MHz or GHz

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

MHz or GHz, what does it mean? Many people ask questions about Megahertz (MHz) or Gigahertz (GHz). Worse, some people don't ask, and assume they know what those terms mean. It isn't hard to understand (it's just clock speed) - but it isn't as easy as some assume either. Many people get it wrong, and assume that MHz (or GHz) is how much work a computer can get done: but it's more like RPM's in a car than MPH, it can mean the car is working harder, but not necessarily that you're going faster.

Basics

Hertz (Hz) means a cycle or a clock tick. Mega (abbreviated 'M') means million, and giga (abbreviate 'G') means billion. So a MHz is a million cycles (clocks) per second and GHz is billion clocks each second.

Each cycle is a stage or step for the computer, where it can get some instruction (or partial instruction) done. Think of an assembly line, and each clock, or each time someone rings a bell, the part being assembled moves to the next person in the line and a little more gets done, until ultimately, the completed product comes out the end of the line. So the faster you clock (ring that bell), the faster the part gets through the line.

This means that the faster the rating of the computer, or the more MHz / GHz, the faster the computer, right? Well, not completely.

Each computer (processor) is designed differently. So MHz and GHz ratings only apply to the same exact version of the same processor.

Imagine that you have two different assembly lines, one where at each stage there are four people working on adding things to the part, and a different assembly line where at each station they have only one person working. In theory the one with fewer people at each stage of the assembly line would have to have more total stages to get the entire car completed. Thus, the one with more things getting done per stage takes fewer stages to complete a car (or part).

However, the simpler stages one has fewer people working at each station, so they are less likely to interfere with each other - thus you could usually run the assembly line faster (go station to station a little quicker).

So which is better? It depends.

If one gets twice as much done per stage, but the other one runs twice as fast, they are basically equal.

Computers (processors) work a lot like that.

• MHz or GHz is usually just a measure of the stage time (how much time between stages).
• The more internal stages they have, the faster the assembly line can go (and the faster the clock can be), but that doesn't mean that they are always getting more work done than another processor. What also matters is how much is getting done at each stage.

Another term for "stages" is steps in a pipeline -- and hardware people call this a deep pipeline (many stages) or a narrow/shallow pipeline (fewer stages).

Back a little after 2000, Intel made their processor look good by having many more stages (a deep pipeline) with 30 stages, which enabled them to clock things really fast. While AMD had a slower clock, but a much shallower pipeline (fewer stages), and the latter was actually more performant.

So clock speed (MHz and GHz) is not an actual measure of performance of the computer, just how fast it is running internally.

Written 2001.09.10