A CPU's clock-speed is measured in gigahertz (GHz) and megahertz (MHz). You don't find many CPUs measured in MHz anymore.
A cycle is like a "pulse", and during every pulse billions of transsistors open and close, changing the state of the CPU. This state represents something important to an application or process. The fact there are billions of transistors means a lot of operations can be done inside of a single cycle.
If you have a CPU with a 3.0GHz clock-speed, then you have a CPU that can operate at 3.0 billion cycles per second.
A CPU can be thought of as a clock. That is to say that CPUs literally "ticks" like a clock, except you can't hear it (and there isn't a literal clock handle moving!) and it does this a few billion times per second.
I could explain this further my self, but let's see what Intel, one of the biggest manufacturers of CPUs in the world (you might even be using one of their CPUs to read this book), has to say on the matter of the CPU clock speed:
Your CPU processes many instructions (low-level calculations like arithmetic) from different programs every second. The clock speed measures the number of cycles your CPU executes per second, measured in GHz (gigahertz).
Breaking this down a bit...
Your CPU processes many instructions
Notice the use of the word "instructions"? That's our instruction set we mentioned earlier.
... from different programs every second.
Programs are basically compiled to a CPU's instruction set. Hopefully that specific point is now making a bit more sense.
... The clock speed measures the number of cycles your CPU executes per second, measured in GHz (gigahertz).
So the CPU "ticks" (cycles) every second and within that single second it processes billions of instructions. Thus the more cycles a CPU can "fit" into a second the faster it is.
These clock speeds are measured in "hertz". They've gone through various "hertz" over the year, but the most common two you'll hear today are "megahertz" and "gigahertz".
To bring together the concepts of an instruction set (architecture), 32-bit and 64-bit, and finally the clock speed, we can build an image of what's happening when we run software on our computer:
- The software is executed on the CPU in the instruction set it understands
- Each of these instructions is a cycle in the CPU
- The CPU can process billions of cycles per second
- And each instruction inside of a cycle can address a certain amount of memory: 32-bits or 64-bits
This means, essentially, that software is broken down to the instruction set (compiled) and then each of those instructions cycles (executes) billions of times per second, using addressing either 32-bits or 64-bits of memory so that it can do its work (like manipulate a photograph.)