The Importance of Binary Numbers in Computing

Melanie Gross
Aug 12, 2011
Updated • May 7, 2012

Binary numbers consist of only two digits, 0 and 1. This seems very inefficient and simple for us humans who are used to working in base 10, but for a computer base 2, or binary, is the perfect numbering system. This is because all calculations in a computer are based on millions of transistors that are either in an on position, or an off position. So there we have it, 0 for off, and 1 for on. But that on it’s own isn’t very interesting or useful. Having a switch that is either off or on tells us nothing and doesn’t allow us to do any maths at all, which after all is what we want computers for.

In order to do anything useful we have to group our switches (called bits) into something bigger. For example, eight bits becomes a byte, and by alternating the position of the bits, either 1 or 0, we end up with 256 combinations. All of a sudden we have something useful we can work with. As it happens, we can now use any number up to 255 (we lose one because 0 is counted as a number) for our mathematics, and if we use two bytes, the number of combinations for our sixteen bits becomes 65,536. Quite staggering considering we’re only talking about sixteen transistors.

Now, in modern computers, a CPU is likely to have anything up to a billion transistors. That’s 1000 million switches all working together at nearly the speed of light, and if we can count to sixty-five thousand with only sixteen transistors, then think what we can achieve with a billion.

But many people have forgotten the basics of the computer processor these days. To many it’s just a chip that you stick into a motherboard that makes it go. No thought is given to the sheer number of calculations that goes on inside a processor, even just to read the article you’re reading right now. This is probably because the size of these transistors are now so small, you actually need a microscope to see them, and they can be packed into a processor core so small, the wires that connect them all together are many times thinner than a human hair. Even now, the scientists of Silicon Valley are working on ways to fit even more transistors into one space, so that each one is barely bigger than an atom.

This is all even more amazing when we thing back to the days when the first computers were around. A simple processor would need an entire building of space, not just a small square just a few centimeters across, and these behemoths were very low powered in comparison, perhaps only capable of a mere 70 thousand instructions per second back in the 1970’s, but yet well into the trillions today. But at the end of the day, all this is done with billions of tiny switches, off and on, 0 and 1.


Previous Post: «
Next Post: «


  1. anonymous said on August 10, 2017 at 4:16 pm

    It doesn’t say why its useful

  2. hi said on December 9, 2015 at 11:04 am

    I used this for school…….it was boring……..very boring.

    1. Anonymous said on November 1, 2016 at 6:28 pm

      why it is boring

    2. Anonymous said on March 22, 2016 at 11:50 am


  3. Chris Wong said on October 6, 2014 at 9:05 pm

    This sounds like a lot of stuff I don’t understand.

  4. yaya said on February 1, 2013 at 10:31 pm

    what is the use of teaching binary numbers in schools? how can it helps to the students in real life?

  5. Anonymous said on December 2, 2012 at 9:55 pm

    how are binary and transistors related?

    1. Dumb Bun said on January 29, 2018 at 2:45 am

      cuz they are the same

  6. kaviarasu said on June 20, 2012 at 5:15 pm

    it is useful to us

  7. lisapopieypb said on August 21, 2011 at 6:35 pm

    That is until we get quantum computers as an everyday desktop. Gone will be conventional bits, replaced with qbits which can take on a range of values.

  8. Berttie said on August 12, 2011 at 11:59 pm

    Melanie, zero has been counted as a number since about 900 CE in Asia, a little later in Europe, and even earlier in the Americas with the Maya using it from at least 36 BCE. The use of a round symbol to denote zero was suggested by Muhammad ibn Ahmad al-Khwarizmi of Persia around 976 CE.

  9. Martin Brinkmann said on August 12, 2011 at 10:19 am

    just to clarify, we do not actually lose a number, the counting just starts from 0 and not from 1.

Leave a Reply

Check the box to consent to your data being stored in line with the guidelines set out in our privacy policy

We love comments and welcome thoughtful and civilized discussion. Rudeness and personal attacks will not be tolerated. Please stay on-topic.
Please note that your comment may not appear immediately after you post it.