The Importance of Binary Numbers in Computing
Binary numbers consist of only two digits, 0 and 1. This seems very inefficient and simple for us humans who are used to working in base 10, but for a computer base 2, or binary, is the perfect numbering system. This is because all calculations in a computer are based on millions of transistors that are either in an on position, or an off position. So there we have it, 0 for off, and 1 for on. But that on itâ€™s own isnâ€™t very interesting or useful. Having a switch that is either off or on tells us nothing and doesnâ€™t allow us to do any maths at all, which after all is what we want computers for.
In order to do anything useful we have to group our switches (called bits) into something bigger. For example, eight bits becomes a byte, and by alternating the position of the bits, either 1 or 0, we end up with 256 combinations. All of a sudden we have something useful we can work with. As it happens, we can now use any number up to 255 (we lose one because 0 is counted as a number) for our mathematics, and if we use two bytes, the number of combinations for our sixteen bits becomes 65,536. Quite staggering considering weâ€™re only talking about sixteen transistors.
Now, in modern computers, a CPU is likely to have anything up to a billion transistors. Thatâ€™s 1000 million switches all working together at nearly the speed of light, and if we can count to sixty-five thousand with only sixteen transistors, then think what we can achieve with a billion.
But many people have forgotten the basics of the computer processor these days. To many itâ€™s just a chip that you stick into a motherboard that makes it go. No thought is given to the sheer number of calculations that goes on inside a processor, even just to read the article youâ€™re reading right now. This is probably because the size of these transistors are now so small, you actually need a microscope to see them, and they can be packed into a processor core so small, the wires that connect them all together are many times thinner than a human hair. Even now, the scientists of Silicon Valley are working on ways to fit even more transistors into one space, so that each one is barely bigger than an atom.
This is all even more amazing when we thing back to the days when the first computers were around. A simple processor would need an entire building of space, not just a small square just a few centimeters across, and these behemoths were very low powered in comparison, perhaps only capable of a mere 70 thousand instructions per second back in the 1970â€™s, but yet well into the trillions today. But at the end of the day, all this is done with billions of tiny switches, off and on, 0 and 1.Advertisement
just to clarify, we do not actually lose a number, the counting just starts from 0 and not from 1.
Melanie, zero has been counted as a number since about 900 CE in Asia, a little later in Europe, and even earlier in the Americas with the Maya using it from at least 36 BCE. The use of a round symbol to denote zero was suggested by Muhammad ibn Ahmad al-Khwarizmi of Persia around 976 CE.
That is until we get quantum computers as an everyday desktop. Gone will be conventional bits, replaced with qbits which can take on a range of values.
it is useful to us
how are binary and transistors related?
cuz they are the same
what is the use of teaching binary numbers in schools? how can it helps to the students in real life?
This sounds like a lot of stuff I don’t understand.
I used this for school…….it was boring……..very boring.
why it is boring
It doesn’t say why its useful