|[ Team LiB ]|
So What Is a Computer?
In today's world and using today's technology, a computer is an electronic device that uses digital logic under the direction of a program for carrying out calculations and making decisions based on those calculations. By this essential definition, a computer has four elements to its construction: electronics, digital logic, programming, and calculating/decision-making.
That's a mouthful of a definition, and one that's pretty hard to swallow, at least in one piece. To help you understand what all this means, let's take a closer look at each of the elements of this definition.
The technology that makes the circuits of a computer work is called electronics. Dig into this word, and you'll find that it comes from electrons, the lightweight particles carrying a negative charge that comprise one of the fundamental constituents of atoms. The flow of electrons through metals is what we normally think of as electricity, a word taken from the Greek word for amber, elektrum. Static electricity, the stuff that creates sparks when you touch a metal doorknob after shuffling across a carpet on a dry winter day, was once most readily produced by rubbing amber with a silk cloth. So, the stuff that makes computers work is named after what's essentially petrified pine tree sap.
Electronics is a technology that alters the flow of charges through electrical circuits. In the more than two centuries since Benjamin Franklin started toying with kites and keys, scientists have learned how to use electricity to do all sorts of useful things, with two of the most important being to operate motors and burn light bulbs. Motors can use electricity to move and change real-world objects. Lights not only help you see but also see what an electrical circuit is doing.
For the record, this electricity stuff is essential for building a computer. Babbage showed how it could be done with cams, levers, gears, and an old crank (which might have been Babbage himself). If you, like Babbage, have too much time on your hands, you could build a computer for yourself that runs hydraulically or with steam. In fact, scientists hope to build computers that toy with quantum states.
However, electricity and electronics have several big advantages for running a computer over nearly everything else (except that quantum stuff, and that's why scientists are interested in it). Electricity moves quickly, at nearly the speed of light. Electrical devices are easy to interface with (or connect to) the real world. Think of those motors and electric lights. They operate in the real world, and an electrical computer can change the currents that run these motors and lights. Moreover, engineers have mastered the fabrication of electrical circuits of all sizes, down to those so small you can't see them, even if you can see the results of their work on your computer screen. And above all, electrical circuits are familiar with off-the-shelf parts readily available, so you can easily tinker with electrical devices and build them economically. And that's the bottom line. Just as celebrities are famous primarily for being famous, electronics are used a lot because they are used a lot.
Most of the circuits inside a computer use a special subset of electronic technology called digital electronics. The most important characteristic of the digital signals in these circuits is that they usually have only two states, which are most often described as on and off (or one and zero). Some special digital systems have more than two states, but they are more than you need to understand right now.
Usually the states are defined as the difference between two voltage levels, typically zero and some standard voltage, or between a positive and negative voltage value. The important part about the states of digital electronics is not what they are but what is between them—nothing. Certainly you can find a whole world of numbers between zero and one—fractions come to mind—but with digital technology the important fact is not whether there could be something between the digital states, but that anything other than the two digital states gets completely ignored. In essence, digital technology says if something is not a zero it is a one. It cannot be anything else.
Think about it, defining the world this way could make sense. For example, an object is either a horse or it is not a horse. Take a close look. It has hooves, four legs, a mane, and a tail, so you call it a horse. If it has six legs and a horny shell and, by the way, you just stepped on it, it is probably not a horse. Yes, we could get on sketchy ground with things such as horseflies, but nit-picking like that is just noise, which is exactly what the two digital states ignore.
This noise-free either/or design is what makes digital technology so important. Noise is normally a problem with electrical signals. The little bit of electricity in the air leaks into the electrical signals flowing through wires. The unwanted signal becomes noise, something that interferes with the signal you want to use. With enough noise, the signal becomes unusable. Think of trying to converse over the telephone with a television playing in the background. At some point, turning the TV up too much makes it impossible to hold a phone conversation. The noise level is simply too high.
Digital signals, however, allow circuits to ignore the noise. For computers, that's wonderful, because every little bit of added noise could confuse results. Adding two plus two would equal four plus some noise—perhaps just a little, but a little might be the difference between being solvent and having your checking account overdrawn. Noise-free digital technology helps ensure the accuracy of computer calculations.
But sometimes things can get tricky. Say you encounter a beast with hooves, four legs, a mane, a tail, and black-and-white stripes. That's a horse of a different color—a creature that most zoologists would call a zebra and spend hours telling you why it's not a horse. The lesson here (besides being careful about befriending didactic zoologists) is that how you define the difference between the two digital states is critical. You have to draw the line somewhere. Once you do—that is, once you decide whether a zebra is a horse or not—it fits the two-state binary logic system.
Logic is what we use to make sense of digital technology. Logic is a way of solving problems, so you can consider it a way of thinking. For computers, "logically" describes exactly how they think. Computers use a special system of logic that defines rules for making decisions based, roughly, on the same sort of deductive reasoning used by Sherlock Holmes, some great Greek philosophers, and even you (although you might not be aware of it—and might not always use it).
Traditional logic uses combinations of statements to reach a conclusion. Here is an example of logical reasoning:
Conclusion: If I go outside, I will be eaten.
Computer circuitry is designed to follow the rules of a formal logic system and will always follow the rules exactly. That's one reason why people sometimes believe computers are infallible. But computers make no judgments about whether the statements they operate on are true or false. As long as they get logically consistent statements (that is, the statements don't contradict one another), they will reach the correct conclusions those statements imply.
Computers are not like editors, who question the content of the information they are given. To a computer, proposition 1, "Dragons eat people," is accepted unquestioningly. Computers don't consider whether a race of vegetarian dragons might be running about.
Computers don't judge the information they process because they don't really process information. They process symbols represented by electrical signals. People translate information into symbols that computers can process. The process of translation can be long and difficult. You do some of it by typing, and you know how hard that is—translating thoughts into words, then words into keystrokes.
Computers work logically on the symbols. For the computer, these symbols take electronic form. After all, electrical signals are the only things that they can deal with. Some symbols indicate the dragon, for example. They are the data. Other symbols indicate what to do with the data—the logical operations to carry out. All are represented electronically inside the computer.
Engineers figured out ways of shifting much of the translation work from you to the computer. Consequently, most of the processing power of a computer is used to translate information from one form to another, from something compatible with human beings into an electronic form that can be processed by the logic of the computer. Yes, someone has to write logic to make the translation—and that's what computer programming is all about.
A computer is programmable, which means that it follows a program. A program is a set of instructions that tells the computer how to carry out a specific task. In that way, a program is like a recipe for chocolate chip cookies (a metaphor we'll visit again) that tells, step by step, how to mix ingredients together and then burn the cookies.
Programmability is important because it determines what a computer does. Change the program the computer follows, and it will start performing a new, different task. The function of the computer is consequently determined by the program.
That's how computers differ from nearly all machines that came before them. Other machines are designed for a specific purpose: A car carries you from here to there; an electric drill makes holes in boards and whatever else gets in the way; a toaster makes toast from bread. But a computer? It can be a film editor, a dictation machine, a sound recorder, or a simple calculator. The program tells it what to do.
Calculating and Decision-Making
The real work that goes on inside your computer bears no resemblance to what you ask it to do. Programs translate human-oriented tasks into what the computer actually does. And what the computer does is amazingly simple: It reacts to and changes patterns of bits, data, and logic symbols.
Most of the time the bit-patterns the computer deals with represent numbers. In fact, any pattern of binary bits—the digital ones and zeroes that are the computer's fodder—translates directly into a binary number. The computer manipulates these bit-patterns or binary numbers to come up with its answers. In effect, it is calculating with binary numbers.
The computer's calculations involve addition, subtraction, multiplication, division, and a number of other operations you wouldn't consider arithmetic, such as logic operations (and, or, and not) and strange tasks such as moving bits in a binary code left and right.
More importantly, the computer can compare two binary codes and do something based on that comparison. In effect, it decides whether one code is bigger (or smaller, or whatever) than another code and acts on that decision.
Work your way up through this discussion, and you'll see that those codes aren't necessarily numbers. They could be translations of human concerns and problems—or just translations of letters of the alphabet. In any case, the computer can decide what to do with the results, even if it doesn't understand the ideas behind the symbols it manipulates.
The whole of the operation of the computer is simply one decision after another, one operation after another, as instructed by the computer's program, which is a translation of human ideas into a logic system that uses binary code that can be carried out by electronic signals.
So what is a computer? You have one answer—one that more than anything else tells you that there is no easy answer to what a computer is. In fact, it takes an entire book to explain most of the details, the ins and outs of a computer. This book.
|[ Team LiB ]|