The misfit between computers and humans: blaming the bit

Computers in the year 2015 are not much different from the first computers built in the seventies. Many will disagree on this with me, but let me refine myself: I think computers have not fundamentally changed. Sure, they’re faster, have more capacity and they’re much better connected to each other than they were before. They also have become much cheaper, what makes them much more available, and therefore, the amount of applications of the computer has exploded.

However, there’s one important thing that has not changed: the foundation of the computer, called the bit.

“Computers have not fundamentally changed over the years.”

Computers work, and have always worked, with bits. Zero or one, true or false. Furthermore, computers basically do two things only. One: they store bits on devices like a hard drive or other type of memory, and they can retrieve them from these devices. Two: computers can do (advanced) calculations (and comparisons) with bits, according to logical instructions. Current storage capacities and processor speeds make computers very powerful when it comes to calculations (think of weather prediction calculation models) and storing/retrieving bits (think of databases, or the Internet).

Although the bit definitely contributes to the computer’s power, at the same time it’s its limitation. All information you’d like to store on computers, or have computers work with (even the commands you’d like to give to a computer), needs somehow be converted to bits, or alternatively formulated: the information needs to be encoded according to some encoding mechanism (this mechanism I will get back to later on). When retrieved from the storage device, the bits need to be decoded to obtain the original information.

When it comes to numbers, this is quite easy, a bit already is numerical, and all possible numbers can be easily encoded to bits. But when it comes to other kinds of information, like language, images, speech, there’s a mismatch between the nature of the storage (bits) and the nature of the information, and storing it in bits does not seem so easy or efficient any more. Another difficulty with storing information in bits, is the fact that for humans, information always is related to other information. To make myself clear, let’s consider a piece of information: the word ‘phone’, for instance. For humans, this word is related to the image of a phone, the language English, the history of telephony, and many more things. But how to store all this using just 0’s and 1’s?

And there the problem arises: there’s a big misfit between humans and computers. When we look at the human brain, although we still don’t know exactly how it works, we see that it does not work with bits. Information is stored without any conversion or encoding, at least, not in the way a computer does this, using bits. And also an important difference: information is not stored in local packages, like a computer does on a storage device. When new information is added to the brain, it is being integrated in the brain, not simply added (like computers do). All kinds of (neurological) relations are created in the brain the moment the new information arrives. Completely different from what computers do.

I’m not a brain expert, and it would go too far to go into the exact differences between the human brain and computers, but I think I can safely state that our brain works quite different from a computer. And personally I think this is the reason why it’s sometimes so hard to work with computers.

But let’s go one step back here. How DO we manage to work with a computer? The answer is: through an intermediary called software. Software is what we use to bridge the fundamental gap between computers and humans. Databases exist, because of software. Your monitor shows images, because of software. Your iPod plays music, because of software.

“Software is what we use to bridge the fundamental gap between computers and humans.”

Software also translates human commands to machine code. However, this does not mean interaction between humans and computers is always smoothly. In fact, the communication between humans and computers is still very framed and limited. The intermediary software has its limitations. Not because it is not a good enough translator, no, it’s just that not all human communication can be translated to computer code, there’s a fundamental limitation. When we type or click very specific commands, then yes, software can translate this easily. But when we start talking to our computer, well, that’s a whole different story. Maybe one day there will be software that will perfectly be able to catch what we’re saying (i.e. make letters, words and sentences out of the sounds we produce with our voice), but then again, how to translate this to commands a computer will be able to understand and act upon? Don’t forget a computer just takes 0’s, 1’s and calculation commands.

Let me give an example. Let’s say I’d like to tell (verbally) my computer: ‘Book me the first flight to New York’. I can imagine there will be software that understands this spoken sentence, and translate it to computer commands. I can imagine it in this particular case, because it is a very well defined command. Assuming the computer has access to flight booking systems, the computer will just search these systems for flights to New York, find the first one, book it, even executes the payment.

Now, let’s change the sentence slightly: ‘Book me the first cheap flight to New York’. I added just one word: ‘cheap’. For a computer, this changes everything! Because of the word ‘cheap’ the command is not well defined any more, at least not for a computer, and cannot be translated to machine code. What is cheap for a flight to New York? 500 dollars? 1000 dollars? Exact instructions please!

This is just a simple example, regarding a relatively simple task of booking a flight. I hope it illustrates how big the misfit is between humans and computers, and how limited the possibilities for software are to close this misfit. Software will become more advanced in the future, no doubt about that. But without any fundamental change in computers, I don’t think the misfit will ever disappear. If we really want to get along well with computers, or drastically increase the capabilities of computers (to get to advanced artificial intelligence for instance), we have to make them more human. And that would mean getting rid of the bit.

“Without any fundamental change in computers, I don’t think the misfit will ever disappear.”

bitbrain