Saturday, November 23, 2013

Computer science: the prequel

So I've skimmed a few blogs that have already been written and it's weird to see how many of them are just about C. I mean, yeah it's been incredibly influential and is the precursor of at least a dozen languages, but it's hardly a starting point. It's like writing a biography about Lincoln when you've been asked to summarize the history of the United States. I'm going to take you back a little farther, to the great-great grandparents of computer science.

Hopefully you've heard of Ada Lovelace and Charles Babbage; if not, I'm honestly disappointed in you. It's like being a music major and never having heard of Bach. These two invented computer science back when the lightbulb was still a new and exciting development in technology.

Wow. Such light. So patented.

Charles Babbage is known today as "the father of computing," and for good reason. Babbage, like any other mathematician of his time, relied on mathematical tables for his computations. These tables included the values of logarithms, exponents, and other calculations that we now rely on calculators to solve.

"Table of common logarithms." Instead of running a calculation through a calculator, your great-great-granddad looked it up in a book like this. 

Every calculation in one of these books (and you can see from the picture that there are thousands of them) was done by a computer. Not a computer like you and I are using, but a human computer. That's the origin of the word -- a "computer" in the nineteenth century was a person who did computations for a living. If you that sounds like a massive pain to you, you're right: it took a group of three people six months, working nearly nonstop, to do the calculations to predict the return of Halley's comet. Babbage thought there had to be a better way. In 1821 he conceived of a machine to autonomously compile mathematical tables for polynomial functions; he called this the Difference Engine and quickly sought government funding. Though he initially received some money to work on the project, insufficient funds and personal conflicts with the engineer let to the ultimate abandonment of development of the Difference Engine in 1834. About fifteen years later, he designed a revamped engine (creatively named the Difference Engine No. 2) that would use fewer parts and could calculate seventh-order polynomials with numbers up to thirty-one digits long. Though Babbage made no effort to actually construct this engine, The London Science Museum constructed two Difference Engines in 1991. You can see a video about how these work here, or you could take fifteen minutes in a car and go see one yourself, since the Computer History Museum (yes, there is one. it's in Mountain View) has one of those two engines.

And it's absolutely gorgeous.

But wait, there's more! Despite everything your algebra II teacher may have told you, there's more to life than polynomials. Babbage figured that out somewhere between Difference Engines one and two, and began conceptualizing an all-purpose programmable computing machine. This machine was called the Analytical Engine (creative naming wasn't really a thing in the 1800s) and is what really earned Babbage his title as the first computer pioneer:

It was programmable using punched cards, an idea borrowed from the Jacquard loom used for weaving complex patterns in textiles. The Engine had a 'Store' [a cache!!] where numbers and intermediate results could be held, and a separate 'Mill' [a CPU!!] where the arithmetic processing was performed. It had an internal repertoire of the four arithmetical functions and could perform direct multiplication and division. It was also capable of functions for which we have modern names: conditional branching, looping (iteration), microprogramming, parallel processing, iteration, latching, polling, and pulse-shaping, amongst others, though Babbage nowhere used these terms. It had a variety of outputs including hardcopy printout, punched cards, graph plotting and the automatic production of stereotypes - trays of soft material into which results were impressed that could be used as molds for making printing plates. [text source here, but the link and notes are mine]

But hardware's no good without software, right? That's where Augusta Ada King, Countess of Lovelace -- more commonly known as Ada Lovelace -- comes in (you've probably read her dad's poetry). Lovelace was a longtime friend of Babbage and took a great interest in his work. When an Italian mathematician, Frederico Luigi, Conte Menabrea, published a paper in French (not Italian. it's weird.) about Babbage's Analytical Engine, Babbage asked Lovelace to translate the paper to English and to add her own notes. The two collaborated for a year on a series of notes that ended up longer than the paper itself. In these, Lovelace included step-by-step sequences for solving several mathematical problems. These were the world's first programs written for a general-purpose computer. And they were written in 1843. Seriously. The first computer programs were written while the Oregon Trail was the new big thing

Lovelace fully understood the impact the Analytical Engine could have on the world. She toyed with the idea of an engine that could be fed any series of characters, not just numbers, and have rules to manipulate those characters. Everything, Lovelace recognized, could ultimately be expressed by numbers: Numbers could represent "[letters] of the alphabet. They could represent musical notes. They could represent positions on a chess board" [source]. Electronic instruments hadn't even been invented yet, and Lovelace was imagining computer-generated music. She's been called the "prophet of the computer age," and that title is well-deserved. 

Computer history long predates computers. Thinking that all major developments came with the C language and its syntax is completely ignoring the beauty of a programmable analog computer and the incredible foresight that predicted developments over a century before they were ultimately realized. 

If you got this far, here's a link to a webcomic that I found about Lovelace and Babbage, which I haven't read yet but seems to not suck. 

No comments:

Post a Comment