December 10, 2012

Ada Lovelace, the daughter of the poet Lord Byron, was the first computer programmer.

The Countess of Lovelace is celebrated today on her 197th birthday with a Google doodle and blog post:
After a chance encounter when aged 17, Ada became friends with Charles Babbage and grew fascinated by his idea to build an “Analytical Engine.”...

While Babbage saw it as a mathematical calculator, Ada understood it had much more potential. She realised it was, in essence, a machine that could manipulate symbols in accordance with defined rules, and—crucially—that there was no reason the symbols had to represent only numbers and equations.
“The Analytical Engine weaves algebraic patterns just as the Jacquard loom weaves flowers and leaves.” Ada Lovelace, 1843
As Ada eloquently argued, such a device could do far more than mathematics. She even mused about its potential to compose music:
“Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.”

55 comments:

edutcher said...

The language, Ada, is named for her.

bagoh20 said...

She might be underwhelmed with what we do with it today.

I often fantasize about bringing people from the past like Ben Franklin to today and being their tour guide of our world. I think the most amazing thing would be an airliner with 300 people aboard flying at 500 MPH, 5 miles high, and doing it by the thousands all day, every day.

That and the wonder that is the ShamWow.

Unknown said...

I read about this woman last year in this book, which I enjoyed as much as anything I read all year.

(The author is not the Gleick who was embarrassed in some AGW fakery a year or two ago, but his brother.)

rhhardin said...

It's one of the amazing because she's only a woman stories.

A guy wouldn't be mentioned for such trivialities.

Larry J said...

edutcher said...
The language, Ada, is named for her.


Yes, but please don't hold that against her. I hated programming in Ada. It'd be like comparing the failed Sergeant York Division Air Defense weapon system to the WWI Medal of Honor winner.

Larry J said...

Also, just as Lady Ada Lovelace was the first programmer, many of the first American computer programmers were women.

Revenant said...

She might be underwhelmed with what we do with it today.

Don't be silly.

deanz1 said...

Interestingly, it was the Jacquard loom, not the Analytical Engine, that eventually facilitated modern computer technology (via IBM)

YoungHegelian said...

@Larry,

And what those women were programming wasn't even a digital computer, but an analog one. A whole different kettle of fish.

Analog computers are programmed by hand wiring connections across various types of circuits (e.g. a capacitor to model an integration function). The "answer" you get back is a voltage, which you then convert to your real-world scenario number.

I saw one back in 1979 in one of the back rooms at the NASA building in Huntsville, AL where I worked. There was still at that time one old guy who knew how to wire it up.

edutcher said...

Larry J said...

The language, Ada, is named for her.

Yes, but please don't hold that against her. I hated programming in Ada.


Wouldn't dream of it, but I thought it was an interesting tidbit for the non-programmers among us.

Larry J said...

No, according to the sources I've found (one link from many), ENIAC was a digital computer. Yes, it was programmed using patch cords. That made the accomplishment of those young women even more remarkable. The engineers were busy building the computer so there was little or no documentation. Those women had to figure it out largely on their own. No small feat, that.

YoungHegelian said...

@edutcher/larry,

Yes, but please don't hold that against her. I hated programming in Ada.

Ada was the military's variant of the "one size can fit all" mentality that periodically sweeps through the world of corporate IT.

Before that IBM was pushing PL1 as the answer to every programming need. PL1 was a great language, but it was a monster.

Anonymous said...

Personally, I find it astonishing that anyone of that era, male or female, had the intuition that "computation" doesn't just mean "solving math problems," since exactly what "computation" is didn't get put on a sound footing until the 1930s. My favorite quote along these lines is "On two occasions I have been asked, 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question."

As for Ada the language, I've always felt it gets a bum rap from programmers. If I had to choose between Ada on one hand and C/C++ on the other, I'd happily choose Ada.

Baron Zemo said...

She also invented the blowjob and her techinques was handed down from mother to daughter until her great, great, great, great, great, great granddaughter popularized it in a landmark film in the early 1970's.

MadisonMan said...

Begin

I never had to program in ada, I don't think. I do recall a class where we did pl/c, lisp, assembler and one other. So maybe that one other was ada.

End

Baron Zemo said...
This comment has been removed by the author.
YoungHegelian said...

@larry,

You seem to be right. It is a digital computer. From the article:

ENIAC stored a maximum of twenty 10-digit decimal numbers. Its accumulators combined the functions of an adding machine and storage unit.

If it's got accumulator registers, it's sure as hell is digital.

It sure looks like an analog, though.

Mea Culpa.

Michael K said...

" Dean Douthat said...
Interestingly, it was the Jacquard loom, not the Analytical Engine, that eventually facilitated modern computer technology (via IBM)"

You beat me to it.

edutcher said...

BTW, yesterday was Amazing Grace's birthday.

Must be something about this time of year.

Chip Ahoy said...

Ada encounters poet.

Larry J said...

What I remember about trying to program in Ada in the late 1980s was that the compiler was horribly slow (several minutes to compile even a simple program) and the syntax was so rigid that I called it a MacFascist language. My instructor said, "If you can get anything to compile, odds are it's correct." Actually, it would've been symantically correct but could still contain logic errors like any other language.

When I was working with it, Ada wasn't object-oriented but I heard they may have added some of the features later.

bagoh20 said...

I wrote a program for recording employee work times (electronic timekeeping) 20 years ago in Pascal, even though I wasn't a programmer then, and definitely am not now. I was just playing around. Fast forward to about 4 months ago, and I decide that using 20 year old DOS programs didn't make sense today. Surely we could find a Windows version that would do the same job and be superior.

After finding and trying a number of them, none were as flexible, fast, reliable and sensible as the one a computer novice wrote late at night, half drunk in the early 90's.

I was the one most in favor of dropping it and moving on, but in the end, it was a unanimous decision that we just go back to the old (free) program, because it was half the work and effort to use as the modern ones costing thousands of dollars, and it provided better results anyway.

Progress is a tricky word, because "Doing nothing is a high standard."

bagoh20 said...

"She might be underwhelmed with what we do with it today.
~~
Don't be silly."


It still doesn't do what she envisioned 160 years ago, i.e., compose wonderful music.

Bruce Hayden said...

Yes, but please don't hold that against her. I hated programming in Ada.

Thank goodness that I wasn't working for DoD at the time. Mostly working in C and Algol variants by then. C was great - you could cheat to your heart's content, and the defines allowed me to make programs architecturally independent, even in implementing shared virtual memory on systems that didn't have such.

I thought that the Ada language was ridiculous, but did try to figure out how to code stuff in it. Luckily, I never got a compiler that would work.

Didn't like PL/1 much better, and did have to write in it some. IBM, of course, had their own variant. In any case, one of my biggest problems with that language was what went on underneath the surface, and, in particular, with type conversions. You might think that you are comparing two binary numbers, but then find that the program doesn't work because PL/1 converted them to strings. That sort of thing. It was supposed to make it easier to program, but turned out to make it much harder to debug.

With C and then object oriented programming, the paradigm went the other way, with programmers being forced more and more to specifically call out their data types, and, thus, to some extent, their functionality (because, of course, more and more of that was hidden as OO programming progressed).

Handling of data was why, I think, that PL/1 was such a pig, and, esp. on non-IBM hardware. Adding two binary numbers together requires getting them into registers and issuing an add instruction. Very fast. Much slower if you are continually converting back and forth between string and binary integer, integer and float, etc. of course, that meant that PL/1 sold a lot more IBM big iron hardware, which was invariably the goal of much of what IBM did with its software at the time. Nevertheless, PL/1 was much better than COBOL, created under the direction of another famous woman in computing, Rear Adm. Grace Hopper.

jimbino said...

I built my first digital computer out of a deck of cards, from an article in Scientific American, ca. 1958.

Gabriel Hanna said...

There was no computer for her to program. It was never built. It is questionable who much math she even understood, judging from her mistakes in translating mathematical works.

Rob Crawford said...

Also, just as Lady Ada Lovelace was the first programmer, many of the first American computer programmers were women.

Because they started out as calculators.

No, really. They did the number-crunching for the theorists.

Rob Crawford said...

After finding and trying a number of them, none were as flexible, fast, reliable and sensible as the one a computer novice wrote late at night, half drunk in the early 90's.

Ballmer Peak. Google it.

Christy said...

I built my first analog computer in the 60s. Probably from something I read in Scientific American.

Gabriel Hanna said...

She realised it was, in essence, a machine that could manipulate symbols in accordance with defined rules, and—crucially—that there was no reason the symbols had to represent only numbers and equations.

She CLAIMED it could do these things, but she did not demonstrate that they were capable of doing so. It was flowery language not backed by evidence.

Claiming Lovelace as the first programmer, or the first to understand the capabilities of computers, is something like claiming Jules Verne as the first rocket scientist or submarine captain.

Alex said...

I remember doing Ada in college and I hated every moment of it. C FTW!!!

Alex said...

But more specifically C# is the greatest programming language ever created.

SteveR said...

I remain thankful that my Computer Science 101 (essentially Fortran) teacher was such an ass that the one semester of punch cards and syntax errors was all the programming I ever undertook. I'm glad for those who have and do but its a way of thinking I am very bad at doing.

Palladian said...

"All but one of the programs cited in her notes had been prepared by Babbage from three to seven years earlier. The exception was prepared by Babbage for her, although she did detect a "bug" in it. Not only is there no evidence that Ada ever prepared a program for the Analytical Engine but her correspondence with Babbage shows that she did not have the knowledge to do so."

Palladian said...

But, as rhhardin pointed out, the narrative of "woman invents computer programming!" is just too important to let historical facts get in the way.

YoungHegelian said...

SteveR,

I'm glad for those who have and do but its a way of thinking I am very bad at doing.

Years ago, my wife's (then fiancee) company paid to have her take a Fortran course at GWU in downtown DC. I had been programming for quite some time by then, and so to help & to be with my squeezums, I often accompanied her to the labs for the exercises (it was a night course).

Well, my wife really didn't need my help. She took to it easily, so I helped other pupils in the class. For some of them, the concepts of programming & their application was so brutally difficult as to produce tears of frustration.

I helped where I could, but I couldn't create aptitude when it wasn't there.

Lucien said...

@Baron Zemo:

Don't be ridiculous. Everyone knows the blowjob was invented by the Greek philosopher Fellatio after a very painful visit to Sodom, after which he devised it as a less intrusive and more palatable alternative.

Later the practice spread by . . . word of mouth.

Unknown said...

Too bad St. John didn't mention that something that kind of seemed like a computer in revelation. Then maybe Google would recognize Easter or Christmas.

Gahrie said...

I often fantasize about bringing people from the past like Ben Franklin to today and being their tour guide of our world. I think the most amazing thing would be...

You want to know the one that would have the biggest and most immediate impact to most of them?

Toliet paper.

virgil xenophon said...

All of this discussion about conceptually innovative female "unlikely" quarters reminds me of Hollywood's Hedy Lamarr who was largely responsible for inventing skip-frequency radio transmissions to prevent enemy jamming in WW II..

virgil xenophon said...

**should have been "...innovative female ideas from unlikely.."

mea culpa

Holmeses said...

Babbage never built his full scale engine, but it was ,constructed from his designs by the Science Museum in London for the 200th anniversary of his birth. I've seen it in person and it is quite an amazing thing to see. Video at the link.

ken in tx said...

It is my understanding that the first computers were women who computed artillery trajectorys for the military.

MadisonMan said...

That's Hedley.

Kirk Parker said...

Bruce H.,

"C was great... I thought that the Ada language was ridiculous"

Well... they really were designed with different problem sets in mind.


I say this as someone whose only experience with Ada consists of having read a few lines, now and then, accompanying articles in Byte and Embedded Systems Programming. However one of my contacts in Boeing says that most (or maybe all? I forget) of the actual flight software continues to be written in Ada. (His stuff, doing analysis of flight test data, is all in C/C++.)



Alex,

C#???? Oh dear Ghu no!

Fred Drinkwater said...

Oh, Ada. At my then (mil contractor) employer, we were visited by Grady Booch, flogging Ada as the Alpha and Omega of programming languages. Since we were all about hard-real-time control systems, our persistent question was "What about machine code insertion?" No answer. Then later "Oh, you won't ever have to do that."

Later, one of our instructors supposedly came out with the infamous "If it compiles without error, it is correct." Laughs all 'round; I'm only sorry I didn't hear it in person.

What a monster. I still get headaches thinking about the multitasking model. "Rendezvous", blech.

And I say that as one who grew up with PL/1, Fortran 2, IV and V, Pascal, C, custom variants, various assemblies, custom microcodes, actual analog computers, and Bog knows what else.

Ms. Lovelace, OTOH, was actually pretty interesting, at least according to the literature I've seen.

Carl said...

Meh. This is such transparent beta suckuppery it astonishes me any women are actually impressed by it.

As Feynman once aptly observed, a substantial portion of the genius of genius is knowing what's worth working on. This is why men like Newton and Galileo, von Neumann and Turing, are bona fide savants, while pulp heroes like Leonardo da Vinci and Ada Lovelace are mere dilettantes. The former had good insight into what problems could be solved, with the knowledge and technology of their days, and they didn't waste their time with those that could not. That's why they ended up with actual enduring advances instead of a load of fascinating questions and really cool prophesies and hints, which, you know, with $3, will get you a nice coffee.

Ada Lovelace was about as much use to actual technological achievement as Jules Verne.

And this is truly ludicrous, not to mention insulting to every other woman (or man) of the day:

Ada envisaged a day when a single machine would be capable of a myriad of tasks, limited only by the creativity of its programmer. At the time—nearly a century before the first computers were built—it was a flash of brilliance.

Oh right. Because nobody but a genius can possibly imagine the uses of a piece of technology before it's actually been constructed. No one today, for example, can imagine what could be done with a robot that has the intelligence of a dog, or with a starship that can travel to Alpha Centauri in 4 hours, or with a genetic therapy that cures all types of cancer with a $50 injection, or with a portable fusion plant, or a practical quantum computer or teleportation device, et cetera.

Ada Lovelace was apparently the only person in the early 19th century with imagination? Or...gee, maybe she was one of the few with the luxury and lack of serious immediate aims to write down her daydreams in exquisite detail.

Dante said...

I'm a computer programmer. The essence of computers is how to define them with a mathematical model. That is what a Turing machine is (Turing is the homosexual that broke the Enigma code, and later committed suicide). He proved a mathematical Truing machine is as powerful as any RAM machine, that is any computer we make today.

In any event, Ada's insights (or dreams), remind me of what an English professor told me when I told him that computers would one day be intelligent. He said "So, at one time a person looked at a helium balloon, and said that eventually we would reach the moon."

"But, we DID," I foolishly said. Not to push myself down too much, I realized my error, but at the same time did not. I thought humans were able to do anything.

Then, about fifteen years ago, I watched a very interesting PBS show. In it, a professor was describing some quantum mechanics on the white board. It was simply complex stuff: it could have been algebra. Then, he turned around, and the camera panned back, and said "Got that" to a dog with a wagging tail.

Of course, the dog could not quantum physics, or maybe even algebra, and maybe humans can't understand some things either, was the valid point.

So Ada's thoughts about the possibilities of computers aren't so important in the realm of reality. They were much like a balloonist's thoughts of reaching the moon. Maybe they inspire people, which is great. But inspiration is 1% of the problem, and lately, far less.

Meanwhile, my favorite unlikely female character is Hedy Lamar. First, she is beautiful, which is always a plus. But she was an Actress who actually did something seminal. She created, on August 11, 1942, US Patent 2,292,387. She did something with Music, to let the notes come out clear, and applied it to communications technologies. Her ideas are still in use today in Cellular networks and WiFi technologies today (though even then, the practical applications were understood by her for radio frequencies, as even then she patented it to make torpedoes jamming resistant).

Now, THAT's amazing. I hope she liked sex, too. Sorry, Leslyn.

Gabriel Hanna said...

Why does Emmy Noether never get a Google doodle? She only established the deepest foundations of all possible physical laws ever, and contributed to quantum mechanics and relativity, as well as being one of the greatest mathematicians of all time, male or female. I sincerely hope this is not why.

Instead we have to celebrate quirky, attractive female aristocrats with minimal accomplishments.

I do not want to see Noether lumped with the other figures from PC history, so I suppose it is best.

Gabriel Hanna said...

If you ask me, that's the real sexism right there. Only attractive women can be held up as role models. For men it doesn't matter nearly as much.

For example, a female character in an action movie must kick ass, must be hot, and have no muscle mass whatever. You can't have a bulky, ugly, scarred woman kicking ass, which would make a lot more SENSE. Intelligence or physical prowess, in a woman, must be coupled with extreme attractiveness as an offset.

Gabriel Hanna said...
This comment has been removed by the author.
Gabriel Hanna said...
This comment has been removed by the author.
Michael McNeil said...

Great novel by Bruce Sterling and William Gibson, The Difference Engine, where Babbage's Engine was built and worked — depicting the huge impact that functional digital computers might have had on the mid 19th century.

Peter said...

Surely one can make the case that Ada Lovelace was not the first programmer, as there was prior art- specifically, those who programmed Jacquard looms.

Unlike Babbage's Difference Engine, Jacquard Looms were an actually deployed technology.

Dante said...

What, the photos are in black and white? She isn't ugly, if that's what you mean. She's Jewish? On the other hand, there is this interesting quotation on her from Wikipedia:

Mostly unconcerned about appearance and manners, she focused on her studies to the exclusion of romance and fashion. A distinguished algebraist Olga Taussky-Todd described a luncheon, during which Noether, wholly engrossed in a discussion of mathematics, "gesticulated wildly" as she ate and "spilled her food constantly and wiped it off from her dress, completely unperturbed". Appearance-conscious students cringed as she retrieved the handkerchief from her blouse and ignored the increasing disarray of her hair during a lecture. Two female students once approached her during a break in a two-hour class to express their concern, but they were unable to break through the energetic mathematics discussion she was having with other students.