August 7, 2014

70 years ago today: IBM presented the Mark I computer to Harvard.

"The Automatic Sequence Controlled Calculator (Harvard Mark I) was the first operating machine that could execute long computations automatically."
A steel frame 51 feet (16 m) long and eight feet high held the calculator.... The ASCC used 500 miles (800 km) of wire with three million connections, 3,500 multipole relays with 35,000 contacts, 2,225 counters, 1,464 tenpole switches and tiers of 72 adding machines, each with 23 significant numbers.

27 comments:

The Crack Emcee said...

WHere white guys deserve credit, I give it:

Remarkable,...

Nonapod said...

A loop was accomplished by joining the end of the paper tape containing the program back to the beginning of the tape (literally creating a loop).

How would you stop the loop?
do while (tape_exists) {
}

Ignorance is Bliss said...

Yeah, but could you use it to download porn?

Big Mike said...

The Mark I was not a stored program computer; it represented an evolutionary dead end.

rhhardin said...

Corporations give away computers that consume too much power to run for what they produce.

Sometimes you can find a university that will take it.

David said...

And even more remarkable that nowadays you can do this work with a chip the size of a contact lens.

FleetUSA said...

I had an old friend who was at Harvard then He would marvel at the size and power of desktops in 2000 compared to that huge computer

William said...

At the end of WWII it was universally agreed that the greatest technological advance to come out of that war was the atom bomb. The computer at Bletchley Park was dismantled and forgotten......I think it's safe to say that the computer has had a more profound, pervasive, and lasting effect on humanity than the A-bomb. I always think of this when they talk about scientific consensus as though that settles the issue.

MountainMan said...

Actually, we should give special credit to a great woman, the late Readr Admiral Grace Murray Hopper, USN, who was one of the first programmers of the Harvard Mark I. She is credited with inventing the word "bug" when an error was caused by a moth getting caught in one of the relays on the Mark I. She went on to have a distinguished career in the Navy and the computer field and is credited with being one of the deveopers of the COBOL programming language, which was once one of the most widely used langauages in business application programming. I had the honor of spending an evening with her at dinner and a meeting wtih a roomful of young engineers and programmers about 35 years ago. She was a remarkable woman. She spent her entire career int the Navy, which she loved She always was seen in public in her uniform, even after retirement. The Navy named a ship for her, the destroyer USS Hopper.

Jolan.False said...

Might it be time for a "Harvard" tag?

Smilin' Jack said...

I had an advisor who used a machine like this. The output went to an electric typewriter--the old-fashioned kind with individual mechanical keys. One day my advisor noticed that a technician would occasionally reach into the typewriter and flick something. When asked what he was doing, the technician replied that when the typewriter was going fast, sometimes two keys would get stuck together on their way to the paper. So he would look to see which one was ahead, and give it a flick so it would hit the paper first. He was quite pleased with his solution to the problem. Weeks of calculations had to be redone.

Fred Drinkwater said...

rhhardim: Heh. In '87 my company had a pair of VAX 11/750 that we kept for testing our devices (they had unique interface hardware). We used to joke that all they were really good for was heating the room. They burned about 500 Watts and ran at 3 megahertz. Later, I knew a senior DEC engineer in Palo Alto (one of the principal guys behind e-mail) who had one in his house - he had to get rid of it because of the heat. He replaced it with a microVAX, which was a nice machine, but DEC collapsed not long thereafter. Sic Transit Gloria Mundi.

Eric the Fruit Bat said...

You can still find spare parts for the Mark I on the shelf at Radio Shack.

Unknown said...

"I, for one, welcome our new computer overlords..."
- Grandpa Just Mike, August 1944

jameswhy said...

Take all the wonder you feel in thinking of how far we've come in the last 70 years and project it out 70 years in the future. And we're worrying that we might be producing too much carbon dioxide?? How they will laugh at our inability to see what will be perfectly obvious 70 years hence.

AustinRoth said...

Crack - that is because Darkies knew their place back then; sweeping the floors and cleaning the toilets.

Kind of what I guess you do for a living.

David said...

AustinRoth said...
Crack - that is because Darkies knew their place back then; sweeping the floors and cleaning the toilets.

Kind of what I guess you do for a living.


Austin, I suggest you confine your interaction with Crack to the digital kind. Otherwise you will end up being the broom. You are already the turd.

Alex said...

Amazing from this monstrosity to the iPhone in less then 7 decades. Steve Jobs is a god.

Quaestor said...

Hardly a month passes when there isn't some significant anniversary in the history of computing, and frankly I'm bored with the commemoratives. Mostly these anniversaries are significant to computers as we know them today only in the negative sense, i.e. how not to build a modern computational device.

One contentious claim that always pops up is priority. Who's on first? Brits of no particular education will always point to Colossus, Tommy Flower's electronic cryptanalytic system installed at the MoD's Bletchley Park facility during WWII, as the "first modern electronic computer." (Often the person making this claim is some bigoted numbskull like BBC's Jeremy Clarkson, millionaire TV presenter and self-confessed computer-hater.)

Electronic and modern for its time Colossus certainly was, but a computer? Well.... A smart phone designed like Colossus would be exceedingly dumb compared to the cheapest Android knock-off made in Vietnam today. The sine qua non of a computing device acceptable as such is a characteristic known as Turing completeness. Colossus was not complete. Neither was Harvard Mark I, for that matter.

The first Turing-complete machine was the all-American ENIAC. So take that, Jeremy Clarkson! and while I'm at it, the Corvette is a better sports car than any hunk of British pig iron since the D-Type.

So what is Turing completeness, you may ask? To avoid TLDR, let me just say that a computer is complete if it can emulate a Turing Machine. The completeness of your iPhone or your iPad is debatable. My hunch is that they are not. But your Mac and your PC are complete, though many have their completeness disabled on-die. The iMac or Windows PC on your desk is the direct descendant of ENIAC, whereas Colossus is only their idiot cousin who spent its life locked locked in an attic out of embarrassing public view.

Quaestor said...

David wrote: Austin, I suggest you confine your interaction with Crack to the digital kind...

Hear, hear. Egging on the resident loony is gauche if nothing else.

chuck said...

She is credited with inventing the word "bug"

IIRC, Edison used the word in his notebooks, so I suspect it traces back at least as far as telegraphy.

AustinRoth said...

David - What, only Crack is allowed to make absurdly racist comments?

Jeff said...

Before he got on the racism kick, Crack built up a lot of good will here by making a lot of smart and original comments. And his MachoResponse blog used to have some posts that were profound, some passionate, and some were just hilarious. His picture posts were amazing.

Crack earned some respect around here.

sinz52 said...

In the biopic "Edison the Man" (1945), Thomas Edison (played by Spencer Tracy) remarks that one of his dynamos "developed a few bugs at the last minute."

And in Edison's own writings, he said that "bugs" are what difficulties and problems are often called. That implies that the term "bug" meaning a defect goes back even further.

The term "debug" (meaning the correction of the defect or fault) goes back to World War II.

AustinRoth said...

Gee, that's funny. I have been coming here and commenting for over 6 years, and don't remember that seeing that Crack.

All I have seen is a "community organizer' mentality, turning every post of Ann's into a reason to throw his ridiculous racist crap everywhere.

Even if at some point in the past he was not the a-hole he is now, so what? I have not seen anything remotely resembling thoughtful commentary from him - only bomb-throwing.

So, I decided to lob a few back at him. I know it will change nothing, and it will not be the majority of what I post, just when I feel like poking the obnoxious prick with a sharp stick.

Joe said...

And even more remarkable that nowadays you can do this work with a chip the size of a contact lens.

That's huge! A chip no bigger than the end of a standard pencil has as much calculating power as the computers that put man on the moon.

A chip the size of a contact lens could contain an System on a Chip (SoC.)

Alex said...

Crack's mind snapped on the Trayvon thing. Literally his posting became different overnight.