Those tools were more than inventive bundles of computer code. The C language and Unix reflected a point of view, a different philosophy of computing than what had come before....I must admit I'd never heard of the man, but... thanks.
Minicomputers represented a step in the democratization of computing, and Unix and C were designed to open up computing to more people and collaborative working styles....
C was designed for systems programmers who wanted to get the fastest performance from operating systems, compilers and other programs. “C is not a big language — it’s clean, simple, elegant,” [said Brian Kernighan, a computer scientist at Princeton University who worked with Mr. Ritchie at Bell Labs.] “It lets you get close to the machine, without getting tied up in the machine.”
October 14, 2011
"The tools that Dennis built — and their direct descendants — run pretty much everything today."
RIP Dennis Ritchie.
Subscribe to:
Post Comments (Atom)
61 comments:
Our society has been built by countless little-known people who labored to make things better for all of us. Most of what we daily take for granted is the accretion of the work products of such people from previous generations.
Who cares, he's a dead white male
Thanks for frontpaging this. C and it's derivatives are the basis for much software in use today. Even Java borrows heavily from it's syntax.
OT: I'm putting the Blue Star image to rest this week. For the uninformed :) The blue star banner was started in the early 40's to signify that the household had a member serving in the armed forces. At one point we were a 2 star household. My wife retired this month, thus we are retiring our last star.
PS: The Gold Star, represents a household that has lost a family memeber in combat, hence, Gold Star Mother....
I must admit I'd never heard of the man, but... thanks.
He reminds me of Bill Clinton that way.. slaving away in total obscurity ;)
Actually, C derived languages comprise better than 90% of the commercial programming languages being used today. However, it was not just the language. Unix, the operating system, is based on many concepts that have since become foundational for modern computer operating systems. It is no stretch to say that without Unix, Windows would probably never have come about. Mac OS X actually is based on a variant of Unix, and Linux is a Unix clone. Further, Unix is the OS that powers the Internet.
It is difficult to overstate RItchie;s impact on programming and, to a somewhat lesser extent, OS evolution.
David
Thank you for this, Madame.
As I say, his work is in large part responsible for the digitization of our lives.
Also, what Joe, Kylos, and David (no doofus) said.
PS Sarge, permit me to tip my hat to your wife in honor of her service.
I can't tell you how many times on how many job sites people "borrowed" my "K&R" (their definitive book on C programming), only to never find a way to bring it back....goddamn it!
I musta gone through at least 5-6 copies that "wandered off".
That we have "the internet" is due to Dennis Ritchie's work. Not only was the internet built (and still run today) around Unix servers, but the foundational communications protocol all internet data uses (TCP/IP) started off in life as the default communications stack of Unix servers.
All the other major vendors had their own network protocols (DEC had DECNET, IBM SNA, Novell IPX, etc), but if you wanted to talk to the backbone servers on the still developing internet for things like URL name resolution, you had to speak TCP/IP.
Now, thanks to the internet, almost everybody runs TCP/IP everywhere. Upon TCP/IP and its now universal standard, I'm now sending out this data on port 80 (http) for you guys to read.
Thanks for highlighting Dennis Ritchie's passing.
We owe so much to the geniuses at Bell Labs: Ken Thompson and Dennis Ritchie for B, C and Unix in the 60s, and Bjarne Stroustrup for adding classes to C and creating the next programming language standard, C++.
Kernighan and Ritchie were the authors of the bible on the C language. I have at least a couple of copies of K&R. Ritchie defined the unbelievably influential and elegant C language. Ritchie also was a principal developer of the unbelievably influential and elegant unix operating system.
If you have an android based phone, the core of that code is from unix. Unix is everywhere still.
correctio--the tools that George Boole built --and logical positivists (ie ,binary)--which a few decades later programmers such as Ritchie used as code.
but compared to that phony Guru Jobs an authentic techie.
This is probably the best eulogy of DMR that I've found. Especially the line at the end:
"C is a poster child for why it's essential to keep those people who know a thing can't be done from bothering the people who are doing it."
Golly Young Bagelian aka Byro ..learned about yr first protocol from googling a few hours today
TCP-IP was DoD gear originally, dimwit. I doubt Ritchie was much involved (tho'...possibly-- BellLabs had some DoD connections). Yr not about TCP-IP or any computing. Yr about Tee-shirt sales, YB.
@Drill SGT, tell your wife "welcome home." And I'm glad neither of your stars ever turned gold.
I did. I learned C and UNIX when I worked for AT&T in the 80's and we were selling computers. These were multi user systems, perfect for both C and UNIX. I use Apple computers today and they are very reminiscent of them.
Vicki from Pasadena
A great man has died.
So J has to show up to dishonor his memory.
I didn't like the K&R book. I could only handle learning C from Dick and Jane Writes Some Code.
Too bad you know nothing about Ritchie,or his work,eh Byro Nixon, the LDS thespian (wait--google bit and byte!). Or are you Alan Grayson today--depends on the state of yr yr bipolar disorder.
BellLabs --Nixon admin snitch-gear
Kernighan and Ritchie was THE standard text for generations of student programmers.
Still found on the shelves in the cubes of the the old fart programmers across the land.
What does this J fellow know what we don't ? Where does he get off claiming some special knowledge ?
Based on what I have seen of the commenters, this site is full of the genuine old IT characters, who probably did teach his (J's) grandma to suck eggs.
Me, I go back to paper tape and punch cards.
Thank you, Dennis.
Our business software, for which I was by default SYSADM, ran on Xenix and then Unix.
Never got beyond the few dribs and drabs of AWK and shell script I needed to write, but I was awestruck.
Terse. Compact. All lean and no fat. Elegant, simple, powerful.
buwaya -- I wrote Fortran to control a paper tape that turned on (or off) 300 fans in an auto plant. Damn thing looked like a cheese wheel for Helen Keller.
Just be thankful that "C" was successful. If it wasn't the next letter in sequence would have been .... "P".
And we can all imagine the kinds of jokes that would involve!
:)
"Me, I go back to paper tape and punch cards."
Ahhh. Memories.
And not good ones either. That tape ripped if you even looked at it wrong.
@YoungHegelian: not really sure what you're getting at. unix predates the internet protocol suite by over a decade. the original unix development was in 1969; v4 of the ip suite wasn't officially active until 1983.
The funny thing is that C was written for internal use at Bell Labs to write programs to test telephone switches.
yeah, paper tape storage was awful. i got grant money to help operate oregon's pine mountain observatory in the early 80s. pmo had a 36" cassegrain reflecting telescope that had a computer controlled two-axis sidereal drive. quite accurate when it worked, but one fine day a visitor (that was the story i was told anyway) stepped on the paper tape and bent it enough that it tore. after a couple of days, we couldn't load the program anymore, and it took a very long time to get a replacement, which didn't happen until after i left. so for about 6 months, that telescope was essentially useless for research.
it made one hell of a personal viewing instrument, however. you had to wear sunglasses to look at the moon; saturn's moons and neptune's rings were amazing.
punch cards were also pretty evil. there were long cardboard boxes that would contain thousands of cards (i don't remember how many cards per box. 2500? 5000?). there were also carts that held a number of the boxes. really big programs would have their own cart.
one of the saddest things i've ever personally seen was an accident between two carts that overturned both at a corridor intersection in a computer center. intermingled cards, from multiple programs, were scattered everywhere.
i don't have any idea how that mess was untangled.
@lewsar
What I'm getting at is that the reason TCP/IP is now everywhere is because it was adopted by the Unix community as the standard network stack, and the internet was built around Unix.
I don't know how long you've been in the IT biz, but it wasn't that long ago that TCP/IP wasn't the standard network stack. There was no standard. It was all vendor specific.
Or, do you have some other understanding of why the internet isn't based on SNA?
@lewsar,
I apologize for asking how long you've been in the biz. If you've done cards it's been a long damn time.
My bad.
Too bad you know nothing about Ritchie,or his work,eh Byro Nixon, the LDS thespian (wait--google bit and byte!). Or are you Alan Grayson today--depends on the state of yr yr bipolar disorder.
Anyone who can interpret this babble is welcome to it.
J is a disgrace.
edwardroyce said...
-----------------
LOL!!
-----
(unrelated) We have a friend who had a passing acquaintance with Ritchie but his wife went around saying that Richie was this guy's PhD adviser.
lewsar said...
@YoungHegelian: not really sure what you're getting at. unix predates the internet protocol suite by over a decade.
--------------
Yep, you're correct.
YoungHegelian said...
-----------
TCP/IP was developed for ARPANET (by Vint Cerf and co, and he was the guy who came up with a 32 bit IP address) which became the Internet as we know it today and ARPANet was predominantly used by the UNIX communities across the universities, US and elsewhere who had access to ARPANet/NSFNet (precursors to Internet).
I must admit I'd never heard of the man...
Neither has most of the Smurfhouse--but a veracity-requirement never got in the way of teabagger bloviation. Few minutes in wikiland--and they're experts, though most don't know a bit from a byte.
No, Byro Sorepaw the t-shirt boy--yr the disgrace, no matter how many names you use, wicca-queer. No be sure to wiki bit vs byte, phony
'bout Annie time, flunkie---ahhyeah...that's user friendly.
... though most don't know a bit from a byte.
Nor do you, J, nor do you.
"i don't have any idea how that mess was untangled."
Depends on whether that was before or after IBM introduced the punch card sorter. And I'm way too lazy right now to wiki it or google it.
I'm pretty sure I've never had to deal with pre-sorter on punch cards.
On the other hand some recruiter tried to convince me that going back to the AS/400 and programming webapps using RPG was a -good- idea.
I ain't that dumb. Close. But no cigar. :)
J,
... though most don't know a bit from a byte.
We can tell a butt when we see one, though.
All I remember is that in Pascal and Cobol et. al., if you had a syntax error, you had to actually look through and understand all the code to figure out where the error was. C was a godsend. the syntax was simple and not in the least ambiguous. That is the reason everyone after has modelled their language syntax after C. Thanks dude. Vaya con dios.
But everybody knows who "Snooki" is.
We're so fucked.
@LakeLevel, C may not have been ambiguous, but there's no such thing as a C programmer who hasn't, many times!, written
if (a = b)
instead of
if (a == b)
For those of you who don't know C, don't worry about it.
True, Big Mike.
IIRC, one reason that the computer science guys went to C and Unix is that a C compiler was always available for C for unix operating systems. You could write device drivers in C for unix, which made it easier to add and control new devices.
Otherwise you had to write the device driver in an assembly language, which varied from computer architecture to computer architecture.
IIRC, the source code for unix was available very early on. Unix could be studied and modified.
So unix helped C spread and C helped unix spread.
“It lets you get close to the machine, without getting tied up in the machine.”
This is the best short answer to the what the hell is C, and why is needed? question. Getting "tied up in the machine" means programming on the CPU level, often called Assembly Language or just Assembler. The most valuable courses I took as an undergrad involved programming in Assembler. Learn how to do that and all the mystery of the black box -- the Intel or AMD chip at the heart of that thing on your desk on your lap -- evaporates. Assembly language is a mighty hard way to get anything practical done with a computer because it is so alien to the way we normally thing about a task. People apply intelligence, insight and intuition to everything we do. Computers have none of those qualities, in spite of whatever illusions a clever programmer can cook up. Assembler also permits the programmer to force an issue with the system if there's a subtle flaw that evades his analysis. If your program always cashes and dumps at the same point because it must have F0F0 at relative such-and-such then you can fix it quick and dirty by writing a bit more code that puts that value there when its needed. This called hammer-lock programming, or putting a gun to the f*cking thing's head.
Before Ritchie most software was written is some flavor of BASIC, COBOL or FORTRAN, and most programmer never touched a project that required a lower level approach. They're all pretty similar with strengths in various fields, COBOL for banking, insurance, etc. FORTRAN for engineers and scientists, BASIC for everybody else, kinda like the three estates of feudal France. The advantage is that the code looks sorta like English and the logic is pretty clear for another programmer, or even a non-programmer to follow. The problem is that since loose and fluid human reasoning seldom coincides with the deterministic and thoughtless logic of the computer the human-like language of COBOL complies to a huge mess of machine code which is inherently inefficient and hard to debug. When COBOL was invented in the 1950s it when for years with subtle errors in the standard compiler, so a one could write a perfectly logical and evidently flawless program and have it fatally corrupted by the compiler. This is the origin of the apocryphal stories of people getting billion-dollar electric bills and such.
The huge size of even simple programs authored in COBOL wasn't an issue when every computer was a building-sized system build by IBM or Univac, they have enough resources (very expensive resources, mind you) to cope with the inefficiencies. But when minicomputers started appearing the limitations of the high-level languages became critical. If you wanted to something really useful with a PDP-series system you needed to program it on the CPU level. Then C appeared, and the mini became a contender. C was the trump of doom to the huge institutional machines like the 360s and the Burroughs 6000s. Suddenly a box the size of a refrigerator could do what a machine that need a building size of an average house had been needed to do.
Both C and Unix are elegant. Elegant implies as small and simple as possible, no clutter. Not only is this more beautiful but the smallness and simplicity of C and Unix make them simpler and cheaper to implement.
Maybe the key concept in making them elegant was defining a small, powerful set of core mechanisms and then reusing those mechanisms over and over and over.
You see that same idea in architecture.
C is for sissies. Real men still program in Fortran.
No, Byro Sorepaw the t-shirt boy--yr the disgrace, no matter how many names you use, wicca-queer. No be sure to wiki bit vs byte, phony,
J is so deluded that he imagines all of his critics are the same person.
And he wouldn't know a byte if it bit him.
Rest in peace.
I must admit I'd never heard of the man
Just one of those things about how the world works. More noticeable this month than last because he was in the same general field as someone who was far more famous.
J thinks a logical positivist has something to do with binary. Tee hee.
Long time reader, first time commenter.
Thank you for bloggin on this Ann. The fact you hadn't heard of Ritchie before proves something about your instincts for what's important or something.
For the nerds,
main( )
{
printf("hello, world");
}
And for those who want to learn more: http://en.wikipedia.org/wiki/Hello_world_program
"... programming webapps using RPG "
Oh. My. Gosh.
Might as well use Klingon.
Oh, wait--RPG is Klingon, isn't it?
I completely agree about the importance of Dennis Ritchie's contributions to the programming world.
I would also like to add a note of thanks to his colleagues who are still with us--who perhaps deserve to be better known while they're still around to appreciate it.
"For those of you who don't know C, don't worry about it."
Yeah C needed a better assignment operator than "=". Then again COBOL needed a better statement terminator than ".". The smallest possible bit of punctuation. Seriously. I can't remember the number of times I've pored over program listings looking for a misplaced "." or two. Ugh.
Then again my favorite bit of C silliness was, and this was a few years ago so my memory might not be as accurate as I would wish, obtaining a pointer to a function.
I had to help a co-worker debug a particularly ugly program where the guy decided that the best possible design involved an array of pointers to functions.
... Yeah don't ask.
Well the guy forgot that some library functions weren't actually functions but macros. And you cannot obtain a valid pointer to a macro. So occasionally his program would either start doing some crazy stuff or just completely crash.
300k of code in a single file. Code review; save your sanity. :)
#include
main( )
{
printf("Goodbye, Dennis");
}
hmmm.....C code, and html don't mix
edwardroyce,
"the guy decided that the best possible design involved an array of pointers to functions...... Yeah don't ask."
No, this is eminently askable: there is certainly nothing wrong with that sort of design in places where it's really appropriate. And also (regarding "And you cannot obtain a valid pointer to a macro"): shouldn't that simply fail to compile?
MadAs,
Html entities are your friend:
<, written as <
and
>, written as >
(and of course the 'written as' parts are written &gt; etc ad infinitem...)
Remember, the foundation of a C++ class is an array of pointers to functions.
And I like '=' as an assignment. Beats the awkward ":=".
I love assembly, but it's pretty damn difficult these days to optimize for modern CPUs. C/C++ is like a super macro assembler.
Unfortunately, the one legacy of K&R is that to save space they moved open braces to the preceding line and that became a horrid format.
"Unfortunately, the one legacy of K&R is that to save space they moved open braces to the preceding line and that became a horrid format. "
I don't know whether to say
Yes++;
or if it should better be:
Yes *= 1024;
Post a Comment