james-iry.blogspot.com/2009/05/brief-incomplete-and-mostly-wrong.html
A Brief, Incomplete, and Mostly Wrong History of Programming Languages 1801 - Joseph Marie Jacquard uses punch cards to instruct a loom to weave "hello, world" into a tapestry. Redditers of the time are not impressed due to the lack of tail call recursion, concurrency, or proper capitalization. She is hampered in her efforts by the minor inconvenience that she doesn't have any actual computers to run her code. Enterprise architects will later relearn her techniques in order to program in UML 1936 - Alan Turing invents every programming language that will ever be but is shanghaied by British Intelligence to be 007 before he can patent them. His lambda calculus is ignored because it is insufficiently C-like. This criticism occurs in spite of the fact that C has not yet been invented. Engineers do this in order to avoid the tabs vs spaces debate. It is a syntax error to write FORTRAN while not wearing a blue tie.
Years later, in a misguided and sexist retaliation against Adm. Hopper's COBOL work, Ruby conferences frequently feature misogynistic material. Their work leads to a series of "Lambda the Ultimate" papers culminating in "Lambda the Ultimate Kitchen Utensil." This paper becomes the basis for a long running, but ultimately unsuccessful run of late night infomercials. Lambdas are relegated to relative obscurity until Java makes them popular by not having them. Critics immediately denounce Pascal because it uses "x := x + y" syntax instead of the more familiar C-like "x = x + y". This criticism happens in spite of the fact that C has not yet been invented. Not satisfied with the number of deaths and permanent maimings from that invention he invents C and Unix. His goal is to create a language with the intelligence of a two year old. He proves he has reached his goal by showing a Prolog session that says "No." When asked for a formal semantics of the formal semantics Milner's head explodes. Other well known languages in the ML family include OCaml, F#, and Visual Basic. When asked what that means he replies, "Smalltalk programs are just objects." When asked what objects are made of he replies, "objects." When asked again he says "look, it's all objects all the way down. The resulting language is so complex that programs must be sent to the future to be compiled by the Skynet artificial intelligence. Skynet's motives for performing the service remain unclear but spokespeople from the future say "there is nothing to be concerned about, baby," in an Austrian accented monotones. There is some speculation that Skynet is nothing more than a pretentious buffer overrun. Upon waking Larry Wall decides that the string of characters on Larry Wall's monitor isn't random but an example program in a programming language that God wants His prophet, Larry Wall, to design. Haskell gets some resistance due to the complexity of using monads to control side effects. Wadler tries to appease critics by explaining that "a monad is a monoid in the category of endofunctors, what's the problem?" He returns with a large cranial scar, invents Python, is declared Dictator for Life by legions of followers, and announces to the world that "There Is Only One Way to Do It." The language is later renamed Ruby on Rails by its real inventor, David Heinemeier Hansson. The bit about Matsumoto inventing a language called Ruby never happened and better be removed in the next revision of this article - DHH . Later, in an effort to cash in on the popularity of Java the language is renamed JavaScript. Later still, in an effort to cash in on the popularity of skin diseases the language is renamed ECMAScript. Java is a relatively verbose, garbage collected, class based, statically typed, single dispatch, object oriented language with single implementation inheritance and multiple interface inheritance. C# is a relatively verbose, garbage collected, class based, statically typed, single dispatch, object oriented language with single implementation inheritance and multiple interface inheritance. He creates Scala, a language that unifies constructs from both object oriented and functional languages. This pisses off both groups and each promptly declares jihad. Footnotes 1 Fortunately for computer science the supply of curly braces and angle brackets remains high.
Objective C is nice but it's not considered an influential language. It didn't have anything that could inspire a new language designer, because in the end its object model is just a light smalltalk layer over C I'd say, it didn't make "History".
Sorry it wasn't enlightening which I thought it would be. And this page of yours constantly contacted and donwloaded MB's of data from s3. or Amazon or whatever, eating up my precious EDGE quota.
Don't forget: 1968: Chuck Moore gets a sweet gig at the National Radio Astronomy Observatory. Realising his poor hygiene and complete inability to speak any human language would inevitably lead to unemployment, he invents Forth, a language interpreter with no features at all and even less syntax than Lisp, in which he rewrites all the observatory's software in a week so that it runs ten times as fast and is totally unmaintainable by anyone but him. Geeks surprise him by taking the language and running with it, making N squared incompatible implementations where N = the number of users. They do this because Star Trek: The Next Generation is still ten years in the future and nobody has invented the Klingon language yet.
PHP: Began as a simple way to include a few elements in a web page. Syntax slowly evolved using a sophisticated voting system by the masses and also tacking features on in a way described as "willy nilly". Later PERL developers flocked to the PHP, dissatisfied with the complexity of PERL. They solved this problem by adding millions of functions that all followed different naming conventions and did not always do what the name suggested. Objects were introduced late in the game but are highly unused because there are functions that'll do just about anything in PHP.
What about this one: 1993 - Roberto Ierusalimschy invents Lua, a very powerfull programmer repelent. He mixes all the things programmers hate: simplicity, indexes starting at 1, not object oriented, "=" means "not equal", avoids bloatware, etc.
I think we should punish the Smalltalk guy for inventing an inferior and disturbing paradigm. We should also punish Stroustrup and Gosling for popularizing it. Because of them, millions of programmers have suffered and thousands have killed themselves.
And Pascal, which was mentioned, was written as a joke until someone decided to implement it on the PC. And hey, if you can't find something funny to say about ADA, you're not a real comedian. No where's my old book on programming languages, the one with the Tower of Babel on it?
I see no inaccuracies whatsoever in this excellent and perspicacious history. I hereby declare it a Reliable Source and will just be off to Wikipedia to add every word.
The comment on PHP is lacking the beginning: In 1994, a guy named Rasmus Lerdorf, who hates programming, printed out an HTML page and shook the printout so hard that pieces of it fell off (this is known as the "shaken page syndrome"). He glued those pieces back and typed the result into the computer.
Previously mentioned you missed Forth which has lead many to a Fifth to help forget about it and taking to the Fifth to deny all knowledge. Also, it is of particular celestial importance controlling telescopes and the Sun (Sun monitor is Forth).
Sorry everyone for my comment maligning Objective C I used sour milk in my cereal and couldn't think straight after that. And lastly, what about HTML, about which I like to say: it's not "what you see is what you get (WYSIWYG)," it's "what you get is what you deserve!"
If you haven't seen it yet, Dick Gabriel and Guy Steele have a talk called "50 in 50", where they discuss 50 programming languages in 50 minutes. I think they originally wrote it for the History of Programming Languages conference, but they've since done it at OOPSLA and JAOO.
Anonymous wrote: > Objective C is nice but it's > not considered an influential > language. Unless you con...
|