5/21 I was not a CS major and never took the compiler class. What is
a good book to help me write a compiler? And what about interpreter?
Do they basically involve the same issues except code generation and
optimization?
\_ Use the book by Robert Mak, called "Writing Interpreters and
Compilers in C++." It's a practical, hands-on "lab" type book which
doesn't get bogged down in too much theory. It's a lot of book to go
through, but if you pace yourself and follow the examples you'll
have a good, practical knowledge about writing interpreters and
compilers. I would avoid theory books such as the Dragon book since
from what you've indicated you probably want practice over theory,
(I doubt you're planning to write the next ocaml or haskel or some
other junk language that will never see the light of day...) Also,
compilers and interpreters are no longer written in the sense that
you think of. Nowadays people use metainterpreters/compilers to
build stuff like this (i.e. lex and yacc).
\_ Oh God!!! Here we go with everyone responding about how great the
dragon book and CLR are.
\_ Let me be the first to say what a piece of crap the dragon book
was (is). It's written so badly it often took me 5-6 reads to
understand a paragraph, often requiring me to diagram what the
author was writing. The book seems to go to great lengths to
avoid clear examples too, which makes it more fun.
\_ agree, this book uses the most cryptic English ever.
\_ You should take basic CS classes before reading the compiler book.
Otherwise it'll be easy to get lost. Do you know regular/push-down
and other Chomsky-hierarchy shit? If not you better get to know
them before you get into compilers.
\_ agree on basic CS courses, I dunno what chom-whatever was, and
I still got thru this course. you basically need to know data
structures and be reasonably decent at programming.
\_ I took CS164 (w/ Hilfinger, no less) but by the time I graduated, I
totally forgot most of the stuff I learned. What are some actual
applications of writing a lexer/parser as opposed to taking
something off the shelf or just putting together a very simple
language that is easily parsed by Perl?
\_ None whatsoever, since it's already been proven that current
languages (C, Pascal, scheme, lisp, forth or any fully functional
ALGOG type language) are as powerful as langauges can get. In
other words all modern computer programming languages are
essentially equivalent and it is impossible to write a more
functional language than what is already out there. The
people who are trying to invent new languages are merely
wasting time and are essentially arguing about style rather
than substance. --williamc
\_ Not exactly. Though programming languages that are Turing
Complete are equally powerful, some are more expressive
than others -- this is something you can quantify using
a model like Denotational Semantics.
\- if you think the dragon book is confusing, have you
tried reading anything by Christopher Strachey? --psb
\_ That's true, certain languages are definitely more apt
at doing certain things than others (i.e. you can do
things in Perl much more quickly than in C or Java and
vice versa). However, the point is that there isn't
anything that really requires another language that
isn't already out there. We've been at OOP for what,
the past 20-30 years? Pattern programming has never
really taken off except in very fundemental class
design. So what's really left to "invent" in a
new language? If you argue for better parralleism
for MPAs I'd say we had languages like that with
Modula 2/3. --wllliamc
\_ One idea I was toying with was separating 'object' from
'data structure' in the language. The language library
provides mathematical objects for you to use:
(graphs, sets, trees, etc), and changes the
implementation (at compile time) depending on how they
are used, in the same way that databases do query
optimization. I don't think this has been done yet.
I think the field of 'improving tools' for programmers
and people representing knowledge is wide open. -- ilyas
\_ The Self virtual machine will change the machine
code implementation of your data structure depending
on how it is used. And in a prototype-based language,
like Self, it's pretty easy to define interfaces
that change behavior as you use them.
\_ I think what I want is for the compiler to do this,
not the programmer. Say if you use a set but only
iterate over it, an array will do, but if you do
random access, you want a hashtable. A compiler
can figure these things out, and substitute the
right data structure, while a programmer can think
about properties of sets themselves. It's cool
that self does this, but I wonder if it does
'complexity analysis' to figure out what data
struct to use like "this set is accessed randomly
a linear number of times, so we want a data struct
which supports random access in constant time", and
so on. These are the kinds of decisions a
programmer makes, and it would neat if occasionally
the compiler could take over this job. -- ilyas
Actually, looking back, I think part of the reason I don't remember
anything is that I took it with Hilfnger and was too busy
deciphering his project specs and doing the projects and not busy
enough learning the theory and applications... but at least I can
still pick up something like the Java or JVM spec and understand it.
\_ Manycompiler/interpreters are for some very specialized language.
\_ Many compiler/interpreters are for some very specialized language.
It only has one application, and you might not even recognize
it as a programming language. |