2/21 List of big design blunders in computer science, I'll start:
\_ IP6 needed because IP4's running out (reality: NAT made IP4 more
abundant hence IP6 adoption has been slow to a crawl)
\_ IPv6 needed because IPv4's running out (reality: NAT made IPv4
more abundant hence IPv6 adoption has been slow to a crawl)
\_ That's not a design blunder; IPv6 is still needed, just not
as quickly as was first anticipated. -tom
\_ NAT is a bad thing(tm). It breaks applications that need
end to end connectivity. It also makes it difficult to
manage large clusters of systems each using the same
NAT address space. In the IPv4 world we have been stuck
using overlay networks to deal with these problems. If
everyone was using IPv6 people would not need these type
of hacks.
\_ I'm increasingly convinced that the future is IPv6
overlays that have to be negotiated/constructed dynamically
by some sort of link control protocol where all the
paranoid authz checks can be done by the folks who
think firewalls and NAT are the greatest thing since
sliced cables.
\_ Therac 25, baby.
\_ DOS, 640K RAM is enough for everyone (reality: never enough RAM)
\_ Why is DOS a blunder? For many applications DOS works well
enough (ex. my DSLR runs DOS and it works just fine).
\_ gets(), strcpy(), strcat(), and all other C standard library
functions that assume infinite buffer sizes.
\_ C++, period. Ugly, ugly, ugly.
\_ Go away troll.
\_ Y2K: first the prevelance of the bug, then the overblown reaction
to it
\_ bug != design decision. People designed systems with two digits
to hold the year because it was the right design tradeoff at the
time. If any of the designers really expected the systems to
stay in use for literally decades they would have decided
otherwise.
\_ wouldn't it have been more space efficient to represent the
year as a single byte, offset from 1900? that would have kept
them safe until 2155 and saved a byte. Would that have been
more computationally expensive?
\_ You obviously aren't familiar with BCD and its prevalence
in the financial world.
\_ Microsoft Bob. -gm
\_ I just looked at it. It actually seems pretty cool albeit the
primitive looking GUI. What happened to it?
\_ I was referring in particular to its "password reset" feature,
which would prompt you for a new password if you entered the
wrong password three times. As for Bob in general, I don't
think it was ever really adopted, and its purpose (make the
Windows UI easier to use) became obsolete. -gm
\_ The password thing is just an implementation fuckup. -John
\_ MBONE SHALL RULE ZE VORLD!!! MUAHAHHA!!
--Professor Larry "The Slammer" Rowe.
\_ slammer?
\_ JavaScript. Language sucks, feature sucks, security sucks.
\_ That's ECMAscript beotch!!!
\_ The unification of data types and conceptual types in programming
languages. Unification isn't even the right word, because these
two generally have not been separated to begin with.
Also, the general philosophy of early CS pioneers of designing
for non-malicious, cooperative use. We are still dealing with the
repercussions of THAT (unsafe languages, problems with network
protocols, etc). -- ilyas
protocols, etc).
Designing languages for the 'average case' rather than the
'best case' (I am talking about users of languages). Designing for
the average gives you Java. -- ilyas
\_ Multics. The entire x86 security ring architecture. Java.
SMTP (sans authentication).
\_ Java? Yeah, that's big design blunder -- a language that is
easy to program in and works on all sorts of different devices,
not to mention fuels my paycheck every month. Maybe the
transistor is another big mistake?
\_ I see your "Multics" and raise you a "Nachos". -gm |