Jims Email-Improving Software
Last updated at 12:35 am UTC on 17 January 2006
The following is lightly edited email that I sent to the list June of 2K1. The question of the day was, "How do we improve software?" and the related, "Why does hardware get faster" which we interpret as better. A three parter, next part at: Jims Email-Improving Software
Bob Jarvis wrote the original email:
That's an interesting question. I'd like to ask two related questions:
Why and how does hardware get faster?
Can we do similar things to improve software?
I don't know the answers to these questions. (OK, I know a little - just
enough to know that I don't know enough). I'd be curious to hear someone
expound upon these topics.
[I thought that Tim Olson wrote a much better reply than mine: Improving Hardware and Software Reply]
> Why and how does hardware get faster?
While I don't know the complete answer, the process is very well financed,
and the guys who create hardware are smart, "ee-centrically" educated and
know esoteric stuff like math and physics. Go to a place like Intel, TI, or
HP and how many non electrical engineers are out designing hardware? Compare
that with the software world. Most software guys can't code their way out of
a paper bag, and very few have the equivalent concentrated "technical"
background that an EE does.
There has always been quite a bit of financial incentive to make faster and
smaller hardware. Hardware marketing is much better too. When they sell
you a CPU, they tell you "You know, the CPU that you're buying sucks, but
it's so cheap you have to buy it. In two years, the equivalent CPU will be
4x faster." In fact, Intel invests in software companies that "waste" CPU
time, things like 3D graphics, digital video, and networking in order to
build a demand for faster processors. Plus they ship new product on time.
It is very easy to measure hardware performance. If you put 2x transistors
on a chip of the same size and double the clock speed, it goes faster. Sure,
there is a little cheating and tweaking of benchmarks in comparing the same
generation hardware, but you can always rely that a two generation
difference (e.g. Pentium II vs. Pentium IV) will produce stunning results.
Place the two side by side and you'll quickly realize that they are not the
same computing experience. On top of that, you'll fondly remember about how
fast you thought the the PII was three years ago when you bought it (for
twice as much money as your new machine), and then realize in 2005 you'll
say the same thing about the "blazingly fast" PIV you just bought. Compare
that with software. How can you tell when one piece of software is better
than the other?
Same thing in networking. If you are regularly using a broadband connection,
going back to a dial up connection is, at best, an extremely painful
excercise. Conceptually similar, but in the real world they are a
fundamentally different experience. And in your heart you know that the
1.5mb connection is just barely useable to begin with, and when is the real
broadband that we've been hearing about going to get here at a reasonable
On top of all that, people pour a ton of money into hardware development.
Software development is a pauper in comparison.
> Can we do similar things to improve software?
I'm pretty convinced that we can't "improve" software at this point in time.
Software is inherently invisible. You can't see it, you can't touch it, you
even have a difficult time measuring it. Compare this with hardware. You can
almost always tell when one piece of hardware is better than the other, even
if you have to strap it to a bench and watch it on an oscilloscope. Software
on the other hand is much more about personal taste. Does it do this that or
the other thing like I want it to. It's not entirely clear when you compare
two pieces of software which is "better". I know in some circles the one
that the bad one has Microsoft written on it, but other than that it's very
difficult to objectively tell good software from bad. Usually "bad" software
is from a UI perspective, which might be less than 20% of an actual program.
Take a recent discussion on the Squeak list as an example. Let's see, Squeak
is an extensible programming environment. What we would consider a 'modern'
architecture today (automatic garbage colllection, VM, GUI etc). Forget that
the guts are 20+ years old. People on this list (who should know better)
There's a thousand methods in class Morph !!!
Morphic is not fast enough.
Morphic is bloated and takes up too much space.
Never mind what it does. Never mind that we have never quantified under what
circumstances it doesn't meet some criteria, or even what those criteria
are. (I'll note here that it seems to work OK on my machines, and I don't
know what these guys are talking about. ).
Hardware guys figured out all of this stuff long ago. When was the last time
you heard an EE say, "I sure wish we could get rid of an extra couple of
hundred thousand transistors for no apparent reason", or "There are just so
many transistors that I can understand!".
Think about how a hardware guy would approach all this for a moment. Number
one, he would never think that a "thousand methods" is a point of concern.
In the most basic sense, he does not believe in some unification theory
whereby everything is small and simple and elegant. Things are what they
are, and if he has to deal with a thousand things so be it. Make tools to
handle a thousand things. Slow? Why, where, analyze it. Bloated, space
problems? What are the requirements? What takes up all the space? Without
defining the problem, it's going to be damn hard to solve it.
Instead of building tools to efficiently deal with the almost inevitable
thousand method class, the Squeak guys are saying, "Let's rewrite all of
this code so that we can meet some unspecified goals (4x faster, smaller,
whatever) ". And no matter what the end result, there will be a vocal
minority that screams "It's just a big fat pig in comparison to MVC!!!".
That is doubly true of any graphical UI that someone builds in comparison
with the highly regarded textual interface. How would we measure how much
better the rewritten software is? By how many methods there are? How fast it
It seems to be part of the software mentality to go 'back to the future'.
You hear things like, "In 1969 I used to solve the exact same problem in
less than 278 lovingly hand crafted, hand assembled instructions that I
hacked in through the front console". Look at the groundswell support for
Linux in the last few years. Apparently folks think it's a clever idea, I
guess I should say revolution, to spend a whole lot of time, energy and
effort rebuilding 30 year old operating systems. "It's really good, it's
written in assembler ( OK C, the same thing ). That makes if fast and small.
It's open source!!!". Which, of course, makes it so much different than the
GNU and BSD variations of Unix. Or, software reanimates old ideas into
something "new" like Java. Round and round, the whole thing looks pretty
stuck to me.
In general, I think this is the rule software lives by. Hardware designers
build for the future. They always press forward with design. Software people
do the exact opposite. In fact, there always seems this immense pressure in
the software world to make software run faster and smaller, even though you
would think that hardware would take care of that for them!! Not that I'm
against those things, but that time could be much better spent. I think it
is funny that everyone believes a browser is a great UI interface.
At the very least, software guys seem to design for whatever machine they
currently have available, with very little foresight as to what the future
might bring. Strange, considering that hardware guys have laws about how
fast a machine will be X amount of time from now. It's amzingly rare that
you see a software team start designing code with the assumption, "It's
going to take 12 months to write this product, so let's assume the
capabilities of the hardware to be what a machine of that time frame is
forecasted to be". Instead, the software guys usually say something along
the lines of "It has to run on a 5 year old machine". So they end up
writing five year old software (six by the time the code is done). Of
course, everyone has turned a 5 year old machine into landfill by then, and
the software does not take advantage of any of a new machines capabilities.
Hardware guys are smart, educated, and are well financed. They have goals.
They have plans. They actually use their brains. They're ready to go out
kick some ass and take some names. Software guys are very soft in
This seems like more than 2 cents worth, sorry
Jims Email-Improving Software