Squeak
  links to this page:    
View this PageEdit this PageUploads to this PageHistory of this PageTop of the SwikiRecent ChangesSearch the SwikiHelp Guide
Zurgle Paper
Last updated at 11:02 pm UTC on 11 July 2018
Note 2018: Dynabook links added

July 3, 2001, by Jim Benson jb@speed.net


Over the last few months, I’ve been thinking about how to improve my software production capabilities. Kick some ass; take some names, that sort of thing. Maybe not total world domination, but I certainly want to leave that open as an option. (Actually I would settle on being able to write simple programs in a reasonable amount of time, but don’t tell anyone). Unfortunately this has required me to actually use my brain and think, something that I find tiring at best.

History

When I first started out on my computer journey, I learned a few things early on which could be condensed into one statement. Programming, what is the proper pejorative? Oh yeah, programming sucks. I guess it tells you quite a bit about me when you consider that I make my living programming.

So why do I think programming is difficult? In my experience, it’s because the machine that I’m programming hates me. Sits there like a big fat pig, sucking juice right out of the wall, waiting me to feed it a niggle that ultimately causes it to crash.

Lord knows I have enough niggles to feed an army of machines. Uninitialized pointers, jump outside of memory bounds, infinite loops. My ‘off by one’ errors alone fill up three warehouses in south New Jersey. And I’m building more all of the time.

[As an aside: I usually use MS-Windows machines, which have an unquenchable thirst for niggles. Fortunately, Microsoft builds software with built in niggle generators. My current favorite: The machine hard crashes inexplicably, so a hard reboot is in order. When the machine comes back up, it scolds that ‘the computer was not shut down properly’ (helpful hint: ‘Always shut down the computer from the menu’) and proceeds to do a procedure called ScanDisk. ScanDisk exhaustively scans all of the local hard drives on the machine.

Empirical evidence suggests that the introduction of ScanDisk correlates with a sharp spike in the price of coffee on world markets, as people everywhere took a break. When ScanDisk is introduced, the price of coffee skyrockets, and keeps climbing to this very day. Coincidence? I think not.

I am not quite sure what ScanDisk is scanning for, though I suspect that it is looking for licensing violations of Microsoft products. I do know that you have to be one brave soul to interrupt ScanDisk during this process unless you want to spend the rest of the day reinstalling all of your software. The end result is that if you are programming and the machine crashes, it’s a long time before you’re back developing again. I have been told that later versions of the OS don’t exhibit that behavior, but I don’t think I’ll be able to test it on the machine I threw from the tenth floor office window when the machine started its fourth Scandisk within 30 minutes [1]]

Out here in the dark, cold, cruel world I started to wonder what had happened. I had been promised in school that computers were good, and that modern software would be easy, even fun, to use and write. I thought that computers wanted to be my friend. I was promised that I wouldn’t have to worry about the mundane bookkeeping minutia that has plagued programming from time immemorial. Keyboard typing? A thing of the past. Other advanced technologies like speech interfaces, or light pens, or tablets, or really advanced hand waving while wearing attractive computer enabled goggles and gloves would be introduced. Didn’t really matter. I would just vaguely describe what I was trying to accomplish, and the machine would take care of the rest. I would spend the rest of my time on a beach in the Barbados, sipping some tasty tropical drink with an umbrella in it.

OK, maybe that was a little optimistic. Certainly I thought that these ‘old’ ideas:


Would be the bare minimum in whatever programming environment(s) became popular. There would be a plethora of languages at my beck and call, each better than the next, allowing pinpoint accuracy for the solution of particularly vexing problems.

By the way, when I say ‘Interactive Programming Environment’, I mean something different from what Microsoft calls Visual Basic, or Emacs provides to ‘C’. A programmer should be able to reach out and ‘touch’ anything on the machine, examine it, change it, redefine how it works, all at run time. Plus, that reach should extend anywhere within the system, or at least anyplace he has a ‘rightful’ place to be. On his personal computer, the master of his domain as it were, I would assume that to be anywhere.

There is no evidence that we need these things for creating great software. After all, if you can write what people tell me is wonderfully advanced software like Linux using just Emacs and C, you can pretty much build anything. Tens of thousands of people work on Linux, and the parts that aren’t perfect soon will be. Now, you’re saying to yourself, “What doesn’t this guy just suck it up, be a man, and really learn to program? After all, once you identify those silly coding mistakes, why in the world would you make them again? Off by one errors don’t happen on my watch, pal.” Even the people who brought you ‘Think different’ see it that way.

My answer to this is that we have done a great injustice to the current generation of programmers, and by extension computer users, if people think this way.

The Land of Milk and Honey

If you look back into ‘ancient’ computer times, circa mid 1980’s there was a very strange breed of dinosaur on the earth. If you follow the fossil record closely enough you’ll notice that these “apartment refrigerator sized” computers appear to be descended from the Mini-Computer-asourus, and look to be the direct ancestors of the modern day computer workstation. Here’s a picture of these Lisp Machines manufactured by Symbolics in a relaxed, natural setting.

I’ll tell you why I think these machines are interesting, and why talking about programming should even include a reference to hardware at all. Back around that time, there were quite a few experiments being performed on microcoded execution units. In short, this facility allows one to write custom CPU instructions, which is invaluable in constructing things like virtual machines. Bit twiddling at its best, the programmer defines the instruction set that the computer executes. This is a stark contrast to the general purpose CPUs of today with fixed instruction sets. This capability was certainly not unique to the Lisp Machines, a notable example of which being the hardware produced at Xerox Parc in the 1970’s to run Smalltalk and Interlisp-D.

I believe that this feature became obsolete when people marketing the first RISC architecture processors convinced everyone that a machine executing only extremely simple instructions could outperform a similar processor with a more complicated instruction set at a better price point. This was ostensibly due to the fact that the RISC architecture could be optimized for the execution of a small set of simple instructions, and simply waving the Turing machine flag at the end of the day proved that any program produced on one architecture could run on the other. Accountants everywhere rejoiced, and programmers had their work cut out for them. [Interestingly enough, Transmeta is using the mirror opposite of this argument to market their new processors. They claim that general-purpose microcodeable machines are fast enough to emulate any fixed instruction machine. They even have their own Turing machine flags ready!]

Rather than say where each of these ideas originated and sound like a thesis with references and such, I’m just going to list what I think are some important points of the Lisp Machine. I’ll just point out here that while other machines of the time had many of these features, few embodied so many of the concepts of ‘modern’ computer design and were commercially available.

First and foremost, the Lisp Machine provided fast Lisp execution. Lisp was the assembler of the machine as it were, which allowed the operating system to be written in 100% Lisp. All levels of the software were written in Lisp, which assured compatibility with any Lisp application produced.

As Alan Kay, (a recurring character within this story) will tell you, for complex systems the compiler and development environment need to be in the same language that it's supporting. It's the only way to grow code. I think a corollary to that is the ‘strength’ of the underlying language ultimately determines how powerful the system can become.

The memory garbage collector was a first class citizen of the operating system. By implementing a tagged memory architecture and combining it with a processor-level instruction set for dealing with memory allocation, the garbage collector was extremely efficient. The tagged memory architecture also allowed for dynamic typing of memory and excellent monitoring capabilities.

Most importantly, superb development and debugging tools were built directly into the operating system.

It had some good bits about networking built in.

Outstanding documentation was a hallmark of Symbolics systems. Documentation existed in electronic form using innovative hypertext much like a modern web-browser, and also in printed form.

The dialect on the Lisp machine allowed:

As I’m sure you know, the whole thing failed miserably. Rather than go into all of the reasons why this idea failed, I’ll just give you one quote:


With an installed worldwide base of some 7000 LISP machines, there is even the possibility that there are actually more [LISP] machines in the marketplace than there are experienced LISP hackers. Frightening.

Harvey P. Newquist, AI Trends ’88, No. 48

Patrick Winston says that this is the primary reason that Symbolics was “doomed” from its conception.

Now, when these guys talk about Lisp the first words out of their mouths are something to the effect that lambda-calculus is a universal model of computation, that is, any computation that can be expressed in a Turing machine can also be expressed in the lambda calculus. Lisp is a direct implementation of lambda calculus.

If you’re like me, that last paragraph has a lot of ‘eye glaze over’ factor. Add to that almost a fetish for parenthesis, and you begin to understand why Lisp might not become the everyman-computing environment that Lisp machines capabilities would at first suggest. Another factor is, at the time, they cost a fair amount of money, though now you can reconstruct one for around $1,500 USD.

Rather than wax philosophical about Lisp, I’ll just gloss over the main facts (as is my usual style) and state that it’s relatively hard to program in Lisp without quite a bit of training. It’s not so much that the bar is set very high as that there is a bar there to begin with. Everyone knows that programmers are under educated slobs.

The reason that I’m talking about hardware here is that in most cases, hardware is just software that has been turned ‘hard’ or ‘firm’. There are definitely some cases of ‘hard’ hardware, such as key presses on a keyboard or bits streaming through an I/O device, but may of the things that we consider hardware today, such as a CPUs instruction set, are a result of a conscious design decision to freeze software and make it ‘hard’. That includes things like CPUs, ROMs, firmware, etc. The Lisp machine architecture is interesting from the perspective that software (and by extension the programmer) controls the machine, from the very lowest functions (like programming the instruction set) to some of the most advanced AI programs devised. In the final analysis, the Lisp Machine is much more about software than hardware, and empowered the user in the true sense of what the ‘personal computing’ revolution promised.

Back to the Future

So, back to the Alan Kay character in the story. Kay had this vision way back when that has been popularized as children going to the beach with their Dynabooks and having a grand old time, while learning all of the secrets of the universe locked within.

Dynabooks are advanced computing devices

It sounds like kids were actually having fun with computers, and I’m thinking if they’re going to the beach tasty tropical drinks can’t be far behind!

If you remember back a little earlier in the story, I noted that Xerox Parc had microcoded execution units. Those units supported a couple of programming environments, Interlisp-D and Smalltalk. Interlisp-D is another implementation of the Lisp ideal, similar in concept to the Lisp Machine.

Kay invented the term ‘object oriented programming’, demonstrated by a programming language called Smalltalk. Everything was an object, and objects talked to each other using messages.

Smalltalk was the exploratory vehicle that Kay’s group was using to pursue some of the implementation ideas of a Dynabook type of computing environment.

Conceptually, the Smalltalk implementation was much the same as the Lisp Machine. The microcoded execution units support atomic Smalltalk instructions, allowing efficient implementations of the virtual machine and garbage collectors. The operating system was written in Smalltalk, and the language included advanced data structures and algorithms in its class libraries. An advanced development environment was created for programming Smalltalk. New paradigms like ‘overlapping windows’ and menus were invented for a better user interface. However, and for various reasons, a similarly themed ‘Smalltalk Machine’ was never effectively marketed.

An interesting note here. Even though Kay is a mathematician by training, one does not think of things like ‘lambda calculus’ when looking at Smalltalk. You think happy thoughts of kids playing with computers while exploring the world around them. The kids can ‘talk’ to their machines using simple to understand programming constructs, with friendly objects and messages.

Spring forward a few years, to the late 1990s. Kay’s group decides to restart this research again in earnest using Smalltalk. The original Smalltalk-80 changed into Squeak. The basic idea is pretty simple. Take Smalltalk-80 and place it into a slingshot, pull back real hard, and let go. Because the development of programming languages and environments has appeared to stall, just use Smalltalk as a base and run with it. Shouldn’t take long to get back to the front of the pack. That’s where we are with Squeak today.

Here we go

Kay’s group is working hard at bringing a ‘dynamic medium for creative thought’ to the Internet using active essays. From the Squeakland.org site:

Squeak aims to have “no threshold”, in that many five year olds can explore ideas in it; and “no ceiling”: its range includes all of the things that can be done with computers.

Does the phrase “all of the things that can be done with computers” sound familiar? And not even one mention of lambda calculus. You’ve got to like that. If five year olds can use this thing, maybe I have a chance. OK, all you have to do at this point is download Squeak, install it on your machine (almost any machine will do they say) and you are the king of the world ...

I started working with Squeak several years ago. In part, the way that I use my machine is in direct contrast to the Squeak teams goals. I usually only use one architecture and one operating system. Squeak runs everywhere, all platforms. Personally, I never worry if it works on a PDA, or an Alpha 64 or a SGI Ultra Reality engine. It’s just me and my PC.

All should be fine and dandy at this point. A fresh bed of rose petals everyday. PCs have gotten fast enough that all that stuff about machine level instruction microcoding is not that big of an issue. I certainly don’t have as many operating system worries as I once had. Except …

It doesn’t do what I want. I don’t get a real good feel for being in control of my computing environment. I always seem a step away from getting it to do what I want. I always get the feeling that “If I just knew this one class or method, I would ‘get it’ and the world would be wonderful”. However I just always end up learning one more class or method without the insight as to how it all fits together. With Squeak running on my machine, I don’t even get the satisfaction of throwing niggles at it, to watch it choke and die on. When I throw a likely niggle its way, a cute little pink dialog asks me, “I’ve found a niggle. Would you please be so kind as to remove it? I might choke and die.” I hate cute.

Part of the history of Squeak is the introduction of Morphic to replace the previous Smalltalk-80 Model-View-Controller (MVC) interface paradigm. Morphic is way cool, a direct manipulation, reification of the user interface. One of the side effects of grafting Morphic into the Squeak was that the entire development environment, which include code browsers, file lists, dialog boxes, virtually everything you see and interact with on the screen had to be reimplemented. As such things go the end result was successful, in that they accomplished exactly what they set out to do.

I’ll point out here that Kay’s team is not concerned with developing a programming environment, but rather an end user experience for use over the Internet. While what follows may sound like criticism, I won’t sugar coat it and just say that it probably is. However it’s not pointed at Kay’s group, rather its just stuff that I am changing for my own personal satisfaction.

Some people dismiss the programming environment as unimportant, and that in the long run it doesn’t make much difference. After all, there are relatively few programmers using computers versus other users. I say it’s much more important because the programming environment determine how other programs are developed, and thus are the limiting factors in building better software. In addition, it is very important because it affects me. This may surprise you, but I’m not as concerned with other people using their machines, as I am concerned with me using my machine.

In using Squeak from day to day, I’ve had some problems. The idea here is that I want a Smalltalk Machine, just like the Lisp machines of lore. I’ll compromise a little and keep an extra 50 grand in my pocket and live with the fact I won’t be microcoding my PC for the Squeak VM. In some sense, that’s good because I can still blame the hardware guys for not putting automatic garbage collection in hardware a long time ago, where it belongs. Those jerks.

Also I’ve promised myself to keep out of the VM as much as possible, and treat it as a black box. Not that I’m afraid of it so much as I believe that there is much more to be gained working on the other side of the fence in the image itself.

The effects of implementing a direct MVC replacement in Morphic are far reaching. Several were clever, for example a fairly simple window interface was very straightforward. An unfortunate side effect is that much of the code is what I’ll call ‘tangled’ within the implementation. Morphic is a different view of life than MVC, as morphs tend to be a combination of the view and controller. Some morphs carry their own model within themselves. So in a direct port of MVC, morphs tended to just burrow into the code rather than separate themselves up and live above the crowd.

Another effect of evolving a live system is that the underlying substrate was constantly changing. If you look at an early release of Squeak versus the latest development environment, you’ll notice that much of the original foundation has changed dramatically. For example, the popular AlignmentMorph became obsolete when most of its functionality was placed into the base class Morph. This creates quite a bit of confusion for new users or the uninformed.

In my Smalltalk Machine world, I envision myself sitting in front of a new generation machine with a nice sized display. I currently have a Pentium IV with 512MB memory, an 2x80 gigabyte drives, and a 21” monitor @ 1600x1200 resolution. I have had more than one monitor attached at times. Because Squeak runs on so many machines, the interface is nominally designed for resolutions around 800x600 or 1024x768. One of the results of this disparity is that the one pixel window edge that you use for window resizing is difficult to grab on a higher res display.

One thing to remember is that MVC was designed in historic times. Many of the interface elements that we have today, such as menu bars, pull down combo lists, and various other widgets did not exist back then, and are lacking in the current implementation. While some say that those things are not needed (after all we need a better interface anyway), they would seem to be handy to have lying about just in case.

Also, some of the standard accoutrements that you expect on modern machines are either missing or exhibit quite different behavior than what one would expect. For example, the ‘blinking cursor’ at the insertion point in a text-editing field is absent, and the topmost window loses keyboard input focus when the mouse leaves the window.

The window manager, while doing an admirable job as an MVC surrogate, is in need of a little help. In particular, each member of the window frame such as the close box, collapse box, label area etc. is an instance variable of the SystemWindow object. This should all be refactored into a new object, WindowFrame, which would control the window decorations, labels and draw an actual window frame for resizing the window itself. There are several things that need to be looked at in this area. A side effect of rehashing SystemWindow is that different looks or ‘themes’ to the interface can be added fairly easily to provide a little personal touch to the environment.

Morphic has a built in ‘layering’ of objects. You’ve probably seen effects like floating toolbar palettes in popular image editing programs, or dialog boxes that always stay on top even when underlying windows are selected. This is one area that needs to be addressed by the window manager. This work would probably extend to the RealEstateAgent to provide different window placement strategies or constraints to the window manager. A needed example of this is an ObjectExplorer that places itself close to the object that is being examined.

Jon Hyland’s idea that a tools class should remember a preferred size when directed needs to be implemented.. The window manager needs to be able to control dialog boxes in a more civilized style.

Likewise, the menu system has grown by leaps and bounds over the years. While I admire all of the functionality that my many menu friends bring me, I’ll be damned if I can find any one of them at any given time. A reordering is due to add a little bit of order back into the World. It would be nice to have a way to be able to make the menus look differently too.

One huge mess in the current system is in the area of preferences. I suggest that this area needs to be looked at carefully, and that preference settings have the capability of being externally stored so that they can be carried from image to image easily.

I’ve never been able to use the FileList effectively. I need to work on that area. FileList2 is much better, but I need better handholding when searching, sorting and selecting files.

Key bindings are nearly as complex as the menu system. I think it’s about time to teach it a lesson.

Documentation? One thing that is strange about Squeak is that due to its history, much of its core is very well documented. I’ll say the rest is less so. Some type of automated documentation process must be put into place (remember the Symbolics machine?), and it must be used. The Squeak team was kind enough to put in a hypertext based system just for this purpose; it would be nice to take advantage of that.

Would you like some cheese to go with that whine?

Enough. As far as applications go, there are quite a few different areas that Squeak is lacking in. My specialty happens to be digital video. Squeak doesn’t do very much in that area. Simple apps like word processing and databases are just plain missing (the lack of a database is especially annoying). There are certainly interesting areas of programming I would like to explore, like Aspect Oriented Programming. However, every time I started working on a given specialized application, I realized that I wasn’t really happy working in the current Squeak environment. If I couldn’t be happy programming in Squeak, then magic apps wouldn’t jump out the other end. Thus the beginning of Project Zurgle.


What is the aim of the project 'Zurgle'?


Project Zurgle is a concentrated effort to clean up some of the rougher edges of Squeak, and begin work on a workstation class development environment. The project includes the aforementioned goals, but hopefully it will grow into something more useful. While some of the goals of Squeak and Zurgle may intersect, I intend to make my design decisions based towards fatter machines, that is, the machines that I use.

First up, window management. I need to be able to resize my windows and move them around. My solution is to add window frames. I am not a fan of the yellow dot for resizing window panes. I’m planning to kill the yellow dot. If I’m adding window frames, I might as well refactor SystemWindow. If I’m refactoring SystemWindow, then I should probably come up with a way to describe an arbitrary window using some type of layout language. Now the boys working on the Stable Squeak project think that XML is the way to go, and have implemented some such changes.

I need to investigate that more thoroughly. One of the quandaries of Morphic is that you have this direct graphical manipulation paradigm. You build your interface objects concretely on the display. However, once you’re through building, it is very difficult to figure out how to interact with them, or come up with a way to store compound morphs and deal with them programmatically. My guess is that both the window frame/layout and morphic representation problems are both very similar, if not exactly alike, and a little cleverness could solve both problems in one fell swoop. One experiment would be to use XML as an intermediate representation, but some type of user interface builder is probably warranted.

What I’m doing here is implementing a lightweight window manager. This might also be a nice time to visit the Cassowary constraint system. I also need to come up with a more intuitive way of documenting things as I go along.

As you can guess, such a beast has far reaching implications throughout the system and I’m currently out hunting down the problems that I have encountered. In some sense, I want to just have the current system work with a new windowing system by shoehorning it into the image. However, I think in practice that I need to abstract the whole deal, so that I can deal more effectively with system implementations details such as event dispatching and key bindings.

Right up front here, I’ll just point out that I’m probably not the sharpest tool in the shed. However, given enough time and energy I can usually figure out what’s going on in the image looking through the code, and I have enough experience to implement the changes I’ve noted above. After all, my father proudly tells me “you’re not a complete idiot”. I think he means it as a compliment though sometimes it’s hard to tell. The preceding changes are not earth shattering; in fact it’s just a whole lot of fluff added on top of Squeak to make it conform to modern norms. Whether the modern norms are better, worse or sacrilegious don’t really matter to me; it’s what I’m used to and is probably close enough for a start.



[1] No actual Personal Computers were hurt in the creation of this document