Squeak
  links to this page:    
View this PageEdit this PageUploads to this PageHistory of this PageTop of the SwikiRecent ChangesSearch the SwikiHelp Guide
Scamper
Last updated at 11:10 pm UTC on 8 May 2017

Scamper 2017


Metacello repository: https://github.com/HPI-SWA-Teaching/Scamper
Metacello new
  baseline: 'Scamper';
  repository: 'github://HPI-SWA-Teaching/Scamper:dev/packages';
  onConflict: [:ex | ex allow];
  load 


Current development head: https://github.com/HPI-SWA-Teaching/Scamper/tree/dev

Scamper architecture https://github.com/HPI-SWA-Teaching/Scamper/wiki/Scamper-Architecture

Rendering pipe line (data flow)
Scamper rendering pipe.PNG

Wiki: https://github.com/HPI-SWA-Teaching/Scamper/wiki

Installation into Squeak 5.1 http://forum.world.st/How-do-I-install-the-Scamper-browser-td4945136.html

Installation into Squeak 6.0a http://forum.world.st/Scamper-in-Squeak6-0a-17230-td4945217.html

Display of simple tables works fine now. See Scamper-HTML-Morphs and HTMLTableMorph.


See also

Headless Chrome

Overview 2006

Scamper is a simple web browser which runs in Squeak. Currently it supports:

Some things that don't work:

Enough is supported that you can browse around on the Web. Images with transparent backgrounds even look pretty cool.

But still, why use a Squeak browser at all, instead of Netscape? Two reasons:

On a deep level, Squeak is a nice place to be. The code and the objects are all there, easily accessible if you ever want to get at them. A Squeak user isn't limitted to just looking at the pretty rendered HTML pages, but can get directly at the underlying objects which structure them; they can move within the web, instead of just looking at it through glass. Sure, most of the time users are content to just read; but when they want to do something more, shouldn't they have the opportunity?

On a pragmatic level, if you already do most of your work within Squeak, it would be nice if you can do mundane things like look up Web documents within Squeak, too.

The improve Scamper page

Executing

The most current version of Scamper is in the Squeak release. There is no separate installation currently available. Run Scamper by selecting the "open..." world menu and then selectiong "web browser".

Architecture

Here is an overview of the design, for anyone interested in web hacking.

Network-URL category
parsing absolute and relative URLs, and downloading Web documents through an abstract interface. Also includes a "MIMEDocument" class which combines data with its content-type and source URL.
HTML category
The biggest part. It includes four main subparts:
  1. HtmlTokenizer: divides a text stream into a sequence of HTML tags, text (handling character reference entities) and comments.
  2. HtmlParser: takes results from the above and generates a tree of HTML entities which is more convenient to manipulate.
  3. HtmlFormatter: takes result of the above and generates a formatted Text object suitable for display.
  4. FormInput hiearchy: handles the set of inputs from a Form.
Scamper
a simple Web browser with a mousey name. Display is handled via the pluggable views.


Thoughts


Contributors

Scamper was originally put together by Lex Spoon.

David J. Pennell has been working on dynamic HTML generation, and pointed out the existing Document Object Model at http://w3c.org/DOM/. Feel free to jump in and explain this better!

Comments

Hey, this is a Swiki after all. Any comments are quite welcome.

I just grabbed a copy of Squeak 3.4 (latest update #5170); I tried to load up http://meta.wikipedia.org/ in Scamper and found myself confronted with an exception. It seems that htmlEntity valueOfHtmlEntity: doesn't handle hexadecimal... Although a complete Smalltalk newbie, I managed (after much trial and error) to get it hacked in (yay!) I'm sure my code is a bit ugly, but it seems to work.

In-progress fixes: http://leuksman.com/smalltalk/Brion's%20Scamper%20fixes.2.cs

Brion Vibber


Cookies

One thing annoying about Navelgator is the series of dialog boxes you get if you want to approve cookies before they are accepted. As an alternative, I propose a "cookie browser." The bottom of the page could contain a list of recently receieved cookies, with a check box by each one so the user can indicate whether they accept that cookie for sending back to the server. As cookies arrive, they get stuffed in the cookie browser. The default behavior (send or not send) could be an easy user-controlled switch.

Scamper Removal and Install Packages


ScamperRemoval.sar
Scamper.sar

Fix Documentation