[FoRK] The Arc Challenge: Tastes great! Less filling!
<jbone at place.org> on
Sun Feb 3 00:46:48 PST 2008
So PG's responded to the griping about Arc's apparent disconnect
between ambition and approach. In his response he says "So working
on what makes programs short rather than what's hard to implement
translates to: I chose what to work on based on the value to the
user, rather than the cost to me."
I'm not sure that makes any sense at all.
In general I agree that a languages ability to make programs shorter
is correlated to value to the user, but there's so much more to it
than the kind of stuff that Graham seems to have fussed over so
much. But then he offers us a gem. From:
"I'm going to propose a simple problem as a challenge. We'll collect
solutions in each of the popular languages, and compare their
lengths. Here it is:
Write a program that causes the url said (e.g. http://localhost:port/
said) to produce a page with an input field and a submit button. When
the submit button is pressed, that should produce a second page with
a single link saying "click here." When that is clicked it should
lead to a third page that says "you said: ..." where ... is whatever
the user typed in the original input field. The third page must only
show what the user actually typed. I.e. the value entered in the
input field must not be passed in the url, or it would be possible to
change the behavior of the final page by editing the url."
"Here's the answer in Arc:"
(defop said req
(aform [w/link (pr "you said: " (arg _ "foo"))
(pr "click here")]
A quick aside: I think PG is cheating a bit here; how do you get
the web server fired up? How is the port configured? Are those
"aform" and "input" and "submit" a library functions / macros / etc.
or are they builtins to the core language? Point being, those few
tokens are not LITERALLY all that's required to get the computer to
behave as desired...
Well, well, well... what a fun game. I'm going to play a little
thought experiment here to illustrate a couple of points.
My good friend and Lisp guru Peter Oreo has been working on a new
language called Noah for, oh, 2 days now. It's a variant of Lisp.
In fact he's been writing it in Arc, it's just a "compiler" for an
extended variant of Arc, written in raw Arc, that compiles to raw
Arc. In fact the only difference between Noah and Arc is that Noah
has a builtin function "F" that, when invoked without arguments,
causes the url said (e.g. http://localhost:port/said) ...well, you
get where I'm going here. So he's got this builtin. The program
Paul Graham has written in 5 very fat lines of Arc above can be
written in Peter Oreo's Noah programming language by simply writing
Wait, not quite yet he doesn't.
My other good friend and UNIX toolchain guru Phil Milano has watched
Peter working on Noah, and makes the following observation. He notes
that any domain-specific task which can be carried out in a library
can be given a very concise calling signature. So he's rightly not
too terribly impressed with what Peter Oreo has achieved. But, Phil
notes, some of the UNIX tools provide something different that leads
to greater *general* computational expressivity and terseness: some
of them provide *different computational / processing paradigms.*
You can, for example, often create awk programs that are shorter than
other language equivalents, because awk assumes an implicit
processing model of looping over its input. Similarly the shell
comprises an algebra of stream operators; this enables programs
which can be cast as dataflow transformations to be very tersely
He realizes that he can beat Peter Oreo at the Arc Challenge by
exploiting this notion of a language having an implied processing
model. He takes Oreo's code for Noah, extends it ever so slightly,
and creates Flood. Flood is exactly like Noah except that its
default, its null program, is implicitly Noah's (f) program, which is
PG's (defop said...) in native Arc. All he has to do is invoke flood
at the command line with no options....
He wins! But who cares, really?
My first point is this: in some ways, The Arc Challenge in and of
itself validates some of the critics' arguments.
My larger point is this: if you really want to make certain classes
of programs easier to write, or some classes of programs *possible*
as a solo practitioner, then there's more to it than simply
optimizing syntax and bumming tokens out of the common cases. That's
necessary, to some degree, but not sufficient for the big win, for
the 100 Year Language. And to a large degree the opportunity for
bumming syntax down to size has already been well-accomplished; the
degree of code compression achieved by e.g. Python, or any Lisp, etc.
vs. e.g. Java is likely to be much larger than the degree of code
compression achieved by Arc vs. e.g. its MzScheme host language.
To the extent that Arc (or any 100YL contender) *does* achieve a
greater degree of compression, it will be (as in the Challenge
solution) via rich(er) standard facilities. Or it will be via novel
processing models, etc. --- and *some* of the richest of those
opportunities require heavier lifting than what we've seen here.
Things like Erlang and its Scheme derivative Termite, for example,
imply a whole new model for factoring systems to enable concurrency
and distribution... and that's just one example. Arc's got nothing
to offer on that front that's not already present in MzScheme; in
some cases it's less than MzScheme.
What language designers should really be doing, IMHO, is attempting
to discover why e.g. Python turns average coders into productivity
engines, and / or why Haskell enables elite programmers to routinely
crank out products that would require teams in other languages ---
and enables them to do so in less than 5k LoC.
Despite my earlier critical comments, I'm actually slightly
optimistic about Arc's prospects. After watching the responses for
the last couple of days it appears to me that, if nothing else, Arc
(if it can get to the "walks" phase, much less "runs" --- and yes, I
realize that my peanut gallery commentary isn't helping! ;-) has the
potential to rally a large part of the extremely fragmented Lisp
community. We'll see. But I bet you a cheeseburger (in a can*) that
if and when it does achieve that kind of critical mass that it looks
more like its host language (in terms of libraries, environments,
tools, community infrastructure, etc.) than it does today.
Here's my challenge to anybody out there:
- build a GUI toolkit for Arc (not just tk bindings, please!)
- build a packaging / software distro mechanism for Arc (cf. gems,
- port / embed scsh in Arc
- port / embed Termite in Arc
- generic DB API and DB bindings for Arc
- FUSE bindings for Arc
- a consolidated download / build / install for Arc
That'll get you some momentum. Note too that even if all you're
after is having Arc be the best Web development language, you *still*
need a few of those (packaging, DB bindings, a less hokey download /
build / install.)
Footnote: after mulling this over all day, drafting it, sending it
off to a valued colleague to get editorial input, etc. I finally
decided to post it. A quick scan of reddit revealed a post with the
provocative name "What are programming languages for?" Instead of
hitting send I read that post, only to discover that somewhere down
near the bottom the author essentially articulates the same argument
I'm making with the Peter Oreo / Phil Milano examples. Beat to the
punch. C'est la vie. Cf.
* Ugh. Yes you can haz cheezburger... in a can.
More information about the FoRK