I've been thinking about this challenge in terms of Erlang idioms and worked up a (very) minimal web framework using yaws. With SPEWF < http://code.google.com/p/spewf/ > this might look like:
The second def to me implies that add-1 returns a function, not a value.
Since arc supports subscripting data structures using a list (I'm sorry if that's a terrible way to say it--I'm not a lisper), the last expression can be non-uglified:
(def add2 [+ (_ 0) (_ 1)])
So [ ] is an anonymous single-argument function, and when more than one argument is passed to the function, a list is passed.
Actually, forget everything I said - I agree with your first sentence. I had originally thought there might be an elegant way to ditch formal parameter lists when there aren't optional or rest parameters. But the cost seems too high for the value added.
It's totally possible that I'm missing the point, but how is _ 0 and _ 1 and better than x and y? Sure this gives you an implicit parameter list, but when you go beyond short functions like add2, wouldn't it be nice to have names for variables? My understanding of [ _ ] was for all those quick, inline anonymous functions, not for function definition of non-trivial functions.
I think "better string processing" is below the level of writing a grammar, even a mini-grammar.
If you treat strings as lists of characters and have good pattern-matching, this problem mostly solves itself, without something as ugly as regular expressions. You have the rich set of list operations, function-based predicates and, with clause selection by pattern, that's mostly what you need.
Yes, I know strings-as-lists seems like a terribly unoptimal thing to do; but it's an optimization challenge (how to optimize a particular backing representation of lists so that certain kinds of operations--subset matching, tokenizing, etc.--happen efficiently?), not something to bake into the language.
I have to say I think regular expressions belong ghetto-ized into a library. If they are bonded with the language, and a language is a tool to think in, then they encourage the programmer to think all data is textual strings to be transformed into structured data (over and over) via the text-matching engine.
This is one of the most irritating legacies of Perl and no new language should make that mistake. I'm not saying regular expressions have no place, but it's a specific and special problem domain, not a general one.
If I had to vote, it would be to have Prolog/Erlang/Haskell-type pattern-matching. I'm not sure what pg means when he says it's good for writing append and that's it--as an Erlang programmer it seems like bread-and-butter to me.
What I don't really understand is why strings are constrained at all... why aren't they just lists of integers? Graham himself in essays has suggested exactly this and other functional languages use it.
I'm guessing that pg wrote it to use lists, but it turned out too slow (he has a site to run using arc), so he's using regular strings for the time being. The first thing you do when writing programs in a language like Haskell that munge large amounts of text, for instance, is to stop using the built-in strings-as-lists and switch to some sensible library like ByteString.