The downside of doing that is that you can't do nice simple things like
(when (find 1 '(2 3 4)) (prn "Found 1"))
You would have to use match instead, but I agree there will always be cases where you do need that option.
I think defcon and match are a great idea, although I don't think you need all that power for what you're doing. Returning a uniq that indicates failure would be just as effective. defcon would be good as a standard way to create new types, and because they'd all work with match you'd gain a lot of power by following that convention. This kind of destructuring is one of ML's best features and I was temped to implement it for Arc myself.
K is actually a very good language for Arc to steal ideas from. It's extremely concise (although it doesn't have to be line noise as above) and it does everything with lists.
yes, it doesn't have to be line-noise - see eliza.q for an example. q is k without ambivalence: symbols denote only binary functions, unaries have keywords. e.g. in k x+y is addition, +x is transpose; in q, + is addition, flip is transpose. the q programmer can go further down this road, e.g. define x#y as 'take', then use x take y. as i mentioned in a follow-up on comp.lang.functional, part of how k achives concision is to keep the primitive-set small and orthogonal on a small set of datatypes. the programmer seeks representations which are tuned to the primitives. i think of this as coding to the grain of the language.
a small but important correction: k doesn't do everything with lists (althought it does do a lot with them.) k has atoms and lists. it also has dictionaries (maps of names to values), tables, which are transposed dictionaries of lists, and keytables, which are maps from utables to tables, where a utable is a table of unique records. as pg knows, it has first-class functions, but not closures.
people outside the k world may have the impression that k is just APL with an ascii character set. at this point in its development, i think it is more useful to think of k as a functional query language. in APL, the primitives understand atoms and lists (scalars and arrays). in k, they also understand maps, since a table is a transposed map of lists, a table is also a list of atomic maps.
i hope these remarks are useful to all of you arcturians.
Ooh... I hadn't seen withs before. That'll be useful.
But you're right, it still doesn't fix it. The closest thing is labels in Common Lisp, but that can only be used to create functions. Perhaps if even CL can't do it then it's not that useful after all.
I originally considered writing a macro that did just that, and even called it module, but then I realised I was packing a load of disparate features into one macro just so it would look like modules in other languages.
I don't think the point is conventions (at least not yet). I think the point is that modules may be an onion, and we don't want any onions.
If the module system is an onion, it is an onion with a kernel, which is really a convention.
For example, see SLIB, the portable Scheme library http://www-swiss.ai.mit.edu/~jaffer/SLIB.html . It is designed to work with R5RS Scheme and before, thus it can't use any module systems. Instead, it uses a naming convention, which works pretty well. The problem is that, to make it work, all programmer should agree and follow the convention. The so-called module systems make it easier to follow the rules by making the existence of the convention invisible (automatically adding prefixes effectively, e.g.) or preventing programmers from doing things differently.
In the discussion of R6RS Scheme, where a module system is finally adopted, some people argued that you can roll your own modules system out of given primitives, and it had proven in the production code. But if you use your own module system (or convention) and I use my own, it will be difficult to mix two libraries. Whatever it is, we have to agree on one way. I think that's really the point.
I agree that module systems are conventions. What I don't agree with is that we should uncritically adopt them in Arc. There may be a better way. What that is, I don't know.
Then the important question is this: When you say better, better in what sense? Beyond a certain point, people have different criteria and your better and my better start to diverge.
In Scheme, traditionally such matter has been settled by "agreed to disagree"---instead of choosing one, the spec just leave it unspecified. But R6RS had a goal to allow portable libraries, so it had to choose one; for modules, it is better to have lesser but one module system, rather than having multiple ("better" in various regards) or than having none.
In Arc, ultimately it's up to pg.
[Edit: To be more PC, R6RS module is 'less agreed'. I do understand it is better in its design criteria.]
Good question. I'm sort of inspired by PG's thoughts about object-oriented programming, that it would be better if they offered the features a la carte rather that packaging them up into objects.
I like the interesting footnote, very relevant to this discussion:
> So perhaps packages will turn out to be a reasonable way of providing modularity. It is prima facie evidence on their side that they resemble the techniques that programmers naturally use in the absence of a formal module system.