I suspect that this change to association lists, together with functional position lookups , destructuring-bind and a judicious use of conses, could potentially eliminate the need for tables in Arc's core and:
- solve the optional arg problem 
- permit apply to be subsumed by the dot notation 
I still have a lot of details to work out before I can make a compelling case for this, though.
I don't see the point. o.o I'd much prefer to write '((a 1) (b 2)) rather than '((a . 1) (b . 2)) and destructure using (let (k v) ...) rather than (let (k . v) ...). Actually, I'd write '((a 1) (b 2)) as (objal a 1 b 2), but the destructuring issue is something I'd just deal with and fume over. :-p
Is there any downside?
Those are the downsides. :)
It's more efficient and more isomorphic to a hash table.
I think you save about 1/3 the conses when creating and adding to alists, so there is that.
But isomorphic to a hash table? The most official way we can compare them is with 'tablist and 'listtab, which use the list-of-two-element kind of alist.
Also, IIRC, Rainbow displays tables as #hash((a 1) (b 2)), and I couldn't be happier. There's so much " . nil" cruft when viewing big tables in official Arc.
could potentially eliminate the need for tables in Arc's core
Arc doesn't have enough table support. XP Keys are compared via Racket 'equal? (or via weirder methods in Rainbow and Jarc), and I haven't gone to the trouble to make tables that somehow dispatch via an extensible 'iso.
I want efficient lookup in big tables for the sake of Lathe's namespace system and Penknife's environments, and I'll get that by dropping to the underlying platform if I need to--I already do for weak tables--but I'd rather not. If official Arc ever removes table support, I hope it also adds 'defcall so I can put tables back in.
- solve the optional arg problem
- permit apply to be subsumed by the dot notation
How are those related? The only point of connection I see is that they're other things that could use dotted lists, but even that's not especially true for optional args. Did you mean to say that you suspect some change regarding dotted lists (or just the way we look at them) will help with both alists and these other cases?
I'm talking about the core notion of a hash table. It's composed of key-value pairs, not key-value "lists of two". :P This is a restatement of bogomipz's point made elsewhere in this thread.
> Arc doesn't have enough table support.
If alists were better supported, you could use them in place of tables in every case except where the utmost efficiency is required.
But I'm not sure it's even correct to frame this as an axioms vs. efficiency debate. Something I've learned from PicoLisp is that heterogeneous data structures slow down the general case by complicating memory allocation and garbage allocation. PicoLisp manages to be a fast interpreter (say what?), in part because it uses the cons cell for everything .
> I'd much prefer to write '((a 1) (b 2)) rather than '((a . 1) (b . 2)) and destructure using (let (k v) ...) rather than (let (k . v) ...).
I think this is a cosmetic issue that has to do only with our visual representation of cons pairs and Arc's incumbent ssyntax.
For example, if you changed the ssyntax so that a.b expanded to (a . b) instead of (a b), then these snippets would be more pleasant to write: '(a.1 b.2) and (let k.v ...) . I'm not actually proposing this particular solution, but it should illustrate my point that the issue is only syntactic/cosmetic.
> How are those related? [...] Did you mean to say that you suspect some change regarding dotted lists (or just the way we look at them) will help with both alists and these other cases?
Well I did say I still have some details to work out. ;)
I think your paraphrase is accurate. A "change regarding dotted lists (or just the way we look at them)" is what I was trying to express with "judicious use of conses" in the grandparent.
A pair is a list of two in English because English lists aren't nil-terminated. But Arc lists are, so we're talking about the difference between (key . val) and (key . (val . nil)).
I don't have a great answer to your triple/singleton question yet except to ask that you consider the following:
- The fundamental data structure of lisp is the cons pair, so perhaps pairs warrant some special treatment over singletons, triples, etc.
- The demand for associative arrays in general-purpose programming is far greater than that for any kind of triple-based data structure, which is why tables have their own type in Arc to begin with
Update: Cons pairs are so powerful that we've used them as the base for almost our entire language. And yet the associative array structure (which screams "pair"!) that we've made from them (i.e. alists) is so inadequate that we all outsource that functionality to tables instead. Around tables we've then developed the conveniences for syntax, etc.... Doesn't this seem a bit kludgy for The Hundred-year Language?
The main advantage of cons pairs, in my mind, is that they're all the same size, so it's easier to reason about them and memory-manage them on a low level. They're also just as powerful as they need to be to support an expressive language. But that doesn't make them ideal abstractions for exploratory programming, especially when an equivalent abstraction in the same language takes fewer characters to type out and is even better supported thanks to 'map, 'any, etc.
I've been overly dramatic here too. I mostly wanted to help you make sure you were on a path that held water while giving you some hooks to convince me by... but I brought some external pet peeves into the mix and got worked up. XP Please do continue with your train of thought. ^^ Here's hoping the train mixes underwater hooks, or something.
It's something about Arc's built-in types that bothers me. They seem so adhoc. You have this beatiful axiomatic thing going on in the core with conses, and then suddenly tables enter the mix. From that point forward, odd utilities get defined with an if branch that checks for the table type.
In this thread, I've been worried about tables cluttering the core language and you about them not being well-supported enough. In truth, I think both of our concerns are legitimate (yours is for sure, because tables really are better than alists for some applications). The problem is that the present implementation doesn't do either of them justice.
I'd like to know what you think of this proposal: keep the core language definitions to symbols and conses. Then support each additional type in a dedicated file (e.g. numbers.arc, tables.arc, queues.arc). These types can either reach down into Racket to borrow one of its types (likely for numbers or tables) or be annotated constructs built from existing types (likely for queues, trees or alists), and then use the extend idiom to give them support in the various utilities and the reader.
"support each additional type in a dedicated file.. either reach down into Racket to borrow one of its types or be annotated constructs built from existing types, and then use the extend idiom to give them support in the various utilities.."
or defgeneric? 8-) I was moved by the same concerns you describe: I never want to see an (if (isa x 'table) ..) in arc code.
Agreed with both of you, but I'd go further: I don't want to see (isa x 'cons) or (isa x 'sym) either, if possible. I'd rather every type be treated as equally non-fundamental. Of course, s-expression syntax special-cases those types sorta intrinsically, but I'm sure 'defgeneric could be used along with one of aw's "access the Arc compiler from Arc" patches. ^_^
It might be difficult and/or impossible though, considering that 'defgeneric needs to be defined in terms of something. So does calling, since the most obvious way to specify a custom calling behavior is to give that behavior as a function to call! XD
Like so many other opinions of mine, this is something that's going into Penknife if at all possible, even if the core currently needs a bunch of rewriting to get it to work.
To fix the "'defgeneric needs to be defined in terms of something" issue, I'm currently considering having most things be built-in rulebooks, with rulebook being a built-in type if necessary.
For the calling issue, I'm going to have the built-in call behavior try certain hardwired things first, and only move on to the customizable calling rulebook if those don't work. I intend for it to be possible to replace the interaction environment with one that uses a different call behavior, so even that hardwired-ness should be kinda seamless with the language.
For now, these things are all hand-wavy, and I'm open to better ideas. ^^
> but I'd go further: I don't want to see (isa x 'cons) or (isa x 'sym) either, if possible. I'd rather every type be treated as equally non-fundamental. Of course, s-expression syntax special-cases those types sorta intrinsically
Wow, I'm really interested in whether there's a way to have s-expressions that don't special-case conses and symbols. shader's just-in suggestion  makes me think there might be a way to merge conses and symbols into a single type, though. Could it be possible?
Essentially, all you need to do is extend 'ac, since compiling is almost all that happens to Arc expressions. In the short term, there's no need to worry about whether a custom type represents a function call, a literal, etc. As long as it compiles, you can start returning it from macros or reader syntaxes.
In the long term, there may be other things that would be useful to extend, like 'ac-macex, 'ac-expand-ssyntax, and 'expand=. Also, it may be easier for a custom syntax type to support 'expand= if there's a separate utility it can extend in order to have all the functionality of a function call. That way it can be 'sref'ed.
Thanks for this guide. It should come in handy for me. :)
If I start messing around with Arc's internals too hard though, I may not be able to resist trying to turn it into an interpreter . I'm too attracted to the notion of first-class environments, eval and fexprs lately. (In this case, I'd be extending eval rather than ac, correct?)
Or maybe I should just stop being such a damn purist. Have to take things one step at a time anyway. ac is a logical place to start.
It's come under my radar before . I've read some of the thread you linked to and some of what's on his github . I like the general idea of giving ' and , more power to control evaluation, but I'm afraid I don't grok the language very well yet. :-/
Update: To clarify my confusion, the documentation talks a lot about closures (e.g. that ' does some kind of closure-wrapping), but I thought the language was supposed to be fexpr-based. I don't understand yet what fexprs have to do with closure-wrapping, but I really should study the language more closely.
Eight's documentation is in a terrible state (in part because there are still many things about which I've yet to make up my mind), so blame me for any confusion.
Here's the gist: Fexprs, like macros, take expressions as arguments (duh). Those expressions are made up of symbols (duh). Because a fexpr is evaluated at runtime, those symbols may already be bound to values when the fexpr is called. Eight keeps track of which symbol is bound to which value at the place the expression originated (where the programmer wrote it) --- even if you cons expressions together, or chop them into pieces. This eliminates the need for (uniq), but still allows for anaphoric fexprs when symbol-leaking is desired.
When I wrote the docs on github, I called an expression plus any accompanying bindings a 'closure' (even though it wasn't a function). I also didn't know the word 'fexpr'. I've read a few dozen more old lisp papers since then, and hopefully on the next go-round my vocabulary will be much improved.
"there might be a way to merge conses and symbols into a single type"
Interesting idea. This might help a lot with implementing lisp in strongly typed languages. I suppose atoms could just be cons cells with nil in their cdr slot. The only problem is then how do you get the actual value out of an atom, and what is it?
This might help a lot with implementing lisp in strongly typed languages.
Don't most of them have option types or polymorphism of some kind? If you've got a really rigid one, at least you can represent every value the lisp as a structure with one element being the internal dynamic type (represented as an integer if necessary) and at least two child elements of the same structure type and one element of every built-in type you'll ever need to manipulate from the lisp (like numbers and sockets). Then you just do manual checks on the dynamic type to see what to do with the rest. :-p
The only problem is then how do you get the actual value out of an atom, and what is it?
I say the programmer never gets the actual value out of the atom. :-p It's just handled automatically by all the built-in functions. However, this does mean the cons cell representation is completely irrelevant to a high-level programmer.
> I suppose atoms could just be cons cells with nil in their cdr slot.
Could they be annotated conses with symbol in the car and value in the cdr (initialized to nil)? nil itself could then be a cons with the nil symbol in the car and nil in the cdr. This should achieve the cons-symbol duality for nil that's usually desired. (Follow-up question: annotate is an axiom, right?)
I don't want to see (isa x 'cons) or (isa x 'sym) either
Totally with you there.
I don't want to get too hung up on 'purity'. It's ok to use tables in the core if you need them for defgeneric or something. It's ok to have a few isas early on. iso is defined as a non-generic bootstrap version in anarki before eventually being overridden, so stuff like that seems fine to me. I just want to move past the bootstrap process as quickly as possible.
iso is defined as a non-generic bootstrap version in anarki before eventually being overridden
Sure, that's an okay way to go about it. ^_^ Since I'm doing the Penknife core library stuff from the top down right now, I'm just writing things the way I want to write them, trying to determine what axioms I need before the core library is loaded. If the high-level axioms are defined in another lower-level library, that's just fine, but I don't know why I'd bother with that when I can just put them in the Arc part of the core. :-p
I think I had trouble digesting it a few months ago because it depended on so many utilities I was unfamiliar with: vtables, defmethod, pickles (and it compared with extend, which I didn't understand back then :-o ). Giving it another try...
vtables and pickles aren't utilities, just implementation details for defgeneric.
Basically vtables contains a hashtable for each generic function mapping a type to an implementation. "If len gets a string, do this. If it gets a table, do that." The body given to defgeneric sets up vtable entries for a few default types (cons, mainly :), and defmethod lets you add to vtables later.
If the generic function doesn't find an entry in vtables it falls back on searching the pickles table for a procedure to convert that type to cons, before retrying.
Let me know if this makes sense.
(names: I believe vtables comes from C++, and pickle is the python primitive for serialization)