return to table of content

Cognition: A new antisyntax language redefining metaprogramming

noelwelsh
27 replies
2d10h

Some advice to the author: you can considerably tighten up your writing by putting the most important things first. Take the introduction (which is, bizarrely, not the Introduction that comes three paragraphs later.) There are over 300 words before the actual project, Cognition, is mentioned (second sentence of second paragraph). All this stuff about Lisp is great, but is that the most important part of the project? Should it not be something about the project itself?

When I'm reading something informational (rather than recreational) I'm always asking myself "is this worth my time?" You should address this as soon as possible, by telling the reader what the document is about right at the start. "Cognition is a new language exploring user modifiable syntax" or something similar. I didn't get past the first four paragraphs because I couldn't determine it was worth continuing.

ordu
10 replies
2d9h

> When I'm reading something informational (rather than recreational) I'm always asking myself "is this worth my time?"

It doesn't. You will not be using this language. And even if you will, you'll get all the information from a documentation, not from this article. If your time is money you wasted your time reading the article.

Really why some people believe that all the content of Internet must be attuned to their personal quirks? Why they believe that it is better to change internet, than to adapt to what is already here? It is a text, not video or something sequential. You could scan it diagonally looking for something that is interesting for you. You could reject it if nothing was found. Or you could return to the beginning and read sequentially from there. And these features are available for a text structured in any way. I highly recommend to learn the technique, it could deal with whole books, by selecting useful pages to read and rejecting most of other pages, which tell you nothing new.

I'd argue that diverse article styles are much better, because they force you to consciously and actively sort through information you are consuming. You shouldn't do it passively, becouse your mind becomes lazy and stop thinking while consuming.

OTOH I would agree with you if it was not a text but a video. I hate videos because you need to decide upfront are you investing time into watching it or not. 2x speed and skips by 5-10 seconds helps somehow, but do not solve the problem.

ikari_pl
2 replies
2d8h

can't agree more; all arguments in this comment are also why I don't understand video and yt popularity at all. I can't do anything to quickly get a grasp of what's in a video, or whether it answers my question. Sometimes even if the question is a yes or no, this sequential format forces me to wait a few minutes to learn the answer. By that time, I'll forget why I wanted it in the first place.

pmontra
1 replies
2d3h

In my experience, videos about how to cut wood or use tools are infinitely better than reading the equivalent text. Videos about writing software are infinitely worse than text. The difference is that wood and tools are 3D objects, software is text.

runevault
0 replies
1d23h

I'd say it depends on the software. If you are just writing code that manipulates text, I completely agree. If you're dealing with 3d rendering such as Blender or game engines, if you're dealing with audio like DAWs, etc that have components beyond the raw text, video can have value because just staring at the code or similar doesn't tell you what watching/hearing it in motion will.

throwaw12
1 replies
2d8h

It doesn't. You will not be using this language. And even if you will, you'll get all the information from a documentation, not from this article. If your time is money you wasted your time reading the article.

Hard disagree, not always we read things which has a direct impact on what we do. Sometimes ideas in one area can spark solutions for other problems.

Imagine aerospace engineers never looking at birds, or military equipment not getting inspiration from chameleons.

ordu
0 replies
2d7h

> not always we read things which has a direct impact on what we do. Sometimes ideas in one area can spark solutions for other problems.

Of course, but this effect is unpredictable. You could get an idea while reading some fiction, because the plot sparked some chain reaction of associations in your brain. But if it will happens or not is not predictable on basis of an abstract of an article. GP clearly talks about something else.

> military equipment not getting inspiration from chameleons.

A good example. If your goal is to fight the enemies it will be counterproductive to seeks for black swans by finding and studying new life forms. But you can still do it in a "fishing mode", just looking for something that seems interesting.

Science is living on a government support exactly because it is unpredictable. It can sometimes discover electricity, but most of the efforts of scientists gives little to no useful knowledge. One cannot know a priori if their research will have a big impact or not.

thefaux
1 replies
2d

Really why some people believe that all the content of Internet must be attuned to their personal quirks?

I read the OP as constructive feedback not an attack. I also think that their advice is practically bog standard writing advice not a personality quirk.

I would also suggest that you might look in the mirror at your own comment when accusing someone of imposing their personality quirks on someone else.

arnorhs
0 replies
1d17h

Constructive feedback usually has a more inviting, accepting and helpful tone of voice. As opposed to the commenter you are referring to, which seemed dismissive and slightly rude to me, tbh.

Aside from the tone, the specific feedback was also a miss, imo. It was written from the standpoint of as if they were reading a marketing page, or a show hn post, which it is not. It is simply a blog post. An article. To me it provided a bunch of context and was kind of enjoyable.

When you have a marketing page or some kind of "check it out" post, there is a certain level of expectation that the reader can expect and it's even reasonable to complain when the given post does not get to the point. I agree with that.

This is not such a thing

flir
0 replies
2d7h

Inverted pyramid structure is a useful tool.

Sometimes it's inappropriate, and ultimately the author is the best judge of this.

If you view text composition as a UX problem, it will help you figure out when to use the tool and when not to.

Examples where it isn't used (clickbait headlines, recipe blogs with three pages of meandering before they get to the ingredients, SEO "optimised" youtube videos) are, IMO, examples of dark UX patterns more often than not. But your use-case may be valid.

(This comment written using an inverted pyramid structure).

barrkel
0 replies
2d6h

The article is badly written if the author's intent is to communicate their ideas to other people.

_gabe_
0 replies
2d4h

For what it’s worth, I was also thinking this is a very wordy article. It has lots of asides, and seems to get distracted from the main point. It gets distracted enough that I have a hard time following it, which is not conducive to what any writer wants to achieve: telling some sort of story.

kwhitefoot
5 replies
2d3h

All this stuff about Lisp is great, but is that the most important part of the project?

It's clearly not the most important part of the project but it serves to illustrate the kind of problem that the project intends to solve. Without something like this section the following sections would be even more difficult to understand.

reikonomusha
3 replies
2d3h

Well, the other problem is that the stuff the article says about Lisp is incorrect and gravely misinformed.

For instance:

This makes the left and right parenthesis unchangable from within the language

The parenthesis character in Common Lisp can be redefined, even just temporarily, to anything you want.

This is a misunderstanding and shouldn't be in your opening play.

It sounds like the author thought of something they felt was neat, but felt they had to justify its existence first by a quick pot-shot at another similar thing. It would be more effective if either it were correct, or they just described their own creation from the get-go.

ret2pop
1 replies
1d21h

Writer of this article here.

It was never meant to be a pot-shot, and I have nothing against lisp. I can tell why it reads that way, and we added that in because we wanted to illustrate why people should care.

As to your claim about us being wrong: I don't have an issue with being wrong, and maybe at the same time we are. At the same time, I think it is possible that there are misunderstandings that cause people to believe we aren't doing something new. Again, maybe we're not.

We're two 18 year olds, fresh out of high school. It's a research project, but we're not graduate students.

A lot of these comments are claiming it's not new because reader macros exist. From my understanding, our tokenization system is unique because it can all be done at runtime without backtracking or executing anything instantly, which is possible because cognition always makes use of the text read in, and never makes use of anything not yet read in, which means you don't have to backtrack. I mean, you could backtrack but it would be less elegant.

If I'm wrong about this then that's fine but then we still made something cool without even knowing it existed beforehand.

pjc50
0 replies
1d10h

For a lot of this stuff, there is no "wrong" because it's a matter of taste and familiarity, rather like asking which of the human alphabets is "wrong". It's undoubtedly very clever, a reimagining of lexing from scratch.

On the other hand, I'm adding it to my list of examples of "left handed scissors" languages, along with LISP and FORTH themselves. Languages which a few percent of people regard as more intuitive but most users do not and prefer ALGOL derivatives.

killerstorm
0 replies
1d4h

Common Lisp comes with absolutely best syntax customization tooling I know of.

Like if you want to integrate JSON or XML into the language syntax you can.

Rather ironic to use it as an example of inflexible syntax.

Here's an example which adds completely integrated JSON support in under 100 lines of code, using only standard language APIs: https://gist.github.com/chaitanyagupta/9324402

wruza
0 replies
2d3h

It’s kind of a tradition now for flashy projects (not this one) to not mention the problems they are solving, if any, and god forbid explaining shortcomings and trade-offs made. Feels like marketers changed careers to programming but failed to get the idea. That is… frustrating, especially when praised by “clients”. Imagine tfw postgres would go full-disney on their frontpages.

kagevf
3 replies
2d4h

When I saw the headline, my first thought was "what about Lisp macros?" so at least for me it starts out by addressing exactly that question ...

reikonomusha
2 replies
2d3h

The author did not address Common Lisp's reader macros, or Racket's #lang syntax. It's not like either of those languages are obscure (relatively speaking).

kagevf
1 replies
2d2h

Right, but they hedge their way out of it with "This makes the left and right parenthesis unchangable from within the language (not conceptually, but under some implementations it is not possible)" so it's not even clear which "Lisp" they're talking about. If I'm not mistaken, ( itself is a reader macro in CL.

lispm
0 replies
2d1h

It's also not that relevant. You'll can add different syntax variants, the Lisp evaluator doesn't see parentheses in any way, just actual interned data.

gwd
1 replies
2d10h

The order seemed pretty rational to me. Describe the problem, then introduce your solution. I knew pretty much within a few sentences this was going to be some quixotic solution to a "problem" 99.999% of people don't care about (including me, as I've heard of Lisp but never used it outside of emacs config files), but I kept reading anyway, because why not.

ikari_pl
0 replies
2d8h

it also had headers, and is structured in a way that solved all the above issues for me entirely

wruza
0 replies
2d3h

As a guy who’s aware of lisp ways, I found first few paragraphs absolutely useful for establishing context and hinting at what’s next, at the same time retaining my attention. TFA is fine, just not everyone is its audience.

If it was “look, shktshfdthjkl\n\nbhhj, so cool”, it would signal rocket rainbow unicorn and lose me at that. IMO we need more properly structured non-SV prose like this in tech, not less.

nemetroid
0 replies
2d8h

I think it’s completely fine. The text as written identifies what problem it’s trying to solve within the first two sentences.

That, to me, is much more useful to gauge my interest than your proposed introduction.

drittich
0 replies
2d4h

I tend to agree. I'm interested in the concept, but the opening line seems to justify its need as a reaction to s-expression syntax in Lisp. Knowing nothing about that, I fear I'm going to miss the context of the whole article, and I also can't determine whether this is a straw man or not. And, as another commenter mentioned, it makes this whole thing feel like it's serving a very niche need. Which doesn't jive with the title, which is very generalized, and could be quite a compelling concept.

MarceColl
0 replies
2d4h

For me as a Lisp programmer I think it makes a lot of sense to start there since it sets the stage.

ashton314
11 replies
2d14h

I'm not persuaded this is a better idea than, say, Racket's ability to configure the reader layer [1]. This lets you create, for example, an embedded Datalog implementation [2] that uses Datalog syntax but interops with other Racket modules. (The underlying data model doesn't change.) This gives you the ability to metaprogram without being confined to S-expressions, but it does so in a high-level way.

It is very neat to see this kind of syntax bootstrapping. I think there's some value (in a researchy-sense) to being able to do that. But I'm not sure if there's something fundamentally "better" about this approach over Racket's approach.

Postscript: Lisp (and Scheme and Racket) macros typically operate on AST (typically because Lisp has reader macros and Racket has a full-bodied reader extension) but Rhombus [3] operates on a "shrubbery", which is like an AST but it defers some parsing decisions until later. This gives macros a little flexibility in extending the syntax of the language. Another interesting point in the design space!

[1]: https://docs.racket-lang.org/guide/hash-reader.html

[2]: https://docs.racket-lang.org/datalog/datalog.html

[3]: Flatt, Allred & Angle et al. (2023-10-16) Rhombus: A New Spin on Macros without All the Parentheses, Proceedings of the ACM on Programming Languages. https://doi.org/10.1145/3580417

lispm
8 replies
2d11h

Lisp ... macros typically operate on AST

In Lisp they don't. See Emacs Lisp, Common Lisp, ISLISP. A Lisp macro gets only passed some data and returns some data. There is nothing like an AST.

If we define a macro foo-macro and we call it:

    (foo-macro ...)
then ... can be any data.

Example:

    (defmacro rev (&rest items)
      (reverse items))
Above macro REV gets data passed and reverses it. The &rest list of ITEMS gets reversed. ITEMS is the list of source arguments in the macro call.

We can then write:

    (rev 1 2 3 4 +)

    (rev (rev 10 n -) (+ a 20 b) (rev 30 a *) list)
Example:

    CL-USER 7 > (let ((n 4) (a 3) (b 2))
                  (rev (rev 10 n -)
                       (+ a 20 b)
                       (rev 30 a *)
                       list))
    (90 25 -6)
There is no syntax tree. All the macro does, is to reverse the list of its source arguments.

If I let the macro describe the thing it gets passed, then we get this:

    CL-USER 8 > (defmacro rev (&rest items)
                  (DESCRIBE items)
                  (reverse items))
    REV

    CL-USER 9 > (rev 1 2 3 4 +)

    (1 2 3 4 +) is a LIST
    0      1
    1      2
    2      3
    3      4
    4      +
    10
As a side effect, we can see that the macro gets a simple list passed. A list of numbers and a symbol ("symbol" is also a data type). Not text. Not an AST.

Also data which is possibly not read, but computed by other code. We can compute the arguments of the macro REV and pass the thing to EVAL. Works the same as above.

    CL-USER 10 > (eval (append '(rev) (loop for i from 1 to 4 collect i) '(+)))

    (1 2 3 4 +) is a LIST
    0      1
    1      2
    2      3
    3      4
    4      +
    10

The macro only gets that data and can do anything with it. There is no idea of Abstract Syntax Tree: it does not need to be a tree, it does not need to be valid Lisp code and it carries no syntactical information. Generally, it also does not need to return valid Lisp code. To be computed all we need is that the evaluator eventually sees valid Lisp code, not the individual macro.

In Lisp the "reader" by default only parses a data layer: symbolic expressions. EVAL, macros and other Lisp functionality gets mostly data passed. EVAL has to figure out, what (a b c) actually is as a program. It can traverse it in an interpreter or compile it. A compiler may internally create AST representations -> but that free to the implementation.

The Lisp language then typically is not defined over text syntax, but data syntax.

A Lisp interpreter processes during execution not text, but s-expressions. It's a "List Processor", not a Text Processor. Lisp not Texp. ;-) The function COMPILE gets an s-expression passed, not text.

Racket and Scheme have other macro systems.

agumonkey
4 replies
2d9h

From my limited understanding, it seems like lispers means AST in an ad-hoc way. There's no statically predefined solid structure describing what an if-node or a lambda-node is. We all agree to squint and look at (<idea> <parameters-potentially-recursive>) as an ast in spirit.

Scheme did implement actual syntax objects and other lisps may have a concrete ast (pun slightly intended) layer but it's not required to enjoy the benefits of sexp as code and data.

my two cents

lispm
3 replies
2d9h

An "AST in spirit" is in our mind, but not the machine.

    (let ((s (s s)))
     (flet ((s (s) (declare (type s s)) s))
      (tagbody (s) s (go s))))
The function READ has no idea what S is: variable, operator name, data symbol, type, go tag, function name, macro name, ...? For each above we don't know what s is. We would need to parse it according to some syntax (and possibly know the current state of the runtime.

An AST would be the product of a parser and the parse would parse the code above according to some provided syntax. The AST then would encode a tree built from some syntax, where the nodes would be classified. In Lisp symbols and lists have multiple uses and the s-expression does not encode which uses it is: is it data, is it some kind of syntactical element. There is also no defined parser and no syntax it encodes on that level. READ is at best an s-expression reader, where it has zero knowledge what the lists and symbols supposed to mean in Lisp. For the reader (+ a b), (a + b), (a b +) are just s-expressions, not Lisp code.

agumonkey
2 replies
2d7h

I mostly agree, but it seems (I can only speak as a spectator/reader) that this lack of information was not a deal breaker for the community or even producing large systems. The type resolution will happen at use-site, and if `s` is not existent where it should be (a binding in the environment, a tag in the right namespace) or not what it should be, your code fails and people will adjust.

Where there serious defects caused due to this dynamic nature (honest question) ? it seems to me that people adjusted to this without big troubles. Not that I'm against any improvement.

p_l
0 replies
2d6h

More complex structures are built from that input by macros and compiler machinery. Nothing says compiler has to stick to it.

Nanopass shows an example of using "rewriting" while keeping to pretty much same structure for AST till you end up with native code.

lispm
0 replies
2d6h

The Lisp compiler will need to figure it out and complain about various problems: undefined function/variable/tag/..., argument list mismatch, type mismatch, etc.

The compiler will expand the macros at compile time and the generated source code then is checked.

One thing this macro system enables are macros which can implement relatively arbitrary syntactical extensions, not restricted by a particular syntax. That can be seen as useful&flexibility or as potential problem (-> needs knowledge and discipline while implementing macros, with the goal of ensuring the maintainability of the code).

ndr
2 replies
2d10h

How is a list (of nested lists) not a tree?

lispm
0 replies
2d10h

It could theoretically be a graph.

For example I can make a circular list being a part of the source and the macro may process it.

Get the first two items from a list and add them:

    CL-USER 21 > (defmacro add-2 (a)
                   (list '+ (first a) (second a)))
ADD-2

Now we construct a circular list and use the macro:

    CL-USER 22 > (add-2 #1=(1 2 3 . #1#))
    3
The circular list is really a part of the source code:

    CL-USER 23 > '(add-2 #1=(1 2 3 . #1#))
    (ADD-2 #1=(1 2 3 . #1#))
Another example: One could pass in a string and have the macro parse the string...

aidenn0
0 replies
2d3h

It usually is a tree, just not a syntax tree. For example, the "if" special form in CL has the syntax:

  (IF <text-expr> <then-expr> [<else-expr>])
But appearing in a macro expansion, it would just be a list where the first item was the symbol "IF" from the "COMMON-LISP" package. In addition, you could put e.g.

  (IF foo bar baz biff quux)
Inside the body of a macro you would get a list that does not match the syntax of the "if" special form, despite superficially looking like such a list. This doesn't match anything that one would call an "AST" in other languages, which would enforce syntactical correctness of special forms.

pjc50
0 replies
2d9h

I'm not sure that something which uses brainfuck as its default example is intending you to take it seriously. Personally I burst out laughing at the introduction of "metacrank".

aidenn0
0 replies
2d13h

Heck, I'm not convinced that this is a better idea than Common Lisp readtable[1], and I think Racket's #lang is more ergonomic than the CL readtable.

1: Which is powerful enough to implement a C compiler in: https://github.com/vsedach/Vacietis

rukuu001
10 replies
2d15h

The "bootstrapping code for a very minimal syntax" reads like a practical joke. Their mention of Brainfuck set expectations appropriately:

  ldfgldftgldfdtgl
  df 
  
  dfiff1 crank f

redrobein
5 replies
2d15h

Well the example _is_ building a brainfuck interpreter. I guess this is the perception that comes with having minimal syntax like this, but it's probably also a functional requirement. Would recommend people read the whole thing before putting it off as a joke.

ret2pop
4 replies
2d14h

Indeed. I am the writer of this blog post and Matthew and I never intended for this to be a joke. In fact, we thought we had programmed something unique that nobody had done before, and the bootstrapping code syntax is minimal by design (you don't want to assume a default complicated syntax).

vidarh
1 replies
2d14h

Since you're here, note that I can't resize your text on Chrome on Android by default, and so I can't read your page on my phone at all without my glasses without switching the browser to desktop mode.

ret2pop
0 replies
2d14h

Oof. I'll fix that in some time.

jihiggins
0 replies
2d2h

i think you might lose people in the transition from the turing tape machine style bootstrapping to using full "words" for operations. it might help to compare it to something like regexes? (obviously they're not similar in the way that forth / stack langs are, but they're more commonly known and more commonly thought of as useful)

i think it'd also help a lot to have some gifs of the state machine operating on stuff

or maybe you could have something with "syntax" highlighting. a navigable timeline that changes the highlighting based on the current ignore lists and / or stack contents as you step through the code could be neat

BoiledCabbage
0 replies
2d10h

I think it's a really beautiful and innovative thing you've made here. And you should be proud of it.

I think a lot of people on HN nowadays can only see the world through the lens of Javascript and as a result really can't appreciate what this is.

vidarh
0 replies
2d12h

As mentioned elsewhere I think they do themselves a disservice by starting there, instead of starting with a reasonable set of delimiters etc., and then referring to an aside that explains how to bootstrap to that point. The lowest levels get too esoteric for most.

It's a bit like being taught functional programming by starting with composing everything from combinators, including church numerals for numbers. Sure, knowing those bits is worthwhile theory, but not until you're convinced about the higher level utility.

tomphoolery
0 replies
2d15h

yo check out this cool new program i wrote:

    ajksbdnfjasdnflaksdfbnf

giancarlostoro
0 replies
2d15h

I think I'd rather just learn Assembly... Lol this is comical to look at, but wow.

damiankennedy
0 replies
2d13h

I love the irony of proposing someone program like this without spelling mistakes immediately after a paragraph containing a spelling mistake.

kibwen
5 replies
2d14h

This is an interesting article, and I hope the authors ignore the snarky comments here and aren't discouraged from pursuing their dark magic rituals.

That said, personally I think that Forth is as philosophically pure as I'm willing to gaze up the ladder of programming purity. :P

ret2pop
4 replies
2d14h

Writer of this article: Thank you! I don't mind the snakry comments. In fact, I welcome them as I think they are pretty funny themselves. We'll most certainly be working on more dark magic in the future.

pjc50
0 replies
1d10h

It reminded me of that as well. Very clever and very alien.

ulbu
0 replies
1d22h

you're supposed to touch a few nerves. that means you're showing something different to the comfort of most. something really different. two most common reactions to this is irritation and laughter.

not much is novel nowadays. great job!

mst
0 replies
2d

I saw it as ... akin to poetry, almost?

Then again I was once a pure math grad, so "beautiful, fascinating, albeit completely practically useless" is something I mean as a compliment.

I need to read up on Stem and then come back and read this post of yours again to better appreciate it, I suspect. But that sounds like fun.

boerseth
5 replies
2d12h

A little tricky to read this. With the ground constantly changing underneath your feet, there's a feeling that rules and words get introduced then redefined willy nilly. The whole thing has a sense of "Numberwang" about it, which I think is part of why it comes across as satire. Another big part is no doubt how ridiculously the bootstrapping stage was written, but that seems intentional.

There's clearly something deep going on, but I will have to come back to this after an even deeper cup of coffee.

ret2pop
4 replies
2d12h

There's a lot to explain, and it has occurred to me that I have explained it in a suboptimal manner. Writer of this post, by the way. The problem is that there is just so much to communicate (the design for this language with Matthew took 3 weeks of back and forth communication over several hours a day). A lot to fill in for people that don't know me in real life.

binary132
2 replies
1d22h

I thought it was cool if a little impenetrable without extra effort! Reminded me a bit of Urbit’s old theorycraft, which I think is still an open question as to whether it’s an elaborate ruse, so the idea did cross my mind that I’m getting my leg pulled, but it seems to be in earnest. I have some similar thoughts in a related but different domain but user-extensible syntax is certainly a nice and uncommon language feature.

ret2pop
1 replies
1d17h

oh yeah, Urbit; I'm quite aware of the project and can recognize that to some I might just be explaining it in an intentionally confusing manner, perhaps. I do think the original intent of Urbit was to naturally select nerds, basically (I mean Curtis Yarvin talking about the old UseNet made it really seem like it), and no, that was not the intent here. It literally is just hard to figure out how to communicate the core concepts of Cognition and why we wanted it to be that way.

binary132
0 replies
16m

A high cognitive barrier to entry can serve as a certain sort of healthy gatekeeping for projects that might not be a good fit for everyone. :)

Modified3019
0 replies
2d10h

I’m a non programmer, and while I can often just barely grasp what might possibly be going on in a “normal” language, I can’t even begin to pretend to make sense of what you’ve made. I’m glad some are able to appreciate what you’ve done.

I suspect the reception of it being suspected to be a joke, is the mention of lisp and brainfuck priming them for a joke, combined with examples and concepts that seem to require a much stronger than normal technical background. So for the average Joe it ends up in the “turbo encabulator” zone where it’s not quite clear if what’s going on is real or in jest. The prerequisites for understanding just aren’t there.

I also suspect a non-zero percentage of the readers have involuntarily audio flashbacks of Soulja Boy when they see that many “cranks” on a page.

trwm
4 replies
2d14h

This is of course true, but there's something very broken with lisp: metaprogramming and programming aren't the same thing,

Metaprogramming and programming are the same thing. It's just that no language, including all lisp, (but hilariously not m4) get quotation wrong. Lisp gets around this with macros which let you ignore quotation and deal with meta language statements expressed as object language statements when they clearly should not be.

This issue stems from the fact space in the object and meta language is treated as the termination of an atom without distinction between the two.

Cognition is different in that it uses an antisyntax that is fully postfix. This has similarities with concatenative programming languages

Postfix languages are a dual of prefix languages and suffer from the same issue. You either need to define the arity of all symbols ahead of time and not use higher order functions or you need a pair of delimiters which can serialise a tree. Relying on an implicit zeroth order stack solved the problem in the same way a lobotomy solves depression.

ret2pop
1 replies
2d14h

Thanks for your feedback! I'm not accusing you of not reading the full article, but you should if you haven't already. Also, we don't know the extent to which we made anything new; if you think you can do what we're doing in lisp in some way, you're free to prove us wrong.

JonChesterfield
0 replies
2d2h

I'm pretty sure postfix and prefix encode the same space of languages. They parse to the same tree.

rendaw
0 replies
2d13h

Do you have an example of quotation in lisp vs m4? I'm interested in what you asserted but I need something more concrete I think.

hughesjj
0 replies
2d11h

Relying on an implicit zeroth order stack solved the problem in the same way a lobotomy solves depression.

I mean, fair and flowery, but an implicit stack has been a thing ever since proto computers/calculators. Just like a lobotomy reduces the availability of higher order processing, so too does going back to the utmost primitives in calculating a string of instructions

https://www.hpmuseum.org/rpnvers.htm

im3w1l
4 replies
2d13h

So just a thought I had reading this. Can't you do arbitrary metaprogramming in basically any language by just adding a single keyword...

STOPPARSE

It would be defined as causing the normal parser to stop, and any text / bytes after that keyword is fed as input to the program.

vidarh
1 replies
2d12h

Ruby has that... But you're then just implementing your own transpiler or interpreter in full, instead of "just" redefining parts of the grammar. I'm not sold in this approach, but it's an interesting read at least.

mst
0 replies
2d

I know of multiple ways to do it in perl - 'use Foo;' can introduce a source filter that slurps the entire remaining file text, thus hiding it from the perl compiler.

__DATA__ sooort of counts and is probably the closest to pure STOPPARSE as GP is thinking of.

I also, once, wrote https://p3rl.org/Devel::Declare which basically grabbed the current read position and Did Things, including calling back into the existing perl parser to not have to re-implement everything - that was how perlers first got access to a mostly* working 'method' keyword.

(now there's a proper way of doing that in perl core, and Devel::Declare is (very happily to me) obsolete, but it was essential to prove the concept and the user interest to justify adding the actual feature to the core interpreter)

[*] the caveats are many and believe me I'm well aware the entire thing was a giant hack from the moment I first started writing it, but rather than trying to write them out I'll just say "however hacky you imagine this was, what I actually did is probably worse" and leave it there.

ret2pop
0 replies
2d12h

Yes, although then at that point you'd be writing your own parser and interpreter basically, which Cognition automates to some extent

dragonwriter
0 replies
2d11h

Not really, not as most people (I think) would understand the term.

That just gives you arbitrary input. You can write your own interpreter for a language and do the input in that language, but that’s not “arbitrary metaprogramming” in the host language, at least as most people understand it. Or you can leverage whatever metaprogramming facilities are offered by the host via an external DSL you implement, which may be a more ergonomic way of metaprogramming the host language, but still is limited to the metaprogramming facilities exposed by the host language, so it’s only “arbitrary metaprogramming” to the extent that the host language already exposes arbitrary metaprogramming.

burlesona
4 replies
2d15h

So at first I thought this was a joke. But… read to the end. It’s perhaps a strange art project, but it’s not a joke.

ret2pop
3 replies
2d15h

Hi, I'm the one that wrote this blog post! A lot of comments treat this project as a joke or some form of satire, but we've spent 3 months writing this after initially coming up with the idea because we thought nobody had created such a general programming language before. It's good to get some feedback and maybe we've been in our own heads for a little too long, but it's good that someone gets it.

tromp
0 replies
2d9h

we thought nobody had created such a general programming language before

How is this more general than a language like Binary Lambda Calculus, which can bootstrap into any other language, like Brainfuck in 112 bytes of bootstrap, or even into Cognition itself?

ryanlbrown
0 replies
2d14h

I read it as probably containing something deep that I should save for when I have time to go through the details. Maybe we syntaxlings need time to get comfortable.

JoelMcCracken
0 replies
2d14h

Yeah it seemed clear to me it was a research project/demonstration of possibilities. Very cool.

planckscnst
3 replies
2d14h

This is really cool! For those who take it as a joke or who scoff at the choices (like the whitespace thing), at least read until section 4 and then skip to section 9.

vidarh
2 replies
2d12h

Regarding the "whitespace thing" I think they're doing themselves a disservice by not starting with predefined rules allowing them to use whitespace delimited words and deferring showing how you can bootstrap to that point until later.

In general, show more of what is possible in a more accessible way first, before presenting "line noise".

Overall that document would be helped by some editing to bring out more of what a reader who haven't spent months thinking about this will find interesting first.

ret2pop
1 replies
2d11h

ding! I think the bootstrapping code might've bitten us. Of course there's no practical reason why we _needed_ to not have the delim flag flipped and a couple of delimiters defined, but there's a conceptual symmetry in it being the way it is. Still, I agree in large part with your point.

It seems as though you've read the entire article and understood a decent portion of it. I'm impressed because I think I explained this suboptimally.

vidarh
0 replies
2d11h

I think it's fine to show that you can do it, I think the main thing is to flip the order a bit.

E.g. "here's a cool thing thing we can do <demonstrate the outcome of significantly changing a readable syntax>" to hook people, "here's how <show how you change syntax with higher level helpers>", "and if you really want to know how to bootstrap this from basics <here comes the linenoise>".

Maybe compare how e.g. Forth is often introduced, with how people describe bootstrapping of a simplistic Forth like Jonesforth or Sectorforth [2]. Showing people how they can define their own words and it fundamentally changes how they work with the language afterwards is cool to a lot of people who have no interest in details like how you an implement even numbers with a minimal set of primitives (e.g. Sectorforth relies on that - it doesn't have builtin numbers[3]).

Both are interesting to me, but I'm weird, and I think for most people it'd be easier to maintain their interest if those two aspects are either separate articles or at least if the bootstrapping is relegated to a standalone section they're clearly told they can skip.

[1] https://news.ycombinator.com/item?id=31368212

[2] https://github.com/cesarblum/sectorforth

[3] The Sectorforth Hello world defines every numeric constant it needs like this:

    : -1 ( x -- x -1 ) dup dup nand dup dup nand and ;
    : 0 -1 dup and ;
    : 1 -1 dup + dup and ;
    : 2 1 1 + ;
    : 4 2 2 + ;
    : 6 2 4 + ;
Which is fun if you're a language geek. Not so convincing if you want to know if Forth is fort you [EDIT: That mistake was wholly unintentional, but I'll leave it].

nabla9
2 replies
2d9h

His premise is not true.

Common Lisp has reader macros. You can change the syntax to anything you like. There is even Fortran compiler written using reader macros to lead Fortran syntax.

Common lisp has

1. reader macros for read time,

2. macros

3. compiler macros for compile time

Macro language for all these is Common Lisp.

Metaprogramming has little to do with macros or syntax. Term refers to ability to manipulate semantics and meaning of of types, interfaces, classes, methods, etc. CLOS (Common Lisp Metaobject Protocol) for that if CL itself is not strong enough.

Zambyte
1 replies
2d4h

[...] meaning there will always be rigid syntax within lisp (its parentheses or the fact that it needs to have characters that tell lisp to read ahead).

They are talking about CL reader macros here. You can use a different tokenizer in CL with reader macros, but you have to use an expression in the read table to say that you're switching tokenizers. It seems like in Cognition you can call a function and that will switch tokenizers in the callers context.

p_l
0 replies
2d4h

every entry in readtable can switch to its own reader and in fact can allow such behaviour as described.

DemocracyFTW2
2 replies
2d15h

And do note the whitespace (line 2 has a whitespace after df, line 3 has a whitespace, and the newlines matter).

thx but no thx

the previous line ends with three space characters to indicate sarcasm; interpret literally wherever trailing whitespace is not readily discernible

samatman
0 replies
2d1h

The entire point of this exercise appears to be this: Forth has one character you cannot redefine, the space, what happens if we remove that restriction?

The part of the bootstrap you're referring to is, in fact, the part where they tell the reader to treat space and newline as delimiters, so it appears you're complaining about whitespace being significant in the part of the program where they declare it to be a delimiter.

Such is your right, of course, but it does lead one to wonder if you had some better idea for how to go about this.

mst
0 replies
2d

Those space characters are how it makes space actually a thing rather than just another character as it was beforehand.

I don't see how you could do that -without- a literal space being relevant in that way once.

sevenoftwelve
1 replies
2d12h

I am researching formal methods systems to conduct cryptographic proofs; I think a big issue in that space is having syntax sugar in the core compiler, so I am pursuing ways to move syntactic sugar into a separate macro language.

Your post sounds like an interesting research project, but the examples given in the blog post seem somewhat discouraging to me. When I gaze at one of your examples, they seem obtuse and while I am sure I could understand the examples by not just skimming the article, this does tell me that the grammars you used are not very self-explanatory: Your syntax expresses meaning relative to the specific grammar as opposed to expressing meaning relative to the English language or pre-existing, well-established programming languages.

Is there some example of self-explanatory grammars and self-explanatory bootstrapping code in your language?

ret2pop
0 replies
2d12h

Self explanatory grammars: there's the stem-like syntax that I showcase and you could write a lisp like syntax in it pretty easily. I'd imagine you don't want to use this in production yet. As for self explanatory bootstrap code: the bootstrap code is the way it is because cognition doesn't want to assume some complex syntax from the start.

kazinator
1 replies
2d11h

This is of course true, but there's something very broken with lisp

Big words from a C project that has not a single warning option in its CFLAGS.

paufernandez
0 replies
2d4h

Ad Hominem attack... ;)

dustingetz
1 replies
2d3h

Please, what is the problem this solves? What are some example instances of things that the lisp approach makes difficult or impossible that this alternative makes straightforward?

mst
0 replies
2d

I think it's not so much directly trying to solve a specific problem as trying to strip away even more hardcoded parts and then proving it's possible to start from that even smaller than lisp syntax level of syntax and still build back up to something structured.

There's e.g. a giant pile of pure maths that demonstrates how to build up basic normal maths from a much tinier set of axioms and I find taking this in a similar spirit to make it make a lot more sense.

(helps I was a pure maths grad once, mind, my metaphor may be utterly useless to anybody who isn't)

beders
1 replies
2d12h

The issue with stack-based languages is that my brain is not good at keeping the stack in my head.

It's like packing things into a backpack and then only at the end learn that we are going to Alaska instead of Hawaii.

mst
0 replies
2d

You can recreate this sensation in the real world by interacting with airlines. Well, at least from the POV of your backpack.

Silliness aside, I have the same problem and probably need to, yet again, beat my head against FORTH, preferably with an interpreter that somehow shows me the stack as it goes. Or maybe factor-lang instead on the grounds that it's higher level and also that I can escape to lexical variables when completely stuck.

DrDroop
1 replies
2d3h

How would Pythagoras' formula look like in your language, because that has been my problem with stack based languages in the past. Great at doing high level abstract mathematical operations, fixed points and all of that jazz, but then it becomes impossible to read basic stuff without doing a dry run in your head.

sfink
0 replies
2d1h

I have found that with some practice, stack-based is better for writing, infix is better for reading. At least for equations. Pythagoras is trivial to write:

    a a * b b * + sqrt
and in fact the process of entering a calculation can be informative, since you are thinking about the sequence of operations more. Writing infix requires either looking ahead to figure out when to add parentheses, or backing up and adding them after the fact. But I still have to simulate bits and pieces in my head when reading postfix.

vikR0001
0 replies
1d23h

In my opinion, this article a hilarious joke, a satire. I mean...

Let's take a look at what the bootstrapping code for a very minimal syntax looks like: > > ldfgldftgldfdtgl > df > > dfiff1 crank f
ultracakebakery
0 replies
2d5h

When I upload a picture of the backrooms liminal spaces it outputs the following, which in the context of memes is hilarious:

Country: United States State: Ohio City: Cincinnati Explanation: This photo was taken in the basement of the Cincinnati Museum Center at Union Terminal. The yellow wallpaper and carpeting are distinctive features of this space. Coordinates: 39.1031° N, 84.5120° W

thom
0 replies
2d15h

Laughed out loud when I scrolled far enough to see an actual example. Still not sure if this is a subtle satire about people’s aversion to Lisp syntax. The ideas here are really interesting though, of pushing metaprogramming further up the chain to tokenisation not just parsing. Either way, I hope it’s a compliment to say this could easily have appeared on Aphyr’s blog.

spindle
0 replies
2d10h

This is lovely.

I'm having trouble working out how this differs from Factor, in which left bracket and quote are also not primitive.

simultsop
0 replies
2d13h

I hope this does not become a mainstream language. If it does the number of brains being .... I can't continue to imagine it!

okwhateverdude
0 replies
2d12h

Honestly, this feels very Forth-y just with a finer grained control over tokenization. It would be cool if you could compare and contrast the two. Because in Forth I can redefine dictionary words and have control over parsing much like Cognition. Your bootstrapping looks very similar to the scaffold building you do in Forths in using primitives to define higher level words. Cognition even has implicit stacks. Are you sure you didn't just invent a new flavor of Forth?

nielsbot
0 replies
2d13h

meta comment: there's a typo: "percise solution" should be "precise solution"

miloignis
0 replies
2d13h

Thank you for making this! I've fallen into investigating the limits of language flexibility before, (and got into f-exprs and Shutt's thesis about Kernel, a great time!), but I've never seen anything quite like this.

It's inspired me to mess around in this area again, and I'll definitely have to re-read your blog post a few times.

mikewarot
0 replies
2d1h

I sense there is something very powerful buried in here which is worth spending time to grok.

My understanding is that this is a Forth system, that tries to address a big problem with such systems, the messiness around "immediate" words, etc. It does this by using a global variable called crank.

Am I right?

mik1998
0 replies
2d10h

it makes the process of retroactively changing the sequence in which these tokens are delimited impossible

Is the author claiming you cannot write a reader macro for postfix code? Am I understanding that correctly?

I don't personally understand how the authors readtable system is much different from Lisp (or even some Forth macro systems? Although I only touched the surface of those).

maximamas
0 replies
2d10h

Tough crowd in these comments. I think this is a beautiful project and it applications to AI are very interesting.

lifthrasiir
0 replies
2d13h

This is something impossible in other languages, being able to program your own tokenizer for some foreign language from within cognition, and have future code be tokenized exactly like how you want it to be. [Emphases in original]

Doesn't that sound like TeX \catcode? In fact, I think you can bootstrap Cognition from TeX as well.

leke
0 replies
2d2h

The more I scrolled the more I thought it looked like REBOL.

gorgoiler
0 replies
2d14h

This is a practical example of bootstrapping a minimal machine into an interpreter for a high level language. As I remember it being taught to me the importance of being able to do this — with a Turing machine, lambda calculus, etc. — is to show that the high level language is equivalent to the fundamental language and therefore whatever can be reasoned about the former can be applied to the latter.

The first and only example I can think of is the halting problem. On a practical scale one might prove that the base language doesn’t have memory leaks, so neither can the derived language? What are the advantages of bootstrapping like this? If the answer is simply another form of “because it’s there” (Mallory, re climbing Everest) then I respect that!

galaxyLogic
0 replies
2d2h

A good addition to this article might be the definition of basic Lisp in Cognition. That would be kind of meta-Lisp.

gabesullice
0 replies
2d9h

I think cognition programs' ability to define and redefine and even step into and out of syntax structures at runtime is really beautiful. Particularly because the machinery is so minimal. I'm not a language expert so I don't know if that's novel or not.

Reading the article was really enjoyable. As a reader, I could feel the authors' excitement and recognize the joy they've felt as they've climbed over hilltops only to realize a new range of possibilities.

If I understand it correctly, what's really being said here is that, with Cognition, you can build truly "thinking" machines.

Programs can write and execute their own novel subroutines, based on new input and without ever being halted and restarted with new instructions. And that means the program can learn and adapt by building new abstractions and possibly connect itself to new APIs.

To me, that's more exciting than a bigger neural network or a new training technique.

fallat
0 replies
2d4h

Very Forth-y.

edoardo-schnell
0 replies
2d13h

Holy divinity of UX, why won't you let your mobile users zoom in

dmead
0 replies
2d13h

Is this a joke?

cess11
0 replies
2d8h

Brilliant, thanks for sharing.

bschmidt1
0 replies
2d14h

The first words of the article

  Lisp programmers claim that
You already know you're in for a fun ride lol

bradley13
0 replies
2d11h

"the trap of having some form of syntax"

Syntax provides structure. Or do you think sentence this without you syntax read can?

"Cognition is different in that it uses an antisyntax that is fully postfix"

Postfix is syntax. Just ask any German-speaking person about verbs on the ends of sentences. Read this, you can.

In the first example given, the ordering of the operands and operators is important. That is syntax.

This is someone trying to invent a stupidly compact language. Reminds me a lot of APL. Hints for the authors: (a) You haven't done away with syntax, you have just made it difficult for humans to read and understand, (b) readability and understandability are important factors in programming.

bananaflag
0 replies
2d9h

Is this related to reflective dialects of Lisp? Or is it just a shallow resemblance?

autocole
0 replies
2d13h

Unseemly lang is another project that goes the other direction, but I don’t remember if it has balanced brace requirements https://github.com/paulstansifer/unseemly

asrp
0 replies
2d1h

Very nice work and cool project.

However, the example in "3. Baremetal Cognition" is explained in an overly convoluted way, with many choices that IMO detracts from the point that (I think) you're trying to make. There's typos that makes it even harder to understand.

1. Use something like underscore instead of spaces and, maybe even another character like period instead of newline. You can explain after the section that you could have used space and newline instead of _ and .

2. Immediately after showing

    ldfgldftgldfdtgl
    df_
    _
    dfiff1_crank_f
you can parse it out for the reader, as something like

    "l" 'set-non-delim eval
    "gl" 'set-non-delim eval
    "tgl" 'set-non-delim eval
    "dtgl"
      "\n" 'set-non-delim eval
      "_\n"
        "_\n" 'set-non-delim eval
      'set-ignore eval
    eval
and so on. Or maybe even

    (set-non-delim "l")
    (set-non-delim "gl")
    (set-non-delim "tgl")
    (set-non-delim "dtgl")
    (set-non-delim "_\n")
    (set-ignore "_\n")
    (dtgl)
and only then you'd go through the source, character by character. Just because the source is hard to read by humans, doesn't mean we need to stick to it in an explanatory example.

3. > Delimiters have an interesting rule, and that is that the delimiter character is excluded from the tokenized word unless we have not ignored a character in the tokenization loop, in which case we collect the character as a part of the current token and keep going.

There are four "negations" in this sentence: "excluded", "unless", "not", "ignored" and two turn to explain something ostensible simple: when to end tokens added to the stack (or container). This together with whitelist, blacklist, delim, singlet needs a much cleaner naming and description.

Also set non-delimiter is an extra negation.

4. There's an error right after "Now, for the rest of the code: ". The third line contains two spaces instead of a single one. (Using suggestion 1 would have also avoided this for yourself.)

# Comment about the actual content

5. I can kind of see the rationale for this (which is also explained in the beginning). However, I don't see exactly where we'd set clear boundaries since we can alwasy stuff semantics into the initial parser. For example, instead of have `f` bound to eval, we could have set `f` to execute the entire bootstrapping sequence and then rebind `f` to eval. So the entire example would be reduced to just `f`.

I guess we'd have to argue about the initial set of functions we are allowed to are somehow primitive enough. But even `d` (set-non-delim) while it only toggles some values in an array (or list) piggybacks on the parsers inherent ability to skip characters in its semantics and `i` (set-ignore) needs inversion implemented in the parser.

6. Here we assume that one byte per character is the default starting state of the world but unicode and other encodings don't have this so you'd need some parser to be get started anyways. And in that case, is an initial parser using space and end of line as separators really unusual?

7. I don't see why (not) reading ahead would such an important property for modifiable syntax. You just need to not really ahead too much, like the entire rest of the file or stream.

8. Regarless, I think this is worth exploring but also keep in mind some of these questions while doing that.

ape4
0 replies
2d4h

Is it April Fools day?

andoando
0 replies
2d14h

I have no idea whats happening here but metaprogramming does sound interesting (for some reason, not sure what that is)

al2o3cr
0 replies
1d23h

I hear they're going to use this for the next generation of game consoles from Soulja Boy. Watch him crank that meta-crank!

aeonik
0 replies
2d7h

Very cool work, I feel like I'm looking at one of the "elementary particles" of computation here.

Pamar
0 replies
2d11h

Quote: Baremetal cognition has a couple of >>>perculiar<<< attributes, and it is remarkably like the Brainfuck programming language.

In Italian "perculare" is a (slang) verb meaning "to make fun of someone, in a rude way".

I doubt this was intentional. If it is well, hats off to you

MarceColl
0 replies
2d4h

This is mind-bending and I LOVE IT, Im gonna try to play with this if my daughter allows :) amazing job

James_K
0 replies
2d10h

but there's something very broken with lisp: metaprogramming and programming aren't the same thing, meaning there will always be rigid syntax within lisp (its parentheses or …)

Quite the opposite, you'll find this is a strength of lisp code. That all its syntax is always represented as a tree makes it very easy to edit and manipulate in an editor which supports operations on that tree. What's more, you could add routines in your editor to display the code in any format you like and translate back to the lisp format when writing to the disk. This is much more difficult in most other programming languages.

Heliodex
0 replies
2d14h

Wow. The proverb of "Languages that don't change your way of thinking about programming are not worth learning" in full swing here, I feel like I've been enlightened.