Every time I read about SICP, I get frustrated all over again about Javascript. It could have been Scheme and all web development would have benefited.
I always see people gush over this book, but is it still worth reading for someone that already has a ton of experience programming?
I started reading it around a year ago with the personal goal of doing every exercise. I'm not done yet (almost done with chapter 4). It has definitely taken a lot of my time, but it really has changed my thinking about a lot of language constructs I find in other languages.
For instance, the part about tagging objects with their type reshaped my thinking about static type systems in general. Static typing, for example in C, is essentially just moving the location of the type tag from existing at runtime within the struct, to the "analysis" phase of the compiler. It's a pretty simple idea, but it made the "degree of dynamism" tradeoff click in my head. All the information _has_ to exist somewhere, and the only question is where. And Scheme is just at an extreme where the _everything_ is stored dynamically.
Static typing, for example in C, is essentially just moving the location of the type tag from existing at runtime within the struct, to the "analysis" phase of the compiler.
See also "shift left"
All the information _has_ to exist somewhere, and the only question is where.
Not entirely sure what you mean about this, but not all type information has to exist somewhere. Sometimes things get encoded into a type that might never be represented otherwise -- like whether an integer is a width or a height, or whether a string is a name or an address.
whether something (a variable) is a width or a height is encoded somewhere in the code because if the variable is a width it makes its usage different than if it is a height.
Then the code is written to make sure you never make the mistake of sending a width to a height, because that would be silly.
The type (width or height) is represented as logic in the code.
theoretically - if you have a type system that allows you to know something is a width or height you can often reduce logic to keep track of these things but my experience is that level of granularity in your typing reduces dynamism too much. I would rather keep track of that in code (as it probably will have to deal with it anyway) than make my type system deal with it.
There's some cool mechanisms in type-forward languages that lead to minimal dynamism loss along with safety: For instance, you can do scientific programming that carries units along. So you really can divide a length by a time with no problems: Your results just happens to be a velocity, and will come out in meters per second (or whichever your favorite units are). Adding meters to kilometers? Congrats, you might you have to say which physical representation you want in float-land. But then you are saved from adding seconds to milliliters, because then the types will tell you it's complete nonsense.
But note that to make all of that work well you really need your language to provide a lot of syntactic sugar features which are missing in most older languages, so all the operations read seamlessly, instead of being full of ceremony. One could do all of this in, say, old Java 1.4 or something, but the amount of type annotation, along with the massive mounts of boilerplate to get all of the operationg working right, and checked at compile time would make it all Not-Worth-It(tm) But with enough language features, and a smart enough compiler, `val acc = mySpeed / someSeconds` will work, straight up, with the safety included.
But then you are saved from adding seconds to milliliters, because then the types will tell you it's complete nonsense.
I was thinking about this as I was reading your comment, and wondering: is it nonsense? If I have 5 seconds of time as well as 10 milliliters of water... I can consider them as being packaged together... which is addition. And I can subtract 3 milliliters and 6 seconds from them without running out of what I had. Nothing wrong with that, really. Five potatoes and a gallon of water is the same notion, just more familiar. Seems no more nonsensical than dividing length by time, right? Food for thought...
That's nonsense because addition loses the structure. 3 litres and 6 seconds becomes indistinguishable from 4 litres and 5 seconds. Maybe that's what you want, but it's realistically pretty much never what you want.
That's not how it works. Addition losing structure doesn't mean "strip out everything from the textual description that isn't a number". It just loses some properties, like ordering.
Which means, for example: (3 liters and (i.e. "+") 6 seconds) + (4 liters and 2 oranges) = (7 liters and 6 seconds and 2 oranges). Perfectly sensible addition, which you do all the time. There's no stipulation the output has to be a single number...
Alright you're just talking about a different operation entirely but have decided to give it the name "addition". Not particularly insightful, but thanks anyway.
What? Addition isn't just for numbers. This is literally no different from adding vectors. I'm sure you've seen that in school?
Bundling the time and the volume might be better called "forming a vector" or "forming an ordered pair" rather than "addition". You can then perform addition and subtraction with the resulting ordered pairs or vectors.
There's no need for ordering (it's a bag rather than a vector), but yeah.
thanks for saying it a lot better than I did!
That just means it exists in your head.
Not really. Whether an integer represents a width or a height won't even exist in your head if you're implementing a function like max(a, b).
The type is encoded in the name of the function.
Is a pretty bad system because you say "won't even exist in your head" but I dont even know if I should pass it integers, floats a srtring. Or what it will return.
Because inside the abstraction max it doesn't represent a width or height, they are just numbers (or just some "comparable things"). But outside the abstraction, at some level, you'll definitely know what a and b are.
exist in your head if you're implementing a function like max(a, b)
surely, you have an idea of what you're using that function for, while you're writing it.
The compiler is doing a check to see if the function makes some minimal logical sense (ala, a typecheck), but it cannot read your mind. But this implies the types are actually an actualization of the thoughts in your head, in a formal manner.
At the extreme end, and with no claims of scalability, you have shell: everything is a string.
I'm skeptical that sophisticated type systems need to exist.
Perhaps I should dust off my copy and start reading SICP
Yes. But instead of taking my word for it why don’t you just start reading it and decide for yourself?
Do you really need this explained? It's a 600 page technical book and people have a limited amount of time to sort through everything that exists.
It's a 600 page technical book
You do understand that you don't have to read an entire book before forming an opinion, right?
sort through everything that exists
"Book that has been considered a classic for forty years and was used as the intro text at MIT for decades" is a long, long way from "everything that exists".
I did begin reading it a few years ago, and it seemed like a lisp flavored intro CS book. Why do I want to read an intro CS book? I've read those before. Yet people promise me that this one is different, while giving me nothing but condescending promises of enlightenment.
I guess the honest question is "what are you looking for in a response to your question?"
You asked if you should read it, and almost everyone who bothered to reply has said yes. Each time this topic comes up and people ask if they should read it the majority of responses are 'yes'. And I presume if you've been around software for that long, you've seen all of those threads and previous questions and answers like I have - meaning you likely knew what people would say when they responded.
So, in the end, it's up to you to decide if you'll read it or not.
I don't know you personally, but from my life experience it sounds to me like something I've seen in other circumstances. You've more or less decided that you're not going to read it, but feel like you're missing out and you want someone who as read it to say "it's ok to not read it". So you can resolve both the feelings of the decision to not read it, and the uncomfortableness of feeling like you're missing out.
It's possible I'm way off the mark on the above, but I mean it to be helpful - as I can say I've seen what looks to be this same pattern many times in life.
or what you're likely looking for: In the end, there are only so many hours in the day. You gave it a fair shot and it wasn't your vibe. It doesn't say anything about your strength as an engineer, it just has a specific approach and it's not a match for everyone. That doesn't make it a bad thing, it just means you'd rather spend your time learning and exploring in other ways. And it probably would've had more impact earlier in your career than at the level of experience you have now.
But of course before wrapping up I do need to undo my comment: I think you should give it another go. Maybe skim past the early part if it feels a bit to introductory and come back to it later. But it's a book that continues to grows on you the more time it's been since you read it. The concepts it presents are subtle but impactful in changing how you think about software. You don't fully grasp it when you read it. It's just that afterwards you just start seeing things through its lens often. I haven't read any other book like it.
I think it's clear he wants arguments whose strength he can then evaluate, not just a yes/no answer.
If you are experienced then I’d say skim whatever sounds interesting from chapters 1-3, then do chapters 4/5 as they contain most of the interesting stuff.
Maybe not for you, but there is bunch of expert beginners out there that would benefit from reading up intro CS stuff.
Why do I want to read an intro CS book? I've read those before.
"We shall not cease from exploration, and the end of all our exploring will be to arrive where we started and know the place for the first time." - TS Eliot.
You aren't a curious person who studies things for their own sake and finds wonder in exploring ideas. That's fine, this book is clearly not for you so why concern yourself with it? Most of my friends who are obsessed with their field count many intro books as their favorites and frequent re-reads.
condescending promises of enlightenment.
It sounds like you're more upset that other people enjoy and therefore recommend this book. You're the one asking for proof that it's worth your time. It's clearly not. People who are curious and like to explore ideas in computing recommend this book frequently to other like minded people.
If you don't like sushi, why question somebody's recommendation for their favorite omakase place?
Weasel words. Considered a classic by who? And why would being an "intro text at MIT" mean it's good for someone who is already experienced? If anything, that should show it's not worth his time.
Agree but you are replying to a user named "lisper".
Maybe he struggles with a speech impediment, you insensitive clod!
</Slashdot_nostalgia>
this book is at the top of the list of near every list of best programming books of all time. its worth reading. is it an easy read? no. its sort of like the Illiad. Not an easy read. But worth it for anyone who wants to be considered well read.
It's a 600 page technical book
why don’t you just start reading it and decide for yourself?
A journey of a thousand miles....
Have you never been to a book store?
You don't have to read every word of a book to understand if it's interesting to you. I purchased a bunch of technical books the other day that I had never heard of based on opening them up, reading a bit of the intro and flipping through the examples.
Relatively few of my favorite books have come through recommendations compared to those that I have come across through serendipitous discovery.
For anyone who has no time to browse books, then most of the best books in existence would be of little interest to that person.
Do not waste your time and energy relying on the opinion of others. Always check yourself.
I think it is, and if you don't want to take my word for it, John Carmack thinks the same :)
There's an old video [0] where he talks in-depth about his explorations of functional programming. It's perhaps a bit more about Haskell than about SICP, but in it he's also quite enthusiastic about what a veteran programmer can still learn from this book.
Did he study it before or after Doom?
This question is answered immediately in the linked video.
this takes a lot longer to type than before or after
Think how long it would have taken to click the link!
(About 9 minutes, acc. to the parallel subthread. My time on this thread is about 90 seconds so far not incl this current comment.)
Still haven't clicked it, huh? If you did you'd see that the link takes you to straight to the moment in the video where he starts talking about functional programming.
11 minutes in, he mentions SICP as a book that he ended up working through exercises "in the past year" (so 2012 timeframe, well after the original Doom but before the reboot).
Thank you! :) Provides helpful context vis-a-vis evaluating possible critical-path enabling / contributing factors to achieving a key milestone.
I don't have time to watch the video right now, but according to gpt watching the video for me: after.
"Don't cast your pearls before swine".
When I was younger I used to passionately defend those things I've seen as beautiful, but after years experience talking with people passionate about their fields and learning and those who never will be: If you lack the innate curiosity to explore those things others have declared marvelous, then this book will offer you no value.
Every time I crack this book open I get excited and I've read it multiple time and one most of the exercises. I can think of few other books that really expose the beauty and simultaneously strong engineering foundations of software.
You have "tons of experience programming" and sound like you've already decided you know what needs to be known (otherwise why even ask rather than just read it free online), I doubt this will offer you anything you haven't already seen before.
Might find it beautiful, but I'm also just f'in busy. Why do you think that if I enjoyed it I would've already done it? Enjoying things is great but it's not my main desideratum for doing something
Then it's not for you and you can do something else, like commenting about why you should read it.
If you lack the innate curiosity to explore those things others have declared marvelous, then this book will offer you no value.
Every time I crack this book open I get excited and I've read it multiple time and one most of the exercises. I can think of few other books that really expose the beauty and simultaneously strong engineering foundations of software.
---
https://www.stilldrinking.org/programming-sucks ( https://news.ycombinator.com/item?id=7667825 and others)
Every programmer occasionally, when nobody’s home, turns off the lights, pours a glass of scotch, puts on some light German electronica, and opens up a file on their computer. It’s a different file for every programmer. Sometimes they wrote it, sometimes they found it and knew they had to save it. They read over the lines, and weep at their beauty, then the tears turn bitter as they remember the rest of the files and the inevitable collapse of all that is good and true in the world.
I recommend (not ironically) Double Binded Sax by the group named Software.
I recommend (not ironically) Double Binded Sax by the group named Software.
After you finish that, I recommend queuing up Friedrich Nietzsche by Klaus Schulze.
otherwise why even ask rather than just read it free online
Not GP, but my time is limited, so asking "is this something an experienced programmer would find worthwhile and insightful" is a fair question.
To be honest I'd recommend just taking a look. It's been 10 years since I first encountered the book (and programming) and I'm sure the lessons can be found elsewhere.
But I find the book a marvel of pedagogy and rank it as maybe one of the greatest textbooks of all time across disciplines. The lessons are packed so densely yet so concisely that you'll appreciate different things on successive reads.
If you're experienced it will also read very easily and quickly so it then becomes quite an easy and enjoyable skim and then you don't have to rely on other people's accounts of whether it is or isn't worth your time.
| I can think of few other books I’m interested to know which books …
That depends entirely on what kind of a ton of experience you have. If you have written Java/PHP/Python/JS for 20 years, SICP might teach you lots of new tricks. If you have worked a lot with functional languages, probably fewer, but still probably some in later parts of the book.
I'd worked exclusively with functional languages and assignment operators implemented as lambdas blew my mind more so than all the monads in Haskell.
I'm still working through chapter 1 and, despite having some previous Scheme experience, let being syntactic sugar for lambdas just blew my mind the other day.
No
Because...
Reading that book made me good at recursive programming :-)
Which made me good at recursive programming, which made me good at recursive programming.
I’d personally say especially so. I’m guessing those taking it as freshman are missing out on 75% of the good stuff.
It’s free online. And YouTube has lectures from the authors- presented in all their early 80’s glory.
Whether it's worth reading isn't a question others can answer for you, but "a ton of experience programming" definitely doesn't imply it isn't worth reading as it's perfectly possible to get through an entire programming career without encountering the ideas and ways of thinking that you'll find in it.
Yes, I bet it will give you new perspective on a number of things.
That book was my first programming class and programming ever: CS61A at Berkeley. When I started taking it, one of the TAs said that the ideas in the class were wasted on us - they seemed simple but we wouldn't grasp the fullness of the ideas until we had more experience. He said the class should be required to take twice: your first and last class.
It gives you basics of things like recursion and message passing that are easy to use immediately (well, once you figure them out -- it took me two weeks to be able to write even the simplest recursive function), but there are large conceptual things you won't really see the depth in until you have more experience. The last project is the meta-circular evaluator - writing a scheme program to interpret scheme programs. It is difficult to understand that without having a lot of other background.
Even then along the way learning things like exceptions as implemented through call/cc might be a little too much for a beginner programming. The book is in some ways talks about concepts way outside of a beginner's comfort zone.
well worth reading even for veteran senior engineers. worth re-reading every few years too.
Depends on what you learned during that "ton of experience programming." My only regret was that I had, but stupidly ignored, the MIT course notes that preceded publication of SICP. I didn't realize what I had missed until many years later when I sat down to work my way through SICP. I coulda been a contender.
No. If you want a deeper understanding of programming, write your own static analysis / theorem prover.
It definitely is a "classic" book for a good reason.
I think it really depends on your actual formal education, how valuable this book will be to you. If you have a CS degree basically everything in there should have been covered during your studies, you already know how to operate on trees, recursive algorithms and how to build an interpreter. The main value to you would have been seeing it all come together in one place.
If you don't have that kind of education I believe it is extremely valuable to see programming from a theoretical perspective. The book is very good at laying out a cohesive narrative of how you go from very basic structures to complex programs and how these programs can be executed by a computer. The highlight definitely is the implementation of lisp in lisp.
There were also chapters, mostly towards the end, which I think aged quite poorly and seemed mostly irrelevant. I don't think there is much lost by skipping them.
Depends on the kinds of experience. If you have experience writing compilers and interpreters - probably not a lot. If your experience is in application development you might learn a lot.
I worked through it after years of practical experience as a self-thought programmer, and I learnt a lot (and found it quite challenging, especially the mathy parts).
Only if you’re comfortable being converted to lisp
I used it as a teaching aid to help colleagues from our application teams get the background and skills to work on our core platform. I certainly got a lot out of it because I had to do all the exercises, and I think if i hadn’t had to prepare like that I would have skimmed more and gained less. But I already had a background that covered a lot of the same areas.
So it is going to depend a lot on your background and the effort you put in. It may be better done as part of a reading group where you can discuss the exercises than on your own as some are meant to highlight why certain approaches are hard and are not commonly used.
I would say yes. I read it when I was already an experienced programmer and I still found it worth it. Scheme is one extreme in terms of styles of programming languages, and it allows the authors to present a certain way of thinking about programming that was still new to me that is natural in Scheme.
I'd say it is less about your experience with programming and more about your interest in it. SICP has quite a profound style that's in between academic and poetic. For me, it was both fundamentally informative and inspirational. So, if you feel like structuring your computer science foundation and like an imaginative approach to programming and math that favors aesthetics, give it a try.
Experience in software development isn't linear so depending on which path your experience has taken it could be very insightful and useful, or it could be stuff you already know.
I'd say for the vast majority of developers, experienced or otherwise, it's a good read.
Haven't actually done the book. Can vouch for the lectures[1] though. Very much worth it to get a different perspective on various language structures I had just taken for granted.
Watching 2-3 of these will probably answer your question.
[1] https://groups.csail.mit.edu/mac/classes/6.001/abelson-sussm...
I used to be a teaching assistant for CS 61A (intro to programming) at Berkeley teaching from this book with Brian as the instructor.
One of Brian's primary points is the following:
Scheme ... has a very simple, uniform notation for everything. Other languages have one notation for variable assignment, another notation for conditional execution, two or three more for looping, and yet another for function calls. Courses that teach those languages spend at least half their time just on learning the notation. In my SICP-based course at Berkeley, we spend the first hour on notation and that's all we need; for the rest of the semester we're learning ideas, not syntax.
Bullshit. Again, I was a TA for this course. You do not spend the rest of the semester on ideas, you spend the rest of the semester on the students being very confused.
This "everything looks the same" property of Scheme and of all LISP-like languages is a bug, not a feature. When the semantics is different, humans need the syntax to be different. In contrast, LISP/Scheme make everything look the same. It is quite hard to even tell a noun from a verb. This makes learning it and teaching it hard, not easy.
Brian is selling a fantasy here. If you think Scheme is so great, look at this nightmare of examples showing the various ways to implement the factorial function in Scheme: https://erkin.party/blog/200715/evolution/
All of this "abstractions first, reality second" agenda is just a special case of what I call "The Pathology of the Modern": the pathological worship of the abstract over the concrete. Everything modernism touches turns into shit. I am done with living in modernist shit and I hope you are too.
I wouldn't have spoken up except for this comment. As a freshman, I took 6.001, the MIT course that Structure and Interpretation of Computer Programs was based on, and I loved it. As a graduate student, I taught 6.001 three times, twice as head TA under Prof. Sussman and once under Prof. Abelson. In addition to helping make up problem sets, quizzes, and exams, my responsibilities included teaching seven or eight one-hour, five-student tutorial sessions per week as well as teaching a forty-student recitation section once per semester. I graded assignments as well. My point is that I have a lot of experience with what students found challenging in the course.
Prof. Harvey's claim rings completely true to me. Students understood the syntax quickly, and spent little time on it. It was not a point of frequent confusion. There were plenty of difficult concepts in the course, but the details of the programming language were not, for most students, among them.
Students who already had programming experience when they started the course often had more trouble than inexperienced students, but mostly because they had to unlearn imperative habits since the imperative features of the language, except for I/O, weren't used until late in the course.
SICP covers a huge breadth of material, from basic computational ideas to algorithms and data structures to interpreters and compilers to query languages to concurrency, and does it in an entertaining and challenging way. Even decades later, I find myself pulling ideas from it in my daily programming work.
I worked at Google for almost twelve years, and I can't count the times I found myself muttering, when reading a design document, "I wish this person had read SICP."
I'm certainly biased, but I would encourage anyone who would like to become a better software engineer to read SICP and study it carefully. Take your time with it, but do read it.
Thanks for speaking up. At this point no one is really presenting any evidence so it’s a necessary evil to offset the Lisp slander even if it is, like the parent comment, not much more than an appeal to authority / popularity.
Syntax is absolutely neither natural nor unnatural, by nature, to humans, but it’s a fact that fewer symbols to memorize is easier than more symbols to memorize. The problem is a failure to launch. Some people never truly understand that it’s not just syntax, it’s semantics. Data is code, code is data. That’s why it all looks the same. This artificial distinction in “C-like languages” is more harmful for the enlightened programmer than it is helpful. Unfortunately not everyone that reads SICP experiences enlightenment the first time (or ever, I guess?)
I really want to like that idea you're describing, however I've found in practice, there absolutely is a practical difference between code, data and types. I mean, they literally live in different sections of a process. If you design a program that runs on a real machine, and you spend a lot of time thinking what the program should do, how it can put the limited resources of the system to good use -- you absolutely need to think about code and data separately. Mostly think about data, really.
The one area where "code is data" remains a nice idea in my mind is for metaprogramming. And whenever I've done more metaprogramming than small doses, I've come to regret it later, no matter what the language was. (Small doses of metadata can be done even in statically typed, AOT compiled languages without RTTI).
The reason is I think, just basic data structures and simple procedures built in to a language allow you to express most everything you need, in a very direct manner. The number of distinct concepts you come up with as a programmer can usually be directly defined in the base language. Metaprogramms won't create new concepts as such, it's only code run in a different phase. There is definitely a case for generic/templated data structures but it seems it's best to use them sparingly and judiciously. Be wary of them duplicating a lot of code, fatting up and slowing down your system at compile time and/or runtime.
Information hierarchies are empirically important and are an essential part of communications design. Uniform syntax makes information hierarchies harder to parse, because the boundaries around different types of information all look the same. It's the same reason we have different sized headings, bold text, etc. They are distinct markers.
So yes, fewer symbols means easier memorization, but you could take that to the extreme and you'll find that binary is harder to read than assembly.
I think Lisp is really elegant, and the power to treat a program as a data structure is very cool. But scanning Lisp programs visually always takes me a little more effort than most other languages.
Honestly GP is making two very valid points though.
Something that Clojure does is differentiating between () = lists of calls, [] = vectors (the go to sequential data structure), {} = maps. This definitely helps the eye to see more structure at a cursory glance. It has a little bit more syntax compared to Scheme, but the tradeoff seems to be worthwhile.
Secondly, I think it's very healthy to be wary of indirection and abstraction. I'm not sure if I agree with the tone and generalization about modernism, but I think there's a burden of proof, so speak, when it comes to adding abstractions, especially in the long term.
I think Scheme works well for the kind of conceptual overview the course is trying to provide. I think there is something to the argument that Scheme syntax is not ideal for readability of larger programs, but I would wager that the bigger reason some students find SICP confusing is the same reason it blows others’ minds - the whole approach is at a higher level of abstraction than most “intro to programming” classes.
Yes, I agree these are two good points. I also experienced teaching SICP and would say the overall position of the GP is incorrect and results in a less profound understanding of programming.
I've had an introductory Scheme course in a smaller university, and have experience designing data structures, creating parsers & interpreters, and with multi-threading and networking.
I was never one to really dig lisp. I prefer the structure and the groundedness of a statically typed systems language (I mostly do systems work). But I took on reading SICP in the hope of finding something new and interesting, and to level up my skills. However, I got bored by the it. Probably made it through more than half of the book.
It's a bummer because I'm left with the feeling of missing out. Am I not worthy or too obtuse to get what's so great about the book? Or maybe I am in fact not the target audience, having too much practical experience that the book doesn't seem worth my while.
If you're comfortable writing interpreters you've probably already picked up on most of the "big ideas" SICP is trying to teach. It's a very good introductory book, but it is still just an introduction.
Take your time with it, but do read it.
And do the harder exercises. Really do them, not just read and tell yourself you understand how to do that one and move on.
It is quite hard to even tell a noun from a verb
What?
Unless the list is quoted or something, the first item after the opening paren is always the "verb", yes?
There's nothing stopping any other item from being a verb, no? (Not the verb, but a verb.) Anything involving higher order functions?
In the context of the verb, everything else is a noun. When you understand what the verb does, then you can care about the difference between a verb and a noun.
Certainly, but the original quote was "It is quite hard to even tell a noun from a verb" (emph. added), and this is correct, you can't tell whether an identifier refers to a function or variable in Scheme by sight alone. This seems desirable if one wants first-class functions, and is very much an intentional choice for Scheme, but it can admittedly be more difficult to build up a mental model of new code if you have no idea what's a variable and what's a function (esp. amidst an already-difficult-to-grok sea of parentheses).
Notably, this isn't intrinsic to Lisps - Common Lisp uses a different syntax and namespace for function names and variables. My understanding is that Scheme et al's decision to merge the namespaces/syntax was not without controversy in the Lisp community (the Lisp-1 v Lisp-2 debate).[0]
[0] http://www.nhplace.com/kent/Papers/Technical-Issues.html
The only "verb" is the open paren. Other languages just make this simple and fundamental rule way more complicated.
Does it beat OO Java classes with students crying ? or months spent on warning kids not to mutate an iterator or else you're gonna cry again ?
Mostly kidding but different paradigms bear different pain points it seems.
Oh and lastly, the let-us-care-not-about-syntax is also an argument at Brown edu (krishnamurti and his team IIRC)
That said, I'd be curious to hear what your students had to say about scheme confusing traits.
I taught both SICP and Java, and I can confirm Java was far more confusing to students. Classes vs instances, inheritance, polymorphism. Why was everything a class? Don't I just want the computer to do something to some input?
I don’t think OO should be taught to students who aren’t already familiar with structs and passing functions around.
If those two things are already well-understood, the nature of OO as a some syntactical sugar and a couple lookup tables is readily apparent.
Without that background, the terminology seems weird and arbitrary and the behavior magical.
And the public static void main and then endless conversations about packages, and public/private fields, that will backfire very pragmatically (at the time) unit test frameworks didn't have a way to call private methods ... Ironically by the time you're done with the basics, nobody has stamina anymore to learn anonymous inner classes.
The thing is, somehow syntax and some forms of abstractions cast a magic spell on most of the population (at time myself included) .. it's your mental interface to the semantics, so you want more syntax to be able to have more abilities, but syntax composes badly.
At least to me that's why I enjoyed lisps / lambda calc, it reduces the domain into a more homogeneous space, suddenly more things are possible with less. Although it seems that the mainstream rather enjoys doing simple thing with verbose tools (it does looks like you're doing a lot of work with a lot of advanced terminology) than solving hard problems with APL oneliners (hyperbole).
Different psychologies ?
Why would the students be confused? By what exactly?
This "everything looks the same" property of Scheme and of all LISP-like languages is a bug, not a feature.
But you are mixing up things here. There are things that look different. Most things in Scheme can be understood as function calls and match that syntax, but there is different syntax for define, let, cond, if, and others. Not everything looks the same. What you might actually mean is, that everything is made of s-expressions. That is actually very helpful, when you work with code. It makes it very easy to move things around, especially in comparison to languages like Python, with significant whitespace indentation.
When the semantics is different, humans need the syntax to be different.
I learned multiple languages before Scheme, and they did leave their scars, but I find, that I do not need syntax to be that much different. Maybe I am not human.
In contrast, LISP/Scheme make everything look the same. It is quite hard to even tell a noun from a verb.
Is that a feature of the English language? I have rarely had this issue in Scheme. Perhaps it is because I think a lot about names when naming things.
This makes learning it and teaching it hard, not easy.
Maybe I only had bad classes and lectures before reading SICP on my own, but I found, that I learned much more from it than most teaching before that was able to teach me.
Brian is selling a fantasy here. If you think Scheme is so great, look at this nightmare of examples showing the various ways to implement the factorial function in Scheme: https://erkin.party/blog/200715/evolution/
And what exactly is your criticism?
That there are many ways of writing the function? That is a property of many general purpose programming languages. For example we could look at something like Ruby, where it has become part of the design to allow you many ways to do the same thing.
Or the richness of programming concepts available in Scheme? Is that a bad thing? I think not. You don't have to use every single one of them. No one forces you to. But am I glad to have them available, when have a good reason to use them.
Surely you are aware, that the page you link to is at least partially in jest?
All of this "abstractions first, reality second" agenda is just a special case of what I call "The Pathology of the Modern": the pathological worship of the abstract over the concrete. Everything modernism touches turns into shit. I am done with living in modernist shit and I hope you are too.
I don't know where you got the idea, that SICP lauds "abstractions first, reality second". This is not the essence of SICP. SICP invents abstractions, once it shows, that some previous approach was not sufficient. A good example is the whole "develop a package" thing, where piece by piece the requirements grow and data directed programming is introduced.
Just an anecdote.
I took CS61A by Brian Harvey in 2009. I loved the course and I actually spent very little time learning the syntax and most of the time learning the concepts.
So I fully agree with Prof. Brian Harvey here.
Bullshit. Again, I was a TA for this course. You do not spend the rest of the semester on ideas, you spend the rest of the semester on the students being very confused.
I was a TA on an SICP course at a UK university, disagree with you. The students weren't confused, the simple syntax really helped and, because all the students had good maths knowledge, a functional style was a lot more intuitive than imperative.
FYI, the course has since been replaced with Python programming.
A funny thing to me is I took CS 50 in '85 or '86 (CS 50 and CS 55 were later split into the CS61X series), and instead of Scheme we used Logo for functional programming.... using Brian Harvey's textbook. He was not teaching the course that semester.
At least part of the goal of CS 50 at that time was to explicitly weed students. They didn't want undeclared students to waste a whole lot of time on CS only to find out they were not going to be accepted into CS. Instead, they went through one hard course to find out. Perhaps that explains why some of it was overwhelming to some students?
You could also portray this as yet another case of theory trumping practice, which is also symptomatic of modernism.
The idea that a language based on a small, elegant set of composable primitives is inherently better for programming in the large as well has not been borne out in practice.
Upvoted for an interesting take, even though I disagree with some of it.
I took 61A from bh. Personally, I agree with bh's statement that you quoted. Where I encountered difficulty was applying the ideas in a different context (e.g. C or Java). Brian spent time addressing this precise difficulty (in the last lecture or so), but it still wasn't enough for me.
I do heartily agree with you calling out "the pathological worship of the abstract over the concrete". Knuth's Concrete Mathematics was also bucking this trend (e.g. https://youtu.be/GmpxxC5tBck?si=tRHQmuA4a-Hapogq&t=78). I'm curious, once you came to this opinion/realization, how did your teaching/learning change?
I won't dispute your experience, but for me the point did hold true. SICP was my first introduction to anything Lisp-like, and by that point I'd done C/C++, a bit of Java, quite a bit of Perl/Python, and of course BASIC.
And I was really surprised how quickly and effortlessly I picked up the part of Scheme taught in the book. Faster than any language I had encountered thus far - Python included.
People who complain about parentheses have never tried structural editing. Change my mind!
I tried Racked with the recommended IDE setup (VScode), does that have structural editing?
Here are my notes from 2023-07:
https://docs.racket-lang.org/more/index.html
* okay, seriously. functional programming is nice, but these fucking parenthesis are ridiculous. the VSCode extension is okay, but doesn't help at all with formatting, etc.
* "car" and "cons", yeey, but "first" would have been so hard?
* the whole "define x 'x" is also meh.
* no return, so sometimes something just takes the last one and returns.
* there's string->url ... why not string->parse-url .. no, would have been too verbose. MAYBE YOU COULD HAVE SAVED SPACE BY OMITTING THE FUCKING PARENTHESES
*
/ end notesehehe ... well ... I think I will keep trying it again every few years. is there a pythonish version, where indentation matters and no need for wrapping parens?
but these fucking parenthesis are ridiculous
I thoroughly agree. I am deeply into functional programming, but syntax built entirely around endless nested parentheses has never felt like anything but a nightmare to me. Doubly so because even in 'clean' code it's reusing the same syntax with what are for most coders three clearly different logical concerns (defining functions, listing statements to execute in order, and invoking functions).
what are for most coders three clearly different logical concerns
That's the imperative model which foundations is the Turing machine. Lisp comes from lambda calculus and you're not doing the above really. At it's core there's the value (or the symbol that's represent it), then the abstraction, which defines how to get the value, and the application, which let you know with what to replace unknowns (variables) in your abstraction so you can get the value. And it's recursive.
A program can be a value (not that useful), an abstraction (kinda like a nice theorem), or an application (the useful choice). Defining a function is creating a named abstraction, which is just a special value. Invoking a function is applying the parameters to that abstraction, which means replacing variables to get a more simplified version with the goal to finally have a value. If you can't get a value, that means the result is still an abstraction and then you still have to do more applications.
You either have a symbol or atom (which is a value) or you have a list which is an application (except the empty list, which is the same thing as nil). An abstraction is created by special forms like (defun foo (bar) ...) in CL. but the result of the latter is still a symbol. An atom is equivalent to itself, and a list is equivalent to having applied the abstraction represented by the first element to the rest. Anything else is either special forms, macros, or syntactic sugar.
So unless you're applying imperative thinking to the lambda calculus model, there's no confusion possible.
I don't have to be confused by it to dislike how it ends up working out in actual code, even "clean" code, in terms of ease of understanding and maintainability.
these fucking parenthesis are ridiculous
They're just different. And once you've come familiar with the language, you miss s-expressions everyday, because they're just that easy to work with, especially with something like paredit. Why? because the grammar is easy to parse and reason about. The whole code is a tree. And evaluation is mostly working from the leaves to the root.
"car" and "cons", yeey, but "first" would have been so hard?
It comes from the nature of the language. "cons" is to construct a pair of values, and "car" to get the first one, while "cdr" returns the second one. But lists are composed of cons cells (check how it works), and in that case you could argue for "head" and "tail" for the function names. But "car" and "cdr" were first and you could alias them easily.
no return, so sometimes something just takes the last one and returns
The computing paradigm is expression evaluations, not sequence of instructions (although sequencing is there). An s-expression is always equivalent to something, and that something is what you're computing. Something like (first '("name" "email")) is the same as "name". Or (if (> x 0) :pos :neg) with x = 5 is the same as (if t :pos :neg) and the same as :pos. [0]
No return is needed. It's tricky when doing iteration, but whenever you're doing sequencing (they mention that in the docs for CL), the last value is equivalent to the whole thing.
[0]: https://en.wikipedia.org/wiki/Lambda_calculus#Reduction
Your notes indeed suggest that you've not been using structural editing.
Aside from that, you could have tried to use `first` instead of `car`. It would've worked.
And yes, there happens to be a pythonic version named `Rhombus`, which is the flagship language of the racket system.
DrRacket probably would have served you better. That or Emacs.
You might be interested in Rhombus: Racket with ML-inspired syntax. https://docs.racket-lang.org/rhombus/index.html
* there's string->url ... why not string->parse-url .. no, would have been too verbose. MAYBE YOU COULD HAVE SAVED SPACE BY OMITTING THE FUCKING PARENTHESES
string->url is consistent with the way they do things in Racket. Note in that same document you linked the use of number->string and string->number, the -> indicates a type conversion. Along with string->url there is also the reverse, url->string, and some other conversion functions. That consistency is actually pretty nice, it means you can guess and check ("I have a string and want a url, will this work?" Oh, great it does!) or guess and search the docs before checking with the REPL or whatever.https://docs.racket-lang.org/net/url.html
* "car" and "cons", yeey, but "first" would have been so hard?
car shows up once, cons not at all, but he does use cdr. first, second, and rest are available, I don't know why he didn't use it in this demonstration. If you want to use first, go for it.I complain about parentheses, and yet I'm building a structure editor[1]. Two complaints about parentheses:
- Parentheses obscure the structure of the language. When different syntax is used for different parts of the language, it's easier to visually scan the code. This is simply because punctuation/arrows/etc. visually stand out more than plain identifiers.
- Parenthetical languages have just as much syntax as non-parenthetical languages, it's just a shittier syntax. Try writing (let (x 1) (y 2) (+ x y)) and see what happens: it's a syntax error. Look at the indentation rules in DrRacket: there's let-like, cond-like, etc., because all of those constructs have different syntax. But it all kind of blends together in a sea of parens.
This weakness of paren-heavy languages is also their greatest strength, though. Since the syntax is so generic, it's easy to extend, and these language extensions can be indistinguishable from the base language. That's a big deal!
BTW, what structure editor would you recommend? Which have you tried?
I would never get into programming if all I was shown were some quixotic features of Scheme I couldn't use for anything in the real-world. What attracted me to programming was to quickly enable me how to draw a graph of a function, how to make a simple worm-like game, how to dream about creating a flight simulator and drawing the plane dashboard with all its dials and make them move even if the flying part was deep in dreams, drawing a car on a screen with some air arrows to simulate aerodynamics and dream how to build a software for it. But dry string manipulation with weird syntax was not very interesting and in fact rather repulsive.
I love SICP, but I will say this: there's a certain class of "great" textbooks that sometimes get recommended by experts, that aren't actually that great for someone learning something for the first time. They get the reputation as being "great" because they are great for people that already have a good basic grasp of the material: They are precise and pithy, and they contain insights that are deep and useful, but hard to appreciate if this is your first encounter with the material. (Kleppner & Kolenkow would be a good example of a freshman physics textbook in this category.) With all due respect, I would put SICP in this category. Which, of course, is a noble and important category, but maybe not the most important category if you're just starting out on a topic.
Very good point. That was literally the magic of early Basic. One could do something quickly. And the imperative nature of the language gave a simple point of attention mental model. I feel that point is lost in many criticisms.
When I was first introduced to computers, I was shown BASIC and Logo (Apple II plus). Logo felt a lot like a LISP (at least, that's how it was presented to me... ) with a very easy way to make graphics (turtle model).
I couldn't really get into Logo- it felt too indirect, too computer-sciencey, so I went into BASIC and later Machine Language to learn how to build games (I was influenced by Choplifter, Lode Runner, Wizardry, and Ultima). It wasn't until some time later that I learned LISP and began to appreciate functional programming and the turtle model (I still am amused at how much I learned on that original Apple II that still applies today to tech like SVG).
I reflect a lot on how Norvig was a LISP guy and then when he saw what was happening in scientific computing, pivoted to treating Python like LISP (https://www.norvig.com/python-lisp.html and https://www.norvig.com/lispy.html) although I don't know that anybody really anticipated we'd have systems like Jax which really turn functional programming in an imperative language on its head.
The danger in demanding to just be taught how to be a good code monkey is that that increases your risk of getting stuck just being used as a code monkey.
My intro CS class was taught in Scheme. I haven't found what I learned in that class to be quixotic at all. Instead, I view it as setting me up with a strong foundation that colleagues of mine who learned in a more "popular" language lack. It's put me at a permanent advantage for designing and building software - even decades later, I find that there are ways of decomposing problems that I can understand easily, and colleagues of mine who learned on Java still have trouble with, because they simply lack the "design vocabulary" that would enable them to think about the problem in the right way.
We never again used the language past that first introductory course, but I still don't think it was a waste of time, because it allowed us to cover so much ground so quickly. In 3 months we went from basic fundamentals to building our own object-oriented programming system from scratch. That's amazingly powerful. A deep knowledge of what object-oriented programming actually is, and how it works under the hood, is usually considered an advanced and esoteric concept that few people really understand. Fast forward about 15 years, and having that kind of knowledge in my head allowed me to take a job at a Java shop, having never touched the language previously, and then within just a few short months have a deeper understanding of how Java works than colleagues of mine who had been using it on a daily basis for 20 years. So that they were coming to me for help with Java problems, despite having an order of magnitude more experience with the language.
Rewind back to school, and it's the same story. The second class in the freshman curriculum used C++. It was not a course in C++, mind - it was a class on algorithms and data structures. Learning C++ was just the first couple of weeks. And learning it mostly consisted of being taught the syntax and the build and debugging tools. The semantics for everything - pointers, classes, templates, etc. - took a negligible amount of time because it could be explained to us in terms of concepts we had already learned by building ourselves - in a 3 month intro class.
Quixotic features like functions and variables.
Edit: Sorry, was cranky from being hungry. I take issue with “useless in the real world.” The first chapters introduce variables, functions, recursion, and lambdas, and not much more than that. That’s the beauty of the book. Flexible ideas that are immensely useful and form a great foundation for learning computing. It’s not the best total beginner book, I’ll admit, but it’s a great beginner-intermediate book for those who seek mastery.
I always found the first half of the book a light read, an exciting read, it was as wonderful to read for me as Lord Dunsany, I sped through it.
When it started to get to object orientation I started having problems following. It just didn't make sense. Maybe I should try again, but it just always made me feel that for my particular mind functional thinking was better.
From personal experience, I can say a person can build a fruitful bread-and-butter earning programming career on the first two chapters of the book, provided you solve literally all the exercises. Such is the density of know-how and design context packed in there. This also makes the book a challenging read. I feel like the math-y content is also not for everyone. After all, it was aimed at MIT engineering undergrad students who are expected to have a baseline mathematical felicity well above the average high school level.
I find the difficulty of the chapters goes as the square of their number. Viz. Chapter 5 is 25 times as hard as chapter 1. I also find that basically no one goes past chapter 4. I've only ever seen one good talk on it and the guy was so far down the esolang rabbit hole that he had an IBM apl keyboard.
Not sure I'd call APL an esolang - it's just an old language focused on array manipulation like you'd find in numpy
Hm... I read half of the book (with exercises) and stopped. I don't remember why, but it was probably OO for me too.
FWIW the book has moved to https://mitp-content-server.mit.edu/books/content/sectbyfn/b...
The old url at http://mitpress.mit.edu/sicp/full-text/book/book.html is now 404, and most of what you find on that site will try to get you to do an "eTextbook rental".
Is the 'How to get Scheme' still appropriate even though it was last updated in 2003?
Are you asking whether the "MIT/GNU Scheme" project is still viable?
The last release is from January 2023. Seems fine enough for learning.
Apart from another HTML rendition, there's also an e-reader-ready epub rendition (from the HTMLs obv.) at https://github.com/sarabander/sicp?tab=readme-ov-file#sicp
Can someone compare it to PAIP? I assume SICP handles functional programming ans CS basics and PAIP is more about programming practice in general?
Similar themes. SICP is a little more academic. You build a compiler, do more math, etc.
PAIP also focuses more on Common Lisp specifics.
Different audiences too. SICP is meant for first year college students with little or no CS or programming experience. PAIP assumes no Common Lisp experience, but it's not an introductory text for CS. It's an introductory text for GOFAI and Common Lisp.
Though, PAIP covers a lot of SICP content, but differently. It also includes implementations of programming languages. PAIP is only partly an introductory text to Common Lisp, it's more about actual software design and development with Common Lisp.
IMHO PAIP is extremely well written. Book and Code: https://github.com/norvig/paip-lisp
SICP has a mixed reception. Many find it excellent, others find it to focused to teach CS concepts in a math-oriented way. Thus there have been written several alternatives to SICP, using Scheme, Logo, other Functional Programming Languages, ... I think it's a great book, with a lot of excellent books written as a reaction to it.
I want a searchable resource where I give my inputs and desired output and brief description of the algorithm and it gives me the algorithm and code.
Same with AOCP, but I greatly prefer scheme to MIX.
Otherwise these works just seem like a doomsday rebuilding references rather than knowledge with any practical presentation.
SICP isn't an algorithms book or a reference. It's very unlike The Art of Computer Programming.
I think your searchable resource is called a "software engineer".
I want a searchable resource where I give my inputs and desired output and brief description of the algorithm and it gives me the algorithm and code.
It sounds like you'd like to outsource your problem solving and thinking. Some things require work, and from that you learn and grow, which is the greatest reward.
And here i thought the goal of SICP was to scare off folks who were getting into CS because it was trendy rather than because they enjoyed it...
Seriously, it was a really valuable foundational course. But 100% it scared some folks to other majors.
With a 50% fail rate, I think the standard course load does a very good job of that anyway.
And here i thought the goal of SICP was to scare off folks who were getting into CS
Well, the acronym is pretty close to SCP...
In addition to the free book online, YouTube has original lectures from the early 80’s- complete with quirky humor, super early attempts at graphics, and late 70’s fashions all around.
Oh,and educational too!
And the best introductory lesson ever where they explain you what is programming.
Prof. Abelson's original lectures are available on YouTube and WELL worth the time to watch. It's unfortunate that this type of granular understanding of CS is seemingly not taught anymore - might help dampen down the hype cycles of late which equate a computer to a divine being of limitless potential instead of a dumb box full of switches.
My favorite class in college was my fundamentals of computer systems class.
It finally clicked how assembly instructions could draw images on a screen and all the magic inside my computer vanished instantly.
Big fan of dumb switch boxes
It's an interesting text for sure, and even became a bit of a meme, but I think SICP has only contributed to the trend of overabstraction and increasing inefficiency in software.
I had been programming profesionally for 3 years (since age 15) when I took "SICP" at MIT in 1980 (we used LISP; Scheme would come a few years later). The course is designed to appeal to the ego of the professors who teach it, and it is designed to mold MIT students into AI researchers which is the whole point of the MIT curriculum - -
1. They taught a bunch of extremely advanced concepts like infinite lists of prime numbers where the 'CAR' and 'CDR' functions could be used to iterate down the list, which calculates the next prime number on-demand.
2. At the end of the course, they made the horrific mistake of encouraging students how to write their programs in LISP and then embed the LISP code into other languages, such as Algol, which produces the most extreme spaghetti code in the history of mankind!
The problem with SICP is that it teaches many advanced language concepts but the students are unprepared to absorb them and haven't ever struggled with the types of problems these advanced concepts are meant to solve! It's like showing a peasant farmer how to drive a modern combine before they've ever tried ploughing their field with a horse plough! I can pretty safely say, as a systems programmer, I have used exactly zero of the concepts taught in SICP back in 1980.
But most importantly, when professors teach SICP, they can feel good about themselves because they get the misguided impression that they are teaching something that grows their student's capabilities as a programmer and when talking to other professors teaching introductory computer science, the SICP professsors can say, "look at this cool shit my students did - i bet your students wish they could write shit as cool as this!" - end of story.
Incidentally, I never went to class and got an "A" in the class. It was annoying to have to take this remedial brainwashing class after already taking 2 other CS classes (including introduction to programming and assembly language programming) at the University of Illinois.
I've slowly been working through SICP, currently on chapter 3: https://mk12.github.io/sicp/. There are lots of other websites, but what sets mine apart is the exercise solutions are written as modules with explicit dependencies on other parts of the book, and they are tested in three Scheme implementations. I describe my approach for this here: https://mk12.github.io/sicp/exercise/language.html.
We'll find out pretty soon whether the course can survive my own retirement.
(Footnote: Nope. Berkeley's new first course for majors uses Python...)
This is the saddest part of the whole post. Taking the intro course in Scheme set me on a path of thinking about computing in a whole new way. While they try to teach the same concepts with Python, it's just not the same thing as using a truly functional language.
A huge loss for Cal students for sure.
most introductions to computer science use whatever is the "hot" language of the moment...
So the coming introductory courses will use JS + LLMs?
This book kicks ass. Ignore the incurious.
I have never gotten past the first few pages. Had it ever been a mandatory class in my education, I would be working with something else today.
I miss the structure in the layout of this read.
The best lesson I took from SICP (which I read years after Iw as a competent programmer) is that the very moment you write your first function you are in effect creating a DSL for solving you domain problem.
It truly levelled me up as a programmer when I was hit with that insight.
Sadly my first exposure to programming was in FORTRAN on a mainframe (when I took a summer course at a college in 8th grade). I could not initially make heads or tails out of fortran or the imperative approach, but I knew it was magical. Eventually bought a VIC-20 and other computers later.
By the time I was ready to start college was well versed in a number of (imperative) programming languages, data structures, algorithms, etc. However was exposed to SICP in college and fell in love with both the book and emerging functional languages (this was the late 80s).
SICP remains my favorite CS / programming book of all time.
The only way it would have "benefited" would be that the web would only be developed by lisp programmers.
To the vast majority of programmers, syntax matters. C-style with brackets, or python whitespace, or Ruby do/end, these fit better the brains of the majority of programmers. Perhaps not the majority of HN readers but the majority of corporate devs.
Another example of this is Erlang and Elixir. Elixir adds a couple of features over Erlang, macros and protocols, but Erlang does everything else. What made Elixir take off where Erlang didn't, after decades, is that Elixir has a syntax that people are comfortable with. Erlang has a syntax that will summon Cthulu.
Is it an innate property of humans that the curly-brace style is more natural? I wonder if in an alternate universe where Lisp took off as the browser language people would find it more natural instead. It seems like somewhat of a chicken-egg problem.
Does decades of empirical evidence not prove that people are more comfortable with imperative, curly brace programming over s-expressions? It's not a chicken and egg problem. The egg has hatched and nested parentheses lost.
You may be right, idk, but I want to point out that you’re conflating two orthogonal concepts: S-expressions and imperative vs. functional programming.
There are lisp dialects that are very imperative, for example elisp, but they still use S-expressions. Historically they might have been considered “functional” because they have first-class functions and higher-order functions like mapcar, but nowadays practically every modern programming language (except go!) has these.
The thing all lisp dialects have in common is not where they land on the imperative vs. functional spectrum, but rather the fact that the syntax is trivial and so it’s easy to write powerful macros.
I think the simple uniform syntax is the main reason why Lisp never became popular.
Code is communication, and communication needs redundancy for error correction. You can see it in natural languages, and it makes sense to have it in programming languages as well. Using different kinds of syntax for expressing different ideas is an easy way to increase redundancy without making the code more verbose.
Lisp did become popular.
Then the AI Winter killed it and people avoided it like the plague.
The Lisp machine companies killed Lisp. They were the ones who blew Lisp sky high, but they also created expensive, monstrous workstations running Lisp images requiring tens of megabytes of RAM. Developers used them for prototyping and then had to cob something together to ship to the users, who had hardware like IBM PC machines with less than a meg of RAM.
Today's cruft like you ... Python, JS, whatever ... would not stand a chance in the world of the 1980s on that hardware.
It's amazing how far they were able to bloat up Lisp while continuing to peddle it commercially.
Leaner Lisps running on small systems existed all along, but they would rescue Lisp from the associations brought about by big Lisp.
This is what fascinates me about Unix, they created an OS which works with text and processes, as opposed to binary structures and function calls when computers were hundreds of times slower. Even today the overhead of process creation and serialization for pipes is not negligible, how the hell did they manage to do it in 1970s?
Or as I tell my colleagues who try to push for more abstract syntax: do you want your brain to do compilation each time your read something, or just have verbose text giving hints at each line ?
It s weird people prefer reading implicit text.
Isn’t this a big reason we have syntax highlighting? You can use color and styling to give you those hints that are otherwise implicit in text.
Clojure has all those other braces also. They’re just used for data structure literals rather than blocks of code.
I think it's quite telling that almost all of the innovations in lisp (garbage collection, first class functions, repl etc) have been absorbed into more popular languages except for s-expression syntax, which remains a small niche despite many great implementations of s-expression based languages.
Because as soon as you adopt the s-expressions, what you got is no longer <language>, but lisp itself. Something like this:
would become:No. There is a good github gist rant I can’t find anymore, but if we call every AST in the form of s-expressions lisp, then is anything lisp? A programming language has to have an associated evaluation strategy, otherwise it’s just data. What you wrote only makes sense to execute as C code, which sure you can write a compiler for in your given lisp as well (so can you write a C compiler taking C AST in any other language, so it’s not special at all).
This one? https://gist.github.com/no-defun-allowed/4f0a06e17b3ce74c6ae...
It also responds to a few parents up "almost all of the innovations in lisp [...] have been absorbed into more popular languages" - pervasive interactivity hasn't even been taken up by some "Lisps", let alone has it been absorbed outside Lisp.
Yes! Thanks for digging it up for me! I can’t find anything on google for the life of me since they switched to vector search, even though I used to be able to find some obscure blog post..
Right. From which we can infer people like many things about lisp except for the syntax.
Has there ever been research on this? Perhaps this situation has come about because the schools people must go to to get the programming jobs only teach the Javascript way? It seems circular logic to say that the current paradigm must be superior for the fact that it is the current paradigm. Is it possible that there are other reasons it reached that status?
does n=1 count? :)
some time ago I tried Racket, and just no. recently I tried Scala ZIO HTTP, and yes.
Maybe it's the types? Maybe it's the parens. But probably both. I cannot really recall my experience, just that manipulating code was ridiculously clunky. My assumption was that the IDE will manage the parens for me and when I'm moving something somewhere it'll figure out if I messed up the parens.. and ... no, nothing. I had to balance them with hand.
One reason emacs is popular for lisp programming is that the paredit package (or its newer competitor smartparens) do basically exactly what you describe: structural editing of sexp-based languages.
Decades of empirical evidence prove that people are more comfortable with functional, reactive, beging/end delimited programming, i.e. Excel.
millenia of empirical evidence and through to today, shows that most people are more comfortable not coding at all.
True, the majority of humans never even saw a computer.
No, it doesn’t.
What has happened in reality is that C became really popular and then all the people designing languages they wanted to be popular, rather than to be experimental, or to push boundaries, etc obviously chose a syntax which was familiar with most programmers, ie a syntax like C’s.
Further, one can disprove that the syntax is particularly important by simply pointing to Python which became immensely popular despite a lack of curly braces and even worse with significant white space simply because colleges and bootcamps decided it would be a good language to teach programming to beginners.
Arguably python and c are much more similar than any of them compared to a lisp.
I would argue the important part are the blocks in the former two, which sort of gets lost in the homogeny of lisps. Whether a block is marked with curly braces or indents doesn’t matter much - they being dissimilar to a regular expression does. Of course well-formatted lisp code tries to indent as well, but still there is a lot of visual noise there making it harder to visually inspect the code, I would guess.
Of course familiarity with a given way is significantly more important. We pretty much learnt the non-intuitive writing of math, to Chinese people their writing system is the intuitive one, etc.
No, because people who start in programming do not go to a syntax comfort clinic, where they are tested, and then assigned to a programming language.
It only proves that those languages are the most learned because they are the most popular in industry.
It says nothing about what makes a language easy to learn.
Hmm, can't find the paper (mostly clutter from language bootcamp results) but around a decade or so back there was an education research project that concluded that teaching SQL first, rather than any imperative language (regardless of punctuation), was better for getting students to develop reasonable mental models for computing. (Unfortunately without the reference I can't address what the criteria for "better" were - but "what people get paid to do" isn't really proof of comfort at any level...)
Does over a century of empirical evidence not prove that people are more comfortable with keyboards whose top row is laid out "QWERTYUIOP"?
I would argue that imperative programming is most natural - it's what everyone gravitates to in the beginning. Then, at a sufficient level of complexity, a programmer gravitates to solutions like OOP or FP, but there's an obvious trade off in readability there. 99 Bottles of Beer implemented with a loop is intrinsically going to be easier to read than an implementation with tail recursion, even though the latter is generally better. Lisp's inside-out parentheses style adds yet more cognitive load on top of that.
Many things are socially constructed, but not everything.
When 6.001 (the introductory class for which SICP was written) was launched, most of the students who took it had never used a computer before. Yes, MIT students. This was around ~1980. And in the first hour of their first class they were already doing symbolic differentiation in scheme.
I think your view of what’s “natural” is a just so story.
People heavily trained in maths can take quickly to languages designed to make programming look like maths, that's hardly a surprise.
I wouldn't base my assumptions about what most people find natural on the experience of MIT students taking 6.001 in 1980.
(Not to mention, 'doing' is doing a lot of heavy lifting in that sentence. I could show you a intricate sentence in French in the first hour of your first French class, but unless you came up with it yourself, are you demonstrating much learning just yet?)
But you are basing your assumptions on absolutely nothing.
I'm basing my assumptions my own experience, both learning to code and teaching others to code, which isn't nothing to me, but may well be nothing to you. (No shade intended, that's totally valid.)
I would certainly be interested in the results of a study that put a simpler interpreter / compiler and a language reference in front of motivated non-programmers, but I strongly suspect that the amount of elegant tail recursion we'll see will be limited (and I'd very much expect there to be a correlation between that and a training in mathematics).
Imho, data comes from experiments, but experiments come from hypotheses, and hypotheses come from experience.
Yeah, mid-1980s the thing incoming students had to unlearn was BASIC, not anything with curly braces. (Source: I was a 6.001 Lab TA at the time.) Of course, the next class on the rotation used Clu, where you had to unlearn "recursion is free".
Imperative programming is probably the most intuitive, but I'm doubtful curly braces and C-like syntax are anything more than coincidence. The first programming language was Fortran, and it didn't look anything like C. This is a really old Fortran program copied from a book:
Most modern programming languages seem to take inspiration from C, which took inspiration from BCPL, and that from Algol. Others took inspiration from Algol directly, like Ada, or Lua. And Python has indentation-based block structure, rather than having blocks of statements delimited by braces or or an "end" keyword.I always liked Pascal's BEGIN and END statements instead of curly braces. There is also Basic where the blocks built into control flow statements, like FOR I=1 TO 5 [code here] NEXT I
I'd argue a lot of programming language evolution is influenced by the capabilities of our IDEs. When you code in a text editor, the terse syntax of C is great and brings advantages over the verbosity of Pascal, Basic or god forbid Cobol. Once your editor does auto-indentation the braces seem redundant and you get Python. Smart completions from IntelliSense are essential to efficiently writing C#, and now that LSP has brought that to every IDE or smart text editor we have the explosion of popularity of more explicit and more powerful type systems (Typescript, typed Python, Rust). Programming languages are shaped by their environment, but the successful ones far outlive the environment that shaped them.
It really depends on your mindset. I grew up with math (composable operators, no side effects) and a lot of immutable + virtual operations software (maya, samplitude, shake, combustion) ... so to me imperative programming, with all the control flow, subtly changing state and time dependencies, coupling of concerns was almost instantaneously an fatal issue..
Backus also shifted away from imperative inspired languages to design FP/FL language (I thought they were contemporaries of BCPL but came 10 years later, later than APL), even though he contributed to FORTRAN directly.
I'd argue a FP implemenation with map (something like `[99..1].map(|n| f'{n} bottles of beer ... {n-1} bottles of beer on the wall').join('\n\n')`) is inherently as readable as the for loop, and not really more complex.
There are lots of great parts in FP, and for the last ~10-15 years imperative programming languages have made a lot of effort to add them to their syntax. You just need to leave out the more dogmatic parts that make FP popular in academia.
Hehe, it's easy if you ignore half the song, the singular for n=1 and the whole n=0 case! (Not that we're talking about rocket science if you don't, but c'mon, oranges to oranges!)
I agree with you otherwise though.
Why is tail recursion better generally? I'm not familiar with FP very much, but it feels like loops more closely resemble the way computers execute them than tail recursion.
It's more concise and illustrates the common subproblem. Loops make you model a state machine in your head, which I'd rather leave to the computer.
Why do you believe this is anything more than an historical accident?
For example, it wasn't what Alonzo Church gravitated to when he invented the lambda calculus in the 1930s, before any programming languages or indeed general-purpose computers existed.
First, you don't need to use explicit tail recursion. See e.g. https://99-bottles-of-beer.net/language-haskell-1070.html
Second, this sounds like unfamiliarity, not anything inherent. Why is it "intrinsically easier to read"? For a tail recursive version, the main tail recursive function would look like this in Haskell:
In fact, with a bit of experience you might write this as: It's only less easy to read if you're completely unfamiliar with the concepts of pattern matching and recursion. But the same is true of any programming language.Given the above, what's a "for loop" and why would you need one? Sounds complicated and unnatural.
CPUs are imperative.
This is completely socially constructed.
Lisp was once a very popular introductory programming language and students learned it just as easily or easier than any other language.
Assembly is imperative, so there's a lot to be said for a language that mimics how the computer actually works. Lisps always leave me saying, "oh, that's clever."
OOP is imperative programming. It's just function calls where the first parameter is to the left of the function name, after all.
A better name for "non-OOP" programming is procedural programming, where you organize code in long blocks that go straight down, code duplication is accepted vs jumping all over the place, etc. Honestly underrated. It can be quite easy to understand.
Strictly-evaluated FP is also imperative. The only really different languages are the ones with different evaluation systems or that can do things besides evaluate - people like to say Haskell is the best here but I think it's actually unification languages like Mercury. Maybe even SQL with transactions.
I don't know if it's innate but it's what we have. Lisp has been around about as long as programming, it's had plenty of time to catch on, it hasn't.
Maybe innate, maybe it's an offshoot of teaching math in an infix style, 1 + 2 vs. + 1 2.
Lisp became very popular, then died off rapidly due to association with the AI Winter.
But good college math departments teach reverse Polish notation; i.e., Hewlett-Packard over Texas Instruments. It’s demonstrably more advanced / efficient.
I don't think it's been tested at all. for people who took and finished a course in Lisp as their first programming language, how many "hate parens"?
I have no trouble with lisp's parens, i like them. What I never liked though, is that the first item in the list was an operator, a verb lets say, and the rest were the nouns; whereas, you could also have nested lists say of numbers where there were no operators. Never felt right (not that I can think of a better way, not worth adding more parens)
I think it's innate that having differentiated syntax for different types of grouping is natural. Look at mathematical papers where people will introduce new brackets with new meanings. (Indeed look at the entirety of QM for a clear, simple case)
Some Scheme and Lisp dialects have that. For example, Racket often uses square brackets instead of parentheses for things like clauses of a cond expression, and Clojure uses square brackets for vector literals and curlies for hash map literals.
Common Lisp has left brackets like {} and [] to the user (aka developer). It supports "reader macros", where the user can extend/supersede the syntax of s-expressions.
So, specialized tools/libraries/applications can introduce these brackets for their own use. Examples are embedded SQL expressions, notations for Frames (special objects, in kind of a mix of OOP and Logics), grammar terms, etc.
Thus it explicitly supports the idea of "people will introduce new brackets with new meanings".
C-like syntax is brutally hostile to programming beginners. There is not a shred of anything natural about it.
There's nothing natural about programming, because no ones like to be that formalized in their thinking process, especially with things that should be common sense (although ambiguity is still an issue). It's the recursive definition that get people. Especially when pointing that the things that the computer can do form a very small set. It's just that they can do it very fast.
You can see that when observing novices programming (without stack overflow or similar help). They often assumes that it will get done (magically) as soon as they call that function. And their code organization reflects the ad hoc thinking instead of a planned endeavor.
The formality of programming is unnatural to many (though the impulse to formality and robustness as displayed in logic & math clearly is millenia old, however niche), but a fair bit of it is as natural as language itself, or magic.
The curly braces themselves are 100% irrelevant, as evidenced by the many, many successful and well-liked languages which don't use them, including Python, which is in the running for the most-used language these days. They're an implementation detail.
What's closer to innate is the Algorithmic Language, Algol for short, the common ancestor of the vast majority of languages in common use (but not, notably, Lisps).
Algol was designed based on observational data of how programmers, who had to somehow turn their ideas into the assembler to run on machines, would write out those ideas. Before it was code, it was pseudocode, and the origins predate electronic computers: pseudocode was used to express algorithms to computers, when that was a profession rather than an object.
That pseudocode could have been anything, because it was just a way of working out what you then had to persuade the machine to do. But it gravitated toward a common vocabulary of control structures, assignment expressions, arithmetic as expressed in PEBCAK style, subroutine calls written like functions, indexing with squared brackets on both sides of an assignment, and so on. I revert to pseudocode frequently when I'm stuck on something, and get a lot of benefit from the practice.
So I do think that what's common in imperative languages captures something which is somewhat innate to the way programmers think about programs. Lisp was also a notation! And it fits the way some people think very well. But not the majority. I have some thoughts about why, which you can deduce an accurate sketch of from what I chose to highlight in the previous paragraph.
Then why does this web page use indentation to clarify who's replying to whom, instead of {}s?
That alternative universe was the early 1980s where Lisp was very popular to learn due to bring consider the best language for AI.
Because lisp is trivial to parse, it’s easy to make an extension for vs code that shows / edits Common Lisp / scheme looking completely different; you can make it unrecognisable. If the () are the only thing bothering people, then this is very simple to resolve. Can hardly be the only thing though. It’s also easy to build operators similar to the ones you have in Python etc like filter, map etc so you don’t have to apply recursion. These are there already but you can build them yourself in a few hours.
So it’s probably just what people learn first + lack of ‘marketing’ or negative PR (there are no libraries or ecosystem! The thing that least bothered me about CL but people with npm leftpad experience seem bothered by it).
It’s interesting as I worked I almost everything in production; c/c++ (including the MS 90s flavour), Delphi, VB, Perl, PHP, Java, C#, Haskell, F#, Common Lisp, Erlang, TS/JS, Python, Ruby, asm (z80, arm, x86) and I simply have not had a better overal experience than CL. The others are better at some things but a an overal experience, CL just is a pleasure.
Very much a personal anecdote, but I spent about a month earlier this year seriously learning various Lisps (CL, Racket Scheme, Clojure). I stopped when it clicked for me - Lisps are a mess. Everything I wanted from Lisp I found in Haskell.
I'm reasonably confident that all the anecdotes you hear about 10x improvements from switching to Lisp are just programmers learning about functional programming and good design patterns for the first time. But those aren't contingent on using a Lisp, and I'd argue using Lisp brings an enormous amount of cruft and baggage that makes FP seem far more alien and difficult than it needs to be.
The 10x seems to mostly have been from programmers switching from extremely low level languages like C++ to something far higher level. Paul Graham also talks about macros in his well known essay, but I honestly think a lot of the value one gets from Lisp can be found with Python. There are a lot of things you don't have to worry about like manual memory management and so on. Python isn't as fast as lisp or as beautiful (opinion), but the ecosystem is very impressive and the community isn't as fractured as the lisp community (for example see the bipolar lisp programmer essay).
I don't think FP by itself is that massive of a win despite what some dubious studies or zealots say, but it's certainly better than enterprise Java. I've read my fair share of horror stories of Haskell in production too.
Ultimately, programming languages are tools, and different tools are appropriate for different jobs.
There's nothing stopping you from writing a massively-scaling e-commerce site in Verilog and running it on an FPGA, but it - uh - probably isn't the soundest course of action.
Yep. I'd agree with that statement. Although there's a lot of tools I'd struggle to use practically anywhere.
With all due respect, he is ironically probably the biggest blub programmer.
I think the idea that Lisp was so much more productive than other languages originates from a much earlier time. But now the most important features of Lisp - like garbage collection - are commonly available in most languages.
I agree with most of what you said here but I want to emphasize that this is not necessarily a good outcome. Fitting the brains of corporate devs is not a metric to measure if your goal is to make the best tool for the job - the majority of corporate devs are extremely mediocre at their job even with a language that they’re not scared of.
All that to say, I completely emphatically agree with the original comment. The world would have been so much better off with Scheme as the language of the web.
If the "job" is to make lots and lots of software, even if most of it is mediocre, then the best tool is what will enable millions of mediocre developers to develop, not just thousands of elite developers.
Significant white space would have been a disaster on the web.
As much as everyone poops on js it is a very forgiving language for embedding.
JS is fine for scripting, sprinkling a bit of interactivity on a page. The issue is when you want to create whole software out of it, and the last thing you want is forgiveness. You want the compiler and linter complaining loudly.
You have no idea whether this is actually true, or whether people have just fit their brains to what is out there.
The idea that programming language syntax fits people's brains rings untrue for anyone who has watched beginners struggle with it, or remembers being one.
Plus, add a heaping tablespoon of survivorship bias.
1. Many people try programming.
2. The vast majority of the people who try programming are subject to external forces that guide them to whatever they learn and use.
3. Out of these, a certain fractions stick with it and are found programming in the long run, even working in it.
We could easily conclude (probably quite wrongly) that the popular languages turn people away from programming, except for a few weirdos for whom they click.
And yet, when you tell them the reasons, why some other syntax than their Java/PHP/Python syntax would be better, they usually counter with "It's just syntax." or "Every language can achieve the same result, why care so much about syntax." or similar.
I would need a source for that.
I think most programmer's brains have simply not been exposed to other syntaxes much or much too late, when their brain already calcified the C-style syntax. Or they don't actually care.
"The only way it would have "benefited" would be that the web would only be developed by lisp programmers."
Considering the state of the web I do not think this is making the argument you intend.
Millions have learned javascript because it is the technology of the web. Are they better off?
So many people have been introduced to programming, computer science, and programs through the abstractions provided by javascript, and that sucks
To be fair, what made Elixir take off, other than syntax there is also the developer experience overall, including documentation, package manager, unit testing framework out of the box, web-framework initially inspired by Rails etc.
Though, slightly off topic, but worth mentioning, that both Erlang and Elixir communities support each other very well. For example, now not only elixir is built on top of Erlang, but also Erlang adopts some things from elixir, such as monadic expression `with` from elixir inspired `maybe` in Erlang, or starting OTP27 Erlang is using ExDoc introduced by Elixir to generate documentations.
When Lisp was popular, developers learned it as easily or easier than any other language.
People aren't born familiar with C style syntax. Quite the opposite: many people struggle with it for a long time! Back in the day it super common to get frustrated because you missed off a semicolon or something. Nowadays IDEs probably help, but what's the point of syntax that the computer could write for you?
There is so much begging the question in this thread that it’s absolutely mind-boggling.
JavaScript was originally developed in two weeks due to time constraints from Brendan Eich's employer. It would probably have just as many design mistakes if he had stuck with a scheme-like syntax instead. It's just hard to create a well-designed language in such a short amount of time.
Yes, Javascript is one of the seminal "sales already signed the contract" situations in tech.
Given the circumstances, it's amazing javascript isn't way worse.
It could have been VBScript.
As someone who's both versed in Lisps (CL, Racket, Scheme, Clojure) and JavaScript: thank god.
It's like the Lisp crowd can't be honest with itself and not realise the serious shortcomings of lisp projects are: maintainability.
Every single lisp project out there, hell this also applies to Haskell, is so radically different because everybody's too busy reinventing its own abstractions and macros (and language extensions in case of Haskell) that you just throw your hands up in the air in dismay.
It's like the Lisp crowd, that can barely attract like-minded people to collaborate on some simple open source projects out there so you get 20 broken JSON parsers, cannot see the link between that issue and industrial lisps.
I love lisps, they are fun, they have their place in the industry when you leverage them to their core strengths.
Try to ask yourself: why does PHP has more quality killer-software than all of lisps combined?
Isn't that the problem with Javascript too, though? Javascript isn't good enough, let's all use jQuery. No, jQuery isn't good enough, let's all use Vue. No, Vue isn't good enough, let's all use Angular. No, Angular isn't good enough, let's all use React. No, React isn't good enough, let's...
This is very disingenuous. The frontend world does move quickly, but the part of the stack you're describing has been reasonably stable.
Vanilla JS was replaced by jQuery after about a decade. A decade later, React/Vue/Angular (React came first, not last) replaced jQuery
There was a paradigm shift in the kind of websites that people wanted to build. For better or worse, we went from HTML + CSS with some JavaScript (jQuery) to SPAs. It's easier for a team of junior developers to build a SPA with React compared to jQuery.
The abstractions that you're describing changed because the goal changed.
---
There has been plenty of churn around build tools, libraries on top of React/Vue, the paradigms within React/Vue (Hooks, Vue 3 composition API), state management libraries, and the adoption of TypeScript.
Again, this isn't really reinventing abstractions as much as dealing with an evolving language that wasn't designed for the applications we're building today (though it is clearly quite capable!)
ECMAScript 2015: proper tail calls are now in the spec
Chromium: we’re not doing proper tail calls
But now they have them... for WebAssembly.
having call/cc would make my life better
I should write a Lisp browser.
R5RS Scheme (standardised in 1998) didn't even have structs. To make anything like the DOM practical, you'd have needed some non-standard extensions that could well have ended up being a mess anyway.
Hey, at least there’s Clojurescript.