Haskell has had a profound impact on the way I think about programming and how I architect my code and build services. The stateless nature of Haskell is something that many rediscover at different points in their careers. Eg in webdev, it's mostly about offloading state to the database and treating the application as "dumb nodes." That's what most K8s deployments do.
The type system in Haskell, particularly union types, is incredibly powerful, easy to understand for the most part (you don't need to understand monads that deeply to use them), and highly useful.
And I've had a lot of fun micro-optimizing Haskell code for Project Euler problems when I was studying.
Give it a try. Especially, if you don't know what to expect, I can guarantee that you'll be surprised!
Granted, the tooling is sh*t.
Haskell also changed the way I think about programming. But I wonder if it would have as much of an impact on someone coming from a language like Rust or even modern C++ which has adopted many of haskell’s features?
True. I often think of Rust as a best-of compilation of Haskell and C++ (although I read somewhere that OCaml had a greater influence on it, but I don’t know that language well enough)
In real life, I find that Haskell suffers from trying too hard to use the most general concept that‘s applicable (no pun intended). Haskell programs happily use “Either Err Val” and “Left x” where other languages would use the more expressive but less general “Result Err Val” and “Error x”. Also, I don’t want to mentally parse nested liftM2s or learn the 5th effect system ;-)
Whoever wrote that is wrong.
If we could wave a magic wand and remove Haskell's influence on Rust, Rust would still exist in some kind of partial form. If we waved the same wand and removed OCaml's influence, Rust would no longer exist at all.
You are the one who is wrong, I'm afraid.
Which OCaml features exist in Rust but not Haskell? The trait system looks very similar to Haskell typeclasses, but I'm not aware of any novel OCaml influence on the language.
> Which OCaml features exist in Rust but not Haskell?
Rust's most important feature! The bootstrapped implementation.
I'm not convinced the implementation language of the compiler counts as a feature of the Rust language. If the argument is that Rust wouldn't have been invented without the original author wanting a 'systems OCaml' then fine. But it's possible Rust would still look similar to how it does now in a counterfactual world where the original inspiration was Haskell rather than OCaml, but removing the Haskell influence from Rust as it is now would result in something quite different.
Rust isn't just a language, though.
Additionally, unlike some languages that are formally specified before turning to implementation, Rust has subscribed to design-by-implementation. The implementation is the language.
That just means the semantics of the language are defined by whatever the default implementation does. It's a big stretch to conclude that means Rust 'was' OCaml in some sense when the compiler was written with it. Especially now the Rust compiler is written in Rust itself.
You're overthinking again. Read what is said, not what you want it to say in some fairytale land.
The original rust compiler was written in OCaml. That's not evidence it "had an influence", but it's highly striking considering how many other languages Greydon could've used.
Yes: if a person knows nothing else about Rust and the languages that might have influenced it, then the fact that the original Rust compiler was written in OCaml should make that person conclude tentatively that OCaml was the language that influenced the design of Rust the most.
I'm not one to hold that one shouldn't form tentative conclusions until one "has all the fact". Also, I'm not one to hold that readers should trust the opinion of an internet comment writer they know nothing about. I could write a long explanation to support my opinion, but I'm probably not going to.
It's like trying to say Elixir wasn't influenced the most by Erlang
Was any Elixir interpreter or compiler every written in Erlang?
If not, what is the relevance of your comment?
Elixir’s implementation still has significant parts written in Erlang. I don’t know if it’s a majority but it’s a lot. e.g.: https://github.com/elixir-lang/elixir/blob/aef7e4eab521dfba9...
https://doc.rust-lang.org/reference/influences.html
Haskell has algebraic data types, pattern matching and type inference, too, and has had them since Haskell first appeared in 1990.
Although SML is older (1983), OCaml is younger than Haskell.
https://doc.rust-lang.org/reference/influences.html
Rust was bootsrapped in OCaml.
https://github.com/mozilla/rust/tree/ef75860a0a72f79f97216f8...
I think it does, actually. Python also has many of Haskell's features (list comprehensions, map/filter/reduce, itertools, functools, etc.). But I only started reaching for those features after learning about them in Haskell.
In Python, it's very easy to just write out a for-loops to do these things, and you don't necessarily go looking for alternative ways to do these things unless you know the functional equivalents already. But in Haskell you're forced to do things this way since there is no for-loop available. But after learning that way of thinking, the result is then more compact code with arguably less risk of bugs.
If anything, Python encourages you to use loops because the backwards arrangement of the arguments to map and filter makes it painful to chain them.
filter has also lost ground in favor of list comps, partially because Guido hates FP [0], and probably due to that, there has been a lot of effort towards optimizing list comps over the years, and they’re now generally faster than filter (or map, sometimes).
[0]: https://www.artima.com/weblogs/viewpost.jsp?thread=98196
Yes, but how do you chain them?
vs I made up the syntax for the last one, but most functional languages have a nice syntax for it. Here's F#: Or plain shell:You don't.
Use generator syntax, which is really the more pythonic way to it.
First off, writing f3(f2(f1(x))) is painful - keeping track of parentheses. If you want to insert a function in the middle of the chain you have some bookkeeping to do.
Second, that's all good and well if all you want to do is map. But what if you need combinations of map and filter as well? You're suddenly dealing with nested comprehensions, which few people like.
In F#, it'll still be:
Here's an example from real code I wrote: This would not be fun to write in List Comprehensions, but you could manage (only two list comprehensions). Now here's other code: BTW, some of the named functions above are defined with their own chain of maps and filters.An alternative for python is to flip what you're iterating over at the outermost level. It's certainly not as clean as F# but neither is it as bad as the original example if there's a lot of functions:
Then the list comprehension can be moved up to mimic more closely what you're doing with F#, allowing for operations other than "map": And a step further if you don't like the "result" reassignment:Fair, but how would it look if you had some filters and reduces thrown in the middle?
In my F# file of 300 lines[1], I do this chaining over 20 times in various functions. Would you really want to write the Python code your way every time, or wouldn't you prefer a simpler syntax? People generally don't do it your way often because it has a higher mental burden than it does with the simple syntax in F# and other languages. I don't do it 20 times because of an obsession, but because it's natural.
[1] Line count is seriously inflated due to my habit of chaining across multiple lines as in my example above.
My example was just a way to do it with plain python and nothing special. There are libraries that use operator overloading to get more F#-style syntax.
For example: https://ryi.medium.com/flexible-piping-in-python-with-pipey-...
And another mentioned there: https://pypi.org/project/pipe/
I think we can just let this rest. These kinds of operations are not as ergonomic in python. That's pretty clear. No example provided is even remotely close to the simplicity of the F# example. Acquiesce.
You do realize this was in my original comment, right?
The fact is the language just works against you in this area if you have to jump through hoops to approximate a feature other languages just have. And I don't even mean extra syntax like F#'s pipe operators (although I do love them). Just swapping the arguments so you could chain the calls would look a lot better, if a little LISPy. It really is that bad.
Generators require a __next__ method, yield statement, or generator comprehension. What you've got is lambdas and a list comprehension. Rewriting using generators would look something like:
It's nicer in a way, certainly closer to the pipe syntax the commenter your replying to is looking for, but kind of janky to have to name all the intermediate steps.f3(f2(f1
This is still backwards?
You're not using generator syntax anywhere in that example.
Don't Haskell and Python use the same argument order?
(Technically the Python version should be cast to a list to have identical behavior.)Same with comprehensions (although nesting comprehensions will always get weird).
Check out Swift, too!
Why would I use swift when more cross-platform solutions exist?
Heck, even coming from Python (2) it felt very underwhelming and hugely oversold. (Edit: To be fair, I'd done a bit of Ocaml years earlier so algebraic data types weren't some huge revelation).
Laziness is mostly an anti-pattern.
Sum types are finally coming to C#. That’ll make it the first “Mainstream” language to adopt them. Will it be as solid and simple as Haskell’s implementation? Of course not. Will having a backing ecosystem make up for that deficiency? Yes.
Python has sum types
optional_int: int | None = None
This is semantically not the same as a sum type (as understood in the sense of Rust, which is afaik the academically accepted way)!
Python's `A | B` is a union operation, but in Rust a sum type is always a disjoint union. In Python, if `A = B = None`, then `A | B` has one possible instance.
In Rust, this sum type has two possible instances. This might not sound like a big deal, but the semantics are quite different.
Sorry, I could not grok the difference, even after reading a few Rust examples.
def foo(int | None = None) ...
... just means the variable's default value is None in a function definition. But it could be either in an actual function call.
There's no difference there because the types are already disjoint.
Say you wanted to define some function taking `YYMMDD | MMDDYY`. If both YYMMDD and MMDDYY are just aliases to `str`, then you gain no information, you cannot discriminate on which one it is, since the union `str | str` just reduces to `str`.
Sum types are disjointed unions, you can't just say `str | str`, the terms are wrapped in unique nominal data constructors, like:
enum Date { MMDDYY(String), YYMMDD(String) }
Then when accepting a `Date` you can discriminate which format it's in. You could do the same in Python by defining two unique types and using `MMDDYY | YYMMDD`.
Every dynamically typed language effectively has one big Sum type that holds all of the other types. IMO this is one reason why dynamic languages have been so popular (because Sum types are incredibly useful, and mainstream statically typed languages have historically had very poor support for them).
I like this observation! It explains a lot.
What counts as mainstream for you?
Java has recently added sealed classes/interfaces which offer the same features as sum types, and I would argue that Java is definitely mainstream.
Kotlin has a similar feature. It might be used less than Java, but it's the default language for Android.
Swift has `enum` for sum types and is the default language for iOS and MacOS.
Likewise for Rust, which is gaining traction recently.
Typescript also has union/sum types and is gaining lot of traction.
Sealed classes still won't let you have e.g. String|Integer, though I'll grant you that java is certainly mainstream.
Scala 3 has had union types for 4 years now. Scala can be used to do Haskell style pure FP, but with much better tooling. And it has the power of the JVM, you can fall back to Java libraries if you want.
You don't really need `String|Integer`, for most usecases an isomorphic type that you can exhaustively pattern match on is more than enough, and sealed classes (along with the support in `switch` expressions) does exactly that.
For that matter, PASCAL has had variant records (i.e. sum types) since the 1970s.
Did it have an ergonomic way to exhaustively match on all the variants? Since the 70s?
How does the ABI work? If a library adds a new constructor, but I am still linking against the old version, I imagine that it could be reading the wrong fields, since the constructor it's reading is now at a different index?
Rust is mainstream, just not use in enterprise applications
AWS uses Rust extensively
Not really, other mainstream languages got there first.
Sounds a lot like my experience. I never really used Haskell for "real work", where I need support for high-performance numerical calculations that is simply better in other languages (Python, Julia, C/C++, Fortran).
But learning functional programming through Haskell – mostly by following the "Learn you a Haskell" book and then spending time working through Project Euler exercises using it – had a quite formative effect on how I write code.
I even ended up baking some functional programming concepts into my Fortran code later. For instance, I implemented the ability to "map" functions on my data structures, and made heavy use of "pure functions" which are supported by the modern Fortran standard (the compiler then checks for side effects).
It's however hard to go all the way on functional programming in HPC contexts, although I wish there were better libraries available to enable this.
I think it is a shame Haskell has gained a reputation of being hard, because it can be an enriching learning experience. Lots of its complexity is accidental, and comes from the myriad of language extensions that have been created for research purposes.
There was an initiative to define a simpler subset of the language, which IMHO would have been great, but it didn't take off: https://www.simplehaskell.org. Ultimately, one can stick to Haskell 98 or Haskell 2010 plus some newer cherry-picked extensions.
I think Elm is a fantastic "simplified Haskell" with pretty good beginner-friendly guides. It's unfortunate that Elm is mostly tied to the frontend and has been effectively abandoned for the last couple of years.
Interestingly, Elm has inspired a host of "successors", including Gleam + Lustre, which look really great (I haven't had a chance to really try them yet).
What about Roc or Koka? Or simply moving to OCaml? It's looking pretty great after v5, with multicore and effect handlers.
OCaml is great but the type system is actually quite different from Haskell's once you get into it. It also has many "escape hatches" out of the functional pathway. Even if you approach it with a learner's discipline you'll run into them even in the standard lib.
With haskell you can look to the ecosystem to see how to accomplish specific things with a pure functional approach. When you look at ocaml projects in that way you often find people choosing not to.
Oh but the OCaml module system is the bees knees.
Yeah I didn't mean any of this as a negative lol. I haven't touched haskell since I learned ocaml. I still think haskell has the edge as an educational language for functional programming and type systems though, which is kind of what we're talking about but not entirely.
No worries, I was just adding my two cents.
Elm's strengths are its constraints, which allow for simple, readable code that's easy to test and reason about - partly because libraries are also guaranteed to work within those constraints.
I've tried and failed several times to write Haskell in an Elm style, even though the syntax is so similar. It's probably me (it's definitely me!), but I've found that as soon as you depend on a library or two outside of prelude their complexities bleed into your project and eventually force you into peppering that readable, simple code with lifts, lenses, transformations and hidden magic.
Not to mention the error messages and compile times make developing in Haskell a chore in comparison.
p.s. Elm has not been abandoned, it's very active and getting better every day. You just can't measure by updates to the (stable, but with a few old bugs) core. For a small, unpopular language there is so much work going into high quality libraries and development tools. Check out
https://elmcraft.org/lore/elm-core-development
for a discussion.
Elm is so nice to work in. Great error messages, and near instant compile times, and a great ecosystem of static analysis, scaffolding, scripting, and hot reloading tools make the live development cycle super nice - it actually feels like what the lispers always promised would happen if we embraced repl-driven development.
Thanks for the Elmcraft FAQ link. It's a great succinct explanation from the Elm leadership perspective (though tellingly not from the Elm leadership).
I feel like I understand that perspective, but I also don't think I'm wrong in claiming Elm has been effectively abandoned in a world where an FAQ like that needs to be written.
I'm not going to try to convince you though, enjoy Elm!!
I've often wondered if it having a reputation as being hard is accurate. Not necessarily because of syntax etc. but because of you don't already have a grounding in programming/engineering/comp sci. it can be difficult to fit the insights Haskell provides into any meaningful framework. That was my experience anyway, came to it too early and didn't understand the significance.
Sounds a lot like the C++ experience.
In my time learning Haskell a decade ago, it was rare to find some code that wasn't using an experimental extension.
Pure functions are a crazy useful abstractions. Complex business logic? Extract it into a type-safe pure function. Still to "unsafe"? Testing pure functions are fast and simple. Unclear what a complex function does? Extract it into meaningful pure functions.
Haskell sounds like a good language to hone your programming skills. What kind of projects is Haskell suited for to get started (besides Euler project)? I use Python primarily for scientific research (mostly numerical computation).
I think the tooling being not ideal is a reflection of how mature/serious the community is about non academic usage. Haskell has been around for ages but it never really escaped its academic nature. I actually studied in Utrecht in the nineties where there was a lot of activity around this topic at the time. Eric Meyer who later created F# at MS was a teacher there and there was a lot of activity around doing stuff with Gopher which is a Haskell predecessor, which I learned and used at the time. All our compiler courses were basically fiddling with compiler generator frameworks that came straight out of the graduate program. Awesome research group at the time.
My take on this is that this was all nice and interesting but a lot of this stuff was a bit academic. F# is probably the closest the community got to having a mature tooling and developer ecosystem.
I don't use Haskell myself and have no strong opinions on the topic. But usually a good community response to challenges like this is somebody stepping up and doing something about it. That starts with caring enough. If nobody cares, nothing happens.
Smalltalk kicked off a small tool revolution in the nineties with its refactoring browser. Smalltalk was famous for having its own IDE. That was no accident. Alan Kay, who was at Xerox PARC famously said that the best way to predict the future was to invent it. And of course he was (and is) very active in the Smalltalk community and its early development. Smalltalk was a language community that was from day one focused on having great tools. Lots of good stuff came out of that community at IBM (Visual Age, Eclipse) and later Jetbrains and other IDE makers.
Rust is a good recent example of a community that's very passionate about having good tools as well. Down to the firmware and operating system and everything up. In terms of IDE support they could do better perhaps. But I think there are ongoing efforts on making the compiler more suitable for IDE features (which overlap with compiler features). And of course Cargo has a good reputation. That's a community that cares.
I use Kotlin myself. Made by Jetbrains and heavily used in their IDEs and toolchains. It shows. This is a language made by tool specialists. Which is why I love it. Not necessarily for functional purists. Even though som Scala users have reluctantly switched to it. And the rest is flirting with things like Haskel and Elixir.
I'd say it's more of a reflection of how having a very big company funding the language is making a difference.
People like to link Haskell's situation to its academic origins, but in reality, most of the issues with the ecosystem are related to acute underfunding compared to mainstream languages.
One doesn't happen without the other. Haskell is hugely influential with it's ideas and impact. But commercially it never really took off. Stuff like that needs to come from within the community; it's never going to come from the outside.
Either the community is large enough for it, or it comes from the sponsoring company.
Few languages start off by being in the first situation. The first example that comes to my mind (Python), well... Tooling was a long and painful road. And if the language hadn't been used/backed by many prominent companies, I don't see how man-hours would have flowed into tooling.
Python is a good example. Guido van Rossum was an academic when he built python. And then later he got employed to work on Python because indeed a lot of people found his work useful. By the time that happened, python was already quite widely used though.
Also time wise it's a good example because python emerged early nineties around the same time the Haskell community started forming. Haskell had a few years head start actually.
The difference was that python became popular quite early in things like Linux distributions and even though Haskell was available and similarly easy to install in those, it never really caught on. Sponsored development usually happens as a result of people finding uses for a language, not before.
“Greece, Rome’s captive, took Rome captive.”
The languages of engineering-aligned communities may appear to have won the race, though they have been adopting significant ideas from Haskell and related languages in their victories.
Something went wrong in the adoption process.
Haskell's biggest benefit is functions, not methods. To define a function, you need to stop directly mutating, and instead rely maps, folds, filters, etc. The bargain was: you give up your familiar and beloved for-loops, and in return you get software that will yield the same output given the same input.
So what happened with the adoption? The Java people willingly gave up the for-loops in favour of the Streams/maps/filters. But they didn't take up the reward of software that yields the same input given the same output.
What's something else in the top-5 killer Haskell features? No nulls. The value proposition is: if you have a value, then you have a value, no wondering about whether it's "uninitialised". The penalty you pay for this is more verbosity when representing missing values (i.e. Maybe).
Again, the penalty (verbose missing values ie. Optional<>) was adopted, and the reward (confidently-present values) was not.
Ah, the joys of having a Scala `Option` type and still having to consider the cases or Some[thing], Nothing and... null!
Yes, well-written Scala code knows not to use null with reckless abandon, but since when your coworkers coming from Java know to show restraint?
The type system is a big part and elements of that have shown up elsewhere. I’m with you on the belief that we should have better adoption for immutability, pure functions, and equational reasoning.
JavaScript promises can work analogously to the Maybe monad if you want them to.
Swift’s optionals are essentially the same thing as the Maybe monad.
Pretty sure F# was created by Don Syme, not Erik Meijer.
You are right. My mistake. They both worked for Microsoft though and Erik Meijer did work on things like Linq, which was an important part of the F# ecosystem. Also his work seems to have inspired Don Syme.
And I will as strongly as possible emphasize the opposite you should not.
If you are are already experienced in functional programming, as well as in statically typed functional programming or something lovely in the ML family of languages then only then does Haskell make sense to learn.
If you are looking to learn about either FP in general, or staticly typed FP Haskell is about the single worst language anyone can start with. More people have been discouraged from using FP because they started with Haskell than is probably appreciated. The effort to insight ratio for Haskell is incredibly high.
You can learn the majority of the concepts faster in another language with likely 1/10th the effort. For general FP learn clojure, Racket, or another scheme. For statically typed FP learn F# or Scala or OCAML or even Elm.
In fact if you really want to learn Haskell is is faster to learn Elm and then Haskell than it is to just learn Haskel. Because the amout or weeds you have to navigate through to get to the concepts in Haskell are so high that you can first learn the concepts and approach in a tiny language like Elm and it will more than save the amount of time it would take to understand those approaches from trying to learn Haskell. It seems unbelievable but ai found it to be very try. You can learn two languages faster than just one because of how muddy Haskell is.
Now that said FP is valuable and in my opinion a cleaner design and why in general our industry keeps shifting that way. Monoids, Functors, Applicative are nice design patterns. Pushing side effects to the edge of your code (which is enforced by types) is a great practice. Monads are way overhyped, thinking in types is way undervalued. But you can get all of these concepts without learning Haskell.
So that's the end of my rant as I've grown tired of watching people dismiss FP because they confuse the great concepts of FP with the horrible warts that come with Haskell.
Haskell is a great language, and I'm glad I learned it (and am in no way an expert at it)- but it is the single worst language for an introduction to FP concepts. If you're already deep in FP it's and awesome addition to your toolbox of concepts and for that specific purpose I highly recommend it.
And finally, LYAH is a terrible resource.
Is it worth learning JavaScript before learning Elm?
Elm has stopped being updated. I naturally assumed it was quietly abandoned. https://elm-lang.org/news
I'm not front end expert, but I have a working knowledge of html & js. I feel like it would still be ok without any JS background, but I could be wrong on that.
That said the language is small enough you can go through the tutorial in a weekend. You'll know pretty quickly if you feel like you're picking it up or feels too foreign.
My gut feel is general programming experience is enough, but don't hold me to that one.
Could you elaborate? I know LYAH doesn't teach enough to write real programs, and does not introduce necessary concepts such as monad transformers, but why is it so terrible as an introduction to Haskell and FP? (In my mind, incomplete/flawed != terrible... Terrible means "avoid at all costs").
As for your overall point, I remember articles posted here on HN about someone teaching Haskell to children (no prior exposure to any other prog lang) with great success.
(not gp, and from memory awhile ago so please forgive lack of exact quotes & page numbers)
Bunch of places where the tone masked or downplayed real issues in ways that made other text suspect. As a concrete example, `head [] -> Exception` with something like "of course it errors if you take the first part of something that's not there" and `take 1 [] -> []` with "obviously taking one thing from an empty list gets you an empty list" -- uh, no. Maybe it's a historical wart, maybe there are good technical reasons, but different behavior in these cases is definitely not obvious!
I hear this a lot, but am curious about two things: (a) which bit(s) of the toolchain are you thinking about specifically -- I know HLS can be quite janky but I haven't really been blocked by any tooling problems myself; (b) have you done much Haskell in production recently -- i.e. is this scar tissue from some ago or have you tried the toolchain recently and still found it to be lacking?
Everytime I use cabal and/or stack, it gives me a wall of errors and i just reinstall everyrhing all the time.
If you share a transcript from a cabal session I'll look into this for you.
And if you share stack transcripts I’ll look into those for you.
I’ve experienced this too, the tools can certainly be improved, but also a little more understanding of what they do and how to interpret their error messages could help you (I am guessing).
Would you mind explaining what you mean by stateless?
Haskell functions are pure, like mathematical functions: the same input to a function produces the same output every time, regardless of the state of the application. That means the function cannot read or write any data that is not passed directly to it as an argument. So the program is "stateless" in that the behavior does not depend on anything other than its inputs.
This is valuable because you as the developer have a lot less stuff to think about when you're trying to reason about your program's behavior.
It's good that the question about 'stateless' was raised, but because these are two different things. Working with pure functions does indeed have the above benefits, but a dumb web node deferring its behaviour to a stateful database is not stateless in that sense, and so does not have the above benefits.
[delayed]
Exactly the same for me.
Stack and Stackage (one of the package managers and library distribution systems in Haskell-land) is the best I found in any language.
Other than that I also found some tools to be lacking.
What makes you say that stack is the best you found in any language? I use it daily, and in my experience I'd put it just a bit above PHP's composer
In other package manager there is no guarantee that the libs work together. Stackage test the whole ecosystem. You use Stackage X.Y (not libA X.Y + libB X.Y, etc).
All other ecosystems there are some libs that do not work together well, and usually you find out the hard way.