return to table of content

Calculus with Julia

dfee
75 replies
21h2m

I skipped around in the book a bit and found it interesting. I’d consider encouraging my kids to learn calculus this way.

However, I am curious about the first paragraph of the preface:

Julia is an open-source programming language with an easy to learn syntax that is well suited for this task.

Why is Julia better suited than any other language?

staplung
43 replies
20h13m

Julia has a bunch of little niceties for mathematics:

* Prefixing a variable with a scaler will implicitly multiply, as in standard math notation (e.g. 3x would evaluate to 6 if x is 2)

* Lots of unicode support. Some feel almost gratuitous (∈ and ∉ are operators that work as you would expect) but it's pretty nice to have π predefined (it's even an irrational datatype) and to use √ as an operator (e.g. √2 is a valid expression and evaluates to a float). It's not just that Julia supports these constructions but provides a convenient way to get them to appear.

* This is a little less relevant for calculus but vectors and matrices are fist class types in Julia. Entering and visually parsing matrices is so much easier in Julia than in Python.

  m = [1 2 3;
       4 5 6;
       7 8 9]
vs.

  m = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
Transposition is a single character operator ('). Dot product can be done with the dot operator (m ⋅ n). A\b works as it does in matlab.

* Julia supports broadcasting. It also has comprehensions but with broadcasting I personally find much less need for comprehensions.

* Rationals are built-in with very simple syntax (1//2))

RivieraKid
25 replies
19h45m

I quite like and use Julia but wish there was a language mixing the best aspects of Julia and Swift (which I think can be done without many compromises, i.e. it would be a better language overall).

Some things I don't like about Julia:

- array.mean().round(digits=2) is more readable than round(mean(array), digits=2)

- Poor support for OOP (no, pure FP is not the optimal programming approach).

abdullahkhalids
15 replies
16h34m

round(mean(array), digits=2)

is how mathematicians have thought for hundreds of years now. They do so in this way, because it's indeed easier to parse, and philosophically it correctly represents what is happening mathematically.

array.mean().round(digits=2)

is how a subset of software engineers have thought about computation in the last couple of decades.

Technology should whenever possible adopt already existing conventions, because technology is created to serve the user, and not vice versa.

adastra22
14 replies
16h21m

is how mathematicians have thought for hundreds of years now.

"Because tradition" is not a good argument. We can and should strive for better.

The modern OOP-derived style of `array.mean().round(digits=2)` is just as exact in its specification, and arguably more semantically precise as well as it organizes the meaningful bits to be together.

Also I think you don't account for how much mathematical notation has changed over the years. The `round(mean(array), digits=2)` structure is relatively recent, early 20th century I believe, and the result of active reforms.

abdullahkhalids
8 replies
14h4m

mathematical notation has changed over the years.

I agree. Math notation has changed in response to problems mathematicians have faced, and as their thinking have evolved. It should not change because tool builders for mathematicians - when the tool builders in most cases are not even professional mathematicians - decide some other notation is better.

array.mean().round(digits=2)

might be good (it is not) when you are operating on one object. As soon as you have a binary operation (say multiplication), this notation completely breaks down. Because the operation is often symmetric, or "near" symmetric, meaning it is not so different as to write something as ridiculous as x.product(y).

But more importantly, (common/dominant) mathematics is functional in nature. It's decidedly not object oriented. Given a matrix, M, doing something like M.eigenvalues() is vomit-inducing because eigenvalues is not a property of a matrix M. For matrices, its a map from a set of linear transformations (of which M is merely one item) onto a set of numbers. It is its own thing, separate from M.

adastra22
4 replies
13h56m

Now you're gatekeeping on who is a real mathematician?

I don't think this subthread is going to be productive and I'm bowing out.

nightlyherb
3 replies
12h35m

I'm not the GP, but I'm sorry you felt gatekeeped. I don't think GP conveyed "real mathematicians should do FP", but rather something else, so I'll try my best to share what I understood from that comment. Hope this makes you feel better.

Even though the notation changed over the years, the paradigm of "numbers as immutable entities and pure functions" has been the dominant way that math is presented, compared to something like "encapsulated objects that interact with each other via sending messages".

I don't think this has to be this way, and I do think that "real math" can also be laid out assuming principles of OOP. However, I do suspect it's the way it is because the laws of nature are unchanging, in contrast to the logic of a business application.

Because Julia is a tool with the target audience of mathematicians and scientists, I think it's a sensible decision to embrace the usual way of thinking, as opposed to presenting a relatively different way of thinking which steepens the learning curve. Not because data and functions are fundamentally better than OOP, but because it's more pragmatic for the target audience.

adastra22
2 replies
11h21m

I understood the other poster correctly. This view of mathematics is as limiting and ignorant as the other poster’s.

The entire field of theoretical computer science, to which functional programming and type theory is closely tied, is a branch of mathematics. The Church-Turing thesis which gives both to our field equates the two at a very fundamental level. Questions about type theory, programming language design, and programmer ergonomics are fundamentally about math and applied math.

Maybe you and the other poster have in mind specific fields of math, but then you need to make claims for why those fields are sufficiently different as to be exempt from applicability of any of the advances in notation observed in other fields.

Your implicit assumption that you can divide computer science into a different bucket from “real math” is incorrect, and gatekeeping.

As I said though, I don’t think this is a profitable debate to have here on HN.

nightlyherb
0 replies
10h38m

Okay, I see what you mean.

Maybe you and the other poster have in mind specific fields of math,

Yes, because this is about Julia, I assumed we are talking about the specific fields of math that happen to be commonly used in mathematical and scientific computing, such as the ones learned in university math and science courses.

Your implicit assumption that you can divide computer science into a different bucket from “real math” is incorrect, and gatekeeping.

It is regretful that we had a miscommunication. I agree with you that computer science belongs into the same bucket as "real math". The thing is, in the context of Julia it is not easy to read "math" as "math, but not only the domains that Julia is concerned in, but really all fields of math, including theoretical computer science". At least thanks to your comment, I see what you mean more clearly, and I think it'll help some other potential readers as well.

but then you need to make claims for why those fields are sufficiently different as to be exempt from applicability of any of the advances in notation observed in other fields.

I'm curious to know specifically about the specific advances of notation observed in other fields. By this you mean dot notation for method application? I'm unsure if `a.method(b).anotherMethod(c)` is more advanced than `a |> method(b) |> anotherMethod(c)` (Edit: or `(anotherMethod(c) . method(b))(a)`) notation-wise.

nightlyherb
0 replies
9h12m

I'd like to add another point worth mentioning. When I read the other poster's post, I was thinking about the semantics of OOP, specifically encapsulated objects and messages. I may have misinterpreted the other poster, but I hope it at least makes my other messages a bit clearer. (I think it's a little unwieldy to use Matrices that encapsulate its state for Linear Algebra, for example.)

bee_rider
2 replies
12h25m

In programming it is typical to have an “eig” function, or something of the sort, which returns the eigenvalues of a matrix. eig(M) seems no less vomit inducing than M.eigenvaues(). Usually when mathematicians want eigenvalues they write something like “Given a matrix M with eigenvalues \lambda” and the eigenvalues just show up.

samatman
1 replies
4h37m

In Julia:

    M = [1 2 3; 4 5 6; 7 8 9]
    λ = eigvals(M)
Which is much the same.

bee_rider
0 replies
2h49m

Sure, my point is that even the math-y programming languages and libraries don’t map very well to how mathematicians seem to describe eigenvalues.

Personally I think of them as a characteristic of the matrix (but I just write numerical software, I’m not a mathematician) and implement them as the result of a process. So I don’t really see any reason to favor either syntax or call one more mathematical.

FabHK
3 replies
15h19m

I used to miss "OOP" in Julia. The funny thing about the dot notation is that it makes the limitations of single dispatch seem natural. Once you get used to multiple dispatch and its benefits, the dot notation seems limiting.

`x.plus(y)` dispatches based on (runtime) type of x (polymorphism) (single dispatch), and might call different functions based on the (compile time) type of y (function overloading).

In Julia, it would be `plus(x, y)`, and it can dispatch based on the runtime type of both x and y (or determine the correct method to call at compile time already, if the types of x and y can be determined then).

adastra22
2 replies
11h17m

Virtual method dispatch isn’t always a good thing though. It’s a tradeoff and typically you really want one or the other. C++ makes both available for that reason, although C++’s syntax is horrible.

jakobnissen
1 replies
10h0m

It's multiple dispatch though, not virtual dispatch. If I understand correctly, virtual dispatch means the method is picked at runtime. In Julia, the compiler is able to pick the correct method at compile time, even when dispatching on all arguments of the function.

samatman
0 replies
4h33m

This is correct, but it's important to realize that compile time continues during runtime.

Julia's execution model is quite distinctive, and it does take a thorough understanding of how it plays out to get native-code speeds out of it.

nightlyherb
0 replies
13h19m

As a person who respects many programming paradigms including OOP and FP, I beg to disagree. Specifically, I'm assuming you're talking about OOP the paradigm, not only the dot notation, for which the pipe operator is a viable alternative.

The modern OOP-derived style of `array.mean().round(digits=2)` is just as exact in its specification, and arguably more semantically precise as well as it organizes the meaningful bits to be together.

I would have agreed with you if I were developing business applications. However, for the domain of math and science, every piece of material that I have seen assumes the paradigm of numbers as immutable entities and functions, not the paradigm of mutable encapsulated objects and interacting messages. To a mathematician or scientist the semantics of a `Vector` (as data) and a `mean` function that takes the `Vector` as input should be more natural than a black-box `Vector` object that can receive a `mean` method.

abisen
2 replies
19h36m

If I am not using named arguments I find myself using the pipe operator a lot. I also find it more readable.

array |> mean |> round

For scenarios with named arguments there is a little not so cleaner workaround

array |> mean |> x->round(x, digits=2)

andyferris
1 replies
18h44m

array |> mean |> round(; digits=2) should also work and is a bit more concise.

SatvikBeri
0 replies
15h18m

This doesn't work, at least for me on 1.10. `round(; digits=2)` would have to produce a function. But Julia doesn't have automatic currying, so that only happens if someone has implemented a special-case version (e.g. (==)(3) returns a partial function that tests for equality with 3.)

staplung
1 replies
19h22m

Well, Julia also has a pipeline operator built in so:

  array |> mean |> x->round(x; digits=2)
will do what you want and is arguably more readable than dot notation.

Not supporting OOP is very much intentional. It has structs and multiple dispatch (which has proven to be insanely useful in Julia's domain of scientific computing) so about the only thing you lose is the namespacing that dot syntax facilitates.

brabel
0 replies
7h20m

That's not more readable, you're just switching a few characters.

Any language that has Uniform Function Call Syntax (UFCS) [1] lets you replace wrapped function calls with linear calls like that, it's clearly easier to read, and much easier to write, specially when you consider programmers' tooling (in case it's not obvious why: you can type the name of a symbol and a '.' and the IDE shows you the available functions... this works also for free functions that take the symbol's type as the first argument as well.... that could work for the pipe operator as well, of course, that's why I said that the two things are basically equivalent, just with slightly different notation).

[1] https://en.wikipedia.org/wiki/Uniform_Function_Call_Syntax

ssivark
1 replies
14h53m

Poor support for OOP (no, pure FP is not the optimal programming approach).

Julia is not really a pure functional programming language, though it does support that style if one desires.

It's actually a "multiple dispatch" language... thinking of OOP as single dispatch, Julia is conceptually OOP on steroids -- but to benefit from the full expressiveness, the programming style looks different from the typical OOP syntax/patterns one is used to.

waveBidder
0 replies
13h4m

having spent my PhD writing in Julia, coming back to object oriented programming felt like programming with a hand tied to my back. many operations just make more sense dispatching on multiple arguments rather than just one, and. cramming methods into a class felt unnatural

patagurbon
0 replies
19h38m

You can overload your own types to get a lot of those things (poor support for OOP is harder but you can sort of emulate it with traits). Overloading `getproperty` for the former case.

Of course it's not built in, which I understand is annoying if that's your preferred coding style. I personally am sad that really good traits aren't encoded in the language.

3abiton
13 replies
18h59m

But why would be better than Mathematica for example for calculus?

moelf
11 replies
18h40m

it's very hard to argue anything is "better" than Wolfram/Mathematica when it comes to symbolic stuff, after all, most theoretical physicists use Mathematica for their professional work.

But for (entry-level) learning and possibly pivoting to application, Julia is delightful to use and can transit into some symbolics and numerical. Besides, it's free and open source.

llm_trw
10 replies
18h23m

most theoretical physicists use Mathematica for their professional work.

They do not.

fiforpg
9 replies
18h9m

Surely, many do? The ones I interacted with did, for example. I'd venture that the preferred CAS varies with schools and countries.

llm_trw
8 replies
17h44m

Some do, most don't.

Unless you're working on undergraduate integrals then it's pretty useless for advanced stuff without writing your own library. By that point you may as well write it in a language that's more performant and cheaper.

abdullahkhalids
6 replies
16h38m

I agree that only some physicists use Mathematica. But I haven't really seen it being used it for calculus. Maybe some differential equations.

But mostly for symbolic algebriac manipulation. I used it during my phd to work with groups. Instead of having to calculate stuff by hand, you can just ask Mathematica to do it. Also lots of stuff with tensors in GR is so easy to do in Mathematica.

adgjlsfhk1
5 replies
16h5m

I think it's fair to say that most math/physics people use mathematica from time to time, but largely for different things than they use other programming languages for. It's very good as a CAS, but it's a pretty bad programming language for things that don't have analytical solutions.

llm_trw
4 replies
15h44m

It isn't. Mathematica is very much a niche product in academia.

The people who would get most out of it are students, but for some god forsaken reason universities don't support them.

I was in a pilot class with Mathematica back in 2006 and the review of the class were _all_ 5 stars and students on average got 10% higher marks in all other subjects they took that year.

They didn't run the course again.

Sagemath is now equally good if a teacher defines a DSL for the students to use in a class.

bee_rider
3 replies
12h20m

Sagemath seems much more appropriate; it is better not to bind students to proprietary tools.

ykonstant
2 replies
10h0m

The only "problem" with sagemath is that it is based on Python. The rationale is that Python is easy to start using and widely known. This is the usual "make it easy for newcomers" trap.

For the mathematical constructs we care about in symbolic programming, I have found Python's syntax and Sage's menagerie of objects awful to use. Initially you feel comfortable, but when you want to do some real work, it gets horribly in the way. The Wolfram language, a LISP variant, is less familiar and harder for a newbie to learn but it is vastly superior for actual work.

llm_trw
1 replies
8h22m

Wolfram is not a lisp, even under the hood it is its own thing.

ykonstant
0 replies
7h0m

Thanks, let me correct myself:

The only "problem" with sagemath is that it is based on Python. The rationale is that Python is easy to start using and widely known. This is the usual "make it easy for newcomers" trap.

For the mathematical constructs we care about in symbolic programming, I have found Python's syntax and Sage's menagerie of objects awful to use. Initially you feel comfortable, but when you want to do some real work, it gets horribly in the way. The Wolfram language, not a LISP variant, is less familiar and harder for a newbie to learn but it is vastly superior for actual work.

ssivark
0 replies
14h47m

Unless you're working on undergraduate integrals then it's pretty useless for advanced stuff without writing your own library. By that point you may as well write it in a language that's more performant and cheaper.

Hard disagree. Mathematica's symbolic dexterity makes abstract reasoning (with equations/expressions) very easy. Think of it as the companion tool for anyone doing pages of algebra that would go into a paper, or form the backend for some code.

The numerical capabilities of Mathematica are passable but nothing fancy. Once you have the math figured out, you might even want to reimplement "in a language that's more performant and cheaper." But I haven't see anything come close to Mathematica for convenience of symbolic reasoning -- not just as a technology (lisp is pretty good) but as a ready-for-use product.

adastra22
0 replies
16h19m

TFA covers that question:

There are many examples of integrating a computer algebra system (such as Mathematica, Maple, or Sage) into the calculus conversation. Computer algebra systems can be magical. The popular WolframAlpha website calls the full power of Mathematica while allowing an informal syntax that is flexible enough to be used as a backend for Apple’s Siri feature. (“Siri what is the graph of x squared minus 4?”) For learning purposes, computer algebra systems model very well the algebraic/symbolic treatment of the material while providing means to illustrate the numeric aspects. These notes are a bit different in that Julia is primarily used for the numeric style of computing and the algebraic/symbolic treatment is added on. Doing the symbolic treatment by hand can be very beneficial while learning, and computer algebra systems make those exercises seem kind of redundant, as the finished product can be produced much easier.

TL;DR: they want the student to actually do some of the work that Mathematic magics away.

eggy
1 replies
6h43m

I like Julia and Python, however when I'm playing with doing math, or using paper, I like APL or J.

The above example in J:

   3 3 $ i.9
0 1 2

3 4 5

6 7 8

   3 3 $ >: i.9
1 2 3

4 5 6

7 8 9

J is 0-indexed. ">:" is the increment operator. APL allows you to set a parameter to do 1-indexed vs. 0-indexed.

APL:

   3 3 ρ ι9
1 2 3

4 5 6

7 8 9

There is a Calculus text for learning it with J:

https://www.jsoftware.com/books/pdf/calculus.pdf

garbthetill
0 replies
4h50m

interesting, i was thinking of sharping my maths skill by programming. Have you tried something like lean, agda etc if so how do they fair against apl or j for doing maths

I honestly thought apl was extinct

llm_trw
0 replies
18h26m

    m = [[1, 2, 3],
         [4, 5, 6],
         [7, 8, 9]]

constantcrying
22 replies
20h24m

Why is Julia better suited than any other language?

Because it is a language specifically targeted for doing numerical analysis.

Python, which is it's main "competitor" has a notoriously poor syntax for mathematical operations with miniscule standard library support for mathematics, a very limited type system and abysmal runtime performance. Julia addresses all of these issues and gives a relatively simple to read language, which often closely resembles mathematical notation and has quite decent performance.

mgaunard
21 replies
20h9m

"its" not "it's"

numpy has similar syntax to MATLAB and Julia

Python's type system is pretty rich and allows overloading all operators

patagurbon
14 replies
19h42m

I would argue that default element-wise operations and 0 based indexing can lead to a lot of cognitive overhead vs the "more math-y" notation of MATLAB or Julia. And overloaded operators like block matrix notation or linear solves are much closer.

The type system in Julia is perhaps even better than MATLAB for some linear algebra operations (a Vector is always a Vector, a scalar is always a scalar). But the Python ecosystem is often far superior to both, and MATLAB has toolboxes which don't have equivalents in either other language. Julia has some uniquely strong package ecosystems (autodiff to some extent, differential equations, optimization), but far fewer.

daynthelife
6 replies
18h2m

I disagree on the zero-based indexing complaint. Indeed, the fact that Julia indexed from 1 is the sole reason I will never use an otherwise great language. I can’t comprehend how people came to the conclusion this was a good idea.

harshreality
1 replies
5h21m

Julia, R, Mathematica, Matlab, Fortran, Smalltalk, Lua...

Julia being a mostly functional non-systems-programming language, where memory addressing is usually in the background, I think you'd find you rarely use numeric-literal indexing and you were getting upset over nothing.

The typical looping over i from 1 to length(a) is not idiomatic in Julia, where looping over eachindex() or pairs() abstracts away any concern over index type (OffsetArray can have an arbitrary first index). There's firstindex() and lastindex(); bracket indexing syntax has begin and end special keywords. Multidimensional arrays have axes().

"But what about modular arithmetic to calculate 1-based indexes?"

Julia has mod1(), fld1(), etc., which also provide a hint to anyone reading the code that it might be computing things to be used as index-like objects (rather than pure arithmetic or offset computation).

The strict way to handle indexing, I suppose, would be to have distinct 1-indexed (e.g. Ind1) and 0-indexed (e.g. Ind0) Int-based types, and refuse to accept regular Ints as indexes. That would eliminate ambiguity, at the cost of Int-to-Ind[01] conversions all over the place (if one insists on using lots of numeric literals as indexes). And how would one abstract away modular arithmetic differences between the Ind0 and Ind1 types? By redefining regular mod and div for Ind1? That would be hella confusing, so that abstraction probably isn't viable.

JanisErdmanis
0 replies
46m

The strict way to handle indexing, I suppose, would be to have distinct 1-indexed (e.g. Ind1) and 0-indexed (e.g. Ind0) Int-based types, and refuse to accept regular Ints as indexes.

I would be happy if unsigned integer UInt would start indexing from 0, whereas for signed integer it would start from 1. Also a module level parameter where preference on whether a literal integer shall be interpreted as Int or UInt would alleviate many usability issues there.

dmz73
1 replies
16h10m

It seems that a lot of programmers have trouble distinguishing between indexing and offsets. This is probably due to legacy of C language which conflates those two, possibly for performance and simplicity reasons. Same goes for case sensitivity in programming languages, again a legacy of C...are Print, print and PRINT really different functions, is that really what you intended??? Anyway, indexing is enumerating elements and it naturally starts at 1 for the first element and goes to Count for the last. Offsets are pointer arithmetic artifact where first element is at offset of 0 for most but not all data structures. Then you get "modern" languages like Python where they adopt indexing from 0 (no offsets since there are no pointers) and range where start is inclusive and end is exclusive...very natural behavior there. Oh, and lets make it case sensitive so you never know if it's Print, print, PRINT...or count, Count and COUNT until you run for 20 min and then crash.

gugagore
0 replies
5h27m

I'm confused that you're bringing in case sensitivity into the discussion. For simplicity, early systems could not be case sensitive because they didn't have a large enough character set.

The comment about crashing due to spelling error really should be addressed by some form of static analysis, not by making name lookups more relaxed, in my opinion.

constantcrying
0 replies
10h32m

I can’t comprehend how people came to the conclusion this was a good idea.

If you look at a textbook for numerical analysis likely the algorithms will be 1-indexed. Julia is a language primarily for writing down mathematics, it would be quite silly not to use mathematical conventions.

FabHK
0 replies
15h12m

You think mathematicians had it all wrong for centuries?

Incidentally, I find it amusing that in Europe, the ground floor is zero, while in the US, it's one. (A friend of mine arrived at college in the US and was told that her room was on the first floor. She asked whether there was a lift as her suitcase was quite heavy...)

staunton
5 replies
18h52m

the Python ecosystem is often far superior to both

What would you say does the Python ecosystem have that the others are missing? The obvious thing is pytorch/tensorflow/.... I can't think of much else though.

constantcrying
2 replies
10h34m

The obvious thing is pytorch/tensorflow/.... I can't think of much else though.

Julia has Flux, which is great. To create a neural network with Flux, you write the forward operation however you want and just auto differentiate the whole thing automatically and get a trainable net.

It is extremely flexible and intuitive.

What Julia lacks is everything which isn't numerical analysis though.

staunton
1 replies
6h10m

I agree that Flux.jl is very cool but you can't use it to train a SOTA LLM, can you? The Julia ecosystem does not have the resources to even attempt building a viable alternative to Pytorch.

I'm still curious what concrete things you mean that Julia is missing. Maybe I'm so stuck doing numerical analysis, can't think of anything else...

constantcrying
0 replies
4h33m

I'm still curious what concrete things you mean that Julia is missing.

Basically everything not related to numerical analysis. The problem isn't that there isn't an option which does work, but that those options aren't fully developed and mature to actually use for a commercial project.

E.g. look at GUI libraries, the native ones are just wrappers around other libraries and poorly documented. There is a single web framework, but is that supported enough that you would bet your entire company on it being continually supported?

If you are doing anything which isn't related to numerical analysis, Julia isn't your first choice, it also isn't your second or third choice. And I think that is fine, clearly Julia wants to be a good at a particular thing and has achieved that.

vegesm
1 replies
4h21m

Pytorch and tensorflow is pretty big. It implies all state of the art research code is not written in Julia, so it is a non-starter for any neural network based project.

constantcrying
0 replies
38m

That is just total nonsense. You can of course do neural network research without pytorch and tensorflow. Julia is especially good, with Flux being the most flexible Neural Network library which exists.

Pytorch and tensorflow are important and a very good reason to use python or C++, but those two being unavailable is more impactful for the industry, of course your research might need them, but it might very well not.

hughesjj
0 replies
18h56m

0 based indexing

I mean, zero based indexing isn't exclusive to computing. Plenty of times in physics or math we do x_0 for the start of a sequence vs x_1.

Although, fair, our textbook had the definition of a sequence being a mapping of elements to z+, which is zero exclusive

constantcrying
4 replies
19h58m

numpy has similar syntax to MATLAB and Julia

Numpy has absolutely awful syntax compared to Julia or MATLAB. It is fundamentally limited by pythons syntax, which was never meant to support the rich operations you can do in Julia.

I have done quite a bit in both languages and it is extremely clear which language was designed for numerical analysis and which wasn't .

Python's type system is pretty rich and allows overloading all operators

The language barely has support for type annotations. Duck typing is actually a very bad thing for doing numerical analysis, since you actually care a lot about the data types you are operating on.

Pythons type system is so utterly inadequate that you need an external library to have things like proper float types.

mgaunard
3 replies
11h48m

You're just repeating that it's bad without any argument.

Numpy's syntax is the same as MATLAB's.

Both MATLAB and Julia also use dynamic typing, though it is true Julia does have support for ad-hoc static typing through annotations in a way that is stricter than Python's.

Numpy does provide all the traditional floating-point types.

constantcrying
2 replies
10h37m

Numpy's syntax is the same as MATLAB's.

That is just plain false. Can you try reading the numpy documentation? They have an article about exactly that topic and even say that numpys Syntax is more verbose.

If you read that documentation you would also realize that numpy has a different approach to Arrays than both MATLAB and Julia.

Here: https://numpy.org/doc/stable/user/numpy-for-matlab-users.htm...

"MATLAB’s scripting language was created for linear algebra so the syntax for some array manipulations is more compact than NumPy’s."

mgaunard
1 replies
8h53m

The only difference is that MATLAB has more operators like matrix multiplication in addition to element-wise multiply, but that applies to Julia too.

constantcrying
0 replies
7h15m

No, that is not the only difference. I literally linked you an article in the numpy documentation which goes over dozens differences, large and small. I encourage you to read the section about the different numpy types, those concepts don't exist in MATLAB, where everything is an n-dimensional array and matrices as such don't exist as a special object.

There are also many differences between Julia and MATALB, although Julia obviously borrowed a lot from MATALB, for the obvious reason that MATLAB had a similar design goal in it's language.

That doesn't change that python inherently won't overcome the fact that it wasn't designed for numerical analysis, while both MATLAB and Julia were (although those also had somewhat different design objectives themselves, especially when it comes to the role of functions).

samatman
0 replies
4h22m

All operators, you say? Which of the following:

    ← → ↔ ↚ ↛ ↞ ↠ ↢ ↣ ↦ ↤ ↮ ⇎ ⇍ ⇏ ⇐ ⇒ ⇔ ⇴ ⇶ ⇷ ⇸ ⇹ ⇺ ⇻ ⇼ ⇽ ⇾ ⇿ ⟵ ⟶ ⟷ ⟹ ⟺ ⟻ ⟼ ⟽ ⟾ ⟿ ⤀ ⤁ ⤂ ⤃ ⤄ ⤅ ⤆ ⤇ ⤌ ⤍ ⤎ ⤏ ⤐ ⤑ ⤔ ⤕ ⤖ ⤗ ⤘ ⤝ ⤞ ⤟ ⤠ ⥄ ⥅ ⥆ ⥇ ⥈ ⥊ ⥋ ⥎ ⥐ ⥒ ⥓ ⥖ ⥗ ⥚ ⥛ ⥞ ⥟ ⥢ ⥤ ⥦ ⥧ ⥨ ⥩ ⥪ ⥫ ⥬ ⥭ ⥰ ⧴ ⬱ ⬰ ⬲ ⬳ ⬴ ⬵ ⬶ ⬷ ⬸ ⬹ ⬺ ⬻ ⬼ ⬽ ⬾ ⬿ ⭀ ⭁ ⭂ ⭃ ⥷ ⭄ ⥺ ⭇ ⭈ ⭉ ⭊ ⭋ ⭌ ← → ⇜ ⇝ ↜ ↝ ↩ ↪ ↫ ↬ ↼ ↽ ⇀ ⇁ ⇄ ⇆ ⇇ ⇉ ⇋ ⇌ ⇚ ⇛ ⇠ ⇢ ↷ ↶ ↺ ↻ --> <-- <-->
How about these ones:

    > < >= ≥ <= ≤ == === ≡ != ≠ !== ≢ ∈ ∉ ∋ ∌ ⊆ ⊈ ⊂ ⊄ ⊊ ∝ ∊ ∍ ∥ ∦ ∷ ∺ ∻ ∽ ∾ ≁ ≃ ≂ ≄ ≅ ≆ ≇ ≈ ≉ ≊ ≋ ≌ ≍ ≎ ≐ ≑ ≒ ≓ ≖ ≗ ≘ ≙ ≚ ≛ ≜ ≝ ≞ ≟ ≣ ≦ ≧ ≨ ≩ ≪ ≫ ≬ ≭ ≮ ≯ ≰ ≱ ≲ ≳ ≴ ≵ ≶ ≷ ≸ ≹ ≺ ≻ ≼ ≽ ≾ ≿ ⊀ ⊁ ⊃ ⊅ ⊇ ⊉ ⊋ ⊏ ⊐ ⊑ ⊒ ⊜ ⊩ ⊬ ⊮ ⊰ ⊱ ⊲ ⊳ ⊴ ⊵ ⊶ ⊷ ⋍ ⋐ ⋑ ⋕ ⋖ ⋗ ⋘ ⋙ ⋚ ⋛ ⋜ ⋝ ⋞ ⋟ ⋠ ⋡ ⋢ ⋣ ⋤ ⋥ ⋦ ⋧ ⋨ ⋩ ⋪ ⋫ ⋬ ⋭ ⋲ ⋳ ⋴ ⋵ ⋶ ⋷ ⋸ ⋹ ⋺ ⋻ ⋼ ⋽ ⋾ ⋿ ⟈ ⟉ ⟒ ⦷ ⧀ ⧁ ⧡ ⧣ ⧤ ⧥ ⩦ ⩧ ⩪ ⩫ ⩬ ⩭ ⩮ ⩯ ⩰ ⩱ ⩲ ⩳ ⩵ ⩶ ⩷ ⩸ ⩹ ⩺ ⩻ ⩼ ⩽ ⩾ ⩿ ⪀ ⪁ ⪂ ⪃ ⪄ ⪅ ⪆ ⪇ ⪈ ⪉ ⪊ ⪋ ⪌ ⪍ ⪎ ⪏ ⪐ ⪑ ⪒ ⪓ ⪔ ⪕ ⪖ ⪗ ⪘ ⪙ ⪚ ⪛ ⪜ ⪝ ⪞ ⪟ ⪠ ⪡ ⪢ ⪣ ⪤ ⪥ ⪦ ⪧ ⪨ ⪩ ⪪ ⪫ ⪬ ⪭ ⪮ ⪯ ⪰ ⪱ ⪲ ⪳ ⪴ ⪵ ⪶ ⪷ ⪸ ⪹ ⪺ ⪻ ⪼ ⪽ ⪾ ⪿ ⫀ ⫁ ⫂ ⫃ ⫄ ⫅ ⫆ ⫇ ⫈ ⫉ ⫊ ⫋ ⫌ ⫍ ⫎ ⫏ ⫐ ⫑ ⫒ ⫓ ⫔ ⫕ ⫖ ⫗ ⫘ ⫙ ⫷ ⫸ ⫹ ⫺ ⊢ ⊣ ⟂ ⫪ ⫫ <: >:
Or these?

    + - − ¦ |\|| ⊕ ⊖ ⊞ ⊟ |++| ∪ ∨ ⊔ ± ∓ ∔ ∸ ≏ ⊎ ⊻ ⊽ ⋎ ⋓ ⟇ ⧺ ⧻ ⨈ ⨢ ⨣ ⨤ ⨥ ⨦ ⨧ ⨨ ⨩ ⨪ ⨫ ⨬ ⨭ ⨮ ⨹ ⨺ ⩁ ⩂ ⩅ ⩊ ⩌ ⩏ ⩐ ⩒ ⩔ ⩖ ⩗ ⩛ ⩝ ⩡ ⩢ ⩣ 
These?

    * / ⌿ ÷ % & · · ⋅ ∘ × |\\| ∩ ∧ ⊗ ⊘ ⊙ ⊚ ⊛ ⊠ ⊡ ⊓ ∗ ∙ ∤ ⅋ ≀ ⊼ ⋄ ⋆ ⋇ ⋉ ⋊ ⋋ ⋌ ⋏ ⋒ ⟑ ⦸ ⦼ ⦾ ⦿ ⧶ ⧷ ⨇ ⨰ ⨱ ⨲ ⨳ ⨴ ⨵ ⨶ ⨷ ⨸ ⨻ ⨼ ⨽ ⩀ ⩃ ⩄ ⩋ ⩍ ⩎ ⩑ ⩓ ⩕ ⩘ ⩚ ⩜ ⩞ ⩟ ⩠ ⫛ ⊍ ▷ ⨝ ⟕ ⟖ ⟗ ⨟
There's a few more, don't worry

    << >> >>> // ^ ↑ ↓ ⇵ ⟰ ⟱ ⤈ ⤉ ⤊ ⤋ ⤒ ⤓ ⥉ ⥌ ⥍ ⥏ ⥑ ⥔ ⥕ ⥘ ⥙ ⥜ ⥝ ⥠ ⥡ ⥣ ⥥ ⥮ ⥯ ↑ ↓

jakderrida
2 replies
20h47m

I'm just guessing, but:

1. Open Source.

2. Easy to learn. (kind of)

3. Robust plotting and visualization capabilities, which are integral (I f*cking swear no pun intended) to understanding calculus, the foundational purpose of calculus being to find the area under the curve. (something I really wish they told me day one of Pre-Calc)

4. This one is just a vague feeling, but the fact that Python's most robust Symbolic Regression package, SymPy, relies on running Julia in the background to do all the real work suggests to me that Julia is somehow just superior when it comes to formulas as opposed to just calculations. IDK how, though.

constantcrying
1 replies
20h21m

1-3 apply exactly the same to python, 4 is just false, SymPy is written in python, it has nothing to do with Julia.

patagurbon
0 replies
19h34m

They're thinking of PySR but you are right on all points!

dnfsod
1 replies
20h48m

I also clicked around and felt the formatting and progression of the book was a bit confusing, but found some of the Julia features intriguing. (like “postfixing” allowing the same pencil notation of f’ and f’’ et al)

In my opinion, and experience, the best “calculus book” is Learn Physics with Functional Programming which only relies on libraries for plotting, and uses Haskell rather than Julia.

https://www.lpfp.io/

Why is Julia better suited than any other language?

Julia is known as a “programming language for math” and was designed with that conceit steering a lot of its development.

Explicitly it supports a lot of mathematical notation that matches handwritten or latex symbols.

Implicitly they may be referencing the simplified (see Pythtonic) syntax, combined with broad interoperability (this tutorial uses SymPy for a lot of the heavy lifting), lots of built in parallel computing primitives, and its use of JIT compilation allowing for fast iteration/exploration.

barrenko
0 replies
11h52m

Thank you for reminding me of LPFP!

patagurbon
0 replies
20h53m

Julia inherited a lot of MATLAB DNA like standard library linear algebra and 1 based indices. In addition it's got very commonly used Unicode support and intentionally can look like pseudocode in many cases.

g0wda
0 replies
12h29m

Multiple-dispatch is the right abstraction for programming mathematics. It provides both the flexibility required to create scientific software in layers (science has a LOT more function overloading compared to any other field.), and the information required to compile to fast machine code.

blagie
0 replies
20h56m

This is a good overview:

https://computationalthinking.mit.edu/Spring21/

The very short story is that it's a language somewhat similar to Python (and, likewise, easy-to-use), but with much richer syntax for expressing mathematics directly.

It also has richer notebooks. The key property is that in python notebooks, you run cells. In julia notebooks, it handles things like dependencies. If I change x, either as a number or a slider, all the dependencies things update. You can define a plot, add sliders, and it just works.

(Also: I'm not an expert in Julia; most of my work is in Python and JavaScript. I'm sure there are other reasons as well, but the two above come out very clearly in similar courses)

ultrasounder
17 replies
18h31m

This comes at a very opportune time in my life. when, my ward enters their HS -Junior year and they are taking SVC. Question to the Author(in case they are monitoring this thread), is it appropriate for a High-schooler with just an intro to Python?

seanhunter
4 replies
11h55m

I'm going to say a strong no to that. The programming parts are fine, but on a quick skim, the mathematics is written in a way they will likely find very confusing unless they already know calculus and leave them thinking they are bad at maths whereas actually the author is just not really trying to explain the maths to someone who doesn't already know it.

For example look at the diagram on[1]. It has unlabled axes of a shaded L-shaped box with a curve going through it and then it is followed by a bunch of equations where he derives the formula for integration by parts using a set of parametric equations with multiple substitutions etc. I know what integration by parts is and how it works really well. This is possibly the most confusing way you could possibly derive the formula and/or explain it, and this diagram really adds absolutely nothing unless you already know and understand the concept. If you have seen the exceptional diagrams and illustrations and clarity of explanation in a book like "Calculus" by James Stewart the contrast is so stark it really jumps out.

The normal way of explaining integration by parts starts the way the author does (with the product rule for derivitives) and shows a few examples of taking the derivitive of things using the product rule so you get an intuition for what the form of the resulting antiderivitive looks like. You then go through the derivation of the formula for integration by parts by using the product rule for derivitives, integrating both sides and splitting the resulting integral[2]. Its very clear and easy to follow. And if they want to actually help the student to do this they teach something like the tabular/"DI" method so the student isn't tearing their hair out getting the signs mixed up when integrating by parts multiple times.

[1] https://jverzani.github.io/CalculusWithJuliaNotes.jl/integra...

[2] My own notes on this derivation which I took when learning this are here. Note that this isn't me really trying to explain it to a beginner - it's just my personal notes but it's still a lot easier to follow than the example given https://publish.obsidian.md/uncarved/3+Resources/Public/Inte...

ultrasounder
1 replies
1h15m

Wow! My secret plan is also to grok the math behind the Transformer while helping my HSchooler out. And I think your resource is super good.

seanhunter
0 replies
55m

That's awesome. Very happy you found it useful. Bear in mind of course that while I do try to get it right, there will be errors etc. If you find it helpful there is a lot more to come - I have a bunch more notes that I need to edit and make new diagrams for but I should be publishing soonish, and then I am learning all the time. Please let me know if there's anything that you found confusing/hard to understand[1] or if you see any errors.

[1] because that indicates I don't understand it well enough myself and I need to work on it more and then fix the note.

barrenko
1 replies
11h20m

Hi sean, we have the same end-goal possibly.

seanhunter
0 replies
2h41m

That's awesome. Hope your studies are going well and you achieve what you are hoping for. I'm finding it fantastic so far I have to say. Super-interesting and always more to discover.

nl
4 replies
17h4m

Stick with Python. Try SymPy.

ultrasounder
1 replies
16h40m

Great. Thanks. Any suggestions for a book or course that uses sympy to teach calculus( svc)?

ssivark
0 replies
14h59m

I imagine a language with decent support for macros would be much much more ergonomic for symbolic computation. Even ignoring all the other pros & cons, just this one reason would push one strongly towards Julia, Lisp(s), Mathematica, etc for symbolic computation.

cfgauss2718
0 replies
13h20m

Sympy is a poor tool to learn because it simply doesn’t scale to problems one most often encounters, even in schooling. Frankly, CAS are so general and unintelligent that problems with well known and elegant closed-form solutions, when presented to a system like sympy, result in an output which is often not even human readable - thousands of algebraic terms for instance to describe the equations of motion for a simple double pendulum.

Personally I have found that the best tools are 1) a firm grasp of elementary calculus (differential, series, and integral) and 2) exposure to simple numerical methods that apply to a broad range of problems.

Armed with this knowledge, in my opinion, Julia is a far superior language to python and its package ecosystem has a brighter future. Indeed, the language has been built with a focus on mathematical modeling and efficient numerical computation. It is a more natural starting point for those with an interest in mathematical modeling and engineering science, and will serve anyone who learns it better than python+numpy+sympy+scipy.

hyencomper
2 replies
12h29m

IMHO a HS student learning Calculus should first learn the subject with pen and paper for a while before programming. It is important to work through the problems and think about the fundamental concepts involved, rather than thinking about the syntax involved in coding their solutions. Practising with pen and paper helps to internalize the subject matter better.

ultrasounder
0 replies
1h18m

Makes perfect sense. And the parents suggestions for books is what I needed. Fun summer project. Learn svc and Mvc together with my Rising junior. What can go wrong.

galdosdi
1 replies
4h27m

I self taught myself calculus at that age (11th grade), obviously had a hard time at first, and persevered by sampling a good half dozen different books till I found one that would make things click. These are my credentials for making the following recommendation:

Get the book Quick Calculus by Kleppner and Ramsey. Nothing else I used came close, for developing intuition about the concepts when they're fresh and new.

Once they have mastered that, any good book will do. James Stewart's is fantastic but so big and thick you may need to use it as a resource to intelligently select appropriate readings and problems from as they go, rather than just start at page 1 and assign every problem. But the main thing is to really get the basics of what a derivative and integral and limit are from the getgo, and nothing I tried compare to Quick Calculus for that.

Using this Julia book or some other similar book to select exercises based on your appropriate judgment from it to supplement Stewart would also be very cool if your student has an interest in programming. Let them complete then in python if they like, python should be fine. I recall implementing a numerical integrator and a symbolic differentiator in python at the same age, as a way of consolidating my knowledge of calculus, and found both to be very valuable and fun exercises. Symbolic differentiation seemed like magic especially, but it was just a matter of parsing and then continually adding rules for each rule I learned in math.

ultrasounder
0 replies
1h29m

Thanks for taking time to answer. This is why I keep coming back to HN. For concrete recommendations based on one’s own experience. Ordered an used Quick calculus.

pclmulqdq
0 replies
8m

As a post-Calculus introduction to Julia and other similar mathematical computing languages, this doesn't seem like a bad thing to do. Understanding how to use one of these at a high level is pretty useful, and any time I have to do real math for work, Mathematica makes things a lot easier. R and Matlab are also big in the space, but iPython notebooks with SymPy might also work (although SymPy is comparatively underpowered in comparison).

adastra22
0 replies
16h27m

I disagree with sibling comment. No prior exposure to Julia or even programming is required for this course.

Alifatisk
14 replies
20h48m

For someone coming from Matlab, is Julia a valid replacement?

BenFranklin100
6 replies
20h2m

While not a Matlab nor Julia user, I think you may be neglecting the nearly 40 years of code, toolboxes, and countless examples of common engineering problems solved in Matlab. Engineers tend to be more of a practical sort than developers, and just want to apply a known solution to a problem than mess around with newish software languages.

constantcrying
2 replies
6h52m

One enormous difference is that MATLAB is paid, which, ironically, makes it much much easier to use in a corporate context.

For MATLAB it isn't just toolboxes, but also integrations with other tools. So it depends on what you are trying to do, but if your problem is taking in data, computing, putting out data, Julia can absolutely compete with MATLAB, even on the most practical "I really just want this to just work and look at the results", level.

BenFranklin100
1 replies
1h34m

Being paid does not make it easier to use in a corporate environment, that is nonsense. I specifically tried to ween my Mech E.’s off Matlab because I don’t want to pay for Matlab licenses for our company.

The big win for Matlab seemed to be there are pre-existing solutions/examples for hundreds of common engineering problems that only need to be tweaked for the problem at hand. Engineers like this. They seem pretty happy and productive in Matlab too, so I finally gave up and decided it was my best interests to let them use it.

constantcrying
0 replies
48m

Being paid does not make it easier to use in a corporate environment, that is nonsense.

No, it isn't. It is the single biggest issue in getting it into your corporation. I can easily get a 10k Euro PC if I had any reasonable need for it which I could coherently explain to my boss. Getting Julia installed on it would take months of debates with IT and tens of thousands in internal expenses.

It is the same reason why corporations use Redhat, there is nothing about Redhats Linux which makes it inherently superior to e.g. Debian, definitely nothing that would justify the price. But corporations still willingly pay for it.

There are multiple reasons for this. First is support, if you aren't paying someone, there is nobody who will support you. Secondly is liability this is enormous, MATHWORKS is willing to take legal responsibility for their mistakes. That is a very big reason people use MATLAB, in some areas it is basically the only option for that reason. Thirdly paid software is easy to understand to everyone else, you wouldn't imagine how hard it is to tell people that something good is actually free and in fact so free that you can do with it whatever you want. If you aren't part of an enormous organization this might be hard to understand, but for most people free means "trial version" or "scam".

agucova
2 replies
19h20m

This is true, but even engineers see the advantages of Julia. My engineering school has went from almost pure Matlab usage to many key engineering courses switching to Julia due to its simplicity and friendliness.

It's also SOTA for many engineering applications, particularly for acausal modelling and scientific machine learning (see https://sciml.ai), which has led to big companies like Pfizer adopting it [1]. And for engineers writing novel libraries, it clearly has a strong edge. See for example the work by NASA's JPL [2, 3], the FAA [4] or the CliMa project [5].

[1]: https://juliahub.com/case-studies/pfizer/ (see also https://info.juliahub.com/case-studies) [2]: https://exoplanets.nasa.gov/news/1669/seven-rocky-trappist-1... [3]: https://ntrs.nasa.gov/citations/20170008266 [4]: https://youtu.be/19zm1Fn0S9M [5]: https://clima.caltech.edu/

logtempo
1 replies
19h5m

Tools are valid regarding a context (when you have a hammer everything is a nail etc.). In the industrial context, Julia is not valid. In the research/education, it is valid. It does not mean that Julia will never be relevant in research.

patagurbon
0 replies
18h47m

Our upcoming JuliaCon 2024 has a significant number of industrial talks and a minisymposium. ASML uses Julia quite widely in a definitively industrial context.

https://juliacon.org/2024/

nsajko
0 replies
20h10m

Very much so. Julia code can be both much nicer and much more performant than Matlab code can.

constantcrying
0 replies
6h57m

It depends on whether you are depending on specific MATLAB toolboxes or integrations.

If you don't, Julia will feel good and be a more than adequate replacement. It has a somewhat similar syntax and many of the Array niceties of MATLAB. Additionally it has a good type system and a much better handling of "general purpose" programming, e.g. it isn't weird about where functions are.

I can only recommend you to try it out.

antegamisou
0 replies
19h34m

No because Matlab is never going away from the industry. Same for Python/matplotlib btw.

adgjlsfhk1
0 replies
20h44m

yes.

SatvikBeri
0 replies
15h11m

In some ways – it's much faster, it generally has a lot of features that make it better as a programming language, it's much easier to parallelize. As someone who's written both professionally, I would say Julia is usually the better choice today.

But there is a huge array of existing algorithms that are already implemented in languages like matlab or stata that may not have corresponding equivalents. If what you want is one of these, it's often hard to justify using another language (though in practice I've usually found it pretty easy to port matlab/python/stat to Julia.)

FabHK
0 replies
15h3m

Yes. It's like giving a glass of ice water to somebody in hell (with apologies to Steve Jobs, who said that about iTunes on Windows, back when iTunes was not the mess it turned into later).

Mind you, I have huge respect for Cleve Moler and Matlab and all they have accomplished (making LINPACK and EISPACK and all that easily accessible). And they've worked hard to overcome the limitations of initially having only one datatype: a matrix.

But Julia as a modern general purpose language is just so much more pleasant to work with, while retaining nearly all of Matlab's power.

tiffanyh
6 replies
19h40m

LuaJIT faster

I find it interesting that after all these year of little development on LuaJIT, and active development in Julia - that LuaJIT is still faster (while also being general purpose).

https://julialang.org/benchmarks/

slwvx
1 replies
19h37m

I find it interesting that after all these years of HN, commenters still ignore the main topic of the OP (a calculus book) to focus on the language it's written in

patagurbon
0 replies
19h30m

LuaJIT is only appreciably faster on text ingestion and printing in those benchmarks. Julia is also plenty general purpose, just targeted (initially at least) towards the numerical community. LuaJIT is fantastically impressive though.

neonsunset
0 replies
6h15m

Now they need to include .NET and System.Numerics.Tensors but that would make a lot of popular options look bad :)

(particularly go which has no business being used in this area)

gc9
0 replies
19h15m

Note that "The benchmark data shown above were computed with Julia v1.0.0" (2018)

fiforpg
3 replies
18h16m

One has to be a little careful with designing courses like this. They are most likely to be of interest to people who already know, at least to some extent, both (i) calculus and (ii) programming. That is, the (presumable) target audience — those who are learning either of these subjects — is not really ready to take in such a class.

Anecdotally, my personal attempts at incorporating only slightly exotic CASes (Maxima or Sagemath) into calculus courses were met with tepid response at best. Part of the issue was, I believe, that freshmen are rarely interested in setting up software for a non-CS course.

That being said, for slightly higher-level classes it can work quite well as an optional component — I've had really good results with Python projects in an ODE course. Python not being a niche language certainly helped, too.

dr_kiszonka
0 replies
17h31m

This is a common challenge in teaching stats where students have to learn statistical concepts and using statistical software simultaneously. In the end, I think it is worth the challenge and beats having students calculate everything by hand & look up values in tables.

__rito__
0 replies
10h18m

"I've had really good results with Python projects in an ODE course."

Which textbook did you use for this course? What were the reference material? Could you share with me anything about this course? Lecture notes, code, slides, books, anything.

Egrodo
0 replies
13h19m

As a self taught programmer who never got farther than algebra, this looks awesome to me and I may be that small target audience haha.

jagged-chisel
1 replies
21h34m

PDF link in the page header is 404

mhd
0 replies
21h12m

I assume they turned it off intentionally:

"These notes may be compiled into a pdf file through Quarto. As the result is rather large, we do not provide that file for download. For the interested reader, downloading the repository, instantiating the environment, and running quarto to render to pdf in the quarto subdirectory should produce that file (after some time)."

blagie
1 replies
20h53m

I like the concept.

I'd much rather have this built on top of or from something like MOOCulus.

https://ximera.osu.edu/mooculus/calculus1

Holistically, I prefer MOOCulus. However, the value-add of Calculus with Julia is large. If the two could integrate somehow... the key thing about MOOCulus is the writing quality is better (much less verbose), and the integrated exercises means kids follow it closely. It's also very refined, from a lot of classroom use.

If it were forked and Julia-enhanced, that would be a very big step up, though. As would the addition of applications.

bmitc
0 replies
20h34m

I checked out the website, and it's really hard to even know how to use. Plus, the first thing I clicked on, "Equal or Not?", has errors.

richrichie
0 replies
4h32m

“A computation is a temptation that must be resisted as long as possible.” - J.P. Boyd.

anthk
0 replies
9h15m

Maxima with Gnuplot with the bundled docs it's pretty good. Also, I think there was a pretty complete intro/guide to Maxima in PDF.

DeathArrow
0 replies
6h48m

Funnily enough I just started to re-learn math (linear algebra, calculus and statistics) because I want to learn ML.

While I learn various things I also try to use them in simple Python implementations (to my shame, I never use Python before). It's quite nice how you can rotate vectors, plot functions using mathplotlib. I can draw things by hand but not as nicely.