Comments on: Is Mojo The Fortran For AI Programming, Or More? https://www.nextplatform.com/2023/09/08/is-mojo-the-fortran-for-ai-programming-or-more/ In-depth coverage of high-end computing at large enterprises, supercomputing centers, hyperscale data centers, and public clouds. Wed, 20 Sep 2023 18:39:48 +0000 hourly 1 https://wordpress.org/?v=6.5.5 By: Slim Albert https://www.nextplatform.com/2023/09/08/is-mojo-the-fortran-for-ai-programming-or-more/#comment-213503 Wed, 13 Sep 2023 03:09:52 +0000 https://www.nextplatform.com/?p=142900#comment-213503 In reply to Hubert.

Indeed! The ML (Meta Language) family of languages (Standard ML, SML/NJ, CAML, Poplog, …) are formally verifiable ancestors of both Haskell and Python, mostly on holiday from the genealogy plot, except for Miranda, admirably keeping watch on the constellation (esp. the 7th planet, whose name shall not be written, because: giggles!).

]]>
By: Slim Albert https://www.nextplatform.com/2023/09/08/is-mojo-the-fortran-for-ai-programming-or-more/#comment-213458 Tue, 12 Sep 2023 02:21:15 +0000 https://www.nextplatform.com/?p=142900#comment-213458 In reply to Erich.

Interesting link! I like that the code (eg. with autotune) seems rather hardware-agnostic (I guess that’s provided Mojo has the needed HW-specific libraries available). The largest bit of speedup (400x) comes when they introduce static typing (eg. C-like) to replace Python’s dynamic typing (eg. LISP-like) — making Mojo that lovechild of C++ and Python as headlined in this TNP article.

Compilation by itself (with dynamic typing; maybe Cython could do it) gives 4.5x speedup, then vectorization, parallelization, and tiling, together, give a 40x speedup (total speedup = 4.5 x 400 x 40 = 70,000). I suspect though that using some sort of SciPy/NumPy/pyBLAS BLAS and LAPACK libraries with Python would also speed it up quite a bit over the generic nested-loops of matmul.

Either way, it seems that the coder still needs to know what he/she’s doing to get good performance (no free gastronomy!).

]]>
By: Hubert https://www.nextplatform.com/2023/09/08/is-mojo-the-fortran-for-ai-programming-or-more/#comment-213437 Mon, 11 Sep 2023 16:00:10 +0000 https://www.nextplatform.com/?p=142900#comment-213437 Like T. Hogberg (above), I quite liked the intro, especially the cool genealogy plot that induces a bit of nostalgia, and provides a frame for discussing PL evolutions. I would have put Haskell as descending from CAML/Ocaml functional-style, inspired by some Scheme conciseness, and Prolog declarative pattern-matchingness, with monads lurking in (disguised as either nomads, or dromedaries, to the mildly dyslexic) — but I’m no genealogist.

Feels odd to think I’ve been at this for forty years (1983+), through assembly, basic, pascal, fortran, lisp, scheme, prolog, smalltalk, c, forth, javascript, verilog, and the higher-level maple and matlab. The puzzle-solving abilities of Haskell intrigues me for a next language to approach (not just the curry!).

I also wonder if the LLVM IR, the MLIR, Webassembly, asm.js, the JVM, and other typed virtual ISAs, VMs, and bytecode interpreters, might one day converge to produce a common target for HW implementation — something like RISC-VI, RISC-VJ, or even CISC-VM++?

]]>
By: Erich https://www.nextplatform.com/2023/09/08/is-mojo-the-fortran-for-ai-programming-or-more/#comment-213436 Mon, 11 Sep 2023 15:40:29 +0000 https://www.nextplatform.com/?p=142900#comment-213436 In reply to emerth.

Here’s a nice walk through of how they accomplish the speedup using a matrix multiply example:
https://docs.modular.com/mojo/notebooks/Matmul.html

In this example, they run base python matrix mult code as Mojo code and get a 4.5x speedup.

Then they add types, SIMD/vectorization, and then parallelization, a cache locality optimization, and finally a loop unrolling and get an over 70,000x improvement over standard Python code.

I think there’s a few takeaways here… the first is that the vanilla python code not only “just worked” but there was a > 4x speedup when using Mojo. Presumably due to the compilation. The second takeaway is if you need to really squeeze out additional performance, there are several knobs that can be turned to get some pretty extraordinary improvements over vanilla python code.

]]>
By: Timothy Prickett Morgan https://www.nextplatform.com/2023/09/08/is-mojo-the-fortran-for-ai-programming-or-more/#comment-213433 Mon, 11 Sep 2023 13:05:29 +0000 https://www.nextplatform.com/?p=142900#comment-213433 In reply to Cristian Vasile.

Neat! It looked a bit retro. But the magnificent part was the tree, or rather trees, not the graphics. Which were kinda, how shall I say, low rez?

]]>
By: emerth https://www.nextplatform.com/2023/09/08/is-mojo-the-fortran-for-ai-programming-or-more/#comment-213397 Sun, 10 Sep 2023 20:50:11 +0000 https://www.nextplatform.com/?p=142900#comment-213397 35000x. That’s the most egregious cherry picking I have seen in decades.

]]>
By: HuMo https://www.nextplatform.com/2023/09/08/is-mojo-the-fortran-for-ai-programming-or-more/#comment-213392 Sun, 10 Sep 2023 16:39:41 +0000 https://www.nextplatform.com/?p=142900#comment-213392 In reply to Eric Olson.

Julia’s childs (at: https://discourse.julialang.org/t/julia-mojo-mandelbrot-benchmark/103638/43) are cooking up a storm of souped-up Mandelbrot recipes to compare with Mojo’s vectorized-&-parallelized dish of the same name. The OP has 7.4ms for Julia to Mojo’s 2.1ms and other’s results vary with seasonings (down to at least 0.9ms) … to be taken with pinches of salt. 8^d

]]>
By: Cristian Vasile https://www.nextplatform.com/2023/09/08/is-mojo-the-fortran-for-ai-programming-or-more/#comment-213385 Sun, 10 Sep 2023 14:43:54 +0000 https://www.nextplatform.com/?p=142900#comment-213385 “magnificent programming language genealogy tree from Wikipedia”
Tim, this chart was created using an open source application that was developed in the past by Ma Bell, also known as AT&T, called graphviz (https://graphviz.org/gallery/).

]]>
By: Cristian Vasile https://www.nextplatform.com/2023/09/08/is-mojo-the-fortran-for-ai-programming-or-more/#comment-213380 Sun, 10 Sep 2023 13:03:03 +0000 https://www.nextplatform.com/?p=142900#comment-213380 “Mojo guys can add a sprinkle of rust”
You can compile Rust code as a C like library, and use Python C Foreign Function Interface to call Rust functions.
More details on Rust side are here: https://doc.rust-lang.org/nomicon/ffi.html#calling-rust-code-from-c and for Python side here: https://cffi.readthedocs.io/en/latest/

]]>
By: Slim Albert https://www.nextplatform.com/2023/09/08/is-mojo-the-fortran-for-ai-programming-or-more/#comment-213352 Sat, 09 Sep 2023 19:58:15 +0000 https://www.nextplatform.com/?p=142900#comment-213352 In reply to Eric Olson.

Matlab syntax (and GNU Octave) is great for implementing the main loop of the error-backpropagation training algo for a multi-layer feed-forward ANN of arbitrary size (using matrix-vector ops) — it takes just 15 lines of highly-readable code (approx.)!

]]>