Machine Learning in Elixir
Learning to Learn with Nx and Axon
by Sean Moriarity
Stable Diffusion, ChatGPT, Whisper—these are just a few examples of
incredible applications powered by developments in machine learning.
Despite the ubiquity of machine learning applications running in
production, there are only a few viable language choices for data
science and machine learning tasks. Elixir’s Nx project seeks to change
that. With Nx, you can leverage the power of machine learning in your
applications, using the battle-tested Erlang VM in a pragmatic language
like Elixir. In this book, you’ll learn how to leverage Elixir and the
Nx ecosystem to solve real-world problems in computer vision, natural
language processing, and more.
The Elixir Nx project aims to make machine learning possible without the
need to leave Elixir for solutions in other languages. And even if
concepts like linear models and logistic regression are new to you,
you’ll be using them and much more to solve real-world problems in no
time.
Start with the basics of the Nx programming paradigm—how it differs
from the Elixir programming style you’re used to and how it enables you
to write machine learning algorithms. Use your understanding of this
paradigm to implement foundational machine learning algorithms from
scratch. Go deeper and discover the power of deep learning with Axon.
Unlock the power of Elixir and learn how to build and deploy machine
learning models and pipelines anywhere. Learn how to analyze, visualize,
and explain your data and models.
Discover how to use machine learning to solve diverse problems from
image recognition to content recommendation—all in your favorite
programming language.
What You Need
You’ll need a computer with a working installation of Elixir v1.12 and
Erlang/OTP 24. For some of the more compute intensive examples, you’ll
want to use EXLA, which currently only supports x86-64 platforms. While
not explicitly required, some examples will demonstrate programs running
on accelerators such as CUDA/ROCm enabled GPUs and Google TPUs. Most of
these programs will still run fine on a regular CPU, just for much
longer periods of time.
Resources
Releases:
2024/08/28
P1.0
First printing.
2024/08/13
B5.0
*Production is complete. Now it’s on to layout and the printer.
2024/05/10
B4.0
*Fixes the following issues in Model Everything with Transformers:
*Addressed errata with new Kino Image APIs
*Updated conversational serving to use modern
LLM
*Fixes the following issues in Learn without Supervision:
*Addressed errata with new Kino Image APIs
*Content complete; headed to production.
2024/02/23
B3.0
*Typos fixed
*Fixed issues with usage of old Scholar metrics in Traditional Machine Learning
*Uses new Bumblebee APIs in Model Everything with Transformers
*Fixed logging issue in Learn without Supervision
*New chapters released:
*Put Machine Learning into Practice
*That’s a Wrap
- Preface

- Foundations of Machine Learning
- Make Machines That Learn
- Classifying Flowers
- Learning with Elixir
- Wrapping Up
- Get Comfortable with Nx
- Thinking in Tensors
excerpt

- Using Nx Operations
- Representing the World
- Going from def to defn
- Wrapping Up
- Harness the Power of Math
- Understanding Machine Learning Math
- Speaking the Language of Data
- Thinking Probabilistically
- Tracking Change
- Wrapping Up
- Optimize Everything
- Learning with Optimization
- Regularizing to Generalize
- Descending Gradients
- Peering into the Black Box
- Wrapping Up
- Traditional Machine Learning
- Learning Linearly
- Learning from Your Surroundings
- Using Clustering
- Making Decisions
- Wrapping Up
- Deep Learning
- Go Deep with Axon
- Learn to See
- Identifying Cats and Dogs
- Introducing Convolutional Neural Networks
- Improving the Training Process
- Going Beyond Image Classification
- Wrapping Up
- Stop Reinventing the Wheel
- Identifying Cats and Dogs Again
- Fine-Tuning Your Model
- Understanding Transfer Learning
- Taking Advantage of the Machine Learning Ecosystem
- Wrapping Up
- Understand Text
- Classifying Movie Reviews
- Introducing Recurrent Neural Networks
- Understanding Recurrent Neural Networks
- Wrapping Up
- Forecast the
Future
- Predicting Stock Prices
- Using CNNs for Single-Step Prediction
- Using RNNs for Time-Series Prediction
- Tempering Expectations
- Wrapping Up
- Model Everything with Transformers
- Paying Attention
- Going from RNNs to Transformers
- Using Transformers with Bumblebee
- Wrapping Up
- Learn Without Supervision
- Compressing Data with Autoencoders
- Learning a Structured Latent
- Generating with GANs
- Learning Without Supervision in Practice
- Wrapping Up
- Machine Learning in Practice
- Put Machine Learning into Practice
- Deciding to Use Machine Learning
- Setting Up the Application
- Integrating Nx with Phoenix
- Seeding Your Databases
- Building the Search LiveView
- Wrapping Up
- That’s a Wrap
- Learning from Experience
- Diffusing Innovation
- Talking to Large Language Models
- Compressing Knowledge
- Moving Forward
Author
Sean Moriarity is author of Genetic Algorithms in Elixir: Solve
Problems using
Evolution,
co-creator of the Nx library, and creator of the Axon deep learning
framework. Sean’s interests include mathematics, machine learning, and
artificial intelligence.