Skip to content

Commit 6e71be1

Browse files
authored
Add MLX Swift blog post (#553)
1 parent d69cb20 commit 6e71be1

File tree

2 files changed

+120
-0
lines changed

2 files changed

+120
-0
lines changed

_data/authors.yml

+16
Original file line numberDiff line numberDiff line change
@@ -403,3 +403,19 @@ ahoppen:
403403
404404
github: ahoppen
405405
about: Alex Hoppen works on the Swift team at Apple, focusing on parsing-related technologies like SourceKit, swift-syntax and code completion.
406+
407+
dkoski:
408+
name: David Koski
409+
410+
github: davidkoski
411+
412+
ahannun:
413+
name: Awni Hannun
414+
415+
github: awni
416+
417+
418+
rcollobert:
419+
name: Ronan Collobert
420+
421+
github: andresy

_posts/2024-02-20-mlx-swift.md

+104
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,104 @@
1+
---
2+
layout: post
3+
published: true
4+
date: 2024-02-20 10:00:00
5+
title: On-device ML research with MLX and Swift
6+
author: [dkoski, ahannun, rcollobert]
7+
---
8+
9+
The Swift programming language has a lot of potential to be used for machine learning research because it combines the ease of use and high-level syntax of a language like Python with the speed of a compiled language like C++.
10+
11+
[MLX](https://github.com/ml-explore/mlx) is an array framework for machine learning research on Apple silicon. MLX is intended for research and not for production deployment of models in apps.
12+
13+
[MLX Swift](https://github.com/ml-explore/mlx-swift/) expands MLX to the Swift language, making experimentation on Apple silicon easier for ML researchers.
14+
15+
As part of this release we are including:
16+
* A comprehensive Swift API for MLX core
17+
* Higher level neural network and optimizers packages
18+
* An example of text generation with Mistral 7B
19+
* An example of MNIST training
20+
* A C API to MLX which acts as the bridge between Swift and the C++ core
21+
22+
We are releasing all of the above under a permissive [MIT license](https://github.com/ml-explore/mlx-swift/blob/main/LICENSE).
23+
24+
This is a big step to enable ML researchers to experiment using Swift.
25+
26+
### Motivation
27+
28+
MLX has several important features for machine learning research that few if any existing Swift libraries support. These include:
29+
30+
* Native support for hardware acceleration. MLX can run compute intensive operations on the CPU or GPU.
31+
* Automatic differentiation for training neural networks and the gradient-based machine learning models
32+
33+
For more information on MLX see the [documentation](https://ml-explore.github.io/mlx).
34+
35+
The Swift programming language is fast, easy-to-use, and works well on Apple silicon. With MLX Swift, you now have a researcher-friendly machine learning framework with the ability to easily experiment on different platforms and devices.
36+
37+
### A Quick Tour
38+
39+
Getting [set up](https://ml-explore.github.io/mlx-swift) with MLX Swift is quick and easy with Xcode or SwiftPM.
40+
41+
In MLX Swift, building and performing operations with N-dimensional arrays is simple. In the following example, all of the operations will be run on the default device, which is the GPU unless otherwise specified.
42+
43+
```swift
44+
import MLX
45+
import MLXRandom
46+
47+
let r = MLXRandom.normal([2])
48+
print(r)
49+
// array([-0.125875, 0.264235], dtype=float32)
50+
51+
let a = MLXArray(0 ..< 6, [3, 2])
52+
print(a)
53+
// array([[0, 1],
54+
// [2, 3],
55+
// [4, 5]], dtype=int32)
56+
57+
// last element of 0th row
58+
print(a[0, -1])
59+
// array(2, dtype=int32)
60+
61+
// slice of the first two rows
62+
print(a[0 ..< 2])
63+
// array([[0, 1],
64+
// [2, 3]], dtype=int32)
65+
66+
// add with broadcast
67+
let b = a + r
68+
69+
print(b)
70+
// array([[-0.510713, 1.04633],
71+
// [1.48929, 3.04633],
72+
// [3.48929, 5.04633]], dtype=float32)
73+
```
74+
75+
You can also use function transformations in MLX Swift. Function transformations in MLX are useful for training models with automatic differentiation as well as optimizing compute graphs for speed or memory use. Below is an example which computes the gradient of a function.
76+
77+
```swift
78+
func fn(_ x: MLXArray) -> MLXArray {
79+
x.square()
80+
}
81+
82+
let gradFn = grad(fn)
83+
84+
let x = MLXArray(1.5)
85+
let dfdx = gradFn(x)
86+
87+
// prints 2 * 1.5 = 3
88+
print(dfdx)
89+
```
90+
91+
The documentation contains a few more complete [examples](https://ml-explore.github.io/mlx-swift/MLX/documentation/mlx/examples) to help you get started with MLX Swift:
92+
93+
* Text generation with an LLM: A complete LLM text generation example with Mistral 7B. The example will generate text using any Mistral or Llama-style model including pre-quantized MLX models, many of which are available on [Hugging Face](https://huggingface.co/models?library=mlx&sort=trending).
94+
* Training an MLP on MNIST: The example trains a simple multi-layer perceptron to classify MNIST digits using the MLX Swift neural network and optimizers packages.
95+
96+
### Further Resources
97+
98+
Here are a few more resources to get started with MLX Swift:
99+
100+
* [Swift documentation and examples](https://ml-explore.github.io/mlx-swift)
101+
* [GitHub repository](https://github.com/ml-explore/mlx-swift)
102+
* We encourage you to file an [issue](https://github.com/ml-explore/mlx-swift/issues) if you encounter any problems or have suggestions for improvements.
103+
* We welcome contributions. If you are interested in contributing to MLX Swift, check out our [contribution guidelines](https://github.com/ml-explore/mlx-swift/blob/main/CONTRIBUTING.md).
104+

0 commit comments

Comments
 (0)