You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+38-27
Original file line number
Diff line number
Diff line change
@@ -9,61 +9,72 @@ Learn PyTorch with project-based tutorials. These tutorials demonstrate modern t
9
9
Applying recurrent neural networks to natural language tasks, from classification to generation.
10
10
11
11
*[Classifying Names with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-classification/char-rnn-classification.ipynb)
12
-
**WIP*[Generating Shakespeare with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-generation/char-rnn-generation.ipynb)
12
+
*[Generating Shakespeare with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-generation/char-rnn-generation.ipynb)
13
13
*[Generating Names with a Conditional Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/conditional-char-rnn/conditional-char-rnn.ipynb)
14
-
*🔥 [Translation with a Sequence to Sequence Network and Attention](https://github.com/spro/practical-pytorch/blob/master/seq2seq-translation/seq2seq-translation.ipynb)
14
+
*[Translation with a Sequence to Sequence Network and Attention](https://github.com/spro/practical-pytorch/blob/master/seq2seq-translation/seq2seq-translation.ipynb)
15
15
*[Exploring Word Vectors with GloVe](https://github.com/spro/practical-pytorch/blob/master/glove-word-vectors/glove-word-vectors.ipynb)
16
16
**WIP* Sentiment Analysis with a Word-Level RNN and GloVe Embeddings
17
-
**WIP* Sentence Similarity with a Word-Level Autoencoder
18
-
**WIP* Intent Parsing with Recursive Application of Recurrent Neural Networks
19
17
20
-
#### Series 2: RNNs for timeseries
18
+
#### Series 2: RNNs for timeseries data
21
19
22
-
**WIP* Predicting future events with an RNN
20
+
**WIP* Predicting discrete events with an RNN
21
+
22
+
## Get Started
23
+
24
+
The quickest way to run these on a fresh Linux or Mac machine is to install [Anaconda](https://www.continuum.io/anaconda-overview):
I assume you have at least installed PyTorch, know Python, and understand Tensors:
46
+
### PyTorch basics
27
47
28
48
*http://pytorch.org/ For installation instructions
49
+
*[Offical PyTorch tutorials](http://pytorch.org/tutorials/) for more tutorials (some of these tutorials are included there)
29
50
*[Deep Learning with PyTorch: A 60-minute Blitz](http://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html) to get started with PyTorch in general
30
51
*[Introduction to PyTorch for former Torchies](https://github.com/pytorch/tutorials/blob/master/Introduction%20to%20PyTorch%20for%20former%20Torchies.ipynb) if you are a former Lua Torch user
31
52
*[jcjohnson's PyTorch examples](https://github.com/jcjohnson/pytorch-examples) for a more in depth overview (including custom modules and autograd functions)
32
53
33
-
You should know about Recurrent Neural Networks and how they work:
54
+
### Recurrent Neural Networks
34
55
35
56
*[The Unreasonable Effectiveness of Recurrent Neural Networks](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) shows a bunch of real life examples
36
57
*[Deep Learning, NLP, and Representations](http://colah.github.io/posts/2014-07-NLP-RNNs-Representations/) for an overview on word embeddings and RNNs for NLP
37
58
*[Understanding LSTM Networks](http://colah.github.io/posts/2015-08-Understanding-LSTMs/) is about LSTMs work specifically, but also informative about RNNs in general
38
59
39
-
And for more, read the papers that introduced many of these topics:
60
+
### Machine translation
40
61
41
62
*[Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation](http://arxiv.org/abs/1406.1078)
42
63
*[Sequence to Sequence Learning with Neural Networks](http://arxiv.org/abs/1409.3215)
43
-
*[Neural Machine Translation by Jointly Learning to Align and Translate](https://arxiv.org/abs/1409.0473)
0 commit comments