|
1 |
| -Deep Learning for NLP with Pytorch |
| 1 | +PyTorchλ₯Ό νμ©ν μμ°μ΄ μ²λ¦¬λ₯Ό μν λ₯λ¬λ |
2 | 2 | ----------------------------------
|
| 3 | +**λ²μ**: `μ΅κ°μ£Ό <http://github.com/Choigapju>`_ |
3 | 4 |
|
4 |
| -These tutorials will walk you through the key ideas of deep learning |
5 |
| -programming using Pytorch. Many of the concepts (such as the computation |
6 |
| -graph abstraction and autograd) are not unique to Pytorch and are |
7 |
| -relevant to any deep learning toolkit out there. |
8 |
| - |
9 |
| -They are focused specifically on NLP for people who have never written |
10 |
| -code in any deep learning framework (e.g, TensorFlow,Theano, Keras, DyNet). |
11 |
| -The tutorials assumes working knowledge of core NLP problems: part-of-speech |
12 |
| -tagging, language modeling, etc. It also assumes familiarity with neural |
13 |
| -networks at the level of an intro AI class (such as one from the Russel and |
14 |
| -Norvig book). Usually, these courses cover the basic backpropagation algorithm |
15 |
| -on feed-forward neural networks, and make the point that they are chains of |
16 |
| -compositions of linearities and non-linearities. This tutorial aims to get |
17 |
| -you started writing deep learning code, given you have this prerequisite |
18 |
| -knowledge. |
19 |
| - |
20 |
| -Note these tutorials are about *models*, not data. For all of the models, |
21 |
| -a few test examples are created with small dimensionality so you can see how |
22 |
| -the weights change as it trains. If you have some real data you want to |
23 |
| -try, you should be able to rip out any of the models from this notebook |
24 |
| -and use them on it. |
| 5 | +μ΄ νν λ¦¬μΌ μ리μ¦λ PyTorchλ₯Ό νμ©ν λ₯λ¬λ νλ‘κ·Έλλ°μ ν΅μ¬ κ°λ
λ€μ λ¨κ³λ³λ‘ μλ΄ν©λλ€. |
| 6 | +μ¬κΈ°μ λ€λ£¨λ λ§μ κ°λ
λ€(μλ₯Ό λ€μ΄, κ³μ° κ·Έλν μΆμνμ μλ λ―ΈλΆ)μ PyTorchμλ§ κ΅νλ κ²μ΄ μλλΌ νμ‘΄νλ λͺ¨λ λ₯λ¬λ λꡬμ 곡ν΅μ μΌλ‘ μ μ©λλ μ리μ
λλ€. |
| 7 | + |
| 8 | +μ΄ νν 리μΌμ νΉν λ₯λ¬λ νλ μμν¬(μ: TensorFlow, Theano, Keras, DyNet λ±)λ‘ μ½λλ₯Ό μμ±ν΄ λ³Έ κ²½νμ΄ μ ν μλ λΆλ€μ μν μμ°μ΄ μ²λ¦¬μ μ΄μ μ λ§μΆκ³ μμ΅λλ€. |
| 9 | +νμ¬ νκΉ
, μΈμ΄ λͺ¨λΈλ§ λ± ν΅μ¬ μμ°μ΄ μ²λ¦¬ λ¬Έμ μ λν κΈ°λ³Έμ μΈ μ΄ν΄λ₯Ό μ μ λ‘ ν©λλ€. λν μ
λ¬Έ μμ€μ μΈκ³΅μ§λ₯ κ°μ’(μ: Russellκ³Ό Norvigμ κ΅μ¬μμ λ€λ£¨λ μμ€)μμ νμ΅νλ μ λμ μ κ²½λ§ μ§μμ κ°μΆκ³ μλ€κ³ κ°μ ν©λλ€. |
| 10 | +μΌλ°μ μΌλ‘ μ΄λ° κ°μ’λ€μ μμ ν μ κ²½λ§μ κΈ°λ³Έμ μΈ μμ ν μκ³ λ¦¬μ¦μ λ€λ£¨λ©°, μ κ²½λ§μ΄ μ ν λ³νκ³Ό λΉμ ν νμ±ν ν¨μμ μ°μ ꡬμ±μ΄λΌλ μ μ κ°μ‘°ν©λλ€. λ³Έ νν 리μΌμ μ£Όλ λͺ©νλ μ΄λ¬ν μ μ μ§μμ λ°νμΌλ‘ μ¬λ¬λΆμ΄ μ€μ λ‘ λ₯λ¬λ μ½λλ₯Ό μμ±νκΈ° μμν μ μλλ‘ μλ΄νλ κ²μ
λλ€. |
| 11 | + |
| 12 | +μ°Έκ³ λ릴 μ μ μ΄ νν 리μΌμ λ°μ΄ν°κ° μλ *λͺ¨λΈ*μ κ΄ν κ²μ
λλ€. λͺ¨λ λͺ¨λΈμ λν΄, |
| 13 | +νμ΅ κ³Όμ μμ κ°μ€μΉκ° μ΄λ»κ² λ³ννλμ§ νμΈν μ μλλ‘ μμ μ°¨μμ ν
μ€νΈ μμ λ€μ λͺ κ°μ§ μμ±νμ΅λλ€. μ€μ λ°μ΄ν°λ‘ μλν΄λ³΄κ³ μΆμΌμλ€λ©΄, μ΄ λ
ΈνΈλΆμ λͺ¨λ λͺ¨λΈμ κ·Έλλ‘ κ°μ Έκ°μ μ¬λ¬λΆμ λ°μ΄ν°μ μ μ©ν΄λ³΄μ€ μ μμ΅λλ€. |
25 | 14 |
|
26 | 15 | 1. pytorch_tutorial.py
|
27 |
| - Introduction to PyTorch |
| 16 | + νμ΄ν μΉ μκ° |
28 | 17 | https://tutorials.pytorch.kr/beginner/nlp/pytorch_tutorial.html
|
29 | 18 |
|
30 | 19 | 2. deep_learning_tutorial.py
|
31 |
| - Deep Learning with PyTorch |
| 20 | + λ₯λ¬λ μκ° |
32 | 21 | https://tutorials.pytorch.kr/beginner/nlp/deep_learning_tutorial.html
|
33 | 22 |
|
34 | 23 | 3. word_embeddings_tutorial.py
|
35 |
| - Word Embeddings: Encoding Lexical Semantics |
| 24 | + λ¨μ΄ μλ² λ© μκ° |
36 | 25 | https://tutorials.pytorch.kr/beginner/nlp/word_embeddings_tutorial.html
|
37 | 26 |
|
38 | 27 | 4. sequence_models_tutorial.py
|
39 |
| - Sequence Models and Long Short-Term Memory Networks |
| 28 | + μμ°¨ λͺ¨λΈ μκ° |
40 | 29 | https://tutorials.pytorch.kr/beginner/nlp/sequence_models_tutorial.html
|
41 | 30 |
|
42 | 31 | 5. advanced_tutorial.py
|
43 |
| - Advanced: Making Dynamic Decisions and the Bi-LSTM CRF |
44 |
| - https://tutorials.pytorch.kr/beginner/nlp/advanced_tutorial.html |
| 32 | + κ³ κΈ μκ° |
| 33 | + https://tutorials.pytorch.kr/beginner/nlp/advanced_tutorial.html |
0 commit comments