Skip to content

Commit 74f9867

Browse files
committed
New translations
1 parent caaca11 commit 74f9867

9 files changed

+13
-8
lines changed

_posts/2018-05-09-visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ layout: prediction_post
33
published: True
44
title: Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention)
55
---
6-
<span class="discussion">Translations: <a href="https://blog.csdn.net/qq_41664845/article/details/84245520">Chinese (Simplified)</a>, <a href="https://tips-memo.com/translation-jayalmmar-attention">Japanese</a>, <a href="https://nlpinkorean.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/">Korean</a>, <a href="http://dml.qom.ac.ir/2021/10/03/visualizing-a-neural-machine-translation-model/">Persian</a>, <a href="https://habr.com/ru/post/486158/">Russian</a>, <a href="https://medium.com/@SenemAktas/n%C3%B6ral-makine-%C3%A7eviri-modelini-g%C3%B6rselle%C5%9Ftirme-seq2seq-modelinin-attention-mekanizmas%C4%B1-b12581b5a1df">Turkish</a></span>
6+
<span class="discussion">Translations: <a href="https://blog.csdn.net/qq_41664845/article/details/84245520">Chinese (Simplified)</a>, <a href="https://lbourdois.github.io/blog/nlp/Seq2seq-et-attention/">French</a>, <a href="https://tips-memo.com/translation-jayalmmar-attention">Japanese</a>, <a href="https://nlpinkorean.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/">Korean</a>, <a href="http://dml.qom.ac.ir/2021/10/03/visualizing-a-neural-machine-translation-model/">Persian</a>, <a href="https://habr.com/ru/post/486158/">Russian</a>, <a href="https://medium.com/@SenemAktas/n%C3%B6ral-makine-%C3%A7eviri-modelini-g%C3%B6rselle%C5%9Ftirme-seq2seq-modelinin-attention-mekanizmas%C4%B1-b12581b5a1df">Turkish</a></span>
77
<br />
88
<span class="discussion">Watch: MIT's <a href="https://youtu.be/53YvP6gdD7U?t=335">Deep Learning State of the Art</a> lecture referencing this post</span>
99

_posts/2018-06-27-illustrated-transformer.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ title: The Illustrated Transformer
77
<a href="https://news.ycombinator.com/item?id=18351674" class="hn-link">Hacker News (65 points, 4 comments)</a>, <a href="https://www.reddit.com/r/MachineLearning/comments/8uh2yz/p_the_illustrated_transformer_a_visual_look_at/" class="">Reddit r/MachineLearning (29 points, 3 comments)</a>
88
</span>
99
<br />
10-
<span class="discussion">Translations: <a href="https://blog.csdn.net/yujianmin1990/article/details/85221271">Chinese (Simplified)</a>, <a href="https://a-coles.github.io/2020/11/15/transformer-illustre.html">French</a>, <a href="https://tips-memo.com/translation-jayalmmar-transformer">Japanese</a>, <a href="https://nlpinkorean.github.io/illustrated-transformer/">Korean</a>, <a href="https://habr.com/ru/post/486358/">Russian</a>, <a href="https://hackernoon.com/el-transformador-ilustrado-una-traduccion-al-espanol-0y73wwp">Spanish</a>, <a href="https://trituenhantao.io/tin-tuc/minh-hoa-transformer/">Vietnamese</a></span>
10+
<span class="discussion">Translations: <a href="https://blog.csdn.net/yujianmin1990/article/details/85221271">Chinese (Simplified)</a>, <a href="https://a-coles.github.io/2020/11/15/transformer-illustre.html">French 1</a>, <a href="https://lbourdois.github.io/blog/nlp/Transformer/">French 2</a>, <a href="https://tips-memo.com/translation-jayalmmar-transformer">Japanese</a>, <a href="https://nlpinkorean.github.io/illustrated-transformer/">Korean</a>, <a href="https://habr.com/ru/post/486358/">Russian</a>, <a href="https://hackernoon.com/el-transformador-ilustrado-una-traduccion-al-espanol-0y73wwp">Spanish</a>, <a href="https://trituenhantao.io/tin-tuc/minh-hoa-transformer/">Vietnamese</a></span>
1111
<br />
1212
<span class="discussion">Watch: MIT's <a href="https://youtu.be/53YvP6gdD7U?t=432">Deep Learning State of the Art</a> lecture referencing this post</span>
1313

_posts/2018-12-03-illustrated-bert.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ title: The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)
77
<a href="https://news.ycombinator.com/item?id=18751469" class="hn-link">Hacker News (98 points, 19 comments)</a>, <a href="https://www.reddit.com/r/MachineLearning/comments/a3ykzf/r_the_illustrated_bert_and_elmo_how_nlp_cracked/" class="">Reddit r/MachineLearning (164 points, 20 comments)</a>
88
</span>
99
<br />
10-
<span class="discussion">Translations: <a href="https://blog.csdn.net/qq_41664845/article/details/84787969">Chinese (Simplified)</a>, <a href="https://a-coles.github.io/2020/11/15/bert-illustre.html">French</a>, <a href="https://tech-magazine.opt.ne.jp/entry/2020/05/01/132654">Japanese</a>, <a href="https://nlpinkorean.github.io/illustrated-bert/">Korean</a>, <a href="http://blog.class.vision/1397/09/bert-in-nlp/">Persian</a>, <a href="https://habr.com/ru/post/487358/">Russian</a></span>
10+
<span class="discussion">Translations: <a href="https://blog.csdn.net/qq_41664845/article/details/84787969">Chinese (Simplified)</a>, <a href="https://a-coles.github.io/2020/11/15/bert-illustre.html">French 1</a>, <a href="https://lbourdois.github.io/blog/nlp/BERT/">French 2</a>, <a href="https://tech-magazine.opt.ne.jp/entry/2020/05/01/132654">Japanese</a>, <a href="https://nlpinkorean.github.io/illustrated-bert/">Korean</a>, <a href="http://blog.class.vision/1397/09/bert-in-nlp/">Persian</a>, <a href="https://habr.com/ru/post/487358/">Russian</a></span>
1111

1212
<strong>2021 Update:</strong> I created this brief and highly accessible video intro to BERT
1313

_posts/2019-03-27-illustrated-word2vec.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ title: The Illustrated Word2vec
99
</span>
1010
<br />
1111
<span class="discussion">
12-
Translations: <a href="https://mp.weixin.qq.com/s?__biz=MjM5MTQzNzU2NA==&mid=2651669277&idx=2&sn=bc8f0590f9e340c1f1359982726c5a30&chksm=bd4c648e8a3bed9817f30c5a512e79fe0cc6fbc58544f97c857c30b120e76508fef37cae49bc&scene=0&xtrack=1#rd">Chinese (Simplified)</a>, <a href="https://databreak.netlify.com/2019-04-25-illustrated_word2vec/">Korean</a>, <a href="https://pessoalex.wordpress.com/2019/03/29/o-word2vec-ilustrado/">Portuguese</a>, <a href="https://habr.com/ru/post/446530/">Russian</a>
12+
Translations: <a href="https://mp.weixin.qq.com/s?__biz=MjM5MTQzNzU2NA==&mid=2651669277&idx=2&sn=bc8f0590f9e340c1f1359982726c5a30&chksm=bd4c648e8a3bed9817f30c5a512e79fe0cc6fbc58544f97c857c30b120e76508fef37cae49bc&scene=0&xtrack=1#rd">Chinese (Simplified)</a>, <a href="https://lbourdois.github.io/blog/nlp/word_embedding/">French</a>, <a href="https://databreak.netlify.com/2019-04-25-illustrated_word2vec/">Korean</a>, <a href="https://pessoalex.wordpress.com/2019/03/29/o-word2vec-ilustrado/">Portuguese</a>, <a href="https://habr.com/ru/post/446530/">Russian</a>
1313
</span>
1414

1515

_posts/2019-06-26-visual-numpy.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ title: A Visual Intro to NumPy and Data Representation
88
<a href="https://news.ycombinator.com/item?id=20282985" class="hn-link">Hacker News (366 points, 21 comments)</a>, <a href="https://www.reddit.com/r/MachineLearning/comments/c5nc89/p_a_visual_intro_to_numpy_and_data_representation/" class="">Reddit r/MachineLearning (256 points, 18 comments)</a>
99
</span>
1010
<br />
11-
<span class="discussion">Translations: <a href="http://www.junphy.com/wordpress/index.php/2019/10/24/visual-numpy/">Chinese 1</a>, <a href="https://github.com/kevingo/blog/blob/master/ML/visual-numpy.md">Chinese 2</a>, <a href="https://note.mu/sayajewels/n/n95edaedb0fc5">Japanese</a>, <a href="https://chloamme.github.io/translation/2021/12/20/visual-numpy-korean.html">Korean</a></span>
11+
<span class="discussion">Translations: <a href="http://www.junphy.com/wordpress/index.php/2019/10/24/visual-numpy/">Chinese 1</a>, <a href="https://github.com/kevingo/blog/blob/master/ML/visual-numpy.md">Chinese 2</a>, <a href="https://note.mu/sayajewels/n/n95edaedb0fc5">Japanese</a>, <a href="https://chloamme.github.io/2021/12/20/visual-numpy-korean.html">Korean</a></span>
1212

1313

1414

_posts/2019-08-12-illustrated-gpt2.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ title: The Illustrated GPT-2 (Visualizing Transformer Language Models)
99
<a href="https://news.ycombinator.com/item?id=20677411" class="hn-link">Hacker News (64 points, 3 comments)</a>, <a href="https://www.reddit.com/r/MachineLearning/comments/cp8prq/p_the_illustrated_gpt2_visualizing_transformer/" class="">Reddit r/MachineLearning (219 points, 18 comments)</a>
1010
</span>
1111

12-
<span class="discussion">Translations: <a href="https://chloamme.github.io/translation/2021/12/08/illustrated-gpt2-korean.html">Korean</a>, <a href="https://habr.com/ru/post/490842/">Russian</a></span>
12+
<span class="discussion">Translations: <a href="https://lbourdois.github.io/blog/nlp/GPT2/">French</a>, <a href="https://chloamme.github.io/2021/12/08/illustrated-gpt2-korean.html">Korean</a>, <a href="https://habr.com/ru/post/490842/">Russian</a></span>
1313

1414

1515

_posts/2019-11-26-a-visual-guide-to-using-bert-for-the-first-time.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ title: A Visual Guide to Using BERT for the First Time
55
---
66

77

8-
<span class="discussion">Translations: <a href="http://www.junphy.com/wordpress/index.php/2020/10/20/a-visual-guide-using-bert/">Chinese</a>, <a href="https://chloamme.github.io/translation/2021/12/22/a-visual-guide-to-using-bert-for-the-first-time-korean.html">Korean</a>, <a href="https://habr.com/ru/post/498144/">Russian</a></span>
8+
<span class="discussion">Translations: <a href="http://www.junphy.com/wordpress/index.php/2020/10/20/a-visual-guide-using-bert/">Chinese</a>, <a href="https://chloamme.github.io/2021/12/22/a-visual-guide-to-using-bert-for-the-first-time-korean.html">Korean</a>, <a href="https://habr.com/ru/post/498144/">Russian</a></span>
99

1010
<div class="img-div-any-width" markdown="0">
1111
<image src="/images/distilBERT/bert-distilbert-sentence-classification.png"/>

_posts/2020-07-27-how-gpt3-works-visualizations-animations.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ title: How GPT3 Works - Visualizations and Animations
77
<a href="https://news.ycombinator.com/item?id=23967887" class="hn-link">Hacker News (397 points, 97 comments)</a>, <a href="https://www.reddit.com/r/MachineLearning/comments/hwxn26/p_how_gpt3_works_visuals_and_animations/" class="">Reddit r/MachineLearning (247 points, 27 comments)</a>
88
</span>
99
<br />
10-
<span class="discussion">Translations: <a href="https://www.arnevogel.com/wie-gpt3-funktioniert/">German</a>, <a href="https://chloamme.github.io/translation/2021/12/18/how-gpt3-works-visualizations-animations-korean.html">Korean</a>, <a href="https://blogcn.acacess.com/how-gpt3-works-visualizations-and-animations-zhong-yi">Chinese (Simplified)</a>, <a href="https://habr.com/ru/post/514698/">Russian</a></span>
10+
<span class="discussion">Translations: <a href="https://www.arnevogel.com/wie-gpt3-funktioniert/">German</a>, <a href="https://chloamme.github.io/2021/12/18/how-gpt3-works-visualizations-animations-korean.html">Korean</a>, <a href="https://blogcn.acacess.com/how-gpt3-works-visualizations-and-animations-zhong-yi">Chinese (Simplified)</a>, <a href="https://habr.com/ru/post/514698/">Russian</a></span>
1111
<br />
1212

1313
The tech world is [abuzz](https://www.theverge.com/21346343/gpt-3-explainer-openai-examples-errors-agi-potential) with GPT3 hype. Massive language models (like GPT3) are starting to surprise us with their abilities. While not yet completely reliable for most businesses to put in front of their customers, these models are showing sparks of cleverness that are sure to accelerate the march of automation and the possibilities of intelligent computer systems. Let's remove the aura of mystery around GPT3 and learn how it's trained and how it works.

_posts/2021-12-27-illustrated-retro.md

+5
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,11 @@ published: False
44
title: The Illustrated Retro Transformer
55
---
66

7+
8+
<span class="discussion">
9+
Translations: <a href="https://chloamme.github.io/2022/01/08/illustrated-retrieval-transformer-korean.html">Korean</a>
10+
</span>
11+
712
looking at DeepMind's Retro Transformer, which at 7.5B parameters is on par with GPT3 and models 25X its size in knowledge-intensive tasks.
813

914
A big moment for Large Language Models (LLMs) for reasons I'll mention in this thread.

0 commit comments

Comments
 (0)