You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: _includes/transcripts/ys_ml5.html
+5-5
Original file line number
Diff line number
Diff line change
@@ -14,18 +14,18 @@
14
14
<p>With the debug mode enabled, ml5.js can also visualize the training progress on the right-hand side.</p>
15
15
<p>It helps us to debug and improve our neural network.</p></div><divclass="slide" role='region' aria-label="Slide 8 of 17" id="slide-8" data-fmt="pdf" data-src="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=8"><noscript><ahref="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=8">Slide 8</a></noscript></div><divrole='region'><p>Here is a collection of other models and methods that ml5.js provides.</p>
16
16
<p>You can learn more about them on the ml5 website.</p>
17
-
<p>Ml5 has a wide collection of image, sound and text-based models with a variety of applications, such as detecting objects, human bodies, hand poses and faces, generating text, images and joins, implementing image translations, classifying audios, detecting pitch and analyzing words and sentences.</p>
18
-
<p>Ml5.js also provides neural network feature extractor and classifier and k-means as helper functions.</p></div><divclass="slide" role='region' aria-label="Slide 9 of 17" id="slide-9" data-fmt="pdf" data-src="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=9"><noscript><ahref="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=9">Slide 9</a></noscript></div><divrole='region'><p>How do I use ml5.js?</p></div><divclass="slide" role='region' aria-label="Slide 10 of 17" id="slide-10" data-fmt="pdf" data-src="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=10"><noscript><ahref="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=10">Slide 10</a></noscript></div><divrole='region'><p>We can run a model in the browser with ml5.js in three simple steps.</p>
17
+
<p>Ml5 has a wide collection of image, sound and text-based models with a variety of applications, such as detecting objects, human bodies, hand poses and faces, generating text, images and drawings, implementing image translations, classifying audios, detecting pitch and analyzing words and sentences.</p>
18
+
<p>Ml5.js also provides NeuralNetwork, FeatureExtractor, KNNClassifier and KMeans as helper functions.</p></div><divclass="slide" role='region' aria-label="Slide 9 of 17" id="slide-9" data-fmt="pdf" data-src="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=9"><noscript><ahref="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=9">Slide 9</a></noscript></div><divrole='region'><p>How do I use ml5.js?</p></div><divclass="slide" role='region' aria-label="Slide 10 of 17" id="slide-10" data-fmt="pdf" data-src="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=10"><noscript><ahref="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=10">Slide 10</a></noscript></div><divrole='region'><p>We can run a model in the browser with ml5.js in three simple steps.</p>
19
19
<p>First, create a model.</p>
20
-
<p>Secondly, ask the model to classify or critique something based on a input, like an image or a text.</p>
20
+
<p>Secondly, ask the model to classify or predict something based on a input, like an image or a text.</p>
21
21
<p>And step three, getting the results.</p>
22
22
<p>It also has great integration with p5.js, a JavaScript library for creating graphics and animations in the browser, which makes it easier to get inputs from webcam or microphones and also to show the outputs with canvas, image or audio.</p></div><divclass="slide" role='region' aria-label="Slide 11 of 17" id="slide-11" data-fmt="pdf" data-src="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=11"><noscript><ahref="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=11">Slide 11</a></noscript></div><divrole='region'><p>How is ml5.js built?</p></div><divclass="slide" role='region' aria-label="Slide 12 of 17" id="slide-12" data-fmt="pdf" data-src="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=12"><noscript><ahref="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=12">Slide 12</a></noscript></div><divrole='region'><p>Besides the core library, the ml5.js project also includes examples, documentations, guides for training and data collection, learning materials for workshops and courses.</p></div><divclass="slide" role='region' aria-label="Slide 13 of 17" id="slide-13" data-fmt="pdf" data-src="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=13"><noscript><ahref="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=13">Slide 13</a></noscript></div><divrole='region'><p>Ml5.js extends the functionality of tf.js.</p>
23
23
<p>It uses tf.js models, data API, layer API and the face API.</p>
24
24
<p>Under the hood, it utilizes the CPU, <aclass=dfn>WebGL</a>, or <aclass=dfn>WebAssembly</a> in the browser.</p>
25
25
<p>Ml5.js provides a high-level and beginner-friendly API to users.</p></div><divclass="slide" role='region' aria-label="Slide 14 of 17" id="slide-14" data-fmt="pdf" data-src="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=14"><noscript><ahref="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=14">Slide 14</a></noscript></div><divrole='region'><p>Web applications are very accessible.</p>
26
26
<p>There are a lot of web applications made by the ml5.js community.</p>
27
27
<p>Here are a few examples.</p>
28
-
<p>A Whac-A-Mole game that you can play with your webcam, a flying game where you can control your characters with your voice, an interactive story reading experiments that uses your voice as input to generate stories and joins.</p>
28
+
<p>A Whac-A-Mole game that you can play with your webcam, a flying game where you can control your characters with your voice, an interactive story reading experiments that uses your voice as input to generate stories and drawings.</p>
29
29
<p>There are many more applications built with ml5.js that you can find at its community page.</p>
30
30
<p>People find the low effort in using existing browser API desirable.</p>
31
31
<p>For example, using webcam and microphones with the ability of rendering output easily to image, canvas, audio or text elements on the DOM.</p>
@@ -43,6 +43,6 @@
43
43
<p>And lastly, port it into ml5.js to provide high-level API to users.</p>
44
44
<p>Here, the first step, which is implementing the model in <aclass=dfn>TensorFlow</a> and train it is the most time-consuming step and not all the operations are supported between different machine learning frameworks.</p>
45
45
<p>Therefore, it will be very helpful to have a standard model format for the web or have a tool that can make this step easier.</p>
46
-
<p>Our next project is making the conversion between different machine learning frameworks easier.</p></div><divclass="slide" role='region' aria-label="Slide 17 of 17" id="slide-17" data-fmt="pdf" data-src="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=17"><noscript><ahref="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=17">Slide 17</a></noscript></div><divrole='region'><p>Here are some more links about ml5.js.</p>
46
+
<p>ONNX project is making the conversion between different machine learning frameworks easier.</p></div><divclass="slide" role='region' aria-label="Slide 17 of 17" id="slide-17" data-fmt="pdf" data-src="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=17"><noscript><ahref="https://www.w3.org/2020/Talks/mlws/ys_ml5.pdf#page=17">Slide 17</a></noscript></div><divrole='region'><p>Here are some more links about ml5.js.</p>
0 commit comments