<?xml version="1.0" encoding="UTF-8"?>
<!-- generator="FeedCreator 1.8" -->
<?xml-stylesheet href="https://wiki.slq.qld.gov.au/lib/exe/css.php?s=feed" type="text/css"?>
<rss version="2.0">
    <channel xmlns:g="http://base.google.com/ns/1.0">
        <title>SLQ Wiki workshops:public:machine_learning:uncanny_valley</title>
        <description></description>
        <link>https://wiki.slq.qld.gov.au/</link>
        <lastBuildDate>Wed, 15 Apr 2026 08:17:20 +0000</lastBuildDate>
        <generator>FeedCreator 1.8</generator>
        
        <item>
            <title>Machine Learning 03 - Entering the Uncanny Valley of Speech</title>
            <link>https://wiki.slq.qld.gov.au/doku.php?id=workshops:public:machine_learning:uncanny_valley:start&amp;rev=1605506206&amp;do=diff</link>
            <description>
&lt;p&gt;
~~REVEAL~~
&lt;/p&gt;

&lt;h1 class=&quot;sectionedit1&quot; id=&quot;machine_learning_03_-_entering_the_uncanny_valley_of_speech&quot;&gt;Machine Learning 03 - Entering the Uncanny Valley of Speech&lt;/h1&gt;
&lt;div class=&quot;level1&quot;&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_wrap_start&amp;quot;,&amp;quot;secid&amp;quot;:2,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;div class=&quot;wrap_hideslide plugin_wrap&quot;&gt;
&lt;p&gt;
With State Library closed to the public due to COVID-19, for this Machine Learning (ML) workshop we won&amp;#039;t be able to use the Digital Media Lab at The Edge.
&lt;/p&gt;
&lt;/div&gt;&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_wrap_end&amp;quot;,&amp;quot;secid&amp;quot;:3,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;
&lt;p&gt;
This online workshop recaps our previous workshops, and  explores the world of Text To Speech (TTS),  voice synthesisers and Speech To Text (STT) voice recognition nbuilt with ML.  The workshop is not an introduction to coding or math, but we will give a of general overview of how ML is defined and where it is commonly used today.  
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
We&amp;#039;ve chosen an approach that demonstrates the power and limitations of ML and leaves you with an understanding of how use an online  ML environment, along with ideas on how to use State Library resources to explore ML further.
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
 The first half of the workshop will cover
&lt;/p&gt;
&lt;ul&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; a basic explanation of ML&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; recap of previous ML workshops&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; ML for speech &lt;/div&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
The second half of the workshop explore how to impliment ML research using Google&amp;#039;s Colab platform
&lt;/p&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Machine Learning 03 - Entering the Uncanny Valley of Speech&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;machine_learning_03_-_entering_the_uncanny_valley_of_speech&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:0,&amp;quot;secid&amp;quot;:1,&amp;quot;range&amp;quot;:&amp;quot;13-1102&amp;quot;} --&gt;
&lt;h2 class=&quot;sectionedit4&quot; id=&quot;outcomes&quot;&gt;Outcomes&lt;/h2&gt;
&lt;div class=&quot;level2&quot;&gt;
&lt;ul&gt;
&lt;li class=&quot;level1 node&quot;&gt;&lt;div class=&quot;li&quot;&gt; A general basic ML background&lt;/div&gt;
&lt;ul&gt;
&lt;li class=&quot;level2&quot;&gt;&lt;div class=&quot;li&quot;&gt; ML for speech&lt;/div&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li class=&quot;level1 node&quot;&gt;&lt;div class=&quot;li&quot;&gt; using Google Colab&lt;/div&gt;
&lt;ul&gt;
&lt;li class=&quot;level2&quot;&gt;&lt;div class=&quot;li&quot;&gt; Spleeter (audio source separation) &lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level2&quot;&gt;&lt;div class=&quot;li&quot;&gt;TTS (Mozilla TTS)&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level2&quot;&gt;&lt;div class=&quot;li&quot;&gt;STT (Mozilla Deepspeech)&lt;/div&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Outcomes&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;outcomes&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:0,&amp;quot;secid&amp;quot;:4,&amp;quot;range&amp;quot;:&amp;quot;1103-1298&amp;quot;} --&gt;
&lt;h2 class=&quot;sectionedit5&quot; id=&quot;requirements&quot;&gt;Requirements&lt;/h2&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;p&gt;
All we need to get started for this workshop is a Google account to access Google Colab in the second half of the workshop.  If you don&amp;#039;t have one you can quickly &lt;a href=&quot;https://accounts.google.com/signup/v2/webcreateaccount?hl=en&amp;amp;flowName=GlifWebSignIn&amp;amp;flowEntry=SignUp&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://accounts.google.com/signup/v2/webcreateaccount?hl=en&amp;amp;flowName=GlifWebSignIn&amp;amp;flowEntry=SignUp&quot; rel=&quot;ugc nofollow noopener&quot;&gt;
sign up&lt;/a&gt;.  If you don&amp;#039;t want to create a Google account, you can always just follow along with the examples.
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_include_start_noredirect&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;workshops:public:machine_learning:ideepcolor:start&amp;quot;,&amp;quot;secid&amp;quot;:6,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;div class=&quot;plugin_include_content plugin_include__workshops:public:machine_learning:ideepcolor:start&quot;&gt;

&lt;h2 class=&quot;sectionedit8&quot; id=&quot;background&quot;&gt;Background&lt;/h2&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;p&gt;
Machine Learning(ML) is a subset of Artificial Intelligence (AI) which is is a fast moving field of computer science (CS).  A good way to think about how these fields overlap is with a diagram.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/detail.php?id=workshops%3Apublic%3Amachine_learning%3Auncanny_valley%3Astart&amp;amp;media=workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:ideepcolor:ml_diagram.png&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:ideepcolor:ml_diagram.png&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=600&amp;amp;tok=89dc93&amp;amp;media=workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:ideepcolor:ml_diagram.png&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;600&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;

&lt;h3 id=&quot;machine_learning_-_why_now&quot;&gt;Machine Learning - Why Now?&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
While many of the concepts are decades old, and the mathematical underpinnings have been around for centuries, the explosion in use and development of ML learning has been enabled by the creation and commercialisation of massively parallel processors.  This specialised computer hardware most commonly found in Graphics Processing Units (GPUs) inside desktop and laptop computers and takes care of the display of 2D and 3D graphics. The same processing architecture that accelerates the rendering of 3D models onscreen is ideally suited to solve ML problems, resulting in specialised programming platforms, Application Programming Interfaces (APIs) and programming libraries for AI and ML.
&lt;/p&gt;

&lt;/div&gt;

&lt;h3 id=&quot;common_machine_learning_uses&quot;&gt;Common Machine Learning Uses&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
One way to think of ML is as a  &lt;em&gt;recommendation system&lt;/em&gt;.  
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Based on input data (a lot of input data&lt;sup&gt;&lt;a href=&quot;#fn__1&quot; id=&quot;fnt__1&quot; class=&quot;fn_top&quot;&gt;1)&lt;/a&gt;&lt;/sup&gt;)
&lt;/p&gt;
&lt;ul&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; a machine learning system is trained&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; a model is generated&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; the model is used  can make recommendations (is implimented) on &lt;em&gt;new&lt;/em&gt; data.  &lt;/div&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_wrap_start&amp;quot;,&amp;quot;secid&amp;quot;:9,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;div class=&quot;wrap_hidescreen plugin_wrap&quot;&gt;
&lt;p&gt;
:workshops:prototypes:machine_learning:ideepcolor:8725096366_d1fe677cc5_o.jpg
&lt;/p&gt;
&lt;/div&gt;&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_wrap_end&amp;quot;,&amp;quot;secid&amp;quot;:10,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;
&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
One extremely common application of this is image recognition. 
&lt;/p&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_wrap_start&amp;quot;,&amp;quot;secid&amp;quot;:11,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;div class=&quot;wrap_hideslide plugin_wrap&quot;&gt;
&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/detail.php?id=workshops%3Apublic%3Amachine_learning%3Auncanny_valley%3Astart&amp;amp;media=workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:ideepcolor:2020-02-21_14_19_57-8725096366_d1fe677cc5_o_fb.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:ideepcolor:2020-02-21_14_19_57-8725096366_d1fe677cc5_o_fb.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=600&amp;amp;tok=420cc2&amp;amp;media=workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:ideepcolor:2020-02-21_14_19_57-8725096366_d1fe677cc5_o_fb.jpg&quot; class=&quot;mediacenter&quot; alt=&quot;&quot; width=&quot;600&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;
&lt;/div&gt;&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_wrap_end&amp;quot;,&amp;quot;secid&amp;quot;:12,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_wrap_start&amp;quot;,&amp;quot;secid&amp;quot;:13,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;div class=&quot;wrap_hidescreen plugin_wrap&quot;&gt;
&lt;p&gt;
:workshops:prototypes:machine_learning:ideepcolor:2020-02-21_14_19_57-8725096366_d1fe677cc5_o_fb.jpg
&lt;/p&gt;
&lt;/div&gt;&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_wrap_end&amp;quot;,&amp;quot;secid&amp;quot;:14,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;
&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
When facebook asks you to tag a photo with names,  you are providing them with a nicely annotated data set for &lt;strong&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/Supervised_learning&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://en.wikipedia.org/wiki/Supervised_learning&quot; rel=&quot;ugc nofollow noopener&quot;&gt;supervised learning&lt;/a&gt;&lt;/strong&gt;.  They can then use this data set to train a model than then recognises (makes a recommendation) about other photos &lt;a href=&quot;https://www.wired.com/story/facebook-will-find-your-face-even-when-its-not-tagged/&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://www.wired.com/story/facebook-will-find-your-face-even-when-its-not-tagged/&quot; rel=&quot;ugc nofollow noopener&quot;&gt; with you or your friend in it&lt;/a&gt;. 
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
&lt;a href=&quot;https://www.youtube.com/watch?v=Pc2aJxnmzh0&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://www.youtube.com/watch?v=Pc2aJxnmzh0&quot; rel=&quot;ugc nofollow noopener&quot;&gt;Snapchat filters&lt;/a&gt; use image recognition to make a map of your features, then applies masks and transformations in real-time.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/detail.php?id=workshops%3Apublic%3Amachine_learning%3Auncanny_valley%3Astart&amp;amp;media=workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:ideepcolor:31402867165_4506ee548d_b.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:ideepcolor:31402867165_4506ee548d_b.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=600&amp;amp;tok=c21134&amp;amp;media=workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:ideepcolor:31402867165_4506ee548d_b.jpg&quot; class=&quot;mediacenter&quot; alt=&quot;&quot; width=&quot;600&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;

&lt;h3 id=&quot;deep_learning&quot;&gt;Deep Learning&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Today we are going to go a little “deeper” inside ML, exploring deep learning. Deep learning used multiple layers of algorithms, in an artificial neural network, inspired by the way the human neural networks inside all of us.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/detail.php?id=workshops%3Apublic%3Amachine_learning%3Auncanny_valley%3Astart&amp;amp;media=workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:ideepcolor:dl_diagram.png&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:ideepcolor:dl_diagram.png&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=600&amp;amp;tok=f56850&amp;amp;media=workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:ideepcolor:dl_diagram.png&quot; class=&quot;media&quot; title=&quot; &quot; alt=&quot; &quot; width=&quot;600&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Background&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;background&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:0,&amp;quot;secid&amp;quot;:8,&amp;quot;range&amp;quot;:&amp;quot;2063-5079&amp;quot;} --&gt;&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_include_editbtn&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Background&amp;quot;,&amp;quot;secid&amp;quot;:15,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_include_end&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;workshops:public:machine_learning:ideepcolor:start&amp;quot;,&amp;quot;secid&amp;quot;:7,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;/div&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_include_start_noredirect&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;workshops:public:machine_learning:ideepcolor:start&amp;quot;,&amp;quot;secid&amp;quot;:16,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;div class=&quot;plugin_include_content plugin_include__workshops:public:machine_learning:ideepcolor:start&quot;&gt;

&lt;h2 class=&quot;sectionedit18&quot; id=&quot;interactive_deep_colorization&quot;&gt;Interactive Deep Colorization&lt;/h2&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;p&gt;
Lets take a look at the subject of our first ML workshop &lt;a href=&quot;https://richzhang.github.io/ideepcolor/&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://richzhang.github.io/ideepcolor/&quot; rel=&quot;ugc nofollow noopener&quot;&gt;Real-Time User-Guided Image Colorization with Learned Deep Priors&lt;/a&gt; (ideepcolor), by  Richard Zhang, Jun-Yan Zhu, Phillip Isola, Xinyang Geng, Angela S. Lin, Tianhe Yu and Alexei A. Efros.
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Here is a talk about the details of their paper.
&lt;/p&gt;
&lt;iframe src=&quot;//www.youtube-nocookie.com/embed/rp5LUSbdsys&quot; height=&quot;239&quot; width=&quot;425&quot; class=&quot;vshare__none&quot; allowfullscreen=&quot;&quot; frameborder=&quot;0&quot; scrolling=&quot;no&quot;&gt;&lt;/iframe&gt;
&lt;p&gt;
I encourage you to watch the above talk in full….
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
but the TLDR version is that they have:
&lt;/p&gt;
&lt;ul&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; trained a neural network on millions of images&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; combined this with simulated human interaction&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; produced a model that recommends an initial colourisation&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; that takes user input to refine the colourisation.  &lt;/div&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
The user input is provided through a Graphical User Interface (&lt;abbr title=&quot;Graphical User Interface&quot;&gt;GUI&lt;/abbr&gt;), and the end result can be exported, along with information about how the model made its recommendations.
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
You can check out a video of the demo in action here.
&lt;/p&gt;
&lt;iframe src=&quot;//www.youtube-nocookie.com/embed/eL5ilZgM89Q&quot; height=&quot;239&quot; width=&quot;425&quot; class=&quot;vshare__none&quot; allowfullscreen=&quot;&quot; frameborder=&quot;0&quot; scrolling=&quot;no&quot;&gt;&lt;/iframe&gt;
&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Interactive Deep Colorization&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;interactive_deep_colorization&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:0,&amp;quot;secid&amp;quot;:18,&amp;quot;range&amp;quot;:&amp;quot;5079-6100&amp;quot;} --&gt;&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_include_editbtn&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Interactive Deep Colorization&amp;quot;,&amp;quot;secid&amp;quot;:19,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_include_end&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;workshops:public:machine_learning:ideepcolor:start&amp;quot;,&amp;quot;secid&amp;quot;:17,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;/div&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_include_start_noredirect&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;workshops:public:machine_learning:paper_to_product&amp;quot;,&amp;quot;secid&amp;quot;:20,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;div class=&quot;plugin_include_content plugin_include__workshops:public:machine_learning:paper_to_product&quot;&gt;

&lt;h2 class=&quot;sectionedit22&quot; id=&quot;ml_-_from_paper_to_product&quot;&gt;ML - From Paper to Product&lt;/h2&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;p&gt;
We&amp;#039;ll be exploring a few ML ideas, but to start with lets follow &lt;a href=&quot;https://www.youtube.com/watch?v=rp5LUSbdsys&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://www.youtube.com/watch?v=rp5LUSbdsys&quot; rel=&quot;ugc nofollow noopener&quot;&gt;&amp;quot;Real-Time User-Guided Image Colorization with Learned Deep Priors&amp;quot;, by Richard Zhang, Jun-Yan Zhu, Phillip Isola, Xinyang Geng, Angela S. Lin, Tianhe Yu, Alexei A. Efros&lt;/a&gt; from paper to product. Its not a recent paper in ML terms where it seems every month brings another breakthrough, but we can follow this paper right through to its release in &lt;a href=&quot;https://www.adobe.com/au/products/photoshop-elements/whats-new.html&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://www.adobe.com/au/products/photoshop-elements/whats-new.html&quot; rel=&quot;ugc nofollow noopener&quot;&gt;Adobe Photoshop Elements 2020&lt;/a&gt;.
&lt;/p&gt;

&lt;/div&gt;

&lt;h3 id=&quot;research_papers_on_arxivorg&quot;&gt;Research Papers on arXiv.org&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
&lt;a href=&quot;https://arxiv.org/&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://arxiv.org/&quot; rel=&quot;ugc nofollow noopener&quot;&gt;arXiv.org&lt;/a&gt; is probably the worlds biggest and fastest growing collection of preprint electronic scientific papers from  mathematics, physics, astronomy, electrical engineering, computer science, quantitative biology, statistics, mathematical finance and economics.  All of the ML ideas we will be looking are either first published on arXiv.org, or reference papers on the site.  
&lt;/p&gt;

&lt;/div&gt;

&lt;h4 id=&quot;finding_a_paper&quot;&gt;Finding a Paper&lt;/h4&gt;
&lt;div class=&quot;level4&quot;&gt;

&lt;p&gt;
Papers on arXiv.org are moderated but not peer-reviewed, which means the speed and volume of publishing on this open-access repository is overwhelming. But to get started, lets say we are interested in re-colourisation of black and white images of our grandparents at their wedding, but we don&amp;#039;t want to do all the work ourselves. We&amp;#039;d also like to interact in real-time and guide the process, so we can get the colour of grandad&amp;#039;s suit and grandma&amp;#039;s bouquet just right.
&lt;/p&gt;

&lt;p&gt;
So lets search for “real-time guided image colourisation” but we&amp;#039;ll use the American English spelling “ colorization”. &lt;a href=&quot;https://arxiv.org/search/?query=real-time+guided+image+colorization&amp;amp;searchtype=title&amp;amp;abstracts=show&amp;amp;order=-announced_date_first&amp;amp;size=50&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://arxiv.org/search/?query=real-time+guided+image+colorization&amp;amp;searchtype=title&amp;amp;abstracts=show&amp;amp;order=-announced_date_first&amp;amp;size=50&quot; rel=&quot;ugc nofollow noopener&quot;&gt;Searching&lt;/a&gt; for “real-time guided image colorization” brings up our paper straight away, with handy &lt;a href=&quot;https://arxiv.org/pdf/1705.02999.pdf&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://arxiv.org/pdf/1705.02999.pdf&quot; rel=&quot;ugc nofollow noopener&quot;&gt;PDF&lt;/a&gt; link&lt;sup&gt;&lt;a href=&quot;#fn__2&quot; id=&quot;fnt__2&quot; class=&quot;fn_top&quot;&gt;2)&lt;/a&gt;&lt;/sup&gt;.  
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/detail.php?id=workshops%3Apublic%3Amachine_learning%3Auncanny_valley%3Astart&amp;amp;media=workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:arxiv_search.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:arxiv_search.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=800&amp;amp;tok=192303&amp;amp;media=workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:arxiv_search.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;800&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;

&lt;h4 id=&quot;examining_the_abstract&quot;&gt;Examining the Abstract&lt;/h4&gt;
&lt;div class=&quot;level4&quot;&gt;

&lt;p&gt;
All research papers begin with an abstract, and a well written abstract will tell us all we need to know about whether the paper is relevant for you, particularly if we are looking for working demonstration. This time we are in luck - there is a link to a &lt;a href=&quot;https://richzhang.github.io/ideepcolor/&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://richzhang.github.io/ideepcolor/&quot; rel=&quot;ugc nofollow noopener&quot;&gt;an ideepcolor demo site&lt;/a&gt; at the end of the abstract.
&lt;/p&gt;

&lt;/div&gt;

&lt;h4 id=&quot;check_out_the_demo&quot;&gt;Check out the Demo&lt;/h4&gt;
&lt;div class=&quot;level4&quot;&gt;

&lt;p&gt;
The demo for ideepcolor looks great and we&amp;#039;ve got a link at the top of the page, where ideepcolor is &lt;a href=&quot;https://github.com/junyanz/interactive-deep-colorization&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://github.com/junyanz/interactive-deep-colorization&quot; rel=&quot;ugc nofollow noopener&quot;&gt;implemented&lt;/a&gt; on github. 
&lt;/p&gt;

&lt;/div&gt;

&lt;h3 id=&quot;code_implementation_on_githubcom&quot;&gt;Code Implementation on github.com&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
&lt;a href=&quot;https://github.com/&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://github.com/&quot; rel=&quot;ugc nofollow noopener&quot;&gt;Github.com&lt;/a&gt; is a website used by software developers to create, collaborate and share source code, and is most likely the largest repository of source code in the world. Github is named after &lt;a href=&quot;https://git-scm.com/&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://git-scm.com/&quot; rel=&quot;ugc nofollow noopener&quot;&gt;git&lt;/a&gt;, a free and open-source(&lt;a href=&quot;https://en.wikipedia.org/wiki/Free_and_open-source_software&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://en.wikipedia.org/wiki/Free_and_open-source_software&quot; rel=&quot;ugc nofollow noopener&quot;&gt;FOSS&lt;/a&gt;) distributed version-control system for tracking changes in source code during software development.  Git means that developers from all over the world can work on the same code and if the project is open source, build on, expand and re-purpose shared codes&lt;sup&gt;&lt;a href=&quot;#fn__3&quot; id=&quot;fnt__3&quot; class=&quot;fn_top&quot;&gt;3)&lt;/a&gt;&lt;/sup&gt;.  But lets back up a bit and cover off on what source code is.
&lt;/p&gt;

&lt;/div&gt;

&lt;h4 id=&quot;using_the_source&quot;&gt;Using the Source&lt;/h4&gt;
&lt;div class=&quot;level4&quot;&gt;

&lt;p&gt;
&lt;a href=&quot;https://simple.wikipedia.org/wiki/Source_code&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://simple.wikipedia.org/wiki/Source_code&quot; rel=&quot;ugc nofollow noopener&quot;&gt; source code&lt;/a&gt; is the instructions for a computer program contained in a simple text document. 
&lt;/p&gt;

&lt;p&gt;
For a computer to run a program, the source code either has to be
&lt;/p&gt;

&lt;p&gt;
 &lt;strong&gt;*compiled&lt;/strong&gt; into binary machine code by a &lt;a href=&quot;https://simple.wikipedia.org/wiki/Compiler&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://simple.wikipedia.org/wiki/Compiler&quot; rel=&quot;ugc nofollow noopener&quot;&gt;compiler&lt;/a&gt;:
&lt;/p&gt;
&lt;ul&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt;his file is executable - in this case execute just means can be read, understood and acted on by the computer, or&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level2&quot;&gt;&lt;div class=&quot;li&quot;&gt; &lt;strong&gt;interpreted&lt;/strong&gt; by another program, which directly executes the code&lt;/div&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Here is a example of source code. in this case its a simple program in the C programming language that shows on the screen “Hello, World”
&lt;/p&gt;
&lt;pre class=&quot;code&quot;&gt;  #include &amp;lt;stdio.h&amp;gt;

  int main(void)
 {
  printf(&amp;quot;Hello, world!\n&amp;quot;);
  return 0;
 }&lt;/pre&gt;

&lt;p&gt;
Despite the strange symbols, if you know how the C language is written, this program is &lt;strong&gt;human readable&lt;/strong&gt;. 
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Once this code is run through a compiler, we get a binary executable file - which is &lt;strong&gt;machine readable&lt;/strong&gt;. 
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
But with the right tools (like a HEX editor) we can still open the file and edit it.  
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Here is the binary for our “Hello World!” program.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/detail.php?id=workshops%3Apublic%3Amachine_learning%3Auncanny_valley%3Astart&amp;amp;media=workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:helloworld_binary.png&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:helloworld_binary.png&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=800&amp;amp;tok=14270b&amp;amp;media=workshops:prototypes:2022-23delivery-lasercutcovers:machine_learning:helloworld_binary.png&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;800&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;

&lt;h4 id=&quot;open_source&quot;&gt;Open Source&lt;/h4&gt;
&lt;div class=&quot;level4&quot;&gt;

&lt;p&gt;
Dokuwiki (the software we are using for this wiki) is open source, and developed publicly, and freely &lt;a href=&quot;https://www.dokuwiki.org/&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://www.dokuwiki.org/&quot; rel=&quot;ugc nofollow noopener&quot;&gt;available&lt;/a&gt; on the internet. Anyone is able to grab the source code and run it, modify it or redistribute it. 
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Below is and example of the &lt;strong&gt;open source&lt;/strong&gt; code for this wiki, which is written in a language called &lt;a href=&quot;https://simple.wikipedia.org/wiki/PHP&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://simple.wikipedia.org/wiki/PHP&quot; rel=&quot;ugc nofollow noopener&quot;&gt;php&lt;/a&gt;.
&lt;/p&gt;
&lt;pre class=&quot;code file php&quot;&gt;  &lt;span class=&quot;co1&quot;&gt;// define all DokuWiki globals here (needed within test requests but also helps to keep track)&lt;/span&gt;
  &lt;span class=&quot;kw2&quot;&gt;global&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$ACT&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$INPUT&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$QUERY&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$ID&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$REV&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$DATE_AT&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$IDX&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt;
  &lt;span class=&quot;re0&quot;&gt;$DATE&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$RANGE&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$HIGH&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$TEXT&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$PRE&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$SUF&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$SUM&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$INFO&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$JSINFO&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;;&lt;/span&gt;
  &lt;span class=&quot;kw1&quot;&gt;if&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#40;&lt;/span&gt;&lt;a target=&quot;_tab&quot; href=&quot;http://www.php.net/isset&quot;&gt;&lt;span class=&quot;kw3&quot;&gt;isset&lt;/span&gt;&lt;/a&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#40;&lt;/span&gt;&lt;span class=&quot;re0&quot;&gt;$_SERVER&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#91;&lt;/span&gt;&lt;span class=&quot;st_h&quot;&gt;'HTTP_X_DOKUWIKI_DO'&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#93;&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#41;&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#41;&lt;/span&gt; &lt;span class=&quot;br0&quot;&gt;&amp;#123;&lt;/span&gt;
  &lt;span class=&quot;re0&quot;&gt;$ACT&lt;/span&gt; &lt;span class=&quot;sy0&quot;&gt;=&lt;/span&gt; &lt;a target=&quot;_tab&quot; href=&quot;http://www.php.net/trim&quot;&gt;&lt;span class=&quot;kw3&quot;&gt;trim&lt;/span&gt;&lt;/a&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#40;&lt;/span&gt;&lt;a target=&quot;_tab&quot; href=&quot;http://www.php.net/strtolower&quot;&gt;&lt;span class=&quot;kw3&quot;&gt;strtolower&lt;/span&gt;&lt;/a&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#40;&lt;/span&gt;&lt;span class=&quot;re0&quot;&gt;$_SERVER&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#91;&lt;/span&gt;&lt;span class=&quot;st_h&quot;&gt;'HTTP_X_DOKUWIKI_DO'&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#93;&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#41;&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#41;&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;;&lt;/span&gt;
  &lt;span class=&quot;br0&quot;&gt;&amp;#125;&lt;/span&gt; &lt;span class=&quot;kw1&quot;&gt;elseif&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#40;&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;!&lt;/span&gt;&lt;a target=&quot;_tab&quot; href=&quot;http://www.php.net/empty&quot;&gt;&lt;span class=&quot;kw3&quot;&gt;empty&lt;/span&gt;&lt;/a&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#40;&lt;/span&gt;&lt;span class=&quot;re0&quot;&gt;$_REQUEST&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#91;&lt;/span&gt;&lt;span class=&quot;st_h&quot;&gt;'idx'&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#93;&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#41;&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#41;&lt;/span&gt; &lt;span class=&quot;br0&quot;&gt;&amp;#123;&lt;/span&gt;
  &lt;span class=&quot;re0&quot;&gt;$ACT&lt;/span&gt; &lt;span class=&quot;sy0&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;st_h&quot;&gt;'index'&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;;&lt;/span&gt;
  &lt;span class=&quot;br0&quot;&gt;&amp;#125;&lt;/span&gt; &lt;span class=&quot;kw1&quot;&gt;elseif&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#40;&lt;/span&gt;&lt;a target=&quot;_tab&quot; href=&quot;http://www.php.net/isset&quot;&gt;&lt;span class=&quot;kw3&quot;&gt;isset&lt;/span&gt;&lt;/a&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#40;&lt;/span&gt;&lt;span class=&quot;re0&quot;&gt;$_REQUEST&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#91;&lt;/span&gt;&lt;span class=&quot;st_h&quot;&gt;'do'&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#93;&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#41;&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#41;&lt;/span&gt; &lt;span class=&quot;br0&quot;&gt;&amp;#123;&lt;/span&gt;
  &lt;span class=&quot;re0&quot;&gt;$ACT&lt;/span&gt; &lt;span class=&quot;sy0&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;re0&quot;&gt;$_REQUEST&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#91;&lt;/span&gt;&lt;span class=&quot;st_h&quot;&gt;'do'&lt;/span&gt;&lt;span class=&quot;br0&quot;&gt;&amp;#93;&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;;&lt;/span&gt;
  &lt;span class=&quot;br0&quot;&gt;&amp;#125;&lt;/span&gt; &lt;span class=&quot;kw1&quot;&gt;else&lt;/span&gt; &lt;span class=&quot;br0&quot;&gt;&amp;#123;&lt;/span&gt;
  &lt;span class=&quot;re0&quot;&gt;$ACT&lt;/span&gt; &lt;span class=&quot;sy0&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;st_h&quot;&gt;'show'&lt;/span&gt;&lt;span class=&quot;sy0&quot;&gt;;&lt;/span&gt;
  &lt;span class=&quot;br0&quot;&gt;&amp;#125;&lt;/span&gt;
&amp;nbsp;&lt;/pre&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
How did we get hold of the source code for this wiki?  In this case all we did was look in the dokuwiki source found on &lt;a href=&quot;https://github.com/splitbrain/dokuwiki&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://github.com/splitbrain/dokuwiki&quot; rel=&quot;ugc nofollow noopener&quot;&gt;github&lt;/a&gt; pick bit of code at random and throw it in our wiki.
&lt;/p&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_wrap_start&amp;quot;,&amp;quot;secid&amp;quot;:23,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;div class=&quot;wrap_hideslide plugin_wrap&quot;&gt;
&lt;p&gt;
 So, finding the source for open software is easy. but to do the same thing with closed source program is usually difficult or impossible. Either you purchase or are given access to the code. Any other method may break all manner of licenses and laws.
&lt;/p&gt;
&lt;/div&gt;&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_wrap_end&amp;quot;,&amp;quot;secid&amp;quot;:24,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;
&lt;/div&gt;

&lt;h3 id=&quot;ideepcolor_on_github&quot;&gt;ideepcolor on Github&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
For a project like ideepcolor, Github is where researchers and developers describe how they achieved their results with real, working code.  There is generally an introduction, which should contain any major updates to the project,  then we go through the prerequisites, setting up or getting started, installation, training and (hopefully) application.  There are number of ways to demonstrate application of a model, ideepcolor has built a custom Graphical User Interface (&lt;abbr title=&quot;Graphical User Interface&quot;&gt;GUI&lt;/abbr&gt;), which we demonstrated this project in our first ML workshop. More commonly demos are done with a &lt;a href=&quot;https://jupyter.org/&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://jupyter.org/&quot; rel=&quot;ugc nofollow noopener&quot;&gt;jupyter notebook&lt;/a&gt; or &lt;a href=&quot;https://colab.research.google.com/notebooks/welcome.ipynb&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://colab.research.google.com/notebooks/welcome.ipynb&quot; rel=&quot;ugc nofollow noopener&quot;&gt;Google Colab&lt;/a&gt; notebook. Now, lets look at the updates at the top of the ideepcolor repository - which tell us:
&lt;/p&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_wrap_start&amp;quot;,&amp;quot;secid&amp;quot;:25,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;div class=&quot;wrap_center wrap_round wrap_info plugin_wrap&quot; style=&quot;width: 60%;&quot;&gt;
&lt;p&gt;
10/3/2019 Update: Our technology is also now available in Adobe Photoshop Elements 2020. See this &lt;a href=&quot;https://helpx.adobe.com/photoshop-elements/using/colorize-photo.html&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://helpx.adobe.com/photoshop-elements/using/colorize-photo.html&quot; rel=&quot;ugc nofollow noopener&quot;&gt;blog&lt;/a&gt; and &lt;a href=&quot;https://www.youtube.com/watch?v=tmXg4N4YlJg&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://www.youtube.com/watch?v=tmXg4N4YlJg&quot; rel=&quot;ugc nofollow noopener&quot;&gt;video&lt;/a&gt; for more details.
&lt;/p&gt;
&lt;/div&gt;&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_wrap_end&amp;quot;,&amp;quot;secid&amp;quot;:26,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;
&lt;p&gt;
So it looks like this project has moved to the next stage - integrating ideepcolor into a commercial product.
&lt;/p&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;ML - From Paper to Product&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;ml_-_from_paper_to_product&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:0,&amp;quot;secid&amp;quot;:22,&amp;quot;range&amp;quot;:&amp;quot;1442-8930&amp;quot;} --&gt;&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_include_editbtn&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;ML - From Paper to Product&amp;quot;,&amp;quot;secid&amp;quot;:27,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;plugin_include_end&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;workshops:public:machine_learning:paper_to_product&amp;quot;,&amp;quot;secid&amp;quot;:21,&amp;quot;range&amp;quot;:&amp;quot;0-&amp;quot;} --&gt;&lt;/div&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Requirements&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;requirements&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:0,&amp;quot;secid&amp;quot;:5,&amp;quot;range&amp;quot;:&amp;quot;1299-1965&amp;quot;} --&gt;
&lt;h1 class=&quot;sectionedit28&quot; id=&quot;speech_synthesis&quot;&gt;Speech Synthesis&lt;/h1&gt;
&lt;div class=&quot;level1&quot;&gt;

&lt;p&gt;
Like many of the 20th century&amp;#039;s technological inovations, the frst modern speech synthesiser can be traced back to the invention of the &lt;a href=&quot;https://en.wikipedia.org/wiki/Vocoder&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://en.wikipedia.org/wiki/Vocoder&quot; rel=&quot;ugc nofollow noopener&quot;&gt;vocoder&lt;/a&gt; at &lt;a href=&quot;https://en.wikipedia.org/wiki/Bell_Labs&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://en.wikipedia.org/wiki/Bell_Labs&quot; rel=&quot;ugc nofollow noopener&quot;&gt;Bell Labs&lt;/a&gt;.   Derived from this, the &lt;a href=&quot;https://en.wikipedia.org/wiki/Voder&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://en.wikipedia.org/wiki/Voder&quot; rel=&quot;ugc nofollow noopener&quot;&gt;Voder&lt;/a&gt; was demonstrated at the 1939 World Fair.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:voder_demonstrated_on_1939_new_york_world_fair_-_the_voder_fascinates_the_crowds_-_bell_telephone_quarterly_january_1940_.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:voder_demonstrated_on_1939_new_york_world_fair_-_the_voder_fascinates_the_crowds_-_bell_telephone_quarterly_january_1940_.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=600&amp;amp;tok=bc1f4c&amp;amp;media=workshops:public:machine_learning:uncanny_valley:voder_demonstrated_on_1939_new_york_world_fair_-_the_voder_fascinates_the_crowds_-_bell_telephone_quarterly_january_1940_.jpg&quot; class=&quot;mediacenter&quot; alt=&quot;&quot; width=&quot;600&quot; /&gt;&lt;/a&gt; &lt;sup&gt;&lt;a href=&quot;#fn__4&quot; id=&quot;fnt__4&quot; class=&quot;fn_top&quot;&gt;4)&lt;/a&gt;&lt;/sup&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:homer_dudley_october_1940_._the_carrier_nature_of_speech_._bell_system_technical_journal_xix_4_495-515._--_fig.8_schematic_circuit_of_the_voder.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:homer_dudley_october_1940_._the_carrier_nature_of_speech_._bell_system_technical_journal_xix_4_495-515._--_fig.8_schematic_circuit_of_the_voder.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=600&amp;amp;tok=0978a5&amp;amp;media=workshops:public:machine_learning:uncanny_valley:homer_dudley_october_1940_._the_carrier_nature_of_speech_._bell_system_technical_journal_xix_4_495-515._--_fig.8_schematic_circuit_of_the_voder.jpg&quot; class=&quot;mediacenter&quot; alt=&quot;&quot; width=&quot;600&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;

&lt;h3 id=&quot;historical_audio_examples&quot;&gt;Historical Audio Examples&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Here is a playlist of various historical TTS methods. 
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://soundcloud.com/user-552764043&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://soundcloud.com/user-552764043&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://soundcloud.com/user-552764043&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Speech Synthesis&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;speech_synthesis&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:1,&amp;quot;secid&amp;quot;:28,&amp;quot;range&amp;quot;:&amp;quot;1966-3574&amp;quot;} --&gt;
&lt;h1 class=&quot;sectionedit29&quot; id=&quot;modern_state_of_the_art_tts&quot;&gt;Modern State of the Art TTS&lt;/h1&gt;
&lt;div class=&quot;level1&quot;&gt;

&lt;p&gt;
Now - it time to have some fun with TTS - check out the man holding the frog below…
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://vo.codes/#speak&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://vo.codes/#speak&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://vo.codes/#speak&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
And have a listen to some interesting examples from pop/meme culture.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://fifteen.ai/examples&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://fifteen.ai/examples&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://fifteen.ai/examples&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
&lt;a href=&quot;https://www.youtube.com/watch?v=drirw-XvzzQ&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://www.youtube.com/watch?v=drirw-XvzzQ&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://www.youtube.com/watch?v=drirw-XvzzQ&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Modern State of the Art TTS&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;modern_state_of_the_art_tts&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:1,&amp;quot;secid&amp;quot;:29,&amp;quot;range&amp;quot;:&amp;quot;3575-3910&amp;quot;} --&gt;
&lt;h2 class=&quot;sectionedit30&quot; id=&quot;wavenet&quot;&gt;Wavenet&lt;/h2&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;p&gt;
Modern deep learning based synthesis started with the release of &lt;a href=&quot;https://deepmind.com/blog/article/wavenet-generative-model-raw-audio&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://deepmind.com/blog/article/wavenet-generative-model-raw-audio&quot; rel=&quot;ugc nofollow noopener&quot;&gt;Wavenet&lt;/a&gt; in 2016 by Google&amp;#039;s &lt;a href=&quot;https://deepmind.com&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://deepmind.com&quot; rel=&quot;ugc nofollow noopener&quot;&gt;Deepmind&lt;/a&gt;.  
&lt;/p&gt;

&lt;p&gt;
WaveNet changes this paradigm by directly modelling the raw waveform of the audio signal, one sample at a time. As well as yielding more natural-sounding speech, using raw waveforms means that WaveNet can model any kind of audio, including music.&lt;sup&gt;&lt;a href=&quot;#fn__5&quot; id=&quot;fnt__5&quot; class=&quot;fn_top&quot;&gt;5)&lt;/a&gt;&lt;/sup&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Wavenet&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;wavenet&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:1,&amp;quot;secid&amp;quot;:30,&amp;quot;range&amp;quot;:&amp;quot;3911-4453&amp;quot;} --&gt;
&lt;h2 class=&quot;sectionedit31&quot; id=&quot;tacotron_and_tacotron2&quot;&gt;Tacotron and Tacotron2&lt;/h2&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;p&gt;
Wavenet was followed by Tacoctron (also from Google) in 2017.  
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://google.github.io/tacotron/publications/tacotron/index.html&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://google.github.io/tacotron/publications/tacotron/index.html&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://google.github.io/tacotron/publications/tacotron/index.html&lt;/a&gt;
&lt;/p&gt;

&lt;p&gt;
Then Tacotron2 
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://ai.googleblog.com/2017/12/tacotron-2-generating-human-like-speech.html&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://ai.googleblog.com/2017/12/tacotron-2-generating-human-like-speech.html&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://ai.googleblog.com/2017/12/tacotron-2-generating-human-like-speech.html&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Tacotron and Tacotron2&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;tacotron_and_tacotron2&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:1,&amp;quot;secid&amp;quot;:31,&amp;quot;range&amp;quot;:&amp;quot;4454-4724&amp;quot;} --&gt;
&lt;h1 class=&quot;sectionedit32&quot; id=&quot;google_colab&quot;&gt;Google Colab&lt;/h1&gt;
&lt;div class=&quot;level1&quot;&gt;

&lt;p&gt;
Google&amp;#039;s Colaboratory&lt;sup&gt;&lt;a href=&quot;#fn__6&quot; id=&quot;fnt__6&quot; class=&quot;fn_top&quot;&gt;6)&lt;/a&gt;&lt;/sup&gt;, or “Colab” for short, allows you to write and execute Python in your browser, with
&lt;/p&gt;
&lt;ul&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; Zero configuration required&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; Free access to GPUs&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; Easy sharing&lt;/div&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;/div&gt;

&lt;h3 id=&quot;python&quot;&gt;Python&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Python is an open source programming language that was made to be easy-to-read and powerful&lt;sup&gt;&lt;a href=&quot;#fn__7&quot; id=&quot;fnt__7&quot; class=&quot;fn_top&quot;&gt;7)&lt;/a&gt;&lt;/sup&gt;).  ​Python ​is: 	
&lt;/p&gt;
&lt;ul&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; a high-level language,  ​(Meaning programmer can focus on what to do instead of how to do it.)&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; an interpreted language (Interpreted languages do not need to be compiled to run.)&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; is often described as a “​batteries included”​ language due to its comprehensive standard library.&lt;/div&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
A program called an interpreter runs Python code on almost any kind of computer. In our case python will be interpreted by  google colab, which is based on Jupyter notebooks.
&lt;/p&gt;

&lt;/div&gt;

&lt;h3 id=&quot;jupyter_notebooks&quot;&gt;Jupyter Notebooks&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Jupyter Notebook is an open-source web application that allows you to create and share documents that contain live code, equations, visualizations and narrative text&lt;sup&gt;&lt;a href=&quot;#fn__8&quot; id=&quot;fnt__8&quot; class=&quot;fn_top&quot;&gt;8)&lt;/a&gt;&lt;/sup&gt;.  Usually Jupyter notebooks require set-up for a specific purpose, but Colab takes care of all this for us.
&lt;/p&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Google Colab&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;google_colab&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:1,&amp;quot;secid&amp;quot;:32,&amp;quot;range&amp;quot;:&amp;quot;4725-5998&amp;quot;} --&gt;
&lt;h1 class=&quot;sectionedit33&quot; id=&quot;getting_started_with_colab&quot;&gt;Getting Started with Colab&lt;/h1&gt;
&lt;div class=&quot;level1&quot;&gt;

&lt;p&gt;
The only requirment for using Colab is (unsurprisingly) a Google account.  Once you have a google account, lets jump into our first ML example  - &lt;a href=&quot;https://github.com/deezer/spleeter&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://github.com/deezer/spleeter&quot; rel=&quot;ugc nofollow noopener&quot;&gt;Spleeter&lt;/a&gt; - that we mentioned earlier.  Go to the Colab here:
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://colab.research.google.com/github/deezer/spleeter/blob/master/spleeter.ipynb&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://colab.research.google.com/github/deezer/spleeter/blob/master/spleeter.ipynb&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://colab.research.google.com/github/deezer/spleeter/blob/master/spleeter.ipynb&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;

&lt;h3 id=&quot;making_a_colab_copy&quot;&gt;Making a Colab Copy&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
The first step is make a copy of the notebook to our Google drive - this means we can save any changes we like.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:01_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:01_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=400&amp;amp;tok=226d26&amp;amp;media=workshops:public:machine_learning:uncanny_valley:01_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;400&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
This will trigger a google sign-in
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:02_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:02_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=400&amp;amp;tok=37d516&amp;amp;media=workshops:public:machine_learning:uncanny_valley:02_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;400&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
and the your copy will open in a new tab.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:03_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:03_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=400&amp;amp;tok=de70db&amp;amp;media=workshops:public:machine_learning:uncanny_valley:03_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;400&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;

&lt;h3 id=&quot;select_a_runtime&quot;&gt;Select a Runtime&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Next we change our runtime (the kind or processor we use) 
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:04_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:04_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=400&amp;amp;tok=094e43&amp;amp;media=workshops:public:machine_learning:uncanny_valley:04_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;400&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
to a GPU to take advantage of Googles free GPU offer.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:04.5_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:04.5_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=400&amp;amp;tok=99d85c&amp;amp;media=workshops:public:machine_learning:uncanny_valley:04.5_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;400&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Now lets connect to our hosted runtime 
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:05_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:05_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=400&amp;amp;tok=c775bb&amp;amp;media=workshops:public:machine_learning:uncanny_valley:05_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;400&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
and check the specs…
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:06_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:06_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=400&amp;amp;tok=153dfd&amp;amp;media=workshops:public:machine_learning:uncanny_valley:06_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;400&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Getting Started with Colab&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;getting_started_with_colab&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:1,&amp;quot;secid&amp;quot;:33,&amp;quot;range&amp;quot;:&amp;quot;5999-7477&amp;quot;} --&gt;
&lt;h2 class=&quot;sectionedit34&quot; id=&quot;step_through_the_notebook&quot;&gt;Step Through the Notebook&lt;/h2&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;p&gt;
Now its time to actually use the notebook!  Before we start, lets go over how the notebooks work:
&lt;/p&gt;
&lt;ul&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; The notebook is divided into sections, with each section made up of cells.  &lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; These cells have code pre-entered into them, &lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; A play button on the runs (executes) the code in the cell.  &lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; The output of the cell is printed (or displayed) directly below each cell. &lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; The output could be text, pictures, audio or video. &lt;/div&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Cells usually contain python code, but can also be coded in bash - the UNIX command line shell.  Cells containing bash commands start with an exclamation mark &lt;code&gt;!&lt;/code&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;p&gt;
Our first section is called “Install Spleeter” and contains the bash command &lt;code&gt;apt install ffmeg&lt;/code&gt; . This installs ffmeg in our runtime, which is used to process audio. Press the go button..
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:07_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:07_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=400&amp;amp;tok=c5f136&amp;amp;media=workshops:public:machine_learning:uncanny_valley:07_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;400&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
ffmpeg will be downloaded and installed to our runtime.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:08_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:08_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=600&amp;amp;tok=85ee7a&amp;amp;media=workshops:public:machine_learning:uncanny_valley:08_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;600&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Next we will run a python command &lt;code&gt;pip&lt;/code&gt;to use the &lt;a href=&quot;https://pypi.org/project/pip/&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://pypi.org/project/pip/&quot; rel=&quot;ugc nofollow noopener&quot;&gt;python package manager
&lt;/a&gt; to install the spleeter python package. 
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:09_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:09_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=1200&amp;amp;tok=aa27f1&amp;amp;media=workshops:public:machine_learning:uncanny_valley:09_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;1200&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
This will take a while - and at the end we will get a message saying we need to restart our runtime due to some compatibilty issues &lt;sup&gt;&lt;a href=&quot;#fn__9&quot; id=&quot;fnt__9&quot; class=&quot;fn_top&quot;&gt;9)&lt;/a&gt;&lt;/sup&gt;
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:10_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:10_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=1200&amp;amp;tok=197b84&amp;amp;media=workshops:public:machine_learning:uncanny_valley:10_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;1200&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Go ahead and restart 
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:11_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:11_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=600&amp;amp;tok=e181ef&amp;amp;media=workshops:public:machine_learning:uncanny_valley:11_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;600&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Next is another bash command 
&lt;/p&gt;
&lt;pre class=&quot;code&quot;&gt;wget&lt;/pre&gt;

&lt;p&gt;
 we use to (web)get our example audio file.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:12_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:12_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=800&amp;amp;tok=300a00&amp;amp;media=workshops:public:machine_learning:uncanny_valley:12_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;800&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
And the next cell uses the python &lt;code&gt;Audio&lt;/code&gt; command to give us a nice little audio player so we can hear our example.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:13_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:13_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=600&amp;amp;tok=6d1c4f&amp;amp;media=workshops:public:machine_learning:uncanny_valley:13_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;600&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Now its finally time to use the spleeter tool with the &lt;code&gt;separate&lt;/code&gt; command &lt;sup&gt;&lt;a href=&quot;#fn__10&quot; id=&quot;fnt__10&quot; class=&quot;fn_top&quot;&gt;10)&lt;/a&gt;&lt;/sup&gt; as &lt;code&gt;!spleeter separate&lt;/code&gt; , and lets pass the &lt;code&gt;-h&lt;/code&gt; flag &lt;sup&gt;&lt;a href=&quot;#fn__11&quot; id=&quot;fnt__11&quot; class=&quot;fn_top&quot;&gt;11)&lt;/a&gt;&lt;/sup&gt; to show us the built in help for the command.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:14_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:14_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=800&amp;amp;tok=d9f96e&amp;amp;media=workshops:public:machine_learning:uncanny_valley:14_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;800&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Now that we know what we are doing - we run the tool for real, and will use the &lt;code&gt;-i&lt;/code&gt; flag to define the input as our downloaded example, and the &lt;code&gt;-o&lt;/code&gt; flag to define our output destination as the directory (folder) &lt;code&gt;output&lt;/code&gt;.  By default spleeter will download and use the&lt;a href=&quot;https://github.com/deezer/spleeter/wiki/2.-Getting-started#using-2stems-model&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://github.com/deezer/spleeter/wiki/2.-Getting-started#using-2stems-model&quot; rel=&quot;ugc nofollow noopener&quot;&gt;2stems model&lt;/a&gt;.
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:15_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:15_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=1200&amp;amp;tok=256712&amp;amp;media=workshops:public:machine_learning:uncanny_valley:15_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;1200&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Another bash command &lt;code&gt;ls&lt;/code&gt; (list) shows us the contents of our output directory
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:16_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:16_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=800&amp;amp;tok=bbda8b&amp;amp;media=workshops:public:machine_learning:uncanny_valley:16_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;800&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
And finally onother couple of &lt;code&gt;audio&lt;/code&gt; commands to hear our result!
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?media=workshops:public:machine_learning:uncanny_valley:17_colab_spleeter.jpg&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:17_colab_spleeter.jpg&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=800&amp;amp;tok=ee6ffe&amp;amp;media=workshops:public:machine_learning:uncanny_valley:17_colab_spleeter.jpg&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;800&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;

&lt;h3 id=&quot;things_to_try&quot;&gt;Things to try&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Check out the &lt;a href=&quot;https://github.com/deezer/spleeter/wiki/2.-Getting-started#separate-sources&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://github.com/deezer/spleeter/wiki/2.-Getting-started#separate-sources&quot; rel=&quot;ugc nofollow noopener&quot;&gt;usage instructions&lt;/a&gt; for the separate tool on the Github site and try your own 4stem and 5tem separations.  
Use your own audio files to test the separation.
&lt;/p&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Step Through the Notebook&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;step_through_the_notebook&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:1,&amp;quot;secid&amp;quot;:34,&amp;quot;range&amp;quot;:&amp;quot;7478-11130&amp;quot;} --&gt;
&lt;h1 class=&quot;sectionedit35&quot; id=&quot;speech_to_text_with_mozilla_deepspeech&quot;&gt;Speech to Text with Mozilla Deepspeech&lt;/h1&gt;
&lt;div class=&quot;level1&quot;&gt;

&lt;p&gt;
Our next challenge will be to adapt the latest version of Mozilla&amp;#039;s Deepspeech for use in Google Colab.
&lt;/p&gt;

&lt;p&gt;
We will be using the documentation here:
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://deepspeech.readthedocs.io/en/v0.8.0/USING.html#getting-the-pre-trained-model&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://deepspeech.readthedocs.io/en/v0.8.0/USING.html#getting-the-pre-trained-model&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://deepspeech.readthedocs.io/en/v0.8.0/USING.html#getting-the-pre-trained-model&lt;/a&gt;
&lt;/p&gt;

&lt;p&gt;
To adapt this colab notebook to run the latest version of Mozilla Deepspeech:
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://colab.research.google.com/github/tugstugi/dl-colab-notebooks/blob/master/notebooks/MozillaDeepSpeech.ipynb#scrollTo=4OAYywPHApuz&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://colab.research.google.com/github/tugstugi/dl-colab-notebooks/blob/master/notebooks/MozillaDeepSpeech.ipynb#scrollTo=4OAYywPHApuz&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://colab.research.google.com/github/tugstugi/dl-colab-notebooks/blob/master/notebooks/MozillaDeepSpeech.ipynb#scrollTo=4OAYywPHApuz&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Speech to Text with Mozilla Deepspeech&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;speech_to_text_with_mozilla_deepspeech&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:1,&amp;quot;secid&amp;quot;:35,&amp;quot;range&amp;quot;:&amp;quot;11131-11635&amp;quot;} --&gt;
&lt;h2 class=&quot;sectionedit36&quot; id=&quot;text_to_speech_with_mozilla_tts&quot;&gt;Text to Speech with Mozilla TTS&lt;/h2&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;p&gt;
Our final example is TTS with Mozilla TTS:
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://colab.research.google.com/drive/1u_16ZzHjKYFn1HNVuA4Qf_i2MMFB9olY?usp=sharing#scrollTo=6LWsNd3_M3MP&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://colab.research.google.com/drive/1u_16ZzHjKYFn1HNVuA4Qf_i2MMFB9olY?usp=sharing#scrollTo=6LWsNd3_M3MP&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://colab.research.google.com/drive/1u_16ZzHjKYFn1HNVuA4Qf_i2MMFB9olY?usp=sharing#scrollTo=6LWsNd3_M3MP&lt;/a&gt;
&lt;/p&gt;

&lt;p&gt;
You can dive straight into this and use it to generate speech.  This example usesTacotron2 and MultiBand-Melgan models and LJSpeech dataset.
&lt;/p&gt;

&lt;/div&gt;

&lt;h3 id=&quot;run_all_cells&quot;&gt;Run All Cells&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/detail.php?id=workshops%3Apublic%3Amachine_learning%3Auncanny_valley%3Astart&amp;amp;media=workshops:public:machine_learning:uncanny_valley:01_melgan.png&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:01_melgan.png&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=400&amp;amp;tok=6ca8e7&amp;amp;media=workshops:public:machine_learning:uncanny_valley:01_melgan.png&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;400&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;

&lt;h3 id=&quot;generate_speech&quot;&gt;Generate Speech&lt;/h3&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
&lt;a href=&quot;https://wiki.slq.qld.gov.au/lib/exe/detail.php?id=workshops%3Apublic%3Amachine_learning%3Auncanny_valley%3Astart&amp;amp;media=workshops:public:machine_learning:uncanny_valley:02_melgan.png&quot; class=&quot;media&quot; target=&quot; _blank&quot; title=&quot;workshops:public:machine_learning:uncanny_valley:02_melgan.png&quot; rel=&quot;noopener&quot;&gt;&lt;img src=&quot;https://wiki.slq.qld.gov.au/lib/exe/fetch.php?w=400&amp;amp;tok=c8405b&amp;amp;media=workshops:public:machine_learning:uncanny_valley:02_melgan.png&quot; class=&quot;media&quot; alt=&quot;&quot; width=&quot;400&quot; /&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Text to Speech with Mozilla TTS&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;text_to_speech_with_mozilla_tts&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:1,&amp;quot;secid&amp;quot;:36,&amp;quot;range&amp;quot;:&amp;quot;11636-12173&amp;quot;} --&gt;
&lt;h2 class=&quot;sectionedit37&quot; id=&quot;going_further&quot;&gt;Going Further&lt;/h2&gt;
&lt;div class=&quot;level2&quot;&gt;

&lt;p&gt;
ML is such a big and fast moving area of research there are countless other ways to explore and learn, here are a few two-minute videos to pique your interest:
&lt;/p&gt;
&lt;ul&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; &lt;a href=&quot;https://www.youtube.com/watch?v=EjVzjxihGvU&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://www.youtube.com/watch?v=EjVzjxihGvU&quot; rel=&quot;ugc nofollow noopener&quot;&gt;Video restoration&lt;/a&gt;&lt;/div&gt;
&lt;/li&gt;
&lt;li class=&quot;level1&quot;&gt;&lt;div class=&quot;li&quot;&gt; &lt;a href=&quot;https://www.youtube.com/watch?v=Lu56xVlZ40M&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://www.youtube.com/watch?v=Lu56xVlZ40M&quot; rel=&quot;ugc nofollow noopener&quot;&gt;OpenAI Plays Hide and Seek&lt;/a&gt;&lt;/div&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;/div&gt;
&lt;div class=&quot;level3&quot;&gt;

&lt;p&gt;
Make sure you check out the resources in Lynda, which you will have free access to as a State Library of Queensland member
&lt;/p&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Going Further&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;going_further&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:1,&amp;quot;secid&amp;quot;:37,&amp;quot;range&amp;quot;:&amp;quot;12174-12656&amp;quot;} --&gt;
&lt;h1 class=&quot;sectionedit38&quot; id=&quot;links&quot;&gt;Links&lt;/h1&gt;
&lt;div class=&quot;level1&quot;&gt;

&lt;p&gt;
&lt;a href=&quot;https://machinelearningforkids.co.uk/#!/links#top&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://machinelearningforkids.co.uk/#!/links#top&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://machinelearningforkids.co.uk/#!/links#top&lt;/a&gt;
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://experiments.withgoogle.com/collection/ai&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://experiments.withgoogle.com/collection/ai&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://experiments.withgoogle.com/collection/ai&lt;/a&gt;
&lt;/p&gt;

&lt;p&gt;
&lt;a href=&quot;https://openai.com/blog/&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://openai.com/blog/&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://openai.com/blog/&lt;/a&gt;
&lt;/p&gt;

&lt;/div&gt;
&lt;!-- EDIT{&amp;quot;target&amp;quot;:&amp;quot;section&amp;quot;,&amp;quot;name&amp;quot;:&amp;quot;Links&amp;quot;,&amp;quot;hid&amp;quot;:&amp;quot;links&amp;quot;,&amp;quot;codeblockOffset&amp;quot;:1,&amp;quot;secid&amp;quot;:38,&amp;quot;range&amp;quot;:&amp;quot;12657-&amp;quot;} --&gt;&lt;div class=&quot;footnotes&quot;&gt;
&lt;div class=&quot;fn&quot;&gt;&lt;sup&gt;&lt;a href=&quot;#fnt__1&quot; id=&quot;fn__1&quot; class=&quot;fn_bot&quot;&gt;1)&lt;/a&gt;&lt;/sup&gt; 
&lt;div class=&quot;content&quot;&gt;the ideepcolor training set is 1.3 million images&lt;/div&gt;&lt;/div&gt;
&lt;div class=&quot;fn&quot;&gt;&lt;sup&gt;&lt;a href=&quot;#fnt__2&quot; id=&quot;fn__2&quot; class=&quot;fn_bot&quot;&gt;2)&lt;/a&gt;&lt;/sup&gt; 
&lt;div class=&quot;content&quot;&gt;this is obviously a contrived example of course, but the principle applies regardless&lt;/div&gt;&lt;/div&gt;
&lt;div class=&quot;fn&quot;&gt;&lt;sup&gt;&lt;a href=&quot;#fnt__3&quot; id=&quot;fn__3&quot; class=&quot;fn_bot&quot;&gt;3)&lt;/a&gt;&lt;/sup&gt; 
&lt;div class=&quot;content&quot;&gt;if made available under an &lt;a href=&quot;https://en.wikipedia.org/wiki/Open-source_software&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://en.wikipedia.org/wiki/Open-source_software&quot; rel=&quot;ugc nofollow noopener&quot;&gt;appropriate license&lt;/a&gt;&lt;/div&gt;&lt;/div&gt;
&lt;div class=&quot;fn&quot;&gt;&lt;sup&gt;&lt;a href=&quot;#fnt__4&quot; id=&quot;fn__4&quot; class=&quot;fn_bot&quot;&gt;4)&lt;/a&gt;&lt;/sup&gt; 
&lt;div class=&quot;content&quot;&gt;&lt;span class=&quot;wrap_lo &quot;&gt;By Internet Archive Book Images - &lt;a href=&quot;https://www.flickr.com/photos/internetarchivebookimages/14776509983/Source&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://www.flickr.com/photos/internetarchivebookimages/14776509983/Source&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://www.flickr.com/photos/internetarchivebookimages/14776509983/Source&lt;/a&gt; book page: &lt;a href=&quot;https://archive.org/stream/belltelephonemag19amerrich/belltelephonemag19amerrich#page/n78/mode/1upReference[Fig.4]&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://archive.org/stream/belltelephonemag19amerrich/belltelephonemag19amerrich#page/n78/mode/1upReference[Fig.4]&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://archive.org/stream/belltelephonemag19amerrich/belltelephonemag19amerrich#page/n78/mode/1upReference[Fig.4]&lt;/a&gt; The Voder Fascinates the Crowds from: Williams, Thomas W. (January 1940) I. At the New York World&amp;amp;#039;s Fair. &amp;amp;quot;Our Exhibits at Two Fairs&amp;amp;quot;. Bell Telephone Quarterly XIX (1): 65.&amp;amp;quot;​The Voder Fascinates the Crowds - The manipulative skill of the operator s fingers makes the Voders voice almost loo good to be true &amp;amp;quot;, No restrictions, &lt;a href=&quot;https://commons.wikimedia.org/w/index.php?curid=43343073&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://commons.wikimedia.org/w/index.php?curid=43343073&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://commons.wikimedia.org/w/index.php?curid=43343073&lt;/a&gt;&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;
&lt;div class=&quot;fn&quot;&gt;&lt;sup&gt;&lt;a href=&quot;#fnt__5&quot; id=&quot;fn__5&quot; class=&quot;fn_bot&quot;&gt;5)&lt;/a&gt;&lt;/sup&gt; 
&lt;div class=&quot;content&quot;&gt;&lt;a href=&quot;https://deepmind.com/blog/article/wavenet-generative-model-raw-audio&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://deepmind.com/blog/article/wavenet-generative-model-raw-audio&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://deepmind.com/blog/article/wavenet-generative-model-raw-audio&lt;/a&gt;&lt;/div&gt;&lt;/div&gt;
&lt;div class=&quot;fn&quot;&gt;&lt;sup&gt;&lt;a href=&quot;#fnt__6&quot; id=&quot;fn__6&quot; class=&quot;fn_bot&quot;&gt;6)&lt;/a&gt;&lt;/sup&gt; 
&lt;div class=&quot;content&quot;&gt;&lt;a href=&quot;https://colab.research.google.com/notebooks/intro.ipynb&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://colab.research.google.com/notebooks/intro.ipynb&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://colab.research.google.com/notebooks/intro.ipynb&lt;/a&gt;&lt;/div&gt;&lt;/div&gt;
&lt;div class=&quot;fn&quot;&gt;&lt;sup&gt;&lt;a href=&quot;#fnt__7&quot; id=&quot;fn__7&quot; class=&quot;fn_bot&quot;&gt;7)&lt;/a&gt;&lt;/sup&gt; 
&lt;div class=&quot;content&quot;&gt;&lt;a href=&quot;https://simple.wikipedia.org/wiki/Python_&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://simple.wikipedia.org/wiki/Python_&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://simple.wikipedia.org/wiki/Python_&lt;/a&gt;(programming_language&lt;/div&gt;&lt;/div&gt;
&lt;div class=&quot;fn&quot;&gt;&lt;sup&gt;&lt;a href=&quot;#fnt__8&quot; id=&quot;fn__8&quot; class=&quot;fn_bot&quot;&gt;8)&lt;/a&gt;&lt;/sup&gt; 
&lt;div class=&quot;content&quot;&gt;&lt;a href=&quot;https://jupyter.org/&quot; class=&quot;urlextern&quot; target=&quot;_tab&quot; title=&quot;https://jupyter.org/&quot; rel=&quot;ugc nofollow noopener&quot;&gt;https://jupyter.org/&lt;/a&gt;&lt;/div&gt;&lt;/div&gt;
&lt;div class=&quot;fn&quot;&gt;&lt;sup&gt;&lt;a href=&quot;#fnt__9&quot; id=&quot;fn__9&quot; class=&quot;fn_bot&quot;&gt;9)&lt;/a&gt;&lt;/sup&gt; 
&lt;div class=&quot;content&quot;&gt;this is not unusual when using a hosted  runtime&lt;/div&gt;&lt;/div&gt;
&lt;div class=&quot;fn&quot;&gt;&lt;sup&gt;&lt;a href=&quot;#fnt__10&quot; id=&quot;fn__10&quot; class=&quot;fn_bot&quot;&gt;10)&lt;/a&gt;&lt;/sup&gt; 
&lt;div class=&quot;content&quot;&gt;confusingly we need to call it from bash (with the exclamation&lt;/div&gt;&lt;/div&gt;
&lt;div class=&quot;fn&quot;&gt;&lt;sup&gt;&lt;a href=&quot;#fnt__11&quot; id=&quot;fn__11&quot; class=&quot;fn_bot&quot;&gt;11)&lt;/a&gt;&lt;/sup&gt; 
&lt;div class=&quot;content&quot;&gt;a fancy way of saying option&lt;/div&gt;&lt;/div&gt;
&lt;/div&gt;
</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
        <category>workshops:public:machine_learning:uncanny_valley</category>
            <pubDate>Mon, 16 Nov 2020 15:56:46 +0000</pubDate>
        </item>
    </channel>
</rss>
