tag:theconversation.com,2011:/us/topics/computer-programming-8509/articlesComputer programming – The Conversation2019-04-10T14:43:49Ztag:theconversation.com,2011:article/1149402019-04-10T14:43:49Z2019-04-10T14:43:49ZCurious Kids: who is Siri?<figure><img src="https://images.theconversation.com/files/268595/original/file-20190410-2927-1im5x47.jpg?ixlib=rb-1.1.0&amp;rect=8%2C0%2C5742%2C3785&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/childhood-technology-family-concept-little-kids-1030882333?src=Nsa2ob-AspNt5AZ9KgX_0Q-1-1">Shutterstock.</a></span></figcaption></figure><figure class="align-left ">
<img alt="" src="https://images.theconversation.com/files/165749/original/image-20170419-32713-1kyojyz.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=237&amp;fit=clip" srcset="https://images.theconversation.com/files/165749/original/image-20170419-32713-1kyojyz.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=376&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/165749/original/image-20170419-32713-1kyojyz.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=376&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/165749/original/image-20170419-32713-1kyojyz.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=376&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/165749/original/image-20170419-32713-1kyojyz.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=472&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/165749/original/image-20170419-32713-1kyojyz.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=472&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/165749/original/image-20170419-32713-1kyojyz.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=472&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p><em><a href="https://theconversation.com/au/topics/curious-kids-36782">Curious Kids</a> is a series by <a href="https://theconversation.com/uk">The Conversation</a>, which gives children of all ages the chance to have their questions about the world answered by experts. All questions are welcome: you or an adult can send them – along with your name, age and town or city where you live – to curiouskids@theconversation.com. We won’t be able to answer every question, but we’ll do our best.</em></p>
<hr>
<blockquote>
<p><strong><em>Who is Siri? – Miles, aged four, London, UK.</em></strong></p>
</blockquote>
<p>Thanks for the question, Miles. The first thing to know is that <a href="https://www.apple.com/siri/">Siri</a> is not a “who” – Siri is a “what”. Siri is a “<a href="https://hbr.org/2015/09/the-president-of-sri-ventures-on-bringing-siri-to-life">virtual assistant</a>” which you can control with your voice. </p>
<p>Siri is “virtual” because it is not a real person, and an “assistant” because it can help you out by doing tasks like reading text messages, playing music, taking photos or reminding you when it’s your friend’s birthday. </p>
<p>Siri is part of the computer program, called an “operating system”, which makes Apple devices – like iPhones, computers, watches and iPads – work. By the way, a computer program is a set of written instructions that tells a computer what to do. </p>
<h2>Speaking with Siri</h2>
<p>When you speak to Siri, it will respond and speak back to you. Siri understands all of what you say – not just a few words – because it has a special program, designed to do just that.</p>
<p>You wake Siri up by saying “Hey Siri” to an Apple device. A speech recogniser program in the device listens for these words and uses another program to check that the words are actually “Hey Siri”, and not any other sounds from nearby. </p>
<p>When that happens, Siri will wake up, collect your voice command and change it into a file of data, which the device can understand. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/268615/original/file-20190410-2924-1toxddh.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/268615/original/file-20190410-2924-1toxddh.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=401&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/268615/original/file-20190410-2924-1toxddh.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=401&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/268615/original/file-20190410-2924-1toxddh.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=401&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/268615/original/file-20190410-2924-1toxddh.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=504&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/268615/original/file-20190410-2924-1toxddh.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=504&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/268615/original/file-20190410-2924-1toxddh.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=504&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Hey, Siri – can you hear me?</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/pretty-baby-smiles-talk-by-phone-146931245">Shutterstock.</a></span>
</figcaption>
</figure>
<p>Most of the program that makes up Siri is not on your device, but on the <a href="https://ieeexplore.ieee.org/abstract/document/4804045">cloud</a>. No, not the ones in the sky – the cloud is a network of computers and programs, accessed through the internet.</p>
<p>So, Siri will send the file with your command in it to the cloud. There, Siri will use Apple’s main computers to access a database of questions and answers, search the internet or connect with apps, so that it can carry out the task you asked it to do. </p>
<p>Once it has done this, it sends the results and actions back to your Apple device. Siri then tells you the answer, or lets you know that it has done what you asked.</p>
<h2>Artificial intelligence</h2>
<p>All this can happen because Siri uses artificial intelligence. Artificial intelligence programs help computers and devices do complex tasks, without humans having to write every step into the program. </p>
<p>Artificial intelligence programs can work out what to do for themselves, and learn as they work. That’s why Siri gets better at understanding your voice, and what you might want, the more you talk to it. </p>
<p>Siri’s original voice is believed to belong to a real person called <a href="https://www.nbcnews.com/technolog/theres-real-human-behind-voice-siri-her-name-susan-8C11337218">Susan Bennet</a>. Susan recorded her voice to help build the technology that Siri uses to recognise your voice. </p>
<p>The people working at Apple will not say whether they used Susan’s voice for Siri – but we do know that in newer versions of Siri <a href="https://searchmobilecomputing.techtarget.com/definition/Siri">hundreds of different voices were used</a> to make the new female voice.</p>
<h2>Why a woman?</h2>
<p>People often ask <a href="https://theconversation.com/theres-a-reason-siri-alexa-and-ai-are-imagined-as-female-sexism-96430">why virtual assistants always have female voices</a>. They think this makes it seem like only women, and not men, can be personal assistants and helpers – or that women are always the assistants and not the bosses.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/268606/original/file-20190410-2931-7whae7.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/268606/original/file-20190410-2931-7whae7.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/268606/original/file-20190410-2931-7whae7.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/268606/original/file-20190410-2931-7whae7.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/268606/original/file-20190410-2931-7whae7.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/268606/original/file-20190410-2931-7whae7.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/268606/original/file-20190410-2931-7whae7.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Your own personal dad-ssistant.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/mixed-race-toddler-child-son-black-1319345612?src=gyceyLGXuYoFYXTJXrWQqA-1-8">Shutterstock.</a></span>
</figcaption>
</figure>
<p>Of course we know this is not true – dads, grandads, brothers and uncles can be very helpful too and women can be bosses! The people who created Siri realised this was unfair, and you can now change Siri’s voice to sound like a man. </p>
<p>Siri is named after the company that invented the voice recognition technology it uses, which was called <a href="https://searchmobilecomputing.techtarget.com/definition/Siri">Siri Inc</a>. Apple kept the name when it bought Siri Inc. </p>
<p>Siri was launched in 2011, and was the very <a href="https://voicebot.ai/2017/07/14/timeline-voice-assistants-short-history-voice-revolution/">first voice virtual assistant</a> but others, such as Cortana by Microsoft and Amazon’s Alexa came quickly after. </p>
<p>Soon we will see more virtual assistants in our homes doing jobs for us. Just remember though, they are not people or friends – just very clever, speaking computer programs.</p>
<hr>
<p><em>More <a href="https://theconversation.com/topics/curious-kids-36782?utm_source=TCUK&amp;utm_medium=linkback&amp;utm_campaign=TCUKengagement&amp;utm_content=CuriousKidsUK">Curious Kids</a> articles, written by academic experts:</em></p>
<ul>
<li><p><em><a href="https://theconversation.com/curious-kids-what-are-meteorites-made-of-and-where-do-they-come-from-114408?utm_source=TCUK&amp;utm_medium=linkback&amp;utm_campaign=TCUKengagement&amp;utm_content=CuriousKidsUK">What is a meteorite made of and where do they come from? – the children of Year Five at Leigh St. Mary’s Church of England Primary School, Lancashire, UK.</a></em></p></li>
<li><p><em><a href="https://theconversation.com/curious-kids-why-do-tigers-have-whiskers-110791?utm_source=TCUK&amp;utm_medium=linkback&amp;utm_campaign=TCUKengagement&amp;utm_content=CuriousKidsUK">Why do tigers have whiskers? – Valentina, aged four, London, UK.</a></em></p></li>
<li><p><em><a href="https://theconversation.com/curious-kids-how-did-the-months-get-their-names-113558?utm_source=TCUK&amp;utm_medium=linkback&amp;utm_campaign=TCUKengagement&amp;utm_content=CuriousKidsUK">How did the months get their names? - Sylvie, aged eight, Brisbane, Australia.</a></em></p></li>
</ul><img src="https://counter.theconversation.com/content/114940/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Allison Gardner is affiliated with the following organisations:
Co-founder of Women Leading in AI
Member of IEEE
Member of the Fabian Society
Member of the Labour party and Co-op and a local councillor for Newcastle-under-Lyme Borough council
Member of GMB, Unite and UCU
</span></em></p>The first thing to know is that Siri is not a "who" – Siri is a "what".Allison Gardner, Teaching Fellow in Bioinformatics/ Head of Foundation Year Science, Keele UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1078362018-12-03T22:06:08Z2018-12-03T22:06:08ZThe promise of the “learn to code” movement<figure><img src="https://images.theconversation.com/files/248557/original/file-20181203-194953-16inz5r.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Truly learning to code involves more than episodic experiences. Students should ideally develop a &#39;coding mindset.&#39;</span> <span class="attribution"><a class="source" href="https://unsplash.com/photos/kwzWjTnDPLk">Nesa by makers/Unsplash</a></span></figcaption></figure><p>This week, educators, students and the public around the world are participating in <a href="https://csedweek.org/">Computer Science Education Week</a> by organizing and leading one-hour coding tutorials. </p>
<p>By the start of the week, more than <a href="https://hourofcode.com/ca/events/all/ca">2,700 Canadian coding events</a> had been registered with <a href="https://code.org">Code.org</a>, a not-for-profit organization in the United States that promotes the week.
This annual event incorporates the spirit of the “learn to code” movement; it aims to attract interest and engage students from primary grades to senior secondary levels in developing coding skills. </p>
<p><a href="https://ec.europa.eu/digital-single-market/en/coding-21st-century-skill">Governments</a>, <a href="https://www.forbes.com/sites/forbestechcouncil/2017/10/18/considering-a-new-job-here-are-10-industries-in-need-of-programmers/#3c0905e65fe3">corporations</a>, <a href="https://k12cs.org">associations in the computer science field</a> and <a href="https://code.org/quotes">trend-setters</a> all assert that learning to code will play a key role in the future. In this context, learning to code is often presented as a panacea to the <a href="https://www.theguardian.com/careers/careers-blog/2015/apr/14/coding-isnt-just-for-the-next-zuckerberg-it-can-help-dentists-too">job market problems</a> of the 21st century. </p>
<p>But for educators, there are multiple factors to consider when deciding what coding skills and which approaches to promote. How should they present what coding offers? </p>
<h2>Disillusioned workforce</h2>
<p>We take particular interest in this topic. Together we combine years of training in computer science, educational technology and educational psychology; our research interest is to develop a teaching and learning model for introducing down-to-earth computer programming concepts and logic. </p>
<p>We want research in computer science education to suit the needs and characteristics of 21st-century learners.
Otherwise, the cost will be an ill-prepared and disillusioned workforce. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/248112/original/file-20181130-194928-1h186mt.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/248112/original/file-20181130-194928-1h186mt.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=450&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/248112/original/file-20181130-194928-1h186mt.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=450&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/248112/original/file-20181130-194928-1h186mt.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=450&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/248112/original/file-20181130-194928-1h186mt.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=566&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/248112/original/file-20181130-194928-1h186mt.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=566&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/248112/original/file-20181130-194928-1h186mt.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=566&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The learn-to-code movement is promising and represents an answer to preparing learners for a digital future. Nonetheless, educators have a responsibility to ensure computer science education fully suits the needs and characteristics of 21st-century learners.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/michaelpollak/14005409228">Michael Pollak/flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<h2>Why code?</h2>
<p>In an era of an insecure job market, when <a href="http://reports.weforum.org/future-of-jobs-2018/workforce-trends-and-strategies-for-the-fourth-industrial-revolution/?doing_wp_cron=1543552438.9746980667114257812500">redundant professions are projected to be eliminated while new ones arise</a>, learning to code gives hope to our collective imagination. </p>
<p>It creates the promise of alternative sources of income as well as opportunities for self-employment given the <a href="https://www.forbes.com/sites/janicegassam/2018/08/15/the-25-highest-paying-jobs-in-america-in-2018/#30f747d55fd5">demand of coding skills in a variety of industries</a>.</p>
<p>Learning to code is not just a younger-generation trend. For example, <a href="https://scratch.mit.edu">Scratch</a> is a <a href="https://scratch.mit.edu/statistics/">popular</a> tool used <a href="http://ims.mii.lt/ims/konferenciju_medziaga/ICER'10/docs/p69.pdf">in and outside of classrooms</a> to create, share and remix games. It allows intergenerational learning where youth, adults and seniors can <a href="https://www.tandfonline.com/doi/abs/10.1080/15350770.2018.1404855">create game prototypes</a>. </p>
<p>Coding can be used to automate tasks, solve complex problems, forecast, or simulate events that did not happen yet. A trendy area of interest for businesses is <a href="https://www.morningfuture.com/en/article/2018/02/21/data-analyst-data-scientist-big-data-work/235/">data analytics</a>, a field involving
making sense of massive amounts of data.</p>
<p>When we live in a digital world, many problems we encounter with solving technical computer issues, controlling devices, or managing online brands can be solved with coding. </p>
<p>For a long time, researchers have associated coding with the <a href="https://journals.sagepub.com/doi/10.3102/00346543060001065">development of problem-solving skills</a>.
Jeannette Wing coined the term <a href="http://www.cs.cmu.edu/%7E./15110-s13/Wing06-ct.pdf">computational thinking</a> to denote attitudes and skills, including problem-solving and analyzing systems, that can be drawn from fundamental concepts of computer science. </p>
<p>This notion of computational thinking presented an opportunity for educators to explore how coding could be used as a <a href="https://www.sciencedirect.com/science/article/pii/S0747563214004634?via%3Dihub">means for developing other relevant skills, such as problem-solving, creative thinking and critical judgement</a>.</p>
<h2>Believe the hype?</h2>
<p>In the U.S., jobs for computer programmers are <a href="https://www.bls.gov/ooh/computer-and-information-technology/computer-programmers.htm#tab-6">projected to decrease</a> because contracts are being outsourced. But the hype around coding is still increasing. </p>
<p>Due to this gap, critics suggest that the movement will potentially <a href="https://www.theguardian.com/technology/2017/sep/21/coding-education-teaching-silicon-valley-wages">create a cheaper workforce</a>. Once everyone learns to code, the market will become overcrowded and employers will not need to offer a competitive salary.</p>
<p>While participating in a coding event may suggest that learning to code is easy, the truth is that episodic experience does not translate to coding skills. In making learning to code attractive, there is a danger of <a href="https://www.cs.utexas.edu/%7EEWD/transcriptions/EWD10xx/EWD1036.html">misrepresenting computer programming by oversimplifying concepts</a>. To develop as a coder requires effort, persistence and patience. </p>
<p>Computer science researcher Leon Winslow estimated in 1996 that it <a href="https://dl.acm.org/citation.cfm?id=234872">takes approximately 10 years to turn a novice into an expert coder</a>. Researchers have been debating the best way to teach introductory computer programming. There is <a href="https://www.seas.upenn.edu/%7Eeas285/Readings/Pears_SurveyTeachingIntroProgramming.pdf">no consensus yet on the answer</a>.</p>
<p>Further, how can we ensure that what kids learn today will be aligned with the jobs and needs of the future? We can only speculate.</p>
<h2>Fourth industrial revolution</h2>
<p>Klaus Schwab, founder and executive chairman of the World Economic Forum, highlights that with the <a href="https://www.weforum.org/about/the-fourth-industrial-revolution-by-klaus-schwab">emergence of the fourth industrial revolution</a>, information and the ability to manipulate it will be essential for survival in a future workforce. </p>
<p>We do know information management and manipulation will be key to creating and maintaining physical, digital and biological systems that will be part of our homes and workplaces. We know we have complex problems to solve. </p>
<p>Coding can help by processing raw observations into concrete simulations: that means using data from the past and present to create model scenarios to forecast the future. </p>
<p>Such simulations could be used to fight <a href="https://www.technologyreview.com/s/420595/how-coders-can-help-fight-climate-change/">climate change</a>, to <a href="https://www.geotab.com/blog/reduce-traffic-congestion/">reduce traffic</a> and even to <a href="https://www.wired.com/story/how-coders-are-fighting-bias-in-facial-recognition-software/">fight racial bias</a> in social media. </p>
<p>Creativity and critical thinking will also be fundamental, <a href="https://www.weforum.org/reports/the-future-of-jobs">as these skills will probably be one of the only ways to compete with artificial intelligence</a>. </p>
<p>Workers will require swift decision-making skills in an accelerated work environment requiring flexibility and adaptability. </p>
<p>This scenario does not preclude the capacity to create and understand code. But the requirements are more complex.
A key in addressing future challenges through coding lies in assessing opportunities to complement the learn to code movement. </p>
<h2>A coding mindset</h2>
<p>We want to propose that beginner coders could start with an attractive and engaging activity, but should also explicitly develop what could be called “the coding mindset.” </p>
<p>This mindset represents a gradual development of computer programming knowledge and strategies, but also includes analyzing systems, solving problems, persisting in front of errors, being resourceful and collaborating.</p>
<p>To teach the coding mindset, educators need to include more explicit foundational computer science concepts and competencies, such as <a href="https://research.hackerrank.com/developer-skills/2018/">creating algorithms to solve problems, debugging existing programs, and designing systems to accomplish new tasks or gather data</a>. </p>
<p>Learning to code should not be intimidating. But it should fulfil promises, not simply hype mythic dreams.</p><img src="https://counter.theconversation.com/content/107836/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ann-Louise Davidson receives funding from SSHRC and Concordia University</span></em></p><p class="fine-print"><em><span>Ivan Ruby does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Learning to code is often presented as a solution to job market problems of the 21st century, but are students really learning the competencies they will need?Ivan Ruby, Ph.D. Student, Concordia UniversityAnn-Louise Davidson, Concordia University Research Chair, Maker culture; Associate Professor, Educational Technology, Concordia UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/865972017-12-04T13:54:44Z2017-12-04T13:54:44ZTaking a second look at the learn-to-code craze<figure><img src="https://images.theconversation.com/files/195068/original/file-20171116-15412-kukk7v.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Are computers in the classroom more helpful to students – or the companies that sell the machines?</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Schools-Too-Much-Testing/04b897dcb320476699eec2eac3db7992/13/0">AP Photo/Sue Ogrocki</a></span></figcaption></figure><p>Over the past several years, the idea that computer programming – or “coding” – is the key to the future for both children and adults alike has become received wisdom in the United States. The aim of making <a href="https://obamawhitehouse.archives.gov/the-press-office/2016/01/30/fact-sheet-president-obama-announces-computer-science-all-initiative-0">computer science</a> a “<a href="https://obamawhitehouse.archives.gov/blog/2016/01/30/computer-science-all">new basic</a>” skill for all Americans has driven the formation of dozens of <a href="http://girlswhocode.com">nonprofit</a> <a href="http://code.org">organizations</a>, <a href="http://flatironschool.com/">coding</a> <a href="https://www.codecademy.com/">schools</a> and <a href="https://www.congress.gov/bill/115th-congress/house-bill/3316">policy programs</a>.</p>
<p>As this year’s annual <a href="https://csedweek.org/">Computer Science Education Week</a> begins, it is worth taking a closer look at this recent coding craze. The Obama administration’s “<a href="https://obamawhitehouse.archives.gov/blog/2016/01/30/computer-science-all">Computer Science For All</a>” initiative and the <a href="https://www.recode.net/2017/9/25/16276904/president-donald-trump-ivanka-tech-stem-computer-science-coding-education-amazon-google">Trump administration’s effort</a> are both based on the idea that computer programming is not only a fun and exciting activity, but a necessary skill for the jobs of the future.</p>
<p>However, the American history of these education initiatives shows that their primary beneficiaries aren’t necessarily students or workers, but rather the <a href="https://blogs.microsoft.com/on-the-issues/2016/01/30/microsoft-supports-white-house-initiative-to-expand-access-to-computer-science/">influential tech companies</a> that <a href="https://www.recode.net/2017/9/26/16364662/amazon-facebook-google-tech-300-million-donald-trump-ivanka-computer-science">promote the programs</a> in the first place. The current campaign to teach American kids to code may be the latest example of <a href="http://technet.org/membership/members">tech companies</a> using concerns about <a href="http://www.newschools.org/">education</a> to achieve their own goals. This raises some important questions about who stands to gain the most from the recent computer science push. </p>
<h2>Old rhetoric about a ‘new economy’</h2>
<p>One of the earliest corporate efforts to get computers into schools was Apple’s <a href="http://hackeducation.com/2015/02/25/kids-cant-wait-apple">“Kids Can’t Wait” program</a> in 1982. Apple co-founder Steve Jobs <a href="http://americanhistory.si.edu/comphist/sj1.html#kids">personally lobbied</a> Congress to pass the <a href="https://www.congress.gov/bill/97th-congress/house-bill/5573">Computer Equipment Contribution Act</a>, which would have allowed companies that donated computers to schools, libraries and museums to deduct the equipment’s value from their corporate income tax bills. While his efforts in Washington failed, he succeeded in his home state of California, where companies could claim a <a href="https://www.ftb.ca.gov/Archive/Law/legis/1981_FedTax.pdf">tax credit for 25 percent</a> of the value of computer donations.</p>
<p>The bill was clearly a corporate tax break, but it was framed in terms of educational gaps: According to a <a href="https://digitalcommons.law.ggu.edu/cgi/viewcontent.cgi?httpsredir=1&amp;article=1472&amp;context=caldocs_assembly">California legislative analysis</a>, the bill’s supporters felt that “computer literacy for children is becoming a necessity in today’s world” and that the bill would help in “placing needed ‘hardware’ in schools unable to afford computers in any other way.”</p>
<p>Kids Can’t Wait took advantage of Reagan-era concerns that Americans were “<a href="https://press.princeton.edu/titles/10208.html">falling behind</a>” global competitors in the “new economy.” In 1983, a U.S. Department of Education report titled “<a href="https://www2.ed.gov/pubs/NatAtRisk/index.html">A Nation at Risk</a>” warned that the country’s “once unchallenged preeminence in commerce, industry, science, and technological innovation is being overtaken by competitors throughout the world.” The report’s authors blamed the American education system for turning out graduates who were underprepared for a fast-changing, technology-infused workplace. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=396&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=396&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=396&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=498&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=498&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=498&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Federal officials, including then House Speaker Newt Gingrich, launched an effort to get classrooms online in 1995.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Watchf-AP-A-DC-USA-APHS256304-House-Speaker-New-/651639f3821f422998d24c11d7050d1d/171/0">AP Photo/Dennis Cook</a></span>
</figcaption>
</figure>
<p>Over the past 30 years, the same rhetoric has appeared again and again. In 1998, Bill Clinton <a href="http://www.presidency.ucsb.edu/ws/?pid=58384">proclaimed</a> that “access to new technology means … access to the new economy.” In 2016, U.S. Chief Technology Officer Megan Smith described the Obama administration’s coding initiative as an “<a href="https://www.theatlantic.com/education/archive/2016/02/obamas-push-for-computer-science-education/459276/">ambitious, all-hands-on-deck effort</a> to get every student in America an early start with the skills they’ll need to be part of the new economy.”</p>
<p>While technology is often framed as the solution for success in a globalized labor market, the evidence is less clear. In his 2001 book “<a href="http://www.hup.harvard.edu/catalog.php?isbn=9780674011090">Oversold and Underused: Computers in the Classroom</a>,” education researcher Larry Cuban warned that technology on its own would not solve “education’s age-old problems,” such as <a href="https://www.theatlantic.com/business/archive/2016/08/property-taxes-and-unequal-schools/497333/">inequitable funding</a>, <a href="https://www.washingtonpost.com/local/education/crumbling-school-facilities-causing-anxiety-for-parents/2015/05/12/ca83a91a-f800-11e4-a13c-193b1241d51a_story.html">inadequate facilities</a> and <a href="https://news.vice.com/story/american-educators-teach-longer-for-less-pay-than-their-foreign-peers">overworked teachers</a>.</p>
<p>Cuban found that some educational technology initiatives from the 1990s did help students get access to computers and learn basic skills. But that didn’t necessarily <a href="http://www.hup.harvard.edu/catalog.php?isbn=9780674011090">translate into higher-wage jobs</a> when those students entered the workforce. However, the equipment and software needed to teach them brought large windfalls for tech companies – in 1995 the industry was <a href="http://www.nytimes.com/1995/09/11/business/apple-holds-school-market-despite-decline.html">worth US$4 billion</a>.</p>
<h2>Under pressure</h2>
<p>If computers in schools didn’t work as promised two decades ago, then what’s behind the current coding push? Cuban <a href="http://www.hup.harvard.edu/catalog.php?isbn=9780674011090">points out</a> that few school boards and administrators can resist pressure from business leaders, public officials and <a href="http://news.gallup.com/poll/184637/parents-students-computer-science-education-school.aspx">parents</a>. Organizations like the <a href="http://www.csforall.org/">CS For All Consortium</a>, for example, have a large membership of education companies who are taking advantage of funding from <a href="https://cardenas.house.gov/media-center/press-releases/c-rdenas-416d65726963612043616e20436f646520">state legislatures</a>.</p>
<p>A huge boost comes from the tech giants, too. Amazon, Facebook, Google, Microsoft and others are collectively <a href="http://blogs.edweek.org/edweek/DigitalEducation/2017/10/300_million_computer_science_pledge.html">contributing $300 million</a> to the Trump administration’s new federal initiative – no doubt seeing, as The New York Times observed, the potential to “<a href="https://www.nytimes.com/2017/09/26/technology/computer-science-stem-education.html">market their own devices and software</a> in schools as coding classes spread.” </p>
<p>This isn’t always the best deal for students. In 2013, the Los Angeles Unified School District planned to give Apple iPads to every student in every school – at a cost of <a href="https://www.wired.com/2015/05/los-angeles-edtech/">$1.3 billion</a>. The program was a fiasco: The iPads had technical problems and incomplete software that made them <a href="https://gizmodo.com/the-la-school-systems-1-3-billion-ipad-fiasco-comes-to-1733569377">essentially useless</a>. The fallout included <a href="http://www.govtech.com/education/What-Went-Wrong-with-LA-Unifieds-iPad-Program.html">investigations by the FBI and the U.S. Securities and Exchange Commission</a>, and a legal settlement in which Apple and its partners <a href="http://www.latimes.com/local/lanow/la-me-ln-la-unified-ipad-settlement-20150925-story.html">repaid the school district $6.4 million</a>.</p>
<p>However, tech companies are framing their efforts in more noble terms. In June 2017, Microsoft president Brad Smith compared the efforts of tech industry nonprofit <a href="https://code.org/">Code.org</a> to previous efforts to improve science and technology training in the United States. Recalling the <a href="https://www.gpo.gov/fdsys/pkg/STATUTE-72/pdf/STATUTE-72-Pg1580.pdf">focus on scientific research</a> that drove the <a href="https://www.history.com/topics/space-race">Space Race</a>, Smith <a href="https://www.nytimes.com/2017/06/27/technology/education-partovi-computer-science-coding-apple-microsoft.html">said</a>, “We think computer science is to the 21st century what physics was to the 20th century.” </p>
<p>Indeed, tech companies are having a very hard time <a href="https://www.bostonglobe.com/business/2016/02/19/the-war-for-tech-talent-escalates/ejUSbuPCjPLCMRYlRZIKoJ/story.html">hiring and retaining software engineers</a>. With new concerns about <a href="https://www.wired.com/2017/04/trumps-executive-order-wont-give-tech-clarity-h-1b-visas/">restrictions on visas</a> for skilled immigrant workers, the industry could definitely benefit from a workforce trained with public dollars. </p>
<p>For some tech companies, this is an explicit goal. In 2016, Oracle and Micron Technology helped write a state <a href="https://legislature.idaho.gov/wp-content/uploads/sessioninfo/2016/legislation/H0379.pdf">education bill</a> in Idaho which read, “It is essential that efforts to increase computer science instruction, kindergarten through career, be driven by the needs of industry and be developed in partnership with industry.” While two lawmakers <a href="http://www.spokesman.com/blogs/boise/2016/feb/02/house-backs-launching-computer-science-initiative-idaho-schools-though-2-members-object/">objected to the corporate influence</a> on the bill, it passed with an overwhelming majority.</p>
<h2>History repeating?</h2>
<p>Some critics argue that the goal of the coding push is to massively increase the number of programmers on the market, <a href="https://www.theguardian.com/technology/2017/sep/21/coding-education-teaching-silicon-valley-wages">depressing wages</a> and bolstering tech companies’ profit margins. Though there is no concrete evidence to support this claim, the fact remains that <a href="http://www.epi.org/files/2013/bp359-guestworkers-high-skill-labor-market-analysis.pdf">only half of college students</a> who majored in science, technology, engineering or math-related subjects get jobs in their field after graduation. That certainly casts doubt on the idea that there is a “<a href="https://www.technologyreview.com/s/608707/the-myth-of-the-skills-gap/">skills gap</a>” between workers’ abilities and employers’ needs. Concerns about these disparities has helped <a href="https://www.theatlantic.com/education/archive/2016/02/obamas-push-for-computer-science-education/459276/">justify investment</a> in tech education over the <a href="https://www.youtube.com/watch?v=w2zU9g3WU5M">past 20 years</a>. </p>
<p>As millions of dollars flow to technology companies in the name of education, they often bypass other major needs of U.S. schools. Technology in the classroom can’t solve the problems that <a href="https://www.npr.org/sections/ed/2017/05/22/529534031/president-trumps-budget-proposal-calls-for-deep-cuts-to-education">budget cuts</a>, <a href="https://www.theatlantic.com/education/archive/2015/07/too-many-kids/397451/">large class sizes</a> and <a href="https://www.washingtonpost.com/news/answer-sheet/wp/2016/08/16/think-teachers-arent-paid-enough-its-worse-than-you-think/">low teacher salaries</a> create. Worse still, new research is finding that <a href="https://press.princeton.edu/titles/11029.html">contemporary tech-driven educational reforms</a> may end up intensifying the problems they were trying to fix. </p>
<p>Who will benefit most from this new computer science push? History tells us that it may not be students.</p>
<p><em>Editor’s notes: This is an updated version of an article originally published Dec. 4, 2017. It was updated Dec. 8, 2017, to correct the year Larry Cuban’s book was first published.</em></p><img src="https://counter.theconversation.com/content/86597/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Kate M. Miltner does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Past efforts to teach American students computer skills haven't always helped workers get better-paying jobs. But spending on hardware and software for schools has certainly enriched tech companies.Kate M. Miltner, Ph.D. Candidate in Communication, University of Southern California, Annenberg School for Communication and JournalismLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/676232017-02-21T01:19:18Z2017-02-21T01:19:18ZBuilding privacy right into software code<figure><img src="https://images.theconversation.com/files/157370/original/image-20170217-10200-1yp574g.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Putting privacy right in the code.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/red-digital-keyhole-concept-cyber-security-518678134">Keyhole image via shutterstock.com</a></span></figcaption></figure><p>When I was 15, my parents did not allow me to use AOL Instant Messenger. All of my friends used it, so I had to find a way around this rule. I would be found out if I installed the software on my computer, so I used the <a href="https://www.lifewire.com/how-to-sign-in-aim-express-1949378">web browser version</a> instead. Savvy enough to delete my internet history every time, I thought my chatting was secret.</p>
<p>Then one day my mother confronted me with all the times I had gone on Instant Messenger in the past week. Whenever I visited the site, it had left a trail of cookies behind. Intended to make my user experience more convenient, <a href="http://computer.howstuffworks.com/internet/basics/question82.htm">cookies saved my login information for repeat visits</a>. Unfortunately, the cookies made my life less convenient: My mother knew how to inspect them to determine when I had been illicitly instant messaging.</p>
<p>Since then, I have been very interested in protecting user privacy. I studied computer science in college and ended up pursuing a career in the field. I became fascinated with programming languages, the construction materials for the information age. <a href="https://www.technologyreview.com/s/536356/toolkits-for-the-mind/">Languages shape how programmers think about software, and how they construct it</a>, by making certain tasks easier and others harder. For instance, some languages allow rapid website prototyping, but don’t handle large amounts of traffic very well.</p>
<p>Regarding my main interest, I discovered that many of today’s most common languages make it difficult for programmers to protect users’ privacy and security. It’s bad enough that this state of affairs means programmers have lots of opportunities to make privacy-violating errors. Even worse, it means we users have trouble understanding what computer programs are doing with our information – even as we increasingly rely on them in our daily lives.</p>
<h2>A history of insecurity</h2>
<p>As part of the first generation who <a href="https://doi.org/10.1177/1461444806059871">came of age on the internet</a>, I enjoyed the benefits of participating in digital life, like instant messaging my friends when I was supposed to be doing homework. I also knew there was the potential for unintended information leaks.</p>
<p>A then-crush once told me that he took advantage of a fleeting Facebook opportunity to discover that I was among his top five stalkers. For a brief period of time, when a user <a href="http://gawker.com/390004/whos-stalking-you-on-facebook">typed “.” into the search bar</a>, the autocompleted searches were the users who most searched for them. I was mortified, and avoided even casual browsing on Facebook for a while.</p>
<p>This small social crisis was the result of a programming problem, a combination of both human programmer error and a shortcoming of the language and environment in which that human worked. And we can’t blame the programmer, because the languages Facebook uses were not built with modern security and privacy in mind. They need the programmer to manage everything by hand.</p>
<h2>Spreading protections across the program</h2>
<p>As those older languages developed into today’s programming environments, security and privacy remained as add-ons, rather than built-in automatic functions. Though programmers try to keep instructions for different functions separate, code dedicated to enforcing privacy and security concerns gets mixed in with other code, and spread all throughout the software.</p>
<p>The decentralized nature of information leaks is what allowed my mother to catch me messaging. The web browser I used stored evidence of my secret chatting in more than one place – in both the history of what sites I visited and in the cookie trail I left behind. Clearing only one of them left me vulnerable to my mother’s scrutiny.</p>
<p>If the program had been built in such a way that all evidence of my activity was handled together, it could have known that when I deleted the history, I wanted the cookies deleted too. But it wasn’t, it didn’t and I got caught.</p>
<h2>Making programmers do the work</h2>
<p>The problem gets even more difficult in modern online systems. Consider what happens when I share my location – let’s say Disney World – on Facebook with friends who are nearby. On Facebook, this location will be displayed on my “timeline.” But it will also be used for other purposes: Visitors to Disney World’s Facebook page can see <a href="http://www.techlicious.com/tip/complete-guide-to-facebook-privacy-settings/">which of their friends has also been to the amusement park</a>. I can tell Facebook to limit who can see that information about me, so people I don’t know can’t go to Disney World’s page and see “Jean Yang checked in 1 hour ago.” </p>
<p>It is the programmer’s job to enforce these privacy restrictions. Because privacy-related code is scattered throughout all the programs Facebook uses to run its systems, the programmer must be vigilant everywhere. To make sure nobody finds out where I am unless I want them to, the programmer must tell the system to check my privacy settings everywhere it uses my location value, directly or indirectly. </p>
<p>Every time a programmer writes instructions to refer to my location – when displaying my profile, the Disney World page, the results of queries such as “friends at Disney World” and countless other places – she has to remember to include instructions to check my privacy settings and act accordingly.</p>
<p>This results in a tangle of code connecting the rules and their implementation. It is easy for programmers to make mistakes, and difficult for anybody else to check that the code is doing what it’s supposed to do.</p>
<h2>Shifting the burden to computers</h2>
<p>The best way to avoid these problems is to take the task of privacy protection away from humans and entrust it to the computers themselves. We can – and should – develop programming models that allow us to more easily incorporate security and privacy into software. <a href="https://www.cs.cornell.edu/andru/papers/jsac/sm-jsac03.pdf">Prior research in what is called “language-based information flow”</a> looks at how to automatically check programs to ensure that sloppy programming is not inadvertently violating privacy or other data-protection rules.</p>
<p>Even with tools that can check programs, however, the programmer needs to do the heavy lifting of writing programs that do not leak information. This still involves writing those labor-intensive and error-prone privacy checks throughout the program. My work on a new programming model called “<a href="http://projects.csail.mit.edu/jeeves/">policy-agnostic programming</a>” goes one step farther, making sloppy programming impossible. In these systems, programmers attach security and privacy restrictions directly to every data value.</p>
<p>For instance, they could label location as information requiring protection. The program itself would understand that my “Disney World” location should be shown only to my close friends. They could see that not only on my own page, but on Disney World’s page.</p>
<p>But people I don’t know would be shown a less specific value in both places. Perhaps friends of my friends might see “away from home,” and total strangers could only learn that I was “in the United States.” Looking at my page, they wouldn’t be able to tell exactly where I am. And if they went to the Disney World page, I wouldn’t appear there either.</p>
<p>With this type of structure, the humans need no longer write code to repeatedly check which information should be shared; the computer system handles that automatically. That means one less thing for programmers to think about. It also helps users feel more confident that some element of a complicated piece of software – much less a human error – won’t violate their personal privacy settings.</p>
<p>With software programs handling our driving, shopping and even <a href="http://www.lifehacker.co.uk/2015/02/16/tinderbox-bot-intelligently-swipes-tinder-matches-can-even-start-conversation">choosing potential dates</a>, we have much bigger problems than our mothers seeing our internet cookies. If our computers can protect our privacy, that would be a huge improvement to our rapidly changing world.</p><img src="https://counter.theconversation.com/content/67623/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jean Yang is an Assistant Professor in the Computer Science Department at Carnegie Mellon University. Jean receives funding from DARPA and the National Science Foundation. </span></em></p>Most of today's computer languages make it hard for programmers to protect users' privacy and security. The fix is to take those tasks out of human hands entirely.Jean Yang, Assistant Professor of Computer Science, Carnegie Mellon UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/709272017-02-02T02:57:56Z2017-02-02T02:57:56ZHunting hackers: An ethical hacker explains how to track down the bad guys<figure><img src="https://images.theconversation.com/files/154807/original/image-20170130-7689-1bxpv08.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Looking deep into computer activities.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/3d-person-magnifier-standing-on-laptop-99185414">Via shutterstock.com</a></span></figcaption></figure><p>When a cyberattack occurs, ethical hackers are called in to be digital detectives. In a certain sense, they are like regular police detectives on TV. They have to search computer systems to find ways an intruder might have come in – a digital door or window left unlocked, perhaps. They look for evidence an attacker left of entry, like an electronic footprint in the dirt. And they try to determine what might have been copied or taken.</p>
<p>Understanding this process has become more important to the public in light of recent events in the news. In October 2016, the U.S. officially said Russia was trying to embarrass respected political figures and <a href="http://www.cnn.com/2016/10/07/politics/us-blames-russia-for-targeting-election-systems/">interfere with the U.S. presidential election process</a>. Specifically, <a href="http://www.politico.com/story/2017/01/white-house-vladimir-putin-election-hacking-donald-trump-233299">the Obama administration formally blamed Russia</a> for hacking into the Democratic National Committee’s computer systems. The statement hinged on the investigative capabilities of American ethical hackers working for both private companies and government agencies.</p>
<p>But how do people track down hackers, figuring out what they have done and who they are? What’s involved, and who does this sort of work? The answer is that ethical hackers like me dig deep into digital systems, examining files logging users’ activity and deconstructing malicious software. We often team up with intelligence, legal and business experts, who bring outside expertise to add context for what we can find in the electronic record.</p>
<h2>Detecting an intrusion</h2>
<p>Typically, an investigation begins when someone, or something, detects an unauthorized intrusion. Most network administrators set up <a href="https://www.lifewire.com/introduction-to-intrusion-detection-systems-ids-2486799">intrusion detection systems</a> to help them keep an eye on things. Much like an alarm system on a house, the intrusion detection software watches specific areas of a network, such as where it connects to other networks or where sensitive data are stored. </p>
<p>When it spots unusual activity, like an unauthorized user or a surprisingly high amount of data traffic to a particular off-site server, the intrusion detection system alerts network administrators. They act as <a href="http://www.csoonline.com/article/2935584/data-breach/cybersecurity-first-responders.html">cybersecurity first responders</a> – like digital firefighters, police officers and paramedics. They react to the alert and try to figure out what happened to trigger it.</p>
<p>This can include a <a href="http://www.ibm.com/developerworks/library/wa-webattack/wa-webattack-pdf.pdf">wide range of attacks</a>, from random, unstructured incursions by individuals and small groups to well-organized and precision-targeted strikes from hackers backed by government agencies. Any of them can set off an intrusion alarm in a variety of ways.</p>
<h2>The immediate response</h2>
<p>Many times, the initial investigation centers on collecting, organizing and analyzing large amounts of network data. <a href="https://web.stanford.edu/class/msande91si/www-spr04/readings/week1/InternetWhitepaper.htm">Computer networking equipment and servers keep records</a> of who connects, where the connection comes from and what the user does on the system.</p>
<p>Depending on what that analysis shows, the administrator may be able to fix the problem right away, such as by preventing a particular user from logging in, or <a href="https://www.pluralsight.com/blog/it-ops/access-control-list-concepts">blocking all network traffic coming from a particular place</a>. But a more complex issue could require calling a sophisticated <a href="https://www.cert.org/incident-management/csirt-development/csirt-faq.cfm">incident response team</a>.</p>
<p>Ideally, each company or organization should have its own internal team or rapid access to a team from outside. Most countries, including the U.S., have their own <a href="https://www.us-cert.gov/">national response teams</a>, often government employees supplemented by private contractors with particular expertise. These teams are groups of ethical hackers who are trained to investigate deeper or more challenging intrusions. In addition to any self-taught skills, these people often have additional experience from the military and higher education. Their most vital expertise is in what is called “<a href="https://weatherhead.case.edu/news/2015/02/23/how-hackers-think">just-in-time learning</a>,” or figuring out how to apply their skills to new situations on the fly.</p>
<p>They conduct larger-scale digital forensic inquiries and analyze malicious software that may have been introduced during the attack. Typically, these teams work to stop the attack and prevent future attacks of that type. The teams can, at times, hunt down the attackers.</p>
<h2>Attributing an attack</h2>
<p>Determining the identity or location of a cyberattacker is incredibly difficult because <a href="https://www.wired.com/2016/12/hacker-lexicon-attribution-problem/">there’s no physical evidence to collect or observe</a>. Sophisticated hackers can cover their digital tracks. Although there are many different <a href="https://www.researchgate.net/publication/235170094_Techniques_for_Cyber_Attack_Attribution">attribution techniques</a>, the best approach takes advantage of more than one. These techniques often include looking very closely at any files or data left behind by the attackers, or stolen and released as part of the incursion. </p>
<p>Response teams can analyze the grammar used in comments that are commonly embedded in software code, as programmers leave notes to each other or for future developers. They can <a href="http://www.lawpro.ca/LawPRO/metadata.pdf">inspect files’ metadata</a> to see whether text has been translated from one language to another. </p>
<p>For example, <a href="http://nationalinterest.org/feature/the-weird-logic-behind-russias-alleged-hacking-17963">in the DNC hack</a>, American cyber experts could look at the specific files published on Wikileaks. Those files’ metadata indicated that some of them contained text converted from the Cyrillic characters of the Russian alphabet to the Latin characters of English.</p>
<p>Investigators can even <a href="https://arstechnica.com/security/2016/06/guccifer-leak-of-dnc-trump-research-has-a-russians-fingerprints-on-it/">identify specific sociocultural references</a> that can provide clues to who conducted the attack. The person or group who claimed responsibility for the DNC hack – <a href="http://www.bbc.com/news/technology-36913000">using the name Guccifer 2.0</a> – <a href="http://motherboard.vice.com/read/dnc-hacker-guccifer-20-interview">claimed to be Romanian</a>. But he had a hard time speaking Romanian fluently, <a href="http://motherboard.vice.com/read/dnc-hacker-guccifer-20-interview">suggesting he wasn’t actually a native</a>. In addition, Guccifer 2.0 used a different smiley-face symbol than Americans. Instead of typing “:)” <a href="https://guccifer2.wordpress.com/2016/06/15/dnc/">Guccifer 2.0 just typed “)”</a> – leaving out the colon, implying that he was Eastern European.</p>
<p>Experienced cyber-investigators build an edge by tracking many significant threats over time. Just like with “cold cases” in regular police work, comparing the latest attack to previous ones can sometimes reveal links, adding pieces to the puzzle. </p>
<p>This is particularly true when dealing with what are called “<a href="https://www.secureworks.com/blog/advanced-persistent-threats-apt-a">advanced persistent threats</a>.” These are attacks that progress gradually, with very sophisticated tactics unfolding over long periods of time. Often attackers custom-design these intrusions to <a href="https://nakedsecurity.sophos.com/2012/10/30/whodunnit-aramco-hack/">exploit specific weaknesses in their targets’ computer systems</a>. That customization can reveal clues, such as programming style – or even choice of programming language – that combine with other information to suggest who might be responsible.</p>
<p>The <a href="https://etd.ohiolink.edu/pg_10?0::NO:10:P10_ACCESSION_NUM:case1427809862">cyber-defense community has another advantage</a>: While attackers typically work alone or in small groups and in secret, ethical hackers work together across the world. When a clue emerges in one investigation, it’s <a href="http://dx.doi.org/10.2139/ssrn.2326634">common for hackers to share that information</a> – either publicly on a blog or in a scholarly paper, or just directly with other known and trusted investigators. In this way, we build a body of evidence and layers of experience in drawing conclusions.</p>
<p>Very often, a report from an attack investigation will yield clues or suggestions, perhaps that an attacker was Russian or was <a href="https://www.wired.com/2014/12/sony-hack-what-we-know/">using a keyboard with Korean characters</a>. Only when the conclusions are <a href="https://krebsonsecurity.com/2017/01/who-is-anna-senpai-the-mirai-worm-author/">clear and irrefutable will investigators directly accuse specific attackers</a>. When they do, though, they often share all the information they have. That bolsters the credibility of their conclusions, helps others identify weaknesses or failures of logic – and it shares all that knowledge with the rest of the community, making the next investigation that much easier.</p>
<p>The most skilled hackers can write self-erasing code, fake their web addresses, route their attacks through the devices of innocent victims and make it appear that they are in multiple countries at once. This makes arresting them very hard. In some attacks, we are able to identify the perpetrator, as happened to celebrity-email hacker <a href="https://www.nytimes.com/2014/11/11/world/europe/for-guccifer-hacking-was-easy-prison-is-hard-.html">Guccifer 1.0</a>, who was <a href="http://www.reuters.com/article/us-usa-cyber-guccifer-idUSKCN1175FB">arrested and imprisoned</a>. </p>
<p>But when the attack is more advanced, coordinated across multiple media platforms and leveraging skillful social engineering over years, it’s likely a government-sponsored effort, making arrests unlikely. That’s what happened when Russia hacked the U.S. presidential election. Of course, <a href="http://www.cnn.com/2016/12/29/politics/russia-sanctions-announced-by-white-house/">diplomatic sanctions are an option</a>. But pointing fingers between world superpowers is always a dangerous game.</p><img src="https://counter.theconversation.com/content/70927/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Timothy Summers is the CEO of Summers &amp; Company, a cyber strategy consulting firm. That company does not conduct intrusion response work, but does advise clients about minimizing risk of future cyberattacks. He also has provided input to other companies in support of their development of cybersecurity training programs.</span></em></p>Cyberdetectives look for digital doors or windows left unlocked, find electronic footprints in the dirt and examine malicious software for clues about who broke in, what they took and why.Timothy Summers, Director of Innovation, Entrepreneurship, and Engagement, University of MarylandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/717272017-01-23T18:02:27Z2017-01-23T18:02:27ZMobile phones offer a new way for Africa's students to learn programming<figure><img src="https://images.theconversation.com/files/153854/original/image-20170123-8067-im6k7g.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Students could learn how to program with the right applications on their mobile phones.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>It’s not easy for Computer Science students at most universities in Africa to practice and develop their programming skills. They have the ability to program, but access to desktop or laptop computers might be a problem. I experienced this first-hand while teaching programming at a Kenyan university.</p>
<p>Most African universities have public computer laboratories, but these tend to be used to teach various classes, hence limiting students’ access. Many institutions may also have very few computers for a large number of students. This means that students might need to access computers outside the classroom in order to practise programming. Yet, most people in developing countries <a href="http://www.pewglobal.org/2015/03/19/internet-seen-as-positive-influence-on-education-but-negative-influence-on-morality-in-emerging-and-developing-nations/technology-report-15/">do not</a> own computers at home.</p>
<p>Limited access to PCs aggravates the learning difficulties faced by programming students. This is especially true because programming is best learnt through practice. However, most students own mobile phones. Cell phones are the most <a href="http://www.pewglobal.org/2015/04/15/cell-phones-in-africa-communication-lifeline/">widely used</a> devices among students in developing countries – and, indeed, among Africans more generally. </p>
<p>I therefore set out to develop a solution that would enable students to learn programming using mobile phones. The biggest challenge was turning mobile phones into functional programming environments. After all, they aren’t designed with programming in mind. They have small screens and small keypads that impede their use as programming platforms.</p>
<p>So I designed what I called scaffolding – or supporting – techniques that allow for the effective construction of programs on mobile phones using the Java language. These techniques can also address new learners’ needs. <a href="https://open.uct.ac.za/handle/11427/16609">The results</a>, taken from my work with 182 students at four universities in South Africa and Kenya, are encouraging.</p>
<h2>Techniques for mobile phones</h2>
<p>The scaffolding techniques I designed can be used on Android platforms. They are specifically aimed at students learning <a href="https://docs.oracle.com/javase/tutorial/java/concepts/">Object Oriented Programming</a> using Java.</p>
<p>The technology works by offering three types of scaffolding techniques:</p>
<ol>
<li><p>Automatic scaffolding, which are supporting techniques automatically presented on the interface. These include instructions on which buttons to press, error prompts and suggestions to view an example while working on a program. These scaffolding techniques fade away as the student gets more familiar with the application.</p></li>
<li><p>Static scaffolding, which involves supporting techniques that never fade away. I included two such techniques. One presents the layout of a Java program on the main interface, so the student always has a visual representation before interacting with the program. This technique is said particularly to <a href="http://web.media.mit.edu/%7Eedith/publications/1996-persp.taking.pdf">support</a> a new student’s learning. The second static scaffolding technique involves creating the program one part at a time, breaking it into smaller parts. This is an effective way to support the creation of a program on small screen devices like mobile phones.</p></li>
<li><p>User-initiated scaffolding, which are supporting techniques that a student can activate. Examples include hints, examples and tutorials.</p></li>
</ol>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=237&amp;fit=clip" srcset="https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=708&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=708&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=708&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=890&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=890&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=890&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A student puts the scaffolding for mobile phones to the test.</span>
<span class="attribution"><span class="source">Dr Chao Mbogo</span></span>
</figcaption>
</figure>
<p>I tested these techniques on the students while they constructed Java programs on mobile phones. Their feedback was largely positive and suggested that scaffolding techniques specifically designed for mobile phones and based on students’ needs could support the learning of programming using a mobile phone. </p>
<h2>Findings and challenges</h2>
<p>Desktop programming environments are complex interfaces. Large screens make it possible for students to be exposed to large amounts of information in one sitting. Large screens also mean that students can be given support, in one place, without having to leave the interface. Providing all this functionality and support in one interface doesn’t work well on small screens.</p>
<p>But my research suggests that small screens have some advantages. Students told me that the more simple interface on a small screen helped them to focus on the task at hand. When they had to create a program one step at a time, they didn’t have to grasp a huge amount of information all at once. This may assist their learning in the long run. </p>
<p>Certainly, the study wasn’t perfect. The scaffolding I developed was only for Android platforms, which excludes users from other platforms such as Windows and iOS. And while mobile phones are far more common among students than private desktop or laptop computers, there are some students who do not have and cannot afford even these devices. </p>
<p>My research is not over yet. My next steps will take these problems into account. For example, the techniques I designed will be tested on other programming languages – such as C++ – and on other mobile platforms. I am also keen to investigate the design of such scaffolding for tablets which are becoming more common among African university students.</p>
<h2>Next steps</h2>
<p>The study I’ve described here relates to my PhD, which I was awarded at the University of Cape Town in December 2015. Since then a number of my peers have suggested other areas to explore and improve. From 2017 my programming students at Kenya Methodist University will use the prototype I tested in a longitudinal study. None of them have ever used a mobile phone to program, so this will be a new experience.</p>
<p>For the foreseeable future, African universities and other institutions offering programming subjects will continue to struggle with resources. As long as this situation persists and students’ access to mobile phones and tablets grows, the techniques I’m developing could offer a smart solution that allows the continent to keep producing young programmers.</p><img src="https://counter.theconversation.com/content/71727/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Chao Charity Mbogo received funding for her Ph.D. research, related fieldwork and related conference grants from Hasso Plattner Institute (HPI), Department of Computer Science at the University of Cape Town, Google, The International Network for Postgraduate Students in the area of ICT4D (IPID), ACM-W, and Schlumberger’s Faculty for the Future fellowship. </span></em></p>Computer programming is best learned through practice, but students in developing economies don't always have access to desktop or laptop computers. Mobile phones may be the solution.Chao Charity Mbogo, Researcher and Lecturer of Computer Science, Mentor, Kenya Methodist UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/668982016-10-20T10:00:19Z2016-10-20T10:00:19ZMoving toward computing at the speed of thought<figure><img src="https://images.theconversation.com/files/141687/original/image-20161013-3950-fue6g8.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">When will computers and humans interact fully?</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/pic-150572768/stock-photo-background-design-of-human-head-and-symbolic-elements-on-the-subject-of-human-mind-consciousness-imagination-science-and-creativity.html">Illustration via shutterstock.com</a></span></figcaption></figure><p>The <a href="http://www.computerhistory.org/timeline/1950/">first computers</a> cost millions of dollars and were locked inside rooms equipped with special electrical circuits and air conditioning. The only people who could use them had been trained to write programs in that specific computer’s language. Today, <a href="https://en.wikipedia.org/wiki/Kinect">gesture-based interactions</a>, using <a href="https://www.raspberrypi.org/blog/the-eagerly-awaited-raspberry-pi-display/">multitouch pads and touchscreens</a>, and exploration of virtual 3D spaces allow us to interact with digital devices in ways very similar to how we interact with physical objects.</p>
<p>This newly immersive world not only is open to more people to experience; it also allows almost anyone to exercise their own creativity and innovative tendencies. No longer are these capabilities dependent on being a math whiz or a coding expert: Mozilla’s “<a href="https://aframe.io/">A-Frame</a>” is making the task of building complex virtual reality models much easier for programmers. And Google’s “<a href="https://www.tiltbrush.com/">Tilt Brush</a>” software allows people to build and edit 3D worlds without any programming skills at all. </p>
<p>My own research hopes to develop the next phase of <a href="http://www.3dinputbook.com/books/interactivedesign.html">human-computer interaction</a>. We are monitoring people’s brain activity in real time and recognizing specific thoughts (of “tree” versus “dog” or of a particular pizza topping). It will be yet another step in the historical progression that has brought technology to the masses – and will widen its use even more in the coming years.</p>
<h2>Reducing the expertise needed</h2>
<p>From those early computers dependent on machine-specific programming languages, the first major improvement allowing more people to use computers was the development of the <a href="http://groups.engin.umd.umich.edu/CIS/course.des/cis400/fortran/fortran.html">Fortran programming language</a>. It expanded the range of programmers to scientists and engineers who were comfortable with mathematical expressions. This was the <a href="http://www-03.ibm.com/ibm/history/ibm100/us/en/icons/punchcard/">era of punch cards</a>, when programs were written by <a href="http://www.computerhistory.org/revolution/punched-cards/2">punching holes in cardstock</a>, and output had no graphics – <a href="http://www.theatlantic.com/technology/archive/2014/01/the-lost-ancestors-of-ascii-art/283445/">only keyboard characters</a>.</p>
<p>By the late 1960s mechanical plotters let programmers draw simple pictures by telling a computer to raise or lower a pen, and move it a certain distance horizontally or vertically on a piece of paper. The commands and graphics were simple, but even <a href="https://ia801600.us.archive.org/4/items/bitsavers_calcompProtromechanicalPlottersOct80_1482960/Programming_CalComp_Electromechanical_Plotters_Oct80.pdf">drawing a basic curve required understanding trigonometry</a>, to specify the very small intervals of horizontal and vertical lines that would look like a curve once finished. </p>
<p>The 1980s introduced what has become the familiar windows, icons and mouse interface. That gave nonprogrammers a much easier time creating images – so much so that many <a href="http://procartoon.com/5-top-drawing-tablets-cartooning/">comic strip authors</a> and artists <a href="http://comicsforbeginners.com/digital-vs-drawing-paper/">stopped drawing in ink</a> and began <a href="http://blog.dilbert.com/post/102544363506/cartoonist-tools">working with computer tablets</a>. <a href="http://www.theverge.com/2014/6/12/5804070/the-amazing-animation-software-behind-how-to-train-your-dragon-2">Animated films went digital</a>, as programmers developed sophisticated proprietary tools for use by animators. </p>
<p>Simpler tools became <a href="https://en.wikipedia.org/wiki/List_of_3D_animation_software">commercially available for consumers</a>. In the early 1990s the <a href="https://www.opengl.org/">OpenGL library</a> allowed programmers to build 2D and 3D digital models and add color, movement and interaction to these models. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/141680/original/image-20161013-3944-8gseyu.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/141680/original/image-20161013-3944-8gseyu.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=237&amp;fit=clip" srcset="https://images.theconversation.com/files/141680/original/image-20161013-3944-8gseyu.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=450&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/141680/original/image-20161013-3944-8gseyu.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=450&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/141680/original/image-20161013-3944-8gseyu.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=450&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/141680/original/image-20161013-3944-8gseyu.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=566&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/141680/original/image-20161013-3944-8gseyu.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=566&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/141680/original/image-20161013-3944-8gseyu.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=566&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Inside a CAVE system.</span>
<span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File%3ACAVE_Crayoland.jpg">Davepape</a></span>
</figcaption>
</figure>
<p>In recent years, 3D displays have become <a href="http://archive.ncsa.illinois.edu/Cyberia/VETopLevels/VR.Systems.html">much smaller and cheaper</a> than the multi-million-dollar CAVE and similar immersive systems of the 1990s. They needed space 30 feet wide, 30 feet long and 20 feet high to fit their rear-projection systems. Now <a href="https://www.cnet.com/products/samsung-gear-vr-innovator-edition-for-galaxy-s6-and-s6-edge/review/">smartphone holders</a> can provide a <a href="https://www.cnet.com/products/google-cardboard/">personal 3D display</a> for less than US$100. </p>
<p>User interfaces have gotten similarly more powerful. Multitouch pads and touchscreens recognize movements of multiple fingers on a surface, while devices such as the <a href="http://www.nintendo.com/wiiu">Wii</a> and <a href="http://www.xbox.com/en-US/xbox-one/accessories/kinect">Kinect</a> recognize movements of arms and legs. A company called Fove has been working to develop a VR headset that will <a href="http://www.getfove.com/">track users’ eyes</a>, and which will, among other capabilities, let people make eye contact with virtual characters.</p>
<h2>Planning longer term</h2>
<p>My own research is helping to move us toward what might be called “<a href="http://www.arcstone.com/blog/2007/11/computing-at-the-speed-of-thought">computing at the speed of thought</a>.” Low-cost open-source projects such as <a href="http://openbci.com/">OpenBCI</a> allow people to assemble their own neuroheadsets that capture brain activity noninvasively. </p>
<p>Ten to 15 years from now, hardware/software systems using those sorts of neuroheadsets could assist me by recognizing the nouns I’ve thought about in the past few minutes. If it replayed the topics of my recent thoughts, I could retrace my steps and remember what thought triggered my most recent thought.</p>
<p>With more sophistication, perhaps a writer could wear an inexpensive neuroheadset, imagine characters, an environment and their interactions. The computer could deliver the <a href="https://www.washingtonpost.com/archive/lifestyle/1979/03/22/rosemary-rogers-the-princess-of-passion-pulp/11615603-34cf-4910-8186-069db7e64e21/">first draft of a short story</a>, either as a text file or even as a video file showing the scenes and dialogue <a href="http://www.one-story.com/index.php?page=stories&amp;story_id=209">generated in the writer’s mind</a>.</p>
<h2>Working toward the future</h2>
<p>Once human thought can communicate directly with computers, a new world will open before us. One day, I would like to play games in a <a href="https://www.theguardian.com/technology/2016/oct/12/video-game-characters-emotional-ai-developers?CMP=Share_iOSApp_Other">virtual world that incorporates social dynamics</a> as in the <a href="https://promweek.soe.ucsc.edu/2011/11/12/gameplay-and-social-physics/">experimental games</a> “<a href="https://promweek.soe.ucsc.edu/">Prom Week</a>” and “<a href="http://www.interactivestory.net/">Façade</a>” and in the commercial game “<a href="https://versu.com/2014/05/28/blood-laurels/">Blood &amp; Laurels</a>.” </p>
<p>This type of experience would not be limited to game play. Software platforms such as an enhanced <a href="https://versu.com/">Versu</a> could enable me to <a href="http://www.nytimes.com/2014/07/07/arts/video-games/text-games-in-a-new-era-of-stories.html?_r=0">write those kinds of games</a>, developing characters in the same virtual environments they’ll inhabit.</p>
<p>Years ago, I envisioned an easily modifiable application that allows me to have <a href="http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.36.2075&amp;rep=rep1&amp;type=pdf">stacks of virtual papers</a> hovering around me that I can easily grab and rifle through to find a reference I need for a project. I would love that. I would also really enjoy playing “<a href="http://harrypotter.wikia.com/wiki/Quidditch">Quidditch</a>” with other people while we all experience the sensation of flying via head-mounted displays and control our brooms by tilting and twisting our bodies.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/PPHkfVX4rjs?wmode=transparent&amp;start=0" frameborder="0" allowfullscreen></iframe>
<figcaption><span class="caption">An early, single-player virtual reality version of ‘Quidditch.’</span></figcaption>
</figure>
<p>Once low-cost motion capture becomes available, I envision new forms of digital story-telling. Imagine a group of friends acting out a story, then matching their bodies and their captured movements to 3D avatars to reenact the tale in a synthetic world. They could use multiple virtual cameras to “film” the action from multiple perspectives, and then construct a video.</p>
<p>This sort of creativity could lead to much more complex projects, all conceived in creators’ minds and made into virtual experiences. Amateur historians without programming skills may one day be able to construct augmented reality systems in which they can superimpose onto views of the real world selected images from historic photos or digital models of buildings that no longer exist. Eventually they could add avatars with whom users can converse. As technology continues to progress and become easier to use, the <a href="http://www.wikihow.com/Make-a-Diorama">dioramas built of cardboard</a>, modeling clay and twigs by children 50 years ago could one day become explorable, life-sized virtual spaces.</p><img src="https://counter.theconversation.com/content/66898/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Frances Van Scoy has received funding from the National Science Foundation. She owns a relatively small number of shares of Intel stock.</span></em></p>A long historical progression has brought technology to the masses – and will expand our capabilities as far as we can imagine.Frances Van Scoy, Associate Professor of Computer Science and Electrical Engineering, West Virginia UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/666612016-10-11T05:21:29Z2016-10-11T05:21:29ZAda Lovelace blazed a trail in science – we need more women to follow in her footsteps<figure><img src="https://images.theconversation.com/files/140893/original/image-20161007-21433-oiiffu.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Science demonstration at the Royal Institution.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/paul_clarke/15367793868/in/photolist-ppZXHf-pq1jYk-dkFrCm-dkrLsg-oJu33g-ppTz75-poSXZj-xWyvW-ppXH1z-poR1mz-dkrLbd-ppXus8-dkrK9n-oJu1tV-poQnJ7-dkrJLC-pG66YF-dkrN7s-pEekH5-ppR7jn-dkrNCS-poN95i-ppTxx3-dkrNH7-7ta78R-oKDD2R-dkrK4j-ppQmLa-pGwhXf-poMYZK-pGcY9t-dkrLVL-pEmFQW-pGjPQp-dkrJTk-pEmbY7-dkrNyC-dkrNbG-ppXuex-pEf6sQ-ppWcaL-pEmaXu-poT7Su-dkrMLN-oKAsk7-dkrMaW-oKAEGG-pG6Q16-pGp7E5-ppXunD">Paul Clarke</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>Ada Lovelace day falls on the second Tuesday of October every year – it is a time to celebrate the achievements of women in STEM subjects: science, technology, engineering and mathematics. But who was Ada Lovelace and why choose a Victorian titled lady as a champion for today’s potential heroes? </p>
<p>Countess Lovelace (Augusta Ada Byron) was born in 1815, yet we celebrate her today as the world’s first programmer, or possibly as its first debugger. In 1843 she wrote a paper about a computer called the <a href="http://sydneypadua.com/2dgoggles/the-marvellous-analytical-engine-how-it-works/">Analytical Engine</a> that was never even built. That didn’t matter. The machine was designed to solve mathematical problems. She was able to understand how solutions to those problems could be broken down into simple steps that could be given to the machine to execute. </p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/140896/original/image-20161007-21423-1qva70.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/140896/original/image-20161007-21423-1qva70.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=237&amp;fit=clip" srcset="https://images.theconversation.com/files/140896/original/image-20161007-21423-1qva70.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=954&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/140896/original/image-20161007-21423-1qva70.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=954&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/140896/original/image-20161007-21423-1qva70.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=954&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/140896/original/image-20161007-21423-1qva70.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=1199&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/140896/original/image-20161007-21423-1qva70.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=1199&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/140896/original/image-20161007-21423-1qva70.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=1199&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Noble.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/101251639@N02/9672674214/in/photolist-fJJZ45-6v9xpg-8xpmTH-69RPF3-7N65f4-dmcddu-pou9wr-7N5LNB-aq1tuV-yojuiY-e4FQ4g-73sPCa-7NkUKy-afiire-6a2yf2-diJh2g-34KgKN-cnMLMd-diJgYT-diJgHx-7Jiq41-diZ7yu-diZ7LN-diJed1-diJgat-8DFEfp-diZ7vN-diJgiX-gJ3nJS-av7oji-gJ48up-diZ9FH-diJezs-diZ9UK-diZ8Vg-diJgnn-diZ9rT-diJgNM-diZ6Tf-diJe6A-diZ8Mk-diJh7B-Aix7cD-diJgtF-diJgT4-diZ7sm-diJdGw-diZ9M8-diZ9JH-7N9MkL">Chris Monk/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Why is she our hero? She was smart. She pursued her interest in mathematics even though that was extremely unusual for a woman of her time. She had dogged determination in solving problems. She was able to understand the Analytical Engine and wrote a <a href="https://www.fourmilab.ch/babbage/sketch.html">“A Sketch of the Analytical Engine” with extensive notes</a> explaining its operation more clearly for a wider audience. In some ways she knew more about the machine than its inventor, Charles Babbage, did. She detected an error in instructions written by him for the machine. Most importantly, she had insight and realised that Babbage’s machine was capable of much more than its original purpose of numerical calculations. If it could manipulate numbers it could just as easily manipulate words or even music. As she wrote: </p>
<blockquote>
<p>[The Analytical Engine] might act upon other things besides number … for instance … the fundamental relations of pitched sounds in the science of harmony and of musical composition … the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.</p>
</blockquote>
<p>For many years Lovelace’s achievements were ignored: in part because computers (general purpose machines to solve all sorts of problems) were not built until the 1940s. Her name was resurrected by Alan Turing in his famous paper, “Computing Machinery and Intelligence”, in which he talked about Lovelace’s objection to artificial intelligence – that computers are not capable of originating anything and can do only what we instruct them to.</p>
<p>Turing is another key figure in the history of computing who has been more recently acclaimed as a true hero due to his work <a href="https://theconversation.com/codebreaking-has-moved-on-since-turings-day-with-dangerous-implications-34448">on wartime cryptography</a>, on understanding biology through mathematics, and on some really elegant theory on what is and is not possible for a computer to do.</p>
<p>Why are heroes important? We need to inspire future ones. We all need role models to look up to, to show how great human achievement can be. And we need role models who somehow “look like us”. Lovelace is an important hero – for men too but especially for women in computing, and in STEM more widely, as a woman who succeeded at a time when women were not at all encouraged into education, let alone science.</p>
<p>Diversity is important in science: we need teams composed of all sorts of people, bringing different perspectives, to solve problems. Not only that, companies with more diverse teams are shown to be more profitable. A <a href="http://www.mckinsey.com/business-functions/organization/our-insights/why-diversity-matters">2015 McKinsey report</a> found that “companies in the top quartile of gender diversity were 15% more likely to have above median financial returns, relative to their national industry median”. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/140891/original/image-20161007-21416-f84u44.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/140891/original/image-20161007-21416-f84u44.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/140891/original/image-20161007-21416-f84u44.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/140891/original/image-20161007-21416-f84u44.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/140891/original/image-20161007-21416-f84u44.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/140891/original/image-20161007-21416-f84u44.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/140891/original/image-20161007-21416-f84u44.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Scientific demonstration at the Royal Institution, 2014.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/paul_clarke/15367793868/in/photolist-ppZXHf-pq1jYk-dkFrCm-dkrLsg-oJu33g-ppTz75-poSXZj-xWyvW-ppXH1z-poR1mz-dkrLbd-ppXus8-dkrK9n-oJu1tV-poQnJ7-dkrJLC-pG66YF-dkrN7s-pEekH5-ppR7jn-dkrNCS-poN95i-ppTxx3-dkrNH7-7ta78R-oKDD2R-dkrK4j-ppQmLa-pGwhXf-poMYZK-pGcY9t-dkrLVL-pEmFQW-pGjPQp-dkrJTk-pEmbY7-dkrNyC-dkrNbG-ppXuex-pEf6sQ-ppWcaL-pEmaXu-poT7Su-dkrMLN-oKAsk7-dkrMaW-oKAEGG-pG6Q16-pGp7E5-ppXunD">Paul Clarke</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>So it’s a real problem for us that there are low numbers of women in computing (15% undergraduates), mathematics (39% undergraduates), physical sciences (39% undergraduates) and engineering (15% undergraduates), according to <a href="https://www.hesa.ac.uk">HESA</a>, which collects data on higher education in the UK. Although biological sciences have more women at undergraduate level (59%) this drops at more senior levels, suggesting women are moving out of the subject. You might say that says more about talent and inclination – women just don’t want to do these subjects – but that would be wrong. It’s a societal problem that women are being put off: the UNESCO/OECD <a href="http://yfactor.org/actwise/">2015 Yfactor report</a> showed women are represented more equally in STEM subjects in the Middle East, North Africa, and south and west Asia.</p>
<p>So we must do more to promote STEM careers for women. There are lots of terrific organisations out there who already run talks and workshops and hackathons and taster events for young people to find out what STEM means. Your local school, college or university may run an after-school coding club. There are local science fairs that aim to reach out and show how scientists are changing our world. These are great places to meet scientists who love to talk about their work with others. For girls, particularly look out for <a href="http://www.stemettes.org/">Stemettes</a>, <a href="http://sciencegrrl.co.uk/">ScienceGrrl</a>, <a href="http://www.codefirstgirls.org.uk/">Code First: Girls</a> and <a href="https://www.techfuturegirls.com/">TechFuture Girls</a>. </p>
<p>What other actions can help? We need prominent role models and advocates of STEM to demonstrate a wider range of “what a scientist looks like”. A <a href="http://csc.mrc.ac.uk/celebrating-women-science-ada-lovelace-day-2016/">new MRC awards scheme</a> to promote modern female heroes of computing and mathematics launches on Ada Lovelace Day 2016. These will recognise women first for their scientific achievements, but also as leaders in their field who are inspiring others. While Lovelace encourages us to showcase these women, women in STEM shouldn’t just be promoted on a single day. Women are doing fantastic work every day – and society needs the next generation of scientists to take up the baton for the future.</p><img src="https://counter.theconversation.com/content/66661/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Carron Shankland does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Lovelace showed great insight into her subject and for that she's still a hero to others.Carron Shankland, Professor of Computing Science and Deputy Head, School of Natural Sciences, University of StirlingLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/639712016-08-25T09:51:21Z2016-08-25T09:51:21ZThis little-known pioneering educator put coding in the classroom<figure><img src="https://images.theconversation.com/files/135034/original/image-20160822-18708-1n0go6u.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Seymour Papert lectures on LOGO, computers and education.</span> <span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File%3APapert-moscow-1987.jpg">Shen-montpellier</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>A man who was arguably the most influential educator of the last 50 years – though he was not widely known to the American public – died on July 31. A respected mathematician and early pioneer of artificial intelligence, Seymour Papert was 88. His career presaged much of today’s focus on education in science, technology, engineering and mathematics, and helped shape the classroom of today and of the future.</p>
<p>As an academic, he paved the way for generations of researchers. In their 1969 book <a href="https://mitpress.mit.edu/books/perceptrons">“Perceptrons,”</a> Papert and his MIT colleague Marvin Minsky were early advocates of the need to investigate the computational details of how early artificial intelligence actually functioned. Today that approach underpins evaluations of neural networks, which foster <a href="https://www.microsoft.com/en-us/research/publication/deep-learning-methods-and-applications/">deep learning</a>, a computational technique widely used in <a href="http://www.computerworld.com/article/3088712/data-analytics/putting-deep-learning-to-work.html">data analytics</a>.</p>
<p>But perhaps his most powerful legacy sprang from his research into learning, specifically the role of computers in education. Papert argued that learning was most successful when students were engaged in creative acts – when they were making things. For him, computers allowed and encouraged creation in a broad range of areas, and could therefore be a key to unlocking better teaching and learning.</p>
<p>As early as the mid-1960s, he was advocating for children to be taught to <a href="https://dspace.mit.edu/bitstream/handle/1721.1/5835/AIM-247.pdf?sequence=2">program computers</a>. At the time, of course, even the smallest computers were the <a href="http://www.computerhistory.org/timeline/1964/#169ebbe2ad45559efbc6eb35720a7bd4">size of office filing cabinets</a>. They weren’t used in schools; <a href="http://childrenstech.com/blog/archives/17504">his idea was considered extreme</a> and <a href="http://www.papert.org/articles/GhostInTheMachine.html">elitist</a>. Papert persisted. Today, every modern student should be grateful to him.</p>
<h2>Inventing LOGO</h2>
<p>In 1967 Papert channeled his research interest in coding and children into the development of the programming language <a href="http://el.media.mit.edu/logo-foundation/what_is_logo/history.html">LOGO</a>. It was intended to help students ranging in age from 5 to 17 learn mathematical concepts through programming and the use of words. Students gave simple instructions to a turtle – first a physical robot and later an icon on a screen – such as “LEFT 90” to turn 90 degrees to the left or “FORWARD 5” to advance a specific distance in a straight line. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/mOly9i7hmzk?wmode=transparent&amp;start=0" frameborder="0" allowfullscreen></iframe>
<figcaption><span class="caption">A LOGO turtle returns to life.</span></figcaption>
</figure>
<p>LOGO was powerful for its day, and very easy to learn and use. Many generations of students used LOGO in their mathematics classes to extend their understanding of geometry and arithmetic and to solve complex problems. </p>
<h2>Creating learning opportunities</h2>
<p>In 1980 Papert wrote <a href="http://www.wired.com/2007/03/the_origins_of_/">“Mindstorms: Children, Computers, and Powerful Ideas”</a> articulating his vision of how children should use a computer: </p>
<blockquote>
<p>the child programs the computer and, in doing so, both acquires a sense of mastery over a piece of the most modern and powerful technology and establishes an intimate contact with some of the deepest ideas from science, from mathematics, and from the art of intellectual model building.</p>
</blockquote>
<p>Papert strongly advocated that children need to be creators. He believed that this was best done through playful exploration. Papert believed that children could take control of their own learning by using the materials around them, fostering their independence and curiosity. Otherwise students would be relying heavily on answers from textbooks and or teachers. They would not be developing key skills such as problem solving, developing independence, or building on their knowledge but rather being told of outcomes secondhand.</p>
<p>Throughout the 1980s, computers started appearing in school classrooms. They were typically used to teach elementary programming languages such as LOGO and BASIC. The <a href="https://hourofcode.com/us">movement to teach children to code</a> had begun.</p>
<p>Today the most <a href="https://tltl.stanford.edu/content/you-cannot-think-about-thinking-without-thinking-about-what-seymour-papert-would-think">popular first programming language</a>, especially for primary school students, is <a href="https://scratch.mit.edu/">Scratch</a>, developed by Mitch Resnick and colleagues at MIT in 2003. Papert was a <a href="http://www.nytimes.com/2016/08/02/technology/seymour-papert-88-dies-saw-educations-future-in-computers.html?_r=0">mentor and friend</a> to Resnick; LOGO’s user-friendliness survives in the newer language, which also adds the ability for students to build communities <a href="http://cacm.acm.org/magazines/2009/11/48421-scratch-programming-for-all/fulltext">online and through social media</a>.</p>
<iframe src="https://embed-ssl.ted.com/talks/mitch_resnick_let_s_teach_kids_to_code.html" width="100%" height="360" frameborder="0" scrolling="no" webkitallowfullscreen="" mozallowfullscreen="" allowfullscreen=""></iframe>
<p>Efforts to teach children to code – Papert’s vision – have been taken up by several large scale initiatives and organizations such as <a href="https://code.org">code.org</a>, <a href="https://coderdojo.com/">Coderdojo</a> and <a href="https://hourofcode.com/">hourofcode.com</a>. These organizations work hard to reach every student, including girls, who are often underrepresented in computing classes, as well as across the socioeconomic spectrum. And they operate globally: Code.org, for example, has materials <a href="http://example.com/">available in 45 languages</a> and coding groups in <a href="https://code.org/about">more than 180 countries</a>.</p>
<h2>Lessons in three dimensions</h2>
<p>Papert was also enormously interested in how children could learn through experimentation and play. He stated that “education has very little to do with explanation, <a href="http://www.legofoundation.com/da-dk/newsroom/articles/2016/honoring-seymour-papert">it has to do with engagement</a> and falling in love with the material.” </p>
<p>In 1985, Papert started a collaboration with the LEGO Group, working on connecting LEGO bricks to a computer. This idea moved beyond the physical turtle robots originally controlled by LOGO, and became the first widely used programmable robotics system for children. Even today LEGO’s programmable robot system is named Mindstorms, after Papert’s book. The <a href="https://education.lego.com/en-us/middle-school/explore">Mindstorms kits</a> are used extensively in elementary and middle-school classrooms across the world and also in competitions such as the <a href="http://www.firstinspires.org/robotics/fll">FIRST LEGO League</a> where young students design, build and program a robot using LEGO to compete a series of challenges. The challenge changes on a yearly basis, but always requires students to program instructions into their robot and complete as many tasks as possible within two-and-a-half minutes.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/dJSeMeAGmXE?wmode=transparent&amp;start=0" frameborder="0" allowfullscreen></iframe>
<figcaption><span class="caption">A student-programmed robot performs in a FIRST Lego League competition.</span></figcaption>
</figure>
<p>Underpinning Papert’s vision was his work on learning and thinking. This was influenced by the Constructivist theory developed by Swiss psychologist <a href="http://www.piaget.org/">Jean Piaget</a>, who argued that learning is constructed by having the learner <a href="http://www.thirteen.org/edonline/concept2class/constructivism/">actively engaged in learning</a> rather than passively receiving information. From this starting point, Papert developed his own <a href="https://tltl.stanford.edu/content/seymour-papert-s-legacy-thinking-about-learning-and-learning-about-thinking">Constructionist theory of learning</a>, in which he argued that people build knowledge most effectively when they are consciously building something – whether practical or theoretical.</p>
<p>Computers, programming and robotics provide ideal platforms to enable children to construct elements of the world around them. Though the man himself didn’t get much popular recognition, Papert’s legacy remains in each and every classroom where students are actively engaged in learning.</p><img src="https://counter.theconversation.com/content/63971/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Leon Sterling receives funding from the Australian Research Council. </span></em></p><p class="fine-print"><em><span>Therese Keane does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Seymour Papert's vision has helped computers become widespread in education today, and gave birth to the movement to teach children to program.Therese Keane, Senior Lecturer in Education, Swinburne University of TechnologyLeon Sterling, Professor Emeritus, Swinburne University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/612002016-07-10T20:40:51Z2016-07-10T20:40:51ZHow to keep more girls in IT at schools if we're to close the gender gap<figure><img src="https://images.theconversation.com/files/129156/original/image-20160704-19124-mwhu9q.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Too many girls are opting out of IT in school so we need to make it more mainstream.</span> <span class="attribution"><span class="source">Shutterstock/bikeriderlondon</span></span></figcaption></figure><p>The world is increasingly embracing digital technology, and so too are <a href="https://theconversation.com/an-education-for-the-21st-century-means-teaching-coding-in-schools-42046">our schools</a>. But many girls are still missing out on developing IT and programming skills.</p>
<p>IT classes in schools mostly focus on basic skills, such as how to use email or spreadsheets, or use tablets to access online quizzes and educational games. Programming and algorithm-based problem solving don’t form a part of the typical school day. They tend to get taught only in extra-curricular classes, such as coding clubs.</p>
<p>But these tend to attract kids who’ve already expressed an interest in technology and want to learn more. The students who don’t know what coding is, or who don’t identify with computer culture (often in the form of computer gaming), are less inclined to participate in these extra-curricular clubs.</p>
<p>This kind of opt-in training means many girls are missing out, particularly if they perceive IT to be a pastime for boys. </p>
<p><a href="http://www.theatlantic.com/business/archive/2014/12/toys-are-more-divided-by-gender-now-than-they-were-50-years-ago/383556/">Gender stereotyping of toys</a> may also push girls away from technical interests. Parents tend to buy gadgets for boys more than for girls, as suggested in the United States by a <a href="http://www.npr.org/sections/money/2014/10/21/357629765/when-women-stopped-coding">National Public Radio story</a> on plunging numbers of women studying computer science. </p>
<p>Or girls may not be as interested in computer games due to the lack of female protagonists, as argued eloquently by 15-year-old student – and coding teacher – <a href="http://cacm.acm.org/magazines/2016/6/202643-a-byte-is-all-we-need/fulltext">Ankita Mitra</a>. Or perhaps <a href="https://theconversation.com/the-real-reason-more-women-dont-code-59663">girls simply don’t feel welcome</a> in these clubs. </p>
<p>A recent <a href="http://digitalcareers.edu.au/wp-content/uploads/2015/04/Female-Participation.pdf">report on female participation in computing</a> from Australia’s <a href="http://digitalcareers.edu.au/">Digital Careers</a> group explores the lack of engagement by girls in computing. It concludes that the best strategy to increase the proportion of women participating in computing and IT is compulsory and sustained engagement with an integrated digital technologies curriculum, including gender inclusive activities.</p>
<h2>Schools must mainstream IT</h2>
<p>Digital technology skills are not going to be optional for our students for much longer. The United Kingdom has implemented a <a href="https://www.theguardian.com/technology/2014/sep/04/coding-school-computing-children-programming">coding curriculum</a> that will see children as young as five learning to program.</p>
<p>Here in Victoria, a new <a href="http://victoriancurriculum.vcaa.vic.edu.au/technologies/digital-technologies/introduction/rationale-and-aims">digital technologies curriculum</a> came into effect late last year, reflecting the national <a href="http://www.australiancurriculum.edu.au/technologies/digital-technologies/rationale">Australian Curriculum: Digital Technologies</a>. </p>
<p>From 2017, students will be taught computational thinking, and will learn to collect and interpret data using automated tools, and to transform data into information through digital solutions.</p>
<p>These changes require teachers who are able to deliver the lessons, and appreciate the value of the skills. The <a href="http://blogs.adelaide.edu.au/cser/">Computer Science Education Research Group</a> at the University of Adelaide is playing a vital role in preparing teachers for the new curriculum, by developing online courses that focus on teaching in the digital technologies learning area.</p>
<p>Schools also have to find place for these activities in a curriculum that is already overcrowded, forcing them to consider what their priorities for educating children are.</p>
<h2>Pursuing IT should not be hard</h2>
<p>I hosted a fascinating panel discussion on Women in IT, for last month’s <a href="https://theconversation.com/au/topics/computing-turns-60">60th anniversary celebration of computing in Australian universities</a>. The panellists had more than a century of experience in IT between them, and explored the many factors that both drew them into IT, and helped them to stay. </p>
<p>A comment made by the youngest panellist struck a particular chord. Neha Soni, a business analyst at Deloitte, observed, </p>
<blockquote>
<p>I’m the kind of person that, if you tell me I <em>can’t</em> do something, then I’ll be even more determined to do it!</p>
</blockquote>
<p>An informal poll suggested that many of the audience members – largely, very accomplished women in IT – shared this attitude.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/127702/original/image-20160622-19789-1phseb9.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/127702/original/image-20160622-19789-1phseb9.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/127702/original/image-20160622-19789-1phseb9.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=401&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/127702/original/image-20160622-19789-1phseb9.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=401&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/127702/original/image-20160622-19789-1phseb9.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=401&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/127702/original/image-20160622-19789-1phseb9.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/127702/original/image-20160622-19789-1phseb9.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/127702/original/image-20160622-19789-1phseb9.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Some of the panellists in a discussion on Women in IT: (left to right) Mark Johnson (Shine Technologies), Leonie Walsh (Lead Scientist of Victoria), Neha Soni (Deloitte) and Cecily Macdougall (Australian Computer Society).</span>
<span class="attribution"><span class="source">The University of Melbourne</span></span>
</figcaption>
</figure>
<p>This suggests that the women who are successful in IT today are <a href="http://www.huffingtonpost.com/entry/how-to-blaze-a-trail-lessons-from-9-incredible-women_us_57691b11e4b034ff3eeffc42">trailblazers</a>. They are determined, and have been willing (and able) to push through barriers to pursue their passion. </p>
<p>It suggests they have fought through <a href="http://www.geekwire.com/2014/women-tech-panel/">the myriad anxieties that women in IT</a> often express, and have survived. They have done this, it would seem, despite lack of encouragement and, in some cases, outright rejection.</p>
<p>Does it have to be so hard? Do girls need to have that fire and determination to have a successful career in IT?</p>
<p>I suggest that we have to mainstream IT, for both boys and girls. We need to make it just as normal as reading, writing and arithmetic. As normal as a career as a doctor (<a href="https://theconversation.com/female-doctors-in-australia-are-hitting-glass-ceilings-why-51325">where women may soon outnumber men</a>) or as an educator (<a href="http://www.acara.edu.au/reporting/national_report_on_schooling_2012/schools_and_schooling_2012/staff_2012.html">where women already do</a>). Something that everyone learns, and anyone who finds it interesting can pursue at more advanced levels or maybe choose for their career.</p>
<p>This sentiment is echoed by a 17-year-old student from Methodist Ladies’ College in Melbourne, <a href="http://www.theage.com.au/victoria/female-student-makes-history-and-heads-to-coding-olympics-20160619-gpmjai.html">Belinda Shi</a>, who will be representing Australia in the International Olympiad in Informatics (<a href="http://www.ioinformatics.org/index.shtml">IOI</a>) in August. </p>
<p>She doesn’t want to be seen as “<em>that female</em> on the informatics team” but rather to be recognised for her programming abilities. Being singled out for your gender is not always comfortable.</p>
<h2>Cultural change through the schools</h2>
<p>I hope for a time when we don’t have to talk about engaging girls in IT, because girls are naturally engaged in it through their learning. And for a time when we don’t have to highlight the accomplishments of women in IT, but can celebrate the accomplishments of deserving individuals. </p>
<p>A time when the cultural barriers have been removed, and it is just as easy and normal for a girl to pursue a career in technology as it is for a boy. </p>
<p>I believe that with strong integration of digital technologies in the school curriculum, that time is not far away. As girls and boys learn digital technologies together, with supportive teachers, stereotypes will fade and women and men will work comfortably side by side. </p>
<p>Studying IT will simply be another part of everyday life. Because IT already is.</p><img src="https://counter.theconversation.com/content/61200/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Karin Verspoor works for the University of Melbourne. She receives funding from the Australian Research Council, the Defence Science and Technology Group, and the Victorian Department of Health and Human Services. She is affiliated with Victorian ICT for Women and is a volunteer for the upcoming &quot;Go Girl, Go for IT&quot; event aimed at high school girls.</span></em></p>Too many girls are missing out on learning IT and computer programming skills that could serve them well in the future economy.Karin Verspoor, Associate Professor, Department of Computing and Information Systems, University of MelbourneLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/604182016-06-06T01:21:54Z2016-06-06T01:21:54ZGoogle wins in court, and so does losing party Oracle<figure><img src="https://images.theconversation.com/files/125183/original/image-20160603-11620-15adj3n.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Everybody wins!</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/pic-402095701/stock-vector-gold-trophy-cup-of-winner-in-two-hands-illustration.html">Trophy and hands via shutterstock.com</a></span></figcaption></figure><p>Oracle <a href="http://arstechnica.com/tech-policy/2016/05/google-wins-trial-against-oracle-as-jury-finds-android-is-fair-use/">recently lost its attempt</a> to use patent and copyright law to force Google to pay US$9 billion for using parts of its Java computer language. Nine billion dollars isn’t chump change, not even for Google, but despite the verdict against Oracle, I’d say Google is not the only winner.</p>
<p>The dispute between the two internet giants was <a href="http://arstechnica.com/tech-policy/2016/05/how-oracle-made-its-case-against-google-in-pictures/">whether Google had needed Oracle’s permission to use computer code</a> called the Java API. The API, and therefore the legal issue, relates to some pretty technical details about how computer programs work – how the instructions programmers write are followed on different hardware devices and different software operating systems.</p>
<p>The outcome of the case, decided in parts by a judge, an appeals court and a jury, was that Google’s use of computer code didn’t violate Oracle’s patents, and that Oracle could copyright its code. However, the jury found that Google’s use did not violate the copyright restrictions because it significantly expanded on the existing copyrighted materials, an exception in law called “<a href="http://www.copyright.gov/fair-use/">fair use</a>.”</p>
<p>It is not only a victory for Google, which has done nothing wrong and need not pay Oracle any money. Programmers remain allowed to use a very popular programming language without fear of crippling legal penalties – which in turn benefits the public, who use <a href="https://github.com/trending/java">apps and websites made with Java</a>. And while technically the legal loser, Oracle also won in a way, because it will benefit from Java’s continued popularity.</p>
<h2>What’s an API?</h2>
<p>To understand the heart of the dispute, we first need to grasp what an Application Programming Interface (API) is and what it does for programmers. At its simplest, an API defines the specific details of how a program interacts with a computer’s operating system and the underlying hardware.</p>
<p>Computer manufacturers use a <a href="http://www.makeuseof.com/tag/whats-inside-your-computer-the-story-of-every-component-you-need-to-know-3/">wide range of specific components</a>: hard drives and memory storage units with different sizes, faster or slower processing chips, smaller and larger screens. They also choose <a href="http://www.computerworld.com/article/3050931/microsoft-windows/windows-comes-up-third-in-os-clash-two-years-early.html">different operating systems</a>, such as Windows, the Macintosh OS X, and Linux – each of which is regularly upgraded with a new version.</p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/125184/original/image-20160603-11585-ehiq95.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=237&amp;fit=clip" srcset="https://images.theconversation.com/files/125184/original/image-20160603-11585-ehiq95.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/125184/original/image-20160603-11585-ehiq95.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/125184/original/image-20160603-11585-ehiq95.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/125184/original/image-20160603-11585-ehiq95.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/125184/original/image-20160603-11585-ehiq95.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/125184/original/image-20160603-11585-ehiq95.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Hoping to avoid nightmares: a Java programmer.</span>
<span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File%3AProgrammer_writing_code_with_Unit_Tests.jpg">Joonspoon</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>Each variation might handle basic functions differently – such as reading a file connecting to the internet, or drawing images on the screen. For a computer programmer, that is a nightmare. Nobody wants to write a program that works only on a <a href="http://www.dell.com/us/p/inspiron-15-3552-laptop/pd?oc=fncwc008sb&amp;model_id=inspiron-15-3552-laptop">Dell laptop with a 15-inch screen, a 500 GB hard drive, 4 GB of RAM, running Windows 10</a> – and no other computer. And nobody wants to write the extremely large number of slight variations to make sure a program works on every machine, either.</p>
<p>The API solves that problem for the programmer, handling the complicated and difficult details of exactly how any specific computer will act. That leaves programmers free to concentrate on what they want a computer program to do, without having to worry about precisely how. It’s better for the user, too. If she has (for example) <a href="https://java.com/en/download/">Java installed</a> on whatever computer she uses, programs written in Java will run.</p>
<h2>Java itself</h2>
<p>The Java API contains methods for everything from reading and writing a file, to drawing on a screen, to handling web security certificates. Without a functioning copy of the API, programs in Java are fundamentally broken. Clearly, therefore, he who controls the API controls the language. </p>
<p>Oracle, when it <a href="http://www.oracle.com/us/corporate/press/018363">bought Sun Microsystems</a>, bought the rights to Java and its API. The crux of the legal battle was how this control is exerted and how far it extends.</p>
<p>No one denied that Oracle has a valid copyright on the language and API specification. This is a good thing. It means I can’t just make a copy of Java, give it a name (like “Darjeeling”), and call it a new language that I own. Similarly, a company can’t change the API arbitrarily and still call it the Java API.</p>
<h2>What did Google do?</h2>
<p>When it <a href="https://googleblog.blogspot.com/2008/09/first-android-powered-phone.html">released Android in 2008</a>, Google added software and hardware development to its existing internet service business. If its products were going to succeed, they needed to be able to run lots of interesting programs. The easiest way to do ensure that was to make sure the new devices could understand at least one computer language that’s already <a href="http://spectrum.ieee.org/computing/software/the-2015-top-ten-programming-languages">widely used by programmers</a>. Java is a natural choice. </p>
<p>The alternative would have been to <a href="https://msdn.microsoft.com/en-us/library/67ef8sbd.aspx">create a new language</a>, but that pathway is fraught with difficulties. Introducing a new language requires convincing programmers that it is worth using and giving them time and resources to learn the language.</p>
<p>Once Google decided on Java, it needed to connect Java programs to Android’s hardware and software – it needed a Java API for Android.</p>
<h2>Sharing names for computer commands</h2>
<p>Rather than commissioning Oracle to write it, Google wrote the software in-house, customizing it for cellphone hardware. For example, Bluetooth, touch-screen gestures and telephone calls are not handled in Oracle’s standard Java API; they are solely in Android-specific code. </p>
<p>However, to be sure Android devices could run existing Java software, Google wrote its Android Java with some of the same commands as Oracle’s version of Java. Both Android and Oracle support the <a href="https://docs.oracle.com/javase/7/docs/api/java/io/package-summary.html">Java.io methods</a> that let programmers use the same <em>files.newInputStream(filename)</em> command to initiate the arcane and complex Java file-reading process. </p>
<p>Google didn’t copy the code Oracle had written for other hardware or software systems. It wrote <a href="https://developer.android.com/reference/classes.html">all-new Android-specific</a> instructions for devices to follow each command, but to help programmers, gave many common commands the same name Oracle used.</p>
<p>Oracle’s <a href="http://arstechnica.com/tech-policy/2012/04/oracles-ip-war-against-google-finally-going-to-trial-whats-at-stake/">lawyers sharpened their knives</a> and the battle was on. Could Google use the same names, even if the code they referred to was different?</p>
<h2>The stakes were high</h2>
<p>If Oracle had won, Java’s days as a primary programming language for Android – the <a href="https://bgr.com/2016/06/02/apples-mobile-market-share-sees-big-drop-in-may-as-android-skyrockets/">world’s most popular smartphone system</a> – were numbered. Very quickly, Google would have chosen a new language for Android programmers to use, and published a conversion tool to translate existing Java apps into the new language. Then it would have stopped supporting Java. (I suspect <a href="https://www.microsoft.com/en-us/">one of Oracle’s competitors</a> would have offered Google excellent licensing terms to choose another language.) </p>
<p>Programmers would have lost. The tools to write code for Android would have been, at a bare minimum, more expensive and less flexible. The public would have lost, because new and interesting apps would both be more expensive and released less frequently.</p>
<p>Finally, Oracle would have lost because programming in Java would no longer be a viable option for a major market. Computer languages compete for popularity, so fewer programmers would choose to program in Java, reducing the pool of people who were comfortable and competent in Java. Instead they would choose others, like <a href="https://www.python.org/">Python</a> or <a href="https://www.ruby-lang.org/en/">Ruby</a>. With fewer people working in Java, Oracle’s primary way of making money from it (creating <a href="https://www.oracle.com/java/index.html">Java-based computer systems</a> that can be expanded by third-party developers) would slowly decline.</p>
<p>Instead, while Oracle doesn’t get $9 billion from Google, the programming community – and those of us who use apps and websites every day – gets to keep using an important tool, without fear of a similarly large lawsuit in the future.</p><img src="https://counter.theconversation.com/content/60418/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Robert Harrison does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Google saves $9 billion, programmers and users get to keep a popular language and its apps – and a key Oracle product stays alive.Robert Harrison, Professor of Computer Science, Georgia State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/499382015-11-09T11:05:37Z2015-11-09T11:05:37ZHow computers broke science – and what we can do to fix it<figure><img src="https://images.theconversation.com/files/101142/original/image-20151107-16258-100l6yf.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Computer... or black box for data?</span> <span class="attribution"><a class="source" href="http://ftp.arl.mil/ftp/historic-computers/">US Army</a></span></figcaption></figure><p>Reproducibility is one of the cornerstones of science. Made popular by British scientist <a href="http://www.bbk.ac.uk/boyle/boyle_learn/boyle_introduction.htm">Robert Boyle</a> in the 1660s, the idea is that a discovery should be reproducible before being accepted as scientific knowledge.</p>
<p>In essence, you should be able to produce the same results I did if you follow the method I describe when announcing my discovery in a scholarly publication. For example, if researchers can reproduce the effectiveness of a new drug at treating a disease, that’s a good sign it could work for all sufferers of the disease. If not, we’re left wondering what accident or mistake produced the original favorable result, and would doubt the drug’s usefulness.</p>
<p>For most of the history of science, researchers have reported their methods in a way that enabled independent reproduction of their results. But, since the introduction of the personal computer – and the point-and-click software programs that have evolved to make it more user-friendly – reproducibility of much research has become questionable, if not impossible. Too much of the research process is now shrouded by the opaque use of computers that many researchers have come to depend on. This makes it almost impossible for an outsider to recreate their results. </p>
<p>Recently, several groups have proposed similar solutions to this problem. Together they would break scientific data out of the black box of unrecorded computer manipulations so independent readers can again critically assess and reproduce results. Researchers, the public, and science itself would benefit.</p>
<h2>Computers wrangle the data, but also obscure it</h2>
<p>Statistician <a href="http://edge.org/annual-question/2014/response/25340">Victoria Stodden</a> has described the unique place personal computers hold in the history of science. They’re not just an instrument – like a telescope or microscope – that enables new research. The computer is revolutionary in a different way; it’s a tiny factory for producing all kinds of new “scopes” to see new patterns in scientific data. </p>
<p>It’s hard to find a modern researcher who works without a computer, even in fields that aren’t intensely quantitative. Ecologists use computers to simulate the effect of disasters on animal populations. Biologists use computers to search massive amounts of DNA data. Astronomers use computers to control vast arrays of telescopes, and then process the collected data. Oceanographers use computers to combine data from satellites, ships and buoys to predict global climates. Social scientists use computers to discover and predict the effects of policy or to analyze interview transcripts. Computers help researchers in almost every discipline identify what’s interesting within their data.</p>
<p>Computers also tend to be personal instruments. We typically have exclusive use of our own, and the files and folders it contains are generally considered a private space, hidden from public view. Preparing data, analyzing it, visualizing the results – these are tasks done on the computer, in private. Only at the very end of the pipeline comes a publicly visible journal article summarizing all the private tasks.</p>
<p>The problem is that most modern science is so complicated, and most journal articles so brief, it’s impossible for the article to include details of many important methods and decisions made by the researcher as he analyzed his data on his computer. How, then, can another researcher judge the reliability of the results, or reproduce the analysis?</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/101143/original/image-20151107-16263-lyvih5.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/101143/original/image-20151107-16263-lyvih5.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/101143/original/image-20151107-16263-lyvih5.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=391&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/101143/original/image-20151107-16263-lyvih5.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=391&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/101143/original/image-20151107-16263-lyvih5.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=391&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/101143/original/image-20151107-16263-lyvih5.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=492&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/101143/original/image-20151107-16263-lyvih5.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=492&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/101143/original/image-20151107-16263-lyvih5.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=492&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Good luck recreating the analysis.</span>
<span class="attribution"><span class="source">US Army</span></span>
</figcaption>
</figure>
<h2>How much transparency do scientists owe?</h2>
<p>Stanford statisticians <a href="http://statweb.stanford.edu/%7Ewavelab/Wavelab_850/wavelab.pdf">Jonathan Buckheit and David Donoho</a> described this issue as early as 1995, when the personal computer was still a fairly new idea.</p>
<blockquote>
<p>An article about computational science in a scientific publication is not the scholarship itself, it is merely <strong>advertising</strong> of the scholarship. The actual scholarship is the complete software development environment and the complete set of instructions which generated the figures.</p>
</blockquote>
<p>They make a radical claim. It means all those private files on our personal computers, and the private analysis tasks we do as we work toward preparing for publication should be made public along with the journal article.</p>
<p>This would be a huge change in the way scientists work. We’d need to prepare from the start for everything we do on the computer to eventually be made available for others to see. For many researchers, that’s an overwhelming thought. Victoria Stodden has found the <a href="http://www.iassistdata.org/conferences/2010/presentation/1394">biggest objection to sharing files</a> is the time it takes to prepare them by writing documentation and cleaning them up. The second biggest concern is the risk of not receiving credit for the files if someone else uses them. </p>
<h2>A new toolbox to enhance reproducibility</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/101113/original/image-20151106-16263-gkt7vd.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/101113/original/image-20151106-16263-gkt7vd.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=237&amp;fit=clip" srcset="https://images.theconversation.com/files/101113/original/image-20151106-16263-gkt7vd.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=745&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/101113/original/image-20151106-16263-gkt7vd.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=745&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/101113/original/image-20151106-16263-gkt7vd.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=745&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/101113/original/image-20151106-16263-gkt7vd.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=937&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/101113/original/image-20151106-16263-gkt7vd.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=937&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/101113/original/image-20151106-16263-gkt7vd.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=937&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">What secrets are within the computer?</span>
<span class="attribution"><a class="source" href="http://ftp.arl.mil/ftp/historic-computers/">US Army</a></span>
</figcaption>
</figure>
<p>Recently, several different groups of scientists have converged on recommendations for tools and methods to make it easier to keep track of files and analyses done on computers. These groups include <a href="http://doi.org/10.1371/journal.pcbi.1003285">biologists</a>, <a href="http://dx.doi.org/10.1890/ES14-00402.1">ecologists</a>, <a href="http://doi.org/10.1038/nature.2014.16014">nuclear engineers</a>, <a href="http://doi.org/10.3389/fninf.2012.00009">neuroscientists</a>, <a href="http://doi.org/10.1007/s10614-007-9084-4">economists</a> and <a href="http://polmeth.wustl.edu/methodologist/tpm_v18_n2.pdf">political scientists</a>. <a href="http://doi.org/10.1371/journal.pbio.1001745">Manifesto-like papers</a> lay out their recommendations. When researchers from such different fields converge on a common course of action, it’s a sign a major watershed in doing science might be under way.</p>
<p>One major recommendation: <a href="http://www.the-scientist.com/?articles.view/articleNo/43632/title/Get-With-the-Program/">minimize and replace</a> point-and-click procedures during data analysis as much as possible by using scripts that contain instructions for the computer to carry out. This solves the problem of recording ephemeral mouse movements that leave few traces, are difficult to communicate to other people, and hard to automate. They’re common during data cleaning and organizing tasks using a spreadsheet program like Microsoft Excel. A script, on the other hand, contains unambiguous instructions that can be read by its author far into the future (when the specific details have been forgotten) and by other researchers. It can also be included within a journal article, since they aren’t big files. And scripts can easily be adapted to automate research tasks, saving time and reducing the potential for human error. </p>
<p>We can see examples of this in <a href="http://doi.org/10.1128/AEM.01369-15">microbiology</a>, <a href="http://doi.org/10.1098/rspb.2014.1631">ecology</a>, <a href="http://doi.org/10.1080/09692290.2012.727362">political science</a> and <a href="http://doi.org/10.1016/j.jhevol.2015.03.014">archaeology</a>. Instead of mousing around menus and buttons, manually editing cells in a spreadsheet and dragging files between several different software programs to obtain results, these researchers wrote scripts. Their scripts automate the movement of files, the cleaning of the data, the statistical analysis, and the creation of graphs, figures and tables. This saves a lot of time when checking the analysis and redoing it to explore different options. And by looking at the code in the script file, which becomes part of the publication, anyone can see the exact steps that produced the published results. </p>
<p>Other recommendations include the use of <a href="http://library.queensu.ca/ojs/index.php/IEE/article/view/4608/4898">common, nonproprietary file formats</a> for storing files (such as CSV, or comma separated variables, for tables of data) and <a href="http://doi.org/10.1371/journal.pcbi.1000424">simple rubrics</a> for <a href="https://github.com/Reproducible-Science-Curriculum/rr-init">systematically organizing files</a> into folders to make it easy for others to understand how the information is structured. They recommend free software that is available for all computer systems (eg. Windows, Mac, and Linux) for analyzing and visualizing data (such as <a href="http://doi.org/10.1038/517109a">R</a> and <a href="http://doi.org/10.1038/518125a">Python</a>). For collaboration, they recommend a free program called <a href="http://www.scfbm.org/content/8/1/7">Git</a>, that helps to track changes when many people are editing the same document. </p>
<p>Currently, these are the tools and methods of the avant-garde, and many midcareer and senior researchers have only a vague awareness of them. But many undergraduates are learning them now. Many graduate students, seeing personal advantages to getting organized, using open formats, free software and streamlined collaboration, are <a href="http://doi.org/10.1038/nature.2014.15799">seeking out training</a> and tools from volunteer organizations such as <a href="http://software-carpentry.org/">Software Carpentry</a>, <a href="http://www.datacarpentry.org/">Data Carpentry</a> and <a href="https://ropensci.org">rOpenSci</a> to fill the gaps in their formal training. My university recently created an <a href="http://escience.washington.edu">eScience Institute</a>, where we help researchers adopt these recommendations. Our institute is part of a <a href="https://www.moore.org/programs/science/data-driven-discovery/data-science-environments">bigger movement</a> that includes similar institutes at <a href="http://bids.berkeley.edu/">Berkeley</a> and <a href="http://datascience.nyu.edu/">New York University</a>.</p>
<p>As students learning these skills graduate and progress into positions of influence, we’ll see these standards become the new normal in science. Scholarly journals will require code and data files to accompany publications. Funding agencies will require they be placed in publicly accessible online repositories. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/101114/original/image-20151106-16255-kk5jm.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/101114/original/image-20151106-16255-kk5jm.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/101114/original/image-20151106-16255-kk5jm.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=241&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/101114/original/image-20151106-16255-kk5jm.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=241&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/101114/original/image-20151106-16255-kk5jm.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=241&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/101114/original/image-20151106-16255-kk5jm.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=303&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/101114/original/image-20151106-16255-kk5jm.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=303&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/101114/original/image-20151106-16255-kk5jm.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=303&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Example of a script used to analyze data.</span>
<span class="attribution"><span class="license">Author provided</span></span>
</figcaption>
</figure>
<h2>Open formats and free software are a win/win</h2>
<p>This change in the way researchers use computers will be beneficial for public engagement with science. As researchers become more comfortable sharing more of their files and methods, members of the public will have much better access to scientific research. For example, a high school teacher will be able to show students raw data from a recently published discovery and walk the students through the main parts of the analysis, because all of these files will be available with the journal article.</p>
<p>Similarly, as researchers increasingly use free software, members of the public will be able to use the same software to remix and extend results published in journal articles. Currently many researchers use expensive commercial software programs, the cost of which makes them inaccessible to people outside of universities or large corporations. </p>
<p>Of course, the personal computer is not the sole cause of problems with <a href="https://theconversation.com/we-found-only-one-third-of-published-psychology-research-is-reliable-now-what-46596">reproducibility</a> in <a href="https://theconversation.com/half-of-biomedical-research-studies-dont-stand-up-to-scrutiny-and-what-we-need-to-do-about-that-45149">science</a>. Poor experimental design, inappropriate statistical methods, a highly competitive research environment and the <a href="https://theconversation.com/real-crisis-in-psychology-isnt-that-studies-dont-replicate-but-that-we-usually-dont-even-try-47249">high value placed on novelty</a> and publication in high-profile journals are all to blame.</p>
<p>What’s unique about the role of the computer is that we have a solution to the problem. We have clear recommendations for mature tools and well-tested methods borrowed from computer science research to improve the reproducibility of research done by any kind of scientist on a computer. With a small investment of time to learn these tools, we can help restore this cornerstone of science.</p><img src="https://counter.theconversation.com/content/49938/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ben Marwick receives funding from the Australian Research Council. He is affiliated with the University of Washington eScience Institute (Data Science Fellow). He contributes to the Software Carpentry, Data Carpentry, and rOpenSci organisations. Currently he is a Senior Research Fellow at the Centre for Archaeological Science at the University of Wollongong. </span></em></p>Virtually every researcher relies on computers to collect or analyze data. But when computers are opaque black boxes that manipulate data, it's impossible to replicate studies – a core value for science.Ben Marwick, Associate Professor of Archaeology, University of WashingtonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/450852015-08-18T10:06:28Z2015-08-18T10:06:28ZIn the push for marketable skills, are we forgetting the beauty and poetry of STEM disciplines?<figure><img src="https://images.theconversation.com/files/92112/original/image-20150817-5110-17nt3z4.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">There is beauty in mathematical ideas and proofs.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/lucapost/694780262/in/photolist-24oVY3-diC14-4f3Jaz-4JqiwG-a5sw5-2RsTt1-geDfL-agNTbS-bz6igw-4f7GCA-aZLKD4-acJS5w-zdTJr-o8nVHc-6GsoZ-A3oZS-cd1WBC-8BMbiL-jXn1k8-jy4a28-4ikigj-usq3wD-6zjnBu-oo7TWg-anDsYW-2RsUqE-rzSR2m-pktg1Y-6aBPfC-qzzDXg-akeS8f-LfcF1-wdC58y-fkp13e-e9XnEF-73kFqy-d4AxJs-97N2Vr-baxAc-ugXsf-oqbq-8hDsUX-acJS9E-cVnd-pnMLBq-acJS7o-vhEJ3-6Mbpka-pDrCn8-5XbUTK">lucapost</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><p>Thousands of students are preparing to begin their job searches with newly earned STEM (science, technology, engineering and mathematics) degrees in hand, eagerly waiting to use the logical, analytical and practical skills they’ve acquired.</p>
<p>However, as qualified as they might be, they could be missing one critical component of the STEM field – art.</p>
<p>I pursued an education and career in computer science and mathematics. And I know only too well that in the field of computer science, there is often an emphasis on elegance and beauty alongside sheer practicality. Indeed, programming itself is sometimes referred to as an <a href="http://ruben.verborgh.org/blog/2013/02/21/programming-is-an-art/">art</a>.</p>
<p>It is the same in related fields. The discipline of mathematics has long championed beauty as an important quality of ideas and proofs. And, of course, many engineers value elegance and beauty as important components in their designs and solutions.</p>
<h2>Poetry is at the heart of technology</h2>
<p>As many seasoned programmers and mathematicians will tell you, there is <a href="http://www.i-programmer.info/news/200-art/6808-writing-code-as-poetry-poetry-as-code.html">poetry in technology</a>. In fact, some regard such poetry as being at the heart of what they do. </p>
<p>In the 1980s, <a href="http://www.tracykidder.com/">Tracy Kidder</a> wrote The Soul of a New Machine, a book about the <a href="https://www.nytimes.com/books/99/01/03/specials/kidder-soul.html%22">pressure and effort</a> of building a next-generation computer. But more importantly, that account opened a lot of people’s eyes to the passion and beauty in creating these machines.</p>
<p>Many of the engineers in the book repeatedly explained that they didn’t work for the money, but rather for the gratification of invention and design – in essence, the beauty of it.</p>
<p>Indeed, reading that book was especially meaningful to me as I began my own studies in computing. </p>
<p>As I know through experience, constructing something poetic/beautiful is very fulfilling to the practitioner. Computer scientists value elegance and beauty in the creation of algorithms and computer programs. </p>
<h2>Mathematical beauty around us</h2>
<p>Similarly, ideas of beauty and poetry have always been important in mathematics.</p>
<p>Prominent mathematicians and computer scientists have long embraced elegance, beauty, poetry and literacy in the code that they write and the theorems that they prove. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/92115/original/image-20150817-25727-jkcswp.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/92115/original/image-20150817-25727-jkcswp.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=361&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/92115/original/image-20150817-25727-jkcswp.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=361&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/92115/original/image-20150817-25727-jkcswp.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=361&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/92115/original/image-20150817-25727-jkcswp.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=453&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/92115/original/image-20150817-25727-jkcswp.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=453&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/92115/original/image-20150817-25727-jkcswp.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=453&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Beauty is important in programming as well.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/seeminglee/8921779798/in/photolist-eAosiC-eM7Nt3-8fUHJ5-dJJaMa-75vxwE-7YmQp6-nVVCfx-7FiU1M-88RAmR-qcSVGW-73zoNZ-o9rumE-o9jEvD-o9CnNi-kJz7fa-75tWag-6sr9f1-74T8P4-9fK69A-pVwTQC-8zPfCv-75vGtW-6h7DLk-ao4yLs-6KosGR-9NnUjX-75y7TJ-9NqG1N-8KuSmz-9hYhPp-9NqGR3-9bavnP-qXct22-imxhyi-9NnXnt-4qfL61-byidYK-9wCAyR-e9VDki-8w1M8R-9fK64C-gDR3aL-bCL9xk-dDNWKm-73Arz1-81JzLb-j7vKq2-mocVVz-9s18nz-bBYzuW">See-ming Lee</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>These ideas, in fact, have been around for millennia. Indeed, the extreme separation of the disciplines is relatively new in Western history.</p>
<p>Those doing science (natural philosophy) and mathematics were also often doing poetry and music. Many of today’s disciplines were subsumed as philosophy. So contemporary surprise at the idea that science and mathematics could be poetic is a somewhat recent phenomenon.</p>
<p>For example, Pythagoras was a philosopher/scientist/mystic/mathematician who <a href="http://www.pbs.org/wgbh/nova/physics/great-math-mystery.html">explored beauty</a> in art and music.</p>
<p>This attention to beauty and pattern continued through Fibonacci and beyond.</p>
<p>Fibonacci (13th century), considered to be the leading mathematician in the Middle Ages, is probably best known for the <a href="https://www.mathsisfun.com/numbers/fibonacci-sequence.html">Fibonacci Sequence</a> named after him: a number in the sequence is the sum of the previous two numbers (eg, start with 1, 2; then add to get 3. Then add 2, 3 to get 5, and it goes on: 1,2,3,5,8,13,21,34,…). Fibonacci <a href="http://io9.com/5985588/15-uncanny-examples-of-the-golden-ratio-in-nature">discovered</a> that much else that we regard as beautiful follows this elegant pattern.</p>
<p>This technical, mathematical beauty is evident in all of nature – from flower petals and shells to spiral galaxies and hurricanes,</p>
<h2>Discoveries come through intuition</h2>
<p>Intuition and discovery, rather than a kind of routine analysis, are important in computing as well as mathematics and science. Significant insights are known to come through intuition.</p>
<p>Intuition is deemed to be so important that developing “computer intuition” is one of the goals in the subfield of artificial intelligence.</p>
<p>So, in computing, there is really no “standard” way to write complex, interesting and aesthetically pleasing programs. Little surprise then that Stanford Professor Emeritus <a href="http://www-cs-faculty.stanford.edu/%7Euno/taocp.html">Donald Knuth’s</a> four-volume masterpiece is titled The Art of Computer Programming.</p>
<p>Similarly, years ago a colleague in the arts told me about the <a href="http://www.pbs.org/wgbh/nova/physics/andrew-wiles-fermat.html">PBS show</a> on Andrew Wiles’s proof of <a href="http://mathworld.wolfram.com/FermatsLastTheorem.html">Fermat’s Last Theorem</a>. Wiles, a British mathematician, devoted much of his career to proving Fermat’s Last Theorem, a problem that no one had been able to solve for 300 years.</p>
<p>My colleague confided that she was moved to tears during the program. Until that show she had thought that mathematics was cold, dry, absolute and passionless. That show completely changed her view so that she could finally see the passion and the poetry that permeates the STEM fields.</p>
<h2>STEM versus liberal arts?</h2>
<p>Many STEM graduates today spend their college years enrolling only in courses they believe will benefit them in their field, zeroing in on skills that will make them more marketable in the digital age, while overlooking social sciences, humanities and the arts. Of course, likewise, many humanities students try to avoid taking science and mathematics courses. </p>
<p>And it shouldn’t be this way; but that’s a discussion for another day.</p>
<p>It’s projected that <a href="http://www.bls.gov/emp/ep_table_101.htm">685,000 new employment opportunities</a> will be created by 2022 in computer and mathematical occupations.</p>
<p>But today’s students need to remember that technology is not just a matter of rote procedure – completing the task according to set protocol; that would not be particularly elegant. </p>
<p>As sciences, technology and computing become ever more powerful forces in the world, it’s important that the people piecing these things together are ethical and bring in the human attributes that are central to a liberal arts education.</p>
<p>We need thinkers, visionaries and creative minds. As the technology industry grows – and with it, employment opportunities – we need more candidates who are rooted in thought and fewer who can simply carry out a task.</p>
<p>For those students graduating with a liberal arts degree, who are unsure where their job hunt will take them, we welcome you with open arms to technology, mathematics and computing.</p>
<p>And for those in technology, celebrate your humanity and the the available cultural riches; become aware of the intuition and the poetry in what you do. Bring with you your love for beauty, passion and artistry, and be prepared to use them.</p><img src="https://counter.theconversation.com/content/45085/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Paul Myers does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Poetry is at the heart of technology. Did not Pythagoras find the connections between beautiful music and mathematics?Paul Myers, Chair of Computer Science , Trinity UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/437982015-06-25T09:08:16Z2015-06-25T09:08:16ZHow computers are learning to make human software work more efficiently<figure><img src="https://images.theconversation.com/files/86265/original/image-20150624-31501-vfj9r7.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Need a computer doctor? Dial 100110011001</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/s/artificial+intelligence/search.html?page=5&amp;thumb_size=mosaic&amp;inline=140863105">agsandrew</a></span></figcaption></figure><p>Computer scientists have a history of borrowing ideas from nature, such as evolution. When it comes to optimising computer programs, a very interesting evolutionary-based approach has emerged over the past five or six years that could bring incalculable benefits to industry and eventually consumers. We call it genetic improvement. </p>
<p>Genetic improvement involves writing an automated “programmer” who manipulates the source code of a piece of software through trial and error with a view to making it work more efficiently. This might include swapping lines of code around, deleting lines and inserting new ones – very much like a human programmer. Each manipulation is then tested against some quality measure to determine if the new version of the code is an improvement over the old version. It is about taking large software systems and altering them slightly to achieve better results. </p>
<h2>The benefits</h2>
<p>These interventions can bring a variety of benefits in the realm of what programmers describe as the functional properties of a piece of software. They might improve how fast a program runs, for instance, or remove bugs. They can also be used to help transplant old software to new hardware. </p>
<p>The potential doesn’t stop there. Because genetic improvement operates on source code, it can also improve the so-called non-functional properties. These include all the features that are not concerned purely with just the input-output behaviour of programs, such as the amount of bandwidth or energy that the software consumes. These are often particularly tricky for a human programmer to deal with, given the already challenging problem of building correctly functioning software in the first place.</p>
<p>We have seen a few examples of genetic improvement beginning to be recognised in recent years – albeit still within universities for the moment. A good early one dates <a href="http://www.genetic-programming.org/combined.php">from 2009</a>, where such an automated “programmer” built by the University of New Mexico and University of Virginia fixed 55 out of 105 bugs in various different kinds of software, ranging from a media player to a Tetris game. For this it won $5,000 (£3,173) and a Gold Humie Award, which is awarded for achievements produced by genetic and evolutionary computation. </p>
<p>In the past year, UCL in London has overseen two research projects that have demonstrated the field’s potential (full disclosure: both have involved co-author William Langdon). The first <a href="http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6733370">involved</a> a genetic-improvement program that could take a large complex piece of software with more than 50,000 lines of code and speed up its functionality by 70 times. </p>
<p>The second <a href="http://link.springer.com/chapter/10.1007%2F978-3-319-09940-8_20">carried out</a> the first automated wholesale transplant of one piece of software into a larger one by taking a linguistic translator called <a href="http://babelxl.com">Babel</a> and inserting it into an instant-messaging system called <a href="https://pidgin.im">Pidgin</a>. </p>
<h2>Nature and computers</h2>
<p>To understand the scale of the opportunity, you have to appreciate that software is a unique engineering material. In other areas of engineering, such as electrical and mechanical engineering, you might build a computational model before you build the final product, since it allows you to push your understanding and test a particular design. On the other hand, software is its own model. A computational model of software is still a computer program. It is a true representation of the final product, which maximises your ability to optimise it with an automated programmer. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/86266/original/image-20150624-31507-6x2wvk.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/86266/original/image-20150624-31507-6x2wvk.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=237&amp;fit=clip" srcset="https://images.theconversation.com/files/86266/original/image-20150624-31507-6x2wvk.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=851&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/86266/original/image-20150624-31507-6x2wvk.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=851&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/86266/original/image-20150624-31507-6x2wvk.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=851&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/86266/original/image-20150624-31507-6x2wvk.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=1069&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/86266/original/image-20150624-31507-6x2wvk.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=1069&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/86266/original/image-20150624-31507-6x2wvk.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=1069&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Thank you, Mr Darwin.</span>
<span class="attribution"><a class="source" href="http://www.shutterstock.com/cat.mhtml?lang=en&amp;language=en&amp;ref_site=photo&amp;search_source=search_form&amp;version=llv1&amp;anyorall=all&amp;safesearch=1&amp;use_local_boost=1&amp;autocomplete_id=14351489844696635000&amp;search_tracking_id=b0uFEorKwiKQdv3sptk7sA&amp;searchterm=evolution&amp;show_color_wheel=1&amp;orient=&amp;commercial_ok=&amp;media_type=images&amp;search_cat=&amp;searchtermx=&amp;photographer_name=&amp;people_gender=&amp;people_age=&amp;people_ethnicity=&amp;people_number=&amp;color=&amp;page=1&amp;inline=252138244">Everett Historical</a></span>
</figcaption>
</figure>
<p>As we mentioned at the beginning, there is a rich tradition of computer scientists borrowing ideas from nature. Nature inspired genetic algorithms, for example, which crunch through the millions of possible answers to a real-life problem with many variables to come up with the best one. Examples include anything from devising a wholesale road distribution network to fine-tuning the design of an engine. </p>
<p>Though the evolution metaphor has become something of a millstone in this context, <a href="https://theconversation.com/why-we-fell-out-of-love-with-algorithms-inspired-by-nature-42718">as discussed here</a>, genetic algorithms have had a number of successes producing results which are either comparable with human programs or even better. </p>
<p>Evolution also inspired <a href="http://whatis.techtarget.com/definition/genetic-programming">genetic programming</a>, which attempts to build programs from scratch using small sets of instructions. It is limited, however. One of its many criticisms is that it cannot even evolve the sort of program that would typically be expected of a first-year undergraduate, and will not therefore scale up to the huge software systems that are the backbone of large multinationals. </p>
<p>This makes genetic improvement a particularly interesting deviation from this discipline. Instead of trying to rewrite the whole program from scratch, it succeeds by making small numbers of tiny changes. It doesn’t even have to confine itself to genetic improvement as such. The Babel/Pidgin example showed that it can extend to transplanting a piece of software into a program in a similar way to how surgeons transplant body organs from donors to recipients. This is a reminder that the overall goal is automated software engineering. Whatever nature can teach us when it comes to developing this fascinating new field, we should grab it with both hands.</p><img src="https://counter.theconversation.com/content/43798/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Machines are not very good at writing software from scratch, but they're getting pretty good at improving on human efforts.John R. Woodward, Lecturer in Computer Science, University of StirlingJustyna Petke, Research Associate at the Centre for Research on Evolution, Search and Testing, UCLWilliam Langdon, Principal Research Associate, UCLLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/425172015-06-01T05:17:38Z2015-06-01T05:17:38ZOracle vs Google case threatens foundations of software design<figure><img src="https://images.theconversation.com/files/83402/original/image-20150529-15241-vuwpgj.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Copyright keeps appearing where it&#39;s not wanted.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/48690880@N03/5814893360">Christopher Dombres</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>The Java programming language, which has <a href="http://www.infoworld.com/article/2923773/java/java-at-20-how-java-changed-programming-forever.html">just turned 20 years old</a>, provides developers with a means to write code that is independent of the hardware it runs on: “<a href="http://www.computerweekly.com/feature/Write-once-run-anywhere">write once, run anywhere</a>”. </p>
<p>But, ironically, while Java was intended to make programmers’ lives easier, the <a href="https://www.eff.org/cases/oracle-v-google">court case</a> between Oracle, Java’s owner, and Google over Google’s use of Java as the basis of its Android mobile operating system may make things considerably more difficult. </p>
<p>Google adopted Java for Android apps, using its own, rewritten version of the Java run-time environment (the <a href="https://anturis.com/blog/java-virtual-machine-the-essential-guide/">Java virtual machine</a> or VM) called <a href="https://source.android.com/devices/tech/dalvik/index.html">Dalvik</a>. The Oracle vs Google court case centres around the use of Java in Android, particularly in relation to Application Program Interface (API) calls. </p>
<p>An API is a standard set of interfaces that a developer can use to communicate with a useful piece of code – for example, to exchange input and output, access network connections, graphics hardware, hard disks, and so on. For developers, using an existing API means not having to reinvent the wheel by accessing ready-made code. For those creating APIs, making them publicly and freely accessible encourages developers to use them and create compatible software, which in turn makes it more attractive to end users. </p>
<p>For example, <a href="https://www.opengl.org/">OpenGL</a> and <a href="https://msdn.microsoft.com/library/windows/apps/hh452744">Microsoft’s DirectX</a> are two APIs that provide a standardised interface for developers to access 3D graphics hardware, as used in videogames or modelling applications. Hardware manufacturers ensure their hardware is compatible with the API standard, the OpenGL Consortium and Microsoft update their APIs to ensure the latest hardware capabilities are addressed and games developers get a straightforward interface compatible with many different types of hardware, making it easier to create games.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/83441/original/image-20150530-15207-qaygn7.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/83441/original/image-20150530-15207-qaygn7.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/83441/original/image-20150530-15207-qaygn7.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=297&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/83441/original/image-20150530-15207-qaygn7.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=297&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/83441/original/image-20150530-15207-qaygn7.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=297&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/83441/original/image-20150530-15207-qaygn7.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=373&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/83441/original/image-20150530-15207-qaygn7.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=373&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/83441/original/image-20150530-15207-qaygn7.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=373&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Java runtime and compatible Android equivalent.</span>
<span class="attribution"><span class="source">Author provided</span></span>
</figcaption>
</figure>
<h2>Fight for your right to API</h2>
<p>Google designed Android so that Java developers could bring their code to Android by recreating (most of) the standard Java API calls used in the Java libraries and supported by the standard Java VM. The case revolves around whether doing this – by essentially re-creating the Java API rather than officially licensing it from Oracle – is a breach of copyright. If the case finds in favour of Oracle it will set a precedent that APIs are copyrightable, and so make developers lives a lot more legally complex.</p>
<p>To be clear, the case doesn’t revolve around any claim that Google reused actual code belonging to Oracle, but that the code it produced mimicked what Oracle’s Java run-time environment was capable of.</p>
<p>The initial finding came in May 2012, when a US court agreed with Google’s claim that using APIs them falls under fair use, and that Oracle’s copyright was not infringed. Then in May 2014, the US Federal Circuit reversed part of the ruling in favour of Oracle, especially related to the issue of copyright of an API. Now, at the US Supreme Court’s request, <a href="http://arstechnica.com/tech-policy/2015/05/white-house-sides-with-oracle-tells-supreme-court-apis-are-copyrightable/">the White House has weighed in</a> in Oracle’s favour.</p>
<h2>Can you ‘own’ an API?</h2>
<p>For most in the industry, a ruling that it’s possible to copyright an API would be a disaster. It would mean that many companies would have to pay extensive licence fees, and even face having to write their own APIs from scratch – even those needed to programmatically achieve only the simplest of things. If companies can prevent others from replicating their APIs through recourse to copyright law, then all third-party developers could be locked out. Also the actual call to the API and its functionality could be copyrighted too, so that the functionality would have to be different too, otherwise it would be a copy.</p>
<p>In the initial trial, District Judge William Alsup <a href="http://www.cnet.com/news/judge-william-alsup-master-of-the-court-and-java/">taught himself Java</a> to learn the foundation of the language. He decided that to allow the copyrighting of Java’s APIs would allow the copyrighting of an improbably broad range of generic (and therefore uncopyrightable) functions, such as interacting with window menus and interface controls. The Obama administration’s intervention is to emphasise its belief that the case should be decided on whether Google had a right under fair use to use Oracle’s APIs.</p>
<h2>It’s like the PC all over again</h2>
<p>Something like this has happened before. When IBM produced its original PC in 1981 (the <a href="http://oldcomputers.net/ibm5150.html">IBM 5150</a>), a key aspect was access to the system calls provided by the PC BIOS, which booted the computer and managed basic hardware such as keyboard, monitor, floppy disk drive and so on. Without access to the BIOS it wasn’t possible to create software for the computer. </p>
<p>One firm, Compaq, <a href="http://mashable.com/2014/05/29/halt-and-catch-fire-amc-compaq/">decided to reverse-engineer the BIOS calls</a> to create its own, compatible version – hence the term “IBM PC compatible” become standard language to describe a program that would run on an IBM model or any of the third-party hardware from other manufacturers that subsequently blossomed. IBM’s monopoly on the PC market was opened up, and the PC market exploded into what we see today – would this have happened had IBM been able to copyright its system calls?</p>
<p>So 20 years after the birth of Java, through the groundwork laid by its original creator, Sun Microsystems, Java has become one of the most popular programming languages in the world through being cross-platform and (mostly) open. But now it seems it ends in a trap. The wrong decision in this case could have a massive impact on the industry, where even using a button on a window could require some kind of licence – and licence fees. For software developers, it’s a horrible thought. Copyrighting APIs would lock many companies into complex agreements – and lock out many other developers from creating software for certain platforms.</p>
<p>For Google, there’s no way of extracting Java from Android now; its runaway success is bringing Google only a whole lot of problems. But as we go about building a world built on software, be assured that one way or another this ruling will have a massive effect on us all.</p><img src="https://counter.theconversation.com/content/42517/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bill Buchanan does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A decision against Google in its court case against Oracle this week could lay the ground for upheaval in the industry.Bill Buchanan, Head, Centre for Distributed Computing, Networks and Security, Edinburgh Napier UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/424962015-05-28T12:16:10Z2015-05-28T12:16:10ZReport into air traffic control failure shows we need a better approach to programming<figure><img src="https://images.theconversation.com/files/83241/original/image-20150528-32187-1fj0vc0.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">The higher they are, the further they have to fall.</span> <span class="attribution"><a class="source" href="http://commons.wikimedia.org/wiki/File:Changi_Airport_Air_Traffic_Control_(141922192).jpg">Ramil Sagum</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>The causes of the National Air Traffic Services (<a href="http://www.nats.aero/about-us/what-we-do/our-control-centres/">NATS</a>) flight control centre system failure in December 2014 that affected 65,000 passengers directly and up to 230,000 indirectly have been revealed in a recently published report.</p>
<p>The <a href="http://www.caa.co.uk/docs/2942/Independent%20Enquiry%20Final%20Report%202.0.pdf">final report</a> from the UK Civil Aviation Authority’s <a href="http://www.caa.co.uk/application.aspx?appid=7&amp;mode=detail&amp;nid=2411">Independent Inquiry Panel</a> set up after the incident examines the cause of and response to the outage at the Swanwick control centre in Hampshire, one of two sites controlling UK airspace (the other is at Prestwick in Scotland). Safety is key, said the report. I agree. And safety was not compromised in any way. Bravo!</p>
<p>“Independent” is a relative term, after all the panel includes Joseph Sultana, director of Eurocontrol’s Network Management, and NATS’s operations chief Martin Rolfe, as well as UK Civil Aviation Authority board member and director of safety and airspace regulation Mark Swan – all of whom have skin in the game. (Full disclosure: a panel member, Professor John McDermid, is a valued colleague of many years.)</p>
<p>For a thorough analysis, however, it’s essential to involve people who know the systems intimately. Anyone who has dealt with software knows that often the fastest way to find a fault in a computer program is to ask the programmer who wrote the code. And the NATS analysis and recovery involved the programmers too, Lockheed Martin engineers who built the system in the 1990s. This is one of two factors behind the “rapid fault detection and system restoration” during the incident on December 12.</p>
<p>The report investigates two phenomena: the system outage, its cause and how the system was restored. It also examines NATS’ operational response to the outage. The report also looks at what this says about how well the findings and recommendations following the last major incident, a year earlier, had been implemented. I just look at the first here, but arguably the other two are more important in the end.</p>
<h1>Cause and effect</h1>
<p>In the NATS control system, real-time traffic data is fed into controller workstations by a system component called the System Flight Server (SFS). The SFS architecture is what is called “hot back-up”. There are two identical components (called “channels”) computing the same data at the same time. Only one is “live” in the running system. If this channel falls over, then the identical back-up becomes the live channel, so the first can be restored to operation while offline. </p>
<p>This works quite well to cope with hardware failures, but is no protection against faults in the system logic, as that logic is running identically on both channels. If a certain input causes the first channel to fall over, then it will cause the second to fall over in exactly the same way. This is what happened in December.</p>
<p>The report describes a “latent software fault” in the software, written in the 1990s. Workstations in active use by controllers and supervisors either for control or observation are called Atomic Functions (AF). Their number should be limited by the SFS software to a maximum of 193, but in fact the limit was set to 151, and the SFS fell over when it reached 153.</p>
<h2>Deja vu</h2>
<p>My first thought is that we’ve heard this before. As far back as 1997-98, evidence given to the House of Commons Select Committee on Environment, Transport and Regional Affairs <a href="http://www.parliament.the-stationery-office.co.uk/pa/cm199798/cmselect/cmenvtra/360iv/et0407.htm">reported</a> that the NATS system, then under development, was having trouble scaling from 30 to 100 active workstations. But this recent event was much simpler than that – it’s the kind of fault you see often in first-year university programming classes and which students are trained to avoid through inspection and testing. </p>
<p>There are technical methods known as static analysis to avoid such faults – and static analysis of the 1990s was well able to detect them. But such thorough analysis may have been seen as an impossible task: it was <a href="http://www.parliament.the-stationery-office.co.uk/pa/cm199798/cmselect/cmenvtra/360iv/et0407.htm">reported</a> in 1995 that the system exhibited 21,000 faults, of which 95% had been eliminated by 1997 (hurray!) – leaving 1,050 which hadn’t been (boo!). Not counting, of course, the fault which triggered the December outage. (I wonder how many more are lurking?)</p>
<p>How could an error not tolerated in undergraduate-level programming homework enter software developed by professionals over a decade <a href="http://www.computerweekly.com/feature/A-brief-history-of-an-air-traffic-control-system">at a cost approaching a billion pounds</a>?</p>
<h2>Changing methods</h2>
<p>Practice has changed since the 1990s. Static analysis of code in critical systems is now regarded as necessary. So-called <a href="http://www.eschertech.com/products/correct_by_construction.php">Correct by Construction</a> (CbyC) techniques, in which how software works is defined in a specification and then developed through a process of refinement in such a way as <a href="http://proteancode.com/keynote.pdf">demonstrably to avoid</a> common sources of error, have proved their worth. NATS nowadays successfully uses key systems developed along CbyC principles, such as <a href="http://nats.aero/blog/2013/07/how-technology-is-transforming-air-traffic-management">iFacts</a>.</p>
<p>But change comes only gradually, and old habits are hard to leave behind. For example, <a href="https://nakedsecurity.sophos.com/2014/02/24/anatomy-of-a-goto-fail-apples-ssl-bug-explained-plus-an-unofficial-patch/">Apple’s “goto fail” bug</a> which surfaced in 2014 in many of its systems rendered void an internet security function essential for trust online – validating website authentication certificates. Yet it was caused by a simple syntax error – essentially a programming typo – that could and should have been caught by the most rudimentary static analysis. </p>
<p>Unlike the public enquiry and report undertaken by NATS, Apple has said little about either how the problem came about or the lessons learned – and the same goes for the developers of many other software packages that lie at the heart of the global computerised economy.</p><img src="https://counter.theconversation.com/content/42496/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Peter Bernard Ladkin presented evidence to the UK House of Commons Transportation Sub-committee on the development of the Swanwick system in 1997 and 1998. His tech-transfer company Causalis Limited received consulting payments from BT Systems, as well as from Serco for due-diligence analysis of the Swanwick system, for their bids during the privatisation of NATS near the turn of the millennium.</span></em></p>Software is now too critical to how the world works, so we need to enforce ways to ensure it's better.Peter Bernard Ladkin, Professor of Computer Networks and Distributed Systems, University of BielefeldLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/422592015-05-28T03:59:55Z2015-05-28T03:59:55ZA bit of coding in school may be a dangerous thing for the IT industry<figure><img src="https://images.theconversation.com/files/82818/original/image-20150525-32548-1xhi0hf.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Teaching children to code is nothing new but does that teach them enough about the IT industry.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/sanjoselibrary/15799650667/">Flickr/San Jos Library</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>It sounds compelling, but what does Opposition leader Bill Shorten actually mean when he says all secondary school pupils should be taught “digital technologies, computer science and coding”? And, equally importantly, why?</p>
<p>In his <a href="http://www.alp.org.au/budget_reply_speech">budget reply speech</a>, as part of a focus on science and technology education, he stated that “coding” should be part of Australia’s national curriculum.</p>
<p>He doesn’t define precisely what he means by “coding”, but it’s likely the speech was at least partly inspired by the work of <a href="http://code.org/">code.org</a>, a US-based multi-national computing education programme.</p>
<p><a href="https://www.whitehouse.gov/blog/2014/12/10/president-obama-first-president-write-line-code">Barack Obama participated</a> in one of the organisation’s largest projects, the introductory “<a href="https://hourofcode.com/au">Hour of Code</a>” for primary school students, allegedly becoming the first US President to write a computer program in the process.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/AI_dayIQWV4?wmode=transparent&amp;start=0" frameborder="0" allowfullscreen></iframe>
</figure>
<p><a href="http://code.org/">code.org</a>’s lessons introduce basic programming skills in bite-size chunks, in simplified programming environments that make it easy to get small examples working.</p>
<p>While a great deal of thought (and not a small amount of time, talent and money) has gone into constructing <a href="http://code.org/">code.org</a>’s lessons, many of the approaches are not new. Attempts to teach children programming date to the late 1960s with the Logo programming language. </p>
<p>One of Logo’s key innovations was “turtle graphics”, in which students learned programming by controlling the movement of a “turtle” that could move forwards or backwards, rotate, or raise and lower a pen to draw as it moved. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/82665/original/image-20150522-12478-7atfny.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/82665/original/image-20150522-12478-7atfny.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/82665/original/image-20150522-12478-7atfny.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=450&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/82665/original/image-20150522-12478-7atfny.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=450&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/82665/original/image-20150522-12478-7atfny.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=450&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/82665/original/image-20150522-12478-7atfny.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=566&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/82665/original/image-20150522-12478-7atfny.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=566&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/82665/original/image-20150522-12478-7atfny.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=566&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Turtle Graphics and the LOGO programming language.</span>
<span class="attribution"><span class="source">Screenshot</span></span>
</figcaption>
</figure>
<p>Hour of Code’s
<a href="http://studio.code.org/s/frozen/stage/1/puzzle/3">Frozen-based tutorial</a> puts turtle graphics, and MIT’s graphical <a href="https://scratch.mit.edu/">Scratch programming language</a>, in Disney wrapping. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/82664/original/image-20150522-12482-sxv3pm.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/82664/original/image-20150522-12482-sxv3pm.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/82664/original/image-20150522-12482-sxv3pm.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=309&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/82664/original/image-20150522-12482-sxv3pm.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=309&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/82664/original/image-20150522-12482-sxv3pm.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=309&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/82664/original/image-20150522-12482-sxv3pm.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=388&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/82664/original/image-20150522-12482-sxv3pm.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=388&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/82664/original/image-20150522-12482-sxv3pm.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=388&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Programming tutorial at code.org.</span>
<span class="attribution"><span class="source">Screenshot</span></span>
</figcaption>
</figure>
<p>The fact that these approaches are evolutionary rather than revolutionary doesn’t make them bad. We speak from personal experience that learning to code in bite-size chunks can be both instructive and inspirational. This kind of introduction led us on paths that resulted in computing becoming our careers.</p>
<p>It’s fun, creative and much more accessible than pop-culture stereotypes about introverted “genius” programmers. By democratising access to such introductory lessons in coding, a broader pool of people may enter the profession, which is a very good thing.</p>
<h2>It might be coding but is it IT?</h2>
<p>But this approach to teaching, in focusing on the accessible and the enjoyable, has the unfortunate side effect of misrepresenting much of the work IT professionals really do.</p>
<p>To create and work with information technology at a professional level, both computer science and software engineering are essential skills. But they are rarely even hinted at in coding exercises for children, even at secondary level.</p>
<p>The core of the discipline of computer science – algorithmics (the topic of a <a href="http://www.vcaa.vic.edu.au/Pages/vce/studies/algorithmics/algorithmicsindex.aspx">new unit</a> in the Victorian Certificate of Education) – is the science of solving problems by devising and analysing procedures. These procedures can then be turned into computer programs. It is rewarding, but much like its sister discipline mathematics, it’s hard, sometimes tedious and often highly abstract.</p>
<p>Similarly, a huge fraction of what IT professionals do falls in the domain of software engineering – the profession focused on building large software systems within the constraints of the real world. Planning, design and management of large development teams are very challenging problems.</p>
<p>That said, over the past 50 years we have learned a lot about how <em>not</em> to build large software systems, and have found systematic methods that work far better than unstructured trial and error.</p>
<h2>Team effort</h2>
<p>So, while getting some small programs to run might inspire students, the kind of things that can realistically be taught to a broad range of children only help a little as preparation for professional study in IT.</p>
<p>In fact, they might be counterproductive. The creative ad-hockery of individuals working alone or in pairs for short periods is not an approach that scales to large systems that actually meet the needs of their stakeholders. For that, you need the analysis that computer science brings and especially the skills and perspective that software engineering teaches. </p>
<p>But in our experience as tertiary educators, many students believe that they have come to university to learn how to perform solo ad-hockery in a particularly expert manner. </p>
<p>Consequently, they do not take anything that isn’t “programming” seriously until they run headlong into the year-long team projects in their final year of tertiary education.</p>
<p>At best, they have a lot of catching up to do. At worst, they don’t learn from the experience and enter the employment market without the skills and attitudes that successful IT professionals exhibit and that switched-on employers expect. <a href="http://www.techrepublic.com/blog/10-things/10-types-of-programmers-youll-encounter-in-the-field/">Cowboy coders</a> are those who can’t or won’t work in teams and who focus on code to the detriment of actual user needs, and they will find it difficult to compete for jobs. </p>
<p>Cowboy coding is hardest to discourage in those students with the most confidence in their own programming abilities, usually students with the most pre-university programming experience. Learning ad-hoc programming without exposure to computer science and software engineering runs the risk of exacerbating the misconceptions behind cowboy coding.</p>
<h2>What if you don’t want to work in IT?</h2>
<p>According to a <a href="http://www.awpa.gov.au/publications/Documents/ICT-STUDY-FINAL-28-JUNE-2013.pdf">report</a> by the Australian Workforce and Productivity Agency, just 4% of the Australian workforce is employed in information technology roles. The majority of Australians, now and into the future, will not make information technology their career.</p>
<p>So what of those who don’t dedicate their lives to IT, what do they get out of a bit of exposure to coding at school?</p>
<p>Given the role that computing plays in our lives, there’s a strong argument that participation in our democracy will be enhanced by giving every Australian some knowledge of how computers function. A future attorney-general might not be quite so confused about the meaning of “metadata”, for instance. </p>
<p>One potential justification for coding in schools is that an increasing number of professions call for some programming skills. Scientists and engineers of all kinds, financial analysts and managers (among others) will have to either perform programming themselves, or work closely with those who do.</p>
<p>Even the relatively technology-phobic journalism profession is conducting a lively internal debate about the value of journalists knowing how to code, both to create interactive content and to support the kind of statistical analysis performed in data journalism (exemplified by the work of Nate Silver, the American statistician who writes about <a href="http://fivethirtyeight.com/contributors/nate-silver/">baseball</a> and <a href="http://www.theguardian.com/politics/2015/apr/27/nate-silver-statistician-us-2012-predicts-uk-general-election-result">elections</a>).</p>
<p>But it’s important to keep what non-specialists can achieve in perspective. If you want to see the results of amateurs struggling to design, plan and manage large, complex physical construction projects, watch an episode of the TV show The Block. Large software construction projects are difficult for amateurs to pull off for similar reasons.</p>
<h2>Realistic ambitions</h2>
<p>The Australian national curriculum already contains a proposed “<a href="http://www.australiancurriculum.edu.au/Curriculum/Overview">digital technologies</a>” section, and a revised version is likely to be adopted in the next few years. This curriculum proposes to teach computing with a heavy focus on computer science and software engineering, far beyond the coding mentioned in Shorten’s speech.</p>
<p>The proposed curriculum is extremely ambitious. Were it to be taken literally, Year 10 students across the nation would be expected to master abilities and bring a mindset that we struggle to achieve in our graduates, who have a strong interest in the topic and years of specialist training as well as the patience, focus and perspective that adulthood brings.</p>
<p>As such, we are extremely sceptical that much of this can actually be effectively taught to the end of mandatory curricula at Year 10 level, let alone earlier in secondary or primary schooling.</p>
<p>In practice, we think it likely that students in the compulsory years of schooling will do some experimental, creative small-scale coding in lessons like those provided by code.org, but at best will be told about computer science and software engineering, which has very limited effect.</p>
<p>Telling students about these disciplines, even at tertiary level, does not on its own lead to any ability to apply them, or even to recognise when they might be needed.</p>
<p>We believe that there is indeed justification for teaching students more about computers in the years of compulsory schooling. But this should be tempered with a clear focus on what we want students to learn, and some realism about what can be achieved in a school setting in compulsory lessons.</p>
<p>Otherwise, we not only run the risk of wasting valuable resources, we run the risk of actively harming students’ chances of later mastering the full range of skills required of IT professionals.</p><img src="https://counter.theconversation.com/content/42259/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Teaching children to code with computers is only part of the challenge to preparing people for a career in the IT industry. But it can also do more harm that good in some cases.Robert Merkel, Lecturer in Software Engineering, Monash UniversityRobyn McNamara, Assistant Lecturer in Software Engineering; PhD candidate in Computer Science Education, Monash UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/420462015-05-20T20:05:30Z2015-05-20T20:05:30ZAn education for the 21st century means teaching coding in schools<figure><img src="https://images.theconversation.com/files/82337/original/image-20150520-30551-1gphxk7.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Programs like Hour of Code introduce computer programming to students in an engaging manner.</span> <span class="attribution"><span class="source">Hour of Code 2014/Flickr</span>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><p>Bill Shorten’s <a href="http://billshorten.com.au/labors-plan-for-coding-in-schools">recent announcement</a> that, if elected, a Labor Government would “ensure that computer coding is taught in every primary and secondary school in Australia” has brought attention to an increasing world trend. </p>
<p>Estonia introduced <a href="http://ubuntulife.net/computer-programming-for-all-estonian-schoolchildren/">coding in primary schools</a> in 2012 and the UK <a href="http://www.bbc.com/news/technology-29010511">followed suit</a> last year. US-led initiatives such as <a href="http://code.org/">Code.org</a> and the “<a href="https://hourofcode.com/au">Hour of Code</a>”, supported by organisations such as Google and Microsoft, advocate that every school student should have the opportunity to learn computer coding. </p>
<p>There is merit in <a href="http://www.afr.com/technology/its-time-to-teach-our-school-kids-a-new-language-code-20150309-13v4qp">school students learning coding</a>. We live in a digital world where computer programs underlie everything from business, marketing, aviation, science and medicine, to name several disciplines. During a recent presentation at a radio station, one of our hosts said that IT would have been better background for his career in radio than journalism. </p>
<p>There is also a strong case to be made that Australia’s future prosperity will depend on delivering advanced services and digital technology, and that programming will be essential to this end. Computer programs and software are known to be a <a href="http://scholar.harvard.edu/files/jorgenson/files/02_jorgenson_ho_samuels19nov20101_2.pdf">strong driver</a> of <a href="http://www.siia.net/Admin/FileManagement.aspx/LinkClick.aspx?fileticket=ffCbUo5PyEM%3D&amp;portalid=0">productivity improvements</a> in many fields. </p>
<p>Being introduced to coding gives students an appreciation of what can be built with technology. We are surrounded by devices controlled by computers. Understanding how they work, and imagining new devices and services, are enhanced by understanding coding. </p>
<p>Of course, not everyone taught coding will become a coder or have a career in information technology. Art is taught in schools with no expectation that the students should become artists.</p>
<h2>Drag and drop</h2>
<p>A computer program is effectively a means of automating processes. Programs systematically and reliably follow processes and can be used to exhaustively try all the possibilities.</p>
<p>The <a href="http://www.devsaran.com/blog/10-best-programming-languages-2015-you-should-know">languages</a> used to program computers have <a href="http://www.abc.net.au/technology/articles/2013/01/11/3667939.htm">evolved</a> in the 70 years we have been building computers. Interfaces and programming environments have become more natural and intuitive. Language features reflect the applications they’re used for.</p>
<p>What is needed to easily express a business process, scientific equation, or data analysis technique is not necessarily the same as what is needed to rapidly develop a video game.</p>
<p>However, throughout the evolution of programming languages, the fundamental principles have remained the same. Computer programming languages express three essential things: </p>
<ol>
<li><p>The order in which a sequence of instructions is performed</p></li>
<li><p>A means of repeating a sequence of instructions a prescribed number of times</p></li>
<li><p>And tests as to whether or not a sequence of instructions is performed.</p></li>
</ol>
<p>While personal preference influences <a href="http://langpop.com/">which computer language</a> a programmer uses, there is a greater understanding of which languages work well for teaching introductory programming. For example, <a href="https://scratch.mit.edu/">Scratch</a> is popular for primary school students and is quick to learn. <a href="http://www.ps.uni-saarland.de/alice/">Alice</a> has been used to help students quickly build computer animations. <a href="https://www.python.org/">Python</a> is increasingly used for scientific applications. <a href="http://en.wikipedia.org/wiki/Visual_programming_language">Visual programming languages</a> – where students can drag-and-drop icons rather than type code – allow for rapid development of simple programs. </p>
<p>At Swinburne University of Technology we run <a href="http://www.swinburne.edu.au/media-centre/news/2015/04/swinburne-partners-with-the-brainary-to-deliver-robotics-workshops.html">workshops</a> to introduce school students to program <a href="https://www.aldebaran.com/en/humanoid-robot/nao-robot">NAO robots</a>. Students use the <a href="http://doc.aldebaran.com/1-14/software/choregraphe/choregraphe_overview.html">Choregraphe environment</a> to link robot actions from a library. </p>
<p>Students previously unused to programming can develop interesting robot projects in a couple of days. More sophisticated development of the robot requires students to use a more detail-oriented language, such as Python or <a href="http://en.wikipedia.org/wiki/C%2B%2B">C++</a>. The simpler options lead to positive student experience.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/82338/original/image-20150520-30504-13hkelw.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/82338/original/image-20150520-30504-13hkelw.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/82338/original/image-20150520-30504-13hkelw.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=404&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/82338/original/image-20150520-30504-13hkelw.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=404&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/82338/original/image-20150520-30504-13hkelw.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=404&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/82338/original/image-20150520-30504-13hkelw.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=507&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/82338/original/image-20150520-30504-13hkelw.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=507&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/82338/original/image-20150520-30504-13hkelw.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=507&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The Nao robot can be programmed easily to perform a range of tasks.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/brettdavis/4436987052/in/photolist-7L5HEL-9jhfb2-9tpLfe-9jheUe-9jhe3P-9tsEk1-5b6h9b-5b6DRG-9sQrBJ-9jkr6d-hwbdyc-hw9LcY-hwa1rj-hwb2H4-hwacu1-hwaL48-hw9TCQ-74rqLE-5b6FHj-5b6xcS-5b6HQL-5b2qRa-5b6DqN-5b6moy-5b6yuN-5b6nc7-5b2qDV-5b2hV6-5b2hwD-5b6EcU-5b2eMr-5b6wDj-5b6gWs-5b6Cpm-5b6jHh-5b6BVw-5b2nHe-5b28kZ-5b2eG6-5b266P-5b6F6S-5b6u9L-5b6nKb-5b2cpt-5b6wHN-5b6rsC-5b6zVE-5b26wk-5b6Axj-5b6n5A">Brett Davis/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span>
</figcaption>
</figure>
<h2>Computational thinking</h2>
<p>Writing and then executing a program gives immediate feedback as to whether you have correctly expressed instructions for the computer. Ultimately, the understanding of how to express concepts so that a computer can perform tasks accurately and efficiently is far more important than the details of the programming language. </p>
<p>Underlying all computer programs are <a href="https://theconversation.com/au/topics/algorithm">algorithms</a>, which specify in a more abstract way how a task is to be done. Algorithmic thinking – also called <a href="http://en.wikipedia.org/wiki/Computational_thinking">computational thinking</a> – underlies computer science, and there has been a growing movement on algorithmic thinking in schools. </p>
<p>The new <a href="http://www.australiancurriculum.edu.au/">national curriculum</a> reflects algorithmic processes, and materials are being developed to help teachers with the new curriculum. Victoria has recently developed a new subject for the Victorian Certificate of Education (<a href="http://www.vcaa.vic.edu.au/Pages/vce/index.aspx">VCE</a>) entitled <a href="http://www.vcaa.vic.edu.au/Pages/vce/studies/algorithmics/algorithmicsindex.aspx">Algorithmics</a>.
There are even materials for teaching algorithmic thinking without computers. The <a href="http://csunplugged.org/">Computer Science Unplugged</a> movement, led by Tim Bell and colleagues at the University of Canterbury, has developed resources that teach students concepts through movement and fun activities. </p>
<h2>Teaching for the this century</h2>
<p>Teaching computer coding in schools is very different from initiatives that advocate for computers in the classroom. I was not, and am still not, supportive of compulsory laptop programs in schools. </p>
<p>The idea is not necessarily to expose students to the technology itself, which is almost inevitable these days with the wide penetration of mobile phones. Rather, students are exposed to the skills needed to develop computer applications. </p>
<p>While IT skill shortages is a contentious topic, there is no doubt that not enough of the best and brightest are studying computer science at university. A significant factor is insufficient exposure to the topic at schools. Teaching coding at schools is aimed at addressing the lack.</p>
<p>It might be said that whatever programming language is taught will be obsolete by the time the students enter the workforce. My experience is that, if taught properly, students can rapidly transfer the principles of one language to another. </p>
<p>In the 19th and 20th centuries, the challenge was to understand the physical world, and harness force and energy. This understanding percolated into the school curriculum. In the 21st century, the challenge is to understand and harness data, information and knowledge. Computer programming is a necessary way of introducing students to these concepts.</p><img src="https://counter.theconversation.com/content/42046/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Leon Sterling works for Swinburne University of Technology. He receives funding from the Australian Research Council. He is the immediate past president of the Australian Council of Deans of ICT and chairs the Australian Informatics Olympiad Committee.</span></em></p>If we want students to be well prepared for the 21st century, then we should be teaching coding in school.Leon Sterling, Pro Vice Chancellor Digital Frontiers, Swinburne University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/379612015-03-09T18:50:21Z2015-03-09T18:50:21ZTo stop the machines taking over we need to think about fuzzy logic<figure><img src="https://images.theconversation.com/files/74030/original/image-20150306-3317-w7ieka.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">A model of the Terminator from the popular movie series where machines take over the world.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/shutterjunkie/3877277138">Flickr/Edwin Montufar</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/">CC BY-NC-SA</a></span></figcaption></figure><p>Amid all the <a href="https://theconversation.com/is-stephen-hawking-right-could-ai-lead-to-the-end-of-humankind-34967">dire warnings</a> that machines run by artificial intelligence (<a href="https://theconversation.com/au/topics/artificial-intelligence">AI</a>) will one day take over from humans we need to think more about how we program them in the first place.</p>
<p>The technology may be too far off to seriously entertain these worries – for now – but much of the distrust surrounding AI arises from misunderstandings in what it means to say a machine is “thinking”.</p>
<p>One of the current aims of AI research is to design machines, algorithms, input/output processes or mathematical functions that can mimic human thinking as much as possible.</p>
<p>We want to better understand what goes on in human thinking, especially when it comes to decisions that cannot be justified other than by drawing on our “intuition” and “gut-feelings” – the decisions we can only make after learning from experience.</p>
<p>Consider the human that hires you after first comparing you to other job-applicants in terms of your work history, skills and presentation. This human-manager is able to make a decision identifying the successful candidate.</p>
<p>If we can design a computer program that takes exactly the same inputs as the human-manager and can reproduce its outputs, then we can make inferences about what the human-manager really values, even if he or she cannot articulate their decision on who to appoint other than to say “it comes down to experience”.</p>
<p>This kind of research is being <a href="http://www.sciencedirect.com/science/article/pii/S0165011406002612">carried out today</a> and applied to understand risk-aversion and risk-seeking behaviour of financial consultants. It’s also being looked at in the field of <a href="http://www.britannica.com/EBchecked/topic/400270/MYCIN">medical diagnosis</a>.</p>
<p>These human-emulating systems are not yet being asked to make decisions, but they are certainly being used to help guide human decisions and reduce the level of human error and inconsistency.</p>
<h2>Fuzzy sets and AI</h2>
<p>One promising area of research is to utilise the framework of <a href="https://www.calvin.edu/%7Epribeiro/othrlnks/Fuzzy/fuzzysets.htm">fuzzy sets</a>. Fuzzy sets and fuzzy logic were formalised by <a href="http://www.sciencedirect.com/science/article/pii/S1026309811000666">Lotfi Zadeh</a> in 1965 and can be used to mathematically represent our knowledge pertaining to a given subject.</p>
<p>In everyday language what we mean when accusing someone of “fuzzy logic” or “fuzzy thinking” is that their ideas are contradictory, biased or perhaps just not very well thought out.</p>
<p>But in mathematics and logic, “fuzzy” is a name for a research area that has quite a sound and straightforward basis.</p>
<p>The starting point for fuzzy sets is this: many decision processes that can be managed by computers traditionally involve truth values that are binary: something is true or false, and any action is based on the answer (in computing this is typically encoded by 0 or 1).</p>
<p>For example, our human-manager from the earlier example may say to human resources:</p>
<ul>
<li>IF the job applicant is aged 25 to 30</li>
<li>AND has a qualification in philosophy OR literature</li>
<li>THEN arrange an interview.</li>
</ul>
<p>This information can all be written into a hiring algorithm, based on true or false answers, because an applicant either is between 25 and 30 or is not, they either do have the qualification or they do not.</p>
<p>But what if the human-manager is somewhat more vague in expressing their requirements? Instead, the human-manager says:</p>
<ul>
<li>IF the applicant is tall</li>
<li>AND attractive</li>
<li>THEN the salary offered should be higher.</li>
</ul>
<p>The problem HR faces in encoding these requests into the hiring algorithm is that it involves a number of subjective concepts. Even though height is something we can objectively measure, how tall should someone be before we call them tall?</p>
<p>Attractiveness is also subjective, even if we only account for the taste of the single human-manager.</p>
<h2>Grey areas and fuzzy sets</h2>
<p>In fuzzy sets research we say that such characteristics are fuzzy. By this we mean that whether something belongs to a set or not, whether a statement is true or false, can gradually increase from 0 to 1 over a given range of values.</p>
<p>One of the hardest things in any fuzzy-based software application is how best to convert observed inputs (someone’s height) into a fuzzy degree of membership, and then further establish the rules governing the use of connectives such as AND and OR for that fuzzy set.</p>
<p>To this day, and likely in years or decades into the future, the rules for this transition are human-defined. For example, to specify how tall someone is, I could design a function that says a 190cm person is tall (with a truth value of 1) and a 140cm person is not tall (or tall with a truth value of 0).</p>
<p>Then from 140cm, for every increase of 5cm in height the truth value increases by 0.1. So a key feature of any AI system is that we, normal old humans, still govern all the rules concerning how values or words are defined. More importantly, we define all the actions that the AI system can take – the “THEN” statements. </p>
<h2>Human–robot symbiosis</h2>
<p>An area called <a href="http://www.cs.berkeley.edu/%7Ezadeh/papers/What%20Computing%20with%20Words%20Means%20to%20Me-CIM%202010.pdf">computing with words</a>, takes the idea further by aiming for seamless communication between a human user and an AI computer algorithm.</p>
<p>For the moment, we still need to come up with mathematical representations of subjective terms such as “tall”, “attractive”, “good” and “fast”. Then we need to design a function for combining such comments or commands, followed by another mathematical definition for turning the result we get back into an output like “yes he is tall”.</p>
<p>In conceiving the idea of computing with words, researchers envisage a time where we might have more access to base-level expressions of these terms, such as the brain activity and readings when we use the term “tall”.</p>
<p>This would be an amazing leap, although mainly in terms of the technology required to observe such phenomena (the number of neurons in the brain, let alone synapses between them, is somewhere near the number of galaxies in the universe).</p>
<p>Even so, designing machines and algorithms that can emulate human behaviour to the point of mimicking communication with us is still a long way off.</p>
<p>In the end, any system we design will behave as it is expected to, according to the rules we have designed and program that governs it.</p>
<h2>An irrational fear?</h2>
<p>This brings us back to the big fear of AI machines turning on us in the future.</p>
<p>The real danger is not in the birth of genuine artificial intelligence –- that we will somehow manage to create a program that can become self-aware such as HAL 9000 in the movie 2001: A Space Odyssey or Skynet in the Terminator series.</p>
<p>The real danger is that we make errors in encoding our algorithms or that we put machines in situations without properly considering how they will interact with their environment.</p>
<p>These risks, however, are the same that come with any human-made system or object.</p>
<p>So if we were to entrust, say, the decision to fire a weapon to AI algorithms (rather than just the guidance system), then we might have something to fear.</p>
<p>Not a fear that these intelligent weapons will one day turn on us, but rather that we programmed them – given a series of subjective options – to decide the wrong thing and turn on us.</p>
<p>Even if there is some uncertainty about the future of “thinking” machines and what role they will have in our society, a sure thing is that we will be making the final decisions about what they are capable of.</p>
<p>When programming artificial intelligence, the onus is on us (as it is when we design skyscrapers, build machinery, develop pharmaceutical drugs or draft civil laws), to make sure it will do what we really want it to.</p><img src="https://counter.theconversation.com/content/37961/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Simon James does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>If machines run by artificial intelligence take over the world it's only because we programmed them to do so. So how can fuzzy logic help us prevent that?Simon James, Lecturer in Mathematics, Deakin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/379852015-02-25T19:30:45Z2015-02-25T19:30:45ZMachines master classic video games without being told the rules<figure><img src="https://images.theconversation.com/files/73018/original/image-20150225-1795-1bnuq2e.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Imagine a machine that can learn things from scratch, no pre-programmed rules. What could it do?</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/matley0/3720313238/">Flickr/Marco Abis</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><p>Think you’re good at classic arcade games such as Space Invaders, Breakout and Pong? Think again.</p>
<p>In a groundbreaking paper <a href="http://dx.doi.org/10.1038/nature14236">published today in Nature</a>, a team of researchers led
by <a href="http://deepmind.com/">DeepMind</a> co-founder Demis Hassabis reported developing a deep neural network that was able to learn to play such games at an expert level.</p>
<p>What makes this achievement all the more impressive is that the program was not given <em>any</em> background knowledge about the games. It just had access to the score and the pixels on the screen. </p>
<p>It didn’t know about bats, balls, lasers or any of the other things we humans need to know about in order to play the games. </p>
<p>But by playing lots and lots of games many times over, the computer learnt first how to play, and then how to play well.</p>
<h2>A machine that learns from scratch</h2>
<p>This is the latest in a series of breakthroughs in deep learning, one of the hottest topics today in artificial intelligence (AI).</p>
<p>Actually, DeepMind isn’t the first such success at playing games. Twenty years ago a computer program known as <a href="http://www.bkgm.com/articles/tesauro/tdl.html">TD-Gammon</a> learnt to play backgammon at a super-human level also using a neural network.</p>
<p>But TD-Gammon never did so well at similar games such as chess, Go or checkers (draughts).</p>
<p>In a few years time, though, you’re likely to see such deep learning in your Google search results. Early last year, inspired by results like these, Google bought DeepMind for <a href="https://www.theinformation.com/google-beat-facebook-for-deepmind-creates-ethics-board">a reported UK£500 million</a>.</p>
<p>Many other technology companies are spending big in this space.</p>
<p>Baidu, the “Chinese Google”, set up the <a href="http://idl.baidu.com/en/">Institute of Deep Learning</a> and <a href="http://www.technologyreview.com/news/527301/chinese-search-giant-baidu-hires-man-behind-the-google-brain/">hired experts</a> such as Stanford University professor <a href="http://cs.stanford.edu/people/ang/">Andrew Ng</a>.</p>
<p>Facebook has set up its <a href="https://research.facebook.com/ai/">Artificial Intelligence Research Lab</a> which is led by another deep learning expert, <a href="http://yann.lecun.com/">Yann LeCun</a>.</p>
<p>And more recently <a href="http://techcrunch.com/2014/07/29/twitter-acquires-image-search-startup-madbits/">Twitter acquired Madbits</a>, another deep learning startup.</p>
<h2>What is the secret sauce behind deep learning?</h2>
<p><a href="https://gigaom.com/2014/11/14/on-reddit-geoff-hinton-talks-google-and-future-of-deep-learning/">Geoffrey Hinton</a> is one of the pioneers in this area, and is another recent Google hire. In an inspiring keynote talk at last month’s annual meeting of the <a href="http://www.aaai.org/home.html">Association for the Advancement of Artificial Intelligence</a>, he outlined three main reasons for these recent breakthroughs</p>
<p>First, lots of Central Processing Units (<a href="http://www.pcmag.com/encyclopedia/term/40436/cpu">CPUs</a>). These are not the sort of neural networks you can train at home. It takes thousands of CPUs to train the many layers of these networks. This requires some serious computing power.</p>
<p>In fact, a lot of progress is being made using the raw horse power of Graphics Processing Units (<a href="http://www.pcmag.com/encyclopedia/term/43886/gpu">GPUs</a>), the super fast chips that power graphics engines in the very same arcade games.</p>
<p>Second, lots of data. The deep neural network plays the arcade game millions of times.</p>
<p>Third, a couple of nifty tricks for speeding up the learning such as training a collection of networks rather than a single one. Think the wisdom of crowds.</p>
<h2>What will deep learning be good for?</h2>
<p>Despite all the excitement though about deep learning technologies there are some limitations over what it can do.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/lge-dl2JUAM?wmode=transparent&amp;start=0" frameborder="0" allowfullscreen></iframe>
<figcaption><span class="caption">DeepMind co-founder Demis Hassaabis on the potential of Artificial intellgience to solve some of biggest problems that humanity faces.</span></figcaption>
</figure>
<p>Deep learning appears to be good for low level tasks that we do without much thinking. Recognising a cat in a picture, understanding some speech on the phone or playing an arcade game like an expert.</p>
<p>These are all tasks we have “compiled” down into our <em>own</em> marvellous neural networks.</p>
<p>Cutting through the hype, it’s much less clear if deep learning will be so good at high level reasoning. This includes proving difficult mathematical theorems, optimising a complex supply chain or scheduling all the planes in an airline.</p>
<h2>Where next for deep learning?</h2>
<p>Deep learning is sure to turn up in a browser or smartphone near you before too long. We will see products such as a <a href="https://www.apple.com/ios/siri/">super smart Siri </a> that simplifies your life by <em>predicting</em> your next desire.</p>
<p>But I suspect there will eventually be a deep learning backlash in a few years time when we run into the limitations of this technology. Especially if more deep learning startups sell for hundreds of millions of dollars. It will be hard to meet the expectations that all these dollars entail.</p>
<p>Nevertheless, deep learning looks set to be another piece of the AI jigsaw. Putting these and other pieces together will see much of what we humans do replicated by computers.</p>
<p>If you want to hear more about the future of AI, I invite you to the <a href="http://www.nextbigthingsummit.com.au/">Next Big Thing Summit</a> in Melbourne on April 21, 2015. This is part of the two-day <a href="http://www.con-nect.com.au/">CONNECT</a> conference taking place in the Victorian capital. </p>
<p>Along with AI experts such as <a href="http://robots.stanford.edu/">Sebastian Thrun</a> and <a href="http://people.csail.mit.edu/brooks/">Rodney Brooks</a>, I will be trying to predict where all of this is taking us.</p>
<p>And if you’re feeling nostaglic and want to try your hand out at one of these games, go to Google Images and search for “atari breakout” (or follow this <a href="https://g.co/doodle/uk3vrq">link</a>). You’ll get a browser version of the Atari classic to play.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/73025/original/image-20150225-1795-2b2cs3.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/73025/original/image-20150225-1795-2b2cs3.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/73025/original/image-20150225-1795-2b2cs3.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=364&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/73025/original/image-20150225-1795-2b2cs3.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=364&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/73025/original/image-20150225-1795-2b2cs3.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=364&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/73025/original/image-20150225-1795-2b2cs3.png?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=458&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/73025/original/image-20150225-1795-2b2cs3.png?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=458&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/73025/original/image-20150225-1795-2b2cs3.png?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=458&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A web browser version of Atari’s breakout found in Google images search.</span>
<span class="attribution"><a class="source" href="https://g.co/doodle/uk3vrq">Google Images</a></span>
</figcaption>
</figure>
<p>And once you’re an expert at Breakout, you might want to head to <a href="https://www.atari.com/arcade">Atari’s arcade website</a>.</p><img src="https://counter.theconversation.com/content/37985/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Toby Walsh does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Tech companies are investing big in artificial intelligence research that allows machines to learn things from scratch, with no pre-programmed rules. So what's the potential for this new technology?Toby Walsh, Professor of AI at UNSW, Research Group Leader, Data61Licensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/370672015-02-03T13:12:04Z2015-02-03T13:12:04ZBuffer overflows are the ghosts that will always be among us<figure><img src="https://images.theconversation.com/files/70825/original/image-20150202-27762-19nsr03.png?ixlib=rb-1.1.0&amp;rect=0%2C136%2C798%2C597&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">The ghosts of Linux.</span> <span class="attribution"><a class="source" href="https://openclipart.org/detail/178475/ghost-by-bartm-178475">BartM</a></span></figcaption></figure><p>Following the trend of giving catchy names to serious operating system security flaws, the Linux vulnerability <a href="http://www.computerworld.com/article/2877900/ghost-flaw-in-linux-can-be-exploited-through-wordpress-other-php-apps.html">revealed recently</a> by security researchers Qualys has been called Ghost. </p>
<p>Like <a href="https://theconversation.com/dont-panic-about-heartbleed-but-have-a-spring-clean-anyway-25509">Heartbleed</a> and <a href="https://theconversation.com/bigger-than-heartbleed-bug-in-bash-leaves-millions-of-web-servers-vulnerable-32231">Shellshock</a> before it, the name is not plucked out of the air but refers to the functions called “gethostbyname” in which the flaw appears. </p>
<p>These functions translate user-friendly domain addresses such as example.com into numerical network IP addresses, such as 93.184.216.34, and are part of the <a href="http://www.gnu.org/software/libc/index.html">GNU C library</a> which is included in practically every Linux system. This is important, as with most servers on the internet running Linux there are an enormous number of potentially vulnerable systems. Successfully exploited, the flaw could allowing an attacker to gain control of the system.</p>
<p>This is an example of a <a href="http://www.cse.scu.edu/%7Etschwarz/coen152_05/Lectures/BufferOverflow.html">buffer overflow</a>, one of the most persistent types of security problems that appears endlessly in lists of security vulnerabilities. For any computer security researcher it’s a case of déjà vu.</p>
<h2>Ghost hunting</h2>
<p>A function is assigned a certain amount of memory allocation to store the parameters or data it uses. A buffer overflow attack works because the function doesn’t correctly define or check the parameters it is sent. A malicious user can supply parameters larger than the allocated memory space which results in them being written into memory space outside that allocated – and therefore beyond whatever security restrictions had been placed on it. If this data is executable code, the system can be fooled into running it, potentially with greater system privileges.</p>
<p>The amount of memory that can be overwritten in the Ghost vulnerability is really very small (either four or eight bytes, depending on whether the system is 32-bit or 64-bit). But even this tiny amount of memory may be sufficient to allow a complete compromise of the system. The degree of skill needed to exploit this particular bug may be very high but Qualys has offered an <a href="https://community.qualys.com/blogs/laws-of-vulnerabilities/2015/01/27/the-ghost-vulnerability">example of code that exploits the flaw</a> based on something as simple as sending an email to a mail server. </p>
<p>Very few applications are known to be remotely exploitable – and many more recent applications don’t use the gethostname functions at all. However, applications using the PHP coding language are a <a href="http://blog.sucuri.net/2015/01/critical-ghost-vulnerability-released.html">significant source of concern</a> – for example, the popular WordPress blogging software is identified as potentially susceptible, so it’s not just obscure software that’s affected.</p>
<p>Buffer overflows are part of an even larger collection of exploits arising due to lack of proper parameter checking. In many online database access applications, a malicious user (or application) can supply input parameters that have been specially crafted so that they override any built-in checking. The most common of these is known as an <a href="http://www.acunetix.com/websitesecurity/sql-injection/">SQL injection </a> attack. Buffer overflows and SQL injection attacks are similar in that both exploit deliberately malformed data sent to program functions that cannot properly process it, and both exploit the absence of proper checking. </p>
<p>This is largely an avoidable problem. There have been concerted efforts by the software development world to seek out and fix buffer overflows in code. It seems, however, that they will always be with us.</p>
<h2>Write once, check twice</h2>
<p>Qualys have worked with various Linux distributors in advance of announcing the vulnerability so that patches for all major distributions have been available since January 27, 2015. If you are running a variant of Linux such as Debian 7, Red Hat Enterprise Linux 6/7, CentOS 6/7, and Ubuntu 12.04, you would do very well to ensure that your system patches are up to date.</p>
<p>Taking a step back, the reaction of the computing community will be a mixture of “yikes”, “phew”, and “yawn”. The first, because the vulnerability is present in a significant number of systems worldwide. The second, because in a great many cases it’s difficult to exploit and so there’s time to roll out the patches that fix the problem. And the third, because we’ve seen it all before. </p>
<p>This particular flaw was recognised and fixed as far back as 2013 – and may have been <a href="http://www.theregister.co.uk/2015/01/27/glibc_ghost_vulnerability/">present since around 2000</a>. However as the fix was not classified as a security problem many popular distributions of Linux didn’t include it in updates. </p>
<p>And so it comes back to haunt us – and it will certainly not be the last of such vulnerabilities we see. To make buffer overflows a thing of the past will require an enormous amount of due diligence – systematic, thorough code review and testing – as new code is written. But the sheer volume of code that exists, such as the potentially 15-year-old lines that include this flaw, never mind that being written anew, should give some indication of the scale of the task.</p><img src="https://counter.theconversation.com/content/37067/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>John Clark does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Following the trend of giving catchy names to serious operating system security flaws, the Linux vulnerability revealed recently by security researchers Qualys has been called Ghost. Like Heartbleed and…John Clark, Professor of Critical Systems, Deputy Head of Department (Research), University of YorkLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/335222014-11-11T08:59:44Z2014-11-11T08:59:44ZIt's possible to write flaw-free software, so why don't we?<figure><img src="https://images.theconversation.com/files/64139/original/x587hfzh-1415630068.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">If Spock would not think it illogical, it&#39;s probably good code.</span> <span class="attribution"><a class="source" href="http://commons.wikimedia.org/wiki/File:Agda_proof.jpg">Alexandre Buisse</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>Legendary Dutch computer scientist <a href="http://www.cs.utexas.edu/users/EWD/">Edsger W Dijkstra</a> famously remarked that “<a href="http://homepages.cs.ncl.ac.uk/brian.randell/NATO/nato1969.PDF">testing shows the presence, not the absence of bugs</a>”. In fact the only definitive way to establish that software is correct and bug-free is through mathematics. </p>
<p>It has long been known that software is hard to get right. Since <a href="http://computer.org/computer-pioneers/pdfs/B/Bauer.pdf">Friedrich L Bauer</a> organised the very first <a href="http://homepages.cs.ncl.ac.uk/brian.randell/NATO/NATOReports/">conference on “software engineering”</a> in 1968, computer scientists have devised methodologies to structure and guide software development. One of these, sometimes called strong software engineering or more usually <a href="http://users.ece.cmu.edu/%7Ekoopman/des_s99/formal_methods/">formal methods</a>, uses mathematics to ensure error-free programming.</p>
<p>As the economy becomes ever more computerised and entwined with the internet, flaws and bugs in software increasingly lead to economic costs from fraud and loss. But despite having heard expert evidence that echoed Dijkstra’s words and emphasises the need for the correct, verified software that formal methods can achieve, the UK government seems not to have got the message.</p>
<h2>Formal software engineering</h2>
<p>The UK has always been big in formal methods. Two British computer scientists, Tony Hoare (<a href="http://www.cs.ox.ac.uk/people/tony.hoare/">Oxford 1977-</a>, <a href="http://research.microsoft.com/en-us/news/features/hoare-080411.aspx">Microsoft Research 1999-</a>) and the late <a href="http://www.cl.cam.ac.uk/archive/rm135/">Robin Milner</a> (Edinburgh 1973-95, Cambridge 1995-2001) were given <a href="http://amturing.acm.org/">Turing Awards</a> – the computing equivalent of the Nobel Prize – for their work in formal methods.</p>
<p>British computer scientist <a href="http://homepages.cs.ncl.ac.uk/cliff.jones/">Cliff B Jones</a> was one of the inventors of the <a href="http://overturetool.org/method/">Vienna Development Method</a> while working for IBM in Vienna, and IBM UK and Oxford University Computing Laboratory, led by Tony Hoare, won a <a href="https://www.gov.uk/queens-awards-for-enterprise">Queen’s Award for Technological Achievement</a> for their work to formalise IBM’s <a href="http://www.bcs.org/upload/pdf/advprog-apr06.pdf">CICS software</a>. In the process they further developed the <a href="http://formalmethods.wikia.com/wiki/Z_notation">Z notation</a> which has become one of the major formal methods. </p>
<p>The formal method process entails describing what the program is supposed to do using logical and mathematical notation, then using <a href="http://math.berkeley.edu/%7Ehutching/teach/proofs.pdf">logical and mathematical proofs</a> to verify that the program indeed does what it should. For example, the following Hoare logic formula describing a program’s function shows how formal methods reduce code to something as irreducibly true or false as 1 + 1 = 2.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/63570/original/jcbyhhbs-1415035184.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/63570/original/jcbyhhbs-1415035184.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=245&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/63570/original/jcbyhhbs-1415035184.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=245&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/63570/original/jcbyhhbs-1415035184.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=245&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/63570/original/jcbyhhbs-1415035184.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=308&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/63570/original/jcbyhhbs-1415035184.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=308&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/63570/original/jcbyhhbs-1415035184.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=308&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Hoare logic formula: if a program S started in a state satisfying P takes us to a state satisfying Q, and program T takes us from Q to R, then first doing S and then T takes us from P to R.</span>
</figcaption>
</figure>
<p>Taught at most UK universities since the mid-1980s, formal methods have seen considerable use by industry in <a href="http://www.inrialpes.fr/vasy/fmics/">safety-critical systems</a>. Recent advances have reached a point where formal methods’ capacity to check and verify code can be applied at scale with powerful automated tools.</p>
<h2>Government gets the message</h2>
<p>Is there any impetus to see them used more widely, however? When the Home Affairs Committee took evidence in its <a href="http://www.publications.parliament.uk/pa/cm201314/cmselect/cmhaff/70/70.pdf">E-crime enquiry</a> in April 2013, <a href="http://www.profjimnorton.com/">Professor Jim Norton</a>, former chair of the <a href="http://www.bcs.org">British Computer Society</a>, told the committee:</p>
<blockquote>
<p>We need better software, and we know how to write software very much better than we actually do in practice in most cases today… We do not use the formal mathematical methods that we have available, which we have had for 40 years, to produce better software.</p>
</blockquote>
<p>Based on Norton’s evidence, the committee put forward in recommendation 32 “that software for key infrastructure be provably secure, by using mathematical approaches to writing code.”</p>
<p>Two months later in June, the Science and Technology Committee <a href="http://www.publications.parliament.uk/pa/cm201314/cmselect/cmsctech/uc252-i/uc25201.htm">took evidence</a> on the <a href="https://www.gov.uk/service-manual/digital-by-default">Digital by Default</a> programme of internet-delivered public services. One invited expert was <a href="http://www.thomas-associates.co.uk/">Dr Martyn Thomas</a>, founder of <a href="http://www.altran.co.uk/">Praxis</a>, one of the most prominent companies using formal methods for safety-critical systems development. Asked how to achieve the required levels of security, he replied that: </p>
<blockquote>
<p>Heroic amounts of testing won’t give you a high degree of confidence that things are correct or have the properties you expect… it has to be done by analysis. That means the software has to be written in such a way that it can be analysed, and that is a big change to the way the industry currently works.</p>
</blockquote>
<p>The committee <a href="http://www.parliament.uk/documents/commons-committees/science-technology/130709-Chair-to-Francis-Maude.pdf">sent an open letter</a> to cabinet secretary Francis Maude in asking whether the government “was confident that software developed meets the highest engineering standards.”</p>
<h2>Trustworthy software is the answer</h2>
<p>The government, in its <a href="http://www.parliament.uk/documents/commons-committees/home-affairs/E-crime-Government-Response-Cm-8734.pdf">response to the E-crime report</a> in October 2013 , stated: </p>
<blockquote>
<p>The government supports Home Affairs Committee recommendation 32. To this end the government has invested in the <a href="http://uk-tsi.org.uk">Trustworthy Software Initiative</a>, a public/private partnership initiative to develop guidance and information on secure and trustworthy software development.</p>
</blockquote>
<p>This sounded very hopeful. Maude’s <a href="http://www.parliament.uk/documents/commons-committees/science-technology/Correspondence/131031MaudeDigitalbyDefault.pdf">reply to the Science and Technology committee</a> that month was not published <a href="https://twitter.com/CommonsSTC/status/527074057515446272">until October 2014</a>, but stated much the same thing.</p>
<p>So one might guess that the TSI had been set up specifically to address the committee’s recommendation, but this turns out not to be the case. The TSI was established in 2011, in response to governmental concerns over (cyber) security. Its “<a href="http://www.uk-tsi.org/?page_id=1175">initiation phase</a>” in which it drew from academic expertise on trustworthy software ended in August 2014 with the production of a guide entitled the Trustworthy Security Framework, available as British Standards Institute standard <a href="http://shop.bsigroup.com/ProductDetail/?pid=000000000030284608">PAS 754:2014</a>.</p>
<p>This is a very valuable collection of risk-based software engineering practices for designing trustworthy software (and not, incidentally, the “agile, iterative and user-centric” practices described in the <a href="https://www.gov.uk/service-manual/digital-by-default">Digital by Default service manual</a>). But so far formal methods have been given no role in this. In a <a href="http://ssdri-web.s3-website-eu-west-1.amazonaws.com/TSI_2012_165_SQM_2012_Keynote_Web.pdf">keynote address</a> at the 2012 BCS Software Quality Metrics conference, TSI director <a href="http://www2.warwick.ac.uk/fac/sci/wmg/research/csc/people/">Ian Bryant</a> gave formal methods no more than a passing mention as a “technical approach to risk management”.</p>
<p>So the UK government has been twice advised to use mathematics and formal methods to ensure software correctness, but having twice indicated that the TSI is its vehicle for achieving this, nothing has happened. Testing times for software correctness, then, something that will continue for as long as it takes for Dijkstra’s message to sink in.</p><img src="https://counter.theconversation.com/content/33522/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Eerke Boiten is a senior lecturer in the School of Computing at the University of Kent, and Director of the University&#39;s interdisciplinary Centre for Cyber Security Research. He receives funding from EPSRC for the CryptoForma Network of Excellence on Cryptography and Formal Methods. He is a member of BCS and board member of its specialist group on Formal Aspects of Computer Science and editorial board member of their journal. Friedrich L. Bauer is his &quot;academic grandfather&quot;, see <a href="http://genealogy.math.ndsu.nodak.edu/id.php?id=76349">http://genealogy.math.ndsu.nodak.edu/id.php?id=76349</a>.</span></em></p>Legendary Dutch computer scientist Edsger W Dijkstra famously remarked that “testing shows the presence, not the absence of bugs”. In fact the only definitive way to establish that software is correct…Eerke Boiten, Senior Lecturer, School of Computing and Director of Interdisciplinary Cyber Security Centre, University of KentLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/322132014-09-30T03:07:57Z2014-09-30T03:07:57ZWindows 9, iOS8 – the balance between bugs and upgrades<figure><img src="https://images.theconversation.com/files/60364/original/cyv5z2mk-1412040343.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Upgrade rage - what to do when things go wrong?</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/nalbertini/6364521809">Flickr/Nicola Albertini</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><p>In the tech world there is a dynamic tension between needing to get products to market before the competition and the need to take enough time to make those products completely defect-free and user friendly. </p>
<p>It is a tricky balancing act. Cut too many corners and release a product too soon, and you risk a withering backlash. Release a product too late and your competitors become established on the high ground and you are left playing catch-up. </p>
<p>Finding the sweet-spot in the middle, now <em>that</em> is the trick. One only has to look at the recent past to see a <a href="http://www.techradar.com/au/news/phone-and-communications/mobile-phones/the-embarrassing-climbdown-tech-firms-would-rather-you-forgot-1190571">litany of gadgets</a> that went wrong when they should have gone right, leaving the company in damage control. </p>
<h2>Something new … but does it work?</h2>
<p>In the ultra-competitive <a href="http://www.marketsandmarkets.com/PressReleases/smartphones-market.asp">billion dollar smartphone market</a> the stakes are high and getting higher, while the price of failure can be brutal. </p>
<p>To the <a href="https://theconversation.com/samsung-makes-fun-of-apples-misfortunes-whilst-its-own-problems-pass-unnoticed-32212">delight</a> of its competitors, the successful debut of the iPhone 6 soon turned into a PR nightmare for Apple.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/60250/original/gn6cpxw6-1411963139.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/60250/original/gn6cpxw6-1411963139.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=237&amp;fit=clip" srcset="https://images.theconversation.com/files/60250/original/gn6cpxw6-1411963139.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=429&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/60250/original/gn6cpxw6-1411963139.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=429&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/60250/original/gn6cpxw6-1411963139.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=429&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/60250/original/gn6cpxw6-1411963139.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=539&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/60250/original/gn6cpxw6-1411963139.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=539&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/60250/original/gn6cpxw6-1411963139.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=539&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">What to expect from the next Windows.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/tecnomovida/11925281394">Flickr/Tecnomovida Caracas</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/">CC BY-NC-SA</a></span>
</figcaption>
</figure>
<p>Bugs in its new <a href="http://www.theverge.com/2014/9/24/6839235/apple-ios-8-0-1-released">iOS8</a> and the now infamous <a href="http://www.cnet.com/news/even-blackberry-is-making-fun-of-the-bendable-iphone-6/">bendable iPhone 6+</a> have caused <a href="http://www.huffingtonpost.com/2014/09/25/apple-stock-price-bendgate-ios-8_n_5882154.html">Apple stock</a> to lose well over US$20 billion in the days following the incidents. The <a href="http://www.businessinsider.com.au/apple-icloud-problems-before-nude-celebrity-photo-hack-2014-9">iCloud hacking scandal</a> some weeks earlier has not helped.</p>
<p>Meanwhile, Microsoft is <a href="http://in.reuters.com/article/2014/09/26/microsoft-windows-idINKCN0HL1KJ20140926">this week</a> giving the world a taste of Windows 9, due for release sometime in 2015. CEO <a href="http://www.infoworld.com/article/2608434/microsoft-windows/satya-nadella-at-six-months--grading-microsoft-s-new-ceo.html">Satya Nadella</a> must surely be hoping for a warmer reception for this version Windows than the much criticised <a href="https://theconversation.com/what-to-expect-from-the-next-generation-of-windows-28763">Windows 8</a>. </p>
<p>Leaked information suggests a <a href="http://www.pcadvisor.co.uk/new-product/windows/3496959/windows-9-release-date-price-features-beta-technical-preview-leaked/">good showing</a> from Windows 9 – and it will need to be good as Microsoft has a lot riding on it. </p>
<h2>Finding the sweet-spot</h2>
<p>Hi-tech companies roll the dice every time they release a new product. If they have come up with a timely idea, brought it to the market in good shape, and for the right price, then all may go well. Maybe, hopefully. </p>
<p>But why does it sometimes <em>not</em> go well? There are a lot of variables, but there is one that is so influential that it is worth mentioning in an article aimed at a general audience.</p>
<p>Software developers everywhere will be familiar with the concept of “<a href="http://www.techopedia.com/definition/27913/technical-debt">Technical Debt</a>” even if they don’t know it by that name. Outside of the industry, few people will have heard of it. </p>
<p>Tight production deadlines on software projects often means that for the sake of getting the job done on time, short-cuts are taken. Putting development work off now that will need to be done later is what we mean by going into technical debt.</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/60253/original/b9w39sbx-1411965748.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/60253/original/b9w39sbx-1411965748.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=237&amp;fit=clip" srcset="https://images.theconversation.com/files/60253/original/b9w39sbx-1411965748.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=514&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/60253/original/b9w39sbx-1411965748.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=514&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/60253/original/b9w39sbx-1411965748.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=514&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/60253/original/b9w39sbx-1411965748.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=646&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/60253/original/b9w39sbx-1411965748.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=646&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/60253/original/b9w39sbx-1411965748.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=646&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The impact of technical debt.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/michael_mayer/8701850930">Flickr/Michael Layer</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/">CC BY-NC-SA</a></span>
</figcaption>
</figure>
<p>It might solve a short-term problem but it creates a more serious one in the long-term. Unless the technical debt is repaid by filling in the gaps, the rising complexity of the system makes further changes increasingly difficult. </p>
<p>As time goes by, and the unpaid interest becomes compounded, the debt grows larger and deadlines are inevitably missed. There is now more accrued technical debt than there is time to complete the work necessary to repay it. This is the crunch facing many software project managers.</p>
<p>This is not to suggest that software developers should never go into technical debt, only that before going <em>into</em> debt, there needs to be a practical plan for <a href="http://www.infoq.com/articles/managing-technical-debt">how the debt is to be managed</a>.</p>
<h2>Software engineering is the answer</h2>
<p>Reducing technical debt and raising the quality of software can be achieved if developers think and act more like engineers. That means using a model to approach the whole process in a more disciplined way. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/60271/original/rmmbgzpk-1411970359.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/60271/original/rmmbgzpk-1411970359.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/60271/original/rmmbgzpk-1411970359.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=480&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/60271/original/rmmbgzpk-1411970359.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=480&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/60271/original/rmmbgzpk-1411970359.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=480&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/60271/original/rmmbgzpk-1411970359.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=603&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/60271/original/rmmbgzpk-1411970359.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=603&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/60271/original/rmmbgzpk-1411970359.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=603&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">No body likes it when things go wrong – the blue screen of death.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/zarrsadus/4957022991">Flickr/Zachary Long</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span>
</figcaption>
</figure>
<p>This is bound to raise the ire of some, but the fact is, anyone can call themselves a programmer. While there are many good ones around, there are also those who will cut corners and take the quick and dirty path. </p>
<p>Real software engineers on the other hand, like engineers from any discipline, approach their work by applying reliable, proven methods. If the method is rigorously applied, then a quality outcome will be the result. </p>
<p>When there is a bridge to be built, the civil engineer applies a tried and true method to design a bridge that will not fall down. The same is true of any other kind of engineer as they go about their work. We need software to be done this way too.</p>
<p>With so many aspects of our lives being controlled by software, it is time that IT developers came into line with every other profession whose work has real bearing on people’s lives. </p>
<p>If the car you drive or aeroplane you fly in was as unreliable as much of the software that we use every day, there would be a public outcry and rightly so. </p>
<p>It costs money, takes time and disciplined effort to produce good quality software; there’s no magic formula. Programmers everywhere could learn a lot from software engineering practice. In the end, everyone wins. </p>
<p>Microsoft is pinning its hopes that Windows 9 will find ready acceptance by an increasingly sceptical and difficult to please market. For their sake, let’s hope they got it right this time.</p><img src="https://counter.theconversation.com/content/32213/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>From 2000 to 2008 I was affiliated with the Software Engineering Institute at Carnegie-Mellon University in the US as a Capability Maturity Model Integration (CMMI) Instructor. During that time I did training on behalf of the Australian Defence Materiel Organisation for the purpose of raising the software development process maturity of its suppliers in the and Defence Contracting industry. During that period, I also earned CMMI consultancy income from non-Defence-related organisations seeking to improve their software development processes. I am not currently engaged in this business and have no plans to be in the immediate future.</span></em></p>In the tech world there is a dynamic tension between needing to get products to market before the competition and the need to take enough time to make those products completely defect-free and user friendly…David Tuffley, Lecturer in Applied Ethics and Socio-Technical Studies, Griffith UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/312662014-09-04T11:22:40Z2014-09-04T11:22:40ZCoding classes should bring in everyone, not just children<p>The US and UK governments often mirror each other’s strategies when it comes to new education policies, and the <a href="http://www.bbc.co.uk/news/technology-29010511">recent introduction</a> of coding into the school curriculum is no exception. From this September, all children aged five and up will have to learn to code, with the English <a href="http://www.bbc.co.uk/news/technology-25857276">coding revolution</a> reflecting the vision encapsulated in President Barack Obama’s <a href="Don't%20just%20play%20on%20your%20phone%E2%80%94program%20it">famous quote</a>: “Don’t just play on your phone – program it!”</p>
<p>This change is being accompanied by a surge of resources aimed at helping children <a href="http://www.theguardian.com/technology/2014/jan/27/tablets-schools-coding-kids-education-ipad">code creatively</a>, with tools ranging from non-digital board games such as <a href="http://www.thinkfun.com/robotturtles/">Robot Turtles</a> to Google’s completely visual, character-free <a href="https://code.google.com/p/blockly/?redir=1">programming language</a>. While some tools are commercially produced, others like <a href="http://darecollaborative.net/2014/05/16/programming-for-new-computing-curriculum/">The Missionmaker Core</a>, have been developed through research and development projects in collaboration with schools. </p>
<p>It’s not just about the tools though. Concerns have been raised about the government’s inadequate training <a href="http://www.ibtimes.co.uk/coding-classroom-schools-across-england-introduce-coding-curriculum-1463188">plans for teachers</a>, and the risks of simplifying coding into <a href="https://theconversation.com/keep-it-creative-to-get-kids-into-coding-15968">procedural building blocks</a> rather than conceptualising it as a new <a href="http://ec.europa.eu/digital-agenda/en/coding-21st-century-skill">21st century skill</a>.</p>
<h2>Mixed signals</h2>
<p>The problem is that we’re getting our coding metaphors mixed up. Editor at Mother Jones, Tasneem Raja, <a href="http://www.motherjones.com/media/2014/06/computer-science-programming-code-diversity-sexism-education">argues</a> that good coders are like good cooks who are able to create creative dishes out of some basic ingredients. Others compare coding to <a href="http://dash.tumblr.com/post/74211793429/when-learning-to-code-always-type-it-do-not-copy-and">music and composing</a>, there is a rhythm and melody to it. Another popular metaphor is that of <a href="http://www.smashingmagazine.com/2010/05/05/the-poetics-of-coding/">poetry</a> and art.</p>
<p>There are some important similarities between these metaphors: they all share the notion of working steadily towards proficiency. Those who code daily for hours are likely to be those who will be good at it. All three metaphors also implicitly point to audience awareness: a musician, poet or cook derive great delight from those they “code” for.</p>
<h2>Code for and with the community</h2>
<p>Another way of looking at coding is that of creating a story, built by a community. If we characterise coding in this way, we move the concept beyond linear code-writing to multi-dimensional coding, co-created by multiple authors, who actively make as well as consume the code. </p>
<p>Seeing coding as community-story projects can help answer questions around how to educate and foster a generation that loves coding rather than teaching a set of skills demanded by employers. It moves us to conceptualising coding as part of computing science which can be <a href="http://hechingerreport.org/content/teaching-computer-science-without-touching-computer_17198/">taught without touching a computer</a> and which needs to be taught <a href="https://theconversation.com/teachers-need-confidence-to-teach-coding-properly-22414">differently to different age groups</a>.</p>
<p>Importantly, it implies that children and teachers have to collaborate to use online tools together. Teachers could code apps and websites with the children, for various contexts of use. We need more examples of apps which are innovative and meet specific needs, like we saw with the <a href="http://agent4change.net/innovation/innovation/1206-devon-students-step-up-school-apps-revolution.html">Devonport High School for Boys app</a>, created by Plymouth students aged 14 and 15 for students, staff and parents to communicate better with each other. </p>
<p>Similarly, a group of students across year groups could collaborate on coding projects, <a href="http://philbagge.blogspot.co.uk/2013/05/reflections-on-teaching-computing.html?utm_source=BP_recent">borrow ideas and re-purpose them</a>. </p>
<h2>The power of the right metaphor</h2>
<p>Seeing coding in this way might provoke a society-wide dialogue about empowering more people to become involved in the creation of the content they would like to see in the digital sphere. It could inspire politicians, the private sector or not-for-profit organisations to support free coding lessons to parents, grandparents and the general public, and so avoid widening the <a href="https://theconversation.com/reinventing-entertainment-for-a-digital-generation-29665">cross-generational digital divide</a> even further. </p>
<p>It could also provide an accessible way in which to demonstrate the need for gender and racial diversity in the coding industry. With a new generation of community coders, we are less likely to see <a href="http://newsroom.fb.com/news/2014/06/building-a-more-diverse-facebook/">social software applications</a> designed for and by predominantly young urban white men.</p>
<p>Metaphors have the power to create realities we would like to see. If we are ever to reduce the cross-generational gap in digital skills we have been experiencing since 1990s and the <a href="http://www.joanganzcooneycenter.org/press/always-connected-young-childrens-media-use-on-the-rise/">digital divides</a> within generations on the rise since the early 2000s, using the metaphor of community storytelling seems like a good one.</p><img src="https://counter.theconversation.com/content/31266/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Natalia Kucirkova receives funding as a KTP Associate. She is affiliated with The Open University and Booktrust.</span></em></p>The US and UK governments often mirror each other’s strategies when it comes to new education policies, and the recent introduction of coding into the school curriculum is no exception. From this September…Natalia Kucirkova, KTP Associate for Booktrust, The Open UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/279482014-06-12T17:19:22Z2014-06-12T17:19:22ZHow the love of one teenager brought Tweetdeck to its knees<figure><img src="https://images.theconversation.com/files/50958/original/5jb7h36t-1402577304.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=496&amp;fit=clip" /><figcaption><span class="caption">Not so tight Florian!</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/79586895@N00/948979876/sizes/o/">ladyb</a>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p>TweetDeck, a Twitter app with millions of users, is back online after a rather surprising security scare. For several hours, the service was taken down all because a 19-year-old user tried to add a cute heart to his messages.</p>
<p>It seems that <a href="https://twitter.com/firoxl">Florian</a>, a budding young programmer from Austria, had run a small amount of code in the TweetDeck interface in an attempt to add a heart icon at the end of his tweets.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=327&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=327&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=327&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=411&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=411&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=411&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Florian reveals his discovery.</span>
<span class="attribution"><a class="source" href="https://twitter.com/firoxl">Twitter</a></span>
</figcaption>
</figure>
<p>Once Florian realised he had found a weakness in TweetDeck that would allow him to introduce a heart, he announced it triumphantly to the world. He says that he <a href="http://www.theregister.co.uk/2014/06/12/tweetdeck_xss_vuln_uncovered_by_heart_hunting_teenager/">tried to alert</a> Twitter, which owns the service, to the weakness but received no response.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="" src="https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=319&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=319&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=319&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=401&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=401&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=401&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A lot of retweets for @derGeruhn.</span>
<span class="attribution"><span class="source">Twitter</span></span>
</figcaption>
</figure>
<p>That worm sent out a line of code out as a tweet from a Twitter account and caused tens of thousands of users to automatically retweet without realising. The account that had the original tweet, @derGeruhn, is owned by a German student called <a href="http://www.washingtonpost.com/news/the-intersect/wp/2014/06/11/who-is-dergeruhn-the-twitter-account-that-40000-tweetdeck-users-just-involuntarily-retweeted/">Andy Perdana</a>. It’s not known if he was deliberately involved or had his account hijacked.</p>
<p>Tweetdeck picked up the tweet and retweeted it to anyone with the app open on their machine. It was then retweeted around 80,000 times, including by the BBC, which retweeted to ten million followers.</p>
<p>It was just like the old days, when worms would infect systems and hog them to the point that they became unusable. In this case, Twitter stepped in, and switched off the function that allowed the messages to be retweeted.</p>
<h2>What’s up with the web?</h2>
<p>At the moment it seems security threats are emerging on some of the biggest sites every day and this is at least in part due to how we run websites these days.</p>
<p>As more and more services are hosted in the cloud and more code is run on web servers, we are using HTML and JavaScript more than ever. In the past, software development teams would spend a great deal of time testing their programs to destruction to spot weaknesses but these programming languages were never designed to be secure.</p>
<p>To make things worse, the teams who are writing web-based code often have little training in how to actually test their applications. Code that is run on Windows or Mac programs are rigorously tested but those run on the web are not. Programmers who test their own programs often do not exercise them in a way that will make them break so they miss important problems.</p>
<p>The TweetDeck hack was about as simple they come, it just exploited a flaw that had been overlooked in testing. As Florian himself pointed out, he should never have been allowed to introduce his loved-up code in the first place.</p>
<p>The code that runs on web servers is often quite messy so security needs to be taught from day one. Software development teams must learn how to secure their code, especially by checking data input at the gate. They should know that users should never be allowed to add code without it being checked first. </p>
<p>For some reason, we often don’t teach security to software developers, especially in an understanding of how to handle exceptions in user input or from external systems, and how we encrypt data. This lack of understanding often leads to passwords and user credentials not be stored in a secure way.</p>
<p>This has got to change. Luckily, in this case, there was no real damage done, but if a single teenager can prompt the collapse of one of the biggest names on the web, we should really be taking away a serious warning about security. </p><img src="https://counter.theconversation.com/content/27948/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bill Buchanan does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>TweetDeck, a Twitter app with millions of users, is back online after a rather surprising security scare. For several hours, the service was taken down all because a 19-year-old user tried to add a cute…Bill Buchanan, Head, Centre for Distributed Computing, Networks and Security, Edinburgh Napier UniversityLicensed as Creative Commons – attribution, no derivatives.