Eventbrite, and certain approved third parties, use functional, analytical and tracking cookies (or similar technologies) to understand your event preferences and provide you with a customised experience. By closing this banner or by continuing to use Eventbrite, you agree. For more information please review our cookie policy.

Event Information

Date and Time

Location

Description

February is an absolutely crazy month for Sydney Machine Learning! With two meetups & the kickoff of our FasterAi Study Group!

In another first, Domain.com.au has been kind enough to be our host for this upcoming Sydney Machine Learning meetup. Come along for beer, pizza, networking and talks from Benjamin Wilson PhD, Chief Scientist at Lateral & who will be talking about Simulating Turing Machines with Recurrent Neural Nets and Rinat Sadykov CTO of Cozitrip who will be talking about how to implement neural networks for facial recognition.

First Presentation Summary:

Facial recognition - Ever wondered how the neural nets used for everything from face payments to tracking of criminals work?

Rinat will go over how neural networks are used for everything from the detection of a face to identifying the person, as well as an overview of what the deep learning network is doing in the different stages.

Proponents of recurrent neural nets (RNNs) can often be heard boasting that "RNNs are Turing complete". That is, they claim that for any computable function, there is a (carefully constructed) RNN that computes this function.

This means that there is even an RNN that simulates a universal Turing machine -- so there is a single RNN that can compute any computable function! In this talk, Ben will present the highlights of the now classical (1995) paper of Siegelmann and Sontag, where this extraordinary statement is proven. As we'll see, the key to the proof is a beautiful construction for encoding the discrete, symbolic machinery of the Turing machine (stacks holding bits and stack operations like pop, push, etc.) in the continuous, rational-valued world of recurrent neural nets.