Events in September 2019

Fall 2019: Foundations of Deep Learning

Instructor:
This course will be taught by David Banks, Professor of the Practice of Statistics, Duke University & Director, SAMSI

Course Outline:
This course is being offered in conjunction with the SAMSI semester-long research program on Deep Learning. The course will start with a review of standard neural networks, and then progress to modern deep learning, including convolutional neural networks, recursive neural networks, generative adversarial networks, and various kinds of autoencoders. We shall discuss training strategies, architecture search, regularization and quantization.

There will be mathematics in the course, and a degree of mathematical sophistication is expected from the students, but the material will all be self-contained. The emphasis will be upon heuristics and applications. There will be projects and presentations at the end of the semester, and students will work on those in small groups. Each group will need to have at least one member who can program in Python or a comparable language.

Talk Title:There is a Kernel Method for ThatSpeaker:Ernest Fokoue, Professor of Statistics, Rochester Institute of Technology

Abstract:
In this lecture, I will present a general tour of some of the most commonly used kernel methods in statistical machine learning and data mining. I will touch on elements of artificial neural networks and then highlight their intricate connections to some general purpose kernel methods like Gaussian process learning machines. I will also resurrect the famous universal approximation theorem and will most likely ignite a [controversial] debate around the theme: could it be that [shallow] networks like radial basis function networks or Gaussian processes are all we need for well-behaved functions? Do we really need many hidden layers as the hype around Deep Neural Network architectures seem to suggest or should we heed Ockham’s principle of parsimony, namely “Entities should not be multiplied beyond necessity.” (“Entia non sunt multiplicanda praeter necessitatem.”) I intend to spend the last 15 minutes of this lecture sharing my personal tips and suggestions with our precious postdoctoral fellows on how to make the most of their experience.

Fall 2019: Foundations of Deep Learning

Instructor:
This course will be taught by David Banks, Professor of the Practice of Statistics, Duke University & Director, SAMSI

Course Outline:
This course is being offered in conjunction with the SAMSI semester-long research program on Deep Learning. The course will start with a review of standard neural networks, and then progress to modern deep learning, including convolutional neural networks, recursive neural networks, generative adversarial networks, and various kinds of autoencoders. We shall discuss training strategies, architecture search, regularization and quantization.

There will be mathematics in the course, and a degree of mathematical sophistication is expected from the students, but the material will all be self-contained. The emphasis will be upon heuristics and applications. There will be projects and presentations at the end of the semester, and students will work on those in small groups. Each group will need to have at least one member who can program in Python or a comparable language.

Talk Title:Attacking the Curse of Dimensionality using Sums of Separable FunctionsSpeaker:Martin Mohlenkamp, Assoc. Professor, Dept. of Mathematics, Ohio University

Abstract:
Naive computations involving a function of many variables suffer from the curse of dimensionality: the computational cost grows exponentially with the number of variables. One approach to bypassing the curse is to approximate the function as a sum of products of functions of one variable and compute in this format. When the variables are indices, a function of many variables is called a tensor, and this approach is to approximate and use the tensor in the (so-called) canonical tensor format. In this talk I will describe how such approximations can be used in numerical analysis and in machine learning.

Fall 2019: Foundations of Deep Learning

Instructor:
This course will be taught by David Banks, Professor of the Practice of Statistics, Duke University & Director, SAMSI

Course Outline:
This course is being offered in conjunction with the SAMSI semester-long research program on Deep Learning. The course will start with a review of standard neural networks, and then progress to modern deep learning, including convolutional neural networks, recursive neural networks, generative adversarial networks, and various kinds of autoencoders. We shall discuss training strategies, architecture search, regularization and quantization.

There will be mathematics in the course, and a degree of mathematical sophistication is expected from the students, but the material will all be self-contained. The emphasis will be upon heuristics and applications. There will be projects and presentations at the end of the semester, and students will work on those in small groups. Each group will need to have at least one member who can program in Python or a comparable language.

Abstract

Hurricane-driven storm surge is one of the most deadly and costly natural disasters, making precise quantification of the surge hazard of great importance. Physics-based computer models of storm surge can be implemented with a wide range of fidelity due to the nature of the system, though the danger posed by surge makes greater fidelity highly desirable. However, such models and their high-dimensional outputs tend to come at great computational cost, which can make highly detailed studies prohibitive. These needs make the development of an emulator combining high-dimensional output from multiple complex computer models with different fidelity levels important. We propose a parallel partial autoregressive cokriging model that is able to address these issues. Based upon the data-augmentation technique, model parameters are estimated via Monte Carlo expectation-maximization algorithm and prediction is made in a computationally efficient way when input designs across different fidelity levels are not nested. With this methodology, the high-fidelity storm surges can be generated much more quickly in coastal flood studies, and hence can facilitate the risk assessment of storm surge hazards.

References

Triangle Machine Learning Day

Description:
Modeled after similar events occurring in New York and Boston, the goal is to bring together researchers and applied scientists working in all different areas of machine learning, including industrial applications, academic theory, and everything in between, for a day of technical talks and posters.

The program will include a poster session, and anyone wishing to contribute a poster must submit an abstract; not all abstracts will be selected as posters, depending on space limitations. The due date for poster submissions is August 29, 2019. Poster selections will be announced on September 5th.

Heavy snacks will be provided for all participants, lunch and breakfast are on your own. There are a large number of restaurants at the neighboring Brodhead Center.

Fall 2019: Foundations of Deep Learning

Instructor:
This course will be taught by David Banks, Professor of the Practice of Statistics, Duke University & Director, SAMSI

Course Outline:
This course is being offered in conjunction with the SAMSI semester-long research program on Deep Learning. The course will start with a review of standard neural networks, and then progress to modern deep learning, including convolutional neural networks, recursive neural networks, generative adversarial networks, and various kinds of autoencoders. We shall discuss training strategies, architecture search, regularization and quantization.

There will be mathematics in the course, and a degree of mathematical sophistication is expected from the students, but the material will all be self-contained. The emphasis will be upon heuristics and applications. There will be projects and presentations at the end of the semester, and students will work on those in small groups. Each group will need to have at least one member who can program in Python or a comparable language.

Abstract

We use topological data analysis and machine learning to study a seminal model of collective motion in biology. This model describes agents interacting nonlinearly via attractive-repulsive social forces and gives rise to collective behaviors such as flocking and milling. To classify the emergent collective motion in a large library of numerical simulations and to recover model parameters from the simulation data, we apply machine learning techniques to two different types of input. First, we input time series of order parameters traditionally used in studies of collective motion. Second, we input measures based in topology that summarize the time-varying persistent homology of simulation data over multiple scales. This topological approach does not require prior knowledge of the expected patterns. For both unsupervised and supervised machine learning methods, the topological approach outperforms the traditional one.

Fall 2019: Foundations of Deep Learning

Instructor:
This course will be taught by David Banks, Professor of the Practice of Statistics, Duke University & Director, SAMSI

Course Outline:
This course is being offered in conjunction with the SAMSI semester-long research program on Deep Learning. The course will start with a review of standard neural networks, and then progress to modern deep learning, including convolutional neural networks, recursive neural networks, generative adversarial networks, and various kinds of autoencoders. We shall discuss training strategies, architecture search, regularization and quantization.

There will be mathematics in the course, and a degree of mathematical sophistication is expected from the students, but the material will all be self-contained. The emphasis will be upon heuristics and applications. There will be projects and presentations at the end of the semester, and students will work on those in small groups. Each group will need to have at least one member who can program in Python or a comparable language.