Session I: Automotive requirements for next generation sensor technology – near, medium and long term

How the traditional auto industries and sensor industries are being disrupted: near – medium – long term

Analysis of the auto companies that have come to Silicon Valley – Chinese / European / Japanese / Korean. What insight does it give into the momentum, movement and disruption of traditional supply chain?

New OEMs, new T1s, new autonomous driving start-ups – how is the industry re-shaping; how are traditional business models being challenged and adapted?

Analysing new partnerships and M&As

9:30 am

Euro NCAP: In pursuit of Vision Zero

Richard Schram | Technical Manager of Euro NCAP

The presentation will show the near, medium and long term requirements Euro NCAP is developing as well as an overview of the achievements so far

10:00 am

Advanced sensor suite requirements for various levels of autonomous driving

Boris Shulkin | Vice President, Research and Development of Magna International Inc.

Networking Lunch

IS Auto Americas 2017 | Track A

Track A: Imaging and Sensors

Ubiquitous usage of Optical flow sensors

With the sudden explosion of autonomous concepts in automobiles, technologies providing real-time motion, depth mapping, and tracking have become increasingly important. Dynamic Vision Sensor (DVS) technology has evolved to address the challenging requirements of ADAS, offering an optimal balance between resolution, speed and power consumption. The DVS supplements other sensing technologies to enable high-performance system solutions and can be extended for use in multiple automotive scenarios.

In cabin monitoring systems: challenges & solutions

-Exploring the requirements for the next generation of driver monitoring systems and transition to in cabin monitoring

-Looking at multi-spectral analytics and sensors fusion technologies to overcome current generation limitations

-Evolution of in cabin monitoring solutions in the context of automation driving levels

3:00 pm

3D LIDAR: The key to mass-market adoption of autonomous transportation

Anand Gopalan | CTO of Velodyne LiDAR, Inc.

Three-dimensional LIDAR technology has been instrumental in ushering in the autonomous mobility revolution, with today’s 3D LIDAR sensors positioned as an integral element of the sensor suite for most self-driving cars. However, as we move from a small number of test fleets to mass deployment across multiple segments, dollar cost and energy consumption will start to come into focus as key hurdles. The presentation will discuss how these hurdles are being addressed to enable a quantum leap in cost and compute efficiency across the autonomous system.

3:30 pm

Afternoon Networking Break

4:00 pm

Where does LiDAR fit in the ADAS, automated driving and autonomous vehicle roadmap?

Nicholas Gagnon | Business Development – Automotive of LeddarTech

In the next 5-10 years we will see more ADAS systems and a significant ramp-up of breakthrough Highly Automated Driving (HAD) systems, using several types of sensors, to be implemented in passenger, recreational and commercial vehicles.

Cameras, radars, LiDAR, ultrasonic technologies, and their fusion will contribute greatly to increase safety on the road for all users.

This presentation will review where LiDAR technology positions itself and will present the value in this roadmap. The requisite features and performance aspects of LiDAR sensors requested and envisioned by OEM and Tier-1 will also be discussed.

4:30 pm

Development of remote parking system using smartphone to realize driverless parking

With the advancement of autonomous driving (AD) technology, development of the remote parking system has gained significant attention with the aim to realize a practical autonomous vehicle. This presentation focuses on next generation ADAS and remote parking system including performing driverless, parallel and perpendicular parking as well as exiting using the smartphone

Radars play a major role in providing sensing capabilities for active safety automotive applications and autonomous cars. Multi-transmitter and receiver radar systems are becoming popular in order to detect and classify objects in complex urban driving scenarios. This presentation will describe hardware and software modules for a multi-mode cascaded radar data processing system and a satellite radar processing system. We derive system configuration and data processing requirements based on current state of the art and future MIMO radar processing requirements. The proposed system is able to meet these requirements with ≤75% AWR12x and < 60% TDA2x utilization.

5:30 pm

Laser Diode Solutions for 3D Depth Sensing LiDAR Systems

Tomoko Ohtsuki | Product Line Manager | Lumentum

LiDAR is a required sensor for level 3&4 ADAS applications. The illumination source for LiDAR systems significantly impacts not only performance, but also reliability and cost. A variety of 3D depth sensing (3DS) diode laser illuminators developed by Lumentum have been adopted and advanced in the consumer electronics market with a solid track record of reliability and quality. New high power VCSEL array technologies are enabling the more demanding and high volume 3DS applications. We will present the robustness of 3DS cameras based on VCSEL array illuminators against sunlight, temperature range -40C to 115C, and other environmental stresses required for automotive-qualified products.

5:45 pm

Chair's Closing Remarks For The Day

6:00 pm

Evening Networking Reception

IS Auto Americas 2017 | Track B

Track B: Computer Vision, Deep Learning and AI

2:00 pm

Essential tools of self-driving ecosystem

Lorant Pocsveiler | Head of US Office of AImotive

Developing a hardware agnostic, scalable solution for enabling fully autonomous self-driving cars. Relying on cameras as primary sensors to accomplish the essential tasks of object recognition & classification, localization, decision making and trajectory planning. The presentation will provide an insight into the toolkit developed for training and verification of the full software stack, including but not limited to: calibration, data collection, semi-supervised annotation and simulation.

How low-level sensor fusion creates a system of sensors that’s better than the sum of its parts, producing higher quality perception required by autonomous vehicles

Creating hardware flexibility and system robustness through neural networks

3:00 pm

Developing robust ADAS: deep learning at the edge

Bruno Fernandez-Ruiz | CTO / co-founder of Nexar

Robust driving policy models depend on large training datasets exposing the true diversity of the real world. Current approaches are limited to models trained using homogenous data from a small number of vehicles running in controlled environments. This presentation will discuss the introduction of a network of connected devices building an end-to-end driving policy which can leverage the 10 trillion miles driven every year.

3:30 pm

Afternoon Networking Break

4:00 pm

An ecosystem approach to ADAS sensing

ADAS sensing challenges are multifaceted, including signal to noise, object identification, computational power, and design integration. A key to addressing these problems is an ecosystem approach taking into account the light source, sensor, the object that is being sensed, and synergistic enhancements to all. Included will be technologies that can make both the machine readable signs and the optical sensors themselves invisible to humans. This presentation will discuss the use of this technology in automotive design enabling optical codes for remote sensing, and the potential implications to training of neural networks.

4:30 pm

Autonomous driving needs raw data fusion

Youval Nehmadi | CTO of VAYAVISION

In this session we will present how raw data sensor fusion provides the advanced perception needed for self-driving. Using both camera and LiDAR to generate a high resolution 3D RGBd environment model we overcome the limitations of each sensor separately. The session will include comparisons between the methods in real life conditions.

5:00 pm

Cybersecurity considerations for autonomous vehicles sensors

Threats against the sensors in an autonomous vehicle represent one of several potential attack surfaces in an autonomous vehicle system. The presentation will begin with an analysis of the cybersecurity threats against the sensors in an autonomous vehicle and an examination of specific threat models. This will be followed by an overview of the attack surface of a typical autonomous vehicle sensor and an exploration of the specific attack vectors. Finally, methods to harden sensors against these attack vectors will be presented along with a review of best practices.

5:30 pm

Chair's Closing Remarks For the Day

6:00 pm

Evening Networking Reception

IS Auto Americas 2017 | Day 2

Opening Remarks and Registration

8:00 am

Conference Registration

8:15 am

IS Auto Americas | Day 2 Remarks

Session II: Optimising the building blocks for next generation auto sensors

Performance-critical imaging systems require a comprehensive approach to provide a solution that meets or exceeds multiple competing criteria. The best design methodology for a team involved these applications is to follow a system engineering process. This presentation will outline how to apply system engineering concepts to rugged/automotive camera module design. An overview of the approach, design trade-offs, and manufacturing techniques supported by real-world examples will allow the audience to leave with concrete steps to take into their own designs.

9:00 am

360° Surround vision sharper than your own eyes

Patrice Roulet | Director of Engineering and Co-Founder of Technology of Immervision

Automobiles, trucks, buses, farming trucks of tomorrow are evolving – getting connected, smarter, even autonomous. Yet, their vision remains limited mainly due to the use of restrictive old generation lenses. A new generation of super wide angles lenses capturing surrounding in full 360° increases the performance of ADAS technologies and beyond, enabling disruptive use cases for automotive video cameras. This new generation of wide angle lenses customisable as different organic eyeballs will become the essential building block in the cars evolution. In this talk, we will introduce the automotive industry unique needs and the impact the wide field-of-view lens requirements such as Chief Ray Angle, large temperature range, low F#. We will explore the fundamental contributions of panomorph optic technology compare to other alternatives. We will conclude by multiple applications where 360° technologies will shape the upcoming world of safety, sharing and freedom.

9:30 am

Optical challenges and opportunities in the auto sector - next generation lenses and more

Viewing Safety: An Integrator’s Perspective

AI and self-driving cars

NVIDIA will discuss the technology behind self-driving cars, specifically different neural networks that support the AIs that allow the car to recognize its environment, where it is, traffic signs and signals and decide how best to manage its path forward, all while keeping everyone safe.

11:45 am

Investing in the building blocks to optimise range, resolution, intelligence and cost for next generation sensors. Where to place your bets in the components stack?

Panelists include: Kleiner Perkins Caulfield & Byers

Panelists include:

Steven Hong, Partner, Kleiner Perkins Caulfield & Byers

Presentation followed by:

Start-up showcase

Showcasing the most promising sensor and machine vision technologies which will have a real impact on the auto sector

C.A.S.E. trends are completely revolutionizing the automotive industry and suppliers should adapt to survive

Connected: suppliers have a role to play to support OEM getting to the next level and help mitigate cybersecurity threats

Autonomous: biggest disruption in our lifetime and only a few players will prevail

Electrification: China is already largest producer of electric vehicles and is the true “Giga Factory”

Quality and warranty risks: new technologies account for 50% of quality issues

Analysing strategic M&A transactions occurring driven by C.A.S.E.

Supplier takeaways: how to navigate this transition

2:00 pm

A scalable approach to providing cognition for self-driving cars

Sravan Puttagunta | CEO and Co-founder of Civil Maps

Extracting context from the vehicle's environment is one of the major challenges to achieving autonomous driving. While this can be accomplished in highly controlled scenarios today, scalable solutions are not yet deployed. In this talk, we explore the crucial role of HD Semantic Maps in efficiently providing cognition to autonomous vehicles. We look at innovations in “marrying” the HD map space (long term memory) with the sensor space (sensor memory) and how to approach this with advanced localization in 6 degrees of freedom (6DoF), machine vision, and smart compression technology.

Afternoon Networking Break

Image sensors are the eyes of a highly autonomous vehicle and the more we move towards level 5 autonomy (self-driving cars), the more the ECUs will rely on sensing components including image sensors to make self-driving decisions. Undetected faulty sensing may lead to wrong driving decisions and may cause significant hazard to human life. Furthermore, since driving decisions will be made by intelligent systems, and not by humans, unauthorized images acquired by the system may also lead to significant hazard to human life. This presentation describes the key challenges and requirements for an automotive image sensor to be considered trustworthy and provides possible recommendations for each requirement to be fulfilled.

Closing Keynote Presentation

Huei Peng | Roger L. McCarthy Professor of Mechanical Engineering and Director of Mcity of University of Michigan

Despite continued progress in technology, driver education and enforcement, the number of fatalities and injuries caused by ground vehicles remain high. In the US, about 35,000 people were killed in 2015 and worldwide the number is more than a million. Technologies supporting the development of automated and connected vehicles have the potential to improve motor vehicle safety, and dramatically impact congestion, energy consumption, and mobility. In this talk, the key recent developments will be summarized, including activities at Mcity.

5:00 pm

Closing panel: Routes to and requirements for autonomous vehicles - what can be done to enhance collaboration through the sensor and broader, interconnected ecosystem?