Get Involved

Speakers and Presentation Topics
Technology Showcase Presenters
Past Speakers
Call for Speakers

Speakers and Presentation Topics

(listed alphabetically, by speaker’s last name)

Sensor fusion: bridging the gap from ADAS to autonomous driving
Eric Balles, ScD
Managing Director, Transport and Energy
Draper

The functionality of ADAS sensors, many of which are based on MEMS components, is nothing short of astounding. In the past five years, vision systems with embedded object recognition and tracking have become commonplace and miniature radar modules are widely available with 3D object tracking capabilities for dozens of simultaneous targets. As both are deployed on more production vehicles, their costs continue to decrease and life-saving features are helping more consumers each year in Level 2 and Level 3 autonomous driving systems. It is only natural then to ask “Can these sensors meet the needs of Levels 4 and 5 autonomous systems?” In the past two years we have explored this question and discovered challenges and opportunities in adapting ADAS technologies to Level 4+ automated driving. At the same time, a new approach has arisen that breaks the current paradigm of using the on-board processing outputs of today’s ADAS sensors. This new approach instead favors pushing “raw” data from each sensor into a centralized high-powered processor where machine-learning algorithms perform all the required fusion tasks. We will review lessons learned from integrating ADAS vision and radar sensors into a prototype Level 4+ autonomous driving system which fuses the “object” tracking data provided by each ADAS sensor into a 360 degree map of cars and pedestrians during urban driving. We will compare this approach with the emerging trend of centralized high performance computing fusion and discuss the impacts both approaches have on future autonomous vehicle architectures.

Biography: Dr. Eric Balles joined Draper in 2011 to launch and build a commercial business focused on solving the most difficult issues facing the transportation and energy sectors. He has overall responsibility for these commercial areas at Draper including strategic planning, customer relationship, business development, project execution and financial performance. Dr. Balles’ career spans several industry sectors including automotive, electricity and oil & gas. He started his career in the automotive industry and has worked extensively in Europe with auto makers and tier one suppliers. Dr. Balles earned SB, SM and ScD degrees in mechanical engineering from the Massachusetts Institute of Technology.


Cooperative sensing for connected automation
Roger Berg
Vice President North America Research and Development
DENSO International America

Recent key initiatives and programs promoting the research and deployment of active safety features and automated transportation have been grabbing media headlines for the past few years. Most current high visibility developments for ADAS/automated vehicles focus on using traditional on-board sensing along with advanced machine learning for perception and computation to execute the driving task. This presentation and discussion reflects on some scenarios where such traditional on-board sensing, for example, multi-spectral devices such as radars, cameras, LIDARs, ultrasonic transducers, or their combination, doesn’t currently always meet the needs for highly automated operation, particularly in denser urban or suburban environments, where cooperative sensing and response might provide additional benefits for the occupants of a highly automated vehicle. We discuss the possible advantages of adding various types of wireless connectivity to optimize safety and efficiency of the onboard and roadside cooperative ADAS/AV systems, and their operation in the transportation domain.

Biography: Roger Berg is Vice President of DENSO’s North American Research and Development group. He is responsible for DENSO’s regional Research and Development of connected vehicles, automation and cyber security at their laboratories in both California and Michigan. He also coordinates closely with DENSO’s research and product groups in sister organizations located in the US, Japan, Germany and China. Mr. Berg earned a Bachelor of Science degree in Electrical Engineering from the University of Illinois in Urbana, Ill., and a Master of Science in Electrical Engineering from Illinois Institute of Technology in Chicago. He is the inventor or co-inventor on eight U.S. and international patents.


LIDAR augmentation techniques for affordable ADAS deployment
Raul Bravo
CEO
Dibotics

Autonomous driving and ADAS markets are fueling an unprecedented level of investment and energy focused on developing smaller, cheaper and better LIDAR sensors, allowing for a better perception and understanding of the environment around the car. However, the LIDAR hardware is only a part of the equation. Real-time 3D perception software for localization, segmentation and obstacle detection and tracking can solve the other part. This presentation will discuss how LIDAR-only data can be used to achieve a high level of reliability and application features for ADAS, de-correlated from other sources of information like radar, and how a combination of software and hardware is a mandatory condition to meet the challenge of both a high performance and low cost perception capabilities in ADAS applications. We'll present how a technique called "sensor augmentation" can help to meet the objective of affordable and accurate 3D perception.

Biography: Raul Bravo, CEO co-founder of Dibotics, creator of the Augmented LIDAR technology. Raul is a serial entrepreneur in mobile robotics & start-up coach, with an extensive 15 years background in both bootstrapped & VC-backed start-up creation and growth. An engineer from UPC (Barcelona, Spain) with an MBA from College des Ingénieurs (Paris, France), he’s filed 10 patents and obtained 27 different awards for his engineering and entrepreneur career, among them the “MIT Technology Review – Top 10 Innovators Under 35”.


Evolution of ADAS architectures, sensor fusion, and the future of the autonomous car
Jeremy Carlson
Principal Analyst and Manager
IHS Markit

The autonomous car figures prominently in the future vision of the automotive and mobility industries. Many OEMs have implemented intermediate milestones through advanced driver assistance (ADAS) functions on the path to achieving a fully autonomous car with significant growth across nearly all ADAS applications as the market marches from Level 0 into Level 2/3 with the latest vehicles coming to market. Challenges on that path are plenty, from the relatively simple but cost-effective democratization of features to more complex technical challenges in sensor technologies such as lidar, cost-efficient and computationally scalable sensor fusion computing, and vehicle architectures to connect everything together. This presentation will illustrate how the building blocks of autonomous driving are evolving: with an emphasis on Levels 3 to 5, a focus on the key additions to the architecture such as lidar, sensor fusion ECUs and in-vehicle architecture, and the changing landscape as autonomous technology shapes personal mobility.

Biography: Jeremy Carlson is a Principal Analyst and Manager with the automotive team at IHS Markit in the areas of autonomous driving, mobility and automotive technology. He has worked in automotive electronics market research and analysis with a focus on driver assistance, sensors, autonomous vehicles and mobility for 9 years in the analyst role. He now leads the Autonomous Driving practice for IHS Markit in addition to being a key contributor to emerging mobility topics. Mr. Carlson’s primary areas of focus include automated and autonomous driving and new mobility services following on years of experience in advanced driver assist systems, technologies and sensors. Complementary research includes technical topics, regulation and legislation, and the deployment of new technologies as they enter and are made more broadly available across the market. He has worked with a number of OEM, supplier and technology companies in supporting both syndicated and custom analysis to support critical decisions that shape the automotive and transportation business landscape.


Critical component for 77 GHz imaging automotive radar: metamaterial frequency-adaptive steerable technology (M-FAST)
Bernard Casse, PhD
Area Manager
PARC, a Xerox company

Today, there is no 77 GHz high-resolution 3D imaging radar – a crucial weatherproof sensor for autonomous driving, which is capable of sensing, discriminating, and tracking road objects. The state of the art in automotive radar is digital beam forming (DBF), which has overly complex hardware, is computationally intensive (slow), and cannot support high signal-to-noise ratio and high-resolution simultaneously. DBF is currently reaching its limits in terms of performance. At PARC, we developed M-FAST: meta material frequency-adaptive steerable technology – a disruptive technology based on engineered metamaterials that is capable of true analogue beamsteering, while being free from the limitations of DBF. M-FAST is poised to displace DBF-based technologies, and represents the next frontier for 77 GHz imaging radars. In this talk, we will provide an overview of this transformative technology, and also talk about Metawave, PARC’s VC-backed spinoff with the charter to accelerate development of M-FAST for both automotive and 5G communications.

Biography: Dr. Bernard Casse manages the Metamaterial Devices and Applications R&D area at PARC. He is responsible for making strategic investments in early-stage technology platforms, overseeing a portfolio of multi-year, multi-million dollars projects, supporting applied R&D operations, defining the strategic agenda for emerging technologies, and leading a team of word-class performers. Prior to his role at PARC, he was a program manager at Physical Sciences Inc. (PSI), a defense contractor company, where he led and managed U.S. Government-sponsored programs focused on developing disruptive technologies and advanced manufacturing. In earlier years, Bernard was a research scientist at the Electronic Materials Research Institute at Northeastern University. And, in parallel, he was a qualified cleanroom user of the Harvard Center for Nanoscale Systems (CNS) and the Center for Functional Nanomaterials (CFN) at Brookhaven National Laboratory. Bernard holds a PhD in Physics from the National University of Singapore, and was a member of the Technical Staff at the Singapore Synchrotron Light Source (SSLS).


Designing sensor algorithms for the automobile environment
Tony Gioutsos
Director of Sales and Marketing
TASS International

The difference between an automobile environment and other environments when designing an algorithm is substantial. Most important is the need for safety. When driving a vehicle at a high rate of speed, any error can produce tragic results. Because automobiles are expected to survive 15 years in all kinds of conditions, it is basically impossible to design sensor algorithms that are tested for all kinds of scenarios that could be encountered. Also, sensor variation occurs throughout the life of the automobile causing even more difficulty. In this presentation, we outline a generic approach to designing sensor algorithms that are robust to the real-world. We look back on techniques and testing methods used in other automobile sensing system algorithms. Some of the areas of discussion include: sensor noise, Monte Carlo techniques, the combination of real-world testing, test track testing and simulation, intelligent outside-the-box design, concatenated events, etc.

Biography: Mr Tony Gioutsos has been involved with automotive safety systems since 1990. As Director of Electronics R&D for both Takata and Breed Technologies, he was at the forefront of the safety revolution. His cutting edge work on passive safety algorithm design and testing led him to start the first automotive algorithm company in 1994. After receiving both his BSEE and MSEE (specializing in Communications and Signal Processing) from the University of Michigan, Mr Gioutsos worked on satellites and radar imaging for defense applications before joining Takata. He has been a consultant for various companies in areas such as biomedical applications, gaming software, legal expert advisory, and numerous automotive systems. Mr Gioutsos is currently Director of Sales and Marketing in the Americas for TASS International where he has continued to define active safety algorithm testing requirements as well as working on various other state-of-the-art approaches to enhance automated and connected car robustness. He has been awarded over 20 patents and presented over 75 technical papers.


Photonic technologies for LIDAR in autonomous and ADAS applications
Jake Li
Marketing Engineer
Hamamatsu

From fleet to commercial vehicles, there are a growing number of new and existing technologies that are important for the development of a fully autonomous vehicle. Aside from traditional sensors such as cameras, ultrasonic and radar, LIDAR technologies are becoming the key enabler in the fusion of sensors needed to achieve higher levels of autonomous control (levels 4/5). Today, there are already multiple designs of LIDAR systems whose key components are photonic devices such as light sources, photodetectors and MEMS mirrors. This presentation will provide an overview of the tradeoffs for LIDAR vs. competing sensor technologies (camera, radar, and ultrasonic) that re-enforce the need for sensor fusion, as well as summarize and compare various mechanical and solid-state LIDAR designs. Guidelines for selecting photonic components such as photodetectors, light sources, and MEMS mirrors will also be discussed.

Biography: Jake Q. Li is a marketing engineer at Hamamatsu Corporation, in charge of research and analysis for various market segments, with a concentration in the LIDAR-automotive market. He is knowledgeable about different optical components such as photodetectors – including MPPC (a type of silicon photomultiplier), avalanche photodiodes, and PIN photodiodes – and light emitters that are important parts of LIDAR system designs. He has expert understanding of the upcoming solid state technology needs for the autonomous automotive market. Together with his experience and understanding of the specific requirements needed for LIDAR systems, he will guide you through the selection process of the best photodetectors that will fit your individual needs. His expertise will assist in making important decisions with regards to various LIDAR design needs.


Millions of scenarios, thousands of requirements: managing the testing challenge for automated driving
Vivek Moudgal
Vice President of Sales
dSPACE

The success of future AD capable vehicles on the road rests, to a large extent, on the capability and availability of the sensing technology. Once the algorithms have been developed, the software will need to be put through a battery of tests with every imaginable scenario to confirm the operation of the software before it is deployed in production vehicles. Executing such tests in a vehicle on test tracks and public roads will be impossible in a reasonable time frame, given the sheer number of scenarios and environmental conditions that need accounting. A simulation-based approach is necessary, one that will allow for flexibility, reproducibility, and rigorous testing. Simulation-based testing offers a spectrum of capabilities from MIL, SIL and HIL-based testing. The SIL approach compliments HIL testing by offering the ability to maximize such testing in an offline simulation environment, using arrays of computers operating in parallel and crunching through the testing scenarios 24/7. This presentation will address the needed capabilities and fidelity in sensor and plant modeling to fulfill the needs for testing ADAS/AD features both in off line as well as real time environments while reducing the need for on-road testing.

Biography: Vivek Moudgal is the Vice President of Sales for dSPACE, responsible for sales operations in the company's North American market since 2003. Vivek joined dSPACE in 1993 as a technical support engineer and spent his first 10 years performing various roles for the engineering department, including supporting, executing, and managing software development projects. Throughout his tenure with the company, he has gained expertise in the application of model-based development tools for control software development and validation.


What is the role of inertial sensors in autonomous vehicles?
Ville Nurmiainen
Business Development Manager
Murata Electronics

Inertial sensors are today widely used in agricultural applications to improve GPS positioning down to sub-inch accuracy by compensating antenna position when vehicle is driving on a slope. In autonomous vehicles, inertial sensors can also be used to support accurate localization, but the concept is a bit different than in typical agricultural applications. GPS signal alone does not provide sufficient localization accuracy. Signals can be blocked or disturbed in “urban canyon” environments. When driving at slow speeds or in stop-and-go traffic, GPS is not able to track heading and velocity changes well. Inertial sensors can also provide a back-up system for a vehicle in case of visual sensor malfunction. For example, inertial sensors can guide a vehicle to a safety stop if the visual sensors (camera, radar, LIDAR) do not provide reliable information. We have investigated what level of localization accuracy can be achieved when using state of the art ESC (electronic stability control) sensors to compensate for GPS un-idealities and outages. Can existing ESC sensors be a reasonable cost solution that provides the required accuracy level for localization in a high-definition map environment? To demonstrate these sensors’ capabilities and limitations, we have developed and tested in-house sensor fusion algorithms that combine data from our 6-DOF IMU (ESC sensor based) with GPS and wheel speed information. In this presentation, we will share information the test setup and the results of the actual driving test.

Biography: Ville Nurmiainen has over 20 years’ experience of providing inertial sensors to the automotive industry. He started his career at VTI, a company which has been the market leader in low-g accelerometers for safety critical applications since early 1990s. Ville joined to Murata when the company acquired VTI in 2012. He has been working in various positions in VTI/Murata from product design to product management and marketing and has experience working with all the major Tier 1 and OEM companies worldwide. Currently Ville is working as a business development manager located in Novi, Michigan and he works closely with automotive industry to determine new business opportunities for sensors, especially in ADAS and autonomous driving applications.


Advanced packaging technologies for ADAS applications
Urmi Ray, PhD
Senior Director, Product Technology Marketing
STATS ChipPAC

Recent trends in the automotive market, such as semi and fully autonomous vehicles, have spiked consumers’ awareness and interest and, in turn, have created significant momentum across the semiconductor supply chain to deliver cost effective technology which promotes comfort and safety. A long list of semiconductor solution providers (such as makers of processors, sensors, and supporting chipsets) are playing a critical role in enabling reliable hardware for the ADAS software stack. In this presentation, we will take a look at the history of packaging solutions for ADAS sensor devices such as radars, LIDARs, and cameras. Then we will fast forward to today and review how the ADAS packaging technologies have advanced to enable high performance automotive radar and optical solutions. We will also discuss the latest packaging platforms, which can enable multi-chipset integration as a system-in-package (SiP), offering added value for ADAS device makers.

Biography: Dr. Urmi Ray is a Senior Director of Product and Technology Marketing for STATS ChipPAC, focusing on advanced system–in-package (SiP) solutions. Before joining STATS ChipPAC, she spent 11 years at Qualcomm, developing advanced technologies such as 2.5D and 3D solutions, multi-chip package integration and interactions. Prior to this, she worked at Lucent Technologies Bell Labs in multiple areas related to electronics assembly and reliability. Dr. Ray received her PhD from Columbia University in New York City. She has been actively involved in the semiconductor industry with several conference publications as well as patented inventions in the field of advanced packaging. She is also currently serving on the board of directors for the International Microelectronics Assembly and Packaging Society (IMAPS) as the Vice President of Technology.


Solid-state VCSEL illumination modules for flash LIDAR in ADAS and automated driving applications
Luke Smithwick
Chief Marketing Officer
TriLumina

LIDAR has been recognized as a pivotal technology which can be used in sensor fusion architectures with vision and radar to greatly increase the accuracy of object and free-space recognition. However, most existing LIDAR systems are too expensive, require rapidly spinning mirrors or other similar moving parts and are physically large. This necessitates the development of low-cost, solid-state, small form factor LIDAR systems providing sufficient optical output power while remaining eye safe. Although a few solid-state illumination alternatives have been proposed, they have not come to fruition in a meaningful way due to multiple limitations of the technologies. This presentation will compare current LIDAR illumination technologies including proven, cost-effective, solid-state illumination solutions based on back-emitting VCSEL arrays enabling eye-safe, flash LIDAR systems for multiple use cases in the vehicle. These same scalable VCSEL illumination modules can also be used to enable in-cabin monitoring and rear camera illumination.

Biography: Luke Smithwick has over 25 years of experience in business and technical leadership spanning semiconductors, software, hardware and core R&D with more than a decade focused on the automotive industry. He is currently the Chief Marketing Officer of TriLumina, the industry’s leading provider of solid-state VCSEL illumination modules for automotive applications. He joined TriLumina from Silicon Valley start-up CloudCar where he was VP of Marketing, Partnerships and Business Development. Prior to that, he was director of Automotive Infotainment Products at Qualcomm driving product management and focusing engineering on achieving automotive qualification of high-complexity application processors. Luke started his automotive career at Freescale (Now-NXP) as Director of Software and Solution Technologies where he started an automotive professional services, P&L, and founded Freescale’s automotive software and solutions team. This evolved into Luke taking leadership of the P&L, product marketing, solutions, strategy and vision for Freescale’s Automotive Driver Information and Infotainment business as Global Operations and Business Manager. Earlier in his career, Luke was focused on complex IP licensing at Aware as VP and GM of Licensing, he was CTO of NetBind and held a number of senior marketing and sales positions at GlobeSpan Semiconductor. Luke was an advanced data communications technology researcher at AT&T Bell Laboratories responsible for breakthroughs in VDSL technology. He did post graduate work at Princeton University and Columbia University and holds a BSEE and MS in electrical engineering from the University of Florida. He holds 14 technology patents and has published multiple industry and technical papers.


ADAS to autonomous: ongoing evolution of LIDAR and infrared camera sensors
Rajeev Thakur
Regional Marketing Manager, Infrared Business Unit
OSRAM Opto Semiconductors

The future of LIDAR and infrared camera in the automotive market continues to evolve rapidly. Low resolution flash LIDAR is being pushed into headlamps and tail lamps – as the corners of the car are premium locations for lighting and sensing. High resolution long range LIDAR is bifurcating to scanning and flash, i.e. 3D cameras. Lasers used are selected for wavelength, emitted power, driver, package, efficiency and other factors. They have to be matched to receivers with compatible peak wavelength sensitivity, gain, input voltage, packaging, etc. The selection of such key components drives the design into a time and resource tunnel of more than a year and a few million dollars. As camera technology is replicated around the car for 360 degree coverage, it still lacks range (80 to 200 m), resolution (1.2 to 8 MP) and sensitivity at low light (visible to IR). Concepts which use visible and infrared spectrum in the camera to extract information are being developed. Mechanical IR filters are giving way to newer software filters and hardware processing chips. This presentation will examine some of these technologies that are rapidly evolving to bring LIDAR and infrared camera technology to market for ADAS and autonomous applications.

Biography: Rajeev Thakur is currently the Product Marketing Manager at OSRAM Opto Semiconductors, responsible for infrared product management and business development in the NAFTA automotive market. His current focus is on LIDAR, driver monitoring, night vision, blind spot detection and other ADAS applications. Thakur joined OSRAM Opto Semiconductor in 2014. He has prior experience in the Detroit automotive industry since 1990 – working for companies such as Bosch, Johnson Controls, and Chrysler. He has concept-to-launch experience in occupant sensing, seating and power train sensors. He holds a Masters degree in Manufacturing Engineering from the University of Massachusetts, Amherst and a Bachelors degree in Mechanical Engineering from Guindy Engineering College in Chennai, India. He is a licensed professional engineer and holds a number of patents on occupant sensing. He is also a member of the SAE Active Safety Standards development committee.


Making sense of the automotive LIDAR marketplace
Harvey Weinberg
Division Technologist for Automotive Business Unit
Analog Devices

The development of affordable LIDAR parallels that of radar. It took roughly 50 years for radar system designers to develop technology that, eventually, standardized system approaches for particular applications. LIDAR technology is experiencing this same learning curve today – but trailing radar by over a decade. The result is a widely varying, and confusing, assortment of design approaches being proposed for automotive LIDAR. Fortunately, the principle of operation of the “whiz-bang” technologies being proposed for automotive LIDAR have already been examined by the very early technology adopters (military and aerospace) and the underlying physics is well understood. As usual, the new crop of LIDAR technologies offer promises of improved engineering solutions to these well-known basic principles of operation. This talk aims to aid the system designer or user understand what principles operation are available, and via physics and math, explain of what a well-engineered LIDAR system based on each particular principle of operation is capable.

Biography: Harvey Weinberg is the Division Technologist for the Analog Devices Automotive Business Unit. Over the past few years he has been working on long-time horizon technology identification as it pertains to automotive. Lately this has been principally LIDAR. Prior roles at Analog Devices have been System Application Engineering Manager for the Automotive BU and before that, leader of the Applications Engineering group for MEMS inertial sensors. He has 8 US patents in technologies varying from ultrasonic airflow measurement, to inertial sensor applications, to LIDAR systems. He has been at Analog for 19 years. Before Analog, he worked for 12 years as a circuit and systems designer specializing in process control instrumentation. He holds a Bachelor of Electrical Engineering degree from Concordia University in Montreal, Canada.


Technology Showcase Presenters

(listed alphabetically, by speaker’s last name)

Nidal Abbas
Business Development Manager
Excelitas Technology

Tony Gioutsos
Director of Sales and Marketing
TASS International

Makoto Motoyoshi
CEO
Tohoku-MicroTec

Sagor Bhuiyan
Technology Strategist
AutoHarvest


Past Speakers

GNSS for ADAS applications
Chaminda Basnayake, PhD
Principal Engineer
Renesas Electronics America

GPS/GNSS (global navigation satellite systems) will likely be the core positioning system in most, if not all ADAS. While other onboard sensors such as radar, vision and lidar sense and interpret the surrounding environment, GNSS and maps will enable the vehicles to see beyond the horizons of other onboard sensors. In addition, GNSS and vehicle connectivity may be required to enable robust sensing, especially in real life scenarios where the sensor vision is limited due to obstructions. Ongoing modernization of GPS (i.e., multiple civilian signals) and deployment of other GNSS such as Galileo, add more robustness to the GNSS solutions available to future ADAS implementations. This presentation will review key GNSS use cases in ADAS applications, GNSS technology options available, and possible limitations/challenges. Discussion will also include how complementary sensor technologies in ADAS can be used with GNSS to design systems that can meet the reliability, complexity, and cost constraints for mass deployment.

Biography: Dr. Chaminda Basnayake is a principal engineer with Renesas Electronics America where he leads all Renesas V2X activities in North America. Prior to that, he was with General Motors R&D and GM OnStar as a senior engineer. He served as the GPS/GNSS subject matter expert for GM and the USDOT-automotive OEM collaboration Crash Avoidance Metrics Partnership (CAMP) consortium during his 10 years at GM. He has authored/co-authored over 20 GM patents and numerous publications and presentations on GNSS, V2X, and telematics systems. His career highlights include winning GM’s highest award for excellence and being nominated as one of the 50 Leaders to Watch in the GNSS industry by the GPSWorld magazine. Dr. Basnayake holds a PhD in geomatics engineering from the University of Calgary, Canada.


Optimizing resolution and cost of lidar sensors in a fusion environment
Jean-Yves Deschênes
President
Phantom Intelligence

While the lidar industry is pursuing very high-density point clouds that are used in mapping applications, ADAS functions such as collision warning can rely on much less dense distance data for collision-warning estimation; within that, cameras can handle the high density required for obstacle classification. To optimize costs, a variety of densities can be used for different applications (i.e., high-density for frontal collision and navigation purposes and lower pixel densities for side or rear collision-warning). New lidar technology alternatives including solid-state, MEMS-scanned, mechanically-scanned and phase-array techniques, promise to make lidar sensing even more cost competitive. The problem now is choosing the right technology from the available alternatives. In addition to explaining the nuances of the available and emerging technologies, this presentation will discuss the selection criteria for various lidar solutions so both system designers and users can pursue the most cost-effective one for their design.

Biography: As co-founder and President of Phantom Intelligence, a Tier One company that develops collision warning sensors and components based exclusively on lidar technology, Jean-Yves Deschênes oversees the definition of strategic orientations and manages the resources required for the executing the company’s strategic objectives. In this role he contributes to the company’s international visibility and to the adoption of its solutions. Deschênes has a computer science degree from Université Laval in Canada and over 30 years of combined software, optics and now lidar technology experience as well as extensive experience on projects involving international collaborations and radical new uses of technology


ADAS sensing: a change in the road ahead?
Mark Fitzgerald​
Associate Director, Global Automotive Practice​
Strategy Analytics

Advanced driver assistance systems (ADAS) applications will continue to be the largest driver of automotive sensor growth through 2022. The growing demand for ADAS for mass-market vehicles comes from: (a) increasing consumer awareness and demand for ADAS technologies, (b) the potential for reduced insurance and repair costs when using ADAS, (c) increasing demand for highly-specified compact segment models, (d) growing OEM competition, using safety features as a means of differentiation, (e) mandates (or pseudo-mandates) requiring ADAS fitment of technologies such as autonomous emergency braking (AEB); these mandates will mean that new solutions are required to meet the cost/performance requirements of the compact model segments. Changes to the existing approach to ADAS sensing will occur based on current suppliers’ continued improvements of their systems as well as new players with innovative ideas. The talk will provide a comprehensive overview of sensor technologies for ADAS applications such as: camera, bolometer, lidar, ultrasonic, infrared, as well as short, medium and long range radar.

Biography: Mark Fitzgerald is the Associate Director for Global Automotive Practice at Strategy Analytics. He manages inquiries and analytical research of the North American market for the Automotive Electronics (AES), and Automotive Multimedia and Communications Service (AMCS) market segments. Mark's experience includes research, forecasting and consulting of automotive electronics and sensor applications in: powertrain control, passenger safety, vehicle stability, and in-vehicle infotainment and connectivity systems. Mr. Fitzgerald is the author of Strategy Analytics' Automotive Sensor Demand Forecast and Outlook report. Prior to joining Strategy Analytics, Mark was a marketing analyst for Pollak Engineered Products, a tier one supplier of sensor and switch products to the automotive industry. He was responsible for performing and managing technology development and competitive market analysis in the area of automotive electronics. Previously, Mark headed the North American Automotive Forecast Database Operations for IHS Automotive. Mr. Fitzgerald holds a BS in Business Management from Providence College.


Solid-state lidar for wide-spread ADAS and automated driving deployment
Axel Fuchs, PhD
Vice President of Business Development
Quanergy

Achieving automotive reliability and cost targets for wide deployment of ADAS requires significant technology advancements. Solid-state lidar and 3D perception has the capability to meet these challenges. It offers improvements in range, resolution, accuracy and speed of object detection. Sensor performance, form factor reduction and power consumption are also areas that need considerable improvement compared to existing mechanical lidar systems. Integrated solid-state lidar sensors can be small enough to fit in the palm of a hand and mounted behind a grill, inside a bumper, inside a side-view mirror or behind a rear-view mirror. However, sensing hardware is only a part of the system solution. 3D perception software for real-time object detection, tracking, and classification is the other part. The data collected is utilized to greatly improve the accuracy and reliability of the environment perception for advanced driver assistance and active safety systems. This presentation will discuss an integrated solid-state solution for adding value to ADAS systems and enabling safe autonomous driving.

Biography: Dr. Axel Fuchs is Vice President, Business Development at Quanergy Systems. Dr. Fuchs is a technology executive with 30 years of experience in managing business and innovation for Daimler, Motorola, and smaller high-tech companies. Before joining Quanergy, he held senior business development positions at Telenav, EZ4Media, and Universal Electronics. Earlier, Dr. Fuchs led the systems engineering department for Motorola Automotive Group where he collaborated with all major carmakers on integrating innovative electronics systems and worked for Daimler Benz where he was the co-founder of the Daimler-Benz research office in Palo Alto, California. There he pioneered the first Internet connected car and received the Smithsonian Award for Excellence. Dr. Fuchs holds more than 12 system level patents to help ensure technical leadership in digital media, mobile computing, navigation, automotive electronics, communications, and control engineering. Dr. Fuchs holds a Doctorate in electrical engineering from Darmstadt University of Technology, Germany.


Advanced physics based sensor simulation approaches for testing automated and connected vehicles
Tony Gioutsos
Director of Sales and Marketing
TASS International

In order to provide a "due care" testing approach to automated and connected vehicle technology, an advanced sensor simulation must be involved. Although real-world or field tests are required as well as test track testing, simulation can provide a bulk of the testing and also provide tests not producible via real or test track testing. Furthermore, to provide the most accurate and best validation, sensor simulation closest to "raw data" is preferred. Besides the deterministic set of data that a simulation program can produce, it is also important to develop probabilistic models that correlate to real-world data. In this talk, advanced physics based sensor models with deterministic and probabilistic components will be introduced. The models described include: camera, radar and V2X. These models can be used to produce receiver operating characteristic (ROC) curves and other measures of detection and estimation system performance. Using these measures allows for a robust system in real-world operating conditions.

Biography: Mr. Gioutsos has been involved with automotive safety systems since 1990. As Director of Electronics R&D for both Takata and Breed Technologies, he was at the forefront of the safety revolution. His cutting-edge work on passive safety algorithm design and testing led to a startup company that was purchased by Breed Technologies. After receiving his Master's Degree in Electrical Engineering from the University of Michigan, Mr. Gioutsos worked on satellites and radar imaging for defense applications before joining Takata. He has been a consultant for various companies in areas such as biomedical applications, gaming software, legal expert advisory, and numerous automotive systems. Mr. Gioutsos is currently Director of Sales and Marketing in the Americas for TASS International where he has continued to define active safety algorithm testing requirements as well as working on various other state-of-the-art approaches to enhance automated and connected car robustness. He has been awarded over 20 patents and presented over 40 technical papers.


Sensor fusion: the “holy grail” of advanced safety and driving automation
Moderator: David McNamara
Technology Mining
Magna International

Driving automation, especially level 3 and 4 systems, challenge today's ADAS systems to be robust under all driving conditions with 360 degree coverage to detect the "car that came out of nowhere". The New Car Assessment Program rating system will incentivize the OEMs to introduce new technology, such as Automatic Emergency Braking (AEB), to prevent intractable accident scenarios such as pedestrian collision warning, the "person who came out of nowhere". Our conventional wisdom is that a wide range of sensor technologies will be integrated, using sensor fusion: GPS, cameras, ultrasonics, radar, LIDAR and V2X connectivity. Automation requires sensor data to be fused from this diverse set of sensors to create a complete and trustworthy understanding of the driving environment. Engineers often evoke a "kitchen sink" system by suggesting we integrate all known sensor technologies. This approach, though enticing, raises issues of affordability and the practically of integrating a diverse sensor data set? We can easily envision a car with 8-12 vision sensors, two long range radar sensors, and four LIDARs at the corners with V2X connectivity. Sensor characterization, system engineering disciplines, and sensor fusion are the enablers to design a robust, affordable system, one that is available not only on luxury vehicles but offered on mass-market vehicles. This panel of leading industry experts will provide insights into how we build a robust level 3 and 4 automation from a set of affordable, high performance sensors. We will identify and examine sensor technologies, address development challenges, and integration issues. How do we reliably employ connectivity (e.g. V2X) and cloud based solutions? What are the system design approaches, centralized versus decentralized control, and the role of cloud-based data, that we should consider? And what are the sensor fusion strategies to adopt?

Biography: David A. McNamara is with Magna International R&D and has responsibilities for Technology Mining as part of Magna's Open Innovation Process for the identification and evaluation of promising companies and technologies. Innovation is a hallmark of Magna, as the leading NA automotive manufacturer. In the growing area of ADAS sensors, Magna recently announced our new vision system, the EYERIS® Generation 3.0 camera system, to enable future advanced safety and driving systems. Dave's career in automotive electronics spans 39 years with extensive experience in the design and launch of high-volume robust automotive systems: navigation, ACC radar-based systems, audio/multimedia, device connectivity, V2X and recently driving automation. His career at Ford Product Development and Ford Research from 1976 to 2006 upon retirement, included Ford firsts - 97MY Mondeo Navigation systems, 2000MY Jaguar radar-based ACC, THX Audiophile for Lincoln, and the 2006 Jaguar Audio Connectivity System, a forerunner of SYNC. From 2006-2015 prior to his position at Magna R&D, Dave as an automotive electronics consultant (MTS LLC) advanced a wide-range of safety and V2X related projects for clients that included USDOT, automotive OEMs and suppliers.


The next generation of optical time-of-flight technology: accelerating ADAS deployment with increased performance and affordability
Michael Poulin
Director of Product Management
LeddarTech

Optical time-of-flight sensors are seen as a key enabling technology for ADAS and autonomous cars. However, many challenges have slowed down their adoption for commercial deployments, as affordable optical sensors tend to fall short in performance, while higher-end solutions remain cost prohibitive for mainstream, high-volume implementations. This session will give you a better understanding of how a new generation of optical detection and ranging sensors is emerging based on an advanced optical time-of-flight technology. Packaged into chipsets, this technology enables highly optimized solid-state optical sensor systems that bridge the cost-performance gap, meeting the automotive industry’s stringent requirements for applications that span from cost-sensitive ADAS to high-performance autonomous vehicle systems. Examples of how these compact optical sensors can easily be integrated into standard automotive components, such as headlamps and rear lamps, as well as results from road trials featuring sensors mounted on a vehicle will be presented.

Biography: Michael Poulin is Director of Product Management at LeddarTech, a supplier of advanced detection and ranging solutions based on the patented, leading-edge Leddar® sensing technology. He has been with the company since 2010, notably as Director of Product Development and as Project Manager. Specializing in digital signal processing and embedded systems development, his previous work included leading and contributing to system and software engineering for the development of neurosensing and neurostimulation implantable medical devices, motorized prosthetics and embedded systems in telecommunications and other industries.


The role of imaging in ADAS safety systems
Narayan Purohit
Product Line Manager for Automotive Image Sensor Products
ON Semiconductor

Advanced Driver Assistance Systems (ADAS) are crucial to providing drivers with the safety, performance and comfort demanded for modern cars. The logical next step for this technology is to be able to effectively assist the driver if their course of action is deemed to be a risk to them or other road users. Today, ADAS technology provides a breadth of capability far beyond what was offered five years ago. The distinguishing feature of next generation ADAS will be their ability to ‘decide’ when to override driver control, or when to give the human a benefit of the doubt. Conversely, autonomous vehicles do not need to factor in human input beyond very basic start/stop functions and any necessary safety overrides. Autonomous cars are likely to require the fusion of many different sensing technologies including lidar, radar, camera systems, and others. This presentation will discuss the implication of high performance, yet cost effective image sensing in these next generation systems.

Biography: Narayan Purohit is currently the Product Line Manager for Automotive Image Sensor products at ON Semiconductor. He has been with ON Semiconductor/Aptina Imaging for over six years in the role of strategic and business development responsible for automotive image sensors business and product portfolio from product definition to market rollout of several new image sensor products. Narayan has over 20 years of experience in roles spanning from design, product development, applications, product management, business development and strategic marketing in various high performance products including image sensors. He holds an MSEE from the Michigan State University.


(2016 Technology Showcase speaker)

AutoHarvest ecosystem welcomes the engineer
Jayson Pankin
Co-founder, President and CEO
AutoHarvest Foundation

AutoHarvest Foundation, a 501(C)3 nonprofit, created and operates a unique innovation ecosystem led by some of the most highly respected figures in the automotive and manufacturing industries. In 2012, AutoHarvest.org was launched as the world’s only truly neutral and global on-line meeting place for innovators of all types with an interest in advanced manufacturing. This system allows users of all types to showcase capabilities, technologies and needs system-wide and then privately connect with fellow inventors and commercializers to explore technology and business development opportunities of mutual interest. The AutoHarvest interest group consists of over 300 prominent R&D and manufacturing organizations from industry, government and academia. Please visit www.autoharvest.org.

Biography: Jayson D. Pankin is a founder, President, and CEO of the nonprofit AutoHarvest Foundation. Jayson and his partner, Dr. David E. Cole, created a unique innovation ecosystem led by some of the most highly respected figures in the automotive and manufacturing industries. In 2012, AutoHarvest.org was launched as the world’s only truly neutral and global on-line meeting place for innovators of all types with an interest in advanced manufacturing intellectual property. From 2003-2010 he led Delphi Automotive’s commercialization activities targeting spin-outs of potentially disruptive technologies into start-up companies. Jayson has been a venture partner specializing in early stage and turnaround situations. He was named by IAM Magazine for two years running as one of the World’s Leading IP Strategists. He earned his BBA in Accounting and MBA in International Business at the George Washington University.


Call for Speakers

If you’d like to participate as a speaker, please call Dr. Mike Pinelis at 734-277-3599 or send a brief email with your proposed presentation topic to mike@memsjournal.com. All speakers will receive a complimentary pass to the conference.

Conference scope includes topics related to automotive sensors and electronics such as:

  • Vehicle vision alternatives
  • Solid state lidar for production vehicles
  • Radar vs. lidar -- are the options changing?
  • Transitioning vision sensing from luxury to high volume vehicles
  • High resolution video cameras / improving the resolution of video cameras
  • Infrared (IR) cameras for both the visible and IR spectrum
  • The role of ultrasonic sensors in ADAS
  • Sensor fusion, as well as dealing with image processing and the vast amount of information from vision and ranging sensors with advanced algorithms and computing power
  • Sensing/seeing beyond 200-300 meters (GPS and cloud data)
  • 5G cellular and Dedicated Short Range Communications (DSRC) impact on ADAS