Abstract: Head up displays in automobiles today create a virtual image at the front of the car at a fixed depth using a microdisplay and magnifier optics. We developed a spatial light modulator based holographic 3D HUD technology that uses eye-tracking to calculate an optimized computer generated hologram (CGH) that avoids the accommodation-vergence issues present in typical stereoscopic displays while providing an image that mimics human vision – i.e. distance objects are out of focus when viewing closer objects. Holographic HUD provides true augmented reality experience when integrated with cameras and other sensors in the next generation cars. Holographic HUD is software upgradable and provides a number of inherent optical advantages such as dynamic aberration correction, software based adaptation for different car models, high light efficiency, and true 3D AR capabilities by providing correct parallax cues.
Bio: Dr. Goksen Yaralioglu received his PhD degree from Electrical and Electronics Engineering Department, Bilkent University, Turkey in 1999. Between 1999 and 2006, he worked as an Engineering Research Associate at Stanford University. During his post-doc, he developed Microelectromechanical systems (MEMS) sensors and actuators. Then, he joined Invensense as a MEMS design engineer. Later he directed test activities to calibrate MEMS inertial sensors in production in the same company. Dr. Yaralioglu published more than 50 journal articles and 4 book chapters. He holds 10 patents. Recently, he co-founded CY Vision Inc (San Jose, CA) to develop next generation AR displays for automotive.