Application of Lasers for Sensing & Free Space Communication

25 June 2018 – 28 June 2018 Wyndham Orlando Resort International Drive , Orlando, Florida United States

Laser systems satisfy many critical needs in sensing and high-bandwidth free-space optical (FSO) communications.  This meeting will especially highlight means by which quantum protocols can enhance the performance of these important application areas.  Challenges common both to sensing and to communication systems, such as atmospheric effects, are also of strong interest.  In addition to applications and systems, this meeting also covers enabling technologies that are often common to diverse laser-based systems.  Advances in sources, waveform/wavefront modulation techniques, receivers, optical apertures, and post-detection processing schemes are needed, and will be explored within this meeting.


Topics

1.) Quantum protocols
         Quantum imaging & sensing
         Quantum communications
2.) Lidar for imaging
         3D imaging
         Range profiling
         Photon counting imaging & ranging
         Synthetic aperture lidar & other coherent systems
3.) Lidar for remote sensing
         DIAL & other atmospheric lidar
         Wind sensing
         Remote vibrometry
4.) Free-space optical communication
         Atmospheric effects and compensation
5.) Component technologies
         Laser sources
         Receivers
         Optical Apertures & beam control
         Post-detection processing algorithms
 

Top


Speakers

  • Paul McManamon, Exciting Technology LLCUnited States 
    Experiences as an Expert Witness in the Uber vs Google/Waymo Lidar for Driverless Car Case Tutorial
  • Claudine Besson, Office Natl d'Etudes Rech AerospatialesFrance 
    New Developments in Active Sensing at Onera
  • Charles Bouman, Purdue UniversityUnited States 
    A Bayesian Framework for Imaging and Atmospheric Sensing using Coherent Laser Radar
  • Robert Boyd, Ottawa UniversityCanada 
    Quantum Key Distribution (QKD) Using Full Laguerre-Gauss Encoding
  • Scott Davis, Analog Devices Inc.United States 
    Compact Steering Technologies for Automotive LiDAR: a Comparison Between Liquid Crystal Clad Waveguides and Optical MEMs
  • Patrick Feneyrou, Thales Research and Technology FranceFrance 
    Novel Development for FMCW Lidar
  • Mohammad Hashemi , University of MiamiUnited States
  • Mark Itzler, Princeton Lightwave IncUnited States 
    Automotive LiDAR with short-wave infrared Geiger-mode detectors
  • Gordon Keeler, Sandia National Laboratories LivermoreUnited States 
    Opportunities for LIDAR and Free-Space Optical Communications Using Micro-Scale Photonics Technologies
  • Jeff Lundeen, Ottawa UniversityCanada 
    Weak Value Amplification: What is it and is it useful?
  • Patrick Maine, QUANTELFrance 
    Lasers for LIDAR & LIDAR Systems : Recent Developments at Quantel and Keopsys, Spanning Pulsed Laser Diodes, Eyesafe Fiber Lasers and High Average Power DPSS Lasers.
  • Mehul Malik, University of RochesterAustria 
    Noise-resistant Entanglement-based Quantum Communication
  • Whitney Mason, Defense Advanced Res Projects AgencyUnited States 
    Reimagine
  • Mohammad Mirhosseini, California Institute of TechnologyUnited States 
    Encoding Quantum Information on the Full Spatial Bandwidth of Photons
  • Dmitry Pushin, University of WaterlooCanada 
    Structured Waves: From Matter to Light
  • Kevin Resch, University of WaterlooCanada 
    Ultrafast Measurement of Energy-time Entangled States
  • Barry Sanders, University of CalgaryCanada 
    Machine Learning for Adaptive Quantum Metrology
  • Zhimin Shi, University of FloridaUnited States 
    Turbulence-Resistant Free Space Communication Using Vector Beams
  • Andrew Sparks, Analog DevicesUnited States
  • Karin Stein, Fraunhofer IOSBGermany 
    Correction of Atmospheric Effects on Laser Beams for sensing and communication
  • Alan Willner, University of Southern CaliforniaUnited States 
    Free-Space Quantum Communication Links using Orbital-Angular-Momentum
  • Franco Wong, Massachusetts Institute of TechnologyUnited States 
    Quantum key distribution at gigabit-per-second secret-key rates

Top


Committee

  • Walter Buell, The Aerospace Corporation, United States , Chair
  • Edward Watson, University of Dayton, United States , Chair
  • Claudine Besson, Office Natl d'Etudes Rech Aerospatiales, France
  • Robert Boyd, Ottawa University, Canada
  • Richard Heinrichs, Massachusetts Institute of Technology, United States
  • Thomas Karr, OUSD(R&E), United States
  • Robert Lamb, Selex ES, United Kingdom
  • Paul McManamon, Exciting Technology LLC, United States
  • David Rabb, US Air Force Research Laboratory, United States
  • Karin Stein, Fraunhofer IOSB, Germany

Top


Plenary Session

Paul Debevec

Google VR, USA

Light Fields and Light Stages for Photoreal Movies, Games, and Virtual Reality

This talk will present work from USC ICT and Google VR in creating actors and environments for movies, games, and virtual reality.  The Light Stage computational illumination and facial scanning systems are geodesic spheres of inward-pointing LED lights which have been used to create digital actor effects in movies such as Avatar, Benjamin Button, and Gravity, and have recently been used to create photoreal digital actors based on real people in movies such as Furious 7, Blade Runner: 2049, and Ready Player One.  The lighting reproduction process of light stages allows omnidirectional lighting environments captured from the real world to be accurately reproduced in a studio, and has recently be extended with multispectral capabilities to enable LED lighting to accurately mimic the color rendition properties of daylight, incandescent, and mixed lighting environments.  They have also recently used their full-body light stage in conjuction with natural language processing and automultiscopic projection to record and project interactive conversations with survivors of the World War II Holocaust. Debevec will conclude by discussing the technology and production processes behind "Welcome to Light Fields", the first downloadable virtual reality experience based on light field capture techniques which allow the visual appearance of an explorable volume of space to be recorded and reprojected photorealistically in VR enabling full 6DOF head movement.

About the Speaker

Paul Debevec is a research professor at the University of Southern California and the associate director of graphics research at USC's Institute for Creative Technologies. Debevec's Ph.D. thesis (UC Berkeley, 1996) presented Façade, an image-based modeling and rendering system for creating photoreal architectural models from photographs. Using Façade he led the creation of virtual cinematography of the Berkeley campus for his 1997 film The Campanile Movie whose techniques were used to create virtual backgrounds in The Matrix. Subsequently, Debevec pioneered high dynamic range image-based lighting techniques in his films Rendering with Natural Light (1998), Fiat Lux (1999), and The Parthenon (2004); he also leads the design of HDR Shop, the first high dynamic range image editing program. At USC ICT, Debevec has led the development of a series of Light Stage devices for capturing and simulating how objects and people reflect light, used to create photoreal digital actors in films such as Spider Man 2, Superman Returns, and The Curious Case of Benjamin Button, and Avatar, as well as 3D Display devices for telepresence and teleconferencing. He received ACM SIGGRAPH's first Significant New Researcher Award in 2001, co-authored the 2005 book High Dynamic Range Imaging from Morgan Kaufmann, and chaired the SIGGRAPH 2007 Computer Animation Festival. He serves as Vice President of ACM SIGGRAPH and is a member of the Visual Effects Society, the Academy of Motion Picture Arts and Sciences, and the Academy's Science and Technology Council.

Jason Eichenholz

Luminar Technologies, USA

​OSA Light the Future Presentation: The Role of Optics and Photonics in the Vehicles of Tomorrow

In this presentation, OSA Fellow Jason Eichenholz will take a high level look at the future of optics and photonics technologies in autonomous vehicles. Optics are a crucial component in an industry headed for extreme disruption over the next few decades and will play a critical role in shaping the future of navigation, passenger experience and the ultimate safety of the autonomous trip. Eichenholz will discuss the key components of all-things-optic, including LiDAR, laser headlights, passenger monitoring and interior lighting and displays, the role each plays inside a future automobile and its impact on the transportation industry.

About the Speaker

Jason Eichenholz is a serial entrepreneur and pioneer in laser, optics and photonics product development and commercialization. Over the course of his twenty-five year career, his unique blend of business and technical leadership has resulted in hundreds of millions of dollars of new photonics products being brought to market. Eichenholz is an inventor on ten U.S. patents on new types of solid-state lasers, displays and photonic devices.

Laurent Pueyo

Space Telescope Science Institute, USA

Exoplanet Imaging: From Precision Optics to Precision Measurements

During this plenary talk Laurent will present recent observational results in exoplanet imaging and discuss prospects for similar experiments on NASA missions such as the upcoming James Webb Space Telescope and the currently studied The Large UV/Optical/IR Surveyor.

About the Speaker

Laurent Pueyo is an astronomer at the Space Telescope Science Institute, in Baltimore, Maryland. He earned his doctorate from Princeton University in 2008 and conducted his post-doctoral work as a NASA Fellow at the Jet Propulsion Laboratory and as a Sagan Fellow at the Johns Hopkins University. His research focuses on imaging faint planets around nearby stars. He has pioneered advanced data analysis methods that are now standard tools used to study extrasolar planets, and invented an optical technique that is now baselined for future NASA missions. At STScI his duties include optimizing the extrasolar-planet imaging capabilities of NASA's James Webb Space Telescope (JWST), scheduled to launch in late 2019. He is also a member of the Science and Technology Definition Team for the Large Ultraviolet Optical and Infrared telescope, a future observatory that will identify Earth-sized planets and assess their habitability.

Top


Special Events

 

Digital Holographic Microscopy: Present and Future Panel Discussion

Monday, 25 June, 12:30–14:00
Join the OSA Holography and Diffractive Optics Technical Group for a panel discussion exploring potential breakthroughs in digital holographic microscopy. Brief
presentations from our featured panelists will be followed by a moderated question and answer session, helping facilitate the exchange of information with our
community. Contact TGactivities@osa.org to register, pending availability.
Hosted by:   
 

Congress Reception

Monday, 25 June; 18:30–20:00
Come join your colleagues for drinks, networking and thoughtful discussion. Enjoy light fare while networking. The reception is open to all full conference attendees.
Conference attendees may purchase extra tickets for their guest.

Student & Early Career Professional Development & Networking Lunch and Learn

Tuesday, 26 June; 12:30-14:00
This program will provide a unique opportunity for students and early career professionals, who are close to finishing or who have recently finished their doctorate degree, to interact with experienced researchers. Key industry and academic leaders in the community will be matched for each student based on the student's preference or similarity of research interests. Students will have an opportunity to discuss their ongoing research and career plans with their mentor, while mentors will share their professional journey and provide useful tips to those who attend. Lunch will be provided. 

This Workshop is complimentary for OSA Members and space is limited. Not all who apply will be able to attend due to space limitations and priority will be given to those who have most recently or are close to graduation.
Hosted by  OSAF

50th Anniversary of Introduction to Fourier Optics by Joseph Goodman

Tuesday, 26 June; 13:30-19:30
This year marks the 50th anniversary of the publishing of Introduction to Fourier Optics by Joseph Goodman, a book that has fundamental influence in the field of optical imaging. To commemorate this anniversary a special series of talks will be presented covering Fourier optics in the classroom to the evolvement of the field.

Join the Image Sensing and Pattern Recognition Technical Group for a small reception immediately following the conclusion of the program.
 
  • Joseph W. Goodman, Stanford University, USA, Origins and Evolution of Introduction to Fourier Optics
  • James Fienup, University of Rochester, USA, ABCD Matrix Analysis for Fourier-Optics Imaging
  • Raymond Kostuk, University of Arizona, USA, A review of the wonderful discussion of Holography by Professor Goodman in his book: The Introduction to Fourier Optics.
  • James Leger, University of Minnesota Twin Cities, USA, What’s the Problem? Insight and Inspiration Derived from Solving the Exercises in J. Goodman’s Classic Book Introduction to Fourier Optics
  • Masud Mansuripur, University of Arizona, USA, Fourier Optics in the Classroom
  • Demetri Psaltis, Ecole Polytechnique Federale de Lausanne, Switzerland, The Transition of Fourier Optics Towards Computational Imaging and Digital Holography
  • William T. Rhodes, Florida Atlantic University, USA, Teaching Fourier Optics: What I do Differently after 50 Years
  • Bahaa Saleh, University of Central Florida, USA
Reception Hosted By 
 

Illumicon II

Tuesday, 26 June 2018, 19:00 – 21:00
 You are invited to join the OSA Display Technology Technical Group for Illumicon II, an exclusive members-only event. Building on the declarations established at the inaugural Illumicon, which was convened in 2016, attendees will come together to discuss and debate emerging trends, technologies and opportunities in advanced 3D displays. Our discussions will also seek input on how the Display Technology Technical Group can further engage the 3D community in the years ahead. Illumicon II attendees will converge over drinks and appetizers at the confidential location. Entrance will be granted to those able to provide the secret Illumicon II event password. RSVP to tgactivities@osa.org to receive the event location and password.
Hosted by 

 

Applications of Visual Science Technical Group Networking Lunch

Wednesday, 27 June 2018, 12:00–13:00
Members of the OSA Applications of Visual Science Technical Group are invited to join us for a networking lunch on Wednesday. The event will provide an opportunity
to connect with fellow attendees who share an interest in this field and to learn more about this technical group. Contact TGactivities@osa.org to register, pending availability.

Hosted by 


Tour of Laser Propagation Facilities at Kennedy Space Center

Thursday, 28 June (13:00-18:00) and Friday (07:00-12:00)
Additional Fee: $25 per person.
 *Fee includes only transportation. Transportation will leave from the conference hotel and return to the Orlando International Airport and the conference hotel. 

During this tour at Kennedy Space Center (KSC), you will see various facilities used for outside field experiments such as laser propagation measurements. The tour will include UCF’s Townes Institute Science and Technology Experimentation Facility (TISTEF), the Shuttle Landing Facility (SLF), and Vehicle Assembly Building (VAB). TISTEF is a site for experiments that require deployment in a fielded setting and consists of a 1 km grass range equipped with atmospheric monitoring instruments and multiple scintillometers as well as capabilities for optical tracking and remote sensing. From this site, slant path measurements can be made over the 13 km path to the top of the VAB. The 5 km long SLF is ideal for longer path measurements because of its homogeneity and flatness (earth’s curvature has been removed). This tour is possible because of the pcAOP committee and University of Central Florida.

 

Student Grand Challenge: the optical systems of the future

The challenge is open to OSA student members and their advisors interested in presenting concepts for enhanced machine visioning or systems that enhance the human vision system by augmenting or extending another human sense.  
 
Individuals or teams are invited to submit ideas for either a novel passive or active optical system in the form of a 35-word abstract and 2 page summary highlighting the novelty, originality and feasibility of the concept. (Additional materials may include videos highlighting system mock-up or demos.)
 
Up to four finalists will be chosen to attend the Imaging and Applied Optics Congress, 25-28 June 2018 in Orlando, FL USA to present a 3-minute synopsis of their concept as well as to host a poster during the conference poster session.  Finalists will receive a travel stipend up to $2,000 USD to cover airfare and hotel as well as full technical registration for the congress.
 
Two winners will be announced on-site and will receive a recognition plaque and a $250 prize.

Sponsored by Lockheed Martin and the OSA Foundation.

Passive Optical System Challenge Problem

The image processing community strives to duplicate human vision.  For certain specific and well defined tasks we have succeeded or surpassed human capability, but still struggle with poorly defined and dynamic environments. The category comparison between machine vision and human vision include:
  • Spectrum: Machine vision is superior as human vision is limited to the visible spectrum.  Machine vision is also more capable of seeing narrower spectrum steps and larger dynamic ranges than our eyes.
  • Resolution: Human vision is superior.  Current machine vision systems that are approaching 8K x 8K formats are starting to get there but, only with visible systems.
  • Focus: Human vision is superior being able to focus from very close to very far with a single lens element. The eye aperture is limited by the size of the pupil for objects far away.  Machine vision systems are specifically designed for very close or very far away and do not suffer from being aperture limited, but utilize many lens elements to accomplish the same human eye tasks.
  • Optical Processing: Human vision + brain is superior to machine vision on pattern recognition and decision making.
The passive optical systems challenge is to create a novel concept, technology or system for improving results in one of the 8 categories below:  
  • Image Processing: Ideas focused on detection and categorization of objects in the view field.
  • Lens Technology: Ideas that are focused on optical sensors.
  • High Speed Data Transport: Ideas that are focused on fast and efficient transport of high resolution image data and streams.
  • Adaptable Lens Technology: Ideas that are focused on adaptable optical sensors.
  • Liquid Lens Optical Sensors: Ideas focused on liquid lens.
  • Artificial Intelligence: Ideas that focus on image based cognition
  • AV/VR Technology: Ideas that are focused on Augmented and Virtual Reality technologies.
  • Other: Ideas that do not fall into one of the existing categories.

Active Optical System Challenge

Given you have a human vision system, which is intrinsically passive, how would you use active sensing techniques to augment that vision system to mimic or extend the human senses? Augmentation could mean adding higher precision 3D vision, active foveae imaging, active IR assisted sensing, vibrometry, polarimetry, sensing motion in the FOV, chemical/biological sensing, looking through fog/turbulence, etc.  Many such systems have been demonstrated, but they are often large, heavy, and costly.
 
The active optical system challenge is to come up with novel sensor concepts that mimic at least two of the human senses at a distance of at least 10 m, with the sensor fitting into one third of the human brain (roughly 0.5 liters).  More sensing modalities are encouraged, especially those that extend what humans can do.
 
  1. Sight (e.g., producing 2D or  3D images)
  2. Hearing (e.g., measuring object vibrations through optical means)
  3. Smell (e.g., chemical/biological sensing)
  4. Taste (e.g., chemical/biological sensing)
  5. Touch (e.g., characterization of surface texture and/or temperature)

Rules

  • Limited to undergraduate or graduate students.
  • The teams should be composed of at least one OSA student member and at least one advisor who is an OSA member.
  • Required submission format: PDF with a 35-word abstract and 2 page summary.
    • Optional submission material: videos, system mock-ups, demonstrations.

Key Criteria

  • Compliance: is the idea submission complete and does it comply with the rules of the challenge?
  • Novelty: does the idea describe a novel approach to providing a solution?
  • Originality: how original is the proposed technology or use of existing technology?
  • Relevance: How well does the idea relate to the topic and provide a solution aligned with the goals of this challenge?
  • Feasibility: how likely can the idea be prototyped?

Evaluation

Submissions will be evaluated by a committee of Imaging and Applied Optics Congress leadership and Lockheed Martin executives.

Top