Computational Optical Sensing and Imaging

22 June 2020 – 26 June 2020 OSA Virtual Event

Theoretical to experimental demonstration and applications of the latest advances in computational imaging research. Computational sensing and imaging applications spanning fundamental science to medical, security, and defense industry applications.


Topics

Topic Categories

  • Compressive sensing                   
  • Tomographic imaging                   
  • Light-field sensing                   
  • Digital holography (including digital holographic microscopy)                   
  • SAR and LiDAR                   
  • Infrared imaging and spectral imaging                   
  • Phase retrieval                   
  • Computational spectroscopy and microscopy                   
  • Inverse problems (including blind deconvolution)                   
  • Phase diversity                   
  • Point spread function engineering                   
  • Super resolution techniques                   
  • Lensless imaging (including coherent diffraction imaging)                   
  • Ptychography and its variants                   
  • Deep learning for computational imaging               
  • Applications of structured illumination for imaging   
  • Applications of Optical frequency combs for imaging                           
  • Imaging through turbid media (and in biomedical tissue)                   
  • Unconventional imaging modalities (such as ghost imaging, quantum imaging, mutual intensity imaging)   
  • Indirect or non-line-of-sight imaging using scattering surfaces                   
  • Imaging using metasurfaces                   
  • Imaging at extreme scales                   
  • 3D Imaging and computational displays   
  • Computational imaging powered by computer vision       
  • Mathematical foundations of computational imaging and sensing

Subtopic Categories

  • Physical pre-processing architectures for Optical Sensing and Imaging
  • Adaptive architectures for Optical Sensing and Imaging
  • Physics-aware ML for Optical Sensing and Imaging
  • Model-based inversion methods for Optical Sensing and Imaging
  • Computer vision for Optical Sensing and Imaging
  • Mathematical Foundations of Optical Sensing and Imaging
  • Other

Top


Speakers

  • Matthew Kupinski, University of ArizonaUnited States
    Keynote Talk: Imaging Science in System and Algorithm Design Keynote
  • Eduardo Bendek, Jet Propulsion LaboratoryUnited States
    Applications of Diffractive Pupils for Aliased Wave Front Control and Astrometry for Exoplanet Detection
  • Charles Bouman, Purdue UniversityUnited States
    Coherent Plug-and-Play: A Framework for Integrating AI and Physical Models in Coherent Optical Imaging Problems
  • Antonio Miguel Caravaca Aguirre, University of Colorado at BoulderFrance
    Multimodal Endo-microscopy Using Multimode Fibers
  • Roman Genov, University of TorontoCanada
    Coded-exposure-pixel Image Sensors
  • Andrew Harvey, University of GlasgowUnited Kingdom
    Engineering the Pupil: From Smaller Thermal Imagers to 3D Microscopy
  • Lingling Huang, Beijing Institute of TechnologyChina
    In-situ Hologram Generation Based on All Dielectric Bifocal Metalens
  • Malena Inés Español, Arizona State UniversityUnited States
    Multilevel Methods for Imaging Applications
  • Ori Katz, Hebrew University of JerusalemIsrael
    Exploiting Speckle Dynamics for Super-resolution
  • Kedar Khare, Indian Institute of Technology DelhiIndia
    Robust Laser Beam Engineering Using Complementary Diffraction
  • Malgorzata Kujawinska, Politechnika WarszawskaPoland
    True Volumetric Measurements of Cells and Tissues by Limited Angle Holographic Tomography
  • Allard Mosk, Universiteit UtrechtNetherlands
    Optimized Light Fields for Imaging and Metrology in Scattering Media
  • Wolfgang Osten, Universität StuttgartGermany
    Some Unconventional Ways to Make Holograms, Interferograms and Microscopic Images
  • Stefan Sinzinger, Technische Universität IlmenauGermany
    Resonant Microoptics for Enhanced Computational Imaging and Sensing Solutions
  • J. Scott Tyo, University of New South WalesAustralia
    Controlling Polarimeter Bandwidth by Modulating in Multiple Domains

Top


Committee

  • Amit Ashok, University of Arizona, United StatesChair
  • Michael Gehm, Duke University, United StatesChair
  • Tatiana Alieva, Universidad Complutense de Madrid, SpainProgram Chair
  • Jun Ke, Beijing Institute of Technology, ChinaProgram Chair
  • Florian Willomitzer, Northwestern University, United StatesProgram Chair
  • Seung-Whan Bahk, University of Rochester, United States
  • Katie Bouman, California Institute of Technology, United States
  • Marc Christensen, Southern Methodist University, United States
  • Oliver Cossairt, Northwestern University, United States
  • Vidya Ganapati, Swarthmore College, United States
  • Andrew Harvey, University of Glasgow, United Kingdom
  • Wolfgang Heidrich, King Abdullah Univ of Sci & Technology, Saudi Arabia
  • Roarke Horstmeyer, Duke University, United States
  • Edmund Lam, University of Hong Kong, Hong Kong
  • Rajesh Menon, University of Utah, United States
  • Figen Oktem, Middle East Technical University (METU), Turkey
  • YongKeun Park, Korea Advanced Inst of Science & Tech, South Korea
  • Prasanna Rangarajan, Southern Methodist University
  • Monika Ritsch-Marte, Innsbruck Medical University, Austria
  • Giuliano Scarcelli, University of Maryland at College Park, United States
  • Paulo Silveira, Occipital Inc, United States
  • Indranil Sinharoy, Samsung Research America Dallas, United States
  • Yong Song, Beijing Institue of Technology, China
  • Esteban Vera, P. Universidad Catolica de Valparaiso, Chile
  • Gordon Wetzstein, Stanford University, United States

Top


Plenary Session

Katie Bouman

California Institute of Technology, USA

Capturing the First Picture of a Black Hole and Beyond

This talk will present the methods and procedures used to produce the first image of a black hole from the Event Horizon Telescope, as well as discuss future developments. It had been theorized for decades that a black hole would leave a "shadow" on a background of hot gas. Taking a picture of this black hole shadow would help to address a number of important scientific questions, both on the nature of black holes and the validity of general relativity. Unfortunately, due to its small size, traditional imaging approaches require an Earth-sized radio telescope. In this talk, I discuss techniques the Event Horizon Telescope Collaboration has developed to photograph a black hole using the Event Horizon Telescope, a network of telescopes scattered across the globe. Imaging a black hole’s structure with this computational telescope required us to reconstruct images from sparse measurements, heavily corrupted by atmospheric error. This talk will summarize how the data from the 2017 observations were calibrated and imaged, and explain some of the challenges that arise with a heterogeneous telescope array like the EHT. The talk will also discuss future developments, including how we are developing machine learning methods to help design future telescope arrays.

About the Speaker

Katie Bouman is an assistant professor in the Computing and Mathematical Sciences Department at the California Institute of Technology. Before joining Caltech, she was a postdoctoral fellow in the Harvard-Smithsonian Center for Astrophysics. She received her Ph.D. in the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT in EECS. Before coming to MIT, she received her bachelor's degree in Electrical Engineering from the University of Michigan. The focus of her research is on using emerging computational methods to push the boundaries of interdisciplinary imaging.

David J. Brady

Duke University, USA

Defining the Digital Camera

Conventionally “the camera” is well defined, it consists of a lens to form an image and a sensor to measure the image. In the modern camera, however, the image is formed computationally rather than by the lens. The camera consists of a variety of sensor resources, potentially including lens and sensor arrays with various forms of active illumination and 3D sensing. Camera designers must select these resources within size, weight, cost and power budgets to maximize the quality of computed media. While this approach creates design challenges, it also enables 100x increases pixel count per unit volume, 100x decreases in operational power per pixel and dramatic improvements spatial, spectral, temporal and range resolution. This talk reviews design strategies for heterogeneous sensor array cameras and analyzes system performance for various recent designs.

About the Speaker

David J. Brady is the Fitzpatrick Professor of Photonics at Duke University. In 2012, Professor Brady led the team that built the world’s first terrestrial gigapixel camera. He subsequently founded Aqueti, Inc., which manufactures array cameras. Brady has also worked on numerous applications of compressive measurement and computational imaging, in 2013 he was awarded the SPIE Denis Gabor Award for the development of compressive holography. His recent work focuses on the use of compressive measurement and artificial intelligence to improve data quality and quantity in parallel cameras; focusing on the ultimate goal of handheld gigapixel cameras. Brady is a fellow of OSA, SPIE and IEEE.

Top


Special Events

Women of Imaging and Sensing Meet and Greet

Grab your coffee, soda or beverage of your choice and join other women of Sensing & Imaging for an informal virtual get together. Members of each committee will be on hand to answer any questions you may have or simply log in and learn a bit about OSA’s diversity and inclusion efforts and share your ideas on helping ensure our community and this meeting is as welcoming and inclusive as possible.

Volunteer Engagement I – OSA Technical Groups

Join OSA Board of Meetings Technical Group Development Chair Daniel Smalley to learn more about the governing structure and activities of OSA Technical Groups. The session will include a brief overview and time for Q&A.

Introductory Remarks and Plenary Session I (Sensing Congress)

OSA Career Lab: Developing Profitable Technology Products

Developing products that make money is the primary goal of most technology companies, but it’s not an easy task to accomplish. Many factors impact whether a product is ultimately successful or not. Learn an overview of the important fundamentals for developing products that will make money for your company.

Volunteer Engagement II – OSA Meetings

Join members of the Sensing and Imaging committee to discuss the roles, responsibilities and time commitment needed to serve on a meeting committee. The session will include a brief overview and time for Q&A.

Technical Groups: Illumicon

You are invited to join the OSA Display Technology Technical Group for Illumicon, an exclusive members-only event. Building on the declarations established at past Illumicon gatherings, attendees will converge online to discuss and debate emerging trends, technologies and opportunities in advanced 3D displays. Entrance to the online event will be granted to those able to enter the secret password.

Volunteer Engagement III – OSA Publishing

Join Kara Peters, NC State University, USA, Applied Optics Topical Editor and Samuel Thurman, Lockheed Martin Coherent Technologies, USA, JOSA A Topical Editor to learn how to become a reviewer, what editors are looking for in a reviewer, and what makes a good review. The session will include a brief overview and time for Q&A.

Introductory Remarks and Plenary Session II (Imaging Congress)

Student and Early Career Professionals Happy Hour

Join fellow students and early career professionals for an informal virtual get together. Grab your coffee, soda or beverage of choice for a chance to meet other students and early career professionals from across the world and swap stories of life in graduate school and beyond.  Share the joys, trials, challenges, and camaraderie of the hard work

Top