2nd RPI Space Imaging Workshop

October 28-30, 2019
Embassy Suites Saratoga Springs,
86 Congress Street, Saratoga Springs, NY 12866

Photo of asteroid Bennu

Asteroid Bennu. Image composed of 12 PolyCam images collected by the OSIRIS-REx spacecraft. Credit: NASA/Goddard/University of Arizona

IDEFIX/Ariane Rocket Body 42P

Poorly-resolved sequence of images of IDEFIX/Ariane Rocket Body 42P. Images from GT-SORT and provided courtesy of Dr. Marcus Holzinger.

Photo of Dione

Dione. Original image provided as .IMG file from : Porco, C.P., CASSINI ORBITER SATURN ISSNA/ISSWA 2 EDR VERSION 1.0, CO-S-ISSNA/ISSWA-2-EDR-V1.0, 2005.

Registration is CLOSED.

DOWNLOAD FULL PROGRAM (1.2MB PDF)


Although images of space objects have been used for decades, there are many common challenges in the use of image data for planetary science, optical navigation, and space situational awareness. Many of these challenges may be addressed by recent advancements in imaging sensors, computer vision algorithms, and computing resources - all of which set the stage for a fundamental change in how we will exploit image data for future space missions. This workshop aims to facilitate the dissemination of ideas between all of the communities using space image data and to identify areas of common need. The unique workshop format will include oral presentations, poster sessions, and focused breakout sessions to discuss topics of interest.

Keynote Speakers

Carter Emmart, Ph.D.
Dr. Carter Emmart
Director of Astrovisualization, American Museum of Natural History
Keynote Speaker

OpenSpace: a Tool for Immersive Interactive Data Context Visualization

Monday, October 28, 2019, 11:20-12:20

OpenSpace is a NASA supported open source software collaboration between the American Museum of Natural History (AMNH) and three leading universities in the fields of data visualization research:  Sweden’s Linkoping University (LiU), University of Utah’s Scientific Computing and Imaging Institute (SCII), and New York University’s Tandon School of Engineering.  It was designed as an interactive visualization tool to contextualize data and simulations across the known spatial and temporal scale range of the universe presented across a range of display environments.  OpenSpace originated from a foundational concept in the millennium rebuilding of New York City’s Hayden Planetarium to take audiences beyond the night sky into 3D charted space by visualizing the measured layout of the universe.  It has grown from two decades of collaboration with LiU that produced the commercial Uniview software (by SCISS, AB) and precursory tools such as NCSA’s Partiview.  Influenced by the vast collections at AMNH, its exhibits that contextualize nature across time, and the immersive display tradition of the planetarium, OpenSpace aspires to transport audiences into vast fields of data augmenting what can be seen, traveling to places only visited robotically, and beyond to scales and timeframes impossible to experience in any other way than through rigorous data visualization.  Multiple modes of visualization and a modular structure aspire to make OpenSpace a flexible presentation and data exploration tool for a range of users.

Carter Emmart, Ph.D., is the Director of Astrovisualization at the American Museum of Natural History. He directs the award-winning space show productions at the Hayden Planetarium based in data visualization. Since 1998 he has overseen the development of the AMNH Digital Universe 3D Atlas. In 2002, he began a collaboration with Sweden’s Linkoping University hosting a series of masters thesis projects that lead to the NASA supported OpenSpace software. Carter’s career began as a space artist, with an academic background in astronomy and geophysics, and comes from a family tradition in the arts.

Andrew Johnson, Ph.D.
Dr. Andrew Johnson
Principal Robotics System Engineer, Jet Propulsion Laboratory
Keynote Speaker

Computer Vision for Planetary Descent and Landing

Tuesday, October 29, 2019, 11:20-12:20

The Mars Exploration Rovers used the Descent Image Motion Estimation System (DIMES) to estimate velocity during landing.  Although the approach was quite controversial at the time, it was ultimately successful for both Spirit and Opportunity landings.  This opened the door for future uses of computer vision during landing, and at that time, the Mars program was particularly interested in position estimation to enable pin-point landing. In 2004, some initial funding was provided to look into its feasibility. Now 15 years later, after numerous starts and stops, changes in funding, distractions and breakthroughs, we have developed the Lander Vision System (LVS) for Mars 2020.  This system estimates map relative position during landing so that known hazards in the landing ellipse can be avoided.  This talk will start with lessons learned from DIMES and then trace the technology development arc that lead to the flight implementation of LVS.

Dr. Andrew E. Johnson graduated with Highest Distinction from the University of Kansas in 1991 with a BS in Engineering Physics and a BS in Mathematics. In 1997, he received his Ph.D. in computer vision from the Robotics Institute at Carnegie Mellon University. Currently, he is a Principal Robotics System Engineer at NASA’s Jet Propulsion Laboratory where he is developing guidance navigation and control technologies for autonomous navigation and mapping during descent to planets moons, comets and asteroids. His current job is the Guidance Navigation and Control Subsystem Manager for Mars 2020 which includes a new vision-based terrain relative navigation system for landing and an enhanced rover navigation system for rapid autonomous driving over challenging terrain.

Schedule

DOWNLOAD FULL PROGRAM (1.2MB PDF)

Monday: October 28, 2019

7:30-8:30

Registration

 

8:30-9:00

Welcome

 

9:00-9:20

DTM Production from LROC NAC Images
Madeleine R. Manheim (Arizona State Univ.), Megan R. Henriksen (Arizona State Univ.), Mark S. Robinson (Arizona State Univ.), and the LROC Team

PDF iconPDF

9:20-9:40

Remote Sensing Method to Model Terrain Shape by Detecting Reliable Ground Points
Mohammed A. Yousefhussien, Walter V. Dixon, and James V. Miller (General Electric Global Research)

PDF iconPDF

9:40-10:00

Planetary Surface Image Generation for Testing Future Space Missions with PANGU
Iain M. Martin (Univ. of Dundee), Martin N. Dunstan (Univ. of Dundee), Manuel Sanchez Gestido (ESA)

PDF iconPDF

10:00-10:20

Break

 

10:20-10:40

Rendering the Titian Environment for Dragonfly
Carolyn A. Sawyer and Nishant L. Mehta (JHU APL)

 

10:40-11:00

A Novel Surface Feature Navigation Algorithm Using Ray Tracing
Chris Gnam (Univ. at Buffalo), Andrew Liounis (NASA GSFC), Benjamin Ashman (NASA GSFC), Kenneth Getzandanner (NASA GSFC), Joshua Lyzhoft (NASA GSFC), Jeffrey Small (Aerospace Corp.), Dolan Highsmith (Aerospace Corp.), Coralie Adam (KinetX), Jason Leonard (KinetX), Peter Antreasian (KinetX), Dante S. Lauretta (Univ. of Arizona)

PDF iconPDF

11:00-11:20

Break

 

11:20-12:20

Day 1 Keynote: Carter Emmart (American Museum of Natural History)

 

12:20-2:00

Lunch

 

2:00-2:20

NASA SPLICE Project: Developing the Next Generation Hazard Detection System
Carolina I. Restrepo (NASA GSFC), Ronney Lovelace (NASA JSC), Ronald R. Sostaric (NASA JSC), John M. Carson (NASA JSC)

PDF iconPDF

2:20-2:40

Monocular Visual-Inertial Odometry with Dynamic LiDAR Scaling for Safe and Precise Landing on Unmapped Planetary Bodies
Chris Owens, Kori Macdonald, Jeremy Hardy, and Andrew Horchler (Astrobotic)

PDF iconPDF

2:40-3:00

A Vision-Based Navigation Capability for Precise Lunar Landing
Courtney E. Mario, Ted J. Steiner, Laura S. Henderson, Ian T. Fletcher, and Alison M. Siegmann (Draper)

PDF iconPDF

3:00-3:20

GENEVIS: Generic Vision Based Navigation for Descent & Landing
Paul Duteis (Airbus Defence), Roland Brochard (Airbus Defence), Darius Djafari-Rouhani (Airbus Defence), and Manuel Sanchez Gestido (ESA)

PDF iconPDF

3:20-3:40

Break

 

3:40-4:00

A Comparison of Feature Extraction Methods for Terrain Relative Navigation
Manoranjan Majji (TAMU), Andrew Simon (TAMU), Carolina I. Restrepo (NASA GSFC), and Ronney Lovelace (NASA JSC)

 

4:00-4:20

Efficient Vision-Based Terrain Relative Navigation Without Dependence on Feature Identification
James S. McCabe (NASA JSC)

PDF iconPDF

4:20-6:00

Break

 

6:00-9:00

Dinner, Posters, Student Paper Competition

 

 

Evening Poster Session

Regular

Analytic Center of Illumination for Groups of Geometric Primitives
Kevin Kobylka and John Christian (RPI)

PDF iconPDF

Regular

Deep Learning-Based Crater Detection for Lunar Terrain Relative Navigation
Lena M. Downes (MIT), Ted J. Steiner (Draper), and Jonathan P. How (MIT)

PDF iconPDF

Regular

The Optical Navigation Toolkit
Devin Renshaw, William Parker, Paul McKee, Courtney Hollenberg, Steven Beseler, Chris Grome, and John Christian (RPI)

PDF icon PDF

Regular

Triangles vs. Quadrilaterals: Selecting the Right 3D Model Format for Space Science and Exploration
Ryan Thibeault, Lillian Hong, and John Christian (RPI)

PDF iconPDF
Regular

Visualizing Space Data in Virtual Reality
Lillian Hong, Ryan Thibeault, and John Christian (RPI)

PDF iconPDF

Student Competition

Addressing Feature Shadowing with Application to Vision-based Terrain Relative Navigation
Jonathan Manni (CU-Boulder), Nisar Ahmed (CU-Boulder), Jay McMahon (CU-Boulder), and Courtney Mario (Draper)

PDF iconPDF

Student Competition

Limb-based Shape Modeling: A Demonstration on Itokawa
Dahlia A. Baker and Jay W. McMahon (CU-Boulder)

PDF iconPDF

Student Competition

Center and Apparent Diameter Optical Navigation on Mars Orbit
Thibaud Teil (CU-Boulder), Hanspeter Schaub (CU-Boulder), and Daniel Kubitschek (LASP)

PDF iconPDF
Student Competition

Small Body Pose and Shape Estimation from Silhouettes via Extended Target Tracking
Enrico M. Zucchelli, Brandon A. Jones, and Ryan P. Russell (UT-Austin)

PDF iconPDF

 

Tuesday: October 29, 2019

7:30-8:30

Registration

 

8:30-8:40

Day 2 Opening Remarks

 

8:40-9:00

Orion Optical Navigation Performance and Testing
Christopher D’Souza, Rebecca Inman, and Kyle Smith (NASA JSC)

 

9:00-9:20

Performance Bounds for Star Centroid Localization in Diffraction-Limited Digital Images
John Christian and Jacob Kowalski (RPI)

PDF iconPDF

9:20-9:40

Optical Navigation for Autonomous Approach of Small Unknown Bodies
Jacopo Villa (KTH-Royal Inst. of Technology), Saptarshi Bandyopadhyay (JPL), Benjamin Morrell (JPL), Benjamin Hockman (JPL), Shyamkumar Bhaskaran (JPL), Issa Nesnas (JPL)

PDF iconPDF

9:40-10:00

Break

 

10:00-10:20

Optical Navigation for New Horizons' Flyby of Kuiper Belt Object (486958) 2014 MU69
Derek S. Nelson (KinetX), Erik J. Lessac-Chenen (KinetX), John Y. Pelgrift (KinetX), Coralie D. Adam (KinetX), Fred J. Pelletier (KinetX), Jeremy Bauman (KinetX), Dale Stanbridge (KinetX), John R. Spencer (SwRI), Simon B. Porter (SwRI), Marc W. Buie (SwRI), Mark E. Holdridge (JHU APL), Harold A. Weaver (JHU APL), Catherine B. Olkin (SwRI), and S. Alan Stern (SwRI)

 

10:20-10:40

The Benefits of Subsampling Optical Navigation Images as Applied to the New Horizons Flyby of (486958) 2014 MU69
Declan M. Mages, William M. Owen Jr., Joseph E. Riedel, Shyam Bhaskaran (JPL)

PDF iconPDF

10:40-11:00

Optical Navigation Preparations for a Possible Binary System During the New Horizons Extended Mission
John Y. Pelgrift (KinetX), Erik J. Lessac-Chenen (KinetX), Derek S. Nelson (KinetX), Coralie D. Adam (KinetX), Jeremy Bauman (KinetX), Joel Fischetti (KinetX), Fred J. Pelletier (KinetX), Bobby Williams (KinetX), Mark E. Holdridge (JHU APL), Harold A. Weaver (JHU APL), John R. Spencer (SwRI), Simon B. Porter (SwRI), Marc W. Buie (SwRI), Catherine B. Olkin (SwRI), and S. Alan Stern (SwRI)

PDF iconPDF

11:00-11:20

Break

 

11:20-12:20

Day 2 Keynote: Andrew Johnson (JPL)

 

12:20-2:00

Lunch

 

2:00-2:20

Template Matching Used for Small Body Optical Navigation with Poorly Detailed Objects
Joshua R. Lyzhoft (NASA GSFC), Andrew J. Liounis (NASA GSFC), Coralie Adam (KinetX), Peter Antreasia (KinetX), Dante S. Lauretta (Univ. of Arizona), and the OSIRIS-REx Team

PDF iconPDF

2:20-2:40

Transition from Star-based to Landmark Optical Navigation During OSIRIS-REx Navigation Campaign at Asteroid Bennu
Coralie D. Adam1 (KinetX), Leilah K. McCarthy (KinetX), Eric M. Sahr (KinetX), Derek S. Nelson (KinetX), John Y. Pelgrift (KinetX), Erik J. Lessac-Chenen (KinetX), Jason M. Leonard (KinetX), Peter G. Antreasian (KinetX), Eric E. Palmer (PSI), John R. Weirich (PSI), Robert W. Gaskell (PSI), Olivier S. Barnouin (JHU APL), Michael C. Moreau (NASA GSFC), Dante S. Lauretta (Univ. of Arizona)

PDF iconPDF

2:40-3:00

Independent Optical Navigation Processing for the OSIRIS-REx Mission using the Goddard Image Analysis and Navigation Tool
Andrew Liounis (NASA GSFC), Jason Swenson (NASA GSFC), Jeffrey Small (The Aerospace Corp.), Josh Lyzhoft (NASA GSFC), Benjamin Ashman (NASA GSFC), Kenneth Getzandanner (NASA GSFC), Dolan Highsmith (The Aerospace Corp.),, Michael Moreau (NASA GSFC), Coralie Adam (KinetX, Peter Antreasian (KinetX, Dante S. Lauretta (Univ. or Arizona), and the OSIRIS-REx Team

PDF iconPDF

3:00-3:20

The Image Constraint Measurement Type for Orbit Determination and Geophysical Parameter Estimation
Kenneth Getzandanner (NASA GSFC), Jason Leonard (KinetX), Andrew Liounis (NASA GSFC), Erwan Mazarico (NASA GSFC), Coralie Adam (KinetX, Peter Antreasian (KinetX), Dante S. Lauretta (Univ. of Arizona), and the OSIRIS-REx Team

 

3:20-3:40

Break

 

3:40-4:00

Reconstruction of Active Bennu Particle Events from Sparse Optical Data
John Y. Pelgrift (KinetX), Erik J. Lessac-Chenen1 (KinetX), Coralie D. Adam (KinetX), Derek S. Nelson (KinetX), Leilah K. McCarthy (KinetX), Eric M. Sahr (KinetX), Jason M. Leonard (KinetX), Dante S. Lauretta (Univ. of Arizona); and the OSIRIS-REx Team

PDF iconPDF

4:00-4:20

Monitoring the Operational Environment of Active Asteroid (101955) Bennu
Carl W. Hergenrother, Frank Shelly, Chester Maleszewski, Rose P. Garcia, Kris Becker, Eric Christensen, Dante S. Lauretta (Univ. of Arizona)

PDF iconPDF

4:20-4:40

Towards Robust Learning-based Pose Estimation of Noncooperative Spacecraft
Tae Ha Park, Sumant Sharma, and Simone D’Amico (Stanford)

PDF iconPDF
4:40-5:00 Break  

5:00-6:30

Seminar: NASA Navigation Filter Best Practices
Russell Carpenter (NASA GSFC) and Christopher D’Souza (NASA JSC)

VIEW

 

Wednesday: October 30, 2019

8:30-Noon

Short Course: Space Image Processing and Computer Vision with MATLAB
Taught by instructors from MathWorks

 

Technical Papers, Presentations, and Posters:

Extended abstracts are sought in all aspects of image analysis for space applications, and may be submitted to one of the following three categories: presentation (manuscript optional), poster (manuscript optional), student competition (manuscript required). Example topics of particular interest include:

  • Image processing techniques as applied to space imagery
  • Estimation of a body’s photometric properties from images
  • Stereophotogrammetry (SPG)
  • Stereophotoclinometry (SPC)
  • Lightcurve inversion
  • Optical navigation (OPNAV)
  • 3D modeling of space objects and planetary terrain
  • Image-based terrain relative navigation (TRN)
  • Image-based simultaneous localization and mapping (SLAM)
  • Image-based virtual reality (VR) and augmented reality (AR) for space exploration
  • Visualization and rendering pipelines for space exploration
  • Imaging with non-traditional electro-optical (EO) sensors
  • Space imaging hardware

Extended abstracts are expected to be approximately 750-1,500 words and (approximately) follow the workshop template: MS Word Template (39KB docx) or LaTeX Template (31KB - in Github). These extended abstracts may not exceed two pages (plus, at the author’s discretion, an optional third page with supplemental images). Details on page and text formatting is provided within the template.

All extended abstracts must be submitted via the webform at the bottom of this page. If accepted, revised extended abstracts or full manuscripts are due by October 21, 2019. Full manuscripts are optional for all categories except the student paper competition. Extended abstracts (or, if submitted, full manuscripts) will be posted on the workshop website. If no revision is provided, the accepted extended abstract will be posted online. Workshop no-shows will have their abstract (or manuscript) removed from the website agenda and archive.

Please remember to allocate enough time for your extended abstract, manuscript, presentation, and/or poster to receive whatever organizational approvals are necessary for presentation in an open/public forum. These approval processes can often take many weeks.

Breakout Session Proposals:

In addition to the technical presentations and posters, the workshop will feature a series of breakout sessions where attendees can meet in small groups to discuss topics of interest to the community. We are accepting proposals for breakout session topics. Breakout session proposals should be no more than one (1) page in length.

Student Paper Competition:

Papers with students as the primary author are eligible for submission to the student paper competition. Students authors must attend the workshop and will present their work in a special poster session. Papers will be evaluated on technical merit and impact to the field, with the top papers receiving awards.

Abstract Submission Deadlines:

Extended Abstracts Due: April 26, 2019 [extended to May 10, 2019]
Author Notification: May 31, 2019
Revised Ext. Abstract (or Optional Manuscript) Due: October 21, 2019

 

Submit Your Abstract: 

Short Course: Space Image Processing and Computer Vision with MATLAB

Taught by instructors from MathWorks, this hands-on MATLAB short course will cover practical applications of image processing and computer vision in space systems.  In this interactive session, you will use MATLAB to explore the following:

  • Automate image processing workflows with new MATLAB apps
  • Feature-based Terrain Relative Navigation (TRN) with MATLAB
  • Object detection and recognition for Rendezvous and Proximity Operations (RPO)
  • Brief introduction to deep learning for computer vision

ISS image, from NASA
ISS image courtesy of NASA
MathWorks

Registration

Registration for the Workshop is now closed. If you have any questions about registration, please contact John Christian at chrisj9@rpi.edu

Travel & Lodging

Embassy Suites by Hilton Saratoga Springs
Venue: 

The workshop will be held at Embassy Suites by Hilton Saratoga Springs.

Lodging: 

A block of rooms has been reserved at the Embassy Suites by Hilton Saratoga Springs at the government per diem rate ($117.00 USD/Night). This special room rate will be available until September 28, 2019 or until the group block is sold-out, whichever comes first. The direct link to make room reservations as part of this block is here. If you wish to make reservations over the phone, the group code is “RPI”.

Map of Venue

Transportation: 

Saratoga Springs, NY, is located 30 minutes north of Albany, NY, on Interstate 87, with easy access to Albany International Airport (ALB) and the Albany-Rensselaer Amtrak station.

For attendees flying into Albany and who do not wish to rent a car, there are a variety of shuttle and taxi services that will take you to Saratoga Springs. Additionally, both Uber and Lyft operate out of the Albany airport.

Sponsors

Rensselaer Polytechnic Institue
http://www.rpi.edu
American Astronautical Society
http://astronautical.org/

Planning Committee

Workshop General Chair:

Workshop Planning Committee:

  • Coralie Adam, KinetX
  • Shyam Bhaskaran, NASA Jet Propulsion Laboratory
  • Lylia Benhacine, C.S. Draper Laboratory
  • Daniella DellaGiustina, The University of Arizona
  • Christopher D’Souza, NASA Johnson Space Center
  • Carolin Frueh, Purdue University
  • Marcus Holzinger, University of Colorado Boulder
  • Andrew Horchler, Astrobotic Technology, Inc.
  • Andrew Liounis, NASA Goddard Space Flight Center
  • Jay McMahon, University of Colorado Boulder
  • Ossi Saarela, Mathworks