October 28-30, 2019
Embassy Suites Saratoga Springs,
86 Congress Street, Saratoga Springs, NY 12866

Asteroid Bennu. Image composed of 12 PolyCam images collected by the OSIRIS-REx spacecraft. Credit: NASA/Goddard/University of Arizona

Poorly-resolved sequence of images of IDEFIX/Ariane Rocket Body 42P. Images from GT-SORT and provided courtesy of Dr. Marcus Holzinger.

Dione. Original image provided as .IMG file from : Porco, C.P., CASSINI ORBITER SATURN ISSNA/ISSWA 2 EDR VERSION 1.0, CO-S-ISSNA/ISSWA-2-EDR-V1.0, 2005.
Registration is CLOSED.
DOWNLOAD FULL PROGRAM (1.2MB PDF)
Although images of space objects have been used for decades, there are many common challenges in the use of image data for planetary science, optical navigation, and space situational awareness. Many of these challenges may be addressed by recent advancements in imaging sensors, computer vision algorithms, and computing resources - all of which set the stage for a fundamental change in how we will exploit image data for future space missions. This workshop aims to facilitate the dissemination of ideas between all of the communities using space image data and to identify areas of common need. The unique workshop format will include oral presentations, poster sessions, and focused breakout sessions to discuss topics of interest.
Keynote Speakers

OpenSpace: a Tool for Immersive Interactive Data Context Visualization
Monday, October 28, 2019, 11:20-12:20
OpenSpace is a NASA supported open source software collaboration between the American Museum of Natural History (AMNH) and three leading universities in the fields of data visualization research: Sweden’s Linkoping University (LiU), University of Utah’s Scientific Computing and Imaging Institute (SCII), and New York University’s Tandon School of Engineering. It was designed as an interactive visualization tool to contextualize data and simulations across the known spatial and temporal scale range of the universe presented across a range of display environments. OpenSpace originated from a foundational concept in the millennium rebuilding of New York City’s Hayden Planetarium to take audiences beyond the night sky into 3D charted space by visualizing the measured layout of the universe. It has grown from two decades of collaboration with LiU that produced the commercial Uniview software (by SCISS, AB) and precursory tools such as NCSA’s Partiview. Influenced by the vast collections at AMNH, its exhibits that contextualize nature across time, and the immersive display tradition of the planetarium, OpenSpace aspires to transport audiences into vast fields of data augmenting what can be seen, traveling to places only visited robotically, and beyond to scales and timeframes impossible to experience in any other way than through rigorous data visualization. Multiple modes of visualization and a modular structure aspire to make OpenSpace a flexible presentation and data exploration tool for a range of users.
Carter Emmart, Ph.D., is the Director of Astrovisualization at the American Museum of Natural History. He directs the award-winning space show productions at the Hayden Planetarium based in data visualization. Since 1998 he has overseen the development of the AMNH Digital Universe 3D Atlas. In 2002, he began a collaboration with Sweden’s Linkoping University hosting a series of masters thesis projects that lead to the NASA supported OpenSpace software. Carter’s career began as a space artist, with an academic background in astronomy and geophysics, and comes from a family tradition in the arts.

Computer Vision for Planetary Descent and Landing
Tuesday, October 29, 2019, 11:20-12:20
The Mars Exploration Rovers used the Descent Image Motion Estimation System (DIMES) to estimate velocity during landing. Although the approach was quite controversial at the time, it was ultimately successful for both Spirit and Opportunity landings. This opened the door for future uses of computer vision during landing, and at that time, the Mars program was particularly interested in position estimation to enable pin-point landing. In 2004, some initial funding was provided to look into its feasibility. Now 15 years later, after numerous starts and stops, changes in funding, distractions and breakthroughs, we have developed the Lander Vision System (LVS) for Mars 2020. This system estimates map relative position during landing so that known hazards in the landing ellipse can be avoided. This talk will start with lessons learned from DIMES and then trace the technology development arc that lead to the flight implementation of LVS.
Dr. Andrew E. Johnson graduated with Highest Distinction from the University of Kansas in 1991 with a BS in Engineering Physics and a BS in Mathematics. In 1997, he received his Ph.D. in computer vision from the Robotics Institute at Carnegie Mellon University. Currently, he is a Principal Robotics System Engineer at NASA’s Jet Propulsion Laboratory where he is developing guidance navigation and control technologies for autonomous navigation and mapping during descent to planets moons, comets and asteroids. His current job is the Guidance Navigation and Control Subsystem Manager for Mars 2020 which includes a new vision-based terrain relative navigation system for landing and an enhanced rover navigation system for rapid autonomous driving over challenging terrain.
Schedule
DOWNLOAD FULL PROGRAM (1.2MB PDF)
Monday: October 28, 2019
7:30-8:30 |
Registration |
|
8:30-9:00 |
Welcome |
|
9:00-9:20 |
DTM Production from LROC NAC Images |
![]() |
9:20-9:40 |
Remote Sensing Method to Model Terrain Shape by Detecting Reliable Ground Points |
![]() |
9:40-10:00 |
Planetary Surface Image Generation for Testing Future Space Missions with PANGU |
![]() |
10:00-10:20 |
Break |
|
10:20-10:40 |
Rendering the Titian Environment for Dragonfly |
|
10:40-11:00 |
A Novel Surface Feature Navigation Algorithm Using Ray Tracing |
![]() |
11:00-11:20 |
Break |
|
11:20-12:20 |
Day 1 Keynote: Carter Emmart (American Museum of Natural History) |
|
12:20-2:00 |
Lunch |
|
2:00-2:20 |
NASA SPLICE Project: Developing the Next Generation Hazard Detection System |
![]() |
2:20-2:40 |
Monocular Visual-Inertial Odometry with Dynamic LiDAR Scaling for Safe and Precise Landing on Unmapped Planetary Bodies |
![]() |
2:40-3:00 |
A Vision-Based Navigation Capability for Precise Lunar Landing |
![]() |
3:00-3:20 |
GENEVIS: Generic Vision Based Navigation for Descent & Landing |
![]() |
3:20-3:40 |
Break |
|
3:40-4:00 |
A Comparison of Feature Extraction Methods for Terrain Relative Navigation |
|
4:00-4:20 |
Efficient Vision-Based Terrain Relative Navigation Without Dependence on Feature Identification |
![]() |
4:20-6:00 |
Break |
|
6:00-9:00 |
Dinner, Posters, Student Paper Competition |
Evening Poster Session
Regular |
Analytic Center of Illumination for Groups of Geometric Primitives |
![]() |
Regular |
Deep Learning-Based Crater Detection for Lunar Terrain Relative Navigation |
![]() |
Regular |
The Optical Navigation Toolkit |
![]() |
Regular |
Triangles vs. Quadrilaterals: Selecting the Right 3D Model Format for Space Science and Exploration |
![]() |
Regular |
Visualizing Space Data in Virtual Reality |
![]() |
Student Competition |
Addressing Feature Shadowing with Application to Vision-based Terrain Relative Navigation |
![]() |
Student Competition |
Limb-based Shape Modeling: A Demonstration on Itokawa |
![]() |
Student Competition |
Center and Apparent Diameter Optical Navigation on Mars Orbit |
![]() |
Student Competition |
Small Body Pose and Shape Estimation from Silhouettes via Extended Target Tracking |
![]() |
Tuesday: October 29, 2019
7:30-8:30 |
Registration |
|
8:30-8:40 |
Day 2 Opening Remarks |
|
8:40-9:00 |
Orion Optical Navigation Performance and Testing |
|
9:00-9:20 |
Performance Bounds for Star Centroid Localization in Diffraction-Limited Digital Images |
![]() |
9:20-9:40 |
Optical Navigation for Autonomous Approach of Small Unknown Bodies |
![]() |
9:40-10:00 |
Break |
|
10:00-10:20 |
Optical Navigation for New Horizons' Flyby of Kuiper Belt Object (486958) 2014 MU69 |
|
10:20-10:40 |
The Benefits of Subsampling Optical Navigation Images as Applied to the New Horizons Flyby of (486958) 2014 MU69 |
![]() |
10:40-11:00 |
Optical Navigation Preparations for a Possible Binary System During the New Horizons Extended Mission |
![]() |
11:00-11:20 |
Break |
|
11:20-12:20 |
Day 2 Keynote: Andrew Johnson (JPL) |
|
12:20-2:00 |
Lunch |
|
2:00-2:20 |
Template Matching Used for Small Body Optical Navigation with Poorly Detailed Objects |
![]() |
2:20-2:40 |
Transition from Star-based to Landmark Optical Navigation During OSIRIS-REx Navigation Campaign at Asteroid Bennu |
![]() |
2:40-3:00 |
Independent Optical Navigation Processing for the OSIRIS-REx Mission using the Goddard Image Analysis and Navigation Tool |
![]() |
3:00-3:20 |
The Image Constraint Measurement Type for Orbit Determination and Geophysical Parameter Estimation |
|
3:20-3:40 |
Break |
|
3:40-4:00 |
Reconstruction of Active Bennu Particle Events from Sparse Optical Data |
![]() |
4:00-4:20 |
Monitoring the Operational Environment of Active Asteroid (101955) Bennu |
![]() |
4:20-4:40 |
Towards Robust Learning-based Pose Estimation of Noncooperative Spacecraft |
![]() |
4:40-5:00 | Break | |
5:00-6:30 |
Seminar: NASA Navigation Filter Best Practices |
VIEW |
Wednesday: October 30, 2019
8:30-Noon |
Short Course: Space Image Processing and Computer Vision with MATLAB |
Technical Papers, Presentations, and Posters:
Extended abstracts are sought in all aspects of image analysis for space applications, and may be submitted to one of the following three categories: presentation (manuscript optional), poster (manuscript optional), student competition (manuscript required). Example topics of particular interest include:
- Image processing techniques as applied to space imagery
- Estimation of a body’s photometric properties from images
- Stereophotogrammetry (SPG)
- Stereophotoclinometry (SPC)
- Lightcurve inversion
- Optical navigation (OPNAV)
- 3D modeling of space objects and planetary terrain
- Image-based terrain relative navigation (TRN)
- Image-based simultaneous localization and mapping (SLAM)
- Image-based virtual reality (VR) and augmented reality (AR) for space exploration
- Visualization and rendering pipelines for space exploration
- Imaging with non-traditional electro-optical (EO) sensors
- Space imaging hardware
Extended abstracts are expected to be approximately 750-1,500 words and (approximately) follow the workshop template: MS Word Template (39KB docx) or LaTeX Template (31KB - in Github). These extended abstracts may not exceed two pages (plus, at the author’s discretion, an optional third page with supplemental images). Details on page and text formatting is provided within the template.
All extended abstracts must be submitted via the webform at the bottom of this page. If accepted, revised extended abstracts or full manuscripts are due by October 21, 2019. Full manuscripts are optional for all categories except the student paper competition. Extended abstracts (or, if submitted, full manuscripts) will be posted on the workshop website. If no revision is provided, the accepted extended abstract will be posted online. Workshop no-shows will have their abstract (or manuscript) removed from the website agenda and archive.
Please remember to allocate enough time for your extended abstract, manuscript, presentation, and/or poster to receive whatever organizational approvals are necessary for presentation in an open/public forum. These approval processes can often take many weeks.
Breakout Session Proposals:
In addition to the technical presentations and posters, the workshop will feature a series of breakout sessions where attendees can meet in small groups to discuss topics of interest to the community. We are accepting proposals for breakout session topics. Breakout session proposals should be no more than one (1) page in length.
Student Paper Competition:
Papers with students as the primary author are eligible for submission to the student paper competition. Students authors must attend the workshop and will present their work in a special poster session. Papers will be evaluated on technical merit and impact to the field, with the top papers receiving awards.
Abstract Submission Deadlines:
Extended Abstracts Due: April 26, 2019 [extended to May 10, 2019]
Author Notification: May 31, 2019
Revised Ext. Abstract (or Optional Manuscript) Due: October 21, 2019
Short Course: Space Image Processing and Computer Vision with MATLAB
Taught by instructors from MathWorks, this hands-on MATLAB short course will cover practical applications of image processing and computer vision in space systems. In this interactive session, you will use MATLAB to explore the following:
- Automate image processing workflows with new MATLAB apps
- Feature-based Terrain Relative Navigation (TRN) with MATLAB
- Object detection and recognition for Rendezvous and Proximity Operations (RPO)
- Brief introduction to deep learning for computer vision


Registration
Registration for the Workshop is now closed. If you have any questions about registration, please contact John Christian at chrisj9@rpi.edu
Travel & Lodging

The workshop will be held at Embassy Suites by Hilton Saratoga Springs.

A block of rooms has been reserved at the Embassy Suites by Hilton Saratoga Springs at the government per diem rate ($117.00 USD/Night). This special room rate will be available until September 28, 2019 or until the group block is sold-out, whichever comes first. The direct link to make room reservations as part of this block is here. If you wish to make reservations over the phone, the group code is “RPI”.
Map of Venue
Saratoga Springs, NY, is located 30 minutes north of Albany, NY, on Interstate 87, with easy access to Albany International Airport (ALB) and the Albany-Rensselaer Amtrak station.
For attendees flying into Albany and who do not wish to rent a car, there are a variety of shuttle and taxi services that will take you to Saratoga Springs. Additionally, both Uber and Lyft operate out of the Albany airport.
Sponsors


Planning Committee
Workshop General Chair:
- John Christian, Rensselaer Polytechnic Institute, chrisj9@rpi.edu
Workshop Planning Committee:
- Coralie Adam, KinetX
- Shyam Bhaskaran, NASA Jet Propulsion Laboratory
- Lylia Benhacine, C.S. Draper Laboratory
- Daniella DellaGiustina, The University of Arizona
- Christopher D’Souza, NASA Johnson Space Center
- Carolin Frueh, Purdue University
- Marcus Holzinger, University of Colorado Boulder
- Andrew Horchler, Astrobotic Technology, Inc.
- Andrew Liounis, NASA Goddard Space Flight Center
- Jay McMahon, University of Colorado Boulder
- Ossi Saarela, Mathworks