Skip to main content

TLDR: Working with Rendered.ai, a team from Planet was able to demonstrate that synthetic data for satellite imagery can be used to simulate real collection of imagery, even before a satellite is even launched. The Planet team was able to use the Rendered.ai platform and a synthetic data channel with RIT’s DIRSIG simulator to create synthetic hyperspectral imagery emulating the upcoming Tanager constellation.

Logo for Planet Labs PBC.

In the satellite imaging market, the opportunities, costs, and risks can literally be out of this world. Traditionally, satellite and sensor manufacturers would design a collection platform, test optics and electronics in the lab, and then wait through the long and expensive process of manufacturing, launching, and deploying a satellite into orbit. Only at the point at which the satellite was deployed and starting to collect and transmit data could the company then begin working through exploitation workflows and processing pipelines to start to show value with collected data.

Planet Labs PBC (Planet), driven by the mission to image all of Earth’s landmass every day, has carved out an important place in the hierarchy of Earth observation data providers, with both commercial and government applications.

Rendered.ai has worked with Planet to offer a solution that assists with testing software pipelines for Planet’s new class of hyperspectral satellites by developing simulated data prior to launch. Rendered.ai was able to help Planet’s efforts to efficiently use the best-in-class imagery simulation tool known as DIRSIGTM, created by the Rochester Institute of Technology (RIT)’s Digital Imaging and Remote Sensing Laboratory (DIRS).

Rendered.ai offers a solution that could enable DIRSIG to be used more broadly by Planet, accelerating creation of simulated imagery at higher volumes and in an easier to use, web-based experience. Rendered.ai builds a platform as a service, hosted in AWS cloud infrastructure, that can take just about any simulation engine, containerize it for deployment to the cloud, and use it to generate virtually unlimited amounts of simulated sensor imagery and video. The Rendered.ai platform uses containerized simulations in a standard framework that enables simulated, or synthetic, imager generation accompanied by 100% accurate label information for computer vision training.

The Planet team, led by Planet Mission Director Mark Keremedjiev, has been working with Rendered.ai to generate radiometrically accurate data with DIRSIG based on Planet’s soon to be launched Tanager hyperspectral satellite constellation. Essentially, Rendered.ai and DIRSIG together are enabling Planet to simulate a rapidly moving satellite capturing many different spectra of light being reflected from the surface of the Earth, months before an actual vehicle was launched into space.

“The synthetic data we generated with Rendered.ai and DIRSIG have been incredibly valuable to a wide range of pre-launch activities on our Tanager program,” said Mark Keremedjiev. “Through this effort we have been able to enhance many areas of the program and have helped demonstrate the value we expect to deliver to our hyperspectral customer base ahead of launch.”

The combined offering of DIRSIG working inside the standard Rendered.ai framework allows Planet’s experts to configure simulations with characterizations of the expected motion of Tanager with its VIS-SWIR imaging spectrometer. Datacubes, large multi-channel imagery files that capture many slices of light intensity at small increments across a part of the electromagnetic spectrum, are generated in the cloud using DIRSIG containerized within Rendered.ai. “Captured” values are simulated based on a realistic model of the Planet platform. The DIRSIG simulation is driven by configurable parameters representing the temporal and spatial capture spectra as well as the satellite platform motion.

A typical graph showing how synthetic data runs can be configured in Rendered.ai.

A typical graph showing how synthetic data runs can be configured in Rendered.ai. Graphs are used as a visual representation of the configuration options in a synthetic data channel, or application.

 

Rendered.ai enables Planet to test an imagery product generation pipeline in which synthetic data jobs are accessed via SDK calls. The generated datacubes represent raw sensor output, also referred to as Level Zero or L0 data, as if images are being directly collected from a space-borne platform. Each parametrically variable synthetic data run may be used to simulate tightly controlled scenarios such as having the same sensor pass over the same field at a similar time of day over several different days. The simulated packages of data are then used to exercise the Planet imagery processing pipeline in which data are processed and repackaged depending on the needs of an end customer.

The output demonstrates high physical accuracy. Embedded in a Rendered.ai synthetic data channel, the DIRSIG simulator can generate raw photon counts or perform the temporal integration calculation used to validate that the DIRSIG simulation accurately emulates physically captured sensor information.

Validation information collection for each pixel are available in the generated datasets along with material classification. Thousands of spectral curves have been gathered and validated to represent radiometrically accurate materials in each scene.

Even accurate information for methane plume data is included in the simulation capability. Configurable scene and sensor parameters in Rendered.ai, for example, allow an atmospheric scientist to analyze data for simulated features in the imagery, such as methane plumes, as if the simulated data is capturing the actual phenomenology of gas emissions inside normal atmosphere.

Examples of hyperspectral imagery generated in Rendered.ai using DIRSIG

A wide variety of physically accurate synthetic data can be generated with DIRSIG in Rendered.ai, enabling pixel-level control over environmental effects, material properties, cloud cover, atmosphere properties, and much more.

Because channels run in a hosted platform with an open pattern for deploying changes to the cloud, updates to the data generation channel — such as new features in DIRSIG — are immediately available to the Planet data processing team. Another facet of using a platform is that a published SDK for Rendered.ai is available for users to manipulate data generation graphs to control a wide range of variation in simulations.

The Planet channel in Rendered.ai can be used for multiple applications such as:

  •  Setting different target locations over a large outdoor scene to capture scans of diverse terrain types from different sensor angles and collection times
  • Creating simulation of varied crop types in agriculture fields using procedural methods that incorporate accurate radiometric materials
  •  Modifying the atmospheric conditions based upon latitude, season of the year, and types of aerosols
  • Adding methane plumes to a specific location with unique plume geometry and point of origin

Parameters to create scene and atmospheric variation are represented as visual graphs in Rendered.ai and can be configured by Planet staff to be able to create diverse datasets for exploration and testing potential future captures from the Tanager satellites.

The Planet team’s usage of Rendered.ai is a great example of how a cloud platform simplifies access to complex technologies making them accessible to more users within an organization and also increasing efficiency when using tools that may not be typically part of an organization’s workflow. Because DIRSIG and other simulation tools are available for use in Rendered.ai synthetic data channels built on a common framework, they can be customized and deployed rapidly for a wide range of applications that require physically accurate data in an industry that is increasingly dependent upon access to data for better performing algorithms.

Acknowledgements

We thank Mark Keremedjiev and the Planet team for reviewing and allowing us to share this story. We also are happy to continue to work with Scott Brown and the DIRS Lab team at RIT who provided essential support for Planet’s use cases.

Leave a Reply