Open side-bar Menu
 GISCafe Voice
Susan Smith
Susan Smith
Susan Smith has worked as an editor and writer in the technology industry for over 16 years. As an editor she has been responsible for the launch of a number of technology trade publications, both in print and online. Currently, Susan is the Editor of GISCafe and AECCafe, as well as those sites’ … More »

PCI Geomatics introduces Analysis Ready Data (ARD) Tools

March 19th, 2019 by Susan Smith

Kevin Jones, Executive Director of Marketing for PCI Geomatics, spoke with GISCafe Voice about the release of new software for Geomatica and GXL, the company’s flagship software for complete and integrated desktop and enterprise geoimage processing. Geomatica features tools for remote sensing, digital photogrammetry, geospatial analysis, mosaicking, and more that can be deployed through the Geomatica desktop, Python workflows, or through large-volume-production systems. The focus of the new release is enabling big data processing for large archives of satellite data, which need to be processed to a scientifically rigorous level known as Analysis Ready Data (ARD). The new ARD tools provide methods to create datasets that can then be used for Multi-Temporal Analytics (MTA) leveraging the Open Data Cube infrastructure.

How recently did you develop the technology to conduct multi-temporal analysis?

We have made MTA the focus of our 2018 release – the Oct 01 2018 release was themed “ARD Tools for Multi-Temporal Analysis.” Analysis Ready Data (ARD) tools address critical requirements for performing authoritative and scientifically accurate single-image analysis and multi temporal analytics.

Why is it important to align data perfectly?

Each satellite sensor our customers work with provide varying positional accuracy information. We’ve implemented rigorous methods of not only using the satellite ephemeris information, but also included new methods to co-register imagery to a known and accurate base layer using a process we call super registration. We have proven that we can achieve registration accuracies of 1/10th of a pixel which is critical for mutli-temporal analyses.  The actual time it takes to process imagery is dependent on the size of the data and of course what compute infrastructure is deployed. We have a range of solutions from desktop, enterprise to cloud based large volume production systems our customers can deploy (Geomatica and GXL platform).

How do customers reconcile changes on the ground as opposed to changes in atmosphere? Or is that done through the Geomatica platform?

Customers want to know what has changed on the ground, therefore they need to remove atmospheric effects, in addition to seasonal solar illumination effects. These new ARD tools are targeted at normalizing the data across time, so that changes on the ground are what our customers can measure.

What can customers accomplish with the archive of imagery?

Deep temporal stacks can help operational and research based customers to ask questions of the data that would have normally been very difficult due to the need to pre-process imagery. With these steps addressed and the data integrated into the Open Data Cube (ODC), customers can produce amazing products such as a flood frequency and spatial extent map (based on actual measurements that span the real occurrence of these events over long time periods). See this example:, which uses an algorithm known as Water Observation from Space (WOfS). Other examples include mapping permanent land change by calculating breaks in NDVI temporal patterns, through another algorithm available through the ODC known as Continuous Change Detection Classification (CCDC) (link to paper:

Who are most of your customers?

PCI Geomatics has customers around the world, mainly in academia, government and private industry. In terms of ARD, we see the potential market being any organization that can access large repositories of imagery, which in our industry typically falls under the purview of organizations that collect / archive satellite imagery (through ground receiving stations). To make the archives more valuable (for authoritative operational or research use), the process of making it “Analysis Ready” is critical.

When was the Open Data Cube project released?

We released our initial version of ARD Tools in October 2018. We plan another release in mid-2019 which will further improve and automate ARD data generation, as well as populating the Open Data Cube automatically through the Geomatica and GXL platform. The Open Data Cube is a project which was initiated by Australia Geoscience , and was originally known as the Australian Data Cube.

Is the Canadian government the primary customer for DataCube projects such as deforestation?

Canada is the ideal use case for ARD. We have many environmental challenges and a one of the largest landmasses globally. The Canadian government is very much focused on better understanding the cumulative effects related to climate change, thus mapping deforestation, urbanization, and flood severity are of great interest. We are piloting this big data processing approach with Environment and Climate Change Canada (ECCC), and Natural Resources Canada (NRCan).

Can you give a sample workflow of the Analysis Ready Data Concept through pulling information from DataCube with GXL software to source data adopted by CEOS?

 Starting with raw data sitting in the archives, the steps would be as follows:
– Ingest large volumes of raw satellite data through GXL, collecting all required metadata to perform pre-processing steps
– Perform Top of Atmosphere Reflectance (TOA) calibration using sensor data to remove top of atmosphere effects
– Perform Bottom of Atmosphere (BOA) calibration using known / expected ground reflectance data and modeling the atmopsheric effects to generate surface reflectance products
– Map solar illumination angles on each acquisition together with local topography to remove topographic effects, producing a topographically normalized data products (which removes inter-seasonal ground reflectance changes)
– Perform super registration of data using known positional information and achieve 1/10th pixel registration or better
– Automatically publish pre-processed imagery to the Open Data Cube (via an automated workflow in the GXL)
– Connect algorithms (such as WOfS, CCDC) to the data via the ODC  – done through web based CEOS tool, or Jupyter notebooks

 Can you give measurements of the increased algorithm accuracy over time?

Validation studies are underway – in reality, many of these types of measurements can be achieved, however the consistent method of producing authoritative pre-processed data has limited the full use of the archives. Through systematic processing and automation, information locked away in archives can be released, if they are connected to the right algorithms.

Tags: , , , , , , , , ,

Categories: 3D Cities, analytics, data, geomatics, geospatial, GIS, handhelds, location based services, mobile, mobile mapping, photogrammetry, remote sensing, resilient cities, satellite imagery

Leave a Reply

Your email address will not be published. Required fields are marked *



This site uses Akismet to reduce spam. Learn how your comment data is processed.

University of Denver GIS Masters Degree Online

Internet Business Systems © 2019 Internet Business Systems, Inc.
25 North 14th Steet, Suite 710, San Jose, CA 95112
+1 (408) 882-6554 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise