Susan Smith has worked as an editor and writer in the technology industry for over 16 years. As an editor she has been responsible for the launch of a number of technology trade publications, both in print and online. Currently, Susan is the Editor of GISCafe and AECCafe, as well as those sites’ … More »
Geospatial Predictions for 2017 from Boundless CEO, Andy Dearing
February 2nd, 2017 by Susan Smith
As I wrote last year, 80 percent of all business data contains a location component, yet most organizations are not using it or don’t know how. Boundless’ open, cloud-based and highly scalable platform, allows developers to deploy an entire scalable GIS infrastructure with just one line of code, and analysts can visualize all of their geospatial data in real time without any licensing fees.
This development solution addresses the increasing demand for an alternative to proprietary geospatial solutions (Esri and Hexagon, for example). Boundless offers greater functionality than Esri’s ArcGIS at 10 percent of the cost.
Boundless CEO Andy Dearing spoke to GISCafe Voice about the company’s perspective for 2017.
How do you see businesses approaching and managing their geospatial data in 2017?
As organizations today have access to more data than ever before, and it only continues to grow at an astounding rate, the problem now is not gaining access to the data, it’s the ability to scale and process it in order to help solve business problems. Location-based data is no exception.
Additionally, open source technologies will continue to proliferate in modern IT enterprises, as they have become an essential component for gathering, organizing, and connecting the dots between vast amounts of the spatial data at our fingertips.
How does geospatial data impact the new technological advancements that are beginning to take shape?
The geospatial industry is propelling the science behind today’s technology trends – namely, integrating IoT data, mapping drone information, and analyzing imagery from small sats – more rapidly than ever before. We are only beginning to really understand how these new sources of information can help us make better decisions.
With the emergence of machine learning concepts, geoprocessing techniques can be applied to make sense of data streams and “big data” architectures. This means that organizations are not only understanding the “what” behind location content, but the “why” as well. The ability to gain deeper insights into change detection, trend analysis, and predictive modeling will take hold in these open, elastic infrastructures.
When you say “machine learning,” is this synonymous with “deep learning?” One way or the other, can you give your description of what each of those are and represent in the industry?
If you ask the experts in AI, Machine Learning, and Deep Learning, they will be able to give you the subtle nuances between all of these terms. In how we introduce machine learning in our trends prediction, it really is providing algorithms on data streams to detect patterns, trends, and ultimately provide answers based on your criteria, and specifically accounting for location as part of those algorithms. When you start implementing deep learning, you can take those patterns and trends and start making more informed actionable decisions in the future, or possibly setup predictive models based on historical trends or behaviors. A good example of this is how machine learning was used to understand the Arab Spring, and helped predict where conflict would occur based on triggers on social media.
What do you think about accuracy as we move into a greater age of assimilating lots of disparate data in the cloud and having such a volume of data to analyze and process?
In general, the higher the volume, variety, and velocity of data contributing to a problem yields more accurate and valid results. This is no different in the geospatial industry. For instance, OSM crowdsources information from users throughout the globe. The more contributions to the dataset, the more accurate and valid it becomes.
However, in order to take advantage of this, you need to possess the technology / infrastructure to support and handle this large volume of information. Open source uniquely provides the ability to aggregate multiple formats of information across microprocessors to assimilate and provide actionable insights. Additionally, you can scale up and out horizontally with open source technologies in order to meet these current and future data processing demands.
What is Boundless’ role in each of these areas, going forward?
Boundless is right in the middle of this area today, and will be well into the future. Open source implementations are designed to be interoperable, flexible, and scalable – well beyond proprietary, stovepiped platforms. Boundless is enhancing our technology platform to be truly multi-tenant, enabling GIS to work within elastic environments and cross-platform to support IoT operations. And all our technology is built upon modern, industry standard devops deployment standards – truly providing an elastic and scalable geospatial technology platform for the modern enterprise.
Categories: analytics, Boundless, citizen science, climate change, cloud, cloud network analytics, data, drones, Esri, field GIS, geocoding, geospatial, GIS, location based sensor fusion, mapping, Open Source, spatial data
2 Responses to “Geospatial Predictions for 2017 from Boundless CEO, Andy Dearing”