GISCafe Guest Sanjay Gangal
Sanjay Gangal is the President of IBSystems, the parent company of AECCafe.com, MCADCafe, EDACafe.Com, GISCafe.Com, and ShareCG.Com. GISCafe Predictions 2024 – 1SpatialFebruary 7th, 2024 by Sanjay Gangal
By Seb Lessware, CTO (Chief Technology Officer), 1SpatialWhat lies ahead for 2024? It’s all about Geospatial AI: Navigating the Future of Automation, Drones, and Data Aggregation
One method of capturing this unstructured or semi-structured data is using drones which for several years have been the highlights of geospatial hardware shows. They are widely used for human inspection via cameras, or for point cloud capture for projects but mostly just for human visualisation. If the AI techniques described above improved automated management of structured spatial data, then this would drive the use of drones for not only human interpretation, but structured data capture so one improvement would unlock the other. In the meantime, there is still a disconnect between the data produced by the design and build phases of construction – held in CAD formats, drawings or point clouds – and the data needed by large scale data management systems of record. The handover and adoption of this information is a big driver for projects that we have been involved in over the last few years. We are seeing a tipping point where the automatic validation and integration of this data is now the norm, so more organisations will adopt this approach. Some projects such as the National Underground Asset register are no longer worrying about ‘how do we ingest, integrate, maintain and share this data?’ but ‘what are the future use-cases for this hugely valuable and up-to-date structured data asset?’. The growth of automation in data capture and data ingest projects also drive the need for measuring and protecting data quality to ensure that automation does not lead to loss of data quality, which might otherwise have been spotted by the people capturing the data. Automation of data quality alongside automation of data capture means the data is then suitable for powerful use cases such as underpinning digital twins and smart cities. These large-scale data aggregation projects mean that there will be a better data framework from which these smart uses can flourish, and we hope to see more of that in the coming year. Data aggregation hubs might only be a steppingstone towards a federated data mesh approach. Aggregating data that is mastered in many different systems by physically mirroring it in an up-to-date data hub is great to get consistent data in a consistent structure which provides a single system to manage resilience, performance, security, and role-based access. But there will always be a lag between what is stored in the hub vs the latest version of the data which might be updated on an hourly basis. A federated model in which the data is pulled live from each data mastering organisation’s system would provide an even more up-to-date version of the data. In the shorter term this is usually achieved using metadata catalogues which can be searched to find and link to relevant data which can be streamed or downloaded. This catalogue approach allows the data to remain in the mastering systems, but they are not usually made available in a consistent structure or format, so it is harder to aggregate this data for use. Data federation is harder, especially when an agreed structure is needed for virtual aggregation, because it requires an agreement on the structure and encoding of the data as well as a high level of technical maturity at the data custodian to provide live services which are scalable and secure. While there are good standards for data sharing from organisations such as the OGC (Open Geospatial Consortium), and good examples of live data feeds being used in production, it will be interesting to see whether more widespread secure data federation is progressed this year – possibly not yet. All these capabilities are underpinned by web connectivity and therefore also at risk of hacking and disruption. The AI techniques described above which can automate positive outcomes, can also be used to speed up and empower cyber criminals, terrorists and ‘state actors’ for negative outcomes and so the ongoing security arms race will continue at full speed with continual upgrades, testing and best practices. Whether there are any seismic changes in the security area we don’t know, but it will be an ongoing discipline that needs to be kept up with to sustain and improve confidence in the systems to ensure that they can continue to be connected in a trusted and secure way. In summary then, many of these developments enable more automation, and automation drives efficiency and opens up new opportunities so we should see various outcomes becoming real this year: Automated AI Data capture experiments will start to show whether they are viable, New data aggregation projects will start to automate ingestion by enforcing rigorous data checks and existing aggregation projects will start to benefit from leveraging their data in new and innovative ways. About Author: With a degree in Cybernetics and Computer Science, Seb joined Laser-Scan (which became 1Spatial) in 1997 as a Software Engineer. After working on many projects and a broad range of software as a Senior and then Principal Software Engineer, he then moved into Consultancy and then Product Management which provided insight into customer and industry needs and trends. After leading Product Management for a number of years, Seb is now Chief Technology Officer (CTO) at 1Spatial. Category: GIS Industry Predictios |