Interesting to know what the Open Geospatial Consortium has been discussing lately in the way of geospatial technology trends:
“All predictions are wrong, some are useful. Predictions of geospatial technology trends have been the topic of recent discussions by the OGC Board of Directors and the OGC Planning Committee. One of my roles as OGC Chief Engineer is to offer a slate of “ripe issues” as a basis of these discussions. This blog provides an overview of the ripe issues developed in March 2013 and explains how they were developed. Future blogs will discuss each issue individually.
The ripe issues of geospatial technology identified in March 2013 are:
The Power of Location
Internet of Things
Geographers of the future
These issues were developed by reviewing over 200 recent articles from information technology journals from IEEE, ACM, etc. as well as from geospatial industry magazines and other publications. Geospatial World’s recent “Thought Leaders Edition” was particularly useful in identifying issues from a geospatial industry perspective.
The U.S. National Institute of Justice has announced that it expects to award a maximum of three discretionary grants for research that explores the relationship between theory and geospatial predictive policing strategies.
No award amount was specified for this program.
According to the report, this funding opportunity is open to any entity, such as state, county, city, township and special district governments; Native American tribal governments and organizations; institutions of higher education; Historically Black Colleges and Universities; Tribally-Controlled Colleges and Universities; non-profits; for-profits; small businesses; eligible agencies of the federal government; and faith-based or community organizations.
The NIJ is seeking proposals that “focus on linking theories to current policing strategies, discerning potential disconnects in the levels of analysis between theory and practice, explicating what effects this may have on findings, and, finally, addressing means of adapting theory and practice based on the results.”
Robert Cheetham, CEO and president of Azavea, spoke about the Web-based Historical Geocoder called Temporal Geocoder, that the company is developing for address-level temporal geocoding.
GISCafe Voice: Do you think this is the first time-enabled geocoder to be developed?
Robert Cheetham: There have been previous efforts to create time-based place name gazetteers. The China Historical GIS project<http://www.fas.harvard.edu/~chgis/>is a good example of a place name geocoder that has some similar ideas. There is a similar effort underway in New York City, led by the New York Public Library that is also aimed at place names. But, to our knowledge, this is the first attempt to create an address-level temporal geocoder. We hope to merge both address and place name geocoding into the same system.
GISCafe Voice: What types of technology will be employed in Temporal Geocoder’s making?
Robert Cheetham: We plan to use Leaflet, Python, Django and PostGIS. There is also some parallel work being done by a sub-project of the OpenStreetMap project and we hope to collaborate with that effort as well. We plan to release the Database Editor under an open source license in order to make it possible for other communities to build similar databases as well as to cultivate a community around this type of work.
GISCafe Voice: How will the information for the historical aspect be displayed?
Robert Cheetham: We plan to create two basic software tools, both of which will be web-based. The first will be a database editing software tool that will enable people to indicate changes in the street network as well as street name changes and aliases. This Historical Street Database Editor will be able to display, a) the current streets; b) the street grid for a specific historical reference period; and c) a historical reference map that has been scanned and georeferenced.
DMC International Imaging (DMCii) is in the business of helping The Algerian Space Agency (ASAL) to predict the spread of locust plagues across North Africa. This effort is part of an aggressive approach to tackle the age-old problem of locust infestation using satellite imagery.
The past decade witnessed a giant leap in various industries, with 3D technology being implemented in various electronic devices and other objects. The need for 3D mapping arose after an attempt to make 2D maps more advanced and look more real. This was done by introducing sensors, cameras, scanners, GPS components, and other acquisition devices to capture the real time 3D images which are created into models incorporated into maps. This type of technology is often used in modern computer programs to provide a lifelike view of a place or thing on a map.
Portable GPS devices use 3D mapping technology to provide automated directions. These devices have small screens that display a three-dimensional view of roads and maps. This is a good tool for people who travel or go for hiking to unfamiliar areas because the device uses satellites to pinpoint its exact location. Building schematics are blueprints used for the construction of houses. 3D mapping technology is often used to create construction schematics. This tool makes it easy to draw a three-dimensional version of a house plan. These plans are typically used to get building permits and construction material before any building starts.
The improvised 3D experience in smartphones, tablets, notebooks, PCs, cars, etc. is set to revolutionize the mobile device market and other GPS-enabled device market by broadening the horizons for the users to locate things easily using any device. This report looks at the various applications of 3D modeling and mapping applied in various business verticals. It analyzes the challenges and opportunities for 3D mapping and modeling as well as its impact in the marketplace. The report also gives insights into the global adoption trends, key market players, future scope, drivers, and restraints in the market, along with growth potential across different geographies. It also analyzes various factors that will drive and restrain the market over the next 5 years.
Key Topics Covered:
2 Executive Summary
3 Market Overview
4 3D Mapping And Modeling: Market Size And Forecast
5 3D Mapping: Market Size And Forecast, By Applications
6 Market Size And Forecast By 3D-Enabled Devices: Its Influence On 3D Mapping
7 3D-Enabling Devices: Its Influence On 3D Mapping
8 3D Mapping And 3D Modeling: Market Size And Forecast, By Verticals
9 3D Mapping And 3D Modeling: Market Size And Forecast, By Regions
The Federal Geographic Data Committee has initiated a process to develop a new strategic plan for the National Spatial Data Infrastructure. On March 7, FGDC held an NSDI Leaders Forum at the U.S. Department of the Interior to begin to gather input to help build the foundation for the new strategic plan. More than 20 key geospatial associations, plus other leaders in the geospatial community, were invited to the forum to participate in a dialogue about a strategic plan to guide the future development of the NSDI. GITA was represented by Talbot Brooks, President of the Board of Directors. John Moeller, former Board Member, also participated in the forum. The FGDC plans to work collaboratively with partners and stakeholders in the geospatial community to develop and implement a new strategy for continued and sustainable development of the NSDI.
The forum was a very good start for gaining input and ideas. Over the almost three hours of discussion, all of the participants had the opportunity of voicing the perspectives of different segments of the geospatial community. This input, along with that from a meeting earlier in the week, with federal agencies, will help FGDC build an expanded outline for the strategic plan. The timeline has the plan being completed by the end of 2013. While this could shift a bit during the next few months, FGDC anticipates having a draft plan out for public review in mid-late summer.
As a key stakeholder in the NSDI, GITA sees a spatial data infrastructure as a critical national infrastructure asset similar to physical infrastructure assets. GITA supports the development of an updated NSDI Strategy and encourages members to contribute to the planning activities. For more information go to http://www.fgdc.gov/nsdi-plan . You may also provide Talbot with your ideas and input by contacting him at email@example.com.
Bill Okubo, Exelis enterprise product manager. spoke on the ENVI Services Engine from Exelis. Exelis Visual information Solutions developed an enterprise-enabled processing engine that provides remote users access to the power of ENVI image analysis and IDL applications from a web or mobile client interface. The working name for this capability is the ENVI and IDL Services Engine (ESE). This engine now enables the remote user to gain access to the same compiled ENVI and IDL functions and procedures that remote sensing scientists have utilized for decades at the desktop level.
TomTom spokesperson Lea Armstrong talked about TomTom’s recent announcements, which include the extension of its longstanding partnership with leading PND supplier MiTAC International Corporation. Under the global agreement, TomTom will provide a range of world-class navigational services for use in all of the PND brands owned by MiTAC including Mio, Navman and Magellan.
TomTom also announced an agreement to supply maps and road data to location-based services provider Telmap. With this partnership, Telmap, an Intel company, will utilize TomTom content to enhance its products and services, including mobile navigation apps distributed via application stores and mobile operators worldwide.
GISCafe Voice: How will this relationship with MiTAC be extended?
This is an extended service agreement from 2009 for another 3 years. The agreement covers EMEA, APAC, & Mexico for me, as well as the USA and Canada for Magellan.
GISCafe Voice: Did Telmap have an existing relationship with TomTom and if so, how did that differ from the current partnership?
Bill Emison, senior account manager for Geospatial Solutions at Merrick & Company, talked about their new QC module in Version 7.1 MARS (Advanced Remote Sensing Software).
Airborne LiDAR – urban area
Merrick Advanced Remote Sensing (MARS software suite is a comprehensive, production-grade Windows application designed to visualize, manage, process and analyze LiDAR point cloud data.
The Quality Control module is designed to provide an automated tool for verifying compliance of a LiDAR point cloud dataset to the LiDAR Base Specification Version 1.0 from the USGS (U.S. Geological Survey). The application sits on top of MARS as an extension.
“As a data vendor there have been many contracts where we’ve had to comply to those specs, and in an effort to do that effectively, we started to build tools two-three years ago,” said Emison. “We competely automated the entire specs. Our goal is to deliver data one time and not have to do rework, it was important to identify issues before the dataset went out the door.”