September 01, 2008
Update on the DHS Data Model
Please note that contributed articles, blog entries, and comments posted on GIScafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
| by Susan Smith - Managing Editor
Each GIS Weekly Review delivers to its readers news concerning the latest developments in the GIS industry, GIS product and company news, featured downloads, customer wins, and coming events, along with a selection of other articles that we feel you might find interesting. Brought to you by GISCafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!
September 1, 2008
GISWeekly examines select top news each week, picks out worthwhile reading from around the web, and special interest items you might not find elsewhere. This issue will feature Industry News, Top News of the Week, Acquisitions/Agreements/Alliances, Announcements, People, Training, New Products, Around the Web and Events Calendar.
GISWeekly welcomes letters and feedback from readers, so let us know what you think. Send your comments to me at
Susan Smith, Managing Editor
Update on the DHS Data Model
By Susan Smith
What is the Department of Homeland Security Data Model?
The DHS Data Model (DHS GDM) is the product of the Department of Homeland Security (DHS), Office of the Chief Information Officer (OCIO), Geospatial Management Office (GMO) in support of urgent DHS mission requirements. I’ve spoken to a number of people about the Data Model and it appears to be an effort to aggregate data from state and local governments, which have some of the best base and event data, with federal data. At first glance it may look like the DHS wants to put it all together in one database, but it is more of an attempt to create a common understanding between 50+ states, 3,100+ counties, and 20,000 taxing entities, a “service oriented architecture”
published in Universal Modeling Language (UML) and XML.
The Department of Homeland Security (DHS) Data Model has been around since July of 2007, openly published on the Federal Geospatial Data Committee’s (FGDC) website, based, according to Mark Eustis, Department of Homeland Security, OCIO and GMO, and Mike Lee, FGDC Homeland Security Working Group on “open government and international standards.”
A definition of the DHS Geospatial Data Model (GDM) found on the FGDC site reads as follows: “This DHS GDM is a standards based, logical data model to be used for collection, discovery, storage, and sharing of homeland security geospatial data. The model will support development of the Department’s services-based geospatial architecture, and will serve as an extract, transform, and load (ETL) template for content aggregation.”
According to a report published in July, 2006, the DHS is mandated to comply with standards, and the standards listed at that time include the National Information Exchange Model (NIEM) and FGDC framework standard.
According to Eustis, “the GDM is a harmonized collection of International, Federal, and certain community standards…as well as national plan recommendations, DHS sector-specific requirements, and various emergency-response and general standards models from pertinent resources. Here’s the abridged list:
FGDC Framework Data Content Standard.
USGS Project Bluebook data model (ESRI form),
National Information Exchange Model (NIEM),
Secondary and/or associated sector-specific resources
FEMA MultiHazard; Emergency Management & Infrastructure Protection,
DHS Infrastructure Protection Taxonomy, v1.0
Geographic Names Information System (GNIS) Feature IDs and types,
Feature types for FGDC Emergency Management Symbology,
National Incident Management System (NIMS) Resource types,
National Hydrography Dataset (NHD) Feature Types,
FGDC Cadastral Subcommittee, Revised Cadastral Model,
National Response Plan Categories,
Homeland Security Infrastructure Protection (HSIP) Feature Types,
NASA Land Cover Classification types,
American Planning Association (APA) Land Use Classifications,
FGDC and ISO Geospatial Metadata
The GDM that is the source for the data schemas produced by the GDM-o-Matic is an entirely open, completely standards-based conceptual model.”
So far the DHS Data Model aggregates a lot of the domain experience from the emergency community including: FEMA, FGDC, National Response Frameworks, data products that are emanating from the NGA, and modeling done against the feature types that are within the Homeland Security Infrastructure Program (HSIP) dataset.
The bottom line is that the standards the DHS refers to are informed by federal Homeland Security agency standards and other federal agencies and their reliance on ESRI software and geodatabases. The DHS Data Model is not software or platform independent at this time.
“Think of the data model like the New York Public Library and it contains everything there is to know about almost everything that could happen in and around the emergency services world, which includes the potential impacts to environment, infrastructure, and a number of different domain areas,” explained Eustis. The “schema generation tool” or “GDM-o-Matic,” an implementation model translation tool, allows people to extract a direct implementation model and put that into their ESRI software. “It will also publish an open XML format product in our next iteration, but the schemas that are produced now are an XML product, formatted to fit into
the standards that ESRI uses.”
Two questions come to mind in reviewing this information: 1) Does the DHS Data Model use Geographic Markup Language (GML), the XML grammar defined by the
Open Geospatial Consortium
(OGC) to express geographical features? GML serves as a modeling language for geographic systems as well as an open interchange format for geographic transactions on the Internet (as defined by the OGC website). And 2) Where does extract, translate and load (ETL), the translation software designed by Safe Software, come in?
Eustis explained that “GML is an object based language and ESRI is not. ESRI is the tool that DHS uses internally and it’s the tool that, according to Daratech, has an 89% marketshare in the state and local communities.”
However, pure GML is available directly on the FGDC website with some Oracle Spatial implementations, and “as the tool evolves I guess there’s an expectation that we will be publishing pure open GML-NIEM-compliant XML and work towards the open formats,” Eustis added.
Because this is an evolutionary process, Eustis said that they have started with the tool with which the DHS and NASA have an enterprise license, and that is ESRI, so “ESRI geodatabases and formatting standards are the industrial standard that folks tend to be using. The hosts and nomenclature (of the DHS Data Model) all come from open standards and it’s our hope and plan, depending upon reaction from the community, to continue to support and move towards more and more formats and standards. The next one will be the NIEM XML. The XML statement can be parsed by any tool you want. There’s nothing hidden.”
NIEM is the National Information Exchange Model, a collaborative program between DHS, the Department of Justice and a number of other entities within the federal community. NIEM builds on the longstanding global justice data XML format GJXML.
The modeling team also builds Oracle implementations to serve other users, such as Autodesk, Intergraph and MapInfo users who work with Oracle Spatial. “The plan is to ensure that this technology remains as open, accessible, and useful as possible; thus the sponsorship and release through the FGDC,” said Lee.
Safe Software, creators of extract, translate and load (ETL) technology, plans to adapt the technology that the DHS has produced to generate schemas in the schema generation tool (sometimes called the GDM-o-Matic) so that it can generate schemas transfer files back and forth are conformant to the GDM. Safe will build this capability into their software.
Eustis described the GDM as a bigger conceptual repository of information and the schema generation tool, GDM-o-Matic allows people within particular sectors of HS in particular geographic areas to reach into that repository of information and pull out information that is very specific to them and share data in that common knowledge. People always have different ways of doing things in different places. The GDM conceptual model provides a common operational picture and reference point for users to find features and attributes they need, and if necessary use the ETL process.
“Sharing data is not a trivial process,” Eustis pointed out. “One group called a hydrant a hydrant, another group might call a hydrant “hyd” or another group might call it a standpipe. You need to have some mechanism of standardizing the nomenclature and then providing the tools that people can map from one space to the next.”
Paul Daisey, Geographic Division, U.S. Census, OGC, incidentally was one of the authors of both UML and GML. Eustis said that UML, the conceptual language of GDM, could be “exported as a GML model if someone wants one.”
The DHS Data Model follows the formula in place for ESRI’s other specific industry data models.
Ken Marshall, Data Content Advisor for GeoConnections, Canada’s national spatial data infrastructure initiative, was the lead for developing the Canadian National Infrastructure Data Model which leveraged the U.S. DHS Geospatial Data Model as its template. However, what is different about the National Infrastructure Data Model is that it is open.
Paula Rojas, GeoConnections Canadian Geospatial Data Infrastructure Content Coordinator at Natural Resources Canada, said, “Our data model in Canada is open. When we developed it, we had it tested on various database platforms so it can be database independent and software independent. In Canada we used a user driven approach that was the requirement of the emergency management community. We found out what they needed, and we based our data model on their needs and then they consulted on the data model and were able to get feedback on it before we finalized it, which was useful to them.”
Rojas was quick to point out that there is a difference between system interoperability and data interoperability. “When we talk about systems being able to interact with each other through true web services and standard interfaces, those are some of the standards put out by the OGC, like the Web Map Services (WMS), etc. When we’re talking about data content standards, which is about the data itself, not how the systems communicate together, but facilitates how the system understands the data, that’s where we develop things like data models, data schemas to have common sorts of structures and languages about data so it can be shared and understood by different
“Each data model is more of a data standard,” explained Rojas, “so when you’re looking at comments about compliance with OGC, etc. I don’t think that’s relevant.” UML data modeling language is a standard used to describe data models for the DHS data model, according to Rojas. When it comes to data content standards or data encoding standards in the OGC, right now they have the GML spec and simple features spec but those are independent of what the data model is. “So I don’t understand people’s concerns with compatibility with the OGC. OGC is not in the business of semantic data standards, meaning standards for specific
thematic communities. In this case, it would be the public safety communities’ needs for critical infrastructure data. It’s very specific.”
Ken Marshall gave his perspective, on why Canada looked at the DHS Data Model and what they understand it to be. “In 2004, several agencies in Canada got together with the military in the U.S. to come up with something they call the Cross Border Infrastructure Plan. Since a lot of the Canadian population lives close to the American border, and major cities are close, they wanted all the geospatial information within that 100-mile swath on either side of the border so both Canadian and U.S. military could have a more complete picture of the infrastructure in that area. So they developed the Cross Border Infrastructure Plan which was the foundation for the data model.” On
the Canadian side, GeoConnections lead the development of the National Infrastructure Data Model, relying on the recommendation of their public safety program which advised they needed this model to help them with Cross Border Infrastructure Plan, which was used for the foundation for the National Infrastructure Data Model.
Public Safety Canada, which is the Canadian counterpart for DHS in the U.S., has the mandate for federal emergency management coordination and broader strategic policy considerations for emergency management and public safety in Canada. They’re the lead on the critical infrastructure programming in Canada dealing with issues such as how the governments will work together to protect critical infrastructure, and plan for the upcoming 2010 Olympics in Vancouver.
Public Safety Canada created ten infrastructure sectors including energy, utilities and transportation. “One key difference is that it is a very strategic data model, in the sense that it’s really intended for emergency managers to use –it doesn’t describe or structure all the attributes of the infrastructure that we’re attempting to describe. It’s very high level, and really to be used for emergency managers, based on consultations and the history of what was done before in the Cross Border Infrastructure environment,” said Marshall.
DHS and Public Safety Canada intend to work together in case of a cross border emergency. People in the different organizations will be able to use the same underlying database and information and look at the same common operational picture.
Sam Bacharach, Executive Director, Outreach and Community Adoption, for the OGC,
agreed that “the data model is something OGC doesn’t do.” He added, “The existing data model is an absolute necessity for us to move forward.”
Bacharach described the DHS Data Model as “something I can translate my data into and give to you, it’s a format you can give me data in and I will understand it. So I’ve only got to do one transform algorithm, I don’t have to map in 10,000 different data sources. It’s tough when you’ve got people from the city, county, state and federal and the commercial world – many of whom have data too. From a data model perspective they’re doing a great job.”
It takes an enormous number of people to manage data, and collect it from the various counties, municipalities and states, Bacharach pointed out. There isn’t enough money to do this. The answer is the DHS Data Model, because “they need something now. As an example, the City of Charlotte serves street center line data out of its own servers on the National Map, so when they make a change it doesn’t have to go through North Carolina and the federal government and whoever else has to be included. It’s just there. That’s the direction we need to be going.”
“Think of bandwidth, think of serving capacity and think of data,” said Bacharach. “As bandwidth gets faster, people will say I’d use the National Map but it is so slow compared to Google Maps. The National Map has one machine sitting in Sioux Falls, South Dakota, it’s got a big fat pipe but it’s only one machine, and Google has half of its 500,000 servers serving maps. Is it any wonder Google Maps is any faster?”
The government needs to take this factor into account, said Bacharach, as right now if there was a catastrophe in a big city and that city was serving live vector data, they would have a hard time serving 500 emergency responders who need the data now. They also don’t yet have the security framework in place that would keep others out.
Also, in an emergency, many people want to access maps immediately. “If you’ve got a little server and 50,000 people trying to get data out of it, the people who need the data can’t get to it. The mechanisms to be able to control that data by role by user, plus serving the data are being developed now.”
Bacharach said that the DHS Data Model v.2 is already being used by government agencies who are using the OGC interface and GML encodings to move data from one system to another. The Department of Defense has mandated these standards be used, and all the large vendors support the core OGC standards anyway.
“When you realize three or four government civilians and four or five contractors are the sum total of the Geospatial Management Office of DHS, an organization with the responsibilities they’ve got? They’re pretty busy people,” Bacharach noted. “If they weren’t trying to stand an agency up they’d still be super busy.”
In summary Bacharach said the DHS Data Model effort “ is not incompatible with OGC at all. Right now they’re focusing on the database, and as it moves out and as more people get hold of it and use it, I’m confident it will go the open standardized way.”
Top News of the Week
Pitney Bowes MapInfo announced the launch of Jobs in the Field, the latest addition to the suite of mobile working software for Confirm, the most sophisticated and comprehensive suite of specialist asset and infrastructure asset management functionality on the market.
The Jobs in the Field module allows work instructions produced in Confirm to be transferred to handheld devices as jobs to be completed out in the field. The seamless integration of this functionality allows mobile operatives to receive new instruction, report on the progress of the work and input completed work details so payment claims may be made.
Sustainable Map Solutions (SMS), a purveyor of municipal software, announced a partnership with mPower Innovations, a GIS software application developer. The partnership allows SMS to distribute mPower Integrator, an online mapping utility, with SMS' online permitting package TOTAL-Permit. The first phase of the mPower Integrator solution is offered to municipalities, counties, and states in an effort to provide cost effective online GIS services.
NAVTEQ, a global provider of digital map data for location-based solutions and vehicle navigation, announced that Loopt is teaming up with NAVTEQ to bring its interoperable social-mapping application to a new network of consumers. The social networking solution uses NAVTEQ data for North America and is commercially available on select carriers.
Ubisoft and GeoEye, Inc. announced that GeoEye, a premier provider of satellite, aerial and geospatial information, has provided high-resolution images taken from its commercial Earth-imaging IKONOS satellite for integration into Tom Clancy’s H.A.W.X for the Xbox 360 video game and entertainment system from Microsoft, the Sony PLAYSTATION3 computer entertainment system and on Windows-based PC. Tom Clancy’s H.A.W.X will be available in early 2009.
The Open Geospatial Consortium, Inc. (OGC) will participate in the
2008 Free and Open Source Software for Geospatial (FOSS4G) Conference
September 29 - October 3, 2008 at Cape Town International Convention Centre, Cape Town, South Africa. FOSS4G 2008 is organized by the
Safe Software announced that its software solution FME, the recognized standard in spatial ETL (extract, transform and load), now offers complete support for CityGML.
The FME 2009 beta introduces the ability to write CityGML data, adding to the read support which has been available since the format's early draft stages. This combination of read and write capabilities enables CityGML users to communicate in the common specification while still using their preferred systems for data analyses. FME Server, Safe Software's spatial data distribution solution, will also be extended in 2009 to add capabilities for sharing CityGML data via the OGC web feature service (WFS) protocol.
Intelligent Addressing (IA) is taking a leading role as Britain’s sole representative in the EURADIN (EURopean ADdresses INfrastructure) project. The EURADIN project, which runs over two years, aims to contribute to the harmonization of European addresses, propose solutions for interoperability and access, facilitating the creation of new value added products and services across Europe. The results of the project will be used as a reference for all European member states to fulfill their obligations under the INPSIRE directive. From 2009 any new address datasets across Europe will have to accommodate the agreed data structure and any existing datasets by 2016.
James W. Sewall Company is pleased to announce the hire of Chester C. Bigelow, III, PWS, as Senior Environmental Scientist. Mr. Bigelow brings to Sewall 25 years’ experience in aquatic and wetland ecology and environmental engineering, with a specialization in water quality monitoring, wetland restoration, and ecological conservation and management.
CartoPac Field Solutions, powered by Spatial Data Technologies, announced that Victoria Bosworth has been hired as the company’s marketing coordinator. CartoPac Field Solutions is a leader in the mobile mapping industry and offers a number of Trimble-compatible applications that give field personnel the ability to be more productive and efficient in the collection of critical field data.
The Naval Surface Warfare Center (NSWC) Dahlgren has named ESRI an approved prime contractor on the SeaPort Enhanced (SeaPort-e) online portal. By being listed on SeaPort-e, ESRI can make available a broad range of engineering, technical, and programmatic services related to geographic information system (GIS) and information technology. ESRI can provide these services to the U.S. Navy, the U.S. Marine Corps, and the Defense Threat Reduction Agency (DTRA).
Details about ESRI's participation in SeaPort-e, along with points of contacts, are available by visiting
Overwatch Geospatial Systems, an operating unit of Textron Systems, a Textron Inc. company, will offer training classes for Feature Analyst, LIDAR Analyst and Urban Analyst in Sterling, Virginia, September 16-18, 2008. The annual Overwatch Geospatial Training Days provides extensive instruction for Overwatch Geospatial software users that will enhance their skill sets by increasing productivity and knowledge base. Half-day training classes will offer new insight to the latest advances in 2D and 3D feature extraction.
All training classes will be held at the Overwatch Geospatial Systems offices in Sterling, Virginia. Discounts are available for users who enroll in multiple classes. A class schedule, pricing and sign-up information are all available at the
ESRI announced the new ArcGIS API for Flex Beta at the ESRI International User Conference in San Diego, California, which was held August 4–8, 2008.
ArcGIS API for Flex is integrated with Adobe Flex Builder 3 and can be downloaded for free from ESRI. Flex is a client-side technology that is rendered by Flash Player 9 or Adobe AIR. This means that application developers now have the capability to combine geographic information system (GIS)-based Web services from ArcGIS Server with other Web content and display it in fast, visually rich, and expressive mapping applications that can be deployed over the Web or to the desktop.
Orbit Geospatial Technologies presents Orbit GIS 4.3 for Windows on 32bit and 64bit machines. The Orbit GIS Desktop solution now comes with full OGC WMS support, raster processing tools and over 85 new and improved functionalities.
Orbit GIS 4.3 offers improvements on many issues, ranging from full OGC-WMS support, dramatically improved raster data management, improved graphics and symbols tools to a wide range of usability improvements with focus on user friendliness. This release is simultaneous for 32bit and 64bit Windows platforms, with identical capabilities.
Idevio announced that Yahoo! Local listings and Yahoo! Traffic are featured for mobile user via the mobile service Locago. Locago releases more local content for mobile users in US by aggregating Yahoo! Local and Yahoo! Traffic for the mobile.
XATA Corporation announced it will add improved digital mapping software to its XATANET on-demand fleet operations software. The upgraded mapping software will provide XATANET end-users with more high-quality data, including larger and more dynamic maps, to help better track exact vehicle location and improve route details. Available in November 2008, the enhanced mapping capability is part of the XATANET 4.3 release.
Around the Web
Mobile GIS Speeds Inventory
, ArcUser Online -- by Pete Pearson, HWA GeoSciences Inc.; Jeremiah Podleski, Perteet Inc.; and Garet Couch, Wind Environmental Services, LLC
|September 7 - 11, 2008
|Keystone Conference Center
Keystone, CO USA
|The National States Geographic Information Council (NSGIC) is an organization committed to efficient and effective government through the prudent adoption of geospatial information technologies (GIT).
|Radisson Hotel Boston 200 Stuart Street
If you are an existing MapInfo Professional customer or a GIS specialist, you will not want to miss the overview session on MapInfo Professional version 9.5. Take a detailed view of the new capabilities and functions now available to you. See live demos of the new features such as CAD-like tools, .NET programmability, support for SQL Server 2008, a sneak peak at vector
translucency and much more. Then join the Pitney Bowes MapInfo
Product Management team for a session where you will have the opportunity to provide personal and direct input on future product enhancements.
|September 11 - 12, 2008
|Radisson SAS Hotel
The 2008 Incident Management Summit was instigated by the Ministerie van Verkeer en Waterstaat who have dedicated their time and energy into making this event the most prestigious and forward thinking geo-spatial platform for European transport management. It brings together many of the world's foremost geo-infrastructure experts, high ranking public sector delegates representing governmental agencies from across the whole of Europe and globally trusted solution providers who will be on-site and available to offer cutting edge technology designed specifically for this arena.
|September 14 - 16, 2008
|Join IMTA for its first global conference in North America. Vancouver is one of the most beautiful cities in the world. Nestled between the majestic coastal mountains of British Columbia and the tides of the Pacific Ocean, it is a year-round destination that lures travelers from all parts of the globe.
|Brighton, United Kingdom
Do you have customers that rely on location based information? Find out how they can track assets, find customers, manage mobile work forces and make sure their customers find them before they find a competitor. Learn how to combine your expertise and Microsoft’s technology to integrate mapping applications into your client’s business. For more information or to register please follow this link.
|September 21 - 24, 2008
|Marriott Westchase Hotel
Houston, TX USA
|The 16th Annual GIS for Oil & Gas Conference and Exhibition is the world's largest gathering of oil and gas GIS professionals.Finding innovative solutions can be the key to unlocking today's unique technological challenges. GIS technology can provide solutions that include many operational and economic benefits to oil and gas operations. Whether you're new to geospatial technologies or a knowledgeable veteran, you'll discover how GIS can help you address regulatory compliance issues, proactively manage organizational assets, and enhance your bottom line.
|September 28 - October 1, 2008
|Renaissance Washington DC
|Join us for the 2008 ESRI Health GIS Conference and learn firsthand how health and human services organizations are using GIS to shape global health. Hear from thought leaders knowledgeable in the value of GIS and see what geospatial solutions can do for you in your line of work. From applying GIS to a problem, issue, or clinical practice to developing an enterprise-wide approach, this conference is your opportunity to explore how important GIS technology is to health
and human services professionals.
|September 29 - October 3, 2008
|Cape Town Convention Center
Cape Town , South Africa
|The 2008 Free and Open Source Software for Geospatial (FOSS4G) conference gathers developers and users of open source geo-spatial software from around the world to discuss new directions, exciting implementations, and growing business opportunities in the field of open source geo-spatial software.
|September 29 - October 3, 2008
The Spatial Sciences Institute’s Remote Sensing and Photogrammetry (RS & P) Commission and the 14ARSPC Organising Committee, look forward to welcoming all delegates to the 14ARSPC in Darwin, Northern Territory, held from 29 September-3 October 2008.
The first of the ARSPC biennial series was held in 1979. It is exciting that for the first time in the conference’s 29 year history, it will be held in Australia’s most northern tropical city,
. The ARSPC is the RS&P Commission’s premier conference showcasing the best of Australasia’s technical and application driven remote sensing and photogrammetry research. The conference also plays an important role in presenting research and future directions from around the globe to RS&P Commission members and the wider Australasian spatial community.
You can find the full GISCafe event calendar here
To read more news, click here
-- Susan Smith, GISCafe.com Managing Editor.