September 01, 2008
Update on the DHS Data Model
Please note that contributed articles, blog entries, and comments posted on GIScafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
The DHS Data Model follows the formula in place for ESRI’s other specific industry data models.
Ken Marshall, Data Content Advisor for GeoConnections, Canada’s national spatial data infrastructure initiative, was the lead for developing the Canadian National Infrastructure Data Model which leveraged the U.S. DHS Geospatial Data Model as its template. However, what is different about the National Infrastructure Data Model is that it is open.
Paula Rojas, GeoConnections Canadian Geospatial Data Infrastructure Content Coordinator at Natural Resources Canada, said, “Our data model in Canada is open. When we developed it, we had it tested on various database platforms so it can be database independent and software independent. In Canada we used a user driven approach that was the requirement of the emergency management community. We found out what they needed, and we based our data model on their needs and then they consulted on the data model and were able to get feedback on it before we finalized it, which was useful to them.”
Rojas was quick to point out that there is a difference between system interoperability and data interoperability. “When we talk about systems being able to interact with each other through true web services and standard interfaces, those are some of the standards put out by the OGC, like the Web Map Services (WMS), etc. When we’re talking about data content standards, which is about the data itself, not how the systems communicate together, but facilitates how the system understands the data, that’s where we develop things like data models, data schemas to have common sorts of structures and languages about data so it can be shared and understood by different
“Each data model is more of a data standard,” explained Rojas, “so when you’re looking at comments about compliance with OGC, etc. I don’t think that’s relevant.” UML data modeling language is a standard used to describe data models for the DHS data model, according to Rojas. When it comes to data content standards or data encoding standards in the OGC, right now they have the GML spec and simple features spec but those are independent of what the data model is. “So I don’t understand people’s concerns with compatibility with the OGC. OGC is not in the business of semantic data standards, meaning standards for specific thematic
communities. In this case, it would be the public safety communities’ needs for critical infrastructure data. It’s very specific.”
Ken Marshall gave his perspective, on why Canada looked at the DHS Data Model and what they understand it to be. “In 2004, several agencies in Canada got together with the military in the U.S. to come up with something they call the Cross Border Infrastructure Plan. Since a lot of the Canadian population lives close to the American border, and major cities are close, they wanted all the geospatial information within that 100-mile swath on either side of the border so both Canadian and U.S. military could have a more complete picture of the infrastructure in that area. So they developed the Cross Border Infrastructure Plan which was the foundation for the data model.” On the
Canadian side, GeoConnections lead the development of the National Infrastructure Data Model, relying on the recommendation of their public safety program which advised they needed this model to help them with Cross Border Infrastructure Plan, which was used for the foundation for the National Infrastructure Data Model.
Public Safety Canada, which is the Canadian counterpart for DHS in the U.S., has the mandate for federal emergency management coordination and broader strategic policy considerations for emergency management and public safety in Canada. They’re the lead on the critical infrastructure programming in Canada dealing with issues such as how the governments will work together to protect critical infrastructure, and plan for the upcoming 2010 Olympics in Vancouver.
Public Safety Canada created ten infrastructure sectors including energy, utilities and transportation. “One key difference is that it is a very strategic data model, in the sense that it’s really intended for emergency managers to use –it doesn’t describe or structure all the attributes of the infrastructure that we’re attempting to describe. It’s very high level, and really to be used for emergency managers, based on consultations and the history of what was done before in the Cross Border Infrastructure environment,” said Marshall.
DHS and Public Safety Canada intend to work together in case of a cross border emergency. People in the different organizations will be able to use the same underlying database and information and look at the same common operational picture.
Sam Bacharach, Executive Director, Outreach and Community Adoption, for the OGC,
agreed that “the data model is something OGC doesn’t do.” He added, “The existing data model is an absolute necessity for us to move forward.”
Bacharach described the DHS Data Model as “something I can translate my data into and give to you, it’s a format you can give me data in and I will understand it. So I’ve only got to do one transform algorithm, I don’t have to map in 10,000 different data sources. It’s tough when you’ve got people from the city, county, state and federal and the commercial world – many of whom have data too. From a data model perspective they’re doing a great job.”
It takes an enormous number of people to manage data, and collect it from the various counties, municipalities and states, Bacharach pointed out. There isn’t enough money to do this. The answer is the DHS Data Model, because “they need something now. As an example, the City of Charlotte serves street center line data out of its own servers on the National Map, so when they make a change it doesn’t have to go through North Carolina and the federal government and whoever else has to be included. It’s just there. That’s the direction we need to be going.”
“Think of bandwidth, think of serving capacity and think of data,” said Bacharach. “As bandwidth gets faster, people will say I’d use the National Map but it is so slow compared to Google Maps. The National Map has one machine sitting in Sioux Falls, South Dakota, it’s got a big fat pipe but it’s only one machine, and Google has half of its 500,000 servers serving maps. Is it any wonder Google Maps is any faster?”
The government needs to take this factor into account, said Bacharach, as right now if there was a catastrophe in a big city and that city was serving live vector data, they would have a hard time serving 500 emergency responders who need the data now. They also don’t yet have the security framework in place that would keep others out.
Also, in an emergency, many people want to access maps immediately. “If you’ve got a little server and 50,000 people trying to get data out of it, the people who need the data can’t get to it. The mechanisms to be able to control that data by role by user, plus serving the data are being developed now.”
Bacharach said that the DHS Data Model v.2 is already being used by government agencies who are using the OGC interface and GML encodings to move data from one system to another. The Department of Defense has mandated these standards be used, and all the large vendors support the core OGC standards anyway.
“When you realize three or four government civilians and four or five contractors are the sum total of the Geospatial Management Office of DHS, an organization with the responsibilities they’ve got? They’re pretty busy people,” Bacharach noted. “If they weren’t trying to stand an agency up they’d still be super busy.”
In summary Bacharach said the DHS Data Model effort “ is not incompatible with OGC at all. Right now they’re focusing on the database, and as it moves out and as more people get hold of it and use it, I’m confident it will go the open standardized way.”
Top News of the Week
Pitney Bowes MapInfo announced the launch of Jobs in the Field, the latest addition to the suite of mobile working software for Confirm, the most sophisticated and comprehensive suite of specialist asset and infrastructure asset management functionality on the market.
The Jobs in the Field module allows work instructions produced in Confirm to be transferred to handheld devices as jobs to be completed out in the field. The seamless integration of this functionality allows mobile operatives to receive new instruction, report on the progress of the work and input completed work details so payment claims may be made.
You can find the full GISCafe event calendar here.
To read more news, click here.
-- Susan Smith, GISCafe.com Managing Editor.
Be the first to review this article