GISCafe Weekly Review March 31st, 2014

Retail and the ArcGIS Platform
March 31, 2014  by Matt Sheehan

 

We are over-users of the word exciting. But truly these are great times to be working in the location services and GIS fields. Not only is location technology being used more widely and by ever more diverse groups of users, but new sectors and vertical are emerging. Retail in particular presents a fascinating world of opportunity.

Esri have put considerable focus on retail GIS.

 

Many of our clients use desktop GIS applications such as ArcMap. Some have developed Web GIS applications. One common theme across all clients with whom we work, is an increasing need to integrate their GIS data and applications.

Data! More data! Still more data! The exploding appetite for enriching GIS datasets with more and better data to support decision making is contributing to the rising demand for custom datasets. It is clear that richly attributed, custom datasets will soon become the “coin of the GIS realm.” In addition to the increasing availability of precision data, the demand for more, better, and faster GIS data conflation is also driven by the National Geospatial-Intelligence Agency’s recent directive to suppliers to aggregate existing data to meet stringent NGA requirements.

New tools and different methods are now required to create the more comprehensively-attributed, custom datasets that are replacing the “good enough” datasets that sufficed in the past. Some early adopters in the GIS market already use a new technology: automated conflation, or “intelligent aggregation” of GIS data. They seem to think that it will give them a competitive advantage that will differentiate the winners from the losers. This article reviews the concept of conflation and two toolkits that can help any GIS professional be more efficient and accurate.

Fragments of the missing Malaysia Airlines Flight MH370 are believed to have been found in the Indian Ocean, according to a press conference by Malaysian prime minister, Najib Razak. Inmarsat satellite data was instrumental in finding the debris. It is one of those events that baffles technologists, as the plane disappeared mysteriously two weeks ago, off the radar, and even now, the evidence is not conclusive that this debris belongs to the missing airliner.It is further proof that all the technology in the world cannot make sure of our safety and can also be manually turned off if someone has the desire to lose a plane.

Right after the aircraft disappeared, Inmarsat was involved in the search for the plane. Although the main aircraft communications addressing and reporting system  (which would usually transmit the plane’s position) was turned off, one of Inmarsat’s satellites continued to pick up a series of automated hourly ‘pings’ from a terminal on the plane, which would normally be used to synchronize timing information.

Inmarsat analyzed these pings and was thereby was able to establish that MH370 continued to fly for at least five hours after the aircraft left Malaysian airspace, and that it had flown along one of two ‘corridors’ – one arcing north and the other south. This was shown in various news reports, but this information was given by the Doppler effect, the change in frequency due to the movement of a satellite in orbit. This gave two predicted paths for the flight – one northerly and one southerly route. Inmarsat engineers came up with this prediction which had never been done before, according to senior vice president of external affairs at Inmarsat, Chris McLaughlin. He said that the technology to track position and speed of the aircraft can be made available on planes for less than a dollar and hour.  The plane was reportedly flying at a cruising height above 30,000 feet.

Although this information was given to Malaysian officials by March 12, the Malaysian government did not acknowledge it publicly until March 15, according to the Wall Street Journal. This delay in responding has been sharply criticized in the press and is thought to have contributed to a considerable loss of valuable time in recovering the lost aircraft.

 

Inmarsat’s engineers continued with their further analysis of the pings and came up with a much more detailed Doppler effect model for the northern and southern paths. They compared these models with the trajectory of other aircraft on similar routes and were able to confirm a matching between Inmarsat’s predicted southerly path with reading from other planes on that same route.

These pings from the satellite coupled with assumptions about the plane’s speed, made it possible for  Australia and the US National Transportation Safety Board to narrow down the search area to just 3 per cent of the southern corridor on March 18th.

“We worked out where the last ping was, and we knew that the plane must have run out of fuel before the next automated ping, but we didn’t know what speed the aircraft was flying at – we assumed about 450 knots,” said McLaughlin. “We can’t know when the fuel actually ran out, we can’t know whether the plane plunged or glided, and we can’t know whether the plane at the end of the time in the air was flying more slowly because it was on fumes.”

The analysis was given to the UK Air Accidents Investigation Branch (AAIB) by Inmarsat this week. So far, the cause of the crash remains unknown.

 

Starting any software project can be daunting. GIS in particular presents many confusing choices. With this in mind we have just written a free guide: Web and Mobile GIS Projects from Idea to Reality in 3 Easy Steps, which should help kick start your Web or mobile GIS project:



You are registered as: [_EMAIL_].

CafeNews is a service for GIS professionals. GISCafe.com respects your online time and Internet privacy. Edit or Change my newsletter's profile details. Unsubscribe me from this newsletter.

Copyright © 2016, Internet Business Systems, Inc. — 595 Millich Dr., Suite 216 Campbell, CA 95008 — +1 (408)-337-6870 — All rights reserved.