After decades—centuries, even—the question of whether or not life forms from other galaxies occasionally visit Earth remains unanswered. For the latest attempt to unravel this age-old mystery, the National Geographic Channel assembled a team of trained investigators to visit several sites where unidentified flying objects allegedly have been sighted. The network aired the investigations in its Chasing UFOs series in summer 2012.
Ben McGee (left), an investigator on the National Geographic Channel Chasing UFOs series, sets up a survey grid using the Topcon IS-3 imaging station while Scott Langbein, Topcon’s director of product marketing, provides technical assistance.
Whether or not the forensic investigations proved that some UFOs are actually spaceships transporting alien life forms is up to viewers to decide. As is the case with all programming on the network, viewers learn something about the planet in the process of being entertained. Among its own discoveries, the team learned something new about terrestrial positioning technology, too, having been equipped with an instrument that combines advanced imaging with high-accuracy surveying capabilities.
ESRI is one of the most well known names in geospatial intelligence and is certainly one of the largest forces for standards and systems unification in this area, and now they have their sights firmly set of making cloud technologies work for geospatial data.
Many industries and many users of geospatial data in their domestic lives are already utilising the cost reducing, de-localised, multi-platform benefits of using cloud-based systems, but when dealing with vast datasets that need to move at high speed and with the utmost security how can it outperform current data architecture models?
DGI spoke to John Day, Director of Defence Business Development for ESRI at DGI 2012 to find out.
With one of the most distinguished careers in the geospatial intelligence arena, including Director of NGA, Director of Naval Intelligence, Vice-Director for Intelligence of the Joint Staff and now a professor at the Institute for National Security and Counterterrorism and on the faculty of the Maxwell School at Syracuse University in New York, Bob Murrett is very much at the forefront of geo and multi-int.
Bob shared his thoughts at DGI 2012 on exactly what and how the geospatial intelligence landscape is changing and what developements we can expect to see over the next 12 months and beyond…
Normally, DGI would never condone torture under any circumstances, but when it comes to squeezing the maximum information out of your geospatial dataset, then they all for it!
Making the most of your pixels — gathering the most data you can from your imagery is one of the most important factors in multi-int today and Digital Globe are on the frontier of research and development of new technologies, but before marching forward, you need to know your past.
To explain more, DGI spoke to Jack Hild, VP of US Defence Strategy at Digital Globe.
Typically, geospatial intelligence is most commonly linked to defence and the military. While this may be true, more and more civilian organisations are utilising the power of geospatial information to radically improve their abilities. Unsurprisingly, the police force is one such area getting involved.
Associate Professor of Criminology, Law, Society & Planning, Policy & Design at the University of California Irvine speaks to DGI’s Online Editor Dan Mellins-Cohen about the innovative and highly successful implementation of geospatial intelligence by the LAPD to not only catch criminals, but proactively prevent crime.
Newly released system of integrated, industry-specific software solutions boosts work efficiency across functional and geographic boundaries
One industry that is characterized by continuous improvement is information technology. An IT innovation for business that has begun to penetrate the mainstream of business activity is enterprise cloud computing, which utilizes groups of computers in various locations that aggregate data storage as well as Internet gateways for network access from any location. The result is a level of data processing power and storage that rivals those of local or wide-area networks—accessible to work groups that are spread out across countries, continents or the entire globe.
Field crews can store project data in a subscriber account that is accessible to other work groups, rather than in separate, self-contained field devices.
We have recorded 18 video interviews at the Esri User Conference in San Diego this week. The conference was extremely well attended with more than 15,000 attendees, more than 200 exhibitors, and 100’s of great presentations. The weather was perfect in San Diego making the conference even more enjoyable.
The following video interviews will be published in the coming weeks as we complete the editing process:
Chad St. Amand, GIS Manager of Tembec, Timmins, Forest Resource Management presents how using lidar can improve operational planning for forestry managers at the Esri Forestry GIS Conference in May, 2012. This was the most popular presentation at this conference.
ScribeKey, LLC, (www.scribekey.com) a Boston area GIS data integration company specializing in metadata and data cleansing, has released version 1 of the GIS Data Profiler. The profiler, an ESRI ™ArcGIS 9 or 10 add-on, captures information describing the structure, contents, and meaning of geospatial datasets. This information, which includes highly detailed, concise descriptions of feature classes, tables, data fields, value lists, indexes, metadata, geodatabase relationships, subtypes, domains, and more, is saved by default in an easy-to-use and share MS Access database. Profile information can also be saved to larger backend databases such as SQL Server or Oracle. The profiler can process Shapefile, Personal Geodatabase, File Geodatabase, SDE, and SDC datasets.
Data profilers, typically found in enterprise business intelligence and decision support environments, are used to help with a wide variety of data-centric projects including dataset revision tracking, data QA/QC, application development, migration and conversion ETL, schema matching, metadata and data dictionary development, and more. Essentially, data profiling provides the ability to generate a highly structured and detailed table of contents and index for a database. Profiling makes it much easier to search through, become familiar with, and make the best use of even the largest data stores.
Profile Database Form Describing Feature Classes and Tables for US Census TIGER/Line Data
ShareMap is slowly phasing out from beta stage. Feature set is almost complete and dozens of maps created with ShareMap enriches multiple Wikipedia articles.
There are two major features introduced recently:
OpenStreetMaps is enormous source for geospatial data. Implantation of import mechanism from OSM was one of first features of ShareMap. But sometimes OSM data is not enough, especially when we try to create historical map. For example if we want to create map of dismantled tramway system in Cleveland we cannot find information about track position at OSM. But we can reuse pre war map of system (with expired copyrights, so it is 100% comply with Creative Commons license). We can treat this old map as overlay to our interactive map, but first of all we have to calibrate it. In simple words calibration is just selecting of several calibration points on raster map (old one) and interactive map. After point selection system warps. Under the hood well known open source GDAL library is used. In this video map calibration process is described in detail: