The GIS Lens Skip Levens
Director of Technical Marketing, Quantum. Technology Leader specializing in 'hardware/software/user experience alchemy.' Collecting Oceans Of Data – Problem Or Opportunity?February 24th, 2014 by Skip Levens
Sensor, packet, video and signal data is not only all-digital and collected around the clock but it is also scaling up in size and resolution. The resulting massive surge in storage required to save it all is either a problem for your organization – or a major opportunity. Plotting geophysical trends, tracking industrial changes or monitoring surveillance imagery requires both broad area spectral analysis and high resolution digital images. Ideally, you can combine multiple inputs for richer analysis to help make the most informed decision as close to real-time as you can get. Better imagery and capture techniques from ever-higher resolution platforms available around the clock mean that your raw data ingest is massive. Given competitive pressures, your need to collect, route and make sense of that data quickly is sharper than ever. Throwing data away that you can’t handle is not an option, but neither is pumping data into an archive with no way to retrieve it and manage it intelligently. Handling the influx of data requires a scalable storage platform that allows multiple High-performance shared file systems can serve as the glue to bind ingest, processing, and distribution systems together. A shared data pool can allow direct, Fibre Channel speed data access without the scalability and performance bottlenecks found in most NAS solutions. The architecture also provides the flexibility to add additional systems as application requirements and customer demands evolve. In some instances, the streamlining of production workflows might be sufficient, but in the majority of cases, creating imagery products might only be half of the goal. The Need for Content Retention In industries where trend analysis is increasingly important, companies involved in managing geospatial imagery also need to keep their data sets stored and readily accessible based on evolving data analysis demands. This could apply to changes in plankton levels in our oceans, changing global environmental patterns that could impact the health of food supplies, or activities in a foreign nation that could impact our national security. This would also apply for natural resources where petabytes of location-referenced seismic data have been collected from around the world that need to be preserved for the petroleum industry for current and future exploration and extraction activities. Spatial as well as temporal comparisons are needed to answer important inquiries. And with the increasing resolution of imagery and complexity of the various types of geospatial data, retention and data management becomes extremely critical. Automated Data Movement Simplified Data Access A high-performance shared storage platform, such as Quantum StorNext, can ingest and process an ever growing amount of data in a timely fashion and have the ability to store data in a simplified, standardized long term storage repository. With a modern architecture, companies can scale faster and evolve with the next generation of visualization and interpretation techniques. There is a need to have the ability to produce quality imagery and provide trend data that enables better planning, smart analysis and quick action on evolving geophysical and geopolitical conditions. This kind of strategic implementation may not always be able to anticipate demand, but one can now safely plan for it – and turn what is a problem for other organizations into your major opportunity. Tags: Quantum, Skip Levens Category: Quantum |