Analytical Graphics Inc. (AGI) is previewing the capability to stream high resolution 3D models into any Cesium-based client.
A provider of Spatial IT solutions, Boundless, has released the newest version of its enterprise geospatial software platform, OpenGeo Suite 4.6. This Suite powers web, mobile and desktop maps and applications across both large and small organizations and improves performance, reliability and styling.
GISCafe Voice spoke with Boundless’ chief marketing officer, Sean Brady, to find out more about the platform release:
GISCafe Voice: What would be an example of cost differential using OpenGeo Suite 4.6 rather than a proprietary geospatial solution?
Sean Brady: There are no traditional license costs associated with OpenGeo Suite, either client-side or server. As a result, as you scale deployments (across both IT environments as well as users) organizations incur no incremental costs other than underlying infrastructure costs. Proprietary geospatial solutions incur license costs on both a per-user basis as well as the number of cores used on the server side, so costs increase with scale.
GISCafe Voice: When you say “anyone” can build maps, etc. do you mean anyone with certain geospatial qualifications?
Sean Brady: This is the benefit of what we at Boundless call “Spatial IT”. It means Spatial no longer needs to require special qualifications, because IT professionals familiar with database technologies like PostgreSQL and web development languages like CSS can build and style maps. As an industry, if we want geospatial to grow in adoption by traditional market verticals, we have to make the technologies more accessible to the IT shops that are already in place without needing to hire scarce geospatial experts.
GISCafe Voice: Do organizations need IT/geospatial departments to get the suite implemented in their companies?
Sean Brady: Again, as an industry if we want geospatial to grow beyond specialized geospatial shops we have to make it accessible to other parts of the business. Organizations still need IT, Web, or application development expertise to leverage the power of OpenGeo Suite – but those are resources in much greater quantity and are already invested in as strategic efforts.
GISCafe Voice: Do you have examples of deployment cases?
Sean Brady: You can find cases on our website underneath our various product offerings. We have case studies posted about deployments at organizations like NOAA, TriMet, and Asheville, North Carolina.
GISCafe Voice: Are you moving into other market areas, if so, which ones?
Sean Brady: In the spirit of Spatial IT, we’re working to make our software accessible to multiple market areas. If you visit our website at http://boundlessgeo.com/resources/, you’ll note multiple market verticals we are currently targeting and working with.
GISCafe Voice: What do you think is the most profound offering of Suite 4.6 that differentiates it from competing Open Source geospatial software?
Sean Brady: In the open source community we like to think we’re not competing with other open source technologies. Rising water lifts all boats in our space, the more people that use open source geospatial technology the better we all do. It’s why we’re committed to OGC standards and interoperability – if you wish to use something different at a certain layer because you have different objectives, then please, go ahead. Where Boundless works to differentiate is by responding to what we perceive are gaps in what’s out there – our improved Composer offering in OpenGeo Suite 4.6 addresses market needs for web-based map design and styling tools using the simplified YSLD syntax.
New capabiities and enhancements in Version 4.6 include:
Enhanced OpenGeo Suite Composer, that allows anyone to build and style maps by making it easier to add data to GeoServer, style layers, and publish to the Web.
In an interview with Todd Steiner, marketing director, Geospatial Imaging and Optical Solutions and Tim Lemmon, marketing director, Geospatial Software Solutions, GISCafe Voice discussed the recent announcement from Trimble announcing an expanded portfolio of geospatial solutions for surveyors, engineers and mapping professionals. Highlights include new total stations, a new GNSS receiver and new field and office software features. According to company materials, the solutions save time, reduce costs, streamline workflows and produce high-quality geospatial deliverables across a wide range of industries.
I’ve often thought that weather prediction was the only profession where the professionals could be wrong 90% of the time and still get paid a good salary. Perhaps the new application program interfaces from Exelis will change all that for the future of weather and meteorology.
Technology for electric utilities is not always particularly exciting, but Space-Time Insight raises the bar for the industry in terms of providing virtual reality to actually access and analyze situations arising in utility facilities.
CoreLogic recently released new wildfire data, the CoreLogic Wildfire Risk Analysis, that states that nearly 900,000 single-family homes across 13 states in the western U.S. are currently designated at “High” or “Very High,” risk for wildfire damage, representing a combined total reconstruction value estimated at more than $237 billion. Of the total homes identified, just over 192,000 homes fall into the “Very High Risk” category alone, with total reconstruction cost valued at more than $49.6 billion. Other categories include “Moderate” and “Low” risk. GISCafe spoke with Dr. Tom Jeffery of CoreLogic to find out the scoop on this important new information for homeowners, insurance companies and other stakeholders.
Dr. Tom Jeffery: In the past we used what’s called the assessed property value which is based on tax assessment. We’ve actually changed that so it matches what we do with storm surge which is reconstruction value of these homes. This is going to be the cost of labor and materials in each of the different locations to replace the structure that would be lost in the event that a wildfire destroyed the whole thing. California is right at the top of the list, in most cases, because of wildfire risk throughout the state, but Colorado and Texas are also states that are usually ranked very high. They continue to do so through this report. There is one overarching factor that pops out whenever we do these reports. When we see results for the first time, we see how many homes are at risk in the total U.S. and what those values are. They are exceptionally high in those areas.
GISCafe Voice: What determines what states are ranked high?
TJ: Because you have large population centers in California, Texas and Colorado, and those urban areas that continue to grow, the pace of the growth is going to grow from year. All three of those states continue to have urban expansion and new homes constructed continue to push out into areas that have higher risk. There’s a lot of risk in those three states as well. A lot of people and a lot of risk is a combination that put those three to the top of the list.
GISCafe Voice: How do you assess the risk score?
TJ: The risk score itself is really based on several factors we combine, the first is the risk on the property, and that determines our categories of high, moderate high and very high, those determine the risk on the property. It’s based on what fuel is there, based on vegetation, if there’s change of terrain, if there’s a steep slop which enhances the risk, that determines the category. But for the score we actually want to look outside the property boundary to risk in close proximity to that property. So if you own a property maybe you have a nice manicured lawn, decorative trees, you don’t have any risk on the property. But just outside the boundary there could be a lot of chaparall in Southern California, for instance, a dense conifer or pine forest, in other areas. If that exists really close to your property that raises the risk value.
So we measure the distance from a property to what’s around it in terms of risk and then we add that to the category or risk on the property, and we add that to the score. The score is going to be 0-100 numeric-based and anything that’s 80 and above is extremely high risk. We have those broken out in the tables, so you can see even though if you look at the U.S. as a whole, there are going to be 192,000 properties that are listed as very high. And that’s looking at the risk on the property. As soon as we look at the score – and the score 81-100, we go from 192,000 all the way up to 1.1 million. So really those homes on the urban edge pushing out in to the wilder areas are the ones that the score is picking up and that’s why the scores are jumping from 192,000 to 1.1 million. It’s the homes that don’t have the risk within their borders and boundaries but have it just outside that are at most risk.
GISCafe Voice: What are insurance companies concerned with when they consult with you?
TJ: Most of those discussions with insurance company representatives revolve around mitigation, which is, how can homeowners reduce the risk on the property and which properties need that? More and more insurance companies have to write these policies and there are so many high risk policies they can’t ignore. What they’re trying to do more and more is identify the high risk properties, then identify ways they can talk to land and homeowners and clear brush around the homes, make sure it’s not a wood shake roof, all these things do to reduce the risk on higher risk. It helps homeowner in the long run because it’s less risk for their home, also helps insurance companies so they both benefit from things homeowners can do to reduce risk on property.
In an interview with vice president of Public Safety Products, Intergraph SG&I Kalyn Sims, and Russ Johnson, director of Public Safety for Esri, discussed the latest collaboration between the two companies, to enhance geospatial capabilities for public safety and security agencies. Through the collaboration, the companies will work to more tightly align Intergraph’s Computer-Aided Dispatch system, I/CAD, and Esri’s ArcGIS Platform.