GISCafe Voice Susan Smith
Susan Smith has worked as an editor and writer in the technology industry for over 16 years. As an editor she has been responsible for the launch of a number of technology trade publications, both in print and online. Currently, Susan is the Editor of GISCafe and AECCafe, as well as those sites’ newsletters and blogs. She writes on a number of topics, including but not limited to geospatial, architecture, engineering and construction. As many technologies evolve and occasionally merge, Susan finds herself uniquely situated to be able to cover diverse topics with facility. « Less Susan Smith
Susan Smith has worked as an editor and writer in the technology industry for over 16 years. As an editor she has been responsible for the launch of a number of technology trade publications, both in print and online. Currently, Susan is the Editor of GISCafe and AECCafe, as well as those sites’ … More » Achieving Image Analysis in a Post-Covid WorldJune 12th, 2020 by Susan Smith
The Geospatial Distancing series from L3Harris Geospatial recently conducted a panel discussion webinar entitled, “How will image analysis get done in a post-COVID world?”
The questions arose, If remote work is the new reality, what are the challenges for an industry already drowning in big data? The previous pace of small and long-term incremental steps within the geospatial industry are no longer enough to keep up with the fast-tracked technological shifts happening right now on a global scale. Panelists Beau Legeer (Esri), Zachary Norman (L3Harris Geospatial) and Andrew Fore (L3Harris Geospatial) discussed what it’s like to work in the geospatial industry now and how data is stored and consumed, and what to do in an industry that consumes and analyzes as much data as the geospatial data industry does. Beau Legeer: Over the years in geospatial our workflows remained relatively consistent even though we have tried to embrace new technologies. A lot of us were coming into the office, downloading data, and consuming data that way for image analysis and exploitation. Everybody went home. That data can’t be on your desktop PC with lots of power and space, so you can’t do that at home. What I’ve seen is people go home and still perform geospatial analysis and embrace new technologies. We’re coming out of the really strict lockdowns, and realizing life did go on. We were able to do our analysis with similar vigor and results. Is this the new normal? Can we still embrace techniques and patterns that don’t require that we be in the office, and how can we help them? Apart from geospatial, thinking about remote sensing and raster and imagery, and really large datasets, do you see this an opportunity to take the plunge into the cloud? A couple of years ago people didn’t access it. Legeer: Yes, it’s happening because it has to happen. The cloud is available anywhere. It’s a great place for organizations and individuals to put data. It doesn’t require significant hardware just comms. We were sent home at a time when comms were pretty good, but a lot of us have infrastructure at home where we have streaming videos. If we can do that we can stream information from the cloud. How can I actually consume data from cloud, desktop or web-based analysis tools? It forced the hand of something the industry was trying to do for so long. Andrew Fore: We see that with our academic customers, and professors are challenged with historically in person labs, teaching online courses that didn’t already have that set up, such as working with Amazon to get AWS. A lot of universities already have online systems. There are a lot of geospatial courses that are not online. Penn State had to go fully online. This was an emergency situation, and made sure people could work from home temporarily. How do you extend that if people want to work from home or offices reduce office space? How do you get that realistic? Zachary Norman: People say, why not just use remote computers? The easiest way people can get into the cloud or desktop is just use a desktop and remote in, that’s a great and easy starting point. We did that beforehand because we had access to large machines plus we had deep learning, and we will continue to do that as we go forward. The world of not being in the office is the remote desktop experience, and is not ideal for some of the visualization workflows we do, even though technology and bandwidth gets faster, even though we pan and zoom through datasets. Remote desktop can get you there. This situation forces us to think of moving datasets off into the cloud, so you can leverage that data from anywhere, using highly available storage. Licensing is taken care of because you’re accessing desktops that you’re accustomed to. Getting into new patterns that embrace large data or storage are a good goal for us. Those of us there already probably didn’t slow down, they just shifted their location from one location to another using the same web browser. For services in particular are you thinking automated processes, streaming on the fly? Some of our remote sensing scientists say it’s really great to have data on your machine. They can tweak results and see it, but you don’t have as much control as you do with a heavy desktop client. Legeer: The web client doesn’t have all the tools for the type of ad hoc analysis you’re taking about. Core technologies like Web GL are beginning to increase the visualization experience on the web. Client side tools and functions are filling the gap between web client and desktop. A couple of years ago web client was great for tile viewing. If you want to do dynamic enhancements you would push the data server side or go to the desktop for ad hoc analysis. I’m hoping this movement to home based analysis or remote analysis can accelerate companies building more robust clients. I think the technology is there. I think it’s just a matter of time that the scientist user can do as much on a heavy duty desktop. That would fill the loop, You combine that with backend processing on servers, almost required for size of data that people want to use, going city, state, country wide analysis, put that all together and you can build a solution for that ad hoc analysis to large scale analysis, all based on sitting at home on your web client. People of equal status are using a brick of a hard drive, VPNs or remote servers, so how do you deal with cybersecurity around things like that? Norman: For us, there are public clouds and government clouds, so you’re cordoned off from rest of the world. You’re probably going to have trouble if you have sensitive or controlled data. You may have to stay on premise, because of the rules and regulations around the data you have. There may not be a great solution for those applications. Maybe a majority of users can use publicly available data and connected solutions they can access today. I’m a programmer as is Beau, and it has been phenomenal how many libraries for web development, hardware acceleration there are, so you have better viewing experience in the browser compared to how powerful desktop apps can be. Legeer: Salesforce and Microsoft Office have shown that with single sign-on applications things can be pretty secure, we can start to follow the lead of other industries being successful and remove need for VPNs. Users with credentials get in securely. The cloud vendors we work with a lot are very focused in security. We can bake that into our solutions in geospatial where we are accessing their data. We need to look at what others are doing successfully and see what we can apply to our industry, with single sign-on applications, the same way you get into Microsoft Outlook. What are the challenges funding this stuff? Legeer: One of the main challenges is short term funding issues for changing to software. Professors who are running summer lab or programs that require people come into the lab to do work. They are all of a sudden getting on AWS because of the procurement process as you must work your way up the ladder which is challenging. In the education industry there are a lot of great things happening with what companies are providing, i.e., open data access, geospatial companies getting licensing out quickly. A lot of students are having a great time with our software and ArcGIS. Professors had to deal with colleagues in IT, and every dollar counts, and you have to pay monthly fees for accessing. It requires a shift in how to spend money, with a regular occurring expense rather than one every couple of years, with yearly budgets usually. A lot of that money has been spent on renewing subscriptions for desktop licensing, and not much money for filling these critical needs. Fore: Any organization that embraces an SaaS model will be most successful. All you need is a web browser, web connection. You can run it everywhere, but it is like Salesforce in that it doesn’t do heavy duty geospatial analysis and it doesn’t stream data in. Can we shift things that have traditionally happen on desktop to SaaS? RelatedTags: 3D Laser Mapping, ArcGIS, ESRI, geospatial, health, imagery, intelligence, laser scanner, LiDAR, mapping, maps, mobile mapping, NASA, navigation, real estate, reality modeling, remote sensing, satellite imagery Categories: 3D Cities, analytics, ArcGIS, Big Data, climate change, cloud, cloud network analytics, data, drones, emergency response, Esri, geospatial, GIS, government, image-delivery software, L3Harris, laser radar, lidar, location based sensor fusion, location based services, location intelligence, NASA, Open Source, photogrammetry, public safety, remote sensing, satellite based tracking, satellite imagery, sensors, spatial data, subsurface utilities This entry was posted on Friday, June 12th, 2020 at 12:15 pm. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site. |