What the rugged tablet market has been waiting for: scientific-grade accurate measurements on a tablet for those engaged in non-land surveying work. These rugged tablets can deliver centimeter-level measurement accuracy faster and easier than using conventional land surveying equipment and at a fraction of the cost – which, in turn, improves safety for first responders during collision reconstruction, natural disasters and crime scenes, according toKevin Tsai, senior product engineer for DT Research. The combination of accurate measurement, small size, and ability to complete other functions on the same device makes these tablets extremely flexible and efficient.
DT Research, designer and manufacturer of purpose-built computing solutions for vertical markets, announced the DT372AP-TR Rugged RTK Tablet, a lightweight military-grade tablet that is purpose-built with Real Time Kinematic (RTK) used to enhance the precision of position data derived from satellite-based positioning systems, according to company materials. This tablet enables 3D Point Cloud creation with centimeter-level accuracy – that meets the high standards required for scientific-grade evidence in court.
In July 2018, a deeply disturbing and violent video began to circulate on social media. Taking place in Cameroon, it depicts two women and two young children being led at gunpoint away from a village by a group of Cameroonian soldiers. Blindfolded, the victims are forced to the ground and shot 22 times by the soldiers.
3DR, the creators of Site Scan, announced recently it has entered into a partnership with Esri, to develop Site Scan Esri Edition, a customized version of its full end-to-end Site Scan product. Esri product manager for Drone2Map for ArcGIS and Full Motion Video Cody Benkelman spoke with GISCafe Voice about the upcoming development and the Site Scan Esri Edition.
“Site Scan Esri Edition is an app focused exclusively on providing flight planning,” said Benkelman. “something Esri does not provide, and our customers have been requesting.”
The Esri Site Scan Edition app is designed to be used with Esri’s Drone2Map for ArcGIS software for post processing in ArcGIS, and full drone project mission planning for transferring drone captured data into the Esri ArcGIS ecosystem. There is also drone processing capability within ArcGIS Pro called “orthomapping.” Users of Site Scan Esri Edition will also be able to process data in ArcGIS Pro through the orthomapping workflow.
Site Scan Esri Edition will allow you to do the flight planning and it will connect directly to ArcGIS Online, and can work well for the enterprise user as many organizations already have a lot of their own data available on ArcGIS Online.
“They’ll have their own field boundaries, site boundaries, vectors along power lines or other linear features, and much of that data will already be accessible on ArcGIS Online,” said Benkelman. “Site Scan Esri Edition will allow those users to connect directly to ArcGIS Online via the internet. They can drop ArcGIS Online layers directly into the flight planning process.”
Benkelman said that Site Scan Esri Edition is good for both enterprise users or those who are only using a drone once or twice a year. Through ArcGIS Online you have access to a vast amount of existing data, such as the USDA NAIP imagery, Landsat and Sentinel 2 imagery, FAA flight maps, weather data, worldwide terrain data, etc. Users can also access custom data layers from the user’s FedRAMP authorized ArcGIS Online organization account as base and reference data for their drone flight planning mission.
In contrast, “Many ArcGIS users worldwide are increasing their use of the Site Scan existing product as ArcGIS is the end destination for a lot of drone data, so even if they’re using different drone hardware or different flight planning applications, a lot of that data ends up in ArcGIS Online or behind an organization’s firewall as proprietary data.”
From Leuven, Belgium, GNSS receiver manufacturer Septentrio recently announced the addition of the AsteRx-i S to its GNSS/INS product portfolio.
According to company materials, the AsteRx-i S combines Septentrio’s compact, multi-frequency multi-constellation GNSS engine with ultralight external industrial grade MEMS based IMU. Calibrated for wide temperature ranges, the AsteRx-i S delivers accurate and reliable GNSS/IMU integrated positioning to the cm-level as well as full attitude at high update rates and low latency.
Key benefits for users:
• GNSS/INS positioning with 3D attitude: heading pitch and roll
• Multi-constellation, multifrequency, all-in-view RTK receiver
• AIM+ interference monitoring and mitigation system
• High-update rate, low-latency positioning and attitude
• Small & ultralight IMU (10 grams)
• Robust calibration for wide temperature ranges
Septentrio product manager Gustavo Lopez answered some questions for GISCafe Voice about the addition of AsteRx-I S to the open interface of Septentrio’s core technology.
1. What problems are you attempting to solve with the AsteRx-i S?
One important aspect in high-end positioning technology is relying on advanced systems which combine the benefits of GNSS with the benefits of industrial grade IMUs. A GNSS/INS solution gives extra possibilities to applications working in difficult environments or where 3D reliable orientation is needed.
2. Is the technology for AsteRx-i S something you created in house or was it as a result of an acquisition?
The AsteRx-i S (same as the AsteRx-i V) uses an external quality industrial grade IMU however all the GNSS algorithmic engine is Septentrio’s proprietary bringing the stamp of quality and performance for what we are normally well known for. This means that the product also inherits all the intuitive and open interface of Septentrio thus making it easy-to-integrate in multiple applications.
CoreLogic®, a leading global property information, analytics and data-enabled solutions provider, recently announced the launch of its new publicly-accessible risk information resource center, Hazard HQ(tm). This new information hub will offer individuals, media and companies high-level analyses and up-to-date data insights on the immediate risks natural catastrophes pose to properties across the country.
The latest risk summary for Hazard HQ focuses on the ongoing California wildfires. As comprehensive risk assessment needs increase alongside growing economic losses from natural catastrophes, Hazard HQ offers a high-level risk perspective for individuals and companies who wish to understand how hazards like earthquakes, floods, hurricanes, severe convective storms, wildfires, wind and volcanic activity can impact their regions.
Senior leader of content and strategy for CoreLogic, Maiclaire Bolton Smith, spoke with GISCafe Voice about the new resource center and how it is dedicated to offering catastrophe insights about events while they are happening.
Does Hazard HQ take in citizen information?
No, it focuses on information from CoreLogic. Corelogic can provide insight and information, whether wildfire, hurricane, earthquake or flooding, and offers insights on number of properties that could be at risk, or on an area that could be impacted and the home value that could be lost. No information is pulled from citizens. It’s our opportunity to share information with others to help them protect themselves and be able to restore from financial catastrophe.
It really evolved as a way for us to share information easily.
We’ve had all these devastating wildfires this summer already. We always try to learn from the events that have happened. We’ll always be providing more information on research. For example, with regard to the wildfire that happened in Sonoma County, California last year that impacted Santa Rosa, over the past six months we’ve done a lot of research looking at the reconstruction from that wildfire and the state of the homes being rebuilt and looking at some of the insurance impacts and implications from that event happening. An event doesn’t end when an event ends, it’s a long process afterwards to really recover from it, so we will continue to share more information on an ongoing basis as we continue to research events.
How do you expect risk analysis you’ve done last year is going to impact or help in the assessment of the damage of the Mendocino fire, as an example, right now?
The biggest factor is that it brings awareness to the impact that these devastating events do have. We hear about the hundreds of thousands of acres burned, but a lot of times the fires are burning in remote areas and there are not a lot of properties at risk. It’s devastating to see the area burned, but what we want to focus on is bringing awareness to insurers and other people about where there are homes and properties at risk, and focus on the human aspects of it. What people can take away from our previous research, is
Being prepared for hazards that could happen, whether it be a flood, earthquake, hurricane, etc. We’re prone to disasters all the time in various parts of the country.
Awareness of the events that can happen, and our main goal is to work with insurance companies and help them understand what properties are valued at to be able to insure properties properly.
The general public needs to know they need insurance for a lot of these hazards. Insurance can really help them recover from events when they do happen. Hopefully they won’t be impacted but if they are, to know their risk and to be able to accelerate their recovery is a huge bonus.
Say a customer is obtaining insurance for things they expect but what about these events that happen way beyond anyone’s expectations?
Unfortunately, those rare events are the wild card that are really beyond planning scenarios. I’m actually a seismologist by training and I spend a lot of time training people to know their earthquake risk. I always say the number one thing people can do to prepare for an earthquake, is believe that it can happen, and that’s the same with all disasters. The possibility is there that it may occur. These are hard for people to conceptualize and plan for.
At CoreLogic we do risk modeling where we look at the range of events that can happen – the more common events to the very extreme events. That’s the information we provide to insurance companies, including what could the worst-case scenario even look like.
I have spoken to CoreLogic many times. In the past the company has said with the fires we’re expecting an increase in losses to homes because people have built closer to forests, and forests are not cleared as often, we run the higher risk. (more…)
President and CEO of Esri, Jack Dangermond, was proud to point out at his Plenary at the Esri User Conference this year, this is the 38th conference, and the purpose of the conference is the same as it was 38 years ago: to be together, share knowledge and have fun.
Data providers abound in the GIS and geospatial industry. Choices range from mapping, built and natural terrain modeling, survey, GIS/LIS technologies, geospatial web, and asset inventory, mapping, geodetic and engineering surveying, photogrammetry, satellite imagery and real-time satellite data, remote sensing, aerial and ground-based LiDAR surveys, geographic and land information systems (GIS/LIS), 3D scanning, and spatial computing and analysis and much more.
Cepton Technologies, Inc., a provider of 3D LiDAR solutions for automotive, industrial and mapping applications, recently introduced its Vista LiDAR sensor at the annual NVIDIA GPU Technology Conference, making it immediately available for the autonomous vehicle market.
Kinesis, the global vehicle tracking solution from Telematics at Radius Payment Solutions in Crewe, UK, recently recorded the milestone of over 2 billion vehicle tracked miles. Since launching in the UK three years ago, Kinesis has installed its state of the art tracking hardware in more than 50,000 vehicles across Europe, Southeast Asia and North America.
Mark Smith, CEO of Geospatial Corporation, spoke this week with GISCafe Voice about the challenges of mapping the underground, which includes mapping underwater. The company’s goal is to create an underground “map of the world,” by doing it “one pipeline at a time.” This is a sensible approach to a project that may seem a bit like trying to eat an elephant (start with the toes!). With the help of sensors and Geospatial’s cloud-based GIS platform, GeoUnderground, it looks like the goal is highly attainable.
What are specific challenges to mapping underground utilities?
The most obvious challenge is that the pipelines and conduits are underground or underwater and that makes the selection of the data acquisition methodology very important. I like to say that the difference between locating and mapping is pretty straight forward. Locators attempt to “clear” an area for a specific reason, such as in preparation for a construction project. At Geospatial Corporation, we approach a project in a very “holistic” manner. We know there is no “silver bullet” that will allow us to accurately map every type of buried infrastructure within a facility, right of way or municipality. We know that we need to use many types of data acquisition technologies to obtain a complete “picture” or “map” of the underground. In addition, getting this vast amount of data properly into a GIS platform from the field, often with numerous techs collecting below and above ground over large areas is in itself a trick. For this we have developed GeoUnderground, our proprietary cloud-based GIS platform built on Google Maps. GeoUnderground provides an economical, SaaS based, powerful yet very simple to use GIS Platform accessible from any phone. Our goal is to have every data acquisition tool seamlessly integrate into GeoUnderground.
What solutions do you provide to achieve goals?
At Geospatial we consider our data acquisition technologies to be simply “sensors on a platform”. The platform could be designed to run inside of a pipeline or conduit and have various types of gyroscopic or electromagnetic sensors. These technologies are extremely accurate under most conditions and allow us to accurately map in x,y&z pipelines and conduits as small as 1.5 inches in diameter to 20 feet in diameter. These technologies are often used on projects for telecom, (Such as AT&T, Comcast & Verizon). This is also applicable for sewers, gas lines and numerous other types of infrastructure. We have developed a method of combining technologies to geo-reference the video collected inside a pipeline during periodic inspections. This allows the pipeline owner to locate any defects within the pipeline, providing an exact xy&z location of the defect. This also allows the video data to be stored and viewed, edited and shared on GeoUnderground. We are constantly looking for new types of data acquisition and data management technologies to be added to GeoUnderground. To this end, we are creating strategic alliances with numerous sensor companies.
Are you creating a map of the world’s underground infrastructure and if so, when do you think that will be completed and how will it be maintained?
Yes, our slogan is that we are creating a map of the world’ underground, one pipeline at a time. In reality we are aggregating data of behalf of our clients that is slowly, but surely creating a map of the underground. As more and more of our clients realize the benefits of mapping and knowing the location of their critical assets, the mitigation of risk and the ROI obtainable from sophisticated analysis, they will accelerate the mapping of their underground and above ground assets. More and more infrastructure stakeholders are beginning to plan to map their entire facility.
How do Blockchain technologies figure in?
It’s a massive undertaking to attempt to map the underground. Just as we are constantly finding new sensor applications, we are also exploring new software applications utilizing Blockchain, machine learning and artificial intelligence.
How do you renovate or replace utility structures that are underwater?
Geospatial doesn’t repair or replace pipelines, but we do have several ways to map pipelines underwater involving either our gyroscopic technologies and our electromagnetic technologies. We have successfully mapped a telecom conduit under the East River in New York City, also the Harlem River in NYC, The Savanna River in Georgia, the Inner Coastal Waterway in Charleston, along with many other rivers and lakes across the USA.
What do you think will be the result of mapping the outdated infrastructure, and how might it be maintained or retrofit using your data?
A few years back, no one would have guessed that all of the above ground infrastructure would have been digitally mapped, from the air, from un-manned drones or from the streets. The underground infrastructure is the last unmapped frontier. We can only begin to speculate the many uses and benefits derived from having an accurate 3D map of the underground. Smart City initiatives, increasing Federal and state requirements for gas & oil pipelines, an abundance of new sensors creating the Internet of Things and the ability to run risk analysis on critical pipelines all require management to know the exact position and depth of our critical infrastructure.