Open side-bar Menu
 GISCafe Voice
Susan Smith
Susan Smith
Susan Smith has worked as an editor and writer in the technology industry for over 16 years. As an editor she has been responsible for the launch of a number of technology trade publications, both in print and online. Currently, Susan is the Editor of GISCafe and AECCafe, as well as those sites’ … More »

SPAR3D 2016 Expo and Conference Special Report

April 14th, 2016 by Susan Smith

The four morning keynotes kicking off SPAR3D 2016 Expo and Conference in The Woodlands, Texas, Tuesday morning included Eddie Paddock, Engineering/VR Technical Discipline lead, NASA Johnson Space Center, Greg Bentley, CEO Bentley Systems, Inc., David Smith, CTO, Wearality, and Curtis Chan, technical evangelist, Autodesk.

Google Cardboard

Google Cardboard

Paddock spoke on the topic, “VR/AR/MR Technologies at NASA’s JSC; Past, Present and Future.”  At the Johnson Space Center they use VR for training astronauts, among other things, and are looking at augmented and mixed reality.

The Orion space station office funds NASA to provide a VR training facility, products and services to  NASA stakeholders. NASA has also built a head mounted display, and robotic operations including simulations, 3D graphics, and immersive HMOs.

Eddie Paddock, Johnson Space Center

Eddie Paddock, Johnson Space Center

According to Paddock, they started in the 90s supporting the shuttle, and the Hubble Repair Mission. “As the space station was built, we built up training for the onboard space station, and have a mass handling robot that simulates mass in weightlessness, a mixed reality robot. We’re looking at Oculus, and game engines like Unity and Unreal, and Avatar tracking systems like Lighthouse and Visualize. They have some kind of reflector of passive or non-passive sensor and infrared emitters to give you location in space. We’re also looking at Chroma Key to filter out objects.”

In the 90s. NASA bought primitive VR headsets and then built some of their own. The most recent one is built with Oculus. They took the Oculus one apart and used it for each eye. Paddock said it has good contrast, it’s twice the fidelity of an Oculus. “On the top are sensors and we have an electromagnetic tracking system and some avatar stuff,” said Paddock.

They also build their own software, DOUG Graphics, a 3D graphics engine and renderer with lots of parallel processing and scripting built in. Paddock said transitioning to Unity or Unreal would be cheaper.

The astronauts are always tethered, and there is about 10 minutes of compressed nitrogen fuel available to them if they should become untethered. They can train for this in the VR lab. Underwater training gives them the feeling of weightlessness, but they can’t see what’s in space, so the lab augments that training with an immersive space experience.

Hololens or 3D glasses can augment with images on top. Hololens can identify tables and fixtures, and you can bring up overlays of procedures or interact dynamically with an iPad on the ground, according to Paddock. You can see in a pump what you need to fix and someone can dynamically interact with it. This is untethered and just coming out now. This type of approach gives a good representation of augmented reality but you’re limited to view and performance.

The Gravity Offload System (ARGOS) is looking for ways to do integrate VR with it. The idea is to integrate an immersive VR system including full body avatar tracking into it.

For the Orion Program, in space, it’s necessary to be able to exercise. They are working on building an exercise device to be used in a small space with four people.

A lot of immersive environments for operations and planning for Mars missions can be done using VR. Hololens has a GPU that augments reality with graphics, and uses a slam technique by which it builds the geometric stuff of a room. It can interact dynamically with a remote device to augment the reality. So if you have to augment an engine motor, it knows where you are, and it gives you augmenting the reality with immersion.

Greg Bentley, CEO of Bentley Systems, spoke on the topic, “Connecting Virtuality and Reality: BIM Advancements Converge for the Pope’s Visit to Philadelphia.”

An overview of Bentley software was presented, including the convergence of technologies. Convergence starts with digital engineering to which the 3D context is added, and which Bentley calls “reality modeling.” In digital engineering, advancing BIM, “B” stands for better asset performance.

Projects start with design modeling then add analytical and construction modeling, but the whole project is tracking through to the performance of the asset. Both the virtuality and reality can be coordinated through Bentley’s Connect Edition.

“We acquired Acute3D software,” said Bentley. “it enables using ordinary digital photography to create 3D meshes. Scalability is not a point cloud result but a reality mesh which is seamless. It can have any level of precision and accuracy. Serial-ity uses a UAV to capture a successive status during the construction of a hospital.”

Bentley’s objective was to apply it to infrastructure projects, and used a drone, supplemented by pictures from the ground. “You can even use a smartphone, down to any level of accuracy, supplemented by additional photos. That becomes part of the reality mesh, and we can connect that to reliability software for transformers. The reality mesh is in the environment where engineers work.”

“As a business we had the opportunity to create a 3D reality mesh of where the Pope would visit,” said Bentley. “There was a corridor of interest where a higher definition of precision was required and we used a helicopter using Aerometrics, supplemented with street level photographs, pictures taken with an ordinary camera, using ContextCapture.” ContextCapture does not require a lot of expertise to deliver high quality reality modeling. You get points of view with 16 hours of processing on one computer, which resulted in reality mesh inside the cathedral and was used inside MicroStation by engineers.

The advantage comes when you converge reality and virtuality, in the case of the Pope’s visit, which included 60,000 temporary structures, barricades, water sources plus much more. Trips to the field weren’t necessary, as they could use precise geo-location to place objects.

The company ESM Productions was responsible for the large scale production of the Pope’s visit.

Bentley used their product LumenRT software to enliven the plan which includes simulating digital nature for the Pope’s visit, simulating time of day and year like in the movies. They could enliven this environment with moving people and crowds, assuring the security of the Pope and the nearly 1 million people in attendance.

“When you produce a model it captures the GPS coordinates of the models,” said Bentley.

The new Comcast tower in Philadelphia was designed by Lord Foster and Partners in London.

“In MicroStation we can engineer the shading system, and use LumenRT to solve the problem of shading, and converge reality and virtuality,” said Bentley. “When Pictometry imagery was taken, it showed simulation of how the construction sequence will begin. First they use a reality modeling capture to see how far it had come by the time of the high definition survey. Then it was compared with the most recent survey. We went back last month with helicopter and drone to see how far it had come.”

In this case Bentley was looking at improving transportation capacity, using OpenRoads ConceptStation. This is a national roadway and bridge to populate, envision and enliven. The cost of project and materials, and how it will appear in Philadelphia are of primary concern.

“With the confluence of digital photography, ubiquitous with UAV, it’s a greater opportunity for you because surveying can be done continuously rather than occasionally,” noted Bentley. “We have software from PointTools, Descartes, and ContextCapture software, and engineering apps start from the premise of reality modeling. You can load huge reality meshes onto a smartphone.”

Wearality Sky

Wearality Sky

David Smith, CTO, Wearality, a 3D pioneer of VR, AR, and other technologies, talked about how his company is dedicated to delivering and creating AR products for consumers and business.

“Think of AR is a superset,” Smith said. “I’ve have seen the evolution of computers from a different perspective. My mentor is Alan Kay, creator of the personal computer. We’re on a threshold of something that is going to empower us in a way none of us can imagine.”

According to Smith, the emergence and innovation of smartphones have been the primary driver of the VR revolution. We are coming close to visual acuity, he said.

He noted iPhone performance has grown with the scale of improvement of the CPU of up through the iPhone 6, with 80x increased performance particularly in GPU.

“There has been a huge dip in PC sales,” said Smith. “While smartphone sales are in the billions, PCs are in the millions. The vast majority of investment in technology is in mobile — low power high performance, trackers, cameras — all things that define VR fall into this category.”

“Mobile VR like Cardboard is the dominant platform, like Cardboard, all designed to be carried with you to some degree, but they are still sort of big. Not too far in the future, each time you get a smartphone, it will have VR.”

Every single part of VR is being improved to make it better in spite of the fact it’s for phones, and it is very high quality and costs nearly nothing.

Everything is great about it except lenses, said Smith, citing that Cardboard has at best a 110 degree view, and light is leaking out, with a very narrow exit pupil, chromatic aberration, out of focus in peripheral vision, and the lenses are heavy. “Unlike the rest of VR technologies that are being taken care of with smartphone technology, there is no advancement in lenses.”

That’s where Wearality comes in with their head wearable displays. New capabilities required by Lockheed Marin customers required new kinds of devices. The Wearality panorama lens creates a very wide field of view (FOV).  Existing lenses provide a small FOV (70 degree in Cardboard).

This new one is designed for humans to look through. Almost every other lens has been designed off the shelf. You can get a full 180-view with the Wearality lens.

“We can get from 120-180 view with full peripheral vision, can wear eyeglasses with them, we have the widest exit pupil (like with binoculars), very thin lens, and it enables a foldable VR device,” said Smith. “We can deliver a bigger picture than a 100-inch TV or IMAX. This works through your phone.”

There is huge cell phone use in many countries for entertainment, so to have this type of experience on the phone will be a real selling point. This technology is possibly two years away.

“When you buy a mobile device it’s not going in your pocket, it’s going to be something you wear,” said Smith. “It will change the way you think, work and communicate. You’re defined more by how you communicate than anything.”


Curtis Chan, technical evangelist at Autodesk, talked about ReCap, one of Autodesk’s over 180 products. ReCap bridges the gap between the reality capture devices (laser scanners, handheld scanners, UAV / drones) and the Autodesk portfolio (AutoCAD, Revit, Inventor, Navisworks, Infraworks, BIM 360,…).

Among those products is one hardware product, Autodesk’s own 3D printer, but the company produces 99% software.

Freeway Interchange

Now ReCap includes the ability to do reality mesh, making it possible to create high quality meshes from reality data. Users can fix meshes and optimize them for use in digital workflows, subtractive manufacturing, additive manufacturing and 3D interactive media.

The reality mesh is capable of creating 3D meshes from reality inputs, such as photogrammetry, handheld scanners, and laser scanners.

The potential of 3D scanning for indoor management of assets with geo-location, for use in transportation projects, surveying, with new mobile mapping solutions from vendors such as Trimble and Surphaser, is formidable. As prices continue to come down for accurate, more accessible 3D scanning products, the door opens for more stakeholders to be able to access 3D scanning with less investment and special expertise requirements.

SPAR 3D scanning mobile unit from Surphaser

At SPAR3D, SurphSLAM high precision mobile mapping solution from Surphaser

Tags: , , , , , , , , , , , , , , , , , , , , ,

Categories: 3D Cities, airports, analytics, Apple, ArcGIS, asset management, Bentley, Big Data, climate change, cloud, conversion, crowd source, data, developers, disaster relief, drones, emergency response, Esri, field GIS, geospatial, GIS, Google, handhelds, iPhone, laser radar, LBS, lidar, location based sensor fusion, location based services, location intelligence, mapping, mobile, NGA, NOAA, photogrammetry, Pictometry, Pitney Bowes Business Insight, remote sensing, resilient cities, satellite imagery, sensors, smartphones, spatial data, storm surge, survey, transportation, UAS, UAV, UAVs, USGS, utilities, wireless networks

Leave a Reply

Your email address will not be published. Required fields are marked *



This site uses Akismet to reduce spam. Learn how your comment data is processed.

University of Denver GIS Masters Degree Online

Internet Business Systems © 2019 Internet Business Systems, Inc.
25 North 14th Steet, Suite 710, San Jose, CA 95112
+1 (408) 882-6554 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise