The Pitfalls—and Potential—of Real-Time Engineering
Today’s digital world is full of real-time experiences that blur the line between what’s real and what’s rendered.
Remote design teams don VR headsets to tweak high-definition prototypes and then send them through all-virtual review cycles. Online shoppers use phones to overlay photorealistic 3D furniture models in their family rooms, exploring colors, positions, and dimensions before making a purchase. From corporate teams to individual consumers, more users are enjoying—and expecting—the benefits of real-time experiences. Unfortunately, the road to meeting those expectations is still an uphill battle for engineering teams.
While real-time engineering tools and processes do exist, most are still slow and cumbersome. It’s a challenge that’s bound to escalate: the extended reality (XR) market is growing at a rapid clip among consumers and corporations alike. The XR market—made up of augmented reality (AR), virtual reality (VR), and mixed reality (MR)—is predicted to jump from $27 billion in 2021 to $250 billion by 2024. Manufacturers are leaning into XR’s potential to slash engineering timelines, lower design costs, and deliver better customer experiences. PWC research suggests that the use of VR/AR in product development alone could boost GDP by $360 billion by 2030.1,2
We’re digital optimists by nature, so we’re excited about the growth of XR. We believe emerging technologies will unleash real-time engineering’s full potential to transform how products are designed and delivered. In the meantime, though, common challenges continue to persist across many engineering teams.
Real Pain Points Demand Progress
There’s no question that a gap exists between the potential for real-time engineering and the place we are today.
Engineering software wasn’t intended for customer-facing visualizers. Today’s digital consumers increasingly want interactive product catalogs that let them experience real-time renderings of everything from a new wardrobe to a kitchen redesign to a vehicle’s interior. But CAD data is meant for manufacturing, not personalized marketing. We often encounter a need to repurpose assets that were originally intended for engineering use only, and ask for an automated process to bring those designs into real time for customers. At this point, the automated processes that do exist don’t meet our requirements for content quality.
Undefined source data complicates export. By its very nature, CAD software includes every minute detail that is required for manufacturing output. That’s great for the factory floor, but not for export to real-time. Since all CAD data elements are treated equally, every thread on every bolt is exported as an equally important design component that has to be rendered in real-time—causing a real waste of time and energy, and adding unnecessary complexity to an already tedious process.
CAD software and XR hardware speak different languages. While immersive hardware is making progress in rendering high-quality visuals, computing power isn’t keeping up. Real-time rendering on a graphics processing unit (GPU) requires specific optimizations, especially for VR headsets. Not only does CAD data have to be converted into polygon meshes, but these meshes have to be optimized for display in real-time, and these systems simply don’t understand each other natively. Progress is happening—five or more years ago it could take an entire server rack to run a VR experience of a single product—but accuracy and fidelity are still being lost in translation between manufacturing software and XR hardware.
A Real-Time Revolution Can Transform Engineering
Despite the current challenges, we’re bullish about better engineering through real-time solutions. Here’s a rundown of new developments we’re excited about, and what we’re looking forward to as the space continues to evolve.
Faster, lighter, longer-lasting hardware. We expect wearables will continue to improve with each iteration and start to meet—or even exceed—user expectations. Microsoft’s HoloLens and HTC VIVE are already delivering impressive experiences in manufacturing and engineering, with a comfortable fit and fully articulated hand tracking that lets users manipulate virtual content with natural motions. Other VR headsets are making progress in offering a 100-degree-plus field of view with improved resolution, and a flurry of lightweight smart glasses options are giving workers instant access to critical specs and documents.
As the world’s leading brands ramp up their investments in enterprise XR, the global market for VR in automotive alone is expected to grow at a CAGR of 45% for the next five years.3 We envision a day in the near future when it’s considered normal for executives and design teams alike to slide on featherlight 180-degree headsets and review 3-D life-size prototypes, inspecting virtual designs with flawless fidelity and zero lag. Automakers already started down this path during the pandemic. Of course, to deliver prototyping breakthroughs on a broader scale, we also need software that can keep up with the enhanced hardware.
Smarter CAD software with real-time integration. CAD software that is designed with real-time experiences in mind will go a long way toward making real-time engineering a reality in manufacturing. Delivering better, faster, cheaper prototyping will require CAD that can handle both manufacturing and real-time previews, letting engineers label individual elements as significant/insignificant before exporting the data.
We’re excited about early forays into smarter software, like ESI’s IC.IDO platform, which enabled Volkswagen’s remote engineering team to design, prototype, and review its Nivus vehicle in a completely digital environment without creating a single physical prototype4. Epic Games is also looking to change the game with Datasmith, a suite of tools for importing CAD models into Unreal Engine to create interactive presentations and visualizations. Ford, Volkswagen, and other automakers have already set up virtual design studios to continue reaping the rewards of advances in real-time engineering, and GM’s studio will open in Q1 2023 with a goal to “rapidly pilot visualization and immersive technology, including augmented and virtual reality,” and then go a step further with an “increased footprint allowing for improved output of physical and virtual proof of concepts and show cars.5 We believe these virtual environments can unlock not only time savings but also creativity and innovation, making it easier for product teams to experiment with new materials and move through review gates more rapidly, without having to create expensive new hard-body prototypes from scratch.
Real-time artificial intelligence. We’re already at a point where AI and machine learning can assist in art direction, visual exploration, and generating imagery. As the technology becomes ever-smarter, it’s also getting better at understanding and creating three-dimensional images based on text.
We’re on the cusp of a market revolution—one that will introduce 3D modeling AI capabilities; automate tedious, inefficient processes; and free product teams to focus their talents on the more important goals of experimentation and innovation. It might take a bit longer than we’d like, but we have zero doubt that innovators will develop AI systems that can take manufacturing models and transform them into something that is not only usable for real time, but is optimized for real time.
Vectorform is proud to work with many of the world’s leading brands early in the development cycle to stay on top of the latest advances in XR, as well as having partnerships with many smaller, cutting-edge design shops. While these experts work on creating next-gen XR and AI software, we can help guide your teams through the real-time tools and processes that exist today. Our Digital Product Development team can also help your company explore the pros and cons of each option, and how to prepare for and manage the change associated with shifting to a more real-time approach to engineering and workflows.