Gaming Technology is Changing How We Design and Visualize Products

Summary

Recent software/hardware advancements in computer graphics crossed the chasm, and it is now faster, less expensive and of commensurate quality to render virtual objects and virtual scenes at scale than to shoot real-world scenes. The technology enabling this relies on I) the new, more powerful capabilities of rendering engines, and II) a new class of authoring tools that make it easy and scalable to paint physics-obeying material properties on 3D models. This new class of authoring tools was created by Allegorithmic – as of today part of Adobe – in the last decade. After becoming the de facto standard for AAA game design, Allegorithmic is now disrupting new areas such as product visualization and design, bringing unprecedented photorealism as well as cost and time savings.

The evolution of product visualization

Paper catalogue, online catalogue, online 3D viewer, augmented reality application

Over the past several decades, there have been important changes in the medium used to show goods and products, going from print formats to web, to interactive web, and now extending to augmented reality. The goal has always been to provide stakeholders, collaborators or buyers as much visual information as possible for products to enable the most informed decisions. In the process of going from the most tangible medium (print) to the most contextual (augmented reality), one factor has remained the same: the need to reproduce the product imagery with the highest level of fidelity possible. For a long time, the only way to achieve that goal was to manufacture the product and take a real photo of it.

But with the growing number of product images required by e-commerce and the proliferation of SKUs (consider for example that Amazon is adding 500,000 new products every day), this approach is no longer scalable. Companies are moving to fully digital content pipelines where rendered virtual images of their goods replace photo shoots, delivering a more flexible, cost effective, and scalable way to show their products even before they are manufactured. Until recently, the most difficult part of this process was the time needed to create true photorealistic content from rendered 3D models. Recent developments in computer graphics have eliminated those roadblocks.

Advancements in computer graphics and rendering

Traditional Phong shading, PBR shading, image rendered in Adobe Dimension showing PBR maps

Starting in the 1970s, the most commonly used computer graphics technique to generate photorealistic images of 3D models has been ray tracing. Ray tracing generates an image by tracing the path of light as pixels in an image plane and simulating the effects of its encounters with virtual objects. Ray tracing for a long time has been relegated to highly specialized animation film and visual effects studios because of its computational cost and lack of standardization, making the science of rendering photorealistic 3D models an art that only shader programmers could successfully craft, with custom solutions and implementations.

In recent years, more and more of the computational algorithms needed for ray tracing have shifted into the chip, bringing ray tracing on new high-end hardware to real time, as demonstrated in the joint effort between Nvidia RTX and Adobe Dimension. Simultaneously, a broad standardization took place in the definition of 3D model materials and shading. Instead of using custom shader code to represent the way light bounces off visual objects, Physically Based Rendering (PBR) simulates the real-world physics of the light, and material texture maps define the way light bounces off 3D models.

Material texture maps to render a rusty metal, material. Rendered in Adobe Dimension.

This major paradigm shift brought creative control to the artist. The PBR standard has become the universal standard both in real-time applications using game engines and in ray tracing, resulting in a massive step up in interoperability between 3D applications, rendering engines, and studios. Since PBR aims to simulate the real-world physics of light, it is also best suited to deliver true photorealistic imagery.

The need for Substance and its impact

Example content created with Substance tools: video games, product visualization, 3D artwork. Middle image courtesy of Hyundai.

With both real-time and ray tracing rendering engines rapidly aligning towards PBR as the standard, there is a need for new tools. Creating by hand and one-by-one the texture maps needed for PBR – such as metallic, roughness, etc. – is in fact a very tedious and unintuitive process. Why should an artist have to know the roughness or metallic properties of say, rusted metal, and how they vary over a 3D model to create an organic photorealistic pattern? That information could be easily pre-encoded in a material with high-level controls given to the artist, who can customize it as needed.

This is what Allegorithmic’s Substance technology provides, among other things – an easy way to create (with Substance Designer) and paint (with Substance Painter) PBR materials directly on 3D models, authoring several texture maps at once with the simple motion of a brush, in real-time, and leveraging concepts very familiar to Photoshop users such as layers and blending modes. This was a game changer for PBR pipelines in game design first, with most of the top AAA gaming studios adopting it – Ubisoft and Naughty Dog being testimonials – and a growing number of indie developers using it.

Substance is a concept developed by Allegorithmic and consists of a higher level of abstraction descriptor of 3D models’ material properties. Substance can encode customizable texture maps as well as procedural functionalities affecting the overall material result. In Substance Painter, Allegorithmic also introduced the power of physical simulation, which greatly simplifies the creation of weathered materials. How laborious would it be for an artist to correctly create the effect of a water drop or wear on a curved surface? With Allegorithmic’s tools, artists and designers have a lot more time for their creative expression.

Impact on the film and VFX industry

Real-time green screen substitution in TV shows, animated TV series using real-time rendering, movie content created with Allegorithmic’s tools

The ability to render in real-time using game engine technology, together with the consistency and photorealism provided by PBR, have also brought breakthroughs in the way movie and TV studios create content.

More video content is being created using real-time gaming pipelines, either for TV series, real-time green screen compositing in TV shows, or full-on CG movies. The speed of iteration and the consequent cost savings have been driving this transformation, and the creation experience is more fun when removing wait-and-see operations, making these new workflows similar to digital puppeteering.

Among Adobe’s video production tools, Adobe After Effects already offers capabilities for compositing 3D and 2D content. Adobe is planning to develop additional workflows to make video compositing a more immediate and seamless experience. By expanding the 3D capabilities in After Effects, it will be possible to bring 3D content textured in Substance tools directly into After Effects and composite 3D and 2D content in real-time, with the true photorealism required by the film and VFX industry.

Embracing Substance and PBR as the industry standard

Three years ago, Adobe introduced a standard PBR material developed jointly with Allegorithmic. That standard is used in Adobe Dimension, Project Aero, Adobe Capture, every 3D material in Adobe Stock and is broadly supported by Substance tools.

With the acquisition of Allegorithmic, Adobe will also broaden the integration of the Substance SDK in its products, making Substance even more of a worldwide standard for materials. For example, the Substance SDK is already used inside Adobe Capture, the first app on the market capable of creating a PBR material in real-time from a single photo. Substance SDK is also integrated with the industry’s most used 3D tools and engines and provides a unique interoperability.

With PBR as a material standard and Substance as its editing format, Adobe is bringing a unified material pipeline for gaming, film, product visualization and design to the world.

What it means for immersive media

Adobe recently previewed Project Aero, a new multi-platform system for authoring Augmented Reality (AR) experiences. AR is a natural domain extension for Adobe as it expands the concept of compositing. Photoshop and After Effects are the most established tools for image and video compositing, respectively, while Aero brings compositing to the real world and to real time. Photorealism is a crucial need for AR as it allows the perfect blend of the real and the virtual, making the AR experience not only believable but also easier and more intuitive.

The Aero engine uses PBR as its primary material type, leveraging machine learning technology in the lighting space to bring the best photorealism and making real and virtual objects indistinguishable in AR. The existing workflow between Substance Painter and Dimension is currently the best way to bring 3D models with PBR materials into Aero, and more workflows will follow.

Conclusions

After conquering the gaming industry, Allegorithmic’s tools are rapidly spreading to film, design, product visualization and marketing, with massive time and cost savings.

It is no surprise that gaming and VFX are the leading industries for high-end 3D visualization. The final product is digital and moving to 3D has been a necessity. We’re now seeing more and more innovation emerge in other industries traditionally more focused on physical products and product design, like CPG, apparel, retail, packaging, and more.

Today, products are very often designed in 3D, but with a number of workarounds that make the process more difficult and time-consuming: the designer iterating on the next set of colorways often has to go back to a 2D workflow, the merchandiser laying out the store design has to fake the assortment by compositing renders in Photoshop, and the UX designer working on the next version of the e-commerce page or AR application has to redevelop the assets from scratch. With Allegorithmic’s tools, companies can now leverage the 3D models of their product across the entire customer experience – from e-commerce content to marketing material, from catalogues to immersive experiences.

Adobe has been working with Allegorithmic since 2015 to help develop and expand the reach of these authoring capabilities. Adobe Creative Cloud has long been the leader in providing creators with the capabilities they need across media formats, and today, we are enhancing 3D content creation and design capabilities with the acquisition of Allegorithmic, the industry standard for 3D texturing and material creation. By combining Adobe’s industry-leading imaging, video and motion graphics tools with Allegorithmic’s 3D design tools, Adobe will empower designers, VFX artists working in film and television, video game creators, advertisers and more to deliver powerful, interactive experiences while reducing cost and time-to-market.

Appendix

Physics Based Rendering (PBR): Physics Based Rendering is a subset of computer-generated imagery that uses empirically-derived shading models. In practical terms, instead of hacking the “look” of a 3D scene using custom shaders, we agree on leveraging the law of physics (e.g. energy conservation) to describe how the light bounces off 3D objects and develop material maps that describe the relevant properties (e.g. roughness, translucency, shininess, etc.). PBR is valuable because it provides consistent rendering across applications, does not require writing shader code, and provides photorealistic results.