AMD Showcases Video of the Mercedes-AMG Petronas F1 W12 Car Rendered in Blender 3.0 Using Two AMD Radeon PRO W6800 GPUs

Jason R. Wilson
Source: AMD

AMD and The Pixelary create a 3D animation video to celebrate the win of the Mercedes-AMG Petronas F1 Team over this last Formula One racing season. The current success from Mercedes-AMG is the eighth straight FIA Formula One World Constructors' Championship for the team. This video also marks the second time Team Orange collaborates with The Pixelary on an animation utilizing AMD hardware.

AMD and The Pixelary show the collaborative powers of AMD hardware and Blender 3.0 rendering enhancements in new demonstration

This year's video differs from last year's video from the two companies. Last year, for the 2020 championship-winning W11 EQ Performance car, the animation was created using AMD's Radeon ProRender engine plug-in and the previous Blender version. Now, AMD and The Pixelary take the power of two AMD Radeon PRO W6800 graphics cards and utilize the new support in Blender 3.0 with the rendering software's latest Cycles renderer.

Related Story AMD’s MI300X Is a Far Superior Option Than NVIDIA’s H100s, Says AI Startup TensorWave’s CEO

Mike Pan, Art Director of The Pixelary, a Canadian-based animation studio specializing in vehicle animation and product visualizations, describes how the two companies created the recent video demonstration.

Blender 3.0 Cycles renderer saw a major update recently, implementing more performance. For artists, the newer Blender 3.0 completes improved feature parity between the processor and graphics cards from various manufacturers. AMD HIP API (Heterogeneous-computing Interface for Portability) replaces the aged OpenCL backend used in the previous Blender 2.9 version. AMD HIP authorizes Cycles to create renders using an individual unified code path for AMD and NVIDIA graphics cards and computer processors. AMD HIP provides superfast rendering speeds for supported AMD Radeon graphics cards, which are validating on AMD Radeon PRO W6800 and AMD Radeon RX 6000 series desktop graphics processors. It facilitates additional information AMD RDNA and AMD RDNA 2 architecture graphics cards.

The Pixelary previously designed sixty wallpapers of the Mercedes-AMG F1 W12 car using AMD's Radeon ProRender, allowing the foundry to implement ideas for an animation demonstration utilizing the newer technology. The Pixelary faced one hurdle to acclimate the assets rendered in ProRender and change them to Blender Cycles. The group was less of a headache due to Radeon ProRender already encapsulating Blender's native shader network. The previous rendering by AMD means the transition was much easier to produce after altering some of the ray visibility flags to add details to the car asset model. The result allows gaps in panels and screws to improve graphically by limiting reflections and shadows on the model. The efficiency of the process allowed the animators to adjust by section and not have to take apart the asset and work on individual pieces.

Source: AMD

Blender Cycles offer volumetrics, increased light production, and motion blur, which are primarily difficult for rendering engines to maintain correctly. Since the car is predominantly black and highly glossy, The Pixelary used specular reflections to create shapes. With the utilization of GPU rendering and Blender's improved rendering performance, lighting becomes much more lifelike and can produce superb results in real-time. The differences between OpenGL Eevee renderer in Blender and the newer Cycles in the software were staggering enough to allow The Pixelary to develop the animation process further.

Source: AMD
mercedes-amg-f1-w12-amd-radeon-pro-blender-animation-eevee
mercedes-amg-f1-w12-amd-radeon-pro-blender-animation-denoising
mercedes-amg-f1-w12-amd-radeon-pro-blender-animation-solid-eevee-cycles
mercedes-amg-f1-w12-amd-radeon-pro-blender-animation-eevee-cycles
mercedes-amg-f1-w12-amd-radeon-pro-blender-animation-stills-collage
mercedes-amg-f1-w12-amd-radeon-pro-blender-animation-lookdev

Mike Pan explains the concepts of temporal denoising in the video demonstration.

When it comes to the final rendering, we rendered everything at 4K with high sample counts to achieve the highest quality possible. Denoising was added to further smooth out the image.

The main concept of temporal denoising is to render individual frames at lower sample counts and merge several frames to reduce noise. This approach works extremely well for low-motion shots since the frame-to-frame difference is small. Despite this, simply blending the frames together will still result in ghosting, so we used motion vectors data to merge the frames in a more intelligent way...

As a result, mixing 3 frames together and rendering each frame at 1/3rd the sample count, we can effectively reduce our render time by 3 times with virtually no artifacts.

Source: AMD

The end product uses Blender's Sequencer, which Pan states are a somewhat straightforward video editor that was adequate to achieve the final presentation.

Source: AMD

Share this story

Comments