LocalRay Interview – ‘We’ve Partnered with One of the Leading Console Manufacturers on Raytracing’

Alessio Palumbo
LocalRay

In early January, when we weren't even aware of COVID-19 and everything still seemed to be going fine, the yearly Consumer Electronics Show in Las Vegas offered the usual great showcase of new technologies. Among those, a relatively small Israelian software company called Adshir presented LocalRay, a software-only raytracing solution designed to be available even on low-end mobile devices.

Following that presentation, we reached out to Adshir in order to be able to chat about LocalRay with company founder Reuven Bakalash, who previously founded HyperRoll (acquired by Oracle in 2009) and Lucidlogix Technologies (acquired by Google last year). Dr. Bakalash invented over 200 patents in total, as you can see in this rather thorough list.

Related Story NVIDIA Omniverse Is Creating Life-Like AI-Based Animations In STALKER 2 & Fort Solis AAA Games With Raytracing

At last, a few weeks ago we managed to schedule an interview focused on LocalRay and its capabilities. Below you'll find the full transcript, where Bakalash reveals that the technology will be used by 'one of the leading console manufacturers' and by 'a major smartphone company' with deals to be announced later this year.

 

Greetings, Reuven! Tell us about LocalRay. 

Let's start from the beginning. I was working on ray tracing or at least thinking about ray tracing for a long time. At the beginning of the 90s, I spent a couple of years at Stony Brook University, New York, on my postdoc, and this was the first time I met ray tracing. It was very pioneering.

We had a guest, they came and gave a talk. And he showed us it was the first time I saw ray tracing, a raytraced image. And this was just the beginning. It was a beautiful image, something unseen at the time. He said this was going to be mainstream in computer graphics. I asked him how long would it take to generate such a nice picture. He replied three weeks, for a single image.

This shows how computationally complex is the application of ray tracing. Now, ten years ago, after computers got much stronger, we saw the first film rendered entirely with ray tracing, Avatar.

The rendering was done in New Zealand, in a big server farm, using about 50K computers. It took them, on average, 40 hours for a single frame.

Now, two years ago, there was a breakthrough, with what was a logical shift to what is called hybrid ray tracing. There was a great need in gaming to have raytracing, to make them more immersive.

Now, gaming is a completely different world, because it is real time. The only way to do it is to switch to hybrid raytracing, which mixes raytracing with rasterization technology, which excels at high speed rather than quality. This happened at SIGGRAPH 2018, where NVIDIA presented its RTX 2080 graphics card, and at the same conference, we from LocalRay presented our software-based raytracing product. It was running on different levels of computing, from AMD or NVIDIA high-end graphics cards to laptops, tablets, and a year ago we also showcased it on smartphones.

Let me give you a snapshot of Adshir as a company. It was founded in 2014 to design real-time, low-power raytracing. This is very important, as the low-power factor enables it for mobile devices as well.

The business model is software licensing. We are targeting the markets of Virtual and Augmented Reality, PC and console gaming, mobile gaming. We are protecting our technology with 25 granted patents and about 10 pending. The company is rather small, but we are growing by the end of this year to about 25 engineers. We already have some initial revenue coming from the mobile market.

I wasn't at CES, but I saw the articles and the video demonstrations. In an interview with VentureBeat, Adshir's Executive VP Offir Remez said that you have multiple deals lined up for LocalRay on console and mobile, right? But you cannot talk about them yet. 

That's right. It's still too early. But we are working on it and the results are very good. And the technology is completely different. Hybrid raytracing is based on solving what is called intersection tests. You have millions of millions of rays going into the scene and strike, each ray has to find an intersection with one of the polygons of the scene. Now, you can imagine there are millions of polygons, millions of rays and this is the high complexity of the application. Now, in order to speed it up, the other ray tracing systems such as NVIDIA's RTX are using what is called an accelerating structure. This is a very big binary tree that actually is based on the scene. So, every single ray has to traverse this tree to finally get to the triangles and that's when intersection tests are done. These tests tell you if there's a hit or not, if there is hit, maybe this is the polygon or the triangle that has the match with a ray. Otherwise, you're looking for another one. So, this is very complicated. Now the problem is not just the long time it takes for the rays to traverse and to make these billions of intersection tests.

There's another problem that's even worse if you want to animate the scene, and that's exactly what is done in gaming. So if you want to animate, especially skinned animation (which is not just moving, it means the character can move hands and legs and so on). The problem is that this accelerating structure is static, not dynamic, which means that for animation, which means changing your scene, you have to rebuild the accelerating structure. Now, this rebuilding is very, very expensive in terms of time. So it contradicts animation. There's a problem. The way this is done, for example, by NVIDIA is with brute force, by just using a very, very strong, very powerful GPU. This, of course, speeds up everything. But if you want low power, this is a real problem, you cannot make it. So we started to design the whole thing, bottom-up, with the idea to deliver this solution to mobiles, to the lowest end.

What we have is actually a different kind of accelerating structure. This is a proprietary, dynamic accelerated structure, which means you can change the scene, you can make animation and this is done very fast. You don't have to rebuild the structure. The animation only makes local changes. So, it's very fast, very efficient. And this is the big difference. This is a cross-platform and cross-level software, which means it can run on different platforms starting from mobile, to tablets to laptops to the high end, Nvidia and AMD. This is the kind of vertical and horizontal scalability we wanted to achieve.

Does that mean your LocalRay technology also works on Intel's integrated graphics for example? Did you test that?

We run it. And you can see it on the website and also on YouTube, the New York scene with Spider-Man. It runs on the high-end GPU, but it runs on a tablet as well. Now, this is an integrated tablet by Microsoft, the Surface, and the GPU is the Iris Intel integrated GPU, three drives very nicely. LocalRay also runs on ARM, on smartphones.

And as you've said, it can also run on consoles, right?

I wouldn't say that it runs just yet, but we are working on it and it will run.

I understand that you cannot go into details for these deals. But is the deal with a new console manufacturer or a manufacturer that's already well established in the console market?

Of course, yeah. One of the leading.

That is interesting and we're definitely looking forward to this particular reveal. By the way, I'm wondering if LocalRay can somehow take advantage of the RT cores available in NVIDIA's RTX graphics cards.

We actually solve the problem with our software only. But of course, every software algorithm can be supported by hardware. So maybe in the future, we have a development I mean not really because we are not a chip company, but maybe some chip companies will develop some accelerating structure for our solution. This probably will happen.

Did you already have contacts with AMD?

We talked in the past with AMD and we talked with others, so not specifically AMD but we talked to them as well.

What about NVIDIA?

No, they have their RTX raytracing system which is completely different.

Your LocalRay is meant to be faster to be able to run even on low power devices, as you've said, but in terms of quality, how much of a difference is there with the RTX raytracing that we've seen so far in some PC games?

No, this is physically correct. It must be physically correct. Because otherwise, it's not raytracing. In hybrid raytracing, you have the luxury that you don't have to make everything the entire scene raytraced. This is true not just for us but for NVIDIA's RTX as well since they are also using a hybrid ray tracing.

So, if you want to run it in real time and you have, for example, too much data in the scene, you can make only part of the scene raytraced. But this part of the scene, the raytraced part, will be physically correct. Of course, if you run the same game or the same scene on a high-end GPU or on a smartphone, it's not the same. So with mobile games, you want to be in line with high-end PCs. But of course, it's not exactly the same. It cannot be the same. But it's still ray tracing and still something which offers different quality than rasterization.

That is the point because today games are still being made on rasterization, while NVIDIA's raytracing is supposed to make the games nicer. But the question is, do we want raytracing to be nice to have? Is it enough? Because if you have the same game just nicer, it's not enough, I think that what we need is to make raytracing in games a kind of must-have. Raytracing has some unique features that you cannot find in raster, like reflections.

So using these specific features of ray tracing, you can create a new kind of games. And I assume that in a few years, a couple of years, we will start to see games which are different, where playing with rasterization is not enough. You cannot play with rasterization, but you can only play with raytracing, and it's making something completely different. I'm not talking just about augmented reality, as in augmented reality if you are not able to make a physically correct character, you won't believe it is real. But also in regular games, not just to make it more realistic but also to use some tools like reflections, and then you can get some special tools that assist you in winning the game. If you don't have it, you cannot win the game. See what I mean?

Indeed. I'm wondering, did you already license the LocalRay software to Unity and Unreal Engine?

Probably by the end of 2020. Actually, our LocalRay today is running integrated with Unity, everything which is done with LocalRay is done with Unity. So all the demos that you see they are done with Unity. But we are not ready to sell it yet.

From the CES reports, the option to integrate LocalRay on a hardware level was also mentioned. Is that happening in some cases?

It's happening. I cannot talk about this, but it was already sold on the market, not by us, but a big company, a big mobile company. And this is the beginning. I can't tell you which company, but this is a major company in the smartphone market.

Is your software going to use like OpenGL or Vulkan or DirectX 12?

We run on Windows and on Android. So DirectX on Windows and Vulkan on Android.

Alright. Does LocalRay use DirectX Raytracing (DXR), Microsoft's dedicated API?

Right now, we're not using Microsoft's DXR API. But we don't need this extension actually. These extensions are done, you see, for the hardware-based accelerating structures.

Of course, some game developers have already been using NVIDIA's RTX platform for almost two years now. Do you think it will be a kind of competition with them or is it going to be easy for game developers to make their games compatible with both NVIDIA's RTX and your LocalRay?

It's very simple. We have some artists, game artists working with LocalRay. This is integrated into Unity. Just use it as a material. This is very easy and it saves a lot of time for artists because instead of coming up with workarounds to make the effects with different rasterization tricks, you do it straightforward with ray tracing.

Yeah, I'm aware of that. This could also make game development smoother and faster.

Yes, yes. Because the most expensive parameter in game development is the human work of artists. This could reduce the work of artists.

So far, no Virtual Reality (VR) games feature any form of raytracing. VR is already complex on its own, because it has to render two images to create stereoscopy, for one thing. Could your low-power, fast, software-only LocalRay solution help usher raytracing in VR games as well?

That's the problem, you need to double the frame rate. So it's still too early, but it will happen. It depends of course on what scenes, on what game, on what complexity, but it will happen.

When do you think we'll see the first commercial game using the LocalRay raytracing software?

Later this year. 2020, for sure.

Alright. Thank you for your time.

Share this story

Deal of the Day

Comments