Garrett Johnson

Software Engineer at Toyota's Woven Planet | NASA JPL | Living in Tokyo, Japan

#threejs collaborator, and graphics enthusiast. All opinions are my own. Maintainer of some cool three.js projects.

@garrettkjohnson on Twitter.

Just added a mode that lets you render out ray depth rather than surface color to three-gpu-pathtracer to help visualize where some of the more expensive surfaces are.

These pictures show the debug image raw and with a postprocess color ramp applied.

#threejs #webgl #raytracing

Three-gpu-pathtracer revision 14 is out! This release brings the addition of support for volumetric fog and some small performance improvements!

Changelog here:
github.com/gkjohnson/three-gpu

#threejs #webgl #raytracing #javascript

And the model is by "ganzhugav" on Sketchfab!

sketchfab.com/3d-models/colour

Another bonus shot - here the fog is just contained within a cylinder around the model.

2/2

I've gotten some of the final quirks worked out of the fog material for three-gpu-pathtracer including the fireflies which were from subsequent specular hits.

Demo here!
gkjohnson.github.io/three-gpu-

1/2

#threejs #webgl #raytracing #javascript #graphics

I wonder how it would work if the same seed was used for the first few rays but a per-px seed was introduced for the next few? The benefits of stratification may be lost, tho.

More experimenting to be done 🤷 Any added thoughts appreciated!

4/4

You can see the negative effects of using the same seed across the whole image in the glossy floor reflections here.

Maybe adding a smaller ray variation per pixel so rays travel in roughly the same direction per wavefront would help the look & still give perf improvement?

3/4

Using the same seed for an 8x8 px block gives similar perf improvements presumably since the wavefronts can share a lot of calculations. But the artifacts are still severe.

2/4

I tried to remove the per-pixel random seed from the pathtracer to see what kind of perf gain it would give & I'm seeing over 2x FPS per sample! Lots of artifacts over time tho.

I wonder if there's a way to keep some perf gain without quality loss? 👇

1/4

#threejs #raytracing

@demofox Yup! Already using RR - something like this would be an option for users to set with caveats stating that it will effect the fidelity of the final render for the sake of render time.

I'm planning to look into a setting to limit the number of fog scatters to prevent cases where rays will bounce around extensively within the fog.

And then Ill probably try artificially attenuating lighting contribution from multiple fog bounces to alleviate fireflies. 🤞

2/2

More fog rendering with three-gpu-pathtracer. With the added particle bounces these renders take so much longer. I'm going to try a few things listed below to limit the number of fog bounces to improve render time & fireflies but any other ideas are helpful!

1/2

#threejs #webgl

@sguimmara Yeah I've tried a number of things. This is an issue others have seen on Adreno, as well, unfortunately.

I'd plan to provide a utility class that would tell you whether the pathtracer could run on your platform (shader compiles, precision is high) so you could provide a reasonable fallback otherwise.

Hopefully compute shaders & WebGPU will bring more consistency to everything

2/2

As three-gpu-pathtracer becomes more complex I'm starting to think I'm going to have to start using structs for raytracing hits rather than accommodating this Qualcomm Android precision bug. It's becoming a pain to maintain otherwise.

Any thoughts on this?

1/2

#threejs #webgl

@demofox Even for path tracing? At the moment I'm using Owen Scrambled Sobol numbers after what had been discussed in this thread and it sounded like blue noise wasn't the best option:

twitter.com/Atrix256/status/16

In this case the fog isn't being rendered with raymarching - it's a fog with random chance to hit a particle that scatters the ray in a random direction.

Fog volumetrics coming in three-gpu-pathtracer, next! This is an image with a fairly bright spotlight & volume filling the whole space.

It adds a lot of bounces & variance to the scene so its more intensive + requires more samples but its a nice effect

If I understand right I think this is something that would benefit from bidirectional path tracing pretty heavily but that will have to come another time 😅

#threejs #raytracing

Here's a quick look at some of the improved models that used iridescence and sheen from the model-viewer tests! Both features are quite a bit improved in this released.

2/2

three-gpu-pathtracer v0.0.13 is now out! This version includes iridescence & sheen quality fixes as well as improved perf w/ early path termination & quilt rendering for multi-view displays!

Notes:
github.com/gkjohnson/three-gpu

#threejs #webgl #raytracing #graphics #holograms

1/2

Since the path tracer needs to render to accumulate samples over multiple frames I've added a utility for rendering the path traced frames to a quilt which is then rendered directly to the WebXR buffer rather than using three.js' ArrayCamera. It's a bit of a hack but it works!

I've finally gotten a model live rendering to the Looking Glass portrait with three-gpu-pathtracer using WebXR! You'll need a display to see the effect but there's a preview rendered, as well.

Here's the link!
gkjohnson.github.io/three-gpu-

#hologram #ar #vr #threejs #webgl #raytracing

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst