This project builds a browser WebGPU scene where:
- Zig (compiled to WASM) owns world state, camera math, floor mesh data, and WGSL emission.
- JavaScript only handles browser APIs: WASM loading, WebGPU setup, input events, and frame submission.
- Video playback is rendered onto a quad inside the 3D scene with a 3D TV-style frame around it.
BoomBox.fbxis loaded at runtime and rendered on the floor near the TV.tape.glbis loaded at runtime and rendered on the floor near the TV screen.player.glbis loaded at runtime and rendered on the floor in front of the TV.- A curved 3D banner above the TV reads
VJS 10 Theater.
npm installnpm run devThis script:
- builds Zig WASM via
zig build wasm - copies
zig-out/web/webgpu_world.wasmintoweb/public/ - starts Vite with
web/as app root
npm run buildBuild output:
zig-out/web/index.htmlzig-out/web/assets/*zig-out/web/webgpu_world.wasm
- Move:
W A S D - Up/Down:
E / Q - Sprint:
Shift - Look: mouse (click canvas or
Lock Cursorbutton for pointer lock) - In-world video: use
Play Video/Pause Videobuttons
- Uses
@videojs/react@nextfrom npm inweb/src/videojs-player-host.js. - Renders a hidden
createPlayer(...).Provider+VideoSkin+Videotree and reuses the rendered<video>element as the WebGPU video texture source.