If you didn’t know minecraft has a builtin lightmapper which it uses to decide light levels, it’s very simple and just used Flood Fill, which is why light sources form rhombus shapes, but fitting for a pixelated block game

There’s minecraft RTX, which completely overlooks it and uses very expensive realtime raytracing, although at least at a lower resolution and upscaled with an algorithm, and requires a specific type of card

But why not improve the builtin lightmapper? it should be possible to:

  • Increase resolution to 1/4 block instead of 1
  • Use RGB instead of grayscale
  • Give each block a radiance color based on the average of their texture
  • For lighting do:
  1. Calculate direct exposure, simple raycast, apply sun color
  2. Calculate sky exposure, apply sky color
  3. Calculate direct light sources, apply light source color
  4. Do light bounces using block radiance, apply resulting color

There is also a lot of methods for optimization:

  • Async calculate closest chunks first
  • only use this chunk and connected straight and diagonal chunks
  • Recycle sky exposure, useful when only the sun is moving
  • Only update sky lighting async every 2 seconds and blend inbetween
  • After the lighting is baked, it is MUCH faster since it’s just coloring geometry with the stored data

Async - do the calculation over a period of time instead of immediately, reducing lag spikes

It would be much faster and still be technically raytracing, and achieve similar lighting, and would be a better option than realtime raytracing, Thoughts?