81 comments, 241 points.
Generative AI could be an interesting approach to the issue of solving the "what happens if you destroy [any particular element]" aspect.
For a lot of games you'd probably still want to have specific destinations set in the map; maybe now it's just much more open-ended as far as how you get there (like some of the ascend-through-matter stuff in Tears of the Kingdom, but more open-ended in a "just start trying to dig anywhere" way and you use gen AI to figure out exactly how much dirt/other material will get piled up for digging in a specific place?).
Or for games with more of an emphasis on random drops, or random maps, you could leverage some of the randomness more directly. Could be really cool for a roguelike.
There's still a ways to go, but I don't really think AI will be required if game engine pipelines extend PBR to destruction physics. The biggest bottleneck is frankly the performance hit it would entail, and the dynamic lighting that is inherently required.
I'd like to do a deeper dive into the two approaches, but on a surface level one interesting note is Oasis specifically mentions using a use-specific ASIC (presumably for inference?):
> When Etched's transformer ASIC, Sohu, is released, we can run models like Oasis in 4K.
They made a great decision by letting me download the video of my journey when the demo ended. I saw some neat constructs and was thinking, "I wish I could show this to some friends." I can!
If you get lost in the dark like I did, just press E to bring up the items menu. Doing that a couple of times and moving around a bit brought some novelty back into view for it to work with.