Portals and Quake
181 points by ibobev 19 hours ago | 47 comments
  • paulryanrogers 17 hours ago |
    Interesting. I recall Prey being the first game to hype portals for rendering, though I think theirs may have been for drawing not just culling? Does anyone know how it worked?

    Dark Forces apparently also used portals for culling, IIRC.

    • to11mtm 16 hours ago |
      I think Build used some form of portals as well... primarily for water/underwater transitions, but there was at least one level in Duke3d that used it for some non-euclidian geometry.
      • d3VwsX 16 hours ago |
        Pretty sure water was just a teleporter? You may be thinking about the mirror effect? I think that was a kind of a portal effect. You had to make a large empty section behind each mirror that was used by the engine to render a mirrored copy of things in front of the mirror.

        You could make some weird impossible geometries by just superimposing sections. Two sections could occupy the exact same coordinates, but movement and rendering were done across edges shared by sections and did not care about if there was some other section in the same space. As long as there was never a way to see those sections at the same time.

        • CyberDildonics 16 hours ago |
          Pretty sure water was just a teleporter? You may be thinking about the mirror effect?

          That's what they just said? They didn't mention the mirror effect? The mirror effect was done with duplicate geometry? It occupied the overlapping space of what was behind it? The build engine also didn't have vertical levels.

          • paulryanrogers 13 hours ago |
            Shadow Warrior (Build) did have some room over room stuff. But IIRC it still leveraged offset sectors like Duke's water. FWIW some modern Doom ports appear to have this as well, at least with levels that support it.
        • Sharlin 8 hours ago |
          A teleporter is just a portal that you can’t see through.

          > You could make some weird impossible geometries by just superimposing sections. Two sections could occupy the exact same coordinates, but movement and rendering were done across edges shared by sections and did not care about if there was some other section in the same space. As long as there was never a way to see those sections at the same time.

          This is exactly what portals are.

      • smitelli 16 hours ago |
        E2L11 “Lunatic Fringe.” The map has an outer ring containing 720 degrees of hallway. Surprisingly disorienting even when you know that’s what’s going on.
    • phire 16 hours ago |
      No. Prey was one of the first games to hype portals and non-euclidean maps for gameplay.

      Portals for rendering date back to the dawn of 3d graphics and many early FPS engines used the concept.

      But these portals were only there as a rendering optimisation. While you could abuse them to make non-euclidean maps, the tools were intended to make proper maps, and the portals would be invisible to players.

      • smolder 14 hours ago |
        Marathon 2 by Bungie also had weird non-euclidean maps with overlapping sections in the late 90s I think.

        I miss the customizability of that game. Early bungie works were up there with early blizzard or early valve. They sold a tool for fun instead of casting a bait for profit.

        • Terr_ 13 hours ago |
          I recall that some of the developer commentary for either Portal or Portal 2 mentioned that they began level-development by using a special version of the namesake non-euclidean Portals to connect different chambers and hallways that were being developed in parallel. [Per distinct loading-screen level.] Then, later in development, they ran a tool to stitch together all that geometry in a more-normal way, and they only had one case of impossible-overlap, which continued to use the non-euclidean link.

          _________

          Edit: Found it:

          > [...] When we started the project making any big structural change in a level or the order of levels would lead to hours or even days of busy work trying to reconnect things and make sure they lined up again. If we ever wanted to ship something the size of Portal with the finely tuned balance we desired then we needed a way to be able to make big changes to the layout of the game without paying the cost of making everything line up again. We needed a way to bend space. We needed to think with portals. Using portals to connect different areas in the world we could make any type of impossible space work out. You could look through a hallway into the next room but the hallway might be on the other side of the map and the room you are looking into might be in a completely different orientation. We could seamlessly insert an elevator, a huge expansive vista, a room that was bigger on the inside than the outside, or even create an infinite fall by connecting a shaft back into itself. Soon every connection between any space was a portal. We would even switch them on the fly. Even a simple door worked like the cartoons - just a facade painted on a wall that seamlessly opened somewhere else entirely. Once the game settled down we were able to finalize our path and remove all of the world portals. There's only one impossible space left in the whole game - see if you can figure out where it is.

          -- "World Portals". Portal 2 developer commentary

        • endgame 12 hours ago |
          IIRC Duke3D also had a map where you had to walk a circular path twice to get back to where you started.
          • Synaesthesia 4 hours ago |
            Duke3D used teleportation to simulate height, for instance in the beginning of the game when you drop down from the rooftop you actually teleport to a other part of the map.
      • chamomeal 13 hours ago |
        Do you know anything about Prey 2017’s “looking glass”? Do you know if it involves any special techniques?

        I know nothing about this stuff, but when I saw in-game I was really impressed. You can even break the screens into shards, and the little pieces still maintain the effect when they’re lying on the floor or flying through the air.

        • adastra22 13 hours ago |
          I've never played the game, but your description doesn't sound very hard to implement. Keep in mind there is no physical camera when rendering, so tricks like this become rather easy to implement. You just setup the camera appropriately and render the perspective being shown by the portal, transformed into view space, using an alpha mask to make sure it only draws pixels where the glass fragment is.
          • keyringlight 5 hours ago |
            Isn't the more common usage of this in skyboxes? You'd have one area where the gameplay is taking place, then another area in the map that shows the larger scale area surrounding where the player is and the portal makes it appear that the gameplay area is within the other.

            The designer needs to confine the player so they can't break the illusion (where Prey lets Morgan Yu shatter that 4th wall), or you can do things like put the player on a speeding train traveling through a scenery and falling off is fatal

    • corysama 16 hours ago |
      > Dark Forces apparently also used portals for culling, IIRC.

      DF used BSP for culling. But, I made a mod level long ago playing with features and made a stairway in the middle of a room that you could only see from in front of it. It was some kind of free-standing portal with no surrounding support.

      Fun fact: the creator of the DF engine told me he based it on a reverse engineering of a beta release of DOOM. Apparently, the final release of DOOM cut a lot of engine features to gain speed. But, DF shipped with them and maybe a few more.

      • paulryanrogers 14 hours ago |
        > Fun fact: the creator of the DF engine told me he based it on a reverse engineering of a beta release of DOOM.

        Can you provide a source? As I understand it DF is based on the (pre release?) Tie Fighter engine, and DF began production before Doom released. That's why for example DF has some non-textured 3D models like the turrets and mouse droids. I follow The Force Engine (reverse engineered fan port) and Dev Game Club (hosts worked at LA).

        Supposedly DF devs did see Doom (perhaps even a pre release version), but already had their own tech working by that point.

        > Apparently, the final release of DOOM cut a lot of engine features to gain speed.

        Doubt. There's a lot of interviews with the ID guys, and I don't recall anyone saying they cut significant technical features. They made Doom in only 13 months, minus almost a month to port Wolf3D to SNES. Maybe Carmack -- or the released alphas and betas -- can clarify? Or perhaps you are confusing some of the Doom porting efforts, which did cut down levels and sometimes features significantly.

    • bluedino 15 hours ago |
      Descent used portals
    • adastra22 13 hours ago |
      As mentioned by others, Descent used portal rendering all the way back in 1995. It's a simple trick that only requires an alpha mask, which even the earliest hardware supported.

      But certainly the game Portal (2007) hyped using portals for rendering prior to Prey.

      • usea 13 hours ago |
        > But certainly the game Portal (2007) hyped using portals for rendering prior to Prey.

        They were referring to Prey (2006).

        • Tempat 10 hours ago |
          Also worth noting that Prey (2006) started development in 1995.
          • keyringlight 5 hours ago |
            There are a handful of videos from E3 (97 or 98) showing off the original 3DRealms development of Prey. One of the things that didn't survive into Human Head's version on the Doom3 engine was rotating portals, or if they did they were rare that I don't remember them.
      • bananaboy 10 hours ago |
        Original Descent was software rendered though.
        • adastra22 8 hours ago |
          2D graphics cards had alpha masking.
          • rep_lodsb 6 hours ago |
            But not standard VGA.
  • hgs3 15 hours ago |
    Some additional points of interest: Quake level designers could use "hint" brushes to help the BSP compiler determine where to create cells. Starting with Quake II, designers were able to place "area portals" which are portals programmers could toggle at runtime (think disabling a doorway portal when the door is closed).
    • rollcat 4 hours ago |
      I remember discovering all these optimizations The Hard Way when making maps for id tech 3 games as a kid. I always tried to build grand and detailed spaces, and the BSP/vis would choke on all the geometry.
      • swayvil 2 hours ago |
        Me too.

        Giant open space. A baron, way off barely visible, throws a firebolt. Takes like 30sec to get to me.

        What doom mapmaking needs is higher level tools. Like a tunnel kit. Just plug together pipes and fittings. Or crank out mazes generatively.

  • fourseventy 15 hours ago |
    This makes me really want to play quake 3 with a portal gun
  • dalant979 15 hours ago |
    Some related content from Youtube channel "Matt's Ramblings":

    Quake's PVS: A hidden gem of rendering optimization: https://www.youtube.com/watch?v=IfCRHSIg6zo

    How Quake's software renderer ELIMINATES overdraw: https://www.youtube.com/watch?v=zdXsHWHxeBY

    I added portals into software Quake: https://www.youtube.com/watch?v=kF-7Jd37gYk

  • smolder 14 hours ago |
    Acknowledge splitgate 2
  • jofla_net 13 hours ago |
    I remember the analogue to this in unreal engine 1, zones i believe, same concept though, good years.
    • markus_zhang 11 hours ago |
      Does anyone know why Unreal Engine 1 runs pretty smooth but with far superior graphics? Or maybe it is not that superior as I thought? I remember went into a netbar in 1998 when it came out and was COMPLETELY blown away -- and that PC didn't even have a good graphics card so it's choppy.
      • andrewmcwatters 11 hours ago |
        Newer temporal rendering techniques result in lower quality frames and output when the prevailing art style of the game is not one that is photorealistic.

        There's a lot of "smearing" as a result. It's not just you, it's how newer rendering techniques in the Unreal Engine end up being perceived by most people.

      • chickenzzzzu 11 hours ago |
        I'm thinking it's likely the number of supported colors and integer vs floating point precision.

        You'll notice that quake textures are very similar to PS1 textures in that they are pixelated and use a limited number of colors, whereas N64 textures have more of a smooth gradient.

        Likely also there are differences in the lighting systems as well. This is why I think people compare Quake II or even Quake III Arena to UE1. The OG quake really was a hack just to be the first that did 6 degrees of freedom textured 3D graphics on a PC, which I think they were the first for those exact constraints. My history is a little fuzzy, SEGA certainly had them beat by multiple years on arcade boards but those were all custom, and other games that had 6 degrees of freedom were not textured. It was a busy time !

        • whaleofatw2022 5 hours ago |
          Descent had textures, although they were arguably not-even-ps1 quality and there were probably some things still flat shaded.
      • Sharlin 8 hours ago |
        The original Unreal had great graphics for the time but did require a very beefy machine.
        • markus_zhang 4 hours ago |
          Ah I wish they open sourced the UE1 engine. I know some of the OldUnreal community members get to see the source code and made patches for it as well as migrated the UT editor to it, which is way better than the original VB one.
      • dahart 2 hours ago |
        I’d be curious if you can find some examples of superior UE1 graphics compared to the best of what you can find in UE5. If your question is based on your memory from 98, I would assume the answer is that UE1 graphics is not superior to today’s, but things that look good today are less surprising than they were then. I just googled UE1 images, and everything I saw in several pages of results looks rather dated to my eyes.

        If you’re thinking more about the smoothness & framerate of the game, then that’s entirely subjective. UE1 did not prevent choppiness or make things fast, and neither does UE5. That’s entirely at the discretion of the game developer. If you use fewer polygons and use simple shaders, the game will be fast and smooth. If you use more than the hardware & engine can handle, then your game will be choppy.

        There are a lot of advancements in real-time lighting today, which means that the look doesn’t necessarily change, but is computed at run time. Most of the nice lighting in UE1 was baked into textures or vertex data, and could not change during the game. Today with shaders and global illumination algorithms running in real-time, the lighting can change in response to a moving sun, moving objects & characters, changing materials, etc.

        • markus_zhang an hour ago |
          Er, I was thinking about UE1 versus Quake 2, definitely not UE5.
  • leoc 10 hours ago |
    Edit: ugh, I missed dalant979's comment which already covers this https://news.ycombinator.com/item?id=42662830
  • Jyaif 7 hours ago |
    Is this technique still used in modern engines to determine which part of a level to render?