I thought at first that GetProjectionRaw could be a problem, but then realized it's not. As you say, the intermediate eye-to-head transformation matrix is sufficient to transform any screen orientation and eye position into a straight-up skewed frustum projection, which can be expressed by the four boundary parameters returned by GetProjectionRaw.

There's no compatibility problem. Per-eye half-angles of 90° or more can't be expressed by any projection matrix, but they're not necessary here.

an issue in how almost all games are rendered; from one viewpoint with planar projecting.

That's not how VR works. Games are rendered from two viewpoints, still using planar projection, but that's not the problem. In detail, SteamVR applications are provided projection matrices for the left and right eyes through the OpenVR functions IVRSystem::GetProjectionMatrix and IVRSystem::GetEyeToHeadTransform, and use those to render the virtual world into left and right frame buffers, which are then submitted to the compositor service for lens distortion correction, reprojection, synchronization, and finally display.

SteamVR generates those matrices based on the geometry specification provided by the active HMD. The old OpenVR driver API only supported parallel screens, but the update apparently allows general screen layout, via sending general eye-to-head matrices from the HMD to SteamVR. If an HMD implements this API correctly, and a SteamVR applications doesn't do something stupid, there will be no distortion.

The fact that there was distortion in the Tested demo means that 1) the demos were still using the old API, 2) they were using the new API, but hadn't implemented the new function yet, or 3) they had an implementation of the new function, but it wasn't working correctly. Either way, it's not the game's fault. Having IPD issues would fall under case 3, but the distortion caused by that should be minor.

OpenVR is a published programming interface by which VR software developers on one side, and VR hardware developers such as Pimax on the other side, can communicate with SteamVR.

Both SteamVR and OpenVR are developed solely by Valve.

Re 2: Who knows. I missed the recent OpenVR API change because I haven't been working on the OpenVR compatibility layer of my own VR run-time software since early July, but, based on their comments on their own forum, the Pimax folks knew at least as of a few days ago. Did they only find out after the Tested demo? Did they not realize they had a problem until the reviewers brought it up? Did they know all along, but weren't able (or willing) to communicate this properly to the reviewers during their interview?

You are correct. The version of openvr_driver.h in my local repository does not have that function, yet the head version on github (I just checked) has. This appears to be a recent API change. I also noticed that my local copy advertises interface version IVRServerDriverHost_004, the same version advertised by the changed header on github. Something there looks fishy, but any way, it appears Valve are on the case -- if that's in fact what that method is supposed to do.

Good point about reprojection. It assumes that rendered frames are in eye tangent space. If that doesn't hold, reprojection may go wrong -- but it might also work out OK, as "wrong" tangent space is still tangent space. It's hard to judge without running experiments, though.

There are two OpenVR interfaces: one between display/controller hardware and the SteamVR run-time, and one between the run-time and VR applications.

The low-level hardware interface is defined in openvr_driver.h, in the OpenVR SDK distribution released on github. I have been using that interface to let low-level OpenVR drivers directly interface with my own VR run-time, so I know it pretty well.

OpenVR-based applications get generic 4x4 matrices from the SteamVR run-time, but the run-time, in turn, creates those matrices based on the information given to it by the HMD itself. If there is no way for HMDs to notify SteamVR of rotated screens, there is no way for SteamVR to create appropriate projection matrices.

Now, any particular OpenVR-based application can tweak the projection matrices handed to it in any way it wants, so special Pimax-made VR applications could create matrices that do take screen rotation into account, but that won't work for any third-party applications.

I don't know the Pimax people, and haven't personally tried their 8k headset. That said, I have tried many other VR headsets in the last few years, and talked with a lot of headset hardware developers. In my experience, many of them -- the big three obviously excluded -- don't really know what they are doing. They might be really good at implementing OLED screen electronics, or manufacturing lens/screen assemblies, but I've often noticed a certain lack of awareness how it all needs to fit together with the software side to create a correct VR experience.

Meaning, I would not be surprised if they noticed that something was slightly "off" during prototype testing, but didn't know the root cause, or didn't deem it a deal breaker.

Standard perspective projection, implemented as 4x4 matrices, is the one correct way to generate images for flat screens, "wasted" pixels around the periphery or not. That goes out the window once the "flat" part no longer holds. Ray casting, with each pixel's ray direction pre-computed or measured, works for any screen geometry, including screens warped by lenses.

The performance benefit of standard perspective over ray casting comes from the former's linearity. The entire GPU rasterization pipeline is predicated on that. Meaning, even with curved screens it might be more efficient to render to one or more virtual flat screens, and then warp the resulting image(s) to the final frame buffer, exactly as it's done now for lens distortion correction.

Of course, it is possible to tweak projection in some minor ways to reduce the amount of warping that's needed afterwards, Nvidia's "lens-matched shading" being one such tweak. The normally hidden fourth component of post-projection vectors (the homogeneous weight) is used to virtually stretch the corners of the view frustum, to mimic the pincushion distortion induced by typical HMD lenses.

In short, (tweaked) perspective projection, followed by non-linear warping to account for the first step's errors, is likely to stick around for a while. Whether future non-flat screens need less or more correction really depends on the particular circumstances.

Impossible to say from these pictures without knowing the internal geometry of the headset and the properties of the lenses. Overlap of the left/right images doesn't directly correspond to degrees, as the mapping from pixels to degrees is non-linear and depends on position of the eyes w.r.t. the screens.

To judge it, you'd need a test scene with essentially a compass rose with the viewer in the center, in which case you could directly read the angular overlap from which parts of the compass are visible in either eye.

My blind guess is that binocular overlap is somewhere north of 80°, with per-eye FoV north of 140°, because otherwise the reviewers would probably have noticed the lack of it. Our eyes' natural binocular overlap is around 120°. For comparison, the Rift CV1's binocular overlap is "only" 74° at optimal viewing distance, and people are noticing that.

Sorry for yelling. The distortion visible in the 200° image will get canceled out when that image is shown in a 200° FoV headset. Not by the lenses (that distortion is additional, and would get canceled out by the lens distortion correction step that's not shown here), but by virtue of viewing a flat screen from up close. This is how it works.

If you want see the second image the way it really looks like, you have to full-screen it on your monitor, and then put your eye about an inch or less from the monitor's center.

Minor quibbles: The second image does not show a 200° FoV. It's a single-eye image, and the Pimax's single-eye FoV is smaller than 180° (I don't know exactly how much smaller). Perspective projection, which is used in the second image, can not create FoVs of 180° or more.

Second quibble: I said this is correct in principle, but it's probably still slightly wrong. The Pimax's screens are angled w.r.t. each other, and SteamVR currently does not support that. That is probably the cause for the visible distortion that reviewers have complained about.

SteamVR is indeed the problem here, as it currently does not support HMDs with angled screens. There is no parameter to report screen angles from the low-level HMD driver to the run-time, so the run-time (and hence VR applications) currently assume that the screens are coplanar. This has to be handled by Valve.