If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Just to emphasize one point, right now the driver only supports adjusting the screen brightness. GPU switching should come along at some point, but there are other issues that have to be resolved on the machine I'm working with before it's even possible to work on the hybrid graphics support.

Comment

Is the device spoken of here a display mux? Does that mean that these notebooks could potentially switch between GPUs just by turning on the GPU and flipping the mux?

For the moment I'm assuming that an X restart would be required. Later, when we have more complete GPU switching (and DMA-BUF/PRIME/Bumblebee/whatever), we'll probably be able to enable the sleeping GPU, move all the buffers over to the new one, switch the mux, and then disable the other GPU within an X/wayland session.

Comment

For the moment I'm assuming that an X restart would be required. Later, when we have more complete GPU switching (and DMA-BUF/PRIME/Bumblebee/whatever), we'll probably be able to enable the sleeping GPU, move all the buffers over to the new one, switch the mux, and then disable the other GPU within an X/wayland session.

Note that in general it's not possible to just switch GPUs under running applications. Even if you move buffers, you need to also recompile shaders and recreate various other resources. Many of these resources will have hardware-dependent properties, and the application itself is required to make various choices regarding what resources or features to use. For example, the IGP card may be some Intel card that barely supports D3D10/GL3 but the discrete chip may have full D3D11/GL4 support, and the application may want to use some of those enhanced features if available.

The only way to reliably handle this is to introduce a Windows-style lost-device message and force the application to completely tear-down and recreate the graphics context from scratch. This is a pain in the ass to handle (but Windows D3D devs are already used to handling it), but it's also a major incompatible break in the X11 protocol. It may be possible to make it a new protocol extension that an application can opt in to, and apps that don't will be locked to the GPU they initially started with (and the OS should be able to give useful information to a use when switching from discrete to integrated GPU that some applications are not able to switch and must be restarted if the user wants to fully disable the power-hungry discrete GPU).

Some of the Windows implementations of GPU switching actually just disallow switching the GPU mode if any app has an open D3D/D2D/GL context. That of course includes all modern browsers, including the Windows 7 default browser. It's pretty awful. Whatever happens with GPU switching, that is just not acceptable. Granted, it's still 100x better than having to restart all of X11, which might as well just be a whole OS reboot from the users' perspective.