[Public WebGL] dual GPU setups default to integrated GPU

Kenneth Russell [email protected]
Mon Jan 30 17:45:18 PST 2017


preferLowPowerToHighPerformance would apply equally to WebGL 2.0 as WebGL
1.0. It wasn't dropped from the spec.

https://www.khronos.org/registry/webgl/specs/latest/2.0/#2.2 mandates that
a few of the context creation attributes must be honored, but that's
because multisampled renderbuffers are a mandatory part of the OpenGL ES
3.0 spec.

I'm not a D3D expert so don't know how feasible it is to render to a
texture on one D3D device and display on another. To the best of my
knowledge Edge doesn't dynamically activate the discrete GPU when WebGL's
active and switch back to the integrated GPU when it isn't. It uses the
"0th" GPU, whatever that is according to the control panel settings.

-Ken


On Tue, Jan 24, 2017 at 6:30 AM, Maksims Mihejevs <[email protected]>
wrote:

> Even my home laptop by default uses integrated GPU for Chrome, regardless
> of battery/plug. NVIDIA Control Panel has a preset for programs, and I've
> seen it is set by default to Integrated.
>
> On 24 January 2017 at 14:02, Ben Adams <[email protected]> wrote:
>
>> dGPU should always be used when on power; and this is only a decision
>> that effects the choice when on battery?
>>
>>
>> On 24 January 2017 at 13:00, Maksims Mihejevs <[email protected]> wrote:
>>
>>> We've recognised same problem that a lot of people with Windows laptops
>>> and dual gpu get integrated gpu as default.
>>>
>>> Although this is not the case for Edge for example. Which gives it
>>> advantage over other browsers on Windows laptops.
>>>
>>> On 24 Jan 2017 9:26 a.m., "Florian Bösch" <[email protected]> wrote:
>>>
>>>> Dual GPU laptops such as some brands of windows and macOS laptops, the
>>>> OS has a GPU switching function that switches to the discrete GPU in
>>>> graphics intensive applications (such as games, cad software etc.)
>>>>
>>>> However, when a browser is running, it is often the case that the
>>>> integrated GPU is used, regardless of if a tab is doing something graphics
>>>> intensive with WebGL or not (https://twitter.com/grorgwork
>>>> /status/823719997616701440).
>>>>
>>>> On windows a user can influence this by a series of complicated steps
>>>> to designate the preferred GPU, but it is a machine wide-setting, and
>>>> sometimes it was ignored (I don't know if that's still the case).
>>>>
>>>> Obviously this presents a problem for WebGL developers. Neither would
>>>> we want to leech a users batteries unnecessarily, nor would we like to
>>>> force a user with a discrete GPU to receive worse performance should they
>>>> wish to use a graphics intensive WebGL application.
>>>>
>>>> In WebGL1 there was a context creation flag called
>>>> "preferLowPowerToHighPerformance", but I'm not aware how widely this
>>>> was implemented, and apparently it's also ignored on macOS (because it
>>>> defaults to false, yet the discrete GPU is still not used).
>>>>
>>>> WebGL2 has no equivalent context creation flag.
>>>>
>>>> Questions:
>>>>
>>>>    1. It would seem we have a sufficient mechanism to express a GPU
>>>>    preference, is this a correct assessment?
>>>>    2. Why was preferLowPowerToHighPerformance dropped from WebGL2?
>>>>    3. Why is preferLowPowerToHighPerformance ignored for WebGL1 on
>>>>    some configurations where it would be most useful?
>>>>    4. Should an additional mechanism be introduced so a user can
>>>>    switch between GPUs at his choice for tabs?
>>>>
>>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://khronos.org/pipermail/public_webgl_khronos.org/attachments/20170130/114521bf/attachment.html>


More information about the public_webgl mailing list