[Public WebGL] Re: webgl/swiftshader

John Davis [email protected]
Wed Feb 22 09:10:03 PST 2012

I'd love to be able to force use of the software rasterizer at runtime.

On Tuesday, February 21, 2012, Ashley Gullen wrote:

> Why not just introduce some flag which indicates whether the
> implementation is rendering with specialised hardware (GPU) or not
> (software)?  Then each application developer can test their individual apps
> both ways and see if for practical purposes both work, or if only one
> works.  Then each developer can either support both, require GPU support,
> or have some notification that says something like "Your computer can run
> this content, but it may be slow.  Are you sure you want to continue?"  By
> default it would use whatever's available so the developer would have to
> intervene to either require GPU support or introduce a notification.
> Sure, some beefy CPUs may be able to just out-do low-class GPU hardware,
> but I think the vast majority of software-rendering cases will be ordinary
> low-mid power computers with perfectly capable GPUs that are blacklisted
> due to old drivers.  (Judging by the mozilla link earlier, 50% of people
> don't get hardware accelerated WebGL.)  As noted previously, most of the
> time any GPU will far outdo software rendering.  So for practical purposes
> a flag indicating software rendering is a good heuristic that rendering
> will basically be slow.
> IMO running performance tests on startup to judge performance is out of
> the question because every single different application needs its own
> performance test.  This likely will require special coding and the
> developer may not even create a fair representation, resulting in bad data
> and the wrong renderer chosen.  Personally I have no idea how I would
> create a single fair test for any particular game that always chooses the
> best renderer for the vast array of system configurations out there.  Think
> about a game with loads of different levels, each with different rendering
> characteristics and heavily dependent on user input and skill level - how
> do you make a worthwhile five second test for that?  I don't think it can
> be solved by frameworks either, because if there was some generalised test
> that could always pick the best renderer, we wouldn't have this problem in
> the first place.
> So I think there really ought to be a software/hardware rendering flag,
> and that's a good heuristic.  For eye candy on a web page, just don't show
> it if it's going to be software rendered, and fall back to an image or
> video.  For a simple game, you can probably get by with software rendering.
>  For Crysis 3: Browser Wars, require hardware acceleration.  That's still
> better than the "no WebGL, no content" situation that we face without
> SwiftShader.
> My 2c.
> Ashley Gullen
> Scirra.com
> On 21 February 2012 15:36, John Tamplin <[email protected]<javascript:_e({}, 'cvml', '[email protected]');>
> > wrote:
>> On Mon, Feb 20, 2012 at 2:14 PM, Steve Baker <[email protected]<javascript:_e({}, 'cvml', '[email protected]');>
>> > wrote:
>>> If you're doing something like selling widgets on the Internet - then
>>> users with GPU's should get a 3D rendering of your new SuperWidget-3000
>>> that they can spin around and interact with - and low end users should
>>> get
>>> a movie clip or a photo gallery of still images or something.  If you
>>> care
>>> enough about getting your message to all of the users out there then this
>>> is a small price to pay.
>>> But - if you're a game designer - there is nothing worse than to find a
>>> valid WebGL environment and then discover that it takes 30 seconds to
>>> render a single frame and to have to resort to disgusting timing tricks
>>> to
>>> try to deduce whether this is a GPU or not.  I'd *much* rather a system
>>> report "Sorry, you need a machine that can run WebGL" than to have it try
>>> to run my game at 0.03Hz.
>> Maybe the answer is to have some way to ask the question "I have x
>> triangles and need y fps and features z, what is the likelihood this WebGL
>> implementation can deliver that".  Given that, the app can make a
>> reasonable decision of telling the user "sorry, you need to upgrade to use
>> this app", running as-is, or scaling back.
>> As I understand it, there isn't any way to do this now other than to try
>> it and time the result, which gives a poor user experience anyway.  The
>> alternative is to detect and whitelist configurations, or leave it up to
>> the user -- "here is the known-good version, you can try the WebGL version
>> if you are brave".
>> --
>> John A. Tamplin
>> Software Engineer (GWT), Google
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://khronos.org/pipermail/public_webgl_khronos.org/attachments/20120222/4709bde5/attachment.html>

More information about the public_webgl mailing list