[Public WebGL] Re: webgl/swiftshader

Steve Baker [email protected]
Mon Feb 20 11:14:57 PST 2012


While I'm not actively against a software rendering fallback - I honestly
think it's a waste of effort to make one - and that it'll cause some
people more harm than good.

10 years ago, a significant number people were still using 9600 baud
dialup, but many were starting to get higher-speed connections.  There
were many websites that were *SO* slow at 9600 baud that they were
effectively inaccessible.  The fix for this was to have "Low Bandwidth
Version" links for people who had dialup...you don't see many sites like
that anymore - but they used to be everywhere.

When designing a new software technology that is expected to be pervasive
- one cannot afford to target the lowest common denominator because to do
so is to invest heavily in a dying population of users.  If we did that
then we wouldn't be able to put 1024x1024 pixel images up on our web pages
because a few die-hards still only have dialup and it takes them ~10
minutes to load it.

WebGL *must* take a leap forward and assume the existence of a GPU or else
it'll be useless for anything beyond minimal eye-candy.  This is an
'intercept' technology - we design for the future and let the world catch
up with us.  That's how we become ready when the technology is so
pervasive that it's safe to assume that it exists everywhere (just as we
assume that nobody we care about still has dialup).

Sadly, we're now in that interim period where not all computers will be
able to display all of this cool new stuff.  But that shouldn't stop web
designers from using it.  There is a perfectly good way to test to see
whether you have a valid WebGL environment - and if you don't, and you
care enough about that decreasing population of people without a GPU -
then you find some other way to convey your message to them.

If you're doing something like selling widgets on the Internet - then
users with GPU's should get a 3D rendering of your new SuperWidget-3000
that they can spin around and interact with - and low end users should get
a movie clip or a photo gallery of still images or something.  If you care
enough about getting your message to all of the users out there then this
is a small price to pay.

But - if you're a game designer - there is nothing worse than to find a
valid WebGL environment and then discover that it takes 30 seconds to
render a single frame and to have to resort to disgusting timing tricks to
try to deduce whether this is a GPU or not.  I'd *much* rather a system
report "Sorry, you need a machine that can run WebGL" than to have it try
to run my game at 0.03Hz.

If software rendering were (say) 50% or even 10% as fast as the GPU, then
I'd be tempted to try to fix that - but they aren't.  They are about 1% of
the speed in small windows and 0.1% of the speed in large windows.  I
can't make my content scale that much.  If I have a 1000 triangle cow in
my scene - I can't drop that to a 10 triangle cow for software rendering
because you simply cannot topologically make a cow in 10 triangles!  I'd
have to do it by drawing fewer cows - but if my game centers on chasing
herds of cows then the game play falls apart if you don't have enough of
them.

  -- Steve

Florian Bösch wrote:
> On Mon, Feb 20, 2012 at 6:51 PM, Alvaro Segura
> <[email protected]>wrote:
>
>> My view is that software rendering is not only welcome but even
>> necessary.
>>
> It's very necessary, there's no denying that.
>
>
>> Real world sites can't afford to use a technology that will not work for
>> a
>> large percentage of users or they will face countless complaints.
>
> That depends on what real world page you're doing. If you're writing a
> WebGL game intended for gamers, and not the next article in the new york
> times, your requirements differ, a lot.
>
>
>> Users of low end systems can understand things work suboptimally for
>> them,
>> but not that they don't work at all.
>
> Some things will not work, at all, with software rendering. Mainly in the
> range of performance. For instance if your usecase requires 5000
> triangles,
> but performance is too slow you can probably scale down to 2500 triangles.
> But if your usecase requires 2 million triangles, there simply isn't any
> way that's gonna fit into 2500 triangles. There's similar "can simply not
> scale down that much" issues around texel troughput, texture units,
> extensions, vram usage, etc.
>
>
>> Standards-based Web technology must not impose special hardware
>> requirements:
>>
> Efficient 3D rendering unfortunately requires special hardware
> requirements. Therefore that statement would have to be reminted to "WebGL
> should be restricted to the capabilities of software rendering only",
> which
> for most intend and purpose would be "no webgl at all".  Clearly that's
> not
> a valid position.
>
>
>
>> So, my opinion is: a fallback is necessary, i.e. a "better than nothing"
>> fallback.
>>
> Yes a software fallback is nice, but it does not mean things will
> magically
> work for everybody. They will work for some.
>
>
>> In the time before programmable shaders and when 3D acceleration was
>> less
>> common, software rendering was quite ubiquitous. VRML software renderers
>> perform quite well for their needs, they just work everywhere, but works
>> much better on good 3D hardware.
>>
> Nobody uses VRML (or x3d or o3d), which proves the point that another
> forward shading rasterizer and scene-graph library is what nobody wanted
> the last 20 years. Thinking things like "it worked for VRML, so it's gonna
> work for WebGL" is flawed.
>


 -- Steve


-----------------------------------------------------------
You are currently subscribed to [email protected]
To unsubscribe, send an email to [email protected] with
the following command in the body of your email:
unsubscribe public_webgl
-----------------------------------------------------------





More information about the public_webgl mailing list