A couple of posts back I mentioned about a current diversification of HTML5 video-codec's as browsers continue to include more and more HTML5 support into their nightly builds based on, what is currently still, a draft version of HTML5. You see, all of these awesome new tags like video, audio and canvas (to some extent) rely on a browser being able to natively support that content. Having that technology proprietorial available.

Prior to HTML5 this was managed with "plug-ins". You had your Quicktime plug-in, your Windows Media Player plug-in; pretty rapidly during HTML4 the plethora of plug-ins quickly expanded. Silverlite, Flash, AIR etc... All just API technologies that allowed the browser to tap into to the power of other software via. API calls built into the plug-in. That's what made plug-ins such a pain though. There were so many technologies with associated plug-ins. Whilst this was fine for very specific, technology-centric applications, for more common content requirements, like say video, it was a pain; and dare I say it, that is where Flash offered a haloed solution. A single format that required a single-plugin that ran on all the major engines. Trident, Gecko, WebKit and Presto etc... That was it's success. Singularity.

That is my dilemma. Now we are back in the 80's to some extent. The whole point of HTML5 is to have unity, ubiquity, a single standard. It's great my browser natively supports HTML5 video; but they don't all share the same codec to play that video.

Webkit based browsers like Safari and Chrome have gone the H.264 route, as has, unbelievably Microsoft with it's Trident engine.
Opera(Presto) and Firefox(Gecko) have both gone with the OGG Theora Codec siting potential licensing and patent issues in the future being the reason for not going the H.264 route.
Google have had some foresight based on potential licensing/patent issues and have now developed their OWN codec. The VP8 Video Codec.

Why am I repeating this. Because this is the problem. For true HTML5 compatibility I now need to encode my video in at least 2 different formats. I'm back to the Quicktime Vs. Windows Media Plugin issue. Then stepped in Flash, and I'm guessing, this is the approach Google are taking with their VP8 Codec in this analogy. They hope to offer that ubiquity Flash did. If I was Adobe I'd be focusing on that right now, ubiquitous video, the HTML5 equivalent of Flash.

Anyway, back to the original title. WebGL. What is it. WebGL is a context of the canvas tag. Through API technology and traditional Javascript you can expose 3D graphics content into the HTML canvas tag and utilize the power of your users graphics hardware. You can initialize powerful renderers and shaders. Experiment with bump mapping and anisotropic filtering (things that game-developers have been doing for years) to display some amazing 3D graphics. This O3D WebGL API is just one amazing example of this. WARNING: You need a WebGL compliant browser and some pretty decent Graphics hardware (ATI/nVidea) to appreciate this at it's best.

But again it's like the video/audio codec. Whilst WebGL is an already defined standard, there are now several "middle-ware" implementations out there.

O3D, GLSL and a ton of other libraries that incorporate WebGL: GLGE, C3DL, Copperlicht, SpiderGL, SceneJS, Processing.js and XB PointStream.

Could this be another HTML5 diversion; each browser deciding to implement different "middle-ware" WebGL API's? Will my Gecko WebGL API code not work on a WebKit based browser?

In either case I can foresee developers that are able to command the full extent of the canvas tag being in demand. The canvas tag is literally that, a canvas onto which you need to paint with other API's. WebGL and SVG are just two examples. I think we could even see game-developers joining web-teams bringing their seasoned experience of WebGL development to the internet. The canvas tag is definitely a deep well. With the potential for so many API's to run within the canvas could this be a new source for hackers to execute arbitrary code. WebGL API's running in the canvas give developers a chance to leverage hardware. What else can be leveraged from within the canvas?

I think ultimately though the canvas tag combined with API technologies will open up a gateway to a new generation of internet. Applications running in the cloud, including gaming applications reliant on OpenGL technology. I think computers will simply become "portals" to the internet. The iPad is possibly a current predecessor.

More on WebGL

WebGL Experiments