Oliver Nassar

Progressive enhancement

August 13, 2009

I was in an interview not too long ago for a company with a terrible terrible idea, with easily the most terrible team leader I've ever had the misfortune of speaking with in a professional context, when someone in the interview asked me about 'Progressive Enhancement'.

I didn't know what the term meant, and when he explained it to me, it made logical sense; it's something that I've been doing for years, but I just never knew the formal name of it.

In a nut shell, progressive enhancement is a way of developing and designing a web experience to be optimal for the viewer. It's a very general term/description, but it's because it can be used in many different areas. For example, CSS is in itself a progressive enhancement technique. The reason I say that is that when Google indexes your site, they don't see CSS, they don't see much layout; they just see a bunch of text, code and tags that they give importance to.

However, imagine now your friend checked out that site; if they viewed it the way google does, they wouldn't be able to use it much (the internet of the early 90's was like this). CSS came along and was designed to lift/boost the visual experience of a visitor that wasn't a robot, and that could understand visual context.

But that's the most general case, two that I've been doing lately include CSS3 graceful degradation, and ajax based hash-requests (my own term that I just though of now). CSS3 degradation is what it sounds like. I throw in a bunch of CSS3 for browsers like IE8, Safari 4, FF 3.5, that makes the user interface spark and work better. For the others, they aren't affected, so nothing bad happens.

This one is common place, and has been for a while, but something that's really cool and super effective (something Digg should've done for their digg bar url fiasco, which I can safely say as a backseat-driver), is anchor link experience degradation (another term I've coined now, and is linked to that above). Basically, all these anchor's have href values that point to another page/experience right? Well using some not so fancy javascript, you can override the default behaviour for browser that support javascript (eg. not google bots), to change the experience.

So for example, I've got a link on my homepage that points to my profile link yadda right? Google bot is going to hit this, visit that page, index it, and keep on moving. A user would normally do the same, but using progressive enhancement, I can make the experience so much better. I can make it so that when they hit the link, I capture the event, prevent it from happening, and instead take the href value and make an ajax call to the same url. My server is configured to notice that this is an ajax call for the profile page, and returns the contents of that page (or view) in a JSON object. I then only replace the part of the page that would be different for the profile. This prevents a full browser resource re-load (Eg. headers, footers, assets etc., don't get loaded/requested again).

Now this isn't just the core of what Ajax is about; ajax is about enhancing the user experience to prevent unnecessary loads. This is about changing the entire experience, all-together, to make your site completely ajax driven, bookmark-able (via url's like website.com/#/users/onassar/profile/), navigation-able (made that up) through the forward/back links, and faster. It takes a lot of work both on the server and client side, but changes the entire scope of your application/site. But because you're using progressive enhancement, you're keeping google happy with new links for it to index (not like flash), and making two versions of your site; an ajax one which is faster, and a regular synchronous one which can be used for old school browsers.

This technique has always been buried into everything I've done on the client side, and while I didn't know the name (and to be honest, as long as you're implementing it the name shouldn't/doesn't matter), it's something I'd really push for wherever possible. This has a lot to do with some more abstract concepts such as RESTful resources/requests, but at the heart of it, it's about improving the user experience to the highest capacity, and degrading gracefully when it won't work.

Regarding the Digg fiasco; basically, a bunch of people complained originally that when Digg changed all the anchor url's to point to their Digg bar which pushed out a link in a frame, that they were losing SEO juice. Now which google could get pissed by this, a fairly straight forward, progressive enhancement technique couldn't been to write out links like this:

<a href="http://iwantseojuice.com/random" rel="b38d0">yadda</a>

Google would visit, and move on through to the page, passing along the 'yadda' SEO juice. But then adding in some nice client side JS (weird that I have to say that, since there are not server side js ports), I could prevent the default action of that anchor, and redirect the url to digg.com/[REL VALUE].

This would keep the juice flowing, and improve the experience by redirecting to the Digg bar url. While I'm sure they thought of this (super smart dudes over there), and maybe they ran into a google/search engine blacklisting situation since they were in fact leading google to a different experience than is intended for the real visitor, I would've been curious to see how bad the upset would've been.