A little earlier today, having read how Sky broadband had blocked the jQuery CDN I tweeted
What the Sky/jQuery thing teaches us is that unpredictable factors can cause good JS to fail. Plan by designing pages to work without first.— Drew McLellan (@drewm) January 27, 2014
The internet, as a network, is designed to be tolerant of faults. If parts of the network fail, the damage gets routed around and things keep working. HTML is designed to be tolerant of faults. If a document has unrecognised tags, or only partially downloads or is structured weirdly, the browser will do its best to keep displaying as much of that page as it can without throwing errors. CSS is designed to be tolerant of faults. If a selector doesn’t match or a property is not supported, or a value is unrecognised, the browser steps over the damage and keeps going.
This isn’t a new concept, it’s a very old one. What is new, however, is the backlash against this very simple idea by people who at the same time consider themselves to be professional web developers.
Somewhere along the line that all got lost. I’m not sure where – it was still alive and well when jQuery launched with it’s find something, do something approach (that’s progressive enhancement). It was lost by the time AngularJS was ever considered an approach of any merit whatsoever.
When did the industry stop caring about this stuff, and why? We spend hours in test labs working on the best user experience we can deliver, and then don’t care if we deliver nothing. Is it because we half expect what we’re building will never launch anyway, or will be replaced in 6 months?