Stephen O'Grady has had a script or two broken by a Gmail alteration: "To be a bit less harsh, while Google probably had good reasons for making the change, it would have been great to see them be proactive and notify people of the change via their blog or some other mechanism." I'm surprised there isn't more discussion about this. Greasemonkey is a cool idea, but these scripts are so fragile it's not funny; they make side-effected GETs look robust. The page DOM is not part of the API contract, which is the basis of Alex Bosworth's argument. In programmer terms it's like depending on the field names rather than the field values. You build on this stuff, you more or less commit to lockstepping with the server. It feels house of card-like. Someone might respond by saying that, well I'm sure 404s seem house of card-like 12 years ago when the Web dropped backlinking, or 8 years when XML dropped shorttags *. Fine except I've not heard the Greasemonkey approach articulated in terms of a tradeoff to get adoption - anyone out there think DOM scraping is an architectural insight? * and while we're at it , here's a strawman. Retrofitting backlinks dominates Web innovation - pagerank, wikis, tags, folksonomies, trackback, pingback, bloglines, del.icio.us, pubsub, technorati - enabling backlinking is what releases value. When people talk about building out social computing infrastructure, backlinking is also the basis for that....