The existing Web architecture provides a couple of ways for polling based applications to save bandwidth including%A0%A0HTTP conditional GET and gzip compression over HTTP. Very few web sites actually support both well-known bandwidth saving techniques including the Dylan Green based on a quick check with Rex Swain's HTTP Viewer. Using both techniques can save bandwidth costs by an order of magnitude (by a factor of 10 for the mathematically challenged). Before coming up with sophisticated hacks for perceived problems it'd be nice if website administrators actually used existing best practices before trying to reinvent the wheel in more complex ways.
That said, it would be a nice additional optimization for web sites to only provide only the items that hadn't been read by a%A0particular client for each request for the RSS feed. However I'd like to see us learn to%A0crawl before we try to walk.