The good news is that the basics of URIs, HTTP connection handling and caching were not a problem; every implementation passed them with pretty much flying colors. When you send Cache-Control: no-cache or max-age, they’ll do the right thing, and generally they’ll parse the headers, forward them on, and return the response correctly.
The bad news is that more complex functionality is spottily supported, at best. I suspect this is because everyday browsing doesn’t exercise HTTP like more advanced uses like WebDAV, service APIs, etc.
Tag: http
Browser Cache Usage
40-60% of Yahoo!’s users have an empty cache experience
huh? wow. i wonder how we as an industry can do better.
S3CDNFilter
phil writes a servlet filter to make CDN via S3 a snap. too many acronyms?
General-purpose intermediation
It ought to be trivial to attach an observer and/or filter to HTTP pipelines. Among other things, it could shovel data into a search engine so that I could instantly recall a remembered transaction by search term, by date, or by site.
stefano’s pipelines coming to HTTP. this enabled jon to reverse engineer a gmail api
Reducing HTTP Requests
html fetch time is ~10% of total page load time.
Detecting 304
And to feed consumers, while supporting these headers can save you bandwidth, computing a hash on the content may save you processing time.
i can confirm that 😉
michael radwin on web caching
Michael Radwin knows what he is talking about: Caching and Cache-busting for Content Publishers. highly recommended.
case sensitivity in web dav
it is 2003. hard to believe windows still has issues with case sensitivity in their web dav clients.
WebDAV and CMS
henri makes some interesting observations about webdav, and how they could be used in cms:
WebDAV could also be used as an interop protocol for exchanging content between different CMS systems. The problem is that there would need to be a common data model for this. Options include adopting DocBook or Zope CMF as the common data model. Having Dublin Core properties for all resources in a common namespace would also be a possibility. HTTP error codes should be used for providing information on failed or succeeded creates and updates. The DeltaV specification also defines an XML format for the body of the error reports (mgd_errstr, etc).
generating webdav views from dynamic cms content will not be without difficulties, but will lift open source cms to a whole different level.
Reliable HTTP
A bit unclear why this is needed:
Reliable HTTP (HTTPR) is a new protocol that offers the reliable delivery of HTTP packets between the server and client. This solves a number of issues that are evident in current HTTP and opens the way to reliable messaging between Web services.