html fetch time is ~10% of total page load time.
Tag: performance
Detecting 304
And to feed consumers, while supporting these headers can save you bandwidth, computing a hash on the content may save you processing time.
i can confirm that 😉
JavaScript Code Inefficiencies
need to mine this at some point
Encoded Polyline Algorithm
aha, gmaps employs some clever compression for polylines, using offsets
new considered harmful
Do not use new Number, new String, or new Boolean. These forms produce unnecessary object wrappers. Just use simple literals instead. Same goes for new Function. its smaller and faster
MySQL Camp Google Notes
The most fascinating bits I took out of it is how they take a partitioning/sharding strategy similar (but notably different in some ways) to WordPress.com and that they use DNS to manage all load balancing, high availability, datacenter failover, etc. DNS is a pretty powerful building block.
some interesting tricks
Marissa Mayer at Web 2.0
Marissa went on to describe how they rolled out a new version of Google Maps that was lighter (in page size) and rendered much faster. Google Maps immediately saw substantial boosts in traffic and usage.
reiser4 benchmarks
reiser4 can do 50% more req/s than reiser3, and 33% more than ext3
Avoiding JavaScript Memory Leaks
1) XMLHttpRequest onreadystatechange = 2) Clean up all your DOM event handlers on unload 3) Never put anything in a DOM expando or property other than a primitive value unless you plan on cleaning it up.
New memcached release
1.2.0 brings reduced CPU usage and increased memory efficiency, plus bug fixes