instead of plugins [web-developer toolbar]&[OctaGate SiteTimer] you can use an another great fx-plugin. this is firebug!
I haven’t had any problems with it so far, but maybe I’m overlooking something.
@magic.ant: Thanks, I forgot about mentioning firebug. It’s great for debugging, but does it break down page sizes as well? (Update: Firebug does show page sizes on the Net tab, not sure how I missed that one! I’ve updated the article.)
@Blaise: The ThinkVitamin article goes into more detail – apparently some versions of Netscape and Internet Explorer (4-6) have issues correctly decompressing gzipped content:
There may be workarounds by detecting the browser and returning different code, but I don’t think it’s worth the risk.
However, if you do get it working in all browsers, I’d love to hear about it!
Una serie de consejos para que la carga de los javacripts no se demoren demasiado y hagan al usuario esperar, algo que no suele gustar, sobre todo ahora que cada vez hay mas efectos y funcionalidades que hacen que las…
HTTP already allows to gzip the payload, so there is really no need to think about gzipping.
that script loads as the page loads
can i do it
plz help me
Hi Raheel, check out the example in part 2:
There’s a script that loads after 5 seconds – you can change the delay to 30 instead (30 * 1000).
Theres a script that loads after (24 * 1024).
Kalid, I don’t agree with your comment on not using gzip. Gzip is required to be supported by HTTP 1.1 protocol, which includes most modern browsers. The bug in IE 6 that is referenced by the ThinkVitamin article affects not just gzip content, but other content as well. People are far better off using gzip than not.
Also, regarding your comments on caching you need to be careful about what HTTP 1.1 caching commands you use because users could be going through proxies that are HTTP 1.0 resulting in odd/unexpected behavior.
Hi Trevin, thanks for the comments.
- Regarding gzip compression, I agree that it is extremely useful (in fact, I’m researching the best way to turn it on for InstaCalc).
One problem with IE6 is that it says it accepts gzip’d content but may have problems decoding it:
As a result, webmasters serving gzip’d content have to resort to hacks like detecting the browser user-agent and returning regular content to IE6, even if it says it can accept compressed content:
These tricks can be done, but may be tough for a newbie. Of course, we can just take a scorched earth approach and let IE6 choke if it can’t render the page, but this is tough stance to take given that IE6 still has a large fraction of browser share.
However, I agree with you that output compression is extremely valuable. In my tests it shaves over 2/3 of the bandwidth, so I really, really want to enable it (and find a suitable workaround for IE6).
- Yes, caching can be tricky as well. As I followed these topics down the rabbit-hole I’m seeing more of the intricacies here.
In general though, it appears to me that an old HTTP 1.0 proxy won’t cache something a new HTTP 1.1 header is set. You might not get the performance benefit when using an old proxy, but I’m not sure what other impact there would be.
Both of these are probably topics for a follow-up article