Nov 14, 2009

Client Side Caching: proxy servers and forced reload

In the previous blog post we have talked about caching basics. Let's review now how proxies actually work and how can we force cache reload.

Cache reload

The main issue with far future expires headers is that browser doesn't re-request resource but takes it from local cache. So if you have made any changes on your website they won't be visible for all 'old' users (with cached styles and scripts, HTML documents usually aren't cached so aggressively).

So what can we do with this trouble? How can we tell browsers to re-request such resources?

Main cache reload patterns

There are two main patterns to force browsers (user agents) to request the current asset once more.

  • Add to the file name any GET parameter (which should indicate new state of this asset). For example
    styles.css -> styles.css "physical" file name. For example
    styles.css -> styles.v20091114.css

Both approaches change URL of the asset and force browser to re-request it.

Cache reload and proxy servers

As you can see the first approach is simpler than the second. But there a few possible issues with it. First of all some proxy servers doesn't cache URL with GET parameter (i.e. our styles.css So if you have a lot of visitors from a network behind one firewall we will serve this asset to each visitor separately, without its caching of a proxy server. This will slow down overall website speed and sometimes this can be critical.

But how can we apply new file name without actual changes on file system? Is there any way to perform this with only change in HTML code? Yes!

Apache rewrite rules

Apache web server has a powerful tool to perform 'hidden' redirects for local file (this is called 'internal redirects'). We can manage the first way with just one predefined rule for all files (in our case it's a set of numbers after .v):

RewriteEngine On
RewriteRule ^(.*)\.v[0-9]+\.css$ $1.css

So all such files will be redirected to their physical equivalents but you can change a part of URL with .v at any time — and browsers will request this asset once more.

Automated cache reload

There are several ways to automate cache reload process for all changed files. As far as Web Optimizer combines all resources into 1 file, it's required to re-check file mtime (time of change) for all files and re-combine all resources.

Issues with re-checking all combined files have been already described last month, so it's not generally good to check them all with every web page visit. We can cache all previous checks into 1 file and check only its mtime. So it's done by default. By default we can check time of change of the only file (CSS or JS one) and add as a GET parameter or as a part of file name.

So this is applied for all such files (that should be cached on a client side) and results in the following:

/cache/website.css you can see there are two timestamps in these CSS files, one goes as a GET parameter, the other — as a part of URL (and with Apache mod_rewrite rule is transformed to /cache/website.css).

Overall schema

So what is overall caching algorithm for the website?

  1. Check if we have combined file. If no — create it.
  2. Check mtime of the combined file. If it's required add mtime to URL (using one of the described ways).
  3. Browser receives HTML code with the URL of combined file.
  4. Browser checks if it has this URL cached. If yes, all finished here.
  5. If not browser requests cached file (which is already prepared on the server or is cached on the proxy).


  1. This comment has been removed by a blog administrator.

  2. There are different reasons for which we use the internet. Some use it to check their mail box or send a message. Some use it to find friends and chat with their loved ones. read this article

  3. I am upbeat to locate your recognized method for composing the post. Presently you make it simple for me to comprehend and execute the idea. Much obliged to you for the post.  vpnveteran

  4. In any event it ended a really fantastic trip for us on an absolute high. A case of being in the right place at the right time!" visita il sito

  5. I am continually amazed by the amount of information available on this subject. What you presented was well researched and well worded in order to get your stand on this across to all your readers. find out more

  6. I'm constantly searching on the internet for posts that will help me. Too much is clearly to learn about this. I believe you created good quality items in Functions also. Keep working, congrats! besuche die Website

  7. This is only the data I am discovering all over the place. A debt of gratitude is in order for your website, I simply subscribe your online journal. This is a decent blog.. VPN

  8. This is a truly good site post. Not too many people would actually. the way you just did. I am really impressed that there is so much information about this subject that have been uncovered and you’ve done your best. with so much class. If wanted to know more about green smoke reviews. than by all means come in and check our stuff. allertaprivacy

  9. Just pure classic stuff from you here. I have never seen such a brilliantly written article in a long time. I am thankful to you that you produced this! weneedprivacy

  10. I think this is an informative post and it is very useful and knowledgeable. therefore, I would like to thank you for the efforts you have made in writing this article.

  11. Intriguing post. I Have Been pondering about this issue, so much obliged for posting. Really cool post.It "s truly extremely pleasant and Useful post.Thanks

  12. Datacenter virtualization is most certainly necessary for most businesses to keep up with the explosion of data resources needed to keep pace with competitors.
    minecraft server hosting

  13. I love significantly your own post! I look at all post is great. I discovered your personal content using bing search. Discover my webpage is a great one as you.I work to create several content this post. Once more you can thank you and keep it create! Enjoy! Klik hier

  14. I think this is definitely an amazing project here. So much good will be coming from this project. The ideas and the work behind this will pay off so much. Meer hierover leest je hier

  15. As a matter of fact, bloggers and other expert reviewers compose quality reviews in various ways. They go for slight varieties to get explicit reviews

  16. Many small business owners struggle with deciding on how to make the right choice for designing their website. They search the web for help with their website and find that the choice is harder than they thought.Webdesign

  17. Your post has those certainties which are not open from anyplace else. It's my unassuming solicitation to u please continue composing such astounding articles cheap vps

  18. By using this type of Proxy, you will bypass a direct connection with the intended website and view the site under the Web Based Proxies IP address. torrentz2

  19. This is actually the kind of information I have been trying to find. Thank you for writing this information. get free instagram likes famoid

  20. Intermediary sites are accessible for nothing and numerous individuals use intermediaries to bring in cash. Certain intermediary sites permit you to ride the web for nothing, while some need a login.

  21. Offline advertising and promotion - Business cards, postcards, traditional direct mail, newspaper ads, flyers, giveaways and brochures that feature a domain name are all excellent ways to promote a website. Trust Pilot Website Scraper Software

  22. For the accomplishment of a website, it is essential to get guests and get business. To advertise a website viably, marketing devices are essential. To get great marketing results, these instruments assume a significant job. how to get email lists for marketing

  23. On the off chance that your social media strategy has not been refreshed in the course of recent years, the approach is probably going to be out of consistence with the direction gave by the National Labor Relations Board as of late. SMM Panel