Caching
With ruby and rails we often want to have caching of static content so we will try to reduce requests that has to come through rails stack when possible. Nginx in front is a great tool and we can use its abilities to add caching easy.
App
lets imagine we have a simple sinatra
app. For purpose of this post we will have app with one method /ohhai
that shows current time. This is a great way to test if out caching is working fine. Code of the app is really simple:
1 2 3 4 5 6 |
|
Also to start it easy i have created a config.ru rackup
file describing how to start app (in repo there is start.sh script ;)
1 2 3 4 5 6 7 8 |
|
If you are using code from my repo https://github.com/JakubOboza/003-nginx-cache-stale-example to start it all you need to do is
1 2 3 4 |
|
I configured the app with my own path/uri and to look for upstream server on port 6677 so you need to change it if you are using different settings.
Caching
Our app is running now. Lets add caching, for this we will need to add nginx frontend config. In most cases i create a single nginx config for each server in sites-available directory and symlink it in sites-enabled (like apache do by default). I like this setting, it helps a lot to maintain more then one site which is common in development environment and also common on shared applications servers.
Nginx config file
I will show complete nginx config file for this example and explain each bit one by one.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 |
|
It isn’t even so long ;)
upstream
1 2 3 |
|
in this part we create description of our upstream. In other words where our application server will be listening. It is easy to just use port but you can configure it to use unix.socket if you want to gain on performance.
global cache config
1
|
|
This directive sets the place where cache is stored and sets the zone name and how big it can be. We will refer tot his zone later on in proxy pass cache config.
app cache config
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
|
Here we configure server name for out application, port to listen on for it and root directory (this is fixed with my mac so you should change it). Whole magic happens in location description. here you have few important things for us.
1
|
|
Sets which cache zone we will use. We use same cache zone we defined in proxy_cache_path
with keys_zone.
Next we are setting for how long it should be cached. I used 1 minute for 200 and 302 status. this lets us see on our example app how this works each minute we see new time :). This is awesome! Next you can set different caching time expiry for other status. Here we are refreshing 404 status cache each hour (it could be days :) ).
Last but not the least is serving stale content if upstream is dead.
1
|
|
This sets up for us config that will enable serving stale content if upstream is dead. This is nice if you want to provide some content in case when your backend is dead.
Test
You can test it now. Or wait… with my config you have to add entry to /etc/hosts
1
|
|
Now you can go to example_stale.local/ohhai
(or just curl example_stale.local/ohhai
) and see how our cache works. Even more now you can kill your app server and still see cache being served correctly.
Results
First request Next requests
–> http://www.youtube.com/watch?v=lgoXUzIwXk0
Cheers
How you can use it? Depends on your app architecture, but for every bit of content that you create which is “static” it is great thing to have. I like this feature of nginx and i hope this post will help you ;).