• Category Archives: caching

Ruby on Rails and Dragonfly processed elements tracking with two layer cache – Memcached na ActiveRecord underneath

Dragonfly is a great on-the-fly processing gem, especially for apps that rapidly change and need files processed by multiple processors (for example, when you need to generate many thumbnails for the same attached image). Since it is a on-the-fly processor, it has some downsides, and the biggest one, in my opinion, is keeping track of […]

Read more at the source

Setting Akamai Edge-Control headers with Ruby on Rails

Just a short and sweet little tip.

Several months ago we moved one of our clients over to Akamai’s Content Delivery Network (CDN). Ww were previously using a combination of Amazon S3 and CloudFront with some benefits, but we were finding several key areas of the world were not s covered by Amazon (yet) for asset delivery. Along with that, we really wanted to take advantage of the CDN for more of our HTML content with a lot of complex rules that related to geo-targeting and regionalization of content.

I’ll try to cover those topics in another post, but wanted to share a few tidbits of code that we are using to manage Akamai’s Edge-control caches from within our Rails application.

With Akamai, we’re able to tell their Edge servers whether it should hold on to the response so it can try to avoid an extra request to the origin (aka our Rails application). From Rails, we just added a few helper methods to our controllers so that we can litter our application with various expiration times.

  # Sets the headers for Akamai
  # acceptable formats include:
  #   1m, 10m, 90m, 2h, 5d
  def set_cache_control_for(maxage="20m")
    headers['Edge-control'] = "!no-store, max-age=#{maxage}"

This allows us to do things like:

  class ProductsController < ApplicationController
    def show
      @product = Product.find(params[:id])

Then when Akamai gets a request for http://domain.com/products/20-foo-bar, it’ll try to keep a cached copy around for four hours before it hits our server again.

Read more at the source