CMS PageSpeed Optimization Integration

Web performance vs Content Management Systems

Content management systems are not known for their efficiency from a web performance perspective - especially the slightly older ones. From the vendor's perspective, it makes sense to build a very generic one-size-fits all solution to cater the needs of a broad range of customers.

A downside of the one-size-fits-all approach is that the html, scripts, images, and css that are generated by such systems are often far from being lean and mean. For example, some systems will forward any image that you might upload for display in a page without optimizing it for the web first.
Another case is that of a marketeer that enters blocking javascript into a content field for A/B split testing purposes, thereby unknowingly introducing a slowdown that might hurt overall conversion rates.

In both examples, the users of the CMS systems cannot be blamed for the degraded user experience on the website.

PageSpeed to the rescue

These inefficiencies is where Google's PageSpeed comes to the rescue, as it is able to automatically rewrite webserver output for fast delivery. Images will automatically resized and (thoroughly) optimized for both desktop and mobile, and javascript will be rewritten so it does not block the browser's network and/or rendering tasks.

Controlling pagespeed optimizations from the CMS

Here is a lightweight way to integrate pagespeed into your CMS: you can control which optimizations pagespeed will apply from script, by sending out a response header named "PageSpeedFilters".

The trick is to let the CMS generate that header and put a value into it when a user has specified that for a page.

This is particularly useful after the initial tuning phase, in case a CMS user needs to enable or disable filters for one or more pages.

Approaching an integration - carefully

At We-Amp, we recently added page speed optimization into the mix for an agency that run lots of (e-commerce) websites through a content management system.

Steps taken in our approach:

  1. Install the PageSpeed module for the relevant server technology, IIS in our case.
  2. Set the base optimization level of the pagespeed module to 'PassThrough', which means only html will be parsed and reserialized. Let this run for a while, letting a significant amount of live traffic pass by. In this phase, keep an eye on conversion rates and event logs.
  3. If all is OK, it is time to flip the switch and set the optimization level to something a little more aggressive: optimizing for bandwith. This optimization level is designed to be as safe as possible, so pagespeed will be very conservative in it's rewriting.
  4. Monitor the server load, conversion rates, and pagespeed statistics for a while to make sure all is well. If the server is spiking, it should usually settle down quickly as pagespeed learns how to best optimize the website. If you are tight on cpu, you can throttle by:
    • Limiting the number of concurrent image optimizations
    • Randomly dropping optimizing rewrites (eventually everything will be optimized anyway)
  5. Repeat step 3 and 4 for all websites involved.
  6. At this point, it is time to start tuning websites by enabling more aggressive optimizations in a controlled way. To effectively do so, tools to measure should be in place. A/B split test experiments are a very good approach to make sure you are on the right track when changes configuration, and doing so is facilitated by PageSpeed.

To figure out which optimizations make the most sense to enable, test your Page Speed with PageSpeed Insights is a very good place to start. It will give you suggestions on what optimizations would be good to enable in the pagespeed module, and help you comply to best practices.