Blog

Easy Fixes to Reduce Page Weight

30 enero, 2014

Total page weight increased by 32% in 2013 to reach a ludicrous 1.7Mb and 96 individual HTTP requests. That’s an average figure; half of all sites will be larger. Website obesity has become an epidemic and we web developers are to blame. There are no excuses.

An overweight site will adversely affect your bottom line:

  1. The larger the download, the slower the experience. Not everyone has a 20Mb connection and this is especially true in developed western countries with aging copper infrastructures. It doesn’t matter how good your site is:users will not wait.
  2. Mobile web access has increased rapidly to reach almost one in four users. On a typical 3G connection, a 1.7MB page will take almost a minute to appear. Is there any point adopting Responsive Web Design techniques when your site won’t work effectively on those devices?
  3. Google’s page speed algorithms will downgrade your site and harm Search Engine Optimization efforts.
  4. The more code you have, the longer it takes to update and maintain.

I predicted page weight will drop this year — and I hope not to be proved wrong. Fortunately, there are a number of quick fixes which will have an instant effect on site performance. All these techniques are well known, use today’s technologies, do not take considerable time, and can be implemented on an existing codebase without the need for redevelopment.

The first three don’t actually slim your website, but put it in a corset and flattering clothing…

1. Activate GZIP compression

According to W3Techs.com, almost half of all websites do not enable compression. This is normally a server setting whichshould be enabled by your web host, although it may be possible to configure it yourself.

2. Encourage browser caching

If the browser can easily cache a file, it won’t necessarily need to download it again. Simple solutions include setting an appropriate Expires headerLast-Modified date or adoptingETags in the HTTP header.

You may be able to configure your server to handle this automatically, e.g. here is an Apache .htaccess setting to cache all images for one month:

1
2
3
4
5
6
7
8
<IfModule mod_expires.c>
ExpiresActive On
<FilesMatch "\.(jpg|jpeg|png|gif|svg)$">
ExpiresDefault "access plus 1 month"
</FilesMatch>
</IfModule>

3. Use a Content Delivery Network (CDN)

Browsers set a limit of between four and eight simultaneous HTTP requests per domain. If your page has 96 assets loaded from your domain, at best it will take twelve sets of concurrent requests before all appear. (In reality, file sizes differ so it doesn’t happen exactly like that, but the limitation still applies.)

Requesting static files from another domain effectively doubles the number of HTTP requests your browser can make. In addition, the user is more likely to have that file pre-cached because it’s been used on another site elsewhere. Easy options are JavaScript libraries such as jQuery and font repositories, but you could also consider dedicated image hosting.

These first three options help improve page speed but we’ll need to examine your code before we can actively reduce page weight…

4. Remove unused assets

Websites evolve. If you’re no longer using a widget, you can remove the associated CSS and JavaScript. If they’re contained in separate files that’s a simple job. If not, you may need to use tools such as Chrome’s Audit Developer Tool, JSLintDust-Me SelectorsCSS Usageunused-css.com or build tools such asgrunt-uncss.

5. Concatenate and minify CSS

Ideally, you require one CSS file (although a couple may be necessary if you’re using RWD to support old versions of IE). While it may be sensible to build and maintain separate CSS files, you should join them and remove unnecessary whitespace prior to hosting on your production server.

Pre-processors such as SassLESS and Stylus can do the hard work for you. Build tools including Grunt.js or Gulp can automate your workflow or, if you’d prefer a GUI, Koala provides a free cross-platform application.

If that sounds like too much effort, you can manually concatenate files in your text editor or from the command line, e.g. in Windows:

1
copy file1.css+file2.css file.css

or Mac/Linux:

1
cat file1.css file2.css > file.css

The resulting file can be run through an online CSS minifier such as cssminifier.comCSS Compressor & Minifier or CSS Compressor.

Finally, remember to load all CSS in the head so the browser knows how to style the HTML that follows and doesn’t need to redraw the page again.

6. Concatenate and minify JavaScript

The average page loads 18 individual script files. While it’s practical to keep libraries such as jQuery as separate files, your own JavaScript code should be concatenated and minified on your production server. Again, build tools can help or you can use online tools such as the YUI CompressorClosure Compileror, my personal favorite, The JavaScript CompressorRaterwhich passes your code to multiple engines so you can choose the best.

Admittedly, you need to be slightly more careful since a JavaScript compressor can fail if you have bad code — even a missing semi-colon. However, simply concatenating the files will provide a performance boost because you’re making fewer HTTP requests.

Finally, it’s best to load JavaScript just before the closing HTMLbody tag. This ensures the scripts don’t block loading of other content and page content is readable before scripts are downloaded and executed.

7. Use the correct image format

Using the wrong image format can bulk up your pages. In general:

  1. use JPG for photographs
  2. use PNG for everything else.

GIF may compress better when you have small graphics with limited color sets — although it’s rare. Some images are also more appropriate as vectors, but we’ll discuss that in a later article.

You’ll need a decent graphics package to convert images but there are plenty of free options available and some such asXnView allow you to batch process files. Remember to play with the settings:

  • JPG is a lossy format with a quality between 0 (poor, smaller file) to 100 (best, larger file). The majority of images will look fine somewhere between 30 and 70 but experiment to find the lowest acceptable value.
  • PNG is available in 256 and 24-bit color varieties. If you don’t need transparency and can limit the color pallet, the 256-color version may compress better.

8. Resize large images

An entry-level smart phone with a 3 mega-pixel camera will produce an image that is too large to display on a web page. Unfortunately, content editors will upload images directly from their camera. A little education and an automated resizing system is recommended.

Image dimensions should never exceed the maximum size of their container. If your template has a maximum space of 800 horizontal pixels, the image will not need a greater width. That said, those using high-density/Retina displays may appreciate a double 1,600 pixel width image, but that’s still smaller than typical camera output.

Resizing images has a significant effect on page weight. Shrinking the image dimensions by 50% reduces the total area by 75% and should considerably reduce the file size.

9. Compress images further

Even if you’ve switched to the correct format and resized the dimensions, it’s possible to shrink image files further using tools that analyze and optimize the graphic. These includeOptiPNGPNGOUTjpegtran and jpegoptim. Most can be installed as standalone executables or integrated into your build process. Alternatively, online tools such as Smush.it can do the work in the cloud.

10. Remove unnecessary fonts

Web fonts have revolutionized design and reduced the need for graphic-based text. However, custom fonts have a cost and may add several hundred kilobytes to your page. If you’re using more than two or three fonts, you’re possibly overdoing it. Your client/boss may love awful handwriting typefaces but, if it’s only used for one title, is it worth downloading a 200KB font file?

I suspect many sites can reduce their weight by 30-50% with a few hours of effort from a non-developer. For the average site, that’s a saving of more than 800Kb and it’ll become noticeably faster. In my next article we’ll discuss more complex optimizations which involve rewriting code.

Author acknowledgement and thanks

Craig Buckler

Contributing Editor

Craig is a Director of OptimalWorks Ltd, a UK consultancy dedicated to building award-winning websites implementing standards, accessibility, SEO, and best-practice techniques.

Más sobre este tema

Tags: ,
Category: Evergreen

Comments are closed.