Updated on Mar 15, 2018
Optimizing images - this one is pretty easy to understand - you lower the size of your images without losing almost any of the quality and lower size means lower bandwidth. However, if you have a lot of images, you might still need a better solution which comes in the next point.
Activating Cloudflare CDN via cPanel for your entire domain or moving the images to a subdomain which has Cloudflare active will reduce your bandwidth substantially. With Cloudflare's edge nodes service most of the content of your website to your visitors, your hosting account will be offloaded and generate way less bandwidth than before. Since Cloudflare does not charge for bandwidth, you will be able to benefit from both the fast loading speed and cashing of the nodes as well as the no price tag. Alternatively, if your domain is managed elsewhere, you can activate Cloudflare via their website.
Offload your downloadable files to a 3rd party storage solution and only link them to download buttons on your website. Amazon S3 is a great service which can provide your users with direct download from your website without any additional redirects. The downloads can also be restricted in case you want to provide purchasable content on demand so only the customer who has paid can download the files.
Offload your videos to a video sharing platform and then embed them on your website. Video Sharing platforms like YouTube and Vimeo are perfect for offloading video bandwidth. Just create an account and start uploading. Afterward, you can easily embed your videos by using the embed codes provided by the platform.
Offload your images to an external media service provider. This will decrease drastically the bandwidth consumption of your website as the images will be loaded from the external provider and directly from your domain name. Both Imgur and Flickr are a great choice for this.
Combining images instead of serving them one by one. Optimizing your graphical elements is a must do on call to action websites with a lot of clickable buttons and icons.
Additional ways of lowering bandwidth usage - improve the performance of your website.
If you already have reduced the bandwidth consumption by activating a CDN service and offloading the media content to an external provider, but you are still not satisfied with the results you may want to try the following methods:
Lower home page size. Since this is the main page which your visitors see when they access your website, it is recommended the same to be clean as much as possible. For example, if you have a store website, you might want to avoid adding a lot of products at the home page and instead split them to separate categories on different pages. Once this is done, you might consider of reducing the size of your internal pages as this will additionally decrease the bandwidth consumption of your domain and you can check the intro of this tutorial section where we have provided a simple comparison of the monthly bandwidth generation between two websites with the very same amount of visitors and the only difference is the size of the pages.
Limit search engine bots with a robots.txt file. If you have determined that most of the bandwidth consumption is coming from the aggressive online crawlers/bots, you might consider of blocking them from your website using the robots.txt file. Whenever a robot visits your website, it will first check the rules which you have set in the robots.txt file and it will proceed only if you are allowing them to actually access your website's pages. By default, this file is located inside the web root directory of your domain name, but if you are not able to find it you can simply create it by yourself.
If you would like to block all crawlers/bots from your website, you will have to add the following lines inside the .txt file in question:
User-agent: * Disallow: /
The first line means that the rule will apply to all robots, where the second line means that it will disallow the access to all of the pages on your website.
Alternatively, if you would like to block a single robot, you should use the following rules instead:
User-agent: BadBot Disallow: /
where you will have to replace the "BadBot" with the name of the robot which you would like to block from your website.
There are also rules which you can apply to your robots.txt file if you would like to allow only a single robot and block the rest. For example, if you wish to allow access only to Google crawler, then you will have to use the following lines:
User-agent: Google Disallow: User-Agent: * Disallow: /
To summarize the provided information above, you should always look to keep the size of your pages as small as possible and disallowing the access of aggressive robots. This way, you will not only reduce the bandwidth consumption but also increase the overall performance of your website.
All of these solutions are easy to follow, don't require extensive work on them while yielding excellent results. Combining some or all of them depending on your website may reduce your bandwidth tremendously.