I wanted to start off this post by letting you know that this is NOT just another generic “How To Speed Up WordPress” article.
I’m not going to regurgitate anything that can already be found on the web. I’m not going to tell you that you should install a caching plugin, enable compression, minify your css/js etc….
After all, you should know how to do these things already. And if you don’t, you can find this generic information on hundreds of other blogs.
Instead, this article contains everything custom/semi-custom that I’ve implemented in the past month or so to speed up my own WordPress blog and basically prevent rogue accesses from bringing down my server.
And the reason it’s “little known” is because the techniques I’m about to describe will be very specific to your own blog depending on the traffic patterns you are seeing.
Note: If you have a slow blog and you really don’t want to deal with any of the technical aspects of speeding up a website, then you should probably sign up for a service like WP Engine.
These guys specialize in WordPress hosting and will make sure your blog runs as fast as possible. But naturally, this comes at a price. You should check them out if this post goes over your head:)
Anyways, before I can explain to you how and why I’ve done what I’ve done to my blog, you have to understand a few fundamental concepts about WordPress and caching which I’ll describe below.
Some Interesting WordPress Facts
Let’s say you’ve followed all of the guidelines on how to speed up WordPress already. Your blog feels zippy. Webpagetest.org tells you your blog is fast as hell. Everything is all good right? Not necessarily.
I used to feel the exact same way about my blog. After all, I follow most of the standard speedup protocols. I run very few plugins and my blog feels pretty fast under normal usage from the perspective of a human reader (The ad networks are what slow my blog down so I load ads last).
But then I started analyzing my CPU usage graphs and often noticed periods of high server load despite low to moderate traffic levels. Occasionally, my server would even become unusable or unresponsive for extended periods.
Note, the only reason why I started paying attention to these stats was because a while ago I was running my online store on the same server as my blog. And periodically I would have customers complain that the store was extremely slow. When I finally did some digging, I found that my optimized, fully cached WordPress blog was bringing my server to its knees!
Moral of the story? Just because a speed test tells you your blog is fast doesn’t necessarily mean everything is all good.
Here are some fun facts about WordPress and caching plugins like WP Super Cache and W3 Total Cache that you should be aware of.
- WordPress 404 responses are slow. Anytime your blog receives an access to a page that does not exist, your poor server has to load up WordPress, run a bunch of php code, do a bunch of MySQL queries and then spit out your custom WordPress 404 page. This is a very resource intensive task and caching isn’t going to help.
- Your caching plugin does not work very well when there are GET parameters in the URL. For example, I used to notice that my blog would slow to a crawl whenever I sent a blast out to my email list. In theory with static file caching, my server should be near invincible.
But because Aweber inserts tracking parameters in the URL, none of my files are super cached. As a result, WordPress has to generate a brand new cache file (even though it already exists), zip it up and send it out each time. The worst part is that these cached files are only used once which makes it a waste of server resources
- Rogue accesses are slow. Because rogue accesses require WordPress to load up by default, a bad bot or crawler that decides to spam your site with bad requests can take down your blog pretty easily.
- Your caching plugin might have a bug or an incompatibility with the way you’ve set up your blog. For example, for 3 years I thought WP Super Cache was doing the right thing until I started watching my logs and noticed a bug in the code. Because of the way I set things up, I had a problem where WP Super Cache was flushing my cache way too often.
The key point I’m trying to make with the bullet points above is that whenever a non-standard access happens on your blog, it uses up a lot of server resources no matter how you have your WordPress blog setup. And it’s these types of accesses that will bring your server to its knees, not regular traffic.
Detecting Rogue Accesses
The first key to optimizing your WordPress blog is understanding the traffic patterns that your blog receives. I’m willing to bet that 99% of WordPress users do not do this. Instead, they blindly follow and install various plugins and assume everything is working properly. Don’t feel bad, I was the same way.
So the first step is finding out what the heck is going on. There are many ways to do this, but the easiest way is to use the debug mode that the plugin WP SuperCache provides. What does this mode do? Basically, every time an access is directly handled by WordPress (resource intensive), it will show up in the WP Super Cache log. Here’s how you enable this mode.
Under the debug tab of your WP Super Cache Plugin, simply click on the “Debugging Enabled” check box and you are good to go!
Once logging is enabled, you can click on the “logfile” link which will point you to a file detailing your WordPress traffic. It will look similar to the text below.
15:03:46 /?utm_source=fwisp.com supercache dir:
15:03:46 /?utm_source=fwisp.com No wp-cache file exists. Must generate a new one.
15:03:46 /?utm_source=fwisp.com In WP Cache Phase 2
15:03:46 /?utm_source=fwisp.com Setting up WordPress actions
15:03:46 /?utm_source=fwisp.com Supercache caching disabled. Only using wp-cache. Non empty GET request.
15:03:46 /?utm_source=fwisp.com USER AGENT (Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)) rejected. Not Caching
The important thing to note is that any time you see something in this log, it is a bad thing because it means that WordPress has to do work. And because WordPress is a resource hog, you want it to do as little work as possible.
What To Look For In The Logs
The WP Super Cache logs are great because they tell you everything that is going on. But the sheer amount of information can be overwhelming unless you know what to look for. Here’s what you should be paying attention to in these logs.
- 404 Errors – Basically these are accesses to pages that don’t exist on your blog. Each 404 access handled by WordPress takes up a lot of server resources so you want to nip these in the bud if possible
- One Time Accesses – These are requests that cause your server to cache and compress pages that are only going to be used once (more on this later)
- Strange Bursts Of Traffic – These are usually bots that hammer your blog all at once
- Weird Caching Behavior – Are your caches getting flushed when they should? Is everything getting cached properly under all circumstances?
After keeping a log of my blog traffic for a period of 2 weeks, I discovered a lot of inefficiencies which I’ll describe below.
Bots Were Hammering My Archives
The first thing I did was look at my server logs for periods of high CPU load. Then, I analyzed my super cache logs during these exact same periods to see if there was any funny business going on. Turns out that once every other day or so, a group of bots would come and hammer all of my blog’s archive pages at the same time!
Because these archive pages aren’t cached by default, a bunch of simultaneous accesses was enough to spike my server load and makes things extremely slow. And occasionally, it even caused my server to crash.
So I took a closer look at my HTML output and noticed that I had a bunch of archive links as part of my WordPress header. Therefore, bots and other web crawlers would come by and try to browse the archives.
After a big of Googling, I found that a few other webmasters were having similar issues.
When we see loading issues with your site many times there are a large number of hits for proxy server IPs and the theory is that these proxies are cacheing your site and hitting all of those archive links. Many of the IPs we see when your site is causing loading issues are corporate, educational and Gov’t/Military proxy IPs which appear to prefetch content when someone accesses a site.
Solution: Check to see if you have the php statement “wp_get_archives” in the header code for your blog and remove it. After deleting this little snippet, all of my archive accesses disappeared.
Bots Were Accessing Non-Existent Files
The second major thing I noticed was that there were a bunch of bots that were accessing files on my server that did not exist. The problem is that each time a file access occurs, your server has to load up WordPress, execute a bunch of PHP code and then issue a 404 page.
The solution to this problem is to edit the .htaccess(Google this if you don’t know what it is) file and add the following lines.
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} !(robots\.txt|sitemap\.xml(\.gz)?)
RewriteCond %{REQUEST_FILENAME} \.(css|js|html|htm|rtf|rtx|svg
|svgz|txt|xsd|xsl|xml|asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc
|docx|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|
m4a|mp4|m4v|mpeg|mpg|mpe|mpp|odb|odc|odf|odg|odp|ods|odt|ogg|pdf
|png|pot|pps|ppt|pptx|ra|ram|rar|swf|tar|tif|tiff|wav|wma|wri|
xla|xls|xlsx|xlt|xlw|zip)$ [NC]
RewriteRule .* – [L]
Once these lines are inserted in your .htaccess file, your webserver will first check to see if a file exists. If not, it will issue a 404 response WITHOUT loading up WordPress.
Bots Were Accessing URLs That Didn’t Exist
What’s unfortunate about the way WordPress is written is that if there’s an access to an article that does not exist in your database, your server will have to load up WordPress, execute a bunch of PHP code and do a bunch of MySQL lookups before issuing a 404 response.
If you look at your logs carefully, you will probably notice certain patterns of accesses that you can exclude before they ever reach WordPress.
For example, I noticed that a bunch of bots were accessing my site with the URL www.mywifequitherjob.com/some-article/www.facebook.com/like.php/…. And each time, my server would load up WordPress and issue a 404 response.
So instead of having WordPress handle these requests, I added the following lines to my .htaccess file
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} www\.facebook\.com/plugins [NC]
RewriteRule .* 404.html [L,R=404]
In the lines above, I’m having my webserver look for “www.facebook.com/plugins” in the URL and immediately issuing a 404 response without loading up WordPress. As you peruse your logs, you’ll find many rogue accesses like the one I described above. Write a .htaccess rule for each and these accesses will no longer load down your server.
Bots Were Hammering My Comment Links
Remember when I told you that URLs with GET parameters are handled differently by your caching plugin? I discovered that there were a ton of bots hammering away at my articles with the “reply to comment” parameter set.
When this happens (depending on your caching settings), wordpress gets loaded up and a cached/zip file is served even though it will probably never be accessed again. This is a waste of resources.
Example Taken From My Log
12:01:11 /how-to-get-more-facebook-fans-with-a-facebook-reveal-tab-or-fan-gate/?replytocom=5972 No wp-cache file exists. Must generate a new one.
12:01:11 /how-to-get-more-facebook-fans-with-a-facebook-reveal-tab-or-fan-gate/?replytocom=5972 Gzipping buffer.
The solution is to redirect all bots and crawlers with these GET parameters to the main article page. Here’s the code I added to my .htaccess file.
RewriteCond %{QUERY_STRING} replytocom
RewriteCond %{HTTP_USER_AGENT} ^(.*)(bot|crawl|spider|slurp) [NC]
RewriteRule .* https://mywifequitherjob.com%{REQUEST_URI}? [L]
This code looks for the “replytocom” GET parameter and then removes this parameter from the final URL before presenting the access to WordPress.
Crawlers Were Accessing Posts Without A Trailing Slash
I’m not sure why this is the case, but I noticed a bunch of webcrawlers that seemed to be legitimately trying to access the articles on my blog without a trailing slash in the URL.
As you may or may not be aware, a URL that is written like http://yourblog.com/article/ is different than a URL written like http://yourblog.com/article.
As a result when a URL without a trailing slash is encountered, WordPress has to step in, run a bunch of PHP code and then issue a 301 redirect to the article with the trailing slash. In order to take WordPress out the picture, you can simply insert the following rule in your .htaccess file.
# Add trailing slash
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} !(.*)/$
RewriteCond %{QUERY_STRING} !.*=.*
RewriteRule ^(.*)$ $1/ [L,R=301]
By adding a trailing slash, you are issuing the 301 redirect to the correct URL before it makes it to WordPress which will save CPU resources.
I Found A Bug In WP Super Cache
Unlike most people, I don’t have my WordPress blog installed in the root of my public_html directory. In addition, the front page of my blog is not my “posts” page either. When you have your blog configured the same way I do, there’s a bug in WP Super Cache where all of your cache files could get deleted prematurely even if you preload your cache.
I’m not sure if the author of the plugin is aware of this problem, but basically I found that whenever a spam comment was issued to my blog, my entire cache would get erroneously flushed. Since I get comment and trackback spam all of the time, my cache was constantly being emptied which made caching much less efficient.
So I spent a weekend debugging this issue and finally developed a workaround. If any of you are having the same problems, let me know and I’ll show you my fix. The moral of the story here is to never assume that your caching plugin just works. You need to look at the logs!
Disable CPU Intensive Plugins
Even if you’ve followed all of my steps above, it is impossible to filter out all of the rogue accesses before they reach your WordPress blog.
Therefore, you are always going to receive hits to your site that will not be handled efficiently. There’s no way around it.
But the important thing to realize is that you most likely will not have a bandwidth problem, you will have a CPU problem.
As a result (and this may be counterintuitive), you actually don’t want to be doing anything CPU intensive like “minifying” or “compressing” of your pages on the fly. Minifying and compression helps with bandwidth at the expense of CPU usage.
The last thing you want to be doing is minifying and caching rogue accesses. In fact, you should probably consider not minifying or compressing URLs with GET parameters either.
Most importantly, you should not be running any CPU intensive plugins on your site. WP-Engine has a great list of CPU intensive plugins that you should probably try to avoid.
As I was perusing my plugin list, I noticed that I was using a “similar posts” plugin. And when I took a look at the source code, I was appalled to discover that the plugin was doing full text compares to find similar articles which is a major CPU drain and not scalable.
As soon as I find a suitable replacement, this plugin is definitely going in the trash.
Moral Of The Story
Just because an online speed test tells you your blog is fast doesn’t really mean much in the grand scheme of things.
Don’t get me wrong. Page load speed is important for the search engines and for your regular customers but you also have to account for rogue accesses that may bypass your cache and bring your server down to its knees.
So don’t just install your caching and other speedup plugins blindly. Take some time to analyze your traffic and write as many .htaccess rules as possible to minimize the work that WordPress has to perform. Good luck!
Ready To Get Serious About Starting An Online Business?
If you are really considering starting your own online business, then you have to check out my free mini course on How To Create A Niche Online Store In 5 Easy Steps.
In this 6 day mini course, I reveal the steps that my wife and I took to earn 100 thousand dollars in the span of just a year. Best of all, it's free and you'll receive weekly ecommerce tips and strategies!
Related Posts In Making Money
- 35 Small Town Business Ideas That Can Make You Money
- How To Retire Early Within 3 Years (And Enjoy Your Life Before Age 65)
- How To Grow 3 Six Figure Businesses While Working A Full Time Job With 2 Kids
- Amazon Affiliate Store Vs Blogging Vs Ecommerce – A Comparison Of 3 Online Business Models
- How To Get Paid To Watch Netflix And Make $30/Hour
Steve Chou is a highly recognized influencer in the ecommerce space and has taught thousands of students how to effectively sell physical products online over at ProfitableOnlineStore.com.
His blog, MyWifeQuitHerJob.com, has been featured in Forbes, Inc, The New York Times, Entrepreneur and MSNBC.
He's also a contributing author for BigCommerce, Klaviyo, ManyChat, Printful, Privy, CXL, Ecommerce Fuel, GlockApps, Privy, Social Media Examiner, Web Designer Depot, Sumo and other leading business publications.
In addition, he runs a popular ecommerce podcast, My Wife Quit Her Job, which is a top 25 marketing show on all of Apple Podcasts.
To stay up to date with all of the latest ecommerce trends, Steve runs a 7 figure ecommerce store, BumblebeeLinens.com, with his wife and puts on an annual ecommerce conference called The Sellers Summit.
Steve carries both a bachelors and a masters degree in electrical engineering from Stanford University. Despite majoring in electrical engineering, he spent a good portion of his graduate education studying entrepreneurship and the mechanics of running small businesses.
There are always so many things to do and improve on. As little as six months ago, I didn’t even have a caching plugin at all, and the first time I went to install one, it was incompatible with my host provider. I finally saw my grades (most were C-F’s) and decided I had to do something, so I found one that’s compatible. My scores are now largely A’s and B’s, though I still have some work to do. I don’t have a CDN so maybe some basic tips on approaching that would be helpful.
In any case, I’m going to bookmark this article because I think it could be helpful to tweak my site.
There’s always some improvement to be made!
Hey Money Beagle,
Once you have everything settled down with your plugins, you should analyze your traffic patterns using the methods described above. That way, you can filter out as much rogue traffic as possible.
Wow, great post with a lot of actionable advice on here. Looks like I have some things to add to my list. I know some of these will definitely help me, including the helps for redirecting bots via .htaccess before they even hit my site. I know I have those spikes here and there where my server slows to a crawl, and usually it’s because of a bot of one kind or another.
Great post, thanks for sharing all your tips and tricks!
You would be surprised how much bot traffic there is out there. Chances are that it’s the bots that are slowing down your site and not human traffic.
Great post Steve, very detailed. I use W3 Total Cache plugin, do you have any advanced tips for that? Also, that speed test screenshot, is that from your own website? How can I improve my first time to byte and image compression? http://www.webpagetest.org/result/121204_RC_FM3/
Nope that speed test is not from my website. I diverge from the recommended settings for my own personal reasons. You can’t really assign a letter grade to your website speed without looking at the “non-ideal traffic” which was one of the points of my article. I don’t use W3 Total Cache, but I would imagine that it has a logging option of some sort.
Hi Veronica,
Your first time to byte is largely dependent on your hosting. If you are running fully cached, you should be able to get a decent first byte time out. In terms of image quality, I tend to use a setting of 60 in photoshop.
I have a dedicated host that’s supposed to be top of the line… who knows. I’ve been terrible at keeping images smaller on my own blog but keep them at 50 quality and compressed for clients all the time.
Great tips, Steve. I haven’t seen most of the htaccess tips anywhere else, so those are definitely worth checking out! I just wish I know more about how all this works. I’m not very tech savvy!
Thanks Ryan,
The reason you haven’t seen them is because I wrote them specifically for my blog. The thing is that depending on traffic patterns, your blog will probably require a different set of rules depending on what you need to exclude.
Wow you know your stuff. I’m not that savvy with this type of technical analysis, especially when it comes to speed. My site loads really slow and I think a decent chunk of that is probably from the images. I’ll have to try and dig into some of your suggestions above. I’ve wondered about 404s and really need to look into that.
Hey Sydney,
If your site is slow under normal blog traffic, then I’d go to webpagetest.org and get the page tested. It could be your hosting or your images. Hard to tell but the webpagetest will tell you what’s up.
Thanks for putting in those wp optimization tips.
I am using w3 total cache instead of wp super cache and it works fine for me.
Will try wp super cache too and will see if it improves the performance.
Hey Sam,
W3Total Cache is a superset of WP Super Cache. Both plugins will work fine for normal traffic. It’s the bots and the rogue access that you have to worry about. You will likely not see much benefit from switching plugins.
Do you just add these to the .htaccess in your root directory? http://www.domain.com/.htaccess?
Great tips, thanks Steve!
I was especially looking forward to implementing the rewrite rule for non-existant files, but it clearly doesn’t work for a WP multisite network. The main site throws an internal server error after adding the code you provided to our .htaccess file. :-\
Don’t suppose you have any suggestions for making that 404 snippet play nice across all sites in a WPMU environment?
this is confusing since when I add the code to the wp-admin, the site stays up but accessing the wordpress platform on the back end I constantly get “page not found” errors that direct me back to my site.
However editing the .htaccess wp-content file throws the site into a mess so that you can barely navigate but leaves the wordpress dashboard and such in tact that I can still edit my site.
Any idea what I am doing wrong here? Thanx to any replies…
So I stopped editing the wp-content .htaccess and only the ap-admin. It seems that these line were causing problems:
RewriteCond %{REQUEST_FILENAME} \.(css|js|html|htm|rtf|rtx|svg
|svgz|txt|xsd|xsl|xml|asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc
|docx|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|
m4a|mp4|m4v|mpeg|mpg|mpe|mpp|odb|odc|odf|odg|odp|ods|odt|ogg|pdf
|png|pot|pps|ppt|pptx|ra|ram|rar|swf|tar|tif|tiff|wav|wma|wri|
xla|xls|xlsx|xlt|xlw|zip)$ [NC]
RewriteRule .* – [L]
This one created a re-direct loop
RewriteRule .* https://mywifequitherjob.com%{REQUEST_URI}? [L]
This one directed me to a 404 lost page
RewriteRule .* 404.html [L,R=404]
Otherwise everyother line of code is in my .htaccess file and my site is working fine. Any clues to what my be causing issues. (Also I have the latest version of WP)
Hey Buds,
The reason that line is not working for you cut and paste is because I had to insert line breaks in order to format it correctly for the post.
Try cut and pasting this.
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} !(robots\.txt|sitemap\.xml(\.gz)?)
RewriteCond %{REQUEST_FILENAME} \.(css|js|html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml|asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|rar|swf|tar|tif|tiff|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip)$ [NC]
RewriteRule .* – [L]
Ah… the line breaks.
That was it Steve!
Thank you for taking the time to reply.
What about removing the JS files which WordPress automatically adds to the footer and the header? I just read an article which recommended it(http://www.giftofspeed.com/speed-up-wordpress/ ) and was wondering what you’d think?
Great post – gave my site just a little more speed. Thanks