Kevin Ohashi does the best WordPress hosting technical analysis out there. He's just come out with his 2016 benchmarks, which are the culmination of months of testing twenty six companies, across six price tiers, with five different methodologies.
In addition to his main post, which then links out to various tier-based tests, he also has a Google Sheet that lists the tiers, as well as the type of hosting plan, server, and other features that were included for each plan tested.
What the tests represent
The methodology is incredibly important. His “top tier” awards for each price bracket are based on a mixture of the methodologies. And keep in mind these methodologies are on performance, not other components of choosing a host.
It's worth considering these methodologies, and how that affects your own website application, when analyzing a host. Two big examples:
- LoadStorm tests logged in users. The test starts with X users, and works up to Y users, over Z amount of time. So, for instance, it may start with 500 users and work up to 1,000 users over a thirty minute time frame, sustaining the 1,000 user peak for 10 minutes.
- Blitz tests cached home pages. The test starts with X users, and works up to Y users, over Z amount of time. So, for instance, it may start with 1 user and work up to 1,000 users over the period of one minute.
LoadStorm's testing is meant to mimic real users. The concurrency isn't based on concurrent requests, but on user sessions. So LoadStorm mimics visiting the site, visiting the login page, going back to the home page, browsing, etc. The requests of those thousand users are like lightning strikes in a thunderstorm. The size of the thunderstorm is based on the number of users. Bigger thunderstorms have more lightning strikes.
Both LoadStorm and Blitz factor significantly into the results, but obviously if a host fails on either then it can massively impact overall performance. In a few cases, one of the other metrics is the reason for failure, but most often it's LoadStorm, and sometimes Blitz.
I do agree with his methodology, and I believe it was quite fair.
If he just tested cached pages with Blitz, it gives the host a free pass to hide a lot of ugly little flaws via full page caching. On the other hand, most pageviews for most websites will utilize a full page cache. If your website is serving 99%+ pageviews to non-logged in users with static content, you may put more weight on the Blitz tests and less on LoadStorm.
Testing using LoadStorm is a great test of a site's underbelly. It's a really important test for sites that can't often utilize full page cache — like eCommerce checkout pages, or social and membership sites — with high rates of logged-in visitors. Logged in testing can also be a truer test of the host's raw capabilities, whereas full page caching can often act like lipstick on a pig.
In addition to these (I believe) most important metrics, he also tests three others:
- Uptime, using UptimeRobot and StatusCake: This is good to detect dramatic failures, but most top hosts today have pretty good uptime. And uptime is best measured over years, in my opinion, not a few months. I'd rather a host with a couple minutes down time every month than a few hours on an irregular basis. Also, there's plenty of reason to be suspect of short failures, as these tools are far from perfect.
- Location based testing with WebPageTest.org: He tested first connections from 11 locations in North America (USA), Europe (UK and Germany), Africa (South Africa), South America (Brazil), and Asia (Japan, China and Singapore), and Australia.
- CPU, MySQL, and database metrics using his WPPerformanceTester plugin. This is a handy test for measuring raw computing power, as well as the hosts ability to handle WordPress queries.
Kevin's test are, by far, the most thorough and rigorous testing available for WordPress hosts.
The six price categories he tested cover most budgets. I'm going to also list the LoadStorm (logged in) and Blitz (logged out) test metrics he used to compare these hosts.
It's worth noting that the levels he threw at them is like slamming a hammer against a window to see if it breaks. Breaking is part of the game, if you throw enough force at it. The top tiers of his tests would equate to millions of visitors per day in real life, which for most of us is not realistic.
So it's worth considering these as you may consider a spike test, meaning even the hosts that don't perform well at all in these tests could still theoretically be a good host on sustained traffic websites.
- Less than $25 per month
- LoadStorm: 500 – 1,000 users over 30 minutes. 10 minutes at peak level.
- Blitz: 1 – 1,000 users over 60 seconds
- $25-50 per month
- LoadStorm: 500 – 2,000 users over 30 minutes. 10 minutes at peak level.
- Blitz: 1 – 1,000 users over 60 seconds
- $51-100 per month
- LoadStorm: 500 – 3,000 users over 30 minutes. 10 minutes at peak level.
- Blitz: 1 – 2,000 users over 60 seconds
- $101-200 per month
- LoadStorm: 500 – 4,000 users over 30 minutes. 10 minutes at peak level.
- Blitz: 1 – 3,000 users over 60 seconds
- $201-500 per month
- LoadStorm: 500 – 5,000 users over 30 minutes. 10 minutes at peak level.
- Blitz: 1 – 3,000 users over 60 seconds
- $500+ per month (enterprise)
- LoadStorm: 500 – 10,000 users over 30 minutes. 10 minutes at peak level.
- Blitz: 1 – 5,000 users over 60 seconds
The enterprise tests were setup differently than the others, as the hosts were allowed to fine tune the site as they would an enterprise client, meaning there was an onboarding process. All other tests were off the shelf — per his requirements for the hosts who participated.
What the tests do not represent
The tests Kevin performs are incredibly helpful performance tests. That does not mean they should be your overall analysis for what hosting partner to choose. They don't measure overall experience (support, UX, UI), or overall value (what you get, performance aside, from a particular plan).
These are not hosting experience tests
Your support and user experience can be just as important, or more important, than performance. Hosting relationships should be long term, and you need to be comfortable with your hosts offerings.
You may want phone support, or 24/7 support, or highly technical tier 1 support, or direct chat capability, or who knows what.
You may feel more comfortable hosting with a small company that knows their customers personally. You may feel more comfortable hosting with a big company. You may feel more comfortable with a company that's been around a while.
It all depends.
Even beyond support, you need to consider experience of the site and platform itself. Are traditional tools like cPanel important to you? What about built-in WP-CLI? Do you prefer a custom administration dashboard? Is support for free SSL certificates via LetsEncrypt important? How easy is it to create and manage staging websites, deployment workflows, and other tools?
Choosing your host should include experience factors.
These are not hosting value tests
What you get for the price may not be fully apples to apples. As Kevin notes several times, some of his categories represent different tiers for that particular company.
The $50-100 category may be one host's top tier, and another hosts entry level tier. That can affect how that host handles those customers. A host that caters to the higher end may not offer the flexibility at this level as another host who offers such a tier as their high end.
For example, $100 may represent a high-tier shared plan that allows 25 sites for one host, and represent a single cloud instance allowing only one site for another host.
Value can also be determined based on the onboarding experience a host provides, or additional services like site profiling or even code review.
It would be a mistake to compare these hosts only using Kevin's tests, even though Kevin's tests are awesome.
All of my qualifications help get to the crux of the matter: comparing hosts is extremely hard.
That said, let's get into the results a bit and find out who consistently won out, who surprised, and make a few additional notes to Kevin's analysis.
Per tier performances
I'll list what Kevin calls “top tier” performers for each tier. He also highlights honorable mentions for each, which I'll also recognize. Usually an honorable mentions is basically a top tier listing with an asterisk, either because of test inconsistencies or something that's solvable and doesn't reflect poorly overall on the host. Other times an honorable mention is just because a host was close, but not quite close enough for “top tier”.
It's worth reiterating that the price and resources are not linear between these offerings. Kevin had to segment by something, and he chose price. That's a logical choice, but just keep in mind this is a performance comparison based on price, not based on product type.
For example, compare it to real estate. In these tests, we're comparing $250,000 houses, whether they are in the country or the city, California or Alabama, one thousand square feet or five thousand square feet. That said, it's still interesting to compare.
Less than $25 per month
The entry level category is where the most number of customers will be affected. And for reference, Kevin used “real pricing”, and excludes plans that would bump above this range on renewal years. Thirteen hosts participated.
Top tier: SiteGround, LightningBase, DreamHost
Honorable mentions: Pressed, WP.Land, GoDaddy/MediaTemple, Traffic Planet Hosting
Other participants: A2 Hosting, Bluehost, Flywheel, Incendia, Hosting Agency
$25 – $50 per month
Twelve hosts participated in the $25-$50 per month category.
Top tier: SiteGround, LightningBase, Pressable, Pantheon, WPOven
Honorable mentions: None
Other participants: A2 Hosting, Conetix, Pressjitsu, WP Engine, WP.Land, Cloudways (DigitalOcean), Cloudways (Vultr)
$51 – $100 per month
Twelve hosts participated in the $51 – $100 per month category.
Top tier: SiteGround, LightningBase, Pressable, LiquidWeb, Kinsta, Pressidium
Honorable mentions: MediaTemple, Pagely
Other participants: Bluehost, Cloudways (AWS EC2), Cloudways (Google), Pantheon
$101 – $200 per month
Eight hosts participated in the $101 – $200 per month category.
Top tier: LiquidWeb, Kinsta, Pressable, Pressidium
Honorable mentions: None
Other participants: A2 Hosting, Bluehost, Conetix, Pressjitsu
$201 – $500 per month
Eight hosts participated in the $201 – $500 per month category.
Top tier: Kinsta, Pressidium
Honorable mentions: PressLabs
Other participants: MediaTemple, Pagely, Pantheon, Pressable, SiteGround
$500+ per month (enterprise)
Seven hosts participated in the enterprise category.
Top tier: WordPress.com VIP, Pagely, Pressable, Pantheon, Kinsta, Pressidium
Honorable mentions: None
Other participants: WP Engine
Kevin did an outstanding job describing why, on certain tests, hosts fell outside the top tier or honorable mentions.
Across the board, it's worth noting that host performance was pretty darn good. Using our window analogy again, I think I can safely say that most of these hosts would've done well, had Kevin thrown a shoe, or maybe a basketball, and not a hammer.
99.948% average uptime for < $25/mo hosts is about 21.5 minutes of down time in 30 days on average. Throwing out the top two and bottom two hosts of each, the average uptime is 99.964% and 15 minutes of average down time in 30 days.
While the tests aren't perfect, passing them with flying colors is certainly a sign of a host doing something right. On that note, let's idenfity the hosts that hit notable top tiers:
- Pressidium and Kinsta were top tier in each test they participated in. They were near flawless across the board — very impressive for two upstart and hungry managed hosting companies.
- LightningBase is basically unknown, and performed spectacularly. Kevin noted in the post that this isn't their first time doing so well. Their marketing is clearly holding them back, but their product seems on point, and they also have quite affordable plans — maxing out at $100 per month for listed prices, but can go higher with custom quotes.
- LiquidWeb was a rock solid VPS performer, top tier in both mid-level tests they participated in. I'm surprised they didn't enter the enterprise category. They've been building out their WordPress-centric tools and they are a fascinating company right now, as they also have the most available support structure.
- SiteGround was top tier for shared and cloud plans. They were bumped with their dedicated server, which didn't handle the spike as well as their more scalable cloud options. SiteGround remains possibly the best bang for your buck host out there, and their entry level prices are so low that it makes them an attractive hosting partner to grow with a site.
- Pressable was in every test above $25, and was top tier in all but $201-$500, where they had what looks to be an isolated uptime incident. They definitely win “most improved” to me, and it seems Automattic's investment into completely rebuilding that company is paying off.
- GoDaddy and MediaTemple appear to be sharing infrastructure these days, for the most part. They both performed admirably, and I wish GoDaddy hadn't had the security snag because it would've allowed a better comparison — but it seems they're clearing reaping the benefits from their investments in managed WordPress.
- Pantheon and Pagely lost a little ground this year purely based on tests, though it's hard to pin down exactly why. Pantheon's results seem a bit hit and miss but were overall positive. Pagely performed well but it appears their high price to resource ratio put them in an awkward position for price-based segmentation, as they didn't bring enough firepower to the gunfights until it came to enterprise — where they shone brightly.
- Flywheel and WP Engine didn't perform very well in the categories they participated in, which is a bit surprising considering their strong reputations and passionate customer bases. Both did much better with cached visits and lost high placement due to logged in tests.
- DreamHost only entered the most affordable category, but it paid off, as they were a top performer. They don't market higher level offerings for managed WordPress hosting, but I don't know if that means they don't exist.
Of course my takeaways aren't comprehensive, and Kevin gave at least a paragraph of analysis for every host that entered each category, but these were the things that were most noticeable to me, as someone who has long been following the hosting market.
What to do with this information?
I've spent over ~2,500 words putting Kevin's excellent research into an even larger context.
Tests like these are very valuable, but they are not a silver bullet. It's important to take these results and compare them to other criteria that matter to you.
This landscape is always changing, and hosts are iterating like crazy lately. Additionally, WordPress is more attractive of a market than ever before, so we'll continue to see more investment in this space.
Even in these tests, there were several hosts I knew absolutely nothing about. Some did surprisingly well, others didn't do so well. Well-established players often performed basically to their reputations (good or bad), but others broke ranks with what you typically hear about them.
Hosting is so hard to analyze, and I hope Kevin's technical analysis and my more editorialized one help show that. It's an industry shrouded in marketing madness and technical mumbo jumbo. Too often, a minority of disgruntled customers can shape the reputation of a good company; likewise, a high-paying affiliate program can help cover for a poor performing product.
Sifting through it all and choosing well is tough. Most of us like to find a company we like and stick with it until it's too unbearable and we have to switch. If you have a host you like, stick with them! If you have to find a new one, I hope you'll utilize tools like Kevin's to help you choose.
You may have noticed I have not linked to any of the hosts. That's because Kevin makes money from each host you buy through on his website, and he plays affiliate stuff as straight as anyone I know, and I'd prefer if this post or his help you choose, that you use his links to buy.