Performance benchmarks for managed WordPress hosts compared

Categorized under:

Photo of author
Written By Brian Krogsgard

7 thoughts on “Performance benchmarks for managed WordPress hosts compared”

  1. I agree that they could have added a few more hosts to the list. I would have liked to see Nexcess on their as well.

  2. We’ve been collecting similar data for quite a while, and figuring out how to present it in a useful way is something that pretty much keeps me up at night. There are so many things to consider, and personal experience varies from customer to customer as well.

    All that to say, we’ll be releasing some data soon as well, and hopefully presented in a way that actually helps people make decisions, not just overwhelm them.

    (Also, 100% agree on the support point. That metric seems to be hardest to measure, but most important.)

    • Ryan,

      We actually measure the ‘hard to measure’ stuff, e.g. support, at ReviewSignal. It wasn’t included in the article because I don’t have enough data for every company tested. Review Signal only works with enough data, so it only really works for larger companies, but of the ones listed in this article, we have data on: A Small Orange, Digital Ocean, GoDaddy, SiteGround, WebSynthesis, and WPEngine.

      Take a look at http://reviewsignal.com/webhosting/compare/ some of them are under Cloud tab as well.

      • Yeah, I hope that didn’t come across as critical. It wasn’t meant to be at all. I know how much time and money it takes to do this level of testing and benchmarking, and you did a great job with it.

        I do think that data on it’s own can be really overwhelming for a lot of people, so figuring out a solid way to present it is really important too. I’m not saying you didn’t, I’m just saying it can be really overwhelming for a lot of people.

        I also think it’s nearly impossible to develop an algorithm for support, because everyone’s standard of what’s “good” is very different. Sure, with enough data you get closer to whether or not the host sucks or not, but it still doesn’t feel like a true metric, because there are no hard set criteria for what makes an experience better than average.

        I may LOVE Big Macs, but for most people that’s garbage food. So while my experience with special sauce and extra cheese on a sesame seed bun may be absolutely stellar, most people are going to talk about how awful it is until their throats are sore.

        Human expectation is just a variable that I have a really hard time trying to quantify in any kind of scientific way.

        • Ryan,

          I didn’t take your post as criticism of my other work. I just wanted to point out that I did quantify it.

          As far as your point about big macs, if most people thought big macs were bad and hurt their throats and a few people found them irresistible, with enough data points you could approximate what % of people enjoy big macs. Same with support. If 7/10 people have a positive experience with a company’s support staff versus a company where 4/10 people have a positive experience, you suddenly have a way to compare what the ‘average’ experience is. No company is perfect and the scores do vary quite a bit. It’s generally pretty easy to tell which companies people think the support is good vs bad at. That doesn’t mean everyone will have a good experience, of course. It just means X% of people do (or don’t).

          • Yep. I don’t think you said anything I didn’t. I just don’t think “Good” or “Bad” are very good benchmarks. There has to be a better way to quantify it.

            Simply saying “good” is 100% and “bad” is 0% and then finding the average of good vs. bad seems very inexact, especially since there’s such a huge human element when it comes to support.

            “Perfect” support can be perceived as subpar by people with unreasonable expectations. And vice versa.

            There’s a better way to measure quality support. I just don’t know what it is yet.

  3. Ryan Sullivan
    Yep. I don’t think you said anything I didn’t. I just don’t think “Good” or “Bad” are very good benchmarks. There has to be a better way to quantify it.
    Simply saying “good” is 100% and “bad” is 0% and then finding the average of good vs. bad seems very inexact, especially since there’s such a huge human element when it comes to support.
    “Perfect” support can be perceived as subpar by people with unreasonable expectations. And vice versa.
    There’s a better way to measure quality support. I just don’t know what it is yet.

    Reminds me of “Democracy is the worst form of government, except for all those other forms that have been tried from time to time.”

    Support is a soft measure and it’s the most human part of the web hosting experience. I think the amount of people who walk away happy vs unhappy is about as good as it gets. You always will have unreasonable people (on both support and customer side, people have bad days), but most companies seem to have a generally consistent level of service.

Comments are closed.

A2 Hosting
Omnisend
WordPress.com