The Golden Rule About Website Performance (#1 and #2)

In my estimation, about 70 percent of performance optimizations are a waste of time and effort. In the web hosting market, I constantly see a race to the bottom. A race to provide LESS RESOURCES on a shared hosting experience, and to provide subpar tech support at troubleshooting the problem.

When does a performance problem merit attention? When a performance problem can be demonstrated to be approximately the same problem, whether you look at it on these two environments:

  • Shared Hosting – generally a 512mb ram limitation, and storage governor limiting performance to 1mbit. I mean, what planet do these people live on.
  • VPS with professional level configuration using nginx and php-fpm.

The first rule about deciding to focus on trying to fix a performance issue is that the Shared Hosting Companies just RIP PEOPLE OFF and then charge an arm and a leg for a decent set of resources.

I mean this client that I was mentioning here, they clearly had a Shared Server environment with 512mb max memory. And then, in the Php.ini the max PER PROCESS limit was configured to be 256mb ram. That means that a webserver can only do two things at once. That is NOT going to lead to optimal results.

If your Shared Storage only has 512mb, you can make the per-process limit any higher than 64mb or MAYBE 128mb. I mean, if you look at the tests that something like Pingdom, or Google Lighthouse are doing, they are parsing, and downloading and measuring the speed of HUNDREDS of files in a timeframe that FOR ME averages about 1.2 seconds. I shudder to think about a webserver actually trying to download 100 different images (like a different Redditor I was helping the week before last). Or my Ultra-High End Marketing Firm that uses WordPress VIP, they actully have 65 different redirects setup on their homepage. It makes my skin crawl. Every one of those redirects is risky, slow and obnoxious. And when you’re paying between $5k and $25k a month for hosting between 1 and 5 websites, I think that things like redirects should be observed and managed for you.

I’m still trying to find the ultimate tool for finding information about redirects. I have found a few Chrome Plugins, but I can’t rely on that. I’ve got a half dozen browser profiles at any one time, each serving a distinct purpose. And Chrome Profiles, with the need to install them multiple times, and the lack of configuration that has been coming down the pipeline.. I’m not a guy that seeks out new Chrome Extensions as my FIRST CHOICE in website monitoring tools.

I strongly believe that EVERY web site needs about 4 or 5 distinct types of monitoring to ensure that things are going as expected. The sitemap should be visually inspected once a month for new content (to make sure new content is still trickling through). and to make sure that there isn’t anything that is being exposed that shouldn’t be.

I had a client the week before last, and I want to blame many of their performance problems on the LARGE amount of sitemap traffic that was rendering an inappropriate page. For example, the ‘Static Content’ Custom Post Type was being rendered dynamically and included in the Sitemap. Of COURSE the theme had a ton of unnecessary Static Content for things like BuddyPress and custom login and logout forms. I mean WHO KNOWS what was happening to those HTML fragments that were being advertised as actual PAGES and POSTS.

But the NUMBER ONE RULE for managing your website performance.


A $10 VPS has 4 times the Ram of a $20 dollar Premium Shared Hosting Experience from GoDaddy / MediaTemple.

I was actually pleasantly surprised with the expertise and patience of MediaTemple.. I hadn’t googled their heritage before I called to deduce that they had been assimilated into the borg many years back.

I could go ahead and make the SECOND rule of improving website performance

#2. GTFO of Apache2 / HTTPD. I just don’t think that it’s possible to properly tune that albatross in this day and age.

Here is a chart to backup my assertion:

A Chart that shows nginx grow from 0 percent marketshare to parity with apache in the last 10 years

Historical yearly trends in the usage statistics of web servers

Here is a more detailed and quantifed, more recent analysis. November 2020. Nginx has a lead over Apace2 of about 80m websites


I have always trusted the netcraft figures over anyone else. I used to cheer on Internet Information Services when I lived on the other side of the planet 15 years ago.

Although I am a HUGE fan of nginx, I was kindof saddened to see that they were purchased last year by F5 networks a Seattle based firm for about 630m dollars. That is a lot of coin. Today, I’ve read about a half dozen articles about how a previous owner of the nginx software, or maybe I should describe them as a claimant believes that they still own the rights to nginx, and somehow they are trying to extort another $630m from F5 Networks. My gut feeling is that this is just a brazen Russian Extortion attempt to monetize what they had somehow released as open source. I’m no attorney, but I am worried that this copyright confusion is going to cause any stunting of the growth of nginx.

But my most pressing matter is to look at the newer releases of nginx and try to figure out how far they are from arriving. That scares me, the Control Panel that I’ve been using this entire year has yet to give me a single hiccup. And the predecessor to my current Control Panel is called VestaCP, they let me down on a quarterly basis. I still love VestaCP, but not enough to use them in production anytime soon. When I’ve gotten new features like 2-factor authentication cooked ENTIRELY through my panel, I’m just ecstatic that the prospect of Russian Snoops in my confidential hosting files just gives me a case of the shakes.

I don’t dislike Russians. I wish I had my Cybersecurity training paid for by the Kremlin. and I wish that Edward Snowden would be pardoned and would return to America. I thought that Trump was considering that, but Snowden called for Trump to INSTEAD pardon the wikileaks guy, and I don’t see that happening. Entirely different questions of scope.

Cheers, we hope that you’ve enjoyed this article about the web hosting market. We love Web Hosting software of ALL types and sizes. We honestly think that the MOST LOGICAL platform for most of our clients is for us to manage their web hosting control panel, to help with troubleshooting, setup and security. and to allow everyone to setup their own websites of any shape and size without artificial restrictions on things like RAM allocation. There is PLENTY of Ram to go around. and when there isn’t, the move from VPS to Dedicated is LESS steep of a climb than the LEAP from Shared Hosting to VPS.

Remember that, people. If you’re on Shared Hosting, stop trying to squeeze a rock to make soup. Your site cannot be properly optimized without adequate webserver (namely nginx in a fully configurable manner).

%d bloggers like this: