How I Lost 2,000 Website Visitors Per Day by Ignoring an E-mail

I compulsively check our website stats. I keep Analyst (a small Mac program) running in my menu bar, and I probably look at it 30 times a day.

Mostly I want to know how many visitors are coming to our main site, North Carolina Divorce. Based on 19 years of data, I know that lots of visitors equals lots of revenue.

More website traffic means more money. I like money a lot; thus, I check our stats.

On Tuesday of last week, something weird happened.

I woke up and promptly checked our stats. (I do it while still in bed. Probably TMI, huh?)

We’d had fewer than 81 visitors, and it was already 6 AM. I panicked. I “shit you not” (I have no idea what that phrase actually means, but I really like it and don’t often have the chance to use it). I really panicked: rapid breathing, elevated heart rate, etc.

Normally by that time, we’d have several hundred visitors. Things then heat up and go strong during the workday, and then we get another burst in the evening. Our typical day easily exceeds a few thousand visitors.

What Went Wrong?

How could we have had so few visitors? What’s wrong?

I had to quell my anxiety until 8:30 AM when I could call our operations guy. It was too early to start drinking, so I managed my stress by eating a big bowl of Fiber One cereal (maybe a bad idea).

Ned arrived at 8:30, and we started digging.

That’s when we realized we’d received a message from Google on Saturday. We’d received another message (via e-mail) on Sunday. We hadn’t yet acted on the information it sent. (We’re busy, you know?)

The messages were from warnings we’d signed up to receive from Google Webmaster Tools. Google is happy to alert us when something is wrong. It’s a free service.

The warnings we received alerted us of an “Increase in authorization permission errors.” Who knows WTF that means? We didn’t.

Now we do.

After some investigating, we figured out that it meant that something had gone wrong with our “robots.txt” file on our site. It had become corrupted and was telling Google not to index our site.

A “robots.txt” file gives instructions to robots (like Googlebot) so it’ll know where to go and where not to go. It’s a very helpful little website file, unless it gets screwed up.

Inadvertently, Google did what we had told it to do. The search engine stopped indexing, and between Saturday and Tuesday, our pages started dropping from the indexes around the world. Oh, crap!

We have tens of thousands of pages of content indexed by Google. Those pages evaporated from the index.

By Tuesday morning, we had clearly suffered a big drop in traffic. That’s when we got busy and figured out the problem.

Thankfully, Google gives pretty good guidance in Webmaster Tools. It directed us to the problem. It had done that in the warning e-mails as well. If we’d been paying closer attention, we could have avoided the entire problem.

How to Avoid Losing Your Web Traffic

The message of the day is pay attention. If Google sends an e-mail, it’s not screwing around. It’s trying to help. Google wants to keep your excellent content in your index, and it doesn’t want to delete your pages. It’s giving you a heads up, but it can’t make you take action. Only you can prevent the deletion of your listings.

Here’s your three-step action plan for avoiding the loss of your traffic:

[ While I have you here, I wanted to remind you that you can get the latest articles delivered to your inbox a week before they go up on the web. Just one email per week. Sign up here. ]

  1. Sign up for Google Webmaster Tools so you’ll get warnings.
  2. Set up a rule in your e-mail box so the warnings get highlighted. Do whatever you need to do to be sure you pay attention (color them red, forward them to your website guru, do whatever it takes).
  3. Take action to correct the problem—quickly!

Looking back on the entire mini-disaster, we’ve tried to figure out what happened to our “robots.txt” file. Was it hacked? Was there a problem with our server? Was it corrupted by something we did?

We’re not sure. We’ve studied the logs and can’t find any evidence of hacking. Our suspicion is that a rogue plugin did something unexpected. We’re not absolutely certain.

While we haven’t figured out the exact cause of our troubles, you can be certain we’re watching our “robots.txt” file carefully. We’re watching the e-mails from Google even more closely. We’re making sure we don’t let this happen again.

Start typing and press Enter to search