traffic light - is your site up, down, or about to fall over

Who is watching your web site to make sure it is up?

Customers expect that your web site is always open for business. Even the best hosting companies cannot promise 100% uptime for their environment.

Watcherwatcher was built because we needed to know if a collection of web sites were down. The data center had a server, called "watcher", that could check sites for up and down status, but it lacked reporting to the business.

Unlike the proverbial tree in the forest that can fall when no one hears it, we needed to know if the site was down. So, we built a script called "watcherwatcher" that did a basic set of checks on a set of web sites and emailed if there was an issue.

24 hours a day, 7 days a week, 365 days a year monitoring. If you have an old desktop in the corner of your office, you can have round the clock monitoring in less than an hour for free.

Technology - Keep it Simple, Stupid

K.I.S.S. is a design principle from the US Navy from 1960. There are many web site monitoring tools, ranging in cost from free to hundreds of dollars per month per URL. They have lots of features, but watcherwatcher has out-performed many of them for years by keeping it simple.

Perl, Curl, Wget

The basic idea behind a web monitoring script is simple. You need to know a scripting language that can loop through a set of URLs, request the page, and check the time for the response, the response code, and the content.

Perl and the LWP Library

Perl is a programming language available on almost every operating system. It has an LWP libary that simplifies the above steps into a few lines of code. There are other libraries and tools depending upon your environment, but everyone has an old computer in the corner of the basement. Linux and perl will run on a computer with as little as 512Mb of memory. In fact, we just retired a watcher station that was running on a 16 year old Dell computer with 256Mb of memory and a light version of Linux.

Secure sites

Your web site needs to have an https certificate if you want it to be ranked by Google. Make sure that your tool of choice has the ability to request a secure web site. It may take some effort to configure the web request to validate the web site https certificate, but any new instance of Linux and perl has that capability.

Curl and Wget

Curl and wget are common libraries that can also be used instead of LWP. They are also available on almost every operating system. You will still need a scripting language to loop through the URLs and process the header results and check the content. Both wget and curl can replace the use of LWP.

Write your own script

The basic watcher script is less than a page of code. For example, here is a script that could monitor this web site... script for one domain

# Program:
# Version:  1.00 10Mar2018
# Author:   Jim Canan  

use strict;
use LWP::UserAgent;

my ($start,$stop);
$start = time();
my $check_url = "";
my $ua = new LWP::UserAgent;
my $request = new HTTP::Request('GET',$check_url);	  
my $response = $ua->request($request); 
my $content  = $response->content;
$stop = time();
unless (($response->code eq '200')     && 
   	(length($content) > 10400  )    &&
   	(length($content) < 10900 )    &&
	(($stop - $start) < 2     )    &&
   	($content =~ /keep it simple/) ) {

	print "Watcherwatcher, we have a web site problem\n";
	print "Response code was ",$response->code,"\n";
	print "Response size was ",length($content), " bytes\n";
	print "Response time was ",($stop - $start)," seconds\n";

open (LOGFILE,">>watcher.log");
my $logline = join("|",scalar(localtime($start)),$check_url,$response->code,
print LOGFILE $logline;
close LOGFILE;

Log file of performance of the site over time

This script will record the web site size, response, and performance over time.
cat watcher.log

Sat Mar 10 15:59:02 2018||200|8427|1|
Sat Mar 10 16:09:03 2018||200|8427|0|
Sat Mar 10 16:19:01 2018||200|8427|1|
Sat Mar 10 16:29:02 2018||200|8427|0|
Sat Mar 10 16:39:00 2018||200|8427|0|
Sat Mar 10 16:49:00 2018||200|8702|0|

Use crontab and .forward

Linux has a crontab function that can run the script for you every ten minutes.

The script will only output upon error. You can add reporting to email or send a text message to you upon errors, or use .forward to send any cron output to your email.

Add a loop to the above script

The above script can be modified to add a loop that reads a configuration file that will check a list of web sites for size, response time, and content.

Watcherwatcher Monitoring

This simple script is monitoring all of the following... That allows you as the busines owner to be notified if any of the above changes and to take approriate action.

Watcher watcherwatcher

Over time, you can extend the watcherwatcher model with some changes.

Watch from more than one location

While we wanted to keep it simple, we realized that only having an old retired Linux server might not be sufficient. By adding a second server under your desk or in your basement that watches a small subset of the sites watched by the primary watcher, we can increase the reliability of the watcherwatcher process.

But you don't need to watch from everywhere

Some web site monitoring tools watch the sites from ten or more places around the world. And other monitoring tools have added "real user monitoring" or RUM, to watch from actual customers. All of those are great tools and you might want to investigate those tools and use them on your web site. But watcherwatcher will likely tell you before those tools that you have a problem.

Synthetic monitoring

The technical term for watcherwatcher is a "synthetic monitor", because it watches from a known location on a regular schedule. That has been a more reliable indicator of issues than a monitor that checks from many locations and averages them over time.

Unless you have lots of checks from every location, adding up multiple monitors will just prove that a web site far away is slower than one near by and a web site monitored from China will always be slower than many other locations around the world. While there are circumstances that can only be detected from certain locations, most web site owners don't have the capability to re-route their traffic around a broken fiber optic cable under the ocean or a denial of service attack against the web.

Daily Reporting

The below perl script will create a log file that tracks your web site performance over time. It is also easy to parse with perl, so you can add a daily report that can give you the page performance over time and provide a reminder that the computer in the corner is watching your web site 24x7x365.

Network Troubleshooting

Over time, I have added watcherwatcher stations both inside and outside the company that can watch a small set of web pages. That allows triagulation of web issues and network issues. That allows you to contact your internet service provider and hosting provider when there are issues because of an outage, an upgrade, or a network device about to fallover.

Add warnings to the reporting

With a couple of additional lines of code, you can create warnings and provide a more fine grained monitoring of page size. If your web site never changes, then you can alert on any change. If your web site changes frequently, you can change the alert thresholds.

Add escalation

Depending upon the quality of your local network and your internet provider, you may want to have an escalation where the initial alert goes to a list of people. Then you can add an escalation process if the alert continues for more than 10 minutes to another group of people.

Website Monitoring

Since 1996, we have been using simple hardware and software to monitor web sites.

Put that old computer in the corner to use.

You can have 24x7x365 monitoring of your web site.

Watch your web site for a price that you can afford.

Jim Canan
Chief Technology Officer