I plan to hire a programmer to write code that measures the overall download speed of a web page over the internet. (Ie, not the speed that the server spits it out, or the speed of the connection, but the total dl speed including every possible obstacle and bottleneck between the web server and the browser.)
I am not well versed in the nuances of what is being measured, but I believe I am wanting the time between the initial request being made and the final "code 200" (or other error code) being received by the browser.
I'd prefer to sound like I know what I'm talking about when I write the software specs. Is there anything I should be aware of?
"ab", or "ab2" that comes with apache (or apache2 ) does exactly that. You can set it to do a set number of tests, say 1000, and also set it to have say 2 threads or whatever running at the same time to simulate load. You can use it to test any web server, doesn't have to be apache.
When done, you get some stats on how fast everything went.
This is the simplest kind of load testing, but its good enough for benchmarking.
You can make it output data to a simple flat text file, for some kind of automatic processing. Thats what i do to monitor performance.