Basically, he's saying why don't web hosts do bittorrent-like apps to distribute their content geographically, in effect renting people's excess computer cycles, bandwidth, and whatnot to host content, and in exchange the web host provides coordination of storage of the user's content in a distributed fashion among many computers.
Benefits: Your web host is never down, and one single client failure won't take out much information, although I could see how several client failures at once could pull your site down as well as many others.
The answer: It's hard to do anything useful that way with small files. BitTorrent works because it provides an economy of scale on large files that aren't otherwise hosted effectively, and would take a long time to download at any rate -- but a user isn't going to wait 10 minutes to find a non-busy host that has his 3kb email from aunt marge on it.
In addition, it's impossible to do this with dynamic content, because the server side applications simply don't work that way. Oh, and if you update your website, it'll take a while to propagate -- potentially letting old crufty information out onto the network. Same with content that was hosted on a user machine that was shut off for whatever reason for a week, and then powered on -- it suddenly serves up a copy of your website that's a freakin' week old to a user -- it's impractical to do content 'freshness' checking at an appropriate speed without taking up too much in the way of server resources on the central router or the user's connection. (yeah, like I'm gonna generate and push a 2mb file of md5 hashes of the 30,000 2kb files that the webhost has stored on my client -- and I easily have 30,000 files in a large text-based wiki on my site -- to the host, wait for it to analyze the entire 30,000 lines of md5 hashes, and then get back to me with fresh copies... heck, no!)
And adding to that, a lot of web files like streaming media-type files need to have all of their bits and bytes arrive in the right order (i.e. a stream), which BitTorrent is specifically NOT good at -- the first two packets you pull down may be the first packet in the stream and the third-to-last packet in the stream, which doesn't do you any good.
So in other words, P2P works great for large files, but sucks for the types of files web hosts most commonly deal with. Most of the problems I listed above aren't insurmountable, but end up being way too much hassle to ever truly be effective. Seriously -- web hosting is NOT that complex of an operation on a small to medium scale, even when you're talking geographic distribution and mirroring. No need to make it more complex and less reliable.