It sounds to me like this network of sites was not thought out very well, if you're having to update separate files manually in order to affect common elements. At the very least, all common elements should be kept in a separate file area and called into every site (there are ways to accomplish this even from multiple servers, though a single server is DEFINITELY preferable). At best, a CMS that can drive updates, files, preferences, etc. from a single location and single database sounds like it would make your life easier.
Of course none of this helps your immediate needs, but maybe it's time to talk with the owners of this network about rebuilding the backend so it is more efficient?
Short term, you might be able to set up software to map to the various accounts on the dedicated and ancillary servers, open up all the pages that need the same change, and do a global search/replace. The danger with this is if you do it wrong, the change it now live across the entire network and it becomes a real PITA to revert it. My software choice for this sort of work is Homesite, but you can't purchase this software anymore (Adobe announced EOL on Homesite in May 2009). I'm sure Dreamweaver can do the same thing too, though I would only use it in code mode - it (and all WYSIWYG editors) are nearly useless and/or dangerous in any mode other than code mode.
Dreamweaver has a very comprehensive search/replace function, including the ability of using regular expressions. But it cannot operate on multiple FTP accounts in same time.
One thing comes to mind is, make all the changes in one website, take a record of the files edited. And copy/overwrite them over other accounts.
You will need to use wget or a similar tool to do it cross-server though. I am not a Linux expert, but there must be a way to automate it via shell scripting. PuTTY Connection Manager might be useful in this case.
Here is my question, how can I upload or how should I go about updating these 40 websites with the newly modified files in a streamlined process instead of doing it one by one?
Develop a batch file to do FTP uploads to each site.
What you should do is upload the files under new names, then rename them ... that way the running web sites will not see partially-uploaded files.
Sometimes you may need more elaborate control; for example, if the uploads can take a long time, you might want to create a duplicate directory structure, then rename that at the top level. That means you can separate the delivery of the new files from their activation.
Wow, wasn't expecting so many replies so soon. Thank all of you.
The network of websites are not your usual setup. It's basically a bunch of domain owners who joined together to create a network. Each website is owned by a different owner, so the websites have different content but use the same core scripts. I can't really create a central location where every website calls it's information from the same place, because not all websites are hosted on the same server.... and the domain owner at any time can decide they don't want to use our services no more and just start doing administration on the website, or hire someone else to do it, or switch to a different host at any time.
The website owners initially were members of another network and hosted on a cloud server. The updates at that time were done in SSH using some type of script. However the other network stole alot of their money so they switched over to our services and we migrated all their websites over.
To be completely honest I've never wrote a bash script or batch file which is why I was hoping there might be a program to accomplish this or maybe a method similar to SVN.
I might be able to find the script they used to roll out the updates before. However their old server was a completely different setup so not sure how useful it would be.