If you have access to SQL server via terminal service then you can do it easily using Enterprise manager.
Right click on database name then choose Backup from menu. It will ask for a device (file) and then some other options then it will backup database to the specified file.
You can also do this using an enterprise manager installed on your local client computer. Just install SQL server with the Client mode (do not install server). Connect to remote server and think it is your local server and make backups the same way.
Dumping databases will not do it on all cases. Full backup is the most efficient way.
By the way it will be better if you can decrease volume of Log files before backup as they take a large space.
Go to Enterprise manager of your SQL server after Login to your online database with username and password. Then right click on database name and select "All Task" option. Under it there will be "Backup" option. Give path of your local machine
Yes, This (the way anantatman says) is another way to backup a database but this is applicable if you have complete access to your server files + ldf and mdf files are sometimes very very larger than backups.
I have seen a backup file of only 10M creates a ldf and mdf pair of 1Gigs.
agreed, but sometimes when you have to move 50 plus databases and don't feel like writing all the t-sql scripts to do the job, copy , detach, attach is fast..
also note that if you have any system wide users that have ownership of the database, you should assign dbo or sa as the owner, and delete the old user.
this can free you from a lot of hassle when restoring the database. sometimes it'll think there are users which have permissions to objects in the db like tables, storedprocs, etc... when they don't exist.
Originally posted by dreamrae.com ahh, if only u were using unix instead...its so easy in unix.....PEOPLE SHOULD BE USING LINUX, just say NO when it coems to windows servers.....unless they are running apache..