Results 1 to 8 of 8
Thread: Source code or compiling
-
01-04-2007, 12:53 PM #1New Member
- Join Date
- Jan 2007
- Posts
- 2
Source code or compiling
Hi
It would like to know what he is safer to initiate the server, get all the sources codes and to compile or to the packages with apt-get direct?
Thanks
-
01-04-2007, 02:35 PM #2LORD OF THE RINGS
- Join Date
- Dec 2005
- Location
- Internet
- Posts
- 1,352
There is no safety problems for both the ways. If you use source codes, you will be able to define the path of the binaries and configuration files. If you use apt-get, you can install the necessary packages very easily.
-
01-04-2007, 04:29 PM #3Junior Guru Wannabe
- Join Date
- Dec 2006
- Location
- Missoula,MT
- Posts
- 46
Really, it's a personal preference
Originally Posted by CarlAndr
It comes down to a couple different options when wanting to install custom compiled software.
1. You can build from source and install, typically using the configure/make/make install paradigm.
2. You can install, in my case because I use RedHat, a source RPM and then modify it to reflect the paths and build options you want. Then you simply rebuild the RPM then install it. You can also have specific build RPMs of apache/php/mysql and others that you place into a system like apt-get or yum or up2date, really, they can be repositories of binaries that you have custom built and act as a means for a cenrtal build server, that comes in very handy if you have multiple machines.
Method 2 is nice if you have several machines, all with an identical builds/architectures and you like maintaining RPMs. Otherwise, if you prefer a very fine grained control to how you build and where you install the software, go with a custom compile.
There are certain distros of Unix/Linux that cater to the different preferences. A typical BSD system uses the ports system, which is an automated means for pulling down and compiling from source. While RPM based systems like RedHat/CentOS/Fedora use prebuilt binaries. I also know some folks that really like Gentoo Linux, as it has very nice controls for automating builds, but does them all in a BSD-style way.
As far as your question about "safer", I'd say that if you rely on a vendor for an RPM update for a security flaw, and don't have the means to patch and/or recompile and install again, it *may* be less secure, simply in that your boxes have compromised software hanging out there for less time than others.SupportLayer - Enterprise Linux Server Management
-
01-04-2007, 04:53 PM #4Junior Guru
- Join Date
- Nov 2006
- Location
- College Station, TX
- Posts
- 185
Vendor RPMs are signed with a GPG-key typically. (I'm not sure how apt-get does things, as I haven't used a Debian distro in years...) The repository that you download those packages from also has a key. First the repository key is verified, and then the individual RPM keys that pertain to the packages can be verified. This means that the releases you get from a repository are just as safe as compiling and installing the source code... and it's 100x easier to deal with RPMs than it is to deal with upgrading source installations.
-
01-04-2007, 07:27 PM #5Web Hosting Master
- Join Date
- Nov 2005
- Location
- Minneapolis, MN
- Posts
- 1,648
Most everyone would agree that keeping current on software patches and updates is a vital step for overall server security. That said, which are you most likely to do:
Option 1: Grab the source code for all of the packages you want, define all your compile-time options (and record them so that you can recompile later), and track the project status for each program for when new critical patches and bug fixes are released so you can feed the changes back into the source and recompile.
Option 2: Occasionally type "apt-get update" and then "apt-get upgrade" to keep your packages up to date.Eric Spaeth
Enterprise Network Engineer :: Hosting Hobbyist :: Master of Procrastination
"The really cool thing about facts is they remain true regardless of who states them."
-
01-04-2007, 10:20 PM #6Predatory Poster
- Join Date
- Jul 2003
- Location
- Goleta, CA
- Posts
- 5,566
It depends on what your goals are. Packages are generally better than source in a production environment because they are back ported with security fixes. This means stability for your system. For example, apache has from time to time rewritten their interface and configuration so that newer versions conflict with configurations used for the older versions. Back porting allows production systems to maintain the old interfaces but incorporate the new security updates (effectively tweaking the old version of the software)
I understand why they don't increment the version number since it's not officially the new version. However, this can cause false positives on security scanners that check based on version number.
Conversely, source code compiling can break dependencies with other applications.
-
01-05-2007, 11:19 AM #7New Member
- Join Date
- Jan 2007
- Posts
- 2
Thanks my friends.
I caught another server with the Red Hat.
I go to use up2date and yum. It takes that everything is functioning very well.
One more time, thanks for the tips.
I find that I do not go to have problems in using the packages.
Thanks
-
01-05-2007, 05:40 PM #8Junior Guru Wannabe
- Join Date
- Jun 2006
- Posts
- 67
I normally read all of the change logs and the forums of vendors (if they have one) for any problems that people may get. I'll only upgrade the crucial things which I know what source code changes are and will only then update the software.
using apt-get or yum is actually relatively fast and efficient, so I'll stick with that unless you know exactly what your doingServerTweak Networks, LLC >> ServerTweak.com
Experience the fastest network and superior servers, feel the power of ServerTweak!
Fremont, CA DataCenter | Dedicated Servers | Colocation | Cross Connects HE.net | 1/4 - Full Cab Sales