Results 1 to 2 of 2
-
01-21-2006, 10:45 PM #1Newbie
- Join Date
- Dec 2005
- Posts
- 26
sort -fu not working on large files
I have text files, that are about 120MB in file size that I am trying to sort for uniques.
The text files have one entry per line.
The end result is usually a seemingly never ending process, or when it does end the file its extracting the sorted uniques to is 0 bytes in size, and empty.
The machine I am trying it on is a P4 3.0, 1GB RAM.
The command I use is: sort -fu sourcefile > destfile
What is causing this to fail? Do I need more RAM? Why does it apparently finishing sorting, and then leave a 0byte file?
The final file should be in the neighborhood of 2-10MB in size.
Is there a better solution to my sorting uniques?I'm here to spew the truth.
-
01-21-2006, 11:14 PM #2Web Hosting Master
- Join Date
- Jul 2003
- Location
- Texas
- Posts
- 787
Try this.
cat foo.txt | sort -n | uniq >> foosorted.txt
Also make sure your /tmp slice is not full as sort / uniq will use it for a scratch during the process.
Thanks,
Jeremy