What do you do if you have long lists of information that you need to de-dupe or sort? You can use a program like Excel, which is not free, has a very large footprint on your PC, and cannot easily handle really large lists of much more than a million rows.
You could fire up a database, which unless you already use one, is just overkill. Or you could ‘Text Deduplicator Plus’.
‘Text Deduplicator Plus’ is a small, free program that can remove duplicates from lists and perform two other related functions, such as ranking your lists, and counting the number of instances of each unique value in your list. It is small, easy to use, and can handle extremely large lists
We tested this on a file that has about 1.8 million rows (which would not fit on a single worksheet in Excel). All that is needed is to point to the file you want to process, then choose ‘Deduplicate’, ‘Sort Only’, or ‘Count Dupes’. Processing was lightning fast.
A terrific little program that does the job quickly and without much fuss. It is simple and straightforward and will get the job done, and can run portably without installation.
One thing I wish it did is sort in descending, and not just ascending order. Note that the dedupe operation by default sorts the results for you.
Get Text Deduplicator Plus here (Windows).