It is currently Fri, 19 Dec 2014 04:53:19 GMT



 
Author Message
 Deleting duplicate lines

The result of the output from a program gives me a file with some duplicate
entries. Some of the entries are one line others are two line. All end with
the "equal" (=) caracter.
Ex:

BTV UA /OV PLB /FL UNK /TP C172 /TURB LGT TO MDT=
MPV UA /OV MPV /FL 085 /TP PA31 /TURB NIL
        /ICG MDT /RM VSBY 4=
MPV UA /OV MPV /FL 085 /TP PA31 /TURB NIL
        /ICG MDT /RM VSBY 4=
BOS UA /OV BOS /FL LNDG /TP B737 /RM WND SHR 20KTS ON FINAL=

I need to eliminate all the duplicates so that I have one only of each
entry. I've tried with sort and sed and can't come up with an answer.
Please help. Merci



 Thu, 18 Mar 1999 03:00:00 GMT   
 Deleting duplicate lines

In article <01bbae4a$be0915e0$cace1...@axess.axess.com>,
[snip]

pipe your output through uniq. that will remove all duplicate lines.

-Mort



 Fri, 19 Mar 1999 03:00:00 GMT   
 Deleting duplicate lines

In article <DyK11o....@midway.uchicago.edu>,
David 'Mort' Mortman <m...@rainbow.uchicago.edu> wrote:

     uniq only remove duplicate lines that directly following
         one or more lines. uniq wil not remove a duplicate line3
         that might be at line6.

         richk



 Fri, 19 Mar 1999 03:00:00 GMT   
 Deleting duplicate lines

: From: ri...@panix.com (Rich Kus)
: Newsgroups: comp.unix.shell
: Date: 30 Sep 1996 23:05:39 -0400
: Organization: Panix
: Lines: 30
:

: >>I need to eliminate all the duplicates so that I have one only of each
: >>entry. I've tried with sort and sed and can't come up with an answer.
: >>Please help. Merci
: >>
: >
: >pipe your output through uniq. that will remove all duplicate lines.
: >

That's where sort comes in.  (Assuming that whatever's depending
on that output doesn't mind a little ascibetical sorting).

-fil
___________________________________________________________________________
Fil Krohnengold      |  Only two rules you need to know about life.  
f...@amnh.org         |  Rule #1: Don't sweat the small stuff.  
f...@paul.rutgers.edu |  Rule #2: Everything's small stuff.



 Sat, 20 Mar 1999 03:00:00 GMT   
 Deleting duplicate lines

In article <FIL.96Oct1151...@john.rutgers.edu>, f...@john.rutgers.edu (FiL)
wrote:

Actually, you don't even need "uniq". You can use the "-u" option in sort:

sort -u -o myfile myfile

This will take "myfile", sort the lines, and remove duplicates. If you need to
keep the lines in order they appeared in the file, then you will be best
off using Perl. The two lines must match exactly in order for this to
work. If you have extra spaces at the end of one line (or some other
problem like that) you may have to pipe the file into sed or tr then pipe
the output to "sort -u".

--
David Weintraub                    _/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/
Deutsche Morgen Grenfell          _/                                      _/
dw...@dbna.com                   _/    I AM THE GREAT AND POWERFUL OZ*   _/
dav...@cnj.digex.net            _/                                      _/
                               _/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/
                          *(Pay no attention to the man behind the curtains)



 Sun, 21 Mar 1999 03:00:00 GMT   
 Deleting duplicate lines

In article <dhw-0210960028250...@192.0.2.1>, d...@telerate.com (David and Rachel Weintraub) writes:
|> In article <FIL.96Oct1151...@john.rutgers.edu>, f...@john.rutgers.edu (FiL)
|>
|> > : From: ri...@panix.com (Rich Kus)
|> > : Newsgroups: comp.unix.shell
|> > : Date: 30 Sep 1996 23:05:39 -0400
|> > : Organization: Panix
|> > : Lines: 30
|> > :
|> >
|> > : >>I need to eliminate all the duplicates so that I have one only of each
|> > : >>entry. I've tried with sort and sed and can't come up with an answer.
|> > : >>Please help. Merci
|> > : >>
|> > : >
|> > : >pipe your output through uniq. that will remove all duplicate lines.
|> > : >
|> >
|> > That's where sort comes in.  (Assuming that whatever's depending
|> > on that output doesn't mind a little ascibetical sorting).
|> >
|> > -fil
|> > ___________________________________________________________________________
|> > Fil Krohnengold      |  Only two rules you need to know about life.  
|> > f...@amnh.org         |  Rule #1: Don't sweat the small stuff.  
|> > f...@paul.rutgers.edu |  Rule #2: Everything's small stuff.
|>
|> Actually, you don't even need "uniq". You can use the "-u" option in sort:
|>
|> sort -u -o myfile myfile
|>
|> This will take "myfile", sort the lines, and remove duplicates. If you need to
|> keep the lines in order they appeared in the file, then you will be best
|> off using Perl. The two lines must match exactly in order for this to
|> work. If you have extra spaces at the end of one line (or some other
|> problem like that) you may have to pipe the file into sed or tr then pipe
|> the output to "sort -u".
|>
|> --
|> David Weintraub                    _/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/
|> Deutsche Morgen Grenfell          _/                                      _/
|> dw...@dbna.com                   _/    I AM THE GREAT AND POWERFUL OZ*   _/
|> dav...@cnj.digex.net            _/                                      _/
|>                                _/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/
|>                           *(Pay no attention to the man behind the curtains)

with perl, its very simple:

#! /usr/local/bin/perl
while (<>)
{
   print unless $seen{$_}++;

otherwise you may use the following:

#! /bin/sh
[ $# -lt 1 ] && set -- -
cat -n "$@" | sort -b +1 | uniq -1 | sort -n | cut -f2-

kf

--



 Sun, 21 Mar 1999 03:00:00 GMT   
 
   [ 6 post ] 

Similar Threads

1. sed or awk - delete duplicate lines in unix

2. Deleting Duplicate Lines...

3. deleting duplicate lines in a file.

4. Deleting Almost Duplicate Lines?

5. Deleting duplicate multiple line entries

6. Extracting multiple lines OR deleting multiple lines from a file using AWK

7. Delete the first line of a 1.4 million lines file

8. . sed script: keep 1 line, delete next 7 lines

9. Deleting multplie lines after matching a regex in the first line

10. Deleting lines above a given line


 
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group.
Designed by ST Software