Hello,

I would like to get ideas on how to remove duplicates from an extremely large Excel file. We have updates to this file regularly, where we need to clean it for duplicates, and it is arduous.

The are over 229,000 rows. I have tried various macros, and parsed out the data as well, but I end up with memory errors due to its size.

Here is an example of what the data consists of:

PLB,TOL,ZMM,DAT,LD2,FID,NC0,REC,UMT
NC0,REC,ZMM,UMT,DAT,PLB,FID,TOL,LD2
NC0,PLB,DAT,REC,TOL,CB5,LD2,FID
CB5,PLB,DAT,NC0,LD2,REC,TOL,FID
REC,PLB,TOL,NC0,FID,CB5,LD2,DAT
REC,NC0,LD2,PLB,DAT,TOL,FID
TOL,REC,NC0,LD2,PLB,UD2,DAT,CB5,FID
NC0,FID,LD2,MC5,REC,TOL,PLB,TMS,DAT
LD2,CB5,TOL,FID,REC,DAT,NC0

Each row above is in Column A to begin with.

Any ideas on how to reduce this would be great.

Thanks,

Larry