-2107390455M130007925125|1107740|DATA - WITHHELD||Sample, Text|(B)|Includes: Data.|Redacted|0|.8|.9|V|FILE-049-06|M12-2107390455MO2156|3/13/2012 12:10:13 AM|470
Replaced sensitive information with meaningless bits.

This would be 1 row of my most troublesome file. A 120mb text file. Pipe delimited as you can see, but the single comma disrupts the line. But it's not in every line.

I'm thinking having the script check for a pre-split in the line before processing that line to join it back together, add the comma, before processing the text to columns. I have to do the text to columns anyway at this point I think, since anything over ~4mb file appears to randomly drop data when doing the text to columns (on a per line basis). Really weird.


Still kinda interested if anyone knows of a method of changing the on-load default delimiter, but it might not matter for this.