Last week I worked with over 10,000 files of tabular data, each with about 50,000 rows and
10 columns separated by a '|'. This is the kind of problem that falls in the uncanny
valley between small and biggish data. My goal was to make some quick checks and, if
possible, concatenate the files into a single csv
file I could load into Python.