Processing huge files - Printable Version +- QB64 Phoenix Edition (https://staging.qb64phoenix.com) +-- Forum: QB64 Rising (https://staging.qb64phoenix.com/forumdisplay.php?fid=1) +--- Forum: Code and Stuff (https://staging.qb64phoenix.com/forumdisplay.php?fid=3) +---- Forum: Utilities (https://staging.qb64phoenix.com/forumdisplay.php?fid=8) +---- Thread: Processing huge files (/showthread.php?tid=798) |
Processing huge files - mdijkens - 08-22-2022 I saw the discussion in MemFile System about fast fileread That all works fine up to 2GB files I sometimes have to process huge (100GB+) csv files. Therefore I created this reader (2x slower but unlimited size): Code: (Select All) t! = Timer RE: Processing huge files - mnrvovrfc - 08-24-2022 (08-22-2022, 05:08 PM)mdijkens Wrote: I sometimes have to process huge (100GB+) csv files.Seriously, I give up. The thing is that when I try to copy a 4GB file into an USB v3.0 external disk it takes more than half-hour! It seems more ornerous doing it on Linux than on Windows. How long does it take for you to copy a file that large from one disk to another? Or do you need to copy it? Thank you for this program, at any rate. :tu: RE: Processing huge files - mdijkens - 08-24-2022 Yes. Processing logfiles of serverclusters. Btw, above code also provides line and field based processing. The csv.field function (also usable in other line based processing ) sets the fields only at the first call with a certain line which works pretty fast. RE: Processing huge files - mdijkens - 08-24-2022 (08-24-2022, 10:13 PM)mnrvovrfc Wrote:(08-22-2022, 05:08 PM)mdijkens Wrote: I sometimes have to process huge (100GB+) csv files.Seriously, I give up. Did not see the yellow text at first... Just copying with this block based approach is mostly limited by the write speed of the USB stick. I have a relatively fast one that reaches around 250MB/s. But on PCIe thunderbolt this same routine reaches 3GB/s |