I have a data logger based on PyBoard 1.1 where files are written to the FAT32 SD card at a rate from 0.1s up to 24 hours in CSV format.
What I have noticed when trying different ways to manage the logs files are the following:
1. If you create a new file say every minute, hour or day, is that as the number of files on the SD card grows, the new file creation and access time starts increasing, which becomes a problem if you are writing at 10Hz. You start seeing data missing between the closing of the old file and the creation of the new file. I gave up on this and moved to '2' below using just a single file;
2. If you just use a single CSV file, which works up to 4GB on FAT32, as this file grows in size (i.e. at around 500MB) you start to see the file access time increasing which again becomes a problem if you are writing at 10Hz.
The file system info for one of the Samsung 64GB EVO SD cards is as follows:
Bytes per sector: 512
Sectors per cluster: 64
Sectors of root director: unlimited
Reserved sectors: 32
Sectors per FAT: 15259
First FAT section: 32
First data sector: 30550
What I am interested to find out is the best way to manage these problems and data logger files with FAT32?
Any comments on this would be appreciated
