Long delays in handling "large" datasets
Posted: Thu Mar 26, 2020 7:12 pm
Hello everyone,
I'm using a ESP32 to transmit serveral kB worth of sensor data via WiFi.
I store these in a textfile where one line represents a sample:
I need to push a fresh datavector on the file, while popping the oldest. My solution for this is working, but really slow. Adding one new element takes about 10s.
I basically use a new file to copy the lines to, and destroy the "donor" file at the end:
Does anyone have any suggestions how to improve the performance of the script? Or is there a better way of doing this task?
Sorry for the dumb question, I'm really not used to think much about system resources when it comes to regular Python
Thanks in advance for your help
I'm using a ESP32 to transmit serveral kB worth of sensor data via WiFi.
I store these in a textfile where one line represents a sample:
Code: Select all
...
22.5,42.0,1002.2,8.3
22.5,42.3,1002.2,8.3
22.5,42.5,1002.2,128.3
...
I basically use a new file to copy the lines to, and destroy the "donor" file at the end:
Code: Select all
def HandleMeasBuffer(newData):
print("start transfer")
foundFiles = os.listdir()
if "log_A.txt" in foundFiles:
log_A = True
file1 = open("log_A.txt", 'r')
file2 = open("log_B.txt", 'w+')
else:
log_A = False
file1 = open("log_B.txt", 'r')
file2 = open("log_A.txt", 'w+')
for num, f1 in enumerate(file1):
if num > 0: #forget first, oldest line
file2.write(f1)
file2.write("\n"+newData)
file1.close()
file2.close()
if log_A:
os.remove("log_A.txt")
else:
os.remove("log_B.txt")
print("end transfer")
Sorry for the dumb question, I'm really not used to think much about system resources when it comes to regular Python
Thanks in advance for your help