Help writing and reading files
Help writing and reading files
I am trying to save an array to a file so I can read it in later and have the same data available. I have been trying to do it with the struct library.
I am doing the following:
from struct import *
f = open('raw-dat', 'r')
for i in range(0,220):
b = unpack('f', f.readlind())
c = b[0]
print(c)
It will print several values and then I keep getting the following error:
>>> read_raw()
1.384608
2.123945
1.913058
2.104977
1.69894
3.47634
9.593949
6.8901
9.207446
10.51072
9.278038
8.956342
9.954795
4.594289
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 6, in read_raw
ValueError: buffer too small
What am I missing?
I wrote the file with the following:
def write_raw():
global raw
f = open('raw.dat', 'w')
for i in range(0, len(raw)):
a = raw
b = pack('f', a)
f.write(b)
f.write('\n')
f.close()
I am doing the following:
from struct import *
f = open('raw-dat', 'r')
for i in range(0,220):
b = unpack('f', f.readlind())
c = b[0]
print(c)
It will print several values and then I keep getting the following error:
>>> read_raw()
1.384608
2.123945
1.913058
2.104977
1.69894
3.47634
9.593949
6.8901
9.207446
10.51072
9.278038
8.956342
9.954795
4.594289
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 6, in read_raw
ValueError: buffer too small
What am I missing?
I wrote the file with the following:
def write_raw():
global raw
f = open('raw.dat', 'w')
for i in range(0, len(raw)):
a = raw
b = pack('f', a)
f.write(b)
f.write('\n')
f.close()
Re: Help writing and reading files
Could it be that you reached the end of the file, in which case readline() returns an empty string? Since unpack of "f" requires 4 bytes, the buffer, having a length of 0, is too small.
- pythoncoder
- Posts: 5956
- Joined: Fri Jul 18, 2014 8:01 am
- Location: UK
- Contact:
There are easier ways
Have you looked at the Python pickle module, or alternatively ujson? These offer much simpler object serialisation. This example creates an array, saves it to a file, reads it back and prints it:
Python objects of arbitrary complexity can be serialised this way. Alas the MicroPython implementation of Pickle doesn't seem to accept an array instance, hence the conversion to a tuple. If you were using a list, dict, set or some convoluted combination of these, no conversion would be required:
Code: Select all
from array import array
import pickle
a = array('f', range(10))
with open('test', 'w') as f:
pickle.dump(tuple(a), f)
with open('test', 'r') as f:
z = array('f', pickle.load(f))
print(z)
Code: Select all
import pickle
a = {'one':[1,2,3], 'two': [4,5,6], 'three': {7,8,9}}
with open('test', 'w') as f:
pickle.dump(a, f)
with open('test', 'r') as f:
z = pickle.load(f)
print(z)
Peter Hinch
Index to my micropython libraries.
Index to my micropython libraries.
-
- Posts: 847
- Joined: Mon Nov 20, 2017 10:18 am
Re: Help writing and reading files
First thing that I noticed is that your opening the file to read in text format but then trying to unpack binary data.
your code: f = open('raw-dat', 'r')
for reading binary data : f = open('raw-dat', 'rb')
I am new to programming but it is my understanding that binary data doesn't have carriage returns only text files have this.
So not sure if this is possible: b = unpack('f', f.readlind())
I think it needs to be more like this : b = unpack('f', f.read(4))[0]
Then you should be able to just print (b)
your code: f = open('raw-dat', 'r')
for reading binary data : f = open('raw-dat', 'rb')
I am new to programming but it is my understanding that binary data doesn't have carriage returns only text files have this.
So not sure if this is possible: b = unpack('f', f.readlind())
I think it needs to be more like this : b = unpack('f', f.read(4))[0]
Then you should be able to just print (b)
Re: Help writing and reading files
Thanks for all the replies. I tried the pickle solution, but the response I get when I try to import pickle:
>>> import pickle
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: no module named 'pickle'
>>>
pickling doesn't seem to be part of MicroPython. I know I am not hitting the end of file because the file has 221 floating point values in it and I am getting the error on the 14th conversion. One time it made 132 conversions before failiing. It does convert some of the values ok, but something is tripping it up.
I don't understand the JSON module at all. Any help there would also be helpful.
Thanks again. I will next try the 'rb' suggestion.
>>> import pickle
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: no module named 'pickle'
>>>
pickling doesn't seem to be part of MicroPython. I know I am not hitting the end of file because the file has 221 floating point values in it and I am getting the error on the 14th conversion. One time it made 132 conversions before failiing. It does convert some of the values ok, but something is tripping it up.
I don't understand the JSON module at all. Any help there would also be helpful.
Thanks again. I will next try the 'rb' suggestion.
Re: Help writing and reading files
With you approach, the hint of @OutoftheBOTS_ seems more appropriate, to use plain text files. You can write you float to the file with f.write("%f\n" & float_number), and read that back with s = f.readline(), converting that back to float with float(s).
Re: Help writing and reading files
Thanks, roberthh! Here is what I ended up doing that works well.
Code: Select all
def write_raw():
global raw
f = open('raw.dat', 'w')
for i in range(0, len(raw)):
a = raw[i]
a = str(a)
a = a + '\n'
f.write(a)
f.close()
import array
raw1 = array.array('f', range(222))
def read_raw():
global raw1
f = open('raw.dat', 'r')
for i in range(0,222):
a = f.readline()
a = float(a)
raw1[i] = a
print(a)
-
- Posts: 847
- Joined: Mon Nov 20, 2017 10:18 am
Re: Help writing and reading files
Glad to see that you got it working.
There is 2 ways to store data in a file: text and binary. Text is human readable but uses more space in the file to store same info, also converting data to string and back is slower with more potential for error than just packing it into binary data (dumping it from memory to a byte array)
If your interested in using biniary data something like this should work, it is untested and just typed out
There is 2 ways to store data in a file: text and binary. Text is human readable but uses more space in the file to store same info, also converting data to string and back is slower with more potential for error than just packing it into binary data (dumping it from memory to a byte array)
If your interested in using biniary data something like this should work, it is untested and just typed out
Code: Select all
from ustruct import pack, unpack
def write_raw():
global raw
f = open('raw.dat', 'wb')
formatting = 'f' * len(raw)
f.write(pack(formatting, raw)
f.close()
import array
raw1 = array.array('f', range(222))
def read_raw():
global raw1
f = open('raw.dat', 'rb')
formatting = 'f' * 222
raw1= unpack(formatting, f.read(222*4))
print(raw1)
- pythoncoder
- Posts: 5956
- Joined: Fri Jul 18, 2014 8:01 am
- Location: UK
- Contact:
JSON and pickle
The pickle module is part of micropython-lib and needs to be installed on your hardware. The code is tiny.
ujson is part of MicroPython and I'd look at the Python docs for guidance.
In contrast to the other people who've commented in this thread I think pickle and JSON are much simpler ways of saving data to disk than using binary files. JSON and pickle are very similar and are easy to use; if you're unsure about library files use JSON. Both convert arbitrary objects to and from text and take care of all the conversion. When I first learned Python and discovered pickle I was astounded that the language made simple something which was such a hassle in every language I'd used previously. If you have some data object which you want to save and restore, and you make a change to its structure, the file handling requires no changes whatsoever. This is good.
The similarity is such that to use ujson in place of pickle I merely did a search/replace:
Code: Select all
from array import array
import ujson
a = array('f', range(10))
with open('test', 'w') as f:
ujson.dump(tuple(a), f)
with open('test', 'r') as f:
z = array('f', ujson.load(f))
print(z)
JSON supports a more limited range of Python data types whereas pickle supports all native data types. JSON has the benefit of being a standard, so JSON files may easily be accessed by other languages. Note that arrays are not a native data type so need to be converted to tuples when using either module as in the sample above.
Peter Hinch
Index to my micropython libraries.
Index to my micropython libraries.
-
- Posts: 847
- Joined: Mon Nov 20, 2017 10:18 am
Re: Help writing and reading files
@pythoncoder
It has been on my list to play with pickle just haven't used it yet. I am comfortable with struct as had to learn it to pack data to be sent via serial to other chips registers.
If you write to a file with pickle can you open the file with a text editor and it is human readable?? obviously binary isn't
It has been on my list to play with pickle just haven't used it yet. I am comfortable with struct as had to learn it to pack data to be sent via serial to other chips registers.
If you write to a file with pickle can you open the file with a text editor and it is human readable?? obviously binary isn't