I have run this code on an ESP8266 and ESP32 and observe the same behaviour. I have also created a test file in Python 3 with the exact same method and I observe the same image, which is not correct. I will attach below examples of these images first is the original image and the second is the reconstructed image.
You will also notice. small black line in the top left of the reconstructed image which is not present in the original. Perhaps to do with the number of received bytes being +62 on the original, but even removing the first 62 bytes does not change anything.
I will now attach the code which I have loaded on to my boards.
Code: Select all
url = 'http://192.168.1.103/hello/'
import socket
_, _, host, path = url.split('/', 3)
addr = socket.getaddrinfo(host, 105)[0][-1]
s = socket.socket()
s.connect(addr)
s.send(bytes('GET /%s\r\n\r\n' % path, 'utf8'))
e._command(const(0x13))
while True:
data = s.recv(1000)
if data:
for i in range(len(data)):
e._data(~data[i])
else:
break
e._command(const(0x12))
sleep_ms(100)
e.wait_until_idle()
I do not believe the problems are due to the display as I have been able to independently observe these results in Python 3 using Pillow to generate the second image as seen above. So it must be due to the way I am programming the socket. (Or the API.)
When running my test in Python 3 I have observed that if I receive all the bytes and create a BytesIO object then I can generate the file and it appears as in the original (first image above), but due to ram limitations on the ESP8266 this is not possible, and I am having some issue with RAM on ESP32 (see my other post viewtopic.php?f=18&t=11816).
If anyone has any recommendations I would greatly appreciate them, I would like to see this code working, I am curious as to how the image may look if I was to use a FrameBuffer and then send the bytes from that instead, but I do not have enough ram to initialise a 48000 bytes frameBuffer, which is needed for the 800x480 display.