ssd1306 using I2C on the esp8266

All ESP8266 boards running MicroPython.
Official boards are the Adafruit Huzzah and Feather boards.
Target audience: MicroPython users with an ESP8266 board.
User avatar
mcauser
Posts: 507
Joined: Mon Jun 15, 2015 8:03 am

Re: ssd1306 using I2C on the esp8266

Post by mcauser » Wed Jun 15, 2016 12:10 pm

@deshipu I'd like to test your modifications to the ssd1306 driver on my WeMos OLED shield (64x48).
I think I am missing a step.

Code: Select all

>>> import ssd1306
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: no module named 'ssd1306'
Do I need to copy/symlink the driver from /drivers/display/ssd1306.py into /esp8266/scripts and make/flash again to make it work?
Is that the same for all scripts in /drivers?

User avatar
deshipu
Posts: 1388
Joined: Thu May 28, 2015 5:54 pm

Re: ssd1306 using I2C on the esp8266

Post by deshipu » Wed Jun 15, 2016 1:33 pm

mcauser wrote:Do I need to copy/symlink the driver from /drivers/display/ssd1306.py into /esp8266/scripts and make/flash again to make it work?
Is that the same for all scripts in /drivers?
Yes, precisely.

It's the same for some stuff from micropython-lib repository that was included in the official release -- you have to copy or symlink it to scripts.

User avatar
mcauser
Posts: 507
Joined: Mon Jun 15, 2015 8:03 am

Re: ssd1306 using I2C on the esp8266

Post by mcauser » Wed Jun 15, 2016 2:30 pm

It works!!
I added an extra method for outputting a bitmap.
Feels like it should not belong in class SSD1306, instead a higher graphics library, which can output shapes, lines, solid blocks etc.
As mentioned on your PR: https://github.com/micropython/micropython/pull/2187

Code: Select all

def draw_bitmap(self, x, y, bitmap, w, h, col=1):
    byteWidth = (w + 7) // 8
    for j in range(h):
        for i in range(w):
            if i & 7:
                byte <<= 1
            else:
                byte = bitmap[byteWidth * j + i // 8]
            if byte & 0x80:
                self.framebuf.pixel(x + i, y + j, col)
eg.

Code: Select all

import ssd1306
from machine import I2C, Pin
import math
i2c = I2C(sda=Pin(4), scl=Pin(5))
display = ssd1306.SSD1306_I2C(64, 48, i2c, 60)
display.fill(0)
display.pixel(0,0,1)
display.pixel(63,0,1)
display.pixel(0,47,1)
display.pixel(63,47,1)
display.text('SSD1306',4,2,1)
display.text('64x48',12,11,1)
display.text('Awesome!',1,36,1)
display.show()
display.invert(True)
smiley = [7,192,24,48,32,8,64,4,64,4,255,254,167,154,175,186,156,114,128,2,64,36,67,196,32,8,24,48,7,192]
display.draw_bitmap(24, 20, smiley, 15, 15, 1)
display.show()

User avatar
deshipu
Posts: 1388
Joined: Thu May 28, 2015 5:54 pm

Re: ssd1306 using I2C on the esp8266

Post by deshipu » Wed Jun 15, 2016 2:47 pm

Awesome, thank you very much for testing!

I replied to your comment on the pull request about the bitmap method -- I think it's better to have that discussion there.

warren
Posts: 74
Joined: Tue Jul 12, 2016 5:47 pm

Re: ssd1306 using I2C on the esp8266

Post by warren » Mon Jul 18, 2016 12:32 pm

Hi Deshipu

I just tried your code sample from this thread on a SSD1306 OLED. I have a file called "ssd1306.py" (the 'driver' from the repository. I pasted your code into a script called "oled() in REPL mode.

This is what I got when I tried to run it:

#######################################

paste mode; Ctrl-C to cancel, Ctrl-D to finish
=== def oled():
=== import ssd1306
=== from machine import I2C, Pin
=== import math
===
=== i2c = I2C(sda=Pin(4), scl=Pin(5))
=== display = ssd1306.SSD1306_I2C(32, i2c, 60)
=== display.fill(0)
=== for x in range(0, 96):
=== display.pixel(x, 16+int(math.sin(x/32*math.pi)*7 + 8), 1)
=== display.show()
>>>
>>> oled()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 7, in oled
File "ssd1306.py", line 108, in __init__
File "ssd1306.py", line 32, in __init__
TypeError: unsupported types for : 'I2C', 'int'

####################################

What did I do wrong?

Many thanks

User avatar
deshipu
Posts: 1388
Joined: Thu May 28, 2015 5:54 pm

Re: ssd1306 using I2C on the esp8266

Post by deshipu » Mon Jul 18, 2016 9:20 pm

There was a change in that driver recently, now you have to specify both width and height, not just height.

mflmartin
Posts: 43
Joined: Sat Jul 23, 2016 7:30 pm

Re: ssd1306 using I2C on the esp8266

Post by mflmartin » Wed Jul 27, 2016 9:08 pm

[quote="mcauser"]It works!!
I added an extra method for outputting a bitmap.
Feels like it should not belong in class SSD1306, instead a higher graphics library, which can output shapes, lines, solid blocks etc.
As mentioned on your PR: https://github.com/micropython/micropython/pull/2187

[code]
def draw_bitmap(self, x, y, bitmap, w, h, col=1):
byteWidth = (w + 7) // 8
for j in range(h):
for i in range(w):
if i & 7:
byte <<= 1
else:
byte = bitmap[byteWidth * j + i // 8]
if byte & 0x80:
self.framebuf.pixel(x + i, y + j, col)
[/code]

eg.
[code]
import ssd1306
from machine import I2C, Pin
import math
i2c = I2C(sda=Pin(4), scl=Pin(5))
display = ssd1306.SSD1306_I2C(64, 48, i2c, 60)
display.fill(0)
display.pixel(0,0,1)
display.pixel(63,0,1)
display.pixel(0,47,1)
display.pixel(63,47,1)
display.text('SSD1306',4,2,1)
display.text('64x48',12,11,1)
display.text('Awesome!',1,36,1)
display.show()
display.invert(True)
smiley = [7,192,24,48,32,8,64,4,64,4,255,254,167,154,175,186,156,114,128,2,64,36,67,196,32,8,24,48,7,192]
display.draw_bitmap(24, 20, smiley, 15, 15, 1)
display.show()
[/code][/quote]


Nice for the bitmap function. Thanks!

What library or code do you use to output the images in that formated array from a PNG?

Thanks,

User avatar
deshipu
Posts: 1388
Joined: Thu May 28, 2015 5:54 pm

Re: ssd1306 using I2C on the esp8266

Post by deshipu » Thu Jul 28, 2016 7:54 pm

Personally I just write a short program in PyGame that reads that image and then iterates over all the pixels and generates whatever I need.

mflmartin
Posts: 43
Joined: Sat Jul 23, 2016 7:30 pm

Re: ssd1306 using I2C on the esp8266

Post by mflmartin » Fri Jul 29, 2016 10:16 pm

deshipu wrote:Personally I just write a short program in PyGame that reads that image and then iterates over all the pixels and generates whatever I need.
Thanks,

I am looking into pygame, but I am a bit lost. I don't understand, yet, all the structures.

Could you please share that bit of code that takes 1 image and converts it to the data model that your added function uses to represent the pixels?

How come the values can go from 0 to 255 in each position? I did a bit of research but I don't understand the data structure logic of this array, containing the smiley:

[7,192,24,48,32,8,64,4,64,4,255,254,167,154,175,186,156,114,128,2,64,36,67,196,32,8,24,48,7,192]

I can see it takes 30 elements, for a 15x15 image. But what I don't understand is how do they describe the position of each pixel exactly and my ignorance is driving me crazy.

Thanks a lot :).

User avatar
deshipu
Posts: 1388
Joined: Thu May 28, 2015 5:54 pm

Re: ssd1306 using I2C on the esp8266

Post by deshipu » Sat Jul 30, 2016 8:17 am

I didn't save the snippets that I normally use, but here's one that I recently used to generate data for a 2-bit font:

Code: Select all


import pygame

colors = {
    (0, 0, 0, 255): 0,
    (102, 102, 102, 255): 1,
    (204, 204, 204, 255): 2,
    (255, 255, 255, 255): 3,
}
image = pygame.image.load("font.png")
images = []

for tile_x in range(0, image.get_size()[0]/4):
    rect = (tile_x * 4, 0, 4, 6)
    images.append(image.subsurface(rect))
    
for image in images:
    print '(%s),' % ', '.join('%d' %
        sum(colors[tuple(image.get_at((x, y)))] << (x * 2)
            for x in range(4))
        for y in range(6))

Post Reply