Bitbanging DVI only possible on RP2040?
Posted: Wed Feb 10, 2021 7:49 am
Hi everyone,
I have recently seen this post ( https://github.com/Wren6991/picodvi ) about bitbanging a DVI signal using the new RP2040 chip and had some questions about it.
Firstly, is this something that would be possible on any device (I'm particularly interested in the esp32) or does it appear to linked directly to the PIO features on the RP2040?
Secondly, I am assuming that if it uses the PIO this code would have to be written outside of micropython, and if so is it possible to interface that with micropython (I assume yes), and would there be any speed implications of that.
As a final point I assume that the images are generated on the fly as for the size of display would not appear to support a buffer in the 256kb of ram available.
I have recently seen this post ( https://github.com/Wren6991/picodvi ) about bitbanging a DVI signal using the new RP2040 chip and had some questions about it.
Firstly, is this something that would be possible on any device (I'm particularly interested in the esp32) or does it appear to linked directly to the PIO features on the RP2040?
Secondly, I am assuming that if it uses the PIO this code would have to be written outside of micropython, and if so is it possible to interface that with micropython (I assume yes), and would there be any speed implications of that.
As a final point I assume that the images are generated on the fly as for the size of display would not appear to support a buffer in the 256kb of ram available.