https://docs.micropython.org/en/latest/ ... conversion
On the other hand, random nerd tutorials has a different set of ranges with the same attenuation levels:
https://randomnerdtutorials.com/esp32-e ... cropython/
ESP32 MP doc:
ADC.ATTN_11DB: 11dB attenuation (150mV - 2450mV)
Random nerd:
ADC.ATTN_11DB — the full range voltage: 3.3V
My tests show that Random nerd is possibly correct. The results are a bit off. My supply measures 3.29V so I'm only 0.3% less than 3.3V. But my readout is 0.16V less than my Fluke meter. This isn't from the 2.45V vs 3.3V I think.
So I read further on ESP32 MP doc:
So this ADC.read_uv(), it doesn't exist in 1.17 firmware. I tried 1.18. It's not there either. i wonder if this is just an issue with the document maybe copy-pasted from pyboard. The call class ADC(pin, *, atten) also doesn't work. It only takes 1 parameter. So under the current scenario, is there any way to read voltages more accurately? Thanks.ADC.read_uv()¶
This method uses the known characteristics of the ADC and per-package eFuse values - set during manufacture - to return a calibrated input voltage (before attenuation) in microvolts. The returned value has only millivolt resolution (i.e., will always be a multiple of 1000 microvolts).
The calibration is only valid across the linear range of the ADC. In particular, an input tied to ground will read as a value above 0 microvolts. Within the linear range, however, more accurate and consistent results will be obtained than using read_u16() and scaling the result with a constant.
Here is another reference from sparkfun saying a 0.2V discrepancy. My issue isn't the 0.2V. I have a voltage divider with 510K/100K to read 12V battery voltages. That scaling makes 0.2V scale to over 1V of difference. I wish I could make more accurate reading using factory calibration.