First of all, regarding "new" and "old" versions. I would like to start with resetting the baseline by saying that new versions of most software don't bring any useful changes. Mostly, they bring flip-flop style of changes and features not really needed by users. There're purely marketing matters behind this, and certainly, there's nothing terribly wrong with it, except that there should be software developed in different way. Regarding specifically MicroPython, it just implements "Python" language, so there's only so much you can add to it (regarding the core language). And some argue that since 1.5.2, nothing unavoidably useful was added to core Python language either (if you disagree, then just make sure that you use, or at least know all the features which were in all the various versions). Standard library is another matter - CPython showed that it can grow uncontrollably. But that also points at the obvious solution - keep "standard library" at absolute minimum, and let all the rest be 3rd-party, community modules.
Well I agree that it should be a novel aim to make and keep micropython as slim as possible. However, other projects taught me that there is a trade-off between size, speed and convenience. Sure you can try to make many stuff modular and you can try to offer smaller and bigger implementations of a certain feature (like string operations) . Meanwhile, I simply guess, that many of the later users of micropython are starters (Arduino-like starters) and they might prefer an overall out-of-the-box experience. It would be hard to explain to them that they need to compile micropython first with a certain set of options to get support for feature A (or to copy some files somewhere) and to recompile (or delete some files and copy over some others) to get feature B. Most users simply want it to work.
It would be not my first time, trying to squeeze in a certain (growing by time) firmware into a flash memory, trying to shave of bits and bytes wherever I can and continuously thinking... "why didn't I just used the next larger version"
In some way it might be comparable to the linux kernel. Sure you can get a very slim kernel with a small set of modules specially tailored towards your system (or no modules at all). But I guess most users and distros just use the "one-size-fit-all" kernels. They are bigger, they need more resources, but they work from the first second without much hassle.
That was the reason, I was referring to leave some space and not trying to get micropython running on to small systems, even it it would (from an developer, researcher, academic, etc.) point-of-view be very interesting and certainly work. As you are a dev, I can see that you do not see a problem in parametrisation, configuration, recompiling, etc. I simply was looking from another point of view, of those starters and beginners, people who even want to learn python and uC-programming in one go...
How about a feature like an auto-routine, which analyses the python user code and compiles and creates a customized version of micropython and necessary modules for the particular task. This might allow people to easily save some (kilo)Bytes RAM and ROM after they finalized there software designs.