Memory usage is an interesting problem, because the failure mode is so much more painful (crashes and complete lock-ups) than high CPU usage.

And even if you return memory to the OS, that doesn't actually solve too-high memory usage.

Some things you can do:

The article mentions `__slots__` for reducing object memory use, and other approaches include just having fewer objects: for example, a dict of lists uses far less memory than a list of dicts with repeating fields. And you can also in many cases switch to a dataframe with Pandas, saving even more memory (https://pythonspeed.com/articles/python-object-memory/ covers all of those).

For numeric data, a NumPy array gets rid of the per-integer overhead for Python, so a Python list of numbers use way more memory than an equivalent NumPy array (https://pythonspeed.com/articles/python-integers-memory/).

I use https://github.com/hakavlad/nohang and it's a game changer

Look at the new tools:

https://github.com/hakavlad/prelockd

https://github.com/hakavlad/memavaild

It can greatly improve responsiveness. Demo:

https://youtu.be/QquulJ06dAo - playing supertux + 12 `tail /dev/zero` in background

https://youtu.be/DsXEWvq60Rw - `tail /dev/zero`, swap on HDD, memavaild, no freezes