Measuring and reducing memory usage in data-intensive Python applications

Hi,

Based on my experience with image processing in a different domain, I’ve found that the current tools for figuring out where your Python application is spending its memory aren’t very good. So I’m working on a new and better tool that can tell you exactly which functions allocated the memory used at peak allocation time, since that’s the bottleneck you want to optimize.

This is a probably going to be a commercial product, but for now I’m just looking for some people who have applications with high memory usage they want to optimize, and might be interested in trying out it for free.

Pangeo people seemed like good candidates; if you’re interested email me at itamar@pythonspeed.com or just reply here.

You can see example of the output here: https://pythonspeed.com/products/filmemoryprofiler/memory-graph.svg — bar width is percentage of memory used, so notice how it’s clear that register_translation is the largest allocation path, and the function it calls (c2cn) is a hefty chunk of that.