Community
Maya Programming
Welcome to Autodesk’s Maya Forums. Share your knowledge, ask questions, and explore popular Maya SDK topics.
cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Memory leaks in MEL

5 REPLIES 5
Reply
Message 1 of 6
Anonymous
1241 Views, 5 Replies

Memory leaks in MEL

I've run into a huge problem with just about every large MEL script I've written in the last couple of years. I tend to build tools in Maya to help me with tasks such as laying out navigation meshes in a game level or exporting vertex colors from hundreds of meshes to a binary file. When running these large MEL scripts, I always end up seeing Maya's memory usage expand as the script runs until eventually Maya runs out of memory and crashes. I was using the 32-bit version, but ended up having to go to the 64-bit version because these scripts eventually cause maya to grow beyond the 4GB limit. Well, even the 64-bit version isn't a big enough band-aid to fix this problem. Maya's currently at 9.4GB right now after having loaded 43,185 path nodes from 3.2 MB a binary file. Maya never releases the memory until it's quit and re-opened.

I really don't know where these memory leaks are coming from. I've tried and tried to debug my code in the past to find the sources but I always end up having leaks galore anyway. The most common culprit seemed to be not clearing arrays after using them, but I've been very careful to always do that. Also, I have the undo buffer disabled and am going to the extra effort to flush the buffer anyway. History is off as well. I'm out of ideas on where to look.

It's really becoming a productivity killer.
5 REPLIES 5
Message 2 of 6
lee.dunham
in reply to: Anonymous

I always though the memory was handled well, but tbh I haven't written/used any truly memory hungry procs.

Apologies as I wont be of any help, but perhaps this would be a prime example to use C++? I believe you'll have far more control over the memory handling and far few leaks.
Message 3 of 6
Anonymous
in reply to: Anonymous

I always though the memory was handled well, but tbh I haven't written/used any truly memory hungry procs.

Apologies as I wont be of any help, but perhaps this would be a prime example to use C++? I believe you'll have far more control over the memory handling and far few leaks.


It's not that I'm doing anything terribly complex, but the leak grows huge depending on the number of items that need to be processed. As for C++, I hadn't thought of that, and that may well be the solution in the future. Unfortunately that would require too much time to rework all the tools I need to use right now to get stuff done. I'm hoping there's something obvious I'm overlooking or unaware of because I find it hard to believe that I find few complaints about this issue on the web from MEL users.
Message 4 of 6
Anonymous
in reply to: Anonymous

Are you sure you're not declaring any variables in the global scope that remain there until overwritten? If defining these variables in the global scope is required you could try to overwrite the variable after the script?

How are you clearing the arrays?
Message 5 of 6
Anonymous
in reply to: Anonymous

This is an old thread, but I know the cause of this problem and a limited and complex workaround. It's also somewhat redundant as most people have moved over to Python for these types of things. But if you're like me, you just continue to do most things in MEL and call Python functions from within MEL when/if required.

 

The cause is pretty simple, everytime you pass variables to a function (global or local), Maya never releases the memory allocated to the variable inputs. That's it.

 

Eg;

--

global proc myFunction (float $poop[]) {}

myFunction({0,0,0});

--

 

Right there, is the memory leak.

 

If you iterate through a huge list of things (like points), and call myFunction every time, your RAM will get chewed up and never released. This gets worse the more variables you pass and the more functions you are calling (god forbid you need to do something recursive!). If you're stupid like me and decided to write a monte carlo ray tracer in MEL  (yes, I have), then you've found this problem to be a show-stopper.

 

Here's a working example. Be warned though, it will consume around 8GB of RAM!

 

--

// Example which exposes the issue (be warned, this consumes around 8GB of RAM!)

global proc myFunction ( float $array[] )
{
// Don't do anything at all!
}

{
for ($i = 0; $i < 10000000; $i++)
myFunction({0,0,0,0,0,0,0,0});
}

--

 

What's the workaround?

 

Initially, I thought 'pass by reference' would solve the issue, but it doesn't.

 

The only way to deal with it is to NEVER pass variables to procedures. Instead, you setup GLOBAL variables which pass data around. This means Maya only allocates the memory for each variable ONCE, so even large arrays aren't a big deal. And it's not as limiting as it might sound as you can create 'container' global var's which you simply call many times; constantly setting them, and retrieving them. The code only slows down a few percent (when tested inside a scene which calls functions around 0.5 billion times - yes, 500 million).

 

Here's a working example.

 

--

global proc myFunction ()
{
// Declare the input 'container'
global float $INPUT3_1[3];
// Declare the output 'container'
global float $OUTPUT3_1[3];

// Set the output container to something derived from the input container.
$OUTPUT3_1[0] = ($INPUT3_1[0] * rand(1));
$OUTPUT3_1[1] = ($INPUT3_1[1] * rand(1));
$OUTPUT3_1[2] = ($INPUT3_1[2] * rand(1));
}

 

{
// Declare the input 'container'
global float $INPUT3_1[3];
// Declare the output 'container'
global float $OUTPUT3_1[3];

for ($i = 0; $i < 10000000; $i++)
{
$INPUT3_1[0] = rand(0.0,1.0);
$INPUT3_1[1] = rand(0.0,1.0);
$INPUT3_1[2] = rand(0.0,1.0);
myFunction();

// Do something with the result;
float $tmp[3] = $OUTPUT3_1;

// print $OUTPUT3_1;
}
}

--

 

In this example, we've also turned the function/procedure into one which returns a value, all without relying on the mechanisms which leak memory. That's what the OUTPUT3_1 variable is for.

 

What you will probably NOT notice when running the above code, is that your RAM has only increased by the amount required for the global variables (and you can always 'clear' them to free up the memory). The reason you don't notice it, is because the increase in this example is so small as to be undetectable.

 

If you need proof that variables are being changed without the RAM leak, run the proc with FAR LESS iterations, and print the result.

 

There you go 🙂

 

 

 

 

 

Message 6 of 6
rflannery
in reply to: Anonymous

Wow, you're right!  I just tried your code in Maya 2016 and Maya 2018, and the memory usage jumped up in both.  (It doesn't actually affect me, since I use Python.  But I was curious.  It's a fascinating find.)

Can't find what you're looking for? Ask the community or share your knowledge.

Post to forums  

Autodesk Design & Make Report