Installation & Licensing
Welcome to Autodesk’s Installation and Licensing Forums. Share your knowledge, ask questions, and explore popular Download, Installation, and Licensing topics.
cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Linux - Memory Allocation?

5 REPLIES 5
Reply
Message 1 of 6
Anonymous
756 Views, 5 Replies

Linux - Memory Allocation?

Hi Everyone,

System Info:
- Dell Precision T3600
- Nvidia Quadro 4000
- 30GB RAM
- CentOS 6.2 64bit
- Maya 2013 64bit SP2

When opening any large scene in CentOS, the RAM loads around 12GB and then Maya crashes with a memory error.
Troubleshooting:
- The same scenes works in Windows 7 64 bit
- Tested memory allocation on the Linux machine and no problem allocating way more than 12GB with other apps

It seems to be an "application specific" bug with Maya under CentOS
There has to be a memory limit somewhere that I can edit...

Thank you for any help or guidance !

mental ray for Maya 2013
mental ray: version 3.10.1.11, Jul 26 2012, revision 177995
mmap: Cannot allocate memory
====================================
Cause of memory exception
====================================
2048.000 Mb Free Memory
2048.000 Mb Free Swap
0.105 Mb Size of alloc
10.000 Mb Low Memory Threshold
====================================
Memory use when exception was thrown
====================================
====================================
1 Page faults
11988.738 Mb Max resident size
9377.577 Mb Peak total size(Estimated)
8845.070 Mb Peak arena size
====================================
468.507 Mb Heap
2621.846 Mb POLY_DRAW_CACHE_DATA
6266.184 Mb Arrays
60.938 Mb MEL
118.598 Mb Data Blocks
26.625 Mb Transforms
0.135 Mb arguments
0.516 Mb Pixel Map
1.000 Mb NURBS AG
2.793 Mb POLY_DRAW_CACHE_STATIC_DATA
0.648 Mb Object Arrays
0.125 Mb Keys
0.125 Mb NURBS Surface Shapes
0.332 Mb NURBS Geometry Cache
5 REPLIES 5
Message 2 of 6
warnold1
in reply to: Anonymous

You seem to be using a lot of memory, lots of large memory blocks, maybe something there to optimize memory usage

The out of memory error is from mmap
There is a system limit on the number of segments map can use per process

man mmap

To not use mmap one could set environment variable

MAYA_USE_MALLOC=1

Before running maya


Or maybe increase the system limits


Wayne Arnold

Developer

Message 3 of 6
Anonymous
in reply to: Anonymous

Thank you for the info !

That environment variable fixed the crash and memory problem.
As for the system limit, I looked everywhere before posting on this forum and it seemed to be "Unlimited".
Maybe something somewhere is limiting it in someway. That, is a vague sentence... 😉

Now the question is...
Is this a good practice running Maya with that variable set... ?
I'm waiting on feedback from Autodesk.
I'll keep you posted

Thanks
Message 4 of 6
warnold1
in reply to: Anonymous

one of the main differences is "releasable" memory. memory allocated via mmap is backed by the file system and can be returned to the system
malloc memory remains with the process and not "normally" returned to the system when free'd. the memory is retained in the process and can be re-used by the memory allocator.

Windows has a similar feature and limit on memory handles

so long as everything stays in memory, and you don't mind the size of the process, should be ok.


Wayne Arnold

Developer

Message 5 of 6
warnold1
in reply to: Anonymous

/proc/sys/vm/max_map_count

http://www.novell.com/support/kb/doc.php?id=7000830


Wayne Arnold

Developer

Message 6 of 6
Anonymous
in reply to: Anonymous

Thanks for the "Novell" memory info.
It worked !!!

I put "vm.max_map_count=1965900"

I think for now I'll go with this, and see what Autodesk has to say about it 😉

Thanks again

Can't find what you're looking for? Ask the community or share your knowledge.

Post to forums  

Administrator Productivity


Autodesk Design & Make Report