Memory Allocation - FATAL ERROR

Memory Allocation - FATAL ERROR

Anonymous
Not applicable
2,456 Views
6 Replies
Message 1 of 7

Memory Allocation - FATAL ERROR

Anonymous
Not applicable

Hello All,

 

I've written a simple ARX utility that does nothing but allocate and free a block of memory off the heap.

 

It uses the C++ new and delete operators, and successively increases the size of the block 1MB at a time, running until AutoCAD crashes.

AutoCAD crashes after allocating 83MB of memory. "FATAL ERROR: Out of Memory - Shutting Down"

 

Test environment: AutoCAD 2011 on XP (Win32) with 3GB of memory (1.85GB physical RAM available).

 

Here is the C code:

 

for (int i = 0; i < 1000; i++)
{
    int len = i * 1000000;
    char* buffer = new char[len];
    if (buffer) delete []buffer;
}

Any insights?

 

Best,

Roger

0 Likes
2,457 Views
6 Replies
Replies (6)
Message 2 of 7

Anonymous
Not applicable

I should clarify the point of my post.

 

The crash occurs at 83MB on a machine with 1.83GB RAM available.

 

That's a very small amount of memory nowadays. A wee bit of RAM.

 

I would not expect the entire program to abort and shut down.

Any insights?

0 Likes
Message 3 of 7

owenwengerd
Advisor
Advisor
You are assuming that every new allocation re-uses the previously freed memory. Your results prove that the assumption is incorrect.
--
Owen Wengerd
ManuSoft
0 Likes
Message 4 of 7

Anonymous
Not applicable

Unrelated to the problem. Thanks for posting though.

 

A single memory allocation fails at the same threshold.

 

Any other ideas?

 

My guess, there's something going on with AutoCAD and memory management under the hood. The behavior of the application on failure to allocate is not a standard operating system response.

 

Also, I'm surprised by the failure to allocate and return a valid pointer. 100MB is not much memory these days.

 

Thanks,

Roger

 

0 Likes
Message 5 of 7

Anonymous
Not applicable

Testing allocation limits under ARX:

  1MB / alloc => Up to 900MB
 10MB / alloc => Up to 700MB
 20MB / alloc => Up to 520MB
 30MB / alloc => Up to 420MB
 40MB / alloc => Up to 320MB
 50MB / alloc => Up to 350MB
 60MB / alloc => Up to 300MB
 70MB / alloc => Up to 140MB
 80MB / alloc => Up to 160MB
 90MB / alloc => FAIL

These are small numbers.

 

Contiguous memory available for heap requests is ample.

On the same machine, heap allocation limits outside ARX:

100MB / alloc => Up to 1.8GB
200MB / alloc => Up to 1.8GB
300MB / alloc => Up to 1.8GB
...
600MB / alloc => Up to 1.8GB

- Why does the heap behave differently under ARX?

- Does AutoCAD limit the size of the process heap?

- Is it using private heaps for memory management?

- Does it override the new operator / malloc function?

- Why is there a custom new handler on malloc failure?

Feedback welcome. Thanks.

Best,
Roger

0 Likes
Message 6 of 7

Anonymous
Not applicable

It appears that your first loop code is a good approximation of a memory fragmenter. However, the allocation limit testing is interesting, and I'm glad you brought it up, since I'm curious to know why this is the case as well.

 

0 Likes
Message 7 of 7

Anonymous
Not applicable

Yes, the original problem surfaced in a very simple heap context, no fragmentation issues.

 

Modern hardware, 1.8 GB physical memory available, allocating 85MB shouldn't be a problem, IMHO.

 

We test for malloc failure in our code, handle gracefully with a dialog box to the user, but we never actually get to that point because AutoCAD pops up its own dialog box on malloc failure: FATAL ERROR - Out of Memory, abort the program, etc.

 

Appreciate the interest in this topic.

0 Likes