Build Times

Build Times

EMD1954
Advocate Advocate
893 Views
10 Replies
Message 1 of 11

Build Times

EMD1954
Advocate
Advocate

I have a library file "EMD_Data.lib" that contains a significant amount of raw data (an array of structs, each struct has 74 members of all data types, and the array is about 4000 elements).  I have a handful of main arx projects that #include its header and link to that library file.

 

In VS2008, it used to take about 55 minutes to compile the Win-32 and x64 versions of the .lib file.  This wasn't a problem because I only need to rebuild those files very infrequently, about once a year.  The main projects compiled (even rebuild all) in about a minute.

 

In VS2008, my project settings for both the .lib and the .arx files were "No whole program optimization" and the C/C++ optimization was set to "Whole program Optimization" -> NO.  For the .arx files, the Linker optimization for "Link Time Code Generation" was set to "Default".

 

I just migrated my files to VS2010 and I maintained the same settings.

 

The .lib file takes about 40 minutes and the .arx files each take about 30 minutes.  The .arx files are programs on which I am always working (say once a week) so I can't possibly be productive when it takes this long to build the application every time I want to test out a new routine or improve on an old one.

 

Any suggestions or advice would be greatly appreciated.

 

Thanks,

Ed

 

 

0 Likes
894 Views
10 Replies
Replies (10)
Message 2 of 11

owenwengerd
Advisor
Advisor

It sounds like you're either not using precompiled headers at all, or you're not using them effectively.

--
Owen Wengerd
ManuSoft
0 Likes
Message 3 of 11

EMD1954
Advocate
Advocate

My settings are:

 

Create/Build Precompiled Headers -> Use Precompiled Header (/Yu)

Precompiled Header File -> StdArx.h

Precompiled Header Output File -> $(IntDir)\$(TargetName).pch

 

Everything is the same in both VC2008 and VC2010

 

Ed

0 Likes
Message 4 of 11

owenwengerd
Advisor
Advisor

Are you creating a precompiled header?

--
Owen Wengerd
ManuSoft
0 Likes
Message 5 of 11

EMD1954
Advocate
Advocate

Yes.  And I delete it (and everything else in the "Release" directory) before I rebuild just to make sure.

 

I even tried the Create (/Yc) option and it makes no difference.

 

I can understand the library file taking time to compile and build (it always did), but the .arx file shouldn't (it never did).  I'm using the .lib file just like any other library (and always did), and my settings are the same in VC2008 and VC2010.  And, there are no other programs open.

 

I'm ranting but it seems to me that it shouldn't be this difficult to upgrade to the newer versions of software which are supposed to make it faster and easier - sorry.

 

Maybe I can try a different approach.  Maybe I shouldn't be compiling an enormous amount of data into a library file.  Maybe there's a way that I can include the info (as *.xls, *.txt, etc.) as a resourse?  It's the equivalent of an Excel spreadsheet with 4,000 rows and 75 columns of data of all types.  If anybody has a sample of how to do this as a user-defined resourse (?), I would really appreciate i

 

Thanks,

Ed

 

0 Likes
Message 6 of 11

owenwengerd
Advisor
Advisor

I use separately built libraries in several of my projects and I haven't noticed any problems with build times in VS 2010. It may be something in VS 2010 that's unique to your library, but it's apparent from your response that you're not sure how to set up precompiled headers, so I'm still suspicious about that.

--
Owen Wengerd
ManuSoft
0 Likes
Message 7 of 11

EMD1954
Advocate
Advocate

I have a handful of other library files that I use and none of them have this issue nor do the projects that use them.  It's only this one large library file.

 

I think I'm using the .pch correctly but I'll do some more research and see if I can improve.

 

The more I think about it, there must be a better way for me to let my programs have access to data that I don't want to distribute.  Should I be looking into ObjectDBX?

 

Thanks,

Ed

 

0 Likes
Message 8 of 11

owenwengerd
Advisor
Advisor

Unless you've implemented telepathy in native code, I don't think there's any way for your program to access data that isn't distributed with the program. In any case, since there's a lot of it and it rarely changes, it should be included in the precompiled header. Maybe you're asking about another way to package it; if so, you could look into using an SQLite database. That can easily be built as part of the build process and packaged with your app (or accessed via telepathy if you prefer).

--
Owen Wengerd
ManuSoft
0 Likes
Message 9 of 11

EMD1954
Advocate
Advocate

Owen,

 

Right now, the library file contains a lot of data.  I wrote a routine that reads from a CDF file and writes to a .cpp file in the proper format for a struct definition in the .lib file.  I only need to distribute the .arx file.  I could just have the project read from the CDF data file but then I have to distribute that file with the application.  It's dangerous because I don't want the user to be able to accidentally (or purposefully) change anything, and it takes longer for the program to read from the file.

 

I thought that maybe there was a way that I could include the data as a resource (or other?) file that would become part of my application.  If anyone has come across this before, I would really appreciate a sample of how to go about it (unless you think that that's the wrong approach as well).

 

Thanks,

Ed

0 Likes
Message 10 of 11

owenwengerd
Advisor
Advisor

Designing an optimum data architecture is pretty involved for a free support forum with limited information to work with, but it should be a simple task for someone with experience, so I think your best bet is to look for an experienced consultant to help you out. This isn't really an AutoCAD problem, but if you're an ADN member, you may be able to get some help from ADN developer support.

--
Owen Wengerd
ManuSoft
0 Likes
Message 11 of 11

EMD1954
Advocate
Advocate

I understand.  I just thought that it was like guys who write programs for the creation of drawings for manufacturing who use parametric equations with a lot of data.  Or someone in Civil Engineering who has done maps with survey data and topology.  I thought surely, someone has done an .arx application using data in some form.

 

I am not an ADN member because this programming is not my primary business (although I write a lot of software used by my firm).  I am looking in other forums as well.

 

Thanks for your help and your interest in replying,

Ed

 

0 Likes