Optimization of the code

Optimization of the code

Anonymous
Not applicable
487 Views
3 Replies
Message 1 of 4

Optimization of the code

Anonymous
Not applicable

Hello 

 

I code using c#, ObjectARX and AutoCAD .NET API.

 

In many functions i need to use big data so they take a long time (sometimes hours), i thought about optimizating the code, is there any method to do that using these technologies ??

 

I found that using a lot of transactions in the same function can contribute in that. is there any other advice in this direction to have an optimal code with the minimal time of execution.

 

Thank you.

0 Likes
488 Views
3 Replies
Replies (3)
Message 2 of 4

mhillis
Advocate
Advocate

This is an extremely broad and general development question that has no definitive answer as it completely depends on your individual code base.  The only thing I can think of doing is debugging your software and finding areas in your software that are taking the longest to complete their operations.  You can also use the Stopwatch class in .NET (System.Diagnostics.Stopwatch) to set timers around certain events that you might feel are suspect.  

 

Once you find those areas it will be up to you to determine how it can be reorganized/restructured to operate more efficiently, if that is even possible. Again, this is something that is completely dependent on YOUR code base and there really isn't a 'one-size-fits-all' solution to this. 

Message 3 of 4

dgorsman
Consultant
Consultant

Generic question, generic answers: the two most common problems I run into are loops and data management.  Don't loop when you don't have to, skip or even stop looping wherever possible.  Don't work with "big data", just as you don't put all your work in a single file and instead use smaller XREFs, work with lots of "small data".  Structure it to support short-circuit looping.  Also choose an appropriate data format and location - a CSV file is OK for a couple hundred lines of data but for a few million you're going to need a database.

----------------------------------
If you are going to fly by the seat of your pants, expect friction burns.
"I don't know" is the beginning of knowledge, not the end.


Message 4 of 4

hgasty1001
Advisor
Advisor

Hi,

 

There is a lot of question about this that you should answer : First of all, where is your "big data"? in a file, in a db server (which one?), cloud, etc.  then how big is it? , how big are the average result sets? What kind of data (numbers, strings, spatial, etc)  Depending on the answers you can check for some things to do:

 

1.-If files, then you can split/merge the data in smaller or bigger files depending on read frequency and read speed, memory access, etc.

 

2.-If data base, you can check if the indexes are well created, poor indexing is the primary factor in data access performance. If they are ok, then changing to a in-memory db engine may be an alternative (see H2 database engine as an example of in-memory db engine)

 

3.-De-normalization, some times you need to de-normalize a structure in order to achieve a better performance, usually the third normal form (3NF) suffice the needs.

 

4.-Using a bad type of cursors in SQL Server may change all, IOW check that you are using the correct cursor in a stored procedure

 

5.-Stored procedure with bad filtering, if your data it's in a SQL Server, check the execution plan, it give you some hint on performance. That apply to indexing too.

 

6.- If you are dealing with spatial data, check you are using the proper structure to store or retrieve the data. Also SQL server use a b-tree to index spatial data,  it's slower than a quad tree.

 

7.-String management, string concatenation operator it's slower than using the string builder class concatenate method in .NET, as string class objects are immutable.

 

8.-String search using regular expressions could be very slow.

 

8.- Search for the correct structure that you need in code, in .NET there are lots of different kind of  lists, arrays, dictionaries, etc. each one performing different with the same method, so do some research on that topic.

 

 

There is a lot that can be do, but we need more info.

 

Gaston Nunez

0 Likes