Hi,
There is a lot of question about this that you should answer : First of all, where is your "big data"? in a file, in a db server (which one?), cloud, etc. then how big is it? , how big are the average result sets? What kind of data (numbers, strings, spatial, etc) Depending on the answers you can check for some things to do:
1.-If files, then you can split/merge the data in smaller or bigger files depending on read frequency and read speed, memory access, etc.
2.-If data base, you can check if the indexes are well created, poor indexing is the primary factor in data access performance. If they are ok, then changing to a in-memory db engine may be an alternative (see H2 database engine as an example of in-memory db engine)
3.-De-normalization, some times you need to de-normalize a structure in order to achieve a better performance, usually the third normal form (3NF) suffice the needs.
4.-Using a bad type of cursors in SQL Server may change all, IOW check that you are using the correct cursor in a stored procedure
5.-Stored procedure with bad filtering, if your data it's in a SQL Server, check the execution plan, it give you some hint on performance. That apply to indexing too.
6.- If you are dealing with spatial data, check you are using the proper structure to store or retrieve the data. Also SQL server use a b-tree to index spatial data, it's slower than a quad tree.
7.-String management, string concatenation operator it's slower than using the string builder class concatenate method in .NET, as string class objects are immutable.
8.-String search using regular expressions could be very slow.
8.- Search for the correct structure that you need in code, in .NET there are lots of different kind of lists, arrays, dictionaries, etc. each one performing different with the same method, so do some research on that topic.
There is a lot that can be do, but we need more info.
Gaston Nunez