so I have the same problem as many others. I have read through articles and SO questions that are similar but never clearly understood how to resolve this problem.
I am using a SQLDataReader to run a select query against a DB, and the query returns 180k records, I am looping through reader.Read() and creating new instance of an object for each loop, assigning values to properties from reader, and at the end of the loop adding this object to a list, I am doing this for all 180k rows of data. Quite obviously I am running into Out of memory exception. Can someone suggest a workaround for this?
Relevant code:
List<MyObject> collection = new List<MyObject>();
var reader = cmd.ExecuteReader();
while(reader.Read())
{
var obj = new MyObject();
obj.Property1 = Convert.ToInt32(reader["Column1"]);
obj.Property2 = Convert.ToString(reader["Column2"]);
...
...
obj.Property178 = Convert.ToString(reader["Column178"]);
collection.Add(obj);
}
This collection at the end of the loop is being returned to a calling method which is doing further processing on the records. Alternatives I tried:
- Use DataTable and dump records from reader into it instead of looping.
- This did not show any change in memory usage.
- Use a single instance of MyObject and reassign values to the same instance on every loop
- This significantly reduced the memory being used, but at the end of the loop, all 180k records in the collection had the exact same value since obj is reference type and modifying its value will modify the list as well.
Any other suggestions?
new List<MyObject>(190000);but that probably won't be enough. you need to rethink why you need so much data in the first place, and whether they all need to be strings.