Exception of type 'System.OutOfMemoryException' was thrown. Why?
Asked Answered
T

5

16

I have a dynamic query that returns around 590,000 records. It runs successfully the first time, but if I run it again, I keep getting a System.OutOfMemoryException. What are some reasons this could be happening?

The error is happening here:

  public static DataSet GetDataSet(string databaseName,string
                                   storedProcedureName,params object[] parameters)
    {
        //Creates blank dataset
        DataSet ds = null;

        try
        {
            //Creates database
            Database db = DatabaseFactory.CreateDatabase(databaseName);
            //Creates command to execute
            DbCommand dbCommand = db.GetStoredProcCommand(storedProcedureName);
            dbCommand.CommandTimeout = COMMAND_TIMEOUT;
            //Returns the list of SQL parameters associated with that stored proecdure
            db.DiscoverParameters(dbCommand);

            int i = 1;
            //Loop through the list of parameters and set the values
            foreach (object parameter in parameters)
            {
                dbCommand.Parameters[i++].Value = parameter;
            }
            //Retrieve dataset and set to ds
            ds = db.ExecuteDataSet(dbCommand);
        }
            //Check for exceptions
        catch (SqlException sqle)
        {
            throw sqle;
        }
        catch (Exception e)
        {
            throw e; // Error is thrown here.
        }
        //Returns dataset
        return ds;
    }

Here is the code the runs on the button click:

protected void btnSearchSBIDatabase_Click(object sender, EventArgs e)
{

        LicenseSearch ls = new LicenseSearch();

        DataTable dtSearchResults = new DataTable();

        dtSearchResults = ls.Search();

        Session["dtSearchResults"] = dtSearchResults;

        Response.Redirect("~/FCCSearch/SearchResults.aspx");
        }
        else
            lblResults.Visible = true;
    }
Tingly answered 10/12, 2008 at 16:22 Comment(2)
590K rows is a little excessive, don't you think?Ratafia
Isn't the problem merely you are storing the DataTable in Session, and then when re-querying, you have both Session variables and your original DataTable in memory? BTW, do not call throw e;, call throw; as otherwise your stack trace will say that the catch handler generated the exception, when actually it is the containing code.Crossbeam
V
42

It runs successfully the first time, but if I run it again, I keep getting a System.OutOfMemoryException. What are some reasons this could be happening?

Regardless of what the others have said, the error has nothing to do with forgetting to dispose your DBCommand or DBConnection, and you will not fix your error by disposing of either of them.

The error has everything to do with your dataset which contains nearly 600,000 rows of data. Apparently your dataset consumes more than 50% of the available memory on your machine. Clearly, you'll run out of memory when you return another dataset of the same size before the first one has been garbage collected. Simple as that.

You can remedy this problem in a few ways:

  • Consider returning fewer records. I personally can't imagine a time when returning 600K records has ever been useful to a user. To minimize the records returned, try:

    • Limiting your query to the first 1000 records. If there are more than 1000 results returned from the query, inform the user to narrow their search results.

    • If your users really insist on seeing that much data at once, try paging the data. Remember: Google never shows you all 22 bajillion results of a search at once, it shows you 20 or so records at a time. Google probably doesn't hold all 22 bajillion results in memory at once, it probably finds its more memory efficient to requery its database to generate a new page.

  • If you just need to iterate through the data and you don't need random access, try returning a datareader instead. A datareader only loads one record into memory at a time.

If none of those are an option, then you need to force .NET to free up the memory used by the dataset before calling your method using one of these methods:

  • Remove all references to your old dataset. Anything holding on to a refenence of your dataset will prevent it from being reclaimed by memory.

  • If you can't null all the references to your dataset, clear all of the rows from the dataset and any objects bound to those rows instead. This removes references to the datarows and allows them to be eaten by the garbage collector.

I don't believe you'll need to call GC.Collect() to force a gen cycle. Not only is it generally a bad idea to call GC.Collect(), because sufficient memory pressure will cause .NET invoke the garbage collector on its own.

Note: calling Dispose on your dataset does not free any memory, nor does it invoke the garbage collector, nor does it remove a reference to your dataset. Dispose is used to clean up unmanaged resources, but the DataSet does not have any unmanaged resources. It only implements IDispoable because it inherents from MarshalByValueComponent, so the Dispose method on the dataset is pretty much useless.

Videlicet answered 10/12, 2008 at 17:8 Comment(2)
Hello, 754600 rows to be exact is returned if the user does not select any extra criteria and runs the query. I do have paging turned on, but all the results are still returned into the dataset.Tingly
I've run into this problem, and I find it MUCH more efficient to requery the database each time a user wants to view a new page. It requires more code, but there's no other work around. Use this code to help you page results in SQL: davidhayden.com/blog/dave/archive/2005/12/30/2652.aspxVidelicet
D
7

Perhaps you're not disposing of the previous connection/ result classes from the previous run which means their still hanging around in memory.

Derivation answered 10/12, 2008 at 16:25 Comment(7)
You can use various memory profiles... This is the best one I've come across: memprofiler.comDerivation
On that example you've just posted, make sure the command and connection have been closed/ disposed.Derivation
Thanks, I am going to give it a shot. I have never used a memory profiler before.Tingly
You are referring to DbCommand, right? What part of the sample would I close and dispose of it? I am also using the Enterprise library, does this automatically close the connection?Tingly
Everytime the code is executed above, DbCommand is always null. Doesn't this mean it is automatically being closed?Tingly
using (DbCommand ...) { ... do stuff ... get dataset ... } As Quarrelsome pointed out, if you use the 'using' statement, it'll call Dispose for you.Derivation
I still got the error, even after calling Dispose on DbCommand.Tingly
R
3

You're obviously not disposing of things.

Consider the "using" command when temporarily using objects that implement IDisposable.

Rawdan answered 10/12, 2008 at 16:32 Comment(0)
D
1

try to break your large data as much as possible because I already faced number of times this types of problem. In which I have above 10 Lakh records with 15 columns.

Drinkable answered 13/4, 2012 at 6:39 Comment(0)
L
0

Where does it fail?

I agree that your issue is probably that your dataset of 600,000 rows is probably just too large. I see that you are then adding it to Session. If you are using Sql session state, it will have to serialize that data as well.

Even if you dispose of your objects properly, you will always have at least 2 copies of this dataset in memory if you run it twice, once in session, once in procedural code. This will never scale in a web application.

Do the math, 600,000 rows, at even 1-128 bit guid per row would yield 9.6 megabytes (600k * 128 / 8) of just data, not to mention the dataset overhead.

Trim down your results.

Leban answered 10/12, 2008 at 17:54 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.