Pros and Cons of using SqlCommand Prepare in C#?
Asked Answered
D

4

39

When i was reading books to learn C# (might be some old Visual Studio 2005 books) I've encountered advice to always use SqlCommand.Prepare everytime I execute SQL call (whether its' a SELECT/UPDATE or INSERT on SQL SERVER 2005/2008) and I pass parameters to it. But is it really so?

  1. Should it be done every time? Or just sometimes?

  2. Does it matter whether it's one parameter being passed or five or twenty?

  3. What boost should it give if any? Would it be noticeable at all (I've been using SqlCommand.Prepare here and skipped it there and never had any problems or noticeable differences).

For the sake of the question this is my usual code that I use, but this is more of a general question.

public static decimal pobierzBenchmarkKolejny(string varPortfelID, DateTime data, decimal varBenchmarkPoprzedni, decimal varStopaOdniesienia) {
    const string preparedCommand = @"SELECT [dbo].[ufn_BenchmarkKolejny](@varPortfelID, @data, @varBenchmarkPoprzedni,  @varStopaOdniesienia) AS 'Benchmark'";
    using (var varConnection = Locale.sqlConnectOneTime(Locale.sqlDataConnectionDetailsDZP)) //if (varConnection != null) {
    using (var sqlQuery = new SqlCommand(preparedCommand, varConnection)) {
        sqlQuery.Prepare();
        sqlQuery.Parameters.AddWithValue("@varPortfelID", varPortfelID);
        sqlQuery.Parameters.AddWithValue("@varStopaOdniesienia", varStopaOdniesienia);
        sqlQuery.Parameters.AddWithValue("@data", data);
        sqlQuery.Parameters.AddWithValue("@varBenchmarkPoprzedni", varBenchmarkPoprzedni);
        using (var sqlQueryResult = sqlQuery.ExecuteReader())
            if (sqlQueryResult != null) {
                while (sqlQueryResult.Read()) {

                }
            }
    }
}

Additional clarification:

If i move sqlQuery.Prepare() like in code below exception is thrown that the size has to be explicitly declared, which basically leads me to thinking that having sqlQuery.Prepare() as first makes it useless? Can someone show the proper usage using my example?

public static decimal pobierzBenchmarkKolejny(string varPortfelID, DateTime data, decimal varBenchmarkPoprzedni, decimal varStopaOdniesienia) {
    const string preparedCommand = @"SELECT [dbo].[ufn_BenchmarkKolejny](@varPortfelID, @data, @varBenchmarkPoprzedni,  @varStopaOdniesienia) AS 'Benchmark'";
    using (var varConnection = Locale.sqlConnectOneTime(Locale.sqlDataConnectionDetailsDZP)) //if (varConnection != null) {
    using (var sqlQuery = new SqlCommand(preparedCommand, varConnection)) {

        sqlQuery.Parameters.AddWithValue("@varPortfelID", varPortfelID);
        sqlQuery.Parameters.AddWithValue("@varStopaOdniesienia", varStopaOdniesienia);
        sqlQuery.Parameters.AddWithValue("@data", data);
        sqlQuery.Parameters.AddWithValue("@varBenchmarkPoprzedni", varBenchmarkPoprzedni);
        sqlQuery.Prepare();
        using (var sqlQueryResult = sqlQuery.ExecuteReader())
            if (sqlQueryResult != null) {
                while (sqlQueryResult.Read()) {

                }
            }
    }
}

How would I do that? By adding .size next to parameters and doing varPortfel.Lenght if it's a string etc?

Daughtry answered 15/3, 2010 at 19:23 Comment(1)
Partly why you're not noticing a difference is that you're changing the schema of the query by declaring the parameters after running prepare. Also, partially-declared params (such as those made via AddWithValue) have significant issues, as SQL guesses the type, size, and nullable status of those values based on context (leading to unnecessary casts and conversions). I'd recommend making an extension method of AddWithValue which takes the missing info, so that you can stop treating your SQL parameters like javascript variables but retain the one-line declarations. Also prevents exceptionsTriforium
L
13

From the MSDN Documentation:

"Before you call Prepare, specify the data type of each parameter in the statement to be prepared. For each parameter that has a variable length data type, you must set the Size property to the maximum size needed. Prepare returns an error if these conditions are not met.

If you call an Execute method after calling Prepare, any parameter value that is larger than the value specified by the Size property is automatically truncated to the original specified size of the parameter, and no truncation errors are returned.

Output parameters (whether prepared or not) must have a user-specified data type. If you specify a variable length data type, you must also specify the maximum Size."

Furthermore, "If the CommandType property is set to TableDirect, Prepare does nothing. If CommandType is set to StoredProcedure, the call to Prepare should succeed, ..."

This in general is used to make sure that the end user is not using a SQL Injection technique to add or remove information you do not want them too from the database.

I looked into it and check out this article http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlcommand.prepare.aspx. Your issue is you need to define your parameters before you run .Prepare() and then set your parameters after you run .Prepare(). Right now you are doing both before. I would try something like this (Note I didn't test it so my syntax might be a bit off).

public static decimal pobierzBenchmarkKolejny(string varPortfelID, DateTime data, decimal varBenchmarkPoprzedni, decimal varStopaOdniesienia) {
    const string preparedCommand = @"SELECT [dbo].[ufn_BenchmarkKolejny](@varPortfelID, @data, @varBenchmarkPoprzedni,  @varStopaOdniesienia) AS 'Benchmark'";
    using (var varConnection = Locale.sqlConnectOneTime(Locale.sqlDataConnectionDetailsDZP)) //if (varConnection != null) {
    using (var sqlQuery = new SqlCommand(preparedCommand, varConnection)) {

        sqlQuery.Parameters.Add("@varPortfelID");
        sqlQuery.Parameters.Add("@varStopaOdniesienia");
        sqlQuery.Parameters.Add("@data");
        sqlQuery.Parameters.Add("@varBenchmarkPoprzedni");

        sqlQuery.Prepare();
        sqlQuery.ExecuteNonQuery();//This might need to be ExecuteReader()

        sqlQuery.Parameters[0].Value = varPortfelID;
        sqlQuery.Parameters[1].Value = varStopaOdniesienia;
        sqlQuery.Parameters[2].Value = data;
        sqlQuery.Parameters[3].Value = varBenchmarkPoprzedni;

        using (var sqlQueryResult = sqlQuery.ExecuteReader())
            if (sqlQueryResult != null) {
                while (sqlQueryResult.Read()) {

                }
            }
    }
}
Luigi answered 15/3, 2010 at 19:26 Comment(8)
What about performance and other things? Is it only to prevent SQL Injection? I've read MSDN documentation myself prior to asking this question and the quoted part isn't realy useful regarding my questions to use it or not and if it has any performance benefits which seems to be suggested on some websites.Daughtry
My understanding is by making everything defined it can cache and reuse the query plan. I would guess that it doesn't give you an extreme amount of benefit.Luigi
Is the usage I have in code correct? I tried moving Prepare to after all Parameters are set and the error is thrown. So is it that sqlQuery.Prepare should always be used before parameters or?Daughtry
Well partially. You didn't notice in the link SqlParameter idParam = new SqlParameter("@id", SqlDbType.Int, 0); SqlParameter descParam = new SqlParameter("@desc", SqlDbType.Text, 100); idParam.Value = 20; Which is what it is all about I guess? But WOW if that's a solution to add 4+ new lines to use Sql.Prepare() i will probably be better off without it.Daughtry
Surely there has to be a better way?Daughtry
The better way is not to use it. The biggest benefit that it will give you is caching your SQL Code. Which is important when you are doing complex queries that take a lot of time to compile. The benefits for stopping SQL injection are not huge but they will help a little. So in this specific case it probably isn't worth it.Luigi
Okay, as for this being worth it, for any query that you are executing more than 10 times IT IS WORTH IT. The speed up alone can be around a factor of 4-5. However, the call to Prepare itself does take some processing time. Between that, and the "extra lines" to set the parameter types are negligible. I've got a query I'm executing 10's of thousands of times, it this is a HUGE time-saver for me. The other benefit of protecting against SQL-injection is huge as well for anyone that doesn't want their users breaking their security.Abscise
Aa soon you are using SqlParameters with your SqlCommand, the SQL statements will be wrapped into sp_executesql procedure calls with parameters. This results in an execution plan that can be reused regardless of the values you pass to your parameters. This way PREPARE does not have any effects regarding performance coming from caching plans because they are cached and reused anyway.Extensible
U
7

The other benefit is that by doing this the SQL query plan is compiled, cached and re-used. This is not a big deal if for a low volume of calls to your query but if you have a lot there really are some significant performance advantages to doing this.

Unload answered 15/3, 2010 at 20:39 Comment(2)
Can you add any supporting links?Monocoque
The plan is reused anyway because of the SqlParameters used with the SqlCommand object, because .net will wrap the sql statement into an sp_executesql procedure that will generate a reusable plan.Extensible
S
4

From my own experience: the performance boost is VERY significant. Some time ago I worked on a project where we used our own object-relational mapping. We utilized huge database as a persistent store of complex object model - with on-demand object loading and weak-referenced object life-time.

Using prepared commands was crucial for success of that application, as it alone made the system actually usable.

In other words: if you execute many SQL commands - that are exactly same or differ only in parameter values -, you'll see huge performance boost.

I don't have exact numbers or links, but I can testify my own experience.

Shrub answered 30/11, 2013 at 11:7 Comment(1)
I absolutly beliefe what you are saying - but @OP: Please consider also the opposite. Depending on the DB having prepared statements might give the query analyzer a hard time and picking the wrong indexes. We saw that especially on DB2 with columns with quite uneven value distribution. So as with every performance advice: You have to measure it for your scenarioTherefor
H
3

According to IDbCommand.Prepare Method documentation:

The server automatically caches plans for reuse as necessary; therefore, there is no need to call this method directly in your client application.

I also found this Databases Administrators answer which give you a lot of details about the prepare method and why I didn't get any significant improvement.

About the SQL injection, you'll be protected against it because you have used a parameterized sqlCommand... not because you have called the prepare method.

Hertha answered 20/2, 2018 at 14:2 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.