Performance differences calling sp_executesql with dynamic SQL vs parameters
Asked Answered
L

1

3

Given:

CREATE PROCEDURE [dbo].[my_storedproc]
  @param1 int, @param2 varchar(100)
AS 
<<whatever>>
GO

Are there known performance differences between these different execution methods?:

-- Method #1:
declare @param1 int = 1
declare @param2 varchar(100) = 'hello'  
exec my_storedproc @param1, @param2

-- Method #2:  
exec my_storedproc @param1=1, @param2='hello'

-- Method #3:  
declare @param1 int = 1
declare @param2 varchar(100) = 'hello'  
declare @procname nvarchar(100) = N'my_storedproc @param1, @param2'
declare @params nvarchar(4000) = N'@param1 int, @param2 varchar(100)'  
exec sp_executesql @procname, @params, @param1, @param2

-- Method #4:  
declare @procname nvarchar(4000) = N'my_storedproc @param1=1, @param2=''hello'''
exec sp_executesql @procname

-- Method #5:  
declare @procname nvarchar(4000) = N'my_storedproc 1, ''hello'''
exec sp_executesql @procname

-- Method #6:  
declare @procname nvarchar(4000) = N'my_storedproc 1, ''hello'''
exec (@procname)

"Why do you ask?" you ask? I am trying to find a way to generically execute stored procedures entirely based upon metadata, the controlling stored procedure that will physically execute all the other configured (in metadata) stored procedures knows nothing about them other than what is defined in the metadata. Within this controller SP, I cannot (in any practical sense) know and declare the specific physical parameters (with their required data types) required for every possible stored proc that might have to be called - I am trying to find a way to execute them entirely generically, while still hopefully maintaining decent performance (reusing query plans, etc).

Lindquist answered 5/3, 2015 at 18:47 Comment(12)
I don't know why so many people think this is good idea. What you are describing is a single stored procedure that can execute any other stored procedure. This is like creating a single method in .NET that can do anything. Sure it can be done, but the cost is going to be performance. Every time you need to do anything with a stored procedure it will first have to parse through a bunch of stuff to figure out what to actually do.Nicknickel
It's interesting how SQL people literally can't conceive of the underlying principle here - not only is this not crazy, it is the very principle that ORM tools are based upon. "This is like creating a single method in .NET that can do anything" No it isn't, you still have individual stored procs. " the cost is going to be performance." Is it? That's the question. That's what detractors said about dynamic SQL with ORM's, and they were proven to be incorrect.Lindquist
Right but an ORM generates the sql dynamically. And I have not seen an ORM that produces great sql yet. It is incredibly complicated which is why the sql is always so unmanageable. I think the sql coming out of something with this additional layer of abstraction would be even more crazy. I would be very leery of using method 3 or 4 that you described because it would be difficult to prevent sql injection without using parameters. And method 1 isn't going to be generic enough for what you are trying to do.Nicknickel
As far as I can tell, 1 and 2 are identical in that only the outer query must be compiled (assuming the procedure has a cached plan), 3, 4 and 5 are also identical in that you must compile two lots of SQL, the outer and inner queries (for want of a better term), probably not significant detriment in individual queries, but you may get a bloated plan cache. In addition 3 and 4 look very difficult to make type safe. The 6th method will fail, as you are trying to execute a stored procedure called [my_storedproc 1, ''hello''].Krucik
If 6 should be exec (N'my_storedproc 1, ''hello''') then it is just the same as 5. How do you intend on calling the generic stored procedure that calls all other procedures? Why are you trying to reinvent the wheel, if you want ORM like behaviour jthen why not ust use an ORM?Krucik
Method 4 is BY FAR the easiest for me to implement, but concerns are performance, and SQL injection. This is an entirely internal system though, appropriate permissions will be set, and and I will be doing some rudimentary checking for injection, so not overly concerned. I'm somewhat more worried about performance with heavier queries.Lindquist
@Krucik This will be called on demand (proxy data server scenario) or by a scheduled process (off hours data extraction scenario) - it's not an ORM, it is just based on the same principles.Lindquist
most people use SSIS for nightly SQL jobs. Is that not an option here? Will the sprocs you call really vary that much day to day that you can't set something like this up?Diverticulitis
I suggest if you are going to do something that has the potential to affect the entire system performance that you do some in depth research first. You should not attempt to know this until you have read a book on the SQL server internals, and one on SQL server performance tuning and one on how to read execution plans. You need to understand how it creates the execution plan and the implications of that in the way you are intending to do business. It make a difference which SQL Server backend you are using too as they have changed how these things are interpreted from version to version.Isoclinal
To get you started: amazon.com/Microsoft-Server-Internals-Developer-Reference/dp/…Isoclinal
There are different versions of the book above for different versions of SQL Server , but alawys look for the one written by Karen Delaney.Isoclinal
@Diverticulitis The design of this will be such that relatively low skilled admin staff can register new data sources, avoiding the requirement for sophisticated skills like an SSIS professional.Lindquist
F
1

There really shouldn't be a performance difference between the 6 options since they are all executing the stored procedure and not any SQL statements directly.

However, there is no better indication of performance than testing this on your own system. You already have the 6 test cases so it shouldn't be hard to try each one.

Within this controller SP, I cannot (in any practical sense) know and declare the specific physical parameters (with their required data types) required for every possible stored proc that might have to be called

Why not? I don't see why you couldn't dynamically generate the SQL for Methods 2 and 3 based on the output of either of the following queries:

SELECT OBJECT_NAME(sp.[object_id]), *
FROM   sys.parameters sp
WHERE  sp.[object_id] = OBJECT_ID(N'dbo.my_storedproc');

SELECT isp.*
FROM   INFORMATION_SCHEMA.PARAMETERS isp
WHERE  isp.[SPECIFIC_NAME] = N'my_storedproc'
AND    isp.[SPECIFIC_SCHEMA] = N'dbo';

And with that info, you could create a table to contain the various parameter values for each parameter for each proc. In fact, you could even set it up to have some parameters with "global" values for all variations and then some parameter values are variations for a particular proc.

Farrison answered 11/3, 2015 at 19:40 Comment(6)
All the SQL is stored as metadata, it is a mixture of plain SQL and stored procedure calls, and some of it references remote SQL servers.Lindquist
@Lindquist The question text and examples only refer to stored procs being called by this setup. If the setup will also be calling dynamic SQL then there can be performance differences between some of those 6 options. And then of course even if there are parameters, those won't be discoverable via the two queries I listed.Farrison
I think my concerns behind this problem (performance of dynamic sql including parameters via sp_executesql) might be eliminated by moving this to SQLCLR. You make a good point in your two query examples, but I also have to consider non-SP based query text. With properly formed queries via the SQLCLR though, I would now expect the performance characteristics of what I'm doing to be essentially the same as any standard ORM (which most DBA's seem to hate, but I don't think there's much justification for that provided they're used correctly.)Lindquist
@Lindquist That is not how this stuff actually works. An execution plan is tied to the full text of the query (using a binary comparison: everything-sensitive). Non-parameterized SQL (EXEC, and SqlCommand with no parameters) will generate a new plan for any change in any parameter value since that changes the query text. Parameterized SQL (sp_executesql and SqlCommand with params) can reuse a plan since changing param values doesn't change the query text. SQLCLR doesn't change this. ORM's usually use sp_executesql but don't allow for fine-tuning the queries, hence why they're disliked.Farrison
I think as long as I render my dynamic sql as parameterized queries, I should get plan reuse (at least from what I read that is the case) in many cases. BUT, I think there are some quite strict guidelines you have to follow. I think in my case this is premature optimization, but I'd like to have a handle on it.Lindquist
@Lindquist Yes, what I said in my previous comment was: "Parameterized SQL (sp_executesql with params and SqlCommand with params via SqlParamater) can reuse a plan since changing param values doesn't change the query text". I am not sure what other guidelines you think need to be in place. But there is no difference between using sp_executesql with params, using an ORM (so long as it uses sp_executesql with params), or using SQLCLR--which will be a SqlCommand--with SqlParameters); these are all parameterized queries.Farrison

© 2022 - 2024 — McMap. All rights reserved.