Insert Into Table performance questions
carriann last edited by
I am writing a stored procedure, main purpose of it will be the insertion into table
Procedure will be signed by certificate, and sending email notifications
Procedure will accept user-defined
table-variableas a parameter, and insert it's contents into persistent table
Below couple of questions:
Can insertion into persistent table cause Table lock escalation on that table?
Does it make any sense to the limit number of rows allowed to be inserted per 1 procedure execution to certain limit? say 100,200,500 rows etc. - to avoid
Temp tables have some benefit comparing to table variables (like statistics).
Inside stored procedure, does it make any sense of first inserting contents of user-defined table-variable into #temp table, and then into target table ?
Or we can skip #temp table and insert into persistent table, from table-variable directly and there won't be any performance drawbacks ?
What locks are taken during insert ? And are there any other performance considerations for inserts ?
update: added simplified procedure code:
create procedure InsertIntoTable (@_TableData TableData readonly) as begin
-- lets use intermediate temp table
drop table if exists #InsertIntoTable
create table #InsertIntoTable (
-- insert from table-variable into temp table
insert into #InsertIntoTable (column1, column2, column3, column4, column5, column6, column7)
-- let's do the actual insert
insert into Database..Table (column1, column2, column3, column4, column5, column6, column7)
Since something needs to first populate the table variable that is then passed to the stored procedure, unless the procedure is doing more than what you posted, why not populate the table directly? Skipping the use of the table variable and sproc?