SQL Clone
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 


Pushing inserts into SQL server with C# ?


Pushing inserts into SQL server with C# ?

Author
Message
blasto_max
blasto_max
Old Hand
Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)

Group: General Forum Members
Points: 375 Visits: 480
I want to insert about 5000 (typically, can be 10K+ at times) rows into SQL server at a time. My insert statements are generated by a C# program.

One way could be by simply generating a big string full of these SQL statements and then actually making SQL server execute them using `SqlCommand.ExecuteNonQuery();` But there is a risk that SQL server might not be able to execute a big statement like that.

Is there a way I could sort of push the SQL statements to some kind of "SQL server buffer" (if there is one) and then finally execute them all ?

Thanks.
burfos
burfos
Valued Member
Valued Member (67 reputation)Valued Member (67 reputation)Valued Member (67 reputation)Valued Member (67 reputation)Valued Member (67 reputation)Valued Member (67 reputation)Valued Member (67 reputation)Valued Member (67 reputation)

Group: General Forum Members
Points: 67 Visits: 290
Perhaps look in to doing a bulk insert call?
jcb
jcb
SSCrazy
SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)

Group: General Forum Members
Points: 2874 Visits: 994
To make a decision you must answer some questions: first, there are dependencies between rows in that big chunk of data?
Second, what do you want to do if a insert fail? Rollback all the inserts, just keep going and create some sort of alert/logging?
Third, there are performance issues? can I run a batch overnight? That data must be persisted ASAP?

I don't thing a big blob/string is a good option because you ill need to parse/deserialize that (sql side) and, in general, putting that kind of logic in a SP(or whatever) is not good for performance, reliability and maintenance.

Search for transaction control and (maybe) stage tables and (maybe) integration strategies.
You can end doing something like:

procedure insert raw data
1- begin transaction
2- loop(insert in stage table)
3- commit tran

procedure process stage table
1- select no processed data from stage
2- loop in
3- begin transaction
4- process and persist data
5- if ok commit, else rollback
6- log results for that "row"
7- end loop

But off course that depends on your requirements.
Lowell
Lowell
One Orange Chip
One Orange Chip (28K reputation)One Orange Chip (28K reputation)One Orange Chip (28K reputation)One Orange Chip (28K reputation)One Orange Chip (28K reputation)One Orange Chip (28K reputation)One Orange Chip (28K reputation)One Orange Chip (28K reputation)

Group: General Forum Members
Points: 28347 Visits: 39959
bulk insert into a single table is incredibly fast. switching away from INSERT statements being generated to sticking the data into a local DataTable, and either syncronizing the changes or using bulk methods are much better, as well as maintaining data type integrity.

here's a c# code example, where previous processes have put the data i want into a DataTable:


//now use SQLBulk Copy to get the data into the server, instead of RBAR:
//note my connection string is based on String.Replace of this:
//Private Const SqlConnectionFormat As String = "data source={0};initial catalog={1};user id={2};password={3};Trusted_Connection=False;Application Name=ReportGrid.exe;"
//Private Const SqlTrustedConnectionFormat As String = "data source={0};initial catalog={1};Trusted_Connection=True;Application Name=ReportGrid.exe;"

{
try {
SqlConnection myConn = new SqlConnection(this.ConnectionString);
myConn.Open();
using (SqlBulkCopy myBulkCopy = new SqlBulkCopy(myConn)) {
myBulkCopy.DestinationTableName = "[" + DestinationTable + "]";
myBulkCopy.WriteToServer(dt);
}
} catch (Exception ex) {
Debug.Print(ex.Message);
}
}




Lowell

--
help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

jcb
jcb
SSCrazy
SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)SSCrazy (2.9K reputation)

Group: General Forum Members
Points: 2874 Visits: 994
I learned something today, thanks!
(a bit shame for me, C# developer)

Last time I needed to do something like that (plumbing data from a terabyte text file) framework 2.0 was not released yet.
But for sure ill keep this in mind next time I'll need to implement that kind of application.
Jeff Moden
Jeff Moden
SSC Guru
SSC Guru (86K reputation)SSC Guru (86K reputation)SSC Guru (86K reputation)SSC Guru (86K reputation)SSC Guru (86K reputation)SSC Guru (86K reputation)SSC Guru (86K reputation)SSC Guru (86K reputation)

Group: General Forum Members
Points: 86754 Visits: 41103
I solved a similar problem before. Some developers for a 3rd party developed front end code that passed 250,000 individual inserts as part of an "initialization" for a new "project" in the code. After looking at their code, it turned out that the front end code was doing all of that work based on just 4 parameters. I wrote a proc to do the same thing that the front end code was doing and it ran in sub second times not to mention preventing a meltdown of the copper between the front end and the server. :-P

--Jeff Moden

RBAR is pronounced ree-bar and is a Modenism for Row-By-Agonizing-Row.
First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column.
If you think its expensive to hire a professional to do the job, wait until you hire an amateur. -- Red Adair

Helpful Links:
How to post code problems
How to post performance problems
Forum FAQs
blasto_max
blasto_max
Old Hand
Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)

Group: General Forum Members
Points: 375 Visits: 480
burfos (11/19/2013)
Perhaps look in to doing a bulk insert call?


Thanks. How do I do that ?
blasto_max
blasto_max
Old Hand
Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)

Group: General Forum Members
Points: 375 Visits: 480
jcb (11/20/2013)
To make a decision you must answer some questions: first, there are dependencies between rows in that big chunk of data?
Second, what do you want to do if a insert fail? Rollback all the inserts, just keep going and create some sort of alert/logging?
Third, there are performance issues? can I run a batch overnight? That data must be persisted ASAP?

I don't thing a big blob/string is a good option because you ill need to parse/deserialize that (sql side) and, in general, putting that kind of logic in a SP(or whatever) is not good for performance, reliability and maintenance.

Search for transaction control and (maybe) stage tables and (maybe) integration strategies.
You can end doing something like:

procedure insert raw data
1- begin transaction
2- loop(insert in stage table)
3- commit tran

procedure process stage table
1- select no processed data from stage
2- loop in
3- begin transaction
4- process and persist data
5- if ok commit, else rollback
6- log results for that "row"
7- end loop

But off course that depends on your requirements.


Wow thats quite a lot of tips for a beginner like me to digest. Thanks for all those points.

There are no dependencies between them. But, can you give me an example of cases where
dependencies might arise ?

If the insert fails, then rollback.

Thank you very much.
blasto_max
blasto_max
Old Hand
Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)

Group: General Forum Members
Points: 375 Visits: 480
Lowell (11/20/2013)
bulk insert into a single table is incredibly fast. switching away from INSERT statements being generated to sticking the data into a local DataTable, and either syncronizing the changes or using bulk methods are much better, as well as maintaining data type integrity.

here's a c# code example, where previous processes have put the data i want into a DataTable:


//now use SQLBulk Copy to get the data into the server, instead of RBAR:
//note my connection string is based on String.Replace of this:
//Private Const SqlConnectionFormat As String = "data source={0};initial catalog={1};user id={2};password={3};Trusted_Connection=False;Application Name=ReportGrid.exe;"
//Private Const SqlTrustedConnectionFormat As String = "data source={0};initial catalog={1};Trusted_Connection=True;Application Name=ReportGrid.exe;"

{
try {
SqlConnection myConn = new SqlConnection(this.ConnectionString);
myConn.Open();
using (SqlBulkCopy myBulkCopy = new SqlBulkCopy(myConn)) {
myBulkCopy.DestinationTableName = "[" + DestinationTable + "]";
myBulkCopy.WriteToServer(dt);
}
} catch (Exception ex) {
Debug.Print(ex.Message);
}
}




Thanks. I will try this. It seems closer to my "dump into SQL server buffer" requirement.
blasto_max
blasto_max
Old Hand
Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)Old Hand (375 reputation)

Group: General Forum Members
Points: 375 Visits: 480
Jeff Moden (11/21/2013)
I solved a similar problem before. Some developers for a 3rd party developed front end code that passed 250,000 individual inserts as part of an "initialization" for a new "project" in the code. After looking at their code, it turned out that the front end code was doing all of that work based on just 4 parameters. I wrote a proc to do the same thing that the front end code was doing and it ran in sub second times not to mention preventing a meltdown of the copper between the front end and the server. :-P


Thanks for sharing your experience. I am new, so its hard for me to fully understand why the 3rd party solution was slow. Can you answer some questions for me ?

1 - Individual inserts as in row by agonizing row ? Thats what I was ordered to do a while ago.
2 - Can you tell me the differences between the front end and the sp, and why the latter was faster ?

Thanks.
Go


Permissions

You can't post new topics.
You can't post topic replies.
You can't post new polls.
You can't post replies to polls.
You can't edit your own topics.
You can't delete your own topics.
You can't edit other topics.
You can't delete other topics.
You can't edit your own posts.
You can't edit other posts.
You can't delete your own posts.
You can't delete other posts.
You can't post events.
You can't edit your own events.
You can't edit other events.
You can't delete your own events.
You can't delete other events.
You can't send private messages.
You can't send emails.
You can read topics.
You can't vote in polls.
You can't upload attachments.
You can download attachments.
You can't post HTML code.
You can't edit HTML code.
You can't post IFCode.
You can't post JavaScript.
You can post emoticons.
You can't post or upload images.

Select a forum

































































































































































SQLServerCentral


Search