Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

Pushing inserts into SQL server with C# ? Expand / Collapse
Author
Message
Posted Tuesday, November 19, 2013 3:27 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Thursday, May 15, 2014 8:51 PM
Points: 183, Visits: 480
I want to insert about 5000 (typically, can be 10K+ at times) rows into SQL server at a time. My insert statements are generated by a C# program.

One way could be by simply generating a big string full of these SQL statements and then actually making SQL server execute them using `SqlCommand.ExecuteNonQuery();` But there is a risk that SQL server might not be able to execute a big statement like that.

Is there a way I could sort of push the SQL statements to some kind of "SQL server buffer" (if there is one) and then finally execute them all ?

Thanks.
Post #1515817
Posted Tuesday, November 19, 2013 9:29 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, May 1, 2014 8:13 PM
Points: 19, Visits: 169
Perhaps look in to doing a bulk insert call?
Post #1515876
Posted Wednesday, November 20, 2013 3:26 AM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Monday, July 28, 2014 11:05 AM
Points: 2,693, Visits: 898
To make a decision you must answer some questions: first, there are dependencies between rows in that big chunk of data?
Second, what do you want to do if a insert fail? Rollback all the inserts, just keep going and create some sort of alert/logging?
Third, there are performance issues? can I run a batch overnight? That data must be persisted ASAP?

I don't thing a big blob/string is a good option because you ill need to parse/deserialize that (sql side) and, in general, putting that kind of logic in a SP(or whatever) is not good for performance, reliability and maintenance.

Search for transaction control and (maybe) stage tables and (maybe) integration strategies.
You can end doing something like:

procedure insert raw data
1- begin transaction
2- loop(insert in stage table)
3- commit tran

procedure process stage table
1- select no processed data from stage
2- loop in
3- begin transaction
4- process and persist data
5- if ok commit, else rollback
6- log results for that "row"
7- end loop

But off course that depends on your requirements.
Post #1515971
Posted Wednesday, November 20, 2013 6:05 AM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Today @ 7:35 AM
Points: 12,889, Visits: 31,844
bulk insert into a single table is incredibly fast. switching away from INSERT statements being generated to sticking the data into a local DataTable, and either syncronizing the changes or using bulk methods are much better, as well as maintaining data type integrity.

here's a c# code example, where previous processes have put the data i want into a DataTable:

//now use SQLBulk Copy to get the data into the server, instead of RBAR:
//note my connection string is based on String.Replace of this:
//Private Const SqlConnectionFormat As String = "data source={0};initial catalog={1};user id={2};password={3};Trusted_Connection=False;Application Name=ReportGrid.exe;"
//Private Const SqlTrustedConnectionFormat As String = "data source={0};initial catalog={1};Trusted_Connection=True;Application Name=ReportGrid.exe;"

{
try {
SqlConnection myConn = new SqlConnection(this.ConnectionString);
myConn.Open();
using (SqlBulkCopy myBulkCopy = new SqlBulkCopy(myConn)) {
myBulkCopy.DestinationTableName = "[" + DestinationTable + "]";
myBulkCopy.WriteToServer(dt);
}
} catch (Exception ex) {
Debug.Print(ex.Message);
}
}




Lowell

--There is no spoon, and there's no default ORDER BY in sql server either.
Actually, Common Sense is so rare, it should be considered a Superpower. --my son
Post #1516004
Posted Thursday, November 21, 2013 5:57 AM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Monday, July 28, 2014 11:05 AM
Points: 2,693, Visits: 898
I learned something today, thanks!
(a bit shame for me, C# developer)

Last time I needed to do something like that (plumbing data from a terabyte text file) framework 2.0 was not released yet.
But for sure ill keep this in mind next time I'll need to implement that kind of application.
Post #1516388
Posted Thursday, November 21, 2013 8:44 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 8:06 PM
Points: 36,786, Visits: 31,243
I solved a similar problem before. Some developers for a 3rd party developed front end code that passed 250,000 individual inserts as part of an "initialization" for a new "project" in the code. After looking at their code, it turned out that the front end code was doing all of that work based on just 4 parameters. I wrote a proc to do the same thing that the front end code was doing and it ran in sub second times not to mention preventing a meltdown of the copper between the front end and the server.

--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1516634
Posted Friday, November 22, 2013 4:49 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Thursday, May 15, 2014 8:51 PM
Points: 183, Visits: 480
burfos (11/19/2013)
Perhaps look in to doing a bulk insert call?


Thanks. How do I do that ?
Post #1516967
Posted Friday, November 22, 2013 4:52 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Thursday, May 15, 2014 8:51 PM
Points: 183, Visits: 480
jcb (11/20/2013)
To make a decision you must answer some questions: first, there are dependencies between rows in that big chunk of data?
Second, what do you want to do if a insert fail? Rollback all the inserts, just keep going and create some sort of alert/logging?
Third, there are performance issues? can I run a batch overnight? That data must be persisted ASAP?

I don't thing a big blob/string is a good option because you ill need to parse/deserialize that (sql side) and, in general, putting that kind of logic in a SP(or whatever) is not good for performance, reliability and maintenance.

Search for transaction control and (maybe) stage tables and (maybe) integration strategies.
You can end doing something like:

procedure insert raw data
1- begin transaction
2- loop(insert in stage table)
3- commit tran

procedure process stage table
1- select no processed data from stage
2- loop in
3- begin transaction
4- process and persist data
5- if ok commit, else rollback
6- log results for that "row"
7- end loop

But off course that depends on your requirements.


Wow thats quite a lot of tips for a beginner like me to digest. Thanks for all those points.

There are no dependencies between them. But, can you give me an example of cases where
dependencies might arise ?

If the insert fails, then rollback.

Thank you very much.
Post #1516968
Posted Friday, November 22, 2013 4:54 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Thursday, May 15, 2014 8:51 PM
Points: 183, Visits: 480
Lowell (11/20/2013)
bulk insert into a single table is incredibly fast. switching away from INSERT statements being generated to sticking the data into a local DataTable, and either syncronizing the changes or using bulk methods are much better, as well as maintaining data type integrity.

here's a c# code example, where previous processes have put the data i want into a DataTable:

//now use SQLBulk Copy to get the data into the server, instead of RBAR:
//note my connection string is based on String.Replace of this:
//Private Const SqlConnectionFormat As String = "data source={0};initial catalog={1};user id={2};password={3};Trusted_Connection=False;Application Name=ReportGrid.exe;"
//Private Const SqlTrustedConnectionFormat As String = "data source={0};initial catalog={1};Trusted_Connection=True;Application Name=ReportGrid.exe;"

{
try {
SqlConnection myConn = new SqlConnection(this.ConnectionString);
myConn.Open();
using (SqlBulkCopy myBulkCopy = new SqlBulkCopy(myConn)) {
myBulkCopy.DestinationTableName = "[" + DestinationTable + "]";
myBulkCopy.WriteToServer(dt);
}
} catch (Exception ex) {
Debug.Print(ex.Message);
}
}




Thanks. I will try this. It seems closer to my "dump into SQL server buffer" requirement.
Post #1516969
Posted Friday, November 22, 2013 5:00 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Thursday, May 15, 2014 8:51 PM
Points: 183, Visits: 480
Jeff Moden (11/21/2013)
I solved a similar problem before. Some developers for a 3rd party developed front end code that passed 250,000 individual inserts as part of an "initialization" for a new "project" in the code. After looking at their code, it turned out that the front end code was doing all of that work based on just 4 parameters. I wrote a proc to do the same thing that the front end code was doing and it ran in sub second times not to mention preventing a meltdown of the copper between the front end and the server.


Thanks for sharing your experience. I am new, so its hard for me to fully understand why the 3rd party solution was slow. Can you answer some questions for me ?

1 - Individual inserts as in row by agonizing row ? Thats what I was ordered to do a while ago.
2 - Can you tell me the differences between the front end and the sp, and why the latter was faster ?

Thanks.
Post #1516972
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse