i migrating program microsoft sql server mysql. works except 1 issue bulk copy.
in solution ms sql code looks this:
connection.open(); sqlbulkcopy bulkcopy = new sqlbulkcopy(connection); bulkcopy.destinationtablename = "testtable"; bulkcopy.writetoserver(rawdata);
now try similar mysql. because think there bad performance don't want write datatable csv file , insert there mysqlbulkloader
class.
any highly appreciated.
because think there bad performance don't want write datatable csv file , insert there mysqlbulkloader class.
don't rule out possible solution based on unfounded assumptions. tested insertion of 100,000 rows system.data.datatable
mysql table using standard mysqldataadapter#update()
inside transaction
. consistently took 30 seconds run:
using (mysqltransaction tran = conn.begintransaction(system.data.isolationlevel.serializable)) { using (mysqlcommand cmd = new mysqlcommand()) { cmd.connection = conn; cmd.transaction = tran; cmd.commandtext = "select * testtable"; using (mysqldataadapter da = new mysqldataadapter(cmd)) { da.updatebatchsize = 1000; using (mysqlcommandbuilder cb = new mysqlcommandbuilder(da)) { da.update(rawdata); tran.commit(); } } } }
(i tried couple of different values updatebatchsize
didn't seem have significant impact on elapsed time.)
by contrast, following code using mysqlbulkloader
took 5 or 6 seconds run ...
string tempcsvfilespec = @"c:\users\gord\desktop\dump.csv"; using (streamwriter writer = new streamwriter(tempcsvfilespec)) { rfc4180writer.writedatatable(rawdata, writer, false); } var msbl = new mysqlbulkloader(conn); msbl.tablename = "testtable"; msbl.filename = tempcsvfilespec; msbl.fieldterminator = ","; msbl.fieldquotationcharacter = '"'; msbl.load(); system.io.file.delete(tempcsvfilespec);
... including time dump 100,000 rows datatable temporary csv file (using code similar this), bulk-loading file, , deleting file afterwards.
Comments
Post a Comment