Go trying to imporve insert speed with MySQL












0














Hi I need to upload enormous amount of small text info into the MySQL.
Unfortunately there no BulkOp with MySQL, what I am trying to use go-routines to parallelize transactions.
The problem that all this concurrency and racing stuff drives me a bit crazy.



And I am not sure if to what I come is any good.



A simplified code looks like this, the huge file is scanned line by line, and lines appends to an slice, when size of slice is 1000



sem := make(chan int, 10) //Transactions pool
sem2 := make(chan int) // auxiliary blocking semaphore
for scanner.Scan() {
line := scanner.Text()
lines = append(lines, line)
if len(lines) > 1000 {
sem <- 1 //keep max 10 transactions
go func(mylines ...lineT) {
// I use variadic, to avoid issue with pointers
// I want to path data by values.
<-sem2 // all lines of slice copied, release the lock
gopher(mylines...) //gopher does the transaction by iterating
//every each line. And here I may use slice
//I think.

<-sem //after transaction done, release the lock
}(lines...)
sem2 <- 1 //this to ensure, that slice will be reset,
//after values are copied to func, otherwise
//lines could be nil before the goroutine fired.
lines = nil //reset slice
}
}


How can I better solve thing.
I know I could have make something to bulk import via MySQL utilities, but this is not possible. I neither can make it like INSERT with many values VALUES ("1", "2), ("3", "4") because it's not properly escaping, and I just get errors.



This way looks a ta wierd, but not as my 1st approach



func gopher2(lines lineT) {
q := "INSERT INTO main_data(text) VALUES "
var aur string
var inter interface{}
for _, l := range lines {
aur = append(aur, "(?)")
inter = append(inter, l)
}
q = q + strings.Join(aur, ", ")

if _, err := db.Exec(q, inter...); err != nil {
log.Println(err)
}

}









share|improve this question




















  • 1




    I would recommend figuring out your why things aren't escaping properly during your mass insert. It's probably an indication that your data isn't properly sanitized for your database.
    – Snake14
    Nov 22 at 17:56










  • well, but db.Exec(q, value) have no problems with same data.
    – MikeKlemin
    Nov 22 at 18:00










  • that looks like a prepared statement method, which means it most likely escapes and values you pass it before it inserts them into your query string.
    – Snake14
    Nov 22 at 18:08










  • well I guess, worth a try to make long prepared statement and see, ty!
    – MikeKlemin
    Nov 22 at 18:31
















0














Hi I need to upload enormous amount of small text info into the MySQL.
Unfortunately there no BulkOp with MySQL, what I am trying to use go-routines to parallelize transactions.
The problem that all this concurrency and racing stuff drives me a bit crazy.



And I am not sure if to what I come is any good.



A simplified code looks like this, the huge file is scanned line by line, and lines appends to an slice, when size of slice is 1000



sem := make(chan int, 10) //Transactions pool
sem2 := make(chan int) // auxiliary blocking semaphore
for scanner.Scan() {
line := scanner.Text()
lines = append(lines, line)
if len(lines) > 1000 {
sem <- 1 //keep max 10 transactions
go func(mylines ...lineT) {
// I use variadic, to avoid issue with pointers
// I want to path data by values.
<-sem2 // all lines of slice copied, release the lock
gopher(mylines...) //gopher does the transaction by iterating
//every each line. And here I may use slice
//I think.

<-sem //after transaction done, release the lock
}(lines...)
sem2 <- 1 //this to ensure, that slice will be reset,
//after values are copied to func, otherwise
//lines could be nil before the goroutine fired.
lines = nil //reset slice
}
}


How can I better solve thing.
I know I could have make something to bulk import via MySQL utilities, but this is not possible. I neither can make it like INSERT with many values VALUES ("1", "2), ("3", "4") because it's not properly escaping, and I just get errors.



This way looks a ta wierd, but not as my 1st approach



func gopher2(lines lineT) {
q := "INSERT INTO main_data(text) VALUES "
var aur string
var inter interface{}
for _, l := range lines {
aur = append(aur, "(?)")
inter = append(inter, l)
}
q = q + strings.Join(aur, ", ")

if _, err := db.Exec(q, inter...); err != nil {
log.Println(err)
}

}









share|improve this question




















  • 1




    I would recommend figuring out your why things aren't escaping properly during your mass insert. It's probably an indication that your data isn't properly sanitized for your database.
    – Snake14
    Nov 22 at 17:56










  • well, but db.Exec(q, value) have no problems with same data.
    – MikeKlemin
    Nov 22 at 18:00










  • that looks like a prepared statement method, which means it most likely escapes and values you pass it before it inserts them into your query string.
    – Snake14
    Nov 22 at 18:08










  • well I guess, worth a try to make long prepared statement and see, ty!
    – MikeKlemin
    Nov 22 at 18:31














0












0








0







Hi I need to upload enormous amount of small text info into the MySQL.
Unfortunately there no BulkOp with MySQL, what I am trying to use go-routines to parallelize transactions.
The problem that all this concurrency and racing stuff drives me a bit crazy.



And I am not sure if to what I come is any good.



A simplified code looks like this, the huge file is scanned line by line, and lines appends to an slice, when size of slice is 1000



sem := make(chan int, 10) //Transactions pool
sem2 := make(chan int) // auxiliary blocking semaphore
for scanner.Scan() {
line := scanner.Text()
lines = append(lines, line)
if len(lines) > 1000 {
sem <- 1 //keep max 10 transactions
go func(mylines ...lineT) {
// I use variadic, to avoid issue with pointers
// I want to path data by values.
<-sem2 // all lines of slice copied, release the lock
gopher(mylines...) //gopher does the transaction by iterating
//every each line. And here I may use slice
//I think.

<-sem //after transaction done, release the lock
}(lines...)
sem2 <- 1 //this to ensure, that slice will be reset,
//after values are copied to func, otherwise
//lines could be nil before the goroutine fired.
lines = nil //reset slice
}
}


How can I better solve thing.
I know I could have make something to bulk import via MySQL utilities, but this is not possible. I neither can make it like INSERT with many values VALUES ("1", "2), ("3", "4") because it's not properly escaping, and I just get errors.



This way looks a ta wierd, but not as my 1st approach



func gopher2(lines lineT) {
q := "INSERT INTO main_data(text) VALUES "
var aur string
var inter interface{}
for _, l := range lines {
aur = append(aur, "(?)")
inter = append(inter, l)
}
q = q + strings.Join(aur, ", ")

if _, err := db.Exec(q, inter...); err != nil {
log.Println(err)
}

}









share|improve this question















Hi I need to upload enormous amount of small text info into the MySQL.
Unfortunately there no BulkOp with MySQL, what I am trying to use go-routines to parallelize transactions.
The problem that all this concurrency and racing stuff drives me a bit crazy.



And I am not sure if to what I come is any good.



A simplified code looks like this, the huge file is scanned line by line, and lines appends to an slice, when size of slice is 1000



sem := make(chan int, 10) //Transactions pool
sem2 := make(chan int) // auxiliary blocking semaphore
for scanner.Scan() {
line := scanner.Text()
lines = append(lines, line)
if len(lines) > 1000 {
sem <- 1 //keep max 10 transactions
go func(mylines ...lineT) {
// I use variadic, to avoid issue with pointers
// I want to path data by values.
<-sem2 // all lines of slice copied, release the lock
gopher(mylines...) //gopher does the transaction by iterating
//every each line. And here I may use slice
//I think.

<-sem //after transaction done, release the lock
}(lines...)
sem2 <- 1 //this to ensure, that slice will be reset,
//after values are copied to func, otherwise
//lines could be nil before the goroutine fired.
lines = nil //reset slice
}
}


How can I better solve thing.
I know I could have make something to bulk import via MySQL utilities, but this is not possible. I neither can make it like INSERT with many values VALUES ("1", "2), ("3", "4") because it's not properly escaping, and I just get errors.



This way looks a ta wierd, but not as my 1st approach



func gopher2(lines lineT) {
q := "INSERT INTO main_data(text) VALUES "
var aur string
var inter interface{}
for _, l := range lines {
aur = append(aur, "(?)")
inter = append(inter, l)
}
q = q + strings.Join(aur, ", ")

if _, err := db.Exec(q, inter...); err != nil {
log.Println(err)
}

}






mysql go






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 22 at 20:28

























asked Nov 22 at 17:47









MikeKlemin

357416




357416








  • 1




    I would recommend figuring out your why things aren't escaping properly during your mass insert. It's probably an indication that your data isn't properly sanitized for your database.
    – Snake14
    Nov 22 at 17:56










  • well, but db.Exec(q, value) have no problems with same data.
    – MikeKlemin
    Nov 22 at 18:00










  • that looks like a prepared statement method, which means it most likely escapes and values you pass it before it inserts them into your query string.
    – Snake14
    Nov 22 at 18:08










  • well I guess, worth a try to make long prepared statement and see, ty!
    – MikeKlemin
    Nov 22 at 18:31














  • 1




    I would recommend figuring out your why things aren't escaping properly during your mass insert. It's probably an indication that your data isn't properly sanitized for your database.
    – Snake14
    Nov 22 at 17:56










  • well, but db.Exec(q, value) have no problems with same data.
    – MikeKlemin
    Nov 22 at 18:00










  • that looks like a prepared statement method, which means it most likely escapes and values you pass it before it inserts them into your query string.
    – Snake14
    Nov 22 at 18:08










  • well I guess, worth a try to make long prepared statement and see, ty!
    – MikeKlemin
    Nov 22 at 18:31








1




1




I would recommend figuring out your why things aren't escaping properly during your mass insert. It's probably an indication that your data isn't properly sanitized for your database.
– Snake14
Nov 22 at 17:56




I would recommend figuring out your why things aren't escaping properly during your mass insert. It's probably an indication that your data isn't properly sanitized for your database.
– Snake14
Nov 22 at 17:56












well, but db.Exec(q, value) have no problems with same data.
– MikeKlemin
Nov 22 at 18:00




well, but db.Exec(q, value) have no problems with same data.
– MikeKlemin
Nov 22 at 18:00












that looks like a prepared statement method, which means it most likely escapes and values you pass it before it inserts them into your query string.
– Snake14
Nov 22 at 18:08




that looks like a prepared statement method, which means it most likely escapes and values you pass it before it inserts them into your query string.
– Snake14
Nov 22 at 18:08












well I guess, worth a try to make long prepared statement and see, ty!
– MikeKlemin
Nov 22 at 18:31




well I guess, worth a try to make long prepared statement and see, ty!
– MikeKlemin
Nov 22 at 18:31

















active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53436071%2fgo-trying-to-imporve-insert-speed-with-mysql%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown






























active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53436071%2fgo-trying-to-imporve-insert-speed-with-mysql%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Contact image not getting when fetch all contact list from iPhone by CNContact

count number of partitions of a set with n elements into k subsets

A CLEAN and SIMPLE way to add appendices to Table of Contents and bookmarks