PDO Prepared Inserts multiple rows in single query
Asked Answered
C

25

178

I am currently using this type of SQL on MySQL to insert multiple rows of values in one single query:

INSERT INTO `tbl` (`key1`,`key2`) VALUES ('r1v1','r1v2'),('r2v1','r2v2'),...

On the readings on PDO, the use prepared statements should give me a better security than static queries.

I would therefore like to know whether it is possible to generate "inserting multiple rows of values by the use of one query" using prepared statements.

If yes, may I know how can I implement it?

Callosity answered 24/7, 2009 at 8:11 Comment(1)
careful with a lot of the answers for $stmt->execute($data); php.net/manual/en/… Basically all of the params are passed validated as strings. Just loop through the data after building the query, and manually bindValue or bindParam passing type as third-argument.Harlotry
M
179

Multiple Values Insert with PDO Prepared Statements

Inserting multiple values in one execute statement. Why because according to this page it is faster than regular inserts.

$data[] = ['valueA1', 'valueB1'];
$data[] = ['valueA2', 'valueB2'];

more data values or you probably have a loop that populates data.

That is basically how we want the insert statement to look like:

insert into table (fielda, fieldb, ... ) values (?,?...), (?,?...)....

So with prepared inserts you need to know the number of fields to create a single VALUES part and the number of rows in order to know how many times to repeat it.

Now, the code:

// create the ?,? sequence for a single row
$values = str_repeat('?,', count($data[0]) - 1) . '?';
// construct the entire query
$sql = "INSERT INTO table (columnA, columnB) VALUES " .
    // repeat the (?,?) sequence for each row
    str_repeat("($values),", count($data) - 1) . "($values)";    

$stmt = $pdo->prepare ($sql);
// execute with all values from $data
$stmt->execute(array_merge(...$data));

Note that this approach is 100% secure, as the query is constructed entirely of constant parts explicitly written in the code, especially the column names.

Mcdevitt answered 20/1, 2010 at 2:30 Comment(10)
I agree and have supported your suggestion with some test data below for reference.Gild
A typo, in the explanation above it mentions $datafields although $datafield is used in $sql. Thus copy paste would result in error. Please do rectify. Thanks for this solution though.Jacktar
Used this for a while then noticed that values with single quotes in them aren't escaped properly. Using double quotes on implosion works like a charm for me: $a[] = '("' . implode(",", $question_marks) . '", NOW())';Scapula
array_merge seems more expensive than just using a array_push.Tetragrammaton
If any one likes : there is also a library github.com/auraphp/Aura.SqlQuery/tree/…Fardel
When you say "there was only a 1 sec difference", how many rows a data were you inserting? 1 sec is pretty significant depending on the context.Mink
The difference might be more noticeable if you are running the query against a remote database (where each individual insert would need to be sent to the remote db).Rabelais
Optimization: No point in calling placeholders() over and over again. Call it once before the loop with sizeof($datafields) and append the result string to $question_marks[] inside the loop.Inchoation
This is not the best way. Because none of the rows get inserted if one of the rows have any issues such as duplicate entry or whatever.Tackling
This significantly reduced the time of execution for 1500 inserts from ~5 min to ~5 seconds. Thanks.Mcwherter
M
86

Same answer as Mr. Balagtas, slightly clearer...

Recent versions MySQL and PHP PDO do support multi-row INSERT statements.

SQL Overview

The SQL will look something like this, assuming a 3-column table you'd like to INSERT to.

INSERT INTO tbl_name
            (colA, colB, colC)
     VALUES (?, ?, ?), (?, ?, ?), (?, ?, ?) [,...]

ON DUPLICATE KEY UPDATE works as expected even with a multi-row INSERT; append this:

ON DUPLICATE KEY UPDATE colA = VALUES(colA), colB = VALUES(colB), colC = VALUES(colC)

PHP Overview

Your PHP code will follow the usual $pdo->prepare($qry) and $stmt->execute($params) PDO calls.

$params will be a 1-dimensional array of all the values to pass to the INSERT.

In the above example, it should contain 9 elements; PDO will use every set of 3 as a single row of values. (Inserting 3 rows of 3 columns each = 9 element array.)

Implementation

Below code is written for clarity, not efficiency. Work with the PHP array_*() functions for better ways to map or walk through your data if you'd like. Given a single query is executed and each query being a transaction on its own, no explicit transaction is required.

Assuming:

  • $dataVals - mutli-dimensional array, where each element is a 1-d array of a row of values to INSERT

Sample Code

// setup data values for PDO. No memory overhead thanks to copy-on-write
$dataToInsert = array();
foreach ($dataVals as $row) {
    foreach($row as $val) {
        $dataToInsert[] = $val;
    }
}

$onDup = "ON DUPLICATE KEY UPDATE colA=VALUES(colA)"; // optional

// setup the placeholders - a fancy way to make the long "(?, ?, ?)..." string
$rowPlaces = '(' . implode(', ', array_fill(0, count($colNames), '?')) . ')';
$allPlaces = implode(', ', array_fill(0, count($dataVals), $rowPlaces));

$sql = "INSERT INTO `tblName` (`colA`, `colB, colC)" . 
    " VALUES $allPlaces ON DUPLICATE KEY UPDATE $onDup";

// and then the PHP PDO boilerplate
$stmt = $pdo->prepare ($sql);
$stmt->execute($dataToInsert);
Mat answered 30/12, 2010 at 1:24 Comment(4)
That is really too bad that PDO handles it this way, there are some very elegant ways to do this in other DB drivers.Halfbaked
This setups the placeholders even more tersely, making $rowPlaces no longer necessary: $allPlaces = implode(',', array_fill(0, count($dataVals), '('.str_pad('', (count($colNames)*2)-1, '?,').')'));Jim
Works perfect. I would add to this answer the need to ensure the uniqueness of the (combination of) indexes in the table. Like in ALTER TABLE votes ADD UNIQUE unique_index(user, email, address);Marchand
Awesome! BTW, using array_push($dataToInsert, ...array_values($dataVals)); will be much faster then foreach ($dataVals as $row => $data) {}Slouch
G
47

For what it is worth, I have seen a lot of users recommend iterating through INSERT statements instead of building out as a single string query as the selected answer did. I decided to run a simple test with just two fields and a very basic insert statement:

<?php
require('conn.php');

$fname = 'J';
$lname = 'M';

$time_start = microtime(true);
$stmt = $db->prepare('INSERT INTO table (FirstName, LastName) VALUES (:fname, :lname)');

for($i = 1; $i <= 10; $i++ )  {
    $stmt->bindParam(':fname', $fname);
    $stmt->bindParam(':lname', $lname);
    $stmt->execute();

    $fname .= 'O';
    $lname .= 'A';
}


$time_end = microtime(true);
$time = $time_end - $time_start;

echo "Completed in ". $time ." seconds <hr>";

$fname2 = 'J';
$lname2 = 'M';

$time_start2 = microtime(true);
$qry = 'INSERT INTO table (FirstName, LastName) VALUES ';
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?)";

$stmt2 = $db->prepare($qry);
$values = array();

for($j = 1; $j<=10; $j++) {
    $values2 = array($fname2, $lname2);
    $values = array_merge($values,$values2);

    $fname2 .= 'O';
    $lname2 .= 'A';
}

$stmt2->execute($values);

$time_end2 = microtime(true);
$time2 = $time_end2 - $time_start2;

echo "Completed in ". $time2 ." seconds <hr>";
?>

While the overall query itself took milliseconds or less, the latter (single string) query was consistently 8 times faster or more. If this was built out to say reflect an import of thousands of rows on many more columns, the difference could be enormous.

Gild answered 31/1, 2012 at 23:28 Comment(5)
@Gild - great idea to put 10 rows directly in one execution. But how can I insert thousands of rows when they are stored in an object like JSON? My code below works perferctly. But how can I adjust it to insert 10 rows in one execution? ` foreach($json_content as $datarow) { $id = $datarow[id]; $date = $datarow[date]; $row3 = $datarow[row3]; $row4 = $datarow[row4]; $row5 = $datarow[row5]; $row6 = $datarow[row6]; $row7= $datarow[row7]; // now execute $databaseinsert->execute(); } // end of foreach `Quickman
@Gild - ... and my second question is: "why is there no bind_param statement in the second import routine"?Quickman
Wouldn't you have to loop twice? You would also have to dynamically generate the (?,?), right?Worl
@Worl Yes, you can use the same for/foreach to generate both.Oilstone
You should have wrapped the multiple statements in a transaction. TOO BAD you didn'tCosenza
E
41

Please note that this answer is outdated. Nowadays PHP can merge all arrays in one go using array_merge(...$data) without any loops.

The Accepted Answer by Herbert Balagtas works well when the $data array is small. With larger $data arrays the array_merge function becomes prohibitively slow. My test file to create the $data array has 28 cols and is about 80,000 lines. The final script took 41s to complete.

Using array_push() to create $insert_values instead of array_merge() resulted in a 100X speed up with execution time of 0.41s.

The problematic array_merge():

$insert_values = array();

foreach($data as $d){
 $question_marks[] = '('  . placeholders('?', sizeof($d)) . ')';
 $insert_values = array_merge($insert_values, array_values($d));
}

To eliminate the need for array_merge(), you can build the following two arrays instead:

//Note that these fields are empty, but the field count should match the fields in $datafields.
$data[] = array('','','','',... n ); 

//getting rid of array_merge()
array_push($insert_values, $value1, $value2, $value3 ... n ); 

These arrays can then be used as follows:

function placeholders($text, $count=0, $separator=","){
    $result = array();
    if($count > 0){
        for($x=0; $x<$count; $x++){
            $result[] = $text;
        }
    }

    return implode($separator, $result);
}

$pdo->beginTransaction();

foreach($data as $d){
 $question_marks[] = '('  . placeholders('?', sizeof($d)) . ')';
}

$sql = "INSERT INTO table (" . implode(",", array_keys($datafield) ) . ") VALUES " . implode(',', $question_marks);

$stmt = $pdo->prepare($sql);
$stmt->execute($insert_values);
$pdo->commit();
Earle answered 19/5, 2012 at 21:18 Comment(6)
In PHP 5.6 you can do array_push($data, ...array_values($row)) instead of $data = array_merge($data, array_values($row));. Much faster.Grindery
Why 5.6 ? Documentation doesn't say anything about 5.6, array_push() is available even in php 4.Gaskin
@Piero it is PHP 5.6+ only code not because of the use of array_push(), but because @Mark is using argument unpacking. Notice the ...array_values() call there?Lottielotto
@Lottielotto array_values() is as well available in php 4. Not sure if that's what you mean by argument unpacking.Gaskin
i wish to plus 1 you more than one. great explanationMeyeroff
@Piero, Argument unpacking is a feature introduces in PHP 5.6. It's a way to provide multiple arguments as an array. Check here - php.net/manual/en/…Slouch
E
18

Two possible approaches:

$stmt = $pdo->prepare('INSERT INTO foo VALUES(:v1_1, :v1_2, :v1_3),
    (:v2_1, :v2_2, :v2_3),
    (:v2_1, :v2_2, :v2_3)');
$stmt->bindValue(':v1_1', $data[0][0]);
$stmt->bindValue(':v1_2', $data[0][1]);
$stmt->bindValue(':v1_3', $data[0][2]);
// etc...
$stmt->execute();

Or:

$pdo->beginTransaction();
$stmt = $pdo->prepare('INSERT INTO foo VALUES(:a, :b, :c)');
foreach($data as $item)
{
    $stmt->bindValue(':a', $item[0]);
    $stmt->bindValue(':b', $item[1]);
    $stmt->bindValue(':c', $item[2]);
    $stmt->execute();
}
$pdo->commit();

If the data for all the rows are in a single array, I would use the second solution. Thanks to using transactions, it's almost as fast as the first solution.

Etoile answered 24/7, 2009 at 9:28 Comment(3)
in the latter aren't you then making several (possibly thousands) of separate execute calls instead of combining into one statement?Gild
@JM4, are you suggesting $stmt->execute(); should be outside the foreach loop?Conversable
@Conversable - Yes I am. See my answer above with upvotes. On a pure insert statement there is no reason i can logically come up with that it can't be a single statement. One call, one execute. In fact, my answer from early 2012 could be improved even further - something I will do later on when I have some more time. If you start throwing in Insert / update/ delete combinations, that is a different story.Gild
C
11

That's simply not the way you use prepared statements.

It is perfectly okay to insert one row per query because you can execute one prepared statement multiple times with different parameters. In fact that is one of the greatest advantages as it allows you to insert you a great number of rows in an efficient, secure and comfortable manner.

So it maybe possible to implement the scheme you proposing, at least for a fixed number of rows, but it is almost guaranteed that this is not really what you want.

Coupe answered 24/7, 2009 at 9:4 Comment(5)
Can you suggest a better way to insert multiple rows into a table?Shipload
@Crashthatch: Just do it the naive way: Setup the prepared statement once, then execute it for each row with different values for the bound parameters. That's the second approach in Zyk's answer.Coupe
The purpose you mentioned for prepared statement is right. But, using multi -insert is another technique to improve insert speed and it can be used with prepared statement too. In my experience, while migrating 30 million row data using PDO prepared statement, I saw multi-insert was 7-10 times faster then grouped single insert in transactions.Slouch
Absolutely agree with Anis. I have 100k rows and get a huge speed increase with muli row inserts.Mantis
Claiming that calling a relational database in a loop once per row is generally a good thing is something I cannot agree with. Downvote for that. Granted, sometimes it's ok. I don't believe in absolutes with engineering. But this is an anti-pattern that should be used only in select cases.Watersick
H
8

A shorter answer: flatten the array of data ordered by columns then

//$array = array( '1','2','3','4','5', '1','2','3','4','5');
$arCount = count($array);
$rCount = ($arCount  ? $arCount - 1 : 0);
$criteria = sprintf("(?,?,?,?,?)%s", str_repeat(",(?,?,?,?,?)", $rCount));
$sql = "INSERT INTO table(c1,c2,c3,c4,c5) VALUES$criteria";

When inserting a 1,000 or so records you don't want to have to loop through every record to insert them when all you need is a count of the values.

Heer answered 9/12, 2011 at 21:9 Comment(0)
A
4

Here is my simple approach.

    $values = array();
    foreach($workouts_id as $value){
      $_value = "(".$value.",".$plan_id.")";
      array_push($values,$_value);
    }
    $values_ = implode(",",$values);

    $sql = "INSERT INTO plan_days(id,name) VALUES" . $values_."";
    $stmt = $this->conn->prepare($sql);
    $stmt->execute();
Adenosine answered 28/3, 2017 at 18:19 Comment(3)
you're defeating the point of using prepared statements. the op is concerned about security in the question On the readings on PDO, the use prepared statements should give me a better security than static queries.Gout
Just imaging that you have non validated $workouts_id, which can have $values with quite inexpected data. You can't guarantee that maybe not now but in future another developer make this data insecure. So I think quite more right make the query prepared by PDO.Thay
This answer should be deleted.Cosenza
S
2

Here's a class I wrote do multiple inserts with purge option:

<?php

/**
 * $pdo->beginTransaction();
 * $pmi = new PDOMultiLineInserter($pdo, "foo", array("a","b","c","e"), 10);
 * $pmi->insertRow($data);
 * ....
 * $pmi->insertRow($data);
 * $pmi->purgeRemainingInserts();
 * $pdo->commit();
 *
 */
class PDOMultiLineInserter {
    private $_purgeAtCount;
    private $_bigInsertQuery, $_singleInsertQuery;
    private $_currentlyInsertingRows  = array();
    private $_currentlyInsertingCount = 0;
    private $_numberOfFields;
    private $_error;
    private $_insertCount = 0;

    function __construct(\PDO $pdo, $tableName, $fieldsAsArray, $bigInsertCount = 100) {
        $this->_numberOfFields = count($fieldsAsArray);
        $insertIntoPortion = "INSERT INTO `$tableName` (`".implode("`,`", $fieldsAsArray)."`) VALUES";
        $questionMarks  = " (?".str_repeat(",?", $this->_numberOfFields - 1).")";

        $this->_purgeAtCount = $bigInsertCount;
        $this->_bigInsertQuery    = $pdo->prepare($insertIntoPortion.$questionMarks.str_repeat(", ".$questionMarks, $bigInsertCount - 1));
        $this->_singleInsertQuery = $pdo->prepare($insertIntoPortion.$questionMarks);
    }

    function insertRow($rowData) {
        // @todo Compare speed
        // $this->_currentlyInsertingRows = array_merge($this->_currentlyInsertingRows, $rowData);
        foreach($rowData as $v) array_push($this->_currentlyInsertingRows, $v);
        //
        if (++$this->_currentlyInsertingCount == $this->_purgeAtCount) {
            if ($this->_bigInsertQuery->execute($this->_currentlyInsertingRows) === FALSE) {
                $this->_error = "Failed to perform a multi-insert (after {$this->_insertCount} inserts), the following errors occurred:".implode('<br/>', $this->_bigInsertQuery->errorInfo());
                return false;
            }
            $this->_insertCount++;

            $this->_currentlyInsertingCount = 0;
            $this->_currentlyInsertingRows = array();
        }
        return true;
    }

    function purgeRemainingInserts() {
        while ($this->_currentlyInsertingCount > 0) {
            $singleInsertData = array();
            // @todo Compare speed - http://www.evardsson.com/blog/2010/02/05/comparing-php-array_shift-to-array_pop/
            // for ($i = 0; $i < $this->_numberOfFields; $i++) $singleInsertData[] = array_pop($this->_currentlyInsertingRows); array_reverse($singleInsertData);
            for ($i = 0; $i < $this->_numberOfFields; $i++) array_unshift($singleInsertData, array_pop($this->_currentlyInsertingRows));

            if ($this->_singleInsertQuery->execute($singleInsertData) === FALSE) {
                $this->_error = "Failed to perform a small-insert (whilst purging the remaining rows; the following errors occurred:".implode('<br/>', $this->_singleInsertQuery->errorInfo());
                return false;
            }
            $this->_currentlyInsertingCount--;
        }
    }

    public function getError() {
        return $this->_error;
    }
}
Spectroscopy answered 28/6, 2012 at 3:31 Comment(1)
Hello Pierre. Maybe you're not active around here anymore. Nevertheless, I just wanted to point out that my idea for this issue looks nearly identical to yours. Pure coincidence, as I guess there isn't much more to this. I added classes for DELETE- AND UPDATE-Operations, too and involved some ideas from here, afterwards. I just didn't see your class. Please excuse my shameless self promotion here, but I guess it will be of help for someone. Hope this isn't against SO-Rules. Find it here.Filtration
C
2

Based on my experiments I found out that mysql insert statement with multiple value rows in single transaction is the fastest one.

However, if the data is too much then mysql's max_allowed_packet setting might restrict the single transaction insert with multiple value rows. Hence, following functions will fail when there is data greater than mysql's max_allowed_packet size:

  1. singleTransactionInsertWithRollback
  2. singleTransactionInsertWithPlaceholders
  3. singleTransactionInsert

The most successful one in insert huge data scenario is transactionSpeed method, but it consumes time more the above mentioned methods. So, to handle this problem you can either split your data into smaller chunks and call single transaction insert multiple times or give up speed of execution by using transactionSpeed method.

Here's my research

<?php

class SpeedTestClass
{
    private $data;

    private $pdo;

    public function __construct()
    {
        $this->data = [];
        $this->pdo = new \PDO('mysql:dbname=test_data', 'admin', 'admin');
        if (!$this->pdo) {
            die('Failed to connect to database');
        }
    }

    public function createData()
    {
        $prefix = 'test';
        $postfix = 'unicourt.com';
        $salutations = ['Mr.', 'Ms.', 'Dr.', 'Mrs.'];

        $csv[] = ['Salutation', 'First Name', 'Last Name', 'Email Address'];
        for ($i = 0; $i < 100000; ++$i) {
            $csv[] = [
                $salutations[$i % \count($salutations)],
                $prefix.$i,
                $prefix.$i,
                $prefix.$i.'@'.$postfix,
            ];
        }

        $this->data = $csv;
    }

    public function truncateTable()
    {
        $this->pdo->query('TRUNCATE TABLE `name`');
    }

    public function transactionSpeed()
    {
        $timer1 = microtime(true);
        $this->pdo->beginTransaction();
        $sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES (:first_name, :last_name)';
        $sth = $this->pdo->prepare($sql);

        foreach (\array_slice($this->data, 1) as $values) {
            $sth->execute([
                ':first_name' => $values[1],
                ':last_name' => $values[2],
            ]);
        }

        // $timer2 = microtime(true);
        // echo 'Prepare Time: '.($timer2 - $timer1).PHP_EOL;
        // $timer3 = microtime(true);

        if (!$this->pdo->commit()) {
            echo "Commit failed\n";
        }
        $timer4 = microtime(true);
        // echo 'Commit Time: '.($timer4 - $timer3).PHP_EOL;

        return $timer4 - $timer1;
    }

    public function autoCommitSpeed()
    {
        $timer1 = microtime(true);
        $sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES (:first_name, :last_name)';
        $sth = $this->pdo->prepare($sql);
        foreach (\array_slice($this->data, 1) as $values) {
            $sth->execute([
                ':first_name' => $values[1],
                ':last_name' => $values[2],
            ]);
        }
        $timer2 = microtime(true);

        return $timer2 - $timer1;
    }

    public function noBindAutoCommitSpeed()
    {
        $timer1 = microtime(true);

        foreach (\array_slice($this->data, 1) as $values) {
            $sth = $this->pdo->prepare("INSERT INTO `name` (`first_name`, `last_name`) VALUES ('{$values[1]}', '{$values[2]}')");
            $sth->execute();
        }
        $timer2 = microtime(true);

        return $timer2 - $timer1;
    }

    public function singleTransactionInsert()
    {
        $timer1 = microtime(true);
        foreach (\array_slice($this->data, 1) as $values) {
            $arr[] = "('{$values[1]}', '{$values[2]}')";
        }
        $sth = $this->pdo->prepare('INSERT INTO `name` (`first_name`, `last_name`) VALUES '.implode(', ', $arr));
        $sth->execute();
        $timer2 = microtime(true);

        return $timer2 - $timer1;
    }

    public function singleTransactionInsertWithPlaceholders()
    {
        $placeholders = [];
        $timer1 = microtime(true);
        $sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES ';
        foreach (\array_slice($this->data, 1) as $values) {
            $placeholders[] = '(?, ?)';
            $arr[] = $values[1];
            $arr[] = $values[2];
        }
        $sql .= implode(', ', $placeholders);
        $sth = $this->pdo->prepare($sql);
        $sth->execute($arr);
        $timer2 = microtime(true);

        return $timer2 - $timer1;
    }

    public function singleTransactionInsertWithRollback()
    {
        $placeholders = [];
        $timer1 = microtime(true);
        $sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES ';
        foreach (\array_slice($this->data, 1) as $values) {
            $placeholders[] = '(?, ?)';
            $arr[] = $values[1];
            $arr[] = $values[2];
        }
        $sql .= implode(', ', $placeholders);
        $this->pdo->beginTransaction();
        $sth = $this->pdo->prepare($sql);
        $sth->execute($arr);
        $this->pdo->commit();
        $timer2 = microtime(true);

        return $timer2 - $timer1;
    }
}

$s = new SpeedTestClass();
$s->createData();
$s->truncateTable();
echo "Time Spent for singleTransactionInsertWithRollback: {$s->singleTransactionInsertWithRollback()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for single Transaction Insert: {$s->singleTransactionInsert()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for single Transaction Insert With Placeholders: {$s->singleTransactionInsertWithPlaceholders()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for transaction: {$s->transactionSpeed()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for AutoCommit: {$s->noBindAutoCommitSpeed()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for autocommit with bind: {$s->autoCommitSpeed()}".PHP_EOL;
$s->truncateTable();

The results for 100,000 entries for a table containing only two columns is as below

$ php data.php
Time Spent for singleTransactionInsertWithRollback: 0.75147604942322
Time Spent for single Transaction Insert: 0.67445182800293
Time Spent for single Transaction Insert With Placeholders: 0.71131205558777
Time Spent for transaction: 8.0056409835815
Time Spent for AutoCommit: 35.4979159832
Time Spent for autocommit with bind: 33.303519010544
Cochabamba answered 11/8, 2019 at 17:55 Comment(0)
H
2

Here is another (slim) solution for this issue:

At first you need to count the data of the source array (here: $aData) with count(). Then you use array_fill() and generate a new array wich as many entries as the source array has, each with the value "(?,?)" (the number of placeholders depends on the fields you use; here: 2). Then the generated array needs to be imploded and as glue a comma is used. Within the foreach loop, you need to generate another index regarding on the number of placeholders you use (number of placeholders * current array index + 1). You need to add 1 to the generated index after each binded value.

$do = $db->prepare("INSERT INTO table (id, name) VALUES ".implode(',', array_fill(0, count($aData), '(?,?)')));

foreach($aData as $iIndex => $aValues){
 $iRealIndex = 2 * $iIndex + 1;
 $do->bindValue($iRealIndex, $aValues['id'], PDO::PARAM_INT);
 $iRealIndex = $iRealIndex + 1;
 $do->bindValue($iRealIndex, $aValues['name'], PDO::PARAM_STR);
}

$do->execute();
Heathheathberry answered 7/2, 2020 at 22:18 Comment(0)
C
1

Here is my solution: https://github.com/sasha-ch/Aura.Sql based on auraphp/Aura.Sql library.

Usage example:

$q = "insert into t2(id,name) values (?,?), ... on duplicate key update name=name"; 
$bind_values = [ [[1,'str1'],[2,'str2']] ];
$pdo->perform($q, $bind_values);

Bugreports are welcome.

Craftsman answered 7/2, 2015 at 20:30 Comment(1)
As of 2.4 you can create multi insert with github.com/auraphp/Aura.SqlQuery/tree/… and make use of ExtendedPdo to execute :) .Fardel
E
1

This is how I did it:

First define the column names you'll use, or leave it blank and pdo will assume you want to use all the columns on the table - in which case you'll need to inform the row values in the exact order they appear on the table.

$cols = 'name', 'middleName', 'eMail';
$table = 'people';

Now, suppose you have a two dimensional array already prepared. Iterate it, and construct a string with your row values, as such:

foreach ( $people as $person ) {
if(! $rowVals ) {
$rows = '(' . "'$name'" . ',' . "'$middleName'" . ',' .           "'$eMail'" . ')';
} else { $rowVals  = '(' . "'$name'" . ',' . "'$middleName'" . ',' . "'$eMail'" . ')';
}

Now, what you just did was check if $rows was already defined, and if not, create it and store row values and the necessary SQL syntax so it will be a valid statement. Note that strings should go inside double quotes and single quotes, so they will be promptly recognized as such.

All it's left to do is prepare the statement and execute, as such:

$stmt = $db->prepare ( "INSERT INTO $table $cols VALUES $rowVals" );
$stmt->execute ();

Tested with up to 2000 rows so far, and the execution time is dismal. Will run some more tests and will get back here in case I have something further to contribute.

Regards.

Epicrisis answered 17/3, 2015 at 19:26 Comment(0)
B
1

Since it has not been suggested yet, I'm pretty sure LOAD DATA INFILE is still the fastest way to load data as it disables indexing, inserts all data, and then re-enables the indexes - all in a single request.

Saving the data as a csv should be fairly trivial keeping in mind fputcsv. MyISAM is fastest, but you still get big performance in InnoDB. There are other disadvantages, though so I would go this route if you are inserting a lot of data, and not bother with under 100 rows.

Barnes answered 26/1, 2017 at 2:11 Comment(0)
A
1

Although an old question all the contributions helped me a lot so here's my solution, which works within my own DbContext class. The $rows parameter is simply an array of associative arrays representing rows or models: field name => insert value.

If you use a pattern that uses models this fits in nicely when passed model data as an array, say from a ToRowArray method within the model class.

Note: It should go without saying but never allow the arguments passed to this method to be exposed to the user or reliant on any user input, other than the insert values, which have been validated and sanitised. The $tableName argument and the column names should be defined by the calling logic; for instance a User model could be mapped to the user table, which has its column list mapped to the model's member fields.

public function InsertRange($tableName, $rows)
{
    // Get column list
    $columnList = array_keys($rows[0]);
    $numColumns = count($columnList);
    $columnListString = implode(",", $columnList);

    // Generate pdo param placeholders
    $placeHolders = array();

    foreach($rows as $row)
    {
        $temp = array();

        for($i = 0; $i < count($row); $i++)
            $temp[] = "?";

        $placeHolders[] = "(" . implode(",", $temp) . ")";
    }

    $placeHolders = implode(",", $placeHolders);

    // Construct the query
    $sql = "insert into $tableName ($columnListString) values $placeHolders";
    $stmt = $this->pdo->prepare($sql);

    $j = 1;
    foreach($rows as $row)
    {
        for($i = 0; $i < $numColumns; $i++)
        {
            $stmt->bindParam($j, $row[$columnList[$i]]);
            $j++;
        }
    }

    $stmt->execute();
}
Asyllabic answered 19/2, 2017 at 9:50 Comment(11)
get rid of a transaction, as it makes no sense to use one for a single query. and as usual, this code is vulnerable to SQL injection or query error.Cosenza
You're right about the redundant use of transactions for this case, but I don't see how this is vulnerable to SQL injection. It's parameterised so I can only presume you're assuming $tableName is exposed to the user, which it's not, it's in the DAL. Can you expand on your claims? It's not helpful to just say things.Asyllabic
well, it's not only a table name but anyway: how can you know whether it will be exposed or not by anyone who would use the code you posted here?Cosenza
So it's a poster's responsibility to outline every potential use of the code or every source for arguments? Maybe I have higher expectations of people. Would it make you happier if I added a note not to allow the user to have access to $tableName?Asyllabic
It's a poster's responsibility to post a reliable code, if their intent is to help someone, not just to show off.Cosenza
Not to mention that column names are much more likely to be exposed. And to be a cause for a mere syntax error even being a part of DAL.Cosenza
Who says the column names are more likely to be exposed? What's your reasoning? It seems like you're the one who's trying to show off and not help. In my particular setup the column names are taken from the model member fields. I'll take your advice on board and make sure it's clear but you need to back up your statements if you're going to be helpful.Asyllabic
Name one of your "model member fields" as "group" and see what happens.Cosenza
The answer was to address the specific question the OP asked. You're being pedantic considering this consideration is outside the scope of the question.Asyllabic
If you didn't notice, the answer to address the specific question the OP asked has been posted almost a decade ago.Cosenza
Yes, if you'd read my answer you would have noticed I'd mentioned that in the first line. Are you having a bad day? You seem to have really taken offence to my post.Asyllabic
P
1

As evidenced by the answers preceding mine, it's cumbersome to do it with vanilla PDO. Database abstractions layers such as Nette make the task much easier without significant impact on speed:

$params = [
    'host' => 'sandbox-db',
    'database' => 'test',
    'user' => 'root',
    'pass' => '123',
];

$database = new Nette\Database\Connection("mysql:host={$params['host']};dbname={$params['database']};charset=utf8", $params['user'], $params['pass']);

$staff = [
    [
        'birthday' => new DateTime('1995-05-01'),
        'name' => 'Sharon',
        'salary' => '200',
        'boss' => true,
    ],
    [
        'birthday' => new DateTime('2000-01-01'),
        'name' => 'John',
        'salary' => '140',
        'boss' => false,
    ],
    [
        'birthday' => new DateTime('1985-08-01'),
        'name' => 'Oliver',
        'salary' => '120',
        'boss' => false,
    ],
];

$database->beginTransaction();

$database->query('INSERT INTO test', $staff);

$database->commit();
Pitt answered 31/3, 2023 at 15:16 Comment(0)
C
0

My real world example to insert all german postcodes into an empty table (to add town names later):

// obtain column template
$stmt = $db->prepare('SHOW COLUMNS FROM towns');
$stmt->execute();
$columns = array_fill_keys(array_values($stmt->fetchAll(PDO::FETCH_COLUMN)), null);
// multiple INSERT
$postcode = '01000';// smallest german postcode
while ($postcode <= 99999) {// highest german postcode
    $values = array();
    while ($postcode <= 99999) {
        // reset row
        $row = $columns;
        // now fill our row with data
        $row['postcode'] = sprintf('%05d', $postcode);
        // build INSERT array
        foreach ($row as $value) {
            $values[] = $value;
        }
        $postcode++;
        // avoid memory kill
        if (!($postcode % 10000)) {
            break;
        }
    }
    // build query
    $count_columns = count($columns);
    $placeholder = ',(' . substr(str_repeat(',?', $count_columns), 1) . ')';//,(?,?,?)
    $placeholder_group = substr(str_repeat($placeholder, count($values) / $count_columns), 1);//(?,?,?),(?,?,?)...
    $into_columns = implode(',', array_keys($columns));//col1,col2,col3
    // this part is optional:
    $on_duplicate = array();
    foreach ($columns as $column => $row) {
        $on_duplicate[] = $column;
        $on_duplicate[] = $column;
    }
    $on_duplicate = ' ON DUPLICATE KEY UPDATE' . vsprintf(substr(str_repeat(', %s = VALUES(%s)', $count_columns), 1), $on_duplicate);
    // execute query
    $stmt = $db->prepare('INSERT INTO towns (' . $into_columns . ') VALUES' . $placeholder_group . $on_duplicate);//INSERT INTO towns (col1,col2,col3) VALUES(?,?,?),(?,?,?)... {ON DUPLICATE...}
    $stmt->execute($values);
}

As you can see its fully flexible. You don't need to check the amount of columns or check on which position your column is. You only need to set the insert data:

    $row['postcode'] = sprintf('%05d', $postcode);

I'm proud of some of the query string constructors as they work without heavy array-functions like array_merge. Especially vsprintf() was a good find.

Finally I needed to add 2x while() to avoid exceeding the memory limit. This depends on your memory limit but at all its a good general solution to avoid problems (and having 10 queries is still much better than 10.000).

Coreligionist answered 5/3, 2015 at 22:27 Comment(0)
V
0

For people who might have issues understanding long rows of codes:

  1. Make sure to have one array for each row to be inserted. Eg:

     $data = [ 
               ['name'=>'Sam', 'job'=>'fullstack', 'device'=>'hp'],
               ['name'=>'Joey', 'job'=>'ui', 'device'=>'apple']
             ]
     //the subarrays in $data can also be indexed arrays
    

The idea is to be able to easily form the SQL Insert query for multiple rows, which is supposed to look like this:

insert into table_name (col1, col2, col3) values ('valA1', 'valA2', 'valA3'), ('valB1', 'valB2', 'valB3'); //normal sql, insecure
insert into table_name (col1, col2, col3) values (?, ?, ?), (?, ?, ?); //prepared statement, secure

Hope I'm still making sense. If you'll be doing this batch insert with dynamic column names, you must always have an array of the expected column names and assign default values to them. Eg:

$expectedcols = ['name'=>'', 'job'=>'', 'device'=>''];

Then;

    $validcols = array_intersect_key($data[0], $expectedcols); //Takes only valid column names and throws away unexpected column names
    $allvalidcols = array_merge($expectedcols, $validcols); //assigns the values in $validcols to the values in $expectedcols. Other columns in $expectedcols required to be inserted will also be captured, but their values will be those default values you assigned to them earlier.
    $col_arr = array_keys($allvalidcols);            //extracts the safe column names.
    $columns = implode(', ', $col_arr);         //name, job, device
    $cols = count($col_arr);                    //number of columns, 3
    $temparr = array_fill(0, $cols, '?');       //first set of placeholders ['?', '?', '?']
    $tempstr = '('.implode(', ', $temparr).')'; //(?, ?, ?)
    $rows = count($data);                  //number of rows, 2
    $totalarr = array_fill(0, $rows, $tempstr); //full set of placeholders ['(?, ?, ?)', '(?, ?, ?)']
    $totalstr = implode(', ', $totalarr);       //(?, ?, ?), (?, ?, ?)
    
    $allarr = [];
    foreach($data as $ind=>$val) {
        $values = array_values($val);           
        $allarr = array_merge($allarr, $values); //['Sam', 'fullstack', 'hp', 'Joey', 'ui', 'apple']
    }

    $sql = "insert into table_name ($columns) values $totalstr";

But if you are not doing the batch insert with dynamic column names, skip $validcols, $allvalidcols, $col_arr and $columns, they're not very necessary, then set $cols to the number of columns, which is 3 in this example, then adjust the $sql query to:

$sql = "insert into table_name (name, age, device) values $totalstr";

And finally;

$stmt = $conn->prepare($sql);
$done = $stmt->execute($allarr);
Vegetarian answered 29/6, 2023 at 0:28 Comment(4)
WARNING: This function is prone to SQL injection through column namesCosenza
@YourCommonSense Thanks for pointing this out. I was trying to write an easy function and I was banking on the developer's ability to validate the column names before the query gets to the db. Seems like the safest way is to hard-code the column names into the query but how do you make the column names dynamic and safe on large applications where you have to insert into different tables with different column set? Thanks. We learn everyday.Vegetarian
You never "bank" on a developer. This is not how security works. the only approach is secure that's secure by itself, no matter what a developer does. Regarding dynamic column names. First of all, nobody asked, in this particular question, how to make it dynamic. the question is just how to create a multi insert using prepared statements. And answers should generally keep to this route. But if you want to make it dynamic, you should always filter the column names against a list, explicitly written in your code. See github.com/colshrapnel/BasicTableGatewayCosenza
array_merge in a loop proved to be inefficient. Look at the code in the accepted answer which is both concise and efficientCosenza
S
0

Using a single prepared statement and a single bind params for multiple rows is possible, changing only the arguments provided to the statement. My project had better results using this method (0.398s) than building a long statement wth a long bind param (2.508s)

Important Note:

Using @jamesvi example I used this method for my project.

$dataVals = [['a', 'b', 'c'], ['d', 'e', 'f']];

$colAIn = '';
$colBIn = '';
$colCIn = '';

// Depending on the keys you have on your table, disabling some more checks might increase performance
// mysqli_query($con, "SET AUTOCOMMIT = 0;"); // this should be unecessary since we're using transactions
// mysqli_query($con, "SET FOREIGN_KEY_CHECKS = 0;");
// mysqli_query($con, "SET UNIQUE_CHECKS = 0;");
mysqli_begin_transaction($con);

$stmt = mysqli_prepare($con, "INSERT INTO tblName (colA, colB, colC) VALUES (?, ?, ?)");
mysqli_stmt_bind_param($stmt, "sss", $colAIn, $colBIn, $colCIn);

for ($i = 0; $i < count($dataVals); $i++) {
    $colAIn = $dataVals[$i][0];
    $colBIn = $dataVals[$i][1];
    $colCIn = $dataVals[$i][2];
    mysqli_stmt_execute($stmt);
}
mysqli_commit($con);

// mysqli_query($con, "SET AUTOCOMMIT = 0;"); // this should be unecessary since we're using transactions
// mysqli_query($con, "SET FOREIGN_KEY_CHECKS = 0;");
// mysqli_query($con, "SET UNIQUE_CHECKS = 0;");

Sorry for the procedural style, I blame my project's predecessors for this.

Spatterdash answered 26/10, 2023 at 5:21 Comment(0)
T
-1

You can insert multiple rows in a single query with this function:

function insertMultiple($query,$rows) {
    if (count($rows)>0) {
        $args = array_fill(0, count($rows[0]), '?');

        $params = array();
        foreach($rows as $row)
        {
            $values[] = "(".implode(',', $args).")";
            foreach($row as $value)
            {
                $params[] = $value;
            }
        }

        $query = $query." VALUES ".implode(',', $values);
        $stmt = $PDO->prepare($query);
        $stmt->execute($params);
    }
}

$row is an array of arrays of values. In your case you would call the function with

insertMultiple("INSERT INTO tbl (`key1`,`key2`)",array(array('r1v1','r1v2'),array('r2v1','r2v2')));

This has the benefit that you use prepared statements, while inserting multiple rows with a single query. Security!

Teter answered 7/3, 2014 at 9:58 Comment(0)
M
-1

This worked for me

$sql = 'INSERT INTO table(pk_pk1,pk_pk2,date,pk_3) VALUES '; 
$qPart = array_fill(0, count($array), "(?, ?,UTC_TIMESTAMP(),?)");
$sql .= implode(",", $qPart);
$stmt =    DB::prepare('base', $sql);
$i = 1;
foreach ($array as $value) { 
  $stmt->bindValue($i++, $value);
  $stmt->bindValue($i++, $pk_pk1);
  $stmt->bindValue($i++, $pk_pk2); 
  $stmt->bindValue($i++, $pk_pk3); 
} 
$stmt->execute();
Mcwhirter answered 28/10, 2014 at 19:5 Comment(1)
Nobody can use this code as nobody knows what DB::prepare() isCosenza
L
-1

Most of the solutions given here to create the prepared query are more complex that they need to be. Using PHP's built in functions you can easily creare the SQL statement without significant overhead.

Given $records, an array of records where each record is itself an indexed array (in the form of field => value), the following function will insert the records into the given table $table, on a PDO connection $connection, using only a single prepared statement. Note that this is a PHP 5.6+ solution because of the use of argument unpacking in the call to array_push:

private function import(PDO $connection, $table, array $records)
{
    $fields = array_keys($records[0]);
    $placeHolders = substr(str_repeat(',?', count($fields)), 1);
    $values = [];
    foreach ($records as $record) {
        array_push($values, ...array_values($record));
    }

    $query = 'INSERT INTO ' . $table . ' (';
    $query .= implode(',', $fields);
    $query .= ') VALUES (';
    $query .= implode('),(', array_fill(0, count($records), $placeHolders));
    $query .= ')';

    $statement = $connection->prepare($query);
    $statement->execute($values);
}
Lottielotto answered 30/9, 2015 at 13:15 Comment(4)
This code should never be used as it is vulnerable to SQL injectionCosenza
@Your I don't see what you see. Compel me to vtd this answer.Cleodal
@Cleodal I am implying thisCosenza
Ah It would have been good for the poster and researchers to understand this specific concern regarding field name injections. I know you were a grumpier lad back in these earlier days, but please be sure to blast that injection page on relevant SO pages in the future, so that people learn the vulnerability. Telling folks that a query is vulnerable but not explaining how -- is unhelpfully cryptic. VingTD.Cleodal
D
-1

what about something like this:

        if(count($types_of_values)>0){
         $uid = 1;
         $x = 0;
         $sql = "";
         $values = array();
          foreach($types_of_values as $k=>$v){
            $sql .= "(:id_$k,:kind_of_val_$k), ";
            $values[":id_$k"] = $uid;
            $values[":kind_of_val_$k"] = $v;
          }
         $sql = substr($sql,0,-2);
         $query = "INSERT INTO table (id,value_type) VALUES $sql";
         $res = $this->db->prepare($query);
         $res->execute($values);            
        }

The idea behind this is to cycle through your array values, adding "id numbers" to each loop for your prepared statement placeholders while at the same time, you add the values to your array for the binding parameters. If you don't like using the "key" index from the array, you could add $i=0, and $i++ inside the loop. Either works in this example, even if you have associative arrays with named keys, it would still work providing the keys were unique. With a little work it would be fine for nested arrays too..

**Note that substr strips the $sql variables last space and comma, if you don't have a space you'd need to change this to -1 rather than -2.

Difference answered 26/5, 2020 at 3:38 Comment(0)
H
-1

Here is a clean solution to insert multiple rows using PDO.

It should work with php 7.1+, because of array destructuring, but this part could be change easily I guess.

function insertMultipleQuery(string $table, array $columns, array $items): array
{
    $placeholders = '';
    foreach ($items as $item) {
        if (!empty($placeholders)) {
            $placeholders .= ', ';
        }
        $placeholders .= '(' . implode(',', array_fill(0, count($item), '?')) . ')';
    }

    $names = '`' . implode("`,`", $columns) . '`';
    $query = "INSERT INTO {$table} ({$names}) VALUES {$placeholders}";

    $values = [];
    foreach ($items as $item) {
        foreach ($item as $value) {
            $values[] = $value;
        }
    }

    return [
        'query' => $query,
        'values' => $values,
    ];
}

Sample data:

// $pdo = new PDO(...........); // Fill in your connection config
$tableName = 'table_name';
// It should be obvious - the keys of the array is actual column names in your table in database.
$columns = [
    'name',
    'value',
    'created',
];
$dataToInsert = [
    [
        'name' => 'some name 1',
        'value' => 'some value 1',
        'created' => 'created datetime 1',
    ],
    [
        'name' => 'some name 2',
        'value' => 'some value 2',
        'created' => 'created datetime 2',
    ]
];

[
    'query' => $query,
    'values' => $values,
] = $this->insertMultipleQuery($tableName, $columns, $dataToInsert);
$stmt = $pdo->prepare($query);
$stmt->execute($values);
Handmade answered 13/7, 2023 at 17:46 Comment(1)
WARNING: This function is prone to SQL injection through column namesCosenza
B
-2

I had the same problem and this is how i accomplish for myself, and i made a function for myself for it ( and you can use it if that helps you).

Example:

INSERT INTO countries (country, city) VALUES (Germany, Berlin), (France, Paris);

$arr1 = Array("Germany", "Berlin");
$arr2 = Array("France", "France");

insertMultipleData("countries", Array($arr1, $arr2));


// Inserting multiple data to the Database.
public function insertMultipleData($table, $multi_params){
    try{
        $db = $this->connect();

        $beforeParams = "";
        $paramsStr = "";
        $valuesStr = "";

        for ($i=0; $i < count($multi_params); $i++) { 

            foreach ($multi_params[$i] as $j => $value) {                   

                if ($i == 0) {
                    $beforeParams .=  " " . $j . ",";
                }

                $paramsStr .= " :"  . $j . "_" . $i .",";                                       
            }

            $paramsStr = substr_replace($paramsStr, "", -1);
            $valuesStr .=  "(" . $paramsStr . "),"; 
            $paramsStr = "";
        }


        $beforeParams = substr_replace($beforeParams, "", -1);
        $valuesStr = substr_replace($valuesStr, "", -1);


        $sql = "INSERT INTO " . $table . " (" . $beforeParams . ") VALUES " . $valuesStr . ";";

        $stmt = $db->prepare($sql);


        for ($i=0; $i < count($multi_params); $i++) { 
            foreach ($multi_params[$i] as $j => &$value) {
                $stmt->bindParam(":" . $j . "_" . $i, $value);                                      
            }
        }

        $this->close($db);
        $stmt->execute();                       

        return true;

    }catch(PDOException $e){            
        return false;
    }

    return false;
}

// Making connection to the Database 
    public function connect(){
        $host = Constants::DB_HOST;
        $dbname = Constants::DB_NAME;
        $user = Constants::DB_USER;
        $pass = Constants::DB_PASS;

        $mysql_connect_str = 'mysql:host='. $host . ';dbname=' .$dbname;

        $dbConnection = new PDO($mysql_connect_str, $user, $pass);
        $dbConnection->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);

        return $dbConnection;
    }

    // Closing the connection
    public function close($db){
        $db = null;
    }

If insertMultipleData($table, $multi_params) returns TRUE, your data has been inserted to your database.

Bushido answered 12/9, 2018 at 14:3 Comment(1)
Invalid syntax..Crematory

© 2022 - 2024 — McMap. All rights reserved.