How to get rid of MySQL error 'Prepared statement needs to be re-prepared'
Asked Answered
B

15

30

I've rewritten my site php-code and added MySQL Stored Procedures.

In my local version everything works fine but after I uploaded my site to hosting server I'm constantly getting fatal error 'Prepared statement needs to be re-prepared'.

Sometimes page loads, sometimes loading fails and I see this error. What's that?

Bandurria answered 7/12, 2010 at 19:36 Comment(2)
The answers below don't get to the core of the problem. More answers in this duplicate.Thay
Your comment is no longer true. More answers came over the years, and now the answers below seem to add a lot to the duplicate (which is of little help, indeed)Montagnard
H
40

This is a possibility: MySQL bug #42041

They suggest upping the value of table_definition_cache.

You can read about statement caching in the MySQL docs.

Hedge answered 7/12, 2010 at 20:18 Comment(5)
How do you go about upping the value of table_definition_cache?Bouie
The error went away for me when I added table_definition_cache=5000 in my /etc/my.cnf file (under the [mysqld] section), and restarted mysqld. Note that my.cnf may be in a different location for you.Incongruent
Also seems to be an equivalent bug in MariaDB 10.0, 10.1, 10.2, 10.3, 10.4, 10.1.34: jira.mariadb.org/browse/MDEV-17124 with similar workarounds (increase table_definition_cache to be larger than or equal to table_open_cache).Yarn
This didn't work for me, but the answer below from @rodrigo did, using FLUSH TABLES;Fount
I can't believe the answer i asked 14 years ago still has so much attention :)Bandurria
B
19

@docwhat's answer seems nice, but on a shared hosting server, not everyone is allowed to touch the table_open_cache or table_definition_cache options.

Since this error is related to prepared statements, I have tried to 'emulate' those with PDO by providing the following option:

$dbh = new PDO('mysql:host=localhost;dbname=test', $user, $pass, [
    PDO::ATTR_EMULATE_PREPARES => true
]);

Note: actually this is in a Laravel 5.6 project, and I added the option in config/database.php:

'connections' => [
    'mysql' => [
        'driver' => 'mysql',
        'host' => env('DB_HOST', '127.0.0.1'),
        'port' => env('DB_PORT', '3306'),
        'database' => env('DB_DATABASE', 'forge'),
        'username' => env('DB_USERNAME', 'forge'),
        'password' => env('DB_PASSWORD', ''),
        'unix_socket' => env('DB_SOCKET', ''),
        'charset' => 'utf8mb4',
        'collation' => 'utf8mb4_unicode_ci',
        'prefix' => '',
        'strict' => true,
        'engine' => null,
        'options' => [
            PDO::ATTR_EMULATE_PREPARES => true,
        ],
    ],
    (...)
],

I have not tested the impact of emulating prepared statements on the duration of loading my site, but it works against the error SQLSTATE[HY000]: General error: 1615 Prepared statement needs to be re-prepared I got.

Update on the performance: the emulated version seems to be slightly faster (32.7±1.4ms emulated, 35.0±2.3ms normal, n=10, p-value=0.027 for two-tailed Student's T-test).

Byre answered 5/4, 2018 at 14:20 Comment(4)
I am also facing this issue and this fix works fine. But for security reasons it is not advised (citing comment of David). But if you are willing to take risks or there are no choice, then so be it :DLatimer
So far, my search for security issues related to prepared native vs PDO yields no differences. Here is also a nice starting point. What reason do you have to consider emulation a risk?Byre
I admit I have limited knowledge about this and just depended on some short info I just found. I was believing increasing table_definition_cache is the most appropriate solution here. But I'm starting to doubt when I read your link. Actually I'm still facing this issue and I observed this occurs on the server php 7.0 but not on php 5.6 (this is specific on my setup)Latimer
Be aware it causes integer types to be fetched as strings. table_definition_cache is more reliable.Gerah
F
18

In short: Don't use VIEWS in prepared statements.

This seems to be an on-going issue

Views are messy to handle with Dynamic SQL

Earliest Bug was Cannot create VIEWs in prepared statements from 11 years ago. There was a patch put in to address it.

Another bug report, Prepared-Statement fails when MySQL-Server under load, states that error 1615 is not a bug when the underlying tables are busy. (Really ?)

While there is some merit to increasing the table cache size (See MySql error when working with a mysql view), it does not always work (See General error: 1615 Prepared statement needs to be re-prepared (selecting mysql view))

ALTERNATIVES Over a year ago, someone mentioned this in the MySQL Forum (MySql “view”, “prepared statement” and “Prepared statement needs to be re-prepared”).

Someone came up with the simple idea of not using the view in the prepared statement but using the SQL of view in a subquery instead. Another idea would be to create the SQL used by the view and execute it in your client code.

These would seems to be better workarounds that just bumping up the table cache size.

Fou answered 27/6, 2020 at 11:2 Comment(4)
acutally I think this is the correct answer, as it points to the reason for the error and not just handles the effect by changing server configurations! thanks a lot - I'll change my code according to your suggestions and linked resources and hope this will finally solve this weird error!Macaulay
This is a really poor answer. Using prepared statements with views does work for MySQL/MariaDB databases. (I know from experience with working with TB of data across millions of DB tables, having views for most of them). The better solution would be to determine appropriate values for the table_xxx_cache (allowed for your environment) or limit the SQL concurrency to be within the limits.Denominate
@JJorgenson you didn't specify what "appropriate value" means. How to find it, why it doesn't work.Fou
(Needs a dedicated article), but for a short answer, the "appropriate values" are likely to be the maximum number of tables in use at any given point in time. My server has >5M tables, however we only 100,000 tables at any given point, and that is where our cache limits are set. Obviously the larger cache requires more server ram/resource, which may impose limits on the cache sizes.Denominate
S
9

First gain access to mysql shell:

mysql 

Check the value of the table_definition_cache:

show global variables like '%table_definition_cache%';

It might be 400 or 1400.

Enlarge it:

set global table_definition_cache = 4000;

Good to go!

Solidstate answered 6/5, 2021 at 6:21 Comment(0)
R
6

FLUSH TABLES; comand on database solved for me, i was using doctrine orm.

Runstadler answered 8/4, 2021 at 18:15 Comment(2)
This worked for me without modifying any project code. Increasing the table_definition_cache to 1024 worked in dev, but not in production. I'm assuming 1024 simply wasn't sufficient in production as we have many more databases and tables on that system.Abbreviate
This answer needs more up votes honestly. I have scheduled this into a task to run every so often. No issues since.Hysterics
H
5

Issue: 'Prepared statement needs to be re-prepared'

This issue generally occurs at the time of calling procedure either by using any Computer Language(like Java) or Calling Procedures from the backend.

Solution: Increase the size of the cache by using (executing) below script.

Script: set global table_definition_cache = 4000;

Harangue answered 21/9, 2018 at 10:13 Comment(1)
Where exactly do you run the above script?Bouie
J
5

just do this:

SET GLOBAL table_definition_cache = 4096;
SET GLOBAL table_open_cache = 4096;

4096 can be to less, so set it to a higher value. But make sure that both values ​​are the same.

Juristic answered 2/9, 2021 at 13:37 Comment(0)
S
1

my solutions is to create a routine like this:

DELIMITER $$
--
-- Procedimientos
--
DROP PROCEDURE IF EXISTS `dch_content_class_content`$$

CREATE DEFINER=`renuecod`@`localhost` PROCEDURE `dch_content_class_content`(IN $classId INTEGER)
BEGIN
-- vw_content_class_contents is a VIEW (UNIONS)
  select * from vw_content_class_contents;
END$$

I hope this help someone

Scalpel answered 6/9, 2016 at 18:22 Comment(2)
Why do you need that bold and big font?Outsize
sorry, font style is not necessary. I hope that my contribution has been useful for someoneScalpel
L
0

This is a workaround for people who are on shared hosting and don't want to risk it with sql injection. According to this post: Laravel Fluent Query Builder Join with subquery you could store the definition of your view in a function

private function myView(){
    return DB::raw('(**definition of the view**) my_view_name)');
}

and then use it like this:

public function scopeMyTable(Builder $query)
{
    return $query->join($this->myView(),'my_view_name.id','=','some_table.id');
}

This is a laravel approach, but I'm sure it could be applied in most cases and doesn't need huge code refactoring or architecture change. Plus, it's relatively secure as your statements stay prepared

Lyndsaylyndsey answered 5/1, 2020 at 23:3 Comment(0)
D
0

I had this error being caused by a large group_concat_max_len statement SET SESSION group_concat_max_len = 1000000000000000000; removed it and the error went away

Dangelo answered 26/7, 2021 at 1:34 Comment(0)
I
0

Okay, we were stuck on a laptop that would not allow Tableau to update or retrieve data from views on our MariaDB 10.3.31 databases. We did the usual, google and stack overflow, lots of solutions that just would not work. However, we have another laptop that did connect and ran just fine. So after many head scratches, a eureka moment came. The offending laptop had both V5.3.14 and V8.0.26 ODBC connectors installed. We removed the V8.0.26 ODBC connector and Bang, all the view issues disappeared. Hope someone finds this solution useful.

Intraatomic answered 21/9, 2021 at 11:16 Comment(0)
G
0

I tried to run an UPDATE query which was joining view and a table, and also got the same error. Since I had no user input, I decided to run DB::unprepared which fixed the problem.

You can read more about it in the Laravel Documentation.

Gompers answered 30/12, 2021 at 17:5 Comment(0)
P
0

If you're using Seqeulize, and your db is on shared hosting or any environment where you can't change the suggested table details;

Run raw queries, by doing this instead:

const { DatabaseError } = require("sequelize");

// ...

const updateQueryData = { first_name: "New First name value", age: 34 }

try {
    // code that throws the ER_NEED_REPREPARE error
  } catch (error) {
    if (
      error instanceof DatabaseError && // optional check
      error?.original?.code === "ER_NEED_REPREPARE"
    ) {
      try {
        // https://mcmap.net/q/472090/-sequelize-updating-updatedat-manually :)
        const sql = db.ModelName.queryGenerator.updateQuery(
          db.ModelName.getTableName(),
          { ...updateQueryData, updated_at: db.sequelize.fn("NOW") },
          { id: 57 }, // where option, update row where id is 57
          /**
           * this option is an important part of this fix
           * without it, the query will still use a PREPARED statement
           */
          { bindParam: (bind) => db.sequelize.escape(bind) },
        );

        // execute query
        const results = await db.sequelize.query(sql, {
          /**
           * these are all optional options - no effect on UPDATE queries.
           * added them because you can use with other query types.
           */
          type: db.Sequelize.QueryTypes.UPDATE,
          model: db.ModelName,
          mapToModel: true,
        });

        // return "Success" (after checking results :))
      } catch (error) {
        console.error(error);
      }
    }

    console.error(error);

  }

Petition answered 4/6 at 3:1 Comment(0)
F
0

We had the same problem in our java application. In our case, upping the value of table_definition_cache did not solve the issue.

What worked for us was changing some attributes in the database connection string

We were using useServerPrepStmts=true. We changed that to false and added the following 2 attributes among others:

prepStmtCacheSize=500
prepStmtCacheSqlLimit=1024

So the connection string looked something like this in the end:

jdbc:mysql://localhost:3306/db_name?useServerPrepStmts=false& 
prepStmtCacheSize=500&
prepStmtCacheSqlLimit=1024&
otherAttributesSpecificToOurProject

This did the trick for us. Might be worth giving it a try if nothing else worked for you.

Full answered 18/7 at 8:1 Comment(0)
F
-3

I was getting this same error in Ruby on Rails in a shared hosting environment. It may not be the most secure solution, but disabling prepared statements got rid of the error message for me.

This can be done by adding the "prepared_statements: false" setting to your database.yml file:

production:
  prepared_statements: false

This seems like a reasonable solution when you don't have control over the configuration settings on the MySQL server.

Flagg answered 27/10, 2016 at 12:46 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.