sequelize: Can't create more than max_prepared_stmt_count statements
When my server runs for a period of time, this prompt appears.
"can't create more than max_prepared_stmt_count statements".
I checked the status of MySQL and got the following results:
mysql> SHOW GLOBAL STATUS LIKE 'com_stmt%';
+-------------------------+---------+
| Variable_name | Value |
+-------------------------+---------+
| Com_stmt_execute | 6512872 |
| Com_stmt_close | 0 |
| Com_stmt_fetch | 0 |
| Com_stmt_prepare | 18313 |
| Com_stmt_reset | 0 |
| Com_stmt_send_long_data | 0 |
| Com_stmt_reprepare | 0 |
+-------------------------+---------+
mysql> show variables like "%prepared%";
+------------------------------------------------------+-------+
| Variable_name | Value |
+------------------------------------------------------+-------+
| max_prepared_stmt_count | 16382 |
| performance_schema_max_prepared_statements_instances | -1 |
+------------------------------------------------------+-------+
My friend suggested that I can modify mysql’s ‘max_prepared_stmt_count’ to a higher value.I checked the sequelize code and found that the connection pool was just ‘closed’ connection, does not close PreparedStatement.Is there any API can solve this or only can be modifying ‘max_prepared_stmt_count’.very thanks.
Environment
Dialect:
- mysql Sequelize version: 5.3.5 Database version: 8.0.15 Node Version: v10.3.0 OS: Centos 7.5 TypeScript version: 3.4.3
About this issue
- Original URL
- State: open
- Created 5 years ago
- Reactions: 13
- Comments: 21 (10 by maintainers)
Any updates on this? I had the exact same problem on a production server
This bug is still happening and it took down an entire production server due to high load. It’s kinda hard to track down which queries are indeed generating this problem to apply such workarounds like binding the parameters 😕
I think this should be a high priority issue.
Yes, this seems to have fixed the issue the first person was having which is the problem we had. We deployed it to production earlier today and it is working well.
You’d need to divide
max_prepared_stmt_count
by how many instances you expect to have running at the same time, but otherwise yes@senthilbalajir hope this can help you(and others, like me) fix the issue and how you can avoid it on v6 (I just spent a few days migrated my project to v7, but needed to get a v6 deployment stable):
max_prepared_stmt_count
is for the server, it is for ALL connectionsmaxPreparedStatements
maxPreparedStatements
lower tells mysql2 to reset the prepared statements it has cached when it hits that valueHopefully this helps you, this was crashing our project in production a fair bit when load got high, requiring a MySQL server restart
This issue over on the mysql2 project helped me wrap my head around all the variables to fully understand why it was happening: https://github.com/sidorares/node-mysql2/issues/702#issuecomment-356005017
Thanks @ephys, as the issue version 6, any chances of merging the fix to version 6 package?
We ran into the original issue mentioned by @play2010dk as well and solved it. The number of prepared statements is controlled by the mysql2 library. The mysql2 lib uses a lru cache to cache the prepared statements. The default number of cached statements is 16,000 per connection.
See the documentation on prepared statements.
If you have a connection pool with more than 1 connection you may create:
number of connections * 16,000 = prepared statements.
To solve this you can set maxPreparedStatements to a a much lower value in the sequelize connection config.
Here is an example, it is set in the dialect options.
Side note:
There may be a bug in the lru cache mysql is using or a bug in how lru cache is used by the mysql2 lib. We only have about 100 parameterized querys (update,inserts) so we should never have more than 100 per connection. But we are seeing 1000s in production and we can replicate it in our development environment.
Environment
mysql 5.7 Sequelize 5.15.0 node v10.16.0