Error: sorry, too many clients already
Asked Answered
H

4

8

I am using PostgreSQL in my NodeJS backend service. All of sudden, when I start the service I am facing below error

 connection error error: sorry, too many clients already.

PostgresSQL connection config

 const pg = require(“pg”);
 const client = new pg.Client({
        host: “txslmxxxda6z”,
        user: “mom”,
        password: “mom”,
        db: “mom”,
        port: 5025
 });

I am unable to query database because of the above error. I am unable to fix this issue. Can you please suggest the solution

Haddix answered 20/6, 2018 at 11:22 Comment(3)
Close some database connections so that you get below max_connections.Monition
Close the idle connection which is idle from long time..Oshinski
Can you please suggest me how to do thatHaddix
O
9

below query will help you.

select max_conn,used,res_for_super,max_conn-used-res_for_super res_for_normal 
from 
  (select count(*) used from pg_stat_activity) t1,
  (select setting::int res_for_super from pg_settings where name=$$superuser_reserved_connections$$) t2,
  (select setting::int max_conn from pg_settings where name=$$max_connections$$) t3
Oshinski answered 20/6, 2018 at 11:28 Comment(3)
Thanks. I am able to see details of connections but please suggest me how to release these connections to run the service againHaddix
SELECT pg_terminate_backend(procpid) FROM pg_stat_activity WHERE state= '<IDLE>' AND now() - query_start > '00:10:00';Oshinski
use this query to find out idle connection select * from pg_stat_activity where state='idle';Oshinski
P
2

I experienced the same issue that " sorry, too many clients already. Request timeout". I managed to solve it. There are some ways which you can try

1)The first one that you can increase the number of maximum connections and other settings in sequilize like

  { 
    max: 95,    //maximum connection which postgresql or mysql can intiate
    min: 0,     //maximum connection which postgresql or mysql can intiate
    acquire:20000, // time require to reconnect 
    idle: 20000, // get idle connection
    evict:10000 // it actualy removes the idle connection
   }

2). The second way is to use timescaledb because sometimes with complex queries we get timeout error for timescale data you can set timescaledb so you can run complex database queries. you can also set maximum clients for postgres according to your server requirement.

https://github.com/timescale/timescaledb-tune

Parenteau answered 23/10, 2019 at 8:10 Comment(3)
I just improve my above post. Use "evict" in connection pool because "idle" just get idle connections. Actually "evict" remove idle connections.Parenteau
by default the evict is 1000ms and this value is the time to remove idle connections, so I think that increasing this value will allow more idle connections during more time. Of course, these are configuration options and its values depends on the application, however, it is important to know how to use them to improve the system.Prehensible
Need to put pool: in front of the object in 1Selenaselenate
D
1

If you would like to use PgAdmin (for me it is more than convenient), you could do these simple steps. And more than that, please use dashboard of this tool that will show you all you connections.

in the documentation https://node-postgres.com/api/pool we can see the "max" and " idleTimeoutMillis" variables, so play with your config, and start from this:

const poolCredentials = {
    user: process.env.DB_USER,
    host: process.env.DB_HOST,
    database: process.env.DB_NAME,
    password: process.env.DB_PW,
    port: process.env.DB_PORT,
    idleTimeoutMillis: 1,
    max: 10,
    connectionTimeoutMillis: 2000,
    ...ssl
};

You simply look on you server constraints and change the config. Good luck and glad if this is helps

SELECT * FROM pg_stat_activity;

example

Deluca answered 8/8, 2022 at 12:14 Comment(0)
C
0

Have you an idea how your program managed to open many connections to the DB? Which is your connection-management strategy:

  1. Open a short-lived connection for each query/command/batch/call, or
  2. Open a single (or few) connection(s) with a the lifespan of the entire service?

Are you sure you are closing every connection that you open? To avoid concurrency issues, I use the first strategy in my NodeJS applications, opening a dedicated connection to handle each request, and make absolutely sure to close it when the request has been handled. That way, the maximum number of open connections is the maximum number of simultaneously processed requests.

Concrete answered 21/9, 2022 at 14:35 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.