9

I am using a database in PostgreSQL 9.1,in which entry are coming continuously from another program . I am sending request from Ajax after 6 sec to fetch the latest entry.tomcat output window shows exception---

Arval SQLException: FATAL: sorry, too many clients already

and program is working correctly also after this. When i check my postgres with query---

select count(*) from pg_stat_activity;

it shows that connection are increasing continuously but I close the connection after each request.I am using netbeans and struts 1.3.

     long previousSNO = Long.parseLong(request.getParameter("previousSNO"));
    if(previousSNO == 0)
    {
        sb.append("SELECT sno,search_type,search_value,search_date FROM log_temp ORDER BY search_date DESC LIMIT 20");
        prest = cb.executeSQLQuery(sb.toString());
        rs = prest.executeQuery();
    }
    else
    {
        sb.append("SELECT sno,search_type,search_value,search_date FROM log_temp WHERE sno > ? ORDER BY search_date DESC");
        prest = cb.executeSQLQuery(sb.toString());    
        prest.setLong(1, previousSNO);
        rs = prest.executeQuery();
    }
    rs.last();
    int c = rs.getRow();
    rs.beforeFirst();

    if(rs!=null && c>0)
    {    
    //code for making json resultsb from resultset here    
    rs.close();
    }
    cb.closeConnection();
    response.setContentType("text/plain");
    response.getWriter().print(resultsb.toString());

//and close method in connection bean is

    public void closeConnection() {
    try {
        // st.close();
        conn.close();
        System.out.println("con is closed");
        conn = null;

    } catch (SQLException e) {
        e.getMessage();
        System.out.println(e.getMessage());
        System.out.println("con is not closed");
    }
}

Every time its print on console " con is closed";

vikas malik
  • 105
  • 1
  • 1
  • 7

2 Answers2

19

To increase the connection limit you may like the following document.

This solution is tested on ubuntu 12.04.

1. Make following changes in postgresql.conf file :

Open /etc/postgresql/9.1/main/postgresql.conf

max_connections = 200
shared_buffers = 100MB
max_files_per_process = 100

Reference: shared_buffers size should be less than shmmax size.

2. Commands to check shmmax:

$ sysctl -e kernel.shmmax
$ ipcs -l

Reference: Adjusting shmmax and shmall

3. Increase the size of shmmax:

Run the following command:

$ sysctl -w kernel.shmmax=134217728
$ sysctl -w kernel.shmall=2097152

and write on top in /etc/sysctl.conf file:

kernel.shmmax=134217728
kernel.shmall=2097152

Reference : SHMMAX in Ubuntu

4. Restart postgresql

$ service postgresql restart

Links:

http://www.varlena.com/GeneralBits/Tidbits/perf.html

http://www.postgresql.org/docs/9.1/static/runtime-config-resource.html

Community
  • 1
  • 1
Sachin M Sharma
  • 303
  • 2
  • 12
11

You can increase the max_connections in postgres, that is not the solution though. You have resource leaks. It could be any - connection not closed, result set not closed. Please go back and check the code.

Consider using a connection pooling library like c3p0/BoneCp

A general discussion on connection pooling is here (Thanks to @sinisa229 mihajlovski)

Jayan
  • 16,628
  • 12
  • 79
  • 131
  • It also depends on the way resources are handled, its good to use connection pool for reusing connections. If a new connection is opened on each call it can still lead to too many connections being opened. – Sinisha Mihajlovski Mar 21 '12 at 08:42
  • I am making a new connection for each call and closed it but i think that Postgres is not destroy that connection. I have set connection limit to 100. – vikas malik Mar 21 '12 at 09:48
  • @vikas malik: I doubt. Please post output from select * from pg_stat_activity; It will show the current query, could give a hint to you on where the leak is happening. – Jayan Mar 21 '12 at 10:05
  • Jayan I checked.In current query field the entry is 99 times. Before 99 connection it doesn't show error but after this it show the error. After 6sec there is a new entry in pg_stat_activity. – vikas malik Mar 21 '12 at 12:48
  • @vikasmalik. please add relevant code to the question. We use postgresql , I am sure it does not have trivial bugs. – Jayan Mar 21 '12 at 12:53
  • @vikasmalik. Please post to the question. Comment becomes unreadable with such large code. Where are you closing the connection, result set etc – Jayan Mar 21 '12 at 13:17
  • @ vikas malik : You could give more details. Without much details, it is safe to assume the error is in your code. – Jayan Apr 10 '12 at 04:23