I'm developing a Rails 3 app that uses Postgres as its database. I've got the table shown below:
Table "public.test"
Column | Type | Modifiers
---------------+---------+-----------
id | integer | not null
some_other_id | integer |
Indexes:
"test_pkey" PRIMARY KEY, btree (id)
"some_other_id_key" UNIQUE CONSTRAINT, btree (some_other_id)
This has two columns:
- id, which is the primary key (automatically created by rails)
- some_other_id, which contains keys generated by another system. This id needs to be unique, so I've added a unique key constraint to the table.
Now if I try to insert a row with a duplicate some_other_id
, it fails (good) and I get the following output in my Postgres logs:
ERROR: duplicate key value violates unique constraint "some_other_id_key"
The problem is that it's completely mainline for my app to try and add the same ID twice, and my logs are being spammed with this "ERROR" message, which causes various problems: files take a lot of disk space, diagnostics get lost in the noise, Postgres has to throw away diags to keep the log files within size limits, etc.
Does anyone know how I can either:
- Suppress the log, either by suppressing all logs about this key, or perhaps by specifying something on the transaction that tries to do the
INSERT
. - Use some other Postgres feature to spot the duplicate key and not try the
INSERT
. I've heard of rules and triggers but I can't get either to work (though I'm no Postgres expert).
Note that any solution needs to work with Rails, which does its inserts like this:
INSERT INTO test (some_other_id) VALUES (123) RETURNING id;