9

I have two servers on PostgreSQL 8.4:

server1:5432
server2:5432

Now i want to copy table from server1 and put it in data base on server2.
Its possible to do?

UPDATE

I try do like in @Valery Viktorovsky's answer:

pg_dump --server1:5432 --encoding=utf8 --no-owner --username=postgres --123456 np_point > D:\np_point.sql
psql --server2:5432 --username=postgres mrsk -f D:\np_point.sql

and get error:

ERROR:  syntax error at or near "pg_dump"
LINE 1: pg_dump --server1:5432 --encoding=utf8 --no-owner --use...
a_horse_with_no_name
  • 440,273
  • 77
  • 685
  • 758
Kliver Max
  • 4,440
  • 20
  • 86
  • 131
  • Possible duplicate of [How to import CSV file data into a PostgreSQL table?](https://stackoverflow.com/questions/2987433/how-to-import-csv-file-data-into-a-postgresql-table) – Eugen Konkov Apr 02 '18 at 18:18

6 Answers6

18

The safest way is to use pg_dump.

pg_dump --host server1 --encoding=utf8 --no-owner --username=foo --password -t table_name db_name > server1_db.sql
psql --host server2 --username=foo db_name -f server1_db.sql
a_horse_with_no_name
  • 440,273
  • 77
  • 685
  • 758
Valery Viktorovsky
  • 5,832
  • 3
  • 32
  • 42
4

Try this

using the COPY command from psql. Connect to server1 and export to CSV then connect to server2 and import from CSV

Community
  • 1
  • 1
A4L
  • 16,713
  • 6
  • 43
  • 60
  • In `COPY products_273 TO '/tmp/products_199.csv' DELIMITER ',' CSV HEADER;` Whats mean DELEMITER? – Kliver Max Mar 11 '13 at 09:54
  • 1
    the delimiter is a char that delimits each column from the next one in an exported table row. Read more about [CSV](http://en.wikipedia.org/wiki/Comma-separated_values), its' a very common and easy format for importing/exporting stuff, not only from/to database but from any other application that supports it. – A4L Mar 11 '13 at 10:01
  • i try `COPY np_point TO 'C:/np_point.csv' DELIMITER ',' CSV HEADER;` PostgreSQL says: `Query returned successfully: 6004 rows affected, 31 ms execution time.` But i cant find csv file. – Kliver Max Mar 11 '13 at 10:10
  • are you using windows 7 ? maybe the psql process does not have write access to `C:\` try to export to some other location. – A4L Mar 11 '13 at 10:35
  • I use WinXP. I try do it on disk D: but get errors like : `ERROR: could not open file "D:/np_point.sql/liferay-develop/111.csv" for writing: No such file or directory ` – Kliver Max Mar 11 '13 at 10:37
  • hmmm... psql seems to have some problems with permissions, you can try to save the csv file in the installation direcotry of your prosgtrsql server, there it should have the permissions requiered. – A4L Mar 11 '13 at 10:49
  • This problem can be bound with that i make it in Pgadmin? – Kliver Max Mar 11 '13 at 10:56
  • Anyway if i do COPY in console i dont get errors and csl file. – Kliver Max Mar 11 '13 at 11:10
  • pgadmin is just a client, just like pgsql, except that is has a graphical user interface. clients (psql or pgadmin or whatever) send requests to the server for proccessing. the server sends the results back to them. so if the server cannot fulfill a request, it doesn't matter who/what the client is. AFIK installation packages for db systems create a separate user (even on windows) witch is then used for running the services. this is done for security reasons. these users usually have very limited access outside of thiere home directory on the machine where they are. – A4L Mar 11 '13 at 11:17
4

exact commands -

1. Export via pg_dump into a file:

    pg_dump --host "source hostname" --port 5432 --username "username" --no-password --verbose --file "filename" --table "source schema.tablename" "source db name"

this will create a file called "filename" at the directory where you ran above command - that will have schema and data for the source table. You can can give any absolute path too.

2. Import via psql:

    psql --host "target hostname" --port 5432 --username "username" --password --verbose --file "file name" "target db name"  

->this will prompt for password

Before running import, drop the target table if exists.

Worked like charm and finished 10M rows within 2mins

Sangam Belose
  • 3,327
  • 8
  • 25
  • 38
KRB
  • 41
  • 2
2

I've written a small script in python that can help. Note - it will only work with small tables. You first need to pip install pandas, sqlalchemy and psycopg2. Create the file as pycopy.py and run it with "python pycopy.py tablename" (without the quotes). You can change the source and destination to work with any other db type - just change the source and destination strings.

The script:

import pandas as pd
import sys
import sqlalchemy as sa

strengine_source='postgresql://user:password@db_ip:5432/dbsourcename'
strengine_dest='postgresql://user:password@db_ip:5432/dbdestinationname'

if len(sys.argv) > 1:
        tblname = sys.argv[1]
        df=pd.read_sql("select * from " + tblname,sa.create_engine(strengine_source))
        df.to_sql(tblname,sa.create_engine(strengine_dest), index=False)
Roee Anuar
  • 2,041
  • 13
  • 26
1

this will copy only a particular table from a particular database to destination database

 pg_dump -h localhost -U postgres -p 5432 -C -t table_name source_db_name | ssh -C username@ip "psql -h localhost -U postgres -p 5432 destination_db_name"
Dhruvil Thaker
  • 1,754
  • 1
  • 13
  • 23
1

You can also do use Linux shell script to migrate your table data from one server to another PostgreSQL server.

I just posted my answer on below similar stack question please refer this. Copying PostgreSQL database to another server

Community
  • 1
  • 1
Anvesh
  • 5,248
  • 2
  • 40
  • 40