23

I'm having problem exporting my PostgreSQL output from a shell to a csv file.
My SQL script is called script.sql.

I typed the following command in my shell:

psql congress -af script.sql &> filename.csv

But when I opened the filename.csv file, values of all the columns are squeezed in one column in the Excel csv (see the attached screenshot).

Then I tried another way. I edited my script.sql to be:

Copy (Select * From ...) To '/tmp/filename.csv' With CSV;

Then I typed the following command in the shell within the database dbname.

\i script.sql

The output is:

COPY 162

Well, my output query has 162 rows.

So the 162 rows of my output table have been copied in the shell. How can I paste or move them to a csv file?

Or, if I'm going to use the filename.csv (screenshot is attached), how can I fix the format of that csv/Excel file?

Screenshot of filename.csv

Erwin Brandstetter
  • 479,275
  • 111
  • 893
  • 1,042
mflowww
  • 5,315
  • 6
  • 15
  • 18
  • 1
    when you open the file in Excel or other alike program, change delimiter to '|' – cur4so Mar 22 '15 at 06:01
  • 1
    http://stackoverflow.com/questions/8119297/postgresql-export-resulting-data-from-sql-query-to-excel-csv – a_horse_with_no_name Mar 22 '15 at 09:39
  • See the more comprehensive answer here, which clarifies `\copy` (client-side) from `COPY` (server-side) queries in `psql`: https://stackoverflow.com/questions/1517635/save-pl-pgsql-output-from-postgresql-to-a-csv-file – nealmcb Sep 28 '17 at 17:21

4 Answers4

42

Modern syntax:

COPY (SELECT * FROM ...) TO '/tmp/filename.csv' (format CSV);

So the 162 rows of my output table have been copied in the shell. How can I paste or move them to a csv file?

The result is the CSV file. Open it with any spreadsheet program using matching delimiters. Per documentation:

The default is a tab character in text format, a comma in CSV format

Like Patrick commented, you can use the corresponding psql meta command \copy in a similar fashion. It writes (and reads) files local to the client and does not require superuser privileges.

More explanation in these related answers:

Erwin Brandstetter
  • 479,275
  • 111
  • 893
  • 1,042
0

first copy your connection info into ~/.pgpass and

cat ip:port:dbname:user:pass > ~/.pgpass
chmod 0600 ~/.pgpass
psql -h serverip -U userid dbname -af test.sql | gzip > result.txt.gz
Ash
  • 2,988
  • 28
  • 36
0

Going off on a bit of a tangent, there is another way too.

I use the following in a windows batch script: -

psql.exe -U %cUser% -h %cHost% -p %cPort% -d %cDB% -t -o "%dumpDir%\tables.txt" -A -c "SELECT table_schema, table_name   FROM information_schema.tables WHERE table_schema = '%qSchema%';"

The trick is to remember the -A option. It suppresses whitespace padding of the data.

I do this to avoid permission errors from the COPY command shown above with the account running postgress not have the same permissions as the account running my scheduled batch file.

This gets me a list of tables in a given schema with results like:-

myschema|mytable1

myschema|mytable2

I then use the FOR batch command to process each line. If you REALLY wanted a CSV file, all you would need to do would be this:-

ECHO table_schema,table_name > %dumpDir%\tables.csv
FOR /F "delims=|" %%p in (%dumpDir%\tables.txt) DO echo %%p,%%q >> %dumpDir%\tables.csv

Probably not the most efficient of mechanisms, but works fine on small numbers of output rows.

0

Most previos answers are correct, here is more details and only for linux

1.Create a file like my_query.sql

\COPY (select * from some_table) TO '/tmp/some_file.csv' (format CSV);

2.Run this below command after you have updated the variables for your DB and file path.

psql "host=YOUR_HOST port=YOUR_PORT dbname=YOUR_DB user=YOUR_USER password=YOUR_PASS" -af my_query.sql

tip: be cautious of your free disk space you have available if you do a select * from your_table you might run out of space if have a large table.

grepit
  • 16,512
  • 5
  • 83
  • 71