Wednesday, 24 October 2018

How to output MySQL query results in CSV format?

Is there an easy way to run a MySQL query from the Linux command line and output the results 
in CSV format?
Here's what I'm doing now:
mysql -u uid -ppwd -D dbname << EOQ | sed -e 's/        /,/g' | tee list.csv
select id, concat("\"",name,"\"") as name
from students
EOQ
It gets messy when there are a lot of columns that need to be surrounded by quotes, 
or if there are quotes in the results that need to be escaped.

 Answers


SELECT order_id,product_name,qty
FROM orders
WHERE foo = 'bar'
INTO OUTFILE '/var/lib/mysql-files/orders.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
Using this command columns names will not be exported.
Also note that /var/lib/mysql-files/orders.csv will be on the server that is
 running MySQL. The user that the MySQL process is running under must have 
permissions to write to the directory chosen, or the command will fail.
If you want to write output to your local machine from a remote server 
(especially a hosted or virtualize machine such as Heroku or Amazon RDS), this solution is not suitable.






mysql --batch, -B

Print results using tab as the column separator, with each row on a new line. 
With this option, mysql does not use the history file. Batch mode results in non-tabular 
output format and escaping of special characters. Escaping may be disabled by using raw mode; 
see the description for the --raw option.

This will give you a tab separated file. Since commas (or strings containing comma) are not escaped it is not straightforward to change the delimiter to comma.



Unix/Cygwin only, pipe it through 'tr':
mysql <database> -e "<query here>" | tr '\t' ',' > data.csv
N.B.: This handles neither embedded commas, nor embedded tabs.



How about:
mysql your_database -p < my_requests.sql | awk '{print $1","$2}' > out.csv



MySQL Workbench can export recordsets to CSV, and it seems to handle commas in fields very well. The CSV opens up in OpenOffice fine.



From your command line, you can do this:
mysql -h *hostname* -P *port number* --database=*database_name* -u *username* -p -e *your SQL query* | sed 's/\t/","/g;s/^/"/;s/$/"/;s/\n//g' > *output_file_name.csv*



Many of the answers on this page are weak because they don't handle the general case of what can occur in CSV format. e.g. commas and quotes embedded in fields and other conditions that always come up eventually. We need a general solution that works for all valid CSV input data.
Here's a simple and strong solution in Python:
#!/usr/bin/env python

import csv
import sys

tab_in = csv.reader(sys.stdin, dialect=csv.excel_tab)
comma_out = csv.writer(sys.stdout, dialect=csv.excel)

for row in tab_in:
    comma_out.writerow(row)
Name that file tab2csv, put it on your path, give it execute permissions, then use it list this:
mysql OTHER_OPTIONS --batch --execute='select * from whatever;' | tab2csv >outfile.csv
The Python CSV-handling functions cover corner cases for CSV input format(s).
This could be improved to handle very large files via a streaming approach.



This is simple, and it works on anything without needing batch mode or output files:
select concat_ws(',',
    concat('"', replace(field1, '"', '""'), '"'),
    concat('"', replace(field2, '"', '""'), '"'),
    concat('"', replace(field3, '"', '""'), '"'))

from your_table where etc;
Explanation:
  1. Replace " with "" in each field --> replace(field1, '"', '""')
  2. Surround each result in quotation marks --> concat('"', result1, '"')
  3. Place a comma between each quoted result --> concat_ws(',', quoted1, quoted2, ...)
That's it!



Here's what I do:
echo $QUERY | \
  mysql -B  $MYSQL_OPTS | \
  perl -F"\t" -lane 'print join ",", map {s/"/""/g; /^[\d.]+$/ ? $_ : qq("$_")} @F ' | \
  mail -s 'report' person@address
The perl script (sniped from elsewhere) does a nice job of converting the tab spaced fields to CSV.



Not exactly as a CSV format, but tee command from MySQL client can be used to save the output into a local file:
tee foobar.txt
SELECT foo FROM bar;
You can disable it using notee.
The problem with SELECT … INTO OUTFILE …; is that it requires permission to write files at the server.



Using the solution posted by Tim, I created this bash script to facilitate the process (root password is requested, but you can modify the script easily to ask for any other user):
#!/bin/bash

if [ "$1" == "" ];then
    echo "Usage: $0 DATABASE TABLE [MYSQL EXTRA COMMANDS]"
    exit
fi

DBNAME=$1
TABLE=$2
FNAME=$1.$2.csv
MCOMM=$3

echo "MySQL password:"
stty -echo
read PASS
stty echo

mysql -uroot -p$PASS $MCOMM $DBNAME -B -e "SELECT * FROM $TABLE;" | sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g" > $FNAME
It will create a file named: database.table.csv



Try this code:
SELECT 'Column1', 'Column2', 'Column3', 'Column4', 'Column5'
UNION ALL
SELECT column1, column2,
column3 , column4, column5 FROM demo
INTO OUTFILE '/tmp/demo.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';

for more: http://dev.mysql.com/doc/refman/5.1/en/select-into.html



Tiny bash script for doing simple query to CSV dumps, inspired by https://.com/a/5395421/2841607.
#!/bin/bash

# $1 = query to execute
# $2 = outfile
# $3 = mysql database name
# $4 = mysql username

if [ -z "$1" ]; then
    echo "Query not given"
    exit 1
fi

if [ -z "$2" ]; then
    echo "Outfile not given"
    exit 1
fi

MYSQL_DB=""
MYSQL_USER="root"

if [ ! -z "$3" ]; then
    MYSQL_DB=$3
fi

if [ ! -z "$4" ]; then
    MYSQL_USER=$4
fi

if [ -z "$MYSQL_DB" ]; then
    echo "Database name not given"
    exit 1
fi

if [ -z "$MYSQL_USER" ]; then
    echo "Database user not given"
    exit 1
fi

mysql -u $MYSQL_USER -p -D $MYSQL_DB -B -s -e "$1" | sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g" > $2
echo "Written to $2"

0 comments:

Post a Comment