Include headers when using SELECT INTO OUTFILE?
Asked Answered
E

25

148

Is it possible to include the headers somehow when using the MySQL INTO OUTFILE?

Exemplificative answered 9/5, 2011 at 19:55 Comment(0)
C
199

You'd have to hard code those headers yourself. Something like:

SELECT 'ColName1', 'ColName2', 'ColName3'
UNION ALL
SELECT ColName1, ColName2, ColName3
    FROM YourTable
    INTO OUTFILE '/path/outfile'
Chang answered 9/5, 2011 at 20:5 Comment(9)
But it will not work if there is an ORDER BY in the SELECT clause. The header line can be anywhere in the generated file depending on the order.Mendive
See a few answers below for ideas on using ORDER BY and also matt's answer for a way to quickly get all the ColName1, ColName2, etc. Very useful add-ons to this great answer!Nougat
This answer seems about right, hell I even used it almost blindly on my dev server... Without column headers, it takes about 50 seconds to dump 240million lines. With this UNION ALL, the server is getting into big troubles trying to do a temporary table before dumping everything, it has been over 10 minutes now and still waiting for the temporary table to get written on disk! Be aware of that! You'd certainly prefer adding the column names another way, even if it means opening the file after with another programming language.Advocate
This only appears to work if all the columns of YourTable are a character data type, which does make sense. Otherwise you get the unhelpful error: "The used SELECT statements have a different number of columns".Subsidence
This is one of the reasons why I'm considering switching to another DBMS.Wilks
If my query has a SUM() condition this makes the entire column incorrect.Atalayah
Actually, I was able to get around that issue of not working withSUM() if I just did something like: SELECT * FROM (SELECT ColName1, ColName2, SUM(ColName3)) t.Atalayah
The answer of @donaldwagner is more generic because of compatibility with complex selections with JOINS, GROUP BY and ORDER BY clauses.Singhalese
If you have a huge table, you may run into this: Errcode: 28 "No space left on device". You can literally watch your disk space go down at a fast pace in GBs ("df -h /tmp"). Without "UNION ALL", there is no issue. So the only solution that works for me is to add the headers afterwards with a bash command ("sed -i '1s/^/your text\n/' file").Newish
C
93

The solution provided by Joe Steanelli works, but making a list of columns is inconvenient when dozens or hundreds of columns are involved. Here's how to get column list of table my_table in my_schema.

-- override GROUP_CONCAT limit of 1024 characters to avoid a truncated result
set session group_concat_max_len = 1000000;

select GROUP_CONCAT(CONCAT("'",COLUMN_NAME,"'"))
from INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'my_table'
AND TABLE_SCHEMA = 'my_schema'
order BY ORDINAL_POSITION

Now you can copy & paste the resulting row as first statement in Joe's method.

Cappuccino answered 28/11, 2012 at 11:13 Comment(10)
This returns all the columns concatenated together in a single field. I'm unable to union that with another select statement that returns multiple fields. This is really useful for getting a line I can copy and paste as the header of my outfile, though.Bartels
The idea is to copy the resulting single field and paste that into your UNION statement instead of manually typing in column1, column2, etc when using Joe's method (the accepted answer) so you can get the list of columns quicker!Nougat
I was looking to export all my table/fields schema. This answer combined with the accepted one did the trick!Hammered
@Chris ORDER BY ORDINAL_POSITION handles thatCappuccino
You are right matt. It works. I thought that what I wanted was my tables columns but infact I want the order of the indexes as that is how my data is being presented. How would you modify your code to get the indexes instead?Ethical
@Ethical I think this deserves to be answered in another question, not in comments. We already steered far enough from the original question.Cappuccino
ORDER BY ORDINAL_POSITION should be part of the GROUP_CONCAT() call, as in GROUP_CONCAT(CONCAT('"',COLUMN_NAME,'"') order BY ORDINAL_POSITION)Manwell
@Cappuccino What does Now you can copy & paste the resulting row as first statement in Joe's method. mean in terms of code?Tollmann
Could this work if I have a JOIN in my query?Caballero
@Tollmann See Andrew's comment.Stcyr
S
17

For complex select with ORDER BY I use the following:

SELECT * FROM (
    SELECT 'Column name #1', 'Column name #2', 'Column name ##'
    UNION ALL
    (
        // complex SELECT statement with WHERE, ORDER BY, GROUP BY etc.
    )
) resulting_set
INTO OUTFILE '/path/to/file';
Shive answered 8/8, 2014 at 14:25 Comment(2)
This solution works well when ordering the second (complex) query; if you don't do it this way you will end up ordering the first column also which is undesirable. Nice suggestion @evilguc!Micra
Did not work with me, after doing UNION ALL the order of id column is messed upMetrology
D
12

This will alow you to have ordered columns and/or a limit

SELECT 'ColName1', 'ColName2', 'ColName3'
UNION ALL
SELECT * from (SELECT ColName1, ColName2, ColName3
    FROM YourTable order by ColName1 limit 3) a
    INTO OUTFILE '/path/outfile';
Descender answered 1/9, 2016 at 17:58 Comment(4)
Just a note that this query layout worked for me in MariaDB 10.1 as well; other layouts suggested in this thread did not.Gravid
Pops the header on the bottom for some reason, but works fine to pop that back on top in a spreadsheet app, strange but cheersMarylnmarylou
I'm stuck using mariaDB as well. with an extraction of just 100 leads, this was over 14 sec faster than running the accepted answer. First pass with accepted: Query OK, 100 rows affected (14.72 sec) Second pass with yours Query OK, 101 rows affected (0.00 sec)Upthrow
This is the most generic solution because of compatibility with complex selections with JOINS, GROUP BY and ORDER BY clauses. Works well and performant on my data.Singhalese
S
9

You can use prepared statement with lucek's answer and export dynamically table with columns name in CSV :

--If your table has too many columns
SET GLOBAL group_concat_max_len = 100000000;
--Prepared statement
SET @SQL = ( select CONCAT('SELECT * INTO OUTFILE \'YOUR_PATH\' FIELDS TERMINATED BY \',\' OPTIONALLY ENCLOSED BY \'"\' ESCAPED BY \'\' LINES TERMINATED BY \'\\n\' FROM (SELECT ', GROUP_CONCAT(CONCAT("'",COLUMN_NAME,"'")),' UNION select * from YOUR_TABLE) as tmp') from INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'YOUR_TABLE' AND TABLE_SCHEMA = 'YOUR_SCHEMA' order BY ORDINAL_POSITION );
--Execute it
PREPARE stmt FROM @SQL;
EXECUTE stmt;

Thank lucek.

Supplementary answered 3/10, 2013 at 10:3 Comment(0)
S
7

I simply make 2 queries, first to get query output (limit 1) with column names (no hardcode, no problems with Joins, Order by, custom column names, etc), and second to make query itself, and combine files into one CSV file:

CSVHEAD=`/usr/bin/mysql $CONNECTION_STRING -e "$QUERY limit 1;"|head -n1|xargs|sed -e "s/ /'\;'/g"`
echo "\'$CSVHEAD\'" > $TMP/head.txt
/usr/bin/mysql $CONNECTION_STRING -e "$QUERY into outfile '${TMP}/data.txt' fields terminated by ';' optionally enclosed by '\"' escaped by '' lines terminated by '\r\n';"
cat $TMP/head.txt $TMP/data.txt > $TMP/data.csv
Sextant answered 23/3, 2017 at 12:7 Comment(0)
S
6

I faced similar problem while executing mysql query on large tables in NodeJS. The approach which I followed to include headers in my CSV file is as follows

  1. Use OUTFILE query to prepare file without headers

        SELECT * INTO OUTFILE [FILE_NAME] FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED 
        BY '\"' LINES TERMINATED BY '\n' FROM [TABLE_NAME]
    
  2. Fetch column headers for the table used in point 1

        select GROUP_CONCAT(CONCAT(\"\",COLUMN_NAME,\"\")) as col_names from 
        INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = [TABLE_NAME] AND TABLE_SCHEMA 
        = [DATABASE_NAME] ORDER BY ORDINAL_POSITION
    
  3. Append the column headers to the file created in step 1 using prepend-file npm package

Execution of each step was controlled using promises in NodeJS.

Sines answered 4/9, 2018 at 6:51 Comment(0)
E
5

This is an alternative cheat if you are familiar with Python or R, and your table can fit into memory.

Import the SQL table into Python or R and then export from there as a CSV and you'll get the column names as well as the data.

Here's how I do it using R, requires the RMySQL library:

db <- dbConnect(MySQL(), user='user', password='password', dbname='myschema', host='localhost')

query <- dbSendQuery(db, "select * from mytable")
dataset <- fetch(query, n=-1)

write.csv(dataset, 'mytable_backup.csv')

It's a bit of a cheat but I found this was a quick workaround when my number of columns was too long to use the concat method above. Note: R will add a 'row.names' column at the start of the CSV so you'll want to drop that if you do need to rely on the CSV to recreate the table.

Ers answered 25/7, 2015 at 11:51 Comment(1)
In the end, this is one of the best approaches. SQL overly complicates things that should be way simpler.Latt
A
3

The easiest way is to hard code the columns yourself to better control the output file:

SELECT 'ColName1', 'ColName2', 'ColName3'
UNION ALL
SELECT ColName1, ColName2, ColName3
    FROM YourTable
    INTO OUTFILE '/path/outfile'
Acquittance answered 1/1, 2021 at 10:12 Comment(0)
K
2

I think if you use a UNION it will work:

select 'header 1', 'header 2', ...
union
select col1, col2, ... from ...

I don't know of a way to specify the headers with the INTO OUTFILE syntax directly.

Kizzykjersti answered 9/5, 2011 at 20:7 Comment(1)
UNION ALL would be safer and faster.Bottrop
C
2

Since the 'include-headers' functionality doesn't seem to be build-in yet, and most "solutions" here need to type the columns names manually, and/or don't even take joins into account, I'd recommand to get around the problem.

  • The best alternative I found so far is using a decent tool (I use HeidiSQL).
    Put your request, select the grid, just right click and export to a file. It got all necessary options for a clean export, ans should handle most needs.

  • In the same idea, user3037511's approach works fine, and can be automated easily.
    Just launch your request with some command line to get your headers. You may get the data with a SELECT INTO OUTFILE... or by running your query without the limit, yours to choose.

    Note that output redirect to a file works like a charm on both Linux AND Windows.


This makes me want to highlight that 80% of the time, when I want to use SELECT FROM INFILE or SELECT INTO OUTFILE, I end-up using something else due to some limitations (here, the absence of a 'headers options', on an AWS-RDS, the missing rights, and so on.)

Hence, I don't exactly answer to the op's question... but it should answer his needs :)
EDIT : and to actually answer his question : no
As of 2017-09-07, you just can't include headers if you stick with the SELECT INTO OUTFILE command
:|

Candelabrum answered 7/9, 2017 at 9:58 Comment(1)
phpMyAdmin is also an good alternative. It export the table into (different formats) outfile and also include the header at the top. But if you doesn't want to use 3rd party software instead of sql command(mainly because it takes more time to use external software instead of sql command); then @Cappuccino answer is the best.Regina
M
1

Actually you can make it work even with an ORDER BY.

Just needs some trickery in the order by statement - we use a case statement and replace the header value with some other value that is guaranteed to sort first in the list (obviously this is dependant on the type of field and whether you are sorting ASC or DESC)

Let's say you have three fields, name (varchar), is_active (bool), date_something_happens (date), and you want to sort the second two descending:

select 
        'name'
      , 'is_active' as is_active
      , date_something_happens as 'date_something_happens'

 union all

 select name, is_active, date_something_happens

 from
    my_table

 order by
     (case is_active when 'is_active' then 0 else is_active end) desc
   , (case date when 'date' then '9999-12-30' else date end) desc
Mccreary answered 30/7, 2014 at 14:52 Comment(0)
A
1

Here is a way to get the header titles from the column names dynamically.

/* Change table_name and database_name */
SET @table_name = 'table_name';
SET @table_schema = 'database_name';
SET @default_group_concat_max_len = (SELECT @@group_concat_max_len);

/* Sets Group Concat Max Limit larger for tables with a lot of columns */
SET SESSION group_concat_max_len = 1000000;

SET @col_names = (
  SELECT GROUP_CONCAT(QUOTE(`column_name`)) AS columns
  FROM information_schema.columns
  WHERE table_schema = @table_schema
  AND table_name = @table_name);

SET @cols = CONCAT('(SELECT ', @col_names, ')');

SET @query = CONCAT('(SELECT * FROM ', @table_schema, '.', @table_name,
  ' INTO OUTFILE \'/tmp/your_csv_file.csv\'
  FIELDS ENCLOSED BY \'\\\'\' TERMINATED BY \'\t\' ESCAPED BY \'\'
  LINES TERMINATED BY \'\n\')');

/* Concatenates column names to query */
SET @sql = CONCAT(@cols, ' UNION ALL ', @query);

/* Resets Group Contact Max Limit back to original value */
SET SESSION group_concat_max_len = @default_group_concat_max_len;

PREPARE stmt FROM @sql;
EXECUTE stmt;
DEALLOCATE PREPARE stmt;
Ashleeashleigh answered 2/11, 2015 at 20:28 Comment(1)
Quite a solution should I say. Adopted it for my purposes. Thanks!Tabulator
B
1

an example from my database table name sensor with colums (id,time,unit)

select ('id') as id, ('time') as time, ('unit') as unit
UNION ALL
SELECT * INTO OUTFILE 'C:/Users/User/Downloads/data.csv'
  FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
  LINES TERMINATED BY '\n'
  FROM sensor
Barite answered 5/9, 2018 at 6:21 Comment(0)
Z
1

If you are using MySQL Workbench:

  1. Select all the columns from the SCHEMAS tab -> Right Click -> Copy to Clipboard -> Name

  2. Paste it in any text editor and, Replace " ` " with " ' "

  3. Copy it back and use it in your UNION query (as mentioned in the accepted answer):

    SELECT [Paste your text here]
    UNION ALL
    SELECT *
    FROM table_name
    INTO OUTFILE 'file_path'
    
Zak answered 27/8, 2021 at 23:39 Comment(0)
I
1

Inspired by pivot table example from Rick James.

SET @CSVTABLE = 'myTableName',
    @CSVBASE = 'databaseName',
    @CSVFILE = '/tmp/filename.csv';

SET @sql = (SELECT CONCAT("SELECT ", GROUP_CONCAT(CONCAT('"', COLUMN_NAME, '"')), " UNION SELECT * FROM ", @CSVBASE, ".", @CSVTABLE) FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME=@CSVTABLE AND TABLE_SCHEMA=@CSVBASE);
prepare stmt from CONCAT(@sql, " INTO OUTFILE '", @CSVFILE, "' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'  LINES TERMINATED BY '\\n';");
execute stmt;

It gets list of columns from INFORMATION_SCHEMA.COLUMNS table, and uses GROUP_CONCAT to prepare SELECT statement with list of strings with column names.

Next UNION is added with SELECT * FROM specified database.table - this creates query text that will output both column names and column values in result.

Now the statement is prepared using previously created query (stored in @sql variable), CSV output specific "things" are appended to query and finally statement is executed with execute stmt

Impute answered 14/12, 2022 at 21:55 Comment(0)
G
0

I was writing my code in PHP, and I had a bit of trouble using concat and union functions, and also did not use SQL variables, any ways I got it to work, here is my code:

//first I connected to the information_scheme DB

$headercon=mysqli_connect("localhost", "USERNAME", "PASSWORD", "information_schema");

//took the healders out in a string (I could not get the concat function to work, so I wrote a loop for it)

    $headers = '';
    $sql = "SELECT column_name AS columns FROM `COLUMNS` WHERE table_schema = 'YOUR_DB_NAME' AND table_name = 'YOUR_TABLE_NAME'";
    $result = $headercon->query($sql);
    while($row = $result->fetch_row())
    {
        $headers = $headers . "'" . $row[0] . "', ";
    }
$headers = substr("$headers", 0, -2);

// connect to the DB of interest

$con=mysqli_connect("localhost", "USERNAME", "PASSWORD", "YOUR_DB_NAME");

// export the results to csv
$sql4 = "SELECT $headers UNION SELECT * FROM YOUR_TABLE_NAME WHERE ... INTO OUTFILE '/output.csv' FIELDS TERMINATED BY ','";
$result4 = $con->query($sql4);
Gumdrop answered 12/12, 2015 at 11:53 Comment(0)
S
0

So, if all the columns in my_table are a character data type, we can combine the top answers (by Joe, matt and evilguc) together, to get the header added automatically in one 'simple' SQL query, e.g.

select * from (
  (select column_name
    from information_schema.columns
    where table_name = 'my_table'
    and table_schema = 'my_schema'
    order by ordinal_position)
  union all
  (select *  // potentially complex SELECT statement with WHERE, ORDER BY, GROUP BY etc.
  from my_table)) as tbl
into outfile '/path/outfile'
fields terminated by ',' optionally enclosed by '"' escaped by '\\'
lines terminated by '\n';

where the last couple of lines make the output csv.

Note that this may be slow if my_table is very large.

Subsidence answered 8/7, 2016 at 10:58 Comment(2)
it has "The used SELECT statements have a different number of columns" error. SirTrisect
That is a wrong answer, because the first select from information_schema would return a couple of records of one column (equal to the number of fields of the table; even records containing the name of the column). So never would the first select column numbers and the second select column be equal (except maybe if the second select return only one column).Regina
C
0

I would like to add to the answer provided by Sangam Belose. Here's his code:

select ('id') as id, ('time') as time, ('unit') as unit
UNION ALL
SELECT * INTO OUTFILE 'C:/Users/User/Downloads/data.csv'
  FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
  LINES TERMINATED BY '\n'
  FROM sensor

However, if you have not set up your "secure_file_priv" within the variables, it may not work. For that, check the folder set on that variable by:

SHOW VARIABLES LIKE "secure_file_priv"

The output should look like this:

mysql> show variables like "%secure_file_priv%";
+------------------+------------------------------------------------+
| Variable_name    | Value                                          |
+------------------+------------------------------------------------+
| secure_file_priv | C:\ProgramData\MySQL\MySQL Server 8.0\Uploads\ |
+------------------+------------------------------------------------+
1 row in set, 1 warning (0.00 sec)

You can either change this variable or change the query to output the file to the default path showing.

Calamity answered 16/11, 2018 at 8:54 Comment(0)
C
0

MySQL alone isn't enough to do this simply. Below is a PHP script that will output columns and data to CSV.

Enter your database name and tables near the top.

<?php

set_time_limit( 24192000 );
ini_set( 'memory_limit', '-1' );
setlocale( LC_CTYPE, 'en_US.UTF-8' );
mb_regex_encoding( 'UTF-8' );

$dbn = 'DB_NAME';
$tbls = array(
'TABLE1',
'TABLE2',
'TABLE3'
);

$db = new PDO( 'mysql:host=localhost;dbname=' . $dbn . ';charset=UTF8', 'root', 'pass' );

foreach( $tbls as $tbl )
{
    echo $tbl . "\n";
    $path = '/var/lib/mysql/' . $tbl . '.csv';

    $colStr = '';
    $cols = $db->query( 'SELECT COLUMN_NAME AS `column` FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = "' . $tbl . '" AND TABLE_SCHEMA = "' . $dbn . '"' )->fetchAll( PDO::FETCH_COLUMN );
    foreach( $cols as $col )
    {
        if( $colStr ) $colStr .= ', ';
        $colStr .= '"' . $col . '"';
    }

    $db->query(
    'SELECT *
    FROM
    (
        SELECT ' . $colStr . '
        UNION ALL
        SELECT * FROM ' . $tbl . '
    ) AS sub
    INTO OUTFILE "' . $path . '"
    FIELDS TERMINATED BY ","
    ENCLOSED BY "\""
    LINES TERMINATED BY "\n"'
    );

    exec( 'gzip ' . $path );

    print_r( $db->errorInfo() );
}

?>

You'll need this to be the directory you'd like to output to. MySQL needs to have the ability to write to the directory.

$path = '/var/lib/mysql/' . $tbl . '.csv';

You can edit the CSV export options in the query:

INTO OUTFILE "' . $path . '"
FIELDS TERMINATED BY ","
ENCLOSED BY "\""
LINES TERMINATED BY "\n"'

At the end there is an exec call to GZip the CSV.

Commodious answered 21/7, 2019 at 12:57 Comment(0)
D
0

I had no luck with any of these, so after finding a solution, I wanted to add it to the prior answers. Python==3.8.6 MySQL==8.0.19

(Forgive my lack of SO formatting foo. Somebody please clean up.)

Note a couple of things:

First, the query to return column names is unforgiving of punctuation. Using ` backticks or leaving out ' quote around the 'schema_name' and 'table_name' will throw an "unknown column" error.

WHERE TABLE_SCHEMA = 'schema' AND TABLE_NAME = 'table'

Second, the column header names return as a single-entity tuple with all the column names concatenated in one quoted string. Convert to quoted list was easy, but not intuitive (for me at least).

headers_list = headers_result[0].split(",")

Third, cursor must be buffered or the "lazy" thing will not fetch your results as you need them. For very large tables, memory could be an issue. Perhaps chunking would solve that problem.

cur = db.cursor(buffered=True)

Last, all types of UNION attempts yielded errors for me. By zipping the whole mess into a list of dicts, it became trivial to write to a csv, using csv.DictWriter.

headers_sql = """   
SELECT 
GROUP_CONCAT(CONCAT(COLUMN_NAME) ORDER BY ORDINAL_POSITION)   
FROM INFORMATION_SCHEMA.COLUMNS   
WHERE TABLE_SCHEMA = 'schema' AND TABLE_NAME = 'table';   
"""" 

cur = db.cursor(buffered=True) 
cur.execute(header_sql) 
headers_result = cur.fetchone() 
headers_list = headers_result[0].split(",")

rows_sql = """   SELECT * FROM schema.table;   """" 

data = cur.execute(rows_sql) 
data_rows = cur.fetchall() 
data_as_list_of_dicts = [dict(zip(headers_list, row)) for row in data_rows]

with open(csv_destination_file, 'w', encoding='utf-8') as destination_file_opened: 
    dict_writer = csv.DictWriter(destination_file_opened, fieldnames=headers_list) 
    dict_writer.writeheader()   for dict in dict_list:
        dict_writer.writerow(dict)
Dandelion answered 2/11, 2020 at 0:42 Comment(0)
P
0

Solution using python but no need to install a python package to read sql files if you already use another tool. If you are not familiar with python you can run the python codes in a colab notebook, all the required packages are already installed. It automates Matt and Joe's solutions.

Firstly execute this SQL script to get a csv with all table names :

SELECT TABLE_NAME 
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE = 'BASE TABLE' AND TABLE_SCHEMA='your_schema'
INTO OUTFILE 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/tables.csv';

Then move tables.csv to a suitable directory and execute this python code after having replaced 'path_to_tables' and 'your_schema'. It will generate a sql script to export all tables headers:

import pandas as pd
import os

tables = pd.read_csv('tables.csv',header = None)[0]
text_file = open("export_headers.sql", "w")
schema = 'your_schema'

sql_output_path = 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/'
for table in tables :
    path = os.path.join(sql_output_path,'{}_header.csv'.format(table))
    string = "(select GROUP_CONCAT(COLUMN_NAME)\nfrom INFORMATION_SCHEMA.COLUMNS\nWHERE TABLE_NAME = '{}'\nAND TABLE_SCHEMA = '{}'\norder BY ORDINAL_POSITION)\nINTO OUTFILE '{}';".format(table,schema,path)
    n = text_file.write(string)
    n = text_file.write('\n\n')
text_file.close()

Then execute this python code which will generate a sql script to export the values of all tables:

text_file = open("export_values.sql", "w")
for table in tables :
    path = os.path.join(sql_output_path,'{}.csv'.format(table))
    string = "SELECT * FROM {}.{}\nINTO OUTFILE '{}';".format(schema,table,path)
    n = text_file.write(string)
    n = text_file.write('\n\n')
text_file.close()

Execute the two generated sql scripts and move the header csvs and values csvs in directories of your choice.

Then execute this last python code :

#Respectively the path to the headers csvs, the values csv and the path where you want to put the csvs with headers and values combined
headers_path, values_path, tables_path = '', '', '' 

for table in tables :
    header = pd.read_csv(os.path.join(headers_path,'{}_header.csv'.format(table)))
    df = pd.read_csv(os.path.join(values_path,'{}.csv'.format(table)),names = header.columns,sep = '\t')
    df.to_csv(os.path.join(tables_path,'{}.csv'.format(table)),index = False)

Then you got all your table exported in csv with the headers without having to write or copy paste all the tables and columns names.

Pegram answered 12/5, 2022 at 15:3 Comment(0)
S
0

You can always add the columns afterwards using Bash:

columns=$(mysql -u root -D database_name -e "SELECT * from users LIMIT 1" | head -1 | tr -s '[:blank:]' ',')
// Get the column names from the DB and replace tabs with commas to make them CSV
sed -i "1s/^/$columns\n/" "$database_name.csv"
// Add the columns to a new line at the beginning of the CSV file

This might take a while for large CSV files and will require free space double the size of the file in order for sed to create its automatic backup file, but it does the job.

Stcyr answered 22/11, 2023 at 1:53 Comment(0)
N
0

you could use FORMAT TEXT HEADER as per document - https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Integrating.SaveIntoS3.html

I use something like:

SELECT * FROM table INTO OUTFILE S3 's3://bucket/folder' FORMAT TEXT HEADER FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\n' OVERWRITE ON

try and let me know

Norbertonorbie answered 11/3 at 13:50 Comment(0)
T
-1
SELECT 'ColName1', 'ColName2', 'ColName3'
UNION ALL
SELECT ColName1, ColName2, ColName3
    FROM YourTable
    INTO OUTFILE 'c:\\datasheet.csv' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\n' 
Tetrarch answered 9/12, 2014 at 15:33 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.