Importing JSON into Mysql
Asked Answered
S

8

16

the problem that I am facing is importing JSON into Mysql. I have searched the interwebs, I have searched stackoverflow and probably even more sources, but could not find a single proper solution. Full disclosure here - I am not a PHP, SQL nor am I a JSON professional.

What I am trying to accomplish is fairly simple: Import this JSON file https://developers.facebook.com/tools/explorer/method=GET&path=245226895508982%2Ffeed%3Faccess_token%3D%3Caccess_token%3E= into a Mysql database. I understand that I need so somehow allocate the JSON data into columns and then curate the rows accordingly with the content. I do not need all of the data, but most of it. The data should later be echoed through a search query from a user.

My question is seems fairly simple - how can I do it?

P.S: I tried converting it into XML, but I do not have the sufficient rights to import the XML into the SQL database. Also the converter did not the best job.

Sodden answered 28/6, 2013 at 14:39 Comment(7)
what code do you have so far? where are you stuck? this task involves multiple steps, so it's hard to help without knowing where you're at with your code. do you have the database set up already? do you know how to read a json file and parse it with php? are you having trouble with the database query itself? this question is only fairly simple once you understand all these things. if you don't, it won't be easy and will require some learning.Catt
It feels like you're trying to find a solution without actually understanding how things work. Why are you trying to insert into a database before understanding databases as a whole?Melba
@DanyCaissy thanks forthe quick reply. I understand the basics of databased - by all means I do not have an extensive knowledge of them. What code I have so far: I used the code provided here #12165956, but that did not get me very far. Where am I stuck: Basically how to tackle the problem - so in the very beginning. Do I have a database already: yes. DO I know how to read a json file and parse it with php: probably not.Sodden
Well, you need to decide whether you want a different column for each JSON attribute, or if you want everything in the same column. You have to decide what you want to do with the data, we can't do that for you.Melba
{ "id": "24522982_585115708186764", "from": { "name": "Ma Mei", "id": "100005547752625" }, "to": { "data": [ { "name": "Wohnung/WG in München gesucht!", "id": "245226895508982" } ] }, "message": "Hallo,\n\nsuchen eine 3 oder 4 Zimmer Wohnung, die als WG für 3 oder 4 Personen, je nachdem wieviele Zimmer, geeignet ist. Ab sofort. Können jederzeit eine Besichtigung wahrnehmen. 3 Studenten, 1 Arbeiter. Freue mich über jeden Hinweis." }Sodden
the columns should be ID, NAME, MESSAGESodden
OK. Step by step. First understanding your JSON; Understand MySql, which is relational database. Relation database can only have rows and columns. What are the column names? Will above json only one row in database? I guess so. Since to.data is an array, it could be another table - with possibility have more than one item in future json data. So you need a mapping between json and database. After solving above two questions, you can process on parse json and insert them into database with coding.Loupe
C
14

You can export your json to csv : http://www.danmandle.com/blog/json-to-csv-conversion-utility/ or https://github.com/danmandle/JSON2CSV

Then :

LOAD DATA INFILE 'filepath/your_csv_file.csv' INTO TABLE tablename;

It should be okay for a oneshot.

More info for load data infile.

Cawthon answered 28/6, 2013 at 14:50 Comment(5)
Thanks! I tried that one too. This is what is happening: `<br /> <b>Warning</b>: array_keys() [<a href='function.arraykeys'>function.array-keys</a>]: The first argument should be an array in<b>/home/mandledp/danmandle.com/projects/JSON2CSV/json2csv.class.php</b> on line <b>20</b><br /> <br /> <b>Warning</b>: Invalid argument supplied for foreach() in <b>/home/mandledp/danmandle.com/projects/JSON2CSV/json2csv.class.php</b> on line <b>20</b><br /> ... I found this to be an excellent solution json-csv.com/Default.aspx. However they charge you 4 USD acopy, which is obviously not feasible.Sodden
Show us the first lines of your JSON file.Cawthon
{ "id": "245226895508982_585115708186764", "from": { "name": "Ma Mei", "id": "100005547752625" }, "to": { "data": [ { "name": "Wohnung/WG in München gesucht!", "id": "245226895508982" } ] }, "message": "Hallo,\n\nsuchen eine 3 oder 4 Zimmer Wohnung", "privacy": { "value": "" }, "type": "status", "created_time": "2013-06-26T21:44:27+0000", "updated_time": "2013-06-26T21:44:27+0000" }Sodden
Ok, so I think that with this JSON file, it won't be easy to insert data in your database. It is not an array of cell{key1:value1, key2:value2}. You'll have to extract, transform and then load your data ==> check ETL in google if it can help you (pentaho is a free ETL tool)Cawthon
The JSON2CSV failed for me with error: "JSON is either invalid or has nested elements.", but it is just a plain flat JSON object with key-values. I found out that excel (for Office 365) can convert JSON to CSV as well. I ran the command as suggested, but found out the file has to be in directory SHOW VARIABLES LIKE "secure_file_priv";. Then lastly the file should not have been comma-separated, but tab-separated (some user reported even semicolon-separated).Hague
S
9

Like others have said, you are going to have to do a little bit of conversion to do what you're asking. Fortunately, it's not too difficult to have PHP iterate through the data and do most of the work for you. Here's a quick-and-dirty attempt:

// MySQL table's name
$tableName = 'my_table';
// Get JSON file and decode contents into PHP arrays/values
$jsonFile = '/path/to/file.json';
$jsonData = json_decode(file_get_contents($jsonFile), true);

// Iterate through JSON and build INSERT statements
foreach ($jsonData as $id=>$row) {
    $insertPairs = array();
    foreach ($row as $key=>$val) {
        $insertPairs[addslashes($key)] = addslashes($val);
    }
    $insertKeys = '`' . implode('`,`', array_keys($insertPairs)) . '`';
    $insertVals = '"' . implode('","', array_values($insertPairs)) . '"';

    echo "INSERT INTO `{$tableName}` ({$insertKeys}) VALUES ({$insertVals});" . "\n";
}

This will spit out a series of INSERT INTO my_table (...) VALUES (...); SQL statements that you can save to a .sql file and import into MySQL. With this method each INSERT can have different columns without causing problems.

Note: addslashes() isn't a very secure way of doing this, and something like real_escape_string() would be preferable, but this works without a DB connection assuming your JSON data is trustworthy.

Selden answered 5/11, 2015 at 20:17 Comment(1)
Add this to your code, it will NULL for value empty $insertVals = str_replace("''", 'NULL', $insertVals);Bonspiel
C
6

Using MySQL Shell 8.0.13 you can import JSON documents straight to MySQL Server using util.importJson shell API function or command line option --import. For example import from file:

mysqlsh user@host:port/database --import /tmp/facebook.json collection_name

or pipe JSON document to stdin:

cat /tmp/facebook.json | mysqlsh user@host:port/database --import - collection_name

where collection_name is a collection.

More about MySQL Shell's JSON Import utility: https://dev.mysql.com/doc/mysql-shell/8.0/en/mysql-shell-utilities-json.html

More about document store: https://dev.mysql.com/doc/refman/8.0/en/document-store.html

More about documents and collections: https://dev.mysql.com/doc/refman/8.0/en/mysql-shell-tutorial-javascript-documents-collections.html

Crucifix answered 22/10, 2018 at 16:18 Comment(2)
Too bad X Dev protocol doesn't work on RDS.Acanthaceous
If your input file has one json document per line, you can try to use util.importTable with dialect option set to json.Crucifix
C
1

The following was tested and works on MySQL 8.0.18. I expect it to work on any version that supports JSON data type and load data statement.

The JSON file can be loaded using load data statement from any SQL client. Assuming the user has proper permission to run the statement and proper access to the file.

create table jsonTableName( jsonColumnName json );

load data infile '/var/lib/mysql-files/someFile.json'
  into table jsonTableName
  fields terminated by '\0' escaped by ''
  lines terminated by '\0';

Fields and lines terminated by '\0' disable fields and lines parsing, thus assuming that the entire file is a single cell.

escaped by '' disables the \ to be interpreted by the loader. Thus allowing \" to be send directy to the JSON parser. Without it, any escape sequence will be loaded improperly potentially causing an exception.

Please use load data LOCAL infile (as @Vijay suggested) when the JSON file is located on your local machine and the MySQL server is not (for remote transfers). Keep in mind that local works only if the server and your client both have been configured to permit it. Use select @@local_infile; to check the server settings (0 is for not allowed and 1 for allowed). For MySQL Workbench client add OPT_LOCAL_INFILE=1 in Connection -> Advanced -> Others. For IntelliJ/Datagrip client set allowLoadLocalInfile to true in the Advanced connection settings.

Crt answered 25/11, 2019 at 21:49 Comment(2)
This works and helps to store the data as a json data type column...Command has to start with LOAD DATA LOCAL INFILEIleanaileane
You are correct, @Vijay. While testing the JSON file was on the same machine as the MySQL server, hence the local was not required. Moreover local is a rather messy option with a number of potential pitfalls, that made me want to stay clear from it. Although it is a very useful option and I added a paragraph with a bit more info about it.Crt
G
1

I got the idea from seeing a import from json into MySQL video on Youtube. Json_import_into_MySQL. So you can use MySQL Workbench to import the json file in to a table.

  1. Create a table with the column that would have the dump file data.

Update

it is not necessary to have a table already create before hand. Workbench will create it from the dump file.

  1. Open workbench
  2. Select Schema tab on the left. (I am using MySQL Workbench 8.0 on Ubuntu) If the sidebar is not visible, then show it from View -> Panels -> Show Sidebar.

only if you want to populate an already existing table

  1. Select the schema -> table in which you want to import the dump file.
  2. right click the table in which you wanted to import.
  3. Then select Table Data import wizard.(5th one from the top).

  1. Click Or paste the url of the file you wanted to import. Remember to change the csv to json format; in the right bottom.

If no table is selected. Do enter a table name Or wrokbench will use the dump filename as table name.

  1. Workbench will automatically coordinate the columns head with the table fields.
  2. click finish to complete the import.

Please remember to change the column type to whatever is relavent; because Workbench will use TEXT as default. First find what is the max length of characters in the field and then change it.

for example.

SELECT 
  max(length(province)) `province`,  
  max(length(cnic)) `cnic`,  
  max(length(`name`)) `name`,  
  max(length(district)) `district`,  
  max(length(fatherName)) `fatherName`
from tab1;

ALTER TABLE tab1
  CHANGE province province CHAR(16), 
  CHANGE cnic cnic CHAR(20), 
  CHANGE `name` `name` CHAR(70), 
  CHANGE district district CHAR(34), 
  CHANGE fatherName fatherName CHAR(43);
Gastritis answered 2/10, 2020 at 9:47 Comment(0)
H
1

This short python script should solve the problem:

import pandas as pd
from sqlalchemy import create_engine
engine = create_engine('mysql+pymysql://<user>:<password>@localhost/myDB')
df = pd.read_json("mydataset.json")
df.to_sql("newTable", con=engine, if_exists = 'replace', index=False)

Field types will appropriately be allocated upon table creation but can also be changed once data is loaded in through various SQL queries. To support the script you will need to install the pandas, sqlalchemy, MySQL and pymysql modules which can easily be done by running the following installation commands:

pip install pandas
pip install sqlalchemy
pip install MySQL
pip install pymysql

If your server does not require passwords simply remove the section from the script. Hope this helps.

Herl answered 7/8, 2022 at 10:44 Comment(0)
D
0

I know is an old post, but maybe it can help someone. I wanted to create the same database in mysql like I have in mongoDb.

I have a valid json file with data (not structure), then I created a table in mysql with the appropriate structure (according to the json data), and imported it through MySql workbench.

Here is the example with csv, but you can load your json file. Just follow the wizard for importing data into table and make sure the structure is valid, otherwise nothing will be imported.

Differentia answered 11/7, 2016 at 9:20 Comment(0)
I
0

I arrived at this problem because I was trying to do the same thing: Import a JSON object into MYSQL so that I could query it.

Perhaps I have simplified the original questions (I can't actually see the desired input file) but my solution was to simply input the JSON data as one single JSON object. Then I was able to use the MYSQL commands to expand the JSON object directly in MYSQL and complete the task.

In Python I created an endpoint class that:

  1. Took in a JSON object
  2. "Dumped" it into a variable called jsonBOM
  3. Ran a query to insert the JSON object into an existing table.

Once I had it in an SQL table I could use all the regular JSON commands to extract the data into a table. (I've given my endpoint class below for reference but everyone creates endpoints differently and I have not included all the helper functions. I just wanted to show how the query is wrapped within python)

class Allocation(Resource):
def post(self):
    
    response = {}
    items = {}

    try:
        conn = connect()
        
        allocation_data = request.get_json(force=True)
        jsonBOM = json.dumps(allocation_data)


        new_allocation_uid = get_new_allocationUID(conn)
        today = getNow()


        # Run query to enter JSON object into Allocation table
        addAllocation =  """
            INSERT INTO pmctb.allocation
            SET allocation_uid =  \'""" + new_allocation_uid + """\',
                allocation_date = \'""" + today + """\',
                allocation_json = \'""" + jsonBOM + """\';

            """

        items = execute(addAllocation, "post", conn)
        response['allocation'] = "Successful"
        return response
    
    except:
        raise BadRequest('Add Part failed')
    finally:
        disconnect(conn)

Now I could a JSON input like this:

[
{"product_uid":"310-000594", "inv_uid":"870-000016", "assembly":"C", "allocated":3}, 
{"product_uid":"310-000594", "inv_uid":"870-000016", "assembly":"C", "allocated":2}
] 

And then create a table in MySQL using a query like this (again, this code is just to show that the JSON object can be made into a table):

JSON_TABLE (p.allocation_json, '$[*]'
    COLUMNS (
            Allocation_id FOR ORDINALITY,
            Allocation_product_uid VARCHAR(255) PATH '$.product_uid',
            Allocation_inventory_uid VARCHAR(255) PATH '$.inv_uid',
            Allocation_assy_name VARCHAR(255) PATH '$.assembly',
            Allocation_allocated_qty INT PATH '$.allocated')
            ) AS BOM

I hope that helps someone!

Interloper answered 25/8, 2022 at 5:54 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.