MySQL 5.7.12 import cannot create a JSON value from a string with CHARACTER SET 'binary'
Asked Answered
M

12

68

I exported my database with JSON columns in it. After I migrated to a new server, my import crashed every time with an error like:

cannot create a JSON value from a string with CHARACTER SET 'binary'

On stackoverflow, I found this post but didn't work for me: mysqlimport issues "set @@character_set_database=binary" which prevents loading json values

The file is 2GB and isn't possible to open the file.

Anyone has an idea to import my database file?

Milliliter answered 28/6, 2016 at 13:51 Comment(0)
B
158

You can apply a regex to the SQL text which you exported which will convert your binary strings into an insertable format. This was my quick and dirty fix when I faced this issue

(X'[^,\)]*')
CONVERT($1 using utf8mb4)

Applying this regex means

INSERT INTO json_table (json_column) VALUES (X'7B22666F6F223A2022626172227D');

will now become

INSERT INTO json_table (json_column) VALUES (CONVERT(X'7B22666F6F223A2022626172227D' using utf8mb4));
Billon answered 3/2, 2017 at 17:35 Comment(5)
For grins, here's the non-inplace sed version: sed -E "s/(X'[^,\)]*')/CONVERT(\1 using utf8mb4)/g" infile.sql > outfile.sqlWinser
The regex in Ray's comment did not work for me so I had to use the following regex: sed -E "s/(0x'[^,\)]*')/CONVERT(\1 using utf8mb4)/g" infile.sql > outfile.sqlUngrounded
The regex in Ray's comment did not work for me so I had to use the following regex: sed -E "s/(0x'[^,\)]*')/CONVERT(\1 using utf8mb4)/g" infile.sql > outfile.sqlUngrounded
this works <3, how should I handle 40K rows? I can't use multiselectCosher
@Cosher use the sed version in Ray's comment to process the entire dump file at once.Trella
S
113

I had this problem dealing with exports made by Sequel Pro. I unchecked the Output BLOB fields as hex option and the problem went away. Visually inspecting the export showed legible JSON instead of binary.

Switchman answered 28/3, 2018 at 21:1 Comment(3)
Tried this, but some field had a quote. Sequel Pro export escapes this as \", but this should be \\". So basically: if you have quotes in your JSON data, you're out of luck with Sequel Pro.Valerianaceous
Tha's what I needed. Thanks!Signe
Thank you. I just wish Sequel Pro would get an update.Streaky
W
26

I faced the same issue today. Below were the findings for my case,

I asked one of my friend to generate an SQL dump for me to import. He used sequel-pro to generate the dump (export database). When I did the import it threw an error

Cannot create a JSON value from a string with CHARACTER SET 'binary'

So, there was an issue with the generated dump, all the json fields were converted to some raw format i.e. instead of value being

"{'key1':'value1', 'key2':'value2'}"

it was,

X'nfdsklsdsklnfjkbvkjsdbvkjhdfsbvkjdsbnvljkdsbvkjhdfbvkjdfbvjkdfb'

So, when importing the dump i.e. running the insert statements mysql could not process the data as it was not of json type.

Here is a link to the bug reported
https://github.com/sequelpro/sequelpro/issues/2397

You need to uncheck the Output BLOB fields as hex option.

Warlike answered 28/1, 2019 at 15:8 Comment(0)
M
14

vim version For Lorcan O'Neill's answer

vi xxxx.sql
:%s/\(X'[^,\)]*'\)/CONVERT(\1 using utf8mb4)/g
Marshamarshal answered 5/7, 2019 at 2:25 Comment(1)
simple yet powerful.Haskel
H
9

This worked for me, (I had control of the export to the sql file as well). There're lots of caveats; e.g. I knew that the fields would never be bigger than 1000 and wouldn't contain any non-ascii chars. Please do comment and tell me all the whys this is so bad tho :)

Before export

alter table <table> modify <json_column> varchar(1000);

Then after import

alter table <table> modify <json_column> json;
Hartfield answered 12/6, 2017 at 14:7 Comment(2)
I would would recommend using LONGTEXT as temporary type. If size is smaller than the JSON data will be truncated and when converting back to JSON column, errors will be thrown for affected records plus your data is only partially saved.Unorganized
yup :) hence caveatsHartfield
V
6

For those using Sequel Pro around June 2019, in addition to unchecking the "Output BLOB fields as hex option" (as mentioned above) - you also need to use the nightly build, which added support for JSON types 2 years ago. This support has not yet made it to the official release.

Var answered 4/6, 2019 at 21:11 Comment(2)
Can you provide the link for the specific nightly build since I am facing the same inspite of unchecking the unchecking the "Output BLOB fields as hex option"Noreen
Sequel Pro was abandoned and is now continued as Sequel Ace on the App Store.Schoolmistress
S
3

I had this problem with a dump. i was able to fix it by changing the line in the dump file from:

/*!40101 SET NAMES binary*/;

to

/*!40101 SET NAMES utf8mb4*/;
Sousa answered 11/1, 2019 at 20:44 Comment(1)
My dump had /*!40101 SET NAMES utf8 */;. Replacing that utf8 with utf8mb4 did not fix or change the issue.Debunk
S
1

Lorcan's answer did help me well as a start, but converting all binary values created a bunch of other error messages like Duplicate entry [...] for key 'PRIMARY'. Finally I figured out that JSON entries all started with 5B or 7B, and closed with 5D or 7D, which of course means they start with [ or { and end with ] or }. So what worked for me was to regex-replace only those entries:

Find:    (X'5B[^,\)]*5D')
Replace: CONVERT($1 using utf8mb4)

then

Find:    (X'7B[^,\)]*7D')
Replace: CONVERT($1 using utf8mb4)

Et voilá, all import errors gone! (At least for my case)

Schoolmistress answered 9/11, 2020 at 19:41 Comment(0)
C
0

change collation to utf8_general_ci. worked for me.

Chalky answered 26/7, 2018 at 10:39 Comment(1)
Instead of one-liner, add some relevant explanation to your answer.Hochheimer
H
0

For the ones like me arived here using Symfony 4 / Doctrine : For some reasons the same entity can be resolved in a longtext MySQL type storing JSON; or a json MySQL type storing json. Manually setting longtext MySQL type resolved the problem in my particular case.

Honeysweet answered 21/1, 2019 at 12:57 Comment(0)
N
0

This odd issue was occurring when running a simple UPDATE query:

update some_table set json_attr = '{"test":168}' where id = 123456;

Restarting MySQL fixed it. Was not able to pinpoint the cause.

Edit: We are using Aurora. It looks like it was related to us having a weird configuration where the same instance handled both master & slave/reader connections.

Naos answered 10/6, 2019 at 21:10 Comment(0)
E
-5

All MySQL JSON data type information must be UTF8MB4 character set not BINARY.

Encampment answered 28/6, 2016 at 14:12 Comment(6)
Is there a possible way to force this with a current SQL dump to fix this thing?Milliliter
How to convert it to UTF8MB4? I get error when I use ALTER TABLE table_name CHANGE col_name col_name JSON CHARACTER SET utf8mb4;Southernmost
Not sure his is an answer, more like a statement. How does get import past this issue?Winser
Where is an answare?Balls
This is not an answer.Aviatrix
@DannyBevers please change the accepted answer. This answer is definitely wrong.Ungrounded

© 2022 - 2024 — McMap. All rights reserved.