"Incorrect string value" when trying to insert UTF-8 into MySQL via JDBC?
Asked Answered
E

24

328

This is how my connection is set:
Connection conn = DriverManager.getConnection(url + dbName + "?useUnicode=true&characterEncoding=utf-8", userName, password);

And I'm getting the following error when tyring to add a row to a table:
Incorrect string value: '\xF0\x90\x8D\x83\xF0\x90...' for column 'content' at row 1

I'm inserting thousands of records, and I always get this error when the text contains \xF0 (i.e. the the incorrect string value always starts with \xF0).

The column's collation is utf8_general_ci.

What could be the problem?

Extenuation answered 8/6, 2012 at 23:46 Comment(4)
That would be LATIN SMALL LETTER N WITH TILDE (ñ).Armorer
For others encounter this issue, you could try: On the database: ALTER DATABASE database_name CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci; - It will solve the” from now on “ created tables. NOT for EXIST tables. For them you need to do : ALTER TABLE table_name CONVERT TO CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci; Source - digitalocean.com/community/questions/…Sept
Tried the approach by @Sept Just to make the reader aware, this screws up performance of the table very very very badly. So badly that I had to revert the table back to utf8. Looking for another approach.Jalapa
The performance hit was most likely from creating a variance in charset with indexes. One index is latin the other is utf8mb4, index will not work correctly. You will need to set any joined columns as the same charset.Understandable
V
406

MySQL's utf8 permits only the Unicode characters that can be represented with 3 bytes in UTF-8. Here you have a character that needs 4 bytes: \xF0\x90\x8D\x83 (U+10343 GOTHIC LETTER SAUIL).

If you have MySQL 5.5 or later you can change the column encoding from utf8 to utf8mb4. This encoding allows storage of characters that occupy 4 bytes in UTF-8.

You may also have to set the server property character_set_server to utf8mb4 in the MySQL configuration file. It seems that Connector/J defaults to 3-byte Unicode otherwise:

For example, to use 4-byte UTF-8 character sets with Connector/J, configure the MySQL server with character_set_server=utf8mb4, and leave characterEncoding out of the Connector/J connection string. Connector/J will then autodetect the UTF-8 setting.

Vixen answered 9/6, 2012 at 9:16 Comment(13)
What an odd choice to have utf8 really mean "the subset of UTF8 that can be represented in 3 bytes".Vulgarian
@EricJ. Exactly! This is very confusing: I knew my encoding and collation were utf8_general_ci but was still getting a mysql error. I'm glad @Vixen had the answer above or else I would have never guessed what was going on.Robson
Warning: You should upgrade to the newest version of the Connector/J driver when doing this since older versions don't seem to autodetect the utf8mb4 setting correctly!Kaiserism
character_encoding_server is nota valid MySQL config variable name. I have tried to set character_set_server to utf8mb4 instead, in addition to individual columns, but it didn't change anything.Nonobservance
There is one more step that I had to do when I changed my database to utf8mb4 and that was to STOP specifying characterEncoding=UTF-8&characterSetResults=UTF-8. This was actually preventing the proper handling of utf8mb4.Sukin
# For each database: ALTER DATABASE database_name CHARACTER SET = utf8mb4 COLLATE = utf8mb4_unicode_ci; # For each table: ALTER TABLE table_name CONVERT TO CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci; # For each column: ALTER TABLE table_name CHANGE column_name column_name VARCHAR(191) CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;Watchword
Bizarre that UTF-8 isn't UTF-8 until it's updated to be UTF-8Tamar
So you are suggesting that UTF-8 with 3 (three) bytes is unable to store LATIN SMALL LETTER N WITH TILDE (ñ), and we need 4 (four) bytes to spell "España" correctly? Really? Could it be more inneficient than this? What can we store besides A-Z and 0-9 with 3 bytes then..Armorer
@Alex ñ takes up 2 bytes in both utf8 and utf8mb4 as long as you use varchar or text. Only characters above U+FFFF take up 4 bytes. If you don't need to store characters from different languages you can always use 1- byte encodings like latin1Vixen
Please at least stop calling utf8mb3 utfVestment
Running SET NAMES utf8mb4; before the LOAD DATA LOCAL INFILE statement was super helpful for me. https://mcmap.net/q/66178/-load-data-infile-invalid-ut8mb4-character-stringBailiwick
Thanks a lot, using changing only the table setting didn't work. I had to update the the my.cnf fileCanticle
For those using mysql Workbench - hover above the table, click the configure button, select requested field and you will see at the bottom Charset/Collation options. For my use case it was a single field that was problematic, not the entire DB, so workbench was great for thisEver
V
151

The strings that contain \xF0 are simply characters encoded as multiple bytes using UTF-8.

Although your collation is set to utf8_general_ci, I suspect that the character encoding of the database, table or even column may be different. They are independent settings. Try:

ALTER TABLE database.table MODIFY COLUMN col VARCHAR(255)  
    CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NOT NULL;

Substitute whatever your actual data type is for VARCHAR(255)

Vulgarian answered 9/6, 2012 at 0:7 Comment(5)
Actually tried that, didn't work. The column's data type is LONGTEXT btw, if this matters.Extenuation
Your app is in Java I take it? Try invoking Java with the file-encoding parameter specifying UTF-8, e.g. java -Dfile.encoding=UTF-8 or add an appropriate switch in your Tomcat (etc.) config file.Vulgarian
I suggest you put an emphasis on "character encoding of the database, table or even column may be different". That is the most important thing.Resor
You will have to alter the table as well with CHARACTER SET utf8 COLLATE utf8_general_ci then after change the column CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ciAlerion
It worked! I wanted to store korean translations in the column. Thank you so much!Aurum
C
86

Got the same problem, to save the data with utf8mb4 needs to make sure:

  1. character_set_client, character_set_connection, character_set_results are utf8mb4: character_set_client and character_set_connection indicate the character set in which statements are sent by the client, character_set_results indicates the character set in which the server returns query results to the client.
    See charset-connection.

  2. the table and column encoding is utf8mb4

For JDBC, there are two solutions:

Solution 1 (need to restart MySQL):

  1. modify my.cnf like the following and restart MySQL:

     [mysql]
     default-character-set=utf8mb4
    
     [mysqld]
     character-set-server=utf8mb4
     collation-server=utf8mb4_unicode_ci
    

this can make sure the database and character_set_client, character_set_connection, character_set_results are utf8mb4 by default.

  1. restart MySQL

  2. change the table and column encoding to utf8mb4

  3. STOP specifying characterEncoding=UTF-8 and characterSetResults=UTF-8 in the jdbc connector,cause this will override character_set_client, character_set_connection, character_set_results to utf8

Solution two (don't need to restart MySQL):

  1. change the table and column encoding to utf8mb4

  2. specifying characterEncoding=UTF-8 in the jdbc connector,cause the jdbc connector doesn't suport utf8mb4.

  3. write your sql statement like this (need to add allowMultiQueries=true to jdbc connector):

     'SET NAMES utf8mb4;INSERT INTO Mytable ...';
    

this will make sure each connection to the server, character_set_client,character_set_connection,character_set_results are utf8mb4.
Also see charset-connection.

Croner answered 2/2, 2016 at 15:3 Comment(8)
Point 3 was the clincher for me in conjunction with changing the db, table & field encodings: 'SET NAMES utf8mb4;INSERT INTO Mytable ...';Bartholomeus
Point 3 did the trick for me too, my table encoding already set to utf8mb4.Crofton
The table encoding is just a default. It is sufficient to change the column encoding to utf8mb4.Nolita
The second approach should be used selectively, i. e. never be applied to SELECT queries, as set names utf8mb4; select ... from ... will never produce a ResultSet and instead result in a ResultSet is from UPDATE. No Data. error.Snare
solution 2, just par. 1 helped me when I was trying to insert Cyrillic text through my contact form.Jacobean
The Solution 2 worked for me but i am unable to get newly inserted id from my second insert query. what should i do ?Bushwa
default-character-set=utf8mb4 for the mysql client fixed it for me. character-set-server is optional. Quickest way to test it: select CONVERT('😃' USING utf8mb4); (edit ~/.my.cnf or /etc/my.cnf on the mysql client machine). To test only character_set_results (server-to-client part): select CONVERT(UNHEX('F09F9883') USING utf8mb4);. "character-set-server" only sets the default char set applied when create-database commands execute. Source: "indicate the character set and collation of the default database" dev.mysql.com/doc/refman/8.0/en/charset-connection.htmlMissive
For anyone using Connector/J. The document said that: "Do not issue the query SET NAMES with Connector/J, as the driver will not detect that the character set has been changed by the query, and will continue to use the character set configured when the connection was first set up.". Please see dev.mysql.com/doc/connector-j/5.1/en/…Herbalist
E
23

I wanted to combine a couple of posts to make a full answer of this since it does appear to be a few steps.

  1. Above advice by @madtracey

/etc/mysql/my.cnf or /etc/mysql/mysql.conf.d/mysqld.cnf

[mysql]
default-character-set=utf8mb4

[mysqld_safe]
socket          = /var/run/mysqld/mysqld.sock
nice            = 0

[mysqld]
##
character-set-server=utf8mb4
collation-server=utf8mb4_unicode_ci
init_connect='SET NAMES utf8mb4'
sql_mode=STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION

Again from advice above all jdbc connections had characterEncoding=UTF-8and characterSetResults=UTF-8 removed from them

With this set -Dfile.encoding=UTF-8 appeared to make no difference.

I could still not write international text into db getting same failure as above

Now using this how-to-convert-an-entire-mysql-database-characterset-and-collation-to-utf-8

Update all your db to use utf8mb4

ALTER DATABASE YOURDB CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;

Run this query that gives you what needs to be rung

SELECT CONCAT(
'ALTER TABLE ',  table_name, ' CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;  ', 
'ALTER TABLE ',  table_name, ' CONVERT TO CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;  ')
FROM information_schema.TABLES AS T, information_schema.`COLLATION_CHARACTER_SET_APPLICABILITY` AS C
WHERE C.collation_name = T.table_collation
AND T.table_schema = 'YOURDB'
AND
(C.CHARACTER_SET_NAME != 'utf8mb4'
    OR
 C.COLLATION_NAME not like 'utf8mb4%')

Copy paste output in editor replace all | with nothing post back into mysql when connected to correct db.

That is all that had to be done and all seems to work for me. Not the -Dfile.encoding=UTF-8 is not enabled and it appears to work as expected

E2A Still having an issue ? I certainly am in production so it turns out you do need to check over what has been done by above, since it sometimes does not work, here is reason and fix in this scenario:

show create table user

  `password` varchar(255) CHARACTER SET latin1 NOT NULL,
  `username` varchar(255) CHARACTER SET latin1 NOT NULL,

You can see some are still latin attempting to manually update the record:

ALTER TABLE user CONVERT TO CHARACTER SET utf8mb4;
ERROR 1071 (42000): Specified key was too long; max key length is 767 bytes

So let's narrow it down:

mysql> ALTER TABLE user change username username varchar(255) CHARACTER SET utf8mb4 not NULL;
ERROR 1071 (42000): Specified key was too long; max key length is 767 bytes
mysql> ALTER TABLE user change username username varchar(100) CHARACTER SET utf8mb4 not NULL;
Query OK, 5 rows affected (0.01 sec)

In short I had to reduce the size of that field in order to get the update to work.

Now when I run:

mysql> ALTER TABLE user CONVERT TO CHARACTER SET utf8mb4;
Query OK, 5 rows affected (0.01 sec)
Records: 5  Duplicates: 0  Warnings: 0

It all works

Expertize answered 15/4, 2017 at 11:42 Comment(2)
Question: the last ALTER TABLE command will convert the contents of all VARCHAR fields into valid, genuine UTF8 encoded string? I ask because I'm having problems converting my LATIN1 fields to UTF8, specifically when the ñ character is found, conversion directly fails due to an incorrect string value (error 1366).Armorer
if you mean ALTER TABLE user CONVERT TO CHARACTER SET utf8mb4; strangely enough when I ran this the final time all the fields no longer had a character set defined. so password from above became password varchar(255) NOT NULL, (nothing about encoding). This means the last command simply must have made mysql lookup what the actual table definition was and because now the table was by default this the fields no longer need it - I presume they remained with the character set simply because during the dump of entire table updates it couldn't update it and hence it was left in that stateExpertize
P
12

In my case, I tried everything above, nothing worked. I am pretty sure, my database looks like below.

mysql  Ver 14.14 Distrib 5.7.17, for Linux (x86_64) using  EditLine wrapper

Connection id:      12
Current database:   xxx
Current user:       yo@localhost
SSL:            Not in use
Current pager:      stdout
Using outfile:      ''
Using delimiter:    ;
Server version:     5.7.17-0ubuntu0.16.04.1 (Ubuntu)
Protocol version:   10
Connection:     Localhost via UNIX socket
Server characterset:    utf8
Db     characterset:    utf8
Client characterset:    utf8
Conn.  characterset:    utf8
UNIX socket:        /var/run/mysqld/mysqld.sock
Uptime:         42 min 49 sec

Threads: 1  Questions: 372  Slow queries: 0  Opens: 166  Flush tables: 1  Open tables: 30  Queries per second avg: 0.144

so, I look up the column charset in every table

show create table company;

It turns out the column charset is latin. That's why, I can not insert Chinese into database.

 ALTER TABLE company CONVERT TO CHARACTER SET utf8;

That might help you. :)

Priam answered 31/3, 2017 at 3:1 Comment(0)
A
7

I had the same problem in my rails project:

Incorrect string value: '\xF0\xA9\xB8\xBDs ...' for column 'subject' at row1

Solution 1: before saving to db convert string to base64 by Base64.encode64(subject) and after fetching from db use Base64.decode64(subject)

Solution 2:

Step 1: Change the character set (and collation) for subject column by

ALTER TABLE t1 MODIFY
subject VARCHAR(255)
  CHARACTER SET utf8mb4
  COLLATE utf8mb4_unicode_ci;

Step 2: In database.yml use

encoding :utf8mb4
Actinouranium answered 24/5, 2017 at 18:11 Comment(0)
P
5

just do

ALTER TABLE `some_table` 
CHARACTER SET = utf8 , COLLATE = utf8_general_ci ;

ALTER TABLE `some_table` 
CHANGE COLUMN `description_with_latin_or_something` `description` TEXT CHARACTER SET 'utf8' NOT NULL ;
Piane answered 23/5, 2017 at 15:17 Comment(1)
what if I have a bunch of tables I want to change in database though? and what if all have different storage engine (innodb, etc)?Lugworm
S
5

Assuming you are using phpmyadmin to solve this error, follow these steps:

  1. phpMyAdmin
  2. your_table
  3. "Structure tab"
  4. change the Collation of your field from latin1_swedish_ci (or whatever it is) to utf8_general_ci
Standardize answered 2/3, 2018 at 12:4 Comment(4)
Not valid, you are supposing he uses phpMyAdmin.Oler
doesnt work...... and the collation is changed in 'operation' and not in structureHusky
@OlorunfemiAjibulu yes, you can change it in "structure" as well. For some people here, it workedStandardize
@TeoMihaila Perhaps, it's versioning.Husky
S
5

this is not the recommendation solution.. But worth to share. Since my project are upgrade the DBMS from old Mysql to newest (8). But I cant change the table structure, only the DBMS config (mysql). The solution for mysql server.

test on Windows mysql 8.0.15 on mysql config search for

sql-mode="....."

uncomment it. Or in my case just type/add

sql-mode="NO_ENGINE_SUBSTITUTION"

why not recommended solution. because if you use latin1 (my case).. the data insert successly but not the content (mysql not respond with error!!) . for example you type info like this

bla \x12

it save

bla [] (box)

okay.. for my problem.. I can change the field to UTF8.. But there is small problem.. see above answer about other solution is failed because the word is not inserted because contain more than 2 bytes (cmiiw).. this solution make your insert data become box. The reasonable is to use blob.. and you can skip my answer.

Another testing related to this were.. using utf8_encode on your code before save. I use on latin1 and it was success (I'm not using sql-mode)! same as above answer using base64_encode .

My suggestion to analys your table requirement and tried to change from other format to UTF8

Stilbite answered 19/7, 2019 at 9:15 Comment(1)
In my settings.py (Django Project), I changed to sql-mode="NO_ENGINE_SUBSTITUTION". It's working.Jehovah
M
4

Its mostly caused due to some unicode characters. In my case it was the Rupee currency symbol.

To quickly fix this, I had to spot the character causing this error. I copy pasted the entire text in a text editor like vi and replaced the troubling character with a text one.

Modernistic answered 12/4, 2016 at 7:9 Comment(1)
The OP mentioned that there are a thousand records being inserted....Resor
H
4

I you only want to apply the change only for one field, you could try serializing the field

class MyModel < ActiveRecord::Base
  serialize :content

  attr_accessible :content, :title
end
Homiletic answered 3/10, 2017 at 21:58 Comment(0)
H
4

I had this problem with my PLAY Java application. This is my stack trace for that exception:

javax.persistence.PersistenceException: Error[Incorrect string value: '\xE0\xA6\xAC\xE0\xA6\xBE...' for column 'product_name' at row 1]
  at io.ebean.config.dbplatform.SqlCodeTranslator.translate(SqlCodeTranslator.java:52)
  at io.ebean.config.dbplatform.DatabasePlatform.translate(DatabasePlatform.java:192)
  at io.ebeaninternal.server.persist.dml.DmlBeanPersister.execute(DmlBeanPersister.java:83)
  at io.ebeaninternal.server.persist.dml.DmlBeanPersister.insert(DmlBeanPersister.java:49)
  at io.ebeaninternal.server.core.PersistRequestBean.executeInsert(PersistRequestBean.java:1136)
  at io.ebeaninternal.server.core.PersistRequestBean.executeNow(PersistRequestBean.java:723)
  at io.ebeaninternal.server.core.PersistRequestBean.executeNoBatch(PersistRequestBean.java:778)
  at io.ebeaninternal.server.core.PersistRequestBean.executeOrQueue(PersistRequestBean.java:769)
  at io.ebeaninternal.server.persist.DefaultPersister.insert(DefaultPersister.java:456)
  at io.ebeaninternal.server.persist.DefaultPersister.insert(DefaultPersister.java:406)
  at io.ebeaninternal.server.persist.DefaultPersister.save(DefaultPersister.java:393)
  at io.ebeaninternal.server.core.DefaultServer.save(DefaultServer.java:1602)
  at io.ebeaninternal.server.core.DefaultServer.save(DefaultServer.java:1594)
  at io.ebean.Model.save(Model.java:190)
  at models.Product.create(Product.java:147)
  at controllers.PushData.xlsupload(PushData.java:67)
  at router.Routes$$anonfun$routes$1.$anonfun$applyOrElse$40(Routes.scala:690)
  at play.core.routing.HandlerInvokerFactory$$anon$3.resultCall(HandlerInvoker.scala:134)
  at play.core.routing.HandlerInvokerFactory$$anon$3.resultCall(HandlerInvoker.scala:133)
  at play.core.routing.HandlerInvokerFactory$JavaActionInvokerFactory$$anon$8$$anon$2$$anon$1.invocation(HandlerInvoker.scala:108)
  at play.core.j.JavaAction$$anon$1.call(JavaAction.scala:88)
  at play.http.DefaultActionCreator$1.call(DefaultActionCreator.java:31)
  at play.core.j.JavaAction.$anonfun$apply$8(JavaAction.scala:138)
  at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:655)
  at scala.util.Success.$anonfun$map$1(Try.scala:251)
  at scala.util.Success.map(Try.scala:209)
  at scala.concurrent.Future.$anonfun$map$1(Future.scala:289)
  at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:29)
  at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:29)
  at scala.concurrent.impl.CallbackRunnable.run$$$capture(Promise.scala:60)
  at scala.concurrent.impl.CallbackRunnable.run(Promise.scala)
  at play.core.j.HttpExecutionContext$$anon$2.run(HttpExecutionContext.scala:56)
  at play.api.libs.streams.Execution$trampoline$.execute(Execution.scala:70)
  at play.core.j.HttpExecutionContext.execute(HttpExecutionContext.scala:48)
  at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:68)
  at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:368)
  at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:367)
  at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:375)
  at scala.concurrent.impl.Promise.transform(Promise.scala:29)
  at scala.concurrent.impl.Promise.transform$(Promise.scala:27)
  at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:375)
  at scala.concurrent.Future.map(Future.scala:289)
  at scala.concurrent.Future.map$(Future.scala:289)
  at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:375)
  at scala.concurrent.Future$.apply(Future.scala:655)
  at play.core.j.JavaAction.apply(JavaAction.scala:138)
  at play.api.mvc.Action.$anonfun$apply$2(Action.scala:96)
  at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:304)
  at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:37)
  at scala.concurrent.impl.CallbackRunnable.run$$$capture(Promise.scala:60)
  at scala.concurrent.impl.CallbackRunnable.run(Promise.scala)
  at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
  at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91)
  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
  at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:81)
  at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:91)
  at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
  at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:43)
  at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
  at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
  at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
  at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.sql.SQLException: Incorrect string value: '\xE0\xA6\xAC\xE0\xA6\xBE...' for column 'product_name' at row 1
  at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1074)
  at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4096)
  at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4028)
  at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2490)
  at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2651)
  at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2734)
  at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2155)
  at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2458)
  at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2375)
  at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2359)
  at com.zaxxer.hikari.pool.ProxyPreparedStatement.executeUpdate(ProxyPreparedStatement.java:61)
  at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeUpdate(HikariProxyPreparedStatement.java)
  at io.ebeaninternal.server.type.DataBind.executeUpdate(DataBind.java:82)
  at io.ebeaninternal.server.persist.dml.InsertHandler.execute(InsertHandler.java:122)
  at io.ebeaninternal.server.persist.dml.DmlBeanPersister.execute(DmlBeanPersister.java:73)
  ... 59 more

I was trying to save a record using io.Ebean. I fixed it by re creating my database with utf8mb4 collation, and applied play evolution to re create all tables so that all tables should be recreated with utf-8 collation.

CREATE DATABASE inventory CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;
Hyaloid answered 26/7, 2018 at 10:3 Comment(0)
P
4

Hint: On AWS RDS you need a new Parameter Group for your MySQL DB with the params (instead of editing a my.cnf)

  • collation_connection: utf8mb4_unicode_ci
  • collation_database: utf8mb4_unicode_ci
  • collation_server: utf8mb4_unicode_ci
  • character_set_client: utf8mb4
  • character_set_connection: utf8mb4
  • character_set_database: utf8mb4
  • character_set_results: utf8mb4
  • character_set_server: utf8mb4

Note: character_set_system stays "utf8"

These SQL commands do NOT WORK PERMANENTLY - only in a session:

set character_set_server = utf8mb4;
set collation_server = utf8mb4_unicode_ci;
Picturize answered 25/2, 2020 at 0:45 Comment(0)
D
4

After trying many queries finally, this works

ALTER TABLE
table_name
CHANGE column_name column_name 
varchar(256)
CHARACTER SET utf8mb4
COLLATE utf8mb4_unicode_ci;
Dronski answered 1/4, 2022 at 4:18 Comment(0)
H
3

If you are creating a new MySQL table, you can specify the charset of all columns upon creation, and that fixed the issue for me.

CREATE TABLE tablename (
<list-of-columns>
)
CHARSET SET utf8mb4 COLLATE utf8mb4_unicode_ci;

You can read more details: https://dev.mysql.com/doc/refman/8.0/en/charset-column.html

Hanahanae answered 7/7, 2019 at 6:25 Comment(0)
L
2

my solution is change the column type from varchar(255) to blob

Lethargy answered 7/7, 2017 at 6:44 Comment(0)
F
2

You need to set utf8mb4 in meta html and also in your server alter tabel and set collation to utf8mb4

Fairyfairyland answered 25/10, 2018 at 7:21 Comment(0)
B
2

Droppping schema and recreating it with utf8mb4 character set solved my issue.

Balladry answered 19/10, 2020 at 9:2 Comment(0)
V
1

I also had to drop and re-create all the database’s stored procedures (and functions too) in order that they execute within the new character set of utf8mb4.

Run:

SHOW PROCEDURE STATUS;

…to see which procedures have not been updated to the server’s new character_set_client, collation_connection and Database Collation values.

Veld answered 9/6, 2020 at 21:27 Comment(0)
B
1

However, it is important to note that the mysql connector driver version must be older than 5.1.47 and later.

Borderer answered 22/11, 2020 at 17:50 Comment(0)
M
1

You need to consider both sides of client-server architecture:

If you are using Java or PHP, you also need to match charset in the JODBC driver or php.ini. In my case using PHP:

default_charset = "UTF-8"

In Mysql 8.0:

character_set_server = utf8mb4 collation_connection = utf8mb4_0900_ai_ci

Moonstruck answered 10/10, 2023 at 23:37 Comment(0)
H
0

If you are facing a similar issue in java and don't have the flexibility to change the charset and collate encoding of the database then this answer is for you.

you can use the Emoji Java library (or something similar if you are not using java) to achieve the same. You can convert into an alias before saving/updating into the database and convert back to Unicode post save/update/load from database. The main benefit is the readability of the text even after the encoding because this library only aliases the emoji's rather than the whole string.

Example code snapshot:

String/Unicode to Alias (Before Save/Update to DB)

String str = "An 😀awesome 😃string with a few 😉emojis!";
String result = EmojiParser.parseToAliases(str);
System.out.println(result);
// Prints:
// "An :grinning:awesome :smiley:string with a few :wink:emojis!"

Alias to Unicode/String (After Save/Update/Load from DB)

String str = "An :grinning:awesome :smiley:string &#128516;with a few :wink:emojis!";
String result = EmojiParser.parseToUnicode(str);
System.out.println(result);
// Prints:
// "An 😀awesome 😃string 😄with a few 😉emojis!"

Note: You can use @PrePersist, @PreUpdate, @PostPersist, @PostUpdate, @PostLoad in enitity itself to do the alias and unicode conversion if you are using spring boot.

Hejira answered 12/8, 2022 at 6:41 Comment(0)
S
0

I was converting a CSV file to an SQL file in Python. The data was from the 90s and some rows were corrupted. For example, where readable ASCII characters would normally be, you'd see control characters 0-31 and 127. Other times, you'd come across weird Unicode characters like U+FFFD.

Importing the resulting SQL file on the command line via mysql database_name < file.sql gave "ERROR 1366 (HY000) at line 123: Incorrect string value: 'ABC' for column 'XYZ' at row 456".

SELECT VERSION(); showed 8.0.32-0ubuntu0.20.04.2.

SHOW VARIABLES WHERE Variable_name LIKE 'character\_set\_%' OR Variable_name LIKE 'collation%'; showed:

Variable_name Value
character_set_client utf8mb4
character_set_connection utf8mb4
character_set_database utf8mb4
character_set_filesystem binary
character_set_results utf8mb4
character_set_server utf8mb4
character_set_system utf8mb3
collation_connection utf8mb4_0900_ai_ci
collation_database utf8mb4_0900_ai_ci
collation_server utf8mb4_0900_ai_ci

Using SET NAMES 'UTF8MB4'; and SET CHARACTER SET 'UTF8MB4'; didn't work. Rewriting the command to use mysql -e "SOURCE file.sql" didn't work. Using character set command line flags such as mysql --default-character-set=utf8mb4 didn't work.

The only thing that worked was changing:

with open('/foo/bar/baz.sql', 'w') as sql_file:

to:

with open('/foo/bar/baz.sql', 'w', encoding='utf-8') as sql_file:

The problem appears to have been the encoding of the file itself.

Showy answered 5/4, 2023 at 23:58 Comment(0)
E
0

when you run java on windows,use java -jar -Dfile.encoding=utf-8,it could solve it! Caused by:

java.sql.SQLException: Incorrect string value: '\xC0\xEB\xBF\xAA\xBC\xD2...' for column 'dsc1' at row 1
Ewers answered 9/8, 2023 at 2:34 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.