Text was truncated or one or more characters had no match in the target code page including the primary key in an unpivot
Asked Answered
D

13

79

I'm trying to import a flat file into an oledb target sql server database.

here's the field that's giving me trouble:

enter image description here

here are the properties of that flat file connection, specifically the field:

enter image description here

here's the error message:

[Source - 18942979103_txt [424]] Error: Data conversion failed. The data conversion for column "recipient-name" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".

What am I doing wrong?

Dichloride answered 28/8, 2014 at 17:14 Comment(8)
"Text was truncated..." seems to infer that the imported file might have a value in that column with more than 100 characters... worth a look, at least.Onstage
nope that's not the case, i did checkDichloride
If you go into the Advanced Properties section of each component in the data flow, can you confirm that all of them correctly use the 100-character length?Gismo
That column in your flat file contains some weird characters that aren't recognized by the code page that you are using for your SQL destination. See if this helps: #18361056Uptodate
I would try nvarchar(max) & see what happensCenser
Redirect the rows to an error output and send it to a derived column or something that you do not need to configure. Then add a data viewer and try to find the rows with errors. It could be that some other column has a tab in it, which is throwing things off. Or maybe the previous line was not ended properlyWapentake
@MarkWojciechowicz thank you. what do yoyu mean by data viewer? is that a control?Dichloride
@yuck a data viewer is a tool that allows you to view data during debug. Right click the arrow between two components and select add data viewer (in this case, the error flow). Then run the package. A window will pop up when rows are sent to that bufferWapentake
L
35

After failing by increasing the length or even changing to data type text, I solved this by creating an XLSX file and importing. It accurately detected the data type instead of setting all columns as varchar(50). Turns out nvarchar(255) for that column would have done it too.

Loreenlorelei answered 26/11, 2014 at 0:37 Comment(3)
What do you do when your data is too big for Excel?Busy
Converting the data type to nvarchar(255) didn't work. I am still getting the exception. I inserted using SQL and it works like a charm and length of string is only 51 characters.Hoedown
Instead of using import data, I chose import flat file and it worked without any issue. One thing I noticed is that the column where I had issue , when using import data option, the column size was 50 ( content size was 51) and when using import flat file the column size was set at 100Hoedown
P
103

Here is what fixed the problem for me. I did not have to convert to Excel. Just modified the DataType when choosing the data source to "text stream" (Figure 1). You can also check the "Edit Mappings" dialog to verify the change to the size (Figure 2).

Figure 1

enter image description here

Figure 2

enter image description here

Postpositive answered 10/10, 2017 at 20:11 Comment(2)
NOTE: I had to both update the data type to VARCHAR(MAX) first and also go to the advance screen and update the import to 5000 characters.Snooty
Worked for me but also had to change the Code page from 1252 (ANSI - Latin I) to 65001 (UTF-8) on the Flat File Source step otherwise it was still failing with the same error. After trying a range of solutions to import over 1 million rows (so couldn't use Excel) this worked fine in combination with the above solution to switch to text stream.Mccutchen
L
35

After failing by increasing the length or even changing to data type text, I solved this by creating an XLSX file and importing. It accurately detected the data type instead of setting all columns as varchar(50). Turns out nvarchar(255) for that column would have done it too.

Loreenlorelei answered 26/11, 2014 at 0:37 Comment(3)
What do you do when your data is too big for Excel?Busy
Converting the data type to nvarchar(255) didn't work. I am still getting the exception. I inserted using SQL and it works like a charm and length of string is only 51 characters.Hoedown
Instead of using import data, I chose import flat file and it worked without any issue. One thing I noticed is that the column where I had issue , when using import data option, the column size was 50 ( content size was 51) and when using import flat file the column size was set at 100Hoedown
N
20

I solved this problem by ORDERING my source data (xls, csv, whatever) such that the longest text values on at the top of the file. Excel is great. use the LEN() function on your challenging column. Order by that length value with the longest value on top of your dataset. Save. Try the import again.

Nan answered 4/12, 2015 at 16:5 Comment(1)
My import was failing with data around 3000 characters, and this method fixed the import errorTemikatemp
A
13

I've resolved it by checking the 'UNICODE'checkbox. Click on below Image link:

Image


Amid answered 13/6, 2019 at 6:55 Comment(0)
D
8

While an approach proposed above (@chookoos, here in this q&a convert to Excel workbook) and import resolves those kinds of issues, this solution this solution in another q&a is excellent because you can stay with your csv or tsv or txt file, and perfom the necessary fine tuning without creating a Microsoft product related solution enter image description here enter image description here

Dario answered 17/12, 2018 at 11:53 Comment(0)
A
7

SQL Server may be able to suggest the right data type for you (even when it does not choose the right type by default) - clicking the "Suggest Types" button (shown in your screenshot above) allows you to have SQL Server scan the source and suggest a data type for the field that's throwing an error. In my case, choosing to scan 20000 rows to generate the suggestions, and using the resulting suggested data type, fixed the issue.

Anguiano answered 9/12, 2014 at 16:13 Comment(0)
W
6

You need to go increase the column length while importing the data for particular column.

Choose a data source >> Advanced >> increase the column from default 50 to 200 or more.

Wimsatt answered 3/7, 2020 at 3:28 Comment(0)
L
5

Not really a technical solution, but SQL Server 2017 flat file import is totally revamped, and imported my large-ish file with 5 clicks, handled encoding / field length issues without any input from me

enter image description here

Lelialelith answered 12/3, 2019 at 13:44 Comment(2)
I imported a 3 millions records file, and it doesn't have an identity column. I want to add an identity column, can you show me how?Gerianne
Does this help? #1049710Lelialelith
D
3

SQl Management Studio data import looks at the first few rows to determine source data specs..

shift your records around so that the longest text is at top.

Donavon answered 7/2, 2016 at 14:36 Comment(1)
yes, add max length character to excel first column and import excel, It will work.Chrotoem
D
2

I had similar problem against 2 different databases (DB2 and SQL), finally I solved it by using CAST in the source query from DB2. I also take advantage of using a query by adapting the source column to varchar and avoiding the useless blank spaces:

CAST(RTRIM(LTRIM(COLUMN_NAME)) AS VARCHAR(60) CCSID UNICODE 
   FOR SBCS DATA)  COLUMN_NAME

The important issue here is the CCSID conversion.

Directive answered 24/7, 2018 at 8:27 Comment(0)
S
2

It usually because in connection manager it may be still of 50 char , hence I have resolved the problem by going to Connection Manager--> Advanced and then change to 100 or may be 1000 if its big enough

Synapsis answered 16/7, 2019 at 11:22 Comment(0)
L
1

None of the above worked for me. I SOLVED my problem by saving my source data (save as) Excel file as a single xls Worksheet Excel 5.0/95 and imported without column headings. Also, I created the table in advance and mapped manually instead of letting SQL create the table.

Lykins answered 12/9, 2017 at 16:11 Comment(0)
P
1

In my case, the problem was that in my column, I had values longer than 50 characters, while the OutputColumnWidth property was set to 50. That means that anything exceeding 50 characters, would cause an error. The fix for me was increasing the value to a maximum that supported all values in my dataset:

enter image description here

Pavis answered 13/12, 2023 at 8:47 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.