Field in data file exceeds maximum length - error
Asked Answered
F

3

7

when i am trying to insert data in to a column of size varchar2(4000) in table i am getting error as "Field in data file exceeds maximum length".

the data =

1,2,3,4,5,6,7,8,9,10,11,12,13,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,31,33,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,111,112,121,654,666,667,1001,1100,1102,1103,1104,1105,1106,1107,1108,1109,1110,1111,1112,1113,1114,1115,1116,1117,1118,1119,1120,1121,1123,1124,1211,2001,2002,2003,2004,2028,2101,2102,2201,2202,2301,2302,2303,2401,2402,3001,3002,3003,3004,3010,3011,3012,3013,3020,3021,3022,3023,3024,3025,3030,3031,3032,3040,3041,3042,3043,3044,3045,3046,3047,3050,3051,3052,3053,3054,3055,3060,3061,3062,3070,3071,3072,3080,3081,3082,3083,3084,3090,3091,3092,3100,3102,3103,3110,3111,3112,3113,3120,3121,3122,3123,3130,3131,3132,5656,8040,9000`

Please help me to resolve this error.

control file:

LOAD DATA
INFILE '$IN_DIR/$FILENAME'
BADFILE '$REJ_DIR/$FILENAME.bad'
APPEND
INTO TABLE test
FIELDS TERMINATED BY ","
TRAILING NULLCOLS
  (
   FILE_RECORD_DESCRIPTOR               POSITION(1),
   LINE_NO                              FILLER,
   DEPT                                 TERMINATED BY "\n",
   SEQ_NO                               CONSTANT "1",
   DEPT_NO                              CONSTANT "0",
   STATUS                               CONSTANT "U",
   PROCESS_ID                           CONSTANT "2"
  )

table structure.

  CREATE TABLE TEST
   (
   FILE_RECORD_DESCRIPTOR VARCHAR2(5)    NOT  NULL, 
   DEPT                   VARCHAR2(4000) NOT NULL, 
   SEQ_NO                 NUMBER         NOT NULL, 
   DEPT_NO                NUMBER(4,0),
   STATUS                 VARCHAR2(10), 
   ERROR_DETAIL           VARCHAR2(3000), 
   PROCESS_ID             NUMBER(8,0)
   );

file:

FILE_RECORD_DESCRIPTOR,LINE_NO,DEPT
EXDEP,2,1,2,3,4,5,6,7,8,9,10,11,12,13,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,31,33,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,111,112,121,654,666,667,1001,1100,1102,1103,1104,1105,1106,1107,1108,1109,1110,1111,1112,1113,1114,1115,1116,1117,1118,1119,1120,1121,1123,1124,1211,2001,2002,2003,2004,2028,2101,2102,2201,2202,2301,2302,2303,2401,2402,3001,3002,3003,3004,3010,3011,3012,3013,3020,3021,3022,3023,3024,3025,3030,3031,3032,3040,3041,3042,3043,3044,3045,3046,3047,3050,3051,3052,3053,3054,3055,3060,3061,3062,3070,3071,3072,3080,3081,3082,3083,3084,3090,3091,3092,3100,3102,3103,3110,3111,3112,3113,3120,3121,3122,3123,3130,3131,3132,5656,8040,9000
Flashing answered 23/7, 2015 at 5:24 Comment(8)
Use Clob data type instead of varchar2(4000).Fishbolt
thankx for the suggesion.howevere i would like to know when the data length is less than 1000 bytes why am i getting the error.Flashing
please post your ctl file.Protagonist
Please share your sql loader control file.Fishbolt
Added control file with original dataFlashing
Refer #10386520.Fishbolt
How can i say SQL loader to consider a particular column to take varchar2(4000) insted default char(255) while loading into table?.Flashing
Welcome to Stack Overflow! I indented your code sample by 4 spaces so that it renders properly - please see the editing help for more information on formatting. Good luck!Rosellaroselle
C
17

The error message is because the data read in from the data file is larger that sqlldr's default character buffer of 255 which is used if no CHAR and size is specified. Note this is different then the size of the column that the field corresponds to. For example, if I have a table column of VARCHAR2(4000), but do not explicitly give a size in the control file

cola  not null,

and the data in the data file exceeds 255 but is less than 4000 in length, you'll get the error.

However, if the control file states the buffer size like this:

cola char(4000) not null,

all will be good as if creates a larger buffer (here it matches the column size). So, just get in the habit of always including the column sizes. Save yourself some hassle and create a function to generate a default control file for you...wait I posted mine for you, give it a try: https://mcmap.net/q/1476311/-sqlldr-to-accept-1-type-of-date-format

Counterbalance answered 29/12, 2016 at 15:31 Comment(0)
A
2

Use CHAR(NN) in your control file for a column which exceeds its length 255. So your control file should like this;

LOAD DATA
INFILE '$IN_DIR/$FILENAME'
BADFILE '$REJ_DIR/$FILENAME.bad'
APPEND
INTO TABLE test
FIELDS TERMINATED BY ","
TRAILING NULLCOLS
  (
   FILE_RECORD_DESCRIPTOR               POSITION(1),
   LINE_NO                              FILLER,
   DEPT                                 TERMINATED BY "\n",
   SEQ_NO                               CONSTANT "1",
   DEPT_NO                              CHAR(4000),
   STATUS                               CONSTANT "U",
   PROCESS_ID                           CONSTANT "2"
  )

I do not know how to use CHAR(4000) with CONSTANT "0". Bu this should solve the problem.

Apocope answered 29/12, 2016 at 7:31 Comment(0)
U
0

As @Gary_W's answer correctly states, the default buffer size must be overridden by greater one using CHAR(<n>) directive in field description. I just want to note the <n> does not have to be enough even if set to all_tab_cols.data_length value of particular varchar2 column. The value <n> apparently means the length of internal SQL*Loader buffer used for processing value before inserting into database. If the field description contains preprocessing function which accepts string longer than column width (even if the result finally fits into buffer) the "Field in data file exceeds maximum length" error occurs anyway.

Example:

I was importing CSV file with eols escaped as \n (real eols are hard to process for SQL*Loader but that's a different story). The csv.unescapize function converts the 11-char-long string 123456789\n into 10-char-long string 123456789<the CHR(10) character>. To import such string into varchar2(10) column this can't be used:

load data
...
fields terminated by "," optionally enclosed by '"' trailing nullcols
(
  ID,
  VALUE CHAR(10) "csv.unescapize(:VALUE)",
)

since the original value does not fit into CHAR(10) buffer. (Knowing how the unescapize function works, it must be dimensioned for worst case to CHAR(20) in this case.)

Ulysses answered 8/12, 2021 at 16:25 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.