"Invalid parameter type" (numpy.int64) when inserting rows with executemany()
Asked Answered
P

4

20

I try to insert bunch of data to database

insert_list = [(1,1,1,1,1,1),(2,2,2,2,2,2),(3,3,3,3,3,3),....] #up to 10000 tuples in this list

conn = pyodbc.connect('DRIVER={FreeTDS};SERVER=xxxxx;DATABASE=xxxx;UID=xx;PWD=xx;TDS_Version=7.0')
cursor = conn.cursor()

sql = "insert into ScanEMAxEMAHistoryDay(SecurityNumber, EMA1, EMA2, CrossType, DayCross, IsLocalMinMax) values (?, ?, ?, ?, ?, ?)"

cursor.executemany(sql, insert_list)

cursor.executemany(sql, insert_list)

pyodbc.ProgrammingError: ('Invalid parameter type. param-index=4 param-type=numpy.int64', 'HY105')

reduce to 100 tuples:

cursor.executemany(sql, insert_list[:100])

cursor.executemany(sql, insert_list[:100])

pyodbc.ProgrammingError: ('Invalid parameter type. param-index=4 param-type=numpy.int64', 'HY105') cursor.executemany(sql, insert_list[:100])

reduce to 5 tuples:

cursor.executemany(sql, insert_list[:5])
conn.commit()

This can insert to database

I have try to:

sql = 'SET GLOBAL max_allowed_packet=50*1024*1024'
cursor.execute(sql)

before excutemany() but it have an error:

pyodbc.ProgrammingError: ('42000', "[42000] [FreeTDS][SQL Server]'GLOBAL' is not a recognized SET option. (195) (SQLExecDirectW)")

How did i solve this.

Thank you.

Precursor answered 1/2, 2017 at 7:13 Comment(0)
A
29

Your problem is not with the volume of data per se, it is that some of your tuples contain numpy.int64 values that cannot be used directly as parameter values for your SQL statement. For example,

a = numpy.array([10, 11, 12], dtype=numpy.int64)
params = (1, 1, a[1], 1, 1, 1)
crsr.execute(sql, params)

will throw

ProgrammingError: ('Invalid parameter type. param-index=2 param-type=numpy.int64', 'HY105')

because the third parameter value is a numpy.int64 element from your numpy array a. Converting that value with int() will avoid the issue:

a = numpy.array([10, 11, 12], dtype=numpy.int64)
params = (1, 1, int(a[1]), 1, 1, 1)
crsr.execute(sql, params)

By the way, the reason that

sql = 'SET GLOBAL max_allowed_packet=50*1024*1024'
cursor.execute(sql)

didn't work is that max_allowed_packet is a MySQL setting that does not have any meaning for Microsoft SQL Server.

Amathist answered 1/2, 2017 at 22:54 Comment(4)
How can I do this with a Pandas DataFrame? I've tried df[col].astype(int) for every col but still no joy.Cassidycassie
@MattCamp - I just tried a simple test using df.values.tolist() and it worked for me. If you require further assistance you should ask a new question.Amathist
yea I got it to work.. had some issue that it was not allowing me to commit the dataframe because there were duplicates in a field there shouldnt be and then when I tried to do fewer records it threw the above error.. when I just changed the duplicate records to test things out it worked. thanksCassidycassie
When getting the "Invalid parameter type" error, is there a good way to tell which of the values are causing the error? i.e. when have a large number of columns it isn't always obvious which one is causing the error...Electrolier
J
5

I did the same as Robert; I converted everything to string. In my case, it was a pandas data frame that I casted to string type:

data = pandas.read_json(...)
data.astype(str).to_sql(...)

https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.astype.html

If the data you are retrieving includes URLs, you may get a "unknown protocol" error (or something like that). If you get this error even after casting to string type, try using StringIO instead:

import requests
from io import StringIO
...
data = pandas.read_json(StringIO(response.text))

where response is an instance of object Response from the requests library and its attribute text contains the json text data.

Jalopy answered 23/7, 2021 at 20:22 Comment(0)
P
2

For anyone out there reading this, it was driving me up the wall.

My eventual solution was to convert all variables to 'str' and it worked fine.

Phyllis answered 16/11, 2020 at 13:7 Comment(0)
T
0

I had the same error when passing parameters into a SQL INSERT statement using pyodbc. As mentioned by Robert, just converting the variable to string worked.

Example: Syntax - str()

cursor.execute("INSERT INTO Table ([col1],[col2],[col3]...])VALUES(?,'test',?)", str(value1), str(value3))
Transportation answered 15/10, 2021 at 7:10 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.