pyodbc: Inserting into SQL server with fast_executemany results in MemoryError
Environment
To diagnose, we usually need to know the following, including version numbers. On Windows, be sure to specify 32-bit Python or 64-bit:
- Python: 3.6.8
- pyodbc: 4.0.26
- OS: Alpine 3.8
- DB: Azure SQL Database
- driver: Microsoft ODBC Driver 17 for SQL Server
Issue
I’m loading data from a SQL Server 2016 towards Azure SQL Database. When inserting rows with a parameterized insert statement and fast_executemany=False, it works perfectly fine. When turning fast_executemany on, a very brief error message is displayed:
in bulk_insert_rows cursor.executemany(sql, row_chunk) MemoryError
This is all I get. I’ve tried setting different encodings on the connection, such as described here: https://github.com/mkleehammer/pyodbc/wiki/Unicode. It fails every single time with fast_executemany on True and succeeds every single time with it turned off.
Looking for other ideas to troubleshoot. Thanks.
About this issue
- Original URL
- State: closed
- Created 5 years ago
- Comments: 15 (6 by maintainers)
@gordthompson yes, using SQL_WVARCHAR works:
(The (0, 0) for size and precision instructs the driver to bind as nvarchar(max) instead of regular nvarchar — and is needed if you want to insert >4000 characters.)