The error you’re seeing is due to the fact that the sys.maxsize
on your system is larger than what can fit into a C long integer, which is the underlying data type used by the csv.field_size_limit
function.
Instead of directly using sys.maxsize
, you can try setting a large fixed value or incrementally increasing the limit until you no longer get the error.
Here’s a solution:
Set a large fixed value:
import csv
# Set a large value, but within C long limits
csv.field_size_limit(2**30) # for example, set to 1 GiB
with open('filename.csv', 'r') as file:
reader = csv.reader(file)
# Iterate over each row in the CSV
for row in reader:
print(row)
Incrementally increase the limit:
If you’re unsure about how big the field might be, you can increase the limit incrementally (for example, double it each time) until the error no longer occurs. This approach may involve a bit more trial and error:
import csv
limit = 131072 # default limit
while True:
try:
csv.field_size_limit(limit)
with open('filename.csv', 'r') as file:
reader = csv.reader(file)
# Iterate over each row in the CSV
for row in reader:
print(row)
break # exit the loop if reading is successful
except _csv.Error as e:
if "field larger than field limit" in str(e):
limit *= 2 # double the limit and retry
else:
raise # raise other exceptions
This will keep doubling the limit until the file can be read without issues or until some other unrelated CSV error occurs.
Python Error : “_csv.Error: field larger than field limit ” – Resolved
Python Error : UnicodeDecodeError: ‘charmap’ codec can’t decode byte : Resolved