I created a python script with a date argument which allows to extract data from a file (4.2 MB) to feed a table; when executing it shows me this error :
File "./insert_pru_data.py", line 136, in <module>
importYear(year)
File "./insert_pru_data.py", line 124, in importYear
SQLrequest += "(" + ", ".join(data_to_insert[i]) + "),\n"
MemoryError
My Code:
def importYear(year):
go = True
if isAlreadyInserted(year):
if replace == False:
print("donnees pour annee " + year + " deja inserees, action annulee")
go = False
else:
print("donnees pour annee " + year + " deja inserees, les donnees seront remplacees")
deleteData(year)
if go:
data_to_insert = getDataToInsert(data)
SQLrequest = "INSERT INTO my_table (date_h, day, area, h_type, act, dir, ach) VALUES\n"
i = 0
print(data_to_insert)
while i < len(data_to_insert) - 1:
data_to_insert[i] = ["None" if element == None else element for element in data_to_insert[i]]
SQLrequest += "(" + ", ".join(data_to_insert[i]) + "),\n"
SQLrequest += "(" + ", ".join(data_to_insert[len(data_to_insert) - 1]) + ");"
with psycopg2.connect(connString) as conn: # Ouverture connexion a la base
with conn.cursor() as cur:
cur.execute(SQLrequest)
cur.execute("COMMIT")
cur.close()
importYear(year)
, someone help me to know how to solve this problem?
data_to_insert, and you're running out of memory (possibly on a 32 bit build of Python where you're limited to ~2 GB of virtual address space no matter how much RAM you have). We have no idea wheredata_to_insertcame from, so that's all that can be said.data_to_insertis list came from file csvdata_to_insertlike a list