I'm written a script to load some objects into a Django database using the Django ORM. The underlying database is Postgres.
After running happily for a while, the script fails with this error:
django.db.utils.DatabaseError: out of shared memory
HINT: You might need to increase max_locks_per_transaction.
I'm guessing this is a problem with my script's efficiency, rather than the database settings.
The script iterates over a CSV file, and create a database object for every row in the CSV file. Typically there are a couple thousand objects to create. I've read some background material on database efficiency in Django. I can rule out some mistakes - I'm not iterating over a queryset, or using __in queries or OFFSET.
But I do have quite a lot of indexes on fields in my database, and I guess that each time I create and save an object, Django has to update all the indexes. I have six indexes on the StoreItem fields, for example.
for item in csv_rows:
s, created = StoreItem.objects.get_or_create(display_url=item['display_url'], \
retailer_img_url=item['retailer_img_url'],store=store_obj)
s.name = item['name']
s.description = item['description']
s.affiliate = item['affiliate']
... more stuff
s.save()
Two questions:
- Is it possible that updating the database indexes could cause this error?
- How can I debug if this is the case?