I have a python script which downloads data from an API source and transforms it in to a a list of dictionaries that I save as a JSON file.
Separately, I have a Django project that uses this data to serve the data in a web page. I am currently using the default SQLite DB but plan on using Postgresql in production.
The data is updated frequently, and so the script needs to download the new data on a daily basis and update the data being used by the Djano project's database.
The issue is I can make the Django project "work" with dummy, sample data; and the script works independently of Django.
How do I integrate this downloading script to work with Django and "push" new records and updates to the Django DB using the python script? Users of the Django project will only read the data and not write / update the data otherwise.
I have reviewed a variety of tutorials such as the Django polls app but am lost on how to marry these two pieces together. Is this a case of using fixtures over and over against to reload data into the DB? It appears that this is not the best method as fixtures are only used for test data and would not be automatic.
What am I missing? Alternatively, is Django not the best way to serve data from a DB that is read-only? What would be a better alternative?