0

I'm using Python 3.6, and have written a file that takes all csv files in a folder and creates a table for each one in a database.db file using sqlite3. Then using sqlite3 database.db I can run sqlite commands and query the tables. It works okay, but for large csv files it's a bit slow (inserting the data into the tables, that is) . Sqlite has its own command for importing csv files into a table, which looks like:

.mode csv
.import FILE_PATH TABLE_NAME

This seems to be much faster at creating the tables and inserting the data than what I have written. I was wondering if there's any way to write python code that would write the above commands directly into the sqlite command line, so it could be done for multiple csv files automatically without having to explicitly type the .import command for each one? Or something to this effect?

4
  • You can use python to send commands to the terminal using subprocess or os.system, see this thread. This might work for you. Commented Jul 30, 2021 at 15:16
  • 2
    Specifically this answer: stackoverflow.com/a/59671652/8451814 Commented Jul 30, 2021 at 15:51
  • if it is a very large dataset you also have an option to use bulk copy pypi.org/project/bcpy Commented Jul 31, 2021 at 6:33
  • @Woodford yes perfect that's exactly what I was looking for, thank you :)) Commented Aug 2, 2021 at 15:40

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.