I am looking to important around 100 csv files with the same data throughout all of them. I am using oracle SQL developer
-
Please add details of what you have tried and at what point you're encountering a problem.devlin carnate– devlin carnate2020-02-06 16:13:13 +00:00Commented Feb 6, 2020 at 16:13
-
Just to make sure: your database is MS SQL Server, and the tool you use is Oracle SQL Developer. It that correct?Littlefoot– Littlefoot2020-02-06 16:13:33 +00:00Commented Feb 6, 2020 at 16:13
-
I have tried to search up methods to do this and can't seem to find anything to do it in one bulk import. any tips?Callum Jones– Callum Jones2020-02-06 16:14:07 +00:00Commented Feb 6, 2020 at 16:14
-
stackoverflow.com/questions/6198863/oracle-import-csv-fileMonofuse– Monofuse2020-02-06 16:15:52 +00:00Commented Feb 6, 2020 at 16:15
-
@Littlefoot Yes i believe soCallum Jones– Callum Jones2020-02-06 16:15:58 +00:00Commented Feb 6, 2020 at 16:15
|
Show 4 more comments
1 Answer
Is this SQL Server or Oracle? Either way, if I were you, I would merge all 100 files into one single file, and load that into any database you are working with. Python will easily do the merge task for you. Then, load the consolidated file into your DB.
import pandas as pd
import csv
import glob
import os
#os.chdir("C:\\your_path\\")
results = pd.DataFrame([])
filelist = glob.glob("C:\\your_path\\test\\*.csv")
#dfList=[]
for filename in filelist:
print(filename)
namedf = pd.read_csv(filename, skiprows=0, index_col=0)
results = results.append(namedf)
results.to_csv('C:\\your_path\\CombinedFile.csv')