Most of the answers above are correct and revolve around uploading the data using terminal with local_infile but the problem with this approach is that if you are having shared hosting and phpMyAdmin instance then you might stuck with below where your shared hosting provider won't let you change the local_infile settings.
+---------------+-------+
| Variable_name | Value |
+---------------+-------+
| local_infile | OFF |
+---------------+-------+
In order to get a workaround solution where I had to insert about 200,000 rows in the db. I wrote below shell script which did the job. You can increase or decrease the BATCH_SIZE as per your use case.
#!/bin/bash
# MySQL credentials
DB_HOST="host"
DB_USER="db_user"
DB_PASS="db_pass"
DB_NAME="db_name"
TABLE_NAME="table_name"
# Path to the CSV file
CSV_FILE="data.csv"
# Field Separator (comma in this case)
IFS=','
# Batch size
BATCH_SIZE=1000
counter=0
SQL_BATCH="INSERT INTO $TABLE_NAME (sub_category, product_name, product_composition, product_price, product_manufactured, product_desc, product_usp, product_interactions) VALUES "'),"
# Read CSV file line by line
while read -r sub_category product_name product_composition product_price product_manufactured product_desc product_usp product_interactions; do
# Escape single quotes to prevent SQL syntax errors
sub_category=$(echo "$sub_category" | sed "s/'/''/g")
product_name=$(echo "$product_name" | sed "s/'/''/g")
product_composition=$(echo "$product_composition" | sed "s/'/''/g")
product_price=$(echo "$product_price" | sed "s/'/''/g")
product_manufactured=$(echo "$product_manufactured" | sed "s/'/''/g")
product_desc=$(echo "$product_desc" | sed "s/'/''/g")
product_usp=$(echo "$product_usp" | sed "s/'/''/g")
product_interactions=$(echo "$product_interactions" | sed "s/'/''/g")
# Append the current row values to the SQL batch
SQL_BATCH="$SQL_BATCH ('$sub_category', '$product_name', '$product_composition', '$product_price', '$product_manufactured', '$product_desc', '$product_usp', '$product_interactions'),"
# Increment the counter
((counter++))
# If we have reached the batch size, execute the SQL
if [[ $counter -eq $BATCH_SIZE ]]; then
# Remove the last comma and add a semicolon to complete the SQL statement
SQL_BATCH="${SQL_BATCH%,};"
# Execute the batch insert
mysql -h "$DB_HOST" -u "$DB_USER" -p"$DB_PASS" -D "$DB_NAME" -e "$SQL_BATCH"
# Reset the batch and counter
SQL_BATCH="INSERT INTO $TABLE_NAME (sub_category, product_name, product_composition, product_price, product_manufactured, product_desc, product_usp, product_interactions) VALUES "
counter=0
fi
done < "$CSV_FILE"
# Execute the remaining records if there are any
if [[ $counter -gt 0 ]]; then
# Remove the last comma and add a semicolon
SQL_BATCH="${SQL_BATCH%,};"
# Execute the remaining batch
mysql -h "$DB_HOST" -u "$DB_USER" -p"$DB_PASS" -D "$DB_NAME" -e "$SQL_BATCH"
fi
echo "Data import complete."
This workaround solution might take some time in case of large data but does the job.