0

I struggled quite a while trying to save into MySQL a table containing JSON columns, using SQLAlchemy and pandas' to_sql.

I got this error

sqlalchemy.exc.ProgrammingError: (mysql.connector.errors.ProgrammingError) Failed processing pyformat-parameters; Python 'list' cannot be converted to a MySQL type

when trying to do it the standard way below. I have a pandas dataframe (df) where I have two columns containing JSONs ('videos', and 'newsitems').

import sqlalchemy
connection = sqlalchemy.create_engine(f"mysql+mysqlconnector://{user}:{pw}@{host}/{db}")
table_name = 'TABLENAME'
df.to_sql(table_name, 
          connection, 
          if_exists='append', 
          index=False)
1
  • I'm getting an error with my json column "scans": ValueError: scans (<class 'sqlalchemy.sql.sqltypes.JSON'>) not a string Commented Jul 31, 2023 at 17:25

1 Answer 1

1

Now I found a response elsewhere, about postgresql - which works as finely with MySQL. Adding the last two lines dtype={...} fixed the issue

import sqlalchemy
connection = sqlalchemy.create_engine(f"mysql+mysqlconnector://{user}:{pw}@{host}/{db}")
table_name = 'TABLENAME'
df.to_sql(table_name, 
          connection, 
          if_exists='append', 
          index=False,
          dtype={'videos': sqlalchemy.types.JSON,
                 'newsitems': sqlalchemy.types.JSON})

The original Q&A that helped me solve the issue was this

Writing JSON column to Postgres using Pandas .to_sql

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.