0

I want to upload a dataframe to mysql where I have a column that has data in multiple languages. That column contains data in these languages: English, Russian, Chinese etc etc. I am using sqlalchemy's create engine, to create the sql engine and pandas to_sql to upload the dataframe to mysql. Is there any way I can upload this data? I am not sure of the parameters to parse while creating the engine or anything else to be able to upload the data as it is. Thanks in advance.

  • as long you have utf8 all the way, and all languages can be founf in utf8mb4(whuch you ave to check),, so simply try it – nbk Jun 23 '20 at 20:57
  • Does your SQLAlchemy connection URI include `?charset=utf8mb4` ...? – Gord Thompson Jun 23 '20 at 20:57
  • @nbk I tried utf8 and utf8mb4 both. None of them seem to work. It does allow me to upload to sql but in sql, it replaces those texts by "???" such question marks. – aashay shah Jun 24 '20 at 00:05
  • @GordThompson I tried utf8 and utf8mb4 both. None of them seem to work. It does allow me to upload to sql but in sql, it replaces those texts by "???" such question marks – aashay shah Jun 24 '20 at 00:06
  • as already said all has to be in utf8 Table database and connection see https://stackoverflow.com/questions/45279863/how-to-use-charset-and-encoding-in-create-engine-of-sqlalchemy-to-create – nbk Jun 24 '20 at 01:45

0 Answers0