0

I've generated an SQLite database (500k rows over 5 tables) with grammatical and lexical information from a Hebrew (UTF8!) corpus with Python.

I want to import this database into MySQL (phpMyAdmin). An earlier proof of concept and way smaller CSV table was imported successfully.


I've tried:

  • wholesale import of the sqlite db. This results in an error, as their syntaxes differ.
  • import of a grep'ed sqlite db, where I've removed all sqlite syntax and replaced it with mysql compatible. The import function is the phpmyadmin one, input encoding set as utf8.

    The import is successful, however the data is garbled (×להי×) in phpmyadmin.

    • Interestingly, this data is not garbled when looking at the data via mysql CLI.
    • When getting the data via php however, they are garbled again. This php subroutine works without problems on the smaller set, so there is no utf8 garbling in this part.
    • I've tried to solve this by means of CHARACTER SET utf8; changing column data type to blob and back; both remedies do not solve the garbled data output online.

The secondary output are CSV files per SQLite table.

  • Wholesale upload of these files fails. phpmyadmin displays a blank screen. I assume the files are too big to parse at once.
  • CSV import via LOAD DATA results in an error which leads me to assume my hosting disables this function for my user account. I forgot to note the error in question. I can't change hosting at this moment.

For now, I can update the database (it is rather static, but cannot be shared with end-users) by splitting the csv files and uploading these individually. I'd rather have a more automated solution, of course.

I'm not very experienced in database design, so I hope I've just one glaring option. Thanks for your help.

cadaniluk
  • 14,449
  • 2
  • 34
  • 61
Redmer K.
  • 21
  • 4

0 Answers0