8

I have a Django deployment in production that uses MySQL.

I would like to do further development with SQLite, so I would like to import my existing data to an SQLite database. I

There is a shell script here to convert a general MySQL dump to SQLite, but it didn't work for me (apparently the general problem isn't easy).

I figured doing this using the Django models must be much easier. How would you do this? Does anyone have any script to do this?

Community
  • 1
  • 1
daphshez
  • 8,294
  • 11
  • 40
  • 57

4 Answers4

19

use

manage.py dumpdata > your_file.json

to export your data from the production system (docs).

Then move the file on the development system and run

manage.py loaddata your_file.json

You can also put the file in your_app/fixtures folder with name "initial_data.json" and it will be automatically loaded when you run "manage.py syncdb" (docs).

Sergio Morstabilini
  • 1,956
  • 19
  • 27
  • in my scenario I'm using Mezzanine and django in a python virtualenv. In order for these commands to work for me, I had to add "python" at the start. For example `python manage.py dumpdata > db_mezzanine_bak.json` – Adrián Jaramillo Dec 17 '19 at 09:48
2

Have you tried using manage.py dumpdata > datadump and then when the new database is set up correctly, use python manage.py loaddata datadump?

Uku Loskit
  • 37,259
  • 9
  • 85
  • 88
1

If you have contenttypes in installed app

INSTALLED_APPS = (
    'django.contrib.contenttypes',
)

Use script like what for copying you entry to new base:

from django.contrib.contenttypes.models import ContentType

    def run():

        def do(Table):
            if Table is not None:
                table_objects = Table.objects.all()
                for i in table_objects:
                    i.save(using='slave')

        ContentType.objects.using('slave').all().delete()

        for i in ContentType.objects.all():
            do(i.model_class())

See full manual here

Sergey Lyapustin
  • 1,743
  • 16
  • 25
  • I think you forgot to mention to something. Here is the things worked for me 1) Ensure correct settings (ie. if no contenttypes add it). then `python manage.py syncdb --database slave` or `python manage.py migrate --database slave`. 2) Write the function `run()` some where. 3) Call it from `python manage.py shell` or create a django custom command (https://docs.djangoproject.com/en/1.9/howto/custom-management-commands/)[https://docs.djangoproject.com/en/1.9/howto/custom-management-commands/] and use it. – Ajeeb.K.P Mar 01 '16 at 10:29
  • That's pretty outdated answer. Latest Django have better version of `dumpdata`, so you don't need to make custom solution for switching between DB's. – Sergey Lyapustin Mar 01 '16 at 15:28
  • 1
    IMHO and based on my experience with my current project, I don't think so. I tried dumpdata and loaddata in Django 1.8 to convert from PostgresQL to Sqlite, which ended up in error. Your solution is working like a charm. Anyway, user should first try dumpdata and loaddata first, then they might try this too (just like me). – Ajeeb.K.P Mar 02 '16 at 08:54
  • Neither dumpdata/loaddata nor this little routine works with me. The problem is caused by foreign keys that link to table entries that do not yet exist. Obviously the order of tables matters. It would be a great idea to add to this routine a function that postpones tables with foreign keys. I have no idea how to test tables for fields with foreign keys. – user1491229 May 01 '16 at 20:31
0

Maybe give south a try:

http://south.aeracode.org/docs/index.html

mkelley33
  • 4,670
  • 8
  • 44
  • 66