Opened 4 years ago

Last modified 5 weeks ago

#31804 assigned New feature

Parallelize database cloning process — at Version 2

Reported by: Ahmad A. Hussein Owned by: Ahmad A. Hussein
Component: Database layer (models, ORM) Version: dev
Severity: Normal Keywords: parallel, mysqlpump
Cc: Triage Stage: Accepted
Has patch: yes Needs documentation: no
Needs tests: yes Patch needs improvement: no
Easy pickings: no UI/UX: no

Description (last modified by Ahmad A. Hussein)

Parallelizing database cloning processes would yield a nice speed-up for running Django's own test suite (and all django projects that use the default test runner)

So far there are three main ways I see we can implement this:

  • Use a multiprocessing pool at the setup_databases level that'll create workers which run clone_test_db for each method
  • Use a pool at the clone_test_db level which parallelizes the internal _clone_test_db call
  • Scrap parallelizing the cloning in general, but parallelizing the internals of specific backends (at least MySQL fits here)

In the first two options, we'd have to refactor MySQL's cloning process since it has another call to _clone_db. We have to because otherwise we'd have a dump being created inside of each parallel process, slowing the workers greatly.

In the last option, we could consider using mysqlpump instead of mysqldump for both exporting the database and restoring it. The con of this approach is that it isn't general enough to apply to the other backends.

Oracle's cloning process(although not merged in the current master) has internal support for option 3 (users can specify a PARALLEL variable to speed-up expdp/impdp utilities), and it can also use the first two options.

The major con though with the first two options is forcing parallelization

Change History (2)

comment:1 by Ahmad A. Hussein, 4 years ago

Owner: changed from nobody to Ahmad A. Hussein

comment:2 by Ahmad A. Hussein, 4 years ago

Description: modified (diff)
Note: See TracTickets for help on using tickets.
Back to Top