1
0
mirror of https://github.com/django/django.git synced 2025-10-24 06:06:09 +00:00

Fixed #1142 -- Added multiple database support.

This monster of a patch is the result of Alex Gaynor's 2009 Google Summer of Code project.
Congratulations to Alex for a job well done.

Big thanks also go to:
 * Justin Bronn for keeping GIS in line with the changes,
 * Karen Tracey and Jani Tiainen for their help testing Oracle support
 * Brett Hoerner, Jon Loyens, and Craig Kimmerer for their feedback.
 * Malcolm Treddinick for his guidance during the GSoC submission process.
 * Simon Willison for driving the original design process
 * Cal Henderson for complaining about ponies he wanted.

... and everyone else too numerous to mention that helped to bring this feature into fruition.

git-svn-id: http://code.djangoproject.com/svn/django/trunk@11952 bcc190cf-cafb-0310-a4f2-bffc1f526a37
This commit is contained in:
Russell Keith-Magee
2009-12-22 15:18:51 +00:00
parent 7ef212af14
commit ff60c5f9de
231 changed files with 7860 additions and 5668 deletions

View File

@@ -272,9 +272,9 @@ Documenting your Custom Field
As always, you should document your field type, so users will know what it is.
In addition to providing a docstring for it, which is useful for developers,
you can also allow users of the admin app to see a short description of the
field type via the ``django.contrib.admindocs`` application. To do this simply
provide descriptive text in a ``description`` class attribute of your custom field.
In the above example, the type description displayed by the ``admindocs`` application
field type via the ``django.contrib.admindocs`` application. To do this simply
provide descriptive text in a ``description`` class attribute of your custom field.
In the above example, the type description displayed by the ``admindocs`` application
for a ``HandField`` will be 'A hand of cards (bridge style)'.
Useful methods
@@ -288,10 +288,13 @@ approximately decreasing order of importance, so start from the top.
Custom database types
~~~~~~~~~~~~~~~~~~~~~
.. method:: db_type(self)
.. method:: db_type(self, connection)
.. versionadded:: 1.2
The ``connection`` argument was added to support multiple databases.
Returns the database column data type for the :class:`~django.db.models.Field`,
taking into account the current :setting:`DATABASE_ENGINE` setting.
taking into account the connection object, and the settings associated with it.
Say you've created a PostgreSQL custom type called ``mytype``. You can use this
field with Django by subclassing ``Field`` and implementing the :meth:`db_type`
@@ -300,7 +303,7 @@ method, like so::
from django.db import models
class MytypeField(models.Field):
def db_type(self):
def db_type(self, connection):
return 'mytype'
Once you have ``MytypeField``, you can use it in any model, just like any other
@@ -315,13 +318,13 @@ If you aim to build a database-agnostic application, you should account for
differences in database column types. For example, the date/time column type
in PostgreSQL is called ``timestamp``, while the same column in MySQL is called
``datetime``. The simplest way to handle this in a ``db_type()`` method is to
import the Django settings module and check the :setting:`DATABASE_ENGINE` setting.
check the ``connection.settings_dict['ENGINE']`` attribute.
For example::
class MyDateField(models.Field):
def db_type(self):
from django.conf import settings
if settings.DATABASE_ENGINE == 'mysql':
def db_type(self, connection):
if connection.settings_dict['ENGINE'] == 'django.db.backends.mysql':
return 'datetime'
else:
return 'timestamp'
@@ -329,7 +332,7 @@ For example::
The :meth:`db_type` method is only called by Django when the framework
constructs the ``CREATE TABLE`` statements for your application -- that is, when
you first create your tables. It's not called at any other time, so it can
afford to execute slightly complex code, such as the :setting:`DATABASE_ENGINE`
afford to execute slightly complex code, such as the ``connection.settings_dict``
check in the above example.
Some database column types accept parameters, such as ``CHAR(25)``, where the
@@ -340,7 +343,7 @@ sense to have a ``CharMaxlength25Field``, shown here::
# This is a silly example of hard-coded parameters.
class CharMaxlength25Field(models.Field):
def db_type(self):
def db_type(self, connection):
return 'char(25)'
# In the model:
@@ -358,7 +361,7 @@ time -- i.e., when the class is instantiated. To do that, just implement
self.max_length = max_length
super(BetterCharField, self).__init__(*args, **kwargs)
def db_type(self):
def db_type(self, connection):
return 'char(%s)' % self.max_length
# In the model:
@@ -421,33 +424,67 @@ Python object type we want to store in the model's attribute.
called when it is created, you should be using `The SubfieldBase metaclass`_
mentioned earlier. Otherwise :meth:`to_python` won't be called automatically.
Converting Python objects to database values
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Converting Python objects to query values
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. method:: get_db_prep_value(self, value)
.. method:: get_prep_value(self, value)
This is the reverse of :meth:`to_python` when working with the database backends
(as opposed to serialization). The ``value`` parameter is the current value of
the model's attribute (a field has no reference to its containing model, so it
cannot retrieve the value itself), and the method should return data in a format
that can be used as a parameter in a query for the database backend.
.. versionadded:: 1.2
This method was factored out of ``get_db_prep_value()``
This is the reverse of :meth:`to_python` when working with the
database backends (as opposed to serialization). The ``value``
parameter is the current value of the model's attribute (a field has
no reference to its containing model, so it cannot retrieve the value
itself), and the method should return data in a format that has been
prepared for use as a parameter in a query.
This conversion should *not* include any database-specific
conversions. If database-specific conversions are required, they
should be made in the call to :meth:`get_db_prep_value`.
For example::
class HandField(models.Field):
# ...
def get_db_prep_value(self, value):
def get_prep_value(self, value):
return ''.join([''.join(l) for l in (value.north,
value.east, value.south, value.west)])
.. method:: get_db_prep_save(self, value)
Converting query values to database values
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Same as the above, but called when the Field value must be *saved* to the
database. As the default implementation just calls ``get_db_prep_value``, you
shouldn't need to implement this method unless your custom field needs a
special conversion when being saved that is not the same as the conversion used
for normal query parameters (which is implemented by ``get_db_prep_value``).
.. method:: get_db_prep_value(self, value, connection, prepared=False)
.. versionadded:: 1.2
The ``connection`` and ``prepared arguments were added to support multiple databases.
Some data types (for example, dates) need to be in a specific format
before they can be used by a database backend.
:meth:`get_db_prep_value` is the method where those conversions should
be made. The specific connection that will be used for the query is
passed as the ``connection`` parameter. This allows you to use
backend-specific conversion logic if it is required.
The ``prepared`` argument describes whether or not the value has
already been passed through :meth:`get_prep_value` conversions. When
``prepared`` is False, the default implementation of
:meth:`get_db_prep_value` will call :meth:`get_prep_value` to do
initial data conversions before performing any database-specific
processing.
.. method:: get_db_prep_save(self, value, connection)
.. versionadded:: 1.2
The ``connection`` argument was added to support multiple databases.
Same as the above, but called when the Field value must be *saved* to
the database. As the default implementation just calls
``get_db_prep_value``, you shouldn't need to implement this method
unless your custom field needs a special conversion when being saved
that is not the same as the conversion used for normal query
parameters (which is implemented by ``get_db_prep_value``).
Preprocessing values before saving
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -475,7 +512,16 @@ correct value.
Preparing values for use in database lookups
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. method:: get_db_prep_lookup(self, lookup_type, value)
As with value conversions, preparing a value for database lookups is a
two phase process.
.. method:: get_prep_lookup(self, lookup_type, value)
.. versionadded:: 1.2
This method was factored out of ``get_db_prep_lookup()``
:meth:`get_prep_lookup` performs the first phase of lookup preparation,
performing generic data validity checks
Prepares the ``value`` for passing to the database when used in a lookup (a
``WHERE`` constraint in SQL). The ``lookup_type`` will be one of the valid
@@ -492,34 +538,45 @@ by with handling the lookup types that need special handling for your field
and pass the rest to the :meth:`get_db_prep_lookup` method of the parent class.
If you needed to implement ``get_db_prep_save()``, you will usually need to
implement ``get_db_prep_lookup()``. If you don't, ``get_db_prep_value`` will be
implement ``get_prep_lookup()``. If you don't, ``get_prep_value`` will be
called by the default implementation, to manage ``exact``, ``gt``, ``gte``,
``lt``, ``lte``, ``in`` and ``range`` lookups.
You may also want to implement this method to limit the lookup types that could
be used with your custom field type.
Note that, for ``range`` and ``in`` lookups, ``get_db_prep_lookup`` will receive
Note that, for ``range`` and ``in`` lookups, ``get_prep_lookup`` will receive
a list of objects (presumably of the right type) and will need to convert them
to a list of things of the right type for passing to the database. Most of the
time, you can reuse ``get_db_prep_value()``, or at least factor out some common
time, you can reuse ``get_prep_value()``, or at least factor out some common
pieces.
For example, the following code implements ``get_db_prep_lookup`` to limit the
For example, the following code implements ``get_prep_lookup`` to limit the
accepted lookup types to ``exact`` and ``in``::
class HandField(models.Field):
# ...
def get_db_prep_lookup(self, lookup_type, value):
def get_prep_lookup(self, lookup_type, value):
# We only handle 'exact' and 'in'. All others are errors.
if lookup_type == 'exact':
return [self.get_db_prep_value(value)]
return [self.get_prep_value(value)]
elif lookup_type == 'in':
return [self.get_db_prep_value(v) for v in value]
return [self.get_prep_value(v) for v in value]
else:
raise TypeError('Lookup type %r not supported.' % lookup_type)
.. method:: get_db_prep_lookup(self, lookup_type, value, connection, prepared=False)
.. versionadded:: 1.2
The ``connection`` and ``prepared`` arguments were added to support multiple databases.
Performs any database-specific data conversions required by a lookup.
As with :meth:`get_db_prep_value`, the specific connection that will
be used for the query is passed as the ``connection`` parameter.
The ``prepared`` argument describes whether the value has already been
prepared with :meth:`get_prep_lookup`.
Specifying the form field for a model field
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

View File

@@ -118,23 +118,27 @@ The SQL files are read by the :djadmin:`sqlcustom`, :djadmin:`sqlreset`,
<ref-django-admin>`. Refer to the :ref:`manage.py documentation
<ref-django-admin>` for more information.
Note that if you have multiple SQL data files, there's no guarantee of the order
in which they're executed. The only thing you can assume is that, by the time
your custom data files are executed, all the database tables already will have
been created.
Note that if you have multiple SQL data files, there's no guarantee of
the order in which they're executed. The only thing you can assume is
that, by the time your custom data files are executed, all the
database tables already will have been created.
Database-backend-specific SQL data
----------------------------------
There's also a hook for backend-specific SQL data. For example, you can have
separate initial-data files for PostgreSQL and MySQL. For each app, Django
looks for a file called ``<appname>/sql/<modelname>.<backend>.sql``, where
``<appname>`` is your app directory, ``<modelname>`` is the model's name in
lowercase and ``<backend>`` is the value of :setting:`DATABASE_ENGINE` in your
settings file (e.g., ``postgresql``, ``mysql``).
There's also a hook for backend-specific SQL data. For example, you
can have separate initial-data files for PostgreSQL and MySQL. For
each app, Django looks for a file called
``<appname>/sql/<modelname>.<backend>.sql``, where ``<appname>`` is
your app directory, ``<modelname>`` is the model's name in lowercase
and ``<backend>`` is the last part of the module name provided for the
:setting:`ENGINE` in your settings file (e.g., if you have defined a
database with an :setting:`ENGINE` value of
``django.db.backends.postgresql``, Django will look for
``<appname>/sql/<modelname>.postgresql.sql``).
Backend-specific SQL data is executed before non-backend-specific SQL data. For
example, if your app contains the files ``sql/person.sql`` and
``sql/person.postgresql.sql`` and you're installing the app on PostgreSQL,
Django will execute the contents of ``sql/person.postgresql.sql`` first, then
``sql/person.sql``.
Backend-specific SQL data is executed before non-backend-specific SQL
data. For example, if your app contains the files ``sql/person.sql``
and ``sql/person.postgresql.sql`` and you're installing the app on
PostgreSQL, Django will execute the contents of
``sql/person.postgresql.sql`` first, then ``sql/person.sql``.

View File

@@ -18,15 +18,16 @@ Give Django your database parameters
====================================
You'll need to tell Django what your database connection parameters are, and
what the name of the database is. Do that by editing these settings in your
:ref:`settings file <topics-settings>`:
what the name of the database is. Do that by editing the :setting:`DATABASES`
setting and assigning values to the following keys for the ``'default'``
connection:
* :setting:`DATABASE_NAME`
* :setting:`DATABASE_ENGINE`
* :setting:`DATABASE_USER`
* :setting:`DATABASE_PASSWORD`
* :setting:`DATABASE_HOST`
* :setting:`DATABASE_PORT`
* :setting:`NAME`
* :setting:`ENGINE`
* :setting:`USER`
* :setting:`PASSWORD`
* :setting:`HOST`
* :setting:`PORT`
Auto-generate the models
========================

View File

@@ -65,7 +65,8 @@ The model layer
:ref:`Raw SQL <topics-db-sql>` |
:ref:`Transactions <topics-db-transactions>` |
:ref:`Aggregation <topics-db-aggregation>` |
:ref:`Custom fields <howto-custom-model-fields>`
:ref:`Custom fields <howto-custom-model-fields>` |
:ref:`Multiple databases <topics-db-multi-db>`
* **Other:**
:ref:`Supported databases <ref-databases>` |

View File

@@ -793,27 +793,53 @@ To run the tests, ``cd`` to the ``tests/`` directory and type:
./runtests.py --settings=path.to.django.settings
Yes, the unit tests need a settings module, but only for database connection
info, with the ``DATABASE_ENGINE`` setting.
info. Your :setting:`DATABASES` setting needs to define two databases:
If you're using the ``sqlite3`` database backend, no further settings are
needed. A temporary database will be created in memory when running the tests.
* A ``default`` database. This database should use the backend that
you want to use for primary testing
If you're using another backend:
* A database with the alias ``other``. The ``other`` database is
used to establish that queries can be directed to different
databases. As a result, this database can use any backend you
want. It doesn't need to use the same backend as the ``default``
database (although it can use the same backend if you want to).
* Your :setting:`DATABASE_USER` setting needs to specify an existing user account
for the database engine.
If you're using the SQLite database backend, you need to define
:setting:`ENGINE` for both databases, plus a
:setting:`TEST_NAME` for the ``other`` database. The
following is a minimal settings file that can be used to test SQLite::
* The :setting:`DATABASE_NAME` setting must be the name of an existing database to
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3'
},
'other': {
'ENGINE': 'django.db.backends.sqlite3',
'TEST_NAME: 'other_db'
}
}
If you're using another backend, you will need to provide other details for
each database:
* The :setting:`USER` option for each of your databases needs to
specify an existing user account for the database.
* The :setting:`PASSWORD` option needs to provide the password for
the :setting:`USER` that has been specified.
* The :setting:`NAME` option must be the name of an existing database to
which the given user has permission to connect. The unit tests will not
touch this database; the test runner creates a new database whose name is
:setting:`DATABASE_NAME` prefixed with ``test_``, and this test database is
:setting:`NAME` prefixed with ``test_``, and this test database is
deleted when the tests are finished. This means your user account needs
permission to execute ``CREATE DATABASE``.
You will also need to ensure that your database uses UTF-8 as the default
character set. If your database server doesn't use UTF-8 as a default charset,
you will need to include a value for ``TEST_DATABASE_CHARSET`` in your settings
file.
you will need to include a value for ``TEST_CHARSET`` in the settings
dictionary for the applicable database.
If you want to run the full suite of tests, you'll need to install a number of
dependencies:

View File

@@ -30,7 +30,19 @@ their deprecation, as per the :ref:`Django deprecation policy
class in favor of a generic E-mail backend API.
* The many to many SQL generation functions on the database backends
will be removed. These have been deprecated since the 1.2 release.
will be removed.
* The ability to use the ``DATABASE_*`` family of top-level settings to
define database connections will be removed.
* The ability to use shorthand notation to specify a database backend
(i.e., ``sqlite3`` instead of ``django.db.backends.sqlite3``) will be
removed.
* The ``get_db_prep_save``, ``get_db_prep_value`` and
``get_db_prep_lookup`` methods on Field were modified in 1.2 to support
multiple databases. In 1.4, the support functions that allow methods
with the old prototype to continue working will be removed.
* The ``Message`` model (in ``django.contrib.auth``), its related
manager in the ``User`` model (``user.message_set``), and the

View File

@@ -158,34 +158,40 @@ It worked!
Database setup
--------------
Now, edit :file:`settings.py`. It's a normal Python module with module-level
variables representing Django settings. Change these settings to match your
database's connection parameters:
Now, edit :file:`settings.py`. It's a normal Python module with
module-level variables representing Django settings. Change the
following keys in the :setting:`DATABASES` ``'default'`` item to match
your databases connection settings.
* :setting:`DATABASE_ENGINE` -- Either 'postgresql_psycopg2', 'mysql' or
'sqlite3'. Other backends are :setting:`also available <DATABASE_ENGINE>`.
* :setting:`ENGINE` -- Either
``'django.db.backends.postgresql_psycopg2'``,
``'django.db.backends.mysql'`` or
``'django.db.backends.sqlite3'``. Other backends are
:setting:`also available <ENGINE>`.
* :setting:`DATABASE_NAME` -- The name of your database. If you're using
SQLite, the database will be a file on your computer; in that case,
``DATABASE_NAME`` should be the full absolute path, including filename, of
that file. If the file doesn't exist, it will automatically be created
when you synchronize the database for the first time (see below).
* :setting:`NAME` -- The name of your database. If you're using
SQLite, the database will be a file on your computer; in that
case, :setting:`NAME` should be the full absolute path,
including filename, of that file. If the file doesn't exist, it
will automatically be created when you synchronize the database
for the first time (see below).
When specifying the path, always use forward slashes, even on Windows
(e.g. ``C:/homes/user/mysite/sqlite3.db``).
When specifying the path, always use forward slashes, even on
Windows (e.g. ``C:/homes/user/mysite/sqlite3.db``).
* :setting:`DATABASE_USER` -- Your database username (not used for SQLite).
* :setting:`USER` -- Your database username (not used for SQLite).
* :setting:`DATABASE_PASSWORD` -- Your database password (not used for
* :setting:`PASSWORD` -- Your database password (not used for
SQLite).
* :setting:`DATABASE_HOST` -- The host your database is on. Leave this as an
empty string if your database server is on the same physical machine (not
used for SQLite).
* :setting:`HOST` -- The host your database is on. Leave this as
an empty string if your database server is on the same physical
machine (not used for SQLite).
If you're new to databases, we recommend simply using SQLite (by setting
:setting:`DATABASE_ENGINE` to ``'sqlite3'``). SQLite is included as part of
Python 2.5 and later, so you won't need to install anything else.
If you're new to databases, we recommend simply using SQLite (by
setting :setting:`ENGINE` to ``'django.db.backends.sqlite3'``). SQLite
is included as part of Python 2.5 and later, so you won't need to
install anything else.
.. note::

View File

@@ -28,8 +28,8 @@ Compiles .po files to .mo files for use with builtin gettext support.
Creates the table needed to use the SQL cache backend
.TP
.B dbshell
Runs the command\-line client for the current
.BI DATABASE_ENGINE.
Runs the command\-line client for the specified
.BI database ENGINE.
.TP
.B diffsettings
Displays differences between the current

View File

@@ -44,18 +44,20 @@ Autocommit mode
.. versionadded:: 1.1
If your application is particularly read-heavy and doesn't make many database
writes, the overhead of a constantly open transaction can sometimes be
noticeable. For those situations, if you're using the ``postgresql_psycopg2``
backend, you can configure Django to use *"autocommit"* behavior for the
connection, meaning that each database operation will normally be in its own
transaction, rather than having the transaction extend over multiple
operations. In this case, you can still manually start a transaction if you're
doing something that requires consistency across multiple database operations.
The autocommit behavior is enabled by setting the ``autocommit`` key in the
:setting:`DATABASE_OPTIONS` setting::
If your application is particularly read-heavy and doesn't make many
database writes, the overhead of a constantly open transaction can
sometimes be noticeable. For those situations, if you're using the
``postgresql_psycopg2`` backend, you can configure Django to use
*"autocommit"* behavior for the connection, meaning that each database
operation will normally be in its own transaction, rather than having
the transaction extend over multiple operations. In this case, you can
still manually start a transaction if you're doing something that
requires consistency across multiple database operations. The
autocommit behavior is enabled by setting the ``autocommit`` key in
the :setting:`OPTIONS` part of your database configuration in
:setting:`DATABASES`::
DATABASE_OPTIONS = {
OPTIONS = {
"autocommit": True,
}
@@ -67,11 +69,11 @@ objects are changed or none of them are.
.. admonition:: This is database-level autocommit
This functionality is not the same as the
:ref:`topics-db-transactions-autocommit` decorator. That decorator is a
Django-level implementation that commits automatically after data changing
operations. The feature enabled using the :setting:`DATABASE_OPTIONS`
settings provides autocommit behavior at the database adapter level. It
commits after *every* operation.
:ref:`topics-db-transactions-autocommit` decorator. That decorator
is a Django-level implementation that commits automatically after
data changing operations. The feature enabled using the
:setting:`OPTIONS` option provides autocommit behavior at the
database adapter level. It commits after *every* operation.
If you are using this feature and performing an operation akin to delete or
updating that requires multiple operations, you are strongly recommended to
@@ -249,29 +251,33 @@ Refer to the :ref:`settings documentation <ref-settings>`.
Connection settings are used in this order:
1. :setting:`DATABASE_OPTIONS`.
2. :setting:`DATABASE_NAME`, :setting:`DATABASE_USER`,
:setting:`DATABASE_PASSWORD`, :setting:`DATABASE_HOST`,
:setting:`DATABASE_PORT`
1. :setting:`OPTIONS`.
2. :setting:`NAME`, :setting:`USER`, :setting:`PASSWORD`,
:setting:`HOST`, :setting:`PORT`
3. MySQL option files.
In other words, if you set the name of the database in ``DATABASE_OPTIONS``,
this will take precedence over ``DATABASE_NAME``, which would override
In other words, if you set the name of the database in ``OPTIONS``,
this will take precedence over ``NAME``, which would override
anything in a `MySQL option file`_.
Here's a sample configuration which uses a MySQL option file::
# settings.py
DATABASE_ENGINE = "mysql"
DATABASE_OPTIONS = {
'read_default_file': '/path/to/my.cnf',
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'OPTIONS': {
'read_default_file': '/path/to/my.cnf',
},
}
}
# my.cnf
[client]
database = DATABASE_NAME
user = DATABASE_USER
password = DATABASE_PASSWORD
database = NAME
user = USER
password = PASSWORD
default-character-set = utf8
Several other MySQLdb connection options may be useful, such as ``ssl``,
@@ -302,7 +308,7 @@ storage engine, you have a couple of options.
* Another option is to use the ``init_command`` option for MySQLdb prior to
creating your tables::
DATABASE_OPTIONS = {
OPTIONS = {
"init_command": "SET storage_engine=INNODB",
}
@@ -467,7 +473,7 @@ If you're getting this error, you can solve it by:
* Increase the default timeout value by setting the ``timeout`` database
option option::
DATABASE_OPTIONS = {
OPTIONS = {
# ...
"timeout": 20,
# ...
@@ -520,25 +526,34 @@ Connecting to the database
Your Django settings.py file should look something like this for Oracle::
DATABASE_ENGINE = 'oracle'
DATABASE_NAME = 'xe'
DATABASE_USER = 'a_user'
DATABASE_PASSWORD = 'a_password'
DATABASE_HOST = ''
DATABASE_PORT = ''
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.oracle',
'NAME': 'xe',
'USER': 'a_user',
'PASSWORD': 'a_password',
'HOST': '',
'PORT': '' ,
}
}
If you don't use a ``tnsnames.ora`` file or a similar naming method that
recognizes the SID ("xe" in this example), then fill in both
:setting:`DATABASE_HOST` and :setting:`DATABASE_PORT` like so::
``HOST`` and ``PORT`` like so::
DATABASE_ENGINE = 'oracle'
DATABASE_NAME = 'xe'
DATABASE_USER = 'a_user'
DATABASE_PASSWORD = 'a_password'
DATABASE_HOST = 'dbprod01ned.mycompany.com'
DATABASE_PORT = '1540'
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.oracle',
'NAME': 'xe',
'USER': 'a_user',
'PASSWORD': 'a_password',
'HOST': 'dbprod01ned.mycompany.com',
'PORT': '1540',
}
}
You should supply both :setting:`DATABASE_HOST` and :setting:`DATABASE_PORT`, or leave both
You should supply both ``HOST`` and ``PORT``, or leave both
as empty strings.
Tablespace options

View File

@@ -121,6 +121,11 @@ createcachetable
Creates a cache table named ``tablename`` for use with the database cache
backend. See :ref:`topics-cache` for more information.
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the database
onto which the cachetable will be installed.
createsuperuser
---------------
@@ -155,8 +160,8 @@ dbshell
.. django-admin:: dbshell
Runs the command-line client for the database engine specified in your
``DATABASE_ENGINE`` setting, with the connection parameters specified in your
``DATABASE_USER``, ``DATABASE_PASSWORD``, etc., settings.
``ENGINE`` setting, with the connection parameters specified in your
``USER``, ``PASSWORD``, etc., settings.
* For PostgreSQL, this runs the ``psql`` command-line client.
* For MySQL, this runs the ``mysql`` command-line client.
@@ -167,6 +172,12 @@ the program name (``psql``, ``mysql``, ``sqlite3``) will find the program in
the right place. There's no way to specify the location of the program
manually.
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the database
onto which to open a shell.
diffsettings
------------
@@ -199,21 +210,6 @@ records to dump. If you're using a :ref:`custom manager <custom-managers>` as
the default manager and it filters some of the available records, not all of the
objects will be dumped.
.. django-admin-option:: --exclude
.. versionadded:: 1.0
Exclude a specific application from the applications whose contents is
output. For example, to specifically exclude the `auth` application from
the output, you would call::
django-admin.py dumpdata --exclude=auth
If you want to exclude multiple applications, use multiple ``--exclude``
directives::
django-admin.py dumpdata --exclude=auth --exclude=contenttypes
.. django-admin-option:: --format <fmt>
By default, ``dumpdata`` will format its output in JSON, but you can use the
@@ -226,6 +222,11 @@ By default, ``dumpdata`` will output all data on a single line. This isn't
easy for humans to read, so you can use the ``--indent`` option to
pretty-print the output with a number of indentation spaces.
.. versionadded:: 1.0
The :djadminopt:`--exclude` option may be provided to prevent specific
applications from being dumped.
.. versionadded:: 1.1
In addition to specifying application names, you can provide a list of
@@ -234,6 +235,11 @@ name to ``dumpdata``, the dumped output will be restricted to that model,
rather than the entire application. You can also mix application names and
model names.
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the database
onto which the data will be loaded.
.. django-admin-option:: --natural
.. versionadded:: 1.2
@@ -244,7 +250,6 @@ a natural key definition. If you are dumping ``contrib.auth`` ``Permission``
objects or ``contrib.contenttypes`` ``ContentType`` objects, you should
probably be using this flag.
flush
-----
@@ -258,13 +263,19 @@ fixture will be re-installed.
The :djadminopt:`--noinput` option may be provided to suppress all user
prompts.
.. versionadded:: 1.2
The :djadminopt:`--database` option may be used to specify the database
to flush.
inspectdb
---------
.. django-admin:: inspectdb
Introspects the database tables in the database pointed-to by the
``DATABASE_NAME`` setting and outputs a Django model module (a ``models.py``
``NAME`` setting and outputs a Django model module (a ``models.py``
file) to standard output.
Use this if you have a legacy database with which you'd like to use Django.
@@ -301,6 +312,12 @@ needed.
``inspectdb`` works with PostgreSQL, MySQL and SQLite. Foreign-key detection
only works in PostgreSQL and with certain types of MySQL tables.
.. versionadded:: 1.2
The :djadminopt:`--database` option may be used to specify the
database to introspect.
loaddata <fixture fixture ...>
------------------------------
@@ -308,6 +325,11 @@ loaddata <fixture fixture ...>
Searches for and loads the contents of the named fixture into the database.
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the database
onto which the data will be loaded.
What's a "fixture"?
~~~~~~~~~~~~~~~~~~~
@@ -389,6 +411,37 @@ installation will be aborted, and any data installed in the call to
references in your data files - MySQL doesn't provide a mechanism to
defer checking of row constraints until a transaction is committed.
Database-specific fixtures
~~~~~~~~~~~~~~~~~~~~~~~~~~
If you are in a multi-database setup, you may have fixture data that
you want to load onto one database, but not onto another. In this
situation, you can add database identifier into . If your
:setting:`DATABASES` setting has a 'master' database defined, you can
define the fixture ``mydata.master.json`` or
``mydata.master.json.gz``. This fixture will only be loaded if you
have specified that you want to load data onto the ``master``
database.
Excluding applications from loading
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. versionadded:: 1.2
The :djadminopt:`--exclude` option may be provided to prevent specific
applications from being loaded.
For example, if you wanted to exclude models from ``django.contrib.auth``
from being loaded into your database, you would call::
django-admin.py loaddata mydata.json --exclude auth
This will look for for a JSON fixture called ``mydata`` in all the
usual locations - including the ``fixtures`` directory of the
``django.contrib.auth`` application. However, any fixture object that
identifies itself as belonging to the ``auth`` application (e.g.,
instance of ``auth.User``) would be ignored by loaddata.
makemessages
------------
@@ -450,6 +503,11 @@ Executes the equivalent of ``sqlreset`` for the given app name(s).
The :djadminopt:`--noinput` option may be provided to suppress all user
prompts.
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the alias
of the database to reset.
runfcgi [options]
-----------------
@@ -567,6 +625,11 @@ sql <appname appname ...>
Prints the CREATE TABLE SQL statements for the given app name(s).
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the database for
which to print the SQL.
sqlall <appname appname ...>
----------------------------
@@ -577,6 +640,11 @@ Prints the CREATE TABLE and initial-data SQL statements for the given app name(s
Refer to the description of ``sqlcustom`` for an explanation of how to
specify initial data.
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the database for
which to print the SQL.
sqlclear <appname appname ...>
------------------------------
@@ -584,6 +652,11 @@ sqlclear <appname appname ...>
Prints the DROP TABLE SQL statements for the given app name(s).
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the database for
which to print the SQL.
sqlcustom <appname appname ...>
-------------------------------
@@ -605,6 +678,11 @@ table modifications, or insert any SQL functions into the database.
Note that the order in which the SQL files are processed is undefined.
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the database for
which to print the SQL.
sqlflush
--------
@@ -613,6 +691,11 @@ sqlflush
Prints the SQL statements that would be executed for the :djadmin:`flush`
command.
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the database for
which to print the SQL.
sqlindexes <appname appname ...>
--------------------------------
@@ -620,6 +703,11 @@ sqlindexes <appname appname ...>
Prints the CREATE INDEX SQL statements for the given app name(s).
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the database for
which to print the SQL.
sqlreset <appname appname ...>
------------------------------
@@ -627,6 +715,11 @@ sqlreset <appname appname ...>
Prints the DROP TABLE SQL, then the CREATE TABLE SQL, for the given app name(s).
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the database for
which to print the SQL.
sqlsequencereset <appname appname ...>
--------------------------------------
@@ -640,6 +733,11 @@ number for automatically incremented fields.
Use this command to generate SQL which will fix cases where a sequence is out
of sync with its automatically incremented field data.
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the database for
which to print the SQL.
startapp <appname>
------------------
@@ -696,9 +794,16 @@ with an appropriate extension (e.g. ``json`` or ``xml``). See the
documentation for ``loaddata`` for details on the specification of fixture
data files.
--noinput
~~~~~~~~~
The :djadminopt:`--noinput` option may be provided to suppress all user
prompts.
.. versionadded:: 1.2
The :djadminopt:`--database` option can be used to specify the database to
synchronize.
test <app or test identifier>
-----------------------------
@@ -848,6 +953,30 @@ Common options
The following options are not available on every commands, but they are
common to a number of commands.
.. django-admin-option:: --database
.. versionadded:: 1.2
Used to specify the database on which a command will operate. If not
specified, this option will default to an alias of ``default``.
For example, to dump data from the database with the alias ``master``::
django-admin.py dumpdata --database=master
.. django-admin-option:: --exclude
Exclude a specific application from the applications whose contents is
output. For example, to specifically exclude the `auth` application from
the output of dumpdata, you would call::
django-admin.py dumpdata --exclude=auth
If you want to exclude multiple applications, use multiple ``--exclude``
directives::
django-admin.py dumpdata --exclude=auth --exclude=contenttypes
.. django-admin-option:: --locale
Use the ``--locale`` or ``-l`` option to specify the locale to process.

View File

@@ -4,8 +4,9 @@
Model ``Meta`` options
======================
This document explains all the possible :ref:`metadata options <meta-options>` that you can give your model in its internal
``class Meta``.
This document explains all the possible :ref:`metadata options
<meta-options>` that you can give your model in its internal ``class
Meta``.
Available ``Meta`` options
==========================
@@ -242,4 +243,3 @@ The plural name for the object::
verbose_name_plural = "stories"
If this isn't given, Django will use :attr:`~Options.verbose_name` + ``"s"``.

View File

@@ -874,6 +874,24 @@ logically::
# existing set of fields).
Entry.objects.defer("body").only("headline", "body")
``using(alias)``
~~~~~~~~~~~~~~~~~~
.. versionadded:: 1.2
This method is for controlling which database the ``QuerySet`` will be
evaluated against if you are using more than one database. The only argument
this method takes is the alias of a database, as defined in
:setting:`DATABASES`.
For example::
# queries the database with the 'default' alias.
>>> Entry.objects.all()
# queries the database with the 'backup' alias
>>> Entry.objects.using('backup')
QuerySet methods that do not return QuerySets
---------------------------------------------

View File

@@ -189,30 +189,67 @@ where ``reason`` is a short message (intended for developers or logging, not for
end users) indicating the reason the request was rejected. See
:ref:`ref-contrib-csrf`.
.. setting:: DATABASE_ENGINE
DATABASE_ENGINE
---------------
.. setting:: DATABASES
DATABASES
---------
.. versionadded: 1.2
Default: ``{}`` (Empty dictionary)
A dictionary containing the settings for all databases to be used with
Django. It is a nested dictionary who's contents maps database aliases
to a dictionary containing the options for an individual database.
The :setting:`DATABASES` setting must configure a ``default`` database;
any number of additional databases may also be specified.
The simplest possible settings file is for a single-database setup using
SQLite. This can be configured using the following::
DATABASES = {
'default': {
'BACKEND': 'django.db.backends.sqlite3',
'NAME': 'mydatabase'
}
}
For other database backends, or more complex SQLite configurations, other options
will be required. The following inner options are available.
.. setting:: ENGINE
ENGINE
~~~~~~
Default: ``''`` (Empty string)
The database backend to use. The built-in database backends are
``'postgresql_psycopg2'``, ``'postgresql'``, ``'mysql'``, ``'sqlite3'``, and
``'oracle'``.
The database backend to use. The built-in database backends are:
* ``'django.db.backends.postgresql_psycopg2'``
* ``'django.db.backends.postgresql'``
* ``'django.db.backends.mysql'``
* ``'django.db.backends.sqlite3'``
* ``'django.db.backends.oracle'``
You can use a database backend that doesn't ship with Django by setting
``DATABASE_ENGINE`` to a fully-qualified path (i.e.
``ENGINE`` to a fully-qualified path (i.e.
``mypackage.backends.whatever``). Writing a whole new database backend from
scratch is left as an exercise to the reader; see the other backends for
examples.
.. versionadded:: 1.0
Support for external database backends is new in 1.0.
.. note::
Prior to Django 1.2, you could use a short version of the backend name
to reference the built-in database backends (e.g., you could use
``'sqlite3'`` to refer to the SQLite backend). This format has been
deprecated, and will be removed in Django 1.4.
.. setting:: DATABASE_HOST
.. setting:: HOST
DATABASE_HOST
-------------
HOST
~~~~
Default: ``''`` (Empty string)
@@ -222,7 +259,7 @@ localhost. Not used with SQLite.
If this value starts with a forward slash (``'/'``) and you're using MySQL,
MySQL will connect via a Unix socket to the specified socket. For example::
DATABASE_HOST = '/var/run/mysql'
"HOST": '/var/run/mysql'
If you're using MySQL and this value *doesn't* start with a forward slash, then
this value is assumed to be the host.
@@ -232,10 +269,10 @@ for the connection, rather than a network connection to localhost. If you
explicitly need to use a TCP/IP connection on the local machine with
PostgreSQL, specify ``localhost`` here.
.. setting:: DATABASE_NAME
.. setting:: NAME
DATABASE_NAME
-------------
NAME
~~~~
Default: ``''`` (Empty string)
@@ -243,44 +280,91 @@ The name of the database to use. For SQLite, it's the full path to the database
file. When specifying the path, always use forward slashes, even on Windows
(e.g. ``C:/homes/user/mysite/sqlite3.db``).
.. setting:: DATABASE_OPTIONS
.. setting:: OPTIONS
DATABASE_OPTIONS
----------------
OPTIONS
~~~~~~~
Default: ``{}`` (Empty dictionary)
Extra parameters to use when connecting to the database. Consult backend
module's document for available keywords.
.. setting:: DATABASE_PASSWORD
.. setting:: PASSWORD
DATABASE_PASSWORD
-----------------
PASSWORD
~~~~~~~~
Default: ``''`` (Empty string)
The password to use when connecting to the database. Not used with SQLite.
.. setting:: DATABASE_PORT
.. setting:: PORT
DATABASE_PORT
-------------
PORT
~~~~
Default: ``''`` (Empty string)
The port to use when connecting to the database. An empty string means the
default port. Not used with SQLite.
.. setting:: DATABASE_USER
.. setting:: USER
DATABASE_USER
-------------
USER
~~~~
Default: ``''`` (Empty string)
The username to use when connecting to the database. Not used with SQLite.
.. setting:: TEST_CHARSET
TEST_CHARSET
~~~~~~~~~~~~
Default: ``None``
The character set encoding used to create the test database. The value of this
string is passed directly through to the database, so its format is
backend-specific.
Supported for the PostgreSQL_ (``postgresql``, ``postgresql_psycopg2``) and
MySQL_ (``mysql``) backends.
.. _PostgreSQL: http://www.postgresql.org/docs/8.2/static/multibyte.html
.. _MySQL: http://www.mysql.org/doc/refman/5.0/en/charset-database.html
.. setting:: TEST_COLLATION
TEST_COLLATION
~~~~~~~~~~~~~~
Default: ``None``
The collation order to use when creating the test database. This value is
passed directly to the backend, so its format is backend-specific.
Only supported for the ``mysql`` backend (see `section 10.3.2`_ of the MySQL
manual for details).
.. _section 10.3.2: http://www.mysql.org/doc/refman/5.0/en/charset-database.html
.. setting:: TEST_NAME
TEST_NAME
~~~~~~~~~
Default: ``None``
The name of database to use when running the test suite.
If the default value (``None``) is used with the SQLite database engine, the
tests will use a memory resident database. For all other database engines the
test database will use the name ``'test_' + DATABASE_NAME``.
See :ref:`topics-testing`.
.. setting:: DATE_FORMAT
DATE_FORMAT
@@ -1045,6 +1129,18 @@ See the :ref:`topics-http-sessions`.
.. setting:: SESSION_EXPIRE_AT_BROWSER_CLOSE
SESSION_DB_ALIAS
----------------
.. versionadded:: 1.2
Default: ``None``
If you're using database-backed session storage, this selects the database
alias that will be used to store session data. By default, Django will use
the ``default`` database, but you can store session data on any database
you choose.
SESSION_EXPIRE_AT_BROWSER_CLOSE
-------------------------------
@@ -1169,56 +1265,6 @@ Default: ``''`` (Empty string)
Output, as a string, that the template system should use for invalid (e.g.
misspelled) variables. See :ref:`invalid-template-variables`..
.. setting:: TEST_DATABASE_CHARSET
TEST_DATABASE_CHARSET
---------------------
.. versionadded:: 1.0
Default: ``None``
The character set encoding used to create the test database. The value of this
string is passed directly through to the database, so its format is
backend-specific.
Supported for the PostgreSQL_ (``postgresql``, ``postgresql_psycopg2``) and MySQL_ (``mysql``) backends.
.. _PostgreSQL: http://www.postgresql.org/docs/8.2/static/multibyte.html
.. _MySQL: http://www.mysql.org/doc/refman/5.0/en/charset-database.html
.. setting:: TEST_DATABASE_COLLATION
TEST_DATABASE_COLLATION
------------------------
.. versionadded:: 1.0
Default: ``None``
The collation order to use when creating the test database. This value is
passed directly to the backend, so its format is backend-specific.
Only supported for the ``mysql`` backend (see `section 10.3.2`_ of the MySQL
manual for details).
.. _section 10.3.2: http://www.mysql.org/doc/refman/5.0/en/charset-database.html
.. setting:: TEST_DATABASE_NAME
TEST_DATABASE_NAME
------------------
Default: ``None``
The name of database to use when running the test suite.
If the default value (``None``) is used with the SQLite database engine, the
tests will use a memory resident database. For all other database engines the
test database will use the name ``'test_' + settings.DATABASE_NAME``.
See :ref:`topics-testing`.
.. setting:: TEST_RUNNER
TEST_RUNNER
@@ -1328,3 +1374,97 @@ Different locales have different formats. For example, U.S. English would say
See :ttag:`allowed date format strings <now>`. See also ``DATE_FORMAT``,
``DATETIME_FORMAT``, ``TIME_FORMAT`` and ``MONTH_DAY_FORMAT``.
Deprecated settings
===================
.. setting:: DATABASE_ENGINE
DATABASE_ENGINE
---------------
.. deprecated:: 1.2
This setting has been replaced by :setting:`ENGINE` in
:setting:`DATABASES`.
.. setting:: DATABASE_HOST
DATABASE_HOST
-------------
.. deprecated:: 1.2
This setting has been replaced by :setting:`HOST` in
:setting:`DATABASES`.
.. setting:: DATABASE_NAME
DATABASE_NAME
-------------
.. deprecated:: 1.2
This setting has been replaced by :setting:`NAME` in
:setting:`DATABASES`.
.. setting:: DATABASE_OPTIONS
DATABASE_OPTIONS
----------------
.. deprecated:: 1.2
This setting has been replaced by :setting:`OPTIONS` in
:setting:`DATABASES`.
.. setting:: DATABASE_PASSWORD
DATABASE_PASSWORD
-----------------
.. deprecated:: 1.2
This setting has been replaced by :setting:`PASSWORD` in
:setting:`DATABASES`.
.. setting:: DATABASE_PORT
DATABASE_PORT
-------------
.. deprecated:: 1.2
This setting has been replaced by :setting:`PORT` in
:setting:`DATABASES`.
.. setting:: DATABASE_USER
DATABASE_USER
-------------
.. deprecated:: 1.2
This setting has been replaced by :setting:`USER` in
:setting:`DATABASES`.
.. setting:: TEST_DATABASE_CHARSET
TEST_DATABASE_CHARSET
---------------------
.. deprecated:: 1.2
This setting has been replaced by :setting:`TEST_CHARSET` in
:setting:`DATABASES`.
.. setting:: TEST_DATABASE_COLLATION
TEST_DATABASE_COLLATION
-----------------------
.. deprecated:: 1.2
This setting has been replaced by :setting:`TEST_COLLATION` in
:setting:`DATABASES`.
.. setting:: TEST_DATABASE_NAME
TEST_DATABASE_NAME
------------------
.. deprecated:: 1.2
This setting has been replaced by :setting:`TEST_NAME` in
:setting:`DATABASES`.

View File

@@ -96,7 +96,7 @@ support:
* The ``sqlinitialdata`` command has been renamed to ``sqlcustom`` to
emphasize that ``loaddata`` should be used for data (and ``sqlcustom`` for
other custom SQL -- views, stored procedures, etc.).
* The vestigial ``install`` command has been removed. Use ``syncdb``.
Backslash escaping changed
@@ -179,20 +179,20 @@ strings that referred to callables were allowed). This allows a much more
natural use of URLconfs. For example, this URLconf::
from django.conf.urls.defaults import *
urlpatterns = patterns('',
urlpatterns = patterns('',
('^myview/$', 'mysite.myapp.views.myview')
)
can now be rewritten as::
from django.conf.urls.defaults import *
from mysite.myapp.views import myview
urlpatterns = patterns('',
urlpatterns = patterns('',
('^myview/$', myview)
)
One useful application of this can be seen when using decorators; this
change allows you to apply decorators to views *in your
URLconf*. Thus, you can make a generic view require login very
@@ -202,17 +202,17 @@ easily::
from django.contrib.auth.decorators import login_required
from django.views.generic.list_detail import object_list
from mysite.myapp.models import MyModel
info = {
"queryset" : MyModel.objects.all(),
}
urlpatterns = patterns('',
urlpatterns = patterns('',
('^myview/$', login_required(object_list), info)
)
Note that both syntaxes (strings and callables) are valid, and will continue to
be valid for the foreseeable future.
be valid for the foreseeable future.
The test framework
------------------
@@ -248,19 +248,19 @@ all their hard work:
* Russell Keith-Magee and Malcolm Tredinnick for their major code
contributions. This release wouldn't have been possible without them.
* Our new release manager, James Bennett, for his work in getting out
0.95.1, 0.96, and (hopefully) future release.
* Our ticket managers Chris Beaven (aka SmileyChris), Simon Greenhill,
Michael Radziej, and Gary Wilson. They agreed to take on the monumental
task of wrangling our tickets into nicely cataloged submission. Figuring
out what to work on is now about a million times easier; thanks again,
guys.
* Everyone who submitted a bug report, patch or ticket comment. We can't
possibly thank everyone by name -- over 200 developers submitted patches
that went into 0.96 -- but everyone who's contributed to Django is listed
in AUTHORS_.
.. _AUTHORS: http://code.djangoproject.com/browser/django/trunk/AUTHORS

View File

@@ -76,6 +76,148 @@ changes:
__members__ = property(lambda self: self.__dir__())
Specifying databases
--------------------
Prior to Django 1.1, Django used a number of settings to control access to a
single database. Django 1.2 introduces support for multiple databases, and as
a result, the way you define database settings has changed.
Any existing Django settings file will continue to work as expected until
Django 1.4. Old-style database settings will be automatically translated to
the new-style format.
In the old-style (pre 1.2) format, there were a number of
``DATABASE_`` settings at the top level of your settings file. For
example::
DATABASE_NAME = 'test_db'
DATABASE_BACKEND = 'postgresl_psycopg2'
DATABASE_USER = 'myusername'
DATABASE_PASSWORD = 's3krit'
These settings are now contained inside a dictionary named
:setting:`DATABASES`. Each item in the dictionary corresponds to a
single database connection, with the name ``'default'`` describing the
default database connection. The setting names have also been
shortened to reflect the fact that they are stored in a dictionary.
The sample settings given previously would now be stored using::
DATABASES = {
'default': {
'NAME': 'test_db',
'BACKEND': 'django.db.backends.postgresl_psycopg2',
'USER': 'myusername',
'PASSWORD': 's3krit',
}
}
This affects the following settings:
========================================= ==========================
Old setting New Setting
========================================= ==========================
:setting:`DATABASE_ENGINE` :setting:`ENGINE`
:setting:`DATABASE_HOST` :setting:`HOST`
:setting:`DATABASE_NAME` :setting:`NAME`
:setting:`DATABASE_OPTIONS` :setting:`OPTIONS`
:setting:`DATABASE_PASSWORD` :setting:`PASSWORD`
:setting:`DATABASE_PORT` :setting:`PORT`
:setting:`DATABASE_USER` :setting:`USER`
:setting:`TEST_DATABASE_CHARSET` :setting:`TEST_CHARSET`
:setting:`TEST_DATABASE_COLLATION` :setting:`TEST_COLLATION`
:setting:`TEST_DATABASE_NAME` :setting:`TEST_NAME`
========================================= ==========================
These changes are also required if you have manually created a database
connection using ``DatabaseWrapper()`` from your database backend of choice.
In addition to the change in structure, Django 1.2 removes the special
handling for the built-in database backends. All database backends
must now be specified by a fully qualified module name (i.e.,
``django.db.backends.postgresl_psycopg2``, rather than just
``postgresql_psycopg2``).
``__dict__`` on Model instances
-------------------------------
Historically, the ``__dict__`` attribute of a model instance has only contained
attributes corresponding to the fields on a model.
In order to support multiple database configurations, Django 1.2 has
added a ``_state`` attribute to object instances. This attribute will
appear in ``__dict__`` for a model instance. If your code relies on
iterating over __dict__ to obtain a list of fields, you must now
filter out ``_state`` attribute of out ``__dict__``.
``get_db_prep_*()`` methods on Field
------------------------------------
Prior to v1.2, a custom field had the option of defining several
functions to support conversion of Python values into
database-compatible values. A custom field might look something like::
class CustomModelField(models.Field):
# ...
def get_db_prep_save(self, value):
# ...
def get_db_prep_value(self, value):
# ...
def get_db_prep_lookup(self, lookup_type, value):
# ...
In 1.2, these three methods have undergone a change in prototype, and
two extra methods have been introduced::
class CustomModelField(models.Field):
# ...
def get_prep_value(self, value):
# ...
def get_prep_lookup(self, lookup_type, value):
# ...
def get_db_prep_save(self, value, connection):
# ...
def get_db_prep_value(self, value, connection, prepared=False):
# ...
def get_db_prep_lookup(self, lookup_type, value, connection, prepared=False):
# ...
These changes are required to support multiple databases -
``get_db_prep_*`` can no longer make any assumptions regarding the
database for which it is preparing. The ``connection`` argument now
provides the preparation methods with the specific connection for
which the value is being prepared.
The two new methods exist to differentiate general data preparation
requirements, and requirements that are database-specific. The
``prepared`` argument is used to indicate to the database preparation
methods whether generic value preparation has been performed. If
an unprepared (i.e., ``prepared=False``) value is provided to the
``get_db_prep_*()`` calls, they should invoke the corresponding
``get_prep_*()`` calls to perform generic data preparation.
Conversion functions has been provided which will transparently
convert functions adhering to the old prototype into functions
compatible with the new prototype. However, this conversion function
will be removed in Django 1.4, so you should upgrade your Field
definitions to use the new prototype.
If your ``get_db_prep_*()`` methods made no use of the database
connection, you should be able to upgrade by renaming
``get_db_prep_value()`` to ``get_prep_value()`` and
``get_db_prep_lookup()`` to ``get_prep_lookup()`. If you require
database specific conversions, then you will need to provide an
implementation ``get_db_prep_*`` that uses the ``connection``
argument to resolve database-specific values.
Stateful template tags
----------------------
@@ -219,6 +361,15 @@ replaces the deprecated user message API and allows you to temporarily store
messages in one request and retrieve them for display in a subsequent request
(usually the next one).
Support for multiple databases
------------------------------
Django 1.2 adds the ability to use :ref:`more than one database
<topics-db-multi-db>` in your Django project. Queries can be
issued at a specific database with the `using()` method on
querysets; individual objects can be saved to a specific database
by providing a ``using`` argument when you save the instance.
'Smart' if tag
--------------

View File

@@ -16,3 +16,4 @@ model maps to a single database table.
managers
sql
transactions
multi-db

265
docs/topics/db/multi-db.txt Normal file
View File

@@ -0,0 +1,265 @@
.. _topics-db-multi-db:
==================
Multiple Databases
==================
.. versionadded:: 1.2
This topic guide describes Django's support for interacting with multiple
databases. Most of the rest of Django's documentation assumes you are
interacting with a single database. If you want to interact with multiple
databases, some additional steps must be taken.
Defining your databases
=======================
The first step to using more than one database with Django is to tell
Django about the database servers you'll be using. This is done using
the :setting:`DATABASES` setting. This setting maps database aliases,
which are a way to refer to a specific database throughout Django, to
a dictionary of settings for that specific connection. The settings in
the inner dictionaries are described fully in the :setting:`DATABASES`
documentation.
Regardless of how many databases you have, you *must* have a database
named ``'default'``. Any additional databases you have can have
whatever alias you choose.
The following is an example ``settings.py`` snippet defining two
databases - a default Postgres database, and a MySQL database called
``users``::
DATABASES = {
'default': {
'NAME': 'app_data',
'BACKEND': 'django.db.backends.postgres_psycopg2',
'USER': 'postgres_user',
'PASSWORD': 's3krit'
},
'users': {
'NAME': 'user_data'
'BACKEND': 'django.db.backends.mysql',
'USER': 'mysql_user',
'PASSWORD': 'priv4te'
}
}
If you attempt to access a database that you haven't defined in your
:setting:`DATABASES` setting then Django will raise a
``django.db.utils.ConnectionDoesNotExist`` exception.
Selecting a database for a ``QuerySet``
=======================================
It is possible to select the database for a ``QuerySet`` at any point
during it's construction. To choose the database that a query will be
preformed against simply call the ``using()`` method on the
``QuerySet``. ``using()`` takes a single argument: the alias of the
database on which you want to run the query. For example::
# This will run on the 'default' database...
>>> Author.objects.all()
# So will this...
>>> Author.objects.using('default').all()
# This will run on the 'other' database
>>> Author.objects.using('other').all()
Select a database to save to
============================
To choose what database to save a model to, provide a ``using`` keyword
argument to ``Model.save()``. For example if you had a user model that you
wanted to save to the ``'legacy_users'`` database you would save the user
by calling::
>>> user_obj.save(using='legacy_users')
Moving an object from one database to another
---------------------------------------------
If you have saved an instance to one database, it might be tempting to use
``save(using=...)`` as a way to migrate the instance to a new database. However,
if you don't take appropriate steps, this could have some unexpected consequences.
Consider the following example::
>>> p = Person(name='Fred')
>>> p.save(using='first') # (1)
# some other processing ...
>>> p.save(using='second') # (2)
In statement 1, a new Person object is saved to the ``first``
database. At this time, ``p`` doesn't have a primary key, so Django
issues a SQL ``INSERT`` statement. This creates a primary key, and
Django assigns that primary key to ``p``.
When the save occurs in statement 2, ``p`` already has a primary key
value, and Django will attempt to use that primary key on the new
database. If the primary key value isn't in use in the ``second``
database, then you won't have any problems -- the object will be
copied to the new databse.
However, if the primary key of ``p`` is already in use on the
``second`` database, the existing object on the ``second`` database
will be lost when ``p`` is saved.
There are two ways to avoid this outcome. Firstly, you can clear the
primary key of the instance. If an object has no primary key, Django
will treat it as a new object, avoiding any loss of data on the
``second`` database::
>>> p = Person(name='Fred')
>>> p.save(using='first')
# some other processing ...
>>> p.pk = None # Clear the PK
>>> p.save(using='second') # Write a completely new object
Secondly, you can use the ``force_insert`` option to ``save()`` to ensure that
Django does a SQL ``INSERT``::
>>> p = Person(name='Fred')
>>> p.save(using='first')
# some other processing ...
>>> p.save(using='second', force_insert=True)
This will ensure that the person named ``Fred`` will have the same
primary key on both databases. If that primary key is already in use
when you try to save onto the ``second`` database, an error will be
raised.
Select a database to delete from
================================
By default, a call to delete an existing object will be executed on the
same database that was used to retrieve the object in the first place::
>>> user_obj = User.objects.using('legacy_users').get(username='fred')
>>> user_obj.delete() # will delete from the `legacy_users` database
If you want to specify the database from which a model will be
deleted, you can use a ``using`` keyword argument to the
``Model.delete()`` method. This argument is analogous to the ``using``
keyword argument to ``save()``. For example if you were migrating a
user from the ``'legacy_users'`` database to the ``'new_users'``
database you might use the commands::
>>> user_obj.save(using='new_users')
>>> user_obj.delete(using='legacy_users')
Using ``Managers`` with multiple databases
==========================================
When you call ``using()`` Django returns a ``QuerySet`` that will be
evaluated against that database. However, sometimes you want to direct
a manager to use a specific database chain ``using()``. If you call
``using()``, you won't have access to any of the methods on the
manager.
To overcome this limitation, managers provide a ``db_manager()``
method. This method returns a copy of the *manager* bound to that
specific database. So, if you want to load an object using it's
natural key (using the ``get_by_natural_key()`` method on the manager,
you can call::
>>> Book.objects.db_mamanger("other").get_by_natural_key(...)
If you are overriding ``get_query_set()`` on your manager you must be sure to
either, call the method on the parent (using ``super()``), or do the
appropriate handling of the ``_db`` attribute on the manager. For example if
you wanted to return a custom ``QuerySet`` class from the ``get_query_set``
method you could do this::
class MyManager(models.Manager):
...
def get_query_set(self):
qs = CustomQuerySet(self.model)
if self._db is not None:
qs = qs.using(self._db)
return qs
Exposing multiple databases in Django's admin interface
=======================================================
Django's admin doesn't have any explicit support for multiple
databases. If you want to provide an admin interface for a model on a
database other than ``default``, you need to write custom
:class:`~django.contrib.admin.ModelAdmin` classes that will direct the
admin to use a specific database for content.
There are four methods that require customization on a ModelAdmin
object::
class MultiDBModelAdmin(admin.ModelAdmin):
# A handy constant for the name of the alternate database
using = 'other'
def save_model(self, request, obj, form, change):
# Tell Django to save objects to the 'other' database
obj.save(using=self.using)
def queryset(self, request):
# Tell Django to look for objects on the 'other' database
return super(MultiDBModelAdmin, self).queryset(request).using(self.using)
def formfield_for_foreignkey(self, db_field, request=None, **kwargs):
# Tell Django to populate ForeignKey widgets using a query
# on the 'other' database
return super(MultiDBModelAdmin, self).formfield_for_foreignkey(db_field, request=request, using=self.using, **kwargs)
def formfield_for_manytomany(self, db_field, request=None, **kwargs):
# Tell Django to populate ManyToMany widgets using a query
# on the 'other' database
return super(MultiDBModelAdmin, self).formfield_for_manytomany(db_field, request=request, using=self.using, **kwargs)
The implementation provided here implements a multi-db strategy where
all objects of a given type are stored on a specific database (e.g.,
all ``User`` objects are on the ``other`` database). If your usage of
multi-db is more complex, your ModelAdmin will need to reflect that
strategy.
Inlines can be handled in a similar fashion -- they require just three
customized methods::
class MultiDBTabularInline(admin.TabularInline):
using = 'other'
def queryset(self, request):
# Tell Django to look for inline objects on the 'other' database
return super(MultiDBTabularInline, self).queryset(request).using(self.using)
def formfield_for_foreignkey(self, db_field, request=None, **kwargs):
# Tell Django to populate ForeignKey widgets using a query
# on the 'other' database
return super(MultiDBTabularInline, self).formfield_for_foreignkey(db_field, request=request, using=self.using, **kwargs)
def formfield_for_manytomany(self, db_field, request=None, **kwargs):
# Tell Django to populate ManyToMany widgets using a query
# on the 'other' database
return super(MultiDBTabularInline, self).formfield_for_manytomany(db_field, request=request, using=self.using, **kwargs)
Once you have written your model admin definitions, they can be
registered with any Admin instance::
from django.contrib import admin
# Specialize the multi-db admin objects for use with specific models
class BookInline(MultiDBTabularInline):
model = Book
class PublisherAdmin(MultiDBModelAdmin):
inlines = [BookInline]
admin.site.register
admin.site.register(Author, MultiDBModelAdmin)
admin.site.register(Publisher, PublisherAdmin)
othersite = admin.Site('othersite')
othersite.register(Publisher, MultiDBModelAdmin)
This example sets up two admin sites. On the first site, the
``Author`` and ``Publisher`` objects are exposed; ``Publisher``
objects have an tabular inline showing books published by that
publisher. The second site exposes just publishers, without the
inlines.

View File

@@ -23,7 +23,7 @@ Performing raw queries
The ``raw()`` manager method can be used to perform raw SQL queries that
return model instances:
.. method:: Manager.raw(query, params=None, translations=None)
.. method:: Manager.raw(raw_query, params=None, translations=None)
This method method takes a raw SQL query, executes it, and returns model
instances.

View File

@@ -56,7 +56,10 @@ Controlling transaction management in views
For most people, implicit request-based transactions work wonderfully. However,
if you need more fine-grained control over how transactions are managed, you
can use Python decorators to change the way transactions are handled by a
particular view function.
particular view function. All of the decorators take an option ``using``
parameter which should be the alias for a database connection for which the
behavior applies to. If no alias is specified then the ``"default"`` database
is used.
.. note::
@@ -79,9 +82,14 @@ Example::
def viewfunc(request):
....
@transaction.autocommit(using="my_other_database")
def viewfunc2(request):
....
Within ``viewfunc()``, transactions will be committed as soon as you call
``model.save()``, ``model.delete()``, or any other function that writes to the
database.
database. ``viewfunc2()`` will have this same behavior, but for the
``"my_other_database"`` connection.
``django.db.transaction.commit_on_success``
-------------------------------------------
@@ -95,6 +103,10 @@ all the work done in a function::
def viewfunc(request):
....
@transaction.commit_on_success(using="my_other_database")
def viewfunc2(request):
....
If the function returns successfully, then Django will commit all work done
within the function at that point. If the function raises an exception, though,
Django will roll back the transaction.
@@ -127,6 +139,10 @@ Manual transaction management looks like this::
else:
transaction.commit()
@transaction.commit_manually(using="my_other_database")
def viewfunc2(request):
....
.. admonition:: An important note to users of earlier Django releases:
The database ``connection.commit()`` and ``connection.rollback()`` methods
@@ -169,21 +185,25 @@ issue a rollback, the entire transaction is rolled back. Savepoints provide
the ability to perform a fine-grained rollback, rather than the full rollback
that would be performed by ``transaction.rollback()``.
Each of these functions takes a ``using`` argument which should be the name of
a database for which the behavior applies. If no ``using`` argument is
provided then the ``"default"`` database is used.
Savepoints are controlled by three methods on the transaction object:
.. method:: transaction.savepoint()
.. method:: transaction.savepoint(using=None)
Creates a new savepoint. This marks a point in the transaction that
is known to be in a "good" state.
Returns the savepoint ID (sid).
.. method:: transaction.savepoint_commit(sid)
.. method:: transaction.savepoint_commit(sid, using=None)
Updates the savepoint to include any operations that have been performed
since the savepoint was created, or since the last commit.
.. method:: transaction.savepoint_rollback(sid)
.. method:: transaction.savepoint_rollback(sid, using=None)
Rolls the transaction back to the last point at which the savepoint was
committed.

View File

@@ -24,14 +24,6 @@ To enable session functionality, do the following:
The default ``settings.py`` created by ``django-admin.py startproject`` has
``SessionMiddleware`` activated.
* Add ``'django.contrib.sessions'`` to your ``INSTALLED_APPS`` setting,
and run ``manage.py syncdb`` to install the single database table
that stores session data.
.. versionchanged:: 1.0
This step is optional if you're not using the database session backend;
see `configuring the session engine`_.
If you don't want to use sessions, you might as well remove the
``SessionMiddleware`` line from ``MIDDLEWARE_CLASSES`` and ``'django.contrib.sessions'``
from your ``INSTALLED_APPS``. It'll save you a small bit of overhead.
@@ -46,6 +38,22 @@ By default, Django stores sessions in your database (using the model
some setups it's faster to store session data elsewhere, so Django can be
configured to store session data on your filesystem or in your cache.
Using database-backed sessions
------------------------------
If you want to use a database-backed session, you need to add
``'django.contrib.sessions'`` to your ``INSTALLED_APPS`` setting.
If you want to store your session data on a database other than ``default``
alias, you should set the :setting:`SESSION_DB_ALIAS` setting.
Once you have configured your installation, run ``manage.py syncdb``
to install the single database table that stores session data.
.. versionadded:: 1.2
The :setting:`SESSION_DB_ALIAS` setting was added in Django 1.2. It
is not required in earlier versions.
Using cached sessions
---------------------
@@ -86,6 +94,9 @@ disregards persistence. In most cases, the ``cached_db`` backend will be fast
enough, but if you need that last bit of performance, and are willing to let
session data be expunged from time to time, the ``cache`` backend is for you.
If you use the ``cached_db`` session backend, you also need to follow the
configuration instructions for the `using database-backed sessions`_.
Using file-based sessions
-------------------------
@@ -97,6 +108,7 @@ to output from ``tempfile.gettempdir()``, most likely ``/tmp``) to control
where Django stores session files. Be sure to check that your Web server has
permissions to read and write to this location.
Using sessions in views
=======================

View File

@@ -278,33 +278,35 @@ The test database
-----------------
Tests that require a database (namely, model tests) will not use your "real"
(production) database. A separate, blank database is created for the tests.
(production) database. Separate, blank databases are created for the tests.
Regardless of whether the tests pass or fail, the test database is destroyed
Regardless of whether the tests pass or fail, the test databases are destroyed
when all the tests have been executed.
By default this test database gets its name by prepending ``test_`` to the
value of the :setting:`DATABASE_NAME` setting. When using the SQLite database
engine the tests will by default use an in-memory database (i.e., the database
will be created in memory, bypassing the filesystem entirely!). If you want to
use a different database name, specify the :setting:`TEST_DATABASE_NAME`
setting.
By default the test databases get their names by prepending ``test_``
to the value of the :setting:`NAME`` settings for the databased
defined in :setting:`DATABASES`. When using the SQLite database engine
the tests will by default use an in-memory database (i.e., the
database will be created in memory, bypassing the filesystem
entirely!). If you want to use a different database name, specify
``TEST_NAME`` in the dictionary for any given database in
:setting:`DATABASES`.
Aside from using a separate database, the test runner will otherwise use all of
the same database settings you have in your settings file:
:setting:`DATABASE_ENGINE`, :setting:`DATABASE_USER`, :setting:`DATABASE_HOST`,
etc. The test database is created by the user specified by
:setting:`DATABASE_USER`, so you'll need to make sure that the given user
account has sufficient privileges to create a new database on the system.
Aside from using a separate database, the test runner will otherwise
use all of the same database settings you have in your settings file:
:setting:`ENGINE`, :setting:`USER`, :setting:`HOST`, etc. The test
database is created by the user specified by ``USER``, so you'll need
to make sure that the given user account has sufficient privileges to
create a new database on the system.
.. versionadded:: 1.0
For fine-grained control over the
character encoding of your test database, use the
:setting:`TEST_DATABASE_CHARSET` setting. If you're using MySQL, you can also
use the :setting:`TEST_DATABASE_COLLATION` setting to control the particular
collation used by the test database. See the :ref:`settings documentation
<ref-settings>` for details of these advanced settings.
For fine-grained control over the character encoding of your test
database, use the :setting:`TEST_CHARSET` option. If you're using
MySQL, you can also use the :setting:`TEST_COLLATION` option to
control the particular collation used by the test database. See the
:ref:`settings documentation <ref-settings>` for details of these
advanced settings.
Other test conditions
---------------------
@@ -1037,6 +1039,39 @@ URLconf for the duration of the test case.
.. _emptying-test-outbox:
Multi-database support
~~~~~~~~~~~~~~~~~~~~~~
.. attribute:: TestCase.multi_db
.. versionadded:: 1.2
Django sets up a test database corresponding to every database that is
defined in the :setting:``DATABASES`` definition in your settings
file. However, a big part of the time taken to run a Django TestCase
is consumed by the call to ``flush`` that ensures that you have a
clean database at the start of each test run. If you have multiple
databases, multiple flushes are required (one for each database),
which can be a time consuming activity -- especially if your tests
don't need to test multi-database activity.
As an optimization, Django only flushes the ``default`` database at
the start of each test run. If your setup contains multiple databases,
and you have a test that requires every database to be clean, you can
use the ``multi_db`` attribute on the test suite to request a full
flush.
For example::
class TestMyViews(TestCase):
multi_db = True
def testIndexPageView(self):
call_some_test_code()
This test case will flush *all* the test databases before running
``testIndexPageView``.
Emptying the test outbox
~~~~~~~~~~~~~~~~~~~~~~~~
@@ -1251,16 +1286,17 @@ utility methods in the ``django.test.utils`` module.
.. function:: setup_test_environment()
Performs any global pre-test setup, such as the installing the
instrumentation of the template rendering system and setting up the dummy
``SMTPConnection``.
instrumentation of the template rendering system and setting up
the dummy ``SMTPConnection``.
.. function:: teardown_test_environment()
Performs any global post-test teardown, such as removing the black magic
hooks into the template system and restoring normal e-mail services.
Performs any global post-test teardown, such as removing the black
magic hooks into the template system and restoring normal e-mail
services.
The creation module of the database backend (``connection.creation``) also
provides some utilities that can be useful during testing.
The creation module of the database backend (``connection.creation``)
also provides some utilities that can be useful during testing.
.. function:: create_test_db(verbosity=1, autoclobber=False)
@@ -1268,27 +1304,29 @@ provides some utilities that can be useful during testing.
``verbosity`` has the same behavior as in ``run_tests()``.
``autoclobber`` describes the behavior that will occur if a database with
the same name as the test database is discovered:
``autoclobber`` describes the behavior that will occur if a
database with the same name as the test database is discovered:
* If ``autoclobber`` is ``False``, the user will be asked to approve
destroying the existing database. ``sys.exit`` is called if the user
does not approve.
* If ``autoclobber`` is ``False``, the user will be asked to
approve destroying the existing database. ``sys.exit`` is
called if the user does not approve.
* If autoclobber is ``True``, the database will be destroyed without
consulting the user.
* If autoclobber is ``True``, the database will be destroyed
without consulting the user.
Returns the name of the test database that it created.
``create_test_db()`` has the side effect of modifying
``settings.DATABASE_NAME`` to match the name of the test database.
``create_test_db()`` has the side effect of modifying the value of
:setting:`NAME` in :setting:`DATABASES` to match the name of the test
database.
.. versionchanged:: 1.0
``create_test_db()`` now returns the name of the test database.
.. function:: destroy_test_db(old_database_name, verbosity=1)
Destroys the database whose name is in the :setting:`DATABASE_NAME` setting
and restores the value of :setting:`DATABASE_NAME` to the provided name.
Destroys the database whose name is in stored in :setting:`NAME` in the
:setting:`DATABASES`, and sets :setting:`NAME` to use the
provided name.
``verbosity`` has the same behavior as in ``run_tests()``.