Răsfoiți Sursa

Hue Aquascan CVEs (#1095)

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
 - Adding Django-1.11.29

* HUE-8905 [core] Apply HUE-8772 to Django-1.11.22 for fixing 'user is missing in mako context'

(cherry picked from commit cf717de5cd857701535c754cb242a97f08bd9214)

* HUE-8905 [core] Apply HUE-8836 to Django-1.11.22 for HTTP_X_FORWARDED_HOST contains multiple hosts

(cherry picked from commit 26e5f7a096fd1b05590ad5a33e43b4e252dc8c57)

* Revert "HUE-8905 [core] Apply HUE-8772 to Django-1.11.22 for fixing 'user is missing in mako context'"

This reverts commit cf717de5cd857701535c754cb242a97f08bd9214.

(cherry picked from commit 9014bfa9b9c4773a4186f3f64fc8e2616ad7eeee)

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Removing Django-1.11.22

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Adding cryptography-2.9
- Removing cryptography-2.1.4

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Adding MarkupSafe-1.1.1
- Removing MarkupSafe-0.9.3

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Adding PyYAML-5.3.1
- Removing PyYAML-3.12

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Adding requests-2.23.0
- Removing requests-2.18.4

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Adding urllib3-1.22
- Removing urllib3-1.25.8

* HUE-5095 [backend] Python requests library should put port information in log message

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Adding requests-kerberos-0.12.0
- Removing requests-kerberos-0.6.1

* HUE-7127 [backend] Fix for Python requests Kerberos/GSSAPI authentication library fails to authenticate kerberos requests to the same destination

current python request Kerberos library fails generating the GSSAPI authentication token with kerberos for the host which run multiple kerberised

Testing Done:
- Manual testing
  - using multiple load generator scripts
- Tested on different cluster with Python 2.6 and Python 2.7

* HUE-8202 [jb] Fix mutual authentication failed with Isilon (#675)

https://review.cloudera.org/r/12820/

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Adding tablib-0.14.0
- Removing tablib-0.10.0

* HUE-2523 [core] Remove xlwt and xlrd from tablib

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Adding MarkupPy-1.14

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Adding openpyxl-2.6.4
- Removing openpyxl-2.3.0-b2
- Removing openpyxl-2.5.3

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Adding odfpy-1.4.1

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Adding pysaml2-4.9.0
- Removing pysaml2-4.4.0

* HUE-3102 [libsaml] Add support for private key passwords

This is also tracked by IdentityPython/pysaml2#278.

This patch adds support for SAML certificates that are protected
with a password. The way it does so is with a bit of trickiness,
due to the fact that `xmlsec1`, which is an external program that
pysaml2 uses to sign the XML requests, which does not have great
support for password protected certificates. It either supports
passing in the password on the command line (which is not safe
since someone else on the machine could see the password), or
through an interactive prompt.

The proper way to fix this would be to update pysaml2 to use
another xmlsec library, but implementing that may take some
time. In the short/medium term, this patch implements this
instead by decrypting the certificate in memory, and passing
this decrypted certificate to xmlsec1 through a named pipe.
This protects us from the decrypted certificate ever hitting
the disk.

Unfortunately, this solution is only portable to
POSIX-compatible platforms. That is fine for Hue, but it
probably means we cannot push this patch to the upstream
pysaml2 repository. This patch will tied us over until
the upstream project switches to a better xmlsec library.

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Adding chardet-3.0.4

* HUE-9225 [core] Upgrade certain third party python libraries that has identified vulnerabilities
- Adding pycryptodomex-3.9.7
- Removing pycrypto-2.6.1
- Removing pycryptodomex-3.4.7

Co-authored-by: Ying Chen <yingchen@cloudera.com>
Prakash Ranade 5 ani în urmă
părinte
comite
b99879b7a0
100 a modificat fișierele cu 58 adăugiri și 3845 ștergeri
  1. 0 31
      desktop/core/ext-py/Django-1.11.22/PKG-INFO
  2. 0 27
      desktop/core/ext-py/Django-1.11.22/django/__init__.py
  3. 0 483
      desktop/core/ext-py/Django-1.11.22/django/contrib/gis/db/models/functions.py
  4. 0 451
      desktop/core/ext-py/Django-1.11.22/django/utils/text.py
  5. 0 707
      desktop/core/ext-py/Django-1.11.22/tests/gis_tests/distapp/tests.py
  6. 0 905
      desktop/core/ext-py/Django-1.11.22/tests/gis_tests/geoapp/tests.py
  7. 0 1241
      desktop/core/ext-py/Django-1.11.22/tests/timezones/tests.py
  8. 0 0
      desktop/core/ext-py/Django-1.11.29/AUTHORS
  9. 0 0
      desktop/core/ext-py/Django-1.11.29/CONTRIBUTING.rst
  10. 0 0
      desktop/core/ext-py/Django-1.11.29/Gruntfile.js
  11. 0 0
      desktop/core/ext-py/Django-1.11.29/INSTALL
  12. 0 0
      desktop/core/ext-py/Django-1.11.29/LICENSE
  13. 0 0
      desktop/core/ext-py/Django-1.11.29/LICENSE.python
  14. 0 0
      desktop/core/ext-py/Django-1.11.29/MANIFEST.in
  15. 31 0
      desktop/core/ext-py/Django-1.11.29/PKG-INFO
  16. 0 0
      desktop/core/ext-py/Django-1.11.29/README.rst
  17. 27 0
      desktop/core/ext-py/Django-1.11.29/django/__init__.py
  18. 0 0
      desktop/core/ext-py/Django-1.11.29/django/__main__.py
  19. 0 0
      desktop/core/ext-py/Django-1.11.29/django/apps/__init__.py
  20. 0 0
      desktop/core/ext-py/Django-1.11.29/django/apps/config.py
  21. 0 0
      desktop/core/ext-py/Django-1.11.29/django/apps/registry.py
  22. 0 0
      desktop/core/ext-py/Django-1.11.29/django/bin/django-admin.py
  23. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/__init__.py
  24. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/app_template/__init__.py-tpl
  25. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/app_template/admin.py-tpl
  26. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/app_template/apps.py-tpl
  27. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/app_template/migrations/__init__.py-tpl
  28. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/app_template/models.py-tpl
  29. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/app_template/tests.py-tpl
  30. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/app_template/views.py-tpl
  31. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/global_settings.py
  32. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/__init__.py
  33. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/af/LC_MESSAGES/django.mo
  34. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/af/LC_MESSAGES/django.po
  35. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/ar/LC_MESSAGES/django.mo
  36. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/ar/LC_MESSAGES/django.po
  37. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/ar/__init__.py
  38. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/ar/formats.py
  39. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/ast/LC_MESSAGES/django.mo
  40. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/ast/LC_MESSAGES/django.po
  41. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/az/LC_MESSAGES/django.mo
  42. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/az/LC_MESSAGES/django.po
  43. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/az/__init__.py
  44. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/az/formats.py
  45. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/be/LC_MESSAGES/django.mo
  46. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/be/LC_MESSAGES/django.po
  47. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/bg/LC_MESSAGES/django.mo
  48. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/bg/LC_MESSAGES/django.po
  49. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/bg/__init__.py
  50. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/bg/formats.py
  51. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/bn/LC_MESSAGES/django.mo
  52. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/bn/LC_MESSAGES/django.po
  53. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/bn/__init__.py
  54. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/bn/formats.py
  55. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/br/LC_MESSAGES/django.mo
  56. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/br/LC_MESSAGES/django.po
  57. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/bs/LC_MESSAGES/django.mo
  58. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/bs/LC_MESSAGES/django.po
  59. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/bs/__init__.py
  60. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/bs/formats.py
  61. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/ca/LC_MESSAGES/django.mo
  62. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/ca/LC_MESSAGES/django.po
  63. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/ca/__init__.py
  64. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/ca/formats.py
  65. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/cs/LC_MESSAGES/django.mo
  66. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/cs/LC_MESSAGES/django.po
  67. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/cs/__init__.py
  68. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/cs/formats.py
  69. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/cy/LC_MESSAGES/django.mo
  70. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/cy/LC_MESSAGES/django.po
  71. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/cy/__init__.py
  72. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/cy/formats.py
  73. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/da/LC_MESSAGES/django.mo
  74. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/da/LC_MESSAGES/django.po
  75. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/da/__init__.py
  76. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/da/formats.py
  77. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/de/LC_MESSAGES/django.mo
  78. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/de/LC_MESSAGES/django.po
  79. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/de/__init__.py
  80. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/de/formats.py
  81. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/de_CH/__init__.py
  82. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/de_CH/formats.py
  83. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/dsb/LC_MESSAGES/django.mo
  84. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/dsb/LC_MESSAGES/django.po
  85. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/el/LC_MESSAGES/django.mo
  86. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/el/LC_MESSAGES/django.po
  87. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/el/__init__.py
  88. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/el/formats.py
  89. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/en/LC_MESSAGES/django.mo
  90. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/en/LC_MESSAGES/django.po
  91. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/en/__init__.py
  92. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/en/formats.py
  93. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_AU/LC_MESSAGES/django.mo
  94. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_AU/LC_MESSAGES/django.po
  95. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_AU/__init__.py
  96. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_AU/formats.py
  97. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_GB/LC_MESSAGES/django.mo
  98. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_GB/LC_MESSAGES/django.po
  99. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_GB/__init__.py
  100. 0 0
      desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_GB/formats.py

+ 0 - 31
desktop/core/ext-py/Django-1.11.22/PKG-INFO

@@ -1,31 +0,0 @@
-Metadata-Version: 2.1
-Name: Django
-Version: 1.11.22
-Summary: A high-level Python Web framework that encourages rapid development and clean, pragmatic design.
-Home-page: https://www.djangoproject.com/
-Author: Django Software Foundation
-Author-email: foundation@djangoproject.com
-License: BSD
-Description: UNKNOWN
-Platform: UNKNOWN
-Classifier: Development Status :: 5 - Production/Stable
-Classifier: Environment :: Web Environment
-Classifier: Framework :: Django
-Classifier: Intended Audience :: Developers
-Classifier: License :: OSI Approved :: BSD License
-Classifier: Operating System :: OS Independent
-Classifier: Programming Language :: Python
-Classifier: Programming Language :: Python :: 2
-Classifier: Programming Language :: Python :: 2.7
-Classifier: Programming Language :: Python :: 3
-Classifier: Programming Language :: Python :: 3.4
-Classifier: Programming Language :: Python :: 3.5
-Classifier: Programming Language :: Python :: 3.6
-Classifier: Programming Language :: Python :: 3.7
-Classifier: Topic :: Internet :: WWW/HTTP
-Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
-Classifier: Topic :: Internet :: WWW/HTTP :: WSGI
-Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
-Classifier: Topic :: Software Development :: Libraries :: Python Modules
-Provides-Extra: argon2
-Provides-Extra: bcrypt

+ 0 - 27
desktop/core/ext-py/Django-1.11.22/django/__init__.py

@@ -1,27 +0,0 @@
-from __future__ import unicode_literals
-
-from django.utils.version import get_version
-
-VERSION = (1, 11, 22, 'final', 0)
-
-__version__ = get_version(VERSION)
-
-
-def setup(set_prefix=True):
-    """
-    Configure the settings (this happens as a side effect of accessing the
-    first setting), configure logging and populate the app registry.
-    Set the thread-local urlresolvers script prefix if `set_prefix` is True.
-    """
-    from django.apps import apps
-    from django.conf import settings
-    from django.urls import set_script_prefix
-    from django.utils.encoding import force_text
-    from django.utils.log import configure_logging
-
-    configure_logging(settings.LOGGING_CONFIG, settings.LOGGING)
-    if set_prefix:
-        set_script_prefix(
-            '/' if settings.FORCE_SCRIPT_NAME is None else force_text(settings.FORCE_SCRIPT_NAME)
-        )
-    apps.populate(settings.INSTALLED_APPS)

+ 0 - 483
desktop/core/ext-py/Django-1.11.22/django/contrib/gis/db/models/functions.py

@@ -1,483 +0,0 @@
-from decimal import Decimal
-
-from django.contrib.gis.db.models.fields import GeometryField, RasterField
-from django.contrib.gis.db.models.sql import AreaField
-from django.contrib.gis.geometry.backend import Geometry
-from django.contrib.gis.measure import (
-    Area as AreaMeasure, Distance as DistanceMeasure,
-)
-from django.core.exceptions import FieldError
-from django.db.models import BooleanField, FloatField, IntegerField, TextField
-from django.db.models.expressions import Func, Value
-from django.utils import six
-
-NUMERIC_TYPES = six.integer_types + (float, Decimal)
-
-
-class GeoFunc(Func):
-    function = None
-    output_field_class = None
-    geom_param_pos = 0
-
-    def __init__(self, *expressions, **extra):
-        if 'output_field' not in extra and self.output_field_class:
-            extra['output_field'] = self.output_field_class()
-        super(GeoFunc, self).__init__(*expressions, **extra)
-
-    @property
-    def name(self):
-        return self.__class__.__name__
-
-    @property
-    def srid(self):
-        expr = self.source_expressions[self.geom_param_pos]
-        if hasattr(expr, 'srid'):
-            return expr.srid
-        try:
-            return expr.field.srid
-        except (AttributeError, FieldError):
-            return None
-
-    @property
-    def geo_field(self):
-        return GeometryField(srid=self.srid) if self.srid else None
-
-    def as_sql(self, compiler, connection, **extra_context):
-        if self.function is None:
-            self.function = connection.ops.spatial_function_name(self.name)
-        if any(isinstance(field, RasterField) for field in self.get_source_fields()):
-            raise TypeError("Geometry functions not supported for raster fields.")
-        return super(GeoFunc, self).as_sql(compiler, connection, **extra_context)
-
-    def resolve_expression(self, *args, **kwargs):
-        res = super(GeoFunc, self).resolve_expression(*args, **kwargs)
-        base_srid = res.srid
-        if not base_srid:
-            raise TypeError("Geometry functions can only operate on geometric content.")
-
-        for pos, expr in enumerate(res.source_expressions[1:], start=1):
-            if isinstance(expr, GeomValue) and expr.srid != base_srid:
-                # Automatic SRID conversion so objects are comparable
-                res.source_expressions[pos] = Transform(expr, base_srid).resolve_expression(*args, **kwargs)
-        return res
-
-    def _handle_param(self, value, param_name='', check_types=None):
-        if not hasattr(value, 'resolve_expression'):
-            if check_types and not isinstance(value, check_types):
-                raise TypeError(
-                    "The %s parameter has the wrong type: should be %s." % (
-                        param_name, str(check_types))
-                )
-        return value
-
-
-class GeomValue(Value):
-    geography = False
-
-    @property
-    def srid(self):
-        return self.value.srid
-
-    def as_sql(self, compiler, connection):
-        return '%s(%%s, %s)' % (connection.ops.from_text, self.srid), [connection.ops.Adapter(self.value)]
-
-    def as_mysql(self, compiler, connection):
-        return '%s(%%s)' % (connection.ops.from_text), [connection.ops.Adapter(self.value)]
-
-    def as_postgresql(self, compiler, connection):
-        if self.geography:
-            self.value = connection.ops.Adapter(self.value, geography=self.geography)
-        else:
-            self.value = connection.ops.Adapter(self.value)
-        return super(GeomValue, self).as_sql(compiler, connection)
-
-
-class GeoFuncWithGeoParam(GeoFunc):
-    def __init__(self, expression, geom, *expressions, **extra):
-        if not isinstance(geom, Geometry):
-            raise TypeError("Please provide a geometry object.")
-        if not hasattr(geom, 'srid') or not geom.srid:
-            raise ValueError("Please provide a geometry attribute with a defined SRID.")
-        super(GeoFuncWithGeoParam, self).__init__(expression, GeomValue(geom), *expressions, **extra)
-
-
-class SQLiteDecimalToFloatMixin(object):
-    """
-    By default, Decimal values are converted to str by the SQLite backend, which
-    is not acceptable by the GIS functions expecting numeric values.
-    """
-    def as_sqlite(self, compiler, connection):
-        for expr in self.get_source_expressions():
-            if hasattr(expr, 'value') and isinstance(expr.value, Decimal):
-                expr.value = float(expr.value)
-        return super(SQLiteDecimalToFloatMixin, self).as_sql(compiler, connection)
-
-
-class OracleToleranceMixin(object):
-    tolerance = 0.05
-
-    def as_oracle(self, compiler, connection):
-        tol = self.extra.get('tolerance', self.tolerance)
-        self.template = "%%(function)s(%%(expressions)s, %s)" % tol
-        return super(OracleToleranceMixin, self).as_sql(compiler, connection)
-
-
-class Area(OracleToleranceMixin, GeoFunc):
-    output_field_class = AreaField
-    arity = 1
-
-    def as_sql(self, compiler, connection, **extra_context):
-        if connection.ops.geography:
-            self.output_field.area_att = 'sq_m'
-        else:
-            # Getting the area units of the geographic field.
-            geo_field = self.geo_field
-            if geo_field.geodetic(connection):
-                if connection.features.supports_area_geodetic:
-                    self.output_field.area_att = 'sq_m'
-                else:
-                    # TODO: Do we want to support raw number areas for geodetic fields?
-                    raise NotImplementedError('Area on geodetic coordinate systems not supported.')
-            else:
-                units_name = geo_field.units_name(connection)
-                if units_name:
-                    self.output_field.area_att = AreaMeasure.unit_attname(units_name)
-        return super(Area, self).as_sql(compiler, connection, **extra_context)
-
-    def as_oracle(self, compiler, connection):
-        self.output_field = AreaField('sq_m')  # Oracle returns area in units of meters.
-        return super(Area, self).as_oracle(compiler, connection)
-
-    def as_sqlite(self, compiler, connection, **extra_context):
-        if self.geo_field.geodetic(connection):
-            extra_context['template'] = '%(function)s(%(expressions)s, %(spheroid)d)'
-            extra_context['spheroid'] = True
-        return self.as_sql(compiler, connection, **extra_context)
-
-
-class AsGeoJSON(GeoFunc):
-    output_field_class = TextField
-
-    def __init__(self, expression, bbox=False, crs=False, precision=8, **extra):
-        expressions = [expression]
-        if precision is not None:
-            expressions.append(self._handle_param(precision, 'precision', six.integer_types))
-        options = 0
-        if crs and bbox:
-            options = 3
-        elif bbox:
-            options = 1
-        elif crs:
-            options = 2
-        if options:
-            expressions.append(options)
-        super(AsGeoJSON, self).__init__(*expressions, **extra)
-
-
-class AsGML(GeoFunc):
-    geom_param_pos = 1
-    output_field_class = TextField
-
-    def __init__(self, expression, version=2, precision=8, **extra):
-        expressions = [version, expression]
-        if precision is not None:
-            expressions.append(self._handle_param(precision, 'precision', six.integer_types))
-        super(AsGML, self).__init__(*expressions, **extra)
-
-    def as_oracle(self, compiler, connection, **extra_context):
-        source_expressions = self.get_source_expressions()
-        version = source_expressions[0]
-        clone = self.copy()
-        clone.set_source_expressions([source_expressions[1]])
-        extra_context['function'] = 'SDO_UTIL.TO_GML311GEOMETRY' if version.value == 3 else 'SDO_UTIL.TO_GMLGEOMETRY'
-        return super(AsGML, clone).as_sql(compiler, connection, **extra_context)
-
-
-class AsKML(AsGML):
-    def as_sqlite(self, compiler, connection):
-        # No version parameter
-        clone = self.copy()
-        clone.set_source_expressions(self.get_source_expressions()[1:])
-        return clone.as_sql(compiler, connection)
-
-
-class AsSVG(GeoFunc):
-    output_field_class = TextField
-
-    def __init__(self, expression, relative=False, precision=8, **extra):
-        relative = relative if hasattr(relative, 'resolve_expression') else int(relative)
-        expressions = [
-            expression,
-            relative,
-            self._handle_param(precision, 'precision', six.integer_types),
-        ]
-        super(AsSVG, self).__init__(*expressions, **extra)
-
-
-class BoundingCircle(OracleToleranceMixin, GeoFunc):
-    def __init__(self, expression, num_seg=48, **extra):
-        super(BoundingCircle, self).__init__(*[expression, num_seg], **extra)
-
-    def as_oracle(self, compiler, connection):
-        clone = self.copy()
-        clone.set_source_expressions([self.get_source_expressions()[0]])
-        return super(BoundingCircle, clone).as_oracle(compiler, connection)
-
-
-class Centroid(OracleToleranceMixin, GeoFunc):
-    arity = 1
-
-
-class Difference(OracleToleranceMixin, GeoFuncWithGeoParam):
-    arity = 2
-
-
-class DistanceResultMixin(object):
-    def source_is_geography(self):
-        return self.get_source_fields()[0].geography and self.srid == 4326
-
-    def convert_value(self, value, expression, connection, context):
-        if value is None:
-            return None
-        geo_field = self.geo_field
-        if geo_field.geodetic(connection):
-            dist_att = 'm'
-        else:
-            units = geo_field.units_name(connection)
-            if units:
-                dist_att = DistanceMeasure.unit_attname(units)
-            else:
-                dist_att = None
-        if dist_att:
-            return DistanceMeasure(**{dist_att: value})
-        return value
-
-
-class Distance(DistanceResultMixin, OracleToleranceMixin, GeoFuncWithGeoParam):
-    output_field_class = FloatField
-    spheroid = None
-
-    def __init__(self, expr1, expr2, spheroid=None, **extra):
-        expressions = [expr1, expr2]
-        if spheroid is not None:
-            self.spheroid = spheroid
-            expressions += (self._handle_param(spheroid, 'spheroid', bool),)
-        super(Distance, self).__init__(*expressions, **extra)
-
-    def as_postgresql(self, compiler, connection):
-        geo_field = GeometryField(srid=self.srid)  # Fake field to get SRID info
-        if self.source_is_geography():
-            # Set parameters as geography if base field is geography
-            for pos, expr in enumerate(
-                    self.source_expressions[self.geom_param_pos + 1:], start=self.geom_param_pos + 1):
-                if isinstance(expr, GeomValue):
-                    expr.geography = True
-        elif geo_field.geodetic(connection):
-            # Geometry fields with geodetic (lon/lat) coordinates need special distance functions
-            if self.spheroid:
-                # DistanceSpheroid is more accurate and resource intensive than DistanceSphere
-                self.function = connection.ops.spatial_function_name('DistanceSpheroid')
-                # Replace boolean param by the real spheroid of the base field
-                self.source_expressions[2] = Value(geo_field._spheroid)
-            else:
-                self.function = connection.ops.spatial_function_name('DistanceSphere')
-        return super(Distance, self).as_sql(compiler, connection)
-
-    def as_oracle(self, compiler, connection):
-        if self.spheroid:
-            self.source_expressions.pop(2)
-        return super(Distance, self).as_oracle(compiler, connection)
-
-    def as_sqlite(self, compiler, connection, **extra_context):
-        if self.spheroid:
-            self.source_expressions.pop(2)
-        if self.geo_field.geodetic(connection):
-            # SpatiaLite returns NULL instead of zero on geodetic coordinates
-            extra_context['template'] = 'COALESCE(%(function)s(%(expressions)s, %(spheroid)s), 0)'
-            extra_context['spheroid'] = int(bool(self.spheroid))
-        return super(Distance, self).as_sql(compiler, connection, **extra_context)
-
-
-class Envelope(GeoFunc):
-    arity = 1
-
-
-class ForceRHR(GeoFunc):
-    arity = 1
-
-
-class GeoHash(GeoFunc):
-    output_field_class = TextField
-
-    def __init__(self, expression, precision=None, **extra):
-        expressions = [expression]
-        if precision is not None:
-            expressions.append(self._handle_param(precision, 'precision', six.integer_types))
-        super(GeoHash, self).__init__(*expressions, **extra)
-
-
-class Intersection(OracleToleranceMixin, GeoFuncWithGeoParam):
-    arity = 2
-
-
-class IsValid(OracleToleranceMixin, GeoFunc):
-    output_field_class = BooleanField
-
-    def as_oracle(self, compiler, connection, **extra_context):
-        sql, params = super(IsValid, self).as_oracle(compiler, connection, **extra_context)
-        return "CASE %s WHEN 'TRUE' THEN 1 ELSE 0 END" % sql, params
-
-
-class Length(DistanceResultMixin, OracleToleranceMixin, GeoFunc):
-    output_field_class = FloatField
-
-    def __init__(self, expr1, spheroid=True, **extra):
-        self.spheroid = spheroid
-        super(Length, self).__init__(expr1, **extra)
-
-    def as_sql(self, compiler, connection):
-        geo_field = GeometryField(srid=self.srid)  # Fake field to get SRID info
-        if geo_field.geodetic(connection) and not connection.features.supports_length_geodetic:
-            raise NotImplementedError("This backend doesn't support Length on geodetic fields")
-        return super(Length, self).as_sql(compiler, connection)
-
-    def as_postgresql(self, compiler, connection):
-        geo_field = GeometryField(srid=self.srid)  # Fake field to get SRID info
-        if self.source_is_geography():
-            self.source_expressions.append(Value(self.spheroid))
-        elif geo_field.geodetic(connection):
-            # Geometry fields with geodetic (lon/lat) coordinates need length_spheroid
-            self.function = connection.ops.spatial_function_name('LengthSpheroid')
-            self.source_expressions.append(Value(geo_field._spheroid))
-        else:
-            dim = min(f.dim for f in self.get_source_fields() if f)
-            if dim > 2:
-                self.function = connection.ops.length3d
-        return super(Length, self).as_sql(compiler, connection)
-
-    def as_sqlite(self, compiler, connection):
-        geo_field = GeometryField(srid=self.srid)
-        if geo_field.geodetic(connection):
-            if self.spheroid:
-                self.function = 'GeodesicLength'
-            else:
-                self.function = 'GreatCircleLength'
-        return super(Length, self).as_sql(compiler, connection)
-
-
-class MakeValid(GeoFunc):
-    pass
-
-
-class MemSize(GeoFunc):
-    output_field_class = IntegerField
-    arity = 1
-
-
-class NumGeometries(GeoFunc):
-    output_field_class = IntegerField
-    arity = 1
-
-
-class NumPoints(GeoFunc):
-    output_field_class = IntegerField
-    arity = 1
-
-    def as_sql(self, compiler, connection):
-        if self.source_expressions[self.geom_param_pos].output_field.geom_type != 'LINESTRING':
-            if not connection.features.supports_num_points_poly:
-                raise TypeError('NumPoints can only operate on LineString content on this database.')
-        return super(NumPoints, self).as_sql(compiler, connection)
-
-
-class Perimeter(DistanceResultMixin, OracleToleranceMixin, GeoFunc):
-    output_field_class = FloatField
-    arity = 1
-
-    def as_postgresql(self, compiler, connection):
-        geo_field = GeometryField(srid=self.srid)  # Fake field to get SRID info
-        if geo_field.geodetic(connection) and not self.source_is_geography():
-            raise NotImplementedError("ST_Perimeter cannot use a non-projected non-geography field.")
-        dim = min(f.dim for f in self.get_source_fields())
-        if dim > 2:
-            self.function = connection.ops.perimeter3d
-        return super(Perimeter, self).as_sql(compiler, connection)
-
-    def as_sqlite(self, compiler, connection):
-        geo_field = GeometryField(srid=self.srid)  # Fake field to get SRID info
-        if geo_field.geodetic(connection):
-            raise NotImplementedError("Perimeter cannot use a non-projected field.")
-        return super(Perimeter, self).as_sql(compiler, connection)
-
-
-class PointOnSurface(OracleToleranceMixin, GeoFunc):
-    arity = 1
-
-
-class Reverse(GeoFunc):
-    arity = 1
-
-
-class Scale(SQLiteDecimalToFloatMixin, GeoFunc):
-    def __init__(self, expression, x, y, z=0.0, **extra):
-        expressions = [
-            expression,
-            self._handle_param(x, 'x', NUMERIC_TYPES),
-            self._handle_param(y, 'y', NUMERIC_TYPES),
-        ]
-        if z != 0.0:
-            expressions.append(self._handle_param(z, 'z', NUMERIC_TYPES))
-        super(Scale, self).__init__(*expressions, **extra)
-
-
-class SnapToGrid(SQLiteDecimalToFloatMixin, GeoFunc):
-    def __init__(self, expression, *args, **extra):
-        nargs = len(args)
-        expressions = [expression]
-        if nargs in (1, 2):
-            expressions.extend(
-                [self._handle_param(arg, '', NUMERIC_TYPES) for arg in args]
-            )
-        elif nargs == 4:
-            # Reverse origin and size param ordering
-            expressions.extend(
-                [self._handle_param(arg, '', NUMERIC_TYPES) for arg in args[2:]]
-            )
-            expressions.extend(
-                [self._handle_param(arg, '', NUMERIC_TYPES) for arg in args[0:2]]
-            )
-        else:
-            raise ValueError('Must provide 1, 2, or 4 arguments to `SnapToGrid`.')
-        super(SnapToGrid, self).__init__(*expressions, **extra)
-
-
-class SymDifference(OracleToleranceMixin, GeoFuncWithGeoParam):
-    arity = 2
-
-
-class Transform(GeoFunc):
-    def __init__(self, expression, srid, **extra):
-        expressions = [
-            expression,
-            self._handle_param(srid, 'srid', six.integer_types),
-        ]
-        if 'output_field' not in extra:
-            extra['output_field'] = GeometryField(srid=srid)
-        super(Transform, self).__init__(*expressions, **extra)
-
-    @property
-    def srid(self):
-        # Make srid the resulting srid of the transformation
-        return self.source_expressions[self.geom_param_pos + 1].value
-
-
-class Translate(Scale):
-    def as_sqlite(self, compiler, connection):
-        if len(self.source_expressions) < 4:
-            # Always provide the z parameter for ST_Translate
-            self.source_expressions.append(Value(0))
-        return super(Translate, self).as_sqlite(compiler, connection)
-
-
-class Union(OracleToleranceMixin, GeoFuncWithGeoParam):
-    arity = 2

+ 0 - 451
desktop/core/ext-py/Django-1.11.22/django/utils/text.py

@@ -1,451 +0,0 @@
-from __future__ import unicode_literals
-
-import re
-import unicodedata
-from gzip import GzipFile
-from io import BytesIO
-
-from django.utils import six
-from django.utils.encoding import force_text
-from django.utils.functional import (
-    SimpleLazyObject, keep_lazy, keep_lazy_text, lazy,
-)
-from django.utils.safestring import SafeText, mark_safe
-from django.utils.six.moves import html_entities
-from django.utils.translation import pgettext, ugettext as _, ugettext_lazy
-
-if six.PY2:
-    # Import force_unicode even though this module doesn't use it, because some
-    # people rely on it being here.
-    from django.utils.encoding import force_unicode  # NOQA
-
-
-@keep_lazy_text
-def capfirst(x):
-    """Capitalize the first letter of a string."""
-    return x and force_text(x)[0].upper() + force_text(x)[1:]
-
-
-# Set up regular expressions
-re_words = re.compile(r'<.*?>|((?:\w[-\w]*|&.*?;)+)', re.U | re.S)
-re_chars = re.compile(r'<.*?>|(.)', re.U | re.S)
-re_tag = re.compile(r'<(/)?(\S+?)(?:(\s*/)|\s.*?)?>', re.S)
-re_newlines = re.compile(r'\r\n|\r')  # Used in normalize_newlines
-re_camel_case = re.compile(r'(((?<=[a-z])[A-Z])|([A-Z](?![A-Z]|$)))')
-
-
-@keep_lazy_text
-def wrap(text, width):
-    """
-    A word-wrap function that preserves existing line breaks. Expects that
-    existing line breaks are posix newlines.
-
-    All white space is preserved except added line breaks consume the space on
-    which they break the line.
-
-    Long words are not wrapped, so the output text may have lines longer than
-    ``width``.
-    """
-    text = force_text(text)
-
-    def _generator():
-        for line in text.splitlines(True):  # True keeps trailing linebreaks
-            max_width = min((line.endswith('\n') and width + 1 or width), width)
-            while len(line) > max_width:
-                space = line[:max_width + 1].rfind(' ') + 1
-                if space == 0:
-                    space = line.find(' ') + 1
-                    if space == 0:
-                        yield line
-                        line = ''
-                        break
-                yield '%s\n' % line[:space - 1]
-                line = line[space:]
-                max_width = min((line.endswith('\n') and width + 1 or width), width)
-            if line:
-                yield line
-    return ''.join(_generator())
-
-
-class Truncator(SimpleLazyObject):
-    """
-    An object used to truncate text, either by characters or words.
-    """
-    def __init__(self, text):
-        super(Truncator, self).__init__(lambda: force_text(text))
-
-    def add_truncation_text(self, text, truncate=None):
-        if truncate is None:
-            truncate = pgettext(
-                'String to return when truncating text',
-                '%(truncated_text)s...')
-        truncate = force_text(truncate)
-        if '%(truncated_text)s' in truncate:
-            return truncate % {'truncated_text': text}
-        # The truncation text didn't contain the %(truncated_text)s string
-        # replacement argument so just append it to the text.
-        if text.endswith(truncate):
-            # But don't append the truncation text if the current text already
-            # ends in this.
-            return text
-        return '%s%s' % (text, truncate)
-
-    def chars(self, num, truncate=None, html=False):
-        """
-        Returns the text truncated to be no longer than the specified number
-        of characters.
-
-        Takes an optional argument of what should be used to notify that the
-        string has been truncated, defaulting to a translatable string of an
-        ellipsis (...).
-        """
-        self._setup()
-        length = int(num)
-        text = unicodedata.normalize('NFC', self._wrapped)
-
-        # Calculate the length to truncate to (max length - end_text length)
-        truncate_len = length
-        for char in self.add_truncation_text('', truncate):
-            if not unicodedata.combining(char):
-                truncate_len -= 1
-                if truncate_len == 0:
-                    break
-        if html:
-            return self._truncate_html(length, truncate, text, truncate_len, False)
-        return self._text_chars(length, truncate, text, truncate_len)
-
-    def _text_chars(self, length, truncate, text, truncate_len):
-        """
-        Truncates a string after a certain number of chars.
-        """
-        s_len = 0
-        end_index = None
-        for i, char in enumerate(text):
-            if unicodedata.combining(char):
-                # Don't consider combining characters
-                # as adding to the string length
-                continue
-            s_len += 1
-            if end_index is None and s_len > truncate_len:
-                end_index = i
-            if s_len > length:
-                # Return the truncated string
-                return self.add_truncation_text(text[:end_index or 0],
-                                                truncate)
-
-        # Return the original string since no truncation was necessary
-        return text
-
-    def words(self, num, truncate=None, html=False):
-        """
-        Truncates a string after a certain number of words. Takes an optional
-        argument of what should be used to notify that the string has been
-        truncated, defaulting to ellipsis (...).
-        """
-        self._setup()
-        length = int(num)
-        if html:
-            return self._truncate_html(length, truncate, self._wrapped, length, True)
-        return self._text_words(length, truncate)
-
-    def _text_words(self, length, truncate):
-        """
-        Truncates a string after a certain number of words.
-
-        Newlines in the string will be stripped.
-        """
-        words = self._wrapped.split()
-        if len(words) > length:
-            words = words[:length]
-            return self.add_truncation_text(' '.join(words), truncate)
-        return ' '.join(words)
-
-    def _truncate_html(self, length, truncate, text, truncate_len, words):
-        """
-        Truncates HTML to a certain number of chars (not counting tags and
-        comments), or, if words is True, then to a certain number of words.
-        Closes opened tags if they were correctly closed in the given HTML.
-
-        Newlines in the HTML are preserved.
-        """
-        if words and length <= 0:
-            return ''
-
-        html4_singlets = (
-            'br', 'col', 'link', 'base', 'img',
-            'param', 'area', 'hr', 'input'
-        )
-
-        # Count non-HTML chars/words and keep note of open tags
-        pos = 0
-        end_text_pos = 0
-        current_len = 0
-        open_tags = []
-
-        regex = re_words if words else re_chars
-
-        while current_len <= length:
-            m = regex.search(text, pos)
-            if not m:
-                # Checked through whole string
-                break
-            pos = m.end(0)
-            if m.group(1):
-                # It's an actual non-HTML word or char
-                current_len += 1
-                if current_len == truncate_len:
-                    end_text_pos = pos
-                continue
-            # Check for tag
-            tag = re_tag.match(m.group(0))
-            if not tag or current_len >= truncate_len:
-                # Don't worry about non tags or tags after our truncate point
-                continue
-            closing_tag, tagname, self_closing = tag.groups()
-            # Element names are always case-insensitive
-            tagname = tagname.lower()
-            if self_closing or tagname in html4_singlets:
-                pass
-            elif closing_tag:
-                # Check for match in open tags list
-                try:
-                    i = open_tags.index(tagname)
-                except ValueError:
-                    pass
-                else:
-                    # SGML: An end tag closes, back to the matching start tag,
-                    # all unclosed intervening start tags with omitted end tags
-                    open_tags = open_tags[i + 1:]
-            else:
-                # Add it to the start of the open tags list
-                open_tags.insert(0, tagname)
-
-        if current_len <= length:
-            return text
-        out = text[:end_text_pos]
-        truncate_text = self.add_truncation_text('', truncate)
-        if truncate_text:
-            out += truncate_text
-        # Close any tags still open
-        for tag in open_tags:
-            out += '</%s>' % tag
-        # Return string
-        return out
-
-
-@keep_lazy_text
-def get_valid_filename(s):
-    """
-    Returns the given string converted to a string that can be used for a clean
-    filename. Specifically, leading and trailing spaces are removed; other
-    spaces are converted to underscores; and anything that is not a unicode
-    alphanumeric, dash, underscore, or dot, is removed.
-    >>> get_valid_filename("john's portrait in 2004.jpg")
-    'johns_portrait_in_2004.jpg'
-    """
-    s = force_text(s).strip().replace(' ', '_')
-    return re.sub(r'(?u)[^-\w.]', '', s)
-
-
-@keep_lazy_text
-def get_text_list(list_, last_word=ugettext_lazy('or')):
-    """
-    >>> get_text_list(['a', 'b', 'c', 'd'])
-    'a, b, c or d'
-    >>> get_text_list(['a', 'b', 'c'], 'and')
-    'a, b and c'
-    >>> get_text_list(['a', 'b'], 'and')
-    'a and b'
-    >>> get_text_list(['a'])
-    'a'
-    >>> get_text_list([])
-    ''
-    """
-    if len(list_) == 0:
-        return ''
-    if len(list_) == 1:
-        return force_text(list_[0])
-    return '%s %s %s' % (
-        # Translators: This string is used as a separator between list elements
-        _(', ').join(force_text(i) for i in list_[:-1]),
-        force_text(last_word), force_text(list_[-1]))
-
-
-@keep_lazy_text
-def normalize_newlines(text):
-    """Normalizes CRLF and CR newlines to just LF."""
-    text = force_text(text)
-    return re_newlines.sub('\n', text)
-
-
-@keep_lazy_text
-def phone2numeric(phone):
-    """Converts a phone number with letters into its numeric equivalent."""
-    char2number = {
-        'a': '2', 'b': '2', 'c': '2', 'd': '3', 'e': '3', 'f': '3', 'g': '4',
-        'h': '4', 'i': '4', 'j': '5', 'k': '5', 'l': '5', 'm': '6', 'n': '6',
-        'o': '6', 'p': '7', 'q': '7', 'r': '7', 's': '7', 't': '8', 'u': '8',
-        'v': '8', 'w': '9', 'x': '9', 'y': '9', 'z': '9',
-    }
-    return ''.join(char2number.get(c, c) for c in phone.lower())
-
-
-# From http://www.xhaus.com/alan/python/httpcomp.html#gzip
-# Used with permission.
-def compress_string(s):
-    zbuf = BytesIO()
-    with GzipFile(mode='wb', compresslevel=6, fileobj=zbuf, mtime=0) as zfile:
-        zfile.write(s)
-    return zbuf.getvalue()
-
-
-class StreamingBuffer(object):
-    def __init__(self):
-        self.vals = []
-
-    def write(self, val):
-        self.vals.append(val)
-
-    def read(self):
-        if not self.vals:
-            return b''
-        ret = b''.join(self.vals)
-        self.vals = []
-        return ret
-
-    def flush(self):
-        return
-
-    def close(self):
-        return
-
-
-# Like compress_string, but for iterators of strings.
-def compress_sequence(sequence):
-    buf = StreamingBuffer()
-    with GzipFile(mode='wb', compresslevel=6, fileobj=buf, mtime=0) as zfile:
-        # Output headers...
-        yield buf.read()
-        for item in sequence:
-            zfile.write(item)
-            data = buf.read()
-            if data:
-                yield data
-    yield buf.read()
-
-
-# Expression to match some_token and some_token="with spaces" (and similarly
-# for single-quoted strings).
-smart_split_re = re.compile(r"""
-    ((?:
-        [^\s'"]*
-        (?:
-            (?:"(?:[^"\\]|\\.)*" | '(?:[^'\\]|\\.)*')
-            [^\s'"]*
-        )+
-    ) | \S+)
-""", re.VERBOSE)
-
-
-def smart_split(text):
-    r"""
-    Generator that splits a string by spaces, leaving quoted phrases together.
-    Supports both single and double quotes, and supports escaping quotes with
-    backslashes. In the output, strings will keep their initial and trailing
-    quote marks and escaped quotes will remain escaped (the results can then
-    be further processed with unescape_string_literal()).
-
-    >>> list(smart_split(r'This is "a person\'s" test.'))
-    ['This', 'is', '"a person\\\'s"', 'test.']
-    >>> list(smart_split(r"Another 'person\'s' test."))
-    ['Another', "'person\\'s'", 'test.']
-    >>> list(smart_split(r'A "\"funky\" style" test.'))
-    ['A', '"\\"funky\\" style"', 'test.']
-    """
-    text = force_text(text)
-    for bit in smart_split_re.finditer(text):
-        yield bit.group(0)
-
-
-def _replace_entity(match):
-    text = match.group(1)
-    if text[0] == '#':
-        text = text[1:]
-        try:
-            if text[0] in 'xX':
-                c = int(text[1:], 16)
-            else:
-                c = int(text)
-            return six.unichr(c)
-        except ValueError:
-            return match.group(0)
-    else:
-        try:
-            return six.unichr(html_entities.name2codepoint[text])
-        except (ValueError, KeyError):
-            return match.group(0)
-
-
-_entity_re = re.compile(r"&(#?[xX]?(?:[0-9a-fA-F]+|\w{1,8}));")
-
-
-@keep_lazy_text
-def unescape_entities(text):
-    return _entity_re.sub(_replace_entity, force_text(text))
-
-
-@keep_lazy_text
-def unescape_string_literal(s):
-    r"""
-    Convert quoted string literals to unquoted strings with escaped quotes and
-    backslashes unquoted::
-
-        >>> unescape_string_literal('"abc"')
-        'abc'
-        >>> unescape_string_literal("'abc'")
-        'abc'
-        >>> unescape_string_literal('"a \"bc\""')
-        'a "bc"'
-        >>> unescape_string_literal("'\'ab\' c'")
-        "'ab' c"
-    """
-    if s[0] not in "\"'" or s[-1] != s[0]:
-        raise ValueError("Not a string literal: %r" % s)
-    quote = s[0]
-    return s[1:-1].replace(r'\%s' % quote, quote).replace(r'\\', '\\')
-
-
-@keep_lazy(six.text_type, SafeText)
-def slugify(value, allow_unicode=False):
-    """
-    Convert to ASCII if 'allow_unicode' is False. Convert spaces to hyphens.
-    Remove characters that aren't alphanumerics, underscores, or hyphens.
-    Convert to lowercase. Also strip leading and trailing whitespace.
-    """
-    value = force_text(value)
-    if allow_unicode:
-        value = unicodedata.normalize('NFKC', value)
-        value = re.sub(r'[^\w\s-]', '', value, flags=re.U).strip().lower()
-        return mark_safe(re.sub(r'[-\s]+', '-', value, flags=re.U))
-    value = unicodedata.normalize('NFKD', value).encode('ascii', 'ignore').decode('ascii')
-    value = re.sub(r'[^\w\s-]', '', value).strip().lower()
-    return mark_safe(re.sub(r'[-\s]+', '-', value))
-
-
-def camel_case_to_spaces(value):
-    """
-    Splits CamelCase and converts to lower case. Also strips leading and
-    trailing whitespace.
-    """
-    return re_camel_case.sub(r' \1', value).strip().lower()
-
-
-def _format_lazy(format_string, *args, **kwargs):
-    """
-    Apply str.format() on 'format_string' where format_string, args,
-    and/or kwargs might be lazy.
-    """
-    return format_string.format(*args, **kwargs)
-
-
-format_lazy = lazy(_format_lazy, six.text_type)

+ 0 - 707
desktop/core/ext-py/Django-1.11.22/tests/gis_tests/distapp/tests.py

@@ -1,707 +0,0 @@
-from __future__ import unicode_literals
-
-from unittest import skipIf
-
-from django.contrib.gis.db.models.functions import (
-    Area, Distance, Length, Perimeter, Transform,
-)
-from django.contrib.gis.geos import GEOSGeometry, LineString, Point
-from django.contrib.gis.measure import D  # alias for Distance
-from django.db import connection
-from django.db.models import F, Q
-from django.test import TestCase, ignore_warnings, skipUnlessDBFeature
-from django.utils.deprecation import RemovedInDjango20Warning
-
-from ..utils import no_oracle, oracle, postgis, spatialite
-from .models import (
-    AustraliaCity, CensusZipcode, Interstate, SouthTexasCity, SouthTexasCityFt,
-    SouthTexasInterstate, SouthTexasZipcode,
-)
-
-
-class DistanceTest(TestCase):
-    fixtures = ['initial']
-
-    def setUp(self):
-        # A point we are testing distances with -- using a WGS84
-        # coordinate that'll be implicitly transformed to that to
-        # the coordinate system of the field, EPSG:32140 (Texas South Central
-        # w/units in meters)
-        self.stx_pnt = GEOSGeometry('POINT (-95.370401017314293 29.704867409475465)', 4326)
-        # Another one for Australia
-        self.au_pnt = GEOSGeometry('POINT (150.791 -34.4919)', 4326)
-
-    def get_names(self, qs):
-        cities = [c.name for c in qs]
-        cities.sort()
-        return cities
-
-    def test_init(self):
-        """
-        Test initialization of distance models.
-        """
-        self.assertEqual(9, SouthTexasCity.objects.count())
-        self.assertEqual(9, SouthTexasCityFt.objects.count())
-        self.assertEqual(11, AustraliaCity.objects.count())
-        self.assertEqual(4, SouthTexasZipcode.objects.count())
-        self.assertEqual(4, CensusZipcode.objects.count())
-        self.assertEqual(1, Interstate.objects.count())
-        self.assertEqual(1, SouthTexasInterstate.objects.count())
-
-    @skipUnlessDBFeature("supports_dwithin_lookup")
-    def test_dwithin(self):
-        """
-        Test the `dwithin` lookup type.
-        """
-        # Distances -- all should be equal (except for the
-        # degree/meter pair in au_cities, that's somewhat
-        # approximate).
-        tx_dists = [(7000, 22965.83), D(km=7), D(mi=4.349)]
-        au_dists = [(0.5, 32000), D(km=32), D(mi=19.884)]
-
-        # Expected cities for Australia and Texas.
-        tx_cities = ['Downtown Houston', 'Southside Place']
-        au_cities = ['Mittagong', 'Shellharbour', 'Thirroul', 'Wollongong']
-
-        # Performing distance queries on two projected coordinate systems one
-        # with units in meters and the other in units of U.S. survey feet.
-        for dist in tx_dists:
-            if isinstance(dist, tuple):
-                dist1, dist2 = dist
-            else:
-                dist1 = dist2 = dist
-            qs1 = SouthTexasCity.objects.filter(point__dwithin=(self.stx_pnt, dist1))
-            qs2 = SouthTexasCityFt.objects.filter(point__dwithin=(self.stx_pnt, dist2))
-            for qs in qs1, qs2:
-                self.assertEqual(tx_cities, self.get_names(qs))
-
-        # Now performing the `dwithin` queries on a geodetic coordinate system.
-        for dist in au_dists:
-            if isinstance(dist, D) and not oracle:
-                type_error = True
-            else:
-                type_error = False
-
-            if isinstance(dist, tuple):
-                if oracle or spatialite:
-                    # Result in meters
-                    dist = dist[1]
-                else:
-                    # Result in units of the field
-                    dist = dist[0]
-
-            # Creating the query set.
-            qs = AustraliaCity.objects.order_by('name')
-            if type_error:
-                # A ValueError should be raised on PostGIS when trying to pass
-                # Distance objects into a DWithin query using a geodetic field.
-                with self.assertRaises(ValueError):
-                    AustraliaCity.objects.filter(point__dwithin=(self.au_pnt, dist)).count()
-            else:
-                self.assertListEqual(au_cities, self.get_names(qs.filter(point__dwithin=(self.au_pnt, dist))))
-
-    @skipUnlessDBFeature("has_distance_method")
-    @ignore_warnings(category=RemovedInDjango20Warning)
-    def test_distance_projected(self):
-        """
-        Test the `distance` GeoQuerySet method on projected coordinate systems.
-        """
-        # The point for La Grange, TX
-        lagrange = GEOSGeometry('POINT(-96.876369 29.905320)', 4326)
-        # Reference distances in feet and in meters. Got these values from
-        # using the provided raw SQL statements.
-        #  SELECT ST_Distance(point, ST_Transform(ST_GeomFromText('POINT(-96.876369 29.905320)', 4326), 32140))
-        #  FROM distapp_southtexascity;
-        m_distances = [147075.069813, 139630.198056, 140888.552826,
-                       138809.684197, 158309.246259, 212183.594374,
-                       70870.188967, 165337.758878, 139196.085105]
-        #  SELECT ST_Distance(point, ST_Transform(ST_GeomFromText('POINT(-96.876369 29.905320)', 4326), 2278))
-        #  FROM distapp_southtexascityft;
-        # Oracle 11 thinks this is not a projected coordinate system, so it's
-        # not tested.
-        ft_distances = [482528.79154625, 458103.408123001, 462231.860397575,
-                        455411.438904354, 519386.252102563, 696139.009211594,
-                        232513.278304279, 542445.630586414, 456679.155883207]
-
-        # Testing using different variations of parameters and using models
-        # with different projected coordinate systems.
-        dist1 = SouthTexasCity.objects.distance(lagrange, field_name='point').order_by('id')
-        dist2 = SouthTexasCity.objects.distance(lagrange).order_by('id')  # Using GEOSGeometry parameter
-        if oracle:
-            dist_qs = [dist1, dist2]
-        else:
-            dist3 = SouthTexasCityFt.objects.distance(lagrange.ewkt).order_by('id')  # Using EWKT string parameter.
-            dist4 = SouthTexasCityFt.objects.distance(lagrange).order_by('id')
-            dist_qs = [dist1, dist2, dist3, dist4]
-
-        # Original query done on PostGIS, have to adjust AlmostEqual tolerance
-        # for Oracle.
-        tol = 2 if oracle else 5
-
-        # Ensuring expected distances are returned for each distance queryset.
-        for qs in dist_qs:
-            for i, c in enumerate(qs):
-                self.assertAlmostEqual(m_distances[i], c.distance.m, tol)
-                self.assertAlmostEqual(ft_distances[i], c.distance.survey_ft, tol)
-
-    @skipIf(spatialite, "distance method doesn't support geodetic coordinates on SpatiaLite.")
-    @skipUnlessDBFeature("has_distance_method", "supports_distance_geodetic")
-    @ignore_warnings(category=RemovedInDjango20Warning)
-    def test_distance_geodetic(self):
-        """
-        Test the `distance` GeoQuerySet method on geodetic coordinate systems.
-        """
-        tol = 2 if oracle else 4
-
-        # Testing geodetic distance calculation with a non-point geometry
-        # (a LineString of Wollongong and Shellharbour coords).
-        ls = LineString(((150.902, -34.4245), (150.87, -34.5789)))
-
-        # Reference query:
-        #  SELECT ST_distance_sphere(point, ST_GeomFromText('LINESTRING(150.9020 -34.4245,150.8700 -34.5789)', 4326))
-        #  FROM distapp_australiacity ORDER BY name;
-        distances = [1120954.92533513, 140575.720018241, 640396.662906304,
-                     60580.9693849269, 972807.955955075, 568451.8357838,
-                     40435.4335201384, 0, 68272.3896586844, 12375.0643697706, 0]
-        qs = AustraliaCity.objects.distance(ls).order_by('name')
-        for city, distance in zip(qs, distances):
-            # Testing equivalence to within a meter.
-            self.assertAlmostEqual(distance, city.distance.m, 0)
-
-        # Got the reference distances using the raw SQL statements:
-        #  SELECT ST_distance_spheroid(point, ST_GeomFromText('POINT(151.231341 -33.952685)', 4326),
-        #    'SPHEROID["WGS 84",6378137.0,298.257223563]') FROM distapp_australiacity WHERE (NOT (id = 11));
-        #  SELECT ST_distance_sphere(point, ST_GeomFromText('POINT(151.231341 -33.952685)', 4326))
-        #  FROM distapp_australiacity WHERE (NOT (id = 11));  st_distance_sphere
-        spheroid_distances = [
-            60504.0628957201, 77023.9489850262, 49154.8867574404,
-            90847.4358768573, 217402.811919332, 709599.234564757,
-            640011.483550888, 7772.00667991925, 1047861.78619339,
-            1165126.55236034,
-        ]
-        sphere_distances = [
-            60580.9693849267, 77144.0435286473, 49199.4415344719,
-            90804.7533823494, 217713.384600405, 709134.127242793,
-            639828.157159169, 7786.82949717788, 1049204.06569028,
-            1162623.7238134,
-        ]
-        # Testing with spheroid distances first.
-        hillsdale = AustraliaCity.objects.get(name='Hillsdale')
-        qs = AustraliaCity.objects.exclude(id=hillsdale.id).distance(hillsdale.point, spheroid=True).order_by('id')
-        for i, c in enumerate(qs):
-            self.assertAlmostEqual(spheroid_distances[i], c.distance.m, tol)
-        if postgis:
-            # PostGIS uses sphere-only distances by default, testing these as well.
-            qs = AustraliaCity.objects.exclude(id=hillsdale.id).distance(hillsdale.point).order_by('id')
-            for i, c in enumerate(qs):
-                self.assertAlmostEqual(sphere_distances[i], c.distance.m, tol)
-
-    @no_oracle  # Oracle already handles geographic distance calculation.
-    @skipUnlessDBFeature("has_distance_method")
-    @ignore_warnings(category=RemovedInDjango20Warning)
-    def test_distance_transform(self):
-        """
-        Test the `distance` GeoQuerySet method used with `transform` on a geographic field.
-        """
-        # We'll be using a Polygon (created by buffering the centroid
-        # of 77005 to 100m) -- which aren't allowed in geographic distance
-        # queries normally, however our field has been transformed to
-        # a non-geographic system.
-        z = SouthTexasZipcode.objects.get(name='77005')
-
-        # Reference query:
-        # SELECT ST_Distance(ST_Transform("distapp_censuszipcode"."poly", 32140),
-        #   ST_GeomFromText('<buffer_wkt>', 32140))
-        # FROM "distapp_censuszipcode";
-        dists_m = [3553.30384972258, 1243.18391525602, 2186.15439472242]
-
-        # Having our buffer in the SRID of the transformation and of the field
-        # -- should get the same results. The first buffer has no need for
-        # transformation SQL because it is the same SRID as what was given
-        # to `transform()`.  The second buffer will need to be transformed,
-        # however.
-        buf1 = z.poly.centroid.buffer(100)
-        buf2 = buf1.transform(4269, clone=True)
-        ref_zips = ['77002', '77025', '77401']
-
-        for buf in [buf1, buf2]:
-            qs = CensusZipcode.objects.exclude(name='77005').transform(32140).distance(buf).order_by('name')
-            self.assertListEqual(ref_zips, self.get_names(qs))
-            for i, z in enumerate(qs):
-                self.assertAlmostEqual(z.distance.m, dists_m[i], 5)
-
-    @skipUnlessDBFeature("supports_distances_lookups")
-    def test_distance_lookups(self):
-        """
-        Test the `distance_lt`, `distance_gt`, `distance_lte`, and `distance_gte` lookup types.
-        """
-        # Retrieving the cities within a 20km 'donut' w/a 7km radius 'hole'
-        # (thus, Houston and Southside place will be excluded as tested in
-        # the `test02_dwithin` above).
-        qs1 = SouthTexasCity.objects.filter(point__distance_gte=(self.stx_pnt, D(km=7))).filter(
-            point__distance_lte=(self.stx_pnt, D(km=20)),
-        )
-
-        # Oracle 11 incorrectly thinks it is not projected.
-        if oracle:
-            dist_qs = (qs1,)
-        else:
-            qs2 = SouthTexasCityFt.objects.filter(point__distance_gte=(self.stx_pnt, D(km=7))).filter(
-                point__distance_lte=(self.stx_pnt, D(km=20)),
-            )
-            dist_qs = (qs1, qs2)
-
-        for qs in dist_qs:
-            cities = self.get_names(qs)
-            self.assertEqual(cities, ['Bellaire', 'Pearland', 'West University Place'])
-
-        # Doing a distance query using Polygons instead of a Point.
-        z = SouthTexasZipcode.objects.get(name='77005')
-        qs = SouthTexasZipcode.objects.exclude(name='77005').filter(poly__distance_lte=(z.poly, D(m=275)))
-        self.assertEqual(['77025', '77401'], self.get_names(qs))
-        # If we add a little more distance 77002 should be included.
-        qs = SouthTexasZipcode.objects.exclude(name='77005').filter(poly__distance_lte=(z.poly, D(m=300)))
-        self.assertEqual(['77002', '77025', '77401'], self.get_names(qs))
-
-    @skipUnlessDBFeature("supports_distances_lookups", "supports_distance_geodetic")
-    def test_geodetic_distance_lookups(self):
-        """
-        Test distance lookups on geodetic coordinate systems.
-        """
-        # Line is from Canberra to Sydney.  Query is for all other cities within
-        # a 100km of that line (which should exclude only Hobart & Adelaide).
-        line = GEOSGeometry('LINESTRING(144.9630 -37.8143,151.2607 -33.8870)', 4326)
-        dist_qs = AustraliaCity.objects.filter(point__distance_lte=(line, D(km=100)))
-        expected_cities = [
-            'Batemans Bay', 'Canberra', 'Hillsdale',
-            'Melbourne', 'Mittagong', 'Shellharbour',
-            'Sydney', 'Thirroul', 'Wollongong',
-        ]
-        if spatialite:
-            # SpatiaLite is less accurate and returns 102.8km for Batemans Bay.
-            expected_cities.pop(0)
-        self.assertEqual(expected_cities, self.get_names(dist_qs))
-
-        # Too many params (4 in this case) should raise a ValueError.
-        queryset = AustraliaCity.objects.filter(point__distance_lte=('POINT(5 23)', D(km=100), 'spheroid', '4'))
-        with self.assertRaises(ValueError):
-            len(queryset)
-
-        # Not enough params should raise a ValueError.
-        with self.assertRaises(ValueError):
-            len(AustraliaCity.objects.filter(point__distance_lte=('POINT(5 23)',)))
-
-        # Getting all cities w/in 550 miles of Hobart.
-        hobart = AustraliaCity.objects.get(name='Hobart')
-        qs = AustraliaCity.objects.exclude(name='Hobart').filter(point__distance_lte=(hobart.point, D(mi=550)))
-        cities = self.get_names(qs)
-        self.assertEqual(cities, ['Batemans Bay', 'Canberra', 'Melbourne'])
-
-        # Cities that are either really close or really far from Wollongong --
-        # and using different units of distance.
-        wollongong = AustraliaCity.objects.get(name='Wollongong')
-        d1, d2 = D(yd=19500), D(nm=400)  # Yards (~17km) & Nautical miles.
-
-        # Normal geodetic distance lookup (uses `distance_sphere` on PostGIS.
-        gq1 = Q(point__distance_lte=(wollongong.point, d1))
-        gq2 = Q(point__distance_gte=(wollongong.point, d2))
-        qs1 = AustraliaCity.objects.exclude(name='Wollongong').filter(gq1 | gq2)
-
-        # Geodetic distance lookup but telling GeoDjango to use `distance_spheroid`
-        # instead (we should get the same results b/c accuracy variance won't matter
-        # in this test case).
-        querysets = [qs1]
-        if connection.features.has_DistanceSpheroid_function:
-            gq3 = Q(point__distance_lte=(wollongong.point, d1, 'spheroid'))
-            gq4 = Q(point__distance_gte=(wollongong.point, d2, 'spheroid'))
-            qs2 = AustraliaCity.objects.exclude(name='Wollongong').filter(gq3 | gq4)
-            querysets.append(qs2)
-
-        for qs in querysets:
-            cities = self.get_names(qs)
-            self.assertEqual(cities, ['Adelaide', 'Hobart', 'Shellharbour', 'Thirroul'])
-
-    @skipUnlessDBFeature("supports_distances_lookups")
-    def test_distance_lookups_with_expression_rhs(self):
-        qs = SouthTexasCity.objects.filter(
-            point__distance_lte=(self.stx_pnt, F('radius')),
-        ).order_by('name')
-        self.assertEqual(
-            self.get_names(qs),
-            ['Bellaire', 'Downtown Houston', 'Southside Place', 'West University Place']
-        )
-
-        # With a combined expression
-        qs = SouthTexasCity.objects.filter(
-            point__distance_lte=(self.stx_pnt, F('radius') * 2),
-        ).order_by('name')
-        self.assertEqual(len(qs), 5)
-        self.assertIn('Pearland', self.get_names(qs))
-
-        # With spheroid param
-        if connection.features.supports_distance_geodetic:
-            hobart = AustraliaCity.objects.get(name='Hobart')
-            qs = AustraliaCity.objects.filter(
-                point__distance_lte=(hobart.point, F('radius') * 70, 'spheroid'),
-            ).order_by('name')
-            self.assertEqual(self.get_names(qs), ['Canberra', 'Hobart', 'Melbourne'])
-
-    @skipUnlessDBFeature("has_area_method")
-    @ignore_warnings(category=RemovedInDjango20Warning)
-    def test_area(self):
-        """
-        Test the `area` GeoQuerySet method.
-        """
-        # Reference queries:
-        # SELECT ST_Area(poly) FROM distapp_southtexaszipcode;
-        area_sq_m = [5437908.90234375, 10183031.4389648, 11254471.0073242, 9881708.91772461]
-        # Tolerance has to be lower for Oracle
-        tol = 2
-        for i, z in enumerate(SouthTexasZipcode.objects.order_by('name').area()):
-            self.assertAlmostEqual(area_sq_m[i], z.area.sq_m, tol)
-
-    @skipIf(spatialite, "length method doesn't support geodetic coordinates on SpatiaLite.")
-    @skipUnlessDBFeature("has_length_method")
-    @ignore_warnings(category=RemovedInDjango20Warning)
-    def test_length(self):
-        """
-        Test the `length` GeoQuerySet method.
-        """
-        # Reference query (should use `length_spheroid`).
-        # SELECT ST_length_spheroid(ST_GeomFromText('<wkt>', 4326) 'SPHEROID["WGS 84",6378137,298.257223563,
-        #   AUTHORITY["EPSG","7030"]]');
-        len_m1 = 473504.769553813
-        len_m2 = 4617.668
-
-        if connection.features.supports_distance_geodetic:
-            qs = Interstate.objects.length()
-            tol = 2 if oracle else 3
-            self.assertAlmostEqual(len_m1, qs[0].length.m, tol)
-        else:
-            # Does not support geodetic coordinate systems.
-            with self.assertRaises(ValueError):
-                Interstate.objects.length()
-
-        # Now doing length on a projected coordinate system.
-        i10 = SouthTexasInterstate.objects.length().get(name='I-10')
-        self.assertAlmostEqual(len_m2, i10.length.m, 2)
-
-    @skipUnlessDBFeature("has_perimeter_method")
-    @ignore_warnings(category=RemovedInDjango20Warning)
-    def test_perimeter(self):
-        """
-        Test the `perimeter` GeoQuerySet method.
-        """
-        # Reference query:
-        # SELECT ST_Perimeter(distapp_southtexaszipcode.poly) FROM distapp_southtexaszipcode;
-        perim_m = [18404.3550889361, 15627.2108551001, 20632.5588368978, 17094.5996143697]
-        tol = 2 if oracle else 7
-        for i, z in enumerate(SouthTexasZipcode.objects.order_by('name').perimeter()):
-            self.assertAlmostEqual(perim_m[i], z.perimeter.m, tol)
-
-        # Running on points; should return 0.
-        for i, c in enumerate(SouthTexasCity.objects.perimeter(model_att='perim')):
-            self.assertEqual(0, c.perim.m)
-
-    @skipUnlessDBFeature("has_area_method", "has_distance_method")
-    @ignore_warnings(category=RemovedInDjango20Warning)
-    def test_measurement_null_fields(self):
-        """
-        Test the measurement GeoQuerySet methods on fields with NULL values.
-        """
-        # Creating SouthTexasZipcode w/NULL value.
-        SouthTexasZipcode.objects.create(name='78212')
-        # Performing distance/area queries against the NULL PolygonField,
-        # and ensuring the result of the operations is None.
-        htown = SouthTexasCity.objects.get(name='Downtown Houston')
-        z = SouthTexasZipcode.objects.distance(htown.point).area().get(name='78212')
-        self.assertIsNone(z.distance)
-        self.assertIsNone(z.area)
-
-    @skipUnlessDBFeature("has_distance_method")
-    @ignore_warnings(category=RemovedInDjango20Warning)
-    def test_distance_order_by(self):
-        qs = SouthTexasCity.objects.distance(Point(3, 3)).order_by(
-            'distance'
-        ).values_list('name', flat=True).filter(name__in=('San Antonio', 'Pearland'))
-        self.assertSequenceEqual(qs, ['San Antonio', 'Pearland'])
-
-
-'''
-=============================
-Distance functions on PostGIS
-=============================
-
-                                              | Projected Geometry | Lon/lat Geometry | Geography (4326)
-
-ST_Distance(geom1, geom2)                     |    OK (meters)     |   :-( (degrees)  |    OK (meters)
-
-ST_Distance(geom1, geom2, use_spheroid=False) |    N/A             |   N/A            |    OK (meters), less accurate, quick
-
-Distance_Sphere(geom1, geom2)                 |    N/A             |   OK (meters)    |    N/A
-
-Distance_Spheroid(geom1, geom2, spheroid)     |    N/A             |   OK (meters)    |    N/A
-
-ST_Perimeter(geom1)                           |    OK              |   :-( (degrees)  |    OK
-
-
-================================
-Distance functions on SpatiaLite
-================================
-
-                                                | Projected Geometry | Lon/lat Geometry
-
-ST_Distance(geom1, geom2)                       |    OK (meters)     |      N/A
-
-ST_Distance(geom1, geom2, use_ellipsoid=True)   |    N/A             |      OK (meters)
-
-ST_Distance(geom1, geom2, use_ellipsoid=False)  |    N/A             |      OK (meters), less accurate, quick
-
-Perimeter(geom1)                                |    OK              |      :-( (degrees)
-
-'''  # NOQA
-
-
-class DistanceFunctionsTests(TestCase):
-    fixtures = ['initial']
-
-    @skipUnlessDBFeature("has_Area_function")
-    def test_area(self):
-        # Reference queries:
-        # SELECT ST_Area(poly) FROM distapp_southtexaszipcode;
-        area_sq_m = [5437908.90234375, 10183031.4389648, 11254471.0073242, 9881708.91772461]
-        # Tolerance has to be lower for Oracle
-        tol = 2
-        for i, z in enumerate(SouthTexasZipcode.objects.annotate(area=Area('poly')).order_by('name')):
-            self.assertAlmostEqual(area_sq_m[i], z.area.sq_m, tol)
-
-    @skipUnlessDBFeature("has_Distance_function")
-    def test_distance_simple(self):
-        """
-        Test a simple distance query, with projected coordinates and without
-        transformation.
-        """
-        lagrange = GEOSGeometry('POINT(805066.295722839 4231496.29461335)', 32140)
-        houston = SouthTexasCity.objects.annotate(dist=Distance('point', lagrange)).order_by('id').first()
-        tol = 2 if oracle else 5
-        self.assertAlmostEqual(
-            houston.dist.m,
-            147075.069813,
-            tol
-        )
-
-    @skipUnlessDBFeature("has_Distance_function", "has_Transform_function")
-    def test_distance_projected(self):
-        """
-        Test the `Distance` function on projected coordinate systems.
-        """
-        # The point for La Grange, TX
-        lagrange = GEOSGeometry('POINT(-96.876369 29.905320)', 4326)
-        # Reference distances in feet and in meters. Got these values from
-        # using the provided raw SQL statements.
-        #  SELECT ST_Distance(point, ST_Transform(ST_GeomFromText('POINT(-96.876369 29.905320)', 4326), 32140))
-        #  FROM distapp_southtexascity;
-        m_distances = [147075.069813, 139630.198056, 140888.552826,
-                       138809.684197, 158309.246259, 212183.594374,
-                       70870.188967, 165337.758878, 139196.085105]
-        #  SELECT ST_Distance(point, ST_Transform(ST_GeomFromText('POINT(-96.876369 29.905320)', 4326), 2278))
-        #  FROM distapp_southtexascityft;
-        # Oracle 11 thinks this is not a projected coordinate system, so it's
-        # not tested.
-        ft_distances = [482528.79154625, 458103.408123001, 462231.860397575,
-                        455411.438904354, 519386.252102563, 696139.009211594,
-                        232513.278304279, 542445.630586414, 456679.155883207]
-
-        # Testing using different variations of parameters and using models
-        # with different projected coordinate systems.
-        dist1 = SouthTexasCity.objects.annotate(distance=Distance('point', lagrange)).order_by('id')
-        if oracle:
-            dist_qs = [dist1]
-        else:
-            dist2 = SouthTexasCityFt.objects.annotate(distance=Distance('point', lagrange)).order_by('id')
-            dist_qs = [dist1, dist2]
-
-        # Original query done on PostGIS, have to adjust AlmostEqual tolerance
-        # for Oracle.
-        tol = 2 if oracle else 5
-
-        # Ensuring expected distances are returned for each distance queryset.
-        for qs in dist_qs:
-            for i, c in enumerate(qs):
-                self.assertAlmostEqual(m_distances[i], c.distance.m, tol)
-                self.assertAlmostEqual(ft_distances[i], c.distance.survey_ft, tol)
-
-    @skipUnlessDBFeature("has_Distance_function", "supports_distance_geodetic")
-    def test_distance_geodetic(self):
-        """
-        Test the `Distance` function on geodetic coordinate systems.
-        """
-        # Testing geodetic distance calculation with a non-point geometry
-        # (a LineString of Wollongong and Shellharbour coords).
-        ls = LineString(((150.902, -34.4245), (150.87, -34.5789)), srid=4326)
-
-        # Reference query:
-        #  SELECT ST_distance_sphere(point, ST_GeomFromText('LINESTRING(150.9020 -34.4245,150.8700 -34.5789)', 4326))
-        #  FROM distapp_australiacity ORDER BY name;
-        distances = [1120954.92533513, 140575.720018241, 640396.662906304,
-                     60580.9693849269, 972807.955955075, 568451.8357838,
-                     40435.4335201384, 0, 68272.3896586844, 12375.0643697706, 0]
-        qs = AustraliaCity.objects.annotate(distance=Distance('point', ls)).order_by('name')
-        for city, distance in zip(qs, distances):
-            # Testing equivalence to within a meter (kilometer on SpatiaLite).
-            tol = -3 if spatialite else 0
-            self.assertAlmostEqual(distance, city.distance.m, tol)
-
-    @skipUnlessDBFeature("has_Distance_function", "supports_distance_geodetic")
-    def test_distance_geodetic_spheroid(self):
-        tol = 2 if oracle else 4
-
-        # Got the reference distances using the raw SQL statements:
-        #  SELECT ST_distance_spheroid(point, ST_GeomFromText('POINT(151.231341 -33.952685)', 4326),
-        #    'SPHEROID["WGS 84",6378137.0,298.257223563]') FROM distapp_australiacity WHERE (NOT (id = 11));
-        #  SELECT ST_distance_sphere(point, ST_GeomFromText('POINT(151.231341 -33.952685)', 4326))
-        #  FROM distapp_australiacity WHERE (NOT (id = 11));  st_distance_sphere
-        spheroid_distances = [
-            60504.0628957201, 77023.9489850262, 49154.8867574404,
-            90847.4358768573, 217402.811919332, 709599.234564757,
-            640011.483550888, 7772.00667991925, 1047861.78619339,
-            1165126.55236034,
-        ]
-        sphere_distances = [
-            60580.9693849267, 77144.0435286473, 49199.4415344719,
-            90804.7533823494, 217713.384600405, 709134.127242793,
-            639828.157159169, 7786.82949717788, 1049204.06569028,
-            1162623.7238134,
-        ]
-        # Testing with spheroid distances first.
-        hillsdale = AustraliaCity.objects.get(name='Hillsdale')
-        qs = AustraliaCity.objects.exclude(id=hillsdale.id).annotate(
-            distance=Distance('point', hillsdale.point, spheroid=True)
-        ).order_by('id')
-        for i, c in enumerate(qs):
-            self.assertAlmostEqual(spheroid_distances[i], c.distance.m, tol)
-        if postgis or spatialite:
-            # PostGIS uses sphere-only distances by default, testing these as well.
-            qs = AustraliaCity.objects.exclude(id=hillsdale.id).annotate(
-                distance=Distance('point', hillsdale.point)
-            ).order_by('id')
-            for i, c in enumerate(qs):
-                self.assertAlmostEqual(sphere_distances[i], c.distance.m, tol)
-
-    @no_oracle  # Oracle already handles geographic distance calculation.
-    @skipUnlessDBFeature("has_Distance_function", 'has_Transform_function')
-    def test_distance_transform(self):
-        """
-        Test the `Distance` function used with `Transform` on a geographic field.
-        """
-        # We'll be using a Polygon (created by buffering the centroid
-        # of 77005 to 100m) -- which aren't allowed in geographic distance
-        # queries normally, however our field has been transformed to
-        # a non-geographic system.
-        z = SouthTexasZipcode.objects.get(name='77005')
-
-        # Reference query:
-        # SELECT ST_Distance(ST_Transform("distapp_censuszipcode"."poly", 32140),
-        #   ST_GeomFromText('<buffer_wkt>', 32140))
-        # FROM "distapp_censuszipcode";
-        dists_m = [3553.30384972258, 1243.18391525602, 2186.15439472242]
-
-        # Having our buffer in the SRID of the transformation and of the field
-        # -- should get the same results. The first buffer has no need for
-        # transformation SQL because it is the same SRID as what was given
-        # to `transform()`.  The second buffer will need to be transformed,
-        # however.
-        buf1 = z.poly.centroid.buffer(100)
-        buf2 = buf1.transform(4269, clone=True)
-        ref_zips = ['77002', '77025', '77401']
-
-        for buf in [buf1, buf2]:
-            qs = CensusZipcode.objects.exclude(name='77005').annotate(
-                distance=Distance(Transform('poly', 32140), buf)
-            ).order_by('name')
-            self.assertEqual(ref_zips, sorted([c.name for c in qs]))
-            for i, z in enumerate(qs):
-                self.assertAlmostEqual(z.distance.m, dists_m[i], 5)
-
-    @skipUnlessDBFeature("has_Distance_function")
-    def test_distance_order_by(self):
-        qs = SouthTexasCity.objects.annotate(distance=Distance('point', Point(3, 3, srid=32140))).order_by(
-            'distance'
-        ).values_list('name', flat=True).filter(name__in=('San Antonio', 'Pearland'))
-        self.assertSequenceEqual(qs, ['San Antonio', 'Pearland'])
-
-    @skipUnlessDBFeature("has_Length_function")
-    def test_length(self):
-        """
-        Test the `Length` function.
-        """
-        # Reference query (should use `length_spheroid`).
-        # SELECT ST_length_spheroid(ST_GeomFromText('<wkt>', 4326) 'SPHEROID["WGS 84",6378137,298.257223563,
-        #   AUTHORITY["EPSG","7030"]]');
-        len_m1 = 473504.769553813
-        len_m2 = 4617.668
-
-        if connection.features.supports_length_geodetic:
-            qs = Interstate.objects.annotate(length=Length('path'))
-            tol = 2 if oracle else 3
-            self.assertAlmostEqual(len_m1, qs[0].length.m, tol)
-            # TODO: test with spheroid argument (True and False)
-        else:
-            # Does not support geodetic coordinate systems.
-            with self.assertRaises(NotImplementedError):
-                list(Interstate.objects.annotate(length=Length('path')))
-
-        # Now doing length on a projected coordinate system.
-        i10 = SouthTexasInterstate.objects.annotate(length=Length('path')).get(name='I-10')
-        self.assertAlmostEqual(len_m2, i10.length.m, 2)
-        self.assertTrue(
-            SouthTexasInterstate.objects.annotate(length=Length('path')).filter(length__gt=4000).exists()
-        )
-
-    @skipUnlessDBFeature("has_Perimeter_function")
-    def test_perimeter(self):
-        """
-        Test the `Perimeter` function.
-        """
-        # Reference query:
-        # SELECT ST_Perimeter(distapp_southtexaszipcode.poly) FROM distapp_southtexaszipcode;
-        perim_m = [18404.3550889361, 15627.2108551001, 20632.5588368978, 17094.5996143697]
-        tol = 2 if oracle else 7
-        qs = SouthTexasZipcode.objects.annotate(perimeter=Perimeter('poly')).order_by('name')
-        for i, z in enumerate(qs):
-            self.assertAlmostEqual(perim_m[i], z.perimeter.m, tol)
-
-        # Running on points; should return 0.
-        qs = SouthTexasCity.objects.annotate(perim=Perimeter('point'))
-        for city in qs:
-            self.assertEqual(0, city.perim.m)
-
-    @skipUnlessDBFeature("has_Perimeter_function")
-    def test_perimeter_geodetic(self):
-        # Currently only Oracle supports calculating the perimeter on geodetic
-        # geometries (without being transformed).
-        qs1 = CensusZipcode.objects.annotate(perim=Perimeter('poly'))
-        if connection.features.supports_perimeter_geodetic:
-            self.assertAlmostEqual(qs1[0].perim.m, 18406.3818954314, 3)
-        else:
-            with self.assertRaises(NotImplementedError):
-                list(qs1)
-        # But should work fine when transformed to projected coordinates
-        qs2 = CensusZipcode.objects.annotate(perim=Perimeter(Transform('poly', 32140))).filter(name='77002')
-        self.assertAlmostEqual(qs2[0].perim.m, 18404.355, 3)
-
-    @skipUnlessDBFeature("supports_null_geometries", "has_Area_function", "has_Distance_function")
-    def test_measurement_null_fields(self):
-        """
-        Test the measurement functions on fields with NULL values.
-        """
-        # Creating SouthTexasZipcode w/NULL value.
-        SouthTexasZipcode.objects.create(name='78212')
-        # Performing distance/area queries against the NULL PolygonField,
-        # and ensuring the result of the operations is None.
-        htown = SouthTexasCity.objects.get(name='Downtown Houston')
-        z = SouthTexasZipcode.objects.annotate(
-            distance=Distance('poly', htown.point), area=Area('poly')
-        ).get(name='78212')
-        self.assertIsNone(z.distance)
-        self.assertIsNone(z.area)

+ 0 - 905
desktop/core/ext-py/Django-1.11.22/tests/gis_tests/geoapp/tests.py

@@ -1,905 +0,0 @@
-from __future__ import unicode_literals
-
-import re
-import tempfile
-
-from django.contrib.gis import gdal
-from django.contrib.gis.db.models import Extent, MakeLine, Union
-from django.contrib.gis.geos import (
-    GeometryCollection, GEOSGeometry, LinearRing, LineString, MultiLineString,
-    MultiPoint, MultiPolygon, Point, Polygon, fromstr,
-)
-from django.core.management import call_command
-from django.db import connection
-from django.test import TestCase, ignore_warnings, skipUnlessDBFeature
-from django.utils import six
-from django.utils.deprecation import RemovedInDjango20Warning
-
-from ..utils import oracle, postgis, skipUnlessGISLookup, spatialite
-from .models import (
-    City, Country, Feature, MinusOneSRID, NonConcreteModel, PennsylvaniaCity,
-    State, Track,
-)
-
-
-class GeoModelTest(TestCase):
-    fixtures = ['initial']
-
-    def test_fixtures(self):
-        "Testing geographic model initialization from fixtures."
-        # Ensuring that data was loaded from initial data fixtures.
-        self.assertEqual(2, Country.objects.count())
-        self.assertEqual(8, City.objects.count())
-        self.assertEqual(2, State.objects.count())
-
-    def test_proxy(self):
-        "Testing Lazy-Geometry support (using the GeometryProxy)."
-        # Testing on a Point
-        pnt = Point(0, 0)
-        nullcity = City(name='NullCity', point=pnt)
-        nullcity.save()
-
-        # Making sure TypeError is thrown when trying to set with an
-        #  incompatible type.
-        for bad in [5, 2.0, LineString((0, 0), (1, 1))]:
-            with self.assertRaisesMessage(TypeError, 'Cannot set'):
-                nullcity.point = bad
-
-        # Now setting with a compatible GEOS Geometry, saving, and ensuring
-        #  the save took, notice no SRID is explicitly set.
-        new = Point(5, 23)
-        nullcity.point = new
-
-        # Ensuring that the SRID is automatically set to that of the
-        #  field after assignment, but before saving.
-        self.assertEqual(4326, nullcity.point.srid)
-        nullcity.save()
-
-        # Ensuring the point was saved correctly after saving
-        self.assertEqual(new, City.objects.get(name='NullCity').point)
-
-        # Setting the X and Y of the Point
-        nullcity.point.x = 23
-        nullcity.point.y = 5
-        # Checking assignments pre & post-save.
-        self.assertNotEqual(Point(23, 5, srid=4326), City.objects.get(name='NullCity').point)
-        nullcity.save()
-        self.assertEqual(Point(23, 5, srid=4326), City.objects.get(name='NullCity').point)
-        nullcity.delete()
-
-        # Testing on a Polygon
-        shell = LinearRing((0, 0), (0, 90), (100, 90), (100, 0), (0, 0))
-        inner = LinearRing((40, 40), (40, 60), (60, 60), (60, 40), (40, 40))
-
-        # Creating a State object using a built Polygon
-        ply = Polygon(shell, inner)
-        nullstate = State(name='NullState', poly=ply)
-        self.assertEqual(4326, nullstate.poly.srid)  # SRID auto-set from None
-        nullstate.save()
-
-        ns = State.objects.get(name='NullState')
-        self.assertEqual(ply, ns.poly)
-
-        # Testing the `ogr` and `srs` lazy-geometry properties.
-        self.assertIsInstance(ns.poly.ogr, gdal.OGRGeometry)
-        self.assertEqual(ns.poly.wkb, ns.poly.ogr.wkb)
-        self.assertIsInstance(ns.poly.srs, gdal.SpatialReference)
-        self.assertEqual('WGS 84', ns.poly.srs.name)
-
-        # Changing the interior ring on the poly attribute.
-        new_inner = LinearRing((30, 30), (30, 70), (70, 70), (70, 30), (30, 30))
-        ns.poly[1] = new_inner
-        ply[1] = new_inner
-        self.assertEqual(4326, ns.poly.srid)
-        ns.save()
-        self.assertEqual(ply, State.objects.get(name='NullState').poly)
-        ns.delete()
-
-    @skipUnlessDBFeature("supports_transform")
-    def test_lookup_insert_transform(self):
-        "Testing automatic transform for lookups and inserts."
-        # San Antonio in 'WGS84' (SRID 4326)
-        sa_4326 = 'POINT (-98.493183 29.424170)'
-        wgs_pnt = fromstr(sa_4326, srid=4326)  # Our reference point in WGS84
-        # San Antonio in 'WGS 84 / Pseudo-Mercator' (SRID 3857)
-        other_srid_pnt = wgs_pnt.transform(3857, clone=True)
-        # Constructing & querying with a point from a different SRID. Oracle
-        # `SDO_OVERLAPBDYINTERSECT` operates differently from
-        # `ST_Intersects`, so contains is used instead.
-        if oracle:
-            tx = Country.objects.get(mpoly__contains=other_srid_pnt)
-        else:
-            tx = Country.objects.get(mpoly__intersects=other_srid_pnt)
-        self.assertEqual('Texas', tx.name)
-
-        # Creating San Antonio.  Remember the Alamo.
-        sa = City.objects.create(name='San Antonio', point=other_srid_pnt)
-
-        # Now verifying that San Antonio was transformed correctly
-        sa = City.objects.get(name='San Antonio')
-        self.assertAlmostEqual(wgs_pnt.x, sa.point.x, 6)
-        self.assertAlmostEqual(wgs_pnt.y, sa.point.y, 6)
-
-        # If the GeometryField SRID is -1, then we shouldn't perform any
-        # transformation if the SRID of the input geometry is different.
-        m1 = MinusOneSRID(geom=Point(17, 23, srid=4326))
-        m1.save()
-        self.assertEqual(-1, m1.geom.srid)
-
-    def test_createnull(self):
-        "Testing creating a model instance and the geometry being None"
-        c = City()
-        self.assertIsNone(c.point)
-
-    def test_geometryfield(self):
-        "Testing the general GeometryField."
-        Feature(name='Point', geom=Point(1, 1)).save()
-        Feature(name='LineString', geom=LineString((0, 0), (1, 1), (5, 5))).save()
-        Feature(name='Polygon', geom=Polygon(LinearRing((0, 0), (0, 5), (5, 5), (5, 0), (0, 0)))).save()
-        Feature(name='GeometryCollection',
-                geom=GeometryCollection(Point(2, 2), LineString((0, 0), (2, 2)),
-                                        Polygon(LinearRing((0, 0), (0, 5), (5, 5), (5, 0), (0, 0))))).save()
-
-        f_1 = Feature.objects.get(name='Point')
-        self.assertIsInstance(f_1.geom, Point)
-        self.assertEqual((1.0, 1.0), f_1.geom.tuple)
-        f_2 = Feature.objects.get(name='LineString')
-        self.assertIsInstance(f_2.geom, LineString)
-        self.assertEqual(((0.0, 0.0), (1.0, 1.0), (5.0, 5.0)), f_2.geom.tuple)
-
-        f_3 = Feature.objects.get(name='Polygon')
-        self.assertIsInstance(f_3.geom, Polygon)
-        f_4 = Feature.objects.get(name='GeometryCollection')
-        self.assertIsInstance(f_4.geom, GeometryCollection)
-        self.assertEqual(f_3.geom, f_4.geom[2])
-
-    @skipUnlessDBFeature("supports_transform")
-    def test_inherited_geofields(self):
-        "Test GeoQuerySet methods on inherited Geometry fields."
-        # Creating a Pennsylvanian city.
-        PennsylvaniaCity.objects.create(name='Mansfield', county='Tioga', point='POINT(-77.071445 41.823881)')
-
-        # All transformation SQL will need to be performed on the
-        # _parent_ table.
-        qs = PennsylvaniaCity.objects.transform(32128)
-
-        self.assertEqual(1, qs.count())
-        for pc in qs:
-            self.assertEqual(32128, pc.point.srid)
-
-    def test_raw_sql_query(self):
-        "Testing raw SQL query."
-        cities1 = City.objects.all()
-        # Only PostGIS would support a 'select *' query because of its recognized
-        # HEXEWKB format for geometry fields
-        as_text = 'ST_AsText(%s)' if postgis else connection.ops.select
-        cities2 = City.objects.raw(
-            'select id, name, %s from geoapp_city' % as_text % 'point'
-        )
-        self.assertEqual(len(cities1), len(list(cities2)))
-        self.assertIsInstance(cities2[0].point, Point)
-
-    def test_dumpdata_loaddata_cycle(self):
-        """
-        Test a dumpdata/loaddata cycle with geographic data.
-        """
-        out = six.StringIO()
-        original_data = list(City.objects.all().order_by('name'))
-        call_command('dumpdata', 'geoapp.City', stdout=out)
-        result = out.getvalue()
-        houston = City.objects.get(name='Houston')
-        self.assertIn('"point": "%s"' % houston.point.ewkt, result)
-
-        # Reload now dumped data
-        with tempfile.NamedTemporaryFile(mode='w', suffix='.json') as tmp:
-            tmp.write(result)
-            tmp.seek(0)
-            call_command('loaddata', tmp.name, verbosity=0)
-        self.assertListEqual(original_data, list(City.objects.all().order_by('name')))
-
-    @skipUnlessDBFeature("supports_empty_geometries")
-    def test_empty_geometries(self):
-        geometry_classes = [
-            Point,
-            LineString,
-            LinearRing,
-            Polygon,
-            MultiPoint,
-            MultiLineString,
-            MultiPolygon,
-            GeometryCollection,
-        ]
-        for klass in geometry_classes:
-            g = klass(srid=4326)
-            feature = Feature.objects.create(name='Empty %s' % klass.__name__, geom=g)
-            feature.refresh_from_db()
-            if klass is LinearRing:
-                # LinearRing isn't representable in WKB, so GEOSGeomtry.wkb
-                # uses LineString instead.
-                g = LineString(srid=4326)
-            self.assertEqual(feature.geom, g)
-            self.assertEqual(feature.geom.srid, g.srid)
-
-
-class GeoLookupTest(TestCase):
-    fixtures = ['initial']
-
-    def test_disjoint_lookup(self):
-        "Testing the `disjoint` lookup type."
-        ptown = City.objects.get(name='Pueblo')
-        qs1 = City.objects.filter(point__disjoint=ptown.point)
-        self.assertEqual(7, qs1.count())
-
-        if connection.features.supports_real_shape_operations:
-            qs2 = State.objects.filter(poly__disjoint=ptown.point)
-            self.assertEqual(1, qs2.count())
-            self.assertEqual('Kansas', qs2[0].name)
-
-    def test_contains_contained_lookups(self):
-        "Testing the 'contained', 'contains', and 'bbcontains' lookup types."
-        # Getting Texas, yes we were a country -- once ;)
-        texas = Country.objects.get(name='Texas')
-
-        # Seeing what cities are in Texas, should get Houston and Dallas,
-        #  and Oklahoma City because 'contained' only checks on the
-        #  _bounding box_ of the Geometries.
-        if connection.features.supports_contained_lookup:
-            qs = City.objects.filter(point__contained=texas.mpoly)
-            self.assertEqual(3, qs.count())
-            cities = ['Houston', 'Dallas', 'Oklahoma City']
-            for c in qs:
-                self.assertIn(c.name, cities)
-
-        # Pulling out some cities.
-        houston = City.objects.get(name='Houston')
-        wellington = City.objects.get(name='Wellington')
-        pueblo = City.objects.get(name='Pueblo')
-        okcity = City.objects.get(name='Oklahoma City')
-        lawrence = City.objects.get(name='Lawrence')
-
-        # Now testing contains on the countries using the points for
-        #  Houston and Wellington.
-        tx = Country.objects.get(mpoly__contains=houston.point)  # Query w/GEOSGeometry
-        nz = Country.objects.get(mpoly__contains=wellington.point.hex)  # Query w/EWKBHEX
-        self.assertEqual('Texas', tx.name)
-        self.assertEqual('New Zealand', nz.name)
-
-        # Testing `contains` on the states using the point for Lawrence.
-        ks = State.objects.get(poly__contains=lawrence.point)
-        self.assertEqual('Kansas', ks.name)
-
-        # Pueblo and Oklahoma City (even though OK City is within the bounding box of Texas)
-        # are not contained in Texas or New Zealand.
-        self.assertEqual(len(Country.objects.filter(mpoly__contains=pueblo.point)), 0)  # Query w/GEOSGeometry object
-        self.assertEqual(len(Country.objects.filter(mpoly__contains=okcity.point.wkt)),
-                         0 if connection.features.supports_real_shape_operations else 1)  # Query w/WKT
-
-        # OK City is contained w/in bounding box of Texas.
-        if connection.features.supports_bbcontains_lookup:
-            qs = Country.objects.filter(mpoly__bbcontains=okcity.point)
-            self.assertEqual(1, len(qs))
-            self.assertEqual('Texas', qs[0].name)
-
-    @skipUnlessDBFeature("supports_crosses_lookup")
-    def test_crosses_lookup(self):
-        Track.objects.create(
-            name='Line1',
-            line=LineString([(-95, 29), (-60, 0)])
-        )
-        self.assertEqual(
-            Track.objects.filter(line__crosses=LineString([(-95, 0), (-60, 29)])).count(),
-            1
-        )
-        self.assertEqual(
-            Track.objects.filter(line__crosses=LineString([(-95, 30), (0, 30)])).count(),
-            0
-        )
-
-    @skipUnlessDBFeature("supports_isvalid_lookup")
-    def test_isvalid_lookup(self):
-        invalid_geom = fromstr('POLYGON((0 0, 0 1, 1 1, 1 0, 1 1, 1 0, 0 0))')
-        State.objects.create(name='invalid', poly=invalid_geom)
-        qs = State.objects.all()
-        if oracle:
-            # Kansas has adjacent vertices with distance 6.99244813842e-12
-            # which is smaller than the default Oracle tolerance.
-            qs = qs.exclude(name='Kansas')
-            self.assertEqual(State.objects.filter(name='Kansas', poly__isvalid=False).count(), 1)
-        self.assertEqual(qs.filter(poly__isvalid=False).count(), 1)
-        self.assertEqual(qs.filter(poly__isvalid=True).count(), qs.count() - 1)
-
-    @skipUnlessDBFeature("supports_left_right_lookups")
-    def test_left_right_lookups(self):
-        "Testing the 'left' and 'right' lookup types."
-        # Left: A << B => true if xmax(A) < xmin(B)
-        # Right: A >> B => true if xmin(A) > xmax(B)
-        # See: BOX2D_left() and BOX2D_right() in lwgeom_box2dfloat4.c in PostGIS source.
-
-        # Getting the borders for Colorado & Kansas
-        co_border = State.objects.get(name='Colorado').poly
-        ks_border = State.objects.get(name='Kansas').poly
-
-        # Note: Wellington has an 'X' value of 174, so it will not be considered
-        # to the left of CO.
-
-        # These cities should be strictly to the right of the CO border.
-        cities = ['Houston', 'Dallas', 'Oklahoma City',
-                  'Lawrence', 'Chicago', 'Wellington']
-        qs = City.objects.filter(point__right=co_border)
-        self.assertEqual(6, len(qs))
-        for c in qs:
-            self.assertIn(c.name, cities)
-
-        # These cities should be strictly to the right of the KS border.
-        cities = ['Chicago', 'Wellington']
-        qs = City.objects.filter(point__right=ks_border)
-        self.assertEqual(2, len(qs))
-        for c in qs:
-            self.assertIn(c.name, cities)
-
-        # Note: Wellington has an 'X' value of 174, so it will not be considered
-        #  to the left of CO.
-        vic = City.objects.get(point__left=co_border)
-        self.assertEqual('Victoria', vic.name)
-
-        cities = ['Pueblo', 'Victoria']
-        qs = City.objects.filter(point__left=ks_border)
-        self.assertEqual(2, len(qs))
-        for c in qs:
-            self.assertIn(c.name, cities)
-
-    @skipUnlessGISLookup("strictly_above", "strictly_below")
-    def test_strictly_above_below_lookups(self):
-        dallas = City.objects.get(name='Dallas')
-        self.assertQuerysetEqual(
-            City.objects.filter(point__strictly_above=dallas.point).order_by('name'),
-            ['Chicago', 'Lawrence', 'Oklahoma City', 'Pueblo', 'Victoria'],
-            lambda b: b.name
-        )
-        self.assertQuerysetEqual(
-            City.objects.filter(point__strictly_below=dallas.point).order_by('name'),
-            ['Houston', 'Wellington'],
-            lambda b: b.name
-        )
-
-    def test_equals_lookups(self):
-        "Testing the 'same_as' and 'equals' lookup types."
-        pnt = fromstr('POINT (-95.363151 29.763374)', srid=4326)
-        c1 = City.objects.get(point=pnt)
-        c2 = City.objects.get(point__same_as=pnt)
-        c3 = City.objects.get(point__equals=pnt)
-        for c in [c1, c2, c3]:
-            self.assertEqual('Houston', c.name)
-
-    @skipUnlessDBFeature("supports_null_geometries")
-    def test_null_geometries(self):
-        "Testing NULL geometry support, and the `isnull` lookup type."
-        # Creating a state with a NULL boundary.
-        State.objects.create(name='Puerto Rico')
-
-        # Querying for both NULL and Non-NULL values.
-        nullqs = State.objects.filter(poly__isnull=True)
-        validqs = State.objects.filter(poly__isnull=False)
-
-        # Puerto Rico should be NULL (it's a commonwealth unincorporated territory)
-        self.assertEqual(1, len(nullqs))
-        self.assertEqual('Puerto Rico', nullqs[0].name)
-
-        # The valid states should be Colorado & Kansas
-        self.assertEqual(2, len(validqs))
-        state_names = [s.name for s in validqs]
-        self.assertIn('Colorado', state_names)
-        self.assertIn('Kansas', state_names)
-
-        # Saving another commonwealth w/a NULL geometry.
-        nmi = State.objects.create(name='Northern Mariana Islands', poly=None)
-        self.assertIsNone(nmi.poly)
-
-        # Assigning a geometry and saving -- then UPDATE back to NULL.
-        nmi.poly = 'POLYGON((0 0,1 0,1 1,1 0,0 0))'
-        nmi.save()
-        State.objects.filter(name='Northern Mariana Islands').update(poly=None)
-        self.assertIsNone(State.objects.get(name='Northern Mariana Islands').poly)
-
-    @skipUnlessDBFeature("supports_relate_lookup")
-    def test_relate_lookup(self):
-        "Testing the 'relate' lookup type."
-        # To make things more interesting, we will have our Texas reference point in
-        # different SRIDs.
-        pnt1 = fromstr('POINT (649287.0363174 4177429.4494686)', srid=2847)
-        pnt2 = fromstr('POINT(-98.4919715741052 29.4333344025053)', srid=4326)
-
-        # Not passing in a geometry as first param should
-        # raise a type error when initializing the GeoQuerySet
-        with self.assertRaises(ValueError):
-            Country.objects.filter(mpoly__relate=(23, 'foo'))
-
-        # Making sure the right exception is raised for the given
-        # bad arguments.
-        for bad_args, e in [((pnt1, 0), ValueError), ((pnt2, 'T*T***FF*', 0), ValueError)]:
-            qs = Country.objects.filter(mpoly__relate=bad_args)
-            with self.assertRaises(e):
-                qs.count()
-
-        # Relate works differently for the different backends.
-        if postgis or spatialite:
-            contains_mask = 'T*T***FF*'
-            within_mask = 'T*F**F***'
-            intersects_mask = 'T********'
-        elif oracle:
-            contains_mask = 'contains'
-            within_mask = 'inside'
-            # TODO: This is not quite the same as the PostGIS mask above
-            intersects_mask = 'overlapbdyintersect'
-
-        # Testing contains relation mask.
-        self.assertEqual('Texas', Country.objects.get(mpoly__relate=(pnt1, contains_mask)).name)
-        self.assertEqual('Texas', Country.objects.get(mpoly__relate=(pnt2, contains_mask)).name)
-
-        # Testing within relation mask.
-        ks = State.objects.get(name='Kansas')
-        self.assertEqual('Lawrence', City.objects.get(point__relate=(ks.poly, within_mask)).name)
-
-        # Testing intersection relation mask.
-        if not oracle:
-            self.assertEqual('Texas', Country.objects.get(mpoly__relate=(pnt1, intersects_mask)).name)
-            self.assertEqual('Texas', Country.objects.get(mpoly__relate=(pnt2, intersects_mask)).name)
-            self.assertEqual('Lawrence', City.objects.get(point__relate=(ks.poly, intersects_mask)).name)
-
-
-@ignore_warnings(category=RemovedInDjango20Warning)
-class GeoQuerySetTest(TestCase):
-    fixtures = ['initial']
-
-    # Please keep the tests in GeoQuerySet method's alphabetic order
-
-    @skipUnlessDBFeature("has_centroid_method")
-    def test_centroid(self):
-        "Testing the `centroid` GeoQuerySet method."
-        qs = State.objects.exclude(poly__isnull=True).centroid()
-        if oracle:
-            tol = 0.1
-        elif spatialite:
-            tol = 0.000001
-        else:
-            tol = 0.000000001
-        for s in qs:
-            self.assertTrue(s.poly.centroid.equals_exact(s.centroid, tol))
-
-    @skipUnlessDBFeature(
-        "has_difference_method", "has_intersection_method",
-        "has_sym_difference_method", "has_union_method")
-    def test_diff_intersection_union(self):
-        "Testing the `difference`, `intersection`, `sym_difference`, and `union` GeoQuerySet methods."
-        geom = Point(5, 23)
-        qs = Country.objects.all().difference(geom).sym_difference(geom).union(geom)
-
-        # XXX For some reason SpatiaLite does something screwy with the Texas geometry here.  Also,
-        # XXX it doesn't like the null intersection.
-        if spatialite:
-            qs = qs.exclude(name='Texas')
-        else:
-            qs = qs.intersection(geom)
-
-        for c in qs:
-            if oracle:
-                # Should be able to execute the queries; however, they won't be the same
-                # as GEOS (because Oracle doesn't use GEOS internally like PostGIS or
-                # SpatiaLite).
-                pass
-            else:
-                if spatialite:
-                    # Spatialite `difference` doesn't have an SRID
-                    self.assertEqual(c.mpoly.difference(geom).wkt, c.difference.wkt)
-                else:
-                    self.assertEqual(c.mpoly.difference(geom), c.difference)
-                    self.assertEqual(c.mpoly.intersection(geom), c.intersection)
-                # Ordering might differ in collections
-                self.assertSetEqual(set(g.wkt for g in c.mpoly.sym_difference(geom)),
-                                    set(g.wkt for g in c.sym_difference))
-                self.assertSetEqual(set(g.wkt for g in c.mpoly.union(geom)),
-                                    set(g.wkt for g in c.union))
-
-    @skipUnlessDBFeature("has_envelope_method")
-    def test_envelope(self):
-        "Testing the `envelope` GeoQuerySet method."
-        countries = Country.objects.all().envelope()
-        for country in countries:
-            self.assertIsInstance(country.envelope, Polygon)
-
-    @skipUnlessDBFeature("supports_extent_aggr")
-    def test_extent(self):
-        """
-        Testing the `Extent` aggregate.
-        """
-        # Reference query:
-        # `SELECT ST_extent(point) FROM geoapp_city WHERE (name='Houston' or name='Dallas');`
-        #   =>  BOX(-96.8016128540039 29.7633724212646,-95.3631439208984 32.7820587158203)
-        expected = (-96.8016128540039, 29.7633724212646, -95.3631439208984, 32.782058715820)
-
-        qs = City.objects.filter(name__in=('Houston', 'Dallas'))
-        extent = qs.aggregate(Extent('point'))['point__extent']
-        for val, exp in zip(extent, expected):
-            self.assertAlmostEqual(exp, val, 4)
-        self.assertIsNone(City.objects.filter(name=('Smalltown')).aggregate(Extent('point'))['point__extent'])
-
-    @skipUnlessDBFeature("supports_extent_aggr")
-    def test_extent_with_limit(self):
-        """
-        Testing if extent supports limit.
-        """
-        extent1 = City.objects.all().aggregate(Extent('point'))['point__extent']
-        extent2 = City.objects.all()[:3].aggregate(Extent('point'))['point__extent']
-        self.assertNotEqual(extent1, extent2)
-
-    @skipUnlessDBFeature("has_force_rhr_method")
-    def test_force_rhr(self):
-        "Testing GeoQuerySet.force_rhr()."
-        rings = (
-            ((0, 0), (5, 0), (0, 5), (0, 0)),
-            ((1, 1), (1, 3), (3, 1), (1, 1)),
-        )
-        rhr_rings = (
-            ((0, 0), (0, 5), (5, 0), (0, 0)),
-            ((1, 1), (3, 1), (1, 3), (1, 1)),
-        )
-        State.objects.create(name='Foo', poly=Polygon(*rings))
-        s = State.objects.force_rhr().get(name='Foo')
-        self.assertEqual(rhr_rings, s.force_rhr.coords)
-
-    @skipUnlessDBFeature("has_geohash_method")
-    def test_geohash(self):
-        "Testing GeoQuerySet.geohash()."
-        # Reference query:
-        # SELECT ST_GeoHash(point) FROM geoapp_city WHERE name='Houston';
-        # SELECT ST_GeoHash(point, 5) FROM geoapp_city WHERE name='Houston';
-        ref_hash = '9vk1mfq8jx0c8e0386z6'
-        h1 = City.objects.geohash().get(name='Houston')
-        h2 = City.objects.geohash(precision=5).get(name='Houston')
-        self.assertEqual(ref_hash, h1.geohash)
-        self.assertEqual(ref_hash[:5], h2.geohash)
-
-    def test_geojson(self):
-        "Testing GeoJSON output from the database using GeoQuerySet.geojson()."
-        # Only PostGIS and SpatiaLite support GeoJSON.
-        if not connection.ops.geojson:
-            with self.assertRaises(NotImplementedError):
-                Country.objects.all().geojson(field_name='mpoly')
-            return
-
-        pueblo_json = '{"type":"Point","coordinates":[-104.609252,38.255001]}'
-        houston_json = (
-            '{"type":"Point","crs":{"type":"name","properties":'
-            '{"name":"EPSG:4326"}},"coordinates":[-95.363151,29.763374]}'
-        )
-        victoria_json = (
-            '{"type":"Point","bbox":[-123.30519600,48.46261100,-123.30519600,48.46261100],'
-            '"coordinates":[-123.305196,48.462611]}'
-        )
-        chicago_json = (
-            '{"type":"Point","crs":{"type":"name","properties":{"name":"EPSG:4326"}},'
-            '"bbox":[-87.65018,41.85039,-87.65018,41.85039],"coordinates":[-87.65018,41.85039]}'
-        )
-        if spatialite:
-            victoria_json = (
-                '{"type":"Point","bbox":[-123.305196,48.462611,-123.305196,48.462611],'
-                '"coordinates":[-123.305196,48.462611]}'
-            )
-
-        # Precision argument should only be an integer
-        with self.assertRaises(TypeError):
-            City.objects.geojson(precision='foo')
-
-        # Reference queries and values.
-        # SELECT ST_AsGeoJson("geoapp_city"."point", 8, 0)
-        # FROM "geoapp_city" WHERE "geoapp_city"."name" = 'Pueblo';
-        self.assertEqual(pueblo_json, City.objects.geojson().get(name='Pueblo').geojson)
-
-        # SELECT ST_AsGeoJson("geoapp_city"."point", 8, 2) FROM "geoapp_city"
-        # WHERE "geoapp_city"."name" = 'Houston';
-        # This time we want to include the CRS by using the `crs` keyword.
-        self.assertEqual(houston_json, City.objects.geojson(crs=True, model_att='json').get(name='Houston').json)
-
-        # SELECT ST_AsGeoJson("geoapp_city"."point", 8, 1) FROM "geoapp_city"
-        # WHERE "geoapp_city"."name" = 'Houston';
-        # This time we include the bounding box by using the `bbox` keyword.
-        self.assertEqual(victoria_json, City.objects.geojson(bbox=True).get(name='Victoria').geojson)
-
-        # SELECT ST_AsGeoJson("geoapp_city"."point", 5, 3) FROM "geoapp_city"
-        # WHERE "geoapp_city"."name" = 'Chicago';
-        # Finally, we set every available keyword.
-        self.assertEqual(
-            chicago_json,
-            City.objects.geojson(bbox=True, crs=True, precision=5).get(name='Chicago').geojson
-        )
-
-    @skipUnlessDBFeature("has_gml_method")
-    def test_gml(self):
-        "Testing GML output from the database using GeoQuerySet.gml()."
-        # Should throw a TypeError when trying to obtain GML from a
-        # non-geometry field.
-        qs = City.objects.all()
-        with self.assertRaises(TypeError):
-            qs.gml(field_name='name')
-        ptown1 = City.objects.gml(field_name='point', precision=9).get(name='Pueblo')
-        ptown2 = City.objects.gml(precision=9).get(name='Pueblo')
-
-        if oracle:
-            # No precision parameter for Oracle :-/
-            gml_regex = re.compile(
-                r'^<gml:Point srsName="EPSG:4326" xmlns:gml="http://www.opengis.net/gml">'
-                r'<gml:coordinates decimal="\." cs="," ts=" ">-104.60925\d+,38.25500\d+ '
-                r'</gml:coordinates></gml:Point>'
-            )
-        else:
-            gml_regex = re.compile(
-                r'^<gml:Point srsName="EPSG:4326"><gml:coordinates>'
-                r'-104\.60925\d+,38\.255001</gml:coordinates></gml:Point>'
-            )
-
-        for ptown in [ptown1, ptown2]:
-            self.assertTrue(gml_regex.match(ptown.gml))
-
-        if postgis:
-            self.assertIn('<gml:pos srsDimension="2">', City.objects.gml(version=3).get(name='Pueblo').gml)
-
-    @skipUnlessDBFeature("has_kml_method")
-    def test_kml(self):
-        "Testing KML output from the database using GeoQuerySet.kml()."
-        # Should throw a TypeError when trying to obtain KML from a
-        #  non-geometry field.
-        qs = City.objects.all()
-        with self.assertRaises(TypeError):
-            qs.kml('name')
-
-        # Ensuring the KML is as expected.
-        ptown1 = City.objects.kml(field_name='point', precision=9).get(name='Pueblo')
-        ptown2 = City.objects.kml(precision=9).get(name='Pueblo')
-        for ptown in [ptown1, ptown2]:
-            self.assertEqual('<Point><coordinates>-104.609252,38.255001</coordinates></Point>', ptown.kml)
-
-    def test_make_line(self):
-        """
-        Testing the `MakeLine` aggregate.
-        """
-        if not connection.features.supports_make_line_aggr:
-            with self.assertRaises(NotImplementedError):
-                City.objects.all().aggregate(MakeLine('point'))
-            return
-
-        # MakeLine on an inappropriate field returns simply None
-        self.assertIsNone(State.objects.aggregate(MakeLine('poly'))['poly__makeline'])
-        # Reference query:
-        # SELECT AsText(ST_MakeLine(geoapp_city.point)) FROM geoapp_city;
-        ref_line = GEOSGeometry(
-            'LINESTRING(-95.363151 29.763374,-96.801611 32.782057,'
-            '-97.521157 34.464642,174.783117 -41.315268,-104.609252 38.255001,'
-            '-95.23506 38.971823,-87.650175 41.850385,-123.305196 48.462611)',
-            srid=4326
-        )
-        # We check for equality with a tolerance of 10e-5 which is a lower bound
-        # of the precisions of ref_line coordinates
-        line = City.objects.aggregate(MakeLine('point'))['point__makeline']
-        self.assertTrue(
-            ref_line.equals_exact(line, tolerance=10e-5),
-            "%s != %s" % (ref_line, line)
-        )
-
-    @skipUnlessDBFeature("has_num_geom_method")
-    def test_num_geom(self):
-        "Testing the `num_geom` GeoQuerySet method."
-        # Both 'countries' only have two geometries.
-        for c in Country.objects.num_geom():
-            self.assertEqual(2, c.num_geom)
-
-        for c in City.objects.filter(point__isnull=False).num_geom():
-            # Oracle and PostGIS 2.0+ will return 1 for the number of
-            # geometries on non-collections.
-            self.assertEqual(1, c.num_geom)
-
-    @skipUnlessDBFeature("supports_num_points_poly")
-    def test_num_points(self):
-        "Testing the `num_points` GeoQuerySet method."
-        for c in Country.objects.num_points():
-            self.assertEqual(c.mpoly.num_points, c.num_points)
-
-        if not oracle:
-            # Oracle cannot count vertices in Point geometries.
-            for c in City.objects.num_points():
-                self.assertEqual(1, c.num_points)
-
-    @skipUnlessDBFeature("has_point_on_surface_method")
-    def test_point_on_surface(self):
-        "Testing the `point_on_surface` GeoQuerySet method."
-        # Reference values.
-        if oracle:
-            # SELECT SDO_UTIL.TO_WKTGEOMETRY(SDO_GEOM.SDO_POINTONSURFACE(GEOAPP_COUNTRY.MPOLY, 0.05))
-            # FROM GEOAPP_COUNTRY;
-            ref = {'New Zealand': fromstr('POINT (174.616364 -36.100861)', srid=4326),
-                   'Texas': fromstr('POINT (-103.002434 36.500397)', srid=4326),
-                   }
-
-        else:
-            # Using GEOSGeometry to compute the reference point on surface values
-            # -- since PostGIS also uses GEOS these should be the same.
-            ref = {'New Zealand': Country.objects.get(name='New Zealand').mpoly.point_on_surface,
-                   'Texas': Country.objects.get(name='Texas').mpoly.point_on_surface
-                   }
-
-        for c in Country.objects.point_on_surface():
-            if spatialite:
-                # XXX This seems to be a WKT-translation-related precision issue?
-                tol = 0.00001
-            else:
-                tol = 0.000000001
-            self.assertTrue(ref[c.name].equals_exact(c.point_on_surface, tol))
-
-    @skipUnlessDBFeature("has_reverse_method")
-    def test_reverse_geom(self):
-        "Testing GeoQuerySet.reverse_geom()."
-        coords = [(-95.363151, 29.763374), (-95.448601, 29.713803)]
-        Track.objects.create(name='Foo', line=LineString(coords))
-        t = Track.objects.reverse_geom().get(name='Foo')
-        coords.reverse()
-        self.assertEqual(tuple(coords), t.reverse_geom.coords)
-        if oracle:
-            with self.assertRaises(TypeError):
-                State.objects.reverse_geom()
-
-    @skipUnlessDBFeature("has_scale_method")
-    def test_scale(self):
-        "Testing the `scale` GeoQuerySet method."
-        xfac, yfac = 2, 3
-        tol = 5  # XXX The low precision tolerance is for SpatiaLite
-        qs = Country.objects.scale(xfac, yfac, model_att='scaled')
-        for c in qs:
-            for p1, p2 in zip(c.mpoly, c.scaled):
-                for r1, r2 in zip(p1, p2):
-                    for c1, c2 in zip(r1.coords, r2.coords):
-                        self.assertAlmostEqual(c1[0] * xfac, c2[0], tol)
-                        self.assertAlmostEqual(c1[1] * yfac, c2[1], tol)
-
-    @skipUnlessDBFeature("has_snap_to_grid_method")
-    def test_snap_to_grid(self):
-        "Testing GeoQuerySet.snap_to_grid()."
-        # Let's try and break snap_to_grid() with bad combinations of arguments.
-        for bad_args in ((), range(3), range(5)):
-            with self.assertRaises(ValueError):
-                Country.objects.snap_to_grid(*bad_args)
-        for bad_args in (('1.0',), (1.0, None), tuple(map(six.text_type, range(4)))):
-            with self.assertRaises(TypeError):
-                Country.objects.snap_to_grid(*bad_args)
-
-        # Boundary for San Marino, courtesy of Bjorn Sandvik of thematicmapping.org
-        # from the world borders dataset he provides.
-        wkt = ('MULTIPOLYGON(((12.41580 43.95795,12.45055 43.97972,12.45389 43.98167,'
-               '12.46250 43.98472,12.47167 43.98694,12.49278 43.98917,'
-               '12.50555 43.98861,12.51000 43.98694,12.51028 43.98277,'
-               '12.51167 43.94333,12.51056 43.93916,12.49639 43.92333,'
-               '12.49500 43.91472,12.48778 43.90583,12.47444 43.89722,'
-               '12.46472 43.89555,12.45917 43.89611,12.41639 43.90472,'
-               '12.41222 43.90610,12.40782 43.91366,12.40389 43.92667,'
-               '12.40500 43.94833,12.40889 43.95499,12.41580 43.95795)))')
-        Country.objects.create(name='San Marino', mpoly=fromstr(wkt))
-
-        # Because floating-point arithmetic isn't exact, we set a tolerance
-        # to pass into GEOS `equals_exact`.
-        tol = 0.000000001
-
-        # SELECT AsText(ST_SnapToGrid("geoapp_country"."mpoly", 0.1)) FROM "geoapp_country"
-        # WHERE "geoapp_country"."name" = 'San Marino';
-        ref = fromstr('MULTIPOLYGON(((12.4 44,12.5 44,12.5 43.9,12.4 43.9,12.4 44)))')
-        self.assertTrue(ref.equals_exact(Country.objects.snap_to_grid(0.1).get(name='San Marino').snap_to_grid, tol))
-
-        # SELECT AsText(ST_SnapToGrid("geoapp_country"."mpoly", 0.05, 0.23)) FROM "geoapp_country"
-        # WHERE "geoapp_country"."name" = 'San Marino';
-        ref = fromstr('MULTIPOLYGON(((12.4 43.93,12.45 43.93,12.5 43.93,12.45 43.93,12.4 43.93)))')
-        self.assertTrue(
-            ref.equals_exact(Country.objects.snap_to_grid(0.05, 0.23).get(name='San Marino').snap_to_grid, tol)
-        )
-
-        # SELECT AsText(ST_SnapToGrid("geoapp_country"."mpoly", 0.5, 0.17, 0.05, 0.23)) FROM "geoapp_country"
-        # WHERE "geoapp_country"."name" = 'San Marino';
-        ref = fromstr(
-            'MULTIPOLYGON(((12.4 43.87,12.45 43.87,12.45 44.1,12.5 44.1,12.5 43.87,12.45 43.87,12.4 43.87)))'
-        )
-        self.assertTrue(
-            ref.equals_exact(
-                Country.objects.snap_to_grid(0.05, 0.23, 0.5, 0.17).get(name='San Marino').snap_to_grid,
-                tol
-            )
-        )
-
-    @skipUnlessDBFeature("has_svg_method")
-    def test_svg(self):
-        "Testing SVG output using GeoQuerySet.svg()."
-
-        with self.assertRaises(TypeError):
-            City.objects.svg(precision='foo')
-        # SELECT AsSVG(geoapp_city.point, 0, 8) FROM geoapp_city WHERE name = 'Pueblo';
-        svg1 = 'cx="-104.609252" cy="-38.255001"'
-        # Even though relative, only one point so it's practically the same except for
-        # the 'c' letter prefix on the x,y values.
-        svg2 = svg1.replace('c', '')
-        self.assertEqual(svg1, City.objects.svg().get(name='Pueblo').svg)
-        self.assertEqual(svg2, City.objects.svg(relative=5).get(name='Pueblo').svg)
-
-    @skipUnlessDBFeature("has_transform_method")
-    def test_transform(self):
-        "Testing the transform() GeoQuerySet method."
-        # Pre-transformed points for Houston and Pueblo.
-        htown = fromstr('POINT(1947516.83115183 6322297.06040572)', srid=3084)
-        ptown = fromstr('POINT(992363.390841912 481455.395105533)', srid=2774)
-        prec = 3  # Precision is low due to version variations in PROJ and GDAL.
-
-        # Asserting the result of the transform operation with the values in
-        #  the pre-transformed points.  Oracle does not have the 3084 SRID.
-        if not oracle:
-            h = City.objects.transform(htown.srid).get(name='Houston')
-            self.assertEqual(3084, h.point.srid)
-            self.assertAlmostEqual(htown.x, h.point.x, prec)
-            self.assertAlmostEqual(htown.y, h.point.y, prec)
-
-        p1 = City.objects.transform(ptown.srid, field_name='point').get(name='Pueblo')
-        p2 = City.objects.transform(srid=ptown.srid).get(name='Pueblo')
-        for p in [p1, p2]:
-            self.assertEqual(2774, p.point.srid)
-            self.assertAlmostEqual(ptown.x, p.point.x, prec)
-            self.assertAlmostEqual(ptown.y, p.point.y, prec)
-
-    @skipUnlessDBFeature("has_translate_method")
-    def test_translate(self):
-        "Testing the `translate` GeoQuerySet method."
-        xfac, yfac = 5, -23
-        qs = Country.objects.translate(xfac, yfac, model_att='translated')
-        for c in qs:
-            for p1, p2 in zip(c.mpoly, c.translated):
-                for r1, r2 in zip(p1, p2):
-                    for c1, c2 in zip(r1.coords, r2.coords):
-                        # XXX The low precision is for SpatiaLite
-                        self.assertAlmostEqual(c1[0] + xfac, c2[0], 5)
-                        self.assertAlmostEqual(c1[1] + yfac, c2[1], 5)
-
-    @skipUnlessDBFeature('supports_union_aggr')
-    def test_unionagg(self):
-        """
-        Testing the `Union` aggregate.
-        """
-        tx = Country.objects.get(name='Texas').mpoly
-        # Houston, Dallas -- Ordering may differ depending on backend or GEOS version.
-        union = GEOSGeometry('MULTIPOINT(-96.801611 32.782057,-95.363151 29.763374)')
-        qs = City.objects.filter(point__within=tx)
-        with self.assertRaises(ValueError):
-            qs.aggregate(Union('name'))
-        # Using `field_name` keyword argument in one query and specifying an
-        # order in the other (which should not be used because this is
-        # an aggregate method on a spatial column)
-        u1 = qs.aggregate(Union('point'))['point__union']
-        u2 = qs.order_by('name').aggregate(Union('point'))['point__union']
-        self.assertTrue(union.equals(u1))
-        self.assertTrue(union.equals(u2))
-        qs = City.objects.filter(name='NotACity')
-        self.assertIsNone(qs.aggregate(Union('point'))['point__union'])
-
-    def test_within_subquery(self):
-        """
-        Using a queryset inside a geo lookup is working (using a subquery)
-        (#14483).
-        """
-        tex_cities = City.objects.filter(
-            point__within=Country.objects.filter(name='Texas').values('mpoly')).order_by('name')
-        expected = ['Dallas', 'Houston']
-        if not connection.features.supports_real_shape_operations:
-            expected.append('Oklahoma City')
-        self.assertEqual(
-            list(tex_cities.values_list('name', flat=True)),
-            expected
-        )
-
-    def test_non_concrete_field(self):
-        NonConcreteModel.objects.create(point=Point(0, 0), name='name')
-        list(NonConcreteModel.objects.all())
-
-    def test_values_srid(self):
-        for c, v in zip(City.objects.all(), City.objects.values()):
-            self.assertEqual(c.point.srid, v['point'].srid)

+ 0 - 1241
desktop/core/ext-py/Django-1.11.22/tests/timezones/tests.py

@@ -1,1241 +0,0 @@
-from __future__ import unicode_literals
-
-import datetime
-import re
-import sys
-import warnings
-from contextlib import contextmanager
-from unittest import SkipTest, skipIf
-from xml.dom.minidom import parseString
-
-import pytz
-
-from django.contrib.auth.models import User
-from django.core import serializers
-from django.core.exceptions import ImproperlyConfigured
-from django.db import connection, connections
-from django.db.models import F, Max, Min
-from django.http import HttpRequest
-from django.template import (
-    Context, RequestContext, Template, TemplateSyntaxError, context_processors,
-)
-from django.test import (
-    SimpleTestCase, TestCase, TransactionTestCase, override_settings,
-    skipIfDBFeature, skipUnlessDBFeature,
-)
-from django.test.utils import requires_tz_support
-from django.urls import reverse
-from django.utils import six, timezone
-from django.utils.timezone import timedelta
-
-from .forms import (
-    EventForm, EventLocalizedForm, EventLocalizedModelForm, EventModelForm,
-    EventSplitForm,
-)
-from .models import (
-    AllDayEvent, Event, MaybeEvent, Session, SessionEvent, Timestamp,
-)
-
-# These tests use the EAT (Eastern Africa Time) and ICT (Indochina Time)
-# who don't have Daylight Saving Time, so we can represent them easily
-# with FixedOffset, and use them directly as tzinfo in the constructors.
-
-# settings.TIME_ZONE is forced to EAT. Most tests use a variant of
-# datetime.datetime(2011, 9, 1, 13, 20, 30), which translates to
-# 10:20:30 in UTC and 17:20:30 in ICT.
-
-UTC = timezone.utc
-EAT = timezone.get_fixed_timezone(180)      # Africa/Nairobi
-ICT = timezone.get_fixed_timezone(420)      # Asia/Bangkok
-
-
-@override_settings(TIME_ZONE='Africa/Nairobi', USE_TZ=False)
-class LegacyDatabaseTests(TestCase):
-
-    def test_naive_datetime(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30)
-        Event.objects.create(dt=dt)
-        event = Event.objects.get()
-        self.assertEqual(event.dt, dt)
-
-    @skipUnlessDBFeature('supports_microsecond_precision')
-    def test_naive_datetime_with_microsecond(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, 405060)
-        Event.objects.create(dt=dt)
-        event = Event.objects.get()
-        self.assertEqual(event.dt, dt)
-
-    @skipIfDBFeature('supports_microsecond_precision')
-    def test_naive_datetime_with_microsecond_unsupported(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, 405060)
-        Event.objects.create(dt=dt)
-        event = Event.objects.get()
-        # microseconds are lost during a round-trip in the database
-        self.assertEqual(event.dt, dt.replace(microsecond=0))
-
-    @skipUnlessDBFeature('supports_timezones')
-    def test_aware_datetime_in_local_timezone(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT)
-        Event.objects.create(dt=dt)
-        event = Event.objects.get()
-        self.assertIsNone(event.dt.tzinfo)
-        # interpret the naive datetime in local time to get the correct value
-        self.assertEqual(event.dt.replace(tzinfo=EAT), dt)
-
-    @skipUnlessDBFeature('supports_timezones')
-    @skipUnlessDBFeature('supports_microsecond_precision')
-    def test_aware_datetime_in_local_timezone_with_microsecond(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, 405060, tzinfo=EAT)
-        Event.objects.create(dt=dt)
-        event = Event.objects.get()
-        self.assertIsNone(event.dt.tzinfo)
-        # interpret the naive datetime in local time to get the correct value
-        self.assertEqual(event.dt.replace(tzinfo=EAT), dt)
-
-    # This combination actually never happens.
-    @skipUnlessDBFeature('supports_timezones')
-    @skipIfDBFeature('supports_microsecond_precision')
-    def test_aware_datetime_in_local_timezone_with_microsecond_unsupported(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, 405060, tzinfo=EAT)
-        Event.objects.create(dt=dt)
-        event = Event.objects.get()
-        self.assertIsNone(event.dt.tzinfo)
-        # interpret the naive datetime in local time to get the correct value
-        # microseconds are lost during a round-trip in the database
-        self.assertEqual(event.dt.replace(tzinfo=EAT), dt.replace(microsecond=0))
-
-    @skipUnlessDBFeature('supports_timezones')
-    def test_aware_datetime_in_utc(self):
-        dt = datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC)
-        Event.objects.create(dt=dt)
-        event = Event.objects.get()
-        self.assertIsNone(event.dt.tzinfo)
-        # interpret the naive datetime in local time to get the correct value
-        self.assertEqual(event.dt.replace(tzinfo=EAT), dt)
-
-    @skipUnlessDBFeature('supports_timezones')
-    def test_aware_datetime_in_other_timezone(self):
-        dt = datetime.datetime(2011, 9, 1, 17, 20, 30, tzinfo=ICT)
-        Event.objects.create(dt=dt)
-        event = Event.objects.get()
-        self.assertIsNone(event.dt.tzinfo)
-        # interpret the naive datetime in local time to get the correct value
-        self.assertEqual(event.dt.replace(tzinfo=EAT), dt)
-
-    @skipIfDBFeature('supports_timezones')
-    def test_aware_datetime_unsupported(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT)
-        with self.assertRaises(ValueError):
-            Event.objects.create(dt=dt)
-
-    def test_auto_now_and_auto_now_add(self):
-        now = datetime.datetime.now()
-        past = now - datetime.timedelta(seconds=2)
-        future = now + datetime.timedelta(seconds=2)
-        Timestamp.objects.create()
-        ts = Timestamp.objects.get()
-        self.assertLess(past, ts.created)
-        self.assertLess(past, ts.updated)
-        self.assertGreater(future, ts.updated)
-        self.assertGreater(future, ts.updated)
-
-    def test_query_filter(self):
-        dt1 = datetime.datetime(2011, 9, 1, 12, 20, 30)
-        dt2 = datetime.datetime(2011, 9, 1, 14, 20, 30)
-        Event.objects.create(dt=dt1)
-        Event.objects.create(dt=dt2)
-        self.assertEqual(Event.objects.filter(dt__gte=dt1).count(), 2)
-        self.assertEqual(Event.objects.filter(dt__gt=dt1).count(), 1)
-        self.assertEqual(Event.objects.filter(dt__gte=dt2).count(), 1)
-        self.assertEqual(Event.objects.filter(dt__gt=dt2).count(), 0)
-
-    def test_query_datetime_lookups(self):
-        Event.objects.create(dt=datetime.datetime(2011, 1, 1, 1, 30, 0))
-        Event.objects.create(dt=datetime.datetime(2011, 1, 1, 4, 30, 0))
-        self.assertEqual(Event.objects.filter(dt__year=2011).count(), 2)
-        self.assertEqual(Event.objects.filter(dt__month=1).count(), 2)
-        self.assertEqual(Event.objects.filter(dt__day=1).count(), 2)
-        self.assertEqual(Event.objects.filter(dt__week_day=7).count(), 2)
-        self.assertEqual(Event.objects.filter(dt__hour=1).count(), 1)
-        self.assertEqual(Event.objects.filter(dt__minute=30).count(), 2)
-        self.assertEqual(Event.objects.filter(dt__second=0).count(), 2)
-
-    def test_query_aggregation(self):
-        # Only min and max make sense for datetimes.
-        Event.objects.create(dt=datetime.datetime(2011, 9, 1, 23, 20, 20))
-        Event.objects.create(dt=datetime.datetime(2011, 9, 1, 13, 20, 30))
-        Event.objects.create(dt=datetime.datetime(2011, 9, 1, 3, 20, 40))
-        result = Event.objects.all().aggregate(Min('dt'), Max('dt'))
-        self.assertEqual(result, {
-            'dt__min': datetime.datetime(2011, 9, 1, 3, 20, 40),
-            'dt__max': datetime.datetime(2011, 9, 1, 23, 20, 20),
-        })
-
-    def test_query_annotation(self):
-        # Only min and max make sense for datetimes.
-        morning = Session.objects.create(name='morning')
-        afternoon = Session.objects.create(name='afternoon')
-        SessionEvent.objects.create(dt=datetime.datetime(2011, 9, 1, 23, 20, 20), session=afternoon)
-        SessionEvent.objects.create(dt=datetime.datetime(2011, 9, 1, 13, 20, 30), session=afternoon)
-        SessionEvent.objects.create(dt=datetime.datetime(2011, 9, 1, 3, 20, 40), session=morning)
-        morning_min_dt = datetime.datetime(2011, 9, 1, 3, 20, 40)
-        afternoon_min_dt = datetime.datetime(2011, 9, 1, 13, 20, 30)
-        self.assertQuerysetEqual(
-            Session.objects.annotate(dt=Min('events__dt')).order_by('dt'),
-            [morning_min_dt, afternoon_min_dt],
-            transform=lambda d: d.dt)
-        self.assertQuerysetEqual(
-            Session.objects.annotate(dt=Min('events__dt')).filter(dt__lt=afternoon_min_dt),
-            [morning_min_dt],
-            transform=lambda d: d.dt)
-        self.assertQuerysetEqual(
-            Session.objects.annotate(dt=Min('events__dt')).filter(dt__gte=afternoon_min_dt),
-            [afternoon_min_dt],
-            transform=lambda d: d.dt)
-
-    def test_query_datetimes(self):
-        Event.objects.create(dt=datetime.datetime(2011, 1, 1, 1, 30, 0))
-        Event.objects.create(dt=datetime.datetime(2011, 1, 1, 4, 30, 0))
-        self.assertSequenceEqual(Event.objects.datetimes('dt', 'year'), [datetime.datetime(2011, 1, 1, 0, 0, 0)])
-        self.assertSequenceEqual(Event.objects.datetimes('dt', 'month'), [datetime.datetime(2011, 1, 1, 0, 0, 0)])
-        self.assertSequenceEqual(Event.objects.datetimes('dt', 'day'), [datetime.datetime(2011, 1, 1, 0, 0, 0)])
-        self.assertSequenceEqual(
-            Event.objects.datetimes('dt', 'hour'),
-            [datetime.datetime(2011, 1, 1, 1, 0, 0),
-             datetime.datetime(2011, 1, 1, 4, 0, 0)]
-        )
-        self.assertSequenceEqual(
-            Event.objects.datetimes('dt', 'minute'),
-            [datetime.datetime(2011, 1, 1, 1, 30, 0),
-             datetime.datetime(2011, 1, 1, 4, 30, 0)]
-        )
-        self.assertSequenceEqual(
-            Event.objects.datetimes('dt', 'second'),
-            [datetime.datetime(2011, 1, 1, 1, 30, 0),
-             datetime.datetime(2011, 1, 1, 4, 30, 0)]
-        )
-
-    def test_raw_sql(self):
-        # Regression test for #17755
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30)
-        event = Event.objects.create(dt=dt)
-        self.assertSequenceEqual(list(Event.objects.raw('SELECT * FROM timezones_event WHERE dt = %s', [dt])), [event])
-
-    def test_cursor_execute_accepts_naive_datetime(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30)
-        with connection.cursor() as cursor:
-            cursor.execute('INSERT INTO timezones_event (dt) VALUES (%s)', [dt])
-        event = Event.objects.get()
-        self.assertEqual(event.dt, dt)
-
-    def test_cursor_execute_returns_naive_datetime(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30)
-        Event.objects.create(dt=dt)
-        with connection.cursor() as cursor:
-            cursor.execute('SELECT dt FROM timezones_event WHERE dt = %s', [dt])
-            self.assertEqual(cursor.fetchall()[0][0], dt)
-
-    def test_filter_date_field_with_aware_datetime(self):
-        # Regression test for #17742
-        day = datetime.date(2011, 9, 1)
-        AllDayEvent.objects.create(day=day)
-        # This is 2011-09-02T01:30:00+03:00 in EAT
-        dt = datetime.datetime(2011, 9, 1, 22, 30, 0, tzinfo=UTC)
-        self.assertTrue(AllDayEvent.objects.filter(day__gte=dt).exists())
-
-
-@override_settings(TIME_ZONE='Africa/Nairobi', USE_TZ=True)
-class NewDatabaseTests(TestCase):
-
-    @requires_tz_support
-    def test_naive_datetime(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30)
-        with warnings.catch_warnings(record=True) as recorded:
-            warnings.simplefilter('always')
-            Event.objects.create(dt=dt)
-            self.assertEqual(len(recorded), 1)
-            msg = str(recorded[0].message)
-            self.assertTrue(msg.startswith("DateTimeField Event.dt received "
-                                           "a naive datetime"))
-        event = Event.objects.get()
-        # naive datetimes are interpreted in local time
-        self.assertEqual(event.dt, dt.replace(tzinfo=EAT))
-
-    @requires_tz_support
-    def test_datetime_from_date(self):
-        dt = datetime.date(2011, 9, 1)
-        with warnings.catch_warnings(record=True) as recorded:
-            warnings.simplefilter('always')
-            Event.objects.create(dt=dt)
-            self.assertEqual(len(recorded), 1)
-            msg = str(recorded[0].message)
-            self.assertTrue(msg.startswith("DateTimeField Event.dt received "
-                                           "a naive datetime"))
-        event = Event.objects.get()
-        self.assertEqual(event.dt, datetime.datetime(2011, 9, 1, tzinfo=EAT))
-
-    @requires_tz_support
-    @skipUnlessDBFeature('supports_microsecond_precision')
-    def test_naive_datetime_with_microsecond(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, 405060)
-        with warnings.catch_warnings(record=True) as recorded:
-            warnings.simplefilter('always')
-            Event.objects.create(dt=dt)
-            self.assertEqual(len(recorded), 1)
-            msg = str(recorded[0].message)
-            self.assertTrue(msg.startswith("DateTimeField Event.dt received "
-                                           "a naive datetime"))
-        event = Event.objects.get()
-        # naive datetimes are interpreted in local time
-        self.assertEqual(event.dt, dt.replace(tzinfo=EAT))
-
-    @requires_tz_support
-    @skipIfDBFeature('supports_microsecond_precision')
-    def test_naive_datetime_with_microsecond_unsupported(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, 405060)
-        with warnings.catch_warnings(record=True) as recorded:
-            warnings.simplefilter('always')
-            Event.objects.create(dt=dt)
-            self.assertEqual(len(recorded), 1)
-            msg = str(recorded[0].message)
-            self.assertTrue(msg.startswith("DateTimeField Event.dt received "
-                                           "a naive datetime"))
-        event = Event.objects.get()
-        # microseconds are lost during a round-trip in the database
-        # naive datetimes are interpreted in local time
-        self.assertEqual(event.dt, dt.replace(microsecond=0, tzinfo=EAT))
-
-    def test_aware_datetime_in_local_timezone(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT)
-        Event.objects.create(dt=dt)
-        event = Event.objects.get()
-        self.assertEqual(event.dt, dt)
-
-    @skipUnlessDBFeature('supports_microsecond_precision')
-    def test_aware_datetime_in_local_timezone_with_microsecond(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, 405060, tzinfo=EAT)
-        Event.objects.create(dt=dt)
-        event = Event.objects.get()
-        self.assertEqual(event.dt, dt)
-
-    @skipIfDBFeature('supports_microsecond_precision')
-    def test_aware_datetime_in_local_timezone_with_microsecond_unsupported(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, 405060, tzinfo=EAT)
-        Event.objects.create(dt=dt)
-        event = Event.objects.get()
-        # microseconds are lost during a round-trip in the database
-        self.assertEqual(event.dt, dt.replace(microsecond=0))
-
-    def test_aware_datetime_in_utc(self):
-        dt = datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC)
-        Event.objects.create(dt=dt)
-        event = Event.objects.get()
-        self.assertEqual(event.dt, dt)
-
-    def test_aware_datetime_in_other_timezone(self):
-        dt = datetime.datetime(2011, 9, 1, 17, 20, 30, tzinfo=ICT)
-        Event.objects.create(dt=dt)
-        event = Event.objects.get()
-        self.assertEqual(event.dt, dt)
-
-    def test_auto_now_and_auto_now_add(self):
-        now = timezone.now()
-        past = now - datetime.timedelta(seconds=2)
-        future = now + datetime.timedelta(seconds=2)
-        Timestamp.objects.create()
-        ts = Timestamp.objects.get()
-        self.assertLess(past, ts.created)
-        self.assertLess(past, ts.updated)
-        self.assertGreater(future, ts.updated)
-        self.assertGreater(future, ts.updated)
-
-    def test_query_filter(self):
-        dt1 = datetime.datetime(2011, 9, 1, 12, 20, 30, tzinfo=EAT)
-        dt2 = datetime.datetime(2011, 9, 1, 14, 20, 30, tzinfo=EAT)
-        Event.objects.create(dt=dt1)
-        Event.objects.create(dt=dt2)
-        self.assertEqual(Event.objects.filter(dt__gte=dt1).count(), 2)
-        self.assertEqual(Event.objects.filter(dt__gt=dt1).count(), 1)
-        self.assertEqual(Event.objects.filter(dt__gte=dt2).count(), 1)
-        self.assertEqual(Event.objects.filter(dt__gt=dt2).count(), 0)
-
-    def test_query_filter_with_pytz_timezones(self):
-        tz = pytz.timezone('Europe/Paris')
-        dt = datetime.datetime(2011, 9, 1, 12, 20, 30, tzinfo=tz)
-        Event.objects.create(dt=dt)
-        next = dt + datetime.timedelta(seconds=3)
-        prev = dt - datetime.timedelta(seconds=3)
-        self.assertEqual(Event.objects.filter(dt__exact=dt).count(), 1)
-        self.assertEqual(Event.objects.filter(dt__exact=next).count(), 0)
-        self.assertEqual(Event.objects.filter(dt__in=(prev, next)).count(), 0)
-        self.assertEqual(Event.objects.filter(dt__in=(prev, dt, next)).count(), 1)
-        self.assertEqual(Event.objects.filter(dt__range=(prev, next)).count(), 1)
-
-    @requires_tz_support
-    def test_query_filter_with_naive_datetime(self):
-        dt = datetime.datetime(2011, 9, 1, 12, 20, 30, tzinfo=EAT)
-        Event.objects.create(dt=dt)
-        dt = dt.replace(tzinfo=None)
-        with warnings.catch_warnings(record=True) as recorded:
-            warnings.simplefilter('always')
-            # naive datetimes are interpreted in local time
-            self.assertEqual(Event.objects.filter(dt__exact=dt).count(), 1)
-            self.assertEqual(Event.objects.filter(dt__lte=dt).count(), 1)
-            self.assertEqual(Event.objects.filter(dt__gt=dt).count(), 0)
-            self.assertEqual(len(recorded), 3)
-            for warning in recorded:
-                msg = str(warning.message)
-                self.assertTrue(msg.startswith("DateTimeField Event.dt "
-                                               "received a naive datetime"))
-
-    @skipUnlessDBFeature('has_zoneinfo_database')
-    def test_query_datetime_lookups(self):
-        Event.objects.create(dt=datetime.datetime(2011, 1, 1, 1, 30, 0, tzinfo=EAT))
-        Event.objects.create(dt=datetime.datetime(2011, 1, 1, 4, 30, 0, tzinfo=EAT))
-        self.assertEqual(Event.objects.filter(dt__year=2011).count(), 2)
-        self.assertEqual(Event.objects.filter(dt__month=1).count(), 2)
-        self.assertEqual(Event.objects.filter(dt__day=1).count(), 2)
-        self.assertEqual(Event.objects.filter(dt__week_day=7).count(), 2)
-        self.assertEqual(Event.objects.filter(dt__hour=1).count(), 1)
-        self.assertEqual(Event.objects.filter(dt__minute=30).count(), 2)
-        self.assertEqual(Event.objects.filter(dt__second=0).count(), 2)
-
-    @skipUnlessDBFeature('has_zoneinfo_database')
-    def test_query_datetime_lookups_in_other_timezone(self):
-        Event.objects.create(dt=datetime.datetime(2011, 1, 1, 1, 30, 0, tzinfo=EAT))
-        Event.objects.create(dt=datetime.datetime(2011, 1, 1, 4, 30, 0, tzinfo=EAT))
-        with timezone.override(UTC):
-            # These two dates fall in the same day in EAT, but in different days,
-            # years and months in UTC.
-            self.assertEqual(Event.objects.filter(dt__year=2011).count(), 1)
-            self.assertEqual(Event.objects.filter(dt__month=1).count(), 1)
-            self.assertEqual(Event.objects.filter(dt__day=1).count(), 1)
-            self.assertEqual(Event.objects.filter(dt__week_day=7).count(), 1)
-            self.assertEqual(Event.objects.filter(dt__hour=22).count(), 1)
-            self.assertEqual(Event.objects.filter(dt__minute=30).count(), 2)
-            self.assertEqual(Event.objects.filter(dt__second=0).count(), 2)
-
-    def test_query_aggregation(self):
-        # Only min and max make sense for datetimes.
-        Event.objects.create(dt=datetime.datetime(2011, 9, 1, 23, 20, 20, tzinfo=EAT))
-        Event.objects.create(dt=datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT))
-        Event.objects.create(dt=datetime.datetime(2011, 9, 1, 3, 20, 40, tzinfo=EAT))
-        result = Event.objects.all().aggregate(Min('dt'), Max('dt'))
-        self.assertEqual(result, {
-            'dt__min': datetime.datetime(2011, 9, 1, 3, 20, 40, tzinfo=EAT),
-            'dt__max': datetime.datetime(2011, 9, 1, 23, 20, 20, tzinfo=EAT),
-        })
-
-    def test_query_annotation(self):
-        # Only min and max make sense for datetimes.
-        morning = Session.objects.create(name='morning')
-        afternoon = Session.objects.create(name='afternoon')
-        SessionEvent.objects.create(dt=datetime.datetime(2011, 9, 1, 23, 20, 20, tzinfo=EAT), session=afternoon)
-        SessionEvent.objects.create(dt=datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT), session=afternoon)
-        SessionEvent.objects.create(dt=datetime.datetime(2011, 9, 1, 3, 20, 40, tzinfo=EAT), session=morning)
-        morning_min_dt = datetime.datetime(2011, 9, 1, 3, 20, 40, tzinfo=EAT)
-        afternoon_min_dt = datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT)
-        self.assertQuerysetEqual(
-            Session.objects.annotate(dt=Min('events__dt')).order_by('dt'),
-            [morning_min_dt, afternoon_min_dt],
-            transform=lambda d: d.dt)
-        self.assertQuerysetEqual(
-            Session.objects.annotate(dt=Min('events__dt')).filter(dt__lt=afternoon_min_dt),
-            [morning_min_dt],
-            transform=lambda d: d.dt)
-        self.assertQuerysetEqual(
-            Session.objects.annotate(dt=Min('events__dt')).filter(dt__gte=afternoon_min_dt),
-            [afternoon_min_dt],
-            transform=lambda d: d.dt)
-
-    @skipUnlessDBFeature('has_zoneinfo_database')
-    def test_query_datetimes(self):
-        Event.objects.create(dt=datetime.datetime(2011, 1, 1, 1, 30, 0, tzinfo=EAT))
-        Event.objects.create(dt=datetime.datetime(2011, 1, 1, 4, 30, 0, tzinfo=EAT))
-        self.assertSequenceEqual(
-            Event.objects.datetimes('dt', 'year'),
-            [datetime.datetime(2011, 1, 1, 0, 0, 0, tzinfo=EAT)]
-        )
-        self.assertSequenceEqual(
-            Event.objects.datetimes('dt', 'month'),
-            [datetime.datetime(2011, 1, 1, 0, 0, 0, tzinfo=EAT)]
-        )
-        self.assertSequenceEqual(
-            Event.objects.datetimes('dt', 'day'),
-            [datetime.datetime(2011, 1, 1, 0, 0, 0, tzinfo=EAT)]
-        )
-        self.assertSequenceEqual(
-            Event.objects.datetimes('dt', 'hour'),
-            [datetime.datetime(2011, 1, 1, 1, 0, 0, tzinfo=EAT),
-             datetime.datetime(2011, 1, 1, 4, 0, 0, tzinfo=EAT)]
-        )
-        self.assertSequenceEqual(
-            Event.objects.datetimes('dt', 'minute'),
-            [datetime.datetime(2011, 1, 1, 1, 30, 0, tzinfo=EAT),
-             datetime.datetime(2011, 1, 1, 4, 30, 0, tzinfo=EAT)]
-        )
-        self.assertSequenceEqual(
-            Event.objects.datetimes('dt', 'second'),
-            [datetime.datetime(2011, 1, 1, 1, 30, 0, tzinfo=EAT),
-             datetime.datetime(2011, 1, 1, 4, 30, 0, tzinfo=EAT)]
-        )
-
-    @skipUnlessDBFeature('has_zoneinfo_database')
-    def test_query_datetimes_in_other_timezone(self):
-        Event.objects.create(dt=datetime.datetime(2011, 1, 1, 1, 30, 0, tzinfo=EAT))
-        Event.objects.create(dt=datetime.datetime(2011, 1, 1, 4, 30, 0, tzinfo=EAT))
-        with timezone.override(UTC):
-            self.assertSequenceEqual(
-                Event.objects.datetimes('dt', 'year'),
-                [datetime.datetime(2010, 1, 1, 0, 0, 0, tzinfo=UTC),
-                 datetime.datetime(2011, 1, 1, 0, 0, 0, tzinfo=UTC)]
-            )
-            self.assertSequenceEqual(
-                Event.objects.datetimes('dt', 'month'),
-                [datetime.datetime(2010, 12, 1, 0, 0, 0, tzinfo=UTC),
-                 datetime.datetime(2011, 1, 1, 0, 0, 0, tzinfo=UTC)]
-            )
-            self.assertSequenceEqual(
-                Event.objects.datetimes('dt', 'day'),
-                [datetime.datetime(2010, 12, 31, 0, 0, 0, tzinfo=UTC),
-                 datetime.datetime(2011, 1, 1, 0, 0, 0, tzinfo=UTC)]
-            )
-            self.assertSequenceEqual(
-                Event.objects.datetimes('dt', 'hour'),
-                [datetime.datetime(2010, 12, 31, 22, 0, 0, tzinfo=UTC),
-                 datetime.datetime(2011, 1, 1, 1, 0, 0, tzinfo=UTC)]
-            )
-            self.assertSequenceEqual(
-                Event.objects.datetimes('dt', 'minute'),
-                [datetime.datetime(2010, 12, 31, 22, 30, 0, tzinfo=UTC),
-                 datetime.datetime(2011, 1, 1, 1, 30, 0, tzinfo=UTC)]
-            )
-            self.assertSequenceEqual(
-                Event.objects.datetimes('dt', 'second'),
-                [datetime.datetime(2010, 12, 31, 22, 30, 0, tzinfo=UTC),
-                 datetime.datetime(2011, 1, 1, 1, 30, 0, tzinfo=UTC)]
-            )
-
-    def test_raw_sql(self):
-        # Regression test for #17755
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT)
-        event = Event.objects.create(dt=dt)
-        self.assertSequenceEqual(list(Event.objects.raw('SELECT * FROM timezones_event WHERE dt = %s', [dt])), [event])
-
-    @skipUnlessDBFeature('supports_timezones')
-    def test_cursor_execute_accepts_aware_datetime(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT)
-        with connection.cursor() as cursor:
-            cursor.execute('INSERT INTO timezones_event (dt) VALUES (%s)', [dt])
-        event = Event.objects.get()
-        self.assertEqual(event.dt, dt)
-
-    @skipIfDBFeature('supports_timezones')
-    def test_cursor_execute_accepts_naive_datetime(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT)
-        utc_naive_dt = timezone.make_naive(dt, timezone.utc)
-        with connection.cursor() as cursor:
-            cursor.execute('INSERT INTO timezones_event (dt) VALUES (%s)', [utc_naive_dt])
-        event = Event.objects.get()
-        self.assertEqual(event.dt, dt)
-
-    @skipUnlessDBFeature('supports_timezones')
-    def test_cursor_execute_returns_aware_datetime(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT)
-        Event.objects.create(dt=dt)
-        with connection.cursor() as cursor:
-            cursor.execute('SELECT dt FROM timezones_event WHERE dt = %s', [dt])
-            self.assertEqual(cursor.fetchall()[0][0], dt)
-
-    @skipIfDBFeature('supports_timezones')
-    def test_cursor_execute_returns_naive_datetime(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT)
-        utc_naive_dt = timezone.make_naive(dt, timezone.utc)
-        Event.objects.create(dt=dt)
-        with connection.cursor() as cursor:
-            cursor.execute('SELECT dt FROM timezones_event WHERE dt = %s', [utc_naive_dt])
-            self.assertEqual(cursor.fetchall()[0][0], utc_naive_dt)
-
-    @requires_tz_support
-    def test_filter_date_field_with_aware_datetime(self):
-        # Regression test for #17742
-        day = datetime.date(2011, 9, 1)
-        AllDayEvent.objects.create(day=day)
-        # This is 2011-09-02T01:30:00+03:00 in EAT
-        dt = datetime.datetime(2011, 9, 1, 22, 30, 0, tzinfo=UTC)
-        self.assertFalse(AllDayEvent.objects.filter(day__gte=dt).exists())
-
-    def test_null_datetime(self):
-        # Regression test for #17294
-        e = MaybeEvent.objects.create()
-        self.assertIsNone(e.dt)
-
-    def test_update_with_timedelta(self):
-        initial_dt = timezone.now().replace(microsecond=0)
-        event = Event.objects.create(dt=initial_dt)
-        Event.objects.update(dt=F('dt') + timedelta(hours=2))
-        event.refresh_from_db()
-        self.assertEqual(event.dt, initial_dt + timedelta(hours=2))
-
-
-@override_settings(TIME_ZONE='Africa/Nairobi', USE_TZ=True)
-class ForcedTimeZoneDatabaseTests(TransactionTestCase):
-    """
-    Test the TIME_ZONE database configuration parameter.
-
-    Since this involves reading and writing to the same database through two
-    connections, this is a TransactionTestCase.
-    """
-
-    available_apps = ['timezones']
-
-    @classmethod
-    def setUpClass(cls):
-        # @skipIfDBFeature and @skipUnlessDBFeature cannot be chained. The
-        # outermost takes precedence. Handle skipping manually instead.
-        if connection.features.supports_timezones:
-            raise SkipTest("Database has feature(s) supports_timezones")
-        if not connection.features.test_db_allows_multiple_connections:
-            raise SkipTest("Database doesn't support feature(s): test_db_allows_multiple_connections")
-
-        super(ForcedTimeZoneDatabaseTests, cls).setUpClass()
-
-    @contextmanager
-    def override_database_connection_timezone(self, timezone):
-        try:
-            orig_timezone = connection.settings_dict['TIME_ZONE']
-            connection.settings_dict['TIME_ZONE'] = timezone
-            # Clear cached properties, after first accessing them to ensure they exist.
-            connection.timezone
-            del connection.timezone
-            connection.timezone_name
-            del connection.timezone_name
-
-            yield
-
-        finally:
-            connection.settings_dict['TIME_ZONE'] = orig_timezone
-            # Clear cached properties, after first accessing them to ensure they exist.
-            connection.timezone
-            del connection.timezone
-            connection.timezone_name
-            del connection.timezone_name
-
-    def test_read_datetime(self):
-        fake_dt = datetime.datetime(2011, 9, 1, 17, 20, 30, tzinfo=UTC)
-        Event.objects.create(dt=fake_dt)
-
-        with self.override_database_connection_timezone('Asia/Bangkok'):
-            event = Event.objects.get()
-            dt = datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC)
-        self.assertEqual(event.dt, dt)
-
-    def test_write_datetime(self):
-        dt = datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC)
-        with self.override_database_connection_timezone('Asia/Bangkok'):
-            Event.objects.create(dt=dt)
-
-        event = Event.objects.get()
-        fake_dt = datetime.datetime(2011, 9, 1, 17, 20, 30, tzinfo=UTC)
-        self.assertEqual(event.dt, fake_dt)
-
-
-@skipUnlessDBFeature('supports_timezones')
-@override_settings(TIME_ZONE='Africa/Nairobi', USE_TZ=True)
-class UnsupportedTimeZoneDatabaseTests(TestCase):
-
-    def test_time_zone_parameter_not_supported_if_database_supports_timezone(self):
-        connections.databases['tz'] = connections.databases['default'].copy()
-        connections.databases['tz']['TIME_ZONE'] = 'Asia/Bangkok'
-        tz_conn = connections['tz']
-        try:
-            with self.assertRaises(ImproperlyConfigured):
-                tz_conn.cursor()
-        finally:
-            connections['tz'].close()       # in case the test fails
-            del connections['tz']
-            del connections.databases['tz']
-
-
-@override_settings(TIME_ZONE='Africa/Nairobi')
-class SerializationTests(SimpleTestCase):
-
-    # Backend-specific notes:
-    # - JSON supports only milliseconds, microseconds will be truncated.
-    # - PyYAML dumps the UTC offset correctly for timezone-aware datetimes,
-    #   but when it loads this representation, it subtracts the offset and
-    #   returns a naive datetime object in UTC. See ticket #18867.
-    # Tests are adapted to take these quirks into account.
-
-    def assert_python_contains_datetime(self, objects, dt):
-        self.assertEqual(objects[0]['fields']['dt'], dt)
-
-    def assert_json_contains_datetime(self, json, dt):
-        self.assertIn('"fields": {"dt": "%s"}' % dt, json)
-
-    def assert_xml_contains_datetime(self, xml, dt):
-        field = parseString(xml).getElementsByTagName('field')[0]
-        self.assertXMLEqual(field.childNodes[0].wholeText, dt)
-
-    def assert_yaml_contains_datetime(self, yaml, dt):
-        # Depending on the yaml dumper, '!timestamp' might be absent
-        self.assertRegex(yaml, r"\n  fields: {dt: !(!timestamp)? '%s'}" % re.escape(dt))
-
-    def test_naive_datetime(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30)
-
-        data = serializers.serialize('python', [Event(dt=dt)])
-        self.assert_python_contains_datetime(data, dt)
-        obj = next(serializers.deserialize('python', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        data = serializers.serialize('json', [Event(dt=dt)])
-        self.assert_json_contains_datetime(data, "2011-09-01T13:20:30")
-        obj = next(serializers.deserialize('json', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        data = serializers.serialize('xml', [Event(dt=dt)])
-        self.assert_xml_contains_datetime(data, "2011-09-01T13:20:30")
-        obj = next(serializers.deserialize('xml', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        if not isinstance(serializers.get_serializer('yaml'), serializers.BadSerializer):
-            data = serializers.serialize('yaml', [Event(dt=dt)], default_flow_style=None)
-            self.assert_yaml_contains_datetime(data, "2011-09-01 13:20:30")
-            obj = next(serializers.deserialize('yaml', data)).object
-            self.assertEqual(obj.dt, dt)
-
-    def test_naive_datetime_with_microsecond(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, 405060)
-
-        data = serializers.serialize('python', [Event(dt=dt)])
-        self.assert_python_contains_datetime(data, dt)
-        obj = next(serializers.deserialize('python', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        data = serializers.serialize('json', [Event(dt=dt)])
-        self.assert_json_contains_datetime(data, "2011-09-01T13:20:30.405")
-        obj = next(serializers.deserialize('json', data)).object
-        self.assertEqual(obj.dt, dt.replace(microsecond=405000))
-
-        data = serializers.serialize('xml', [Event(dt=dt)])
-        self.assert_xml_contains_datetime(data, "2011-09-01T13:20:30.405060")
-        obj = next(serializers.deserialize('xml', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        if not isinstance(serializers.get_serializer('yaml'), serializers.BadSerializer):
-            data = serializers.serialize('yaml', [Event(dt=dt)], default_flow_style=None)
-            self.assert_yaml_contains_datetime(data, "2011-09-01 13:20:30.405060")
-            obj = next(serializers.deserialize('yaml', data)).object
-            self.assertEqual(obj.dt, dt)
-
-    def test_aware_datetime_with_microsecond(self):
-        dt = datetime.datetime(2011, 9, 1, 17, 20, 30, 405060, tzinfo=ICT)
-
-        data = serializers.serialize('python', [Event(dt=dt)])
-        self.assert_python_contains_datetime(data, dt)
-        obj = next(serializers.deserialize('python', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        data = serializers.serialize('json', [Event(dt=dt)])
-        self.assert_json_contains_datetime(data, "2011-09-01T17:20:30.405+07:00")
-        obj = next(serializers.deserialize('json', data)).object
-        self.assertEqual(obj.dt, dt.replace(microsecond=405000))
-
-        data = serializers.serialize('xml', [Event(dt=dt)])
-        self.assert_xml_contains_datetime(data, "2011-09-01T17:20:30.405060+07:00")
-        obj = next(serializers.deserialize('xml', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        if not isinstance(serializers.get_serializer('yaml'), serializers.BadSerializer):
-            data = serializers.serialize('yaml', [Event(dt=dt)], default_flow_style=None)
-            self.assert_yaml_contains_datetime(data, "2011-09-01 17:20:30.405060+07:00")
-            obj = next(serializers.deserialize('yaml', data)).object
-            self.assertEqual(obj.dt.replace(tzinfo=UTC), dt)
-
-    def test_aware_datetime_in_utc(self):
-        dt = datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC)
-
-        data = serializers.serialize('python', [Event(dt=dt)])
-        self.assert_python_contains_datetime(data, dt)
-        obj = next(serializers.deserialize('python', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        data = serializers.serialize('json', [Event(dt=dt)])
-        self.assert_json_contains_datetime(data, "2011-09-01T10:20:30Z")
-        obj = next(serializers.deserialize('json', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        data = serializers.serialize('xml', [Event(dt=dt)])
-        self.assert_xml_contains_datetime(data, "2011-09-01T10:20:30+00:00")
-        obj = next(serializers.deserialize('xml', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        if not isinstance(serializers.get_serializer('yaml'), serializers.BadSerializer):
-            data = serializers.serialize('yaml', [Event(dt=dt)], default_flow_style=None)
-            self.assert_yaml_contains_datetime(data, "2011-09-01 10:20:30+00:00")
-            obj = next(serializers.deserialize('yaml', data)).object
-            self.assertEqual(obj.dt.replace(tzinfo=UTC), dt)
-
-    def test_aware_datetime_in_local_timezone(self):
-        dt = datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT)
-
-        data = serializers.serialize('python', [Event(dt=dt)])
-        self.assert_python_contains_datetime(data, dt)
-        obj = next(serializers.deserialize('python', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        data = serializers.serialize('json', [Event(dt=dt)])
-        self.assert_json_contains_datetime(data, "2011-09-01T13:20:30+03:00")
-        obj = next(serializers.deserialize('json', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        data = serializers.serialize('xml', [Event(dt=dt)])
-        self.assert_xml_contains_datetime(data, "2011-09-01T13:20:30+03:00")
-        obj = next(serializers.deserialize('xml', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        if not isinstance(serializers.get_serializer('yaml'), serializers.BadSerializer):
-            data = serializers.serialize('yaml', [Event(dt=dt)], default_flow_style=None)
-            self.assert_yaml_contains_datetime(data, "2011-09-01 13:20:30+03:00")
-            obj = next(serializers.deserialize('yaml', data)).object
-            self.assertEqual(obj.dt.replace(tzinfo=UTC), dt)
-
-    def test_aware_datetime_in_other_timezone(self):
-        dt = datetime.datetime(2011, 9, 1, 17, 20, 30, tzinfo=ICT)
-
-        data = serializers.serialize('python', [Event(dt=dt)])
-        self.assert_python_contains_datetime(data, dt)
-        obj = next(serializers.deserialize('python', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        data = serializers.serialize('json', [Event(dt=dt)])
-        self.assert_json_contains_datetime(data, "2011-09-01T17:20:30+07:00")
-        obj = next(serializers.deserialize('json', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        data = serializers.serialize('xml', [Event(dt=dt)])
-        self.assert_xml_contains_datetime(data, "2011-09-01T17:20:30+07:00")
-        obj = next(serializers.deserialize('xml', data)).object
-        self.assertEqual(obj.dt, dt)
-
-        if not isinstance(serializers.get_serializer('yaml'), serializers.BadSerializer):
-            data = serializers.serialize('yaml', [Event(dt=dt)], default_flow_style=None)
-            self.assert_yaml_contains_datetime(data, "2011-09-01 17:20:30+07:00")
-            obj = next(serializers.deserialize('yaml', data)).object
-            self.assertEqual(obj.dt.replace(tzinfo=UTC), dt)
-
-
-@override_settings(DATETIME_FORMAT='c', TIME_ZONE='Africa/Nairobi', USE_L10N=False, USE_TZ=True)
-class TemplateTests(SimpleTestCase):
-
-    @requires_tz_support
-    def test_localtime_templatetag_and_filters(self):
-        """
-        Test the {% localtime %} templatetag and related filters.
-        """
-        datetimes = {
-            'utc': datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC),
-            'eat': datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT),
-            'ict': datetime.datetime(2011, 9, 1, 17, 20, 30, tzinfo=ICT),
-            'naive': datetime.datetime(2011, 9, 1, 13, 20, 30),
-        }
-        templates = {
-            'notag': Template("{% load tz %}{{ dt }}|{{ dt|localtime }}|{{ dt|utc }}|{{ dt|timezone:ICT }}"),
-            'noarg': Template(
-                "{% load tz %}{% localtime %}{{ dt }}|{{ dt|localtime }}|"
-                "{{ dt|utc }}|{{ dt|timezone:ICT }}{% endlocaltime %}"
-            ),
-            'on': Template(
-                "{% load tz %}{% localtime on %}{{ dt }}|{{ dt|localtime }}|"
-                "{{ dt|utc }}|{{ dt|timezone:ICT }}{% endlocaltime %}"
-            ),
-            'off': Template(
-                "{% load tz %}{% localtime off %}{{ dt }}|{{ dt|localtime }}|"
-                "{{ dt|utc }}|{{ dt|timezone:ICT }}{% endlocaltime %}"
-            ),
-        }
-
-        # Transform a list of keys in 'datetimes' to the expected template
-        # output. This makes the definition of 'results' more readable.
-        def t(*result):
-            return '|'.join(datetimes[key].isoformat() for key in result)
-
-        # Results for USE_TZ = True
-
-        results = {
-            'utc': {
-                'notag': t('eat', 'eat', 'utc', 'ict'),
-                'noarg': t('eat', 'eat', 'utc', 'ict'),
-                'on': t('eat', 'eat', 'utc', 'ict'),
-                'off': t('utc', 'eat', 'utc', 'ict'),
-            },
-            'eat': {
-                'notag': t('eat', 'eat', 'utc', 'ict'),
-                'noarg': t('eat', 'eat', 'utc', 'ict'),
-                'on': t('eat', 'eat', 'utc', 'ict'),
-                'off': t('eat', 'eat', 'utc', 'ict'),
-            },
-            'ict': {
-                'notag': t('eat', 'eat', 'utc', 'ict'),
-                'noarg': t('eat', 'eat', 'utc', 'ict'),
-                'on': t('eat', 'eat', 'utc', 'ict'),
-                'off': t('ict', 'eat', 'utc', 'ict'),
-            },
-            'naive': {
-                'notag': t('naive', 'eat', 'utc', 'ict'),
-                'noarg': t('naive', 'eat', 'utc', 'ict'),
-                'on': t('naive', 'eat', 'utc', 'ict'),
-                'off': t('naive', 'eat', 'utc', 'ict'),
-            }
-        }
-
-        for k1, dt in six.iteritems(datetimes):
-            for k2, tpl in six.iteritems(templates):
-                ctx = Context({'dt': dt, 'ICT': ICT})
-                actual = tpl.render(ctx)
-                expected = results[k1][k2]
-                self.assertEqual(actual, expected, '%s / %s: %r != %r' % (k1, k2, actual, expected))
-
-        # Changes for USE_TZ = False
-
-        results['utc']['notag'] = t('utc', 'eat', 'utc', 'ict')
-        results['ict']['notag'] = t('ict', 'eat', 'utc', 'ict')
-
-        with self.settings(USE_TZ=False):
-            for k1, dt in six.iteritems(datetimes):
-                for k2, tpl in six.iteritems(templates):
-                    ctx = Context({'dt': dt, 'ICT': ICT})
-                    actual = tpl.render(ctx)
-                    expected = results[k1][k2]
-                    self.assertEqual(actual, expected, '%s / %s: %r != %r' % (k1, k2, actual, expected))
-
-    def test_localtime_filters_with_pytz(self):
-        """
-        Test the |localtime, |utc, and |timezone filters with pytz.
-        """
-        # Use a pytz timezone as local time
-        tpl = Template("{% load tz %}{{ dt|localtime }}|{{ dt|utc }}")
-        ctx = Context({'dt': datetime.datetime(2011, 9, 1, 12, 20, 30)})
-
-        with self.settings(TIME_ZONE='Europe/Paris'):
-            self.assertEqual(tpl.render(ctx), "2011-09-01T12:20:30+02:00|2011-09-01T10:20:30+00:00")
-
-        # Use a pytz timezone as argument
-        tpl = Template("{% load tz %}{{ dt|timezone:tz }}")
-        ctx = Context({'dt': datetime.datetime(2011, 9, 1, 13, 20, 30),
-                       'tz': pytz.timezone('Europe/Paris')})
-        self.assertEqual(tpl.render(ctx), "2011-09-01T12:20:30+02:00")
-
-        # Use a pytz timezone name as argument
-        tpl = Template("{% load tz %}{{ dt|timezone:'Europe/Paris' }}")
-        ctx = Context({'dt': datetime.datetime(2011, 9, 1, 13, 20, 30),
-                       'tz': pytz.timezone('Europe/Paris')})
-        self.assertEqual(tpl.render(ctx), "2011-09-01T12:20:30+02:00")
-
-    def test_localtime_templatetag_invalid_argument(self):
-        with self.assertRaises(TemplateSyntaxError):
-            Template("{% load tz %}{% localtime foo %}{% endlocaltime %}").render()
-
-    def test_localtime_filters_do_not_raise_exceptions(self):
-        """
-        Test the |localtime, |utc, and |timezone filters on bad inputs.
-        """
-        tpl = Template("{% load tz %}{{ dt }}|{{ dt|localtime }}|{{ dt|utc }}|{{ dt|timezone:tz }}")
-        with self.settings(USE_TZ=True):
-            # bad datetime value
-            ctx = Context({'dt': None, 'tz': ICT})
-            self.assertEqual(tpl.render(ctx), "None|||")
-            ctx = Context({'dt': 'not a date', 'tz': ICT})
-            self.assertEqual(tpl.render(ctx), "not a date|||")
-            # bad timezone value
-            tpl = Template("{% load tz %}{{ dt|timezone:tz }}")
-            ctx = Context({'dt': datetime.datetime(2011, 9, 1, 13, 20, 30), 'tz': None})
-            self.assertEqual(tpl.render(ctx), "")
-            ctx = Context({'dt': datetime.datetime(2011, 9, 1, 13, 20, 30), 'tz': 'not a tz'})
-            self.assertEqual(tpl.render(ctx), "")
-
-    @requires_tz_support
-    def test_timezone_templatetag(self):
-        """
-        Test the {% timezone %} templatetag.
-        """
-        tpl = Template(
-            "{% load tz %}"
-            "{{ dt }}|"
-            "{% timezone tz1 %}"
-            "{{ dt }}|"
-            "{% timezone tz2 %}"
-            "{{ dt }}"
-            "{% endtimezone %}"
-            "{% endtimezone %}"
-        )
-        ctx = Context({'dt': datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC),
-                       'tz1': ICT, 'tz2': None})
-        self.assertEqual(
-            tpl.render(ctx),
-            "2011-09-01T13:20:30+03:00|2011-09-01T17:20:30+07:00|2011-09-01T13:20:30+03:00"
-        )
-
-    def test_timezone_templatetag_with_pytz(self):
-        """
-        Test the {% timezone %} templatetag with pytz.
-        """
-        tpl = Template("{% load tz %}{% timezone tz %}{{ dt }}{% endtimezone %}")
-
-        # Use a pytz timezone as argument
-        ctx = Context({'dt': datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT),
-                       'tz': pytz.timezone('Europe/Paris')})
-        self.assertEqual(tpl.render(ctx), "2011-09-01T12:20:30+02:00")
-
-        # Use a pytz timezone name as argument
-        ctx = Context({'dt': datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT),
-                       'tz': 'Europe/Paris'})
-        self.assertEqual(tpl.render(ctx), "2011-09-01T12:20:30+02:00")
-
-    def test_timezone_templatetag_invalid_argument(self):
-        with self.assertRaises(TemplateSyntaxError):
-            Template("{% load tz %}{% timezone %}{% endtimezone %}").render()
-        with self.assertRaises(pytz.UnknownTimeZoneError):
-            Template("{% load tz %}{% timezone tz %}{% endtimezone %}").render(Context({'tz': 'foobar'}))
-
-    @skipIf(sys.platform.startswith('win'), "Windows uses non-standard time zone names")
-    def test_get_current_timezone_templatetag(self):
-        """
-        Test the {% get_current_timezone %} templatetag.
-        """
-        tpl = Template("{% load tz %}{% get_current_timezone as time_zone %}{{ time_zone }}")
-
-        self.assertEqual(tpl.render(Context()), "Africa/Nairobi")
-        with timezone.override(UTC):
-            self.assertEqual(tpl.render(Context()), "UTC")
-
-        tpl = Template(
-            "{% load tz %}{% timezone tz %}{% get_current_timezone as time_zone %}"
-            "{% endtimezone %}{{ time_zone }}"
-        )
-
-        self.assertEqual(tpl.render(Context({'tz': ICT})), "+0700")
-        with timezone.override(UTC):
-            self.assertEqual(tpl.render(Context({'tz': ICT})), "+0700")
-
-    def test_get_current_timezone_templatetag_with_pytz(self):
-        """
-        Test the {% get_current_timezone %} templatetag with pytz.
-        """
-        tpl = Template("{% load tz %}{% get_current_timezone as time_zone %}{{ time_zone }}")
-        with timezone.override(pytz.timezone('Europe/Paris')):
-            self.assertEqual(tpl.render(Context()), "Europe/Paris")
-
-        tpl = Template(
-            "{% load tz %}{% timezone 'Europe/Paris' %}"
-            "{% get_current_timezone as time_zone %}{% endtimezone %}"
-            "{{ time_zone }}"
-        )
-        self.assertEqual(tpl.render(Context()), "Europe/Paris")
-
-    def test_get_current_timezone_templatetag_invalid_argument(self):
-        with self.assertRaises(TemplateSyntaxError):
-            Template("{% load tz %}{% get_current_timezone %}").render()
-
-    @skipIf(sys.platform.startswith('win'), "Windows uses non-standard time zone names")
-    def test_tz_template_context_processor(self):
-        """
-        Test the django.template.context_processors.tz template context processor.
-        """
-        tpl = Template("{{ TIME_ZONE }}")
-        context = Context()
-        self.assertEqual(tpl.render(context), "")
-        request_context = RequestContext(HttpRequest(), processors=[context_processors.tz])
-        self.assertEqual(tpl.render(request_context), "Africa/Nairobi")
-
-    @requires_tz_support
-    def test_date_and_time_template_filters(self):
-        tpl = Template("{{ dt|date:'Y-m-d' }} at {{ dt|time:'H:i:s' }}")
-        ctx = Context({'dt': datetime.datetime(2011, 9, 1, 20, 20, 20, tzinfo=UTC)})
-        self.assertEqual(tpl.render(ctx), "2011-09-01 at 23:20:20")
-        with timezone.override(ICT):
-            self.assertEqual(tpl.render(ctx), "2011-09-02 at 03:20:20")
-
-    def test_date_and_time_template_filters_honor_localtime(self):
-        tpl = Template(
-            "{% load tz %}{% localtime off %}{{ dt|date:'Y-m-d' }} at "
-            "{{ dt|time:'H:i:s' }}{% endlocaltime %}"
-        )
-        ctx = Context({'dt': datetime.datetime(2011, 9, 1, 20, 20, 20, tzinfo=UTC)})
-        self.assertEqual(tpl.render(ctx), "2011-09-01 at 20:20:20")
-        with timezone.override(ICT):
-            self.assertEqual(tpl.render(ctx), "2011-09-01 at 20:20:20")
-
-    @requires_tz_support
-    def test_now_template_tag_uses_current_time_zone(self):
-        # Regression for #17343
-        tpl = Template("{% now \"O\" %}")
-        self.assertEqual(tpl.render(Context({})), "+0300")
-        with timezone.override(ICT):
-            self.assertEqual(tpl.render(Context({})), "+0700")
-
-
-@override_settings(DATETIME_FORMAT='c', TIME_ZONE='Africa/Nairobi', USE_L10N=False, USE_TZ=False)
-class LegacyFormsTests(TestCase):
-
-    def test_form(self):
-        form = EventForm({'dt': '2011-09-01 13:20:30'})
-        self.assertTrue(form.is_valid())
-        self.assertEqual(form.cleaned_data['dt'], datetime.datetime(2011, 9, 1, 13, 20, 30))
-
-    def test_form_with_non_existent_time(self):
-        form = EventForm({'dt': '2011-03-27 02:30:00'})
-        with timezone.override(pytz.timezone('Europe/Paris')):
-            # this is obviously a bug
-            self.assertTrue(form.is_valid())
-            self.assertEqual(form.cleaned_data['dt'], datetime.datetime(2011, 3, 27, 2, 30, 0))
-
-    def test_form_with_ambiguous_time(self):
-        form = EventForm({'dt': '2011-10-30 02:30:00'})
-        with timezone.override(pytz.timezone('Europe/Paris')):
-            # this is obviously a bug
-            self.assertTrue(form.is_valid())
-            self.assertEqual(form.cleaned_data['dt'], datetime.datetime(2011, 10, 30, 2, 30, 0))
-
-    def test_split_form(self):
-        form = EventSplitForm({'dt_0': '2011-09-01', 'dt_1': '13:20:30'})
-        self.assertTrue(form.is_valid())
-        self.assertEqual(form.cleaned_data['dt'], datetime.datetime(2011, 9, 1, 13, 20, 30))
-
-    def test_model_form(self):
-        EventModelForm({'dt': '2011-09-01 13:20:30'}).save()
-        e = Event.objects.get()
-        self.assertEqual(e.dt, datetime.datetime(2011, 9, 1, 13, 20, 30))
-
-
-@override_settings(DATETIME_FORMAT='c', TIME_ZONE='Africa/Nairobi', USE_L10N=False, USE_TZ=True)
-class NewFormsTests(TestCase):
-
-    @requires_tz_support
-    def test_form(self):
-        form = EventForm({'dt': '2011-09-01 13:20:30'})
-        self.assertTrue(form.is_valid())
-        self.assertEqual(form.cleaned_data['dt'], datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC))
-
-    def test_form_with_other_timezone(self):
-        form = EventForm({'dt': '2011-09-01 17:20:30'})
-        with timezone.override(ICT):
-            self.assertTrue(form.is_valid())
-            self.assertEqual(form.cleaned_data['dt'], datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC))
-
-    def test_form_with_explicit_timezone(self):
-        form = EventForm({'dt': '2011-09-01 17:20:30+07:00'})
-        # Datetime inputs formats don't allow providing a time zone.
-        self.assertFalse(form.is_valid())
-
-    def test_form_with_non_existent_time(self):
-        with timezone.override(pytz.timezone('Europe/Paris')):
-            form = EventForm({'dt': '2011-03-27 02:30:00'})
-            self.assertFalse(form.is_valid())
-            self.assertEqual(
-                form.errors['dt'], [
-                    "2011-03-27 02:30:00 couldn't be interpreted in time zone "
-                    "Europe/Paris; it may be ambiguous or it may not exist."
-                ]
-            )
-
-    def test_form_with_ambiguous_time(self):
-        with timezone.override(pytz.timezone('Europe/Paris')):
-            form = EventForm({'dt': '2011-10-30 02:30:00'})
-            self.assertFalse(form.is_valid())
-            self.assertEqual(
-                form.errors['dt'], [
-                    "2011-10-30 02:30:00 couldn't be interpreted in time zone "
-                    "Europe/Paris; it may be ambiguous or it may not exist."
-                ]
-            )
-
-    @requires_tz_support
-    def test_split_form(self):
-        form = EventSplitForm({'dt_0': '2011-09-01', 'dt_1': '13:20:30'})
-        self.assertTrue(form.is_valid())
-        self.assertEqual(form.cleaned_data['dt'], datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC))
-
-    @requires_tz_support
-    def test_localized_form(self):
-        form = EventLocalizedForm(initial={'dt': datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT)})
-        with timezone.override(ICT):
-            self.assertIn("2011-09-01 17:20:30", str(form))
-
-    @requires_tz_support
-    def test_model_form(self):
-        EventModelForm({'dt': '2011-09-01 13:20:30'}).save()
-        e = Event.objects.get()
-        self.assertEqual(e.dt, datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC))
-
-    @requires_tz_support
-    def test_localized_model_form(self):
-        form = EventLocalizedModelForm(instance=Event(dt=datetime.datetime(2011, 9, 1, 13, 20, 30, tzinfo=EAT)))
-        with timezone.override(ICT):
-            self.assertIn("2011-09-01 17:20:30", str(form))
-
-
-@override_settings(
-    DATETIME_FORMAT='c',
-    TIME_ZONE='Africa/Nairobi',
-    USE_L10N=False,
-    USE_TZ=True,
-    ROOT_URLCONF='timezones.urls',
-)
-class AdminTests(TestCase):
-
-    @classmethod
-    def setUpTestData(cls):
-        cls.u1 = User.objects.create_user(
-            password='secret',
-            last_login=datetime.datetime(2007, 5, 30, 13, 20, 10, tzinfo=UTC),
-            is_superuser=True, username='super', first_name='Super', last_name='User',
-            email='super@example.com', is_staff=True, is_active=True,
-            date_joined=datetime.datetime(2007, 5, 30, 13, 20, 10, tzinfo=UTC),
-        )
-
-    def setUp(self):
-        self.client.force_login(self.u1)
-
-    @requires_tz_support
-    def test_changelist(self):
-        e = Event.objects.create(dt=datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC))
-        response = self.client.get(reverse('admin_tz:timezones_event_changelist'))
-        self.assertContains(response, e.dt.astimezone(EAT).isoformat())
-
-    def test_changelist_in_other_timezone(self):
-        e = Event.objects.create(dt=datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC))
-        with timezone.override(ICT):
-            response = self.client.get(reverse('admin_tz:timezones_event_changelist'))
-        self.assertContains(response, e.dt.astimezone(ICT).isoformat())
-
-    @requires_tz_support
-    def test_change_editable(self):
-        e = Event.objects.create(dt=datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC))
-        response = self.client.get(reverse('admin_tz:timezones_event_change', args=(e.pk,)))
-        self.assertContains(response, e.dt.astimezone(EAT).date().isoformat())
-        self.assertContains(response, e.dt.astimezone(EAT).time().isoformat())
-
-    def test_change_editable_in_other_timezone(self):
-        e = Event.objects.create(dt=datetime.datetime(2011, 9, 1, 10, 20, 30, tzinfo=UTC))
-        with timezone.override(ICT):
-            response = self.client.get(reverse('admin_tz:timezones_event_change', args=(e.pk,)))
-        self.assertContains(response, e.dt.astimezone(ICT).date().isoformat())
-        self.assertContains(response, e.dt.astimezone(ICT).time().isoformat())
-
-    @requires_tz_support
-    def test_change_readonly(self):
-        Timestamp.objects.create()
-        # re-fetch the object for backends that lose microseconds (MySQL)
-        t = Timestamp.objects.get()
-        response = self.client.get(reverse('admin_tz:timezones_timestamp_change', args=(t.pk,)))
-        self.assertContains(response, t.created.astimezone(EAT).isoformat())
-
-    def test_change_readonly_in_other_timezone(self):
-        Timestamp.objects.create()
-        # re-fetch the object for backends that lose microseconds (MySQL)
-        t = Timestamp.objects.get()
-        with timezone.override(ICT):
-            response = self.client.get(reverse('admin_tz:timezones_timestamp_change', args=(t.pk,)))
-        self.assertContains(response, t.created.astimezone(ICT).isoformat())

+ 0 - 0
desktop/core/ext-py/Django-1.11.22/AUTHORS → desktop/core/ext-py/Django-1.11.29/AUTHORS


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/CONTRIBUTING.rst → desktop/core/ext-py/Django-1.11.29/CONTRIBUTING.rst


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/Gruntfile.js → desktop/core/ext-py/Django-1.11.29/Gruntfile.js


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/INSTALL → desktop/core/ext-py/Django-1.11.29/INSTALL


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/LICENSE → desktop/core/ext-py/Django-1.11.29/LICENSE


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/LICENSE.python → desktop/core/ext-py/Django-1.11.29/LICENSE.python


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/MANIFEST.in → desktop/core/ext-py/Django-1.11.29/MANIFEST.in


+ 31 - 0
desktop/core/ext-py/Django-1.11.29/PKG-INFO

@@ -0,0 +1,31 @@
+Metadata-Version: 2.1
+Name: Django
+Version: 1.11.29
+Summary: A high-level Python Web framework that encourages rapid development and clean, pragmatic design.
+Home-page: https://www.djangoproject.com/
+Author: Django Software Foundation
+Author-email: foundation@djangoproject.com
+License: BSD
+Description: UNKNOWN
+Platform: UNKNOWN
+Classifier: Development Status :: 5 - Production/Stable
+Classifier: Environment :: Web Environment
+Classifier: Framework :: Django
+Classifier: Intended Audience :: Developers
+Classifier: License :: OSI Approved :: BSD License
+Classifier: Operating System :: OS Independent
+Classifier: Programming Language :: Python
+Classifier: Programming Language :: Python :: 2
+Classifier: Programming Language :: Python :: 2.7
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.4
+Classifier: Programming Language :: Python :: 3.5
+Classifier: Programming Language :: Python :: 3.6
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Topic :: Internet :: WWW/HTTP
+Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
+Classifier: Topic :: Internet :: WWW/HTTP :: WSGI
+Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
+Classifier: Topic :: Software Development :: Libraries :: Python Modules
+Provides-Extra: argon2
+Provides-Extra: bcrypt

+ 0 - 0
desktop/core/ext-py/Django-1.11.22/README.rst → desktop/core/ext-py/Django-1.11.29/README.rst


+ 27 - 0
desktop/core/ext-py/Django-1.11.29/django/__init__.py

@@ -0,0 +1,27 @@
+from __future__ import unicode_literals
+
+from django.utils.version import get_version
+
+VERSION = (1, 11, 29, 'final', 0)
+
+__version__ = get_version(VERSION)
+
+
+def setup(set_prefix=True):
+    """
+    Configure the settings (this happens as a side effect of accessing the
+    first setting), configure logging and populate the app registry.
+    Set the thread-local urlresolvers script prefix if `set_prefix` is True.
+    """
+    from django.apps import apps
+    from django.conf import settings
+    from django.urls import set_script_prefix
+    from django.utils.encoding import force_text
+    from django.utils.log import configure_logging
+
+    configure_logging(settings.LOGGING_CONFIG, settings.LOGGING)
+    if set_prefix:
+        set_script_prefix(
+            '/' if settings.FORCE_SCRIPT_NAME is None else force_text(settings.FORCE_SCRIPT_NAME)
+        )
+    apps.populate(settings.INSTALLED_APPS)

+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/__main__.py → desktop/core/ext-py/Django-1.11.29/django/__main__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/apps/__init__.py → desktop/core/ext-py/Django-1.11.29/django/apps/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/apps/config.py → desktop/core/ext-py/Django-1.11.29/django/apps/config.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/apps/registry.py → desktop/core/ext-py/Django-1.11.29/django/apps/registry.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/bin/django-admin.py → desktop/core/ext-py/Django-1.11.29/django/bin/django-admin.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/app_template/__init__.py-tpl → desktop/core/ext-py/Django-1.11.29/django/conf/app_template/__init__.py-tpl


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/app_template/admin.py-tpl → desktop/core/ext-py/Django-1.11.29/django/conf/app_template/admin.py-tpl


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/app_template/apps.py-tpl → desktop/core/ext-py/Django-1.11.29/django/conf/app_template/apps.py-tpl


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/app_template/migrations/__init__.py-tpl → desktop/core/ext-py/Django-1.11.29/django/conf/app_template/migrations/__init__.py-tpl


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/app_template/models.py-tpl → desktop/core/ext-py/Django-1.11.29/django/conf/app_template/models.py-tpl


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/app_template/tests.py-tpl → desktop/core/ext-py/Django-1.11.29/django/conf/app_template/tests.py-tpl


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/app_template/views.py-tpl → desktop/core/ext-py/Django-1.11.29/django/conf/app_template/views.py-tpl


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/global_settings.py → desktop/core/ext-py/Django-1.11.29/django/conf/global_settings.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/af/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/af/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/af/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/af/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/ar/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/ar/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/ar/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/ar/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/ar/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/ar/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/ar/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/ar/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/ast/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/ast/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/ast/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/ast/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/az/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/az/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/az/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/az/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/az/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/az/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/az/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/az/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/be/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/be/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/be/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/be/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/bg/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/bg/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/bg/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/bg/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/bg/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/bg/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/bg/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/bg/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/bn/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/bn/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/bn/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/bn/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/bn/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/bn/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/bn/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/bn/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/br/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/br/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/br/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/br/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/bs/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/bs/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/bs/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/bs/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/bs/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/bs/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/bs/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/bs/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/ca/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/ca/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/ca/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/ca/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/ca/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/ca/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/ca/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/ca/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/cs/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/cs/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/cs/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/cs/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/cs/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/cs/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/cs/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/cs/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/cy/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/cy/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/cy/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/cy/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/cy/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/cy/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/cy/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/cy/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/da/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/da/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/da/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/da/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/da/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/da/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/da/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/da/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/de/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/de/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/de/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/de/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/de/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/de/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/de/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/de/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/de_CH/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/de_CH/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/de_CH/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/de_CH/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/dsb/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/dsb/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/dsb/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/dsb/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/el/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/el/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/el/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/el/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/el/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/el/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/el/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/el/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/en/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/en/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/en/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/en/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/en/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/en/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/en/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/en/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/en_AU/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_AU/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/en_AU/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_AU/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/en_AU/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_AU/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/en_AU/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_AU/formats.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/en_GB/LC_MESSAGES/django.mo → desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_GB/LC_MESSAGES/django.mo


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/en_GB/LC_MESSAGES/django.po → desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_GB/LC_MESSAGES/django.po


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/en_GB/__init__.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_GB/__init__.py


+ 0 - 0
desktop/core/ext-py/Django-1.11.22/django/conf/locale/en_GB/formats.py → desktop/core/ext-py/Django-1.11.29/django/conf/locale/en_GB/formats.py


Unele fișiere nu au fost afișate deoarece prea multe fișiere au fost modificate în acest diff