Skip to content

Commit 4b53e5c

Browse files
Merge pull request #888 from datajoint/release013
Add release 0.13 features and bug fixes
2 parents 3df05c0 + 8417c66 commit 4b53e5c

File tree

84 files changed

+10483
-1439
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

84 files changed

+10483
-1439
lines changed

.coveragerc

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,3 +3,4 @@ branch = False
33
source = datajoint
44

55
[report]
6+
show_missing = True

.github/workflows/development.yaml

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -21,8 +21,6 @@ jobs:
2121
mysql_ver: "5.7"
2222
- py_ver: "3.6"
2323
mysql_ver: "5.7"
24-
- py_ver: "3.5"
25-
mysql_ver: "5.7"
2624
steps:
2725
- uses: actions/checkout@v2
2826
- name: Set up Python ${{matrix.py_ver}}
@@ -38,7 +36,7 @@ jobs:
3836
- name: Run primary tests
3937
env:
4038
UID: "1001"
41-
GID: "116"
39+
GID: "121"
4240
PY_VER: ${{matrix.py_ver}}
4341
MYSQL_VER: ${{matrix.mysql_ver}}
4442
ALPINE_VER: "3.10"
@@ -50,4 +48,4 @@ jobs:
5048
- name: Run style tests
5149
run: |
5250
flake8 --ignore=E121,E123,E126,E226,E24,E704,W503,W504,E722,F401,W605 datajoint \
53-
--count --max-complexity=62 --max-line-length=127 --statistics
51+
--count --max-complexity=62 --max-line-length=127 --statistics

.gitignore

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,4 +24,5 @@ notebook
2424
.vscode
2525
__main__.py
2626
jupyter_custom.js
27-
apk_requirements.txt
27+
apk_requirements.txt
28+
.eggs

CHANGELOG.md

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,22 @@
11
## Release notes
22

3+
### 0.13.0 -- Mar 24, 2021
4+
* Re-implement query transpilation into SQL, fixing issues (#386, #449, #450, #484). PR #754
5+
* Re-implement cascading deletes for better performance. PR #839.
6+
* Add table method `.update1` to update a row in the table with new values PR #763
7+
* Python datatypes are now enabled by default in blobs (#761). PR #785
8+
* Added permissive join and restriction operators `@` and `^` (#785) PR #754
9+
* Support DataJoint datatype and connection plugins (#715, #729) PR 730, #735
10+
* Add `dj.key_hash` alias to `dj.hash.key_hash`
11+
* Default enable_python_native_blobs to True
12+
* Bugfix - Regression error on joins with same attribute name (#857) PR #878
13+
* Bugfix - Error when `fetch1('KEY')` when `dj.config['fetch_format']='frame'` set (#876) PR #880, #878
14+
* Bugfix - Error when cascading deletes in tables with many, complex keys (#883, #886) PR #839
15+
* Add deprecation warning for `_update`. PR #889
16+
* Add `purge_query_cache` utility. PR #889
17+
* Add tests for query caching and permissive join and restriction. PR #889
18+
* Drop support for Python 3.5
19+
320
### 0.12.9 -- Mar 12, 2021
421
* Fix bug with fetch1 with `dj.config['fetch_format']="frame"`. (#876) PR #880
522

LNX-docker-compose.yml

Lines changed: 13 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# docker-compose -f LNX-docker-compose.yml --env-file LNX.env up --build --exit-code-from app
1+
# docker-compose -f LNX-docker-compose.yml --env-file LNX.env up --exit-code-from app --build
22
version: '2.2'
33
x-net: &net
44
networks:
@@ -32,7 +32,7 @@ services:
3232
interval: 1s
3333
fakeservices.datajoint.io:
3434
<<: *net
35-
image: raphaelguzman/nginx:v0.0.13
35+
image: datajoint/nginx:v0.0.16
3636
environment:
3737
- ADD_db_TYPE=DATABASE
3838
- ADD_db_ENDPOINT=db:3306
@@ -72,14 +72,17 @@ services:
7272
- COVERALLS_SERVICE_NAME
7373
- COVERALLS_REPO_TOKEN
7474
working_dir: /src
75-
command: >
76-
/bin/sh -c
77-
"
78-
pip install --user nose nose-cov coveralls .;
79-
pip freeze | grep datajoint;
80-
nosetests -vsw tests --with-coverage --cover-package=datajoint && coveralls;
81-
# jupyter notebook;
82-
"
75+
command:
76+
- sh
77+
- -c
78+
- |
79+
set -e
80+
pip install --user -r test_requirements.txt
81+
pip install -e .
82+
pip freeze | grep datajoint
83+
nosetests -vsw tests --with-coverage --cover-package=datajoint
84+
coveralls
85+
# jupyter notebook
8386
# ports:
8487
# - "8888:8888"
8588
user: ${UID}:${GID}

README.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -32,9 +32,10 @@ Some Python datatypes such as dicts were coerced into numpy recarrays and then f
3232
However, since some Python types were coerced into MATLAB types, old blobs and new blobs may now be fetched as different types of objects even if they were inserted the same way.
3333
For example, new `dict` objects will be returned as `dict` while the same types of objects inserted with `datajoint 0.11` will be recarrays.
3434

35-
Since this is a big change, we chose to disable full blob support by default as a temporary precaution, which will be removed in version 0.13.
35+
Since this is a big change, we chose to temporarily disable this feature by default in DataJoint for Python 0.12.x, allowing users to adjust their code if necessary.
36+
From 13.x, the flag will default to True (on), and will ultimately be removed when corresponding decode support for the new format is added to datajoint-matlab (see: datajoint-matlab #222, datajoint-python #765).
3637

37-
You may enable it by setting the `enable_python_native_blobs` flag in `dj.config`.
38+
The flag is configured by setting the `enable_python_native_blobs` flag in `dj.config`.
3839

3940
```python
4041
import datajoint as dj
@@ -68,7 +69,7 @@ as structured arrays, whereas new record inserted in DataJoint 0.12 with
6869
appropriate native python type (dict, etc).
6970
Furthermore, DataJoint for MATLAB does not yet support unpacking native Python datatypes.
7071

71-
With `dj.config["enable_python_native_blobs"]` set to `False` (default),
72+
With `dj.config["enable_python_native_blobs"]` set to `False`,
7273
any attempt to insert any datatype other than a numpy array will result in an exception.
7374
This is meant to get users to read this message in order to allow proper testing
7475
and migration of pre-0.12 pipelines to 0.12 in a safe manner.

datajoint.pub

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
-----BEGIN PUBLIC KEY-----
2+
MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDUMOo2U7YQ1uOrKU/IreM3AQP2
3+
AXJC3au+S9W+dilxHcJ3e98bRVqrFeOofcGeRPoNc38fiLmLDUiBskJeVrpm29Wo
4+
AkH6yhZWk1o8NvGMhK4DLsJYlsH6tZuOx9NITKzJuOOH6X1I5Ucs7NOSKnmu7g5g
5+
WTT5kCgF5QAe5JN8WQIDAQAB
6+
-----END PUBLIC KEY-----

datajoint/__init__.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"""
1616

1717
__author__ = "DataJoint Contributors"
18-
__date__ = "February 7, 2019"
18+
__date__ = "November 7, 2020"
1919
__all__ = ['__author__', '__version__',
2020
'config', 'conn', 'Connection',
2121
'Schema', 'schema', 'VirtualModule', 'create_virtual_module',
@@ -24,7 +24,7 @@
2424
'Not', 'AndList', 'U', 'Diagram', 'Di', 'ERD',
2525
'set_password', 'kill',
2626
'MatCell', 'MatStruct', 'AttributeAdapter',
27-
'errors', 'DataJointError', 'key']
27+
'errors', 'DataJointError', 'key', 'key_hash']
2828

2929
from .version import __version__
3030
from .settings import config
@@ -38,6 +38,7 @@
3838
from .admin import set_password, kill
3939
from .blob import MatCell, MatStruct
4040
from .fetch import key
41+
from .hash import key_hash
4142
from .attribute_adapter import AttributeAdapter
4243
from . import errors
4344
from .errors import DataJointError

datajoint/attribute_adapter.py

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
import re
22
from .errors import DataJointError, _support_adapted_types
3+
from .plugin import type_plugins
34

45

56
class AttributeAdapter:
@@ -38,10 +39,11 @@ def get_adapter(context, adapter_name):
3839
raise DataJointError('Support for Adapted Attribute types is disabled.')
3940
adapter_name = adapter_name.lstrip('<').rstrip('>')
4041
try:
41-
adapter = context[adapter_name]
42+
adapter = (context[adapter_name] if adapter_name in context
43+
else type_plugins[adapter_name]['object'].load())
4244
except KeyError:
4345
raise DataJointError(
44-
"Attribute adapter '{adapter_name}' is not defined.".format(adapter_name=adapter_name)) from None
46+
"Attribute adapter '{adapter_name}' is not defined.".format(adapter_name=adapter_name))
4547
if not isinstance(adapter, AttributeAdapter):
4648
raise DataJointError(
4749
"Attribute adapter '{adapter_name}' must be an instance of datajoint.AttributeAdapter".format(

datajoint/autopopulate.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,6 @@
77
from tqdm import tqdm
88
from .expression import QueryExpression, AndList
99
from .errors import DataJointError, LostConnectionError
10-
from .table import FreeTable
1110
import signal
1211

1312
# noinspection PyExceptionInherit,PyCallingNonCallable
@@ -41,7 +40,7 @@ def _rename_attributes(table, props):
4140
parents = self.target.parents(primary=True, as_objects=True, foreign_key_info=True)
4241
if not parents:
4342
raise DataJointError(
44-
'A relation must have primary dependencies for auto-populate to work') from None
43+
'A relation must have primary dependencies for auto-populate to work')
4544
self._key_source = _rename_attributes(*parents[0])
4645
for q in parents[1:]:
4746
self._key_source *= _rename_attributes(*q)
@@ -58,15 +57,16 @@ def make(self, key):
5857
@property
5958
def target(self):
6059
"""
61-
relation to be populated.
62-
Typically, AutoPopulate are mixed into a Relation object and the target is self.
60+
:return: table to be populated.
61+
In the typical case, dj.AutoPopulate is mixed into a dj.Table class by inheritance and the target is self.
6362
"""
6463
return self
6564

6665
def _job_key(self, key):
6766
"""
6867
:param key: they key returned for the job from the key source
6968
:return: the dict to use to generate the job reservation hash
69+
This method allows subclasses to control the job reservation granularity.
7070
"""
7171
return key
7272

@@ -136,7 +136,7 @@ def handler(signum, frame):
136136

137137
make = self._make_tuples if hasattr(self, '_make_tuples') else self.make
138138

139-
for key in (tqdm(keys) if display_progress else keys):
139+
for key in (tqdm(keys, desc=self.__class__.__name__) if display_progress else keys):
140140
if max_calls is not None and call_count >= max_calls:
141141
break
142142
if not reserve_jobs or jobs.reserve(self.target.table_name, self._job_key(key)):

datajoint/blob.py

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -11,11 +11,10 @@
1111
import uuid
1212
import numpy as np
1313
from .errors import DataJointError
14-
from .utils import OrderedDict
1514
from .settings import config
1615

1716

18-
mxClassID = OrderedDict((
17+
mxClassID = dict((
1918
# see http://www.mathworks.com/help/techdoc/apiref/mxclassid.html
2019
('mxUNKNOWN_CLASS', None),
2120
('mxCELL_CLASS', None),
@@ -346,8 +345,8 @@ def pack_set(self, t):
346345
len_u64(it) + it for it in (self.pack_blob(i) for i in t))
347346

348347
def read_dict(self):
349-
return OrderedDict((self.read_blob(self.read_value()), self.read_blob(self.read_value()))
350-
for _ in range(self.read_value()))
348+
return dict((self.read_blob(self.read_value()), self.read_blob(self.read_value()))
349+
for _ in range(self.read_value()))
351350

352351
def pack_dict(self, d):
353352
return b"\4" + len_u64(d) + b"".join(

0 commit comments

Comments
 (0)