The Descartes Labs Platform is designed to answer some of the world’s most complex and pressing geospatial analytics questions. Our customers use the platform to build algorithms and models that transform their businesses quickly, efficiently, and cost-effectively.
By giving data scientists and their line-of-business colleagues the best geospatial data and modeling tools in one package, we help turn AI into a core competency.
Data science teams can use our scaling infrastructure to design models faster than ever, using our massive data archive or their own.
Please visit https://descarteslabs.com for more information about the Descartes Labs Platform and to request access.
The descarteslabs
python package, available at https://pypi.org/project/descarteslabs/, provides client-side access to the Descartes Labs Platform for our customers. You must be a registered customer with access to our Descartes Labs Platform before you can make use of this package with our platform.
The documentation for the latest release can be found at https://docs.descarteslabs.com. For any issues please request Customer Support at https://support.descarteslabs.com.
- Fixed a bug where some geometries weren't supported by blob geometry properties
- Fixed a problem with unpickling Catalog objects pickled with an earlier version. Please be aware that we do not support the pickling of any Catalog objects, so if in doubt, don't do it!
- The links for interacting with login and token generation have been updated to refer to
https://app.descarteslabs.com
.
- All
CatalogObject
classes which support theowners
,writers
, andreaders
fields have been refactored to derive this support from the newAuthCatalogObject
. This change does not affect the behavior of any of these classes. The methodsAuthCatalogObject.user_is_owner()
,AuthCatalogObject.user_can_write()
, andAuthCatalogObject.user_can_read()
have been added to allow testing of permissions prior to attempting an operation such as updating or deleting the object. EventSchedule
now has a read-onlyexpires
attribute which indicates when the schedule will be expired and deleted.EventSubscription
now has a read-onlyowner_role_arn
field which contains the AWS Role which will be used by certainEventRule
targets that reside in an external AWS account.EventRule
has been enhanced to support SQS Queue targets.- Several new helper classes for use with
EventSubscription
are now supported:EventSubscriptionSqsTarget
,NewImageEventSubscription
,NewStorageEventSubscription
,NewVectorEventSubscription
, andComputeFunctionCompletedEventSubscription
. The latter supports events generated by the Compute service as described below.
- The Compute service now generates a
compute-function-completed
event every time the number of outstanding (pending or running) jobs transitions to 0, akin to theFunction.wait_for_completion()
method. These can be used with the Catalog service events support to trigger other operations.
- Support for Python 3.8 has been removed
- Support for Python 3.12 has been added
- Some dependencies have been updated due to security flaws
- The dependency on
pytz
has been removed in favor of the standardzoneinfo
package - Minor changes and additions to the client exception hierarchy so that ClientErrors and ServerErrors are not conflated in the retry support.
- The Catalog now provides support for Events, essentially notifications of new or updated assets in the Catalog, including images and storage blobs. Additionally, scheduled calendar-based events can be defined. You can subscribe to these events to trigger running a Compute function of your choice. This makes it possible to set up automated processing of new imagery. See the [https://docs.descarteslabs.com/guides/catalog.html](Catalog Guide) and API documentation for more information.
- Due to declining support for Python 3.8 across the ecosystem, we have discontinued our support for Python 3.8. It is expected that the client will continue to function until Python 3.8 is End of Life (October 2024), but we can no longer test new releases against this version.
- The Catalog Storage Blob deletion methods have been enhanced to support waiting for completion of the operation. When a blob is deleted, it is removed immediately from the catalog and a background asynchronous task is launched to clean up the contents of the blob from the backing storage. If a blob is deleted and then a new blob with the identical id is immediately created and uploaded before this background task completes, it is possible for the background task to end up deleting the new blob contents. As of this release the
Blob
instance and class delete methods return aBlobDeletionTaskStatus
object which provides await_for_completion
method which can be used to wait until the background task completes and it is safe to create a new blob with the same id. For theBlob.delete_many
method, thewait_for_completion=True
parameter can be used to wait for all the supplied blobs to be completely deleted. Note that in the case of theBlob.delete
class method, this is a very slight breaking change, as it used to return True or False, and now instead returns aBlobDeletionTaskStatus
orNone
, which have the same truthiness and hence are very likely to behave identically in practical use.
Bugfix only
- The
descarteslabs
client CLI script generated by the installation process was broken. Now it works!
A very minor release with some obscure bug fixes.
- The
descarteslabs
client CLI has had an overhaul. Gone is the obsolete support for the Raster client, and added is support for querying Catalog Products, Bands, and Blobs and managing sharing for the same. - Minor fixes to the authorization flow on login.
- Add testing of Blobs.
- Corrected regular expressions used to parse the
memory
argument to theFunction
constructor. - Improved documentation of the the
cpus
andmemory
arguments to theFunction
constructor.
- Fixed a bug in seldom-used code to clear client state causing an import failure.
- Fixed a bug in
Table.visualize()
which could cause Authorization (401) errors when rendering tiles into anipyleaflet.Map
.
- Bumped some minimum dependency version constraints to avoid security vulnerabilities.
- Fixed a bug in
Table.visualize()
that was causing it to fail.
Due to a number of breaking changes, the version has been bumped to 3.0.0. However, the vast majority of typical use patterns in typical user code will not require changes. Please review the specifics below.
- The
tags
attributes on Catalog objects can now contain up to 32 elements, each up to 1000 characters long. But why would you even want to go there? - Breaking Change: Derived bands, never supported in the AWS environment and catalog products, have been removed.
- The new
Blob.delete_many
method may be used to delete large numbers of blobs efficiently. - The
Blob.get_or_create
method didn't allow supplyingstorage_type
,namespace
, orname
parameters. Now it works as expected, either returning a saved Blob from the Catalog, or an unsaved blob that you can use to upload and save its data. - Image methods
ndarray
anddownload
no longer pass the image's default geocontext geometry as a cutline. This is to avoid problems when trying to raster a complete single image in its native CRS and resolution where imperfect geometries (due to a simplistic projection to EPSG:4326) can cause some boundary pixels to be masked. When passing in an explicitGeoContext
to these methods, consider whether any cutline geometry is required or not, to avoid these issues.
Function
andJob
objects now have a newenvironment
attribute which can be used to define environment variables for the jobs when they are run.- Breaking Change: The
Function.map
method previously had no bound on how many jobs could be created at one time. This led to operational problems with very large numbers of jobs. Now it submits jobs in batches (up to 1000 jobs per batch) to avoid request timeouts, and is more robust on retryable errors so that duplicate jobs are not submitted accidently. There is still no bound on how many jobs you may create with a single call toFunction.map
. Additionally, since it is possible that some jobs may be successfully submitted, and others not, the return value, while still behaving as a list ofJob
s, is now aJobBulkCreateResult
object which has ais_success
and anerror
property which can be used to determine if all submissions were successful, what errors may have occurred, and what jobs have actually been created. Only if the first batch fails hard will the method raise an exception. - The
Job.statistics
member is now typed as aJobStatistics
object. - The efficiency of deleting many jobs at once has been significantly improved using
Function.delete
andFunction.delete_jobs
. It is still possible to encounter request timeouts with very large numbers of jobs; workarounds are now documented in the API documentation for theFunction.delete_jobs
method. - The
ComputeClient.check_credentials
method has been added, so that the client can determine if valid user credentials have already been registered with the Compute service.
- The Vector client library, previously available as the
descarteslabs-vector
package on PyPI, has now been integrated into the Descartes Labs Python Client (this package). It should no longer be installed separately. - Visualization support (
ipyleaflet.Map
) is enabled whenipyleaflet
is available. It is not installed by default, but can be installed manually, or by installing thedescarteslabs
python client with theviz
extra (e.g.pip install descarteslabs[viz]
). Note that in order to be compatible with jupyterlab notebooks, thevisualize()
method no longer returns the layer, it just adds it to the supplied map. - The Vector package now has a
VectorClient
API client, with the usual support forget_default_client()
andset_default_client()
. Most constructors and methods now accept an optionalclient=
parameter if you need to use something other than the default client. - Configuration is now accomplished using the standard
descarteslabs.config
package. In particular, thevector_url
setting is used to specify the default Vector host. TheVECTOR_API_HOST
environment variable is no longer consulted. - Vector client methods now raise standard
descarteslabs.exceptions
Exception classes rather than thedescarteslabs.vector.vector_exceptions
classes of the old client. - The
is_spatial=
parameter previously accepted by many methods and functions is now deprecated and ignored. It is not required because existing type information always determines if an operation is spatial or not. Warnings will be generated if it is used. - Be advised that feature upload and download (query) do not currently support or impose any limits, and thus allow operations so large and slow that timeouts or other failures may occur. A future version will implement limits and batching, so that large operations can be supported reliably. Until then, the user may wish to implement their own batching were possible to avoid encountering network limits and timeouts.
- The old client version v1.12.1 is reaching end of life and will longer be supported as of February 2024. You can expect the version to stop working at any point after that as legacy backend support is turned off.
- Breaking Change: The deprecated
Scenes
client API has been removed. - Breaking Change: The deprecated
Metadata
client API has been removed. - The minimum required version of
urllib3
has been bumped to 1.26.18 to address a security vulnerability. - The minimum required version of
shapely
has been bumped to 2.0.0 to address thread safety issues. - Python 3.7, formerly deprecated, is no longer supported.
- Python 3.12 is not yet officially supported due to the lack of support from
blosc
. However, if you are able to provide a functionalblosc
on your own, then 3.12 should work. - Urllib3 2.X is now supported.
- Geopandas, Pydantic, and PyArrow have been added as core dependencies to support the Vector client.
- For those users of the
clear_client_state
function (not common), the bands cache for the Catalog client is now cleared also.
Function.delete_jobs
was failing to implement thedelete_results
parameter, so job result blobs were not being deleted. This has been fixed.- Add
delete_results
parameter toFunction.delete
for consistency. Job.statistics
field added which contains statistics (CPU, memory, and network utilization) for the job. This can be used to determine the minimal resources necessary for theFunction
after some representative runs.
- Filtering on datetime attributes (such as
Function.created_at
) didn't previously work with anything butdatetime
instances. Now it also handles iso format strings and unix timestamps (int or float).
- Following our lifecycle policy, client versions v1.11.0 and earlier are no longer supported. They may cease to work with the Platform at any time.
- The Catalog
Blob
class now has aget_data()
method which can be used to retrieve the blob data directly given the id, without having to first retrieve theBlob
metadata.
-
Breaking Change The status values for
Function
andJob
objects have changed, to provide a better experience managing the flow of jobs. Please see the updated Compute guide for a full explanation. Because of the required changes to the back end, older clients (i.e. v2.0.3) are supported in a best effort manner. Upgrading to this new client release is strongly advised for all users of the Compute service. -
Breaking Change The base images for Compute have been put on a diet. They are now themselves built from "slim" Python images, and they no longer include the wide variety of extra Python packages that were formerly included (e.g. TensorFlow, SciKit Learn, PyTorch). This has reduced the base image size by an order of magnitude, making function build times and job startup overhead commensurately faster. Any functions which require such additional packages can add them in as needed via the
requirements=
parameter. While doing so will increase image size, it will generally still be much smaller and faster than the prior "Everything and the kitchen sink" approach. Existing Functions with older images will continue to work as always, but any newly minted `Function`` using the new client will be using one of the new slim images. -
Base images are now available for Python3.10 and Python3.11, in addition to Python3.8 and Python3.9.
-
Job results and logs are now integrated with Catalog Storage, so that results and logs can be searched and retrieved directly using the Catalog client as well as using the methods in the Compute client. Results are organized under
storage_type=StorageType.COMPUTE
, while logs are organized understorage_type=StorageType.LOGS
. -
The new
ComputeResult
class can be used to wrap results from aFunction
, allowing the user to specify additional attributes for the result which will be stored in the CatalogBlob
metadata for the result. This allows the function to specify properties such asgeometry
,description
,expires
,extra_attributes
,writers
andreaders
for the resultBlob
. The use ofComputeResult
is not required. -
A
Job
can now be assigned arbitrary tags (strings), and searched based on them. -
A
Job
can now be retried on errors, and jobs track error reasons, exit codes, and execution counts. -
Function
andJob
objects can now be filtered by class attributes (ex.Job.search().filter(Job.status == JobStatus.PENDING).collect()
). -
The
Job.cancel()
method can now be used to cancel the execution of a job which is currently pending or running. Pending jobs will immediately transition toJobStatus.CANCELED
status, while running jobs will pass throughJobStatus.CANCEL
(waiting for the cancelation to be signaled to the execution engine),JobStatus.CANCELING
(waiting for the execution to terminate), andJobStatus.CANCELED
(once the job is no longer executing). Cancelation of running jobs is not guaranteed; a job may terminate successfully, or with a failure or timeout, before it can be canceled. -
The
Job.result()
method will raise an exception if the job does not have a status ofJobStatus.SUCCESS
. IfJob.result()
yields anNone
value, this means that there was no result (i.e. the execution returned aNone
). -
The
Job.result_blob()
method will return the Catalog Storage Blob holding the result, if any. -
The
Job.delete()
method will delete any job logs, but will not delete the job result unless thedelete_results
parameter is supplied. -
The
Function
object now has attributesnamespace
andowner
. -
The
Function.wait_for_completion()
and newFunction.as_completed()
methods provide a richer set of functionality for waiting on and handling job completion. -
The
Function.build_log()
method now returns the log contents as a string, rather than printing the log contents. -
The
Job.log()
method now returns the log contents as a list of strings, rather than printing the log contents. Because logs can be unbounded in size, there's also a newJob.iter_log()
method which returns an iterator over the log lines. -
The
requirements=
parameter toFunction
objects now supports morepip
magic, allowing the use of specialpip
controls such as-f
. Also parsing of package versions has been loosened to allow some more unusual version designators. -
Changes to the
Function.map()
method, with the parameter name change ofiterargs
changed tokwargs
(the old name is still honored but deprecated), corrected documentation, and enhancements to support more general iterators and mappings, allowing for a more functional programming style. -
The compute package was restructured to make all the useful and relevant classes available at the top level.
- Property filters can now be deserialized as well as serialized.
- Allow deletion of
Function
objects.- Deleting a Function will deleted all associated Jobs.
- Allow deletion of
Job
objects.- Deleting a Job will delete all associated resources (logs, results, etc).
- Added attribute filter to
Function
andJob
objects.- Attributes marked
filterable=True
can be used to filter objects on the compute backend api. - Minor optimization to
Job.iter_results
which now uses backend filters to load successful jobs.
- Attributes marked
Function
bundling has been enhanced.- New
include_modules
andinclude_data
parameters allow for multiple other modules, non-code data files, etc to be added to the code bundle. - The
requirements
parameter has been improved to allow a user to pass a path to their ownrequirements.txt
file instead of a list of strings.
- New
- Allow data type
int32
in geotiff downloads. BlobCollection
now importable fromdescarteslabs.catalog
.
- Added API documentation for dynamic compute and vector
- Due to recent changes in
urllib3
, rastering operations were failing to retry certain errors which ought to be retried, causing more failures to propagate to the user than was desirable. This is now fixed.
(Release notes from all the 2.0.0 release candidates are summarized here for completeness.)
- Deprecated support for Python 3.7 (will end of life in July).
- Added support for Python 3.10 and Python 3.11
- AWS-only client. For the time being, the AWS client can be used to communicate with the legacy GCP platform (e.g.
DESCARTESLABS_ENV=gcp-production
) but only supports those services that are supported on AWS (catalog
andscenes
). This support may break at any point in the future, so it is strictly transitional.
- Removed many dependencies no longer required due to the removal of GCP-only features.
- Added support for Shapely 2.X. Note that user code may also be affected by breaking changes in Shapely 2.X. Use of Shapely 1.8 is still supported.
- Updated requirements to avoid
urllib3>=2.0.0
which breaks all kinds of things.
- Major overhaul of the internals of the config process. To support other clients using namespaced packages within the
descarteslabs
package, the top level has been cleaned up, and most all the real code is down insidedescarteslabs.core
. End users should never have to import anything fromdescarteslabs.core
. No more magic packages means thatpylint
will work well with code usingdescarteslabs
. - Configuration no longer depends upon the authorized user.
- Added support for data storage. The
Blob
class provides mechanism to upload, index, share, and retrieve arbitrary byte sequences (e.g. files).Blob
s can be searched by namespace and name, geospatial coordinates (points, polygons, etc.), and tags.Blob
s can be downloaded to a local file, or retrieved directly as a Pythonbytes
object.Blob
s support the same sharing mechanisms asProduct
s, withowners
,writers
, andreaders
attributes. - Added support to
Property
forprefix
filtering. - The default
geocontext
for image objects no longer specifies aresolution
but rather ashape
, to ensure that default rastering preserves the original data and alignment (i.e. no warping of the source image). - As with
resolution
, you can now pass acrs
parameter to the rastering methods (e.g.Image.ndarray
,ImageCollection.stack
, etc.) to override thecrs
of the default geocontext. - A bug in the code handling the default context for image collections when working with a product with a CRS based on degrees rather than meters has been fixed. Resolutions should always be specified in the units used by the CRS.
- Added support for managed batch compute under the
compute
module.
- Fixed a bug in the handling of small blocks (less than 512 x 512) that caused rasterio to generate bad download files (the desired image block would appear as a smaller sub-block rather than filling the resulting raster).
- The defaulting of
align_pixels
has changed slightly for theAOI
class. Previously it always defaulted toTrue
. Now the default isTrue
ifresolution
is set,False
otherwise. This ensures that when specifying ashape
and abounds
rather than a resolution,theshape
is actually honored. - When assigning a
resolution
to anAOI
, any existingshape
attribute is automatically unset, since the two attributes are mutually exclusive. - The validation of bounds for a geographic CRS has been slightly modified to account for some of the irregularities of whole-globe image products, correcting unintended failures in the past.
- Fixed problem handling MultiPolygon and GeometryCollection when using Shapely 2.0.
- Loosen up the restrictions on the allowed alphabet for Blob names. Now almost any printable character is accepted save for newlines and commas.
- Added new storage types for Blobs:
StorageType.COMPUTE
(for Compute job results) andStorageType.DYNCOMP
(for saveddynamic-compute
operations).
- Added testing of the client.
- The defaulting of the
namespace
value forBlob
s has changed slightly. If no namespace is specified, it will default to<org>:<hash>
with the user's org name and unique user hash. Otherwise, any other value, as before, will be prefixed with the user's org name if it isn't already so. Blob.get
no longer requires a full id. Alternatively, you can give it aname
and optionally anamespace
and astorage_type
, and it will retrieve theBlob
.- Fixed a bug causing summaries of
Blob
searches to fail.
Function.map
andFunction.rerun
now save the createdJob
s before returning.Job.get
return values fixed, and removed an extraneous debug print.
- Updated requirements to avoid
urllib3>=2.0.0
which break all kinds of things.
- The defaulting of
align_pixels
has changed slightly for theAOI
class. Previously it always defaulted toTrue
. Now the default isTrue
ifresolution
is set,False
otherwise. This ensures that when specifying ashape
and abounds
rather than a resolution,theshape
is actually honored. - When assigning a
resolution
to anAOI
, any existingshape
attribute is automatically unset, since the two attributes are mutually exclusive. - The validation of bounds for a geographic CRS has been slightly modified to account for some of the irregularities of whole-globe image products, correcting unintended failures in the past.
- The default
geocontext
for image objects no longer specifies aresolution
but rather ashape
, to ensure that default rastering preserves the original data and alignment (i.e. no warping of the source image). - The
Blob.upload
andBlob.upload_data
methods now returnself
, so they can be used in a fluent style. - As with
resolution
, you can now pass acrs
parameter to the rastering methods (e.g.Image.ndarray
,ImageCollection.stack
, etc.) to override thecrs
of the default geocontext.
- A bevy of fixes to the client.
- Added support for data storage. The
Blob
class provides mechanism to upload, index, share, and retrieve arbitrary byte sequences (e.g. files).Blob
s can be searched by namespace and name, geospatial coordinates (points, polygons, etc.), and tags.Blob
s can be downloaded to a local file, or retrieved directly as a Pythonbytes
object.Blob
s support the same sharing mechanisms asProduct
s, withowners
,writers
, andreaders
attributes. - Added support to
Property
forprefix
filtering.
- Added method to update user credentials for a
Function
. - Added methods to retrieve build and job logs.
- Added support for Shapely=2.X.
- This is an internal-only release. There is as of yet no updated documentation. However, the user-facing client APIs remain fully compatible with v1.12.1.
- Added support for managed batch compute under the
compute
module.
- Removed the check on the Auth for configuration, since it is all AWS all the time.
- Fixed a bug in the handling of small blocks (less than 512 x 512) that caused rasterio to generate bad download files (the desired image block would appear as a smaller sub-block rather than filling the resulting raster).
- This is an internal-only release. There is as of yet no updated documentation. However, the user-facing client APIs remain fully compatible with v1.12.1.
- Deprecated support for Python 3.7 (will end of life in July).
- Added support for Python 3.10 and Python 3.11
- AWS-only client. For the time being, the AWS client can be used to communicate with the legacy GCP platform (e.g.
DESCARTESLABS_ENV=gcp-production
) but only supports those services that are supported on AWS (catalog
andscenes
). This support may break at any point in the future, so it is strictly transitional.
- Removed many dependencies no longer required due to the removal of GCP-only features.
- Major overhaul of the internals of the config process. To prepare for supporting other clients using namespaced packages within the
descarteslabs
package, the top level has been cleaned up, and most all the real code is down insidedescarteslabs.core
. However end users should never have to import anything fromdescarteslabs.core
. No more magic packages means thatpylint
will work well with code usingdescarteslabs
. - GCP environments only support
catalog
andscenes
. All other GCP-only features have been removed.
- A bug in the code handling the default context for image collections when working with a product with a CRS based on degrees rather than meters has been fixed. Resolutions should always be specified in the units used by the CRS.
- Fixed a bug causing
descarteslabs.workflows.map.geocontext()
to fail with an import error. This problem also affected the autoscaling feature of workflows map layers.
- Fixed a bug causing downloads of single-band images to fail when utilizing rasterio.
- Catalog V2 is now fully supported on the AWS platform, including user ingest.
- Catalog V2 has been enhanced to provide substantially all the functionality of the Scenes API. The
Image
class now includes methods such asndarray
anddownload
. A newImageCollection
class has been added, mirroringSceneCollection
. The variousSearch
objects now support a newcollect
method which will return appropriateCollection
types (e.g.ProductCollection
,BandCollection
, and of courseImageCollection
). Please see the updated Catalog V2 guide and API documentation for more details. - Previously, the internal implementation of the
physical_range
attribute on various band types was inconsistent with that ofdata_range
anddisplay_range
. It has now been made consistent, which means it will either not be set, or will contain a 2-tuple of float values. It is no longer possible to explicitly set it toNone
. - Access permissions for bands and images are now managed directly by the product. The
readers
,writers
, andowners
attributes have been removed from all the*Band
classes as well as theImage
class. Also theProduct.update_related_objects_permissions
andProduct.get_update_permissions_status
methods have been removed as these are no longer necessary or supported. - All searches for bands (other than derived bands) and images must specify one or more product ids in the filtering.
This requirement can be met by using the
bands()
andimages()
methods of a product to limit the search to that product, or through afilter(properties.product_id==...)
clause on the search. - Products have a new
product_tier
attribute, which can only be set or modified by privileged users. - The
Image.upload_ndarray
will now accept either an ndarray or a list of ndarrays, allowing multiple files per image. The band definitions for the product must correspond to the order and properties of the multiple ndarrays.
- With the addition of the Scenes functionality to Catalog V2, you are strongly encouraged to migrate your Scenes-based code to use Catalog V2 instead. Scenes will be deprecated in a future release. Some examples of migrating from Scenes to Catalog V2 are included in the Catalog V2 guide. In the meantime the Scenes API has been completely reimplemented to use Catalog V2 under the hood. From a user perspective, existing code using the Scenes API should continue to function as normal, with the exception of a few differences around some little-used dark corners of the API.
- The Scenes
search_bands
now enforces the use of a non-emptyproducts=
parameter value. This was previously documented but not enforced.
- With the addition of the Scenes functionality to Catalog V2, you are strongly encouraged to migrate your Metadata-based code to use Catalog V2 instead. Metadata will be deprecated in a future release.
- As with Catalog and Scenes, one or more products must now be specified when searching for bands or images.
- The Raster client API now requires a
bands=
parameter for all rastering operations, such asraster
,ndarray
andstack
. It no longer defaults to all bands defined on the product.
- An off-by-1/2-pixel problem was identified in the coordinate transforms underlying
DLTile.rowcol_to_latlon
andDLTile.latlon_to_rowcol
. The problem has been corrected, and you can expect to see slight differences in the results of these two methods.
- All the REST client types, such as
Metadata
andRaster
, now supportget_default_client()
andset_default_client()
instances. This functionality was previously limited to the Catalog V2CatalogClient
. Whenever such a client is required, the client libraries useget_default_client()
rather than using the default constructor. This makes it easy to comprehensively redirect the library to use a specially configured client when necessary.
- The
GeoContext
types that originally were part of the Scenes package are now available in the newdescarteslabs.geo
package, with no dependencies on Scenes. This is the preferred location from which to import these classes.
- The
descarteslabs.utils
package, added in the previous release for the AWS client only, now exists in the GCP client as well, and is the preferred location to pick up theDotDict
andDotList
classes, thedisplay
andsave_image
functions, and theProperties
class for property filtering in Catalog V2. - The
display
method now has added support for multi-image plots, see the API documentation for thefigsize
,nrows
,ncols
andlayout_direction
parameters.
- The
property_filtering.GenericProperties
class has been replaced withproperty_filtering.Properties
, but remains for back compatibility. - Property filters now support
isnull
andisnotnull
operations. This can be very useful for properties which may or may not be present, e.g.properties.cloud_fraction.isnull | properties.cloud_fraction <= 0.2
.
- The
Config
exceptionsRuntimeError
andKeyError
were changed toConfigError
exceptions fromdescarteslabs.exceptions
. Auth
now retrieves its URL from theConfig
settings. If no valid configuration can be found, it reverts to the commercial service (https://app.descarteslabs.com
).
- Dependencies for the descarteslabs library have been updated, but remain constrained to continue to support Python 3.7.
- Numerous bug fixes.
- The extra requirement options have changed. There are four extra requirement options now,
visualization
,tables
,complete
, andtests
.visualization
pulls in extra requirements to support operating in a Jupyter notebook or environment, enabling interactive maps and graphical displays. It is not required for operating in a "headless" manner.tables
pulls in extra requirements to support theTables
client.complete
is the combination ofvisualization
andtables
.tests
pulls in extra requirements for running the tests. As always,pip install 'descarteslabs[complete]'
will install a fully enabled client.
- The Descartes Labs client now supports configuration to support operating in different environments. By default,
the client will configure itself for standard usage against the GCP platform (
"gcp-production"
), except in the case of AWS Marketplace users, for whom the client will configure itself against the AWS platform ("aws-production"
). Alternate environments can be configured by setting theDESCARTESLABS_ENV
environment variable before starting python, or by using a prelude likebefore any other imports of any part of the descarteslabs client package.from descarteslabs.config import Settings Settings.select_env("environment-name")
- The new AWS Enterprise Accelerator release currently includes only Auth, Configuration and the Scenes client.
- The
descarteslabs.client.auth
package has moved todescarteslabs.auth
. It is now imported into the original location atdescarteslabs.client.auth
to continue to work with existing code, but new code should use the new location. - The
descarteslabs.client.exceptions
module has moved todescarteslabs.exceptions
. It is now imported into the original location atdescarteslabs.client.exceptions
to continue to work with existing code, but new code should use the new location.
- Fixed an issue in
scenes.DLTile.from_shape
where there would be incomplete coverage of certain geometries. The function may now return more tiles than before. - Added support for the new
all_touched
parameter to the differentGeoContext
types. Default behavior remains the same as always, but if you setall_touched=True
this communicates to the raster service that you want the image(s) rastered using GDAL'sCUTLINE_ALL_TOUCHED
option which will change how source pixels are mapped to output pixels. This mode is only recommended when using an AOI which is smaller than the source imagery pixel resolution. - The DLTile support has been fixed to avoid generating gaps when tiling regions that span
a large distance north-to-south and straddle meridians which are boundaries between
UTM zones. So methods such as
DLTile.from_shape
may return more tiles than previously, but properly covering the region. - Added support for retrieving products and bands.
- Methods added:
get_product
,get_band
,get_derived_band
,search_products
,search_bands
,search_derived_bands
. - Disallows search without
products
parameter.
- Methods added:
- Scaling support has been enhanced to understand processing levels for newer products. The
Scene.scaling_parameters
andSceneCollection.scaling_parameters
methods now accept aprocessing_level
argument, and this will be factored in to the determination of the default result data type and scaling for all rastering operations such asScene.ndarray
andSceneCollection.mosaic
. - If the user provides the
rasterio
package (which implies providing GDAL), then rasterio will be used to save any downloaded images as GeoTIFF, allowing for the use of compression. Otherwise, by default thetifffile
support will be used to generate the GeoTIFF files but compression is not supported in this mode. - As the Places client has been deprecated, so has any use of the
place=
parameter supported by several of the Scenes functions and methods.
- (Core users only) Added support for specifying the image index to use when creating a new
Product
. - Added support for defining per-processing-level
data_type
,data_range
,display_range
andphysical_range
properties on processing level steps.
- Added support for filtering
Assets
by type and name fields.- Supported filter types
blob
,folder
,namespace
,sym_link
,sts_model
, andvector
. Specifying multiple types will find assets matching any given type. - The name field supports the following wildcards:
*
matches 0 or more of any character.?
matches 1 of any character.
- Find assets matching type of
blob
and having a display name offile name.json
orfile2name.txt
but notfilename.json
:Discover().list_assets("asset/namespace/org:some_org", filters="type=blob&name=file?name.*")
Discover().list_assets("asset/namespace/org:some_org", filters=AssetListFilter(type=AssetType.BLOB, name="file?name.*"))
- Find assets of type
blob
orvector
:Discover().list_assets("asset/namespace/org:some_org", filters="type=blob,vector")
Discover().list_assets("asset/namespace/org:some_org", filters=AssetListFilter(type=[AssetType.BLOB, AssetType.VECTOR], name="file?name.*"))
- Supported filter types
Metadata.products
andMetadata.available_products
now properly implement paging so that by default, a DotList containing every matching product accessible to the user is returned.
- If the user provides the
rasterio
package (which implies providing GDAL), then rasterio will be used to save any downloaded images as GeoTIFF, allowing for the use of compression. Otherwise, by default thetifffile
support will be used to generate the GeoTIFF files but compression is not supported in this mode.
- Fixed an issue that caused a user's schema to be overwritten if they didn't provide a primary key on table creation.
- Now uses Discover backend filtering for
list_tables()
instead of filtering on the client to improve performance. list_tables()
now supports filtering tables by nameTables.list_tables(name="Test*.json")
- New Tasks images for this release bump the versions of several dependencies, please see the Tasks guide for detailed lists of dependencies.
- The new Workbench release bumps the versions of several dependencies.
- Added support for the new
all_touched
parameter to the differentGeoContext
types. See description above underScenes
.
- The Places client has been deprecated, and use thereof will generate a deprecation warning.
- The older Catalog V1 client has been deprecated, and use thereof will generate a deprecation warning. Please use the Catalog V2 client in its place.
- Documentation has been updated to include the `AWS Enterprise Accelerator" release.
- With Python 2 far in the rearview mirror, the depedencies on the
six
python package have been removed throughout the library, the distribution and all tasks images.
- Added support for Python 3.9.
- Removed support for Python 3.6 which is now officially End Of Life.
- Added support for organizational sharing. You can now share using the
Organization
type:workflows.add_reader(Organization("some_org"))
- Added support for organizational sharing. You can now share using the
Organization
type:asset.share(with_=Organization("some_org"), as_="Viewer")
- Allow user to list their organization's namespace.
Discover().list_asset("asset/namespace/org:some_org")
- Allow user to list their organization's users.
Discover().list_org_users()
- Added an alpha Tables client. The Tables module lets you organize, upload, and query tabular data and vector geometries. As an alpha release, we reserve the right to modify the Tables client API without any guarantees about backwards compatibility. See the Tables API and Tables Guide documentation for more details.
- Added the
progress=
parameter to the various rastering methods such asScene.ndarray
,Scene.download
,SceneCollection.mosaic
,SceneCollection.stack
,SceneCollection.download
andSceneCollection.download_mosaic
. This can be used to enable or disable the display of progress bars.
- Support for Python 3.9 images has been added, and support for Python 3.6 images has been removed.
- Many of the add on packages have been upgraded to more recently released versions. In particular,
tensorflow
was updated from version 2.3 to version 2.7. - GPU support was bumped up from CUDA 10 to CUDA 11.2
- Fixed a bug preventing retry-able errors (such as a 429) from being retried.
- Allow retrieving Attribute as a class attribute. It used to raise an exception.
- Fixed a bug preventing the user from writing JPEG files with smaller than 256x256 tiles.
- Allow specifying a
NoData
value for non-JPEG GeoTIFF files. - Include band description metadata in created GeoTIFF files.
- Support scaling parameters as lists as well as tuples.
- Add caching of band metadata to drastically reduce the number of metadata queries when creating
SceneCollections
. DLTiles.from_shape
was failing to handle shape objects implementing the__geo_interface__
API, most notably several of the WorkflowsGeoContext
types. These now work as expected.- Certain kinds of network issues could read to rastering operations raising an
IncompleteRead
exception. This is now correctly caught and retried within the client library.
- Users can now use
descarteslabs.tasks.update_credentials()
to update their task credentials in case they became outdated.
- We have introduced a hard limit of 120 as the number of outstanding Workflows compute jobs that a single user can have. This limit exists to minimize situations in which a user is unable to complete jobs in a timely manner by ensuring resources cannot be monopolized by any individual user. The API that backs the calls to
compute
will return adescarteslabs.client.grpc.exceptions.ResourceExhausted
error if the caller has too many outstanding jobs. Prior to this release (1.9.0), these failures would be retried up to some small retry limit. With the latest client release however, the client will fail without retrying on an HTTP 429 (rate limit exceeded) error. For users with large (non-interactive) workloads who don’t mind waiting, we added a newnum_retries
parameter to thecompute
function; when specified, the client will handle any 429 errors and retry up tonum_retries
times. - Workflows is currently optimized for interactive use cases. If you are submitting large numbers of long-running Workflows compute jobs with
block=False
, you should consider using Tasks and Scenes rather than the Workflows API. - Removed
ResourceExhausted
exceptions from the list of exceptions we automatically catch and retry on forcompute
calls.
- Lots of improvements, additions, and clarifications in the API documentation.
- Workflows client no longer validates
processing_level
parameter values, as these have been enhanced to support new products and can only be validated server side. - Catalog V2 bands now support the
vendor_band_name
field (known asname_vendor
in Metadata/Catalog V1). - Scenes support for masking in version 1.8.1 had some regressions which have been fixed. For this reason, version 1.8.1 has been pulled from PyPI.
- New task groups now default to a
maximum_concurrency
value of 5, rather than the previous 500. This avoids the common problem of deploying a task group with newly developed code, and having it scale up and turning small problems into big problems! You may still set values as large as 500. - The Tasks client now provides an
update_group()
method which can be used to update many properties of an existing task group, including but not limited toname
,image
,minimum_concurrency
, andmaximum_concurrency
. - Improved testing across several sub-packages.
- Various documentation fixes.
** Version Deprecated ** Due to some regressions in the Scenes API, this version has been removed from PyPI.
- Added a new
common.dltile
library that performs geospatial transforms and tiling operations. - Upgraded various dependencies:
requests[security]>=2.25.1,<3
,six>=1.15.0
,blosc==1.10.2
,mercantile>=1.1.3
,Pillow>=8.1.1
,protobuf>=3.14.0,<4
,shapely>=1.7.1,<2
,tqdm>=4.32.1
,traitlets>=4.3.3,<6;python_version<'3.7'
,traitlets==5.0.5,<6;python_version>='3.7'
,markdown2>=2.4.0,<3
,responses==0.12.1
,freezegun==0.3.12
,imagecodecs>=2020.5.30;python_version<'3.7'
,imagecodecs>=2021.5.20;python_version>='3.7'
,tifffile==2020.9.3;python_version<'3.7'
,tifffile==2021.4.8;python_version>='3.7'
- Added an alpha Discover client. Discover allows users to organize and share assets with other users. As an alpha release, we reserve the right to modify the Discover client API without any guarantees about backwards compatibility. See the Discover API documentation for more details.
- breaking Image (Scene) metadata now accepts and returns the
bucket
anddirectory
fields as lists of strings, of a length equal to that of thefiles
fields. This allows the file assets making up an image to live in different locations. When creating new images, a simple string can still be provided for these fields. It will automatically be converted to a list of (duplicated) strings as necessary. As most users will never interact with these fields, the change should not affect user code.
derived_params
field for Image (scene) metadata now supported for product-specific service-implemented "native derived bands" which may only be created for core products.
- Scenes now uses the client-side
dltile
library to make DLTiles. This improves performance when creating a large number of DLTile objects. - Scenes DLTile
from_shape
now has a parameter to return tile keys only instead of full tile objects. Usage details can be found in the docs. - Scenes DLTile now has new methods:
iter_from_shape
that takes the same arguments asfrom_shape
but returns an iterator (from_shape docs),subtile
that adds the ability to subdivide tiles (subtile docs), androwcol_to_latlon
andlatlon_to_rowcol
which converts pixel coordinates to spatial coordinates and vice versa (rowcol_to_latlon docs and latlon_to_rowcol docs). - Scenes DLTile now has a new parameter
tile_extent
which is the total size of the tile in pixels including padding. Usage details can be found in the docs. - breaking Removed the dependence on
Raster
for tiling. Theraster_client
parameter has been removed from thefrom_latlon
,from_key
,from_shape
, andassign
DLTile methods. - Tiling using
from_shape
may return a different number of tiles compared to previous versions under certain conditions. These tiles are usually found in overlapping areas between UTM zones and should not affect the overall coverage. - DLTile geospatial transformations are guaranteed to be within eight decimal points of the past implementation.
- DLTile errors now come from the
dltile
library and error messages should now be more informative. - When specifying output bounds in a spatial reference system different from the underlying raster, a densified representation of the bounding box is used internally to ensure that the returned image fully covers the bounds. For certain methods (like
mosaic
) this may change the returned image dimensions, depending on the SRSs involved. - breaking As with the Metadata v1 client changes, the
bucket
anddirectory
fields of the Scene properties are now multi-valued lists. - Scenes does not support writing GeoTiffs to file-like objects. Non-JPEG GeoTiffs are always uncompressed.
dltiles_from_shape
,dltiles_from_latlon
, anddltile
have been removed. It is strongly recommended to test any existing code which uses the Raster API when upgrading to this release.- Fully masked arrays are now supported and are the default. Usage details can be found in the docs
- Added support to draw progress bar. Usage details can be found in the docs.
- The signature and return value of
Raster.raster()
have changed. Thesave=
parameter has been removed as the resulting download is always saved to disk, to a file named by theoutfile_basename=
parameter. The method returns a tuple containing the name of the resulting file and the metadata for the retrieval, which is now an ordinary Python dictionary. - As with Scenes, when specifying output bounds in a spatial reference system different from the underlying raster, a densified representation of the bounding box is used internally to ensure that the returned image fully covers the bounds. For certain methods (like
mosaic
) this may change the returned image dimensions, depending on the SRSs involved.
Internal release only. See 1.8.1 above.
- Upgraded various dependencies:
blosc==1.10.2
,cachetools>=3.1.1
,grpcio>=1.35.0,<2
,ipyleaflet>=0.13.3,<1
,protobuf>=3.14.0,<4
,pyarrow>=3.0.0
,pytz>=2021.1
- Upgraded from using Travis to GitHub Actions for CI.
- Added support for the
physical_range
property onSpectralBand
andMicrowaveBand
.
- Workflows sharing. Support has been added to manage sharing of
Workflow
objects with other authorized users. Thepublic
option for publishing workflows has been removed now thatWorkflow.add_public_reader()
provides the equivalent capability. See the Workflows Guide. - Lots of improvements to API documentation and the Workflows Guide.
- Allow constructing
Float
instances from literal python integers.
Fixes a few buglets which slipped through. This release continues to use the workflows channel v0-18
.
- Fixed a problem with the defaulting of the visual options when generating tile URLs, making it possible to toggle the checkerboard option on a layer and see the difference.
- Support
axis=list(...)
forImage
. - Corrected the results of doing arithmetic on two widgets (e.g. adding two
IntSlider
s together should yieldan
Int`). - For single-band imagery
VizOption
will accept a single two-tuple for thescales=
argument.
- Python 3.6 is now deprecated, and support will be removed in the next version.
- Added support to Bands for new processing levels and processing step specifications to support Landsat Collection 2.
- The new channel
v0-18
utilizes a new and improved backend infrastructure. Any previously saved workflows and jobs from earlier channels are not accessible from the new infrastructure, so you will need to recreate and persist (e.g. publish) new versions usingv0-18
. Older releases and older channels can continue to access your originals if needed. wf.widgets
lets you quickly explore data interactively. Add widgets anywhere in your code just like normal values, and the widgets will display automatically when you call.visualize
.- View shared Workflows and XYZs in GIS applications using WMTS. Get the URL with
wf.wmts_url()
,XYZ.wmts_url()
,Workflow.wmts_url()
.- Create publicly-accessible tiles and WMTS endpoints with
wf.XYZ(..., public=True)
. Anyone with the URL (which is a cryptographically random ID) can view the data, no login required. Setdays_to_expiration
to control how long the URL lasts. wf.XYZ.list()
to iterate through all XYZ objects you've created, andXYZ.delete
to delete them.- Set default vizualization options (scales, colormap, bands, etc.) in
.publish
orwf.XYZ
withwf.VizOption
. Theseviz_options
are used when displaying the published object in a GIS application, or withwf.flows
.
- Create publicly-accessible tiles and WMTS endpoints with
ImageCollection.visualize()
: display ImageCollections onwf.map
, and select the reduction operation (mean, median, mosaic, etc.) interactivelyImage.reduction()
andImageCollection.reduction()
(likeic.reduction("median", axis="images")
) to reduce an Image/ImageCollection with an operation provided by namewf.map.controls
is accessible (you had to dowf.map.map.controls
before)- Access the parameters used in a Job with
Job.arguments
andJob.geoctx
.
- Errors like
In 'or': : operand type(s) all returned NotImplemented from __array_ufunc__
when using the bitwise-or operator|
are resolved. - Errors when using computed values in the
wf.Datetime
constructor (likewf.Datetime(wf.Int(2019) + 1)
) are resolved. wf.Timedelta
can be constructed from floats, and supports all binary operations that Python does (support for/, //, %, *
added)- In
.rename_bands
, prohibit renaming a band to a name that already exists in the Image/ImageCollection. Previously, this would succeed, but cause downstream errors. .bandinfo.get("bandname", {})
now works---before, providing{}
would fail with a TypeError- Indexing an
Any
object (likewf.Any({"foo": 1})["foo"]
) behaves correctly wf.Datetime
s constructed from strings containing timezone information are handled correctly
.mask(new_mask)
ignores masked pixels innew_mask
. Previously, masked pixels innew_mask
were considered True, not False. Note that this is opposite of NumPy's behavior.- If you
.publish
an object that depends onwf.parameter
s orwf.widgets
, it's automatically converted into awf.Function
. - breaking
.compute
and.inspect
no longer accept extra arguments that aren't required for the computation. If the object doesn't depend on anywf.parameter
s orwf.widgets
, passing extra keyword arguments will raise an error. Similarly, not providing keyword arguments for all parameters the object depends on will raise an error. - breaking The
wf.XYZ
interface has changed; construct an XYZ withwf.XYZ(...)
instead ofwf.XYZ.build(...).save()
- Set
days_to_expiration
onXYZ
objects. After this many days, the object is deleted. Job
metadata is deleted after 10 days;wf.Job.get(...)
on a job ID more than 10 days old will fail. Note that Job results have always been deleted after 10 days; now the metadata expires as well.wf.Function
has better support for named arguments. Now,f = wf.Function[{'x': wf.Int, 'y': wf.Str}, wf.Int]
requires two argumentsx
andy
, and they can be given positionally (f(1, "hi")
), by name in any order(f(x=1, y="hi")
orf(y="hi", x=1)
), or both (f(1, y="hi")
).wf.Function.from_callable
will generate a Function with the same names as the Python function you decorate or pass in. Therefore, when using@wf.publish
as a decorator, the published Function will automatically have the same argument names as your Python function.
- Python 3.8 is now supported in the client.
- As Python 3.5 has reached End Of Life, it is no longer supported by the descarteslabs client.
- Altered the behavior of Task function creation. Deprecation warnings will be issued when attempting to create a Task function for which support will be removed in the near future. It is strongly recommended to test any existing code which uses the Tasks client when upgrading to this release.
- New tasks public images for for use with Python 3.8 are available.
.pick_bands
supports proxywf.Str
objects;.unpack_bands
supportswf.Str
andwf.Tuple[wf.Str, ...]
.- Better performance constructing a
wf.Array
from aList
of numbers (likewf.Array(ic.sum(["pixels", "bands"]))
) - No more error using
@wf.publish
as a decorator on a function without a docstring
No more irrelevant DeprecationWarning
s when importing the descarteslabs
package (#235). Deprecated functionality in the package will now show FutureWarning
s instead.
wf.map.geocontext
doesn't raise an error about the CRS of the mapwf.flows
doesn't raise an error about versions from incompatible channels
- Example code has been cleaned up.
- Sharing of any Workflows object as a
Workflow
with version and access control. Browse through sharedWorkflow
s with thewf.flows
browser widget. - Upload images to the DL catalog from Workflows jobs. Usage details can be found in the docs.
wf.np.median
Job.cancel()
to cancel running jobs.- Transient failures in Jobs are automatically retried, resulting in fewer errors.
- Search widget on
wf.map
by default.
- Bitwise operations on imagery no longer fail
wf.np.linspace
no longer fails when being called correctly.median
is slightly less prone to OOM errors
- Breaking: Workflows sharing:
wf.publish()
andwf.use()
have new signatures,wf.retrieve()
has been removed in favor ofwf.Workflow.get()
andwf.VersionedGraft.get_version()
and thewf.Workflow
object has been completely refactored. Detailed information is in the docs. Array.to_imagery
now acceptsKnownDict
for bandinfo and properties.Number
s can now be constructed fromStr
s
- Output formats for
.compute
including GeoTIFF, JSON, PyArrow, and MessagePack. Usage details can be found in the docs. - Destinations for Job results: download and email. Usage details can be found in the docs.
- Save
.compute
outputs to a file with thefile=
argument. - Pixel value inspector: click in the map widget to view pixel values.
wf.ifelse
for simple conditional logic.- NumPy functions including
hypot
,bitwise_and
,bitwise_or
,bitwise_xor
,bitwise_not
,invert
, andldexp
- Bitwise
Array
andMaskedArray
operations size
attribute onArray
andMaskedArray
astype
function onArray
andMaskedArray
for changing the dtypeflatten
function onArray
andMaskedArray
for flattening into a 1D arrayMaskedArray.compressed
for getting all unmasked data as a 1D arrayget
function onDict
andKnownDict
for providing a default value if a key does not existnbands
attribute onImage
andImageCollection
proxify
can handlescenes.GeoContext
sDict.contains
,Dict.length
- Fewer failures and hanging calls when connecting to the Workflows backend (like
.compute
,.visualize
,Job.get
, etc.) wf.numpy.histogram
works correctly with computed values forrange
andbins
(such asrange=[arr.min(), arr.max()]
)- More consistent throughput when a large number of jobs are submitted
Array
s can now be constructed from proxyList
sMaskedArray.filled
works correctly when passed Python values- Long-running sessions (like Jupyter kernels) refresh credentials instead of failing with auth errors after many hours of use
wf.numpy.dot
andwf.numpy.einsum
no longer fail when being called correctly- Occasional errors like
('array-89199362e9a5d598fb5c82805136834d', 0, 0)
when callingwf.compute()
with multiple values are resolved
pick_bands
accepts duplicate band names. Enjoy easier Sentinel-1"vv vh vv"
visualizations!ImageCollection.from_id
is always ordered by datewf.numpy.percentile
no longer accepts anaxis
argument- breaking
wf.Job
construction and interface changes:- Use a single
wf.Job(..)
call instead ofwf.Job.build(...).execute()
to create and launch a Job - New
Job.result_to_file
method Job.status
is removed in favor of a singleJob.stage
wf.TimeoutError
renamed towf.JobTimeoutError
- Use a single
- 191 functions from NumPy are available for Workflows
Array
s, including parts of thenumpy.linalg
andnumpy.ma
submodules. See the full list on the docs. index_to_coords
andcoords_to_index
methods onImage
/ImageCollection
/GeoContext
for converting between geospatial and array coordinatesvalue_at
function onImage
andImageCollection
for extracting single pixel values at spatial coordinates.
- Using datetimes as parameters to
visualize
behaves correctly.
- Fixed a bug that prevented uploading ndarrays of type
uint8
- Array support for
argmin
,argmax
,any
,all
pick_bands
supports anallow_missing
kwarg to drop band names that may be missing from the data without an error.wf.compute
supports passing lists or tuples of items to compute at the same time. Passing multiple items towf.compute
, rather than callingobj.compute
for each separately, is usually faster.- Casting from
Bool
toInt
:wf.Int(True)
- Experimental
.inspect()
method for small computations during interactive use.
- [breaking] Array no longer uses type parameters: now you construct an Array with
wf.Array([1, 2, 3])
, notwf.Array[wf.Int, 1]([1, 2, 3])
. Remember, Array is an experimental API and will continue to make frequent breaking changes! - Workflows now reuses the same gRPC client by default---so repeated or parallel calls to
.compute
, etc. will be faster. Calling.compute
within a thread pool will also be significantly more efficient.
wf.numpy.histogram
correctly accepts aList[Float]
as therange
argument
1.1.2 fixes a bug which caused Workflows map layers to behave erratically when changing colormaps.
1.1.1 fixes a packaging issue that caused import descarteslabs.workflows
to fail.
It also makes NumPy an explicit dependency. NumPy was already a transitive dependency, so this shouldn't cause any changes.
You should NOT install version 1.1.0; 1.1.1 should be used instead in all circumstances.
Image.upload()
now emits a deprecation warning if the image has acs_code
orprojection
property. The projection defined in the uploaded file is always used and applied to the resulting image in the Catalog.Image.upload_ndarray()
now emits a deprecation warning if the image has both acs_code
and aprojection
property. Only one of them may be supplied, andcs_code
is given preference.
SceneCollection.download_mosaic
has new default behavior formask_alpha
wherein thealpha
band will be used as a mask by default if it is available for all scenes in the collection, even if it is not specified in the list of bands.
- Experimental Array API following the same syntax as NumPy arrays. It supports vectorized operations, broadcasting,
and multidimensional indexing.
ndarray
attribute ofImage
andImageCollection
will return aMaskedArray
.- Over 60 NumPy ufuncs are now callable with Workflows
Array
. - Includes other useful
Array
functions likemin()
,median()
,transpose()
,concatenate()
,stack()
,histogram()
, andreshape()
.
ImageCollection.sortby_composite()
for creating an argmin/argmax composite of anImageCollection
.- Slicing of
List
,Tuple
,Str
, andImageCollection
. wf.range
for generating a sequence of numbers between start and stop values.ImageCollectionGroupby.mosaic()
for applyingImageCollection.mosaic
to each group.wf.exp()
,wf.square()
,wf.log1p()
,wf.arcsin()
,wf.arccos()
, andwf.arctan()
Datetime.is_between()
for checking if aDatetime
falls within a specified date rangeFeatureCollection.contains()
- Container operations on
GeometryCollection
including:GeometryCollection.contains()
GeometryCollection.sorted()
GeometryCollection.map()
GeometryCollection.filter()
GeometryCollection.reduce()
List
andTuple
can now be compared with other instances of their type via__lt__()
,__eq__()
etc.List.__add__()
andList.__mul__()
for concatenating and duplicatingList
s.
- Products without alpha band and
nodata
value are rejected, instead of silently producing unwanted behavior. ImageCollection.concat_bands
now throws a better error when trying to concatenate bands from anotherImageCollection
that is not the same length.Any
is now promotable to all other types automatically.- Better error when trying to iterate over Proxytypes.
- Interactive map: calls to
visualize
now clear layer errors. - Interactive map: when setting scales, invalid values are highlighted in red.
- Interactive map: a scalebar is shown on the bottom-left by default.
ImageCollection.mosaic()
now in "last-on-top" order, which matches with GDAL anddl.raster
. Usemosaic(reverse=True)
for the same ordering as in v1.0.0.
- Better errors when specifying invalid type parameters for Proxytypes that require them.
- Field access on
Feature
,FeatureCollection
,Geometry
, andGeomeryCollection
no longer fails. - In
from_id
, processing level 'cubespline' no longer fails.
As of January 1st, 2020, the client library no longer supports Python 2. For more information, please contact [email protected]. For help with porting to Python 3, please visit https://docs.python.org/3/howto/pyporting.html. |
---|
- There is an entirely new backend supporting asynchronous uploads of image files and ndarrays with
the catalog client. There are minor changes to the
ImageUpload
class (a newevents
field has subsumederrors
, and thejob_id
field has been removed) but the basic interface is unchanged so most code will keep functioning without any changes. - It is now possible to cancel image uploads.
- Errors messages are now easier to read.
- Many improvements to the documentation.
- You can now create or retrieve an existing object using the
get_or_create
method. - Retrieving a
Band
orImage
by name is now possible by callingget_band
orget_image
on theProduct
instance. You can also use the Product'snamed_id
function to get a complete id for images and bands. - A new convenience function
make_valid_name
onImage
andBand
classes will return a sanitized name without invalid characters. - A new property
ATTRIBUTES
enumerates which attributes are available for a specific catalog object. - Trying to set an attribute that does not exist will now raise
AttributeError
. update_related_objects_permissions()
should no longer fail with a JSON serialization error.- Setting a read-only attribute will now raise an
AttributeValidationError
. - Saving a new object while one with the same id already exists will now raise a
ConflictError
instead ofBadRequestError
. - If a retrieved object has since been deleted from the catalog, saving any changes or trying to
reload it will now raise a
DeletedObjectError
. - Resolution fields now accept string values such as "10m" or "0.008 degrees". If the value cannot
be parsed, an
AttributeValidationError
will be raised. - Changes to the
extra_properties
attribute are now tracked correctly.
- This release no longer supports Python 2.
- This package is now distributed as a Python 3 wheel which will speed up installation.
- Handling of missing data via empty ImageCollections
ImageCollection.from_id
returns an empty ImageCollection if no data exist for the given time/place, rather than an errorImageCollection.filter
returns an empty ImageCollection if the predicate is False for every Image, rather than an errorImage.replace_empty_with
andImageCollection.replace_empty_with
for explicitly filling in missing data- See the Workflows guide for more information
- Docstrings and examples on every class and function!
- Assigning new metadata to Image properties & bandinfo:
Image.with_properties()
,Image.with_bandinfo()
- Interactive map: colorbar legends on layers with colormaps (requires matplotlib)
Dict.from_pairs
: construct a Dict from a sequence of key-value pairs- Map displays a fullscreen button by default ([breaking] if your code adds one, you'll now get two)
wf.concat
for concatentatingImage
andImageCollection
objectsImageCollection.concat
now acceptsImage
objects; newImage.concat
acceptsImage
orImageCollection
ImageCollection.mosaic()
FeatureCollection.sorted()
,FeatureCollection.length()
,FeatureCollection.__reversed__()
GeometryCollection.length()
,GeometryCollection.__reversed__()
wf.zip
now supportsImageCollection
,FeatureCollection
,GeometryCollection
as well asList
andStr
- Get a GeoContext for the current bounds of the map in any resolution, shape, or CRS (including
"utm"
, which automatically picks the right UTM zone for you) withwf.map.geocontext
. Also now returns a Scenes GeoContext for better introspection and use with Raster. - Better backend type-checking displays the possible arguments for most functions if called incorrectly
arr_shape
included when callingwf.GeoContext.compute()
- More readable errors when communication with the backend fails
- Interactive map: layout handles being resized, for example setting
wf.map.layout.height = '1000px'
Any
is no longer callable;Any.cast
encouragedremove_layer
andclear_layers
moved fromwf.interactive.MapApp
class towf.interactive.Map
(non-breaking change)- [possibly breaking] band renaming in binary operators only occurs when broadcasting:
red + red
is justred
, rather thanred_add_red
.red + blue
is stillred_add_blue
. Code which depends on accessing bands by name may need to change.
wf.where
propagates masks correctly, and handles metadata correctly with multi-band inputsprocessing_level="surface"
actually returns surface-reflectance-processed imageryImageCollection.sorted()
works properly- Viewing global-extent WGS84 images on the Workflows map no longer causes errors
List
proxytype no longer infinitely iterable in Python- Repeated use of
axis="bands"
works correctly ImageCollection.from_images
correctly aligns the bands of the inputs- Numeric casting (
wf.Int(wf.Float(2.2))
) works as expected - More descriptive error when constructing an invalid
wf.Datetime
- Computing a single
Bool
value derived from imagery works correctly
- Update workflows client channel
- Workflows map UI is more stable: errors and layers won't fill the screen
- Catalog client: Added an
update()
method that allows you to update multiple attributes at once.
- Catalog client: Images and Bands no longer reload the Product after calling
save
- Catalog client: Various attributes that are lists now correctly track changes when modifying them with list methods (e.g.
Product.owners.append("foo")
) - Catalog client: Error messages generated by the server have a nicer format
- Catalog client: Fix a bug that caused waiting for tasks to never complete
- The minimum
numpy
version has been bumped to 1.17.14 for Python version > 3.5, which addresses a bug withscenes.display
.compute()
is noticeably faster- Most of the Python string API is now available on
workflows.Str
- Interactive map: more descriptive error when not logged in to iam.descarteslabs.com
- Passing the wrong types into functions causes more descriptive and reliable errors
RST_STREAM
errors when calling.compute()
have been eliminatedImage/ImageCollection.count()
is much faster.buffer()
on vector types now works correctly- Calling
.compute()
on aGeometryCollection
works
- Catalog client: Added a
MaskBand.is_alpha
attribute to declare alpha channel behavior for a band.
- The maximum number of
extra_properties
allowed for Catalog objects has been increased from 10 to 50. - Fixed bug causing
SceneCollection.download
to fail.
- When you call
.compute()
on anImage
orImageCollection
, theGeoContext
is included on the result object (ImageResult.geocontext
,ImageCollectionResult.geocontext
)
- Passing a Workflows
Timedelta
object (instead of adatetime.timedelta
) into functions expecting it now behaves correctly - Arguments to the reducer function for
reduce
are now in the correct order
- A new catalog client in
descarteslabs.catalog
makes searching and managing products, bands and images easier. This client encompasses functionality previously split between thedescarteslabs.Metadata
anddescarteslabs.Catalog
client, which are now deprecated. Learn how to use the new API in the Catalog guide. - Property filtering expressions such as used in
scenes.search()
andFeatureCollection.filter()
now support anin_()
method.
SceneCollection.download
previously always returned successfully even if one or more of the downloads failed. Now if any of the downloads fail, a RuntimeError is raised, which will detail which destination files failed and why.- Fixed a bug where geometries used with the Scenes client had coordinates with reduced precision.
- Interactive parameters: add parameters to map layers and interactively control them using widgets
- Spatial convolution with
wf.conv2d
- Result containers have helpful
repr
s when displayed Datetime
andTimedelta
are unpacked intodatetime.datetime
anddatetime.timedelta
objects when computed.
- [breaking] Result containers moved to
descarteslabs/workflows/results
and renamed, appending "Result" to disambiguate (e.g. ImageResult and ImageCollectionResult) - [breaking]
.bands
and.images
attributes of ImageResult and ImageCollectionResult renamed.ndarray
- [breaking] When
compute
-ing anImage
orImageCollection
, the order ofbandinfo
is only correct for Python >= 3.6 - Interactive maps: coordinates are displayed in lat, lon order instead of lon, lat for easier copy-pasting
- Interactive maps: each layer now has an associated output that is populated when running autoscale and deleted when the layer is removed
- Interactive maps:
Image.visualize
returns aLayer
object, making it easier to adjustLayer.parameters
or integrate with other widgets
- Composing operations onto imported Workflows no longer causes nondeterministic errors when computed
- Interactive maps:
remove_layer
doesn't cause an error - No more errors when creating a
wf.parameter
forDatetime
and other complex types .where
no longer causes a backend error- Calling
wf.map.geocontext()
when the map is not fully initialized raises an informative error - Operations on numbers computed from raster data (like
img_collection.mean(axis=None)
) no longer fail when computed - Colormap succeeds when the Image contains only 1 value
Raster.stack
max_workers
is limited to 25 workers, and will raise a warning and set the value to 25 if a value more than 25 is specified.
- Interactive maps:
clear_layers
andremove_layer
methods - ImageCollections:
reversed
operator - ImageCollections:
concat
andsorted
methods - ImageCollections:
head
,tail
, andpartition
methods for slicing - ImageCollections:
where
method for filtering by condition - ImageCollections
map_window
method for applying sliding windows - ImageCollections: Indexing into ImageCollections is supported (
imgs[1]
) - [breaking] Statistics functions are now applied to named axes
- DateTime, Timedelta, Geocontext, Bool, and Geometry are now computable
- ImageCollectionGroupby ProxyObject for grouping ImageCollection by properties, and applying functions over groups
- ImageCollections:
groupby
method parameter
constructor
- Interactive maps: autoscaling is now done in the background
- Tiles requests can now include parameters
median
is noticeably fastercount
is no longer breaks colormapsmap
,filter
, andreduce
are 2x faster in the "PREPARING" stage- Significantly better performance for functions that reference variables outside their scope, like
overall_comp = ndvi.mean(axis="images")
deltas = ndvi.map(lambda img: img - overall_comp)
- Full support for floor-division (
//
) between Datetimes and Timedeltas (imgs.filter(lambda img: img.properties['date'] // wf.Timedelta(days=14)
)
- [breaking]
ImageCollection.one
(in favor of indexing)
scenes.DLTile.assign(pad=...)
method added to ease creation of a tile in all ways indentical except for the padding.
- The parameter
nbits
has been deprecated for catalog bands.
- New interactive map, with GUI controls for multiple layers, scaling, and colormaps.
- Colormaps for single-band images.
- Map interface displays errors that occur while the backend is rendering images.
- ImageCollection compositing no longer changes band names (
red
does not becomered_mean
, for example). .clip()
and.scale()
methods for Image/ImageCollection.- Support specifying raster resampler method.
- Support specifying raster processing level:
toa
(top-of-atmosphere) orsurface
[surface reflectance). - No more tiles 400s for missing data; missing/masked pixels can optionally be filled with a checkerboard pattern.
- Workflows
Image.concat
renamedImage.concat_bands
. - Data are left in
data_range
values ifphysical_range
is not set, instead of scaling to the range0..1
. - Selecting the same band name twice (
img.pick_bands("vv vv")
) properly raises an error. - Reduced
DeprecationWarning
s in Python 3.7.
- Alpha Workflows API client has been added. Access to the Workflows backend is restricted; contact support for more information.
- Workflows support for Python 3 added in channel v0-5.
- Scenes API now supports band scaling and output type specification for rastering methods.
- Methods in the Metadata, Raster, and Vector service clients that accepted GeoJSON geometries now also accept Shapely geometries.
- Add support for user cython modules in tasks.
- Tasks webhook methods no longer require a
group_id
if a webhook id is provided. catalog_id
property on images is no longer supported by the API- Fix
scenes.display
handling of single band masked arrays with scalar masks - Fix problems with incomplete
UploadTask
instances returned byvectors.FeatureCollection.list_uploads
- Metadata, Catalog, and Scenes now support a new
storage_state
property for managing image metadata and filtering search results.storage_state="available"
is the default for new images and indicates that the raster data for that scene is available on the Descartes Labs Platform.storage_state="remote"
indicates that the raster data has not yet been processed and made available to client users. - The following additional colormaps are now supported for bands – 'cool', 'coolwarm', 'hot', 'bwr', 'gist_earth', 'terrain'. Find more details about the colormaps here.
Scene.ndarray
,SceneCollection.stack
, andSceneCollection.mosaic
now support passing a string as themask_alpha
argument to allow users to specify an alternate band name to use for masking.- Scenes now supports a new
save_image
function that allows a user to save a visualization given a filename and extension. - Tasks now allows you to unambiguously get a function by group id using
get_function_by_id
. - All Client APIs now accept a
retries
argument to override the default retry configuration. The default remains the same as the prior behavior, which is to attempt 3 retries on errors which can be retried.
- Bands of different but compatible types can now be rastered together in
Scene.ndarray()
andScene.download()
as well as across multiple scenes inSceneCollection.mosaic()
,SceneCollection.stack()
andSceneCollection.download()
. The result will have the most general data type. - Vector client functions that accept a
geometry
argument now support passing Shapely shapes in addition to GeoJSON.
- Removed deprecated method
Metadata.sources()
FeatureCollection.filter(geometry)
will now raise anInvalidQueryException
if you try to overwrite an existing geometry in the filter chain. You can only set the geometry once.
- Many old and obsolete examples were removed from the package.
Scene.ndarray
,SceneCollection.stack
, andSceneCollection.mosaic
now will automatically mask alpha if the alpha band is available in the relevant scene(s), and will setmask_alpha
toFalse
if the alpha band does not exist.FeatureCollection.add
,FeatureCollection.upload
,Vector.create_feature
,Vector.create_features
, andVector.upload_features
all accept afix_geometry
string argument that determines how to handle certain problem geometries including those which do not follow counter-clockwise winding order (which is required by the GeoJSON spec but not many popular tools). Allowed values arereject
(reject invalid geometries with an error),fix
(correct invalid geometries if possible and use this corrected value when creating the feature), andaccept
(the default) which will correct the geometry for internal use but retain the original geometry in the results.Vector.get_upload_results
andVector.get_upload_result
now accept apending
parameter to include pending uploads in the results. Such pending results will havestatus: PENDING
and, in lieu of a task id, theid
attribute will contain the upload id as returned byVector.upload_features
UploadTask.status
no longer blocks until the upload task is completed, but rather returns the current status of the upload job, which may bePENDING
,RUNNING
,SUCCESS
, orFAILURE
.- The
FutureTask.ready
andUploadTask.ready
property has been added to test whether the task has completed. A return value ofTrue
means that ifget_result(wait=True)
were to be called, it would return without blocking. - You can now export features to a storage
data
blob. To export from thevector
client, useVector.export_product_from_query()
with a storage key and an optional query. This returns the task id of the export task. You can ask for status usingVector.get_export_results()
for all export tasks orVector.get_export_result()
for a specific task by task id. - FeatureCollection has been extended with this functionality with a
FeatureCollection.export()
method that takes a storage key. This operates on the filter chain that FeatureCollection represents, or the full product if there is no filter chain. It returns anExportTask
which behaves similar to theFutureTask
. Catalog.upload_image()
andCatalog.upload_ndarray()
now will return anupload_id
that can be used to query the status of that upload usingCatalog.upload_result()
. Note that the upload id is the image id and if you use identical image idsCatalog.upload_result()
will only show the result of the most recent upload.
- Several typical kinds of non-conforming GeoJSON which previously caused errors can now be accepted or
fixed by the
FeatureCollection
andVector
methods for adding or uploading new vector geometries.
- Fixed issues with
Catalog.upload_ndarray()
under Windows - Added header to client requests to better debug retries
- Improved error messages for Catalog client upload methods
- Tasks methods
create_function
,create_or_get_function
, andnew_group
now have image as a required parameter - The
name
parameter is renamed toproduct_id
inVector.create_product
, andFeatureCollection.create
andFeatureCollection.copy
. The 'name' parameter is renamed tonew_product_id
inVector.create_product_from_query
. Usingname
will continue to work, but will be removed completely in future versions. - The
name
parameter is no longer required, and is ignored forVector.replace_product
,Vector.update_product
,FeatureCollection.update
andFeatureCollection.replace
. This parameter will be removed completely in future versions.
Metadata.paged_search
has been added and essentially supports the original behavior ofMetadata.search
prior to release 0.16.0. This method should generally be avoided in favor ofMetadata.features
(orMetadata.search
).
- Fixed typo in
UploadTask.status
which caused exception when handling certain failure conditions FeatureCollection.upload
parametermax_errors
was not being passed to Vector client.- Ensure
cloudpickle==0.4.0
is version used when creatingTasks
. - Eliminate redundant queries from
FeatureCollection.list
.
FeatureCollection.upload
andVector.upload_features
now accept an optionalmax_errors
parameter to control how many errors are acceptable before declaring an upload a failure.UploadTask
(as returned byFeatureCollection.upload
andVector.list_uploads
) now has added attributes to better identify what was processed and what errors occurred.Storage
now has added methodsset_file
andget_file
to allow for better uploading and downloading, respectively, of large files.Storage
class now has anexists()
method that checks whether an object exists in storage at the location of a givenkey
and returns a boolean.Scenes.search
allowslimit=None
FeatureCollection.delete_features
added to support deletingFeature
s that match afilter
FeatureCollection.delete_features
andFeatureCollection.wait_for_copy
now useAsyncJob
to poll for asynchronous job completion.Vector.delete_features_from_query
andVector.get_delete_features_status
added to support newFeatureCollection
andAsyncJob
methods.
- Fixed tasks bugs when including modules with relative paths in
sys.path
- Tasks now support passing modules, data and requirements along with the function code, allowing for a more complex and customized execution environment.
- Vector search query results now report their total number of results by means of the standard
len()
function.
Metadata.search
no longer has a 10,000-item limit, and the number of items returned will be closer tolimit
. This method no longer accepts thecontinuation_token
parameter.
- Raster client can now handle arbitrarily large numbers of tiles generated from a shape using the new
iter_dltiles_from_shape()
method which allows you to iterate over large numbers of tiles in a time- and memory-efficient manner. Similarly the existingdltiles_from_shape()
method can now handle arbitrarily large numbers of tiles although it can be very slow. - Vector client
upload_features()
can now upload contents of a stream (e.g.io.IOBase
derivative such asio.StringIO
) as well as the contents of a named file. - Vector FeatureCollection
add()
method can now handle an arbitrary number of Features. Use of theupload_features()
method is still encouraged for large collections. - Vector client now supports creating a new product from the results of a query against an existing product with the
create_product_from_query()
method. This support is also accessible via the newFeatureCollection.copy()
method. - XYZTile GeoContext class, helpful for rendering to web maps that use XYZ-style tiles in a spherical Mercator CRS.
- Tasks client FutureTask now instantiates a client if none provided (the default).
- Catalog client methods now properly handle
add_namespace
parameter. - Vector Feature now includes valid geojson type 'Feature'.
- Tasks client now raises new GroupTerminalException if a task group stops accepting tasks.
- General documentation fixes.
- Scenes and raster clients have a
processing_level
parameter that can be used to turn on surface reflectance processing for products that support it
scenes.GeoContext
: better defaults andbounds_crs
parameterbounds
are no longer limited to WGS84, but can be expressed in anybounds_crs
- New
Scene.default_ctx
uses a Scene'sgeotrans
to more accurately determine aGeoContext
that will result in no warping of the original data, better handling sinusoidal and other non-rectilinear coordinate reference systems. - Important: the default GeoContexts will now return differently-sized rasters than before!
They will now be more accurate to the original, unwarped data, but if you were relying on the old defaults, you should now explicitly set the
bounds
togeometry.bounds
,bounds_crs
to"EPSG:4326"
, andalign_pixels
to True.
Scene.coverage
andSceneCollection.filter_coverage
accept any geometry-like object, not just aGeoContext
.
FutureTask
inheritance changed fromdict
toobject
.
- Can now specify a GPU parameter for tasks.
Vectors.upload
allows you to upload a JSON newline delimited file.Vectors.list_uploads
allows you to list all uploads for a vector product.UploadTask
contains the information about an upload and is returned by both methods.
Vector.list_products
andVector.search_features
getquery_limit
andpage_size
parameters.
Vector.upload_features
handles new response format.
- Vector client support for retrieving status information about upload jobs. Added methods
Vector.get_upload_results
andVector.get_upload_result
.
- Shapely is now a full requirement of this package. Note: Windows users should visit https://docs.descarteslabs.com/installation.html#windows-users for installation guidance.
- Reduced the number of retries for some failure types.
- Resolved intermittent
SceneCollection.stack
bug that manifested asAttributeError: 'NoneType' object has no attribute 'coords'
due to Shapely thread-unsafety. - Tracking system environment to improve installation and support of different systems.
- The vector service is now part of the public package. See
descarteslabs.vectors
anddescarteslabs.client.services.vector
.
- Fixed SSL problems when copying clients to forked processes or sharing them among threads
- Removed extra keyword arguments from places client
- Added deprecation warnings for parameters that have been renamed in the Metadata client
- Scenes now exposes more parameters from raster and metadata
- Scenes
descarteslabs.scenes.search
will take a python datetime object in addition to a string - Scenes will now allow Feature and FeatureCollection in addition to GeoJSON geometry types
- Fixed Scenes issue preventing access to products with multi-byte data but single-byte alpha bands
Scene.download
,SceneCollection.download
, andSceneCollection.download_mosaic
methods- Colormaps supported in
descarteslabs.scenes.display
- Task namespaces are automatically created with the first task group
- Moved metadata property filtering to common
- Deprecated
create_or_get_function
in tasks - Renamed some examples
- Namespaced auth environment variables:
DESCARTESLABS_CLIENT_SECRET
andDESCARTESLABS_CLIENT_ID
.CLIENT_SECRET
andCLIENT_ID
will continue to work. - Tasks runtime check for Python version.
- Documentation updates
- Example updates
- Scenes package
- More examples
- Deprecated
add_namespace
argument in catalog client (defaults toFalse
now, formerlyTrue
)
- Added org to token scope
- Removed deprecated key usage
- Tasks service
- Patched bug in catalog service for py3
- Catalog service
- Storage service
- Switched to
start_datetime
argument pattern instead ofstart_date
- Fixed minor regression with
descarteslabs.ext
clients - Deprecated token param for
Service
class
- Raster stack method
- Removed deprecated searching by
const_id
- Removed deprecated raster band methods
- Deprecated
sat_id
parameter for metadata searches - Changed documentation from readthedocs to https://docs.descarteslabs.com
- Dot notation access to dictionaries returned by services
- Reorganization into a client submodule
- Fix regression for
NotFoundError
- Reverted
descarteslabs.services.base
todescarteslabs.services.service
- Reorganization of services
- Places updated to v2 backend, provides units interface to statistics, which carries some backwards incompatibility.
- Places updated to v2 backend, provides units interface to statistics, which carries some backwards incompatibility.
- Blosc Support for raster array compression transport
- Scrolling support for large metadata searches
- Offset keyword argument in metadata.search has been deprecated. Please use the metadata.features for iterating over large search results
- Complex filtering expressions for image attributes
- Raise explicitly on 409 response
- Keep retrying token refresh until token fully expired
- Fixed race condition when creating
.descarteslabs
directory
- Added ext namespace
- Metadata multi-get
- Fix OpenSSL install on OSX
- Automatic retry on 504
- Internal API refactoring / improvements for Auth
- Add raster bands methods to metadata service.
- Deprecate raster band methods.
- Add
require_bands
param to derived bands search method.
- Test suite replaces original token when finished running script tests.
- Support for derived bands endpoints.
- Direct access to
const_id
toproduct
translation.
descarteslabs
scripts on windows OS.
- Fix auth login
- Add metadata.bands and metadata.products search/get capabilities.
- Add bands/products descriptions
- Additional Placetypes
- Better error messages with timeouts
- Update to latest version of
requests
- Major refactor of metadata.search
- Introduction of "Products" through
Metadata.products()
- metadata entries id now concatenate the product id and the old metadata keys. The original metadata keys are available through entry['key'].
- Additional sorting available.
- Introduction of "Products" through
- Search & Raster using DLTile Feature GeoJSON or key. Uses output bounds, resolution, and srs to ease searching and rasterizing imagery over tiles.
- Better Error messaging
- DLTile notebook
save
andoutfile_basename
inRaster.raster()
- Fix metadata.features
- Strict "requests" versions needed due to upstream instability.
- Fix python 3 command line compatibility
- API Change
descarteslabs
,raster
,metadata
have all been merged into 'descarteslabs
'. 'descarteslabs login
' is now 'descarteslabs auth login
', 'raster
'' is now 'descarteslabs raster
', etc.
- A Changelog
- Testing around command-line scripts
- Searching with cloud_fraction = 0
- dltile API documentation
- Fix login bug
- Installation of "requests[security]" for python < 2.7.9
- Doctests
- Python 3 login bug
- Search by Fractions
- Initial release of client library