Coverage for python/lsst/daf/butler/datastores/chainedDatastore.py : 96%

Hot-keys on this page
r m x p toggle line displays
j k next/prev highlighted chunk
0 (zero) top of page
1 (one) first highlighted chunk
# This file is part of daf_butler. # # Developed for the LSST Data Management System. # This product includes software developed by the LSST Project # (http://www.lsst.org). # See the COPYRIGHT file at the top-level directory of this distribution # for details of code ownership. # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program. If not, see <http://www.gnu.org/licenses/>.
"""Chained Datastores to allow read and writes from multiple datastores.
A ChainedDatastore is configured with multiple datastore configurations. A ``put()`` is always sent to each datastore. A ``get()`` operation is sent to each datastore in turn and the first datastore to return a valid dataset is used.
Attributes ---------- config : `DatastoreConfig` Configuration used to create Datastore. storageClassFactory : `StorageClassFactory` Factory for creating storage class instances from name. name : `str` Label associated with this Datastore.
Parameters ---------- config : `DatastoreConfig` or `str` Configuration. This configuration must include a ``datastores`` field as a sequence of datastore configurations. The order in this sequence indicates the order to use for read operations. """
"""Path to configuration defaults. Relative to $DAF_BUTLER_DIR/config or absolute path. Can be None if no defaults specified. """
"""Key to specify where child datastores are configured."""
def setConfigRoot(cls, root, config, full): """Set any filesystem-dependent config options for child Datastores to be appropriate for a new empty repository with the given root.
Parameters ---------- root : `str` Filesystem path to the root of the data repository. config : `Config` A `Config` to update. Only the subset understood by this component will be updated. Will not expand defaults. full : `Config` A complete config with all defaults expanded that can be converted to a `DatastoreConfig`. Read-only and will not be modified by this method. Repository-specific options that should not be obtained from defaults when Butler instances are constructed should be copied from `full` to `Config`. """
# Extract the part of the config we care about updating
# And the subset of the full config that we can use for reference. # Do not bother with defaults because we are told this already has # them.
# Loop over each datastore config and pass the subsets to the # child datastores to process.
fullDatastoreConfig[containerKey])):
# Reattach to parent
# Reattach modified datastore config to parent # If this has a datastore key we attach there, otherwise we assume # this information goes at the top of the config hierarchy. else:
# Scan for child datastores and instantiate them with the same registry
# Name ourself based on our children else: childNames = "(empty@{})".format(time.time())
# We declare we are ephemeral if all our child datastores declare # they are ephemeral
"""Check if the dataset exists in one of the datastores.
Parameters ---------- ref : `DatasetRef` Reference to the required dataset.
Returns ------- exists : `bool` `True` if the entity exists in one of the child datastores. """
"""Load an InMemoryDataset from the store.
The dataset is returned from the first datastore that has the dataset.
Parameters ---------- ref : `DatasetRef` Reference to the required Dataset. parameters : `dict` `StorageClass`-specific parameters that specify, for example, a slice of the Dataset to be loaded.
Returns ------- inMemoryDataset : `object` Requested Dataset or slice thereof as an InMemoryDataset.
Raises ------ FileNotFoundError Requested dataset can not be retrieved. TypeError Return value from formatter has unexpected type. ValueError Formatter failed to process the dataset. """
"""Write a InMemoryDataset with a given `DatasetRef` to each datastore.
The put() to child datastores can fail with `DatasetTypeNotSupportedError`. The put() for this datastore will be deemed to have succeeded so long as at least one child datastore accepted the inMemoryDataset.
Parameters ---------- inMemoryDataset : `object` The Dataset to store. ref : `DatasetRef` Reference to the associated Dataset.
Raises ------ TypeError Supplied object and storage class are inconsistent. DatasetTypeNotSupportedError All datastores reported `DatasetTypeNotSupportedError`. """
else: except DatasetTypeNotSupportedError: pass
raise DatasetTypeNotSupportedError(f"None of the chained datastores supported ref {ref}")
warnings.warn(f"Put of {ref} only succeeded in ephemeral databases", stacklevel=2)
"""Add an on-disk file with the given `DatasetRef` to the store, possibly transferring it.
This method is forwarded to each of the chained datastores, trapping cases where a datastore has not implemented file ingest and ignoring them.
Notes ----- If an absolute path is given and "move" mode is specified, then we tell the child datastore to use "copy" mode and unlink it at the end. If a relative path is given then it is assumed the file is already inside the child datastore.
A transfer mode of None implies that the file is already within each of the (relevant) child datastores.
Parameters ---------- path : `str` File path. Treated as relative to the repository root of each child datastore if not absolute. ref : `DatasetRef` Reference to the associated Dataset. formatter : `Formatter` (optional) Formatter that should be used to retreive the Dataset. If not provided, the formatter will be constructed according to Datastore configuration. transfer : str (optional) If not None, must be one of 'move', 'copy', 'hardlink', or 'symlink' indicating how to transfer the file. The new filename and location will be determined via template substitution, as with ``put``. If the file is outside the datastore root, it must be transferred somehow.
Raises ------ NotImplementedError If all chained datastores have no ingest implemented or if a transfer mode of `None` is specified. """
# A "move" is sometimes a "copy"
# Each child datastore must copy the file for a move operation
# if the file was meant to be moved then we have to delete it
"""URI to the Dataset.
The returned URI is from the first datastore in the list that has the dataset with preference given to the first dataset coming from a permanent datastore. If no datastores have the dataset and prediction is allowed, the predicted URI for the first datastore in the list will be returned.
Parameters ---------- ref : `DatasetRef` Reference to the required Dataset. predict : `bool` If `True`, allow URIs to be returned of datasets that have not been written.
Returns ------- uri : `str` URI string pointing to the Dataset within the datastore. If the Dataset does not exist in the datastore, and if ``predict`` is `True`, the URI will be a prediction and will include a URI fragment "#predicted".
Notes ----- If the datastore does not have entities that relate well to the concept of a URI the returned URI string will be descriptive. The returned URI is not guaranteed to be obtainable.
Raises ------ FileNotFoundError A URI has been requested for a dataset that does not exist and guessing is not allowed. """
"""Indicate to the Datastore that a Dataset can be removed.
The dataset will be removed from each datastore. The dataset is not required to exist in every child datastore.
Parameters ---------- ref : `DatasetRef` Reference to the required Dataset.
Raises ------ FileNotFoundError Attempt to remove a dataset that does not exist. Raised if none of the child datastores removed the dataset. """
"""Retrieve a Dataset from an input `Datastore`, and store the result in this `Datastore`.
Parameters ---------- inputDatastore : `Datastore` The external `Datastore` from which to retreive the Dataset. ref : `DatasetRef` Reference to the required Dataset in the input data store.
Returns ------- results : `list` List containing the return value from the ``put()`` to each child datastore. """ |