Coverage for python/lsst/daf/butler/datastores/inMemoryDatastore.py : 94%

Hot-keys on this page
r m x p toggle line displays
j k next/prev highlighted chunk
0 (zero) top of page
1 (one) first highlighted chunk
# This file is part of daf_butler. # # Developed for the LSST Data Management System. # This product includes software developed by the LSST Project # (http://www.lsst.org). # See the COPYRIGHT file at the top-level directory of this distribution # for details of code ownership. # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program. If not, see <http://www.gnu.org/licenses/>.
"""Internal InMemoryDatastore Metadata associated with a stored DatasetRef. """
"""Unix timestamp indicating the time the dataset was stored."""
"""StorageClass associated with the dataset."""
composite. Not used if the dataset being stored is not a virtual component of a composite """
"""Basic Datastore for writing to an in memory cache.
This datastore is ephemeral in that the contents of the datastore disappear when the Python process completes. This also means that other processes can not access this datastore.
Parameters ---------- config : `DatastoreConfig` or `str` Configuration. registry : `Registry`, optional Unused parameter. butlerRoot : `str`, optional Unused parameter.
Notes ----- InMemoryDatastore does not support any file-based ingest. """
"""Path to configuration defaults. Relative to $DAF_BUTLER_DIR/config or absolute path. Can be None if no defaults specified. """
"""A new datastore is created every time and datasets disappear when the process shuts down."""
"""Internal storage of datasets indexed by dataset ID."""
"""Internal records about stored datasets."""
# Name ourselves with the timestamp the datastore # was created.
# Storage of datasets, keyed by dataset_id
# Records is distinct in order to track concrete composite components # where we register multiple components for a single dataset.
# Related records that share the same parent
"""Set any filesystem-dependent config options for this Datastore to be appropriate for a new empty repository with the given root.
Does nothing in this implementation.
Parameters ---------- root : `str` Filesystem path to the root of the data repository. config : `Config` A `Config` to update. Only the subset understood by this component will be updated. Will not expand defaults. full : `Config` A complete config with all defaults expanded that can be converted to a `DatastoreConfig`. Read-only and will not be modified by this method. Repository-specific options that should not be obtained from defaults when Butler instances are constructed should be copied from ``full`` to ``config``. overwrite : `bool`, optional If `False`, do not modify a value in ``config`` if the value already exists. Default is always to overwrite with the provided ``root``.
Notes ----- If a keyword is explicitly defined in the supplied ``config`` it will not be overridden by this method if ``overwrite`` is `False`. This allows explicit values set in external configs to be retained. """
# Docstring inherited from GenericBaseDatastore.
# Docstring inherited from GenericBaseDatastore.
# Docstring inherited from GenericBaseDatastore. # If a component has been removed previously then we can sometimes # be asked to remove it again. Other datastores ignore this # so also ignore here
"""Check if the dataset exists in the datastore.
Parameters ---------- ref : `DatasetRef` Reference to the required dataset.
Returns ------- exists : `bool` `True` if the entity exists in the `Datastore`. """ # Get the stored information (this will fail if no dataset)
# The actual ID for the requested dataset might be that of a parent # if this is a composite
"""Load an InMemoryDataset from the store.
Parameters ---------- ref : `DatasetRef` Reference to the required Dataset. parameters : `dict` `StorageClass`-specific parameters that specify, for example, a slice of the Dataset to be loaded.
Returns ------- inMemoryDataset : `object` Requested Dataset or slice thereof as an InMemoryDataset.
Raises ------ FileNotFoundError Requested dataset can not be retrieved. TypeError Return value from formatter has unexpected type. ValueError Formatter failed to process the dataset. """
# We have a write storage class and a read storage class and they # can be different for concrete composites.
# Check that the supplied parameters are suitable for the type read
# We might need a parent if we are being asked for a component # of a concrete composite
# Different storage classes implies a component request
raise ValueError("Storage class inconsistency ({} vs {}) but no" " component requested".format(readStorageClass.name, writeStorageClass.name))
# Concrete composite written as a single object (we hope)
# Since there is no formatter to process parameters, they all must be # passed to the assembler.
"""Write a InMemoryDataset with a given `DatasetRef` to the store.
Parameters ---------- inMemoryDataset : `object` The Dataset to store. ref : `DatasetRef` Reference to the associated Dataset.
Raises ------ TypeError Supplied object and storage class are inconsistent. DatasetTypeNotSupportedError The associated `DatasetType` is not handled by this datastore.
Notes ----- If the datastore is configured to reject certain dataset types it is possible that the put will fail and raise a `DatasetTypeNotSupportedError`. The main use case for this is to allow `ChainedDatastore` to put to multiple datastores without requiring that every datastore accepts the dataset. """
# Store time we received this content, to allow us to optionally # expire it. Instead of storing a filename here, we include the # ID of this datasetRef so we can find it from components. parentID=ref.id)
# We have to register this content with registry. # Currently this assumes we have a file so we need to use stub entries # TODO: Add to ephemeral part of registry
"""URI to the Dataset.
Always uses "mem://" URI prefix.
Parameters ---------- ref : `DatasetRef` Reference to the required Dataset. predict : `bool` If `True`, allow URIs to be returned of datasets that have not been written.
Returns ------- uri : `str` URI string pointing to the Dataset within the datastore. If the Dataset does not exist in the datastore, and if ``predict`` is `True`, the URI will be a prediction and will include a URI fragment "#predicted". If the datastore does not have entities that relate well to the concept of a URI the returned URI string will be descriptive. The returned URI is not guaranteed to be obtainable.
Raises ------ FileNotFoundError A URI has been requested for a dataset that does not exist and guessing is not allowed.
"""
# if this has never been written then we have to guess else:
"""Indicate to the Datastore that a Dataset can be removed.
Parameters ---------- ref : `DatasetRef` Reference to the required Dataset.
Raises ------ FileNotFoundError Attempt to remove a dataset that does not exist.
"""
raise FileNotFoundError("No such file dataset in memory: {}".format(ref))
# Only delete if this is the only dataset associated with this data
# Remove rows from registries
"""Validate some of the configuration for this datastore.
Parameters ---------- entities : iterable of `DatasetRef`, `DatasetType`, or `StorageClass` Entities to test against this configuration. Can be differing types. logFailures : `bool`, optional If `True`, output a log message for every validation error detected.
Raises ------ DatastoreValidationError Raised if there is a validation problem with a configuration. All the problems are reported in a single exception.
Notes ----- This method is a no-op. """
# Docstring is inherited from base class return
# Docstring is inherited from base class |