Coverage for python/lsst/daf/butler/datastores/genericDatastore.py : 88%

Hot-keys on this page
r m x p toggle line displays
j k next/prev highlighted chunk
0 (zero) top of page
1 (one) first highlighted chunk
# This file is part of daf_butler. # # Developed for the LSST Data Management System. # This product includes software developed by the LSST Project # (http://www.lsst.org). # See the COPYRIGHT file at the top-level directory of this distribution # for details of code ownership. # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program. If not, see <http://www.gnu.org/licenses/>.
"""Methods useful for most implementations of a `Datastore`.
Should always be sub-classed since key abstract methods are missing. """
"""Place to store internal records about datasets."""
"""Convert a `StoredDatastoreItemInfo` to a suitable database record.
Parameters ---------- info : `StoredDatastoreItemInfo` Metadata associated with the stored Dataset.
Returns ------- record : `MutableMapping` Record to be stored. """ raise NotImplementedError("Must be implemented by subclass")
"""Convert a record associated with this dataset to a `StoredDatastoreItemInfo`
Parameters ---------- record : `MutableMapping` Object stored in the record table.
Returns ------- info : `StoredDatastoreItemInfo` The information associated with this dataset record as a Python class. """ raise NotImplementedError("Must be implemented by subclass")
"""Record internal storage information associated with this `DatasetRef`
Parameters ---------- ref : `DatasetRef` The Dataset that has been stored. info : `StoredDatastoreItemInfo` Metadata associated with the stored Dataset. """ raise KeyError("Attempt to store item info with ID {}" " when that ID exists as '{}'".format(ref.id, self.records[ref.id]))
"""Retrieve information associated with file stored in this `Datastore`.
Parameters ---------- ref : `DatasetRef` The Dataset that is to be queried.
Returns ------- info : `StoredFilenfo` Stored information about this file and its formatter.
Raises ------ KeyError Dataset with that id can not be found. """
"""Remove information about the file associated with this dataset.
Parameters ---------- ref : `DatasetRef` The Dataset that has been removed. """
"""Update registry to indicate that this dataset has been stored.
Parameters ---------- ref : `DatasetRef` Dataset to register. itemInfo : `StoredDatastoreItemInfo` Internal datastore metadata associated with this dataset. """
# TODO: this is only transactional if the DatabaseDict uses # self.registry internally. Probably need to add # transactions to DatabaseDict to do better than that.
# Register all components with same information
"""Remove rows from registry.
Parameters ---------- ref : `DatasetRef` Dataset to remove from registry. """
"""Given the Python object read from the datastore, manipulate it based on the supplied parameters and ensure the Python type is correct.
Parameters ---------- inMemoryDataset : `object` Dataset to check. readStorageClass: `StorageClass` The `StorageClass` used to obtain the assembler and to check the python type. assemblerParams : `dict` Parameters to pass to the assembler. Can be `None`. """ # Process any left over parameters
# Validate the returned data type matches the expected data type raise TypeError("Got Python type {} from datastore but expected {}".format(type(inMemoryDataset), pytype))
"""Validate the supplied arguments for put.
Parameters ---------- inMemoryDataset : `object` The Dataset to store. ref : `DatasetRef` Reference to the associated Dataset. """
# Sanity check raise TypeError("Inconsistency between supplied object ({}) " "and storage class type ({})".format(type(inMemoryDataset), storageClass.pytype))
# Confirm that we can accept this dataset # Raise rather than use boolean return value. " configuration.")
"""Retrieve a Dataset from an input `Datastore`, and store the result in this `Datastore`.
Parameters ---------- inputDatastore : `Datastore` The external `Datastore` from which to retreive the Dataset. ref : `DatasetRef` Reference to the required Dataset in the input data store.
""" |