Coverage for python/lsst/meas/deblender/plugins.py : 66%

Hot-keys on this page
r m x p toggle line displays
j k next/prev highlighted chunk
0 (zero) top of page
1 (one) first highlighted chunk
#!/usr/bin/env python # # LSST Data Management System # See COPYRIGHT file. # # This product includes software developed by the # LSST Project (http://www.lsst.org/). # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the LSST License Statement and # the GNU General Public License along with this program. If not, # see <https://www.lsstcorp.org/LegalNotices/>. #
# Import C++ routines
''' Clips the given *Footprint* to the region in the *Image* containing non-zero values. The clipping drops spans that are totally zero, and moves endpoints to non-zero; it does not split spans that have internal zeros. ''' continue # Time to update the SpanSet
"""Class to define plugins for the deblender.
The new deblender executes a series of plugins specified by the user. Each plugin defines the function to be executed, the keyword arguments required by the function, and whether or not certain portions of the deblender might need to be rerun as a result of the function. """ """Initialize a deblender plugin
Parameters ---------- func: `function` Function to run when the plugin is executed. The function should always take `debResult`, a `DeblenderResult` that stores the deblender result, and `log`, an `lsst.log`, as the first two arguments, as well as any additional keyword arguments (that must be specified in ``kwargs``). The function should also return ``modified``, a `bool` that tells the deblender whether or not any templates have been modified by the function. If ``modified==True``, the deblender will go back to step ``onReset``, unless the has already been run ``maxIterations``. onReset: `int` Index of the deblender plugin to return to if ``func`` modifies any templates. The default is ``None``, which does not re-run any plugins. maxIterations: `int` Maximum number of times the deblender will reset when the current plugin returns ``True``. """
"""Execute the current plugin
Once the plugin has finished, check to see if part of the deblender must be executed again. """
return ("<Deblender Plugin: func={0}, kwargs={1}".format(self.func.__name__, self.kwargs)) return self.__str__()
"""Update the peak in each band with an error
This function logs an error that occurs during deblending and sets the relevant flag.
Parameters ---------- debResult: `lsst.meas.deblender.baseline.DeblenderResult` Container for the final deblender results. log: `log.Log` LSST logger for logging purposes. pk: int Number of the peak that failed cx: float x coordinate of the peak cy: float y coordinate of the peak filters: list of str List of filter names for the exposures msg: str Message to display in log traceback flag: str Name of the flag to set
Returns ------- None """ log.trace("Peak {0} at ({1},{2}):{3}".format(pk, cx, cy, msg)) for fidx, f in enumerate(filters): pkResult = debResult.deblendedParents[f].peaks[pk] getattr(pkResult, flag)()
sources=None, constraints=None, config=None, maxIter=100, bgScale=0.5, relativeError=1e-2, badMask=None): """Run the Multiband Deblender to build templates
Parameters ---------- debResult: `lsst.meas.deblender.baseline.DeblenderResult` Container for the final deblender results. log: `log.Log` LSST logger for logging purposes. useWeights: bool, default=False Whether or not to use the variance map in each filter for the fit. usePsf: bool, default=False Whether or not to convolve the image with the PSF in each band. This is not yet implemented in an optimized algorithm, so it is recommended to leave this term off for now sources: list of `scarlet.source.Source` objects, default=None List of sources to use in the blend. By default the `scarlet.source.ExtendedSource` class is used, which initializes each source as symmetric and monotonic about a peak in the footprint peak catalog. constraints: `scarlet.constraint.Constraint`, default=None Constraint to be applied to each source. If sources require different constraints, a list of `sources` must be created instead, which ignores the `constraints` parameter. When `constraints` is `None` the default constraints are used. config: `scarlet.config.Config`, default=None Configuration for the blend. If `config` is `None` then the default `Config` is used. maxIter: int, default=100 Maximum iterations for a single blend. bgScale: float Amount to scale the background RMS to set the floor for deblender model sizes relativeError: float, default=1e-2 Relative error to reach for convergence badMask: list of str, default=`None` List of mask plane names to mark bad pixels. If `badPixelKeys` is `None`, the default keywords used are `["BAD", "CR", "NO_DATA", "SAT", "SUSPECT"]`.
Returns ------- modified: `bool` If any templates have been created then ``modified`` is ``True``, otherwise it is ``False`` (meaning all of the peaks were skipped). """ import scarlet
# Extract coordinates from each MultiColorPeak bbox = debResult.footprint.getBBox() peakSchema = debResult.footprint.peaks.getSchema() xmin = bbox.getMinX() ymin = bbox.getMinY() peaks = [[pk.y-ymin, pk.x-xmin] for pk in debResult.peaks]
# Create the data array from the masked images maskedImages = [mimg.Factory(mimg, debResult.footprint.getBBox(), PARENT) for mimg in debResult.maskedImages] data = np.array([mimg.image.array for mimg in maskedImages])
# Use the inverse variance as the weights if useWeights: weights = 1/np.array([mimg.variance.array for mimg in maskedImages]) else: weights = weights = np.ones_like(data)
# Use the mask plane to mask bad pixels and # the footprint to mask out pixels outside the footprint if badMask is None: badMask = ["BAD", "CR", "NO_DATA", "SAT", "SUSPECT"] fpMask = afwImage.Mask(bbox) debResult.footprint.spans.setMask(fpMask, 1) fpMask = ~fpMask.getArray().astype(bool) mask = np.zeros(weights.shape, dtype=bool) for fidx, mimg in enumerate(maskedImages): badPixels = mimg.mask.getPlaneBitMask(badMask) mask[fidx] = (mimg.getMask().array & badPixels) | fpMask weights[mask] = 0
# Extract the PSF from each band for PSF convolution if usePsf: psfs = [] for psf in debResult.psfs: psfs.append(psf.computeKernelImage().array) psf = np.array(psfs) else: psf = None
bg_rms = np.array([debResult.deblendedParents[f].avgNoise for f in debResult.filters])*bgScale if sources is None: # If only a single constraint was given, use it for all of the sources if (constraints is scarlet.constraints.Constraint or constraints is scarlet.constraints.ConstraintList ): constraints = [constraints.copy() for peak in peaks] elif constraints is None: constraints = [None]*len(peaks) sources = [ scarlet.source.ExtendedSource(center=peak, img=data, bg_rms=bg_rms, constraints=constraints[pk], psf=psf, symmetric=True, monotonic=True, thresh=1.0, config=config) for pk,peak in enumerate(peaks) ]
# When a footprint includes only non-detections # (peaks in the noise too low to deblend as a source) # the deblender currently fails. try: blend = scarlet.blend.Blend(sources=sources, img=data, weights=weights, bg_rms=bg_rms, config=config) blend.fit(steps=maxIter, e_rel=relativeError) except np.linalg.LinAlgError as e: log.warn("Deblend failed catastrophically, most likely due to no signal in the footprint") debResult.failed = True return False debResult.blend = blend
modified = False # Create the Templates for each peak in each filter for pk, src in enumerate(blend.sources): _cx = src.Nx >> 1 _cy = src.Ny >> 1
if debResult.peaks[pk].skip: continue modified = True cx = src.center[1]+xmin cy = src.center[0]+ymin imbb = debResult.deblendedParents[debResult.filters[0]].img.getBBox()
# Footprint must be inside the image if not imbb.contains(afwGeom.Point2I(cx, cy)): _setPeakError(debResult, log, pk, cx, cy, debResult.filters, "peak center is not inside image", "setOutOfBounds") continue # Only save templates that have nonzero flux if np.sum(src.morph)==0: _setPeakError(debResult, log, pk, cx, cy, debResult.filters, "had no flux", "setFailedSymmetricTemplate") continue
# Temporary for initial testing: combine multiple components model = blend.get_model(m=pk, flat=False) model = model.astype(np.float32)
# The peak in each band will have the same SpanSet mask = afwImage.Mask(np.array(np.sum(model, axis=0)>0, dtype=np.int32), xy0=debResult.footprint.getBBox().getBegin()) ss = afwGeom.SpanSet.fromMask(mask)
if len(ss) == 0: log.warn("No flux in parent footprint") debResult.failed = True return False
# Add the template footprint and image to the deblender result for each peak for fidx, f in enumerate(debResult.filters): pkResult = debResult.deblendedParents[f].peaks[pk] tfoot = afwDet.Footprint(ss, peakSchema=peakSchema) # Add the peak with the intensity of the centered model, # which might be slightly larger than the shifted model peakFlux = np.sum(src.sed[:,fidx]*src.morph[:,_cy, _cx]) tfoot.addPeak(cx, cy, peakFlux) timg = afwImage.ImageF(model[fidx], xy0=debResult.footprint.getBBox().getBegin()) timg = timg.Factory(timg, tfoot.getBBox(), PARENT) pkResult.setOrigTemplate(timg, tfoot) pkResult.setTemplate(timg, tfoot) pkResult.setFluxPortion(afwImage.MaskedImageF(timg)) pkResult.multiColorPeak.x = cx pkResult.multiColorPeak.y = cy pkResult.peak.setFx(cx) pkResult.peak.setFy(cy) pkResult.peak.setIx(int(np.round(cx))) pkResult.peak.setIy(int(np.round(cy))) return modified
"""Fit a PSF + smooth background model (linear) to a small region around each peak
This function will iterate over all filters in deblender result but does not compare results across filters. DeblendedPeaks that pass the cuts have their templates modified to the PSF + background model and their ``deblendedAsPsf`` property set to ``True``.
This will likely be replaced in the future with a function that compares the psf chi-squared cuts so that peaks flagged as point sources will be considered point sources in all bands.
Parameters ---------- debResult: `lsst.meas.deblender.baseline.DeblenderResult` Container for the final deblender results. log: `log.Log` LSST logger for logging purposes. psfChisqCut*: `float`, optional ``psfChisqCut1`` is the maximum chi-squared-per-degree-of-freedom allowed for a peak to be considered a PSF match without recentering. A fit is also made that includes terms to recenter the PSF. ``psfChisqCut2`` is the same as ``psfChisqCut1`` except it determines the restriction on the fit that includes recentering terms. If the peak is a match for a re-centered PSF, the PSF is repositioned at the new center and the peak footprint is fit again, this time to the new PSF. If the resulting chi-squared-per-degree-of-freedom is less than ``psfChisqCut2b`` then it passes the re-centering algorithm. If the peak passes both the re-centered and fixed position cuts, the better of the two is accepted, but parameters for all three psf fits are stored in the ``DebldendedPeak``. The default for ``psfChisqCut1``, ``psfChisqCut2``, and ``psfChisqCut2b`` is ``1.5``. tinyFootprintSize: `float`, optional The PSF model is shrunk to the size that contains the original footprint. If the bbox of the clipped PSF model for a peak is smaller than ``max(tinyFootprintSize,2)`` then ``tinyFootprint`` for the peak is set to ``True`` and the peak is not fit. The default is 2.
Returns ------- modified: `bool` If any templates have been assigned to PSF point sources then ``modified`` is ``True``, otherwise it is ``False``. """ # Loop over all of the filters to build the PSF
# create mask image for pixels within the footprint
# pk.getF() -- retrieving the floating-point location of the peak # -- actually shows up in the profile if we do it in the loop, so # grab them all here.
dp.img, dp.varimg, psfChisqCut1, psfChisqCut2, psfChisqCut2b, tinyFootprintSize)
img, varimg, psfChisqCut1, psfChisqCut2, psfChisqCut2b, tinyFootprintSize=2, ): """Fit a PSF + smooth background model (linear) to a small region around a peak.
See fitPsfs for a more thorough description, including all parameters not described below.
Parameters ---------- fp: `afw.detection.Footprint` Footprint containing the Peaks to model. fmask: `afw.image.Mask` The Mask plane for pixels in the Footprint pk: `afw.detection.PeakRecord` The peak within the Footprint that we are going to fit with PSF model pkF: `afw.geom.Point2D` Floating point coordinates of the peak. pkres: `meas.deblender.DeblendedPeak` Peak results object that will hold the results. fbb: `afw.geom.Box2I` Bounding box of ``fp`` peaks: `afw.detection.PeakCatalog` Catalog of peaks contained in the parent footprint. peaksF: list of `afw.geom.Point2D` List of floating point coordinates of all of the peaks. psf: list of `afw.detection.Psf`s Psf of the ``maskedImage`` for each band. psffwhm: list pf `float`s FWHM of the ``maskedImage``'s ``psf`` in each band. img: `afw.image.ImageF` The image that contains the footprint. varimg: `afw.image.ImageF` The variance of the image that contains the footprint.
Results ------- ispsf: `bool` Whether or not the peak matches a PSF model. """
# my __name__ is lsst.meas.deblender.baseline
# The small region is a disk out to R0, plus a ramp with # decreasing weight down to R1. # ramp down to zero weight at this radius... # R2: distance to neighbouring peak in order to put it into the model
# Make sure we haven't been given a substitute PSF that's nowhere near where we want, as may occur if # "Cannot compute CoaddPsf at point (xx,yy); no input images at that point." pkres.setOutOfBounds() return
# The bounding-box of the local region we are going to fit ("stamp") log.trace('Skipping this peak: out of bounds') pkres.setOutOfBounds() return
# drop tiny footprints too? # Minimum size limit of 2 comes from the "PSF dx" calculation, which involves shifting the PSF # by one pixel to the left and right. log.trace('Skipping this peak: tiny footprint / close to edge') pkres.setTinyFootprint() return
# find other peaks within range... continue
# Now we are going to do a least-squares fit for the flux in this # PSF, plus a decenter term, a linear sky, and fluxes of nearby # sources (assumed point sources). Build up the matrix... # Number of terms -- PSF flux, constant sky, X, Y, + other PSF fluxes # + PSF dx, dy # Number of pixels -- at most # indices of columns in the "A" matrix. # offset of other psf fluxes:
# Build the matrix "A", rhs "b" and weight "w".
# Clip the PSF image to match its bbox pbb.getMinX()-px0: 1+pbb.getMaxX()-px0]
# Compute the "valid" pixels within our region-of-interest
log.warn('Skipping peak at (%.1f, %.1f): no unmasked pixels nearby', cx, cy) pkres.setNoValidPixels() return
# pixel coords of valid pixels
(xlo <= xhi) and (xmin <= xmax))
# Constant term # Sky slope terms: dx, dy
# whew, grab the valid overlapping PSF pixels
# PSF dx -- by taking the half-difference of shifted-by-one and # shifted-by-minus-one. psfarr[psf_y_slice, sx3 - dpx0 - 1: sx4 - dpx0 - 1])/2. # revert x indices...
# PSF dy psfarr[sy3 - dpy0 - 1: sy4 - dpy0 - 1, psf_x_slice])/2.
# other PSFs... (xx >= obb.getMinX())*(xx <= obb.getMaxX()))
# Weights -- from ramp and image variance map. # Ramp weights -- from 1 at R0 down to 0 at R1. # save the effective number of pixels
import pylab as plt plt.clf() N = NT2 + 2 R, C = 2, (N+1)/2 for i in range(NT2): im1 = np.zeros((1+yhi-ylo, 1+xhi-xlo)) im1[ipixes[:, 1], ipixes[:, 0]] = A[:, i] plt.subplot(R, C, i+1) plt.imshow(im1, interpolation='nearest', origin='lower') plt.subplot(R, C, NT2+1) im1 = np.zeros((1+yhi-ylo, 1+xhi-xlo)) im1[ipixes[:, 1], ipixes[:, 0]] = b plt.imshow(im1, interpolation='nearest', origin='lower') plt.subplot(R, C, NT2+2) im1 = np.zeros((1+yhi-ylo, 1+xhi-xlo)) im1[ipixes[:, 1], ipixes[:, 0]] = w plt.imshow(im1, interpolation='nearest', origin='lower') plt.savefig('A.png')
# We do fits with and without the decenter (dx,dy) terms. # Since the dx,dy terms are at the end of the matrix, # we can do that just by trimming off those elements. # # The SVD can fail if there are NaNs in the matrices; this should # really be handled upstream # NT1 is number of terms without dx,dy; # X1 is the result without decenter # X2 is with decenter except np.linalg.LinAlgError as e: log.warn("Failed to fit PSF to child: %s", e) pkres.setPsfFitFailed() return
# r is weighted chi-squared = sum over pixels: ramp * (model - # data)**2/sigma**2 else: chisq1 = 1e30 else: chisq2 = 1e30
# This can happen if we're very close to the edge (?) log.trace('Skipping this peak: bad DOF %g, %g', dof1, dof2) pkres.setBadPsfDof() return
# check that the fit PSF spatial derivative terms aren't too big # as a fraction of the PSF flux pkres.psfFitBigDecenter = True
# Looks like a shifted PSF: try actually shifting the PSF by that amount # and re-evaluate the fit. # clip
# Make sure we haven't been given a substitute PSF that's nowhere near where we want, as may occur if # "Cannot compute CoaddPsf at point (xx,yy); no input images at that point." ispsf2 = False else: # clip image to bbox pbb2.getMinX()-px0:1+pbb2.getMaxX()-px0]
# yuck! Update the PSF terms in the least-squares fit matrix.
# re-solve... else: chisqb = 1e30
# Which one do we keep? (ispsf2 and not ispsf1)): Xpsf = X2 chisq = chisq2 dof = dof2 log.debug('dof %g', dof) log.trace('Keeping shifted-PSF model') cx += dx cy += dy pkres.psfFitWithDecenter = True else: # (arbitrarily set to X1 when neither fits well)
# Save the PSF models in images for posterity. SW, SH = 1+xhi-xlo, 1+yhi-ylo psfmod = afwImage.ImageF(SW, SH) psfmod.setXY0(xlo, ylo) psfderivmodm = afwImage.MaskedImageF(SW, SH) psfderivmod = psfderivmodm.getImage() psfderivmod.setXY0(xlo, ylo) model = afwImage.ImageF(SW, SH) model.setXY0(xlo, ylo) for i in range(len(Xpsf)): for (x, y), v in zip(ipixes, A[:, i]*Xpsf[i]): ix, iy = int(x), int(y) model.set(ix, iy, model.get(ix, iy) + float(v)) if i in [I_psf, I_dx, I_dy]: psfderivmod.set(ix, iy, psfderivmod.get(ix, iy) + float(v)) for ii in range(NP): x, y = ipixes[ii, :] psfmod.set(int(x), int(y), float(A[ii, I_psf]*Xpsf[I_psf])) modelfp = afwDet.Footprint(fp.getPeaks().getSchema()) for (x, y) in ipixes: modelfp.addSpan(int(y+ylo), int(x+xlo), int(x+xlo)) modelfp.normalize()
pkres.psfFitDebugPsf0Img = psfimg pkres.psfFitDebugPsfImg = psfmod pkres.psfFitDebugPsfDerivImg = psfderivmod pkres.psfFitDebugPsfModel = model pkres.psfFitDebugStamp = img.Factory(img, stampbb, True) pkres.psfFitDebugValidPix = valid # numpy array pkres.psfFitDebugVar = varimg.Factory(varimg, stampbb, True) ww = np.zeros(valid.shape, np.float) ww[valid] = w pkres.psfFitDebugWeight = ww # numpy pkres.psfFitDebugRampWeight = rw
# Save things we learned about this peak for posterity...
# replace the template image by the PSF + derivatives # image.
# Instantiate the PSF model and clip it to the footprint # Scale by fit flux.
# Clip the Footprint to the PSF model image bbox.
# Copy the part of the PSF model within the clipped footprint. # Save it as our template.
# DEBUG
"""Build a symmetric template for each peak in each filter
Given ``maskedImageF``, ``footprint``, and a ``DebldendedPeak``, creates a symmetric template (``templateImage`` and ``templateFootprint``) around the peak for all peaks not flagged as ``skip`` or ``deblendedAsPsf``.
Parameters ---------- debResult: `lsst.meas.deblender.baseline.DeblenderResult` Container for the final deblender results. log: `log.Log` LSST logger for logging purposes. patchEdges: `bool`, optional If True and if the parent Footprint touches pixels with the ``EDGE`` bit set, then grow the parent Footprint to include all symmetric templates.
Returns ------- modified: `bool` If any peaks are not skipped or marked as point sources, ``modified`` is ``True. Otherwise ``modified`` is ``False``. """ # Create the Templates for each peak in each filter
# TODO: Check debResult to see if the peak is deblended as a point source # when comparing all bands, not just a single band continue log.trace('Peak center is not inside image; skipping %i', pkres.pki) pkres.setOutOfBounds() continue True, patchEdges) log.trace('Peak %i at (%i, %i): failed to build symmetric template', pkres.pki, cx, cy) pkres.setFailedSymmetricTemplate() continue
# possibly save the original symmetric template
"""Adjust flux on the edges of the template footprints.
Using the PSF, a peak ``Footprint`` with pixels on the edge of ``footprint`` is grown by the ``psffwhm``*1.5 and filled in with ramped pixels. The result is a new symmetric footprint template for the peaks near the edge.
Parameters ---------- debResult: `lsst.meas.deblender.baseline.DeblenderResult` Container for the final deblender results. log: `log.Log` LSST logger for logging purposes. patchEdges: `bool`, optional If True and if the parent Footprint touches pixels with the ``EDGE`` bit set, then grow the parent Footprint to include all symmetric templates.
Returns ------- modified: `bool` If any peaks have their templates modified to include flux at the edges, ``modified`` is ``True``. """ # Loop over all filters
continue dp.maskedImage, dp.x0, dp.x1, dp.y0, dp.y1, dp.psf, pkres.peak, dp.avgNoise, patchEdges) except lsst.pex.exceptions.Exception as exc: if (isinstance(exc, lsst.pex.exceptions.InvalidParameterError) and "CoaddPsf" in str(exc)): pkres.setOutOfBounds() continue raise pkres.setPatched()
x0, x1, y0, y1, psf, pk, sigma1, patchEdges ): """Extend a template by the PSF to fill in the footprint.
Using the PSF, a footprint that touches the edge is passed to the function and is grown by the psffwhm*1.5 and filled in with ramped pixels.
Parameters ---------- log: `log.Log` LSST logger for logging purposes. psffwhm: `float` PSF FWHM in pixels. t1: `afw.image.ImageF` The image template that contains the footprint to extend. tfoot: `afw.detection.Footprint` Symmetric Footprint to extend. fp: `afw.detection.Footprint` Parent Footprint that is being deblended. maskedImage: `afw.image.MaskedImageF` Full MaskedImage containing the parent footprint ``fp``. x0,y0: `init` Minimum x,y for the bounding box of the footprint ``fp``. x1,y1: `int` Maximum x,y for the bounding box of the footprint ``fp``. psf: `afw.detection.Psf` PSF of the image. pk: `afw.detection.PeakRecord` The peak within the Footprint whose footprint is being extended. sigma1: `float` Estimated noise level in the image. patchEdges: `bool` If ``patchEdges==True`` and if the footprint touches pixels with the ``EDGE`` bit set, then for spans whose symmetric mirror are outside the image, the symmetric footprint is grown to include them and their pixel values are stored.
Results ------- t2: `afw.image.ImageF` Image of the extended footprint. tfoot2: `afw.detection.Footprint` Extended Footprint. patched: `bool` If the footprint touches an edge pixel, ``patched`` will be set to ``True``. Otherwise ``patched`` is ``False``. """ # Compute the max of: # -symmetric-template-clipped image * PSF # -footprint-clipped image # Ie, extend the template by the PSF and "fill in" the footprint. # Then find the symmetric template of that image.
# The size we'll grow by # make it an odd integer
# (footprint+margin)-clipped image; # we need the pixels OUTSIDE the footprint to be 0.
# find pixels on the edge of the template
# instantiate PSF image # shift PSF image to be centered on zero # clip PSF to S, if necessary # clip PSF image
# Compute the ramped-down edge pixels # For each edge pixel, Tout = max(Tout, edgepix * PSF) slice(x+px0 - ox0, x+px1+1 - ox0))
# Fill in the "padim" (which has the right variance and # mask planes) with the ramped pixels, outside the footprint
# This template footprint may extend outside the parent # footprint -- or the image. Clip it. # NOTE that this may make it asymmetric, unlike normal templates. # clip template image to bbox
"""Applying median smoothing filter to the template images for every peak in every filter.
Parameters ---------- debResult: `lsst.meas.deblender.baseline.DeblenderResult` Container for the final deblender results. log: `log.Log` LSST logger for logging purposes. medianFilterHalfSize: `int`, optional Half the box size of the median filter, i.e. a ``medianFilterHalfSize`` of 50 means that each output pixel will be the median of the pixels in a 101 x 101-pixel box in the input image. This parameter is only used when ``medianSmoothTemplate==True``, otherwise it is ignored.
Returns ------- modified: `bool` Whether or not any templates were modified. This will be ``True`` as long as there is at least one source that is not flagged as a PSF. """ # Loop over all filters continue # We want the output to go in "t1", so copy it into # "inimg" for input # possible save this median-filtered template else: pkres.pki, timg.getWidth(), timg.getHeight(), filtsize, filtsize)
"""Make the templates monotonic.
The pixels in the templates are modified such that pixels further from the peak will have values smaller than those closer to the peak.
Parameters ---------- debResult: `lsst.meas.deblender.baseline.DeblenderResult` Container for the final deblender results. log: `log.Log` LSST logger for logging purposes.
Returns ------- modified: `bool` Whether or not any templates were modified. This will be ``True`` as long as there is at least one source that is not flagged as a PSF. """ # Loop over all filters continue
"""Clip non-zero spans in the template footprints for every peak in each filter.
Peak ``Footprint``s are clipped to the region in the image containing non-zero values by dropping spans that are completely zero and moving endpoints to non-zero pixels (but does not split spans that have internal zeros).
Parameters ---------- debResult: `lsst.meas.deblender.baseline.DeblenderResult` Container for the final deblender results. log: `log.Log` LSST logger for logging purposes.
Returns ------- modified: `bool` Whether or not any templates were modified. This will be ``True`` as long as there is at least one source that is not flagged as a PSF. """ # Loop over all filters continue
"""Weight the templates to best fit the observed image in each filter
This function re-weights the templates so that their linear combination best represents the observed image in that filter. In the future it may be useful to simultaneously weight all of the filters together.
Parameters ---------- debResult: `lsst.meas.deblender.baseline.DeblenderResult` Container for the final deblender results. log: `log.Log` LSST logger for logging purposes.
Returns ------- modified: `bool` ``weightTemplates`` does not actually modify the ``Footprint`` templates other than to add a weight to them, so ``modified`` is always ``False``. """ # Weight the templates by doing a least-squares fit to the image log.trace('Weighting templates') for fidx in debResult.filters: _weightTemplates(debResult.deblendedParents[fidx]) return False
"""Weight the templates to best match the parent Footprint in a single filter
This includes weighting both regular templates and point source templates
Parameter --------- dp: `DeblendedParent` The deblended parent to re-weight
Returns ------- None """ nchild = np.sum([pkres.skip is False for pkres in dp.peaks]) A = np.zeros((dp.W*dp.H, nchild)) parentImage = afwImage.ImageF(dp.bb) afwDet.copyWithinFootprintImage(dp.fp, dp.img, parentImage) b = parentImage.getArray().ravel()
index = 0 for pkres in dp.peaks: if pkres.skip: continue childImage = afwImage.ImageF(dp.bb) afwDet.copyWithinFootprintImage(dp.fp, pkres.templateImage, childImage) A[:, index] = childImage.getArray().ravel() index += 1
X1, r1, rank1, s1 = np.linalg.lstsq(A, b) del A del b
index = 0 for pkres in dp.peaks: if pkres.skip: continue pkres.templateImage *= X1[index] pkres.setTemplateWeight(X1[index]) index += 1
"""Remove "degenerate templates"
If galaxies have substructure, such as face-on spirals, the process of identifying peaks can "shred" the galaxy into many pieces. The templates of shredded galaxies are typically quite similar because they represent the same galaxy, so we try to identify these "degenerate" peaks by looking at the inner product (in pixel space) of pairs of templates. If they are nearly parallel, we only keep one of the peaks and reject the other. If only one of the peaks is a PSF template, the other template is used, otherwise the one with the maximum template value is kept.
Parameters ---------- debResult: `lsst.meas.deblender.baseline.DeblenderResult` Container for the final deblender results. log: `log.Log` LSST logger for logging purposes. maxTempDotProd: `float`, optional All dot products between templates greater than ``maxTempDotProd`` will result in one of the templates removed.
Returns ------- modified: `bool` If any degenerate templates are found, ``modified`` is ``True``. """
# We build a matrix that stores the dot product between templates. # We convert the template images to HeavyFootprints because they already have a method # to compute the dot product. afwImage.MaskedImageF(pkres.templateImage)))
# Normalize the dot products to get the cosine of the angle between templates A[i, j] = 0 else:
# Iterate over pairs of objects and find the maximum non-diagonal element of the matrix. # Exit the loop once we find a single degenerate pair greater than the threshold.
# If one of the objects is identified as a PSF keep the other one, otherwise keep the one # with the maximum template value keep = indexes[rejectedIndex] reject = indexes[i] reject = indexes[rejectedIndex] keep = indexes[i] else: keep))
strayFluxToPointSources='necessary', clipStrayFluxFraction=0.001, getTemplateSum=False): """Apportion flux to all of the peak templates in each filter
Divide the ``maskedImage`` flux amongst all of the templates based on the fraction of flux assigned to each ``template``. Leftover "stray flux" is assigned to peaks based on the other parameters.
Parameters ---------- debResult: `lsst.meas.deblender.baseline.DeblenderResult` Container for the final deblender results. log: `log.Log` LSST logger for logging purposes. assignStrayFlux: `bool`, optional If True then flux in the parent footprint that is not covered by any of the template footprints is assigned to templates based on their 1/(1+r^2) distance. How the flux is apportioned is determined by ``strayFluxAssignment``. strayFluxAssignment: `string`, optional Determines how stray flux is apportioned. * ``trim``: Trim stray flux and do not include in any footprints * ``r-to-peak`` (default): Stray flux is assigned based on (1/(1+r^2) from the peaks * ``r-to-footprint``: Stray flux is distributed to the footprints based on 1/(1+r^2) of the minimum distance from the stray flux to footprint * ``nearest-footprint``: Stray flux is assigned to the footprint with lowest L-1 (Manhattan) distance to the stray flux strayFluxToPointSources: `string`, optional Determines how stray flux is apportioned to point sources * ``never``: never apportion stray flux to point sources * ``necessary`` (default): point sources are included only if there are no extended sources nearby * ``always``: point sources are always included in the 1/(1+r^2) splitting clipStrayFluxFraction: `float`, optional Minimum stray-flux portion. Any stray-flux portion less than ``clipStrayFluxFraction`` is clipped to zero. getTemplateSum: `bool`, optional As part of the flux calculation, the sum of the templates is calculated. If ``getTemplateSum==True`` then the sum of the templates is stored in the result (a `DeblendedFootprint`).
Returns ------- modified: `bool` Apportion flux always modifies the templates, so ``modified`` is always ``True``. However, this should likely be the final step and it is unlikely that any deblender plugins will be re-run. """ raise ValueError((('strayFluxToPointSources: value \"%s\" not in the set of allowed values: ') % strayFluxToPointSources) + str(validStrayPtSrc)) raise ValueError((('strayFluxAssignment: value \"%s\" not in the set of allowed values: ') % strayFluxAssignment) + str(validStrayAssign))
# Prepare inputs to "apportionFlux" call. # template maskedImages # template footprints # deblended as psf # peak x,y # indices of valid templates
# for stray flux...
# Now apportion flux according to the templates # .getDimensions()) # sumimg.setXY0(bb.getMinX(), bb.getMinY())
elif strayFluxToPointSources == 'always': strayopts |= butils.STRAYFLUX_TO_POINT_SOURCES_ALWAYS
# this is the default elif strayFluxAssignment == 'r-to-footprint': strayopts |= butils.STRAYFLUX_R_TO_FOOTPRINT elif strayFluxAssignment == 'nearest-footprint': strayopts |= butils.STRAYFLUX_NEAREST_FOOTPRINT
pkx, pky, strayopts, clipStrayFluxFraction)
# Shrink parent to union of children
# Store the template sum in the deblender result debResult.setTemplateSums(sumimg, fidx)
# Save the apportioned fluxes
# NOTE that due to a swig bug (https://github.com/swig/swig/issues/59) # we CANNOT iterate over "strayflux", but must index into it. else:
# Set child footprints to contain the right number of peaks.
(pkres.strayFlux, False)]: |