Overview of Single Epoch Processing for the Dark Energy Survey

Systems for processing Dark Energy Survey telescope data are being developed mainly by the DESDM project.

This page gives a high level overview of the processing that just a single exposure ("epoch") needs.

Goals

The goal of this project is to study how practical it is to run the single exposure processing on Grid worker nodes.

The main steps of this project are:

  1. Building and adapting the image processing applications for standard Grid platforms.
  2. Adapting the image processing applications to work in single exposure mode. This includes modifying the remap step.
  3. Adapting the prototype orchestration scripts from NCSA to properly pass information across computational steps.
  4. Porting the single exposure pipeline to run on a Grid cluster. Scaling up the usage of the Grid to 300 images at a time, either pre-fetched or transferred upon request.
  5. Measuring standard computational metrics when running on the Grid. These metrics include application run time, time delays for data transfers, performance of local peripherals, etc.

High-level Description of Processing

Inputs

  1. An image from the telescope. This takes up about 2GB.
  2. A bias image -- 2 GB.
  3. A flat image -- 2 GB.

#2 and #3 only change once every night or two.

Details:

  • One CCD image is 4k x 4k x (2 bytes or 4 bytes), which works out to 32 or 64 MB. One exposure consists of 62 CCD images, which works out to a total of 2 GB or 4 GB.

  • There are 5 different bands (grizY), but each band is done as a separate exposure.

  • Taking an exposure (with DECAM) takes ~100 seconds. The filter has to be swapped for the next exposure.

Outputs

  1. A reduced image with SCAMP headers (4GB). This is to help check the processing.
  2. A remapped image (4GB).
  3. A small table from PCM (~100MB).

Time Estimates

The processing for one exposure should be on the order of 10 hours (?).

Usually about 300 exposures should be processed every day if the sky is clear enough that the telescope is in Standard Survey Mode. If a few images take longer that's ok.

If there are a few thin clouds then the telescope will be in Supernova Mode, since Standard Survey Mode needs excellent conditions. Supernova Mode generates only about 100 images a night, but it is important that these be processed before the next night so that any supernovas can be followed up on.

Processing

   calibration    processing
   data needed    steps
              -----
       
   Bias           imcorrect
   Flats          crosstalk
                     These process the raw data into "reduced images" or "reds" for short.
                     Each pixel becomes a 4 byte float.
                
   known          sextractor #1 -- detect bright objects
   objects        scamp -- matches bright objects with known objects. Then we know
   UNSO-B                  what the CCD is looking at. Scamp writes this data into the
   catalog                 FITS headers (doesn't change the pixels). The output is
                           "reduced+SCAMP headers" and also a few numbers
                           that say how to rotate and stretch the image to match
                           the fixed coordinate system.
                
                  remap
                    Align the chips so that they are on a fixed coordinate system.
                    "RA" is the x-axis, "DEC" is the y-axis.

                  Full sextractor -- measures and detects all the objects in the images.
                     Computationally intensive step. ~5 minutes for each chip (?) => 5 hours.

                  PSF calculation

  photometric     PCM  -- Photometric Calibration Module
  solution            Produces a table of objects with columns like these:         
                        run id, object id, ra, dec, magnitude

Notes

Human intervention is needed to fix at least scamp failures. Some questions about whether processing should continue if scamp fails on only a few chips.

Have to return the logs to help debugging.

After single epoch processing, will do coadd processing. Also need to run science code like the weak lensing processing.

Can't do just one chip at a time, since some steps need a few chips. For example, crosstalk correction needs 12. The full sextractor can be done separately for each chip. DAGMAN could be useful for splitting up the processing if this takes a long time.

Could write small scripts to interact with the database when needed.

Running on the Grid

There is a preliminary system for running many of these steps on the grid.

Limited Public Access to DES SV data

Limited Access to DES Data for November, 2012 SV run

---The Dark Energy Survey and OSG/FNAL User Support

Topic revision: r5 - 12 Oct 2016 - 15:24:25 - KyleGross
Hello, TWikiGuest
Register

 
TWIKI.NET

TWiki | Report Bugs | Privacy Policy

This site is powered by the TWiki collaboration platformCopyright by the contributing authors. All material on this collaboration platform is the property of the contributing authors..