Calibration Train

From GlueXWiki
Revision as of 16:52, 20 October 2015 by Sdobbs (Talk | contribs) (Calibration Challenge)

Jump to: navigation, search


Calibration Challenge

We propose to run a "Calibration Challenge" starting the first week of December 2015.

The goal will be to take a run with zeroed out calibrations and to see how many calibrations can be automatically extracted.

Test runs will be performed in the weeks leading up to this challenge.

More details to follow.

Organization

  • The jobs will be submitted every Tuesday at noon, JLab time.
  • The jobs will be run from the gxproj3 account [parallel use with EventStore jobs]
  • The output of the jobs will be stored in ...

Run Ranges

The following runs will be processed:

RunPeriod-2015-03

  • 2931, 3079, 3179, 3180, 3183, 3185

Calibrations

Job Requirements

Each calibration process should include the following:

What is Being Run

The following plugins are currently being run:

  • RF_online (RF signal)
  • BCAL_TDC_Timing
  • HLDetectorTiming
  • PSC_TW

Working on adding:

Code

The current code can be obtained at

svn co https://halldsvn.jlab.org/repos/trunk/home/sdobbs/calibration_train/

The ROOT library directory can be found at: ...

Older Information

Proposal

  • Use case: Streamline tape library usage, provide common environment for production and development
    • Example: Users develop plugins on their favorite files/run, use this for running over larger data
  • Run every week (Wednesday to avoid conflict with monitoring?)
  • Uses subset of runs
  • Users provide:
    • DANA plugin
    • Optional post-processing scripts to be run after every/all runs
      • Curated in SVN
  • Results stored in standard location(s)
    • Possible results: ROOT files, images, calibration constants, web pages
  • Uses SWIF? (buzzword compliance)
  • Uses GlueX project account (gxproj3? 4? 5?)

Runs to Use

Several possibilities:

  1. Large/Popular Runs
    • Fall 2014: 1514, 1769, 1777, 1787, 1803, 1807, 1810, 1825, 1847, 1852, 1854, 1871, 1872, 2138, 2206, 2207, 2209, 2223, 2397
    • Spring 2015: 2931, 3079, 3179, 3180, 3183, 3185
  2. Other big sets of runs:
    • Fall 2014: 1516-1520, 1769-1777, 1900, 2179-2184, 2205, 2228, 2407-2409, 2416-2420
    • Look for groups of runs with similar conditions
  3. Others


This also raises the question: what are the good runs?

One proposal:

  • Tag each run in monitoring database with one of three values:
    1. 0 - non-production quality
    2. 1 - production quality
    3. 2 - production quality, used for calibrations
  • Idea is that > 1 means that the run is good to use for physics
    • Finer grained information can be stored in RCDB
  • To determine production quality, develop quality metrics for each subdetector, use combination of quality metrics and eye test