December 3, 2014 Calibration

From GlueXWiki
Jump to: navigation, search

GlueX Calibration Meeting
Wednesday, December 3, 2014
11:00 am, EDT
JLab: CEBAF Center, F326

Connection Using Bluejeans

  1. To join via Polycom room system go to the IP Address: 199.48.152.152 (bjn.vc) and enter the meeting ID: 630804895.
  2. To join via a Web Browser, go to the page [1] https://bluejeans.com/630804895.
  3. To join via phone, use one of the following numbers and the Conference ID: 630804895
    • US or Canada: +1 408 740 7256 or
    • US or Canada: +1 888 240 2560
  4. Upon connection all microphones are automatically muted. To unmute your mike on a Polycom or equivalent unit, enter *4
  5. More information on connecting to bluejeans is available.

Agenda

  1. Announcements
  2. Status Update
  3. Calibration status/updates
    1. Calorimetry
      1. pi0's/electrons in FCAL - GlueX-Doc-2609 GlueX-Doc-2610
      2. Calibration of BCAL with cosmic data (Andrei & Irina)
    2. Tracking
      1. dE/dx measurements (Justin)
      2. beta v. p measurements
      3. Vertex Kinematic Fitting - Paul
        Vertex Kinematic Fitting
    3. TOF / Start Counter
  4. Data Monitoring
    1. Commissioning Software
    2. Run Browser, Plot Browser, and Time Series webpages
  5. Physics Analysis
    1. ω Peak Hunt: Mike?
    2. Δ Peak Hunt: Justin?
    3. ρ Peak Hunt: Sean?
    4. γ p → p π+ π-: Kei?
    5. K0 Peak Hunt: Naomi
  6. AOB

Minutes

Attending: Sean (NU); Simon, Beni, Kei, Nathan, Mark I., Will M., Sasha S., Elton, Will L., David L., Sergey F., Mike S. (JLab); Matt S. (IU); Curtis, Paul, Naomi (CMU); Eric (FIU); Justin (MIT)

Status Update

Activity in the CCDB has been slowing ramping up. Nathan set the tagger endpoint energy to enable the calculation of the photon energies in the tagger. Simon added some preliminary inter-detector time offsets. These depend on the trigger timing. He is going to update these numbers for the FCAL trigger today; numbers for the BCAL trigger will follow after some further analysis. Some run-dependent constants have also been added, including target conditions and FCAL gains.

FCAL

  • Matt reviewed the current status of the FCAL energy scale. The base energy scale in the CCDB was increased by a factor 4. A factor 2 of this is due to an increase in the high voltage setting which hadn't been applied to the reconstruction. Another factor 2 was determined by using electrons hitting the inner rings of the FCAL. The other factor of two could have been due to several other factors. One is that the previous calibration was estimated only roughly from the FCAL beam test. Another is that the observed pi0's were of low energy, and high DAC thresholds could have been excluding a significant number of cells on the edges of the showers.
  • The next step is to do a finer calibration by using the algorithm which balances relative gains between blocks by minimizing the width of the pi0 peak. John Z. has written a plugin to do this, but it is an iterative procedure, so Matt is working on a plugin to skim the pi0-containing events to speed the procedure up.
  • Using these improved calibrations, they can determine increased HV settings and lower DAC thresholds to improve efficiency.
  • After some discussion with Sasha, the per-block energy threshold was estimated to be ~5 MeV, which was agreed to be on the high side.

BCAL

  • Andrei gave a comprehensive review of the current status of the BCAL calibration using cosmic ray data taken with the external scintillator trigger and simulations using a detailed BCAL geometry.
  • The waveforms of BCAL hits were studied in order to find the fADC-to-MeV conversion factors and the hardware thresholds. These values could be determined simulataneously. Non-trivial pedestal instabilities were noticed, and only hits with low pedestals were used.
  • The observed pulse integral distributions are a product of two contributions: the energy deposited by the charged particle, which can be obtained from the simulations, and a threshold function reflecting the effect of the readout algorithms. The modules can be classified into three classes: the minimum-ionizing-particle (MIP) peak is clearly separated from the threshold, the MIP peak is close to the threshold, the MIP peak is partially cut off by the threshold. The first two classes are useable, the third difficult but perhaps not impossible. He showed initial results of the calibrations, but work is still ongoing.
  • Some more notes: energy plots are in attenuated MeV, divide by ~0.095 to get physical MeV; the energy of a muon that passes all the way through a module is ~200 MeV, which doesn't happen to all modules due to the geometry of the BCAL; readout thresholds were generally higher than expected. Note: the 0.095 factor (9.5%) comes from simulations of the sampling fraction for entire calorimeter; we also have simulated the sampling fraction versus energy and angle and entered the results in look up tables, that have not been implemented yet in our reconstruction.
  • Running the BCAL at lower temperatures might reduce detector noise and the pedestals, which would allow for lower readout thresholds. Testing this hypothesis is a goal of this month's running.

Tracking

  • Justin reviewed a couple studies he is working on
  • First he reviewed a study of dE/dx in the CDC/FDC, which showed evidence for protons. New simulations of bggen events were needed to compare to simulations due to updated FDC geometry. He is working on improving the truncated mean calculation used for the dE/dx measurement, along with some suggestions by Lubomir on how to improve the pulse integral calculation for the FDC.
  • Then he showed a first look at using time-of-flight information to show evidence for protons. Using SC/TOF timing information, he saw clear evidence of protons with both FCAL and BCAL trigger. Using SC/BCAL timing was more difficult, as the timing and structures seen were more complicated. Selecting tracks with large dE/dx in the CDC cleared the data up to give some evidence for protons.
  • Paul discussed the status of the standard analysis PID code. By default, the code prefers TDC times, which are not available yet. There are some beta v. p bands in the monitoring plots, but these are artificial - the code relies on timing information from the photon tagger to determine the event time, and this information is not reliable yet.
  • Paul also showed results from the DVertex factory, which showed reasonable event vertices and good vertex kinematic fits considering the very initial stage of the detector calibrations.
  • If any other PID methods are developed, then Paul offered to help turn them into actions that can be called by the standard analysis code.

Other Detectors

  • Mark reported that the start counter TDCs are currently OK. A method for using the TOF TDCs has been developed as well, which uses the TI counter to disentangle the TOF TDC times. For more details see Mark's talk at the last online meeting, and this report by Kei.
  • Simon reported seeing good coincidences between the ADC and TDC times for the start counter, but seeing more TDC hits than ADC hits.
  • The CCDB table structures for the TOF and start counter are in the process of being updated.

Data Analysis

  • Paul reminded the detector groups to check out the offline monitoring data. Mark remarked that the message had been received to some extent by people stationed at JLab, but not as much by the university groups off-site, and that it might be helpful to regularly discuss these plots in the detector group meetings.
  • The data looks in good enough shape to start trying some initial bump-hunting analyses. Various analyses as listed in the minutes will be started, hoping for reports at the next physics meeting.
  • It was also pointed out that having quicker feedback for the data monitoring would increase its visibility and usefulness. One roadbump is due to the fact that we've been taking data with the full pulse shapes, which is slow to analyze. Mode 7 data will be much faster to analyze. There have also been more serious delays induced by the batch system. Mark is working on a solution where we will generate multiple user account and appropriately tune their priority levels.