Difference between revisions of "Topics for the 2015 Software Review"

From GlueXWiki
Jump to: navigation, search
(Overall Theme of the Presentation(s))
(Preferred Forma)
Line 2: Line 2:
 
== Overall Theme of the Presentation(s)==
 
== Overall Theme of the Presentation(s)==
  
== Preferred Forma ==
+
== Preferred Format ==
 
Given that this is the 3rd software review, we will that the entire committee
 
Given that this is the 3rd software review, we will that the entire committee
 
would like to see where things stand. As such, we would prefer to only have a  
 
would like to see where things stand. As such, we would prefer to only have a  

Revision as of 13:39, 17 January 2015

Overall Theme of the Presentation(s)

Preferred Format

Given that this is the 3rd software review, we will that the entire committee would like to see where things stand. As such, we would prefer to only have a plenary presentation.

Topics to be Covered

  1. Report on Successful Data Challenges
    • DC1 - December 2012/ January 2013
      • 5 billion Events - OSG, JLab, CMU
      • 1200 Concurrent Jobs at JLab.
    • DC2 - March/April 2014
      • 10 billion events with EM backgrounds included - OSG, JLab, MIT, CMU, FSU
      • 4500 Concurrent Jobs at JLab
      • Well under 0.1% failure rate
    • DC3 - January/February 2015
      • Read data in raw-event format from tape and produce to DST (REST) files.
      • Load up as many JLab cores as possible.
      • Run Multi-threaded jobs
      • Already doing full reprocessing of the Fall 2014 data from tape every two weeks.
  2. Data Acquisition Successes - Running Fall 2014 (stealth data challenge).
    • Exceeded the 300MB/s transfer to tape bandwidth of experiment.
      • ~500 million events.
      • 7000 files, 120TB of data
    • Most data were taken in full pulse mode of the Flash ADCs
      • Need to get final processing algorithms on the FPGAs in the FADCs
      • Need to clean raw data of massive unused headers.
    • Event Rates of 2KHz for full experiment, much higher for individual components.
      • Need to move to block mode.
      • Need to move to FPGA processing to compress data.
    • Full DAQ chain to local raid disk, transfer to tape, and automatic processing from tape.
    • Robustness issues with the system
      • Handle corrupted evio data
      • Problems with some FADCs getting out of sync.
    • Stealth Online Data Challenge
  3. Revisit data and computing spreadsheets
    • Update based on current software performance.
    • Update with best estimates of raw data footprint.
  4. offline monitoring
    • browser
    • analyze data as it appears on the silo
    • reconstruction results
  5. calibration committee
    • bi-weekly meeting
    • preliminary list of constants compiled
    • calibration still needs to be regularized
    • calibration database training
  6. CCDB successes
    • command line interface
    • SQLite form of database
  7. analysis results
    • electron identification in the FCAL.
    • pi0 peak
    • proton id with tof
    • proton id with dE/dx
    • rho meson in pi+ pi-
    • omega meson in pi+ pi- pi0
  8. data transfer to CMU via globus Online
  9. data management: event store, etc.