GlueX Offline Meeting, June 1, 2018

From GlueXWiki
Jump to: navigation, search

GlueX Offline Software Meeting
Friday, June 1, 2018
10:00 am EDT
JLab: CEBAF Center A110
BlueJeans: 968 592 007


  1. Announcements
    1. new version file (version_2.35_jlab.xml) with new sim-recon tag for bggen simulation
    2. New release of build_scripts: version 1.31
    3. scratch disk policy change
    4. Software Review, Summer 2018
  2. Review of minutes from the May 18 meeting (all)
  3. request for university compute resources (Richard)
  4. Splitting up sim-recon into sim and recon (Sean, Mark)
  5. g3, g4, and hdds, ccdb (Mark)
  6. Review of recent pull requests (all)
  7. Review of recent discussion on the GlueX Software Help List (all)
  8. Action Item Review (all)

Communication Information

Remote Connection


Talks can be deposited in the directory /group/halld/www/halldweb/html/talks/2018 on the JLab CUE. This directory is accessible from the web at .



  • FSU: Sean Dobbs
  • JLab: Amber Boehnlein, Thomas Britton, Hovanes Egiyan, Mark Ito (chair), David Lawrence, Simon Taylor

There is a recording of this meeting on the BlueJeans site. Use your JLab credentials to access it.


  1. Simulation Launch.Thomas has start a large-scale bggen simulation run on the OSG. John Hardin's plug-ins are included.
  2. new version file (version_2.35_jlab.xml) with new sim-recon tag for bggen simulation. This will be used for the simulation mentioned above. It was re-tagged last night.
  3. New release of build_scripts: version 1.31. The default source of geometry information is now the CCDB.
  4. scratch disk policy change. The life of unread files has been increased from 14 days to 60 days.
  5. Software Review, Summer 2018. No word yet.

Review of Minutes from the May 18 Meeting

We went over the minutes.

  • David reported that our allocation for NERSC for this year is 23 million core hours. Chris Larrieu of Scientific Computing is working on the system for staging data to NERSC. For comparison, one reconstruction pass (ver01) on 2017 data used 3.3 million core hours. And Richard Jones reported getting one million core hours for simulation on the OSG over a 10 day period recently.

Splitting up sim-recon into sim and recon

Sean went over a proposal on why and how to split sim-recon. He and Mark had discussed the issues last week. The major points in the proposal are:

  • The directories targeted for relocation to the "sim" repository are programs/simulation, plugins/Simulation, and libraries/AMPTOOLS_*.
  • Although there are other parts of the current sim-recon that could be split off into separate repositories, we would start with just a "sim" split off.

Sean has already done a proof-of-principle split and build.

We endorsed the proposal. Details to come.

Transitioning to HDGeant4

We discussed the situation/problem of continuing to rely on GEANT 3 with only a very few projects using Geant 4. Mark asked if we should be taking active steps to encourage/support/cajole collaborators to start using HDGeant4. He is concerned that we are in a chicken and egg situation: no one will use it until it can be trusted and no one will trust it until everyone is using it.

Amber is concerned that the effort on Geant4 may be ramping down and told us that she has advocated for more funding to go the Geant4 team. She is worried that the main user base in high-energy physics (i. e., LHC experiments) may not be pushing development in a direction that helps us.

Two ideas:

  1. Recruit someone to devote a large fraction of their time to tuning-up Geant4 to work well for GlueX, and perhaps for other Halls as well.
  2. Encourage/help people who have significant simulation tasks to devote part of their effort to running HDGeant4 and running the same post-detector-simulation software on the HDGeant4 generated sample.

Thomas agreed to do some simulation with HDGeant4 as part of the current bggen campaign. MCwrapper already has an option to do this.

Request for University Compute Resources

We discussed Richard's recent email asking for collaborators to identify local resources that might be made available for use by GlueX.

Sean explained that the idea hinges on leveraging tools developed by OSG, in particular BOSCO, that allow jobs submitted to the OSG to run on a variety of batch systems including those not configured as OSG Compute Elements. This effort is enabled by our deployment of Singularity containers and our development of infrastructure to support their use. It could result in a substantial increase in the amount of computing available to us at very little cost.

Work on Geometry Classes

Sean reported that he is working on re-writing some of our geometry classes adding the ability to read alignment constants (geometry tweaks) in from the CCDB for subsystems that would benefit from it.

Review of Recent Pull Requests

We looked at [the list without comment.

Review of Recent Discussion on the GlueX Software Help List

We pulled up the Help List.

  • Sean asked about persistent classes in JANA and refreshing calibration info on run number changes. David is working on it.
  • David noted that the launch parameter page still needs updating. Alex Austregesilo is aware of this.