GlueX Offline Meeting, May 18, 2018

From GlueXWiki
Revision as of 14:42, 21 June 2018 by Marki (Talk | contribs) (Running on the OSG, with and without Singularity)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

GlueX Offline Software Meeting
Friday, May 18, 2018
10:00 am EST
JLab: CEBAF Center A110
BlueJeans: 968 592 007

Agenda

  1. Announcements
    1. Software Review, Summer 2018
    2. LineShape_Library
  2. Review of minutes from the May 4 meeting (all)
  3. xrootd and cvmfs (Sean)
  4. Use cases for the GlueX singularity container, with examples
  5. Splitting up sim-recon into sim and recon (all)
  6. Review of recent pull requests (all)
  7. Review of recent discussion on the GlueX Software Help List (all)
  8. Action Item Review (all)

Communication Information

Remote Connection

Slides

Talks can be deposited in the directory /group/halld/www/halldweb/html/talks/2018 on the JLab CUE. This directory is accessible from the web at https://halldweb.jlab.org/talks/2018/ .

Minutes

Present:

  • JLab: Alex Austregesilo, Thomas Britton, Mark Ito (chair), David Lawrence, Justin Stevens, Simon Taylor, Beni Zihlmann
  • UConn: Richard Jones
  • Yerevan: Hrach Marukyan

There is a recording of this meeting on the BlueJeans site. Use your JLab credentials to access it.

Announcements

  1. Software Review, Summer 2018: still no word on the schedule.
  2. LineShape_Library: Thomas announced that he and Simon have started work on a ROOT-based lineshape library that extends the repertoire provided natively by ROOT, including interference effects between resonances with identical final states. See his talk at Workfest 2018 for details.
  3. David mentioned that there will be an experimental readiness review for GlueX II at the end of June. As part of that review, we will likely be asked to update our estimates for computing resources.
  4. David also reported that the OSG has lost some component of their funding and no new Virtual Organizations (VOs) will be admitted. Richard told us that his understanding is that the cut-back is to Helpdesk personnel at Indiana University and the freeze on new VOs is to inhibit growth of the Helpdesk load.
  5. David reported that he has succeeded running jobs at NERSC using our standard container for the system code and our Oasis CVMFS share for the user code. No serious difficulties with the scheme were encountered.

Review of minutes from the May 4 meeting

We reviewed the minutes without a lot of new discussion.

XROOTD and CVMFS

Sean has suggested that in addition to running an XROOTD server, we also run our own CVMFS server and use it instead of Oasis for OSG jobs. This would eliminate the concern of using too much space on Oasis.

We discussed the idea. It is not needed in the near term; Oasis is working well for us at present. There was some concern that the infrastructure behind the CVMFS Oasis share would be hard to match (backup, multiple redundant servers). We could start exploring the idea on the current Submit Host (scosg16.jlab.org), measure requirements, and see if dedicated hardware might be needed.

Running on the OSG, with and without Singularity

Richard walked us through his new wiki page, HOWTO run jobs on the osg in the GlueX singularity container. There are other use cases, where one would like to access the GlueX software stack without having to build it in its entirety, in a non-Grid context. On one's local desktop for code development, for example. Other wiki pages, dealing with these other use cases, are in preparation.

One innovation Richard has developed is the ability to run inside a Singularity container on a host without actually having singularity installed on the host. It does require modifications to the script that invokes any binary executables. This significantly expands the universe of compute nodes that we can use on the OSG. Also, by scripting distribution of the tar file image of our Oasis share, jobs can be run on nodes that do not support CVMFS, another expansion of available nodes.

Splitting up sim-recon into sim and recon

We have discussed this idea before. Sean has done some work developing a technical approach to splitting the Git repositories. The problem is that development of reconstruction and simulation will often proceed separately, in particular we may want to improve simulation code against a fixed version of the reconstruction where the version of reconstruction matches that used in a large reconstruction launch. Maintaining a development branch with both simulation and reconstruction in the same repository has been found to be difficult.

Justin pointed out that there are some libraries which are used in both simulation and and analysis, for example the GlueX AmpTools library, where the location of the library, post-split, is not clear. Further, there are some libraries, like HDDM that could be split off all by themselves and stand apart from both Sim and Recon.

Mark and Sean will put together a proposal on how this should be done and present it to the group.

Review of recent pull requests

We went over the list without significant comment.

Review of recent discussion on the GlueX Software Help List

We went through the list without significant comment as well.