Difference between revisions of "HOWTO use the GlueX Singularity Container"
(→Get a Shell Inside the Container) |
(→Get the Container) |
||
(33 intermediate revisions by 5 users not shown) | |||
Line 1: | Line 1: | ||
== Install Singularity == | == Install Singularity == | ||
− | See the instructions on the [https:// | + | See the instructions on the [https://apptainer.org/ Apptainer/Singularity site]. |
Alternately, RedHat Enterprise 7 has an RPM: | Alternately, RedHat Enterprise 7 has an RPM: | ||
yum install singularity | yum install singularity | ||
+ | |||
+ | Or on Ubuntu 16.04 and earlier: | ||
+ | |||
+ | go [https://singularity.lbl.gov/install-linux here] and follow the instructions | ||
+ | |||
+ | Or Ubuntu 16.10 and later: | ||
+ | |||
+ | sudo apt-get install singularity-container | ||
+ | |||
+ | The last one does not work on Pop!_OS, which is based on Ubuntu 22.04. Here is what worked for me [J.R.]: | ||
+ | There seems an issue with at least some Ubuntu versions: if singularity is not yet installed, | ||
+ | singularity --version | ||
+ | will respond with that it can get installed with 'sudo apt install singularity'. DON'T DO THAT! It will install a game called 'Endgame: Singularity'. | ||
+ | Instead do the following: | ||
+ | sudo apt update | ||
+ | wget https://github.com/sylabs/singularity/releases/download/v3.11.0/singularity-ce_3.11.0-jammy_amd64.deb | ||
+ | sudo apt install ./singularity-ce_3.11.0-jammy_amd64.deb | ||
+ | I had pieced this together from Syslabs [https://github.com/sylabs/singularity/releases/tag/v3.11.0 github page] and instructions [https://www.linuxwave.info/2022/02/installing-singularity-in-ubuntu-2004.html here]. | ||
== Get the Container== | == Get the Container== | ||
− | Download [https://halldweb.jlab.org/dist/ | + | Download [https://halldweb.jlab.org/dist/gluex_centos-7.9.2009_gxi2.35.sif gluex_centos-7.9.2009_gxi2.35.sif], the default container for CentOS7. |
+ | |||
+ | At JLab there is no need to download. The file is on the group disk: | ||
+ | |||
+ | /group/halld/www/halldweb/html/dist/gluex_centos-7.9.2009_gxi2.35.sif | ||
+ | |||
+ | Alternatively, the container can also be accessed via CVMFS ([[HOWTO Install and Use the CVMFS Client]]): | ||
+ | |||
+ | /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_devel:latest | ||
+ | |||
+ | |||
+ | JLab currently also hosts a container built with AlmaLinux9: [https://halldweb.jlab.org/dist/gluex_almalinux9_gxi2.37.sif gluex_almalinux9_gxi2.37.sif] | ||
+ | |||
+ | /group/halld/www/halldweb/html/dist/gluex_almalinux9_gxi2.37.sif | ||
+ | |||
+ | This container is also available on CVMFS: | ||
+ | |||
+ | /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_almalinux_9:latest | ||
== Get the Software and Support Files == | == Get the Software and Support Files == | ||
Line 15: | Line 50: | ||
Use one of three methods. | Use one of three methods. | ||
− | === 1. | + | === 1. tarball === |
− | # Download the tarball: [https://halldweb.jlab.org/dist/group_halld.tar.gz group.halld.tar.gz]. It's | + | This method is not supported at present (December 13, 2018). If you would like to see it revived, contact the Software Working group. |
+ | |||
+ | # Download the tarball: [https://halldweb.jlab.org/dist/group_halld.tar.gz group.halld.tar.gz]. It's 18 GB. | ||
# cd <directory that will contain "group"> | # cd <directory that will contain "group"> | ||
# tar zxvf <directory containing tarball>/group_halld.tar.gz | # tar zxvf <directory containing tarball>/group_halld.tar.gz | ||
Line 23: | Line 60: | ||
=== 2. rsync with direct ssh === | === 2. rsync with direct ssh === | ||
− | rsync -ruvt --delete scosg16.jlab.org:/cvmfs/oasis.opensciencegrid.org/gluex/group/ <directory that contains "group">/group/ | + | rsync -ruvt --delete --links scosg16.jlab.org:/cvmfs/oasis.opensciencegrid.org/gluex/group/ <directory that contains "group">/group/ |
=== 3. rsync through ssh tunnel === | === 3. rsync through ssh tunnel === | ||
Line 30: | Line 67: | ||
<li> Establish the tunnel | <li> Establish the tunnel | ||
<pre> | <pre> | ||
− | ssh -t -L9001:localhost:9001 | + | ssh -t -L9001:localhost:9001 login.jlab.org ssh -t -L9001:localhost:22 scosg16 |
</pre> | </pre> | ||
<li> In a separate shell instance, do the rsync | <li> In a separate shell instance, do the rsync | ||
<pre> | <pre> | ||
− | rsync -ruvt --delete -e 'ssh -p9001' localhost:/cvmfs/oasis.opensciencegrid.org/gluex/group/ <directory that contains "group">/group/ | + | rsync -ruvt --delete --links -e 'ssh -p9001' localhost:/cvmfs/oasis.opensciencegrid.org/gluex/group/ <directory that contains "group">/group/ |
</pre> | </pre> | ||
</ol> | </ol> | ||
+ | |||
+ | === 4. Run CVMFS === | ||
+ | |||
+ | There are two options here: | ||
+ | |||
+ | * [[HOWTO use prebuilt GlueX software from any linux user account using cvmfsexec|Run CVMFS in user space (cvmfsexec)]] | ||
+ | * [[HOWTO Install and Use the CVMFS Client|Run CVMFS as root]] | ||
+ | |||
+ | Depending on choices made during installation the directory that contains "group" will be something like | ||
+ | |||
+ | /path/to/cvmfs/oasis.opensciencegrid.org/gluex | ||
== Get a Shell Inside the Container == | == Get a Shell Inside the Container == | ||
− | singularity shell --bind <directory that contains "group">/group/halld:/group/halld <directory with container>/ | + | singularity shell --cleanenv --bind <directory that contains "group">/group/halld:/group/halld <directory with container>/gluex_centos-7.7.1908_sng3.8_gxi2.20.sif |
+ | |||
+ | == Set-up the Developer Toolset package (optional) == | ||
+ | |||
+ | The standard GCC version is 4.8.5. Perform the following step if you would like to use the GCC 8.3.1 compiler. | ||
+ | |||
+ | scl enable devtoolset-8 bash | ||
+ | |||
+ | or | ||
+ | |||
+ | scl enable devtoolset-8 tcsh | ||
== Set-Up the GlueX Environment == | == Set-Up the GlueX Environment == | ||
+ | |||
+ | For bash: | ||
source /group/halld/Software/build_scripts/gluex_env_jlab.sh | source /group/halld/Software/build_scripts/gluex_env_jlab.sh | ||
+ | |||
+ | or for tcsh: | ||
+ | |||
+ | source /group/halld/Software/build_scripts/gluex_env_jlab.csh | ||
+ | |||
+ | == gxshell == | ||
+ | We set up a tool to more easily set up and start a shell session. It requires singularity and cvmfs to be set up. For instructions consult [[HOWTO Install and Use the CVMFS Client]] | ||
+ | |||
+ | Once the software is installed create an alias like | ||
+ | alias gxshell singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/home /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_devel:latest gxshell | ||
+ | where you bind all directories that you want to take "into" the container. Binding /cvmfs/oasis.opensciencegrid.org/gluex/group/halld to /group/halld/ ensures that the prebuilt software will be available in the container. | ||
+ | '''Important: If you don't list a directory after the "--bind" option you might not find it within your container. Please customise this line as appropriate.''' | ||
+ | |||
+ | If everything is set up correctly all you have to do is type | ||
+ | gxshell | ||
+ | to start a bash shell within the GlueX container with the software environment set up. |
Latest revision as of 14:10, 13 December 2024
Contents
Install Singularity
See the instructions on the Apptainer/Singularity site.
Alternately, RedHat Enterprise 7 has an RPM:
yum install singularity
Or on Ubuntu 16.04 and earlier:
go here and follow the instructions
Or Ubuntu 16.10 and later:
sudo apt-get install singularity-container
The last one does not work on Pop!_OS, which is based on Ubuntu 22.04. Here is what worked for me [J.R.]: There seems an issue with at least some Ubuntu versions: if singularity is not yet installed,
singularity --version
will respond with that it can get installed with 'sudo apt install singularity'. DON'T DO THAT! It will install a game called 'Endgame: Singularity'. Instead do the following:
sudo apt update wget https://github.com/sylabs/singularity/releases/download/v3.11.0/singularity-ce_3.11.0-jammy_amd64.deb sudo apt install ./singularity-ce_3.11.0-jammy_amd64.deb
I had pieced this together from Syslabs github page and instructions here.
Get the Container
Download gluex_centos-7.9.2009_gxi2.35.sif, the default container for CentOS7.
At JLab there is no need to download. The file is on the group disk:
/group/halld/www/halldweb/html/dist/gluex_centos-7.9.2009_gxi2.35.sif
Alternatively, the container can also be accessed via CVMFS (HOWTO Install and Use the CVMFS Client):
/cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_devel:latest
JLab currently also hosts a container built with AlmaLinux9: gluex_almalinux9_gxi2.37.sif
/group/halld/www/halldweb/html/dist/gluex_almalinux9_gxi2.37.sif
This container is also available on CVMFS:
/cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_almalinux_9:latest
Get the Software and Support Files
Use one of three methods.
1. tarball
This method is not supported at present (December 13, 2018). If you would like to see it revived, contact the Software Working group.
- Download the tarball: group.halld.tar.gz. It's 18 GB.
- cd <directory that will contain "group">
- tar zxvf <directory containing tarball>/group_halld.tar.gz
2. rsync with direct ssh
rsync -ruvt --delete --links scosg16.jlab.org:/cvmfs/oasis.opensciencegrid.org/gluex/group/ <directory that contains "group">/group/
3. rsync through ssh tunnel
- Establish the tunnel
ssh -t -L9001:localhost:9001 login.jlab.org ssh -t -L9001:localhost:22 scosg16
- In a separate shell instance, do the rsync
rsync -ruvt --delete --links -e 'ssh -p9001' localhost:/cvmfs/oasis.opensciencegrid.org/gluex/group/ <directory that contains "group">/group/
4. Run CVMFS
There are two options here:
Depending on choices made during installation the directory that contains "group" will be something like
/path/to/cvmfs/oasis.opensciencegrid.org/gluex
Get a Shell Inside the Container
singularity shell --cleanenv --bind <directory that contains "group">/group/halld:/group/halld <directory with container>/gluex_centos-7.7.1908_sng3.8_gxi2.20.sif
Set-up the Developer Toolset package (optional)
The standard GCC version is 4.8.5. Perform the following step if you would like to use the GCC 8.3.1 compiler.
scl enable devtoolset-8 bash
or
scl enable devtoolset-8 tcsh
Set-Up the GlueX Environment
For bash:
source /group/halld/Software/build_scripts/gluex_env_jlab.sh
or for tcsh:
source /group/halld/Software/build_scripts/gluex_env_jlab.csh
gxshell
We set up a tool to more easily set up and start a shell session. It requires singularity and cvmfs to be set up. For instructions consult HOWTO Install and Use the CVMFS Client
Once the software is installed create an alias like
alias gxshell singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/home /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_devel:latest gxshell
where you bind all directories that you want to take "into" the container. Binding /cvmfs/oasis.opensciencegrid.org/gluex/group/halld to /group/halld/ ensures that the prebuilt software will be available in the container. Important: If you don't list a directory after the "--bind" option you might not find it within your container. Please customise this line as appropriate.
If everything is set up correctly all you have to do is type
gxshell
to start a bash shell within the GlueX container with the software environment set up.