SCons Build System

From GlueXWiki
Jump to: navigation, search


10/30/2013 David Lawrence

The first general build system used by Hall-D was called the Build Management System (BMS) and was base on the standard "make" utility. This was chosen for a couple of reasons over a few competitors, (including the then new scons tool). Despite the lack of creativity in the name, the BMS system was successfully used for a number of years to build the sim-recon software for the GlueX experiment in Hall-D. As the amount of source code in sim-recon grew, the compile time required grew too. By 2013, it would take 15-20 minutes to build sim-recon from a fresh checkout of the code. This was not a problem for everyone since the most common use-case was to build sim-recon once and then work on only a small part of the software and rebuild that as needed. However, whenever the sim-recon directory was updated from the repository after a number of changes had been made, it was good practice to do a "make clean" and rebuild everything. In addition, certain power users would routinely need to check out the source and build from scratch on different platforms or to test different revisions.

The solution to the slow-build issue is to do parallel builds where one takes advantage of multiple cores on a single computer by compiling, linking, and installing several things at the same time. The difficulty with this is that some files must be built before others. For example, a library must be built before it can be linked into a program. Good old "make" can handle this, but only if the Makefiles are written in such a way that it knows the entire dependency tree and can work out the sequencing. Alas, the BMS system was written to work recursively. This means that the Makefile in the libraries directory would simply run another instance of make in its sub-directories. Thus, when the parallel feature of make was turned on, BMS would choke at some point and fail to complete the build. Two options were available (well, OK, probably a lot more than 2, but these two were the only ones that made sense for our narrow section of the code-niverse). The first was to re-write the Makefiles to work using includes that would allow it to form the entire dependency tree and thus, efficiently use parallel builds. The second was to do something similar with SCons. Both would really need to be written from scratch so the amount of work was assumed to be more or less equivalent. An SCons based system had a few things that were appealing though:

  1. The CLAS12 group in Hall-B had been using scons successfully for a few years
  2. SCons is based on Python which continues to become more popular
    • (the online group in Hall-D had plans to use it as the default scripting language for online systems)
  3. I would have had to spend the next 10 years trying to defend "make to those in the "Nuevo esta Grande" Gang

So, an effort to create a build system based on SCons was started. The name SBMS just stands for "SCons Build Management System" which pays homage to the old BMS color-challenged name.


SBMS implements the following features/conventions:

  1. Supports parallel builds! Just invoke with the "-j" argument. e.g. to run with 16 threads do "scons -j16" (or "scons -u -j16" if not in src)
  2. Centralized configuration. Most SConscript files are very simple and often duplicate others. This allows almost all of the details on how to build to be kept in a central location so that they apply to the entire source tree. For example, the list of DANA libraries exists in only one place so adding a new library will not require one to edit 50 SConscript files!
  3. In most cases, all source files in a given directory are compiled. For programs and plugins, the compiled objects are all linked into the final executable. For libraries, all objects are added to the library. (This is to try and force better housekeeping.)
  4. Final library file names are based on the directory name. For example, a library whose source is contained in a directory named "TOF" will have all of the source in that directory compiled and placed in a library named "libTOF.a". This is to make it easier for someone trying to find the code for a library, or the library that is made from some code.
  5. Final program file names are based either on the source file name defining "main()" or the directory name holding the code (sbms.executables vs. sbms.executable). For example, the hdgeant program will have it's "main()" routine defined in a file named (or possibly hdgeant.F or hdgeant.c, ... you get the idea)
    • Multiple programs in a single source directory are supported, but all source files that do not define "main()" will have their object files linked into all programs made from source that directory.
  6. Final plugin names are based on the directory name just as with libraries.
  7. Cleaning via "scons -c" will only clean the current directory and its direct descendants. Note that this contradicts the default SCons behavior which is to clean all dependencies even ones in sibling and cousin directories (e.g. when doing a clean in a program directory that depends on libraries, the library directories would also be cleaned if the default SCons behavior were used). The entire source tree is cleaned if one invokes scons -c from the top-level src directory.


Simple build with one thread

If running from the top-most directory (this should be named src and should contain the SConstruct file)

scons install

If running in a subdirectory, you need to add the "-u" option to tell scons to search parent directories until it finds the SConstruct file.

scons -u install

Simple build with multiple threads

If running from the top-most directory with 4 threads (this should be named src and should contain the SConstruct file)

scons -j4 install

If running in a subdirectory with 4 threads, you need to add the "-u" option to tell scons to search parent directories until it finds the SConstruct file.

scons -u -j4 install

You can (and probably should) specify the number of threads such that it equals the number of cores on the computer you are using. Note that I did see issues on my MacBook Pro where compiling with 4 threads where it eventually used up all of the RAM and caused it to go into swap mode which slowed the whole machine to a crawl. You should keep an eye on the RAM usage to know how many threads it's safe to use.


Run this from any directory, but only files associated with that directory or its direct descendants will be cleaned.

scons install -c

You can also use the "-j" option here to do this multi-threaded, but I don't think any speedup will be very noticeable.

Show the build commands

By default, SBMS hides the sometimes quite verbose compile and link commands. To show these commands as they are executed, set the SHOWBUILD variable to "1" (note that this was adopted from the gemc build system used for CLAS12)

scons -u -j4 install SHOWBUILD=1

The above example would be used for running in a subdirectory and with 4 threads. SCons is smart enough not to intermix commands from multiple threads when printing to the screen so this is safe.

Install Location

SBMS uses the Variant Directory feature to place built binary files outside of the source tree. This helps keep the source tree less cluttered. The variant directory is really a mirror directory structure of the source directory tree and should be considered a temporary staging area for binaries, including the object files. The installation directory is different and contains only the final libraries+headers, programs, and plugins. The topmost directory of the variant directory is a hidden directory in src whose name is simply "src/.$BMS_OSNAME" where the BMS_OSNAME environment variable is used to provide a platform dependent directory name. The installation directory is parallel to the src directory and does not contain the "." in the name so that it is not hidden.

variant directory:  src/.${BMS_OSNAME}     (aka  sim-recon/src/.${BMS_OSNAME} )

install directory:  src/../${BMS_OSNAME}   (aka  sim-recon/${BMS_OSNAME} )

You'll notice in all of the commands given above, the "install" target is specified explicitly. Without this, the binaries will still be built, but only in the variant dir. This is a useful feature if you are modifying the source in a distribution that you want to make sure builds cleanly before the binaries are deployed to the install directory. This follows the standard form used when distributing software for unix systems that typically used separate "make" and "make install" phases.

Inside the install directory a standard directory structure exists:

├── Linux_CentOS6-x86_64-clang3.2
│   ├── bin
│   ├── include
│   ├── plugins
│   └── lib

Adding to the sim-recon source tree

If you wish to add a library, program, or plugin to the sim-recon source tree, the easiest thing to do is find an existing item and duplicate it. Create a directory at the appropriate place for you source and you can probably copy the SConscript file from a sibling directory without modification. You will, though need to modify the SConscript file in the parent directory to include your new directory in the list of places it needs to read SConscript files from.

There are two options when specifying your new directory in its parent's SConscript file. The first will automatically build your package whenever the rest of the source tree is built. The second will only build it if scons is invoked while in your directory or, alternatively, in its parent's, but with your directory explicitly listed on the command line. Note that if you go for the first option (and assuming you committed this to the subversion repository), then you are declaring this directory a required part of the sim-recon build. This will then be required for all sim-recon users so be prepared to maintain this package and respond to issues that may arise on various platforms people build and run sim-recon on.

To add the package as a required part of sim-recon, add the directory to the subdirs list:

subdirs = ['acceptance_hists', 'b1pi_hists', 'MyNewPackage']
SConscript(dirs=subdirs, exports='env osname', duplicate=0)

To add the package as an optional one that doesn't get built unless the user explicitly requests it:

sbms.OptionallyBuild(env, ['phys_tree', 'mc_tree', 'MyNewPackage'])

Extending SBMS

If you need to add support for a new package to SBMS then there are two ways to do it. The first is to modify the src/SBMS/" file to add support for the desired package (there are multiple examples already in there). The second is to put this support in a new file in the "src/SBMS" directory. This directory is added to the PYTHONPATH automatically in the top-level SConstruct file so anything you put there can be easily brought in to an SConscript file using the standard "import" syntax from Python.