Skip to main content

Access to ECCO

The Economics Compute Cluster (ECCO) has migrated to the BioHPC environment, and accounts are not handled on this website anymore. See the BioHPC User Guide.

Latest Tweets

Error: Could not authenticate you.

Advanced qsub

Print Friendly, PDF & Email
In order to go beyond our convenience scripts, you may want to write your own qsub script. Keep a separate window open with this full description of qsub options. There are many other (better-funded) sites running Torque/Maui out there, so searching the web for instructions is often very helpful as well. All of the scripts below can be found on Lars' Github Gist for easy download, and if you see anything that doesn't work, or could work better, please contribute any improvements there.

Basic steps

  1. Using a regular text or program editor, write a small script (f.i., my.qsub) with all the necessary information:
    where SAS (or Stata, or R) should be called as you would normally call the program when submitting a batch job (refer to your software documentation for details). For the PBS parameters, several examples of which are noted above, refer to the qsub manual. It is preferable to use full paths wherever appropriate.
  2. Open a terminal shell, if logged in through NX.
  3. Submit the job with
    qsub myjob.qsub
  4. One subtlety: you can also add as a first line a standard UNIX shell invocation to run the qsub script under the shell of your choice. The main example above invokes the bash shell, the same you have on the head node by default. For instance, the following script forces the qsub script to run in alternate csh login shell:

    If you don't know what the difference between a csh and bash are, then you probably don't need to worry about this, though.

Resource usage and its limits

The most important resource usage requests are likely to be

  • memory
  • number of CPUs
  • length of the job

Memory is requested in megabytes, and the length of the job (in "wallclock time", i.e., not just the time the CPU spends on your job) is measured in hours/minutes/seconds. For instance, a job that requests 8 CPUs (in one node) with 32GB of memory, running for 10 minutes would have

#PBS -l ncpus=8
#PBS -l mem=32768mb
#PBS -l walltime=00:10:00

Note that there are limits to these resource requests, depending on which queue you are allowed to submit to, see this page.

Application-specific notes


You can limit SAS to the amount of memory and CPU requested in the qsub parameters. The 'qsas' script does this automatically (in units of 'chunks' = 2 CPUs and 8GB). The SAS command line parameters for this are


An example SAS job requesting about 32GB and 4 CPUs might be

We've given SAS a bit less memory than we requested from the job scheduler to avoid problems. Note that there are many more SAS optimization options. Some that are used in 'qsas' are

-noterminal  (makes some SAS Procs run better when not running graphically)
-sumsize     (make it a bit smaller than memsize)
-sortsize    (same)


Since Stata 12, it is not possible to limit the amount of memory Stata uses. It is, however, possible to limit the number of CPUs used in Stata-MP by adding the following command to the Stata program:

set processors 1

In fact, by default, ECCO adds that command to the preamble that gets executed by every Stata job - if you have a job that needs more CPUs, you need to explicitly request them in both your qsub job as well as your Stata program. It is not possible to modify the requested number of CPUs on the Stata command line. An example Stata-oriented qsub might be

Note the use of the command-line parameter "-b" which instructs Stata to run in 'batch' = non-interactive mode. A log file will be generated.


The recommended command line for Matlab is

matlab -nojvm -nodesktop -nosplash < program.m > program.out

Thus, a complete custom qsub would look like this:

Please note that there are generally multiple Matlab versions installed. It is good reproducibility practice to request a specific version, i.e.,

module load matlab/R2014b

Users can query the available versions from the command line in an interactive shell by inspecting the listed Matlab versions when typing

module avail

Users need to take care with how many resources Matlab uses, in particular the amount of memory and CPUs. Matlab will use all the processors in a system by default and if possible, ignoring any qsub-based restrictions (this is a "feature"). Depending on how you are letting Matlab run, it may well use all the processors on a single node, and if that is larger than the number of processors requested, then the job will be killed.

Matlab states "You can set the -singleCompThread option when starting MATLAB to limit MATLAB to a single computational thread. " but that doesn't guarantee that it doesn't "leak" into a second CPU. This means that you should probably (if not running something fundamentally multithreaded) add that option to the matlab command line option:

matlab -nojvm -nodesktop -nosplash -singleCompThread

Caution: Depending on the version of Matlab you are using, you may need to remove the "-nojvm" switch, as there are more and more components of Matlab that use a Java Virtual Machine (JVM). This is true in particular for Parallel Computing since R2015b.

If you intend to use the Parallel Computing package in Matlab, you might need to explicitly specify

parpool open 8 (in R2014b)
parpool(8) (since R2015b)

(the command matlabpool is deprecated since R2014a) where 8 is the number of cores you want to use, depending on how many cores you have requested in your qsub. Note that the master Matlab job counts as a CPU, so if you requested 8 cpus (-l ncpus=8), you should set

parpool open 7

If you intend to use Knitro in conjunction with Matlab, please see our page on that software.


R by default is single-threaded. However, if using the HPC-R versions, multi-threaded versions are available. In order to control how many threads are used, you need to set an environment variable prior to running R:

Other qsub parameters of interest

The following parameters are often used and useful:

-N (jobname)  
-M (desired email address)
-m abe  sends emails on (a)bort, (b)egin, (e)nd 
-j oe   merges output and error files. 
        Results will be in file (jobname).oNNNN 
        where NNNN is the job number.

(jobname) should be an informative job name - it will show up when typing 'qstat'.

You can specify any of the #PBS prefaced parameters on the command line, overriding what is specified in the script, but if the parameters in the script are fine, then you need to specify nothing on the command line.