## Table of Contents
- Table of Contents
- Introduction
- Definitions
- Data Location
- Toolbox Licenses
- Running MATLAB with the GUI for Code Development
- Submitting Interactive Jobs
- Submitting Jobs with PBS
*qsub*
- Submitting Task Parallel and Data Parallel Jobs with PCT
- Running a Job on a GPGPU
- Running Locally on a Desktop
- PCT Resources and Demos

- Table of Contents
- Introduction
- Definitions
- Data Location
- Toolbox Licenses
- Running MATLAB with the GUI for Code Development
- Submitting Interactive Jobs
- Submitting Jobs with PBS
*qsub* - Submitting Task Parallel and Data Parallel Jobs with PCT
- Running a Job on a GPGPU
- Running Locally on a Desktop
- PCT Resources and Demos

## Introduction

There are several ways in which to submit MATLAB jobs to a cluster. This document will cover the various ways to run MATLAB compute jobs on the Shared Research Compute clusters, which will include using the Parallel Computing Toolbox (PCT) and the MATLAB Distributed Compute Server (MDCS) to submit many independent tasks as well as to submit a single task that has parallel components. Examples are included.

## Definitions

**Task Parallel Application** - The same application that runs independently on several nodes, possibly with different input parameters. There is no communication, shared data, or synchronization points between the nodes.

**Data Parallel** **Application** - The same application that runs on several labs simultaneously, with communication, shared data, or synchronization points between the labs.

**Lab** - A MATLAB worker in a multicore (Data Parallel) job. One lab is assigned to one worker (core). Thus, a job with eight labs has eight processor cores allocated to it and will have eight workers each working together as peers.

**MDCS** - MATLAB Distributed Compute Server. This is a component of MATLAB that allows our clusters to run MATLAB jobs that exceed the size of a single compute node (multinode parallel jobs). It also allows jobs to run even if there are not enough toolbox licenses available for a particular toolbox, so long as the university owns at least one license for the particular toolbox.

**PCT** - Parallel Computing Toolbox.

**MATLAB Task** - One segment of a job to be evaluated by a worker.

**MATLAB Job** - The complete large-scale operation to perform in MATLAB, composed of a set of tasks.

**MATLAB Worker** - The MATLAB session that performs the task computations. If a job needs eight processor cores, then it must have eight workers.

**Job** - Job submitted via the PBS job scheduler (also called PBS Job).

## Data Location

This document assumes that all of your input data files, output files, and MATLAB code files are stored on the cluster's filesystem, not on your desktop. When you launch the MATLAB GUI and look at your home folder, you will be looking at your home folder on the cluster, not on your desktop. If your data or code files are stored on your desktop, you will need to transfer them to the cluster first.

## Toolbox Licenses

The version of MATLAB installed on our clusters shares a site license with the rest of the campus. To see which Toolboxes our campus is licensed to use, launch MATLAB interactively on your desktop or on one of our clusters and run the *ver* command.

## Running MATLAB with the GUI for Code Development

if you need to run MATLAB on one of the clusters to develop your code and run short (30 minutes or less) tests of the code, please follow these instructions.

**1. ** **Login to the Cluster**

Login to the cluster using our published instructions .

**2. Load the MATLAB Environment**

Load the MATLAB module with the following command:

**3. Run MATLAB**

To run MATLAB with our without the GUI as follows:

This will start MATLAB with and without the GUI, respectively.

At this point MATLAB will be running interactively on one of the login nodes, not one of the compute nodes. When you get the *matlab* prompt, start writing your code as your normally would. This method of running MATLAB is intended for code development and for executing short test runs of your code (30 minutes or less).

## Submitting Interactive Jobs

If you need to run a MATLAB **compute** job interactively, please follow these instructions.

**1. ** **Login to the Cluster**

Login to the cluster using our published instructions.

**2. Load the MATLAB Environment**

Load the MATLAB module with the following command:

**3. Submit an Interactive PBS Job**

Submit an interactive PBS job as follows:

When this job starts executing you will receive a prompt on a compute node. The output will look something like the following:

You will notice that the command prompt has changed from a login node designation to a compute node designation.

**4. Launch MATLAB**

After you have obtained a command line prompt on a compute node, launch MATLAB with or without the GUI. An example without the GUI is shown below.

From the prompt you can run any MATLAB command that you wish.

## Submitting Jobs with PBS *qsub*

The recommended method of submitting Task Parallel jobs is to use PCT as described below. However, if you need to submit one or more **single core jobs**, and are not encountering problems with the lack of specific toolbox licenses, then you might wish to submit your jobs directly with PBS as described in the following steps.

**1. Build a MATLAB .m code file**

Include all of the MATLAB commands that you need to execute in a MATLAB .m file, such as *sample.m* and place it somewhere in your home directory or subdirectory. The creation and contents of a .m file are beyond the scope of this document. Consult the MATLAB documentation for information on .m files.

**2. Create a PBS batch script**

You will need to execute MATLAB from within a PBS batch script, such as *sample.pbs* as follows. In this example, this file is saved in the same directory as *sample.m*.

**sample.pbs**

In this example the *sample.pbs* script calls *matlab* with the -r option followed by the MATLAB script name that was created in step #1. Leave off the trailing .m from the script name when calling MATLAB this way.

For more information about PBS job scripts, please see our FAQ.

**3. Submit the Job**

After you have created *sample.m* and *sample.pbs*, go to the directory where *sample.pbs* resides, load the MATLAB module (if you have not already done so) and submit the job to the job scheduler:

You should now be able to see your job in the queue by going to your Linux terminal window and using the *showq* command.

### Submitting multiple MATLB jobs via PBS

In order to submit multiple MATLAB batch jobs, simply repeat this section for each job.

## Submitting Task Parallel and Data Parallel Jobs with PCT

The Parallel Computing Toolbox (PCT) provides an API that allows you to submit a job that has multiple independent tasks (Task Parallel) or submit a job that has a single task that is a multiprocessor, and possibly multinode, task (Data Parallel). In order to run this type of job you must first configure MATLAB for this type of job submission by following these steps;

### Configuring MATLAB

1. Enable passwordless for your cluster account.

After you have logged into the cluster, follow these instructions to enable passwordless ssh.

2. In your home directory create the MdcsDataLocation subdirectory.

3. Load the MATLAB 2011a environment:

4. Run MATLAB on the login node:

5. In MATLAB, add the */opt/apps/matlab/2011a/local* folder to your MATLAB path so that MATLAB will be able to find the scripts necessary to submit and schedule jobs.

- Click
**File > Set Path** - Click
**Add Folder** - Specify the following folder:

/opt/apps/matlab/2011a/local

6. Import the cluster configuration for the cluster you are running on:

- On the MATLAB Desktop menu bar, click
**Parallel > Manage Configurations**. - Click
**File > Import** - In the Import Configuration dialog box, browse to find the MAT-file for the configuration that you want to import. Navigate to
*/opt/apps/matlab/2011a/local*and select the configuration for the system you are using, such as*sugar.mat*,*davinci.mat*,*stic.mat*, and so forth. Select the file and click**Import**. - Select the configuration you have imported and click
**Start Validation**.- All four stages should pass: Find Resources, Distributed Job, Parallel Job, Matlabpool
- Select this configuration to be the default configuration.
- Close the Configuration Manager window.

If all validation stages succeed, then you are ready to submit jobs to MDCS.

### Submitting Task Parallel Jobs

The following is an example of a Task Parallel job. The task-parallel example code, *frontDemo*, calculates the risk and return based on historical data from a collection of stock prices. The core of the code, *calcFrontier*, minimizes the equations for a set of returns. In order to parallelize the code, the *for* loop is converted into a parfor loop with each iteration of the loop becoming its own independent task.

To submit the job, copy *submitParJobToCluster.m* into your working directory, make the necessary modifications for your job environment, and then run the code from within MATLAB. This will submit the job. An explanation of the code follows:

**submitParJobToCluster.m**

Code

When you run *submitParJobToCluster* within MATLAB, the *frontDemo* code will be submitted to the PBS job scheduler. Use the *showq* command from a cluster terminal window to look for your job in the job queue.

### Submitting Data Parallel Jobs

The data-parallel example code calculates the area of pi under the curve. The non parallel version, *calcPiSerial*, calculates with a for loop, looping through discrete points. The parallel version, *calcPiSpmd*, uses the *spmd* construct to evaluate a part of the curve on each MATLAB instance. Each MATLAB instances uses its *labindex* (i.e. rank) to determine which portion of the curve to calculate. The calculations are then globally summed together and broadcasted back out. The code uses higher level routines, rather than lower level MPI calls. Once the summation has been calculated, it’s indexed into and communicated back to the local client MATLAB to calculate the total area.

To submit the job, copy *submitSpmdJobToCluster.m* into your working directory, make the necessary modifications for your job environment, and then run the code from within MATLAB. This will submit the job. An explanation of the code follows:

**submitSpmdJobToCluster.m**

When you run *submitSpmdJobToCluster* within MATLAB, the *calcPiSpmd* code will be submitted to the PBS job scheduler. Use the *showq* command from a cluster terminal window to look for your job in the job queue.

Code

### Job Dependencies

In order to run code on the cluster, a job may be dependent on several MATLAB or data files. The *batch()* function takes two parameters: *PathDependencies* and *FileDependencies*. Both can be assigned to a comma separated cell array of filenames and/or folder names. If the MATLAB client shares a file system with the compute nodes, then typically the user will specify the dependencies on the local path (i.e PathDependencies). For example:

If the MATLAB client does not share a file system with the compute nodes, then the user will specify the dependencies on the path (i.e. FileDependencies}, which in turn will be zipped up as part of the job. For example:

In this example, the files myrand.m and random.m will be zipped up (the caller function, *myrand*, is assumed to be needed and is always included in the ZIP file).

### Configuring Cluster Parameters with *ClusterInfo*

As previously mentioned, configurations are written to describe the scheme of the cluster. However, there are some properties that may need to be set often or at runtime. *ClusterInfo* provides a mechanism for the user to set additional properties. For example, the user may want to specify that a job should last no longer than 30 minutes.

Or the email address to use to be notified when a job is running

The entire collection of properties can be displayed with the state method

The values persist between jobs as well as between MATLAB sessions, until cleared. They can be cleared by setting a single property empty

Or by clearing the entire set

### Destroying a Job

When you submit a job with *batch*, you will notice that each submission is labeled *Job1*, *Job2,* and so forth. Temporary directories associated with each job can be found in ~/MdcsDataLocation as the jobs are running. When *job.destroy* is called, these temporary directories are deleted. The above examples call *job.destroy*. If you close your MATLAB session before executing *job.destroy*, which is likely unless you are using the full example with *job.wait*, you will need to manually cleanup temporary directories in ~/MdcsDataLocation.

## Running a Job on a GPGPU

MATLAB 2011a and higher versions are capable of running jobs on GPGPUs as well as on CPUs. See our FAQ for details.

## Running Locally on a Desktop

In order to run MATLAB code with a parallel component on your desktop locally, you must first start up a MATLAB Pool, as such:

where 8 is the number of MATLAB processes to attach to the job. At this point you will have access to eight MATLAB workers for use with parallel code, such as code with *parfor* loops, and so on.

After running the code, close the MATLAB Pool:

## PCT Resources and Demos

Here are a few resources for getting started with the Parallel Computing Toolbox:

- Online demos: http://www.mathworks.com/products/parallel-computing/demos.html?show=demo
- Recorded webinars: http://www.mathworks.com/products/parallel-computing/demos.html?show=recorded
- Parallel Computing Toolbox documentation: http://www.mathworks.com/access/helpdesk/help/toolbox/distcomp
- Self-paced tutorial: http://www.mathworks.com/matlabcentral/fileexchange/22809
- Blog: http://blogs.mathworks.com/loren/2009/10/02/using-parfor-loops-getting-up-and-running