The Center for Research Computing (CRC) manages shared computational clusters for a wide range of research projects affiliated with Rice University and the Texas Medical Center.
Because our clusters are highly complex environments, we do expect our users to acquire a basic set of skills before they get started. We understand that the range of computing experience among researchers can vary widely. In any cross-disciplinary field, there exists a need for introductory material to get researchers up to speed on fundamental concepts. We've gathered the following list of tutorials on varying computing subjects that we think will be of use.
At the least, a cluster user should understand:
- Basic file/directory manipulation under Linux
- Creating/Editing files under Linux
- Writing and submitting job scripts
- How to login to the clusters
- How to copy data in and out of the clusters
- Know whether your jobs are serial or parallel
Start here: CRC Tutorials
The CRC administers and maintains infrastructure in four core service areas: Shared Computing Clusters, the Research Data Facility, the High-Speed Data Transfer infrastructure, High-Performance Visualization, and Virtual Machines. While these services often fit together, each provides unique capabilities designed to accommodate specific research tasks.
This page gives an overview of these services, with links to more detailed information on each of them. If you have any questions or find that the services below do not meet your research computing needs, contact our CRC facilitators via ticket.
High-powered computing resources for big datasets.
- Part of the mission of the CRC is to serve as an onramp for our researchers to scale their problems up to the minimum size requirements at big national supercomputing resources. One such national resource is the Extreme Science and Engineering Discovery Environment
Research Data Facility (RDF)
When your project won’t fit on, or shouldn’t be on, an external hard disk.
High-speed data transfer
When your project is too big for a regular FTP transfer. Specialized, NSF-funded infrastructure for moving large-scale data sets
DTN/DTN2/DTN HA (Globus Data transfer nodes) Getting Started
Science DMZ (documentation coming soon)
Help with Visualization of data.
Help when your project needs virtual machines.
- Commercial options currently available through Amazon AWS and Google Cloud Computing. The CRC can help you to get a quote and even to get started. Google currently offers $300 in credits to new users.
- Owl Research Infrastructure Open Nebula (ORION) VM Pool on Rice Campus. Getting started
- Decommissioned Resources
Shared Pool of Integrated Computing Environments (SPICE) Decommissioned: March 27th, 2018