The University of Arizona
    For questions, please open a UAService ticket and assign to the Tools Team.
Page tree
Skip to end of metadata
Go to start of metadata

University of Arizona High Performance Computing




Our Mission

UA High Performance Computing (HPC) is an interdisciplinary research center focused on facilitating research and discoveries that advance science and technology. We deploy and operate advanced computing and data resources for the research activities of students, faculty, and staff at the University of Arizona. We also provide consulting, technical documentation, and training to support our users.

This site is divided into sections that describe the High Performance Computing (HPC) resources that are available, how to use them, and the rules for use.





Contents

  • User GuideThis section has the basic knowledge that will introduce you to the resources and provides information on account registration, system access, how to run jobs, and how to request help.
  • ResourcesDetailed information on compute, storage, software, grant, data center, and external (XSEDE, CyVerse, etc.) resources.
  • PoliciesPolicies related to topics that include acceptable use, access, acknowledgements, buy-in, instruction, maintenance, and special projects.
  • ResultsA list of research publications that utilized UArizona's HPC system resources.
  • FAQA collection of frequently asked questions and their solutions.
  • Secure HPC


Quick Links
  • User Portal  —  Manage and create groups, request rental storage, manage delegates, delete your account, and submit special project requests.
  • Open OnDemand —  Graphical interface for accessing HPC and applications.
  • Getting Help —  Request help from our team.




Highlighted Research

Faster Speeds Need Faster Computation - Hypersonic Travel






Quick News

Do you like using Ocelote? Good news! On November 9th, the standard allocation on Ocelote was increased from 35,000 to 70,000 CPU hours.

Singularity has been renamed Apptainer as the project is brought into the Linux Foundation. An alias exists so that you can continue to invoke singularity. Local builds are now possible in many cases and remote builds with Sylabs are no longer supported

We only keep a reasonably current version of Apptainer. Prior versions are removed since only the latest one is considered secure. Apptainer is installed on all of the system's compute nodes and can be accessed without using a module.

Anaconda is very popular and is available as a module. It expands the capability of Jupyter with Jupyter Labs; includes RStudio, and the Conda ecosystem. To access GUI interfaces available through Conda (e.g., JupyerLab), we recommend using an Open OnDemand Desktop session. See these instructions.

As a note, Anaconda likes to own your entire environment.  Review those instructions to see what problems that can cause and how to address them.

Have you tried Puma yet?  Our latest supercomputer is larger, faster and has bigger teeth than Ocelote (ok, maybe not the last bit). Puma Quick Start

Since we upgraded Ocelote it has the same software suite as Puma.  It is generally not as busy as Puma.  So if your work does not need the capabilities of Puma, consider using Ocelote instead.  This applies to GPU's also, if the P100s will work for you.

Now that we are into the second year of use, we have determined that we can increase the standard allocation.  From the end of April 2022 the standard allocation of CPU hours is increased from 70,000 to 100,000.




Calendars

System Calendar
DateEvent

 

Maintenance downtime is scheduled from 6Am to 6PM on October 26 for ALL HPC services

 

Maintenance downtime is scheduled from 6AM to 6PM on July 20 for ALL HPC services

 

Maintenance downtime is scheduled from 6AM to 6PM on April 27 for ALL HPC services

 

Maintenance downtime is scheduled from 6AM to 6PM on January 26 for ALL HPC services

 

Maintenance downtime is scheduled from 6AM to 6PM on July 28 for ALL HPC services

 -
 

El Gato will be taken down for scheduled maintenance from July 12th through August 1st. Following
maintenance, it will use SLURM as its scheduling software and have the same software image and modules
as Ocelote and Puma.

 -
 

Ocelote will be taken down for scheduled maintenance from June 1st through June 30th. During that time, its
OS will be updated to CentOS 7 and its scheduler will be migrated to SLURM. 

-

Maintenance downtime is scheduled from 6AM on January 27th through 6PM on January 28th
for ALL HPC services
Training Calendar


Introduction to HPC

Click here for more detailed information

Upcoming Workshops

Date Time Location Registration
TBD


Past Workshops

Date Time Location Registration

9:00 - 10:00am Main Library B254

9:00 - 10:00am Main Library, Data Studio CATalyst

9:00 - 10:00am Room 130A UITS Building

9:00 - 10:00am Room 130A UITS Building

9:00 - 10:00am Room 130A UITS Building

Introduction to Machine Learning

Click here for more detailed information

Upcoming Workshops

Date Time Location Registration
TBD


Past Workshops

Date Time Location Registration

 

9:00 - 10:00am Main Library, Data Studio CATalyst

 

9:00 - 10:00am Main Library, Data Studio CATalyst

 

9:00 - 10:00am Room 130A UITS Building

 

9:00 - 10:00am Room 130A UITS Building

 

9:00 - 10:00am Room 130A UITS Building

Introduction to Parallel Computing

Click here for more detailed information

Upcoming Workshops

Date Time Location Registration
TBD


Past Workshops

Date Time Location Registration

 

10:30 - 11:30am Main Library, Data Studio CATalyst

 

10:30 - 11:30am Main Library, Data Studio CATalyst

Introduction to Containers

Click here for more detailed information

Upcoming Workshops

Date Time Location Registration
TBD


Past Workshops

Date Time Location Registration

 

9:00 - 10:00am Main Library, Data Studio CATalyst

 

9:00 - 10:00am Main Library, Data Studio CATalyst

Data Management Workshops

Click here for more detailed information

Upcoming Workshops

Date Time Location Registration
Data Management Part 1
TBD


Data Management Part 2
TBD


Past Workshops

Date Time Location Registration
Data Management Part 1

 

1:00 - 2:00pm Online

 


Online
Data Management Part 2

 

2:00 - 3:00pm Online

 


Online

  • No labels