The University of Arizona
    For questions, please open a UAService ticket and assign to the Tools Team.
Page tree
Skip to end of metadata
Go to start of metadata

Central Computing Facilities

The University of Arizona (UA) has two data center facilities available to assist researchers on campus:

  1. Research Data Center (RDC): 1200 ft2 raised floor data center designed for water-cooled racks dedicated to centrally managed research computing systems
  2. Co-location Data Center:  1900 ft2 of raised floor data center space for air-cooled research co-located equipment

These campus data centers are managed by the UA’s central computing organization, University Information Technology Services (UITS). Other than installation costs no bandwidth or other recurring charges will be levied for co-location of research systems in these facilities.

Power and Cooling

The UITS data centers are both located in the Computer Center with 1192kW of battery backup and a 1750kW generator for backup power.

Cooling in the RDC is both in-rack cooling with chilled water heat exchangers and Computer Room Air Conditioning (CRAC) units. The co-location Data Center is cooled with chilled water CRAC units and dual cool CRACs. Both data centers are equipped with 18” raised floors that allow for full coverage of cooling to all the equipment, and leak detection systems in the subfloor.

Fire Suppression

The fire suppression system is a multi-tiered defense with clean agent compressed gas, dry pipe pre-action sprinkler and EPO (Emergency Power Off) systems zoned to deploy in affected areas.   For prevention, storage of combustible materials such as cardboard, flammable liquids and other hazardous materials is prohibited within the data centers.

Security

UITS data centers have badge swipe access with two-factor authentication and video surveillance in data center and surrounding building. The data centers are monitored by a co-located 24/7 Operations and dedicated infrastructure team. With automated environmental and system monitoring to assist with issue triaging and escalation. All personnel with swipe access to the data centers have undergone background checks and are required to be US Citizens.

Network and Connectivity

In addition to direct connections to commodity Internet carriers, the UA connection to Internet2 is through the Sun Corridor Network – an Arizona regional network established through a collaborative effort sponsored by the Arizona Board of Regents’ (ABOR) three state universities – Arizona State University (ASU), Northern Arizona University (NAU), and the University of Arizona (UA). The Sun Corridor Network provides advanced networking services beyond those available from the individual Arizona Universities and builds an environment essential to leading-edge education, research, and the sharing of digital communications resources, network services, and applications among eligible members.

The UA manages and operates the Sun Corridor Network. The current connection from UA to Sun Corridor is dual 10G, while Sun Corridor is connected to Internet2 via dual 100G connections in Tucson and Phoenix. Network traffic to Internet2 is automatically routed via the Internet2 infrastructure; no action or configuration by the user is required to take advantage of Internet2 connectivity.

The UA’s Research Data Center has 40GB/s connections to the UA core with all the servers connected by 1GB/s or 10GB/s connections.  In-rack switching is enabled with Cisco FEX switches used in a top of rack configuration in both data centers with servers connected to two different switches for (N + 1) redundancy.

In addition to direct connectivity to the campus network at the building level, researchers have an opportunity to use a Science DMZ for fast and high volume data transfers to outside collaborating institutions. The Science DMZ is deployed at the University of Arizona network perimeter, outside border firewalls, and is directly connected to Sun Corridor via 10G link. It is secured via static access lists deployed at the Sun Corridor router without impact to performance. There are two high-performance Data Transfer Nodes (DTNs) deployed in the Science DMZ. DTN’s are dedicated servers with hardware and operating system optimized for high speed transfer.










  • No labels