The Pinnacle I and II clusters from 2019 and 2022 are the newest resources at AHPCC. Pinnacle II consists of 71 AMD based compute nodes with a total of 74 NVidia Ampere class GPUs. Pinnacle I consists of 106 mostly Intel based compute nodes with a total of 26 NVidia Volta class GPUs. Total floating point capacity is more than one PetaFlops (one quadrillion 64-bit floating point operations per second), mostly contributed by the GPUs. A significant fraction of the compute nodes are funded by research projects and reserved for their first use, including specialty nodes such as 4 TeraBytes of memory and quad GPU nodes.
These systems are interconnected with an Infiniband network fabric at 100 Gb/s and also connected to about 2.5 PetaBytes of high-speed parallel storage and about 4 PetaBytes of nearline storage.
About 250 older machines are connected as the “Trestles” cluster and are used for less demanding tasks.
A “Science DMZ” connects these systems at 100 Gb/s to the ARE-ON state and Internet2 national research networks and the UAMS HPC “Grace” system. Also on the Science DMZ are elements of regional and national grids such as PRP Nautilus, Great Plains Network, and Open Science Grid. A project is underway to make AHPCC and UAMS HPC functionally a single system available to all Arkansas research users.