AHPCC is available for research and instructional use to faculty and students of any Arkansas university and their research collaborators. There is no charge for use of our computing resources. (A priority access to our resources is available through our condo program.)
HPC-SUPPORT@listserv.uark.edu
with a description of your research for AHPCC reporting. You will be sent a name for the project leader field in the form.
Connect to the HPC clusters using an SSH client. Linux and Mac computers have SSH built-in as ssh
(terminal connection) and scp
(file transfer). On Windows, you may use
PuTTY for a simple ready-to-run executable.
From the PuTTY website, you will want putty.exe
, pscp.exe
, possibly psftp.exe
, and for graphical connections plink.exe
. There is also an installable program
sshwinsecureshell-3.2.9.exe with an explorer-like file transfer GUI. You may also install Cygwin for a (large in disk usage) clone of the Linux environment in Windows. SSH is not automatically installed, it is an option that must be selected in Cygwin setup.
Login hosts are razor.uark.edu
and trestles.uark.edu
, which connect to separate clusters and filesystems. Each login host is a load-balancer of 2 or 3 hosts with a single interface to the outside internet. Your hostname reported by the system after logging in will be similar to razor-l2
or tres-l1
. To log in using ssh
and your choice of cluster and UArk or AHPCC-assigned loginname:
ssh trestles.uark.edu -l loginname
ssh razor.uark.edu -l loginname
and the system will prompt for your password. PuTTY and ssh.com SSH have GUIs in which to enter the host name, the login name, and the password.
passwd-ahpcc
command on razor-l1, razor-l2 or razor-l3 login nodes. This command will change your password on for all AHPCC clusters.razor-l3:user-ext1:$ passwd-ahpcc You have a non-UAF account (external user account) Changing password for user user-ext1. Changing password for user-ext1. (current) UNIX password: New password: Retype new password: passwd: all authentication tokens updated successfully. Propagating to other nodes... razor-l3:user-ext1:$
No passwords will be requested by AHPCC staff at all, and none will be sent by email.
Usually you will want to copy in some data files to the cluster to run a job, and copy back results after the job runs. SCP is a file transfer implementation built on SSH and is scp
in Linux/Mac/Cygwin and pscp
in PuTTY. To copy program.c
from your workstation to your cluster home area (note that if you are using PuTTY these next few items are done from your computer's command line, not in PuTTY. See the moving data page for more info):
scp program.c loginname@razor.uark.edu:/home/loginname/
To copy an entire directory tree src
using SCP, use -r
for recursive:
scp -r src loginname@razor.uark.edu:/home/loginname/
In SCP, a string containing :
is a remote destination. Just reverse remote and local destination to bring a file back from the cluster. If you don't specify a full path for the local destination, use ./
for the current local directory:
scp -r loginname@razor.uark.edu:/home/loginname/src ./
SFTP is an implementation of the FTP command set over SSH. Use sftp
in Linux/Mac/Cygwin and psftp
in PuTTY:
sftp loginname@trestles.uark.edu
RSYNC is built on top of PSCP and has advantages like preserving modification times and checking to see that the files actually show up at the destination. RSYNC is available on Linux/Mac/Cygwin but is not part of PuTTY or ssh.com SSH. cwRsync is one free version for Windows. RSYNC is most useful for copying or updating entire directory trees.
rsync -av src loginname@razor.uark.edu:/home/loginname/
Very large data files should be sent to/from tgv.uark.edu
if on-campus or dm.uark.edu
if off-campus, since these nodes have faster network connections and faster connections to parallel storage.