A dedicated external data mover node is available, called tgv.uark.edu from campus and dm.uark.edu from the world. It should be used for moving data to and from the clusters and the Razor parallel file systems. tgv/dm is configured with a 10Gb/s network connection and a dedicated 21TB storage system mounted at
/local_storage. Regular login shells are blocked. The allowed protocols are
To upload a data file from the current directory on your local desktop machine to your /storage directory on razor:
pawel@localdesktop$# scp localfile.dat firstname.lastname@example.org:/storage/pwolinsk/
To download a data file from your /storage directory on razor to the current directory on your local desktop machine:
pawel@localdesktop$# scp email@example.com:/storage/pwolinsk/remotefile.dat .
You will also have a staging directory on tgv/dm called /local_storage/$USER/. A new Globus Online instance on tgv/dm is in preparation, and login shells are available for special situations such as batch
wget from an http server. Please contact firstname.lastname@example.org if you need that.
Windows OS does not include Secure Copy or Secure FTP tools. There are multiple file transfer clients available for download. The most popular command line client is pscp.exe available here:
To transfer files using pscp.exe download and save pscp.exe to your Windows machine. Then open a terminal (i.e. Command Prompt, under “Start→All Programs→Accessories→Command Prompt”) and specify the full path to the location of the downloaded pscp.exe file followed by two arguments <source> and <destination>. Either one or both could specify a file a on remote machine (user@host:path_to_file) or a local file (path_to_file). For example:
C:\Users\Pawel> c:\Users\Pawel\Downloads\pscp.exe filetoupload.txt email@example.com:
The code above uses secure copy protocol to upload a file “filetoupload.txt” to the home directory of user pwolinsk on razor.uark.edu.
Another popular windows transfer client (GUI) is WinSCP:
|NOTE: The data mover node tgv.uark.edu can also be accessed as dm.uark.edu from outside of the UofA network. This domain name is assigned an IP address in the network DMZ (demilitarized zone) on a 10Gb/s ethernet network and is the preferred domain name to use for UofA external transfers.|
A dedicated internal node named bridge is set aside for the purpose of moving data between storage systems of the Razor and Trestles clusters. The bridge node has home, storage and scratch file systems from both Razor and Trestles nodes mounted under these directories:
Trestles file systems
/trestles/home/ /trestles/storage/ /trestles/scratch/ These are also mounted at: /home /storage /scratch There are also multiple privately owned storage areas /storage[x].
When you login to bridge, you will be located in your Trestles home. Also please recall that the Trestles persistent /scratch/$USER partition is being phased out and will in the future be only /scratch/$PBS_JOBID for each batch job.
Razor file systems
/razor/home/ /razor/storage/ /razor/scratch/
scp to the bridge node is possible from either cluster, we recommend logging into bridge node directly and using the
mv commands to move files, thus using 40Gb/s Infiniband instead of
scp using 1Gb/s ethernet:
tres-l1:pwolinsk:$ ssh bridge Last login: Fri Feb 19 14:03:31 2016 from tres-l1 No Modulefiles Currently Loaded. bridge:pwolinsk:$ cp /trestles/home/pwolinsk/memusage /razor/home/pwolinsk/ bridge:pwolinsk:$ exit logout Connection to bridge closed. tres-l1:pwolinsk:$
rsync is better for copying whole directories:
bridge:pwolinsk:$ cd /razor/home/pwolinsk bridge:pwolinsk:/razor$ rsync -av XSEDE /home/pwolinsk/ sending incremental file list XSEDE/ XSEDE/xsede13/ ...omitted every file listed with -v option... XSEDE/xsede14/OpenMP_Exercise/heat.c sent 185047 bytes received 1258 bytes 124203.33 bytes/sec total size is 180459 speedup is 0.97
/trestles/home/username are equivalent.
bridge is not reachable from outside the clusters, please use tgv/dm. A data transfer node to the Trestles file systems is planned.