User Tools

Site Tools


moving_data

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
moving_data [2016/02/22 22:34]
pwolinsk
moving_data [2017/02/21 19:54] (current)
root
Line 1: Line 1:
 ===== Data Transfer to and from AHPCC Clusters ===== ===== Data Transfer to and from AHPCC Clusters =====
  
-special ​data mover node named **tgv.uark.edu** should be used for moving data to and from the clusters. ​ **tgv** is configured with a 10Gb/s network connection and a dedicated 21TB storage system mounted at **''/​local_storage''​**. ​ Regular ​logins ​shells are blocked ​on **tgv** The allowed protocols are   +dedicated external ​data mover node is available, called ​**tgv.uark.edu** ​from campus and **dm.uark.edu** from the world. It should be used for moving data to and from the clusters ​and the Razor parallel file systems.  **tgv**/**dm** is configured with a 10Gb/s network connection and a dedicated 21TB storage system mounted at **''/​local_storage''​**. ​ Regular ​login shells are blocked. The allowed protocols are   
  
    * **''​scp''​** (secure copy)       * **''​scp''​** (secure copy)   
Line 7: Line 7:
    * **''​rsync''​**    * **''​rsync''​**
  
-To upload a data file from the current directory on your local desktop machine to your home directory on **razor**:+=== Linux and MacOS === 
 + 
 +To upload a data file from the current directory on your local desktop machine to your /​storage ​directory on **razor**:
 <​code>​ <​code>​
-pawel@localdesktop$#​ scp localfile.dat pwolinsk@tgv.uark.edu:​+pawel@localdesktop$#​ scp localfile.dat pwolinsk@tgv.uark.edu:​/​storage/​pwolinsk/​
 </​code>​ </​code>​
-To download a data file from your home directory on **razor** to the current directory on your local desktop machine:+To download a data file from your /​storage ​directory on **razor** to the current directory on your local desktop machine:
 <​code>​ <​code>​
-pawel@localdesktop$#​ scp pwolinsk@tgv.uark.edu:​remotefile.dat .+pawel@localdesktop$#​ scp pwolinsk@tgv.uark.edu:​/​storage/​pwolinsk/​remotefile.dat .
 </​code>​ </​code>​
 +You will also have a staging directory on **tgv**/​**dm** called **/​local_storage/​$USER/​**. ​ A new Globus Online instance on **tgv**/​**dm** is in preparation,​ and login shells are available for special situations such as batch **''​wget''​** from an http server. ​ Please contact hpc-support@listserv.uark.edu if you need that.
 +
 +=== Windows ===
 +
 +Windows OS does not include Secure Copy or Secure FTP tools. ​ There are multiple file transfer clients available for download. ​ The most popular command line client is pscp.exe available here:
 +
 +https://​the.earth.li/​~sgtatham/​putty/​latest/​x86/​pscp.exe
 +
 +To transfer files using pscp.exe download and save pscp.exe to your Windows machine. ​ Then open a terminal (i.e. Command Prompt, under "​Start->​All Programs->​Accessories->​Command Prompt"​) and specify the full path to the location of the downloaded pscp.exe file followed by two arguments <​source>​ and <​destination>​. ​ Either one or both could specify a file a on remote machine (user@host:​path_to_file) or a local file (path_to_file). ​ For example:
 +
 +<​code>​
 +C:​\Users\Pawel>​ c:​\Users\Pawel\Downloads\pscp.exe filetoupload.txt pwolinsk@razor.uark.edu:​
 +</​code>​
 +
 +The code above uses secure copy protocol to upload a file "​filetoupload.txt"​ to the home directory of user pwolinsk on razor.uark.edu.
 +
 +Another popular windows transfer client (GUI) is WinSCP:
 +
 +https://​winscp.net/​eng/​download.php
 +
 +<​html>​
 +
 +<ul>
 +<table border=0><​tr><​td bgcolor=#​aaaaaa><​b>​NOTE:</​b>​ The data mover node <​b><​tt>​tgv.uark.edu</​tt></​b>​ can also be accessed as <​b><​tt>​dm.uark.edu</​tt></​b>​ from outside of the UofA network. ​ This domain name is assigned an IP address in the network DMZ (demilitarized zone) on a 10Gb/s ethernet network and is the preferred domain name to use for UofA external transfers.</​td></​tr></​table>​
 +</​html>​
  
 ===== Data Transfer between Razor & Trestles Clusters ===== ===== Data Transfer between Razor & Trestles Clusters =====
  
-special ​node named **bridge** is set aside for the purpose of moving data between storage systems of the Razor and Trestles clusters. ​ The **bridge** node has **//​home//​**,​ **//​storage//​** and **//​scratch//​** file systems from both Razor and Trestles nodes mounted under these directories:​+dedicated internal ​node named **bridge** is set aside for the purpose of moving data between storage systems of the Razor and Trestles clusters. ​ The **bridge** node has **//​home//​**,​ **//​storage//​** and **//​scratch//​** file systems from both Razor and Trestles nodes mounted under these directories:​
  
 __Trestles file systems__ __Trestles file systems__
Line 25: Line 52:
 /​trestles/​storage/​ /​trestles/​storage/​
 /​trestles/​scratch/​ /​trestles/​scratch/​
 +These are also mounted at:
 +/home
 +/storage
 +/scratch
 +There are also multiple privately owned storage areas /​storage[x].
 </​code>​ </​code>​
 +When you login to **bridge**, you will be located in your Trestles **//​home//​**. ​ Also please recall that the Trestles persistent **/​scratch/​$USER** partition is being phased out and will in the future be only **/​scratch/​$PBS_JOBID** for each batch job.
  
 __Razor file systems__ __Razor file systems__
Line 34: Line 67:
 </​code>​ </​code>​
  
-Although ​a direct secure copy to **bridge** node from either ​Razor or Trestles ​cluster ​nodes or front ends is availableto achieve best performance ​we recommend logging into **bridge** node directly and using the **''​cp''​** ​command ​to move files between the cluster file systems:+Although **''​scp''​** to the bridge ​node is possible ​from either cluster, we recommend logging into **bridge** node directly and using the **''​cp''​** ​or **''​mv''​** commands ​to move files, thus using 40Gb/s Infiniband instead of **''​scp''​** using 1Gb/s ethernet:
  
 <​code>​tres-l1:​pwolinsk:​$ ssh bridge <​code>​tres-l1:​pwolinsk:​$ ssh bridge
Line 45: Line 78:
 tres-l1:​pwolinsk:​$ ​ tres-l1:​pwolinsk:​$ ​
 </​code>​ </​code>​
 +**''​rsync''​** is better for copying whole directories:​
 +<​code>​
 +bridge:​pwolinsk:​$ cd /​razor/​home/​pwolinsk
 +bridge:​pwolinsk:/​razor$ rsync -av XSEDE /​home/​pwolinsk/​
 +sending incremental file list
 +XSEDE/
 +XSEDE/​xsede13/​
 +...omitted every file listed with -v option...
 +XSEDE/​xsede14/​OpenMP_Exercise/​heat.c
 +
 +sent 185047 bytes  received 1258 bytes  124203.33 bytes/sec
 +total size is 180459 ​ speedup is 0.97
 +</​code>​
 +
 +On **bridge**, **''/​home/​username''​** and **''/​trestles/​home/​username''​** are equivalent.
 +
 +**bridge** is not reachable from outside the clusters, please use **tgv**/​**dm**. ​ A data transfer node to the Trestles file systems is planned.
moving_data.1456180450.txt.gz · Last modified: 2016/02/22 22:34 by pwolinsk