Skip to content

Connecting to the HPC infrastructure#

Before you can really start using the HPC clusters, there are several things you need to do or know:

  1. You need to log on to the cluster using an SSH client to one of the login nodes or by using the HPC web portal. This will give you command-line access. A standard web browser like Firefox or Chrome for the web portal will suffice.

  2. Before you can do some work, you'll have to transfer the files that you need from your desktop computer to the cluster. At the end of a job, you might want to transfer some files back.

  3. Optionally, if you wish to use programs with a graphical user interface, you will need an X-server on your client system and log in to the login nodes with X-forwarding enabled.

  4. Often several versions of software packages and libraries are installed, so you need to select the ones you need. To manage different versions efficiently, the VSC clusters use so-called modules, so you will need to select and load the modules that you need.

Connection restrictions#

Since March 20th 2020, restrictions are in place that limit from where you can connect to the VSC HPC infrastructure, in response to security incidents involving several European HPC centres.

VSC login nodes are only directly accessible from within university networks, and from (most) Belgian commercial internet providers.

All other IP domains are blocked by default. If you are connecting from an IP address that is not allowed direct access, you have the following options to get access to VSC login nodes:

  • Use an VPN connection to connect to UGent the network (recommended). See https://helpdesk.ugent.be/vpn/en/ for more information.

  • Whitelist your IP address automatically by accessing https://firewall.vscentrum.be and log in with your UGent account.

    • While this web connection is active new SSH sessions can be started.

    • Active SSH sessions will remain active even when this web page is closed.

  • Contact your HPC support team (via hpc@ugent.be) and ask them to whitelist your IP range (e.g., for industry access, automated processes).

Trying to establish an SSH connection from an IP address that does not adhere to these restrictions will result in an immediate failure to connect, with an error message like:

ssh_exchange_identification: read: Connection reset by peer

First Time connection to the HPC infrastructure#

The remaining content in this chapter is primarily focused for people utilizing a terminal with SSH. If you are instead using the web portal, the corresponding chapter might be more helpful: Using the HPC-UGent web portal.

If you have any issues connecting to the HPC after you've followed these steps, see Issues connecting to login node to troubleshoot.

Connect#

Open up a terminal and enter the following command to connect to the HPC.

$ ssh vsc40000@login.hpc.ugent.be

Here, user vsc40000 wants to make a connection to the "hpcugent" cluster at UGent via the login node "login.hpc.ugent.be", so replace vsc40000 with your own VSC id in the above command.

The first time you make a connection to the login node, you will be asked to verify the authenticity of the login node. Please check Warning message when first connecting to new host-first-connecting-to-new-host) on how to do this.

A possible error message you can get if you previously saved your private key somewhere else than the default location ($HOME/.ssh/id_rsa):

Permission denied (publickey,gssapi-keyex,gssapi-with-mic).

In this case, use the -i option for the ssh command to specify the location of your private key. For example:

$ ssh -i /home/example/my_keys

Congratulations, you're on the HPC infrastructure now! To find out where you have landed you can print the current working directory:

$ pwd
/user/home/gent/vsc400/vsc40000

Your new private home directory is "/user/home/gent/vsc400/vsc40000". Here you can create your own subdirectory structure, copy and prepare your applications, compile and test them and submit your jobs on the HPC.

$ cd /apps/gent/tutorials
$ ls
Intro-HPC/

This directory currently contains all training material for the Introduction to the HPC. More relevant training material to work with the HPC can always be added later in this directory.

You can now explore the content of this directory with the "ls --l" (lists long) and the "cd" (change irectory) commands:

As we are interested in the use of the HPC, move further to Intro-HPC and explore the contents up to 2 levels deep:

$ cd Intro-HPC
$ tree -L 2
.
'-- examples
    |-- Compiling-and-testing-your-software-on-the-HPC
    |-- Fine-tuning-Job-Specifications
    |-- Multi-core-jobs-Parallel-Computing
    |-- Multi-job-submission
    |-- Program-examples
    |-- Running-batch-jobs
    |-- Running-jobs-with-input
    |-- Running-jobs-with-input-output-data
    |-- example.pbs
    '-- example.sh
9 directories, 5 files

This directory contains:

  1. This HPC Tutorial (in either a Mac, Linux or Windows version).

  2. An examples subdirectory, containing all the examples that you need in this Tutorial, as well as examples that might be useful for your specific applications.

$ cd examples

Tip

Typing cd ex followed by Tab (the Tab-key) will generate the cd examples command. Command-line completion (also tab completion) is a common feature of the bash command line interpreter, in which the program automatically fills in partially typed commands.

Tip

For more exhaustive tutorials about Linux usage, see Appendix Useful Linux Commands

The first action is to copy the contents of the HPC examples directory to your home directory, so that you have your own personal copy and that you can start using the examples. The "-r" option of the copy command will also copy the contents of the sub-directories "recursively".

$ cp -r /apps/gent/tutorials/Intro-HPC/examples ~/

Go to your home directory, check your own private examples directory, ... and start working.

$ cd
$ ls -l

Upon connecting you will see a login message containing your last login time stamp and a basic overview of the current cluster utilisation.

Last login: Thu Mar 18 13:15:09 2021 from gligarha02.gastly.os

 STEVIN HPC-UGent infrastructure status on Mon, 19 Feb 2024 10:00:01
      cluster         - full - free -  part - total - running - queued
                        nodes  nodes   free   nodes   jobs      jobs
 -------------------------------------------------------------------------
           skitty          39      0     26      68      1839     5588
           joltik           6      0      1      10        29       18
            doduo          22      0     75     128      1397    11933
         accelgor           4      3      2       9        18        1
          donphan           0      0     16      16        16       13
          gallade           2      0      5      16        19      136


For a full view of the current loads and queues see:
https://hpc.ugent.be/clusterstate/
Updates on current system status and planned maintenance can be found on https://www.ugent.be/hpc/en/infrastructure/status

You can exit the connection at anytime by entering:

$ exit
logout
Connection to login.hpc.ugent.be closed.

tip: Setting your Language right

You may encounter a warning message similar to the following one during connecting:

perl: warning: Setting locale failed.
perl: warning: Please check that your locale settings:
LANGUAGE = (unset),
LC_ALL = (unset),
LC_CTYPE = "UTF-8",
LANG = (unset)
    are supported and installed on your system.
perl: warning: Falling back to the standard locale ("C").
or any other error message complaining about the locale.

This means that the correct "locale" has not yet been properly specified on your local machine. Try:

LANG=
LC_COLLATE="C"
LC_CTYPE="UTF-8"
LC_MESSAGES="C"
LC_MONETARY="C"
LC_NUMERIC="C"
LC_TIME="C"
LC_ALL=

A locale is a set of parameters that defines the user's language, country and any special variant preferences that the user wants to see in their user interface. Usually a locale identifier consists of at least a language identifier and a region identifier.

Note

If you try to set a non-supported locale, then it will be automatically set to the default. Currently the default is en_US.UFT-8 or en_US, depending on whether your originally (non-supported) locale was UTF-8 or not.

Open the .bashrc on your local machine with your favourite editor and add the following lines:

$ nano ~/.bashrc
...
export LANGUAGE="en_US.UTF-8"
export LC_ALL="en_US.UTF-8"
export LC_CTYPE="en_US.UTF-8"
export LANG="en_US.UTF-8"
...

tip: vi

To start entering text in vi: move to the place you want to start entering text with the arrow keys and type "i" to switch to insert mode. You can easily exit vi by entering: "ESC :wq" To exit vi without saving your changes, enter "ESC:q!"

or alternatively (if you are not comfortable with the Linux editors), again on your local machine:

$ echo "export LANGUAGE=\"en_US.UTF-8\"" >> ~/.profile
$ echo "export LC_ALL=\"en_US.UTF-8\"" >> ~/.profile
$ echo "export LC_CTYPE=\"en_US.UTF-8\"" >> ~/.profile
$ echo "export LANG=\"en_US.UTF-8\"" >> ~/.profile

You can now log out, open a new terminal/shell on your local machine and reconnect to the , and you should not get these warnings anymore.

Transfer Files to/from the HPC#

Before you can do some work, you'll have to transfer the files you need from your desktop or department to the cluster. At the end of a job, you might want to transfer some files back. The preferred way to transfer files is by using an scp or sftp via the secure OpenSSH protocol. Linux ships with an implementation of OpenSSH, so you don't need to install any third-party software to use it. Just open a terminal window and jump in!

Using scp#

Secure copy or SCP is a tool (command) for securely transferring files between a local host (= your computer) and a remote host (the HPC). It is based on the Secure Shell (SSH) protocol. The scp command is the equivalent of the cp (i.e., copy) command, but can copy files to or from remote machines.

It's easier to copy files directly to $VSC_DATA and $VSC_SCRATCH if you have symlinks to them in your home directory. See the chapter titled "Uploading/downloading/editing files", section "Symlinks for data/scratch" in the intro to Linux for how to do this.

Open an additional terminal window and check that you're working on your local machine.

$ hostname

If you're still using the terminal that is connected to the HPC, close the connection by typing "exit" in the terminal window.

For example, we will copy the (local) file "localfile.txt" to your home directory on the HPC cluster. We first generate a small dummy "localfile.txt", which contains the word "Hello". Use your own VSC account, which is something like "vsc40000". Don't forget the colon (:) at the end: if you forget it, it will just create a file named vsc40000@login.hpc.ugent.be on your local filesystem. You can even specify where to save the file on the remote filesystem by putting a path after the colon.

$ echo "Hello" > localfile.txt
$ ls -l 
...
-rw-r--r-- 1 user  staff   6 Sep 18 09:37 localfile.txt
$ scp localfile.txt vsc40000@login.hpc.ugent.be
localfile.txt     100%   6     0.0KB/s     00:00

Connect to the HPC via another terminal, print the working directory (to make sure you're in the home directory) and check whether the file has arrived:

$ pwd
/user/home/gent/vsc400/vsc40000
$ ls -l 
total 1536
drwxrwxr-x 2
drwxrwxr-x 2
drwxrwxr-x 10
-rw-r--r-- 1
$ cat localfile.txt
Hello

The scp command can also be used to copy files from the cluster to your local machine. Let us copy the remote file "intro-HPC-Linux-Gent.pdf" from your "docs" subdirectory on the cluster to your local computer.

First, we will confirm that the file is indeed in the "docs" subdirectory. On the terminal on the , enter:

$ cd ~/docs
$ ls -l
total 1536
-rw-r--r-- 1 vsc40000 Sep 11 09:53 intro-HPC-Linux-Gent.pdf

Now we will copy the file to the local machine. On the terminal on your own local computer, enter:

$ scp vsc40000@login.hpc.ugent.be:./docs/intro-HPC-Linux-Gent.pdf .
intro-HPC-Linux-Gent.pdf 100% 725KB 724.6KB/s 00:01
$ ls -l
total 899
-rw-r--r-- 1 user staff 741995 Sep 18 09:53
-rw-r--r-- 1 user staff      6 Sep 18 09:37 localfile.txt

The file has been copied from the HPC to your local computer.

It's also possible to copy entire directories (and their contents) with the -r flag. For example, if we want to copy the local directory dataset to $VSC_SCRATCH, we can use the following command (assuming you've created the scratch symlink):

$ scp -r dataset vsc40000@login.hpc.ugent.be:scratch

If you don't use the -r option to copy a directory, you will run into the following error:

$ scp -r dataset vsc40000@login.hpc.ugent.be:scratch
dataset: not a regular file

Using sftp#

The SSH File Transfer Protocol (also Secure File Transfer Protocol, or SFTP) is a network protocol that provides file access, file transfer and file management functionalities over any reliable data stream. It was designed as an extension of the Secure Shell protocol (SSH) version 2.0. This protocol assumes that it is run over a secure channel, such as SSH, that the server has already authenticated the client, and that the identity of the client user is available to the protocol.

The sftp is an equivalent of the ftp command, with the difference that it uses the secure ssh protocol to connect to the clusters.

One easy way of starting a sftp session is

$ sftp vsc40000@login.hpc.ugent.be

Typical and popular commands inside an sftp session are:

cd ~/exmples/fibo Move to the examples/fibo subdirectory on the (i.e., the HPC remote machine)
ls Get a list of the files in the current directory on the HPC.
get fibo.py Copy the file "fibo.py" from the HPC
get tutorial/HPC.pdf Copy the file "HPC.pdf" from the HPC, which is in the "tutorial" subdirectory.
lcd test Move to the "test" subdirectory on your local machine.
lcd .. Move up one level in the local directory.
lls Get local directory listing.
put test.py Copy the local file test.py to the HPC.
put test1.py test2.py Copy the local file test1.py to the and rename it to test2.py.
bye Quit the sftp session
mget *.cc Copy all the remote files with extension ".cc" to the local directory.
mput *.h Copy all the local files with extension ".h" to the HPC.

Using a GUI#

If you prefer a GUI to transfer files back and forth to the HPC, you can use your file browser. Open your file browser and press Ctrl+l

This should open up a address bar where you can enter a URL. Alternatively, look for the "connect to server" option in your file browsers menu.

Enter: sftp://vsc40000@login.hpc.ugent.be/ and press enter.

You should now be able to browse files on the HPC in your file browser.

Fast file transfer for large datasets#

See the section on rsync in chapter 5 of the Linux intro manual.

Changing login nodes#

It can be useful to have control over which login node you are on. However, when you connect to the HPC (High-Performance Computing) system, you are directed to a random login node, which might not be the one where you already have an active session. To address this, there is a way to manually switch your active login node.

For instance, if you want to switch to the login node named gligar07.gastly.os, you can use the following command while you are connected to the gligar08.gastly.os login node on the HPC:

ssh gligar07.gastly.os
This is also possible the other way around.

If you want to find out which login host you are connected to, you can use the hostname command.

$ hostname
gligar07.gastly.os
$ ssh gligar08.gastly.os

$ hostname
gligar08.gastly.os

Rather than always starting a new session on the HPC, you can also use a terminal multiplexer like screen or tmux. These can make sessions that 'survives' across disconnects. You can find more information on how to use these tools here (or on other online sources):