The Virtual Wall2 testbed offers shared permanent project storage, per project.

All users of a project have access to the same shared storage.

This storage can be accessed from:

  • All bare metal and VM nodes in each experiment of the project on virtual wall2.
  • GPULab jobs started within the project
  • iLab.t Jupyterhub notebooks started within the project

The data is stored permanently, it is not deleted after your experiment is terminated. The data is shared instantly, and is thus instantly available everywhere (it’s the same NFS share everywhere).


There are no automatic backups for this storage! You need to keep backups of important files yourself!

Access from GPULab

You can access the shared storage from within gpulab jobs, by adding the /project mountpoint to the jobDefinition:

"jobDataLocations": [
       "mountPoint": "/project"

This will cause the directory /project in your job to be bound to the shared storage.

You can also mount only sub directories of the /project dir this way:

"jobDataLocations": [
       "sharePath": "/project/mycode/",
       "mountPoint": "/work/code/"

Access from Virtual Wall2

When you start an experiment with wall2 resources in the experiment MyProject, on all your nodes you can find the shared storage in this directory:


Access from Jupytherhub

The iLab.t Jupytherhub uses the shared storage as its default working directory.

Access from your own PC

To access this data share from your own machine, there are 2 options:

  • Start a notebook at the iLab.t Jupytherhub. The notebook will open with the shared storage as its working directory. You can easily upload and download files using the jupyter notebook interface.
  • Use the jFed experimenter GUI to reserve a resource, and access the data from that resource. You can find a detailed tutorial on how to do this in the Fed4Fire first experiment tutorial. Note that jFed has basic scp functionality, to make transferring files easier.

Note: GPULab jobs offer ssh access, but they do not offer scp access.