Skip to content

AI Cloud

Researchers
Indicates if the platform is accessible for researchers (e.g., PhD students, postdocs, faculty) for research purposes.
Students
Indicates if the platform is accessible to students for educational purposes (e.g., coursework, projects, thesis).
Sensitive Data
Whether the platform supports processing and storing sensitive or confidential data
CPU processing
Indicates if the platform supports computational tasks that only require CPU resources.
GPU processing
Indicates if the platform supports computational tasks that require GPU resources for acceleration (e.g., deep learning).
Unlimited compute
Whether the platform allows unrestricted compute usage, without limitations on the amount of usage time.
Terminal interface
The method used to access the platform.
Pre-installed apps
Indicates if the platform comes with pre-installed applications or frameworks for convenience (e.g., Ansys, PyTorch, TensorFlow).
Collaboration friendly
Indicates if the platform supports collaborative work (e.g., sharing resources, co-editing, team projects).
Working interactively
Indicates if the platform supports interactive workflows where users can interact with running processes (e.g., Jupyter notebooks).
Possible to add GUI
Whether it is possible to run graphical user interfaces (GUIs) on the platform (e.g., remote desktops, JupyterLab).

Introduction

AI Cloud is a GPU cluster made up of a collecton of NVIDIA GPU's, designed for processing GPU-demanding machine learning workloads. The platform is accessed through a terminal application on the user's local machine. From here the user logs in to a front end node, where files management and job submission to the compute nodes takes place.

Getting Started

Key Features

High-Performance GPU Cluster

Harness powerful NVIDIA GPUs for efficient processing of large datasets and complex models.

Containerization for Flexibility

Ensure consistent software environments across nodes, supporting diverse and customizable computational workflows.

Efficient Batch Processing

AI Cloud uses Slurm for seamless job scheduling, enabling easy batch processing and background task management.

Common Use Cases

Training deep learning models

GPU access for AI projects

Fine-tuning large language models

Training speech models for PhD

AI research with CT images

Drug discovery acceleration

MI models for question answering

Knowledge graph embedding models

Machine vision system development

Important Information

Not for confidential or sensitive data

With AI Cloud you are only allowed to work with public or internal information according to AAU’s data classification model (classified as levels 0 and 1, respectively).

If you would like to work with confidential or sensitive data (classified as levels 2 and 3), then we support another HPC platform called UCloud.

Not suitable for CPU-only computational tasks

The powerful GPU processors allow users to process large datasets much more efficiently than would be the case with pure CPU processing - given that your application can be parallelised in a GPU compatible manner. At the same time, the AI Cloud platform is not designed for CPU-only computational tasks, and we have alternative recommended platforms, such as UCloud or Strato for those needs.

Review the terms and conditions

Before getting started, take a few moments to review the terms and conditions of using AI Cloud, and don't hesitate to reach out to our support team if you have any questions or concerns.