Platform
DGX-Ready Software

DGX Ready Software

NVIDIA DGX System​

Nvidia DGX servers and workstations use GPGPU to accelerate deep learning applications. They have a rackmount chassis with high-performance x86 server CPUs, except for DGX A100 and DGX Station A100, which use AMD EPYC CPUs. The main component is a set of 4 to 16 Nvidia Tesla GPU modules on an independent system board, integrated using a version of the SXM socket. DGX systems have powerful cooling systems to cool thousands of watts of thermal output.​

Backend.AI is validated as ​ the first NVIDIA DGX-Ready Software in Asia Pacific region.​

Backend.AI on DGX-family​

Complements to NVIDIA Container Runtime​

  • GPU sharing for multi-user support
  • Features for machine learning pipeline
  • Scheduling with CPU/GPU topology​​

We're here for you!

Complete the form and we'll be in touch soon

Contact Us

Headquarter & HPC Lab

Namyoung Bldg. 4F/5F, 34, Seolleung-ro 100-gil, Gangnam-gu, Seoul, Republic of Korea

© Lablup Inc. All rights reserved.