logo
Alauda AI
Navigation
Overview
Introduction
Quick Start
Release Notes
Install
Pre-installation Configuration
Install Alauda AI Essentials
Install Alauda AI
Upgrade
Upgrade from AI 1.3
Uninstall
Uninstall
Infrastructure Management
Device Management
About Alauda Build of Hami
About Alauda Build of NVIDIA GPU Device Plugin
Multi-Tenant
Guides
Namespace Management
Workbench
Overview
Introduction
Install
Upgrade
How To
Create WorkspaceKind
Create Workbench
Model Deployment & Inference
Overview
Introduction
Features
Inference Service
Introduction
Guides
Inference Service
How To
Extend Inference Runtimes
Configure External Access for Inference Services
Configure Scaling for Inference Services
Troubleshooting
Experiencing Inference Service Timeouts with MLServer Runtime
Inference Service Fails to Enter Running State
Model Management
Introduction
Guides
Model Repository
Monitoring & Ops
Overview
Introduction
Features Overview
Logging & Tracing
Introduction
Guides
Logging
Resource Monitoring
Introduction
Guides
Resource Monitoring
API Reference
Introduction
Kubernetes APIs
Inference Service APIs
ClusterServingRuntime [serving.kserve.io/v1alpha1]
InferenceService [serving.kserve.io/v1beta1]
Workbench APIs
Workspace Kind [kubeflow.org/v1beta1]
Workspace [kubeflow.org/v1beta1]
Manage APIs
AmlNamespace [manage.aml.dev/v1alpha1]
Operator APIs
AmlCluster [amlclusters.aml.dev/v1alpha1]
Glossary

#Navigation

Overview#

Introduction

    Quick Start

      Release Notes

        Install#

        Pre-installation Configuration

          Install Alauda AI Essentials

            Install Alauda AI

              Upgrade#

              Upgrade from AI 1.3

                Uninstall#

                Uninstall

                  Infrastructure Management#

                  Device Management

                    Multi-Tenant

                      Workbench#

                      Overview

                        Install

                          Upgrade

                            How To

                              Model Deployment & Inference#

                              Overview

                                Inference Service

                                  Model Management

                                    Monitoring & Ops#

                                    Overview

                                      Logging & Tracing

                                        Resource Monitoring

                                          API Reference#

                                          Introduction

                                            Kubernetes APIs

                                              Glossary#

                                              Glossary

                                                📝 Edit this page
                                                Next pageOverview