logo
Alauda AI
English
简体中文
English
简体中文
logo
Alauda AI
Navigation

Overview

intro
Features Overview
quick_start
Release Notes

Install

Pre-configuration
Alauda AI Essentials Cluster Plugin
Alauda AI Cluster Components

Upgrade

Upgrade from AML 1.2

Infrastructure Management

Device Management

intro

Multi-Tenant

Guides

Namespace Management
Permissions

Model Deployment & Inference

Overview

Introduction
Features

Inference Service​​

Introduction

Guides

Feature Introduction

Troubleshooting

Experiencing Inference Service Timeouts with MLServer Runtime
Permissions

Model Management

Introduction

Guides

Feature Introduction
Permissions

Monitoring & Ops

Overview

Introduction
Features Overview

Logging & Tracing

Introduction

Guides

Logging

Resource Monitoring

Introduction

Guides

Resource Monitoring
Permissions

API Reference

Introduction

Kubernetes APIs

Manage APIs

AmlNamespace [manage.aml.dev/v1alpha1]

Operator APIs

AmlCluster [amlclusters.aml.dev/v1alpha1]

Inference Service APIs

ClusterServingRuntime [serving.kserve.io/v1alpha1]
InferenceService [serving.kserve.io/v1beta1]
glossary

#Inference Service​​

Introduction

Introduction

  • Core Advantages
  • Application Scenarios

Guides

Feature Introduction

  • Advantages
  • Applicable Scenarios
  • Value Brought
  • Main Features
  • Create inference service
  • Experience

Troubleshooting

Experiencing Inference Service Timeouts with MLServer Runtime

  • Problem Description
  • Root Cause Analysis
  • Solutions
  • Summary

Permissions

Permissions

    Previous PageFeatures
    Next PageIntroduction