High Memory Usage in Redis

The memory usage rate of Redis is a critical monitoring metric. Insufficient memory within an instance can lead to slow database responses and may severely result in data writing errors.

Configuration Description: After configuring the maxmemory parameter, the available memory limit for Redis will be consistent with maxmemory. When memory usage reaches the limit, Redis only accepts read commands, and write commands will return errors. It can also be configured to evict keys in order to ensure the normal operation of Redis when the maximum memory limit is reached.

View Memory Usage

Memory usage can be monitored from either the instance itself or the Pods.

  • You can check Redis's info information to see memory usage statistics in the Redis Instance Details > Monitoring section for the corresponding metrics.

    Metric NameMeaning
    MemoryUsedTotal memory allocated by the Redis allocator.
    MemFragmentationRatioThe fragmentation ratio, the ratio between used_memory_rss and used_memory.
    MemoryUsed,PeakPeak memory consumption of Redis.
    MemoryMaxThe maximum usable memory value set in configuration.
  • Memory resource usage statistics for each Pod under Redis, displayed on a per-Pod basis, will show memory statistics for each container within the Pod. You can find the Pod memory monitoring under Redis Instance Details > Replica Count > StatefulSet > Pods.

    Metric NameMeaning
    Memory Usage RateMemory usage rate of each Pod.

Memory Usage Analysis

In most scenarios, memory is primarily consumed by actual data storage requirements, and you can consider scaling the instance specifications according to needs.

In specific scenarios where Redis's memory usage does not meet expectations, you can utilize the Redis memory module for related analysis.

Click Redis Instance Details > Terminal Console, connect to the Redis instance, and enter the memory stats command to view information about server memory usage. For details, refer to MEMORY STATS.

Typically, a majority of a Redis instance's memory is consumed by the dataset, but there are also other memory overheads, such as the backlog buffer for master-slave replication, memory consumed during Redis process initialization, and memory used to maintain the key-value linked list in Redis.

  • If the memory consumed by non-dataset elements is minimal, it is advisable to ignore this part and focus on analyzing whether the dataset memory usage meets expectations.

  • If non-dataset elements occupy a large amount of memory, you need to analyze the root cause of memory consumption item by item based on the memory stats command.

Memory Policy Configuration

You can also handle the data in memory by configuring data eviction policies or expire policies.

Data Eviction Policy

When Redis runs out of memory, if you have configured a relevant data eviction policy, data will be automatically evicted to ensure that the Redis instance can run normally. You can update the maxmemory-policy parameter through Redis Instance Details > Parameter Configuration.

Parameter value description:

Parameter ValueDescription
volatile-lruEvicts the least recently used data from the set with an expiration time (including key-value and fields in hash).
volatile-lfuEvicts the least frequently used data from the set with an expiration time (including key-value and fields in hash).
volatile-ttlEvicts only the data that has an expiration time, in ascending order of TTL.
volatile-randomRandomly selects data from the set with an expiration time for eviction.
allkeys-lruEvicts the least recently used data from all data sets (including key-value, hash, list, set, and zset).
allkeys-lfuEvicts the least frequently used data from all data sets (including key-value, hash, list, set, and zset).
allkeys-randomRandomly selects data from all data sets (including key-value, hash, list, set, and zset) for eviction.
noevictionDoes not evict data; directly return errors for write operations when memory space is insufficient.

Expire Policy

Redis provides a configurable expiration strategy, known as the "expire" policy, which allows for automatic expiration of keys. This policy enables the user to set an expiration time for a key, and when this time is reached, Redis will delete the key. This function is particularly useful for cache data, allowing for the prevention of memory resources being wasted by retaining cached data indefinitely.

Memory Best Practice Recommendations

Generally, the dataset occupies the vast majority of memory in Redis, so memory optimization primarily focuses on this part. You can refer to the best practices for configuration below:

  • Set a reasonable maximum memory limit (maxmemory) parameter for Redis, which controls the maximum amount of memory the Redis instance can use. When Redis reaches the maximum memory limit, it triggers various cleanup strategies, such as deleting expired keys or using the LFU (least frequently used) algorithm to remove less used keys. It is recommended that the maxmemory size be less than or equal to 80% of the instance specifications.
  • Carefully consider the data structures and key design in Redis. For example, using smaller data types (such as strings) or adopting more compact data structures (such as Redis's hash table data type) can reduce memory usage.
  • Set reasonable expiration times and cleanup strategies for keys.
  • Avoid creating big keys.

Best practices may differ based on various application scenarios. Therefore, depending on your specific application context and requirements, you may need to customize the best practices for Redis memory usage.