summaryrefslogtreecommitdiffstats
path: root/health/guides/memcached
diff options
context:
space:
mode:
Diffstat (limited to 'health/guides/memcached')
-rw-r--r--health/guides/memcached/memcached_cache_fill_rate.md41
-rw-r--r--health/guides/memcached/memcached_cache_memory_usage.md35
-rw-r--r--health/guides/memcached/memcached_out_of_cache_space_time.md19
3 files changed, 0 insertions, 95 deletions
diff --git a/health/guides/memcached/memcached_cache_fill_rate.md b/health/guides/memcached/memcached_cache_fill_rate.md
deleted file mode 100644
index ec276b3a7..000000000
--- a/health/guides/memcached/memcached_cache_fill_rate.md
+++ /dev/null
@@ -1,41 +0,0 @@
-### Understand the alert
-
-This alert, `memcached_cache_fill_rate`, measures the average rate at which the Memcached cache fills up (positive value) or frees up (negative value) space over the last hour. The units are in `KB/hour`. If you receive this alert, it means that your Memcached cache is either filling up or freeing up space at a noticeable rate.
-
-### What is Memcached?
-
-Memcached is a high-performance, distributed memory object caching system used to speed up web applications by temporarily storing frequently-used data in RAM. It reduces the load on the database and improves performance by minimizing the need for repeated costly database queries.
-
-### Troubleshoot the alert
-
-1. Check the current cache usage:
-
-You can view the current cache usage using the following command, where `IP` and `PORT` are the Memcached server's IP address and port number:
-
-```
-echo "stats" | nc IP PORT
-```
-
-Look for the `bytes` and `limit_maxbytes` fields in the output to see the current cache usage and the maximum cache size allowed, respectively.
-
-2. Identify heavy cache users:
-
-Find out which applications or services are generating a significant number of requests to Memcached. You may be able to optimize them to reduce cache usage. You can check Memcached logs for more details about requests and operations.
-
-3. Optimize cache storage:
-
-If the cache is filling up too quickly, consider optimizing your cache storage policies. For example, you can adjust the expiration times of stored items, prioritize essential data, or use a more efficient caching strategy.
-
-4. Increase the cache size:
-
-If needed, you can increase the cache size to accommodate a higher fill rate. To do this, stop the Memcached service and restart it with the `-m` option, specifying the desired memory size in megabytes:
-
-```
-memcached -d -u memcached -m NEW_SIZE -l IP -p PORT
-```
-
-Replace `NEW_SIZE` with the desired cache size in MB.
-
-### Useful resources
-
-1. [Memcached Official Site](https://memcached.org/)
diff --git a/health/guides/memcached/memcached_cache_memory_usage.md b/health/guides/memcached/memcached_cache_memory_usage.md
deleted file mode 100644
index 2a14f01fc..000000000
--- a/health/guides/memcached/memcached_cache_memory_usage.md
+++ /dev/null
@@ -1,35 +0,0 @@
-### Understand the alert
-
-This alert indicates the percentage of used cached memory in your Memcached instance. High cache memory utilization can lead to evictions and performance degradation. The warning state is triggered when the cache memory utilization is between 70-80%, and the critical state is triggered when it's between 80-90%.
-
-### What does cache memory utilization mean?
-
-Cache memory utilization refers to the percentage of memory used by Memcached for caching data. A high cache memory utilization indicates that your Memcached instance is close to its maximum capacity, and it may start evicting data to accommodate new entries, which can negatively impact performance.
-
-### Troubleshoot the alert
-
-1. **Monitor cache usage and evictions**: Use the following command to display the current cache usage and evictions metrics:
-
- ```
- echo "stats" | nc localhost 11211
- ```
- Look for the `bytes` and `evictions` metrics in the output. High evictions indicate that your cache size is insufficient for the current workload, and you may need to increase it.
-
-2. **Increase cache size**: To increase the cache size, edit the Memcached configuration file (usually `/etc/memcached.conf`) and update the value of the `-m` option. For example, to set the cache size to 2048 megabytes, update the configuration as follows:
-
- ```
- -m 2048
- ```
- Save the file and restart the Memcached service for the changes to take effect.
-
- ```
- sudo systemctl restart memcached
- ```
-
-3. **Optimize your caching strategy**: Review your caching strategy to ensure that you are only caching necessary data and using appropriate expiration times. Making updates that reduce the amount of cached data can help prevent high cache memory usage.
-
-4. **Consider cache sharding or partitioning**: If increasing the cache size or optimizing your caching strategy doesn't resolve the issue, you may need to consider cache sharding or partitioning. This approach involves using multiple Memcached instances, dividing the data across them, which can help distribute the load and reduce cache memory usage.
-
-### Useful resources
-
-1. [Memcached Official Documentation](https://memcached.org/)
diff --git a/health/guides/memcached/memcached_out_of_cache_space_time.md b/health/guides/memcached/memcached_out_of_cache_space_time.md
deleted file mode 100644
index 5f546553c..000000000
--- a/health/guides/memcached/memcached_out_of_cache_space_time.md
+++ /dev/null
@@ -1,19 +0,0 @@
-### Understand the alert
-
-This alert indicates that the Memcached cache is running out of space and will likely become full soon, based on the data addition rate over the past hour. If the cache reaches 100% capacity, evictions may occur, resulting in a loss of cached data and decreased performance.
-
-### Troubleshoot the alert
-
-1. **Monitor cache usage**: Use the `stats` command in Memcached to check the current cache usage and the number of evictions. This will help you understand the severity of the issue and whether evictions are already happening.
-
-2. **Evaluate cache settings**: Review your Memcached configuration file (`/etc/memcached.conf` or `/etc/sysconfig/memcached`) and check the cache size setting (`-m` parameter). Ensure that the cache size is set appropriately based on your system's available memory and workload requirements.
-
-3. **Increase cache size**: If the cache is consistently running out of space, consider increasing the cache size by adjusting the `-m` parameter in the Memcached configuration file. Be cautious not to allocate too much memory, as this can cause other system processes to suffer.
-
-4. **Optimize cache usage**: Analyze the cache usage patterns of your applications and optimize their caching strategies. This may involve adjusting the cache TTL (time-to-live) settings, using different cache eviction policies, or implementing a more efficient caching mechanism.
-
-5. **Monitor application performance**: Check the performance of your applications that use Memcached to identify any issues or bottlenecks. If performance is degrading due to cache evictions, consider optimizing the applications or increasing cache capacity.
-
-### Useful resources
-
-1. [Memcached Configuration Options](https://github.com/memcached/memcached/wiki/ConfiguringServer)