当前位置: X-MOL 学术ACM Trans. Archit. Code Optim. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
SLAP: Segmented Reuse-Time-Label Based Admission Policy for Content Delivery Network Caching
ACM Transactions on Architecture and Code Optimization ( IF 1.6 ) Pub Date : 2024-03-23 , DOI: 10.1145/3646550
Ke Liu 1 , Kan Wu 2 , Hua Wang 1 , Ke Zhou 1 , Peng Wang 1 , Ji Zhang 1 , Cong Li 3
Affiliation  

‘‘Learned” admission policies have shown promise in improving Content Delivery Network (CDN) cache performance and lowering operational costs. Unfortunately, existing learned policies are optimized with a few fixed cache sizes while in reality, cache sizes often vary over time in an unpredictable manner. As a result, existing solutions cannot provide consistent benefits in production settings.

We present SLAP, a learned CDN cache admission approach based on segmented object reuse time prediction. SLAP predicts an object’s reuse time range using the Long-Short-Term-Memory model and admits objects that will be reused (before eviction) given the current cache size. SLAP decouples model training from cache size, allowing it to adapt to arbitrary sizes. The key to our solution is a novel segmented labeling scheme that makes SLAP without requiring precise prediction on object reuse time. To further make SLAP a practical and efficient solution, we propose aggressive reusing of computation and training on sampled traces to optimize model training, and a specialized predictor architecture that overlaps prediction computation with miss object fetching to optimize model inference. Our experiments using production CDN traces show that SLAP achieves significantly lower write traffic (38%-59%), longer SSDs lifetime (104%-178%), a consistently higher hit rate (3.2%-11.7%), and requires no effort to adapt to changing cache sizes, outperforming existing policies.



中文翻译:

SLAP:基于分段重用时间标签的内容交付网络缓存准入策略

“学习的”准入策略在提高内容交付网络 (CDN) 缓存性能和降低运营成本方面显示出了希望。不幸的是,现有的学习策略是通过一些固定的缓存大小来优化的,而实际上,缓存大小通常会随着时间以不可预测的方式变化。因此,现有解决方案无法在生产环境中提供一致的优势。

我们提出了SLAP,一种基于分段对象重用时间预测的学习型 CDN 缓存准入方法。SLAP使用长短期内存模型预测对象的重用时间范围,并在给定当前缓存大小的情况下接纳将重用的对象(在逐出之前)。SLAP将模型训练与缓存大小解耦,使其能够适应任意大小。我们解决方案的关键是一种新颖的分段标记方案,该方案无需精确预测对象重用时间即可实现SLAP 。为了进一步使SLAP成为实用且高效的解决方案,我们建议积极重用采样轨迹上的计算和训练来优化模型训练,并提出一种专门的预测器架构,将预测计算与未命中对象获取重叠以优化模型推理。我们使用生产 CDN 跟踪进行的实验表明,SLAP 可以显着降低写入流量 (38%-59%)、延长 SSD 寿命 (104%-178%)、始终保持较高的命中率 (3.2%-11.7%),并且无需任何努力适应不断变化的缓存大小,优于现有策略。

更新日期:2024-03-24
down
wechat
bug