PACE: Penalty aware cache modeling with enhanced AET
Document Type
Conference Proceeding
Publication Date
8-27-2018
Abstract
© 2018 Association for Computing Machinery. Past cache modeling techniques are typically limited to a cache system with a fixed cache line/block size. This limitation is not a problem for a hardware cache where the cache line size is uniform. However, modern in-memory software caches, such as Memcached and Redis, are able to cache varied-size data objects. A software cache supports update and delete operations in addition to only reads and writes for a hardware cache. Moreover, existing cache models often assume that the penalty for each cache miss is identical, which is not true especially for software cache targeting web services, and past cache management policies that aim to improve cache hit rate are no longer sufficient. We propose a more general cache model that can handle varied cache block sizes, nonuniform miss penalties, and diverse cache operations. In this paper, we first extend a state-of-the-art cache model to accurately predict cache miss ratios for variable cache sizes when object size, updates and deletions are considered. We then apply this model to drive cache management when miss penalty is brought into consideration. Our approach delivers better results than a recent penalty-aware cache management scheme, Hyperbolic Caching, especially when cache budget is tight. Another advantage of our approach is that it provides predictable and controllable cache management on cache space allocation, especially when multiple applications share the cache space.
Publication Title
Proceedings of the 9th Asia-Pacific Workshop on Systems, APSys 2018
Recommended Citation
Pan, C.,
Hu, X.,
Zhou, L.,
Luo, Y.,
Wang, X.,
&
Wang, Z.
(2018).
PACE: Penalty aware cache modeling with enhanced AET.
Proceedings of the 9th Asia-Pacific Workshop on Systems, APSys 2018.
http://doi.org/10.1145/3265723.3265736
Retrieved from: https://digitalcommons.mtu.edu/michigantech-p/12580