We recently open sourced a better python cache - it’s like Python’s lru_cache, but it stores the values in files instead of in memory. This has saved us hours of time running our LLM benchmarks, and we’d like to share it as its own Python module.
Thanks to Lucas Jaggernauth (former Sweep engineer) for building the initial version of this!
Here’s the blog: A better Python cache for slow function calls.