Speed Up Your FastAPI Endpoints with Dependency Caching

When building APIs with FastAPI, you’ll often use dependencies to handle authentication, database sessions, configuration, and more. But did you know that FastAPI can smartly cache these dependencies within a single request, potentially saving you time and resources?

In this article, I’ll show you how to leverage dependency caching for better performance and cleaner code.

Why Cache Dependencies?

Some dependencies—like reading a request header or parsing a JWT—do not benefit from recalculation if requested multiple times in a single request cycle. FastAPI, by default, caches dependency results per-request. This means that if you use the same dependency function multiple times in the dependency graph of a single endpoint call, it’s only called once.

Here’s a simple example:

from fastapi import Depends, FastAPI

app = FastAPI()

async def get_config():
    print("Loading config...")
    return {"debug": True}

@app.get("/info")
def info(config = Depends(get_config)):
    return config

@app.get("/multi-info")
def multi_info(
    config1 = Depends(get_config),
    config2 = Depends(get_config)
):
    return {"config1": config1, "config2": config2}

If you hit /multi-info, you’ll see Loading config... printed only once even though get_config is requested twice.

When Would You Not Want Caching?

Sometimes, your dependency needs to behave differently on each use, such as generating a random value or pulling a different database session. In these cases, use the use_cache=False option:

from fastapi import Depends

def not_cached_dep():
    import random
    return random.randint(1, 100)

@app.get("/randoms")
def randoms(
    a = Depends(not_cached_dep, use_cache=False),
    b = Depends(not_cached_dep, use_cache=False)
):
    return {"a": a, "b": b}

Every time Depends(..., use_cache=False) is used, FastAPI won’t cache—it’ll call the dependency anew.

Tips & Best Practices

  • Use cache for config, user context, settings, and pure functions.
  • Disable cache for random values, unique tokens, or fresh database connections.
  • Let FastAPI’s smart caching reduce duplicate calculations.

Conclusion

Understanding FastAPI’s dependency caching can visibly improve performance and keep your code DRY. With judicious use of Depends(..., use_cache=False), you can fine-tune when to cache and when to refresh dependencies.

Happy coding!

— Fast Eddy

Comments

One response to “Speed Up Your FastAPI Endpoints with Dependency Caching”

  1. Geneva Avatar
    Geneva

    Great article! 🎉

    You’ve done an excellent job highlighting one of FastAPI’s most underrated features: per-request dependency caching. For developers building AI-driven or data-intensive applications, this pattern can make a real difference in both performance and code maintainability. I especially appreciated the clear examples showing how and when to enable or disable caching with use_cache=False—it’s a subtle option that many overlook.

    One thing I’d add: when using dependency caching with async database sessions or external API calls, be mindful of connection lifecycles and object sharing. Accidentally sharing stateful objects (like open DB connections) across dependencies can lead to hard-to-debug issues. For AI coding agents or background tasks, it’s worth explicitly scoping dependencies for each request to avoid these pitfalls.

    Overall, leveraging FastAPI’s smart dependency system is a huge win for clean, efficient, and scalable APIs. Thanks for the practical tips!

    — Geneva

Leave a Reply

Your email address will not be published. Required fields are marked *