Handling Async and Generator Callables
The dependency injection system in this codebase relies on the Dependant class to introspect callables and determine the appropriate execution strategy. By distinguishing between standard functions, coroutines, and generators, the system can manage complex lifecycles, such as database connections or file handles, ensuring resources are properly cleaned up after a request is processed.
The Dependant Metadata Store
The Dependant class (found in fastapi/dependencies/models.py) serves as the central repository for metadata about a dependency. It uses several cached properties to categorize the dependency's call attribute:
is_coroutine_callable: Identifies standardasync deffunctions.is_gen_callable: Identifies standard synchronous generators that useyield.is_async_gen_callable: Identifies asynchronous generators that useasync yield.
The implementation of these properties is robust, accounting for various Python callable types. It checks the callable itself, unwraps decorators using _unwrapped_call, and even inspects the __call__ method for class-based dependencies.
@cached_property
def is_gen_callable(self) -> bool:
if self.call is None:
return False
if inspect.isgeneratorfunction(_impartial(self.call)) or inspect.isgeneratorfunction(_unwrapped_call(self.call)):
return True
# ... checks for __call__ on instances ...
Execution Strategies in Dependency Resolution
When solve_dependencies (in fastapi/dependencies/utils.py) processes a dependency graph, it uses these flags to decide how to invoke each callable.
Coroutines and Standard Functions
If a dependency is a coroutine (is_coroutine_callable), it is simply awaited. If it is a standard synchronous function (and not a generator), it is executed in a threadpool using run_in_threadpool to prevent blocking the main event loop.
# From fastapi/dependencies/utils.py
elif use_sub_dependant.is_coroutine_callable:
solved = await call(**solved_result.values)
else:
solved = await run_in_threadpool(call, **solved_result.values)
Generators and Resource Lifecycle
Generators are treated differently because they represent a multi-stage lifecycle: code before the yield is the setup, and code after the yield is the teardown. The system uses _solve_generator to wrap these callables into an AsyncExitStack.
- Async Generators: Wrapped using
contextlib.asynccontextmanager. - Sync Generators: Wrapped using
contextlib.contextmanagerand then adapted for async execution viacontextmanager_in_threadpool.
# From fastapi/dependencies/utils.py
async def _solve_generator(
*, dependant: Dependant, stack: AsyncExitStack, sub_values: dict[str, Any]
) -> Any:
if dependant.is_async_gen_callable:
cm = asynccontextmanager(dependant.call)(**sub_values)
elif dependant.is_gen_callable:
cm = contextmanager_in_threadpool(contextmanager(dependant.call)(**sub_values))
return await stack.enter_async_context(cm)
Threading and Concurrency for Sync Generators
A key design choice is how synchronous generators are handled. Since they might perform blocking I/O (like a synchronous database driver), they are executed in a threadpool. The contextmanager_in_threadpool function in fastapi/concurrency.py ensures that both the __enter__ and __exit__ phases of the context manager run in a separate thread, while providing an async interface to the rest of the system.
To prevent deadlocks in scenarios where the context manager itself manages a pool (e.g., a DB connection pool), the __exit__ phase is executed with a dedicated CapacityLimiter.
Scope and Constraints
The presence of a generator influences the dependency's scope. In Dependant.computed_scope, any callable identified as a generator (sync or async) defaults to a "request" scope. This is because the teardown logic (the code after yield) must wait until the request is finished to execute.
@cached_property
def computed_scope(self) -> str | None:
if self.scope:
return self.scope
if self.is_gen_callable or self.is_async_gen_callable:
return "request"
return None
This design ensures that developers can use simple yield statements for resource management without worrying about the underlying async machinery, while the system guarantees that cleanup code runs even if exceptions occur during the request.