FastAPI Async Support
Introduction
FastAPI is built from the ground up with asynchronous programming in mind, making it one of the fastest Python web frameworks available. Asynchronous programming allows your application to handle multiple operations concurrently without blocking the execution flow, which is particularly useful for I/O-bound operations like database queries, API calls, or file operations.
In this tutorial, we'll explore how FastAPI integrates with Python's native async
/await
syntax and how you can leverage these capabilities to build high-performance APIs that can handle many concurrent requests efficiently.
Understanding Asynchronous Programming
Before diving into FastAPI's async features, let's briefly understand what asynchronous programming is:
- Synchronous (blocking) code executes line by line and waits for each operation to complete before moving to the next line
- Asynchronous (non-blocking) code can pause execution of one task to work on another while waiting for I/O operations to complete
Python's async
/await
syntax provides a way to write asynchronous code that looks and behaves like synchronous code, making it easier to understand and maintain.
Basic Async Route in FastAPI
Creating an asynchronous endpoint in FastAPI is as simple as adding the async
keyword to your route function:
from fastapi import FastAPI
app = FastAPI()
@app.get("/")
async def read_root():
return {"message": "Hello World"}
This simple change allows FastAPI to handle this route asynchronously, meaning it can process other requests while waiting for any I/O operations in this route.
When to Use Async Routes
Not all routes need to be asynchronous. Here's when you should consider using async
routes:
Use async
when:
- Making network calls (API requests, database queries)
- Performing file I/O operations
- Waiting for external processes
- Needing to handle many concurrent requests efficiently
Use regular functions when:
- Performing CPU-bound operations
- Using libraries that don't support asynchronous operations
- Not performing I/O operations
Practical Example: Async vs. Sync Performance
Let's create an example that demonstrates the performance difference between synchronous and asynchronous routes:
import asyncio
import time
from fastapi import FastAPI
app = FastAPI()
# Synchronous endpoint
@app.get("/sync")
def sync_operation():
# Simulate an I/O operation (e.g., database query)
time.sleep(1)
return {"operation": "sync", "completed": True}
# Asynchronous endpoint
@app.get("/async")
async def async_operation():
# Simulate an I/O operation asynchronously
await asyncio.sleep(1)
return {"operation": "async", "completed": True}
If you make a single request to either endpoint, they'll both take about 1 second to respond. However, the difference becomes apparent when handling multiple concurrent requests:
- The synchronous endpoint will process requests one after another, taking 1 second per request
- The asynchronous endpoint can handle multiple requests concurrently, with all requests taking just over 1 second total
Working with Async Libraries
To fully leverage FastAPI's async capabilities, you'll need to use async-compatible libraries for your I/O operations. Here are some popular options:
- Database: SQLAlchemy 1.4+ (with
async_session
), asyncpg, motor (for MongoDB) - HTTP requests: httpx, aiohttp
- File operations: aiofiles
Let's see an example using httpx
to make asynchronous HTTP requests:
from fastapi import FastAPI
import httpx
app = FastAPI()
@app.get("/fetch-data")
async def fetch_data():
async with httpx.AsyncClient() as client:
# Make multiple HTTP requests concurrently
response1 = client.get("https://jsonplaceholder.typicode.com/todos/1")
response2 = client.get("https://jsonplaceholder.typicode.com/todos/2")
# Await the responses
results = await asyncio.gather(response1, response2)
# Process the results
return {
"todo1": results[0].json(),
"todo2": results[1].json()
}
Advanced Example: Background Tasks with Async Support
FastAPI allows you to run background tasks asynchronously. This is useful when you need to perform operations after returning a response to the client:
from fastapi import FastAPI, BackgroundTasks
app = FastAPI()
async def process_data_async(data: str):
# This would be some time-consuming process
await asyncio.sleep(5)
with open("log.txt", "a") as f:
f.write(f"Processed data: {data}\n")
@app.post("/submit")
async def submit(background_tasks: BackgroundTasks, data: str = "default"):
# Add the task to be run in the background
background_tasks.add_task(process_data_async, data)
# Return response immediately while the background task runs
return {"message": "Data processing started in the background"}
Dependency Injection with Async Support
FastAPI's dependency injection system fully supports asynchronous dependencies:
from fastapi import FastAPI, Depends
app = FastAPI()
async def get_user_from_db(user_id: int):
# In a real app, this would query a database
await asyncio.sleep(0.1) # Simulate db query
return {"id": user_id, "name": f"User {user_id}"}
@app.get("/users/{user_id}")
async def read_user(user: dict = Depends(get_user_from_db)):
return user
Common Pitfalls and Best Practices
Mixing Sync and Async Code
Be careful when mixing synchronous and asynchronous code. If you call synchronous, blocking functions from an async route, you'll lose the benefits of asynchronous execution:
@app.get("/bad-practice")
async def bad_practice():
# This will block the entire event loop!
time.sleep(1) # ❌ Don't do this in an async function
return {"message": "This defeats the purpose of async"}
@app.get("/good-practice")
async def good_practice():
# This allows other requests to be processed while waiting
await asyncio.sleep(1) # ✓ Do this instead
return {"message": "This properly uses async"}
CPU-Bound Tasks
For CPU-bound tasks in async routes, consider offloading the work to a separate process:
import asyncio
from concurrent.futures import ProcessPoolExecutor
app = FastAPI()
process_pool = ProcessPoolExecutor()
def cpu_bound_task(x):
# Simulate CPU-intensive work
result = 0
for i in range(10000000):
result += i
return result + x
@app.get("/cpu-task/{number}")
async def handle_cpu_task(number: int):
# Run CPU-bound task in a separate process
loop = asyncio.get_event_loop()
result = await loop.run_in_executor(
process_pool,
cpu_bound_task,
number
)
return {"result": result}
Practical Real-World Example
Let's create a more comprehensive example that simulates a real-world scenario where async operations provide significant benefits:
from fastapi import FastAPI, HTTPException
import httpx
import asyncio
import time
app = FastAPI(title="Weather Aggregator API")
# These would typically be environment variables
WEATHER_API_KEY = "demo_key"
CITIES = ["London", "New York", "Tokyo", "Sydney", "Paris"]
async def fetch_weather(city: str, client: httpx.AsyncClient):
"""Fetch weather data for a specific city"""
try:
response = await client.get(
f"https://api.weatherapi.com/v1/current.json",
params={"key": WEATHER_API_KEY, "q": city}
)
response.raise_for_status()
data = response.json()
return {
"city": city,
"temperature": data["current"]["temp_c"],
"condition": data["current"]["condition"]["text"]
}
except Exception as e:
return {"city": city, "error": str(e)}
@app.get("/weather/sync")
def get_weather_sync():
"""Get weather for multiple cities synchronously"""
start_time = time.time()
results = []
# Synchronous implementation
with httpx.Client() as client:
for city in CITIES:
try:
response = client.get(
f"https://api.weatherapi.com/v1/current.json",
params={"key": WEATHER_API_KEY, "q": city}
)
response.raise_for_status()
data = response.json()
results.append({
"city": city,
"temperature": data["current"]["temp_c"],
"condition": data["current"]["condition"]["text"]
})
except Exception as e:
results.append({"city": city, "error": str(e)})
elapsed = time.time() - start_time
return {
"elapsed_seconds": elapsed,
"weather": results
}
@app.get("/weather/async")
async def get_weather_async():
"""Get weather for multiple cities asynchronously"""
start_time = time.time()
# Asynchronous implementation
async with httpx.AsyncClient() as client:
tasks = [fetch_weather(city, client) for city in CITIES]
results = await asyncio.gather(*tasks)
elapsed = time.time() - start_time
return {
"elapsed_seconds": elapsed,
"weather": results
}
In this example, the async version would complete significantly faster than the synchronous version when fetching weather data for multiple cities, especially as the number of cities increases.
Summary
FastAPI's async support provides a powerful way to build high-performance web APIs:
- Use
async def
for routes that perform I/O-bound operations - Pair FastAPI with async-compatible libraries for databases, HTTP requests, etc.
- Leverage background tasks for operations that don't need to block the response
- Be careful not to block the event loop with synchronous operations in async routes
- Use separate processes for CPU-bound tasks
By understanding when and how to use FastAPI's asynchronous features, you can create applications that efficiently handle large numbers of concurrent requests, providing a better experience for your users.
Additional Resources
- FastAPI Official Documentation on Async
- Python's asyncio Documentation
- HTTPX - Async HTTP Client
- Database: SQLAlchemy Async
Exercises
- Create an async FastAPI endpoint that fetches data from multiple public APIs concurrently and aggregates the results
- Implement an endpoint that performs a simulated database query using
asyncio.sleep()
and returns progressively streamed responses - Compare the performance of synchronous vs. asynchronous routes by creating a simple benchmark test with multiple concurrent requests
- Build a FastAPI application that uses background tasks to process uploaded files asynchronously
If you spot any mistakes on this website, please let me know at [email protected]. I’d greatly appreciate your feedback! :)