Discover how yield from makes Python generators cleaner and more powerful, plus a real-world example with FastAPI log streaming.
Python generators are one of the best tools in your toolbox when you want to write efficient, readable, and memory-friendly code. But there's a hidden gem inside Python's generator system that makes things even smoother: the yield from
statement.
In this post, we'll walk through:
yield from
is and how it works.yield from
.Let’s get started!
A generator is a function that allows you to return a sequence of values one at a time, using the yield
keyword.
Example:
def count_up_to(n):
i = 1
while i <= n:
yield i
i += 1
This doesn’t return a list, it gives back a generator object that you can loop through:
for number in count_up_to(3):
print(number)
Output:
1
2
3
This kind of lazy evaluation is perfect when working with large files, data streams, or pipelines, you only load what you need, when you need it.
Now imagine you want to delegate part of your generator logic to another generator. You could do this:
def wrapper():
for value in count_up_to(3):
yield value
But Python offers a cleaner, smarter way:
def wrapper():
yield from count_up_to(3)
Boom. Just one line and it works exactly the same. yield from
basically forwards all values from another iterable or generator into your current one.
Let’s flatten this list:
data = [1, [2, 3], [4, [5, 6]], 7]
We want: 1, 2, 3, 4, 5, 6, 7
Here’s how:
def flatten(items):
for item in items:
if isinstance(item, list):
yield from flatten(item)
else:
yield item
Usage:
for val in flatten([1, [2, 3], [4, [5, 6]], 7]):
print(val)
yield from
makes recursive generators a breeze!
Imagine you’re pulling data from different sources:
def fruits():
yield from ["apple", "banana"]
def veggies():
yield from ["carrot", "daikon"]
def everything():
yield from fruits()
yield from veggies()
everything()
will yield all items from both functions, without any manual loops. This makes your code super modular.
Let’s now take what we’ve learned and apply it to a real-world backend use case: serving live logs via an HTTP endpoint.
In backends, you often need to stream logs to the frontend. This can be for:
But log files can be huge, and loading them all at once is a bad idea.
That’s where generators (and yield from
) come in:
pip install fastapi uvicorn
# main.py
from fastapi import FastAPI
from fastapi.responses import StreamingResponse
import time
import os
app = FastAPI()
def tail_log_file(filepath):
"""A generator that yields new lines from a log file, like `tail -f`"""
with open(filepath, "r") as f:
# Move to end of file
f.seek(0, os.SEEK_END)
while True:
line = f.readline()
if not line:
time.sleep(0.5) # Wait for new lines
continue
yield line
def log_streamer():
"""Wrapper that could combine multiple sources using `yield from`"""
# In the future, you could use yield from other sources here.
yield from tail_log_file("app.log")
@app.get("/logs")
def stream_logs():
return StreamingResponse(log_streamer(), media_type="text/plain")
uvicorn main:app --reload
Then visit http://localhost:8000/logs and watch the logs stream in real time!
You might just stream from one source today, but tomorrow you might want to stream logs from:
subprocess.Popen
)With yield from
, you can combine and delegate streams without breaking your structure.
Example:
def log_streamer():
yield from tail_log_file("app.log")
yield from tail_log_file("error.log")
This modular design makes it easy to grow your app in the future.
Python’s yield from
is more than a syntax shortcut—it’s a powerful tool for writing clean, modular, and memory-efficient generators.
We saw how it helps simplify:
And with frameworks like FastAPI, you can use it to build elegant, high-performance data streaming endpoints with just a few lines of code.