Both generators and iterators in Python are used to traverse data lazily, producing items one at a time instead of storing them in memory.
- Iterator: Any object that implements
__iter__()and__next__(). You can traverse elements manually or with loops. - Generator: A simpler way to create an iterator using a function with
yield. It automatically implements__iter__()and__next__().
Generators are essentially a subset of iterators but provide a cleaner, more memory-efficient syntax.
Comparison Table: Generators vs Iterators
| Feature | Iterator | Generator |
|---|---|---|
| Creation | Use class with __iter__() and __next__() | Use function with yield |
| Memory | Can consume more memory for large datasets | Memory-efficient, generates items on-the-fly |
| Syntax | Verbose, needs custom class | Concise, simple function |
| Use Case | Traversing existing sequences | Lazy computation, streaming, infinite sequences |
| Reusability | Can reset or recreate iterator | Cannot restart once exhausted, must create again |
Example 1: Using an Iterator
class Count:
def __init__(self, n):
self.i = 1
self.n = n
def __iter__(self):
return self
def __next__(self):
if self.i > self.n:
raise StopIteration
val = self.i
self.i += 1
return val
counter = Count(5)
for num in counter:
print(num)
Example 2: Using a Generator
def count_gen(n):
i = 1
while i <= n:
yield i
i += 1
for num in count_gen(5):
print(num)
- Both examples produce the same output: 1, 2, 3, 4, 5
- The generator is simpler and more memory-efficient
Example 3: Real-World Scenario – Large Data Processing
Using Iterator:
class FileIterator:
def __init__(self, filename):
self.file = open(filename)
def __iter__(self):
return self
def __next__(self):
line = self.file.readline()
if not line:
self.file.close()
raise StopIteration
return line.strip()
for line in FileIterator("large_file.txt"):
print(line)
Using Generator:
def file_generator(filename):
with open(filename) as f:
for line in f:
yield line.strip()
for line in file_generator("large_file.txt"):
print(line)
- Generator version is cleaner, concise, and automatically handles file closing
Advantages of Generators Over Iterators
- Less code, easier to read
- Automatically implements iterator protocol
- Memory-efficient for large or infinite datasets
- Handles lazy evaluation gracefully
Best Practices
✔ Use generators for large datasets or streaming data
✔ Use iterators for custom traversal logic requiring state
✔ Avoid overcomplicating generators; keep them simple and modular
Conclusion
Generators and iterators are essential tools for lazy evaluation in Python. Generators simplify code, save memory, and handle infinite sequences, while iterators provide more control and flexibility for custom traversals.
References
- Internal Reference: https://savanka.com/category/learn/python/
- External Reference: https://www.w3schools.com/python/