What Are Python Generators vs Iterators? See Examples

Both generators and iterators in Python are used to traverse data lazily, producing items one at a time instead of storing them in memory.

  • Iterator: Any object that implements __iter__() and __next__(). You can traverse elements manually or with loops.
  • Generator: A simpler way to create an iterator using a function with yield. It automatically implements __iter__() and __next__().

Generators are essentially a subset of iterators but provide a cleaner, more memory-efficient syntax.


Comparison Table: Generators vs Iterators

FeatureIteratorGenerator
CreationUse class with __iter__() and __next__()Use function with yield
MemoryCan consume more memory for large datasetsMemory-efficient, generates items on-the-fly
SyntaxVerbose, needs custom classConcise, simple function
Use CaseTraversing existing sequencesLazy computation, streaming, infinite sequences
ReusabilityCan reset or recreate iteratorCannot restart once exhausted, must create again

Example 1: Using an Iterator

class Count:
    def __init__(self, n):
        self.i = 1
        self.n = n

    def __iter__(self):
        return self

    def __next__(self):
        if self.i > self.n:
            raise StopIteration
        val = self.i
        self.i += 1
        return val

counter = Count(5)
for num in counter:
    print(num)

Example 2: Using a Generator

def count_gen(n):
    i = 1
    while i <= n:
        yield i
        i += 1

for num in count_gen(5):
    print(num)
  • Both examples produce the same output: 1, 2, 3, 4, 5
  • The generator is simpler and more memory-efficient

Example 3: Real-World Scenario – Large Data Processing

Using Iterator:

class FileIterator:
    def __init__(self, filename):
        self.file = open(filename)
    
    def __iter__(self):
        return self
    
    def __next__(self):
        line = self.file.readline()
        if not line:
            self.file.close()
            raise StopIteration
        return line.strip()

for line in FileIterator("large_file.txt"):
    print(line)

Using Generator:

def file_generator(filename):
    with open(filename) as f:
        for line in f:
            yield line.strip()

for line in file_generator("large_file.txt"):
    print(line)
  • Generator version is cleaner, concise, and automatically handles file closing

Advantages of Generators Over Iterators

  • Less code, easier to read
  • Automatically implements iterator protocol
  • Memory-efficient for large or infinite datasets
  • Handles lazy evaluation gracefully

Best Practices

✔ Use generators for large datasets or streaming data
✔ Use iterators for custom traversal logic requiring state
✔ Avoid overcomplicating generators; keep them simple and modular


Conclusion

Generators and iterators are essential tools for lazy evaluation in Python. Generators simplify code, save memory, and handle infinite sequences, while iterators provide more control and flexibility for custom traversals.


References

Leave a Comment

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *