You're Already Using This Library (You Just Don't Know It)
22 million Composer installs. 72 GitHub stars.
That ratio tells you something: this library is everywhere, but nobody talks about it. If you use Infection for mutation testing, you’re already using sanmai/pipeline. It’s what made Infection as memory-efficient as it is.
But you might want to use it directly. Here’s why.
The Problem
You’ve hit this wall before. You need to process a log file, filter some data, maybe aggregate results. Standard PHP approach:
$lines = file('access.log');
$errors = array_filter($lines, fn($l) => str_contains($l, 'ERROR'));
Clean, readable, works great. Until your log file is 500MB and your server has 512MB of RAM. Then PHP just dies.
Laravel Collections won’t save you here either - it still loads everything into an array first. Memory limits are real: shared hosting, containers, that cheap VPS you’re trying to keep under budget. Sometimes you can’t just throw more RAM at the problem.
The Solution
Same task, constant memory:
use function Pipeline\take;
$file = new SplFileObject('access.log');
$errors = take($file)
->filter(fn($l) => str_contains($l, 'ERROR'))
->toList();
That’s it. SplFileObject is an iterator - it reads one line at a time. Pipeline keeps it that way. File size doesn’t matter anymore.
The API is what you’d expect. map() transforms values (and can yield multiple items). filter() keeps what matches. reduce() collapses to a single value. Works with arrays, iterators, generators - anything iterable.
use function Pipeline\take;
// Chain operations, nothing executes until you consume it
$result = take($hugeDataset)
->stream()
->map(fn($row) => parseRow($row))
->filter(fn($item) => $item->isValid())
->cast(fn($item) => $item->value)
->reduce(fn($sum, $val) => $sum + $val, 0);
Lazy evaluation means you can build complex pipelines without worrying about intermediate arrays eating your memory.
When You Need It
You probably don’t need this for most PHP work. Request comes in, you fetch some records, transform them, send a response. Arrays are fine.
But sometimes:
- Log files that won’t fit in memory
- CSV imports with millions of rows
- Database cursors streaming large result sets
- Paginated APIs where you want to process all pages without loading everything first
This isn’t replacing Laravel Collections. It’s for when Collections can’t help you - when the data is too big, or you don’t know how big it’ll be.
Try It
composer require sanmai/pipeline
Full documentation has more examples and the complete API.
Next time you hit a memory limit while processing data, remember: there’s a library for that. You might already have it installed.