Streams, AsyncIterable, TypeScript... sounds complex, doesn't it? But fear not! In this article, we're taking a leisurely stroll through the world of stream processing with a handy tool called strict-stream that helps to manage iterators.
Just simple explanations and examples. Let's dive in!
Streams are like digital rivers flowing with data. Imagine a conveyor belt carrying items past you, one at a time.
Like that...
      ┌─┐   ┌─┐    ┌─┐     ┌─┐     ┌─┐     ┌─┐
──────┼┼┼───┼┼┼────┼┼┼─────┼┼┼─────┼┼┼─────┼┼┼─────────►
      └─┘   └─┘    └─┘     └─┘     └─┘     └─┘
That's a stream!
We're going to use a tool that helps us do clever things with these digital conveyor belts.
Let's say we have a stream of data that you could manage and do some operations on its way.
        │                │            │         │
────────┼────────────────┼────────────┼─────────┼─────────►
  Fetch │       Validate │  Transform │  Handle │ Upload
You could fetch the data, validate it, transform it, and upload it to somewhere, for example, in database.
At some point, for example, after validate stage your data becomesstrictly typed and you know that your transformations are applied or the right datasets and your flow is type-safe (Read more about AsyncIterable and Data Pipelines in TypeScript and type safety here.)
But how do you do that? You could write a bunch of code to manage the stream, and that's a lot of work for sure to make things clean. And what if you want to reuse the code? Compose? Or test it?
So strict-stream helps you to organize, your code, and make it reusable as well as strictly typed.
Suppose we have an array of numbers, and we want to double each number. Here's how you'd do it:
import { run, of } from "strict-stream";
import { map } from "strict-stream/map";
async function example() {
  const stream = of([1, 2, 3])
    .pipe(map((value) => value * 2));
  for await (const value of stream) {
    console.log(value);
  }
}
await example();See? Looks quite easy.
of([1, 2, 3]) just makes an AsyncIterable<number> from an array, and map doubles each value. For sure value is strictly typed as number.
The .pipe(map(...)) part is where the magic happens.
It takes the stream and applies the map function to it to double each value.
Then, we can iterate over the stream and log each value to the console.
pipe could be used as much as you need to compose your stream processing safely.
It gets along well with Node.js streams.
You can easily convert between strict-stream and Node.js streams using functions like
nodeReadable, nodeWritable, and nodeTransform.
Basically, node.js streams are AsyncIterable as well by default,
so there is not any overhead, and there is only some "sugar" to make it more comfortable to use.
It lets you build complex operations by chaining simple ones. You can filter, map, and even aggregate data with just a few lines of code. It's like building with LEGO bricks—simple pieces come together to make something impressive.
Imagine drinking from a fire hose—ouch! Backpressure handling prevents data from overwhelming your code. It's like sipping water through a straw instead as soon as you need it.
In this approach, you write less code that does more. And the best part? It's fast and efficient.
Streams might sound intimidating, but strict-stream makes them friendly.
It simplifies data processing and turns complex operations into manageable steps.
Whether you're dealing with real-time data or wrangling big datasets,
strict-stream has your back. Give it a try and simplify your code!
Note: You can find strict-stream on npm: npmjs.com/package/strict-stream.
Happy coding!
PS: One more article that you might like with introduction of composable streams for AsyncIterable