Questions tagged [scalaz-stream]

scalaz-stream is a streaming I/O library. The design goals are compositionality, expressiveness, resource safety, and speed. The design is meant to supersede or replace older iteratee or iteratee-style libraries.

The library supports a number of other interesting use cases:

  • Zipping and merging of streams: A streaming computations may read from multiple sources in a streaming fashion, zipping or merging their elements using a arbitrary Tee. In general, clients have a great deal of flexibility in what sort of topologies they can define--source, sinks, and effectful channels are all first-class concepts in the library.
  • Dynamic resource allocation: A streaming computation may allocate resources dynamically (for instance, reading a list of files to process from a stream built off a network socket), and the library will ensure these resources get released in the event of normal termination or when errors occur.
  • Nondeterministic and concurrent processing: A computation may read from multiple input streams simultaneously, using whichever result comes back first, and a pipeline of transformation can allow for nondeterminism and queueing at each stage.
  • Streaming parsing (UPCOMING): A separate layer handles constructing streaming parsers, for instance, for streaming JSON, XML, or binary parsing. See the roadmap for more information on this and other upcoming work.
83 questions
3
votes
1 answer

How do you combine multiple Scalaz-Streams such that order of completion is preserved but interleaving isn't enforced?

var num =0 var num2 = 3333 val p2 = Process.eval { Thread.sleep(10000) Task.delay { Thread.sleep(10000) num2 = num2 + 1 s"hi ${num2}" } }.repeat.take(15) //p2: scalaz.stream.Process[[x]scalaz.concurrent.Task[x],String] = //…
jordan3
  • 827
  • 4
  • 13
3
votes
1 answer

Performance of line counting with scalaz-stream

I've translated the imperative line counting code (see linesGt1) from the beginning of chapter 15 of Functional Programming in Scala to a solution that uses scalaz-stream (see linesGt2). The performance of linesGt2 however is not that great. The…
Frank S. Thomas
  • 4,569
  • 2
  • 25
  • 47
2
votes
0 answers

How do I nondeterministically flatten infinite FS2 streams

I'm using Scala's FS2 stream library. I have a Stream[F, [Stream[F, A]] where both the inner streams and the outer stream are infinite (with appropriate Async instances for F). I want to end up with a Stream[F, A] that concurrently pulls from the…
badcook
  • 3,619
  • 11
  • 25
2
votes
1 answer

Controlling the throughput of a Process

I'm trying to control the throughput of a Process[F, A] with a timer Process : val p: Process[List,Int] = Process.iterateEval(0)(i => List(i + 1)) val timer: Process[Task, Duration] = time.awakeEvery(1 second)(Strategy.DefaultStrategy,…
synapski
  • 313
  • 2
  • 12
2
votes
1 answer

How to kill the console input when process interrupted

I'm playing with Scalaz Stream library and trying to create a simple console app. I followed the tutorial scalaz streams and they have an example with console read and write. But I faced with a strange problem which I'm not sure how to sort…
kikulikov
  • 2,302
  • 3
  • 24
  • 37
2
votes
1 answer

continuously fetch database results with scalaz.stream

I'm new to scala and extremely new to scalaz. Through a different stackoverflow answer and some handholding, I was able to use scalaz.stream to implement a Process that would continuously fetch twitter API results. Now i'd like to do the same thing…
plambre
  • 5,568
  • 2
  • 15
  • 30
2
votes
1 answer

How do you write to and read from an external process using scalaz streams

I would like to be able to send data from a scalaz stream into an external program and then get the result of that item back in about 100ms in the future. Although I was able to do this with the code below by zipping the output stream Sink with the…
Chris Balogh
  • 358
  • 3
  • 8
2
votes
0 answers

scalaz-stream: combining queues based on one queue's size

In my application I have up to N consumers working in parallel and a producer. Consumers grab resources from the producer, do their work, append results to an updateQueue and ask for more resources. Producer has some resources available initially…
Pyetras
  • 1,402
  • 15
  • 21
2
votes
1 answer

How to merge adjacent lines with scalaz-stream without losing the splitting line

Suppose that my input file myInput.txt looks as follows: ~~~ text1 bla bla some more text ~~~ text2 lorem ipsum ~~~ othertext the wikipedia entry is not up to date That is, there are documents separated by ~~~. The desired output is as…
mitchus
  • 3,732
  • 3
  • 26
  • 67
2
votes
2 answers

How do I cleanly log to io.stdOutLines and respond to the client with a scalaz.stream.tcp server

I'm very new to both scalaz-stream and specifically scalaz.stream.tcp. I'm trying to do a very simple server for my own educational purposes. I parse the requests into commands, execute them to produce responses, and write the responses back to…
Integrator
  • 509
  • 4
  • 14
2
votes
2 answers

scalaz-stream queue without hanging

I have a two-part question, so let me give some background first. I know that is possible to do something similar to what I want like this: import scalaz.concurrent._ import scalaz.stream._ val q = async.unboundedQueue[Int] val p: Process[Task,…
2
votes
1 answer

How to send a process to multiple sinks in scalaz-stream

If I have a simple process which is emitting values of type String and I wish to send these to multiple sinks (i.e. each sink gets sent the String), how do I do this? For example, running this program: object Play extends App { def prepend(s:…
oxbow_lakes
  • 129,207
  • 53
  • 306
  • 443
2
votes
1 answer

Using Scalaz stream, how to convert A => Task[B] to Process1[A,B]

I am encoding a http request to a remote server as a function which takes an id and yields a Task[JValue]. I would like to convert that function into a Process1, to simplify my program (By simplify, i mean use Processes as building blocks as much as…
Atle
  • 308
  • 1
  • 11
2
votes
1 answer

Puzzling behavior in scalaz-stream with chunk and zipWithIndex

I am trying to process a stream of data using scalaz-stream with an expensive operation※. scala> :paste // Entering paste mode (ctrl-D to finish) def expensive[T](x:T): T = { println(s"EXPENSIVE! $x") x } ^D // Exiting paste…
underspecified
  • 939
  • 1
  • 7
  • 15
1
vote
1 answer

FS2 join Cannot prove that Seq[fs2.Stream[cats.effect.IO,Int]] <:< fs2.Stream[cats.effect.IO,O2]

I'm trying to use fs2 streams 0.10.0-M9 and doobie version 0.5.0-M9 for getting a sequence of objects from an http call which I want to then insert into a postgres database but I'm having issues structuring this code, getting the following…