Chunked response handling with spray example
Asked Answered
T

1

7

Documentation states that spray is able to handle chunked responses but I can't find any example to start with. There is my naive implementation:

object Main extends App {

  implicit val system = ActorSystem()
  import system.dispatcher
  val log = Logging(system, getClass)
  val ioBridge = IOExtension(system).ioBridge()
  val httpClient = system.actorOf(Props(new HttpClient(ioBridge)))

  val conduit = system.actorOf(
    props = Props(new HttpConduit(httpClient, "localhost", 3000)),
    name = "http-conduit"
  )

  val pipeline = HttpConduit.sendReceive(conduit)
  val response = pipeline(
    HttpRequest(
      method = GET,
      uri = "/output.cgi.xml"
    )
  )

  response onComplete {
    case Success(a) =>
      log.info("Success: " + a)
      system.shutdown()

    case Failure(error) =>
      log.error(error, "Failure")
      system.shutdown()
  }

}

I've set response-chunk-aggregation-limit = 0, nothing still happens.

Can you provide me with example of reading chunked response?

Update

I've rewritten my code as following:

object Main extends App {

  implicit val system = ActorSystem()
  import system.dispatcher
  val log = Logging(system, getClass)
  val ioBridge = IOExtension(system).ioBridge()
  val httpClient = system.actorOf(Props(new HttpClient(ioBridge)))

  actor(new Act {
    httpClient ! Connect(new InetSocketAddress("localhost", 3000))

    become {
      case Connected(_) =>
        log.info("connected")
        sender ! HttpRequest(GET, "/output.cgi.xml")
      case Closed(handle, reason) =>
        log.info("closed: " + reason)
        system.shutdown()
      case ChunkedResponseStart(res) =>
        log.info("start: " + res)
      case MessageChunk(body, ext) =>
        log.info("chunk: " + body)
      case ChunkedMessageEnd(ext, trailer) =>
        log.info("end: " + ext)
      case m =>
        log.info("received unknown message " + m)
        system.shutdown()
    }
  })

}

And now I'm receiving closed: ProtocolError(Aggregated response entity greater than configured limit of 1048576 bytes) just after connection is established.

My application.conf

spray.can {
  client {
    response-chunk-aggregation-limit = 0
  }
}
Tko answered 2/5, 2013 at 10:7 Comment(3)
WDYM with "nothing still happens"? Are you saying you never get a result to the query?Tananarive
Which version of spray do you use?Tananarive
I'm using 1.1-M7. I mean I don't know how to handle response in a chunked way. Currently onComplete receives aggregated response.Tko
T
9

As you have noticed, HttpConduit only works on aggregated responses. You have to fall down to the spray-can layer do handle single chunks.

Unfortunately, we currently have no example showing how you would do this. Roughly, it works like this (in M7):

  1. Set response-chunk-aggregation-limit = 0
  2. Send Connect to httpClient actor and wait for Connected
  3. Send HttpRequest to the sender of the Connected message
  4. Handle chunked request messages ChunkedResponseStart, MessageChunk, and ChunkedResponseEnd.

For more info see http://spray.io/documentation/spray-can/http-client/#chunked-responses

In contrast to using HttpConduit, this means you have to manage your connections yourself (if that's why you were using HttpConduit). In recent nightlies it got easier, because the new HttpClient automatically supports connection pools, etc. If you need that you can also get more info at the mailing list.

Tananarive answered 2/5, 2013 at 14:11 Comment(2)
Does it mean that my client will receive every big enough response as chunks, or some special server-side support is needed?Tko
You currently get chunks only if the server sends chunks. I added a ticket to support a chunk API even if the actual response isn't chunked over the wire: github.com/spray/spray/issues/281Tananarive

© 2022 - 2024 — McMap. All rights reserved.