beast: bug when reading chunked on "slow" stream

Version of Beast

master (v49) and v50

compiler

os: ubuntu 17.04 compiler: gcc (Ubuntu 6.3.0-12ubuntu2) 6.3.0 20170406

code for repeating the error

struct very_slow_stream {
  boost::asio::io_service& io_service;
  const std::string input =
      "HTTP/1.1 200 OK\r\n"
      "Transfer-Encoding: chunked\r\n"
      "Content-Type: application/octet-stream\r\n"
      "\r\n"
      "4\r\n"
      "abcd\r\n"
      "0\r\n"
      "\r\n";
  std::size_t index{};

  template <typename MutableBufferSequence>
  std::size_t read_some(const MutableBufferSequence& buffer,
                        boost::system::error_code& /*error*/) {
    auto write_to = boost::asio::buffer_cast<char*>(*buffer.begin());
    // std::size_t copy_count{input.size()}; // WORKS!
    std::size_t copy_count{1};  // FAILS, with bad chunk!
    for (std::size_t index_copy = 0; index_copy < copy_count; ++index_copy)
      *(write_to + index_copy) = input[index_copy + index];
    index += copy_count;
    return copy_count;
  }

  template <typename MutableBufferSequence>
  std::size_t read_some(const MutableBufferSequence& buffer) {
    boost::system::error_code error;
    const auto size = read_some(buffer, error);
    if (error) throw error;
    return size;
  }
  boost::asio::io_service& get_io_service() { return io_service; }
};

void test_bug() {
  boost::asio::io_service service;
  very_slow_stream stream_{service};
  beast::multi_buffer stream_buffer;
  beast::http::parser<false, beast::http::dynamic_body, beast::http::fields>
      parser;

  while (!parser.is_done()) {
    const auto read_count =
        beast::http::read_some(stream_, stream_buffer, parser);
    std::cout << parser.get() << std::endl;
    stream_buffer.consume(read_count);
  }
  std::cout << beast::buffers(parser.get().body.data()) << std::endl;
}

the issue

I’m trying to read a chunked http stream, !chunk by chunk!. So I’m using beast::http::read_some. !Sometimes!, I get the “bad chunk” error when beast tries to parse the chunk header (pasic_parser.ipp:546). After some hours of debugging, I found out the error occurs when the stream happens to be very slow. There seems to be a logic error with the \r\n of the previous body chunk.

About this issue

  • Original URL
  • State: closed
  • Created 7 years ago
  • Comments: 22 (12 by maintainers)

Commits related to this issue

Most upvoted comments

massive bug in the test pipe, it uses write_size instead of read_size! That means none of my test code that thought it was reading a byte at a time was doing what it think it do: https://github.com/vinniefalco/Beast/commit/e4a4725838dd827b3af09b28cc2eaf0d173e8e20

I can reproduce your issue with this test function:

// https://github.com/vinniefalco/Beast/issues/430
void
testRegression430()
{
    test::pipe c{ios_};
    c.server.read_size(1);
    ostream(c.server.buffer) <<
        "HTTP/1.1 200 OK\r\n"
        "Transfer-Encoding: chunked\r\n"
        "Content-Type: application/octet-stream\r\n"
        "\r\n"
        "4\r\nabcd\r\n"
        "0\r\n\r\n";
    flat_buffer fb;
    parser<false, dynamic_body> p;
    read(c.server, fb, p);
}

Master v50 is working for me too. Thanks!

Phew, that’s a relief!!! Thanks!

I was able to retest and everything works as expected (master v50), thank you!

I’ve identified a defect in the parsing of chunk headers, working on a fix

Hey I think you’re right about removing the flag! But keeping it is not so easy because of the transition from a regular chunk to the final chunk…working on a fix.