jackson-dataformats-binary: Eager allocation of byte buffer can cause `java.lang.OutOfMemoryError` exception (CVE-2020-28491)
CBORParser.java _finishBytes() accepts an unchecked field string length value discovered during parsing, and is used to allocated a buffer. A malicious payload can be fabricated to exploit this and (at least) cause a java.lang.OutOfMemoryError exception.
@SuppressWarnings("resource")
protected byte[] _finishBytes(int len) throws IOException
{
// First, simple: non-chunked
if (len >= 0) {
if (len == 0) {
return NO_BYTES;
}
byte[] b = new byte[len]; <-- OutOfMemoryError here if len is large
I am not sure how serious this is in java. With an unmanaged runtime this would be critical security vulnerability.
For example, the following CBOR data (discovered by a fuzzer) leads to len = 2147483647 and triggers this exception on my laptop.
d9d9f7a35a7ffffffff7d9f7f759f7f7f7
This can probably be addressed by simple sanity checking of the len value (non-negative, some max limit).
About this issue
- Original URL
- State: closed
- Created 5 years ago
- Comments: 17 (14 by maintainers)
Commits related to this issue
- Add test for (cbor) #186 — committed to FasterXML/jackson-dataformats-binary by cowtowncoder 4 years ago
- Fix eager allocation aspect of #186 — committed to FasterXML/jackson-dataformats-binary by cowtowncoder 4 years ago
- Add CVE id (CVE-2020-28491) for #186 — committed to FasterXML/jackson-dataformats-binary by cowtowncoder 3 years ago
What I mean is that with this problem, an attacker can use a very small payload to cause this big allocation, possibly causing denial of service with little effort. If the buffer were to grow incrementally while reading, the attacker would need to follow up the length header they sent with actual data to make an impact on memory consumption.
Another possibility would be not pre-allocating the buffer for large sizes and instead using a ByteArrayOutputStream-like growing scheme. It’s slightly less efficient if the full data does come in, but an attacker would have to actually send the data she is claiming to cause a DoS, which makes an attack more difficult.
Ah. Yes, maximum length for names seems perfectly reasonable. And I am not against configurable limit for values, but that value may need to start at quite high (perhaps even max int).
Eventually similar approach should probably be added for JSON (and Smile, maybe Protobuf, Avro) decoders, with various limits. I have done something like this with XML before (Woodstox Stax parser has a reasonable set of limits for all kinds of things like max attributes, nesting levels, attribute values, cdata segments etc etc), but so far there hasn’t been as much demand for json for some reason. Maybe it’s just that XML had more time to mature to be exposed to DoS attacks or something. 😃