Little known fact: azure is also a colour and python is a snake. But back to RxJava. This file is tiny but we'll use it to learn some principles. If you follow them you'll be capable of loading and continually processing arbitrarily large, even infinitely long JSON files. First of all the standard "Jackson" way is similar to JAXB: loading the whole file into memory and mapping it to Java beans. However, if your file is in megabyte or gigabytes (because somehow you found JSON to be the best format for storing gigabytes of data...) this technique won't work. Luckily Jackson provides streaming mode similar to StAX.

Loading JSON files token-by-token using Jackson

There is nothing wrong with a standard ObjectMapper that takes JSON and turns it into a collection of objects. But in order to avoid loading everything into memory, we must use lower-level API used by ObjectMapper underneath. Let's look again at the JSON example:

You get the idea. If you are familiar with compiler theory this is one of the first steps during compilation. The compiler transforms source code from characters to tokens.
But, if you know compiler theory you are probably not parsing JSON for a living. Anyway! Jackson library works this way and we can use it without transparent object mapping:

Of course, if we reach END_OBJECT (closing whole JSON file) we signal that the stream is over. The last lambda expression simply allows to clean up the state, for example by closing JsonParser and underlying File. Now imagine this JSON file is hundreds of gigabytes in size. Having Flowable<Colour> we can consume it safely in arbitrary speed without risking memory overload.