I came across this GigaOm article which among some legitimate points (such as lack of innovations to hide the complexity of MR framework) also says that “Hadoop is the talk of the town when it comes to big data, but it’s not without faults…”.

Apparently folks @GigaOm have been looking into Hadoop very close to notice that Hadoop has faults. It has a lot of them. In fact, it has a special fault injection framework in it, which sole purpose is to add faults into the Hadoop like there’s no tomorrow.