The Hawaii facility didn't have modern security updates to prevent this from happening

One would think that former NSA contractor Edward Snowden's attempt to access confidential agency files would not be easy, but it reportedly didn't take much.

According to The New York Times, Snowden used a cheap Web crawler to delve deep into the NSA's classified files and take them.

A Web crawler is software used to index and back up websites. It can be programmed with various search phrases, and then jumps automatically from Web page to Web page by following links, traveling far and wide in search of relevant documents.

Some examples of Web crawlers are Googlebot and wget.

The Web crawler used by Snowden has not been named, but Snowden reportedly programmed his search to find certain subjects and see how deep the search would take him into the NSA’s internal networks.

This raises some major questions; such as why a simple Web crawler was able to return such information on supposedly tightly protected government networks.

The answer lies in Snowden's location. Back when WikiLeaks incident occurred in 2010, government facilities were required to install updated anti-leak software. But a facility in Hawaii was unable to receive the update because the outpost's network didn't have enough power to run it.

Edward Snowden [SOURCE: Wired]

When Snowden downloaded the 1.7 million NSA files, he was working at that government facility in Hawaii.

It's currently unclear if Snowden just happened to be placed at that facility or if he made a request, according to reports.

Nevertheless, this is just one more example of how Snowden outwitted the NSA. During his time at the NSA regional operations center for a month in Hawaii last spring, Snowden conned between 20 to 25 NSA employees to give him their login credentials and passwords. Snowden reportedly told the NSA employees that he needed their passwords in order to do his job, and after downloading secret NSA documents, he leaked the information to the media.

Since the leaks, the floodgates have been opened. In August 2013, reports said that the NSA admitted to touching 1.6 percent of total globe Web traffic. Its technique was to filter data after harvesting it, which led to over-collection on a major scale.

Many top tech leaders, like Facebook CEO Mark Zuckerberg and Google Executive Chairman Eric Schmidt, have spoken out against the NSA's programs along with civil-liberties advocates, U.S. citizens and even other countries that had the NSA peeping in their window.

A presidential review panel made 46 recommendations regarding greater restraint on the NSA's surveillance programs last month, which sought for an end to bulk collection of data among other suggestions.