Entries in analysis
(8)

As some of you may already know, a couple weeks back @HiddenIllusion and I gave a talk on Memory Forenics titled "Mo' Memory No' Problems" at BSidesNola. While the talk wasn't recorded we did want to put the slides out for the folks who were not able to attend. I hope you all enjoy.

*Be sure to visit HiddenIllusion's blog. Also for the analysis walk through at the end of the deck, we used a memory dump of 1337 hacker activity generated by Tony Lee of SecuritySynapse.

I must admit, I am in a bit of shock here. Never did I think I would actually get to have something I developed on the most popular distribution for security professionals. For those who do not know, Kali is the successor to Backtrack, this time fully debian based.

As of 2013/03/22 Automater was added to the Kali repos. What does that mean for you? If you are on Kali or have their repos added as a source you can now apt-get install automater.

Installation of automater on Kali:

In order to ensure you are able to pull automater from Kali you must first update your repos. to do this simply:

apt-get update

Once updated you can install automater with the following command:

apt-get install automater

You can roll both of those commands into one if you like, as seen in the screenshot below.

It is that simple, installation complete. Of course if you are not on kali you can still grab the latest version of automater from Github.

Automater usage:

Once installed via apt-get, automater is placed in /usr/bin/ which is in the PATH. So you can run automater from any directory.

Single target is an ip:

automater -t 66.249.23.64

Single target is a url/domain:

automater -t lovedacha.com

Single target is a shortened url:

automater -e bit.ly/XDlV1q

Target is several IPs and URLs/Domains listed in a file called hosts, outputting to another file called hosts.out:

automater -f hosts -o hosts.out

I have some really fun things planned for automater, such as adding a hash search function. I hope you enjoy.

Before getting into Moloch, I wanted to take a moment to say thank you to everyone who has putting the word out there on Automater lately. Automater has got a lot of recognition lately (thanks Reddit), which has been very motivating.

In this episode of Tektip, we take a closer look at one of the most exciting projects showed at Shmoocon 2013, Moloch.

"Moloch is an open source, large scale IPv4 packet capturing (PCAP), indexing and database system. A simple web interface is provided for PCAP browsing, searching, and exporting. APIs are exposed that allow PCAP data and JSON-formatted session data to be downloaded directly. Simple security is implemented by using HTTPS and HTTP digest password support or by using apache in front. Moloch is not meant to replace IDS engines but instead work along side them to store and index all the network traffic in standard PCAP format, providing fast access. Moloch is built to be deployed across many systems and can scale to handle multiple gigabits/sec of traffic."

"Andy Wick and Eoin Miller are members of AOL’s Computer Emergency Response Team. Andy Wick has more than 15 years of development experience at AOL. He has recently come into the CERT group and has begun developing tools for defense and forensics. Eoin Miller specializes in using IDS and full packet capture systems to identify drive by exploit kits and the traffic that feeds them (malvertising in particular). He regularly contributes the developed signatures to EmergingThreats/OISF and other groups."

Now I have put a lot of time into MASTIFF lately and haven't had a chance to get Moloch installed and configured properly quite yet. Luckily, the securabit.com team has given me access to their lab, where they have Moloch built out, along with many other products. A huge thank you to them, especially Mike Bailey (@mpbailey1911) who took the time to get Moloch installed and configured, with a decent amount of traffic pumping through it.

The version of Moloch I am using for this video is 0.7.3. Moloch gives the user an efficient method of browsing, querying, exporting, and visualizing packet data. Some commercial products I would say are similar in function are NetScout, NetWitness, and Cascade.

The power of Moloch, at least for what I will be using it for is the ability to have immediate access to traffic data and pcaps that match custom filters based on fields that are not normally available as a queryable field, such as http header information. As Moloch uses a syntax for filters very similar to wireshark, network analysts will quickly adapt to the product. On the visualization side, there is a maltego like feel. It shows how IP Addresses and ports relate to each other based on the data you have filtered on.

As Moloch is still early in development I expect the product will evolve to incorporate even more features. My current Moloch wish list is:

Groups: Have the ability to create groups of IPs, Services, tags, and then query on those groups. An example is create a group for all of your DNS servers and then write a filter to the effect of "IP Source of Not 'DNS Servers' to External on UDP/53"

Save Filters: Would be nice to be able to save filters for future use.

Share saved filters: Share filters with other users.

Enjoy of the screenshots, and check out the video for a more in-depth look.

Automator as most of you know, pulls IP and URL information from various sources in order to make analysis easier on the analyst. Recently IPVoid changed up their site a bit, because of this I needed to make some modifications to Automater to get it to function appropriatley. To be specific I needed to modify the regular expressions to match the format of the new site.

I have some other changes that I will be making soon as well:

Extra export options (csv, html)

Malware domainlist checker

Source engine selection

Re-write to utilize BeautifulSoup

If anyone has any other feature requests please let me know.

To see more about what Automater is and how it functions check out the tutorial.

If the IP or URL has not been previously scanned at IPVoid or URLVoid, the script is supposed to submit the IP or URL and then pull results. This seems to work most of the time, but on occasion it will not wait long enough to pull the appropriate result. Running the command a second time will work though.

Can not use the -e and -f switch together.

URLs with http:// cannot be scanned. Must take the http:// out for it to work.

Please submit any other bugs to 1aN0rmus@tekdefense.com

Upcoming Features:

For those who would like to be able to just query a specific engine or source such as robtex, we will be creating an option to do so.

Check IP and/or URL against Malwaredomainlist

Check IP and/or URL against malware sandboxes such as ThreatExpert.

Summary report that will give statistics on the targets highlighting the known bad information such as blacklists and malicious URL categories.