[ar:int0x80]
[al:Def Con 24 Hacking Conference]
[ti:Anti-Forensics AF]
[au:int0x80]
[by: DEF CON Communications (https://www.defcon.org)]
[00:00:00.50]
>> Alright this is my first time
onstage at Def Con not rapping,
so I am a little bit nervous.
[00:00:05.93]
This is um, Anti-Forensics AF.
Uh, I'm.. when I would see
internet memes that were like
[00:00:11.87]
that's silly AF or that's stupid
AF, I would read it as like
that's silly Anti-Forensics.
[00:00:17.43]
And... [Laughter] My name is Int
Eighty. I'm the rapper in dual
core and um this is the fourth
[00:00:22.43]
[00:00:26.90]
Anti-Forensics talk that I've
ever made. Uh so the other
three, they're online, I've
[00:00:31.10]
presented at like Derby Con and
stuff like that, so you can
watch them on YouTube. And uh,
[00:00:36.93]
here's, here's some stuff that
we're gonna talk about today.
Um, we are uh, we're gonna start
[00:00:42.37]
kind of where I left off with my
previous Anti-Forensics talk and
we'll talk about some self
[00:00:46.30]
modifying code in memory for
Windows executables. Um, this is
like coming from the context of
[00:00:51.30]
being an operator and being on a
compromised system and you have
malware in place running and
[00:00:56.60]
you're trying to prevent your
malware from being acquired and
analyzed. Uh then we'll talk
[00:01:00.80]
about some Android stuff. Um so
I've got some Android Forensics
research that I've done with my
[00:01:05.27]
girlfriend and then also some
Anti Forensics. And uh I'll
probably win the award for worst
[00:01:10.33]
Def Con demo ever. And then uh,
I have some stuff with SD Cards
but I'm having, uh display
[00:01:15.70]
issues with my Linux laptop. So
if we can't get it to work, uh
we will, uh I don't know, I'll
[00:01:22.13]
be in the vendor area afterwards
so I'd be happy to show you guys
some cool Sd card tricks.
[00:01:26.70]
[00:01:30.43]
[Baaahhh] [Laughter] Uh so I
currently work on a red team and
I get shells and write malware
[00:01:35.43]
[00:01:38.17]
uh for, for a living and I come
from a reverse engineering
background so I've never done
[00:01:43.27]
forensics professionally so um
you know, don't, don't really
take my word for all of this. I
[00:01:48.63]
mean it works, but I'm not a
forensics professional. UH, I've
had some really interesting run
[00:01:53.47]
ins with uh law enforcement and
three letter agencies and I'd be
happy to talk about my trolling
[00:01:59.80]
of certain agencies like the FBI
and the NSA. Um in some other
time when I have a drink in my
[00:02:05.23]
hand or something like that so
if you want to find me after the
talk, I can tell you some fun
[00:02:08.83]
stories. Again I'm not an
expert, uh I just like to try
stupid things on computers and
[00:02:13.93]
see what happens and so your
milage may vary but I'm sure you
guys are all way smarter than I
[00:02:18.67]
am and can come up with way
cooler and more technical
things. And then the last part
[00:02:22.57]
is um kind of a commentary on
our current legal sys, uh system
and um every cool thing I know
[00:02:27.57]
[00:02:29.97]
how to do with computers is
illegal and so I've learned to
cool things with computers by
[00:02:34.33]
doing illegal things. So I
highly recommend doing illegal
things. [Laughter] [Baaaahhh!!]
[00:02:39.33]
[00:02:45.90]
[Laughter] Alright cool so, uh
we've all known for the past
like decade or so Memory
[00:02:50.13]
Forensics is the new hotness
right? You can get all of the
good stuff out of ram. And so
[00:02:55.00]
again the context here is you're
an operator, you're an attacker,
you're on a system, you've got
[00:02:58.87]
malware running, you've got an
implant, you've got you know,
persistence. But you know, it's
[00:03:03.47]
expensive to build a tool chain
and so you don't want to get
your malware caught. You don't
[00:03:08.13]
want it to be analyzed, you
don't want IOCs to be coming out
about your malware, you don't
[00:03:12.10]
want it to be detected. So the
whole goal is to keep our
malware running on a system and
[00:03:16.37]
thwart either acquisition and or
un analysis. And so in the
previous talk, um I've done some
[00:03:23.20]
tampering, uh for the
acquisition side, and in this
one we'll look at the analysis
[00:03:26.73]
side. With a little bit of
acquisition. Uh like I said,
cool stuff happens in memory, um
[00:03:32.63]
but we kind of take advantage of
this. Right? The analysis tools,
they need all of the data about
[00:03:38.33]
your malware in order to be
successful in analyzing uh what
you, what you've coded. And so
[00:03:43.60]
we don't need all of those
sections, uh once we're loaded
in memory. Um so we can tamper
[00:03:49.27]
them, or remove them or whatever
and therefor our execution still
continues, we persists as
[00:03:54.43]
operators on our target
environment but analysts lose
and can't analyze our malware.
[00:03:59.43]
[00:04:01.90]
Uh and so it's great, we can
basically just, like I said,
either modify or remove bytes
[00:04:06.87]
and the analysis cools...tools
can't do their job. [Baaahhhh!!]
[Laughter] So this first demo is
[00:04:11.87]
[00:04:17.40]
um, a POC that I wrote called
"The keys are like right next to
each other." It's kind of a
[00:04:22.40]
[00:04:26.87]
throwback to the Bash dot org
quote. It's a common type, the
keys are like right next to each
[00:04:31.87]
[00:04:35.37]
other. Alright, here we go.
Okay, everybody see that okay?
My text is big enough? >> All
[00:04:40.37]
[00:04:58.10]
good. >> Cool, alright awesome.
Alright so uh I have a malware
sample that I wrote called the
[00:05:03.10]
[00:05:05.93]
keys are like right next to each
other. So it's running. Alright
we can see that it's executing,
[00:05:11.20]
everything's good let's do an
acquisition of memory. So I'm
using uh recall which is an open
[00:05:16.20]
[00:05:19.90]
source framework published by
Google. You can grab it off of
their GitHub. So first what
[00:05:24.30]
we're gonna do is acquire memory
and we're gonna put it in this
file called L-O-L dot A- F-F
[00:05:28.87]
four. And you can, you can name
it whatever you want. I just
think it's funny so. [Laughter]
[00:05:33.87]
[00:05:38.73]
I should have like a how to
basic video in here somewhere
probably. How many people are
[00:05:43.20]
coming to the Def Con party
tonight? [Cheer] Awesome. I'll
be rapping at one thirty
[00:05:47.70]
headlining so I'll see you all
there. [Woo!] How many people
have heard the song, " Drink all
[00:05:52.73]
the Booze, Hack all the Things?"
[Woo!] >> Many times. >> That is
exactly how many album sales we
[00:05:58.33]
have. Wow. [Laughter] Thank you
guys. [Laughter] Alright. I
thought Ram was supposed to be
[00:06:03.33]
[00:06:10.30]
fast. How many people first year
at Def Con? [Clapping] Nice!
Thank you for coming. How many
[00:06:15.30]
[00:06:23.07]
people twenty fourth year at Def
Con? [Laughter] Nice! Why did I
do a sixty four bit B-M.
[00:06:28.07]
[00:06:37.93]
[Laughter] Alright well. While
that's going, this is what the,
the malware looks like loaded
[00:06:43.67]
from disk. Right? This is
statically loaded, the keys are
like right next to each other in
[00:06:47.90]
IDA. You get a nice call graph,
you see all the subs. You can do
strings. Alright, we can see all
[00:06:52.90]
[00:06:59.37]
this cool stuff. Everything
looks good, right? There's,
there's no real complaints here.
[00:07:03.03]
But, yes okay cool. We finish
our acquisition, so now lets get
that malware. Again, malware's
[00:07:08.03]
[00:07:16.97]
still running, so... Alright so,
uh this basically gives you like
an I python notebook for your
[00:07:21.97]
[00:07:37.40]
workspace, um in recall, and we,
it's really nice, it's basically
Python and you can get tab
[00:07:43.73]
completions. So let's uh let's
knock out this binary. We don't
even have to know the Pid, we
[00:07:50.03]
can just say the keys and, let's
give it a dump directory. I'll
just put it right on the
[00:07:55.03]
[00:08:01.70]
desktop. SO it's dumbing it out
of the acquired ram image now.
Cool. And so there we go, we got
[00:08:06.70]
[00:08:31.23]
a dumped pid forty seven one
ninety two. Here's our
executable, forty seven ninety
[00:08:35.30]
two, load it in IDA. Hmm
doesn't, doesn't know what to
make of it. Guesses it's a MS
[00:08:40.43]
Dos Com file. Let's see what it
says. That does not look the
same. And so, you notice it
[00:08:45.43]
[00:08:49.50]
doesn't even know the segment
names, these instructions don't
look legit and we have just a
[00:08:54.10]
bunch of bytes. So, but
Malware's still running. Right?
So, what does this look like on
[00:08:59.10]
[00:09:04.03]
like for the code? Uh, this is a
c plus plus file, sixty three
lines, not including comments,
[00:09:09.03]
[00:09:15.53]
and or including comments. So
all we do is resolve our way
into finding the header in
[00:09:20.53]
[00:09:23.03]
memory and once we've recognized
like yup we've found our header,
our executable header, uh we
[00:09:28.03]
[00:09:30.90]
call virtual protect so we can
uh set the right bit for
permissions on the page for the
[00:09:35.60]
header and then we zero off the
memory with this call to RTL
zero memory. It's basically just
[00:09:40.30]
mim set underneath the hood, but
what you end up with is losing
all the data structure about
[00:09:45.30]
your portable executable at that
point, right? The whole thing is
gone. Uh we reset the
[00:09:50.40]
permissions back with virtual
protect, and just in this case,
the POC just loops and, and
[00:09:55.87]
that's it. So you know pretty
simple but BT analysis tools.
Our malware persists hasn't been
[00:10:00.87]
[00:10:03.73]
analyzed. Uh, you know you
could, um you could go through
here and try to like, you know
[00:10:09.57]
hit, hit C and IDA and see if
you can figure out like you
know, is this code, is this
[00:10:13.73]
code? Right? It's not data, is
it code? And maybe get some
disassembly of the instructions
[00:10:18.57]
but it's going to be a pain in a
butt and I also don't know any
forensic investigators that even
[00:10:21.87]
have IDA installed to begin
with. So, [Chuckle] I don't
know. I'm hedging my bets. My
[00:10:26.83]
threat model looks like that.
Cool. Uh let's see, I can
actually show you guys the hex
[00:10:31.83]
[00:10:37.27]
editor too. This is, this is
what it looks like in the hex
editor. Just a bunch of zeros.
[00:10:43.90]
Right, whereas in the original
in hex editor looks like a
normal windows executable file.
[00:10:49.00]
So, pretty neat. But if you want
to take this further, you can do
some interesting things to throw
[00:10:53.87]
off the analysis tools right,
like uh two of my, two of my
friends, Richard Wartell from
[00:10:58.33]
Palo Alto Networks and Craig
Smith from Open Garages
independently suggested to me
[00:11:02.23]
like rewriting a new PE Header.
So I don't know if you guys have
ever seen something like tiny PE
[00:11:06.80]
where you could write like a new
executable into like part of the
memory space and you know it
[00:11:11.63]
might throw the analyst off by a
lot. They'd be like oh, it's
just tiny PE even though its you
[00:11:15.80]
know, a ten K file. Or something
like that. Um you can do uh
other things like uh modified
[00:11:21.87]
values in the header and uh
you'd get some interesting
results aka crashes with the
[00:11:27.10]
analysis tools if uh if any of
you are so inclined to research
that particular vector. Alright.
[00:11:32.10]
[00:11:36.23]
Any, any questions on what we
did with the keys are like right
next to each other for Windows?
[00:11:39.97]
Cool. So the whole reason this
works is, in order to get from
on disk into memory, you need
[00:11:44.97]
[00:11:57.33]
your PE header so you can load
and navigate all the data
structures and get everything
[00:12:01.20]
loaded. Um but once we're in
memory, we don't need the PE
header anymore. Right? We don't
[00:12:05.77]
need to reference back into
there uh into that section, we,
our code is executing, we're
[00:12:09.93]
good to go. So in this case, we
just zero the memory, and we're
still running, we don't, we
[00:12:14.23]
don't need to reference the
section header but the analysts
tools failed. And so in the
[00:12:19.77]
preference Anti Forensics talk,
I demoed this in Windows XP uh
before it was end of life and
[00:12:24.00]
here I demoed it in Windows ten,
it still works. [Baaaaahhh!!]
[Laughter] Uh for completeness
[00:12:29.00]
[00:12:32.90]
in case you guys wanna take
pictures of the slides, or
anything like that, I mean I
[00:12:35.77]
guess they're on the CD too but
I don't know anybody that has a
CD drive. Buy Dual Core CDs.
[00:12:40.30]
[Laughter] Alright, this is,
this is what we ran.
[Baaaaaahhh!!] [Laughter]
[00:12:45.30]
[00:12:54.00]
Alright, so then uh you know,
the next question is, well will
it run in Linux? Usually the
[00:12:59.00]
[00:13:02.63]
question is, will it run Linux
but it's an intel chip so it'll
run Linux fine. Alright. Alright
[00:13:07.63]
[00:13:19.50]
cool. So uh here I've got uh you
know Linux port of the code, uh
the keys are like, right next to
[00:13:25.77]
each other, running. This is
just some debugging output.
Alright cool so let's get ram in
[00:13:30.77]
[00:13:35.20]
Linux. Sorry I made this talk a
bit ago and I'm really forgetful
because contact switching so I
[00:13:40.20]
[00:13:45.37]
have all my notes here. Alright.
So uh to acquire ram in Linux,
uh you know in Windows, we use
[00:13:50.37]
[00:13:54.67]
recall but we're going to use
Lime which is an open sourced
tool. Published by the five oh
[00:13:58.13]
forensics guys. You can grab
this off of GitHub as well. And
once you build it, it um it
[00:14:04.20]
shows up as a colonel module so
in this case its this uh lime
generic dash K O. Which matches
[00:14:10.53]
the version number of your Linux
colonel. And, we'll specify a
path to dump it out to. So
[00:14:15.53]
[00:14:23.03]
great, all we're doing is just
loading, loading this colonel
module and it's gonna acquire
[00:14:28.93]
ram for us. Are you kidding me?
[Laughter] Oh maybe I did't
build it. Let's build it.
[00:14:33.93]
[00:14:36.50]
[Laughter] Uh. Woo! [Laughter] I
really didn't know what was
gonna happen there. [Laughter]
[00:14:41.50]
[00:14:53.53]
Alright, that one worked. Good.
Alright so. Great, we have a
memory acquisition. Let's check
[00:14:58.53]
[00:15:11.30]
it out with volatility. Oh God,
I hope this works too.
[Laughter] Alright, so. This is
[00:15:16.30]
[00:15:22.40]
uh Volatility's um by the
volat..., Volatility foundation.
Open source framework written in
[00:15:26.57]
Python. Does a lot of cool
stuff. Uh one thing I'll go over
with more verbosity in the
[00:15:31.23]
slides is um building Volatility
for uh your Linux setup. Um by a
default it doesn't come ready to
[00:15:36.23]
[00:15:38.40]
run on Linux uh but it's pretty
straight forward to get it, to
get it going. So let's... let's
[00:15:43.40]
[00:15:46.30]
take a look at our memory
acquisition here. Alright first
let's find our PID. These are
[00:15:51.30]
[00:16:03.93]
just modules that weren't
installed. Oh God. Oh there it
is. Phew. [laughter] Alright.
[00:16:09.50]
Ninety eight oh one. And just to
confirm. Yup, we're still, still
running. Still doing evil stuff.
[00:16:14.50]
[00:16:24.13]
[Laughter] It's false
advertising by the way. It's
just printing. >> It's pretty
[00:16:27.43]
evil. >> Okay. So, looks, looks
like it dumped, right? Outside
of the module. Module said it
[00:16:32.43]
[00:16:37.70]
tried to load. Uh everything
else fine, no complaints, right?
We're good to go. Four, four
[00:16:42.77]
hundred thousand hex looks like
a solid base address for
standard Linux image. So...
[00:16:48.40]
let's take a look. Uh, there it
is. Okay cool. So here's our
process dumped out in memory. Oh
[00:16:53.40]
[00:16:57.00]
that's weird, it's empty. Ah ha
sweet, we won. Zero file size.
They couldn't... Forensics
[00:17:02.00]
[00:17:06.23]
investigator can't get our
malware out in memory. Good job
us. [Yay, laughter] [Applause]
[00:17:11.23]
[00:17:18.83]
Uh, let's take a look at what
this one looks like. So it's
pretty much the same thing. A
[00:17:23.90]
little extra debugging output.
Uh.. This code here, lines
sixteen through twenty is bad.
[00:17:29.30]
And I should feel bad.
[Laughter] Uh, I'm literally
like taking a short cut to, to,
[00:17:34.23]
I'm using F scan F and reading
out of the Proc maps to find the
header and memory.Um there's
[00:17:39.70]
probably an actual legit way to
do this but I'm lazy and this
worked so. Uh Cool. So we find
[00:17:45.17]
the header, just like we did in
the Windows version. Uh in this
case instead of virtual pretect,
[00:17:50.40]
protect we call emprotect uh
again setting permissions for
both read right all three read
[00:17:55.63]
right and execute. Uh then we
call mem set, zero out the
header in memory and call and
[00:18:02.17]
protect again to restore the
original permissions. So if you,
if you caught this right at the
[00:18:07.53]
moment that the the overwrite
was happening, it might look
weird seeing the header with all
[00:18:11.13]
three read right execute
permissions but if you catch it
afterwards, which you will,
[00:18:15.30]
maybe. Um, then you'll see it
look like normal with only read
and execute. And then uh this is
[00:18:20.30]
[00:18:22.77]
just more debug output and here
we go. Just infinitely looping,
doing evil stuff. So, that's all
[00:18:28.00]
it takes. Hack the planet.
[Laughter] Cool. So you know we
did, we did get an acquisition
[00:18:33.00]
[00:18:41.53]
with Lime, um we didn't tamper
the acquisition too much but we
ended up not being able to
[00:18:46.40]
extract the actual binary
volatility. So, good job.
[Baaaahhhh!!] [Laughter] This
[00:18:51.40]
[00:18:55.80]
works for the same reason that
the Windows one works. Uh we
don't need the executable
[00:18:59.03]
header, right so, Windows uses
portable executable, or PE. And
Linux uses ELF which I don't
[00:19:04.27]
know what it stands for but who
cares? And uh, we do the same
thing. We zero the header, and
[00:19:09.30]
since we don't need it once
we're in memory, it continues to
run but the analysis tools fail.
[00:19:13.03]
[00:19:15.67]
So this was my uh this was my
win on the fly I successfully
built Lime with a make command.
[00:19:21.30]
Good job, me. Uh the real bread
and butter here is the insmod,
so inserting the colonel module.
[00:19:26.30]
[00:19:28.90]
And um uh pointing it at the
output path and giving it the
format. And this is the
[00:19:34.23]
volatility stuff, so if you want
to install it straight out of
their GitHub, just do a Git
[00:19:40.97]
clone um, python setup dot py
install, pretty normal. Although
I always forget that if I don't
[00:19:46.30]
do it like if I always use PIP
for awhile and then I'm like
wait, how do I do this manually?
[00:19:50.20]
Uh that's why I put that line in
there. And then um, this is
building the Linux profile.
[00:19:55.10]
Right, like I said, volatility
out of the box, is not gonna
work on a Linux Vm or a Linux
[00:19:58.33]
system. So you need to build a
profile for the actual system
that you're gonna be um, uh, um
[00:20:03.67]
acquiring the ram from. So uh,
you'll go into tools Linux sub
directory in the volatility of
[00:20:10.53]
directory, run make and it will
create this module dot dwarf
file. And if you run head on it,
[00:20:15.07]
the first line you should see
should be dot debug info and
that will tell you that you did
[00:20:18.43]
a good job. And I think all of
this is in their GitHub anyways.
[Bahh!] [Laughter] Uh, and then,
[00:20:23.43]
[00:20:28.80]
um once you built the module for
that dwarf file, you copy it
into the volatility file system
[00:20:33.80]
where it wants it and then you
can verify that you've got it by
running that first volatility
[00:20:37.57]
command that I ran and grepping
for Linux and that will tell
you, uh that you've got a
[00:20:41.63]
profile for Linux now. Then we
did PS list to find our process
ID and then uh we did proc dump
[00:20:46.63]
[00:20:51.70]
and that tries to dump out the
image of our malware and fails.
[Baaahhh!] [Laughter] Alright,
[00:20:56.70]
[00:21:02.13]
uh Android stuff. Um before I
get started on the Android
stuff, I will say uh that none
[00:21:07.13]
[00:21:11.90]
of this addresses any of the
Qualcomm trust zone issues or
like the driving of hardware
[00:21:16.27]
encryption key from that. The
premise is in my threat model,
I'm not going up against
[00:21:20.53]
somebody that... or um the
people that I might against
don't care about me enough that
[00:21:24.23]
they're gonna send my Android
device off to Israel for
cracking. [Woo!] Alright, cool.
[00:21:29.23]
[00:21:32.80]
So uh, I also used Tor, used
Signal for this. It followed the
grugq on Twitter. Planning
[00:21:37.80]
[00:21:42.47]
Thanksgiving, use Tor, use
Signal. [Laughter] I like that
one because I'm allergic to
[00:21:47.47]
[00:21:58.73]
nuts. My tree nerd credibility.
The retweet in here is I want to
eat a pint of jerry garcia ice
[00:22:03.73]
[00:22:07.17]
cream should I use a bowl or
not? Use Tor, use Signal.
[Laughter] Selling social
[00:22:12.17]
[00:22:15.23]
security numbers for bitcoin
please contact me on my XMPP and
will discuss further. use Tor,
[00:22:20.23]
[00:22:29.10]
Use Signal. [Laughter] [Baahhh]
[Baaaaahhhhh!!!!] {laughter]
Okay, I'm really sorry. I'm just
[00:22:34.30]
gonna say it, I've had like this
research for awhile and really
the only reason I made this talk
[00:22:39.50]
'cause I thought the format for
screaming goats in slide
transitions was hilarious. And
[00:22:43.70]
so I'm so sorry to everybody.
[Laughter] Okay so this whole
premise of Android stuff is all
[00:22:48.70]
[00:22:53.77]
about using encryption right?
And so to understand why using
entrip, encryption is so
[00:22:57.77]
important for your Android
device, uh we're gonna talk
about the acquisition and
[00:23:03.17]
analysis process of Android
Forensics first. TLDR it sucks.
[Bahhh!] [Laughter] Okay, so
[00:23:08.17]
[00:23:13.10]
Android Forensics, is not the
easiest thing. Um anybody here
done Android Forensics? Few
[00:23:19.37]
people. Nice. Alright so, this
is the way that my girlfriend
and I figured out how to do it
[00:23:24.53]
if you've got like flashy
hardware and budget and stuff
like that. It's probably way
[00:23:28.43]
easier. But uh, you have all
these, these like um blockers in
place that you need to work in
[00:23:35.07]
order for your kill chain to be
successful. So if you want ram,
you have to be running a, uh, an
[00:23:40.80]
Android colonel or a Linux
colonel that allows loadable
colonel modules. Because you can
[00:23:46.00]
use Lime to acquire ram on an
Android as well but out of all
of the Android devices I looked
[00:23:51.37]
at none of them have a colonel
that allows loadable colonel
modules. Which I mean is good,
[00:23:55.53]
for my personal usage I wouldn't
want that, right? Um so, you're
probably going to lose at memory
[00:24:00.20]
forensics right from, right off
the, right off the bat. Uh the
way we did acquisition, was by
[00:24:05.83]
cross compiling net cat um for
arm and then placing it on to
the device which already your,
[00:24:11.40]
you know if you do uh
traditional forensics, you're
only read only. Right? Your
[00:24:15.93]
method is read only. And so
you're already writing on to the
device that you're gonna be
[00:24:20.13]
doing read only from. And
there's a bunch of different
interfaces that get exposed
[00:24:23.80]
based on your build of Android.
So, in order for successful
acquisition of ram. You need the
[00:24:28.80]
[00:24:30.90]
device to be powered on, the
device to be decrypted,
unlocked, rooted, and USB
[00:24:35.33]
debugging. that's if you want a
full physical acquisition of the
nand storage. So guess what, if
[00:24:41.97]
you're encrypted, then you
already killed them with the
second, second step. Then if you
[00:24:46.83]
want ram acquisition, then you
need the loadable, loadable
colonel modules. [bahhh] [bahhh]
[00:24:51.83]
[00:24:58.77]
[bahhh] [Laughter] Okay, uh so,
um this is like kind of the
verbose notes of um doing
[00:25:03.77]
[00:25:11.13]
Android forensics. Uh once
you've attached your device to
your acquisition machine, uh you
[00:25:15.60]
can run ADB devices and that
will show you a list of Android
devices that are attached. Then
[00:25:20.80]
once you've got your cross
complied net cat binary, you
push it up onto the Android
[00:25:24.93]
device. So that's just adb push.
Then you set up a less listener.
This is like the weirdest thing
[00:25:29.90]
I've ever seen in forensics. So
you're forwarding all of the
acquired data over at TCP
[00:25:35.17]
listener. In this case port four
four four four. And if you're
playing a drinking game and you
[00:25:40.10]
have to drink on the word four,
you've be in a lot of trouble
right there. Uh then you spawn a
[00:25:46.67]
shell call a SU. Uh that's the
part of the device being rooted.
Copy the net cat binary over
[00:25:53.10]
into dev NC and change motive
for executable bits. And then
you would DD. And we're all
[00:25:59.27]
familiar with DD, so that's
fine, nothing crazy there. Uh
but you pipe it into net cat
[00:26:05.23]
over your listener, [Laughter]
And then you acquire it over the
net cat connection. It's all USB
[00:26:10.30]
right, its not like going out
over the air or anything like
that. It just, it just seems
[00:26:13.37]
like such a weird set up. And
then uh SHA256 sum it and make a
back up copy and SHA256 on the
[00:26:18.60]
back up copy. [Goat Simulation
Video] [Laughter] Goat
simulator. Uh okay so one of the
[00:26:23.60]
[00:26:33.43]
weird things that I found uh, um
my girlfriend and I found during
our research on Android
[00:26:37.77]
forensics was you'll find NAND
exposed by different interfaces.
Um so if you look in proc
[00:26:42.73]
partitions that'll tell you
where you can acquire from but
um I've seen it under these,
[00:26:47.73]
[00:26:49.83]
these ones that are listed. Dev
Block MMC Block, Dev MTD, MTD,
Dev MTD BLOCK, Dev EMMC, and I
[00:26:56.10]
thought I was being clever there
with that c plus plus no
comment. [Laughter] Dad jokes.
[00:27:01.10]
[00:27:13.60]
[Goat simulation video.]
[Laughter] Uh if you wanted,
that's if you want physical
[00:27:15.60]
acquisition, if you want logical
acquisition, it's way easier
right? You don't need root
[00:27:18.20]
necessarily, you can just plug
the device up. You'll need USB
debugging enabled but you can
[00:27:22.80]
just do a ADB pull. There's like
a bunch of facilities that are
available via ADB and the
[00:27:27.13]
Android SDK that you can, you
can acquire data off of a target
Android device. I thought ADB
[00:27:33.80]
back up one was kind of
interesting because it creates
this like jar file and you have
[00:27:38.10]
to put a pin in for it um it
throws like a pin on the device
I think and that then exposes
[00:27:43.70]
your pin in the bash history. If
you were maybe targeting a
forensic investigator. I don't
[00:27:48.87]
know. [Goat simulation video]
[Laughter] Uh, more stuff dump
state. Like all these things get
[00:27:53.87]
[00:27:58.83]
you different pieces of data
about the device like radio
history, location history. Stuff
[00:28:03.97]
like that. There's also a tool
out there called AF Logical OSE
and it's an open source edition.
[00:28:09.20]
Um it's pretty nice, it can, it
can get you a good acquisition
of data as well. I think there's
[00:28:13.07]
a law enforcement edition. So if
you wanted to impersonate a law
officer you could try to get a
[00:28:17.00]
copy of that. Or be a law
enforcement officer. [Laughter]
Didn't even think about that
[00:28:23.70]
second part. [Laughter] Uh so
yeah. So basically like you know
it takes all hat stuff to just
[00:28:30.10]
to get an acquisition. It sucks.
And you know, you're writing a
net cat binary onto the device
[00:28:35.73]
you have to be able to justify
all the changes that have
happened to the device. You're
[00:28:38.50]
volition the traditional
forensics methodology. Um and
yeah all this stuff is easy to
[00:28:40.50]
describe. We've got that super
long kill chain. If the device
powers off, it's over. If it's
[00:28:42.50]
encrypted, it's over. If the USB
debugging's not enabled and you
can't get it unlocked, it's
[00:28:47.50]
[00:28:55.73]
over. [Giraffe simulation video]
It's not even the goat.
[Laughter] Alright so, use
[00:29:01.30]
encryption, right? And um here's
like some examples, in areas
that I thought were applicable.
[00:29:06.13]
So number one, if you're out
operating and you're and your or
your freedom fighting as the
[00:29:10.00]
Gruqg calls it. Uh, you're not
going to bring your personal
phone with you. You're not going
[00:29:14.47]
to bring your personal device
right? So you leave it at home
but what if you're out freedom
[00:29:17.47]
fighting and you get raided by
law enforcement while you're
out. You don't want them to
[00:29:21.07]
acquire all the evidence off of
your device. Uh also like um I
may or may not know people that
[00:29:26.07]
[00:29:28.10]
build hardware implants for
Android devices and reply them
while operating. And so uh you
[00:29:33.90]
know if your impact gets phone
by a blue team or somebody else
that's not meant to find it. You
[00:29:40.07]
don't want them to acquire
whatever evidence you've
acquired while operating. And
[00:29:44.87]
then um how many people here
knows, know Koz? So Koz has a
penchant for smashing cell
[00:29:50.73]
phones and initially I put this
is as a shrug but when I looked
at it in the context of Android
[00:29:54.43]
it looks like a asked 29:58
throwing his cell phone on the
ground. [Laughter] Alright,
[00:29:59.43]
[00:30:01.70]
cool. So my thought was if I'm
leaving an Android device
somewhere and it's got full disk
[00:30:06.50]
encryption and it's running, and
it gets acquired by somebody
that I don't want to acquire it.
[00:30:11.53]
All I really want to do is just
turn off the device. it's
powered off, everything
[00:30:17.20]
encrypted at rest. I win. Kill
chain's broken. Forensics
Investigator get nothing. And
[00:30:23.70]
then of course, lawyer up. And
that's after you've deleted
Facebook and hit the gym.
[00:30:28.50]
[00:30:30.60]
[Laughter] Oh before actually.
Lawyer up first. Okay cool. So
you have all these awesome
[00:30:34.97]
sensors available to you in um,
in your Android device, right?
You've got Bluetooth, Cellular,
[00:30:40.27]
etcetera. So my thought was, you
know, you could pair to a
Bluetooth device in your house,
[00:30:46.73]
and if it all the sudden the
device becomes unpaired. Turn
the phone off. Now its encrypted
[00:30:51.87]
right so, the FBI comes they put
your device in a Faraday bag.
You become unpaired or you go
[00:30:57.33]
off the cell network. Right?
Turn the phone off. Encrypt it.
Uh set the GEO fence, if it
[00:31:02.57]
wanders outside the Geo fence
using the GPS, turn it off. Must
of walked away on its' own. So
[00:31:08.43]
um, so yeah, so I'm just
leveraging the facilities
available to me on the Android
[00:31:12.10]
device. [Baaahhh] [Laughter] So
I wrote this tool called Duck
the Police. [Laughter] And it's
[00:31:17.10]
[00:31:27.20]
just an Android app and I and I
suck at programming as you could
probably tell from the other
[00:31:31.00]
source code so. Um it's not,
it's not amazing but it does
turn the phone off. [Laughter]
[00:31:36.07]
So um, so yeah so, here here's
my entry for the worst Def Con
demo ever. So... [Laughter] So
[00:31:42.20]
you're just kind of gonna have
to take my word for it.
[Laughter] Uh, so I've got my
[00:31:47.80]
Duck the Police app here with
the a Dolan icon. And I select
movement, and I Duck the Police.
[00:31:52.80]
[00:31:55.27]
There you go. And set my phone
down and I pick it up and it's
off. And it's encrypted now. So
[00:32:00.27]
[00:32:04.47]
now Law Enforcement gets no
evidence off of my device. That
was the easiest solution I could
[00:32:08.80]
come up with and like right at
the top of my coding capacity.
[Applause] And uh this is the
[00:32:13.80]
[00:32:22.10]
only meme that I could find that
said Duck the Police on it. And
these were my next best ones.
[00:32:27.43]
[Laughter] That one I love
because a former coworker of
mine he was like oh yeah, you
[00:32:32.43]
[00:32:38.60]
know like I can pick the boots
on cars. He ca.. I can pick the
lock on those. He's like the
[00:32:43.60]
[00:32:49.33]
police charge a hundred and
twenty five dollars to take them
off, I charge seventy five.
[00:32:53.13]
[Laughter] So much chaos on the
internet. [Laughter] So yeah, so
that's uh that's like my Android
[00:32:58.13]
[00:33:16.60]
uh Android stuff and uh again
like those are my my ideas for
scenarios but the device turns
[00:33:22.43]
off and now all of your evidence
is encrypted and again not
taking into consideration the
[00:33:27.60]
trust zone and, trust, Qualcomm
trust zone stuff. [Baaahhh]
Alright so I was gonna play CTF
[00:33:32.60]
[00:33:36.83]
with you guys but unfortunately
the Demo Gods are not with me on
the display from my Linux
[00:33:41.90]
laptop. So I'm happy to demo
this, I'm gonna be, I'm bending
with Hack Five, they're the bros
[00:33:46.53]
in the vendor area that have the
huge pineapple. So if you wanna
come by or find some other time
[00:33:50.50]
during the conference, um, I can
show you this in person. but, uh
basically this is like some cool
[00:33:56.33]
stuff that Craig Smith from Open
Garages put me on to with SD
cards. And this was to prevent
[00:34:02.70]
me for going too far into the
slides before showing the
answers. [Zebra screaming]
[00:34:07.70]
[00:34:10.67]
[Laughter] But basically, the
set, the CTF set up is I have an
SD card in my laptop and its got
[00:34:14.33]
a text file on it and you cat
the text file and it says the
rules are simple, just add your
[00:34:17.83]
name. Unmount the text, or uh
unmount the SD card. Uh remove
it from the laptop, plug it back
[00:34:23.70]
in and verify that your name is
still there. And what ends up
happening is, uh you, you can
[00:34:28.47]
you know mount the SD card,
write a pin on to the um text
file, you mount it, no
[00:34:35.00]
complaints at all, you remove
the SD card and you put it back
in, and your name's not there.
[00:34:39.00]
And so, the CTF is like we go
back and forth, like what would
you try. Like people are like
[00:34:43.47]
try like change mode, zero,zero,
zero, zero. You know like all
zeros, no, no read no write, no
[00:34:48.67]
execute bits. That doesn't work.
Try change attribute. You make
it plus A or plus I, and that
[00:34:54.17]
doesn't work. Um and so what it
is is actually uh modification
uh in working with the firmware
[00:35:00.37]
on the Sd card so there's an
open source tool called SD tool
and um what it can do is lock
[00:35:06.40]
and unlock the device. And this
is aside from the physical lock
that's on the device. And so the
[00:35:11.33]
rights all happen in memory so
the underling OS, everything
looks good, everything fine but
[00:35:15.93]
then below that the firmware uh
to the SD card is not preventing
or it's preventing or not
[00:35:20.70]
allowing the rights on to the
actual storage. so as we all
say, no logs, no crime.
[00:35:25.70]
[00:35:29.93]
[Laughter] [From audience: So
say we all.] And so it's um,
it's pretty neat. Uh there's a
[00:35:32.37]
couple caveats uh so like if
you've got like a USB hub, it
may not work. It might expose it
[00:35:37.40]
as like um like a mass storage
device. Uh but if you have a
direct MMC device, it should
[00:35:43.37]
work. You might need to go
through different SD card um
cages but one of them should
[00:35:48.37]
[00:35:54.20]
work. [Bahhh] [Laughter] Uh and
in the example scenarios, where
I thought this would be good
[00:35:58.00]
also like building your own
hardware implants using running
off of a Sd card uh anybody that
[00:36:03.47]
does like PORTAL of Pi or
anything similar to that um you
can do the same since its
[00:36:07.80]
Raspberry Pi running off an SD
card and if you make your like
attack infrastructure attack
[00:36:12.40]
VM's running off of SD cards,
none of your logs get written to
storage. That's kind of neat.
[00:36:17.40]
>>[Video: I still don't know
what this. What is that?] >> How
many people have seen this
[00:36:22.97]
Ladder Goat? [Laughter]
[Screaming sounds] >> Get up
there. [Laughter] [Video: Oh
[00:36:27.97]
[00:36:33.13]
you, Ladder Goat. You're so
random. ] [Laughter] >> This is
how you use SD tool um, uh you
[00:36:38.13]
[00:36:42.27]
just basically point, once you
build it, uh point it at the um
MMC device and then you can get
[00:36:47.70]
the status or lock and unlock
it. Uh quick note, I switch my
make file to use clang instead
[00:36:53.00]
of uh GCC. Built, uh GCC gave me
errors, but clang was able to
build okay. [So hello from the
[00:36:58.00]
[00:37:02.10]
other side. I must have called a
thousand ti.. [bahhhh] ...mes. ]
[Laughter] [To tell you, I'm
[00:37:07.10]
[00:37:19.13]
sorry. ] [Laughter] [Applause]
[00:37:23.13]