[ar:The Bob Ross Fan Club]
[al:Def Con 24 Hacking Conference]
[ti:Propaganda and you (and your devices)…]
[au:The Bob Ross Fan Club]
[by: DEF CON Communications (https://www.defcon.org)]
[00:00:00.77]
>> How's everyone else doing?
It's the last talk of the day.
[cheering] Still got a lot of
[00:00:05.70]
energy? You know, I was, ah, the
first day I was streaming the
"101" track and there was a lot
[00:00:10.70]
[00:00:15.70]
of people that were, uh, you
know, coming up here and, and
encouraging people not to go to
[00:00:19.50]
the talks. They were like "Oh,
just... it's always uploaded on
YouTube, why would you go to the
[00:00:23.00]
talks?" And I'm usually one of
those people, I'm usually the
person that thinks that you're
[00:00:27.37]
crazy if you're gonna come do a
talk and stand in all those
lines. Mostly, it's the lines!
[00:00:32.23]
Being from Wyoming, lines aren't
a thing, so, like, makes my
heart hurt when I see-hee people
[00:00:37.23]
standing in the line. [pause]
Uh, but, I dunno, to encourage
people not to come to a talk,
[00:00:42.23]
[00:00:46.27]
that's, uhm, that's not, uh...
I'd say it's misguided cause
there's something about being a
[00:00:52.80]
part of an audience that's just
electric! And, you know, this is
no surprise, you know, you think
[00:00:57.53]
about any sort of sitcom that's
out there, what do they do? They
pipe audience into your living
[00:01:03.23]
room, they're piping, uh,
soundtracks into you, or ,uh,
uh, canned laughter into you
[00:01:07.73]
just to make you feel that
you're doing the right thing. To
make you feel like "Oh, I
[00:01:10.90]
shouldn't have to go make real
friends, uhm, I've, I've got all
the friends I need right here in
[00:01:14.83]
my living room." [audience
noise] So, I mean, it kinda
makes you wonder, it's like...
[00:01:19.40]
Oh, so first off - yea, thank
you, thank you for, uhm, braving
the, uh, colds and sinuses,
[00:01:24.90]
infections and what not and,
uh... [sniff] Being here with me
today. Uhm... uh, yea so, uh,
[00:01:29.90]
[00:01:33.57]
yea.. There's something electric
about being in the audience and
it, it makes you think, it's
[00:01:36.57]
like... What is that? What is
that that is programming us to
like, to yearn to be a part of
[00:01:41.57]
these social, uh, interactions,
and, you know, if, if things
like laughter, canned laughter
[00:01:46.80]
can program us to act
differently then just how open,
how vulnerable are we? [pause]
[00:01:52.80]
And I thought, yea, well, let's
just examine one of the hardest
cases that we know about and I'm
[00:01:57.90]
talking about kernel version
Helen Keller. She was blind,
deaf and mute and look at
[00:02:03.80]
that... she's still reading.
You're still piping input into
her - teaching her who knows
[00:02:08.30]
what. [chuckles] [laughter] So
it's like if she's, she's the
most locked down system I could
[00:02:15.17]
think of and if she's not locked
down, how wide open are we?!
And, you know, I'm joking but at
[00:02:19.53]
the same time I'm really not
because we can't stop stuff from
coming into our eyes; we can't
[00:02:23.07]
st... stop stuff from coming
into our ears, mostly. You walk
around Vegas, you're bombarded
[00:02:26.73]
with stuff. [sniff] [pause] So,
yea... What's my motivation? I'm
a software engineer by day but
[00:02:31.73]
[00:02:33.83]
I'm a frustrated consumer by
night, a media consumer. You
know, it's one of those things
[00:02:38.77]
that, that you just feel icky,
you feel that, like, propaganda
is, is all over you, you know,
[00:02:42.53]
and you can't escape it. Uhm, so
what can you do about it? You
figure that the people can do
[00:02:48.47]
something about it are the ones
that are doing the propaganda to
begin with. You know, you can't
[00:02:52.53]
expect anybody else to, you
know, come to your rescue for
you so I thought "Well, Python
[00:02:57.63]
solves a lot of my other
problems, could Python solve
this problem for me as well?".
[00:03:01.50]
So, I guess it's time to take a
journey. It's time to fight your
friends in alleyways...
[00:03:06.50]
[00:03:09.30]
[laughter] ... tell them to put
on the glasses. And if you don't
get this, you haven't watched
[00:03:13.20]
"They Live" and that's your
fault, you're wrong - not me.
[laughter] So let's define
[00:03:19.33]
propaganda, propaganda's defined
in this talk, it's, uh, real
simple, you have node A, you
[00:03:21.63]
have node B and you have a
communication stream or data
stream. This data stream is an
[00:03:23.63]
attempt from node A to get node
B to behave in such a way that
directly benefits, or get node B
[00:03:28.63]
[00:03:31.93]
to act in a way that directly
benefits node A. It's real
simple... [pause] And it goes by
[00:03:36.93]
[00:03:43.33]
many names, you know,
advertising, lobbying, social
engineering - all of Las Vegas
[00:03:49.30]
essentially, it's all one big
propaganda. [chatter] Uhg, go
away... [laughter] [pause] Yea,
[00:03:54.30]
[00:04:05.57]
so propaganda is defined by this
talk, it goes by many different
names and, uhm, you're probably
[00:04:09.40]
thinking "Oh, I do some of
these... am I a propagandist?"
Yea, you are. We all are, we all
[00:04:12.53]
propagandize a little bit, for
sure. [sniff] And, it's, uh,
it's a level like what we're
[00:04:16.63]
willing to tolerate, what we as
an individual, what we as a
society, what we're willing to
[00:04:20.17]
tolerate as far as what we're
able to do with, or we're, what
we find acceptable in
[00:04:24.67]
propaganda. But what about the
word? I'd like to give a little
history of the word, uhm...
[00:04:31.13]
because none of these, none of
these word are a scary as
propaganda - "propaganda" is the
[00:04:34.73]
only one that really is scary.
Maybe "social engineering" is a
little bit scary. But propaganda
[00:04:39.47]
is the only one that's really
scary, you know, the one that's
always used as a pejorative and
[00:04:43.40]
you gotta ask "Well, what's...
what's behind that?" there's
definitely reason behind that.
[00:04:47.90]
Uh, propaganda you know, you
know, first came into vernacular
in like, you know, the mid, uh,
[00:04:53.37]
in 1622. It was, it stood for
propagating the faith, uh, the
world was getting larger and,
[00:04:58.37]
[00:05:01.03]
like, our ships could sail
further, the world was also
getting smaller and kingdoms and
[00:05:05.37]
factions started to bump into
each other. And so you had,
like, this pope that's like
[00:05:09.50]
"God, I don't wanna deal with
all this sh** that's on the
outer reaches of my kingdom, I'm
[00:05:13.50]
just gonna...". He didn't say
that, he speaks Latin, and uh...
[laughter] He, uhm, he, uh, he,
[00:05:19.00]
you know, he.. he stood up this,
uh, you know, propagation, this,
uh, propagation of faith and it
[00:05:22.83]
became effective almost
immediately. In fact, like, the
guy that was in, you know, the
[00:05:28.10]
cardinal that was in charge of
the propagation, you know,
committee, you know, he was, uh,
[00:05:32.23]
he was known as "The Red Pope",
he was essentially an ad hoc
pope for all the external
[00:05:35.77]
kingdoms out there - very
powerful. But even then, I mean,
it, it, it wasn't a pejorative,
[00:05:40.87]
it didn't become a pejorative
until, like, as in a negative
connotation until World War I,
[00:05:45.37]
after World War I. What's going
on there? [audience noise] Well,
World War I is, if, you know, if
[00:05:50.50]
you.. I've listened to, uh,
George Collins, Dan Collins,
uhm, World War I history, uhm,
[00:05:56.80]
if I, I highly recommend it.
But, uh, you know, he, he brings
out World War I was the first
[00:06:02.10]
war, you know, major war in the,
in, uh, uhm, since like the
Napoleonic era. So you had this
[00:06:08.47]
group of people, who, you know,
who weren't used to fighting,
and you had to get them spun-up,
[00:06:13.40]
you had to get them hating the
other people. And Hitler wrote
in Mein Kampf, he wrote a whole
[00:06:17.10]
chapter on war propaganda and
he's like "This is where we
failed, the English were, the
[00:06:20.63]
English and the Russians they
were way better at this and
we've got to do a lot better".
[00:06:23.67]
It was at this time also, and,
and... Hitler called it
propaganda, you know, he, he
[00:06:27.90]
loved the word. And so that's
why it became a bad word here.
[sniff] Edward Bearnaise, uhm,
[00:06:33.03]
uh... you know he's a father of
modern day propaganda, he wrote,
he looked at the same time
[00:06:38.73]
period and he actually wrote in
his book propaganda, he
lamented, he goes "Propaganda is
[00:06:42.50]
such a fine word and the Nazis
went and ruined it..." he's like
"Well, we've gotta come up with
[00:06:46.90]
a different word." And the word
he came up with was "Public
Relations", you know, that's the
[00:06:50.67]
term... [laughter] Did the
Germans learn their lesson?
Well, after World War II... was
[00:06:55.67]
[00:06:59.00]
over, the, the liberators were,
how ever you wanna look at them,
were coming through, uh,
[00:07:03.00]
Germany, and they came across,
the, this, uh... A man named
Robert Gaylin, Gellin... And
[00:07:08.00]
[00:07:10.30]
they looked at his, uh,
anti-communism propaganda and
they were like "Man, this is
[00:07:14.17]
good! Can you do this in
English?" And he was like "Ja...
I mean Yes...". [laughter] "I
[00:07:19.20]
can totally do that". And, uhm,
it was very effective, we have
this, uh, retired CIA officer
[00:07:25.47]
saying that we fed his
propaganda, you know, his anti-,
uhm, uh, anti-communism
[00:07:30.13]
propaganda - the fed at
pentagon, the president's
office, you know, it was just,
[00:07:34.07]
he called it, it was basically
bullshit; it was boogeyman
bullshit. [sniff] [pause] So
[00:07:39.50]
yea, the Germans did lose, did
learn their lesson. And they
still lost the war but their
[00:07:43.43]
propaganda machine was much
better, uhm, yea... Thanks for
sitting through that, uh, brief
[00:07:50.30]
history... [audience noise]
Let's fast forward to, like, we
have the rise of the machines
[00:07:54.37]
with propaganda - what has that
done? Now, you've probably heard
a lot of stuff about, like, how
[00:07:59.40]
much it costs Google to power,
you know, to run their, you
know, just to pay their power
[00:08:03.70]
bill every day. Probably heard
stuff about, you know, Target,
uh, you know, could predict
[00:08:08.57]
the... predict when one of their
customers was pregnant - all
those kinds of stories. You
[00:08:13.83]
know, there's these big numbers,
we're not really quantifying it
as a, as a human brains. But
[00:08:18.33]
then I saw this, this, uh, this,
uhm, this really neat article
"The heterotopia of Facebook" -
[00:08:23.33]
[00:08:25.87]
the thing is, Facebook, is a,
uh, doesn't really matter, this
could be any social, uh, uh,
[00:08:31.30]
network out there; on,
especial.. especially online.
And, yea, go ahead and, go ahead
[00:08:36.50]
and... I highly recommend
reading the "The heterotopia of
Facebook" and it, it describes
[00:08:41.10]
what is going on with the human
brain very succinctly - we
participate in these social
[00:08:47.47]
media groups. You know, these,
other spaces is what
"heterotopia" stands for.
[00:08:53.87]
Heterotopia is where you can go,
it's a space that exists in real
life, but it's a space where you
[00:08:58.30]
can go and kind of project and
live a sort of, you know, your
own utopia. The this is everyone
[00:09:04.20]
else is doing the same thing, so
you can get curbstomped if you
go out there and project your
[00:09:08.70]
idea; you can also get your
boots licked by a sicko fan.
What Foucault argues is that,
[00:09:14.30]
like, whatever happens in these
heterotopias, if you're
investing real, uh, uh, parts of
[00:09:20.00]
yourself into it every action
that happens inside of this
heterotopia will have an effect
[00:09:26.33]
on you in the real world.
Because your brain really can't
distinguish between the two.
[00:09:30.47]
It's like "If I was a dumbass in
there, why wouldn't I be dumbass
outside? And the answer is...
[00:09:35.13]
[laughter] ... there's no
reason. There's nothing stopping
you, you are a dumbass.
[00:09:38.73]
[laughter] That post? That was a
sh** post... [laughter] Your
whole life's been a sh** post.
[00:09:43.73]
[00:09:47.57]
[laughter] You know?... So yea,
so let's say you have these
heterotopia, you have these
[00:09:51.17]
spaces and you're a propagandist
and you, like, you know, a
salesman comes up to you and
[00:09:55.10]
says "Look, I have these spaces
where people just come and live
their lives and interact with
[00:09:58.53]
all these other people. And it's
like, it's like a direct
pipeline to how they act in the
[00:10:02.10]
real world. Also,we've had
machines and we've recorded
everything, could this be of use
[00:10:06.67]
to you?". "I dunno...
[chuckle]... I dunno if it'll be
worth my time..." says the
[00:10:12.17]
propagandist. Obviously that's
not what they say! We know this
because, uhm, these industries
[00:10:16.83]
are titans - uh, they're owners
of billionaires, uh, the people,
you know, are trying to wine and
[00:10:22.80]
dine the owners all over the
place to try and get their
services on, you know, to be
[00:10:26.73]
beneficial to them. So, yea,
these heterotopias, or these,
uh, social media groups, that
[00:10:31.50]
collect all of our data that we
invest very real parts of our
lives in - it's a very
[00:10:36.20]
important, it's very important.
[sniff][deep breath] Let's go
back to 1993, Al Gore was
[00:10:42.57]
talking about, you know, this,
this idea of, like, the gap
between the "information haves"
[00:10:47.47]
and the "information haves not".
He's warning, you know, if we
don't provide technology to ,
[00:10:52.77]
you know, uh, poorer populations
they'll just get left behind.
Now, in his book... I mean,
[00:10:57.77]
[00:11:00.37]
that's fine! That's a noble,
that's a noble thing to say. In
his book "Davis Mod... David
[00:11:05.83]
Shank". He made a, he made a
really good point cause it... he
goes "It's one thing if you push
[00:11:10.83]
[00:11:13.07]
technology on to a people but if
you don't upgrade the people, if
you don't teach the people, if
[00:11:17.83]
you don't give them the proper
education of what it takes to,
you know, uh, to understand what
[00:11:23.30]
these devices are and what data
they're producing, then you're
not helping the information gap
[00:11:28.77]
at all. In fact, you might even
be widening it". So what's it
take to be "information have"
[00:11:33.67]
and "information haves not"?
What's that gap? Well, the
information, to be an
[00:11:37.40]
information have is not that
hard - if you know what a
tracking cookie is, if you...
[00:11:41.53]
you know, you, you know why the
internet is making these people
billions of dollars, if you know
[00:11:47.37]
those things; if you know what
an adblocker is, you're already
in the information haves. If
[00:11:50.73]
you're, you know, one of these
fine outstanding citizens that
need stuff to just work out of
[00:11:55.57]
the box - you're an information
have not. [pause] And, uhm, to,
uh... Whoops. [pause] Yea...
[00:12:00.57]
[00:12:19.73]
[pause] Huh, I guess I can go
to, uh... Do we have any
examples of information gaps
[00:12:23.03]
today? Yea, sorry... I think so!
I think one came up not too long
ago, in the aftermath of the,
[00:12:29.53]
uh, the Orlando shootings, we
had a talk, uh, uh statement
from Hillary Clinton. [pause]
[00:12:35.70]
She promised, you know, as
president "I will work with
great tech companies from
[00:12:39.63]
Silicon Valley to Boston..." She
then goes on to uh, uh, give the
normal stump speech about
[00:12:45.70]
intersecting ISIS's
communication, tracking and
analyzing their media posts,
[00:12:49.17]
blah, blah, blah. We already
know that, there's already been
plenty of discussion about that.
[00:12:52.80]
But, what I find interesting is,
as well as promoting credible
voices who can provide
[00:12:57.63]
alternatives to radicalization.
She's asking Silicon Valley to
team up with them to promote
[00:13:02.63]
credible.. credible voices.
Promote them, you know, whatever
voice you deem necessary. Why do
[00:13:08.70]
I think this is, uh, an example
of the information gap? Because
I don't think she could have got
[00:13:15.57]
away with saying something like
this: "As president I will work
with our great net, uh, New York
[00:13:21.30]
Times' bestseller list,
promoting credible books and
authors who can provide
[00:13:26.43]
alternative radicalization...".
And, so, yea, that's the
question, it's, like, why can,
[00:13:32.83]
you know, why can, a, uhm, uhm,
what's the difference between,
you know, Silicon Valley
[00:13:39.10]
promoting stuff on their product
and the New York Times promoting
books on their bestseller... uh,
[00:13:43.70]
promoting books on their
bestseller list? It's one of
those things where, uh, you
[00:13:48.17]
know, some people read this they
read about the government
messing with their books and
[00:13:51.00]
they go "Hmmm, I don't think so.
You stop right there,
government!" [chuckles] Uhm, uh,
[00:13:55.40]
uhm... uh why can we get social
media, you know the government
can, uh, you know, talk, you
[00:14:00.40]
[00:14:03.00]
know, the presidential candidate
can openly talk about how she
would like, uh, you know,
[00:14:08.07]
private companies to help
promote propaganda. [pause] And
this wasn't a one of thing
[00:14:15.00]
either, we have this one story
you might have seen, uh, this
was in, uh, "Facebook and
[00:14:19.40]
Twitter pledge to remove hate
speech within 24 hours" - this
was a story that was in Europe.
[00:14:25.33]
And inside this article we see
that companies that also agree
to promote independent
[00:14:30.27]
counter-narratives to fight hate
speech including content
non-discrimination, tolerance
[00:14:35.50]
and respect. So, we this, you
know, we movements where
private, uh, companies are asked
[00:14:41.60]
to, you know, produce
propaganda, you know, and uh,
are we okay with that? [pause]
[00:14:46.60]
[00:14:49.87]
Alright... so we have, uh, we
have, like, we have this gap -
information haves and
[00:14:55.57]
information have nots. The best
way to get information have nots
into the information haves class
[00:15:00.77]
is for education. Just educate
them, just educate an
information have nots into a
[00:15:04.77]
information haves and it's done!
The problem is, is that this
front's countered to what a lot
[00:15:10.53]
of, you know, there's a lot of
people that, you know, in their
best interests are keeping
[00:15:14.47]
information have nots. You know,
the more a information have nots
there are, the more ads, the
[00:15:19.57]
better ads you can sell; or
target the people. And, uhm, you
know, the more surveillance you
[00:15:24.57]
[00:15:26.87]
can collect on people, so, you
know, there's interests in
keeping the information have
[00:15:30.83]
nots. [audience noise] Now you
say, "Well, if we were to
educate these people, what would
[00:15:36.90]
it be?" Thank you very much,
that was a good transition to
the next slide... [laughter] So,
[00:15:40.90]
I'm, uh, I'm pulling this from,
uh, a, uh, a lot of this is, uh,
uh, propaganda information from
[00:15:45.90]
[00:15:48.30]
this book called "Firewall",
"Firewall: Propagandists guide
to self-defense". Jack Nolan
[00:15:53.57]
was, uh, a PSYOP officer
himself, in the military, and,
uhm, if you don't know anything
[00:15:58.57]
about the military - they have
these things called fuel manuals
and they are like command pages
[00:16:02.90]
for war. If, you, you ever find
yourself in an apocalypse,
you'll wanna secure a volume.
[00:16:08.53]
[sniff] Now, uhm, he's basically
going through the PSYOP, uh,
uhm, basically his training and
[00:16:14.40]
he puts it all in this book. So,
first off, let's go and, like,
what's the propaganda steps? And
[00:16:20.47]
we see here it's mostly like a
normal, uh, like a pen testing
step. [chatter] You know, you
[00:16:25.53]
start off, like, dv audience,
you know, you wanna do some
footprint, you know what I mean?
[00:16:29.33]
You wanna, you know, determine
what the audience's, uh, limits
are, what you CAN get them to
[00:16:35.60]
do; what you CAN'T get them to..
to do. [sniff] You also wanna
determine who their ringleaders
[00:16:40.13]
are. [pause] In today's, you
know, you know, in today's, you
know, social networks we have
[00:16:47.00]
all these lights, we have
followers - it's easy to tell
who the ring leaders are - the
[00:16:51.47]
end-users. But, also, you gotta
rem.. remember that you can also
target moderators, you can
[00:16:57.13]
target sysadmins. A propagandist
may just, I mean, it's way
easier, what are you gonna do?
[00:17:01.90]
You gonna target a group of
people or you can just target a
mod who can censor everything
[00:17:06.23]
for you? If you want, I mean, if
you can get them onto your
side... Uh, the next thing a
[00:17:11.07]
propagandist wants to do after
ID-ing his target audience - he
needs to, you know, uh,
[00:17:17.20]
determine a goal state. Most of
the time goal state is to make
money, you know, but, we've got
[00:17:22.40]
a lot of other things that that
goal state could be, you know,
to, uh, get people to hate the
[00:17:26.00]
other people so you can go to
war. But mostly it's to make
money. The next thing you wanna
[00:17:30.90]
do is scan for vulnerabilities,
on a society, on a group of
people that you'd like to, uh,
[00:17:36.83]
you'd like to, uh, change their
behavior on. These are... you
can tell what a society's
[00:17:42.07]
vulnerabilities are - these are
know as hot-button issues. If,
ever you're like "OOH, that's a
[00:17:46.37]
hot-button issue" - that's a
societal vulnerability. That is
something that someone could
[00:17:49.73]
always wedge in there, you know,
wedge a wrench in there and just
try to exploit. This can be
[00:17:53.90]
race, religion, gender,
wealth... you know, these are,
uh, you know things that people
[00:17:59.83]
are real sensitive about. And,
you know, it's upon the society,
it's up on the society to really
[00:18:04.87]
patch these. You can't, you
know, if you're always gonna
leave this vulnerability open
[00:18:08.33]
it's, you know, it's, it, uhm...
It'll always be exploited.
[pause] Choose the theme of the
[00:18:15.07]
message - this is essentially
the exploit of the, how do you,
how are you gonna exploit the
[00:18:20.17]
vulnerability? Are you gonna
appear.. appeal to fear? Are you
gonna appeal to greed? Ingroup/
[00:18:25.53]
outgroup that's very powerful.
Like I said about people
yearning to be a part of a
[00:18:29.47]
crowd. [pause] Uh, if you can
get people feeling like they're
an outgroup, you know, you can
[00:18:34.47]
[00:18:36.57]
get people feeling like they're
an ingroup cause they continue
to do what you want them to do.
[00:18:41.80]
[sniff] These are all different,
uh, this is your theme, this is
the message that you wanna run
[00:18:45.40]
with, you wanna appeal to
something of your audience -
fear, greed, whatever. [pause]
[00:18:50.27]
Now I actually wanna plan to
grab attention, this is
essentially like your payload
[00:18:55.27]
[00:18:57.33]
that you're going to be
delivering. [pause] You know,
you're gonna have your TV
[00:19:02.70]
commercials; you're gonna do
internet advertising [pause]
There are many to choose from...
[00:19:08.30]
Next thing you wanna test the
message, measure it's impact and
adjust the message as needed,
[00:19:13.17]
repeat. In today's day and age
this is very easy to do.
Obviously, this is uhm, you
[00:19:17.80]
know, we get immediate feedback
on the message and its impact
and, uh, we can adjust it as
[00:19:22.80]
[00:19:25.07]
needed very quickly, repeat it,
recoup, repeat it. So that's
what a propagandist is gonna do,
[00:19:29.97]
that's his steps. What can a
propaganda, uh, what can a
person do to counter the
[00:19:35.63]
propaganda? What kind of
checklist can they, can they,
can they use for themselves?
[00:19:40.73]
[pause] A series of questions
that you can ask is can the
speaker gain anything from
[00:19:46.60]
having me listen to them? You
know, if you're talking to
somebody and they, you know, you
[00:19:53.23]
feel, you know, they f... they
feel that, like, "Yes, you know,
if I listen to them they have
[00:19:58.30]
something that they can gain if,
you know, if I, if I listen to
them and, you know, believe what
[00:20:02.90]
they say..." [sniff] [pause] Ask
if the source is verified.
Alright, so, when you're reading
[00:20:07.90]
[00:20:10.67]
articles, uhm, pay attention to
what, uh, uhm, you know, who's,
you know, who's saying what.
[00:20:17.40]
Sometimes you'll have an
unidentified source, uh, uh, you
know, these are good clues, so
[00:20:23.87]
you know, white source. Is the
source identified? This is like
you know, you know who the
[00:20:28.87]
[00:20:31.10]
author is. You know I'm the
journalist, you know, "Glenn
Greenwald, I'm publishing this
[00:20:34.70]
article.". Uhm, uh, you know,
he's putting his name out there.
If a source is unidentified
[00:20:39.70]
[00:20:41.80]
you'll see sometimes, like, uh,
for instance in, uh, the coo
that happened in Turkey, it was
[00:20:47.27]
an unidentified source that
released, uh, you know, that,
uh, the president or the prime
[00:20:51.70]
ministers or uh, was, you know,
he had uh, he had sought asylum
in Germany. [sniff] That came
[00:20:57.13]
from iden.. that came from an
unidentified source and that's
something that you can look
[00:21:00.90]
into, I mean, that's something,
that, uh, it's a very good clue.
If an iden... if an unidentified
[00:21:04.53]
leak, you know, ha.. happens,
and the government tries to
chase them off to Russia, you
[00:21:10.50]
can tell that that was not a
government, uh, uhm, sponsored
leak. Obviously they did not
[00:21:14.97]
like that, however if the source
is an article and is not
identified and doesn't seem to
[00:21:20.77]
be a big deal, you bet that that
was a sponsored leak. Like they,
you know, the government was
[00:21:26.30]
okay with that one. [sniff]
[pause] And black as a source is
falsified, you know, some
[00:21:31.30]
people, like Donald Trump, say
stuff that's never happened
before that is a black,uh, uh,
[00:21:37.47]
verification. The source is uh,
a falsified source, he said
something that no one else has
[00:21:42.07]
even said. [pause] You also
wanna ask if the source is
credible. Has the source lied to
[00:21:48.13]
you before? Usually when a
journalist does something, like,
say that they're, they, you
[00:21:51.70]
know... "they took, you know,
took rocket fire or they saw
combat in the area" you know
[00:21:55.90]
they're done. They're, they're
no longer credible, there's no
reason for you to believe them.
[00:21:59.57]
So asking them, keep track of
your source, uhm, uhm....
[pause] You know, have they lied
[00:22:04.57]
[00:22:09.60]
to you before? Have they always
been truthful? You know, that
kind of thing. [pause] Ask if
[00:22:15.20]
you've heard the message
repeatedly. If the me..
repetition is key to propaganda.
[00:22:21.77]
It's, like, their best friend.
I, uh, I kind of feel, uhm, uh,
kinda reminds me of, like, the
[00:22:27.80]
Unix touch command - when run
"touch" first it creates a new
file, right? What happens if you
[00:22:33.57]
run it again on a file? Modifies
the end-time right? Modifies
that, modifies that modified
[00:22:38.10]
time. And the way our brain
works is that we rely heavily on
modified time - when we go to
[00:22:43.80]
search for something we like to
go to stuff to receive activity
recently. So, when you build a,
[00:22:50.20]
when you send a message out
repeatedly... [pause] You're
building familiarity. We're, uh,
[00:22:55.20]
[00:22:57.77]
we're uh, humans we love
familiarity. It means that
you're not a tiger that's here
[00:23:03.53]
to eat me... [chuckles] You
know, you know it means that
you're, uh, uh, a friend, you
[00:23:08.97]
know... And that's way
everyone's trying to push their
label or their commercials or
[00:23:14.03]
their propaganda on you all the
time, repeatedly. That's why
it's a good sign, it's a good
[00:23:18.33]
sign, that's why it's a good
clue - you're seeing the message
repeatedly, like, "Why am I
[00:23:21.80]
having to see this repeatedly?".
Think about every time you go
into McDonalds - how many golden
[00:23:25.07]
arches you see, they're trying
to build a familiarity presence.
[sniff] [pause] Alright,
[00:23:30.07]
[00:23:32.47]
"Manufacturing consent". Uh,
this is a book written by a man,
uh, Noam Chomsky and Edward
[00:23:37.47]
[00:23:40.30]
Herman, and it's coming from the
hypothesis that "The media
mobilizes support for special
[00:23:46.00]
interests that dominate private
and state activity". Basically,
the media is not the forth of
[00:23:51.13]
state is what they're deciding.
They're, uh, what they're
hypothesizing, they are
[00:23:55.03]
hypothesizing that the media is
not this neutral thing - that
they will perform, you know, to
[00:23:59.97]
the, to the needs of the special
interests of the society. And to
test this hypothesis, they is...
[00:24:04.97]
[00:24:07.20]
they developed a propaganda
model. And what's in this
propaganda model? This...
[00:24:12.73]
Alright, one of the things, it,
first propaganda filters the
size, concentration, ownership,
[00:24:19.30]
uh, alright... God damnit...
[chatter] [pause] Size,
concentration of ownership,
[00:24:25.20]
owner wealth, profit orientation
of dominant mass media firms.
So, in our beloved capitalist
[00:24:30.20]
[00:24:33.40]
systems, uh, these media firms
and these businesses in general
they're allowed to grow, they're
[00:24:38.40]
allowed to buy more... uhm, uh,
just everything around them. So
you can buy more media
[00:24:44.30]
companies, you can, you know,
that's, uhm, you can buy more
things that aren't media
[00:24:49.13]
companies and now you have, now
you have, like, uh, you know,
maybe something that's making
[00:24:52.97]
money and you don't want your
media firm to be, you know, too,
uh, you know, to, uh, it's.. You
[00:24:58.37]
have a business that's shady,
you don't want your media firm
bringing attention to it, you
[00:25:02.30]
own both of them - that's great!
You can just tell the media firm
"Hey, shut up about this." and
[00:25:06.13]
the media firm will be like
"Alright, you're the boss.".
[sniff] Advertising, as inc...
[00:25:11.37]
as a main source of income -
this is nothing. So... when,
when media was just sold a
[00:25:18.27]
subscription-only kind of a
thing.... [pause] Uh, the
government, uh, publications
[00:25:23.27]
[00:25:26.13]
were allowed to talk a lot of
sh** about governments and a lot
of people in power. But, uhm, it
[00:25:32.17]
was hypothesized, like, "Well,
if we just turn it over to the
market... and so maybe the media
[00:25:38.00]
will sort itself out. Maybe
it'll get back in line, you
know, people won't be as, uh,
[00:25:42.93]
anti-government." And it really
worked! And one of the ways, you
know, as soon as the advertising
[00:25:48.43]
came in it allowed for, you
know, more rich people than you
know, the rest of the population
[00:25:53.03]
to come in and, and, you know,
uh, kind of, sort of invest in
these media companies. Allowing
[00:25:58.27]
them to grow bigger. [pause] And
so that's another way, if you
can't own the media what you can
[00:26:03.27]
[00:26:05.93]
do is own part of the
advertising - that's a way that
you can get your own, kind of,
[00:26:10.77]
check in balance and go like
"What message it the media
putting out there?". [sniff]
[00:26:17.13]
[pause] And then we have this
idea of like reliance of media
on information provided by
[00:26:20.77]
government experts funded by
primary resources. This is where
we get the idea of press
[00:26:25.30]
conferences from. So, we have 24
hour news, and, you know, we
have uh, or at least news - we
[00:26:31.00]
have a newspaper every day, we
have a news hour every day and
yea... we do have 24 hour news.
[00:26:35.40]
How do they have news to give?
Well! This is where you know,
pre, you know, very powerful
[00:26:40.40]
[00:26:43.20]
entities such as governments and
corporations can set up a press
conference. Like, "Oh, you want
[00:26:48.17]
news? You can just come to this
location at this time, there'll
be coffee and donuts, we.. all
[00:26:54.23]
you have to do is just listen to
us tell our side of the story".
And that's very important
[00:26:58.50]
because now you have the
government and the media kinda
working together and, you know,
[00:27:02.37]
you kinda hear about it, you
know, when, uh, you know,
someone, you know, some
[00:27:05.47]
journalist, "goes", you know,
"goes rogue" and, uh, you know,
get's their, get's their
[00:27:09.27]
credentials pulled for being
loud into the, the, you know,
the press conference. You know,
[00:27:15.67]
so it gets, you know,
journalists, uh, to kind of stay
within bounds, so they can stay
[00:27:20.70]
getting news from these press
conferences. And, also, think
about what this does, it's like
[00:27:25.07]
you're on the other side of the
world and one of these entities
harms you in some way... You
[00:27:30.57]
have to wait for journalists to,
like, fly over and stick a
microphone in your face. Where
[00:27:36.17]
as in, with press conferences by
the time journalists gets across
sees to interview the victim,
[00:27:42.93]
the, you know, on the home side,
you know, the, the big entities
they'll have had their opinions
[00:27:49.30]
in, uh, in, uh, you know, the
public's sphere for who knows
how long? [sniff] [pause] You
[00:27:55.67]
know, so, uh, so press
conferences are very beneficial
- they're very beneficial to
[00:27:59.93]
both the media and, you know,
government and high power
entities that can, you know,
[00:28:04.40]
afford to do these press
conferences for journalists.
Flack... flack is a way of
[00:28:08.50]
disciplining the media, if ever
the media does something bad you
have to have ways that you can,
[00:28:14.43]
uh, uh, tell the media, you
don't like that. Uh, one of the
ways you can do this, is, you
[00:28:20.33]
know, you can have counter
stories, run, you can withdraw
your advertising - this kind of
[00:28:26.07]
thing. You wanna make life
difficult for any media that
dare say something bad about
[00:28:30.37]
you. Another thing that
propaganda model has, it's got
this thing called "boogeyman
[00:28:35.63]
e..", "boogeyman enemies". When,
uh, manufacturing consent first
came out, you know, the
[00:28:41.17]
communist was the boogeyman. And
the way they wrote it is like
they assumed that communism
[00:28:44.90]
would be the boogeyman forever.
We have terrorists as the
boogeyman today, another
[00:28:49.07]
boogeyman that we have, like
we've gone over before, is hate
speech. You know, hate speech is
[00:28:53.83]
being used, you know, to get
social media companies to, uh,
help propagandize.... its users.
[00:28:58.83]
[00:29:03.90]
[pause] [deep breath] Alright,
so we've learned some things
about propaganda, what can we
[00:29:07.93]
do? Well, I decided to make a
Reddit column, why did I make a
Reddit column? Well because I'm
[00:29:13.50]
addicted to Reddit, alright?
[chuckles] [audience noise] And,
uhm, you know, my friend told,
[00:29:17.13]
he was like "That's doing weird
things to your brain...". I
didn't listen and so now it's
[00:29:21.53]
the only one I can solve
problems... [laughter] Becau....
well I mean, the theme on Reddit
[00:29:25.40]
though - it's a very organised
website and so if you can get
rid of the people it's very
[00:29:30.13]
pleasant. [laughter] And, you
know, if you think about, like,
"I wanna better, my, my media
[00:29:36.27]
consumption..." you know we're
all familiar with this, you
know, this, concept of
[00:29:41.43]
end-to-end encryption. You don't
want things in the middle, you
know, we want things "end" to
[00:29:46.40]
"end"... So could we do the same
thing to, uh, to, uh, our media
consumption? Can we get as close
[00:29:51.40]
[00:29:53.67]
as possible to the knowledge
source without having, you know,
all these grubby paws in there
[00:29:58.87]
trying to change the message?
And the other thing that Reddit
does... yea I get through the
[00:30:05.13]
mods and admins, this is huge!
Like... uh, I, don't... Reddit
has a lot of drama that gets
[00:30:10.07]
kicked out cause of the mods and
admins did something that was
seen as censorship. Uhm, luckily
[00:30:15.27]
I have not had to do any of that
because I don't go to those
weird sub-rights, but, you know,
[00:30:20.20]
they've got a point. Mods and
admins can just choose to do
stuff, uh, against your, uh,
[00:30:26.10]
against your will, is... do you
want that in a news source?
[pause] Also, it's open source
[00:30:32.47]
with a very easy install script.
And it's got this, you know,
Python, the Python prom module
[00:30:38.67]
is a, is a very decent, uh, uh,
module for, you know, scripting
your Reddit bots. I dunno why it
[00:30:45.30]
keeps coming on... [laughter]
So, I just stoodup this Digital
Ocean droplet, it's running
[00:30:50.67]
Ubuntu. This is, uh... the
Reddit install script and it,
uh, you can, you know, can run
[00:30:56.60]
it on like whatever version, you
know, Ubuntu, or whatever that
can support it if you're brave
[00:31:01.67]
enough, you can tweak it to your
heart's desires. But, like, if
you want life easy and, you
[00:31:06.07]
know, something like Digital
Ocean and, uh, will make it that
easy, you can just select
[00:31:10.53]
whatever install Reddit shell's
looking for and BAM! Within 15
minutes, uh, uh, Reddit column
[00:31:16.07]
up and running. [pause] I wanted
to leave the Reddit code alone,
you know, once I got up and
[00:31:23.00]
running, I was like "Alright,
that's fine.". So then I just
used "pra" and "bottle", any..
[00:31:27.00]
anyone here use bottle? It's
very lightweight Python, uh,
uhm, framework. It's like, it's
[00:31:33.97]
very lightweight, it's stupid
simple API, I can just send
commands at it and it'll just
[00:31:39.50]
send commands at my Reddit
clone. [pause] Also, I use this
thing called "summary API".
[00:31:46.37]
Summary API was a life sent
cause I had only scrapers, you
know, stuff like the natural
[00:31:52.13]
language, you know, toolkit out.
And you know, Parsey
McParseface... [audience noise]
[00:31:55.60]
Came out not too long ago,
right? So I was like "Wow, I can
do all this stuff". But then!
[00:32:00.83]
Luckily, summary comes out and
was like "I do all that for
you". Plus... [pause] Uhm... you
[00:32:05.83]
[00:32:08.53]
know, I don't have to... uh, I
mean, what it'll do, it'll go
out, you'd send it a link or
[00:32:13.90]
even send a text and it will run
an analysis on it and pull out
the important sentences. You
[00:32:20.53]
know, this is literally, if
you've ever seen, like, a "Too
Long; Didn't Read" bot it's
[00:32:24.43]
probably using this exact site.
It's, it's great at just taking
the article pshh and munching it
[00:32:31.03]
down to however many sentences
you want - you can munch it down
to a tweet, you know, if you
[00:32:34.77]
wanted to. [sniff][pause] So
yea, I was very happy to uh, uh,
know about these guys I relied
[00:32:40.70]
on heavily. Also, look, I'm a
slow reader, so if I can just
shorten articles - look, I'm
[00:32:46.70]
either gonna run a 10 sentence
synopsis of this article or I'm
not gonna read it at all.
[00:32:51.93]
[chuckle] [audience noise] So,
thanks summary! [pause] Alright,
yea... Everybody ready for an
[00:32:57.80]
exciting demo? Look, it's
Reddit, don't get too excited.
[chuckle] [laughter] So, this
[00:33:04.00]
is, yea, so when I said it was
very simple - that was a lie.
Where are you guys now? Uh...
[00:33:09.80]
cause I killed some of my bots,
but I have one bot up and
running. [pause] Oh, God damnit.
[00:33:14.80]
[00:33:18.47]
[pause] So, yea, it's uh... it's
very simple, like, whatever, you
know, whatever, uhm... [pause]
[00:33:23.47]
[00:33:30.00]
Whatever domain, uh, the a, the,
the article came from that's
gonna be the "sub-Reddit".
[00:33:35.00]
[00:33:37.00]
Whatever author wrote it,
that'll be the "user". So, you
know, Reddit's a great way, you
[00:33:42.40]
know, you can select, you know,
all of the things that were
published by ... all the things
[00:33:45.00]
that were published by that
author and, you know, that part
of the organization is done
[00:33:49.87]
automatically. [deep breath] So
the next thing you can do is,
like, let's take a look at this,
[00:33:54.17]
uh, uhm... [pause] Summary
because that's what we're
looking at here. So, I dec, I
[00:34:01.07]
did away with titles, you know,
the, the journalists aren't the
ones that decide the article's
[00:34:07.00]
title, that goes to the editor.
The editor wants to, uhm, you
know, he wants people to click
[00:34:13.10]
on it. He's gonna make it "click
baity". [laughter] So, get rid
of it. Like I said, if we want
[00:34:18.40]
end-to-end, why not just get rid
of that? He's gonna try to get
you to click on it, so get rid
[00:34:22.77]
of it. And so... what I have is
just the top, the, the first 300
integers, or 300 characters, I
[00:34:27.77]
[00:34:31.30]
mean... is for it's limit. So
that's, you just, that's my
title. So I read the first, you
[00:34:37.40]
know, the first couple of
sentences. And we're gonna load
it... Or not. [audience noise]
[00:34:42.23]
Oh, shoot, oh well. [pause] I
had one loaded up here. [pause]
>> If you reset the various
[00:34:47.23]
[00:34:51.73]
wireless prompts, you may not be
on the right one. >> I thought I
was on "DefCon". Oh, wow.... no,
[00:34:56.73]
[00:35:06.37]
no, no. [pause] Uhg, oh okay...
Well I realize I'm all... Geez,
you all are, are so kind, I'm
[00:35:11.37]
[00:35:25.07]
gonna be over, waaaay over.
Alright, real quick... [chatter]
Let me just go over this, tips
[00:35:30.07]
[00:35:32.83]
for reading internet comments:
DON'T read internet comments.
[laughter] Alright... alright,
[00:35:36.60]
[00:35:40.70]
first off, it's not do, doing
good stuff for your health,
alright? You get upset, you
[00:35:44.83]
know, it's raising your blood
pressure, you're using your free
time, or work time, you know...
[00:35:49.47]
[laughter] Probably someone on
the other end is getting paid
for that, alright? So you're
[00:35:53.90]
hurting yourself. Not only that
but we're seeing more of this,
you know, first off [chuckles].
[00:35:58.90]
[00:36:01.27]
If you're Russian you're a
troll, you know, but you're, you
know, you're uh, Facebook
[00:36:05.87]
warrior if you're British.
That's some interesting
propaganda in and of itself.
[00:36:10.33]
But, we're seeing more of these
things called "Astroturfers" or
"Trolls", or whatever you want
[00:36:16.03]
to, uhm, very many names. Uhm...
[deep breath] And that's very
powerful, it's like, you know,
[00:36:21.53]
you're talking to a somebody,
you know, like we talked about,
in your heterotopia. You're
[00:36:25.50]
talking with someone and you
thinks it's real but they're not
real, and so... Also, I like
[00:36:30.33]
this, it's, like, you know,
Russia has the troll army but,
you know, we're gonna go look
[00:36:36.30]
at, uhm... We're gonna have an
internet stranger tell me who I
should vote for. That's above
[00:36:41.50]
board... So, you know. Even if
you think you're the best
internet debater in the world,
[00:36:46.50]
[00:36:48.67]
you don't have an incentive, you
don't have a money incentive on
the line... But that's not, I
[00:36:54.30]
would argue that that's not the
main reason you shouldn't argue
with internet trolls and the
[00:36:59.20]
main reason I think comes from
the "Correct a record". This was
Hillary Clinton's super pack,
[00:37:04.17]
they were talking about like
"Yea, we astroturf people." And,
in the statement "Lessons learnt
[00:37:10.83]
from online engagements, Bernie
Bros, during the democratic
primary will be applied to the
[00:37:15.00]
rest of the primary season and
the general election". I mean,
it's even if you think that,
[00:37:19.50]
like, you're, you're, you're
getting good ideas out there and
you're throwing them out
[00:37:22.47]
there... you're really not.
You're just helping them, you're
building their knowledge base,
[00:37:26.00]
uh, and you're helping them get
better. It kinda reminds me of,
uh, uh, like, Rick and Morty
[00:37:31.00]
[00:37:33.33]
when it comes to like those
little micro verses, you know...
[laughter] You, know, they're
[00:37:36.60]
able, you know, you have like
this internet and you're in
your, uh, your little thread and
[00:37:40.07]
you're just... but really all
you're doing is speeding up
their test and you're speeding
[00:37:43.27]
up their, their knowledge of how
to, uh, uh, propagandize the
rest of the population. Because
[00:37:48.50]
as they, you know, come out of
that, uh, out of your thread
they come into the real world
[00:37:53.53]
and they're like "We have just
the commercial for you...".
[laughter] Alright, so just
[00:37:57.93]
don't, however... Alright, if
you do insist on reading
internet comments or if you
[00:38:02.80]
wanna comment with somebody,
take out the public part of it.
Take out the vanity part of it,
[00:38:09.57]
you know, with, whether it comes
to up vote or, uh, retweets or
any of that, just take all that
[00:38:15.73]
and go to dm or pm, you know,
see about that. You know, I
think most people you know
[00:38:20.67]
they've been most of, uh...
accepting. They'll talk to you
on dm, on the dl. [audience
[00:38:25.67]
[00:38:30.20]
noise] So, and if you want, if
you're interested some
documentaries I recommend "The
[00:38:33.77]
manufacturing of consent", it's
about three hours long. Uh,
that, this is a documentary on,
[00:38:37.83]
uh, uh, Noam Chomsky,it goes
over a lot of his theories on
propaganda. "Century of self" is
[00:38:43.83]
about Ed Bernays, this is the
movement from, like, alright "We
just kicked a** in a war, now
[00:38:48.97]
what will we do? We can a lot of
money, that's what we can do".
And that's what Century of self
[00:38:54.20]
is about, it's about turning,
like, propaganda on to the, uh,
uh, private sector. Uh, some
[00:38:59.20]
[00:39:01.40]
books, again, "Manufacturing
consent and propaganda" by the
same people; Firewall, uh, that
[00:39:06.37]
was a name by Jack Nolan, I've,
I dunno why I didn't put his
name. Now, some sites, okay so I
[00:39:11.00]
didn't mention fair dot org and
I didn't mention LMD media
podcast... uh, fair dot org, you
[00:39:16.07]
know, they, they're the fairness
and accuracy in reporting, they
do a lot of newsletters on how
[00:39:20.43]
to, uhm... [pause] On just some
bullsh** that the media does. On
the media podcast, uh, you know,
[00:39:26.43]
it's, uh, if you like radio labs
it's kind of the same, kind of
the same kind of production
[00:39:31.30]
value. They give like 15 to 50
minute podcasts and, uh, they do
some interesting topics on how
[00:39:37.97]
the media is treating various
different subjects. [pause]
Alright, thank you for coming.
[00:39:42.97]
[00:39:46.93]
[chuckles] You've been too kind.
[applause] Alright, alright,
hold up, hold up. So if you
[00:39:51.93]
[00:39:56.57]
wanted to see. [pause] [audience
noise] Yea, this is summary,
I've got like little 10 sentence
[00:40:03.47]
summaries. You can come through
and you can read the article
very quickly if you wanted to
[00:40:07.93]
and if you're, you know, and,
you know, I always include the
link so you can go see the real
[00:40:11.90]
thing if you wanted to
anyways... Alright, now I'm
done. [laughter] [applause]
[00:40:17.00]
Fuck...
[00:40:17.47]