Category Archives: Privacy

Post navigation

CloudNS is a DNS host that supports a few cool security features. I’ve set it up, and it’s working for me on Linux Ubuntu 13.04. I think its security features give it the potential to be the preferred choice for those looking for that higher level of security and privacy.

* DNSCrypt Support
We only allow connections to our service using DNSCrypt, this
provides confidentially and message integrity to our DNS
resolver, and makes it harder for an adversery watching the
traffic of our resolver to identify the origin of a DNS query as
all the traffic is mixed together.
* DNSSEC Validation
Our server does complete trust validation of DNSSEC enabled
names, protecting you from upstream dns poisoning attacks or
other DNS tampering.
* Namecoin resolution
Namecoin is an alternative, decentralized DNS system, that is
able to prevent domain name censorship. Our DNS server does local
namecoin resolution of .bit domain names making it an easy way to
start exploring namecoin websites.
* Hosted in Australia
Our DNS Server is hosted in Australia, making it a faster
alternative to other open public DNS resolvers for Australian
residents.
* No domain manipulation or logging
We will not tamper with any domain queries, unlike some
public providers who hijack domain resolution for domains that
fail to resolve. Our servers do not log any data from connecting
users including DNS queries and IP addresses that make
connections.

I think those are some really interesting features. For one thing, it forces DNSCrypt and validates with DNSSEC, and it appears to be the only resolver to do both of these things. And it’s also hosted outside of the US, which has its own implications for security.

So I went ahead and set up CloudNS using the following command (and setting this in rc.local) after configuring DNSCrypt from this guide. You can check Cloudns.com.au for the updated information, but as of today (Aug 8th, 2013) this command works for me.
dnscrypt-proxy --user=dnscrypt --daemonize --resolver-address=113.20.6.2:443 --provider-name=2.dnscrypt-cert.cloudns.com.au --provider-key=1971:7C1A:C550:6C09:F09B:ACB1:1AF7:C349:6425:2676:247F:B738:1C5A:243A:C1CC:89F4

So the three big improvements for me are DNSSEC, DNSCrypt, and Australia hosting.

DNSSEC

DNSSEC is an extension of DNS that aims to provide authentication and integrity of DNS results; it ensures that you know who the result is from and that no one else has tampered with it. DNS responses are authenticated but they are not encrypted, so DNSSEC does not prevent someone between you and the resolver from viewing the request.

DNSCrypt

DNSCrypt provides encryption of DNS requests, which provides confidentiality of the requests, meaning that an attacker between you and the resolver can not view the traffic between you and your DNS resolver.

Stacking DNSSEC and DNSCrypt works out very well, as you end up covering your bases and achieving confidentiality, integrity, and authentication.

Hosting In Australia

While I’m not particularly familiar with Australia’s laws, hosting outside of the US definitely provides a bit more peace of mind. Just yesterday we learned that Lavabit (the email provider chosen by Edward Snowden) has shut down due to the US government trying to compromise their ability to protect their users. The truth is that hosting in the US just makes a service less trustworthy at this point, and hosting outside is a big plus. This, combined with Namecoin and their pledge to not log, is really somewhat comforting.

So, while I can’t absolutely recommend it at this point (I haven’t been using it long enough) I think there’s a lot of potential here.

DNSCrypt is a DNS Resolver that encrypts the DNS requests between you and the first level DNS resolver. I have a guide for setting it up here. This guide will be about restricting the process and user account, making DNSCrypt more resilient to attack – I will continue to update this guide, I have a few more ideas.

One of the nice features of DNSCrypt is that it actually takes security into account. I wish this weren’t something to be shocked by, but, *gasp* it actually uses compiler security flags. Specifically, it uses the following flags:

-fPIC -fPIE -fstack-protector-all -fno-strict-overflow -fwrapv **

-fPIC and -fPIE tell the compiler to create a relocatable binary, completing the implementation of ASLR. It’s a mitigation technique we rarely seen used, despite it being somewhat critical, and it having been around for years. So right off the bat they’re doing more than most.

-fstack-protector-all (unlike the oft used -fstack-protector, which only protects functions using char arrays/strings) tells the compiler to protect every function with a stack canary. If an overflow occurs the canary may be overwritten, and the function will fail.

-fno-strict-overflow and -fwrapv are essentially the same (in other words, I don’t know the difference) and they tell GCC to not make assumptions about overflows, basically to not assume that overflows won’t occur. Compilers make the assumption that overflows won’t happen when they generate the optimized assembly, so they can build optimizations with that assumption – this prevents that, which is safer.

So these are nice, and we like them. But DNSCrypt also does a bit more.

You can create a new DNSCrypt user with no write rights, and it will chroot itself into that user, and drop rights. This is great, since a chroot’d process with no ability to write is difficult to break out of. And running as a separate user means no X11 access, it gets its own home folder, and it’s generally more isolated from the system – all good things!

But it means some other stuff too. Because it does all of the above we as users can take that protection further – beyond where typical programs allow us to. I think this demonstrates what a strong security model really can do when built from the ground up.

So, on to what we can do.

First thing’s first, we’re going to want some information on our DNSCrypt user.

run ‘id dnscrypt’

You should get something similar to:

id dnscrypt
uid=109(dnscrypt) gid=123(dnscrypt) groups=123(dnscrypt)

We’re going to need this.

IPTables On User

Note that if you’re using UFW this may cause issues, using UFW/GUFW with iptables isn’t recommended, and your mileage may vary – to remove your UFW rules run ‘iptables -F’.

Normally I’m not fond of outbound filtering, but because DNSCrypt separates itself into another user, it’s actually not such a bad idea. It means that DNSCrypt can’t just switch its outbound connection to another program under the same user account, and it means that the ports we limit will be limited to that user account specifically. This assumes you are using DNSCrypt under a user called ‘dnscrypt’.

So it’s a lot more worthwhile to set up outbound filtering here.

DNSCrypt should only need outbound access to port 443, with UDP. So we can restrict it to just port 443 and UDP with the following IPTables rules:

Basically, the first rule allows outbound access to the DNSCrypt user over port 443 and UDP, and the second rule denies everything. If the first rule is hit, and it passes, the second rule doesn’t have to come into play.

***

DNSCrypt is now restricted to UDP over port 443, and all processes running under the dnscrypt user are as well. If you followed the tip then no new connections can be made to your system except over port 53 (you can have dnscrypt use another port, in which case you’ll switch that port to whatever that one is. I have yet to figure the details of this out, I’ll edit it in when I do.)

Trusted Path Execution

If you care about security you’re already running Grsecurity, but if not, see my guide here.

Grsecurity has an option called Trusted Path Execution that allows us to limit a group, preventing it from executing files owned and only writable by root – since our program doesn’t run as root, and can’t write anywhere, it means it won’t be able to execute anything at all.

So check the TPE box and add the GID for untrusted users, in this case 123.

Now this protection is superfluous, DNSCrypt shouldn’t be able to write to the filesystem, so it shouldn’t be able to execute any payloads off of the file system, but it’s still good to have as the protection is now implemented by the user account itself, and doesn’t rely on the program to drop rights properly, or a perfect implementation of chroot.

Chroot Restrictions

While you’re compiling your Grsecurity kernel, you can also go ahead and turn on every single chroot restriction without worry – DNSCrypt works fine with them all. DNSCrypt already can’t write to its chroot, so as far as I know there’s no known bypass as is, but you can safely enable all of these restrictions. Although some of the protections are a bit redundant due to the aforementioned write restrictions, there are a few that are quite nice, such as:

Apparmor

Apparmor is an LSM (Linux Security Module) program that restricts a process. If Apparmor is the LSM used on your distribution (Ubuntu derivatives) you can find my profile here. Apparmor will restrict file access, what programs can be executed, what libraries can be loaded, etc. An attacker who winds up in a program that is confined with apparmor must either find a flaw in apparmor, or the profile, or they have to use a local escalation attack. If you’re using everything listed above this is going to be a lot of work for them.

Users of other LSM such as SELinux will need to build their own profiles. This shouldn’t be hard, DNSCrypt needs little file access to work.

Conclusion

Given the situation where an attacker finds himself compromosing the DNSCrypt-Proxy on a system that has done all of the above, they’re going to be pretty pissed off. There is still room for improvement, (seccomp filters) but right now an attacker is going to have to do a lot to get an exploit to be reliable.

For a program like DNSCrypt this level of security is great. It already chroots itself to a directory that it can’t write to, and they use compiler security, so you know they’re taking this stuff seriously. That’s what allows us to spend our time securing it further. If DNSCrypt did not so gracefully run as another user, and if it weren’t built to drop its rights to the extent that it does, then our apparmor profile would be more convoluted, TPE may not be possible, and an outbound Firewall would have been a useless attempt at security through obscurity. But because it’s built from the ground up to be this way we can reinforce it well.

Notes/ Tips

Much of this can be done to any process/ service with a bit of a change, but it’s nice to be able to do this to a process like DNSCrypt.

**

Keep in mind that you can add your own files to the makefile, such as “-march=native”, optimizing for your CPU. I can’t guarantee that this will play nice, or that it won’t add in unsafe compiler optimization! But you may end up using something like AES instructions since this deal swith crypto, and math, and this could speed things up.

***

Tip: The following commands will set your firewall so that:

1) If a connection is new, is over the loopback interface, is udp, and uses port 53, we accept it (allows dns resolution)
2) If a connection is already established from an outbound connection then we allow an inbound connection.
3)All other connections that do not meet the above criteria are blocked.

This is an essay I wrote for a class. I’ve obviously very specific to that class – and it’s written like a class for a paper. It’s not a research paper, it’s not anything fun or legitimate, and you’ll probably be bored reading it because… why would you care what Socrates thinks about TOR?

Anyways, enjoy the paper. It’s actually edited for once.

TOR is a program funded by the US Navy and designed by the Electronic Frontier Foundation (EFF) in order to allow for near-perfect anonymity when accessing the internet. Since its inception the program has been adopted by developers around the world, and it’s free to be used by anyone. By routing internet traffic through various ‘nodes’, which can be hosted by anyone with a computer, TOR prevents anyone from being able to tell where traffic is going to or coming from. There is, in fact, an entire ‘hidden network’ called ‘The Deep Web’, which can only be accessed by using TOR.

The implications of a program that can prevent any and all tracking are considerable. Users of TOR can access ‘hidden services’ on the network, which can provide highly illegal content: there are websites dedicated to selling drugs, hacking websites, and even sites where people offer to murder for money. Because the TOR system is set up so well, it is virtually impossible to track users accessing these services. Even governments with significant resources are unable to track users of the system–leaving criminals an open forum to discuss and propagate illegal activity.

One site that has garnered significant attention is Silk Road, a website that sells illegal drugs, such as marijuana or cocaine. Silk Road is only accessible through the TOR network, and all payments are made using an anonymous currency called ‘bitcoins’. Anyone can access the site via TOR and have a wide variety of illegal substances delivered to them, without any law enforcement agencies being able to track them through their transaction. The Silk Road is an example of clearly illegal activity happening in a fairly public forum with the state having virtually no power to stop it.

In contrast, TOR is also heavily used by activists in countries where speaking out against the government carries serious penalties, sometimes including death. Totalitarian countries such as Iran and China block users from accessing websites that question the government, and it’s a punishable offense to try to get around these bans. TOR has enabled citizens of these countries to bypass the censor and have free, unhindered access to information. Without fear of being tracked down, activists can report on their government’s corruption.

The network is highly controversial due to its clear applications for ethically questionable actions. The tool itself, as a method for individuals to bypass the laws set down by a sovereign entity, is interesting in the context of social contract theory. The question of whether rising up against one’s government in any way can be ethically justified has piqued the interest of philosophers as far back as Socrates. Socrates, Hobbes, and Rousseau would all have very interesting ideas about a technology such as TOR.

In its purest form the social contract is a mutual concession of rights in order to achieve a mutually beneficial system. We, as individuals, give up our animalistic and natural rights, such as the right to kill each other, in order to form a mutually beneficial society. The question of where our obligations lie in the system, and whether this system is objectively beneficial, is at the heart of the issue.

Socrates was one of the earliest philosophers to deal with these issues. In Apologia, Socrates is put on trial for his views – he is accused of, essentially, sacrilege. The details of the trial aside, Socrates is found guilty and sentenced to death. While waiting for his sentence to be carried out Socrates is visited by his friends and students, and they offer to break him out and hide him. Socrates responds by stating that he has no right to leave: although he feels that he is not guilty, he believes that he would not even be alive if not for the people accusing him; he is a product of the state, and therefore is justly at its mercy.

Socrates’ opinions on the relationship between citizens of a state and the state itself were so strong that he felt justified dying for them. He believed that he would not have been alive had the state not provided a system for him to be born into; he existed only because the state existed. For Socrates the only power above the state is God, and individuals are products of both.

But that is not to say that Socrates would not break the law; he compared himself to a ‘gadfly’, persistently buzzing around those whom he considered unjust. While he was willing to let the state kill him, he did not feel that he had done anything wrong.

Applying this to a modern context, where users of a product like TOR are bypassing government restrictions, it seems likely that Socrates would have been entirely supportive of the users–or at least of the users’ activism–but would feel that those who are caught should accept their punishment. Socrates would have no moral qualms about the tool itself, so long as users accepted the consequences of the tool.

Another philosopher who worked more directly with the concept of a social contract is Thomas Hobbes. Hobbes is credited as the creator of the social contract theory and is an influential writer on political philosophy. The question that Hobbes was trying to answer was whether our ethical obligations should be to society, our government, or to ourselves. He was writing during the English Civil War, and this was the question that many people wanted an answer to; they wanted to know if they should continue to follow the current political entity, the monarchy, or if instead they should revolt. This tenuous environment gave Hobbes a platform for his views–social contract theory, after all, was directly relevant to everyone’s life, and to the questions that everyone was asking.

Socrates and Hobbes had much in common when it came to their opinions on the state: both considered it to be the sovereign entity that should have authority over citizens. Hobbes, however, would differ from Socrates in his opinions about circumventing government control; whereas Socrates would accept the state’s control, Hobbes would be against it.

Hobbes felt that a single person, when left to their own devices, would fall back to a natural state of chaos. This natural state would lead to death or significant unhappiness. Because of this, the social contract is formed, to move beyond the natural state. Humans relinquish some of their individual rights in order to form a society. If one person or group of people begins to move against the societal rules, the laws handed down by the sovereign entity, then it undermines the system, and will lead to the primitive natural state. By circumventing the law, one is refusing to forfeit their rights, and the sovereign entity can not function properly. Hobbes described the state of nature as “bellum omnium contra omnes”, a Latin phrase meaning “the war of all against all”. He considered it to be the ultimate struggle, “a perpetuall warre of every man against his neighbour”. In Hobbes’ Leviathan he makes his views explicit:

Hereby it is manifest that during the time men live without a common power to keep them all in awe, they are in that condition which is called war; and such a war as is of every man against every man. – Chapter XIII, Leviathan

It is clear that Hobbes believes that a sovereign power, above the individual, is necessary to keep the animalistic state of humanity in check.

As TOR is a tool created almost explicitly for the purpose of evading government restrictions and tracking, it is unlikely that Hobbes would have supported such a project. Most internet users forfeit rights just by opening their browsers–browsers track usage statistics, websites track user IP info, etc. All of this tracking is so that the websites can analyze that data, and mutually sacrifice some time in order for a beneficial service. Likewise, governments use the data to track criminals, thereby providing another service. By removing their ability to track, we rescind our sacrifice of rights and we ‘void’ our social contract. We fall back to an animalistic system, and this is evident in TOR’s usage – there are websites about murder, rape, drug use, and terabytes of disturbing content. The TOR network is truly an animalistic subnet within the confines of our greater network. There is no social contract because the community is built around individuals retaining their power, regardless of their socio-political or moral obligations – a free for all of unregulated rights.

Rousseau differs from both Socrates and Hobbes. A French philosopher of the 18th century, his work was influenced heavily by the enlightenment period. Rousseau took a very different view of the world, and of the social contract: where Socrates and Hobbes nearly worshiped the state, Rousseau felt that humanity had left its ‘noble self’ behind in its struggle for progress. For humanity to regain itself, and for societies to actually serve a beneficial purpose, there must be a strict form of democracy in which the people do not serve the state, but in fact form it directly.

Rousseau’s opinions on democracy were integral to his philosophical theories surrounding the social contract. Rousseau felt strongly that society had formed in such a way that a class system was inevitable–and, naturally, the people who were at the top were the ones deciding how things worked. It was because of this class system that Rousseau felt a pure democracy had to be the best form of government. He did not advocate a return to the animal state, but he did believe that it was the role of the government to provide the same type of environment, one which provided freedom.

TOR is, among other things, a tool that allows citizens to be free. It allows for uninhibited communication, and access to information. Rousseau advocates a society in which all citizens directly make up the government, therefore it’s possible that he would appreciate the access to information that TOR provides – one that would bridge the issue of a direct democracy in a country as large as China or the US. One of the largest issues with modern politics is an uninformed electorate; often, people are unable to get the information they need, relying on at most a few minutes of news from a specific channel with its own litany of biases. A program such as TOR would allow for full access to information, regardless of government consent; in a country like China, where information is often censored, it would allow for a more informed common citizen, and therefore would provide a society that is more conducive to Rousseau’s image of a pure democracy. The other implications of TOR, in its use, would not factor into Rousseau’s opinion of it, as they don’t directly mitigate or inhibit his view of the social contract.

The case of TOR reminds us that there is no item that is inherently good or evil. But the ethical implications of any technology will be considerable. For a program as complicated and controversial as TOR, there is no limit to the ethical questions that can be asked. But the program is most relevant to political philosophical theory, and social contract theory is key to answering the questions surrounding TOR. Looking at the past we can see how some of the most influential philosophers in the history of social contract theory would view a technology that emerged years after their deaths. Socrates, Hobbes, and Rousseau all attempted to answer similar questions–questions of loyalty and obligation, of individual and government rights–and a system like TOR would have given rise to significant discourse.

This is a short guide with pictures that will hopefully explain how to set up Bitlocker drive encryption for your Windows system, and get you on your way towards a more secure computer.

Bitlocker allows for full system encryption or just a partition to be encrypted. It uses AES 128bit by default but we can move it to AES 256bit. Let me just say that 128bit is entirely sufficient, and there is very little reason to use 256bit as it can cause performance issues.

Setting AES 256bit

If you’re dealing with highly classified information or your systems performance is not of concern you can change Bitlocker settings to use 256bit AES. 256bit mode will also increase the rounds used from 10 to 14. To implement AES 256bit we type the following into our search:

Many systems don’t have TPM so we can disable the requirement for one. After doing the above you can go to “Operating System Drives”.

Go to “Require additional authentication” and disable.

Set Up Of Bitlocker

Restart the system if you’ve done either of the above. Now we get to the set up of Bitlocker itself. It’s very simple:

First we choose the drive to encrypt.

You can encrypt your OS or any other partition. For the most security you’ll want the OS encrypted, this will prevent attackers from manipulating an offline machine.

Once you choose the drive it’s time to set a password.

Remember, a good password will have at least one of every character type: lower case, upper case, symbol, number (aB#4). Do not use special ASCII characters – Bitlocker lets you enter them but won’t let you use them at boot up – you’ll be locked out.

You will be asked to save a recovery key. This is very dangerous. You have three options if you want to be secure:

1) Print the key and hide it well. My least favorite option.

2) Save the key to a file and keep it on a separate USB, which you can hide or encrypt.

3) Safe the key to a file and delete that file after a reboot.

Now you choose whether to encrypt the full partition or just the used space. If you choose to skip encryption of free space an attacker may be able to gain valuable information. I highly suggest you encrypt the full partition.

Bitlocker Should Be Set Up

If you’ve followed these steps Bitlocker should be set up properly. For the average user performance hit (with default settings) should be very little.

It’s important to state that Bitlocker is only good for preventing access to your information while the system is off. If the system is on you are vulnerable. It will not prevent keyloggers, viruses, or any type of malware – all it prevents is tampering of data on the device.

There’s been a lot of talk about a recent feature to Ubuntu 12.10 – when you type into the Dash you receive back information from Amazon based on your input. There’s a massive outcry that this is a privacy violation or even a security issue, and the media’s been fueling it as much as they can. I’m going to try to explain what’s going on here and where the issues actually lie.

How It Works

The feature activates when you type messages into the Dash – a feature of Unity that takes in your text input and outputs relevant information. When you type into your Dash info is sent to Canonical and then Canonical sends the info to Amazon, the information is then sent back to Canonical form Amazon and finally lands on your system. What is sent is only what you type in, nothing more.

The idea here is that I can type in “Vacuum” and now I get books on vacuum cleaners or some such thing. The Dash is meant to be a ‘conduit’ of information, you type a word and it responds with everything related to that word. Amazon is just one more way to provide information to you.

The Problem?

Users seem to think this is a privacy issue. I think people hear “OMG Ubuntu has Amazon ads now! And it can see what you type!” – no. No, Amazon can not see what you type and they’re not ads. They can see the words you put into the Dash and Canonical acts as a proxy, so really, it’s Canonical that “sees” what you’ve typed into the Dash.

So this isn’t some full system keylogger or some such thing, it’s Canonical (the company behind Ubuntu, that packages your system components for you) seeing what you type into the Dash.

So ask yourself – what do I type in the Dash? For me it’s simple – I would type “Pidgin”, “Chrome”, “Homework” and open those files/ programs via Dash. Not exactly personal information.

Unless you’re typing in “porn” or your social security number perhaps you should question how sensitive the information in your Dash really is. Really, what is it that you enter that’s scary?

And then remember that Canonical doesn’t need some clever Dash keylogger to steal your information… they’ve “got root” as Mark Shuttleworth put it. If you don’t trust Canonical you shouldn’t be using their Operating System because they could easily patch up a kernel to spy on you or any other system component that they build on your behalf.

I’ve heard people claim “But what if someone accidentally puts a password in?” well, uh, yeah, that sucks! Canonical then sees your password… not that they need it since, again, they have root. And all of this information is sent to Canonical via secure encrypted connection.

Even beyond all of this users seem to have missed that it’s always been this way. Yes, your Dash has always communicated via internet – how do you think it gets ‘recommended apps’ from the software center? Or music? It’s done this for a long long time and nothing has changed.

And, of course, you can easily disable this by typing “Privacy” into the Dash and disabling the feature.

So I got two referral hits from the discussion page for Iron Browser. It seems that someone there wants a “conflict” section explaining that Iron doesn’t actually provide anything that Chromium doesn’t, and nothing of substance compared to Chrome. A really great endeavor.

I’m not optimistic it will be let through. Why? The discussion page is clearly bias.

UPDATE: I posted on the discussion page. I talked to someone there about getting proper sources up. I’m obviously not reputable but my sources are – the issue is that because none of my sources explicitly mention the Iron browser they can’t be used to discredit the browser. Essentially Iron says X and the reputable source says ‘X is false’ but because the reputable source doesn’t say “X is false and therefor Iron is making a false claim’ it can’t be linked – Wikipedia doesn’t allow that type of connection, which they refer to as ‘synthesis’. I think that’s idiotic but I don’t really care – I’ve had thousands of hits on my Iron page so even though users who go to Wikipedia are essentially getting lied to via proxy there are many who have come to my blog and gotten the facts.

To quote:

“Scam” is not accurate

I’ll be reverting some of User:98.207.42.24‘s edits that basically littered this article with the same statement over and over, about how a self-published source compared Chromes and Irons source code and concluded that Iron is a “scam” because of the fashion in which Iron sets privacy values (hard-coded instead of through a user interface).

I have several problems with this:

For someone that doesn’t want to do research about Google Chrome’s privacy faults before starting up the browser (even for the first time), Iron is helpful. And according to Irons website, that is what it was created for.

As far as I can tell Iron was made for comfort and it’s not ment to fool anyone into thinking otherwise.

That’s from user Bitbit. Taking a look at user Bitbit’s page we see he’s obviously anti-Google (there’s a banner dedicated to disliking Google). His bias is obvious because his post is a bit silly, I checked his page because reading his post it seemed clear. I’ll explain why in this post! I want it clear that, unlike my post about the Iron developer, I’m not trying to be insulting here. The Iron developer is throwing out crapware/ scareware whereas this is just one user with his opinions. I’m not going to attack him or his opinions, only explain why his post on the matter is invalid. So many people have fallen for the Iron browser, I don’t blame any of them, and I’ve seen so many users on other forums read the information and immediately state that they’re moving to another browser.

Let’s take this one step at a time.

For someone that doesn’t want to do research about Google Chrome’s privacy faults before starting up the browser (even for the first time), Iron is helpful. And according to Irons website, that is what it was created for.

For someone who doesn’t want to do research? Well… if you’re a user who winds up on Iron’s page it should be obvious that you were looking for a more private alternative to Chromium or Chrome. So the defense that Iron isn’t purporting itself as a more private alternative, only one with a more private default configuration is fairly weak. Furthermore, the Iron developers claims are disingenuous – one really clear example is the “URL Tracker” feature, a poor choice of name but the Iron developer makes it out to be a privacy issue when it is obviously not (you can read about this all in my original post). Therefor it is far more than a claim of “default configuration” because the developer has made claims that features removed are privacy violates when they are not.

The source is not timestamped so we have no idea when it was created and what has changed since then on Iron and Chrome.

There are multiple sources available. I found many in my original blog post on Iron. Bitbit didn’t look for any, so naturally he didn’t find them. The information is out there, and I’ve put it all in one place to make things simpler.

As I explained in my original post, RLZ is not a privacy issue. It also does not exist in Chromium.

As far as I can tell Iron was made for comfort and it’s not ment to fool anyone into thinking otherwise.

If you look at the facts from my first post it should be clear that Iron was made for money. It is absolutely designed to fool people – the page is filled with lies.

[…], these are flat-out dry facts, not an opinion

Here are some absolute sourced facts that can be externally verified, and I didn’t go into in my original post. I suggest reading the first post for a full tear down of Iron.

The default installation of Iron contains a bookmark to the Iron forum. On this forum? Ads.

And the default home page? Ads as well!

On its own this is hardly damning. Ad-based software is perfectly fine. Where the issue lies is that this is software designed to make a profit and it does so by playing on users fears. Adware, Scareware, Scamware, all can easily be seen as fitting. In every case it’s a matter of a user being tricked, or scared, into using a software that makes money off of them while providing no actual benefit to that user.

So it’s my hope that Wikipedia does include a section that puts all of the information available in their article. I’ve made use of many sources and they’re free to verify it all. When you search “Iron Browser” on DuckDuckGo Wikipedia isin the result box and it’s the first thing many users will see – providing those users with all of this information is important, it’s what Wikipedia is for. I’ve sourced my original post thoroughly and redundantly. I hope the information in that post is put to good use.

Thanks to whoever said the nice words about the post on that Wikipedia article (no username). I appreciate that.

You can find my original post here: https://insanitybit.wordpress.com/2012/06/23/srware-iron-browser-a-real-private-alternative-to-chrome-21/

If you’re asking the question “How do I securely do my banking online?” you’re one of many. Banking is something we used to do upfront and in person (or so I’m told, before my time) but now that the web has allowed access to our accounts from any location we have to ask how to do something so sensitive in a secure manor. This article will be a short guide to secure online banking.

Normally I say that Chrome is a secure browser for the average user, but it’s a different kind of secure. its sandbox aims to do things more relevant to system infection but not web-based attacks. In terms of web security, preventing CSRF, XSS, and the like – the types of attacks most directly related to online banking – I think Firefox with NoScript takes the cake. NoScript is the only program that’s proven to prevent XSS in the most situations, it’s the only program with ClickJacking prevention that’s worth anything, protection against SVG keylogging, and so many other things, and for banking you want to isolate and restrict the website you’re interacting as much as possible.

There are a few other things you’ll want to do before setting up Firefox if you’re planning on banking online:

1) Make sure you are on a secure network. A secure network is one using WPA2 encryption with a strong 12 character password (or larger) that only you know (assuming wireless).

2) Make sure your system is completely up to date. Keeping intruders out starts with patching. The browser, operating system, and your plugins are key here.

3) If you’re using Linux Ubuntu enable AppArmor for Firefox (sudo aa-enforce /etc/apparmor.d/*firefox*) – other distros may use other LSM.

Chrome now includes a “Script Bubble” feature that shows you how many scripts are running on a specific page and which ones. The feature could potentially allow users to spot a malicious extension more easily.

There’s no way to stop the extension from running on the page through the Script Bubble, which is something I’d really like to see. Further, I’d like to see scripts on on pages on a whitelist as well and options similar to the other content settings.

Chromium – the open source project that Google Chrome is based on – has gained preliminary support for ‘Do Not Track’. Do Not Track is a “signal” that the browser sends to a website saying that it doesn’t want any advertisers to track the user, essentially it’s a way to try to stay private.

Recently Do Not Track has gotten a lot of press due to Internet Explorer 10 enabling it by default – a very poor publicity move on Microsoft’s part as Do Not Track can only be enforced when a user opts into it. In other words anyone using Internet Explorer 10 won’t benefit from Do Not Track at all.

Do Not Track has long been built into Mozilla’s Firefox browser and it’s nice to see Chrome adopting it as well.

Though the Do Not Track header is not yet legally or technologically enforced some advertisers will adhere to it. With both Firefox and Chrome supporting it there’s a lot of weight behind the idea.

Unfortunately, due to Microsoft’s idiotic move, traction may not gain as quickly.

We’ll see how it all plays out. We can likely expect Do Not Track to hit Stable Chrome in the next two months.

So I’ve just spent the time getting DNSCrypt working on my system. It was a bit of a pain but now that I got it done it shouldn’t be hard to recreate. I thought I’d write up a short guide explaining how to get it done.

Note that all double “-“s are turned into single ones. This is a WordPress issue. You’ll have to manually type them in, sorry.