The idea of big data makes a lot of us uneasy. It sounds a lot like Orwell's Big Brother, and with ads from companies that seem to know what we're doing and the recent NSA domestic spying revelations, it is understandable that some people find the massive amount of information out there about all of us disturbing.

People can tell lots about you from this data, including your age, gender, sexual orientation, marital status, income level, health status, tastes, hobbies, habits and a whole host of other things that you may or may not want to be public knowledge. They need only have the means and the will to gather and analyze it. And whether they mean well or ill, it can have unintended consequences.

Keep Reading Below

We give up more information than we realize to companies with whom we do business, especially if we use loyalty cards or pay with credit or debit cards. Someone can learn a lot about you just from analyzing your purchases. Target received some press when it was discovered that they could pinpoint which customers were pregnant and even how close they were to their due dates from things like the types of supplements and lotions they were buying. In one case, Target began mailing coupons for baby products directly to a teenage girl, sparking her father's ire against the company for sending her what he considered age-inappropriate ads -- until he found out about her pregnancy [sources: Datoo, Duhigg, Economist].

Governments and privacy advocates have made attempts to regulate the way people's personally identifiable information (PII) is used or disclosed in order to give individuals some amount of control over what becomes public knowledge. But predictive analytics can bypass many existing laws (which mainly deal with specific types of data like your financial, medical or educational records) by letting companies conclude things about you indirectly, and likely without your knowledge, using disparate pieces of information gathered from digital sources. Some companies are using the information to do things like check potential customers' credit worthiness using data other than the typical credit score, which can be good or bad for you, depending upon what they find and how they interpret it. One worry, though, is that this type of personal information can lead to hard-to-detect employment, housing or lending discrimination. And worse yet, it may not always be entirely accurate.

It's also possible for patterns seen in big data to be misinterpreted and lead to bad decisions. Like any tool, the results all depend upon how well it is used. Even though math is involved, big data analytics is not an exact science, and human planning and decision-making has to come in somewhere. With huge data sets, judgment calls need to be made about what is important and what can be be ignored. But performing big data analytics well can give companies a competitive advantage.

Such analysis can be used for things that are obviously good, such as fighting fraud. Banks, credit card providers and other companies that deal in money now increasingly use big data analytics to spot unusual patterns that point to criminal activity. On an individual account, they can quickly be alerted to red flags like purchases of unusual items, amounts the customer normally wouldn't spend, an odd geographical location or a small test purchase followed by a very large purchase. Patterns across multiple accounts, like similar charges on different cards from the same area, can also alert a company to possible fraudulent behavior.

Huge data sets can aid in scientific and sociological research, election predictions, weather forecasts and other worthwhile pursuits. Social media posts and Google searches have even been used to quickly find out where disease outbreaks are occurring. So it's not all bad news. It'll just take a while to work out all the potential problems and to implement laws that would protect us from potential harm. Until then, if you're worried, you might want to revert to cash purchases and watch what you put out there about yourself. Still, we're probably too far down the rabbit hole for any of us to be entirely off the radar.