Data scientist Cathy O’Neil said drawing conclusions about health risks on such data could lead to a bias against some poor people. It would be easy to infer they are prone to costly illnesses based on their backgrounds and living conditions, said O’Neil, author of the book “Weapons of Math Destruction,” which looked at how algorithms can increase inequality. That could lead to poor people being charged more, making it harder for them to get the care they need, she said. Employers, she said, could even decide not to hire people with data points that could indicate high medical costs in the future.

雖然說數據掮客稱這些數據只建議作為提供客戶健康建議的參考，不建議定價時使用，但業內人士透漏，實際上還是存在用在定價的情況。

McCulley and others at LexisNexis insist the scores are only used to help patients get the care they need and not to determine how much someone would pay for their health insurance. The company cited three different federal laws that restricted them and their clients from using the scores in that way. But privacy experts said none of the laws cited by the company bar the practice. The company backed off the assertions when I pointed that the laws did not seem to apply.

在歐洲五月開始上路的法律完全禁止了個人隱私數據的販賣，所以問題比美國小很多。美國法律只保護病患的醫療記錄。

掮客也不能保證每個人的數據都是完全正確的：

Ruane acknowledged that the information his company gathers may not be accurate for every person. “We talk to our clients and tell them to be careful about this,” he said. “Use it as a data insight. But it’s not necessarily a fact.”

In a separate conversation, a salesman from a different company joked about the potential for error. “God forbid you live on the wrong street these days,” he said. “You’re going to get lumped in with a lot of bad things.”

作者像掮客要了一份自己的數據，結果竟然長達 182 頁：

I filled out a request on the LexisNexis website for the company to send me some of the personal information it has on me. A week later a somewhat creepy, 182-page walk down memory lane arrived in the mail.