Facebook may be dominating the headlines, but it certainly isn’t the only platform taking a long hard look in the mirror. After all, the data-breach scandal comes hot on the heels of concerns over hate speech on micro-blogging channels, the politicization of fake news and accusations about ideological echo chambers.

There is no doubt that privacy is the game-changer. Good faith has been the watchword in tech. Platforms were essentially saying to users, “Yes, we have your data, but don’t worry: We’ll use it responsibly.” To which users replied, “We don’t understand what goes on behind the scenes data-wise. It’s confusing and boring, but we like your services, so we’ll trust you’re doing the right thing.”

The good faith arrangement has come unstuck. The trust has gone. It’s no longer good enough for tech companies to talk about improving transparency: We have to back that up with concrete steps to win trust back. Businesses that don’t engage with this collective responsibility will pay a hefty price.

One simple solution would be to ensure that every single user is comprehensively informed about ways in which their data could be used, enabling them to opt out. But don’t platforms already do that? Sure they do: It’s called “agreeing to the terms and conditions,” and “comprehensively” is right: A 2016 social science study concluded that for every service a consumer signs up for, on average, they would need to spend 40-minutes per day, every day, reading all of the service policies.

In fact, the same researchers encouraged a group of volunteers to sign up for a fictitious social media platform, and 98 percent of them failed to spot a clause in the terms and conditions that stipulated that they must surrender their first-born child as payment for services.

The terms-and-conditions fiction that all parties were happy to go along with—”We give you impossible things to read; you pretend you’ve read them”—has been exposed as just that: an unworkable falsehood. Users are beginning to understand what they agreed to, and a lot of them don’t like it very much.

Platforms cannot afford to sit around and wait for regulators to solve these problems. The big players are turning their tankers around, but it could take a while. Smaller brands have a golden opportunity to put consumer privacy at the heart of their marketing strategies right now.

This doesn’t mean that data will somehow cease to flow. One consequence of the increased media coverage is that people are now far less naive about how our industry works, and that’s not a bad thing. In fact, brand marketers have a responsibility to continue this process of education. It is in their best interest also.

Consumers now realize that the “free” services they enjoy are essentially paid for in data, and that their favorite social media platforms can legitimately be described as ad networks, too.

However, they are also beginning to understand that safe and responsible use of their data can be directly beneficial to them. If advertising is an essential ingredient in the process, would they rather be bombarded by content that is irrelevant to them, or carefully targeted with products that match their interests and needs?

Tech companies have the power to bring people more of what they want and less of what they don’t want. As chief technology officer of a mobile-first ad-tech company, our energy and expertise are devoted to creating technology that does precisely that.

In today’s on-demand world, most companies are driven by data in some way. The ability to access unprecedented levels of information from multiple sources is providing organizations with the insights they need to make advancements in healthcare, education, insurance, customer service retail and so on. Most people don’t want to turn back the clock on these improvements, just as they don’t want to give up social media.

Businesses that wholeheartedly embrace the need to protect their users’ data from misuse will see a rise in customer loyalty. This might even lead to a legitimate upswing in data collection, as more and more people come to realize which brands use information responsibly. Conversely, platforms that fail to take privacy seriously could face a real and deeply damaging user backlash.

So, what will change look like? Time will tell. Certainly, there are technical problems that need addressing and loopholes that need shutting down so that abuse is minimized. But that alone won’t cut it: After all, Facebook shut down a huge loophole in 2015 that did little to stem the tide of data misuse.

In the current marketplace, users want to know how companies are responding to the data issues that are currently under the microscope. Here are a few suggestions:

Platforms should openly acknowledge that it is incredibly difficult to protect users’ data from misuse once it gets into the murkier parts of the developer ecosystem.

Platforms should demonstrate what they are doing to make sure that doesn’t happen. This means, among other things, weeding out low-quality content.

Platforms should empower users to take privacy into their own hands in ways that they can understand (as Facebook is now doing).

Platforms need to educate users about how their data can be used in positive ways.

Platforms are right to be fighting ad fraud with everything they’ve got. The time is now to take user privacy every bit as seriously. The days of collecting and monetizing more and more personal data are over. From now on, it’s about entering into a genuine partnership with consumers—using data they are comfortable sharing in the most intelligent and transparent way we can.

Platforms that aren’t seen to be doing it this way and aren’t prioritizing privacy will pay the price.