LAS VEGAS — A senior Facebook leader on Tuesday defended the social giant’s practices of protecting users’ data, saying the platform is “committed to privacy and building privacy by design.”

“(Critics see as) inherent something wrong with the business model but there isn’t. You can have a privacy protected ad business model and we do,” said Facebook’s chief privacy officer Erin Egan, adding that the platform has a number of privacy practices such as data minimalization in place that protect users’ information.

Egan’s comments were part of a far-reaching discussion at CES 2020 on the range of privacy issues facing tech and consumer companies, and steps they are taking to protect user data in the wake of major breaches like the Facebook Cambridge Analytica scandal that wracked the company’s reputation. Jane Horvath and Susan Shook, who serve as chief privacy officers at Apple and P&G respectively, and FTC commissioner Rebecca Slaughter also participated.

Egan, as well as fellow panelists, defended the basic tenet of data collection as a means of providing consumers with products, services and information they want or need. “We collect the data we need to serve people and serve underlying advertisers.” Egan said. “Are people deriving value from ad biz models? Yes, we believe they are. We are able to provide a service for free.”

Shook agreed in principal, saying data enables P&G to connect consumers with products that would be of specific interest to them based on, say, where they live — and is critical in enabling the company to foster relationships with customers who could very easily buy their goods from other companies.

Shook, however, said that, for P&G, the key is using that data in ways that don’t breach consumers’ privacy or extend beyond the company’s core purpose of providing them relevant information — or risk losing their trust. “We understand that customers can easily exit the equation,” she said.

Like fellow panelists, however, Shook said it’s incumbent on companies to institute, and adhere, to practices like data minimalization (which means using only as much data as necessary to accomplish a task) and differential privacy (which refers to sharing information and groups but not individuals) and de-identifying data after it’s been harvested.

Egan said it’s also “absolutely important” that tech companies be responsible for insuring privacy is protected rather than putting the onus on consumers through complex opt-out agreements. Horvath said Apple minimalizes collecting and transmitting the data necessary to provide consumers with services like weather forecasts.

Slaughter, however, said protecting consumer privacy involves complexities that aren't necessarily addressed by even stringent practices, as a host of unknowns and risks are inherent to the nature of data collection.

Localization data, for instance, is actually highly sensitive information, as it could put individuals’ personal safety at risk. Consumers don’t necessarily understand the complexities of opt-in and opt-out agreements, or what companies actually do with their data once it's collected. And just because companies have the ability to de-identify data doesn’t mean they will do it, Slaughter said.

Industry is also wrestling with problems surrounding encryption which, in the case of Apple, could protect sensitive information like user health and financial data, but also could protect terrorist or abuse content for instance, panelists said.

Slaughter said she believes “the necessary conditions are in place,” including having tech companies on board, for the passage of federal legislation that would address privacy rights and how business could better ensure such protection.

However, she said, “It has to be meaningful and it has to be strong. And there has to be real consequences for violating (it).”