Facebook’s top security chief said he is still with the company following reports that his tenure may soon be over. The discussion about his status continues as it becomes clear that Facebook user information is Facebook’s currency. It’s bought and sold every day.

Alex Stamos, Facebook’s chief security officer, says he’s still “fully engaged” in his work at Facebook. In a tweet, Stamos said his role changed, but “I’m currently spending more time exploring emerging security risks and working on election security.”

His remarks follow a New York Times report published late Monday that said Stamos is planning on leaving Facebook by August. According to the Times, his departure is partly a result of company disputes regarding how much to share about nation states manipulating Facebook, and internal restructuring leading up to the 2018 midterm elections. The debate that erupted over the weekend regarding Cambridge Analytica use of Facebook user data has led to heightened interest in just how customer information is used. 

In a statement to CNN, a spokesperson for Facebook said he is still the CSO.

“He has held this position for nearly three years and leads our security efforts especially around emerging security risks,” the company said in a statement. “He is a valued member of the team and we are grateful for all he does each and every day.”

However, while both Facebook and Stamos said he is currently still with the company, neither denied he would be leaving in the near future.

Stamos joined Facebook in 2015 after previously serving as the chief information security officer at Yahoo.

About your data

Now, about just how safe your personal information is at Facebook …

When advertisers want to target a specific group of customers who, for example, are a particular age and have a certain political affiliation or interest, Facebook makes that possible. The stuff you share and the inferences Facebook makes about you are packaged together with similar people’s data, stripped of names and sold to companies. That allows businesses to put ads in front of people they’re certain they can influence.

On Facebook, you are the product. Advertisers are the customer.

Facebook’s not alone. Most advertiser-supported networks sell some of your information to third parties. Google, Microsoft, Yahoo, AOL, Amazon, Twitter and Yelp do the same.

Giving up our privacy is the price we pay for getting to use Facebook for free. Most of the time, that tradeoff works: People take advantage of free services by posting, searching and sharing. Most companies that collect our data use it for legitimate purposes and within the bounds that companies like Facebook permit.

That arrangement has turned Facebook and Google into online advertising juggernauts. They have built massive audiences of billions of customers, and advertisers flock to them. Facebook and Google control three-quarters of the $83 billion digital advertising market in the United States, according to eMarketer.

But the customer-is-the-product deal doesn’t always work to the user’s advantage. This weekend, the public learned data company Cambridge Analytica improperly accessed 50 million Facebook users’ personal information to influence the 2016 election.

Internet companies have a financial disincentive to give users more control over their data. If people share less, social networks will earn less money.

‘Meaningful transparency’

Most companies offer privacy settings, and some even let you leave and take your data with you. But they don’t make it easy, and critics say social networks and internet companies should give users far more say about which data ends up in advertisers’ hands — and when.

“Tech companies can and should do more to protect users, including giving users far more control over what data is collected and how that data is used,” said the Electronic Freedom Foundation in a statement. “That starts with meaningful transparency.”

Once you share something on any digital service, your personal information leaves your control. Cambridge Analytica serves as a stark reminder of that.

The data company broke Facebook’s rules when it obtained the data from a researcher without users’ authorization. That’s the tricky thing with data: Once it’s out there, it’s hard to out boundaries around it. Facebook trusts companies and researchers who obtain your data to use it properly. If they break the rules, Facebook can punish them (it suspended Cambridge Analytica, for example), but only after your data has already been used illicitly.

“It’s difficult to police after it’s left your secure perimeter,” said Rik Ferguson, vice president of security research at Trend Micro. “Cambridge took advantage of the porousness of Facebook.”