<- All Blogs
Cyber Workforce Resilience

Facebook probably knows you better than you do – should you be worried?

Written by
Evander Pierre
Published on
February 3, 2021

In the first week of 2021, messaging giant WhatsApp announced that it would begin siphoning user data to parent company Facebook – and those who said otherwise would be booted from the service. Except plenty of people did say otherwise, which prompted WhatsApp to have a rethink.

The privacy policy changes have now been postponed until May 2021, and nobody will lose access to WhatsApp in the meantime. Millions of users, however, aren’t waiting around, and have already jumped ship for rivals such as Telegram and Signal (cosigned by Elon Musk). The figures evidence this: between December 21 – January 3 and January 4 – January 17, Signal downloads rose 9493 per cent, while WhatsApp’s fell 16 per cent.

What did WhatsApp want to do?

The company said its policy update included new options for people to manage their business on WhatsApp and did not “expand our ability to share data with Facebook.” So is this all just Twittersphere histrionics? Well, not exactly. The information scheduled to be shared with Facebook included data that could be linked to your identity. This included your device ID, usage and advertising data, purchase history, financial information, physical location, phone number – the list goes on. Signal doesn’t require access to any of this data, which raises the question: does WhatsApp need it?

Source: Zac Doffman, Forbes

Why people pushed back

Trust in Facebook has plummeted since the Cambridge Analytica scandal. People are also wising up to data in general, better understanding its value. Terms such as ‘cookies’ and ‘GDPR’ are now commonplace, for instance, and people think much harder before checking boxes. Tech vendors must therefore consider how they manage customer data, and communicate any changes transparently. (They should have been doing this anyway but benefited from the public’s prior naivety.) WhatsApp has attempted to excuse itself since its first announcement, but for many people it’s too little too late – the damage is done.

Our thoughts

While security and privacy are not the same thing, our cyber pros are clued up on both. Some of the team share their opinions below.

Robert Klentzeris,
Junior Application Security Engineer

I’m switching to Signal and persuading those in my WhatsApp groups to join it too. I won’t be trying to convert people to Telegram, as it doesn’t have the best track record (e.g., bugs being introduced to its cryptographic code that would serve no other purpose than to offer a backdoor).

The “security” risk of Facebook/WhatsApp knowing so much about individuals is not as large as the collective ability to discriminate on, or control aspects of, people’s lives unbeknown to them. Facebook already collects 52,000 unique attributes about its users. Do we need to feed even more information for them to eventually leak through scandals like Cambridge Analytica?

Some people say, “if you have nothing to hide, you've got nothing to worry about”. Let me answer this with another question: would you send me screenshots of all the messages you've sent through WhatsApp? Despite the logistical nightmare of taking that many screenshots, I assume your answer would be no. What if I look over your shoulder while you type them out? Would you behave or talk differently? All of a sudden you’re self-censoring – this is the slippery slope we're on. It's not by chance that privacy is recognized as a human right.

Mat Rollings,
Senior Application Security Engineer

I have switched to Signal and convinced others to do the same. I don't think the new privacy policy is something to panic about, but moving towards privacy focused applications will remove the power and control that private businesses have over our communications.

Facebook has proven time and again to be operating unethically, wishing to track your location, your contacts and interactions, etc., so that it can sell this data on. This data is not only useful to advertisers but also parties looking to spread misinformation to vulnerable groups.

Everyone has things they wish to remain private: medical records, search history, bank account details, etc. Would you hand your unlocked phone to a stranger? You may not oppose government and law enforcement having access to this information, but sharing it with private companies is another matter.

Ben McCarthy,
Lead Cyber Security Engineer

People in the cybersecurity community have been leaving WhatsApp, many of them for Signal. Family and friends have asked me about switching and I’ve recommended they do it. I wouldn't say I’m worried about the situation – much of this data is already collected by other enterprises; however, where you can, you should protect your privacy and data.

WhatsApp’s data farming isn’t a traditional “security risk”, and using it doesn't necessarily increase your likelihood of getting hacked (though there is still a risk). The issue stems from them using your data to target you. By allowing WhatsApp and Facebook to use and share your data, the internet becomes tailored to you – whether in relation to politics, advertising or anything else – and this can enable third parties to manipulate your views. When Facebook and WhatsApp control what you see, how do you know what is “true”?

WhatsApp and Facebook aren't looking at you for illegal activity but rather to profile you – to understand the way you view the internet and target you better with adverts and campaigns. This is fine so long as it isn't used to slowly change your mind. If such companies control what you see and like, can they form your opinions for you?

What next?

We haven't heard the last of this story, and we're all curious to see how it'll unfold in the coming weeks. If you'd like to find out more, tune in to our latest episode of Cyber Humanity.

Share this post