2020 realized an unprecedented increase in the importance and value of digital services and infrastructure. From the rise of remote working and the world shift in customer habits to huge profits booked by internet entertainers, we are witnessing how overwhelmingly important the connected infrastructure has become for the daily functioning of society.

What does all this mean for privacy? With privacy more often than not being traded for convenience, we is confident that for many 2020 has essentially altered how much privacy people are willing to sacrifice in exchange for security( especially from the COVID-1 9 menace) and access to digital services. How are governments and enterprises going to react to this in 2021? Here are some of our thoughts on what the coming year may look like from the specific characteristics perspective, and which diverse and sometimes contrary forces-out are going to shape it.

Smart health device vendors are going to collect increasingly diverse data- and use it in increasingly diverse routes.

Heart rate monitors and step counters are already a standard in even the cheapest smart-alecky fitness band simulates. More wearables, nonetheless , now come with an oximeter and even an ECG, allowing you to detect possible heart rate issues before they can even cause you any trouble. We belief more sensors are on the way, with body temperature among the most likely candidates. And with your body temperature being an actual public health concern nowadays, how long before health officials want to tap into this pool of data? Remember, heart rate and activity tracker data– as well as consumer gene sequencing– has already been used as evidence in a court of law. Add in more smart health machines, such as smart body scales, glucose level monitors, blood pressure monitors and even toothbrushes and you have huge amounts of data that is invaluable for marketers and insurers. Consumer privacy is going to be a value proposition, and in most cases cost money. Public awareness of the perils of unfettered data collection is growing, and the free market is taking notice. Apple has publicly clashed with Facebook claiming it has to protect its customers’ privacy, while the latter is wrestling with regulators to implement end-to-end encryption in its messaging apps. People are more and more willing to choose services that have at least a promise of privacy, and even pay for them. Security dealers are promoting privacy awareness, backing it with privacy-oriented products; incumbent privacy-oriented services like DuckDuckGo show they can have a sustainable business modeling while leaving you in control of your data; and startups like You.com claim you can have a Google-like experience without the Google-like tracking. Government are going to be increasingly jealous of big-tech data hoarding- and increasingly active in regulation. The data that the big tech corporations have on people is a gold mine for governments, democratic and oppressive alike. It can be used in a variety of ways, from use geodata to build more efficient transportation to sifting through cloud photos to fight child abuse and peeking into private conversations to silence disagreement. However, private corporations are not truly keen on sharing it. We have already seen governments around the world resist companies’ plans to end-to-end encrypt messaging and cloud backups, pass legislation forcing developers to flower backdoors into their software, or voice concerns with DNS-over-HTTPS, as well as more laws regulating cryptocurrency being legislated everywhere, and so on. But big tech is called big for a reason, and it will be interesting to see how this showdown develops. Data corporations are going to find ever more creative, and sometimes more intrusive, sources of data to fuel the behavioral analytics machine. Some sources of behavioral analytics data are so common we can call them conventional, such as using your recent acquisitions to recommend new goods or employing your income and spending data to calculate credit default risk. But what about using data from your web camera to track your engagement in work meetings and decide on your yearly bonus? Using online exams that you take on social media to determine what kind of ad will stimulate you buy a coffee brewer? The feeling of your music playlist to choose the goods to marketplace to you? How often you accuse your telephone to determine your credit rating? We have already seen these scenarios in the wild, but we are expecting the marketers to get even more creative with what some data experts call AI snake oil. The main implication of this is the chilling effect of people having to weigh every move before react. Imagine knowing that choosing your Cyberpunk 2077 hero’s gender, romance line and play style( stealth or open assault) will somehow influence some unknown taken into account in your real life down the line. And would it vary how you play the game? Multi-party computations, differential privacy and federated learning are going to become more widely adopted- as well as edge computing. It is not all bad news. As corporations were becoming increasingly self-conscious as to what data we are really need and consumers push back against unchecked data collection, more advanced privacy tools are emerging and becoming more widely adopted. From the hardware view, we will see more powerful smartphones and more specialized data processing hardware, like Google Coral, Nvidia Jetson, Intel NCS enter the market at affordable prices. This will permit developers to create tools that could be doing imagination information and communications technology, such as running neural networks, on-device instead of the cloud, dramatically limiting the amount of data that is transferred from you to the company. From the software standpoint, more companies like Apple, Google and Microsoft are adopting differential privacy techniques to give people strict( in the mathematical sense) privacy guarantees while continuing to make use of data. Federated learning is going to become the go-to method for dealing with data deemed too private for customers to share and for companies to store. With more educational and non-commercial initiatives, such as OpenMined, surrounding them, these methods might lead to groundbreaking collaborations and new results in privacy-heavy areas such as healthcare.

We have learnt over the past several decades, and the last few years in particular, how privacy has become a hot-button issue at the intersection of governmental, corporate and personal interests, and how it has been subject to such different and sometimes even conflicting tendencies. In more general terms, we hope this year helps us, as national societies, to move closer to a balance where the use of data by governments and companies is based on privacy guarantees and respect of individual rights.

Read more: securelist.com