DIGITAL PRIVACY, PART TWO: WHAT CAN WE DO ABOUT OUR DATA’S PRIVACY?Matt
As I indicated in Part One of these reports on digital privacy, digital tools such as facial recognition are used for many beneficial purposes. However, as I demonstrated, those tools are also extremely easy to abuse, particularly in the hands of governments and the law enforcement community.
One of the films of the blockbuster film series, the Marvel Cinematic Universe, demonstrated the threat in a most capable manner.
Reel life reflecting real life
In “Captain America: The Winter Soldier,” Steve Rogers, the titular super-soldier, finds himself in a race against time to stop a deadly conspiracy that is fueled by abuse of digital surveillance. It’s discovered that the government security agency SHIELD has been infiltrated by a terrorist group known as Hydra. As Hydra scientist Arnim Zola explains, “Hydra was founded on the belief that humanity could not be trusted with its own freedom. What we did not realize is that if you try to take that freedom, they resist. (World War II) taught us much. Humanity needed to surrender its freedom willingly. After the war … the new Hydra grew. For 70 years, Hydra has been secretly feeding crises, reaping war. … Hydra created a world so chaotic that humanity is finally ready to sacrifice its freedom to gain its security.”
Hydra infiltrator Jasper Sitwell explains how digital information is being used to determine the targets for the imminent lethal uprising. “The 21st century is a digital book. Zola taught Hydra how to read it. Your bank records, medical histories, voting patterns, emails, phone calls, your damn SAT scores. Zola’s algorithm evaluates people’s past to predict their future. … And then the Insight helicarriers [heavily armed aerial transports] scratch people off the list a few million at a time.”
Yeah, that’s a frightening scenario: “Big Brother” writ large. Depending on your age and education, you might wonder what the hit CBS television show has to do with digital privacy. After all, the reality TV show is designed for entertainment. But the phrase “Big Brother” debuted in George Orwell’s 1949 novel “1984,” in which a totalitarian government maintains control through constant electronic surveillance of its citizens. Today, the phrase “Big Brother” is “a synonym for abuse of government power, particularly in respect to civil liberties, often specifically related to mass surveillance.”
And, as I demonstrated in the previous essay, digital information, particularly facial recognition, can easily be misused and abused … as demonstrated by these most recent examples, which were made public after the last essay was published:
- In Michigan, Robert Williams, a Black man, was arrested by Detroit police in his driveway. Police thought Williams was a suspect in a shoplifting case. However, the inciting factor for the arrest was a facial recognition scan, which had incorrectly suggested that Williams was the suspect. And while the charges were later dropped, the damage was done: Williams’ “DNA sample, mugshot, and fingerprints — all of which were taken when he arrived at the detention center — are now on file. His arrest is on the record,” says the American Civil Liberties Union, which has filed a complaint with Detroit police department.
- In May, Harrisburg University announced that two of its professors and a graduate student had “developed automated computer facial recognition software capable of predicting whether someone is likely going to be a criminal. With 80 percent accuracy and with no racial bias, the software can predict if someone is a criminal based solely on a picture of their face.” On June 23, over 1,500 academics condemned the research paper in a public letter. In response, Springer Nature will not be publishing the research, which the academics blasted as having been “based on unsound scientific premises, research, and methods which … have [been] debunked over the years.” The academics also warn that it is not possible to predict criminal activity without racial bias, “because the category of ‘criminality’ itself is racially biased.”
Today, we’ll explore another aspect of digital privacy — namely, how much of a threat your own digital footprint can pose to your security. Because, as has become readily apparent, humanity doesn’t need “to surrender its freedom willingly.” It’s already done it.
Not too long ago, my wife and I were relaxing. I was reading a book; she was watching television. She asked me a question about the drug commercial that had just aired, and we laughed while discussing how the “minor” side effects of the drug didn’t sound all that minor. In the midst of our laughter, her phone started talking, telling us all about the drug.
She hadn’t touched the phone, which was sitting beside her on the couch.
I said, “Skynet is real.” She picked up the phone, put it into sleep mode, and set it back down, all without directly looking at it. And then, of course, for the next week or so, her supposed interest in the drug influenced what kind of ads popped up on her phone and laptop.
They say it’s not being paranoid if people really are out to get you. And they are out to get you … or your data, anyway.
The thing is, both of us have voice recognition software on our phones. With it, we can instruct our phones to open a specific app, call a person, search Google for information, and various other tasks. It just never occurred to us that the phones could listen in without our specifically activating the voice-recognition software.
But we’re not the only ones creeped out by our devices’ antics. A few years ago, users of Amazon’s Alexa reported that the AI assistant would abruptly laugh for no apparent reason. Sometimes, the laughter would come in response to a user query. Other times, the user would be sitting silently when Alexa would suddenly chuckle. The laughter disturbed a lot of Alexa users. One user tweeted, “Lying in bed about to fall asleep when Alexa on my Amazon Echo Dot lets out a very loud and creepy laugh… there’s a good chance I get murdered tonight.”
Admittedly, we know that it’s highly unlikely that our phones, or Alexa, are going to pick up a weapon and come after us. While some say the age of Skynet is inching ever closer, most of us realize that it’s not all that likely that all of our machines are going to rise up and wipe us out … the repeated attempts of GPS notwithstanding. (Ever had your GPS tell you to “Turn right” … while you’re in the middle of a bridge over a very wide and deep river? I have!)
They say it’s not being paranoid if people really are out to get you. And they are out to get you … or your data, anyway. Corporations want your data, which fuels their marketing; social media platforms want your data, which fuels their interconnectivity, as well as the demographic data they can use for targeting their ads; the government wants your data, to fuel their research, voting, and criminal justice databases; and hackers want your data so they can steal your money.
The ACLU breaks down its concerns over privacy and technology into the categories of Internet privacy; cybersecurity; location tracking; privacy at borders and checkpoints; medical and genetic privacy; consumer privacy; and workplace privacy.
In 2019, the Pew Research Center published the results of a survey in which a majority of Americans admitted their belief that their activities — both online and offline —were being monitored by the government and companies. “Roughly six-in-ten U.S. adults say they do not think it is possible to go through daily life without having data collected about them by companies or the government,” the report warned.
Although the report acknowledges that “data-driven products and services are often marketed with the potential to save users time and money or even lead to better health and well-being,” 81 percent of respondents said that “the potential risks they face because of data collection by companies outweigh the benefits, and 66% say the same about government data collection. At the same time, a majority of Americans report being concerned about the way their data is being used by companies (79%) or the government (64%). Most also feel they have little or no control over how these entities use their personal information.”
There are various aspects of Internet privacy:
- Consumer Online Privacy: One of the concerns that many consumers have is how their data is collected online, and what happens to it afterward. Ever Googled a retail website and then, shortly afterward, ads for that website start popping up in the margins of whatever other website you’re looking at? Your Internet Search Provider likely sold your information, or the website left “cookies” on your computer that allowed it to target you for the ads. Because ISPs are in the perfect position to see everything we do online, Maine regulates how they operate by requiring that they gain permission from users before using their data. Others are using Virtual Private Networks to put barriers between their devices and the ever-watching eyes of the ISPs. Maine is an outlier, though, and VPNs can be inconvenient and costly.
- Social Networking Privacy: When you’re not at work, you would think that what you post on social media is your own private business. But increasingly, employers, school and the federal government are requiring access to our digital lives. U.S. border enforcement agents are demanding that travelers unlock their devices and provide passwords. Schools are utilizing services that allow them to access students’ devices and social media accounts. The concern about overreach has become so widespread that some states have taken steps to prevent employers from researching the habits and postings of job applicants on social media, or trying to require that employees surrender passwords to their accounts.
- Cell Phone Privacy: You know from movies, television shows and the news that your digital devices also act as reliable tracking devices. Indeed, recent events alone have shown that the tech can track stolen devices; allow advocacy and voting rights groups to track the movements of protesters (and communicate with them); and allow companies like Venntel can collect and then sell data from citizen’s phones to government agencies, which can lead to warrantless tracking of their activities.
- Email Privacy: One of the biggest stories in 2016 was the hacking of emails that belonged to the Democratic National Committee and then-presidential candidate Hillary Clinton. Today, concerns remain over just how private our emails really are, creating opportunities for services that give you more control Mozilla is now offering Firefox Relay, which effectively acts as call forwarding in the form of email aliases that are connected to your real account, but doesn’t allow others access to your real account. Google is also offering new features to make Gmail safer.
- Cybersecurity: As our reliance on our digital devices continues to grow, and the technology that connects those devices continues to improve, it can be argued that we are increasing the opportunities for hackers to exploit that same dependence. As the prevalence of hacker attacks grow, so does the need to protect computers, databases, electronic systems, mobile devices, networks and servers. A recent poll showed that most companies’ electronic security breaches were the result of poorly planned security infrastructure. Yet, even though U.S. business losses in cybersecurity attacks averaged $1.41 million in 2018, with over 68 million sensitive records exposed in 2019, the United States faces a coming shortage of cybersecurity experts. At a time when 5G is fast becoming the go-to resource of connectivity, it needs to be considered that the expansion of 5G could also lead to a massive expansion of internet-connected devices, which will raise the stakes for cybersecurity even higher. One such concern, from Consumer Watchdog, is that “all the top 2020 cars have Internet connections to safety critical systems that leave them vulnerable to fleet wide hacks,” which could lead to “a 9-11 scale catastrophe.”
The very nature of a cell phone requires that the device be tracked from tower to tower to maintain the integrity of calls. Since mobile companies store that location data, the government can obtain a great deal of information about you from your movements. Similarly, the mobile company might sell that information. By triangulating cell towers, your location data can reveal where you live, your doctor’s office, your school, your workplace, your place of worship, your friends’ homes … the list goes on.
Back in 2013, whistleblower Edward Snowden revealed that the National Security Agency was obtaining almost 5 billion records a day from cellphones around the world. This collection effort allowed U.S. intelligence officials to track phones, the phones’ users, and map out the relationships of the phones’ users to other users and their phones. This meant that people around the world, Americans among them, were caught up in the NSA’s web, where that data was stored … all without a warrant.
However, the pace at which technology evolves means that, for every advance that can be used for oppressive purposes, a counter will shortly follow. In September 2019, protesters in Hong Kong were well aware of the fact that Chinese authorities monitored WiFi and the Internet. They turned to Bridgefy, a Bluetooth-based app that does not use the Internet and, as a result, is more difficult for the Chinese authorities to trace.
Privacy at Borders and Checkpoints
People carry a great deal of information —personal and professional — on their phones and other devices. Typically, they take steps to ensure that said information doesn’t fall into the wrong hands. For example, my phone contains my emails, as well as access to my various social media accounts. My communications with my clients are accessible via my laptop and my cell phone. As a result, I keep my devices pass-coded. However, a growing problem is that of government agents attempting to gain access to travelers’ devices without a warrant. An argument can be made that the Constitutional prohibition on unreasonable search and seizure appears to be under assault, as government agents at border crossings demand access to travelers’ devices.
Medical and Genetic Privacy
What would you do if your employer, acting upon a question from your health insurance company, asked you to submit the results of that DNA testing you had performed via Ancestry.com or 23andMe? Right now, your medical and genetic information is protected under the provisions of the 1996 Health Insurance Portability and Accountability Act (signed by President Bill Clinton), the 2008 Genetic Information Nondiscrimination Act (signed by President George W. Bush). Meanwhile, you are protected from being denied health insurance because of pre-existing conditions by the 2010 Affordable Care Act (signed by President Barack Obama). But these laws, intended to protect Americans from predatory health care, may not be as strong as we think they are:
- The ACA, which is more commonly known as “Obamacare,” has been under relentless attack by the Trump administration. Indeed, during the writing of this report, the Trump administration has asked the U.S. Supreme Court to “invalidate” the ACA. If it falls, insurance companies could again deny coverage to those with pre-existing conditions. It should be noted that a 2017 U.S. Department of Health and Human Services analysis estimated that between 61 million and 133 million Americans have a preexisting condition.
- In 2017, a committee in the Republican-controlled House of Representative approved HR 1313, “a bill that would let companies make employees get genetic testing and share that information with their employer — or pay thousands of dollars as a penalty.” The bill died from inaction after Democrats won the House in the 2018 midterm elections.
- In 2018, police in California identified a serial killer by matching his DNA with DNA from members of his family who had signed up for a genealogy website called GEDmatch. Since then, questions have been raised about the accessibility of such data to law enforcement agencies.
- Consumer genetics testing companies frequently sell your data, often to pharmaceutical companies. This is a modern spin on the case of Henrietta Lacks, a poor Black woman who was treated for cervical cancer in 1951. Researchers discovered that her cells were incredibly resilient, and where other cancer patients’ cell samples would die, Lacks’ cells would continue to live and thrive. Without Lacks’ permission, researchers used her cells to conduct research into “the effects of toxins, drugs, hormones and viruses on the growth of cancer cells without experimenting on humans. They have been used to test the effects of radiation and poisons, to study the human genome, to learn more about how viruses work, and played a crucial role in the development of the polio vaccine.” Although the discovery was a boon to the worlds of science and medicine, the fact remains that Lacks’ medical and genetic information was used without her permission in what is now an industry worth more than $1 billion annually, as of 2019.
- The GINA law bans employers and health care companies from using genetic data to deny you coverage or employment. However, “companies with fewer than 15 people are exempt from this rule, as are life insurance, disability insurance, and long-term care insurance companies — all of which can request genetic testing as part of their application process.”
- What happens if companies make your medical or genetic information part of the interview process? Granted, some state and federal laws protect against genetic discrimination, but those laws do not cover everything.
I mentioned earlier that companies want your data for their marketing. Indeed, it’s been shown that Facebook, for example, looks at its customers as less like people and more like products to be sold to advertisers. A Vermont law forced some transparency on companies that sell our information, but very little is actually known about who has access to our information, and what happens to it after it is sold. California has taken steps to address that lack of knowledge, but it is too early to see what kind of impact the law is having.
…take a closer look at the various apps you keep on your devices. How much access do those apps have to your private information? For example, do you really need to grant “Candy Crush” access to your microphone, camera, and location?
When you’re at work, you likely don’t expect to have privacy. A 2018 study showed that 50 percent of companies monitor their workers’ emails and social media accounts, “along with who they met with and how they utilized their workspaces.” Fast forward a year, and 62 percent of companies were “leveraging new tools to collect data on their employees.” And even in the current work-from-home reality of COVID-19, employers are still keeping an eye on their workers. For example, random screenshots of your workers’ screens will tell you what they’re actually doing. Indeed, “monitoring software can track keystrokes, email, file transfers, applications used and how much time the employee spends on each task.” Also, if you ever use your personal phone for business purposes? You could be risking all of your personal data should your employment be terminated.
A hopeless situation?
Earlier, I mentioned a 2019 Pew Research Center study showing that “roughly six-in-ten U.S. adults” believed they were being monitored by the government and companies., and that they “do not think it is possible to go through daily life without having data collected about them by companies or the government.” The report also noted that most Americans feel “they have little or no control over how these entities use their personal information.”
With that information as a backdrop, it is important to realize that what may seem like a hopeless situation actually is an opportunity. Granted, U.S. laws on privacy have fallen behind the pace of technology, and it has been shown that those in charge of regulating technology and social media platforms often do not understand the very technologies they’ve been charged with monitoring. As a 2018 Brookings Institution paper warns:
“This is where we are with data privacy in America today. More and more data about each of us is being generated faster and faster from more and more devices, and we can’t keep up. It’s a losing game both for individuals and for our legal system. If we don’t change the rules of the game soon, it will turn into a losing game for our economy and society.”
Indeed, news reports seem to bear this out. From the Snowden revelations in 2013, to the 2017 Equifax breaches that exposed the data of nearly 146 million Americans, to the 2018 Cambridge Analytica scandal and beyond, more and more attention is being paid to the subject of data privacy. But that just means that we’re more aware. What can we actually do about it?
First of all, I strongly recommend that you make use of the one aspect of your digital data that you do control —namely, your device privacy settings. Some of the data sharing our devices perform is within our ability to control. However, it is important to remember than many aspects of it are not.
Second, take the time to actually read the privacy policies of the digital devices and online services that you use. As the Brookings Institution has noted, perhaps the concept of “informed consent was practical two decades ago, but it is a fantasy today. In a constant stream of online interactions, especially on the small screens that now account for the majority of usage, it is unrealistic to read through privacy policies. And people simply don’t.” The fact that so many people don’t bother to read those polices effectively makes them complicit when their own data is used against them.
Third, take a closer look at the various apps you keep on your devices. How much access do those apps have to your private information? For example, do you really need to grant “Candy Crush” access to your microphone, camera, and location?
And then, of course, are the ideas that you could demand from digital service companies.
Pitch these ideas to lawmakers, activists, and journalists to force companies to discuss them:
- Demand that companies establish easier ways to manage your devices’ privacy settings
- Demand that companies be transparent over how they store, and if they share, your information.
- Demand “real name” requirements for social media, in which the accounts can only be opened with a photocopy of a government-issued ID card. Admittedly, there would be a loss of privacy here, but it would lead to a decrease in the frequent “mob” mentality we see online, and an increase in the accountability of account users for their content.
- Companies should have a standardized form governing whether to grant permission to companies to sell or share their personal data.
- Question lawmakers to ensure that they understand the technologies they’re attempting to regulate.
- Demand that, if someone is cleared of criminal charges, any biometric data that was gathered as a result of the arrest be deleted within 30 days.
- Require that greater scrutiny be given to digital applications coming from foreign countries that have a history of intellectual property theft.
- Demand that lawmakers reveal if they have any financial ties to technology companies
- Require technology companies to create more secure privacy settings for minors using social media.
Admittedly, some of the above suggestions may be long shots, particularly given how much money technology companies and their lobbyists have at their disposal. This would seem an impossible task, taking back control of our digital data.
However, to put it into a historical context: Women’s right to vote, the end of slavery, and same-sex marriage were once considered impossible tasks, too.
About the author
Melvin Bankhead III is the founder of MB Ink Media Relations, a boutique public relations firm based in Buffalo, New York. An experienced journalist, he is a former syndicated columnist for Cox Media Group, and a former editor at The Buffalo News.
Note from MTN Consulting
MTN Consulting is an industry analysis and research firm, not a company that typically comments on politics. We remain focused on companies who build and operate networks, and the vendors who supply them. That isn’t changing. However, we are going to dig into some of the technology issues related to these networks and networking platforms which are having (or will have) negative societal effects.