A group of former, early employees at large Silicon Valley companies like Facebook and Google have recently created a coalition to build awareness about the addictive qualities of social media and smartphone technology.
The group will be overseeing the creation of a new nonprofit dubbed the Center for Humane Technology. CHT will be working alongside the media advisory group Common Sense Media to deliver awareness programs about tech addiction to public schools. The educational campaign is backed up by an advertising campaign called The Truth About Tech, which will discuss the problems associated with overuse of technology and social media.
Design, Exploitation, And Addiction
The advocacy group and awareness campaign came from growing concerns about the negative effects that social media and digital technology can have on our psychology. While 2017 may have been the year of fake news and hacking, it was just a part of a larger, growing trend that aims to reappraise the nature of our relationship with social media.
While the use of social media and digital technology confers many benefits, there are also drawbacks. More attention is being paid to these drawbacks, such as the ability for social media networks to rapidly disseminate misinformation. Another concern is that the algorithms which social media companies use to incentivize users to stay on their sites have harmful effects on our brains, particularly the brains of children.
The Center for Humane Technology is lead largely by Tristan Harris, a former data and design ethicist at Google. Harris says his experiences at Google have given him insight into how tech companies use algorithms that exploit human psychology to maximize the amount of time users spend on their sites and services.
“We were on the inside. We know what the companies measure. We know how they talk, and we know how the engineering works,” said Harris.
An example of algorithms used to exploit human psychology is something called A/B Testing, which refers to making subtle changes to the design of a site or app and showing it to different groups of test subjects. The test subjects in most cases are random users who will see Version A of the site/app, while others will see Version B of the site/app. The goal is to see if the change has elicited desired behavior, such as spending more time on the site or clicking on certain links. If the changes elicited the desired behavior, they will often be implemented site-wide and dropped if not. This means that over time A/B testing results in a site/app design catered to aspects of our psychology and irresistible.
Ethics And Research
CHT’s members include a wide roster of former employees from major tech companies. CHT’s board includes Sandy Parakilas, former operations manager at Facebook; Lynn Fox, a former communications executive with Apple and Facebook; Justin Rosenstein, creator of Facebook’s Like button; and Roger McNamee, one of Facebook’s largest early investors.
CHT is currently planning lobbying campaigns to place limits on the influence of major tech companies. The lobbying campaigns will be targeted at two specific pieces of legislation. The first is a bill to commission major research on the impact of technology on children’s psychological well-being, and the second is a bill that would prohibit the utilization of internet bots without some form of identification.
McNamee says his decision to join CHT was motivated by a desire to help keep Facebook, the company he had invested in, accountable.
McNamee says that Facebook appeals to your “lizard brain”, to the base part of your brain driven by primal emotions, primarily fear and anger. With smartphones, Facebook has you dialed in for every waking moment, McNamee says.
The exact effects of digital technology and social media on people’s brain, particularly the developing brains of children, is still a mystery in many ways. The technology is relatively new and there have been few long-term studies that track sustained effects on the development of children’s brains. Nonetheless, some research does imply harmful, non-normative effects from digital addiction and the overuse of social media.
A recent study done on 8th-graders, conducted by Jean Twenge (author of iGen), suggests that heavy users of social media are approximately 56% more likely to report being unhappy, 27% percent more likely to be depressed, and 35% more likely to have risk factors associated with suicidal ideation. Harris says that the symptoms of social media dependence look much like classical addiction, including a dependence on something that makes you generally unhappy.
Other studies have found positive effects of social media use, including increased self-esteem. The seemingly contradictory findings and general mystery about the effects of tech addiction and overuse is one of the reasons that CHT and Common Sense Media want to support large-scale research on the topic.
Changing Direction For Time Well Spent
Harris himself says that there are no doubt benefits to the use of social media, but that there are clear downsides as well. Harris says part of the reason so little progress has been made is that people can’t accept that both things can be true. Harris says we do “derive lots of value from Facebook”, and that there are simultaneously “many manipulative design techniques” being employed by it.
For their own part, many Silicon Valley companies are receptive to the ideas espoused by Harris and his colleagues, and what to practice ethical persuasion. Tech giants like Facebook and Google have made recent commitments to fighting some of the dangerous and harmful impacts that unethical and unexamined algorithms can have. Venture capitalists have recently criticized the negative impacts social media often has on us, and YouTube has committed themselves to tackling issues like hate speech and self-harm.
This isn’t Harris’ first attempt at improving people’s relationship with social media and smartphones. Harris co-founded Time Well Spent alongside James Williams. The goal of Time Well Spent was to reform how digital devices and services are designed, so that the user truly wants to be there and is happy they did so, rather than being exploited into doing so. In other words, Harris and Williams don’t want to derail companies like Facebook and Google, they just want them to switch their direction.
“You see a train [going the wrong direction on the tracks]. OK, we upgrade the brakes … but it’s still on the wrong tracks, going in the wrong direction. The only solution is having a different business model … anything else is just updating the shock absorbers,” Williams said.