Six Acts of Skullduggery in the Technology Debate
Or how to abuse science (and how science is abused)
Posted May 18, 2019
Note: This post is coauthored with Rune K.L. Nielsen and originally appeared in the Danish Communication Forum, in Danish. This is the English translation.
There is no strong scientific evidence that modern digital technology such as cell phones, computer games, or the Internet is inherently harmful to humans. At the same time, it is also not possible to prove that these technologies do not have a harmful effect. One can never prove a negative claim. For example, it is impossible to prove that the abominable snowman is not out there somewhere. The closest science can come is to establish how unlikely it is that he or she exists.
We are therefore in the slightly unfortunate situation that it is up to us as humans, parents, psychologists, doctors, politicians, etc., to assess how to best deal with technology. But if you are concerned about the role of technology in people's mental health, you can easily use science to trick your opponents anyway. You just have to be willing to commit a little bit of skullduggery. Here are six of the most popular ways that science is abused:
Tell half the truth and nothing but half the truth
If the best lie is half-truth then dopamine is the perfect neurotransmitter to use to tell tall tales. In pop-culture and pop-science, dopamine is often referred to as the brain's happiness hormone. The notion that the amount of dopamine floating around in a brain is a direct measure of how much pleasure a person experiences stems from science, but in scientific circles there is virtually no one still subscribing to this simple view of dopamine and pleasure[i]. Not even Roy Wise who first proposed the “dopamine pleasure hypothesis” in 1980. By the mid-1990s, Wise had already retracted the theory, saying: ”I no longer believe that the amount of pleasure felt is proportional to the amount of dopamine floating around in the brain.”[ii]
Dopamine is perhaps best known for its role in learning and motivation and most notorious for its role in addiction. However, it’s activity in routine activities, whether sex, food, exercise or technology use, looks nothing like dopamine activity during the use of cocaine or methamphetamine. But it sounds scary. So, if you know someone who does things that you don't like, you can rightly accuse them of being a slave of dopamine, and thus frame their behavior as pathological. Is your friend in love with someone you don't like? You can easily argue that it is not true love, your friend has just become "partner-addicted" in a storm of dopamine. Does your girlfriend spend too much time working? It does not have to be because the work is interesting or meaningful, with the dopamine argument you can claim that it is a morbid addiction. The point is that all behavior can be made out to be pathological by claiming that it is just an expression of dopamine dependence.
The boring scientific truth is that dopamine itself is neither dangerous nor pathological, it is natural and necessary.
Come up with a really unpleasant name
The best way to conjure up something from nothing is by giving it a name. If the name sounds really horrible, it might be used to deter people from doing something and stigmatize those who cannot be deterred. In the 70s and 80s, one could read for the first time how dangerous it was to write computer code. There were some who became so pathologically concerned with programming that they no longer just tried to write computer programs that could solve tasks. Instead, they continued to work on the code and make it more complex without a specific purpose in mind, they would code for the sake of coding. It can be difficult to understand why some people throw themselves into programming and give themselves over entirely to the computer, but words like 'microholics' (a merging of microchip and alcoholic) or 'machine-code junkie' complex social and psychological processes all of a sudden become very simple. Some people have simply gone and contracted a chronic disease of the brain inflicted on them by the allure of the personal computer.
Language is well designed to persuade other people to see things from your point of view. If you want to be allowed to exercise torture, call it enhanced interrogation techniques. If you are concerned about technology then compare it to passive smoking or other forms of pollution. The important thing is to use metaphors that are undoubtedly negative.
The boring scientific truth seems to be that the effect of technology is extremely context-dependent. That which has positive effects for one user in one context might have negative effects for another user in another context. This is as true for “good” media and technology (reading culturally important books like Catcher in the Rye, for instance) as for “naughty” media such as playing Grand Theft Auto. When several studies of ‘computer game addiction’ have not been able to find negative effects for the ‘addicts’, it may, of course, be because the study is poorly executed. However, it is probably due to the fact that computer games do not have unavoidable effects, what they have instead are highly context-dependent effects, kind of like just about anything else in the world.
Pretend as if changes in the brain are necessarily harmful
Once, it was thought that when a human's brain was fully developed, that was it, there was no more change. It was thought that the brain only changed if it was damaged in some way. Now we know that the brain is constantly changing and that everything that you learn causes physical changes in the brain. In other words, one cannot learn anything without the brain physically changing. This can be used in order to make anything appear highly suspect. If you know someone who does something you dislike, for example playing golf, then you can use science to warn them that golf not only causes changes in brain chemistry but also in the structure of the brain itself! And who would want to incur golf brain? That sounds unpleasant. Actually, golfers would probably want that. Without learning and the corresponding changes in the brain, it would be as if you had never touched a club before every time you picked on up out of the bag.
The boring scientific truth is that it can be very difficult to figure out if a change in the brain is an expression of an erosion or an optimization and streamlining of brain function.
Pretend as if correlation is the same as causality
Or in other words: ignore the fact that just because A is followed by B it is not necessarily the case that A caused B. When the earth's temperature rises as the number of pirates falls, it is not necessarily because pirates have a cooling effect on the planet (or because warmer temperatures make life as a pirate difficult). When people who often have a lighter in their pocket die earlier than those who rarely have a lighter in their pocket, it is not necessarily because lighters are dangerous (or because people who are close to death like to have a lighter in their pocket), this is more likely because people who like to smoke also like to have a lighter in their pocket. If children who sit a lot in front of a screen have more attention problems than other children, we cannot claim that screens cause children to have attention problems. Viewed through the lens of science, we have to consider whether children who have attention problems are more often parked in front of a screen. In many studies, the only source of knowledge about the children and their lives is the children's parents. In such cases, we also need to consider whether we can rely on the parents' judgment or whether there may be differences between people that cause them to exaggerate or understate the amount of time their kids spend in front of screens. Psychologists always warn us that “correlation does not equal causation” until they come across a correlation that confirms their own personal beliefs, theories or moral soapboxes. Too often, psychological researchers appear just as bad as the general public in ignoring the correlational fallacy.
A boring insight from the philosophy of science is that we cannot speak to the effects of technology, or anything else, without rather cumbersome and intrusive experiments. Observation alone is never proof of causality.
Exaggerate small effects from individual studies
In a large dataset, researchers found that there was an association between depression symptoms among American teenagers and their exposure to social media. Of course, this study created headlines in the media. We have previously criticized this kind of research for assuming, at least rhetorically, that media is something people are exposed to passively. We argue instead that people's media consumption is at least a little bit conscious and driven by users’ personal motivations.
However, rhetoric was not the biggest problem with the aforementioned study. When other researchers went through the data, they found that the relationship between depressive symptoms and social media only applied to girls and not to boys. In addition, the relationship was incredibly small, it was only at 0.36% which was about the same association that was found between eating potatoes and depression symptoms. For comparison, the relationship between listening to music and depression symptoms was 13 times as high. Again, we do not know on this basis, whether teenagers get depression symptoms because they listen to music, or whether teens with depression symptoms are more likely to listen to music. It might of course also be the case that there is something uniquely depressing in the music that teenagers listen to. So, if you don’t mind playing a little loose and fast with science, the next time you are exposed to a teenager's annoying music you can appropriately inform them that they are gambling with their mental health.
The boring scientific truth is that one study cannot stand alone, we have to look at the total amount of research if we want to say anything meaningful. At the same time, we should be careful about using the author of a study to assess the quality and importance of the study. Even researchers are just people.
Pretend that technology is particularly addictive
It is probably not possible to come up with a human activity that is not engaged with excessively by someone somewhere. At the same time, there are probably few people who do not feel that there is something in their life that they do too much. Activities that involve technology are, of course, among the things we do too much. But we also overdo old school activities such as exercise (or lack thereof), eating (or lack thereof), sex, love, work, religion or shopping and none of these have addiction diagnoses (even if they do figure in other disorders).
Since the World Health Organization (WHO) introduced "gaming disorder" as a behavioral addiction, it has become very easy to argue that computer games are particularly addictive. In the United States, the American Psychiatric Association have been more cautious and have stated that more research is required for video game addiction to be the second behavioral addiction. Gambling was reclassified as an addiction in 2013, before that time it was not possible to be addicted to anything not involving a substance. It is still a relatively new and controversial in psychiatry to posit that one can be addicted to behavior. There is research into a myriad of behavioral dependencies, even dance addiction, but the WHO has officially recognized only gambling addiction and computer game addiction. When we asked the WHO why, despite great criticism from the research community, it has chosen to only recognize computer games as a new addiction we got the somewhat mysterious answer that they have been under great pressure, especially from Asian countries to make it happen.
The slightly boring scientific truth is that the research community has not yet been able to demonstrate that technology is unique in its ability to cause addiction in the same way as drug addiction.
The body of scientific evidence that exists right now does not strongly support the idea that technology is inherently harmful, but that does not mean that one cannot use skullduggery to argue that such strong evidence exists. One could also choose a different tack and use common sense, or one’s values and beliefs about what constitutes a well-lived life, to argue for, and decide how much and when it is appropriate to use technology.
[i] Berridge, K. C., & Kringelbach, M. L. (2015). Pleasure Systems in the Brain. Neuron, 86(3), 646–664.
[ii] Wickelgren, I. (1997). Getting the Brain’s Attention. Science, 278(5335), 35–37.