top of page
  • Writer's pictureCriscillia

Why Smartphones Are Like Lead Pipes



I bought my first cell phone in 2003 when I was about 30 years old. It was the tiniest, no-contract flip phone I could find.


Although I hardly used it, it didn’t take long before I started feeling weird about leaving my phone at home. My phone, the talisman. I couldn’t shake thoughts about how useful it would be if I were to have car trouble at night or face some unimaginable catastrophe. I wanted my phone with me at all times. You never know, right? Anything could happen . . .


Fast forward to 2010. My husband’s company gifted him a state-of-the-art smartphone. Without much thought, we decided to give our not-so-old smartphones to our kids, and buy a twin of the gift phone for myself. Around that time we also decided that everyone in the family should have laptops. Our kids were 13 and 15.


We regret the timing of those decisions, and the fact that we did not prepare our kids for the burden of owning ultra-portable, tiny computers. We didn’t know to do this, because we hadn’t felt the burden ourselves. We hadn’t yet noticed how deeply our phones were affecting us and our relationships. We didn’t know that smartphones were poised to knife the very fabric of society. And even if we had known these things, we wouldn’t have known how to communicate this kind of message to our children.


Giving our kids smartphones and laptops in 2010 without any preparation wasn’t a disaster. We were lucky. Smartphones were different then, far less invasive. The business models and algorithmic know-how that has developed around them were in their infancy.

These days, however, more and more humans (myself included) are finding that using their smartphones when and how they want takes work. More work than it should. That’s because smartphone operating systems and apps are designed to exploit our cognitive biases, reflexes, and reward circuitry. It isn't easy to put your phone away, and that's no accident.


The research and writing my husband and I do today is inspired by the desire to raise awareness of the powerful design strategies that keep people glued to their mobile devices, and to understand fully and communicate broadly how digital information and communication technology (ICT) impacts human relationships, autonomy, and well-being. We review industry publications and peer-reviewed research, and canvas information from sources as various as media measurement firms like Nielsen, investor and analyst firms like Deloitte, government sources like the CDC, and non-profit organizations and foundations like Pew Research.


We wish we had prepared our kids for the burden of owning ultra-portable, tiny computers.

A few years ago, we built an information-theoretical model of how sensorimotor information flows through the circuitry responsible for calibrating the brain’s internal representations of lived experience. This circuit-level model predicts how digital inputs negatively impact a brain’s powers of prediction, causing humans to lose trust in their senses, themselves, their understanding of the world, and each other. We describe this model in a peer-reviewed article that was published in the Journal of Neural Computation as "Sensory Metrics of Neuromechanical Trust." "Sensory Metrics" offers the most comprehensive account of digital dependencies to date. You can hear a lay-friendly summary on episode 52 of Douglas Rushkoff's Team Human podcast, "Recalibrating for Trust."


Our research is ongoing.


This is what we know for sure: 1) It's all too easy to spend too much time online; 2) More and more people cross this murky threshold each year; 3) The more time people spend online the more likely they are to feel lonely, anxious, and depressed. 4) The human brain needs sensorimotor inputs that originate from the real world to remain in calibration.


The human brain needs sensorimotor inputs that originate from the real world to remain in calibration.

How much screen time is too much? We’re not certain. Luckily, precise numbers don't matter.


What matters is this: humans have as much physiological need for online “experiences” as they have for water stored in lead. Neither will kill you or cause long-term damage right away, but both are toxic and both are especially dangerous to children.


Humans began mining lead over 6,000 years ago. It was once used it to make water pipes, food storage vessels (it imparted a sweet taste to food), and white makeup. Lead’s toxicity wasn’t recognized until the nineteenth century, and it wasn’t banned from most interiors until the 1930s.


We now know what lead is good for: roofing, ammunition, car batteries, and stained glass windows. We also know what lead is not good for: water pipes, food storage and cooking containers, makeup, and paint. Ethical companies make business model, product design, and product manufacturing decisions accordingly.


When my husband and I began researching digital dependencies, we felt like the people who decried lead in the nineteenth-century. No longer. Evidence pointing to the toxicity of today's ICT continues to mount. It's time for the technology industry to adapt in acknowledgment.


Even Google has recognized that mobile devices are hurting us. Researchers at Google interviewed and shadowed Android and iOs users with a range of attitudes toward their smartphones to understand how they experienced excessive mobile device use and non-use. Google found that "mobile devices loaded with social media, email and news apps, were creating a constant sense of obligation, generating unintended personal stress." Basically, Google finally learned what most of us already knew in our bones: people "want to be able to set aside their phone sometimes, not worry about missing anything absolutely urgent, and feel in control of their phone use."


Our smartphones have gone from helping us feel safe to threatening our mental health, productivity, social relationships, and even our democracy.


The time for pointing fingers or indulging in melodramatic rhetoric about "evil" tech is over. As a global culture and as individuals, we must learn how to protect and restore ourselves from the various toxins emitted by ICT while ensuring equal access to ICTs as tools for knowledge access, creative self-expression, enrepreneurship, and more. Moreover, we must align social and professional expectations about how and when to use ICT with human rather than role-based and revenue-based needs.


But we can’t stop there.


It’s time to understand in psychosocial terms, in computer science terms, and in economic terms the short-term and long-term impact of ICT use on humans and our lived experience. We must also understand in computer science terms the nature of the information humans need to thrive in order to bake that knowledge into the design of new ICT products and systems. The paper my husband and I wrote begins this work.


Yet, there's still more work to be done.


The time has come for the tech industry to design and make available ICTs that function as useful tools for humans, rather than as toxins, prison guards, and spies.


The time has come for ICT to treat every human being on this planet with dignity.


The time has come for the tech industry to support (rather than thwart) our human capacity to choose freely, know what's true, feel safe, optimistic, and loved.


If this means that it's also time for the tech industry to invent new business models, so be it.


Recent Posts

See All
bottom of page