On Monday, Apple unveiled a series of updates to iOS. Part of that update was Screen Time – a combination of new and existing features designed to help people “understand and manage their tech use.” While the press lavishes Apple with praise for taking a brave step, the risk that Apple will do more harm than good only grows. Their announcement directly “Sherlocks” my startup, Onward, but I am first and foremost committed to the mission of helping people with Tech-Life Balance, and I had hoped Apple would do the right thing here.
The biggest problem with Apple’s Screen Time solution – like so many others before it – is that it’s not based on any science. Tech Addiction is a psychological condition that demands expert care and treatment. The belief that anyone in Silicon Valley can “innovate” a solution to mental health is outrageously self-aggrandizing. Where is Apple’s Clinical Advisory Board? What studies did they use to decide on these features? Who at Apple has a qualified degree in behavioral design?
I know they didn’t have any researchers work on this problem, because just about everything they decided to build has been proven not to work in behavior change. A year ago we presented Apple with detailed data on what actually does work and they still chose to build this nearly carbon copy of Google’s offering. Their conflicted position and pressure from investors and competitors clearly took their toll.
Let’s break this down a bit:
Knowing how much you consume (which Tim Cook repeatedly harped on in this CNN interview as a breakthrough for behavior change) has not been shown in any way to reduce consumption. Most famously, analysis by NYU Langone on NYC’s calorie labeling law found zero statistical evidence that knowing how many calories you consume makes you eat less. In fact, some similar studies have found that it may cause some to eat more or adjust their caloric intake to maximize efficiency. Moreover, we presented very clear data from our cohort of 75,000 self-described tech addicts that showed zero correlation between usage time, perceived severity of the problem and willingness to act on it. None. This means that knowing how much time you spend doing something, and then being shown that data has – at best – no effect. And worse, may cause you to act out even more. I’m not arguing that knowledge isn’t power, but knowledge is only mildly useful. Ask Tim Cook if he’s changed his behavior since seeing his data in Screen Time – and insist on seeing the receipts.
Setting time limits on your usage of specific apps sounds nice in theory, but the majority of people have no way of knowing what those limits should be. Even with their own personal reporting data, should they keep the number the same, lower it or increase it? Further, our data from Onward has shown that restricting times of day (rather than total time) is more likely to help people use less. So saying “No Facebook during work” is a much more effective way of intervening.
When you reach your allotted time on a given app, Apple’s new feature will pop up a giant warning screen that tells you your time is up. Cook talked this up as “giving people an important moment to make a conscious decision about their actions.” While this makes a lot of sense on the surface, there is minimal evidence that people are not conscious of their choices here. Moreover, when you get the warning, it’s easy to dismiss with a single tap (like Waze’s “I’m a Passenger” or Apple’s driving DND), and you can go right back to unfettered usage. We demonstrably showed with Onward data presented to Apple that hard blocking is necessary to help people reduce their usage. This means that when your time is up, the apps should stop working until the next day – not simply warn you and let you continue with a minor speedbump. As any addiction researcher will tell you, when an addict wants to use, it will be hard to stop them. So a solution must take this into consideration.
As we have also heard from Onward users over and over, restrictions are not useful if you can easily change the settings in a couple of seconds. So we are delivering the ability for individuals to give settings control to another person to prevent changes when they’re in the throes of a desire to overuse. Setting a PIN in parental controls is nice, but what you really need is a mechanism that helps people lean on each other for support and guidance. Plus, I’ve never met a kid who couldn’t get around the PIN using “social engineering” tactics.
A feature Apple omitted entirely – but is often considered a highly effective way of creating behavior change – is the idea of accountability sharing. This works by giving someone else (a therapist, advisor, spouse, clergy, etc.) an automated / regular view into your behavior. That way if you overuse beyond your limits, you will have a trusted party you’ll be disappointing as well as someone to lean on. Whether it’s the Sponsor in AA or your personal trainer – having someone else there to help is super useful in changing behavior.
Every shared service in iOS can be universally disabled by a single tap in the Settings app, except notifications. If you want to limit the number of messages sent by apps you must individually disable each app’s notifications. Their new “Notification Tuning” feature is a welcome step forward, but instead of building convoluted Do Not Disturb functions, Apple could easily give people the ability to universally disable notifications with a single tap (a feature requested by many users). I believe they punted on this feature because it would cut into developer satisfaction and revenue.
Evidence-based treatments for addiction also include therapeutic approaches, like Cognitive Behavioral Therapy and Mindfulness Meditation, both of which we included in Onward because they work. By omitting this content and preventing third-party developers from using the Screen Time data via an API, they have all but ensured there will be little scientifically-validated approaches on iOS for the foreseeable future.
Over the past 2.5 years, we’ve helped 75,000 people change their lives for the better – and along the way have learned a lot about what works and what doesn’t in combatting tech addiction. We’ve made it part of our mission to share our data so that everyone can improve their treatment modalities.
Apple demonstrates tremendous hubris in its decision to ignore the advice of experts – including by not having a scientific advisory board – and to deliver a solution to a serious mental health problem with typical Silicon Valley bravado. They wouldn’t attempt to make a heart pacemaker, why would they think that mental health is less deserving of expert opinion?
As much as everyone wants to heap praise on Apple for taking a “brave stand” on this issue, I believe that they have instead demonstrated a disregard for their users. With a rushed and unfocused solution such as this, devoid of any data or research, they send the strong message that they don’t actually care about helping people, but rather getting regulators, the media and consumers off their backs. Apple profits more from tech addiction than just about any other company, selling expensive hardware that both gives us great potential and robs us of our agency.
That’s why I believe that the tech industry needs to be regulated as a hazard to public health. They have demonstrated that they will use diversionary tactics to claim self-regulation, and it’s only a matter of time before they push back on this issue by saying they gave us the data and tools without any appreciable change. Why should tech companies shirk responsibility for distracted driving or ruined relationships or an inability to focus on work/school? They literally engineered their products to get you to pay attention at all costs.
Time to start holding them to account.