“What you’re describing is grief.”
Those words were spoken by a middle-aged female psychologist who proceeded to recline back into her chair and cross her fingers. I recall this specific detail because the time before I was appointed to a male therapist (against my preference) with only three fingers on each hand. He had said the same thing. Grief; an apparent explanation for the nauseating pain in my lower stomach, the same pain that drove me to prematurely leave my Chinese lecture, walk three blocks down Bancroft Way and check myself into student mental health services. “There’s no point,” I would say to myself in class “being physically present but mentally unavailable.”
This was the prevailing mindset for my first semester at Berkeley. I began to operate in insecurity, triggered by the prospect of taking responsibility for my own choices. The negative beliefs I once held to be true started to materialize through unrighteous anger and were only exacerbated by a tendency to surround myself with people who were objectively important by virtue of inherited looks and money. Selfies were staged, alcohol was in abundance, and as a teenager I somehow fancied myself a socialite by the likes of Zelda Fitzgerald, determined to be seen with the “right people” at the “right parties.” In short, I was a severely unhappy and superficial person.
It is a way of functioning that does not presently strike me as foreign. As a great deal of curious children do, throughout childhood I held onto little snippets of my parents’ dialogue which unnerved me and tasked myself with rationalizing those words. Over time, I came to suspect a genetic component; that they too suffered from obsessive thoughts, despair, and possibly, that same immobilizing grief. I began to take note of such symptoms as they came to hamper even my most basic endeavors, such as getting out of bed or eating. In January of the next year I sought treatment from a psychiatrist in San Francisco and was formally diagnosed with Generalized Anxiety Disorder, an affliction that annually affects roughly 3% of the US population.
I tell you this not as an aimless revelation but because visibility is important, particularly among an audience that suppresses each perceptible sign of weakness. Pathophysiological research suggests that “GAD” is linked to disrupted functional connectivity of the amygdala, a set of neurons in the brain’s medial temporal lobe integral for processing human emotion. To disregard this diagnosis with the common notion that teenagers nowadays are entirely too fragile would be to miss the point entirely. More than anything, the news indicated to me that more introspective work was necessary.
Although now, some time later, I’m scarcely able to pinpoint all of the non-biological stressors that contributed to my anxiety, in the moment I was certain about one thing: limiting social media use. Before even arriving on campus, I had already gained hundreds of mutual followers on Instagram via the class page on Facebook. Excited by the idea of meeting people unlike those from my rural hometown, these online connections did not appear unusual to me. Upon walking to class, however, I was taken aback by the multiple people I would run into who referred to me by my username. Friendships would develop through real life interactions, only to have it later revealed that those same people already had an idea of me through social media. “I didn’t want it to be weird,” one girl later conceded. All judgment suspended, it still felt weird.
These awkward situations are symptomatic of my generation, “Generation Z.” The age group which market research suggests has a preference for emojis over text, and entertains the impulse to consume information as quickly as possible. Patience and rumination are characteristics which, although approved in the abstract, lose ground to more instantly negotiable gratifications. To swiftly navigate the digital era is potentially to have everything: the world at your fingertips, validation at a whim.
“We capture a contextualizing, affective moment through trust in technology,” contends digital rhetoric professor, Aaron Hess. “The intersection of body and machine, of analog and digital, enables users to generate new perceptions of both the self and the device.” But what happens when the digital mindset is imposed onto the physical world? What happens when young adults become preoccupied with manipulating photographs to fit an unobtainable image? We begin to neglect the people we are in real life.
This was the case for me, and thus I adopted the philosophy that if I had only spoken to someone a handful of times or no longer saw them on a regular basis, then I was not obligated to follow them. Being informed about these people’s lives from social media posts as opposed to in-person interaction felt ingenuine, especially considering most of them lived within a five block vicinity. By the end of the school year, I had unfollowed over 200 students.
It was a relief to no longer be constantly confronted with visuals of white teeth and Gucci belts and vacations to remote Grecian islands. It was a relief to accept myself without Facetune or filters, to not conceal the cystic acne scars which remind me that I am indeed still a teenager. It was a relief to be exempt from the calculated portrayals which innately disagree with who I am as a writer. No longer did I have to sacrifice authenticity for digital approval, or even subdue my healthy sense of self-deprecation. Documenting each tragic mistake and act of charming naivety allows for reflection, a virtue which does not currently appear to be resonant with social media use. Take for example, Hopelab’s sponsored national survey of more than 1,300 U.S. teens and young adults which found that, “social media users are somewhat more likely to agree than disagree that they feel like they always have to show the best version of themselves on social media, with 53% agreeing. A majority (57%) reported feeling like other people are doing better than they are (15% often feel that way when using social media).”
The aim here is not to point out all of Generation Z’s shortcomings, nor to suggest that I’m by any means absolved from contemporary technological culture. It is simply to draw attention to the increasing importance of social media in young adults’ identity formation and the means by which different age groups utilize it differently. Is “social media” indeed being used for social purposes, or is this a term that masks the behavior of teenagers obsessed with self image? Should we reevaluate the perceived legitimacy behind modern online interactions? Are social media’s benefits fleeting or long-term? These are the types of questions we at Hopelab are asking. It is my hope that older generations, particularly those individuals who build technologies to be disseminated across young audiences, confront these uncertainties head on.