The Psychology of Internet Fame

Our brains are wired to seek approval but may not be equipped to handle our modern existence.

In his New Yorker essay “On the Internet, We’re Always Famous,” Chris Hayes ties together some interesting threads. It’s an intricate piece deserving of a read in its entirety but here are some salient chunks:

It seems distant now, but once upon a time the Internet was going to save us from the menace of TV. Since the late fifties, TV has had a special role, both as the country’s dominant medium, in audience and influence, and as a bête noire for a certain strain of American intellectuals, who view it as the root of all evil. In “Amusing Ourselves to Death,” from 1985, Neil Postman argues that, for its first hundred and fifty years, the U.S. was a culture of readers and writers, and that the print medium—in the form of pamphlets, broadsheets, newspapers, and written speeches and sermons—structured not only public discourse but also modes of thought and the institutions of democracy itself. According to Postman, TV destroyed all that, replacing our written culture with a culture of images that was, in a very literal sense, meaningless. “Americans no longer talk to each other, they entertain each other,” he writes. “They do not exchange ideas; they exchange images. They do not argue with propositions; they argue with good looks, celebrities and commercials.”

This revulsion against the tyranny of TV seemed particularly acute in the early years of the George W. Bush Administration. In 2007, George Saunders wrote an essay about the bleating idiocy of American mass media in the era after 9/11 and the run-up to the Iraq War. In it, he offers a thought experiment that has stuck with me. Imagine, he says, being at a party, with the normal give and take of conversation between generally genial, informed people. And then “a guy walks in with a megaphone. He’s not the smartest person at the party, or the most experienced, or the most articulate. But he’s got that megaphone.”

The man begins to offer his opinions and soon creates his own conversational gravity: everyone is reacting to whatever he’s saying. This, Saunders contends, quickly ruins the party. And if you have a particularly empty-minded Megaphone Guy, you get a discourse that’s not just stupid but that makes everyone in the room stupider as well . . .

[…]

Yes, he wrote that in 2007, and yes, the degree to which it anticipates the brain-goring stupidity of Donald Trump‘s pronouncements is uncanny. Trump is the brain-dead megaphone made real: the dumbest, most obnoxious guy in the entire room given the biggest platform. And our national experiment with putting a D-level cable-news pundit in charge of the nuclear arsenal went about as horribly as Saunders might have predicted.

That’s hardly a novel argument and, frankly, I disagree with it. It’s doubtless true that there was a lot of dumb stuff on television in the early 2000s. Then again, there was a lot of dumb stuff on television in the 1950s. And the 1960s. And the 1970s. And . . . well, you get the point. We mostly elected really bright people to the Presidency.

And, of course, there have always been some truly excellent shows. Not only on the entertainment side but on the news and information side. Further, by 2007, we were well into an amazing renaissance in television that began in the 1990s and really kicked off with HBO’s The Sopranos in 1999. For those who wanted it, there had never been more brilliant programming available.

Still, the notion that television is a passive medium that impacts us differently than radio, let alone books and magazines, certainly resonates.

But Saunders’s critique runs deeper than the insidious triviality and loudness of major TV news, both before and after 9/11. He’s making the case that forms of discourse actually shape our conceptual architecture, that the sophistication of our thinking is determined to a large degree by the sophistication of the language we hear used to describe our world.

This is, of course, not a new contention: the idea that dumb media make us all dumber echoes from the very first critiques of newspapers, pamphlets, and the tabloid press in America, in the late eighteenth century, to the 1961 speech by then Federal Communications Commission Chair Newt Minnow, in which he told the National Broadcasters of America that, basically, their product sucked and that TV amounted to a “vast wasteland.”

The medium, one might say, is the message.

Which, at long last, gets us to Hayes’ central argument:

I thought, and many of us thought, that the Internet was going to solve this problem. The rise of the liberal blogs, during the run-up to Barack Obama’s election, brought us the headiest days of Internet Discourse Triumphalism. We were going to remake the world through radically democratized global conversations.

That’s not what happened. To oversimplify, here’s where we ended up. The Internet really did bring new voices into a national discourse that, for too long, had been controlled by far too narrow a group. But it did not return our democratic culture and modes of thinking to pre-TV logocentrism. The brief renaissance of long blog arguments was short-lived (and, honestly, it was a bit insufferable while it was happening). The writing got shorter and the images and video more plentiful until the Internet birthed a new form of discourse that was a combination of word and image: meme culture. A meme can be clever, even revelatory, but it is not discourse in the mode that Postman pined for.

As for the guy with the megaphone prattling on about the cheese cubes? Well, rather than take that one dumb guy’s megaphone away, we added a bunch of megaphones to the party. And guess what: that didn’t much improve things! Everyone had to shout to be heard, and the conversation morphed into a game of telephone, of everyone shouting variations of the same snippets of language, phrases, slogans—an endless, aural hall of mirrors. The effect is so disorienting that after a long period of scrolling through social media you’re likely to feel a profound sense of vertigo.

Not only that: the people screaming the loudest still get the most attention, partly because they stand out against the backdrop of a pendulating wall of sound that is now the room tone of our collective mental lives. Suffice it to say: the end result was not really a better party, nor the conversation of equals that many of us had hoped for.

So, there’s no doubt a lot of truth in this. Looking at my Facebook feed, which for a variety of reasons is much less curated than my Twitter feed, I see a lot of dumb stuff that, were I less trained and experienced in the consumption of content, might well impact me differently than it does. A goodly number of people that I know to be quite intelligent think some really dumb things because they’re bombarded with this crap.

While it’s not an everyday thing, because of the vagaries of work and family obligations, I consume a lot of Twitter. But, because of the way I curate it (short version: a handful of lists of experts in a handful of fields that a care about) I get very little of the vertigo Hayes seems to experience. And I tend to mute and unfollow the annoying shouters. It can be a huge time suck because the information never stops. And, of course, the curation itself can certainly create blind spots and reinforce a “bubble.” But it’s not psychologically overwhelming.

Next, Hayes pivots to his titular point—and it’s quite the pivot, indeed:

The most radical change to our shared social lives isn’t who gets to speak, it’s what we can hear. True, everyone has access to their own little megaphone, and there is endless debate about whether that’s good or bad, but the vast majority of people aren’t reaching a huge audience. And yet at any single moment just about anyone with a smartphone has the ability to surveil millions of people across the globe.

The ability to surveil was, for years, almost exclusively the province of governments. In the legal tradition of the U.S., it was seen as an awesome power, one that was subject to constraints, such as warrants and due process (though often those constraints were more honored in the breach). And not only that, freedom from ubiquitous surveillance, we were taught in the West, was a defining feature of Free Society. In totalitarian states, someone or something was always listening, and the weight of that bore down on every moment of one’s life, suffocating the soul.

Well, guess what? We have now all been granted a power once reserved for totalitarian governments. A not particularly industrious fourteen-year-old can learn more about a person in a shorter amount of time than a team of K.G.B. agents could have done sixty years ago. The teen could see who you know, where you’ve been, which TV shows you like and don’t like; the gossip that you pass along and your political opinions and bad jokes and feuds; your pets’ names, your cousins’ faces, and your crushes and their favorite haunts. With a bit more work, that teen could get your home address and your current employer. But it’s the ability to access the texture of everyday life that makes this power so awesome. It’s possible to get inside the head of just about anyone who has a presence on the social Web, because chances are they are broadcasting their emotional states in real time to the entire world.

In a relative sense, I’m Internet Famous and have been for going on two decades now. Despite having a relatively common name, most of the results of a Google search for “James Joyner” will be about me. And, while I mostly write about national and international news and policy, there’s certainly plenty of personal information available for those willing to spend the time to dig into it all. (And it would take no work at all to figure out my current employer.) Thus far, at least, it hasn’t been particularly problematic for me.

So total is the public presence of our private lives that even those whose jobs depend on total privacy cannot escape its reach. The open-source intelligence outfit Bellingcat has used this fact to track down a wide array of global malefactors, including the two Russian agents who appear to have poisoned a Russian defector in the U.K., Sergei Skripal, with a rare radioactive substance, in 2018. Bellingcat was able to identify both men through data it purchased on the gray market, obtaining their aliases and photos of each. But the breakthrough came when it was discovered that one suspect had attended the wedding of the daughter of their G.R.U. unit’s commander. In a video—posted on Instagram, of course—the commander walks his daughter down the aisle on a lovely dock, to the sounds of a bossa nova cover of “Every Breath You Take.”

The young couple didn’t just post clips of their wedding (which was gorgeous, by the way) to Instagram. They also uploaded a highly stylized video, set to upbeat music, that shows them in bathrobes getting ready for the ceremony as well as the big moments of the wedding itself. To establish the suspect’s attendance at the ceremony, Bellingcat scanned other posted snapshots of the wedding and compared them with images in the video. Sure enough, the identity of the man in question, Anatoliy Chepiga, matched that of the alias he’d used to travel to the U.K. for the attempted murder.

Bellingcat published its findings, and, presumably, a whole host of Russian military and intelligence officials—maybe all the way up the chain to Vladimir Putin—realized that the utterly innocuous social media posts of a happy young couple had tripped off the identification of someone indicted for attempted murder and wanted by the British authorities.

So, yes, this sort of thing is profoundly creepy. Ditto recent news of sophisticated hacking groups revealing all of the personal data of QAnon and various Republican Party groups. Whether one thinks those particular outcomes are a good thing, the fact that it’s so easy to do should concern us.

Alas, aside from better cyber security protocols and stiffer penalties for cybercrimes, I don’t know what the hell to do about any of this. The benefits that the Internet, mobile phones, GPS, and the like have brought over the last thirty or so years have been enormous. That they also compromise our privacy is not new news.

Ditto this:

This is an extreme example of a common phenomenon. Someone happens upon a social-media artifact of a person with a tiny number of followers and sends it shooting like a firework into the Internet, where it very briefly burns white-hot in infamy. There are some who find the sudden attention thrilling and addictive: this will be their first taste of a peculiar experience they then crave and chase. And there are others, like our newlyweds, who very much do not want the attention. They belatedly try to delete the post or make it “private,” but by then it’s too late for privacy. A message they intended for friends and family, people they have relationships with, ended up in the hands of strangers, people who don’t know them at all.

I see this happen all the time and it really annoys me. The other day, for example, a very prominent national sports broadcaster with over a million Twitter followers quote tweeted some poor rando with 37 followers who, probably several alcoholic beverages in, “publicly” expressed his frustration with his college team’s performance in a big game. Said sportscaster lambasted him for being a disloyal fan, generating a predictable pile-on. There’s simply no reason for that to happen.

And, obviously, there are cases that are far more consequential than embarrassing some otherwise anonymous person who thinks he’s just talking to his buddies via a medium that happens to be public. People literally lose their jobs over these kinds of things.

It’s perfectly understandable from the standpoint of employers. If someone publishes something that’s racist, sexist, or otherwise inflammatory, it’s seldom worth the internal strife to rehabilitate their reputation. And, indeed, in many industries, it’s simply not sustainable and even opens them up to legal liability. Still, it’s something that wouldn’t have happened even 20 years ago. Random off-color comments made between friends would simply be private unless they took place at the workplace. Now, there is essential no “private.”

And let’s not even get started on people in their teens and twenties that post photos and videos of themselves on Instagram, TikTok, and other venues that will almost certainly come back to haunt them for decades to come.

Never before in history have so many people been under the gaze of so many strangers. Humans evolved in small groups, defined by kinship: those we knew, knew us. And our imaginative capabilities allowed us to know strangers—kings and queens, heroes of legend, gods above—all manner of at least partly mythic personalities to whom we may have felt as intimately close to as kin. For the vast majority of our species’ history, those were the two principal categories of human relations: kin and gods. Those we know who know us, grounded in mutual social interaction, and those we know who don’t know us, grounded in our imaginative powers.

But now consider a third category: people we don’t know and who somehow know us. They pop up in mentions, comments, and replies; on subreddits, message boards, or dating apps. Most times, it doesn’t even seem noteworthy: you look down at your phone and there’s a notification that someone you don’t know has liked a post. You might feel a little squirt of endorphin in the brain, an extremely faint sense of achievement. Yet each instance of it represents something new as a common human experience, for their attention renders us tiny gods. The Era of Mass Fame is upon us.

Some variant of that has long existed but not at this scale. Whether because I moved a lot or for some other reason, I often had the experience that people who attended my high school or college knew who I was while I had no idea who they were. But, yes, a Twitter conversation between me and someone I know (say, Dan Drezner, who is significantly more “famous” than me) will be joined in by people I theretofore didn’t know existed. Whether this somehow renders me a “tiny god” is not, alas, something that I’ve previously contemplated.

If we define fame as being known to many people one doesn’t know, then it is an experience as old as human civilization. Stretching back to the first written epic, Gilgamesh (whose protagonist was, in fact, an actual king), history, particularly as it is traditionally taught, is composed almost entirely of the exploits of the famous: Nefertiti, Alexander the Great, Julius Caesar, Muhammad, and Joan of Arc.

But as the critic Leo Braudy notes, in his 1987 study, “The Frenzy of Renown,” “As each new medium of fame appears, the human image it conveys is intensified and the number of individuals celebrated expands.” Industrial technology—newspapers and telegraphs, followed by radio, film, and TV—created an ever-larger category of people who might be known by millions the world over: politicians, film stars, singers, authors. This category was orders of magnitude larger than it had been in the pre-industrial age, but still a nearly infinitesimal portion of the population at large.

Indeed. And, ostensibly, all of those people sought—indeed, worked very hard—to become famous.

All that has changed in the past decade. In the same way that electricity went from a luxury enjoyed by the American élite to something just about everyone had, so, too, has fame, or at least being known by strangers, gone from a novelty to a core human experience. The Western intellectual tradition spent millennia maintaining a conceptual boundary between public and private—embedding it in law and politics, norms and etiquette, theorizing and reinscribing it. With the help of a few tech firms, we basically tore it down in about a decade.

It’s been closer to two decades now. Facebook expanded beyond the college set in 2006 and Twitter launched that same year; I’ve been on both in some form almost the entire time. And MySpace and other precursors, going back at least to LiveJournal, were around a few years before that.

Still, Hayes and I are relatively unusual. While social media is damn near ubiquitous at this point, most people simply aren’t all that online. But, for the younger generation, there’s very little distinction, indeed, between social media and “real life.”

That’s not to say the experience of being known, paid attention to, commented on by strangers, is in any sense universal. It’s still foreign to most people, online and off. But now the possibility of it haunts online life, which increasingly is just life. The previous limiting conditions on what’s private and what’s public, on who can know you, have been lifted. In the case of our young Russian lovebirds, one might safely assume that, until Bellingcat started snooping around their wedding videos, they had been spared the experience of the sudden burst of Internet fame. But, like them, just about everyone is always dancing at the edge of that cliff, oblivious or not.

This has been entirely internalized by the generation who’ve come of age with social media. A clever TikTok video can end up with forty million views. With the possibility of this level of exposure so proximate, it’s not surprising that poll after poll over the past decade indicates that fame is increasingly a prime objective of people twenty-five and younger. Fame itself, in the older, more enduring sense of the term, is still elusive, but the possibility of a brush with it functions as a kind of pyramid scheme

My youngest stepdaughter, who just went off to college, has been doing YouTube and TikTok videos for a niche hobby for years and is a modest “influencer” in that space.

This all worries Hayes considerably:

This, perhaps, is the most obviously pernicious part of the expansion of celebrity: ever since there have been famous people, there have been people driven mad by fame. In the modern era, it’s a cliché: the rock star, comedian, or starlet who succumbs to addiction, alienation, depression, and self-destruction under the glare of the spotlight. Being known by strangers, and, even more dangerously, seeking their approval, is an existential trap. And right now, the condition of contemporary life is to shepherd entire generations into this spiritual quicksand.

Again, this strikes me as both legitimate and overwrought.

On the one hand, yes, this “online” existence is very performative and almost certainly adds stressors that didn’t exist for people emotionally unequipped to handle it. As if teenagers and young adults needed more of that. On the other, these same media make it much easier to form meaningful connections, especially for those who are introverted or unusual.

Back to the aforementioned stepdaughter: her two best friends are people she met online, live in two different states that are not adjacent to one she has ever lived in, and have seen in person only very occasionally. It seems perfectly normal to her. And those relationships doubtless made it far easier to get through the last two years of high school in a new school—the last eighteen months of which were entirely online because of a global pandemic.

As I’ve tried to answer the question of why we seek out the likes and replies and approval of strangers, and why this so often drives both ordinary and celebrated people toward breakdowns, I’ve found myself returning to the work of a Russian émigré philosopher named Alexandre Kojève, whose writing I first encountered as an undergraduate.

[…]

In his lectures, Kojève takes up Hegel’s famous meditation on the master-slave relationship, recasting it in terms of what Kojève sees as the fundamental human drive: the desire for recognition—to be seen, in other words, as human by other humans. “Man can appear on earth only within a herd,” Kojve writes. “That is why the human reality can only be social.”

That rings true to me. Indeed, I’ve spent an inordinate amount of my free time since kicking off this blog in January 2003 writing for and interacting with strangers.

Understanding the centrality of the desire for recognition is quite helpful in understanding the power and ubiquity of social media. We have developed a technology that can create a synthetic version of our most fundamental desire. Why did the Russian couple post those wedding photos? Why do any of us post anything? Because we want other humans to see us, to recognize us.

But We Who Post are trapped in the same paradox that Kojève identifies in Hegel’s treatment of the Master and Slave. The Master desires recognition from the Slave, but because he does not recognize the Slave’s humanity, he cannot actually have it. “And this is what is insufficient—what is tragic—in his situation,” Kojève writes. “For he can be satisfied only by recognition from one whom he recognizes as worthy of recognizing him.”

I just don’t see this as true. While I’ve met a handful of OTB commenters “in real life,” the distinction really has little meaning anymore. Conversations with regulars are pretty similar in all but form to those I’d have with colleagues and losing Teve or Doug isn’t really any different from losing an acquaintance from school or work.

I’ve found that this simple formulation unlocks a lot about our current situation. It articulates the paradox of what we might call not the Master and the Slave but, rather, the Star and the Fan. The Star seeks recognition from the Fan, but the Fan is a stranger, who cannot be known by the Star. Because the Star cannot recognize the Fan, the Fan’s recognition of the Star doesn’t satisfy the core existential desire. There is no way to bridge the inherent asymmetry of the relationship, short of actual friendship and correspondence, but that, of course, cannot be undertaken at the same scale. And so the Star seeks recognition and gets, instead, attention.

The Star and the Fan are prototypes, and the Internet allows us to be both in different contexts. In fact this is the core, transformative innovation of social media, the ability to be both at once. You can interact with strangers, not just view them from afar, and they can interact with you. Those of us who have a degree of fame have experienced the lack of mutuality in these relationships quite acutely: the strangeness of encountering a person who knows you, who sees you, whom you cannot see in the same way.

This is true but only in degree. If someone I’ve never interacted with tells me they read OTB or follow my work on Twitter or have read a piece that I’ve published somewhere, it’s good to hear but can be a bit awkward. If it’s a one-off encounter, it’s an inherently unequal exchange. But I’ve developed relationships with OTB readers and Twitter followers such that I care what their opinions are.

And, frankly, being able to not care what some rando on Twitter or some one-off commenter on the blog thinks is a valuable antidote to the problem Hayes identifies. Do I get a dopamine hit when something I tweeted goes “viral” for a day or two? Sure. Ditto the olden days of the blog when I got an “Instalanch” or otherwise got a surge in traffic when the post got linked somewhere prominent. (It’s been years since I even tracked traffic so, absent a huge influx of new commenters, I wouldn’t even know at this point.) But if someone with 17 followers and no personal information in their bio starts spewing nonsense, I’ll quickly mute them and move on. Conversely, if someone with a third of my followers makes a salient point, I’ll engage with them and likely follow them back. Star/Fan doesn’t have to be a static, one-sided relationship.

We are conditioned to care about kin, to take life’s meaning from the relationships with those we know and love. But the psychological experience of fame, like a virus invading a cell, takes all of the mechanisms for human relations and puts them to work seeking more fame. In fact, this fundamental paradox—the pursuit through fame of a thing that fame cannot provide—is more or less the story of Donald Trump’s life: wanting recognition, instead getting attention, and then becoming addicted to attention itself, because he can’t quite understand the difference, even though deep in his psyche there’s a howling vortex that fame can never fill.

This is why famous people as a rule are obsessed with what people say about them and stew and rage and rant about it. I can tell you that a thousand kind words from strangers will bounce off you, while a single harsh criticism will linger. And, if you pay attention, you’ll find all kinds of people—but particularly, quite often, famous people—having public fits on social media, at any time of the day or night. You might find Kevin Durant, one of the greatest basketball players on the planet, possibly in the history of the game—a multimillionaire who is better at the thing he does than almost any other person will ever be at anything—in the D.M.s of some twentysomething fan who’s talking trash about his free-agency decisions. Not just once—routinely! And he’s not the only one at all.

Sure! But this is actually the opposite phenomenon. Fifteen years ago, it would have been nearly impossible for a twentysomething fan to have any relationship at all with a sports superstar. Now, superstars who want to engage with their fans can. And those who don’t want to can limit their engagement to those they follow.

There’s no reason, really, for anyone to care about the inner turmoil of the famous. But I’ve come to believe that, in the Internet age, the psychologically destabilizing experience of fame is coming for everyone. Everyone is losing their minds online because the combination of mass fame and mass surveillance increasingly channels our most basic impulses—toward loving and being loved, caring for and being cared for, getting the people we know to laugh at our jokes—into the project of impressing strangers, a project that cannot, by definition, sate our desires but feels close enough to real human connection that we cannot but pursue it in ever more compulsive ways.

So, again, Hayes is unusual. He has a national television program. He’s not George Clooney famous or LeBron James famous. But he’s way more than Internet Famous. Yet, like many in the media, he is also incredibly online. He has more than 2.5 million Twitter followers, tweets dozens of times a day, and regularly interacts with followers. (Indeed, in a thread about the article I’m dissecting here, he rejects a suggestion he simply close off comments from those he doesn’t follow on the absolutely correct premise that it “utterly transforms the platform.”)

By the standards of normal people, I’m pretty damn online. I’m a little under 8000 Twitter followers, which is decidedly not at a celebrity level but is at least an order of magnitude more than most. Yet, aside from the propensity to be a time suck, I’m never found it overwhelming. But I can certainly see where it would be for Hayes. Or for a teenager who suddenly gets “famous” in a bad way.

FILED UNDER: Democracy, Science & Technology, Society, , , , , , , , , , , , , , , , , , , , , , , , , , , , ,
James Joyner
About James Joyner
James Joyner is Professor of Security Studies. He's a former Army officer and Desert Storm veteran. Views expressed here are his own. Follow James on Twitter @DrJJoyner.

Comments

  1. Michael Reynolds says:

    I would be an outlier in Chris Hayes’s world.

    I have a degree of ‘fame,’ at least my pseudonym does. Google Michael Grant and there I am in all my (ten year-old photo) fabulousness. I have 21K Twitter followers, which I could easily increase if I spent my time pandering rather than shoving politics down the throats of people who just want to talk about some book I wrote. I deleted Facebook. I never used Instagram. I used to interact on Goodreads til I forgot my sign-in and I took the opportunity to do nothing about it. I’ve taken Twitter breaks a few times and found life goes chugging right along. I’d delete Twitter but it’s the only venue left for fans who want to talk to me and that seems unnecessarily rude.

    IOW I do the bare minimum by modern standards, and I do it for marketing purposes and fan consideration, not because I want to be famous or to achieve recognition. I dread recognition. My line has always been: money, good; fame, stupid. Possibly the fugitive years helped me reach that realization, but since then, IRL, I’ve still never seen the upside to fame. I don’t ever want to be buying laxative at CVS and have someone come up and say, ‘hey, aren’t you. . .?’

    Fame is a thing you don’t own or control, it’s a power held by others, not by you. Money on the other hand, that’s yours. (Well, yours and the IRS’s. And the California FTB.) But that’s me and I’m not normal. Most people do seem to have some desperate need to be seen, some need to be recognized, which is probably a core simian thing – you want the other monkeys in the tribe to know you’re part of the tribe, otherwise they might eat your face.

    All that aside, yes, it’s true that the internet is not a thing the brains of homo sapiens were evolved to manage. OTOH, our brains also were not evolved to direct a 3000 pound hunk of steel going seventy miles an hour amidst a lot of other fast-moving hunks of steel while eating a cheeseburger, singing along to music and taking occasional pauses to speak in finger language to other drivers. And yet. . . yep, in the still of the morning, I hear the traffic on the 5.

    The species will adapt, it’s what we do. We are adapting. The process of adapting is messy to put it mildly, and various disasters may occur before we adapt, but in the end we’ll adapt.

    There! An optimistic ending. Later I’ll write the spin-off with the gloomy ending.

    5
  2. Andy says:

    Bellingcat has really been remarkable in showing the power of open source intelligence.

    Internet fame (avoiding it) is a big reason I never stopped being pseudonymous online after I left the intel community, although Andy is my real name. Plus the world is full of assholes. But my online presence is really limited – commenting here and there and a Facebook account that has the privacy settings maxed that I only post on a couple times a year. I still have a twitter account but really only check it for breaking news or for research purposes. I still consider the platform to be mind cancer in terms of its effect on elites and political discourse.

    But I also think it’s important to point out that not everyone is online. A small percentage of the population has a twitter account, a smaller percentage is active on the platform and the 80/20 rule applies there where the vast majority of content comes from a small minority of users. I still have a lot of friends and associates who have no online presence at all.

    But I have tried to teach my kids to not be stupid and constantly remind them that the internet is forever and there are tons of assholes out there who will destroy your reputation if you let them.

    5
  3. DAllenABQ says:

    This post is a good example of why I come to this site pretty much every day. I would not have seen the Hayes article had not Dr. Joyner read it and written about it. Freely confess
    to feeding my intellectual curiosity from the effort of another intellectually curious person.

    I agree that Hayes’s premise and flow of logic is sound, but he does get a bit overwrought in his conclusions. Seems he is turning up the volume of his own megaphone.

    5
  4. EddieInCA says:

    @Michael Reynolds:

    With the exception of this blog, and a twitter account, I have no social media.

    With the exception of a video interview I did last year which is on a certain site, I avoid media at all costs.

    I’ve been around fame since my first job at ABC Television in 1982. When I was 24, in one week alone, I met with Meryl Streep, Kurt Russell, Craig T Nelson, and Cher. The job wasn’t glamorous. I was a photo editor and had to show them photos from the feature “Silkwood” they had recently finished. I got to go to Jack Nicholson’s house to do the same job for “Prizzi’s Honor”. I worked with Martin Sheen and Emilio Esteves on “40 Days for Danny”.

    I’ve seen fame, up close and personal. Most people have no idea how exhausting fame is/can be. Most people who want fame, regret it when they get it. And sad thing is, as MR states, once you’re famous, you lose control. I have actress acquaintances who are active on Instagram and TikTok who have to live with deepfakes of themselves doing porn. And if you call attention to is, it just exacerbates the problem, creating even more. It’s just not right. The law hasn’t caught up with the technology.

    My desire to stay as anonymous as possible – given my job – has been a detriment. I don’t do press tours. I don’t do publicity for the shows I work on. I turn down all the publicity and marketing invites that WB still tries to set up for me. I don’t need the people of WWTF radio in Seattle knowing what I think about John Ridley as a boss. I don’t care to tell the people who listen to WGTFO radio in Minneapolis what Marc Maron is like. I’d have a much more lucrative career, much more – because these publicity and marketing people LOVE to spend money – if I played that game, which is why so many do it.

    For me, no thanks. No instagram. No facebook. No snapchat. No tiktok. No signal. No parler. Nope. Nope. Nope.

    10
  5. Mimai says:

    [Disclosure: I have not read the entire piece.]

    From the excerpts, it appears that he moves back and forth between “fame” and “recognition.” If (big if…see disclosure) he does indeed conflate these, I think this is problematic.

    While they may be kissing cousins, fame and recognition are not interchangeable. Indeed, fame is often used as a pejorative….when the intent is to downgrade the seeker. And often to upgrade oneself by comparison.

    When I noodle this, I can start to put some separation between the two concepts, but the boundaries are fuzzy. I haven’t really thought that deeply about it….naturally, this is because, like the other fine commenters, I do not seek fame (upgrade complete).

    Perhaps others have a different read of this (having, ahem, actually read the entire piece).

    2
  6. CSK says:

    As somebody once said, “A celebrity is someone who’s famous for being well-known.”

    3
  7. Modulo Myself says:

    Chris Hayes has a very earnest idea of the internets that’s post-crash and post-9/11 and basically about defeating Fox News/dumb tv with good blogging. It doesn’t really have much resonance. Josh Harris did Quiet We Live In The Public in 1999 on the internet. It was basically 24/7 surveillance of a 100 or so artists/whatever living in a bunker in Tribeca. They slept in pods, could access cameras everywhere, and everything was in public–sex, eating, drugs, going to the bathroom, etc.

    Harris wasn’t coming from left field. I lived in SF in the late 90s before the crash. There were so many people talking about Guy Debord and the Situationists and the power of the spectacle and our consumer culture and how the internet was going to defeat it–whatever it was. Harris was just the dark side. He was an utterly f—ed up guy, but he was able to predict a lot of what has ended up consuming people.

  8. Stormy Dragon says:

    One important thing is to differentiate between the Internet itself and the actions of specific companies operating on the Internet.

    Twitter, Facebook, etc. make money though selling ads, which means they want people clicking around as much as possible. Their business models are built on deliberately stirring shit between their users, like a high school girl going to tell Amy that she heard Beth tell Susy that Amy is a slut. They don’t actually care about Amy, they just want Amy’s friends to attack Beth, and then Beth’s friend will attack back, etc. and more clicks!

    Social media could easily be made more healthy, but it won’t as long as it’s controlled by corporations whose interests are in direct conflict which is what’s best for their users.

    4
  9. Lounsbury says:

    @Andy: This reminds me of the lesson I learned in the days of blogging (Aqoul) on Iraq, and a proper financial Journo tracking me down due to clever reading between the lines. Rang me at the office having figued out the oblique funds references. A wee lesson.

    2
  10. Just nutha ignint cracker says:

    @Modulo Myself: I think that the difficulty that those who will seek to

    …to remake the world through radically democratized global conversations.

    fail to realize that “radically (or even mundanely, truth be told) democratized anything intellectual falls prey to gravitation toward the mean. Elevating society is simply hard work. There are no “few easy steps” solutions.

  11. Just nutha ignint cracker says:

    @Stormy Dragon: “It’s not a superhighway; it’s a shopping mall. And it’s not about information; it’s about commerce.” That a quote (at least approximately) from an article that I read back some 30-mumble years ago. The lay of the land hasn’t changed because the basic rules of the society haven’t changed. In a capitalist society, everything either starts out or eventually becomes monetized. For the “Information Superhighway,” some imagined the rules would change midstream, but they didn’t.

    1