Toward Disengagement

From Person to User and Back Again

by Bix Frankonis

1.

Early in 2020, I’d found it interesting that I’d had to pause my reading of Richard Sennett’s Building and Dwelling: Ethics in the City in order to read Joanne McNeil’s Lurking: How a Person Became a User when it became available, in that a significant part of why I’d wanted to read the former was my casual interest in ways to apply urban planning lessons, or at least language, to online communications and communities.McNeil’s book was a useful history reminder-lesson for me. I don’t know how old she is in offline years but in internet-self time she’s been online since right around the same time as me (I logged in for the first time in the fall of 1993). Lurking, then, in many ways told the story of the very internet that developed while my own online self did.(In general arc if not always specific sites and services, Lurking is the story of my internet, too. For her, it was AOL and Geocities; for me: gopherspace, MindVox, IRC, and Usenet.)Mostly I was struck by McNeil’s recounting of what for lack of a better phrase I’ll call the social networking era, as opposed to the social media one which grew out of it. Once upon a time, we had user profiles with all manners of information about us, as provided by us (or as permitted by us, in the case of things such as testimonials from others or posts to our “wall”). To actually find each other, and connect with each others, we had to instant message, group chat, or visit a forum.

Messages sent user to user and public testimonials were how people communicated, but the promise of the social network was realized in the observable and intuited. There were no alerts when changes were made; a person had to look over the same profiles again and again to see their latest updates.

(Emphasis mine.)It was an internet of place. Once services like Friendster or MySpace gave way to services like Twitter and Instagram which rely predominantly upon the notion of the social media feed, place went away in favor of a more amorphous and identity-flattening space. Profiles as they once existed truly defined and denoted our personhood, or at least our personahood, and chat rooms and bulletin boards felt like places to visit. The feed, though, did away with all (or at least most) of that.Add in the rise of the smartphone which was far better suited to quick-hit, bite-sized, on-the-go consumption, and out go the blogs and discussion forums and real-time chats which were so intimately tied to larger, more fixed-in-place devices.

The internet had a station before, like a shoebox full of recipes on a countertop, like the kitchen itself. As smartphones blurred organizational boundaries of online and offline worlds, spatial metaphors lost favor. How could we talk about the internet as a place when we’re checking it on the go, with mobile hardware offering turn-by-turn directions from a car cupholder or stuffed in a jacket pocket?

This transition from place to space also fundamentally transformed the nature of the activity for which McNeil titled her book.Once upon a time, lurking frequently was how you learned the shape of an online place, how you learned its rules and came to understand the dynamic of its residents. There might have been common points of (n)etiquette but each place also had its own flavors and its own boundaries. In the borderless expanses of “platforms” such as Twitter, there’s no there there, and so no real opportunity to lurk around its edges to observe and learn its ways.

A testimonial was always a one-off, and there was no space for someone to respond to another person’s testimonial. And if it was no good, the recipient would delete it (mortifying). Unlike email (private) or forums (within a community), the testimonial widened online communication within set parameters: user to user in public, or user to an audience (friends and onlookers). A testimonial was written with the expectation that lurkers would see it.

I’ve sometimes expressed the changes in our internet experience as a move from interaction to indication, from expression to excitation. “Social media on mobile,” writes McNeil, “had a different tempo and friction as users documented in the moment, rather than retrospectively.” Mostly, how we began to behave on mobile became how we behaved on other devices, as well, because it’s how the new crop of sites to which we all gravitated were designed to be used.McNeil herself thinks that lurking remains, just in a somewhat redefined and restrictive sense.

Friendster users found themselves liberally adopting the word “friend” to describe various relationships. Instagram and Twitter used language that accounted for the potential of a mass of strangers watching another user’s activity. Instead of friends, users “followed” users and were “followers.” Lurkers weren’t just a possibility now, but an expectation.

As suggested above, I’m not sure I agree with this. What’s been forced upon us, I think, is the inaction of consumption as opposed to the participation (passive though it might have been) that was lurking.“Earlier social networks and social digital environments,” writes McNeil, “benefited from smaller, segmented communities: no obligation to participate, IRL intervals between logged-in sessions, and more flexible online identities.” It was those smaller, segmented communities that drove both the sense of place and the action of lurking. What we have now is a cognitive state more akin to a coiled spring, where we consume “content” with the expectation of engagement rather than of participation.McNeil properly defends her use of “lurking” only as a positive thing: “Lurking is listening and witnessing on the internet, rather than opining and capturing the attention of others.” More than anything else, that does capture what we’ve lost as the frictionless, placeless spaces of social media have taken over.Richard Sennett, in Building and Dwelling, connects the question of place versus space to the matter of speed.

At a walking pace, the spotlit objects are ‘round’, in the sense that we can dwell on them, studying their contours and context, whereas at a speeding pace the single spotlit object appears neurologically as ‘flat’—a fleeting image with no depth or context. In this sense, walking slowly produces a deeper lateral consciousness than moving fast. Lateral accounting is one of the criteria for distinguishing place—a site in which you dwell—from space—a site you move through. It establishes the basic cognitive claim for privileging cyclists over motorists—the cyclist knows more, neurologically, about the city than the motorist.

This, too, describes what happened to the internet in its “progression” from boards and rooms and walls, to social networking, to social media. We no longer dwell online; rather, we move through it.McNeil and I both started off in the internet of places, and witnessed a sort of gamification of what it meant to be online. There are plenty of remaining spaces but few to consider “ours”, or, really, anyone’s. Most of these internet spaces are like McNeil specifically describes Facebook: “an infinite ant farm”.It’s not that place no longer is possible on the internet, but that as commerce took over, everything else online became just as transactional. Which is not to say that commerce never should have come to the internet; it just should not have imposed its ethic and its view of human behavior upon everything else that was here.It’s not that boards and forums and chats no longer exist, and there’s nothing stopping us from maintaining profile websites of our own, divorced from any particular platform’s designs upon us.It’s just that the dominant ethos of the internet right now is one that maneuvers us into being users rather than people. One of the ways we get back to being people is to learn (or perhaps relearn) how it used to be—by reading accounts such as Joanne McNeil’s of our one-time lurking life.

2.

L. M. Sacasas in January 2021:

Last summer I argued that, in the context of information superabundance, the Database now precedes the Narrative. Digitization has made possible the dissemination and storage of information at unprecedented scale and speed. To the degree that your view of the world is mediated by digitized information, to that same degree your encounter with the world will be more like an encounter with a Database of unfathomable size than with a coherent narrative of what has happened. The freedom, if we wish to call it that, of confronting the world in this way also implies the possibility that any two people will make their way through the Database along wildly divergent paths.

L. M. Sacasas in June 2020:

In other words, when you read a narrative, for example, you are encountering the product of a series of choices that have already been made for you by the author out of a myriad of possibilities from the database of language. The countless other choices that were possible are present only to the imagination. You see the words the author chose, not the ones she could’ve chosen. You see the path marked out for you as a reader, not the multiple paths that were rejected. When you encounter a database, however, you see the opposite. You see the field of possibility and any number of paths through the database remain hypothetical and potential.

This contrast between narrative and database struck me like a bolt of lightning. Not just because it nicely describes the insurmountable difficulty I have with social media’s organizing principle of the feed, but because it describes a fundamental part of my autistic experience.Years ago, before my diagnosis, a friend of mine asked me to help her pack up her house before she left the country. Tasks such as wrapping up items and packing up boxes were easy: there’s a simple progression of steps: grab an item, wrap it up, find a place for it in the box. But I’d also been charged with something else. I was pointed at the cluttered piles of her garage and asked to determine what’d be good to keep, what’d be good to donate, and what’d be good to throw away.This brought only a total mental paralysis, and a sense of the walls closing in on me. Wrapping and packing items was a simple narrative task, with clear and sequential “story” beats. Sorting the garage was a complicated database task that overwhelmed by cognitive capacity and executive function.It’s just as Sacasas says: faced with narrative, much of the cognitive work already has been done for you; faced with database, almost all of that work falls to you.That “field of possibility” Sacasas mentions? It’s the paralyzing aspect of being asked open-ended questions. It’s the social pressures of small talk. It’s the dramatic failure of jobs where I’d been asked to create new processes out of whole cloth. Sacasas wasn’t writing about having an autistic brain, but this construction of narrative vs. database has given me new language with which to tell people about my experience—and it’s become fundamental to understanding why I cannot, after all, cognitively manage social media.

3.

Gone from social media since late 2020, I’ve nonetheless kept an eye out for concepts which help me explain why. Enter an Aeon dissection from Sally Davies of predictive processing.

Predictive processing casts the brain as a ‘prediction engine’ – something that’s constantly attempting to predict the sensory signals it encounters in the world, and to minimise the discrepancy (called the ‘prediction error’) between those predictions and the incoming signal. Over time, such systems build up a ‘generative model’, a structured understanding of the statistical regularities in our environment that’s used to generate predictions. This generative model is essentially a mental model of our world, including both immediate, task-specific information, as well as longer-term information that constitutes our narrative sense of self. […][…]According to the emerging picture from predictive processing, cognition and affect are tightly interwoven aspects of the same predictive system. Prediction errors aren’t merely data points within a computational system. Rather, rising prediction errors feel bad to us, while resolving errors in line with expectation feels good. This means that, as predictive organisms, we actively seek out waves of manageable prediction error – manageable uncertainty – because resolving it results in our feeling good.

Davies goes deep into the reward system at play here, but it seems to me that predictive processing also more simply explains my troubles with what I’ve called the cognitive violence of the social media feed, in and of itself, as an organizing principle.Whether we are talking about an algorithmic feed that makes judgments about what it thinks you will want to see the most, or simply the context collapse of a single feed into which all manner of people and subject matter are dumped willy-nilly, we’re talking about a profound loss of any sort of predictability for one’s own actions or one’s own thinking.That can be especially problematic for an autistic, for whom the need for a predictable environment can be almost a sort of prime directive. The greater the rise in prediction errors, the greater the sense of anxiety, if not a general sense of overwhelm.Predictive processing gets a bit at the idea of the distinctions between a database and a narrative. Predictions in the offline world necessarily follow mostly a basic course of cause-and-effect—in other words, a narrative course. Algorithmic feeds and context collapse thwart any real sense of narrative; without clear cause-and-effect—without an obvious causal relationship between a first thing, a second thing, and a third thing—cognitive struggles can abound.

4.

Currently, then, can be thought of as the “where” upon which I’ve landed as these sorts of ideas have played around with each other in the back of my mind since well before I’d read Lurking; the book helped both catalyze and crystalize that thinking.To be clear about something: I’m not, per se, “against Twitter”. My twelve years on the platform directly are responsible for whatever understanding I’ve gained about matters such as racial inequity, white privilege, and the trans experience (let alone, say, dog mushing). I’d be at a substantial loss and disadvantage in my perspectives on the world without my years on Twitter.What I came, finally, to realize is that Twitter—and to a lesser but not at all inconsequential extent Instagram (I’d already quit Facebook itself years ago)—had doggedly committed to a principle for the organization of communication and connection that fundamentally is at odds with, at least, my own cognitive capacities: the feed.(As they say: if you’ve met an autistic person, you’ve met one autistic person. In my case, it’s evident that my own particular autistic feature set simply is almost completely incompatible with the feed as a day-in, day-out information structure and, in retrospect, for years had been increasingly impeding my own ability to think straight and regulate my behavior, both online and off.)

Friendster and Myspace were simple, and use of either was straightforward: add people, message people, click around. There were no algorithmic filters ranking and prioritizing what content a user would most wish to see.

(Emphasis mine.)There’s an adage that if you’re not paying for a product, you are the actual product. I see the feed similarly in that you are the thing that’s being fed to something else. In the move from the passive participation of lurking (and all that went with it) to the comparative inaction of consumption (and all that goes with it), we become the thing that is being consumed.In the earlier era of social networking, there was neither a way for you to “broadcast a message across contexts” (from McNeil’s discussion of hashtag activism on Twitter) nor for the site to give you what it wanted rather than what you wanted. In a sense, the social networking era as contrasted with the modern social media one was an era of greater agency on the part of the people doing the actual networking.Prior to the advent of the relentless database that is the feed, we were left in many ways to explore individual people one-by-one, through the individual profile. Left to our own devices in a place of our own, we naturally tend to present ourselves in a more narrative fashion.What I’m after, personally, is a way to keep up to date with people I care about, or am interested in, without having to subject myself to those cognitively-violent vagaries of the feed. I want a way back to those profile pages that I feel, as I say above, “denoted our personhood, or at least our personahood”.To have a sense of place online—and to dwell there—is not about being lured and captured into never logging off. It’s not about being always online. The subtly coercive ways in which social media (and its attendant devices) keeps us more or less constantly on the internet, by denying or at least making extraordinarily difficult our agency, bastardize the idea of place: as McNeil writes, “all worthwhile communities have this in common: participants are always free to leave”.Gone for many of us have been McNeil’s “no obligation to participate” and “IRL intervals between logged-in sessions”. Captivity is not a state in which you dwell, but one in which you suffer. For many of us, even when we are offline, cognitively we aren’t, really. Not really. To dwell is to live in, and one cannot live in a database. One only can live in a narrative.The final sticking point for me in the decision to delete my Twitter account simply was the fact that it’s the only convenient way in which to keep up with people I know both from other times in my life (both online and off) and from Twitter itself. In the end, though, the stresses of having those connections buried in the torrent of the unending, infinite-scream-scroll of the feed—I just couldn’t any longer justify the trade-off.(I’ve realized more recently, and more than once because apparently this is something I need to learn over and over again, that the reason I had never quite managed to find a Mastodon server that seemed to suit me was because Mastodon remained a feed. At its essence, with its streaming micro-posts, favorites, and boosts, it’s just a distributed Twitter with no algorithm. The point, at least for me, can’t possibly be just to make a “better” Twitter.)Currently is not intended to replace or compete with Twitter or Instagram or Mastodon, although I sort of suspect that many people would be able to offload some of their Twitter connections to it, or at least something very like it. It’s meant as an alternative to being always part of the feed.Right now, Currently is vaporware (or maybe it’s more aptly considered thinkware), existing only as a series of mockups; I am very much not a programmer.What I hope for it, should Currently come to be as a kind of shared network for disengagement, is that it become a place where users can become people again, and where people can feel a sense of ownership over themselves, through the generated personahood of the profile page.What I want after more than a decade of social media spaces, and what I think other people might want, is a place to go where you can catch up, then log off.

5.

They weren’t cows inside.
They were waiting to be, but they forgot.
Now they see sky, and they remember what they are.
—Drew Z. Greenberg, Firefly, “Safe”