Everything We Do These Days Is Measured And Informed By Data. Does This Really Help?


One recent Tuesday, at two thirty-seven in the afternoon, I received an email from UPS letting me know that a package had been delivered to my home. Attached, as evidence, was a blurry, off-kilter photograph of a small, slightly dented but otherwise nondescript cardboard box that had been placed on my driveway, next to the garage door. A minute later, at thirty-eight minutes past two, I received a second email announcing the package’s arrival, this one from the online merchant that had shipped the box and sold me the shirt it contained. The company congratulated me on the purchase, praised my good taste in menswear, and offered a few suggestions of other articles of clothing I might be interested in buying.

The two emails capped a fusillade of messages. It began five days earlier, when, as I tapped the Place Order button for the shirt, a banking app on my phone notified me that my credit card was being charged $79.95. (It was a nice shirt.) Seconds later, I received both an email and a text from the merchant, confirming the purchase and letting me know that I would receive further communications when the shirt shipped. Which I did, the very next day, when both the merchant and UPS emailed me a shipment confirmation with a tracking link. (When I clicked the link, I learned that the package had been picked up and had arrived at a UPS facility in Tacoma, Washington.) I also received emails from the two companies, as well as another text from the merchant, the day before the delivery, informing me the shirt would arrive the following day—“Get ready!” the retailer brayed—and yet another UPS email, early on Tuesday morning, confirming that the shirt had been loaded onto a truck at a local warehouse and was officially “out for delivery.” There was a coda, too: The day after the shirt arrived, the merchant sent an email expressing its hope that I liked the garment and suggesting I post a review on its website.

I find myself in possession of a lot of information these days. I’m in the loop. I’m in many loops, all spinning simultaneously. It’s not just the minutiae of commerce—orders, shipments, deliveries—that are richly documented. When I’m driving, my car’s dashboard, linked to my iPhone through CarPlay, shows me exactly where I am, tells me the posted speed limit and the current traffic conditions, and lets me know both the distance I have to go before I reach my destination and the estimated time of my arrival. (There’s also a readout available on the town or city I’m visiting: population, elevation, square footage, GPS coordinates.) My phone’s weather app gives me a bespoke meteorological report of remarkable thoroughness. Right this second, the app tells me it’s eighty-four degrees and cloudy outside. A light rain will begin in seventeen minutes and will end forty-eight minutes after that, at which point it will become partly cloudy. The wind is blowing west-southwest at six miles per hour, the relative humidity is 58 percent, and the barometric pressure is 30.18 inHg. The UV index is six, which is High, and the air quality index is fifty-one, which is Moderate. The sun will set this evening at 8:11 p.m., and in four days the moon will be full. I’ve taken 4,325 steps today. My refrigerator’s water filter has only 10 percent of its useful life left. My credit rating just dropped eight points. I have 4,307 unread emails, two more than I had five minutes ago.

Even my consumption of cultural goods—an ugly phrase, yes, but it seems apt—is shadowed by metadata. When the graphical user interface was introduced to personal computers in the early 1980s, the scroll bar habituated us to a visual indicator of our progress through a document. Now, pretty much all viewing, listening, and reading is tracked, visually or numerically, in real time. When I’m listening to a song, a glance at the progress bar tells me, to the second, how much time has elapsed since the tune began and how much remains before it ends. The same goes for TV shows and movies and videos. When I’m reading an e-book, I’m kept apprised of the percentage of the text I’ve made it through. When I’m looking over the homepage of a newspaper or magazine site, I’m told how long it will take to read each article. Here’s a “3 min read.” There’s a “7 min read.” (This essay, for the record, is a thirteen-minute read, and you have nine minutes to go.) Every photo on my phone offers its own little data dump: where and when it was taken, the aperture and ISO settings, the exposure time, the image’s size in pixels and bits. My pictures tend to be amateurish, but the data always looks professional.

We talk a lot these days about Big Data, those heaping stores of digitized information that, fueling search and recommendation engines, social media feeds, and, now, artificial intelligence models, govern so much of our lives today. But we don’t give much notice to what might be called little data—all those fleeting, discrete bits of information that swarm around us like gnats on a humid summer evening. Measurements and readings. Forecasts and estimates. Facts and statistics. Yet it’s the little data, at least as much as the big stuff, that shapes our sense of ourselves and the world around us as we click and scroll through our days. Our apps have recruited us all into the arcane fraternity of the logistics manager and the process-control engineer, the meteorologist and the lab tech, and what we’re monitoring and measuring, in such exquisite detail, is our own existence. “Software is eating the world,” the venture capitalist Marc Andreessen declared in a famous Wall Street Journal op-ed a decade ago. It’s also eating us.

In Minima Moralia, his 1951 book of aphoristic musings, the German philosopher Theodor Adorno made a trenchant observation about the intimate relationship he saw developing between humanity and its ever more elaborate and encompassing technology. People were growing attuned to and protective of “the functioning of the apparatus, in which they are not only objectively incorporated but with which they proudly identify themselves.” Adorno wasn’t just rehashing the trope about laborers becoming cogs in the industrial machine, so memorably expressed fifteen years earlier by Charlie Chaplin in Modern Times. His point was subtler. Machines aren’t our masters. They’re not even separate from us. As their makers, we imbue them with our own will and desire. They’re our familiars, and we’re theirs. As we form tighter bonds, our intentions merge. We vibrate to the same rhythms, adopt the same posture toward the world.

The mechanical apparatuses of Adorno’s time, from machine tools in factories to vacuum cleaners in homes, emphasized the industrial ethos of routinization, standardization, and repetition. They oriented people toward the efficient production of outputs. They turned everyone into a machinist. But the apparatuses were not constant presences in people’s lives. Workers walked away from their machines at the end of their shifts. Vacuum cleaners went back into the closet once the rugs were clean. The Internet is different. Thanks to the omnipresence of the smartphone, it’s always there. The network is less a tool than a habitation, less an apparatus than an environment. We don’t just use it to get things done. We are, as Adorno foresaw, incorporated into it as components. We’re nodes, continuously receiving and transmitting signals. The ethos of the system is one of documentation and representation. We’re all jointly engaged in the production of a facsimile of the world—a “mirror world,” to borrow a term from the computer scientist David Gelernter, created purely of information—and in that facsimile, we have taken up residence.

Clean and tidy, the mirror world has practical value. It makes life run more smoothly. If I know I’m going to have to sign for a package, it’s useful to be told when it will arrive. If I’m on a highway and I’m alerted to an accident ahead, I can take an exit before I get stuck in a traffic jam. If I know rain is going to start falling in seventeen minutes, I can put off the walk I was about to take. But the view of reality that little data give us is narrow and distorted. The image in the mirror has low resolution. It obscures more than it reveals. Data can show us only what can be made explicit. Anything that can’t be reduced to the zeroes and ones that run through computers gets pruned away. What we don’t see when we see the world as information are qualities of being—ambiguity, contingency, mystery, beauty—that demand perceptual and emotional depth and the full engagement of the senses and the imagination. It hardly seems a coincidence that we find ourselves uncomfortable discussing or even acknowledging such qualities today. In their open-endedness, they defy datafication.

Still, little data’s simplifications are reassuring. By shrinking the world to the well-defined and the measurable, they lend a sense of order and predictability to our disjointed lives. Social situations used to be bounded in space and time. You’d be in one place, with one group of people, and then, sometime later, you’d be somewhere else, with another group. Such “situation segregation” served as “a psycho-social shock absorber,” the communication professor Joshua Meyrowitz explained in his 1986 book, No Sense of Place. “By selectively exposing ourselves to events and other people, we control the flow of our actions and emotions.” Social media eliminates the spatiotemporal boundaries. Social settings blur together. We’re everywhere, with everyone, all at once. The shock absorber gone, a welter of overlapping events and conversations buffets the nervous system. Time stamps, progress bars, location mappings, and other such informational indicators help temper the anxiousness bred by the flux. They give us a feeling that we’re still situated in time and space, that we exist in a solid world of things rather than a vaporous one of symbols. The feeling may be an illusion—the information offers only a sterile representation of the real—but it’s comforting nonetheless. My shirt is in Tacoma, and all is right in the world.

The comfort is welcome. It’s one reason the data exert such a pull on us. But there’s a bigger reason. Little data tell us little stories in which we play starring roles. When I track a package as it hopscotches across the country from depot to depot, I know that I’m the prime mover in the process—the one who set it in motion and the one who, when I tear open the box, will bring it to a close. That little white arrowhead traveling so confidently across the map on the dashboard? That’s me. I’m going somewhere. I’m worth watching. When I monitor the advance of a song’s progress bar, I know I can stop the music anytime, purely at my whim. I’m the DJ. I’m the tastemaker. I say when one tune ends and the next begins. So lovingly personalized, so indulgent, little data put us at the center of things. They tell us that we have power, that we matter.

And yet, as we rely on the data to get our bearings and exercise our agency, we lose definition as individuals. The self, always hazy, dissolves into abstraction. We begin to exist symbolically, a pattern of information within a broader pattern of information. We feel this most acutely when we shape an identity to fit the parameters of social media. Everything we do on platforms like Facebook, Instagram, and X is logged, and the resulting data are often immediately visible to us (and others) in the form of like and view tallies, friend and follower counts, comment and retweet scores, and other quantitative measures of activity and affect. Even the number of seconds that elapse between a post and a response becomes laden with meaning. Social status and personal character take numerical forms and, like other measurements, demand to be monitored, managed, and optimized. Just as today’s airline pilots, surrounded by data displays in their “glass cockpits,” fly their planes more by number than by sight and feel, so we seem fated to navigate our lives more through recorded signals than through direct experience. Events become real only when they’re rendered after the fact as information. Pics or it didn’t happen, as the Instagrammers used to say.

When social relations are conducted through data, they come to resemble economic relations. They turn transactional. Before my Uber driver sees me as a person, she sees me as an assemblage of information—a location on a map, a rating on a five-point scale, a first name—and I see her the same way. The gig economy, like the social media system, is constructed of little data. It works by turning people and their activities into abstractions, digital signals that can be processed by computers. It’s only logical that, in cities like San Francisco, Phoenix, and Austin, the drivers are now being automated out of existence. Self-driving algorithms can carry out the necessary transactions with even more precision and efficiency. To really perfect the system, though, you’d need to turn the passengers into automatons, too. The trip would take place not on asphalt but entirely on screen, a flow of data through the mirror world. We may not want to admit it, but when we communicate using little data, we’re speaking the language of robots.

Back in 2004, in an interview with Playboy magazine, Sergey Brin, one of Google’s founders, said something that has stuck with me. “The entirety of the world’s information,” he suggested, might one day become “just one of our thoughts.” He was speculating about the possibility that Google would invent some sort of electronic implant to connect an individual’s nervous system to the Internet. The idea seemed far-fetched to me at the time, and it still does. But as I think about how my mind works these days, I’m coming to realize that Brin may have been more prescient than either he or I realized. We don’t need dongles hanging out of our skulls. The stream of little data is already a stream of consciousness. It’s running through our heads all the time. In coming years, as digital sensors proliferate, as more and more objects turn into computer interfaces, and as AI gets better at reading our interests and intentions, the ever-swelling data stream may become our dominant train of thought, our all-purpose apparatus for the work of sense-making and self-making.

A few months ago, as part of my annual physical exam, I had blood drawn for a routine panel of tests. Late the next day, my phone vibrated to let me know the results were available through my doctor’s “patient portal” app. I signed in (entering a six-digit code to authenticate myself), clicked on the Results tab, and was greeted by a long list of numbers. There must have been two dozen of them, each a measure of some important metabolic function, each occupying a point within a range of points. Blood, that most vital and visceral of substances, had been turned into an array of data on a computer screen. Blood had been rendered bloodless. Maybe I was in a morbid mood—medical tests will do that to you—but as I scrolled through the numbers, I couldn’t help feeling I was looking at a metaphor for something larger, something central to the human condition today. What is datafication but a process for transforming the living into the dead?

I returned the shirt. It didn’t fit.

Reprinted from The Hedgehog Review 26.3
(Fall 2024). This essay may not be resold, reprinted,
or redistributed for compensation of any kind without prior written permission. Please contact
The Hedgehog Review for further details.



Source link

About The Author

Scroll to Top