Filterworld: How Algorithms Flattened Culture
Filterworld: How Algorithms Flattened Culture by Kyle Chayka examines how algorithms shape our cultural experiences and choices, leading to a homogenized society where personal freedom and authentic expression are increasingly compromised. The book critiques the pervasive influence of algorithm-driven recommendations on our digital and physical worlds, arguing for the necessity of understanding and transcending these systems to reclaim our individuality.
Highlights
- Over the two centuries since its invention, the device has become a prevalent metaphor for technological manipulation. It represents the human lurking behind the facade of seemingly advanced technology as well as the ability of such devices to deceive us about the way they work. (In 2005, Amazon named its service for accomplishing digital tasks, like tagging photos or cleaning data, using an invisible marketplace of outsourced human labor “Mechanical Turk.”) The Mechanical Turk is like The Wizard of Oz’s man behind the curtain—an all-knowing, uncanny entity that is ultimately revealed as something much more mundane and comprehensible. (Location 77)
- Algorithmic recommendations shape the vast majority of our experiences in digital spaces by considering our previous actions and selecting the pieces of content that will most suit our patterns of behavior. They are supposed to interpret and then show us what we want to see. (Location 89)
- Algorithmic recommendations are the latest iteration of the Mechanical Turk: a series of human decisions that have been dressed up and automated as technological ones, at an inhuman scale and speed. Designed and maintained by the engineers of monopolistic tech companies, and running on data that we users continuously provide by logging in each day, the technology is both constructed by us and dominates us, manipulating our perceptions and attention. The algorithm always wins. (Location 104)
- Each platform develops its own stylistic archetype, which is informed not just by aesthetic preferences but by biases of race, gender, and politics as well as by the fundamental business model of the corporation that owns it. (Location 122)
-
The culture that thrives in Filterworld tends to be accessible, replicable, participatory, and ambient. It can be shared across wide audiences and retain its meaning across different groups, who tweak it slightly to their own ends. (Location 124)
-
Filterworld culture is ultimately homogenous, marked by a pervasive sense of sameness even when its artifacts aren’t literally the same. It perpetuates itself to the point of boredom. (Location 130)
New highlights added October 20, 2024 at 2:58 PM
-
described the phenomenon to me as an international “harmonization of tastes.” Through algorithmic digital platforms like Instagram, Yelp, and Foursquare, more people around the world are learning to enjoy and seek out similar products and experiences in their physical lives. (Location 140)
-
Through their feeds, they are consuming similar kinds of digital content, no matter where they live, and so their preferences are shaped in that image. Algorithms are manipulative; the apps guide them through physical space to places that have adopted digitally popular aesthetics, winning attention and ratings from other users. With higher ratings come yet more algorithmic promotion and thus more visitors. Yet as international as these effects are, the platforms that undergird them are Western, largely based in the tiny American locus of Silicon Valley and controlled by a handful of unfathomably wealthy white men—the opposite of diversity. (Location 142)
-
As digital platforms have expanded, the homogeneity they cause has spread, too. (Location 151)
-
This imbalance induces a state of passivity: We consume what the feeds recommend to us without engaging too deeply with the material. We also adapt the way we present ourselves online to its incentives. We write tweets, post on Facebook, and take Instagram photos in forms we know will grab attention and attract likes or clicks, which drive revenue for the tech companies. Scientific studies have shown that those likes trigger rushes of dopamine in our brains, meaning that chasing them, and complying with the feed, is addictive. (Location 157)
-
On the other side of our algorithmic anxiety is a state of numbness. The dopamine rushes become inadequate, and the noise and speed of the feeds overwhelming. Our natural reaction is to seek out culture that embraces nothingness, that blankets and soothes rather than challenges or surprises, as powerful artwork is meant to do. Our capacity to be moved, or even to be interested and curious, is depleted. (Location 161)
-
In place of the human gatekeepers and curators of culture, the editors and DJs, we now have a set of algorithmic gatekeepers. (Location 181)
-
Attention becomes the only metric by which culture is judged, and what gets attention is dictated by equations developed by Silicon Valley engineers. The outcome of such algorithmic gatekeeping is the pervasive flattening that has been happening across culture. By flatness I mean homogenization but also a reduction into simplicity: the least ambiguous, least disruptive, and perhaps least meaningful pieces of culture are promoted the most. Flatness is the lowest common denominator, an averageness that has never been the marker of humanity’s proudest cultural creations. (Location 183)
-
She pushes a tuner button preset to make the radio jump to FEN, a station that plays American rock music. In a footnote, the book meditates on the technology of that button: It’s “a nice feature where you can set the frequency for the station you want in advance,” but “a little bit of the maniac fun of manual tuning has been lost.” The author observes the difference between hitting a button to instantly tune into the station and wiggling a knob back and forth, navigating through static, and eventually finding the perfect analog position. The latter might be less precise and less convenient, but it’s slightly more magical and humane. (Location 191)
-
Still, no matter how complex, an algorithm remains in its essence an equation: a method to arrive at a desired conclusion, whether it’s a Sumerian diagram to divide an amount of grain equally among several men or the Facebook feed determining which post to show you first when you open the website. All algorithms are engines of automation, and, as Ada Lovelace predicted, automation has now moved into many facets of our lives beyond pure mathematics. (Location 293)
-
applying the practice of “cybernetics” to business management. Beer described cybernetics as “the science of control.” It involves analyzing complex systems, whether corporations or biology, and determining how they work to better model or create such intelligent, self-correcting systems. (Location 303)
-
“We enshrine in steel, glass, and semiconductors those very limitations of hand, eye, and brain that the computer was invented precisely to transcend,” Beer wrote in his 1968 book Management Sciences, pinpointing the paradox. As with the Mechanical Turk, the human persists within the machine. (Location 323)
-
Algorithms have become both invisible and omnipresent, contained in the apps we carry around with us on our phones even as their data are hosted physically somewhere distant, within vast air-conditioned server farms set into obscure locations in the natural landscape. Where Project Cybersyn suggested that the world run by data might be coherent and graspable, contained within a room, we now know that it is abstract and diffuse, everywhere and nowhere at once. We’re encouraged to forget the presence of algorithms. (Location 328)
-
New technologies inevitably create new forms of behavior, but the behaviors are rarely those that the inventors expect. (Location 332)
-
The technology has an inherent meaning of its own that eventually comes to the fore. Marshall McLuhan wrote his famous dictum “the medium is the message” in his 1964 book Understanding Media: The Extensions of Man. He meant that the structure of a new medium—electric light, the telephone, television—is more important than the content that travels through it. The telephone’s ability to connect people exceeds any particular conversation. “The ‘message’ of any medium or technology is the change of scale or pace or pattern that it introduces into human affairs,” McLuhan wrote. (Location 332)
-
An executive at the music cataloging and recommendation service Pandora once described the company’s system to me as an “orchestra” of algorithms, complete with a “conductor” algorithm. Each algorithm used different strategies to come up with a recommendation, and then the conductor algorithm dictated which suggestions were used at a given moment. (The only output was the next song to play in a playlist.) Different moments called for different algorithmic recommendation techniques. (Location 359)
-
Algorithms also change over time, refining themselves using machine learning. The data they take in is used for gradual self-improvement to encourage even more engagement; the machine adapts to users and users adapt to the machine. (Location 364)
-
But the latter, more innovative technique was based on the actions of other users. Who opened a particular email and how they responded to it would be factored into how much the system prioritized the email. As the paper described it: People collaborate to help one another perform filtering by recording their reactions to documents they read. Such reactions may be that a document was particularly interesting (or particularly uninteresting). These reactions, more generally called annotations, can be accessed by others’ filters. (Location 380)
-
Quality is subjective; data alone, in the absence of human judgment, can go only so far in gauging it. Social information filtering bypasses those problems because it is instead driven by the actions of human users, who evaluate content on their own—using judgments both quantitative and qualitative. (Location 403)
-
Early Internet algorithms were designed to sift through a vast body of material for whatever was important to a user, and then present it in a coherent way. Recommendations were the goal: recommending a piece of information, a song, an image, or a social media update. Algorithmic feeds are sometimes more formally and literally labeled “recommender systems,” for the simple act of choosing a piece of content. (Location 420)
-
advertising now provides the vast majority of Google’s revenue—more than 80 percent in 2020. As PageRank attracted billions of users to Google Search, the company could also track what the users were searching for and could thus sell advertisers space on particular search queries. The ads a user sees were just as informed by the algorithm as the search results were. And advertising, built on the search algorithm, turned Google into a behemoth. (Location 446)
-
Amazon website began using collaborative filtering as early as 1998 to recommend products for customers to buy. Rather than attempting to measure similar profiles of users to approximate taste, as Ringo did, the system worked by determining which items were likely to be purchased in tandem—a rattle with a baby bottle, for example. (Location 450)
-
These early algorithms sorted individual emails, musicians (as opposed to specific songs), web pages, and commercial products. As digital platforms expanded, recommender systems moved into more complex areas of culture and operated at much faster speeds and higher volumes, sorting millions of tweets, films, user-uploaded videos, and even potential romantic partners. Filtering became the default online experience. (Location 462)
-
The negative aspects of Filterworld might have emerged because the technology has been applied too widely, without enough consideration for the experience of the user, rather than for the advertisers targeting them. The recommendations, such as they are, don’t work for us anymore; rather, we are increasingly alienated by them. (Location 473)
-
The News Feed patent application’s longer description suggests a system of collaborative filtering, acting on a much larger scale than the email systems of the 1990s. It’s worth quoting in full because it predicts what much of life online, from social networks to streaming and e-commerce, became in the decade that followed: so many automated feeds dictated by corporations more so than users, gradually forming a more passive relationship between users and the content feed. Items of media content are selected for the user based on his or her relationships with one or more other users. The user’s relationships with other users are reflected in the selected media content and its format. An order is assigned to the items of media content, for example, based on their anticipated importance to the user, and the items of media content are displayed to the user in the assigned order. The user may change the order of the items of media content. The user’s interactions with media content available in the social network environment are monitored, and those interactions are used to select additional items of media content for the user. (Location 497)
-
It was like designing a website for Google search-engine optimization: journalists optimized content for the metrics of the algorithm, or at least what we perceived them to be. The process felt manipulative and at times Kafkaesque; we contended with an unseen, incomprehensible, ever-changing opponent. (Location 548)
-
Culture is meant to be communal and requires a certain degree of consistency across audiences; without communality, it loses some of its essential impact. (Location 577)
-
Technology often appears to belong to the distant future right up until the moment the switch flips, and the leap forward becomes totally mundane, a simple fact of daily life. (Location 584)
-
Tanizaki mourned the unique forms of Japanese culture that the old dimness of candlelight had inspired, from the gleam of gold leaf on a home’s interior sliding door to the murky appearance of miso soup in a darkened restaurant: “Our cooking depends upon shadows and is inseparable from darkness.” (Location 597)
-
the algorithm is often unconsidered, part of the furniture, noticed only when it doesn’t function in the way it’s supposed to, like traffic lights or running water. (Location 606)
-
For all the people on these platforms, every interaction, every moment of passive consumption, is mediated by algorithmic recommendations. Even if some users can opt out of an algorithmic feed, their participation contributes to the data that fuels other users’ recommendations. The dragnet is inescapable. Social networks and streaming services have become the primary way a significant percentage of the global population metabolizes information, whether it’s music, entertainment, or art. We now live in an era of algorithmic culture. (Location 612)
-
So we connect more people. That can be bad if they make it negative. Maybe it costs someone a life by exposing them to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is de-facto good. (Location 621)