This is a cache of https://digg.com/cant-get-much-higher/link/personalized-lies. It is a snapshot of the page at 2024-10-09T01:19:09.214+0000.
Personalized Lies | Digg

A Title Made for You

Personalized Lies

Personalized Lies
Sometimes you need to fear the algorithm.
· 8.6k reads ·
·

Often when you hear people bemoaning the dangers of social media or the problems of music streaming, they point to one thing: "the algorithm." This makes sense -- algorithms do power our online experiences -- but it is also somewhat silly. An algorithm is just a very specific set of instructions. Here's an example.

[Device dictating how tall you must be to go on a specific ride in Disneyland. Credit: MickeyVisit]

Last week, I was in Disneyland with my girlfriend and our two friends. At every ride, we'd see some measuring stick indicating how tall you had to be to get on. This is a very basic example of an algorithm, namely one that decides if a person is able to get on the ride. If you are greater than or equal to the indicated height, you may get on. If you are shorter than that, you may not. Here's how you would render that as a function in Python, a popular programming language:

def can_i_ride(height):
    if height >= 42:
        print("Yes")
    else:
        print("No")

"Algorithms are tools," Spotify's former Data Alchemist Glenn McDonald told me a few months ago. "Tools have no moral stature on their own. They're good if they increase our ability to do interesting, humane things. They're bad if they don't do that." Let's conjure an example of a music algorithm -- albeit without code -- that I think would increase our ability to do "interesting, humane things."

Let's say you're a becoming a big fan of Ella Fitzgerald. According to her website, "During Ella's 50-plus year career she recorded over 200 albums and around 2,000 songs." That's a lot of music to sift through even if you've got a world of time. Imagine a tool that showed you the most popular Ella Fitzgerald songs that you haven't heard yet. That would be very useful. It would be even more useful if it were generalized.

Imagine a tool called the "Discovery Machine" that allowed you to enter any artist's name. After you did such, it would run an algorithm that returned you a list of that artist's songs that you'd never heard sorted from most to least popular. I would love a tool like that. It seems plainly beneficial. (Frankly, it sounds like something I should mention to my bosses.)

But algorithms aren't all plainly beneficial. Imagine I wrote one that was explicitly designed to tell you that you were a worthless pile of garbage every time you opened your phone. Though the algorithm itself might be computationally perfect, I think we would agree that it was morally wrong.

The moral rightness (or wrongness) of an algorithm is not always that clear cut, especially when you consider that all modern software uses scores of algorithms of varying degrees of complexity. Nevertheless, let's go back to our "Discovery Machine" algorithm from a moment ago and tweak it to build a new tool called the "Familiarity Machine." This still allows you to enter any artist's name, but instead returning a list of that artist's songs you've never heard sorted from most to least popular, it will return a list of that artist's songs that you have heard sorted from most to least listened to by you. If you've never listened to that artist before, it will return nothing.

Are either of these algorithms, or tools, morally better than the other? It's hard to say. Both discovery and familiarity are vital to the music listening process. If all you did was discover new things, you'd never get to deeply engage a piece of music. At the same time, if you only listened to things you'd heard over-and-over, you wouldn't get to expand your horizons and have new experiences. If you look at the various personalized, or algorithmic playlists that Spotify recommends to you, you'll notice that each weights discovery and familiarity differently.

Here are four algorithmic playlists that Spotify recently recommended to me. In this case, algorithmic means that a human did not make them. They were curated by a giving a computer a complex, yet specific set of instructions. The first two mixes -- Daily Mix 1 and Daily Mix 2 -- lean into familiarity. In fact, I've heard every song on each of those, some scores of times. Discover Weekly goes in the other direction, meaning I haven't heard most of the songs being recommended. Release Radar exists in the middle. That contains new songs from artists that I have listened to.

I really don't have a problem with any of this. These playlists are listed in a part of the app that is specifically highlighting algorithmic curation. It's also a good mix of discovery and familiarity. But I've become more suspect of places where personalization lurks less obviously and where that personalization can be misleading. Let me show you why.

The other day I clicked on Spotify's "Punk Essentials" playlist. Billed as "All the punk rock that you need in your life," it looked like a serviceable collection of songs. Rancid. Bad Religion. NOFX. blink-182. Makes sense. But then I kept scrolling a noticed some oddities. First, I saw "Great Balls of Fire" by Jerry Lee Lewis. "Maybe," I thought to myself, "you could call 'Great Balls of Fire' proto-punk, but that would really be stretching the definition." Then I saw "I'm a Man" by The Spencer Davis Group. Great song, but certainly not a punk song. When I got to "Mystery Train" by Elvis Presley, I figured something must have been broken. "Mystery Train" is an absolute classic, but it's not even in the punk universe.



I grabbed my girlfriend's phone to compare. Her "Punk Essentials" playlist started out the same. "Time Bomb" by Rancid. "American Jesus" by Bad Religion. "Fuck Authority" by Pennywise. "Linoleum" by NOFX. "Still Waiting" by Sum 41. "I'm Shipping Up To Boston" by Dropkick Murphys. "Dammit" by blink-182. And "Come Out and Play" by The Offspring. Then things started to diverge. The ninth song on my playlist was "All Day and All of the Night" by The Kinks. The ninth song on her playlist was "1985" by Bowling For Soup.

A quick search confirmed that both of our playlists had each of those songs. "1985", for example, was the 26th song on my playlist. "All Day and All of the Night" was the 23rd on hers. That was the case with many other songs on our playlists. "Should I Stay or Should I Go" by The Clash was 13th on her playlist and 29th on mine.

There were other songs that were exclusive to one of our playlists, though. For example, Elvis Presley's "Mystery Train" -- the assuredly non-punk song on my playlist -- was not found on hers. Green Day's "American Idiot", by contrast, was on her playlist but not on mine. What's going on?

Punk Essentials is what Spotify has dubbed an "algotorial playlist." This means that both man (i.e., editorial) and machine (i.e., algorithm) come together to make a playlist. In broad strokes, curators select a pool of songs for the playlist. There are certain songs they can specify that must be on it. That's why the first eight songs on our playlists are the same. For the rest of playlist, an algorithm -- or fleet of algorithms -- decide which of the songs from the human-curated pool should appear on each of our playlists based on our individual listening history.

In many situations, I think this technology is great. Consider a Spotify playlist like Mood Booster. This playlist is largely just a mix of contemporary, upbeat songs. And it's quite popular. It's been saved almost 8 million times. If you had a curator pick 500 happy-ish songs and an algorithm choose the 50 that best match each listener's tastes, you'd likely have more listeners satisfied than if it were one static playlist of 50 peppy tunes. What boosts one person's mood is the bane of another's existence. A playlist like Punk Essentials is different, though.

Punk is an identifiable genre. It has specific sonic qualities (i.e., stripped-down, in-you-face, guitar-driven). It also has a known historical arc, growing from the garage rock sound of the 1960s into its own scene a decade later. You can argue about what constitutes true punk music. Is Good Charlotte punk? Are The Stooges punk? These categories are flexible and the edges are fuzzy. But we can agree that certain things are not punk. "Call Me Maybe" by Carly Rae Jepsen is not punk. "Hello" by Lionel Richie is not punk. "Mystery Train" by Elvis Presley is not punk.

Good Charlotte

It's possible that Spotify's "Punk Essentials" playlist is a true algotorial playlist. It's possible that there is a punk expert selecting a pool of songs that one of Spotify's algorithms is using to choose what the best punk essentials are for me. It's possible that that person thinks that Elvis Presley's "Mystery Train" is a proto-punk anthem. (It's also possible that that person added the song to the selection pool errantly.) But I worry that with the layoffs that occurred at Spotify in 2023 that more playlists are being taken out of the hands of curators. While algorithmic curation works well for mood-based playlists, it can lead to distortions and outright falsehoods if used to curate playlists that are based on actual scenes and historical movements.

"Does it matter?" you might say to me. "Nobody is losing their life if Elvis Presley accidentally ends up on a punk playlist." But it does matter. Over the last decade, we've seen countless conflicts erupt because social and news feeds curate versions of reality that people want to believe exists rather than what actually exists. And the truth can be fuzzy. But if everything your algorithmically-curated Facebook feed spits back at you is meant to reenforce what you already believe, if every Spotify playlist spoon feeds you music you already like, then we end up in a worse world, one where we can't challenge our minds, where we can't agree on even basic truths. Sometimes, as they say, it's only rock n' roll. Sometimes it's much more than that.


Cut Through The Chaos With Digg Edition

Sign up for Digg's daily morning newsletter to get the most interesting stories. Sent every morning.