The Problem with Media Recommender Systems Today

Bill Seaward

Overview

A fundamental flaw in the way media recommendation systems currently operate lies at the heart of why digital experiences often leave us dissatisfied and sometimes unwell. A new approach to indexing and discovering content is the solution.

It's well established that self determination, sometimes called freewill depending on the context, is contingent not just on having the freedom of choice, but also on the freedom of one's desires. If our desires are not aligned with our choices, or if some external force constrains our desires, then the freedom to act alone may not be sufficient for our freedom of choice. This insight about how our desires relate to our freedom is relevant today for the lives we inhabit inside media recommendation systems, because these systems often ignore our deepest desires.

Our digital lives are immersed in recommender systems, and they generally work by algorithmically selecting content to be automatically suggested to the user based on information about the user. This information is often associated with previous content selections made by the user but may also include other data such as psycological or marketing profiles of the user, and large language models may also be used to predict recommendations. Importantly, the recommendations made are also dependent on any other business goals the platform persues. For instance, the business may want to sort recommendations based on how content causes the user to act in order to keep users on the platform longer.

To make these recommendations existing systems rely almost entirely on the implicit actions we take, like clicks and watch time, rather than explicit inputs, like a user-reported preference or emotion. This is a problem as our emotions are obviously fundamental to the development of our desires and the execution of our choices. But if systems do not provide a means of easily reflecting our feelings and our desires in our choices then our actual desires and our explicit actions may start to diverge. For example, we might find that despite spending considerable time on a platform we also quite dislike it.

This disconnection between our choices and desires is possible because behavior doesn't mirror desires in a perfect one-to-one way. In a broad sense, choices are best viewed as an epiphenomena that emerge in association with other psycological states, such as competing and conflicting desires and feelings that are in a constant state of experiment and change. Another way to view this is that we often have desires about desires. For instance, we can desire something while at the same time possess the desire to not desire that thing. Such second order desires are often expressed and processed internally by us as identity and goal oriented self reflection, and emotional responses to experience. We act, get feedback from the world, weigh our competing desires, change our feelings and attitudes, and act again. And it's in the continuous unfolding of this process that our freedom of choice emerges.

But current recommendation systems don't consider our internal mental states in making their content selections, so as we engage the current platforms we're unknowingly defining our recommendations based on a limited set of inputs that poorly reflect who we are. These inputs narrow our future recommendations, which increases the probability that we'll click on more of what we did earlier. And when we do click on more of the same, that in turn, strengthens existing predictions about what we get to see, and our future recommendationsare are narrowed again, and so on.

This is a feedback loop in which our initial choices become unnaturally magnified focal points of interest over time. So the probability that a fleeting interest will become an entrenched habit in the future can increase in a very unnatural way. This kind of choice recursion is new in communications technology due to the degree of personal immersion and immediacy of recommendation adjustment that digital experiences enable.

Failing to appreciate this relationship between our deepest thoughts and desires and content recommendation design is part of why social media, while fun at times, makes some people miserable and many of us dissatisfied, and yet we don't reduce its usage. There are many contexts where recommender systems are less concerning, such as in the curation of specific media, like music or movie recommendations. But our online environments are becoming more than just entertainment delivery systems and are increasingly immersive proxies for the real world. As more of our analog lives move online, these concerns become more significant.

The solution lies in creating digital experiences that better reflect our deepest desires. This means adapting recommendation systems to better appreciate user outcomes, satisfaction, and overall psychological experience. This could be enabled by creating a new layer of content tagging and connecting the moods, goals, and other more meaningful internal experiences of users with the content. Third, we must design low-friction interfaces that let users easily report on their deepest ever-changing desires and filter content based on the moods, desires, and outcomes they want.

As a matter of digital wellbeing, choice recursion in content recommendation should be minimized. The algorithms that determine digital design should not also determine who we are, who we become, or define the terms of our own satisfaction. Rather, they should enable us to find digital experiences that reflect our changing internal desires as accurately, and with as little latency, as possible.

Contact Us