Remember the people on the tram? Or, more precisely, my pondering on the way we consume information? Well, those thoughts are still happily bouncing around my head, like a random "Tombola" synthesiser, and I find myself returning to my keyboard.

"Of course!" I thought, "It's not just what we consume, it's how we consume it."

In todays age of ubiquitous digital connectivity, the dreaded algorithm plays an increasingly powerful role in shaping our cultural and artistic experiences. From the content we consume on streaming platforms to the news and opinions we encounter on social media, much of what we engage with online is curated by complex recommendation systems designed to maximise engagement and profit.

And that can't be good, right?

My suspicion is that whilst all this apparent personalisation offers us convenience and tailored recommendations, it also raises profound questions about the impact on our knowledge, creativity, self-expression, and ability to make authentic choices.

As we outsource more and more of our cultural decision-making to AI-driven platforms, we risk homogenising our collective imagination, reinforcing echo-chambers, and eroding the common ground necessary for a thriving, diverse public discourse.

So what do these algorithms actually do? What are the implications of algorithmic curation on society, and our fundamental sense of human agency?

I had a think and came up with some ideas.

Humour me!

To me, much of the online world we inhabit feels carefully curated to suit our tastes, preferences and attention spans. Streaming services like Netflix and Spotify use machine learning to anticipate what films, series and songs we're likely to enjoy. Facebook, Twitter (and the loathsome TikTok) flood our feeds with posts calculated to maximise engagement time and interaction. Google and YouTube serve up search results and recommendations designed to keep us clicking and consuming.

We've all been there, right?

The girl on the floor does not exist.

On one level, the promise of personalisation holds some obvious allure. It seemingly puts our interests front and centre, promising to save time wading through content that doesn't resonate.

However, it's likely that "algorithmic curation" fundamentally limits and shapes individual choice and collective creativity in worrying ways.

Worrying to me at least.

These "unseen" logical structures directing our digital lives must raise some pretty profound questions around authenticity, innovation, diversity and human agency itself.

What are the consequences when dispassionate mathematics increasingly mediates our access to culture and the scope of creative possibility?

The mind boggles. Let's dig a little deeper here.

Algorithms are Invisible Curators.

At their core, content recommendation algorithms used by streaming platforms, social networks and search engines aim to reverse engineer and codify taste, in order to maximise time spent on a given service.

Uh oh, I can hear Pierre Bourdieu's footsteps coming down the garden path...

Using a combination of behavioural data (e.g. viewing histories, likes, comments), metadata tagging and filtering techniques, they build complex mathematical models for anticipating which films, songs, books, posts or products a given user (you know, me and you) or demographic is statistically likely to consume and enjoy.

Don't you just love the idea of being your very own demographic. Sounds cool, right?

However, despite their pervasive influence, algorithms' role as cultural curators often goes unnoticed and unexamined. We just don't think we are being framed. User experience design psychologically habituates us to accept top recommendations as authoritative, while statistical personalisation provides us with an "illusion of choice" obscuring how parameters are being set.

Which is one way of saying that we increasingly "outsource" taste-making to proprietary mathematical models that we can't audit or even understand.

Good for us! Not.

As a result, just as traditional museum and gallery curation shape art history's canon based on particular institutional values, the "algorithmic canon" of culture centred around Silicon Valley definitely (and some would say profoundly) delimits the creative works and ideas we're exposed to.

Algorithmic curation clearly serves profit-driven corporate interests rather than public cultural enrichment.

I had to read into all this, but it turns out that recommendation and filtering systems are inherently engineered to maximise key engagement metrics - like watch time, listens, likes, shares and conversion rates. Unsurprisingly I suppose, this leads to design choices prioritising compulsively clickable, easily digestible and ideologically comforting content over more complex, challenging or nuanced work.

The result is an attentional economy incentive structure, systematically biased toward uncritical consumption and virality* over reflective engagement and aesthetic diversity.

(*Please don't mistake this for "virility". LOL.)

She exists purely as an algorithm.

Share-ability vs. Innovation.

The algorithmic emphasis on maximising share-ability must have profound impacts on the commercial viability of risky, experimental and boundary-pushing creative projects. Yeah?

Because AI-based recommendations extrapolate future behaviour from past data, they inherently privilege familiarity over novelty and what's popular over what's niche.

I'd call this "algorithmic shaping", for want of a better term.

Example: In order to boost subscribers, Netflix has clearly shifted investment away from auteur-driven indie dramas, toward blockbusters, sequels and reboots with proven templates. A brief Google search (oh, the irony) confirms that Netflix's top performing original films are mostly star-driven action movies and rom-coms specifically optimised based on viewing data models.

How desperately sad is that?

This lower-risk, imitative pipeline logic filters upward, with recommendation-driven mega-hits like Bird Box and Extraction spawning scores of derivative copycats.

Rinse, repeat...

While recycling proven properties minimises risk, it surely lowers the ceiling for artistic innovation. We end up drowning in what cultural critics call a "monoculture" of endlessly iterated narrative franchises and stylistic formulas engineered to feed the algorithm's penchant for recognisability and virality.

Experiments in form and theme struggle to surface amid content designed to mirror and validate prior consumption patterns.

I'd argue that the (unfortunate) result is pervasive cultural repetition rather than paradigm-shifting novelty.

She knows you better than you think.

On Echo-Chambers and Shrinking Common Ground.

The thing is, this share-ability driven calcification of style and subject matter often intersects problematically with political polarisation and the construction of online echo-chambers.

Social media algorithm design amplifies simplistic sloganeering, extreme partisanship and conspiracy theories over and above nuanced discussion because such content reliably sparks quick interaction (e.g. upvotes, argumentative comments, links).

Over time, positive feedback loops emerge where people are disproportionately exposed to emotive, ideologically-charged content catering to their existing beliefs.

This not only pushes partisan thinkers further apart, but dramatically shrinks common cultural and informational touchstones between groups. With less shared media exposure, collective sense making erodes, and mutual distrust metastasises.

These "filter bubbles" don't just impact factual news and political discourse, but the stories and symbols societies use to understand themselves.

Time for another hastily plucked example. As the Left / Right wing divide widens, it's exacerbated by algorithms increasingly serving up entirely distinct streams of information, films, TV shows, books and music to Left and Right audiences. Whereas the mass hits of previous eras created connective cultural tissue and opportunities for mutual understanding, we now occupy increasingly disjointed aesthetic worlds reflecting particular subcultural values - whether the social-justice inflected content pushed to liberals or the more masculine, laissez-faire messaging targeted at conservatives.

I don't know, I may be wrong.

But over time these divergent exposure patterns calcify distinct aesthetic norms and narrative templates largely legible only within ideological in-groups. Left and Right consumers don't just disagree on government policy, but inhabit different universes of cultural meaning-making and expression, with creative works on both sides optimised to affirm, rather than complicate, existing worldviews.

The common stories and emotional bonds vital for bridging divides slowly erode, diminishing empathy across lines of difference.

Oh dear.

Addicted to information.

The Challenge of Choice.

Stepping back for a moment, and getting a bit more abstract on yo asses, this rise in digital curation must be shifting core notions of individual agency and choice itself.

I mean, how do we know our cultural preferences are truly our own and not artefacts of algorithmic influence?

In a world of ubiquitous recommendation systems shaping default options and perceptions of possibility, "free choice" is already constrained by parameters we don't set ourselves.

Every time Netflix autoplays a new series after a binge session, or YouTube's sidebar tempts us with an endless stream of similar clips, we're reminded of the algorithms' growing power to delimit the horizons of subjective experience - often without us noticing.

On a deeper, philosophical level, algorithmic micro-targeting and the automatic personalisation of information fundamentally undermines the notion of the autonomous, rationally self-directed individual.

We are all doomed.

If the media we consume is always already curated to suit our impulses by automated systems beyond our control or understanding, in what sense can aesthetic choice be fully "free"?

Do we discover new content, or does it discover us?

Despite definitely not being a parent, I dare say we should be especially concerned about the impact of all this on our children and "digital natives."

(My nice, Jenny, by the way, can remember when the internet had wires!)

Whereas previous generations' early cultural experiences and exposure to art was primarily mediated by parents, teachers, librarians and other gatekeepers, for many kids today those initial encounters are heavily driven by automated recommendations on YouTube Kids, Netflix Kids, and kid-focused video games.

For me, this raises troubling questions about tech platforms' outsize influence over the formative development of taste and creative sensibilities.

To what extent will the next generation's cultural consciousness reflect the incentive structures of advertising driven algorithms rather than diverse adult guidance and curation?

Mull that one over!

Feed Addict.

Put it this way - the growing power of algorithms to imperceptibly shape production and cultural consumption raises some rather pressing social questions.

Unaudited AI curation engines designed to maximise engagement metrics, like virality and watch time, inevitably privilege simple, emotively-charged and ideologically validating content over more complex, nuanced or challenging creative works.

The result is a lowest-common-denominator "monoculture" of aesthetic repetition and political polarisation rather than bold experimentation and good faith dialogue.

At the same time, the manipulation of automatic recommendations fuelled by vast data collection erodes individual agency to make free, self-directed choices in cultivating personal taste. Default options set by "Big Tech" threaten to constrain the very horizons of expressive possibility, especially among digital native youth increasingly reliant on algorithmic suggestions from birth.

Like I said, we're doomed!

Contending with this disempowering dynamic calls for new forms of algorithmic literacy and pressure for transparent, accountable and pro-social recommendation system design.

We should all think more deeply about the values and outcomes we want our curatorial infrastructure to promote - from meaningful innovation to privacy to inclusive representation to child wellbeing.

And we need models for empowered human choice and intentional engagement that don't just accept AI's frictionless defaults.

Cue drumroll.

Ultimately, cultural production has always been vital for helping us imagine and communicate alternative possibilities for living together. IMO, ensuring this function endures in an era of algorithmic everything is essential.

The recommendation systems we build today to discover, distribute and monetise human creativity will shape the stories and ideas we're able to think and feel with tomorrow.

We owe it to ourselves to make them not just responsive, but responsible and equitable from the ground up.

Dug that? Dig this:

The Semiotic Significance of Everyday Life.
Is being good at maths essential? Can semiotics provide us with a structured approach to understanding the world that maths cannot? Or do the two work hand in hand? Let’s find out in The Semiotic Significance of Everyday Life: By someone who was never good at maths.
The link has been copied!