The project takes an ethical approach to emphasize how the complex mechanism that influences our information choices is linked to our privacy and social life. The focus lies in using design artifacts and storytelling to arouse awareness and discussion on how the public, regulation, technology, and market could potentially engage in adjusting the direction of a future platform.
Nowadays, social media has become the domination of our digital life and information sources. Taking Facebook as an example, over 2 billion people use Facebook every month, and over 50 percent of people use social media as a news source, making the news feed the most viewed and most influential aspect of the news industry.
With information structured like a waterfall, the newsfeed creates a seamless reading experience. The continuous, non-stop exposure has taken over the control of information from us, however, how the platform interprets data and how the algorithm makes decisions for users remains as a vague concept to the public. The complex mechanism behind the newsfeed is like a "black box": On the one hand, disciplinary barriers make it difficult to understand. On the other hand, since the algorithm does not exist alone, even the professional couldnĀ“t have a holistic interpretation.
In the face of today's political climate and trust challenges, there is an urgent request for platforms on algorithmic transparency, yet there is a gap between emerging claims and ideal transparency that is meaningful to users.
Take cookie consent for example, how many of us never read carefully before clicking accept. Even with more explanations and controls devised, they are still too complex or vague to understand, and users cannot relate themselves to the consequences of their actions.
It's a easy claim to say that we want to make the algorithm visible. But just as the quote from the media professor Taina Bucher, "When we ask for transparency, what exactly is our goal. And what are we hoping for once it is made visible?"
How can we take the discussion a step further and explore the transparency that really makes sense for users?
The first challenge of the project is to present the abstract, invisible platform algorithms to the public in a perceptible and intuitive way.
1. Media Diet Visualizations
To understand the gap between reality and peopleĀ“s perceptions of their media diets, I conducted 8 user interviews and colored code 3 users' newsfeed content to reveal three patterns on social media: content ingestion, friends' influence on newsfeeds, unconscious ad reception. These visual maps reveal the complex dynamics of digital systems and how they relate to us. They provide a basis for scrutiny and reflection among the interviewees and stimulate the discussion around information consumption awareness.
2. Library of control and review
After benchmarking six mainstream platforms around the content, layout, target users, algorithmic mechanisms, and business models, I find that there lacks of research from a user experience level in academia. To further clearly delineate what is controlled by us and the platform, I mapped out a concrete UI library that links to the control and review functions. These features are then deconstructed into three dimensions: their targets, immediacy and visibility, and 7 classifications. I made it an open resource for revealing and tracking the dynamics of the control the platform provides.
3. Expert and stakeholder involvement
I involved 5 experts from social media, technology, journalism, and design to bring in insights to understand the distribution of power across the system from different perspectives. I also conduct trends and related academic research by participating in webinars
#1 Uneven Power Dynamics
While the platform provides little control to users, the algorithms on the newsfeed change all the time and become extremely powerful. "As soon as the algorithm behind the recommendation changes, all of our publisher's content undergoes a change." As these changes are in-transparent and without notice, the individual user is actually in a more passive and powerless position in these dynamics
#2 Hidden Business Agenda
Social media platforms have the business model of advertising that revolves around the harvesting of personal data to predict and manipulate what people do. The platform opens up way more controls for advertisers to target users with specific content. However, individuals are excluded from the trade, even though the data is produced by and about them.
#3. Trap In Social Relationship
To maximum interaction activity on the platform, social media continuously score users' networks and prioritize content based on intimacy. However, the theory and research show that weak ties play an even vital role in helping individuals access a more diverse range of information, and the majority of the content is actually generated by a few people on the platform.
What value can be brought under this theme from design? Rather than picturing "more" or "better" controls for the current system, I choose to use design as a medium to facilitate a room for revealing and discussing different transparent interventions and the systemic changes it causes
I conducted trend scanning, what-if futuring, and ideation workshops. Over 50 signals are sorted and three algorithmic mechanisms are selected as typical to show the complex dynamics of the system.
Future Wheel Co-creation Workshop
An important part of the project is speculating on consequences in systems with complex dynamics. I chose the Future Wheel as a tool to co-imagine the consequences of events with users and stakeholders. Participants are invited to role positions and speculate the subsequent reactions that will occur in the scene.
The user interface and user stories work as artifacts for dialogue and iterate through feedback. 13 Users participated in the process of scenario building and testing.
Potential Newsfeed is three speculative scenarios on Facebook in 2024?
It shows the potential of a power shift from the platform to people. But it also challenge the user on the consequences.
1. "Pay to Control"
What if the business trade on platforms is naked and involves people?
In the future, social media will reform the business model that charges users for data protection. A new divide is built on the affordability of blocking targeting.
This scenario reveals a new divide based on the ability to decide what to read.Would we be willing to pay for more control? How to make a choice between freedom of information, the comfort of digital experience, and financial burden?
2. "Content Influencer"
What if your reading behavior data tracking was stored locally and freely editable?
In a future where social media platforms will no longer own users' behavioural data, New software and free trade markets emerge to enable users to regulate content recommendations by locally editing their data.
By opening up control over data entry, the editorial role is open to the market. While we transact directly with content producers, are we willing to assume the responsibility of data security gatekeepers?
3. "Your Friend is Your News"
What if your friends' influence on your information consumption is revealed?
In the future, the value of a friend will lie in their value as an information influencer. New features are launched for allocating newsfeed content by intervening with the scoring system.
Can we really afford the impact of it -to know how our friends, their interests, and attitudes affect our newsfeeds?
As a speculative project, potential newsfeed aims to confront diverse audiences with the complex and deeply interconnected nature of the challenges we face today. The work examines how design can visualize complex systems and facilitate ethical discussion concretely and intuitively. The result has been presented in different communities and t professional groups to remain open to multiple possibilities and navigate uncertainty with active hope.
This project doesn't try to critique social media nor simply describe a utopia future. Social media is still young in comparison to traditional media. The project attempts to influence its continuous changes by raising essential questions related to its future role and advocating to the public in the shift of shaping the platform.