Addictive Algorithms
Jhonatan Jimenez
One major issue with addictive algorithms in social media is that they are designed to maximize user engagement by prioritizing emotionally charged or highly stimulating content. This can lead to excessive screen time, reduced attention spans, and negative mental health effects such as anxiety or low self-esteem. These algorithms often reward content that keeps users scrolling rather than content that is informative or beneficial.
Some best practices to mitigate these issues include increasing algorithmic transparency, implementing time-use reminders or usage limits, and designing algorithms that prioritize well-being over engagement metrics.
Samuel Emile
One major concern is the impact on children and teenagers. Spending excessive time on social media has been linked to anxiety, depression, and low self-esteem. Algorithms also tend to push emotionally charged or extreme content because it increases engagement, even if that content is harmful. In some cases, young users may be exposed to unsafe recommendations, including content related to self-harm or eating disorders. To help mitigate these issues, there should be stronger protections in place. For example, platforms could disable addictive features like autoplay and endless scroll by default for minors. There should also be clearer transparency about how recommendation algorithms work. Parents should have better control tools, such as customizable content filters and screen time limits. Legislation like the Kids Online Safety Act (KOSA) is a step in the right direction because it shifts responsibility back to tech companies to design safer platforms. Overall, addictive algorithms prioritize engagement and profit over user well-being, especially for younger users. Addressing this issue will require a combination of policy changes, ethical design practices, and increased public awareness.
Mehnaz Barsha
Addictive algorithms are designed to maximize engagement by repeatedly showing highly engaging content, such as trending posts that users are already likely to watch or interact with. Because this content is shown over and over, it can keep users scrolling longer than they intend to, forming unhealthy usage habits. For young people, this repeated exposure can affect overall mental well-being over time.
The best practice to mitigate these issues is implementing screen-time limits that cap how long users spend on an app, helping reduce overexposure to these platforms. Establishing clear boundaries is important, as it increases awareness of time spent online and supports more intentional social media use.
Dylan Moutinho
Most Social platforms use algorithms designed to maximize engagement through variable reward schedules, infinite scroll, and personalized feeds that trigger dopamine responses. This often leads to excessive screen time, reduced productivity, sleep issues, and increased anxiety, especially among younger users. These systems also create filter bubbles and amplify divisive content.
Mitigation Strategies:
1. Set time limits using built-in screen time controls
2. Disable non-essential notifications
3. Use apps intentionally with specific purposes, not mindless scrolling
4. Curate your feed—unfollow negative triggers
5. Create phone-free zones (meals, bedrooms, first/last hour)
6. Use browser versions instead of apps when possible
The goal is conscious usage rather than being controlled by the platform.
Jun Li Lin
I don't really use social media a lot, but I do watch Youtube and I have found their algorithm on how they recommend videos are extremely aggressive when it comes to pushing videos. There was this one time when a new video popped up on my recommended so I clicked on it to see what it was about, after watching halfway through I decided to check the Youtube channel this video was related to, to see what the channel was about after checking it and finding myself not really interested in their content and topics I decided to click off to find other videos to watch. Immediately my Youtube home screen was covered with videos relating to this one topic from several other channels, just from this if someone was to watch videos that promotes a certain view or opinion they can easily fall into an algorithm echo chamber where they will only be feed on that topic or view.
A lot of modern video games have the strategy of employing FOMO with things like battle passes or limited time events to keep players playing or else they might miss out on big events, most games also have loot boxes or other reward systems which is just gambling marketed towards children. While I believe it's not the internet's job to babysit kids and keep them safe and instead it should be the parents who are responsible for their own kids and not shove a tablet in their face all call it a day. I do think getting rid of things like loot boxes or addictive algorithms are generally good for everyone involved(maybe except the companies but they make enough money already).
Charles Murphy
Issues on addictive algorithms in social media include variable reward systems, FOMO, and emotional amplification.
Variable reward systems closely mimic the same behaviors and experience of gambling, except this is an issue for users while scrolling on social media. Rewards come in the form of favorable clips, images, and videos and foster a heightened reliance on technology by triggering dopamine hits. Taking a break from social media can be key to mitigating this issue because if you are prone to gambling, then social media might take a lot of your time away from you. Just like abstaining from gambling can be a cure, abstaining from social media can also be another remedy for time management issues.
Fear of Missing Out or FOMO is a form of anxiety about missing out, which causes users to continually seek content that makes them feel a part of the group or mainstream status quo. FOMO can be mitigated by first analyzing that you could be being manipulated by an algorithm that plays on your emotional fears. Then, boosting your esteem by not following the main trends which in turn influence your habits online, making the FOMO disappear.
Emotional amplification is an observation that negative emotions like anger, outrage, and sadness, start to take priority over the positive ones while using social media, because it is a matter of fact that this type of content causes more of a reaction than benign / peaceful content. This is a part of the algorithm that tries to maximize your time by playing on these strong underlying human emotions. A way to mitigate this issue is to practice not reacting to content you consume even if it is designed to be emotionally sensational. Another way is to understand that the more sensationalized a post is, the more likely you are to get hooked on scrolling for an exorbitant amount of time.
Elsa Shaikh
Social media apps have a lot of addictive components that contribute to keeping children and teens constantly using social media. Some examples include infinite scrolling, and constant notifications. These features remove natural stopping points and lead to continuous use. These features are also intentionally designed to trigger dopamine rewards and create habits that cause users to keep coming back. As a result, users may lose track of time, develop compulsive behaviors, and experience increased anxiety. Instead of prioritizing well-being, many platforms design their apps to maximize engagement and profit, raising serious concerns about their impact on users, especially minors. To help with these issues, social media platforms should implement some sort of usage reminder/ summary to help make users aware of how long they are spending on the app. Also, these platforms should be more transparent about how their algorithms work to protect younger users and help them develop better habits.
David Flores
One of the many things that I find an issue when it comes to addictive algorithms in social media would be targeted ads. While scrolling, one might encounter an ad about something they like or want; they could ignore the ad or stay for the entire ad but for the cases when the user is hooked into the ad. They get hooked on another addictive algorithm within online shopping. Now it's the way how these ads appear in one's FYP that might be a concern and how it collects user data on their activity.
Back to the loophole of online shopping, similar to the addictive algorithms on social media, it checks on what item the user had clicked on and for how long. Then later, the user would see similar products while browsing. This makes things worse, as now online shopping makes purchasing products with three clicks or less, removing the thought process of purchasing the item or not.
William Socci
The addictive patterns in technology are evident when you scroll. After scrolling for an hour you find yourself mentally drained and foggy taking a couple seconds to snap back to reality. Looking away from your phone almost feels like dragging your face out of a black hole.
I think an easy example to see is the generation of ipad kids being brought up. The idea that a child is bored or too energetic so let's just stick an ipad in front of them and they turn into a zombie is enough proof. Then when you try to take the ipad away they scream and get violent is scary when you think about the mind games and addictive behavior it produces. Children show humans at the bare bones because they haven't learned to hide their emotions yet. Meaning that's how we are all reacting at our core when using technology.
There is no easy way to mitigate this because it truly is addictive. The most effective way would be downgrading your phone to just the bare essentials and scrapping social media and games all together. But a lot of people aren't willing to do that so instead you could also use limiters for specific apps. I tried that but I could just cancel it and it became a habit where I canceled it in a second and moved on like I didn't cancel it.
I genuinely believe the best way to mitigate it is by not starting at a young age and don't allow technology until you're 15 and then from there should be laws that actually restrict social media usage. And lock those apps out after 3hours in the day or less.
Jude Duperval
Social media’s addictive algorithms present a paradox that’s easy to overlook. These systems are contrived, leveraging principles long used in marketing, economics, and behavior analysis. Human tendencies are well-documented, and various industries have used such understandings to forecast and influence behavior. Social media platforms have merely been further proof of their consistency in terms of application. Algorithms are intentionally designed to align with the human psyche in ways that attempt to maximize engagement, often for the producer’s 'benefit' rather than the consumer’s 'well-being'.
Mitigating these issues is challenging precisely because the platforms are continuously optimized to keep users hooked. While strategies such as setting screen-time limits, reducing recommendations, disabling notifications, and taking deliberate breaks can assist, they don’t change the underlying processes. The system, as it currently is, will always push to reclaim your attention. Ultimately, it becomes a personal decision about what one’s willing to compromise, and how much autonomy one wants to reclaim from platforms designed to minimize it.
Dante Pruitt
I found that most users get stuck in a content loop that causes them to constantly check social media for either fear of missing out or simply addiction due to content constantly being hand fed to them with algorithms. This is especially prevalent among kids who have access to the internet. There are acts like KOSA that aim to require opt outs for algorithms autoplays and parental controls.
Olivier Jean Pierre
One issue is that AI powered algorithms push harmful content onto users. These systems are designed so that the user spends as long as possible on the app and to maximize engagement. As mentioned in the article above, the algorithm may push targeted content such as harmful diet strategies, extremist material, or mental health content. And users are fed this content over and over which eventually leads to anxiety and depression. One way to reduce this problem is to make their algorithms transparent and obvious for the common user, warnings for sensitive content, etc.
Another issue is the link between social media and the increased risk of depression. The article states that among adolescents, it was found that there was a 13% in the incidence of depression for every additional hour spent on social media. This is important because AI powered algorithms , that are made to keep you on the app is playing a role in indirectly increasing your risk of depression. One solution to this problem is to place screen time limits or reminders,
The last issue is that addiction strengthens the brain reward and habit systems such as the basal ganglia and the amygdala, while weakening the prefrontal cortex. AI social media algorithms take advantage of these systems in the brain by constantly pushing content that will keep users hooked on the app. To mitigate this issue, I think setting limits on social media like from the previous issue would be a great option to mitigate this problem. I think once the addiction is there it's best to de-load yourself and prevent it from becoming worse.
A bit more in detail of the last issue from the article (can be omitted and should):
"Structurally, the basal ganglia, amygdala, and prefrontal cortex are critical in developing and maintaining addictive behaviors. Dysregulation in these three regions can enhance the focus on incentives and weaken executive controls such as decision-making and regulation of actions, emotions, and impulses [30]. Internet addiction has increased grey matter volume in the bilateral putamen and right nucleus accumbens and decreased grey matter volume in the orbitofrontal cortex, a part of the prefrontal cortex [31]"
Marlon Yaucan
Addictive algorithms have many negative effects. It can make one have a shorter attention span, be less productive, and trigger addictive dopamine releases. These algorithms are designed to keep the user trapped to their screen to make money while creating serious mental and physical health effects. Some good practices to mitigate the issue is to set a screen time limit on social media apps and if you find yourself still having a high screen time then it would be best to delete your social media apps.
Cole Reynolds
The major issue I see with social media algorithms is that they prioritize engagement over accurate or meaningful information. Because these systems are designed to maximize clicks, comments, and shares, they tend to amplify emotionally charged content. Nothing drives engagement faster than anger, which is why rage bait has become so common.
When we reward lowest-common-denominator-type outrage content, it encourages creators to post increasingly extreme or misleading material just to provoke reactions. Leading to a cycle I'm sure we've all seen: first outrage, then backlash, then backlash to the backlash and it's genuinely exhausting to see it happen again and again.
To address this, platforms should shift away from pure engagement metrics and incorporate quality-based ranking signals. Increasing transparency about how content is promoted, reducing the constant algorithmic amplification of low-information content that simply exists to drive a backlash cycle.
Unfortunately, the current political climate has led to platforms deprioritizing or outright removing objective fact-checking because reality has a well-known liberal bias.
The problem is it's far easier for platforms to pander to power and learn to like the taste of leather than attempt to prevent the erosion of a shared objective reality and I don't see any solutions with the current incentive structures in place.
Matthew Fletcher
I think there are several serious issues with addictive algorithms. The first issue that comes to mind is the spread of harmful content. For example, a few months ago, there was a trend where parents recorded videos of themselves trying to “teach” their child to crack an egg, but instead they would smack the egg on the child’s head. This could be harmful to the child. Because some people thought these videos were “funny,” the algorithms pushed them to more users, which encouraged others to participate in the trend. As a result, harmful behavior spread quickly simply because it generated engagement.
Another issue with addictive algorithms is the spread of misinformation. For example, some people posted “life hack” videos claiming that if you freeze an egg overnight, you can peel it in the morning, slice it, and make small fried egg snacks. However, when eggs freeze, they expand and often crack, which can allow bacteria to enter. This can potentially make people sick. When algorithms promote these videos to hundreds or even thousands of users, misinformation spreads rapidly and can lead to real-world consequences.
The final issue I can think of is the effect addictive algorithms have on mental health. These algorithms curate content that often promotes idealized lifestyles. Constant exposure to this type of content can cause users to compare themselves to unrealistic standards, which may lead to lower self-esteem, envy, and even depression.
I think one possible solution would be to reform these addictive algorithms. For example, social media platforms could be required to reset users’ algorithms on a regular basis so that people are not continuously pushed deeper into a specific content “rabbit hole.” This might help reduce the spread of harmful trends, misinformation, and negative mental health effects.
Jean Lara
Addictive algorithms used by platforms like TikTok, Instagram, YouTube, and Facebook are designed to maximize user engagement by analyzing behavior such as watch time, clicks, pauses, and interactions to continuously personalize content in ways that keep users scrolling, often by leveraging psychological mechanisms like variable reward schedules and dopamine-driven feedback loops; while effective for revenue generation, this design can negatively impact mental health by increasing anxiety, depression, and reduced attention span, especially among young users, and can also amplify misinformation, polarization, and emotionally extreme content because algorithms optimize for engagement rather than truth or well-being. To reduce harm, best practices include implementing transparent recommendation systems, allowing chronological feed options, limiting infinite scroll and autoplay features, incorporating time-use reminders and default screen-time caps for minors, auditing algorithms for bias and harmful amplification, prioritizing well-being metrics alongside engagement metrics, strengthening digital literacy education, and establishing regulatory oversight that requires accountability and data transparency from social media companies.
Fabricio Miranda
I have just learned that many modern algorithms have predictive intent. This means that they can tell what kind of state of mind you are in depending on things most people don't tend to think about like how fast your scrolling, how long the screen has been paused for and in some cases videos you shared with your friends. Using this data the algorithm will feed you videos depending on your mood/emotional state this makes it extremely difficult to get off of the app. The best way to prevent this is by just staying away from your phone, out of sight, out of mind.
Soo Hee Min
According to a January 27, 2026 ABC7NY report regarding social media’s addictive algorithms, TikTok settled just before the representative trial for youth addiction claims was set to begin. The lawsuit alleged that the companies were “borrowing heavily from the behavioral and neurobiological techniques used by slot machines and exploited by the cigarette industry, with the Defendants deliberately embedding an array of design features in their products aimed at maximizing youth engagement to drive advertising revenue.”
The fact that the settlement was reached just before trial demonstrates the seriousness of the issue, showing that it extends beyond mere social debate and requires the examination of actual legal liability. Since an unfavorable ruling in the representative trial could have had repercussions across similar litigation, the settlement may be interpreted as a strategic move to minimize legal risk and reputational damage.
To mitigate this problem, I believe we need fundamental design changes that limit addictive features such as infinite scrolling and autoplay. Second, there should be greater transparency and clearer choices regarding how recommendation algorithms operate. Third, policies should provide users with practical tools to monitor and control their usage time.
Ishtiaq Alam
Addictive social media algorithms are designed to maximize engagement by promoting emotionally charged content and using features like infinite scroll and constant notifications, which can encourage compulsive use and negatively impact mental health, especially for younger users. Best practices to mitigate these issues include more ethical platform design (such as limiting autoplay and notifications), increased transparency and user control over feeds, stronger policy oversight, and encouraging individuals to set screen-time limits and practice intentional online use.
Nadia Brown
One issue caused by addictive algorithms, particularly those employed by social media companies like TikTok and Meta is the impact on mental health. Social Media has been linked to increased risk for depression, anxiety, loneliness, self-harm, and suicidal thoughts. This is because social media can promote negative experiences. For example, users might compare their lives to the lives of others online and fear that they are not having rewarding experiences. This is commonly referred to as Fear of Missing Out (FoMO).
FoMO can trigger feelings of inadequacy or anxiety, and it can lower levels of life satisfaction. Surprisingly, FoMO has also led to increased social media usage because user become addicted to the occasional dopamine boost. The reason for this is because social media activates the reward center in the brain.
To reduce the negative impact of social media, users can start by turning off nonessential notifications. They can also set screen time limits and uninstall apps from mobile devices. Uninstalling apps from devices will introduce friction between users and social media sites. This will help reduce social media usage because of the extra steps required to sign on.
Zahra Qureshi
Social media platforms like TikTok, Instagram, and YouTube use powerful recommendation algorithms designed to maximize user engagement. These systems track behavior, such as watch time, likes, and clicks to personalize content and keep users online longer. However, this often leads to the amplification of emotionally intense, misleading, or harmful material because it generates stronger reactions. For children and teens, features like infinite scrolling, autoplay, and push notifications encourage compulsive use, increasing risks of anxiety, depression, poor sleep, and unhealthy social comparison.
To mitigate these issues, platforms should implement stronger default protections for minors, such as disabling autoplay and limiting algorithm-driven recommendations. Greater transparency about how content is curated would also help users understand how their feeds are shaped. In addition, promoting digital literacy, setting screen time limits, and encouraging open conversations about online experiences can reduce harm. Creating safer digital spaces requires responsible technology design alongside active involvement from parents, educators, and policymakers.
Elvis Chen
This week, I am focusing specifically on the auto-scroll and infinite scrolling feature on Instagram and how it contributes to addictive algorithmic design. Unlike traditional media where content has a clear stopping point (such as the end of a page or episode), Instagram’s feed continuously loads new posts as users scroll. Because there is no natural break, users often lose track of time and continue consuming content longer than they initially intended.
The platform algorithm further strengthens this effect by prioritizing content that aligns closely with a user’s past behavior—such as posts they liked, shared, or watched longer. When paired with auto-playing videos and personalized recommendations on the Explore page and Reels, the infinite scroll creates a seamless stream of highly tailored content. This design encourages passive consumption and can make disengagement more difficult.
A best practice to address this issue would be implementing optional “natural stopping cues,” such as reminders after extended scrolling or periodic pauses that prompt users to decide whether they want to continue. Instagram could also make chronological feed options more accessible and provide clearer controls to disable autoplay features. Encouraging transparency about how the algorithm curates content would allow users to better understand how their engagement patterns shape what they see.
Overall, while infinite scrolling improves convenience and personalization, its design can promote excessive use. Ethical design changes that prioritize user well-being would help reduce the addictive impact of the auto-scroll feature.
David Lin
Algorithms control our day-to-day lives when it comes to social media consumption, online shopping, and a wide array of content that seeks to capture the user's attention for profit. In the 1970's, Herbert A. Simon, a computer science and economic researcher, discussed the issues surrounding the "attention economy" in a paper titled Designing Organizations For An Information-Rich World. He says how, "a wealth of information creates a poverty of attention and a need to allocate that attention efficiently". We see today that his message is more prevalent than ever over 50 years in the future. With the transformation of data in the 2000s into the 2010s from structured data into the big data we are more familiar with on social media platforms, the explosion of information has far exceeded what Simon and others may have predicted back in the 1970s. According to a Rivery article discussing big data statistics, the amount of data that exists in our "datasphere" is likely to exceed anywhere from 175-200 zettabytes of data with a single zettabyte equating to 1 sextillion bytes (1,000,000,000,000,000,000,000). Researchers suggest that this volume of data is projected to nearly double every four years. Although the sharp increase in global population has also exploded due to the development of technology, it pales in comparison to the amount of data we have created. With so little "digital real estate" to capture and so much data that demands your attention, algorithms are necessary to ensure the precious resource of attention isn't wasted. Addictive algorithms are used to mainly do 2 jobs, handling the filtering of useless garbage data that gets generated on the internet every day (especially with the rise of A.I.) and to create a feedback loop that keeps the user addicted. Metrics such as timeliness, virality, user engagement, along with personal identifiers are fed into a refined algorithm that has the capacity and reasoning of those owned by big data companies creates a dystopian feedback loop that is necessarily predatory to generate profit for shareholders and investors. Nowadays, many social media giants are being scrutinized for the effectiveness of their algorithms which have been refined to a point where they are starting to cause obvious harm to an increasing population of its users. The problem may be more widespread than we initially assumed. Look no further than when CEOs of large tech companies testified before congress in 2020 and 2021 to defend antitrust and misinformation allegations. Top tech companies are alleged to be actively trying to create a digital monopoly by swallowing smaller competition while maximizing misinformation and disinformation to drive traffic to their website and keep users addicted so they can use the precious data generated by the user's attention to keep them locked in. In a New York Times article titled TikTok’s ‘Addictive Design’ Found to Be Illegal in Europe, Satariano describes how the E.U. is pushing for widespread social media reform with the threat of fines and bans if they aren't taken seriously. These concerns in particular are targeted at teenagers and kids who are more susceptible to the time and attention sink that these algorithms create. The article details a Pew Research Center report that found how, "16 percent American teenagers said they were on TikTok 'almost constantly'". The protection and sometimes ban of children from social media is not an interest that is specific to the E.U. The article discusses how many other countries are also looking to criminalize the spread of harmful information and the addictiveness of algorithms. Of course, these algorithms are not only harmful to younger users but also older ones as well. In a paper published in the Frontiers of Psychology titled Regulating addictive algorithms and designs: protecting older adults from digital exploitation beyond a youth-centric approach, suggests that blanket reform may not work the way we think it will when applying these laws to an underrepresented portion of the community that doesn't have as strong of a voice. Clearly, widespread social media reform is needed but has become increasingly difficult to control. We are going through a period of change where governments want to hijack the user's privacy in order to have complete social media control in a one-off fix while tech companies want to push back as much as possible. Either way, the consumer gets harmed in the process. External applications can control screen-time and app usage, and certain schools are banning phone use during classes altogether, but there doesn't seem to be a cookie cutter solution to tackling the problem at large. In my opinion, we need to be increasingly careful about how we advocate legislation in the social media age where we need to be concerned about the balance between safety and privacy. As mentioned last week, education on select topics pertaining to these algorithms is a necessary step to shed light on its potential harm and ways in which you can potentially use social media in a healthy way.
Habib Habib
I looked at the Addictive Algorithms page on the Social Media Reform website. And It says social media apps use algorithms made to keep people scrolling and watching for a long time. These algorithms often push emotional or extreme posts because they get more clicks and reactions. This can be harmful for kids. It also says children who spend more than 3 hours a day on social media are twice as likely to have anxiety and depression. It also explains that features like infinite scroll, autoplay, streaks, and push notifications can make people feel hooked. Sometimes the algorithm can also suggest harmful content, like self harm or eating disorder content.
Best practices to reduce the problem:
Set a daily time limit (1–2 hours).
Turn off push notifications and autoplay.
Keep the phone away at night.
For kids, use parental controls and safer settings.
Did coca cola originally have cocaine as an ingredient?
Yes, the rumors are true—the "Coca" in Coca-Cola wasn't just for show. When John Pemberton first brewed the drink in 1885, it definitely contained cocaine.
Here is the breakdown of how the world's most famous soda went from a "brain tonic" to a family-friendly soft drink:
1. The Original Recipe
The name Coca-Cola actually describes its two main original ingredients:
Coca: Derived from the coca leaf, which contains cocaine alkaloids.
Cola: Derived from the kola nut, which provided the caffeine.
In the late 19th century, cocaine was legal and widely considered a "wonder drug." Pemberton, a pharmacist, marketed the drink as a medicinal tonic that could cure everything from headaches to morphine addiction and "melancholy.
No comments:
Post a Comment