Whistleblowers have come forward to reveal that social media companies, including TikTok and Meta, have compromised user safety in their pursuit of engagement and profit. Internal research has shown that algorithms prioritize outrage and sensational content, which can lead to the spread of harmful and violent material. This prioritization has resulted in a lack of adequate safeguards, particularly for vulnerable users such as teenagers and children. The companies' focus on competing with each other has led to a culture of prioritizing growth over safety, with potentially devastating consequences. As a result, users are being exposed to increasing amounts of harmful content, which can have serious real-world implications.