Product Liability in a Digital Age
In today’s world, many of the products we use every day don’t come in boxes. They live on our phones, tablets, and computers. Social media platforms, AI chatbots, and mobile apps now shape how we communicate, learn, and spend our time. As these digital tools become more powerful and pervasive, courts are increasingly asking an important question: Should tech companies be held to the same safety standards as traditional product manufacturers?
The emerging answer appears to be yes -- at least in some circumstances, according to a recent article in Tech Policy.
Traditional product liability law holds manufacturers and sellers responsible when their products cause harm. Claims generally fall into three categories: manufacturing defects, design defects, and failure to warn about known risks.
For decades, this framework applied mainly to physical goods. But courts are now recognizing that digital products can also cause real-world harm -- and that those harms may fit squarely within existing product liability principles. Claims involving defective design and failure to warn are increasingly being allowed to move forward against digital platforms, including social media companies, AI developers, and app providers.
Courts Are Treating Certain Digital Features Like Products
In Lemmon v. Snap, parents sued Snapchat after two teenage boys were killed in a high-speed car crash while using the app’s speed filter, which rewarded users for driving over 100 miles per hour. The court allowed a product liability claim to proceed, focusing on whether the feature’s design encouraged dangerous behavior.
In Garcia v. Character Technologies, Inc., a teenager died by suicide after extensive interactions with AI-generated characters. The court ruled that the family could pursue claims not only against the company that created the chatbot, but also against Google, which provided the cloud infrastructure that powered it.
Federal courts have also examined hundreds of lawsuits brought by children and teens against major social media platforms, including Facebook, Instagram, YouTube, TikTok, and Snapchat. The central issue in these cases was whether social media platforms can be treated like products for purposes of liability. Courts have closely analyzed specific platform features, such as parental controls, barriers to deactivation, and filtering tools, to determine whether they function more like a product or like protected speech.
What This Shift Means
From a public policy perspective, this evolution mirrors the origins of product liability law itself. Modern product liability developed during the rise of mass manufacturing, when consumers no longer had the ability to inspect or fully understand the products they were buying. Courts recognized that responsibility for injuries should rest with companies that had the knowledge, control, and resources to design safer products and spread costs through pricing and insurance.
A recent California case, Neville v. Snap, drew this historical parallel directly. Families sued Snapchat after children suffered fatal and near-fatal fentanyl overdoses from drugs purchased through the app. The court allowed the claims to proceed, comparing today’s digital economy to the industrial revolution of the 20th century. Just as consumers once lacked the power to protect themselves from hidden dangers in mass-produced goods, today’s users -- especially parents -- are largely powerless to understand or mitigate the risks embedded in complex digital platforms.
Courts and Lawmakers Are Both Paying Attention
Judges remain careful to consider free speech protections and laws like Section 230, and many claims are still dismissed on those grounds. Importantly, most courts have not yet ruled that social media or AI companies are ultimately liable for specific harms. What they have done is allow families to make their case -- to seek discovery and hold companies accountable through the legal process.
At the same time, state legislatures are exploring ways to strengthen accountability. In California, proposed legislation would increase penalties for large social media companies when negligent platform features harm minors. These efforts do not create new duties; rather, they reinforce the longstanding principle that all companies, including technology firms, owe a duty of reasonable care.
As society moves deeper into the digital era and toward widespread use of artificial intelligence, product liability law may need to continue evolving -- especially to protect vulnerable users like children. While new regulations have struggled to gain momentum at the federal level, court decisions recognizing that digital products can be unreasonably dangerous offer meaningful hope to families who have been harmed.
No family should bear the burden of preventable injuries or loss without answers or recourse. If a loved one of yours has suffered a physical injury because of what you believe to be a faulty product or negligent manufacturer, consider contacting a top Philadelphia attorney for product liability cases.