RISE Act 2025: Lummis Seeks AI Liability Shield

RISE Act 2025: Lummis Seeks AI Liability Shield
Planck

RISE Act 2025: Lummis Seeks AI Liability Shield
Image source: CoinToday
- The RISE Act aims to clarify legal liability for professionals using AI systems. - AI developers must disclose model specifications to qualify for liability immunity. 2025-06-12, On June 12, 2025, Senator Cynthia Lummis's official website unveiled the Responsible Innovation and Safe Expertise (RISE) Act of 2025. This Act aims to clarify liability for AI systems used by professionals such as physicians, attorneys, engineers, and financial advisors. Furthermore, the legislation holds these professionals legally responsible for exercising due diligence, verifying AI outputs, and standing behind their advice. The RISE Act proposes a "safe harbor" from civil liability for AI developers under certain conditions. To qualify for this immunity, developers must publicly disclose model specifications, which include "model cards" and key design details. These disclosures enable professionals to make informed decisions by allowing them to understand an AI system’s capabilities and limitations before relying on it. The act stipulates that model cards must detail an AI system's training data sources, intended uses, performance metrics, known limitations, and potential failure modes. Additionally, developers must update AI documentation and specifications within 30 days of deploying new versions or discovering significant failure modes. The RISE Act does not offer blanket immunity to AI developers, as immunity will not apply in cases of recklessness, willful misconduct, fraud, or knowing misrepresentation. It also will not apply when AI is used outside the defined scope of professional usage. This targeted liability reform aims to create predictable, federal standards to address the current patchwork of state-by-state liability rules, which can discourage investment and innovation in AI. Reactions to the RISE Act have varied. Proponents see it as a way to protect innovation and provide legal clarity, believing it balances innovation with transparency in sectors like finance, law, and healthcare. However, some critics argue the bill might place too much burden on professionals who use AI tools and that the transparency requirements may not be sufficient. These critics suggest developers might opt out of transparency by accepting liability. Furthermore, concerns exist that demands for transparency could stifle creativity and slow innovation, particularly if developers hesitate to disclose intellectual property. Some view the RISE Act as a work in progress that may require modifications, and it has received measured support from groups on different sides of the debate regarding state-level legislation with similar aims. If passed and signed into law, the bill proposes to take effect on December 1, 2025, and would then apply to acts or omissions occurring on or after that date. The legislation aligns with international efforts toward AI governance, such as the European AI Act, though it takes a different approach to liability.
Article Info
Category
Market
Published
2025-06-22 15:18
NFT ID
PENDING
News NFT detail

Get the latest news in your inbox!


Recommended News

About Us

 | Contact Us | 

Privacy Policy

 | 

RSS