Healthcare AI at a Crossroads: How OSTP and FDA’s New Calls for Comment Could Shape Regulation

Background

Two major federal developments last week signal where U.S. AI policy is headed — and both carry big implications for healthcare innovators.

  • The Office of Science & Technology Policy (OSTP) published a Request for Information (RFI) on reforming regulations that may hinder AI development. Comment deadline: October 27, 2025

  • The Food and Drug Administration’s (FDA) Digital Health Center of Excellence released a Request for Public Comment (RPC) on how to measure and monitor the real-world performance of AI-enabled medical devices. Comment deadline: December 1, 2025

Taken together, these initiatives show that federal agencies are moving past the question of whether to regulate AI and focusing instead on how.

The OSTP RFI: Removing Barriers to AI

What is OSTP Looking For?

OSTP wants stakeholders to 1) identify laws, regulations, or processes that unnecessarily hinder AI technologies; and 2) propose tailored reforms.

Regulatory obstacles may be characterized as falling into one of five categories:

  • Regulatory Mismatches (e.g., outdated laws written for traditional devices).

  • Structural Incompatibility (requirements that don’t align with AI’s continuous learning).

  • Lack of Clarity (uncertainty around how AI fits into existing rules).

  • Direct Hindrance (rules that flatly prohibit AI deployment).

  • Organizational Factors (agency silos or processes that slow down approvals).

Why It Matters for Digital Health Companies

While the RFI applies across industries, healthcare is a major focal point. The RFI advances the directive set by America’s AI Action Plan, released by the current Administration on July 23, 2025, and signals a continued focus by the executive branch on the intersection of AI regulation and innovation. OSTP notes that “regulations for medical devices, telehealth, and patient privacy were designed for human clinicians and discrete medical device updates” rather than continuously evolving AI tools.

This creates opportunities for reforms such as:

  • Safe harbors or pilot programs for AI in clinical use.

  • Conditional approvals that allow innovation while monitoring safety.

  • Alternative regulatory pathways that balance speed with accountability.

Action item for digital health companies and AI innovators: Submitting comments backed by data, examples, and statutory references could help shape the AI framework that will later impact FDA, CMS, FTC, and FCC policies.

Comment deadline: October 27, 2025

The FDA RPC: Monitoring AI in the Real World

What is FDA Looking For?

FDA wants feedback on practical approaches to evaluating AI-enabled devices once deployed. For example, this may include:

  • How to detect performance drift (i.e. when AI tools lose accuracy over time due to changes in input data or user behavior).

  • How to structure mitigation response protocols when drift is detected.

  • What performance metrics and monitoring triggers should be used.

Why It Matters for digital health and Healthcare AI Companies

FDA’s emphasis is clear: launch is not the finish line. AI products will be judged on performance across their full lifecycle.

Practical strategies could include:

  • Ongoing algorithm audits tied to clinical outcomes.

  • Quarterly bias testing to ensure fair performance across patient populations.

  • Building drift-detection dashboards to flag anomalies for clinicians.

Action item for healthcare AI companies: Stakeholders should be ready to show that postmarket monitoring and risk mitigation are built into their AI strategy — not an afterthought. This will be crucial for patient trust, investor confidence, and regulatory approval.

Comment deadline: December 1, 2025

Key Takeaways for Digital Health Innovators and Investors

  1. Concrete, evidence-based input is critical. Regulators are asking for real-world use cases, not abstract theory.

  2. Real-world monitoring is the new norm. Expect oversight not just at launch but throughout the AI product lifecycle.

  3. Cross-agency consistency matters. Push for alignment across FDA, CMS, FTC, FCC, and other regulators to reduce conflicting requirements.

What Should Healthcare AI Stakeholders Do Now?

  • Submit comments: If your company has direct experience with regulatory mismatches or performance monitoring, contribute before the October 27 (OSTP) and December 1 (FDA) deadlines.

  • Engage with investors and partners: Highlight how your compliance strategy reduces regulatory risk — an increasingly important factor in valuations.

  • Build compliance into your roadmap: Demonstrate that your team is not only innovating but also planning for long-term safety and accountability.

Conclusion

These dual initiatives mark a pivotal moment for healthcare AI. For digital health companies, providers, and investors, this is both a responsibility and an opportunity: shaping the guardrails now will help ensure a more predictable and innovation-friendly environment in the future.

At Nixon Law Group, we work with digital health and healthcare AI companies to navigate the complex intersection of AI, regulation, and healthcare delivery. If your organization is interested in submitting comments or developing compliance strategies around AI-enabled products, our team can help. Please Contact Us for more information.

Previous
Previous

Inside the Deal: Healthcare VCs to Watch with Sal DeTrane – Smart Capital, Real Impact

Next
Next

Government Shutdown Looms: How the Medicare Telehealth Cliff Impacts Providers and Patients Starting October 1