GxP Insights: The Purolea Warning Letter: What It Means for AI in GxP Manufacturing

5 mins

Welcome to our monthly industry insights newsletter, tailored for professionals in Medical D...

By David Dolman

Executive Director / EVP

Welcome to our monthly industry insights newsletter, tailored for professionals in Medical Devices, Biologics, Cell & Gene Therapy, and Sterile/Aseptic Manufacturing. Each edition delves into a key industry theme, offering expert perspectives and career insights to help you stay ahead.

This month, we examine the FDA’s first-ever enforcement action citing AI use in pharmaceutical manufacturing, what it means, what the rules now look like, and what it signals for careers in quality, validation, and regulatory affairs.


A First in Enforcement: The Purolea Warning Letter

On 2 April 2026, the FDA issued a landmark warning letter to Purolea Cosmetics Lab, which caught significant attention across the industry. It may be the first enforcement action in which the FDA cited the use of AI itself as a source of cGMP non-compliance.

The facts are stark. During inspection, investigators found that the company had used AI agents to generate drug product specifications, standard operating procedures, and master production records, and deployed those outputs without adequate review by qualified personnel. When asked about certain legal requirements, company personnel reportedly told investigators they were unaware of them because “the AI agent didn’t tell them.”

The FDA’s response was clear. AI-generated outputs must be reviewed and approved by an authorized Quality Unit representative before use. Responsibility for cGMP compliance cannot be delegated to an AI tool. Full stop.

This case is a signal to the entire industry: FDA expectations around AI are no longer theoretical, and they are being enforced under existing cGMP frameworks.


The Regulatory Framework Taking Shape

The Purolea letter did not emerge from a vacuum. The FDA has been building formal expectations around AI in drug development for several years, and the pace has accelerated.

In January 2026, the FDA, in collaboration with the EMA, published its Guiding Principles of Good AI Practice in Drug Development, setting out ten principles for responsible AI use, a response to a significant increase in drug application submissions that include AI components, spanning nonclinical, clinical, post-marketing, and manufacturing phases.

On the manufacturing side specifically, the FDA has been developing its position through its FRAME initiative, a framework for advanced manufacturing evaluation, which published a dedicated discussion paper on AI in drug manufacturing in 2023 and has continued to gather stakeholder input since. Formal manufacturing-specific AI guidance has not yet been issued, but the direction of travel is clear.

However, the Purolea case demonstrates that the FDA does not need to wait for new rules. Its findings were tied to two existing cGMP regulations: 21 CFR 211.22(c), which places responsibility for approving procedures and specifications squarely with the Quality Unit, and 21 CFR 211.100, which requires written production and process controls to ensure product quality.

The message to manufacturers is clear: AI governance in manufacturing is not a future compliance requirement; it is already being tested against standards that exist today. And when dedicated manufacturing guidance does arrive, the bar is only likely to rise.


What “Responsible AI” Looks Like in a GxP Environment

The life sciences industry is not anti-AI. AI is already delivering real value in pharmaceutical manufacturing, from predictive maintenance and environmental monitoring through to process optimization and deviation detection. The question is not whether to use AI, but how to govern it.

Several principles are emerging from the Purolea warning letter and recent FDA guidance:

  • Human-in-the-loop is the baseline: the Purolea case makes this explicit: under 21 CFR 211.22(c), the Quality Unit is responsible for approving all procedures and specifications that affect product quality. That responsibility cannot be delegated to an AI tool.
  • Validation frameworks need to catch up: traditional Computer System Validation was designed for deterministic software. AI/ML systems behave differently: they adapt, retrain, and can drift over time. The FDA's final Computer Software Assurance guidance, issued in February 2026, signals a shift toward risk-based software assurance, and the expectation is that manufacturers apply the same thinking to AI systems in production and quality environments.
  • AI governance needs to be inspection-ready: the Purolea case is a reminder that inspectors will ask how AI systems are controlled, what they are authorized to do, and who is accountable for their outputs. Companies that cannot answer those questions clearly are exposed.



Skills in Demand

As AI becomes more embedded in manufacturing operations, demand is growing for professionals who can sit at the intersection of quality, digital systems, and regulatory strategy:

  • QA Professionals with AI Governance Expertise: The “human in the loop” requirement means QA teams need to understand AI systems well enough to review and approve their outputs critically, and to build the oversight frameworks that make that possible at scale.
  • Validation Engineers (CSV → CSA): The FDA's Computer Software Assurance guidance shifts from rigid CSV to risk-based software assurance. Validation engineers who can apply these updated approaches to AI/ML systems are in high demand.
  • Regulatory Affairs Specialists: Navigating dual FDA/EU AI Act compliance, supporting AI-related manufacturing sections of submissions, and advising on lifecycle management for AI components in drug and device products.
  • Quality Systems Leaders: Building and maintaining AI governance policies, CAPA frameworks for AI-related deviations, and inspection-ready documentation across digitally enabled manufacturing environments.

These roles are seeing growing demand across established and emerging life sciences hubs across the United States, as companies integrate AI tools into both new facilities and existing manufacturing operations.


Live Opportunities

As AI becomes more embedded in manufacturing operations across the US, we are actively supporting projects where digital governance, quality, and regulatory expertise are critical requirements.

Our current live W2 contract requirements include:

If you’re exploring your next move or would like to become part of our network for upcoming projects, we’d be happy to connect. To check all live requirements, click here.


Final Thoughts

The Purolea warning letter is a line in the sand. AI is not going away from pharmaceutical manufacturing. The efficiency gains are too significant, and the investments too large. But the industry now has its first concrete example of what non-compliant AI use looks like under existing cGMP, and the FDA has made its expectations clear.

The professionals who thrive in this environment are not those who avoid AI, but those who understand how to govern it, and who can build the human oversight frameworks that make AI a genuine asset rather than a compliance liability.


i-Pharm GxP

At i-Pharm GxP, we work alongside the life sciences organizations navigating this shift, helping them scope, staff, and deliver the GxP talent they need to succeed in a changing landscape.

If you’re exploring your next step, we’d love to connect.


Found these insights helpful? Be sure to share them with your network and subscribe here for monthly updates and exclusive industry insights straight to your Inbox.