L7 | CHATS
thought leadership
Cell Therapy Cannot Scale Without Digital Continuity Across R&D, CMC, and Manufacturing
by Kevin McMahon | posted on February 23, 2026
Cell therapy programs move fast, and they have to. Science is advancing, timelines are compressed, and the operational bar is high. Cell therapies are complex to develop, difficult to transfer, and demanding to manufacture consistently.
But here is the hard truth: most of the programs I see are not failing because of bad science. They are struggling because of broken information flows. Data exists. Insights exist. But context does not travel. And in cell therapy, lost context is not a minor inconvenience; it is a direct threat to program timelines, regulatory credibility, and ultimately, patient access.
What cell therapy organizations need, and what most are missing, is not more data. It is continuity. Continuity of definitions, continuity of process intent, continuity of material lineage, and continuity of execution. Because the moment a program crosses a phase boundary, changes a site, adds a partner, or evolves a process, everything that is not unified becomes friction.
And friction in cell therapy is never just inconvenient. It shows up as delayed tech transfers, comparability studies that drag far longer than they should, batch investigations that consume weeks instead of days, and teams operating at the outer edge of what legacy systems can support.
Why CMC is the Backbone, not the Bottleneck
CMC (Chemistry, Manufacturing, and Controls) is the regulatory and operational discipline that ensures cell therapies can be manufactured consistently, safely, and at scale. But I want to push back on how most organizations treat it.
CMC is not paperwork. It is not a gate you face at the end of development. It is the backbone that makes a therapy repeatable, defensible, and commercially viable. CMC is where you prove you can manufacture within defined controls, preserve product and process integrity through change, maintain end-to-end traceability, generate records that support confident release decisions, and defend your product story under regulatory scrutiny.
When CMC is treated as a downstream burden, it becomes exactly that: a delay, a cost, and a recurring source of rework. When it is treated as foundational infrastructure from day one, it becomes an accelerator; the thing that lets you move faster because you are not constantly reconstructing what you already knew.
The problem is that most cell therapy organizations are trying to build that backbone on top of fragmented digital systems, systems that were never designed to carry context across the full product lifecycle. The result is a CMC story that exists in pieces: scattered across notebooks, LIMS, MES, and quality systems that do not speak to each other in any meaningful way.
Think about what it would mean to have your full CMC overview available at the push of a button, not compiled from a dozen sources the week before an inspection or submission, but maintained continuously as a natural byproduct of how your organization executes.
That is not a fantasy. But getting there requires rethinking the digital foundation, starting with how you define and govern your processes.
The Continuity Problem: Fragmented Systems Cannot Carry Context
Most cell therapy organizations have assembled their digital infrastructure over time: a system for notebooks, a system for sample tracking, a system for manufacturing execution, a system for quality management, a system for analytics. Each was adopted for valid reasons, usually by different teams solving immediate needs, without thinking of the bigger picture or bigger win.
The problem is not the individual tools. The problem is what happens at the boundaries between them.
When digital infrastructure is fragmented, teams end up spending enormous time on work that does not advance the program: reconciling inconsistent terminology across sites, manually compiling batch context during investigations, chasing material lineage through disconnected records, translating process definitions during tech transfer, and rebuilding the story of what happened every time a deviation occurs.
This is what I call the continuity problem. In cell therapy, the product is inseparable from the process, and the process cannot be governed if its context is scattered across disconnected systems.
One of the most powerful solutions to the continuity problem is the use of ontologies: formal, structured vocabularies that create a shared language across your systems, departments, sites, and teams. Instead of each system using its own terminology for the same concepts (a reagent, a process step, a critical quality attribute), an ontology enforces consistent, machine-readable definitions that travel with the data.
Ontologies do not just improve searchability. They enable your systems to reason about relationships: that this batch of starting material is connected to that process step, which was governed by this version of the protocol, which produced a result that was evaluated against these acceptance criteria. That chain of meaning is what continuity actually looks like at the data level, and without it, every handoff between systems, phases, or sites introduces ambiguity and risk.
What Digital Continuity Actually Requires: An Execution Layer
The missing component in most cell and gene therapy technology stacks is not another point solution. It is an execution layer: infrastructure that carries context and governance across the full lifecycle rather than capturing data in isolated silos.
An execution layer is how you move from ‘we have tools’ to ‘we have a controlled program.’ It standardizes and versions your process definitions, orchestrates work across teams and sites, preserves lineage automatically rather than requiring manual reconstruction, maintains consistent change control, integrates quality oversight with manufacturing reality, and grounds analytics in a trustworthy operational context.
A critical component of building this execution layer effectively is how you model your processes in the first place. Approaches like KASA (Knowledge-aided Assessment and Structured Application) provide a framework for defining processes using structured, knowledge-grounded terminology that is designed to accelerate regulatory submissions and lifecycle regulatory checkpoints.
The logic is straightforward: if your process is defined from the beginning using structured, machine-readable knowledge (capturing not just what you do, but why, under what conditions, with what materials, and governed by what criteria), then regulatory submissions become faster because the evidence is already organized. Lifecycle changes become easier to defend because the baseline is clearly documented. And comparability across sites or process versions becomes a tractable exercise rather than an exercise in reconstruction.
The byproducts of this kind of structured process modeling are significant:
- Stability data becomes better controlled and more searchable because it is linked to the specific process versions, materials, and conditions that generated it, not stored as a disconnected dataset that someone has to manually interpret during submissions or investigations.
- Continuous Process Verification (CPV) becomes genuinely continuous, not a periodic retrospective exercise, but an ongoing capability grounded in data that was captured with consistent context from the moment of execution.
- Tech transfers become faster because the process definition is already structured, versioned, and interpretable, not a narrative document that the receiving site has to translate into their own operational reality. CPPs and CQAs will follow your product lifecycle across scaleup, with additional ones added to describe new equipment impacts on your product .
This is the difference between a program that scales and a program that reintroduces risk as it grows.
Why AI-Ready is not Enough: Context Must Travel with the Data
AI is showing up everywhere in life sciences right now, and the enthusiasm is understandable. Cell therapy generates complex, high-dimensional data across R&D, process development, manufacturing, and quality. The opportunity to extract more signal from that data is real.
But I want to be direct about something: data volume does not create value. Context creates value.
In cell therapy, useful AI depends entirely on whether the data is connected to its meaning. If your organization cannot reliably answer basic questions, such as which process version produced this result, which materials and lots were used, what conditions were in effect, what decisions and approvals shaped the outcome, then AI is forced to operate on incomplete information. In regulated environments, that is not a risk you can manage away with better algorithms.
This is why AI-readiness for cell therapy needs to be defined differently. It is not just about data cleanliness or access. It is about whether your enterprise can consistently preserve and reuse context, including relationships between:
- Patient or donor lineage and associated materials
- Process steps, parameters, and versions
- Methods, results, and acceptance criteria
- Deviations, investigations, and corrective actions
- Approvals, sign-offs, and audit history
When that context is captured at the point of execution and carried forward across phases and sites, underpinned by ontologies that make the relationships explicit and queryable, AI stops being a reporting layer and becomes something far more powerful: decision support that can operate with guardrails, because the underlying record is complete, interpretable, and governed.
This is the path from AI-ready to AI-actionable. It is not about adding another model on top of disconnected systems. It is about building the execution layer that makes the data trustworthy enough, and the workflow governed enough for AI to be used responsibly in operations.
Imagine a World Where the CMC Backbone is Always Available
Let me paint a picture of what this looks like when it works.
Imagine a world where your CMC backbone is not something you assemble before a regulatory submission; it is something that exists continuously as a living record of your program. Where you can pull a complete picture of your product and process history at the push of a button. Where context is not lost at phase boundaries, not reconstructed during site transfers, and not manually compiled before an inspection.
In this world, process definitions are modeled using structured knowledge from day one; not written in narrative documents that each site interprets differently, but encoded in a way that is machine-readable, versioned, and consistent across your organization. Deviations are linked to the specific process step, material lot, and environmental conditions that were in effect. Stability data is searchable because it is anchored to the process context that generated it. CPV is continuous because the data has always been captured with the right structure to support it.
Tech transfers happen faster because the receiving site inherits the structured process definition; not a document to translate, but a model to execute. Regulatory submissions are more confident because the evidence is already organized in the structure the reviewer needs. And when an investigator asks about the history of a batch, the answer is not a three-week reconstruction effort; it is a query.
This is not theoretical. The organizations that are building this kind of digital continuity now, investing in unified platforms, ontology-driven data standards, and structured process modeling, are the ones that will be able to scale without reintroducing the risk and rework that kills timelines in late-stage development and commercialization.
The question is not whether to build this foundational infrastructure. It is whether you build it before scale becomes the constraint, or after it already has.
How L7|ESP Enables Digital Continuity for Cell Therapy Programs
L7|ESP (Enterprise Science Platform) is a digital unified platform designed to provide exactly this execution layer, enabling cell therapy organizations to manage data, workflow context, Orchestration, and governance from R&D through CMC and commercial manufacturing.
L7|ESP addresses the continuity problem across three foundational dimensions:
- Establishing and carrying forward consistent standards. L7|ESP helps organizations define process models, ontology-grounded data standards, and governance rules early, then carry those standards through tech transfer, manufacturing scale-up, and site expansion. This reduces downstream rework and supports defensible comparability because the organization maintains consistent definitions rather than reinterpreting its own program at every transition.
- Orchestrating workflows while preserving governance. Cell therapy work is distributed across research, process development, manufacturing, quality, and external partners. L7|ESP functions as an execution layer that connects workflows across these boundaries while maintaining approvals, traceability, and audit trails. Continuity becomes operational, not aspirational.
- Preserving lineage automatically to accelerate investigations and decisions. When lineage is captured as a natural outcome of execution, not reconstructed manually, teams spend less time compiling batch history and more time addressing root causes. This directly improves investigation cycle times, strengthens inspection readiness, and increases confidence in release decisions.
By providing a unified platform rather than a collection of point solutions, L7|ESP enables stability data to be controlled and searchable, CPV to be genuinely continuous, and AI-assisted decision making to move from pattern detection to actionable insight, because the underlying data has the context and governance structure to support it.
Build the Foundation Before Scale Becomes the Constraint
The cell therapy programs that scale successfully are not necessarily the ones with the best science. They are the ones that built operational continuity into the program itself; consistent standards from the start, governed execution across every phase, preserved lineage as work happens, and decision-making grounded in a trustworthy context.
The digital foundation for that continuity requires more than tools. It requires ontologies that enforce a shared language. It requires process modeling approaches like KASA that structure knowledge for regulatory utility. It requires an execution layer that carries context across every handoff so that the CMC backbone is not something you build at the end; it is something you have maintained all along.
Cell therapies are too complex and too important to be run on disconnected systems that cannot carry context across the development and manufacturing lifecycle. The organizations that get this right will not just move faster; they will move more confidently, with a record that is always ready for the scrutiny that life sciences demand.
That is what digital continuity makes possible. And the time to build it is before you need it.