Data-driven biomanufacturing excellence.

Hercule overview

What parts of production margins are nibbled by batch losses or unplanned down-time ? What are key process drivers? How to improve process yield or quality? How to detect issues before they arise? How to make process knowledge robust to collaborators turnover? How to get a 360° view on your plant performance? How to develop faster new production processes? How to keep distributed production under control?


A methodology to turn data into value


We provide expertise in data-driven strategies for biomanufacturing (including ATMP). Be it for one-shot troubleshooting analysis, process development or improvement (yield or quality increase), or simply for data consolidation from multiple data silos, we tailor our actions to your needs. Software implementation is not a goal, only a piece of this process, when relevant.

Whatever the stage of your organization with respect to data in biomanufacturing, we will help.


When appropriate, on top of strategic guidance and training, we provide our customers with our complete Hercule software platform. It integrates smoothly into every infrastructure, and can address all your data sciences needs in a context of biomanufacturing. It is quickly and completely tailored to your infrastructure, is fully documented and validated.

Hercule software platform comes for free, with source code and a lifelong license of use! See below.

Hands on

We do not only solve your issues related to data in biomanufacturing, we help your organization better apprehend these notions. We provide counsel for integrating data-driven strategies for various roles of your organization, provide technical and scientific training when relevant, and train Hercule software platform users.

Do not only integrate new tools: empower your organization with respect to data in biomanufacturing.

How it goes

  • 1
    Need analysis

    Understanding your actual needs is crucial: is your objective to start data integration in your company in general? Is there a particular issue on a given process? Do you intend to convince your management of the added-value of data analysis in the manufacturing department of your organization? These are all very relevant goals, but they will trigger different roadmaps. We'll help you in sorting this out.

  • 2
    Process mapping

    Almost all our interventions will start by working with you on the mapping of your process: what data are generated when and where (in which systems)? Who can access what information? What are the regulatory constraints in each process involved? This will allow to design a specific strategy based on what you have and what you need.

  • 3

    Depending on your needs, from a one-shot ad-hoc data analysis to complete deployment of our Hercule software platform or other tools, let's put the strategy in action.

  • 4

    At the end of our mission, you have additionnal tools in your organization to increase the value generated by the manufacturing department. Your colleagues have also been trained to use them properly. Return on investment is to be expected shortly.

Hercule overview

Hercule software platform


Hercule offers a series of modules for data integration, data entry, continuous process verification, and industrial R&D (process development and improvement).

Learn more about them below.


The deployment and use of Hercule software platform does not replace existing systems (LIMS, ERP, EMS, Scada,... ) and procedures. Rather, it will complement and integrate them. It is also very lean: integration can be done step-by-step, "à la carte".


Hercule software platform is process-centric. It is aware of every regulatory contraint you have, every existing piece of data, every step in production process. It integrates this knowledge to provide insights that lead to concrete actions.


One module, one mission.


Consolidate your data silos

Get a 360° view on your production data.

QC data are in your LIMS, Raw Material data and batch meta-data are in your ERP, Automation data are in an Historian database (and are even not aware of batch production start and stop), Environment data are in an EMS. Microbiology data are yet in a separate system. How are you going to make these data silos talk to each other? How are you going to pool these data for joint analyses?

The Data Lake module of Hercule interfaces all these different systems, to reconstitute a unified electronic batch record. And it does so in a gentle way: your data silos are left untouched, but push new data in the data lake through a standardized API. This data lake can then be used by other modules of Hercule to perform all their data analyses and visualizations. The API can also be used to export datasets for other uses, for example to feed your favorite internal statistical software.


Fill gaps in electronic batch records

Get rid of paper-only records and unintegrated xls files.

Either because you have an already long legacy, or because you start transfering a new process from R&D to production, there might be part of your production activities that are still generating unstructured data (paper form or free-text word reports) or structured but uncaptured data (stand-alone xls files). How can the information contained in these files be part of your data-driven strategy?

The Data Entry module of Hercule allows you to progressively (at your pace) replace these legacy elements by structured data that can feed your data analyses just as any other piece of data.

Your procedures are tailored for paper-based form at some points and you do not want to start modyfing them? No worry, Data Entry module have workarounds to fit into this frequently encountered situation.


Monitor what is going on, good or bad.

Monitor all aspects of your production in real-time.

As a production manager, you want to ensure that operations goes smoothly, to speed up batch release, to avoid unplanned maintenance, to stop production of a batch that is doomed to fail. You also need to perform a series of recurring computations for process stability for example.

All these questions can be addressed by the Continuous Process Verification of Hercule. It gathers all available data in real-time and produces usefull insights and call-to-actions for every single batch. It also allows you to track data management progress. If a part of the data is not yet available to the system, you will know, and may take corrective actions with the team involved.

All the information and insights generated by this module can be exported into editable reports, in order to support batch release activities.


Produce better.

Develop new processes faster, improve existing processes.

If you are developing a new process, you would like to benefit from guidance. The industrial R&D Module of Hercule provides standard Design of Experiment functionalities. But as Hercule has data in its DNA, it will also help you integrating the early stage data that you produce in order to strengthen these recommandations.

If you are dealing with one or multiple process already ongoing for a while, then you are sitting on a gold mine of data. The Industrial R&D module of Hercule uses advanced Machine Learning and Optimization to mine into it. It will discover what are the key drivers of some outcomes of interest (such as yield, production lead time, quality variability, ...). It will also make recommandations about how to reconfigure production lines to improve with respect to these outcomes. And it will do so taking automatically into account regulatory constraints. It will also measure the innovativeness and potential gain of each recommandation. Finally, it will suggest the next round of experiments you could run to improve its predictions.


Frequently Asked Questions

I have plans to change a LIMS, implement a new ERP, ... should I wait to start working on my data strategy?

Not in the least. These different systems are there to address regulatory and operational constraints, and can support the data strategy, but should not at all define it. The sooner your data-driven goals are set, the easier it is to harness all the data-related assets of the company, regardless of the rest of the IT roadmap.

I feel these data-driven initiatives are only worth when there are already plenty of data. When is the best time to start?

The sooner the better. Only the process improvement part needs a lot of data history to deliver its full value. Aggregating all your data sources, put in place good practice related to data, and monitoring your production on a day to day basis deliver value on their own. And by starting soon, without long legacy, deploying such strategy is even simpler.

My colleagues are used to minitab for QC data and simca for automation data analysis. Is Hercule software platform appropriate for us?

For sure! First, a central benefit of the Hercule methodology and software platform is to confront data from different silos, where these software are generally making use of one silo at a time. Second, Hercule allows for consolidated data export. Clean and consolidated data that can then be imported into any other analysis software that can handle them.

I have already my data consolidated and I don't want an additionnal software. Can you still work for us?

In most cases yes. We can elaborate data strategies in any case. We can also guide you with the establishment of associated procedures regardless of your installed infrastructure. We can also run analyses on your data, extracted from your infrastructure. We do that a lot.

My production is spread over multiple sites. Or I subcontract my production to several CDMOs. Is this data-related initiative still relevant?

Even more relevant. How to make sure the quality and the yield is consistent from site to site if you are blind for what they actually do? How to measure the impact on your final prodcut when there are different equipments, different raw material providers, different environmental conditions, etc.?

What kind of technology is involved?

Hercule software platform involves industry standard elements. The data lake runs a Postgresql database with an rest API on top of it. The computational elements of Hercule are coded in R, one of the most used environment for data sciences. The user interface and software logic use Shiny R. Visualisation elements use Plotly to make them nice and interactive.

Do I need to purchase a license to use the Hercule software platform

No, you don't. We provide free use license to all customers benefiting from our expertise.

Is the Hercule software platform a black box with limited features?

To the contrary. The software comes with its source code and the license allows for extension development. We help you customize the software, develop extra modules or do it completely for you.

To what kind of process is Hercule fit?

For any manufacturing process putting "the living" at work: production of biological therapeutics, vaccines, Advanced Therapeutic Medical Products (ATMP) such as gene or cell therapies. That does not mean the methodology and tools cannot apply to more traditionnal products like small molecule therapeutics, but these process are generally more under control. Yet, the data consolidation part remains fully relevant to this set of products as well.

I am a C(D)MO, I run a lot of different production processes for my customers. Do I need a data strategy?

Definitely. In this case, there might be process that are run very frequently and others that are only run once or twice a year. For the most frequent ones, just consider the data-related initiative as if they were your own process. For the least frequent ones, then wouldn't you feel safer to be able to compare what you produce from campaign to campaign, in particular if they are distant in time?

I understand my data is the new gold. I want to keep them in-house. Can you work in these conditions?

The Hercule software platform is intended to be installed locally, so your data stay where they are. For ad-hoc analyses we might run, we can work on your premices or remotely, on our machines or yours, as you wish.

Who we are

Hercule is a solution by DNAlytics.

DNAlytics provides expertise in data-sciences for the healthcare industry since 2012, having successfully worked for small, medium and big biopharma and medtech companies, academia, hospitals and government agencies.

Centre Monnet, Rue Jean Monnet, 1348 Louvain-la-Neuve, Belgium.