Systems and Process Engineering

July 27, 2022

Any large-scale construction project is an undertaking of immense complexity – so you can imagine what it takes to design and build a new nuclear reactor.

Share this post


Any large-scale construction project is an undertaking of immense complexity – so you can imagine what it takes to design and build a new nuclear reactor.

The Petten High-Flux Reactor (HFR) is a globally significant, ‘non-power’ research reactor that uses small amounts of low-enriched Uranium to produce a variety of essential medical isotopes. Each day, the current reactor supplies 30,000 patients across Europe with life-saving diagnostic and therapeutic isotopes. But, after more than 60 years in operation, it is now due for replacement.

PALLAS is the name of the new HFR destined to replace the current aging facility, and the project is being headed by the Stichting Voorbereiding Pallas-Reactor (Foundation Preparation PALLAS-Reactor). Since 2013, this foundation has been starting the preparatory work necessary for commissioning the new facility.

Unsurprisingly, this is a complicated affair, with systems engineering and information management requirements like no other.


Careful planning

Once built, the reactor will need to operate safely and continuously for 50 years. This is no mean feat; the configuration of each individual part of the facility - and the whole - must be carefully planned and synchronized. This is essential, so that the safety and longevity of the reactor is guaranteed

Furthermore, each aspect of the design and configuration is heavily regulated, with rigorous standards that must be met.

A project of this scale and scope needs a rigid data model that meets these requirements. Specifically, PALLAS uses the international standard, ISO 15926 pt 11, which stipulates how all reference data relating to the facility is managed during the project. PALLAS had to either develop a suitable data model itself, or try to find a solution that met all of their stringent requirements. 

The International Atomic Energy Agency (IAEA) requires a simple but stable common data environment (CDE). This is used for storing, accessing, and evaluating data relating to the configuration and planned operation of the facility. The CDE is essential to ensure total visibility of every aspect of the design. 

As the design evolves, it also needs to be apparent how each new/adjusted element affects all other parts.

Such a long-term project must account for the high probability that specifications, designs, and requirements may change over the course of the entire lifetime of the project – from concept to the many decades of operation. It requires a rigorous data management system that is also versatile enough to handle and homogenize all the different types of data being used.

Dynamic virtual configuration

The complete ‘virtual’ configuration of the whole facility using raw data and advanced modeling is needed, long before construction begins. Every component of the design needs to be configured with respect to all other parts of the puzzle, so that it can be maintained easily. Also, any design adjustments need to trigger revisions in related design elements. For example, making sure the walls and floors are strong enough if the tank volume is increased, so they support the new weight, or increasing the coolant flow if the thermal capacity  increases.

Adaptations and alterations like these are relatively commonplace. In the late 1960s, the current HFR had its thermal capacity increased to 45MW. Another change in design was needed in 2006, when the switch was made to low-enriched Uranium. And, in 2008, the coolant pipes needed to be replaced due to corrosion. 

To maintain safety and visibility, every change in the design and finished reactor needs to be mapped out and maintained in a reliable integrated common data environment (CDE).

A solution for a safe future

To solve this, the Stichting Voorbereiding Pallas-Reactor is using the platform to create and complete the initial ‘design research’ phase. Before progressing to detailed final designs, the data is processed, then added and integrated into the platform, so that each element is fully mapped-out and modeled. 

This preliminary stage is probably the most complex, because all the variables are still in a state of flux. New data is constantly being added to the model, and adjusted, as updates and alterations are made. There are more than 300 distinct subcontractors involved, each with independent access using a secure portal to upload concept drawings, schematics, 3D-designs, calculations and raw data.

Long before a final design is created, the entire facility is ‘virtually configured’ within the platform. This complete and verified configuration can then be used to create detailed plans and chronologies. 

Over time, the ‘data layer’ formed in can be used to create a ‘digital twin’ of the whole facility. By combining the model with real operational data and 3D modeling, it becomes possible to keep a constant eye on the facility - even ‘step inside’ a virtual model or use AR to ‘see’ additional layers of detail. 

By using, all stakeholders can see the connections between the billions of data-points in the design; but it also has an important part to play in the long-term operation of the facility. Once operational, real-time data is easily compared to the predicted raw data, which is fully accessible. Any variance can then be investigated in a meaningful way, without losing granularity due to a lack of the original raw data. With a sturdy and reliable data model, PALLAS can make data-driven decisions on a sound basis - confidently.


Total visibility. Legal compliance. Peace of mind.

By using as the common data environment for the PALLAS project, every contractor can easily contribute the most up-to-date information, models, and projections. These are compiled within into a comprehensive overview, in a clear and transparent way that meets the strict legal requirements set by the IAEA and other authorities. As the design evolves over time, different versions are easily managed within the platform, and a baseline is created for all structural and engineering data. The real-world implications and connections of every data point are clearly visible, trackable and usable.

In the future, AI will have a bigger role in the configuration and maintenance of complex facilities like PALLAS, but they can only do so when the data is available. To begin with, pattern recognition is done manually, but over time it becomes possible for larger datasets to be fed to smart algorithms. With enough data, facilities like this can become even safer by enabling predictive maintenance. enables this safer future. It’s also the only tool that makes it easy to meet the required standards.

Using is an investment for the future of the new reactor. The baseline formed by the data model enables the possibility of predictive maintenance at some point, which makes the facility a safer and happier workplace. When maintenance is needed but wasn’t predicted, it’s possible to go back and see why.

This level of visibility provides a source of reassurance for decades to come.

Subscribe to our mailing list

A new series of webinars is coming soon! Join our mailing list here:

I accept the Terms (because we love the GDPR laws...)
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

How Saas & Tech companies use Knowledge Graphs to manage evergrowing data complexity.

Your organization relies on the flawless management of data to keep everything running. Discover how the leaders in the SaaS and Tech industries are using Knowledge Graphs to make better data-led decisions.

See how these complex industries are making intuitive connections and extracting value from incredibly complex data. Steal these successful strategies for yourself.