5 myths: What everybody gets wrong about interoperability
You hear it everywhere – in conference rooms, during supplier meetings, in Industry 4.0 roadmaps. Interoperability has become a buzzword of industrial transformation. But do you truly understand what it means? Many companies treat it as a nice-to-have – something to consider eventually. That’s a mistake that can cost you a fortune.
Regardless of the type of manufacturing automation you’re planning, interoperability should be a core pillar of your strategy. It’s the framework on which all system components must rely. Without it, you risk limited scalability, difficult integration, and – above all – rising costs.
Here are five common misconceptions about interoperability you’ll want to avoid.
Myth 1: “We’ll add interoperability later”
Imagine your new electric motor production monitoring system is working perfectly – sensors are collecting data, machine learning algorithms are predicting failures, and dashboards are displaying sleek visualisations. Then, three months in, a problem arises. You discover that the advanced IoT platform cannot be integrated with your DCS via industrial standards like OPC UA or MQTT. The system only communicates through proprietary APIs provided by the vendor.
The lack of a common protocol, data model, or service structure means costly, time-consuming integration. The consequences are always the same:
- Higher costs – retrofitting interoperability significantly increases project expenditure.
- Delayed timelines – designing and testing additional integrations causes delays.
- Lost scalability – systems built without interoperability become fragile; every extension adds complexity.
All of these issues could have been avoided by following RAMI 4.0 principles from the start. Interoperability should be designed into every stage of production – from the physical and informational to functional and business layers.
It’s not something that can be easily added later; it’s an integral part of the solution that needs to be considered in advance.
At Spyrosoft, interoperability is never an afterthought. We design systems that integrate seamlessly – from day one – with both legacy environments and new infrastructure.
Myth 2: “If it’s all from one supplier, we’re safe”
At first glance, relying on a single supplier appears to be a safe bet. One ecosystem should mean built-in compatibility, minimal integration effort and fewer points of failure. However, a factory isn’t a laboratory and digital transformation never starts with a blank page.
Even in companies where most components come from the same vendor, hybrid environments are the rule – not the exception. Think production lines with older-generation robots, sensors from a new supplier, a client-specified MES, and ERP rolled out from headquarters.
In reality, vendor lock-in doesn’t solve interoperability. It only delays the problem – until you need to add a new component, integrate external data or onboard a customer-specific solution.
That’s why at Spyrosoft, we use middleware that connects systems across different suppliers. We build universal interfaces designed for intelligent manufacturing in line with Industry 4.0 architecture.
Discover the benefits of intelligent manufacturing with interoperability
Learn more
Myth 3: “Standards are for engineers”
It’s tempting for operational leaders to delegate protocol compatibility and data models to technical teams. However, treating standards as an engineering-only concern is no longer a responsible approach. Communication and interoperability are now strategic architectural decisions that shape the entire business environment.
They influence:
- How easily systems scale across locations.
- How quickly you can integrate partners or client ecosystems.
- How fast you deploy new technology.
- The total cost of ownership for each implementation.
In large-scale operations, any change in protocol or data model can trigger a domino effect – revising documentation, rebuilding integrations or restructuring entire data flows.
We follow a simple rule: today’s flexibility is tomorrow’s saving. That’s why we adopt open standards from the start. They accelerate time-to-market, increase supplier independence and enable long-term evolution.
Myth 4: “Interoperability is just about communication protocols”
Project teams often equate interoperability with a checklist of supported protocols: OPC UA, MQTT, Modbus, etc. But seamless connectivity requires more than a shared protocol – it demands shared understanding.
True interoperability relies not only on physical integration but also on:
- Semantics – do systems interpret data in the same way?
- Data models – are values structured and described using consistent, standardised formats?
- Services – can functions be offered and reused across components?
- Business logic – can data flow meaningfully and securely between MES, ERP or EAM systems?
Data interoperability means that two machines from different manufacturers can not only communicate with each other but also understand the context of the information being transmitted and can make business decisions autonomously.
Myth 5: “Legacy systems can’t be integrated”
Mature industrial organisations often assume that their legacy systems can’t coexist with contemporary platforms. They may claim, for example, that their robots do not support any modern protocols or that Profibus or Modbus-based environments are incompatible with cloud analytics.
That was exactly the case for one of our clients in the processing industry. Their ageing DCS couldn’t communicate with modern automation components. We implemented an OPC UA-based integration layer, enabling seamless, real-time data exchange and complete process visibility. The results? 15% increase in production efficiency, 30% reduction in downtime, and noticeable improvement in product quality.
Interoperability underpins intelligent automation in manufacturing
Interoperability isn’t a technical detail to bolt onto an Internet of Things (IoT) project at its final stage. It’s a mindset for designing industrial systems – one rooted in coherence, adaptability and long-term resilience. The earlier you bring in an experienced technology partner and plan for interoperability across your IoT environment, the more friction, risk and expense you’ll avoid along the way.
About the author
Contact us
Let’s discuss how we can help with your project
Interoperability in manufacturing refers to the ability of different systems, machines, and software to seamlessly communicate, exchange, and interpret data across various levels of production. It is essential for ensuring scalability, reducing integration costs, and enabling Industry 4.0 innovations like predictive maintenance and real-time analytics. Without it, manufacturers face higher costs, delays, and limited system adaptability.
No. Retrofitting interoperability after implementation is expensive and inefficient. Delaying its integration leads to costly rework, project delays, and fragile systems that lack flexibility. Interoperability should be part of the foundational design, following frameworks like RAMI 4.0, to ensure future-proof, scalable industrial automation.
Relying on a single supplier may simplify initial integration but doesn’t guarantee long-term interoperability. Most manufacturing environments are hybrid systems, mixing legacy equipment with new technologies. Vendor lock-in can hinder future integrations, especially when adding third-party tools or customer-specific solutions. A universal middleware approach supports true cross-vendor compatibility.
Not at all. True interoperability extends beyond protocols like OPC UA or MQTT. It includes shared data semantics, consistent data models, compatible services, and aligned business logic. This ensures machines and systems not only exchange data but also understand and act on it contextually — a critical element for smart manufacturing.
Yes. Even older systems using protocols like Profibus or Modbus can be integrated with modern automation platforms. Solutions such as OPC UA-based integration layers allow legacy equipment to participate in real-time data exchange and analytics, improving efficiency, reducing downtime, and enabling digital transformation without a full equipment overhaul.