Article
    by Ergin Tuganay, Head of Data & AI, Nortal Finland

    A faster path to real-time industrial data: Insights from the Industrial Data & AI Kickstart approach

    Many industrial organizations lose months — even years — exploring theoretical data strategies. By the time they are ready to implement, the business context has already changed. Our practical, four-step approach helps companies move from intention to implementation quickly, without overengineering or unnecessary investments.

    Service

    Data and AI

    Industry

    Manufacturing Industry

    In every industrial organization, hundreds of small decisions are made before lunchtime: operators adjust speeds, maintenance teams decide whether to intervene, planners shift loads to ensure customers receive products on time, and managers prioritize production lines, capacity, and overtime. Individually, the decisions are small, but collectively, they determine the organization’s throughput, quality, energy use, customer satisfaction, and – ultimately – the bottom line.

    The real difference for any industrial business is whether those decisions are based on intuition or on data and insights. This article outlines a practical four‑step approach that has repeatedly proven effective in helping industrial organizations move from intention to implementation. This approach enables organizations to kick-start the development of a modern, real-time data platform without overengineering or unnecessary investments. In doing so, it helps avoid the analysis paralysis seen in some companies, where months and even years are wasted exploring theoretical approaches. By the time these organizations are ready to implement, the initial business assumptions may have already changed.

    Uncovering the repeatable patterns behind unique industrial data projects

    Experience across industrial organizations shows that data-related challenges vary greatly, and no two IT or OT landscapes are alike. The underlying drivers may include technology harmonization, technical risk reduction, cost efficiency, shared business understanding, or preparing data for future AI use. Additional factors can involve maintaining a complete history of raw data, reducing reliance on system-specific expertise, accelerating application development, enabling self-service data analytics, or any combination of these.

    Despite differences in driving forces and baseline organizational readiness, common data approaches should be used across large industrial organizations. Identifying these common elements helps create a structured framework for initiating work on a modern industrial data platform. And importantly, these shared patterns provide the foundation for translating business needs into a practical, actionable plan.

    Turning business needs into a practical plan

    As industrial companies increasingly recognize the strategic need for a robust data foundation, we have found that early decisions about data architecture and ways of working shape everything that follows. Organizations that make efficient progress typically start not with technology but with clarity: What role should operational data play in improving decision-making, operations, and long‑term competitiveness?

    Drawing on our customer experiences, a structured approach – the Industrial Data & AI Platform Kickstart developed by us – has proven effective in helping organizations build this foundation in a practical, achievable way. It enables organizations to create a clear and actionable basis for their data and AI initiatives. Instead of focusing on tools or platforms, the goal is to establish three essentials:

    • A shared understanding of why and what data matters for your business,

    • An initial architecture that reflects both IT standards and operational realities, and

    • Clear next steps that move the work forward with conviction rather than complexity.

    Focused, time-boxed workshops are an effective way to advance this work. They strengthen participants’ understanding of industrial data and analytics, enabling them to evaluate and apply insights in practical contexts. A four‑step, systematic approach provides an efficient, practical starting point for modernizing the underlying data capabilities.

    #2 The next step usually involves identifying and prioritizing the first 1–3 high-impact use cases that would benefit from better insight, which would quickly influence behavior and outcomes. Examples include stabilizing production, optimizing energy usage, or providing operators with real-time visibility into anomalies. Co‑creation techniques and structured templates help capture requirements and identify recurring use case architecture patterns. These patterns help determine which foundational platform capabilities to prioritize to enable scaling beyond initial use cases.

    #3 A common third step is to assess the existing OT/IT landscape, including data sources, integration mechanisms, network topology, and access patterns. Based on this analysis and the identified use cases, the future requirements as well as design of an initial target architecture for the edge-to-cloud platform and integrations are defined. This architecture addresses security, scalability, data product delivery, and operational constraints, leveraging both edge and cloud capabilities where appropriate.

    #4 The final step focuses on defining the governance and operating model that aligns with the target architecture. This involves clarifying platform and data ownership, roles and responsibilities, decision-making and prioritization processes, and data quality and lifecycle management. The goal is to ensure the platform can be operated, governed, and evolved sustainably across organizational boundaries.

    The workshop outcomes are consolidated into a coherent starting package for execution. This includes the initial platform and integration architecture, governance and operating model, prioritized use cases, and a practical roadmap. The result is a clear, actionable foundation ready to inform a pilot implementation, scaling initiative, or RFP process.

    Shared principles, adaptable execution

    Across industries, organizations are increasingly seeking ways to clarify their data ambitions without committing to rigid solutions too early. Experience shows that a structured, step-by-step approach helps surface the right questions and guide conversations productively. While common templates and frameworks provide consistency, each case ultimately differs according to the organization’s context and evolving market needs.

    To move efficiently, up-to-date documentation of organization data, governance, and integration setup is essential. Appointing a coordinator who understands the current state is crucial to steer the review process. Assembling the right mix of team members is also important: an executive sponsor, project manager, security specialist, data architect, engineers, data scientists, and domain experts make up the ideal “dream team.” However, if these roles are not all present, don’t worry — success is still possible with a leaner team, supported by external expertise where needed.

    In many organizations, similar early‑phase efforts usually span several months, depending on alignment and available resources. With strong alignment, the duration can be reduced to as little as one month. Regardless of pace, the goal remains the same: a practical path to value, with clear timelines, resourcing, and a prioritized backlog. By producing material that supports both internal momentum and external implementation tendering, organizations are typically well-positioned to move forward.

    Enabling more consistent decisions, shift after shift

    Experience across industrial settings shows that clear structures help link data initiatives directly to daily operations. When development begins with genuine operational needs rather than technology choices, organizations are better positioned to avoid overdesign and fragmented tool landscapes.

    A systematic early-phase approach can help build greater confidence in decision-making. As the data foundation strengthens, teams increasingly rely on shared insights rather than assumptions. This shift often results in more predictable performance: downtime and scrap may decrease, energy use becomes easier to manage, and new use cases can be developed more efficiently through repeatable patterns. Meanwhile, leadership gains a clearer view of the scope, expected benefits, and the work’s progression.

    Establishing alignment among executives, architects, security, data teams, and data consumers early on reduces later rework and streamlines approvals. It also prepares organizations for the increasing regulatory emphasis on transparency and compliance.

    Ultimately, when reliable insights reach the right people at the right time, the countless micro‑decisions made each day become more consistent, informed, and safe. The goal is simply better decisions – shift after shift.

    Why cyber exercising matters

    • Reveals critical gaps in technical controls, escalation paths, and decision-making workflows.
    • Fosters organisation-wide collaboration, improving coordaination and communication across all roles, functions, and levels. Builds confidence under pressure, giving participants, groups, and organisations muscle memory they can rely on.
    • Exposes participants to real-world attack techniques, improving detection, containment, and familiarity.
    • Strengthens regulatory and stakeholder alignment by stress-testing notification and reporting procedures in a simulated environment.
    • Fosters a culture of continuous improvement by turning lessons from exercises into actionable changes across people, processes, and technologies. 

    Talk to our industry experts

    Tell us how we can help. Our experts will be in touch.