All the latest news from the industry weekly compiled by the editorial team for you free of charge.
This eMail is already registered.
An unexpected error occured.
Please accept our Terms of Use.
Registration successful.
1 Rating

MRO TECHNOLOGIES Cognitive insights to boost fleet availability

Mar 24, 2021

The airline industry was witnessing steady growth until COVID-19 put a pause to it. While restarting, it is important to overcome roadblocks effectively. This white paper, Dassault Systèmes, whose 3DEXPERIENCE® platform connects the dots to enable an agile production rate with quality goals, investigates the beneficial impact of data analytics used to correct and optimise maintenance quality, safety and performance challenges.

Over the past ten years, there was some good news for the airline industry: Until 2019, airlines have grown steadily while becoming consistently profitable, significantly safer and more efficient in managing capacity & fuel consumption. The airlines made money every year with operating margins above 5%, thanks to these key KPIs:

  • Passenger traffic expanded by almost 80%, growing from 2.5 billion passengers in 2009 to 4.4 billion in 2018, because of air travel becoming more affordable and accessible

  • Passenger load factors rose by 5.8%, from 76.1% to 81.9%, with airlines becoming better at managing capacity to meet demand growth

  • Flights increased by 47%, but average fuel consumption per available seat kilometre dropped by 20.9% thanks to fuel efficiency programmes and more fuel-efficient, new-generation aircraft

Nevertheless, substantial operational challenges remain. Airlines’ cost approximately $75 billion per annum, or roughly 1.5 times the industry operation profit of 2018. These challenges are holding back airlines from achieving higher profits, safer and more punctual operations and pose a major threat in the event of a future economic downturn. These operational challenges influenced a full spectrum of airline operations in the past years, from quality challenges causing unplanned maintenance to millions of hours mechanics waste on unproductive tasks, high costs due to safety incidents as well as fleet availability and on-time performance a little above 80%. These inefficiencies will only intensify due to the current COVID-19 pandemic and its aftermath when the industry will grow again.

Information roadblocks to effective quality management

Information silos

Quality-related data lives in many diverse systems. Of course, it can be found in dedicated quality and compliance systems (which often track-only limited, easily quantifiable costs, like scrap, rework and warranty claims). But important quality information can also be found in broader design, engineering and manufacturing systems, such as design files, production databases, field engineer notebooks and supplier spreadsheets. It can also be found in machine log data, Customer Relationship Management (CRM) databases, social media posts & commentary and many other sources.

Given this diversity, knowing where pertinent quality data lives and achieving a unified view of it across systems are formidable challenges. Currently, most quality control strategies rely on a single quality database or data aggregated from only a few select sources. This leaves important quality data isolated within discrete organisational units and information systems. And the situation is even worse for analytics. Quality-related analytics run on inherited data by teams with different responsibilities working in different locations around the globe.

Consequently, the analytics (if they exist at all) can only deliver partial, often outdated, even misleading insights. And even when analytics are performed on consolidated data, the analysis is often bespeaking in nature and too rarely industrialised (or productised) for enterprise-wide integration and reuse. (For example, it is rather common for data scientists to spend most of their time working on ad hoc exploratory projects for internal customers).

Weak-signal intelligence: Hidden meanings & relationships in data

In addition to the challenges of silos, some of the information captured – like a comment in a field notebook, a question in an email message or a temperature reading in a sensor log – provides only weak-signal intelligence about quality issues. The signal is ‘weak’ because its meaning can only be understood once connected with other data and a pattern revealed. For example, how could a supervisor know that a single defect – a specific circuit that overheats – is behind multiple, scattered reports of ‘bridges’ or ‘capacitors’ or ‘switches’ ‘overheating’ or ‘smoking’ or ‘burning’ or ‘sparking’?

The supervisor could not know unless some semantic classifications were applied to the textual data & the notes cross-referenced with other data, like location, customer, product or part ID numbers. Sometimes, however, signals are buried in data sets so large that the conventional tools normally used to detect such connections or patterns simply cannot be used. This is a challenge that must be overcome in order to detect and address issues as early as possible, ideally, through design changes, preventive maintenance or revised usage guidelines, rather than through warranty claims, recalls and lawsuits. Fortunately, it is now easier to access & use dedicated tools designed for large, heterogeneous data environments to perform advanced analytics (possibly using Machine Learning (ML), so that silos of information can be bridged and weak signal data transformed into clear and timely intelligence.

Cognitive insight engines

In this context, analytics systems known as ‘insight engines’ use big data processing tools, search engine indexing and advanced analytics to enable users to collect, organise, enhance, explore and analyse quality-related data across large and diverse data collections. Such insight engines are designed to augment the cognitive processes that human beings follow when exploring or analysing information. Some insight engines have functions specifically tailored for issue or event detection and investigation, while others are more geared to semantic search and discovery. What they have in common, though, is that all of them can provide unified access to diverse internal and external data collections, with those that can aggregate unstructured content (like customer complaints in online forums) and structured or semi-structured data (like database records or sensor log files) of greatest value to quality and reliability analytics.

To better understand how insight engines work and what kinds of analytic processes can be used to address quality issues, let’s look at the EXALEAD insight engine and its use of advanced analytics to deliver quality and reliability intelligence.

360 operations excellence – EXALEAD asset quality intelligence solution

Adapted specifically for the needs of the aerospace & defence industry, 360 operations excellence includes the EXALEAD Asset Quality Intelligence (AQI) solution designed to help companies achieve four primary objectives:

  1. More rapid and accurate detection and understanding of current or potential quality issues.

  2. Aid supporting engineers to correct existing problems.

  3. Helping quality managers and engineers develop & deploy preventive maintenance measures to avoid potential problems.

  4. Minimising future quality issues by providing design, engineering and manufacturing teams (including project and program managers) with lessons-learned intelligence.

To fulfil these objectives, AQI consolidates data from all identified sources of quality-related information. It uses ML to mine this important information and reveal potential similarities in quality issues. Menus and graphs help users refine search and analytics options in order to investigate issues and causes. Once issues are analysed and the right action is determined, the solution enables these actions to be integrated into a task management framework for rapid resolution and full traceability.

Step one: Collect data

As an insight engine, AQI uses advanced search engine technology to collect and index a wide variety of internal sources, like databases & data lakes and external sources, like websites and open data repositories. This includes data from the virtual world of digital design and simulation and real-world data from manufacturing, usage and maintenance activities. The data can be free-form, unstructured content, like service notes, consumer forum comments, company emails, 2D & 3D drawings, call centre recordings and CRM notes. Or it can be semi-structured data, like IoT sensor log files, warranty claims databases, or highly structured data, like that managed in Product Lifecycle Management (PLM) databases, Enterprise Resource Planning (ERP), Manufacturing Execution Systems (MES) and more.

Step two: Pre-process data

In order to facilitate data retrieval and analysis, AQI uses ML algorithms to pre-process the data. During this phase, raw data is converted into clean data suitable for analysis. The tasks executed include data cleansing (e.g., checking data validity and converting formats), possibly replacing missing values with estimates and providing baseline normalisation (e.g., regularising text cases, measurement units and ranges, etc). Algorithms are also used to identify and remove irrelevant or redundant attributes from data that could impact the accuracy of the analyses.

Step three: Investigate issues

After initial processing, ML techniques are used to mine the data for hidden relationships, patterns, trends & anomalies and to reveal these insights through automatically generated graphs & charts.

These dynamic visualisations can be used to zoom in on the relationships between objects, events, people, places and documents – greatly facilitating an investigator’s ability to detect and understand significant issues. Investigators can also refine the options and parameters used by the AQI algorithms to create personalised views of the information. This facilitates individual understanding while still enabling collaborative teams to access and work from a shared result set (a ‘single version of the truth’). Going a step further, users also have the option to execute custom or off-the-shelf algorithms. This is enabled via the AQI solution’s built-in ML studio, which provides an interface for developing or importing custom algorithms to complement the platform’s native algorithms and search-based techniques for rendering, contextualising and exploring data.

No advanced data science skills are required, however, to use the AQI solution. It makes the work of data scientists easier and more effective but is designed to enhance any user’s ability to automatically:

  • Detect links or similarities between incidents (clustering)

  • Identify trends

  • Reveal abnormal (anomalous) behaviour

  • Contextualise discrete pieces of information

  • Analyse incident causes

  • Make predictions about future product behaviour

  • Recommend corrective or preventive actions

  • Measure the real or anticipated impact of issues

In addition, the workflows developed for the action items above can be shared and reused, boosting collaboration and enabling continuous improvement in analytic techniques and data processing workflows.

Step four: Ensure digital continuity

This continuous improvement in workflows, collaboration and data models is key to enabling continuous improvement in product quality and asset performance, outcomes of the Dassault Systèmes 3DEXPERIENCE platform.

Thanks to its integration into the platform, the EXALEAD AQI solution enables users to access essential governance and project management tools from ENOVIA seamlessly, thus better supporting the execution and monitoring of their analytics workflows. The solution also enriches the digital replica – the 3DEXPERIENCE twin – with real-world information from each physical product in operation, providing a ‘single source of truth’ referential maintained throughout the respective product’s overall lifecycle. As a complementary tool for understanding and improving quality, AQI is of essential value in this virtual environment and an important enabler of full digital continuity for products and assets.

The high rewards of continuous quality improvement

Quality issues cost manufacturers enormous sums of money every year. Sometimes, they even cost them their business. But there is hope for more effective management of quality risks and costs thanks to advances in big data management technologies and advanced analytics, including Artificial Intelligence and ML.

What’s more, these technologies, as represented by the EXALEAD Asset Quality Intelligence solution, have the potential to transform quality management from a simple risk mitigation activity into a strategic tool for driving operational excellence and product innovation. The usage and performance data and insights can also be channelled to service partners to optimise their support offerings or shared with distributors so they can make timely offers of maintenance supplies and pitch product upgrades or replacements at an opportune time. As the body of insights grows, new and better approaches to design and value chain management can be developed as well, and inventory levels, service level agreements, and service offerings optimised for parts and brands.

Courtesy: Dassault Systèmes

Image Gallery

  • View and anticipate maintenance

  • Refine searches to identify and resolve issues

  • Benefits for manufacturers

Related articles