Global

Corporate Research & Development Center

Overview

Research and Development

  • Research News
  • Research Fields
  • Awards
  • Media
  • Videos

Toshiba Develops AI for Automatically Correcting Time Lag
Among Time-Series Data from Multiple Sensors
-Achieving higher accuracy in anomaly prediction/detection and motion analysis for infrastructure or manufacturing equipment by identifying the timing of changes with less than one-tenth the error of conventional methods-

June 2, 2020
Toshiba Corporation

Tokyo—Toshiba Corporation (TOKYO: 6502) has developed “Lag-aware Multivariate Time-series Segmentation” (LAMTSS)(Note 1), an AI that improves the accuracy of anomaly prediction/detection and motion analysis for infrastructure and manufacturing equipment. This technology automatically corrects time lag among multiple time-series datasets obtained from sensors installed in equipment or devices. Testing confirmed that the timing of changes in data due to anomalies and other problems can be detected with less than one-tenth the error of conventional methods(Note 2).
Up to now, time lag has required manual correction, but Tohiba’s new technology automatically corrects time lag between multiple datasets, even when many sensors are installed in large-scale infrastructure or manufacturing systems. In addition to real-time anomaly prediction/detection, in the future this technology is expected to contribute to productivity improvements through identification of the root causes of anomalies and through application to motion analysis.
The technical paper on this technology was accepted for the prestigious SIAM International Conference on Data Mining (SDM) 2020(Note 3), a leading conference in the field of AI and data mining.

The increased adoption of Internet of Things (IoT) technologies has made it possible to collect huge amounts of multivariate time-series data recording moment-by-moment changes detected by multiple sensors attached to infrastructure and manufacturing equipment. There are three primary methods for anomaly prediction and detection using time-series data: outlier detection of data points that normally do not occur; anomaly detection of partial time-series in which an anomaly occurs; and change-point detection of points in rapidly changing time-series data patterns. Anomaly detection and change-point detection involve temporal changes, making them more difficult to handle. However, these methods are indispensable for improving detection accuracy, and in recent years increased attention has been given to analysis that incorporate elements of temporal change. LAMTSS is a technology that focuses on change-point detection, allowing prediction and detection of anomalies in equipment and devices requiring high accuracy, as well as extraction of device operation patterns.

One problem with change point detection is that time lag can arise between time-series data from multiple sensors. In pressure and speed control for infrastructure equipment, for example, it takes some time for sensor data to reflect measurement and control results, causing a time delay in behavior between the data. Vital components of power plants and factories can contain thousands of sensors for performing maintenance and management work while monitoring sensor values. When an anomaly occurs in such a facility, workers investigate its cause by using sensor values to understand what happened and when. If the detection times of each of thousands of sensors are not synchronized, however, time lags between them can make it difficult to investigate the cause of an anomaly and to devise countermeasures. Time lag can also occur in motion capture used to improve worker productivity at manufacturing sites, for example, when right- and left-hand movements are not perfectly synchronized. When there is time lag between data, it becomes difficult to identify the exact times when changes occur in equipment or devices, making it difficult to correctly identify the timing of anomalies. For this reason, change-point detection technology is needed that can consider time lag between data from multiple sensors from the start.

Toshiba has developed LAMTSS as a technology for detecting change times in equipment and devices through the use of multivariate time-series data that accounts for time lags between multiple time-series datasets (Fig. 1). To tackle this problem, LAMTSS uses a dynamic time warping method(Note 4) based on dynamic programming(Note 5) to realize self-correction of time lag by automatically aligning shifted peaks of small waveforms that occur in some places. Performance testing using artificially generated data showed that this method can accurately detect change points with less than one-tenth the error of conventional technologies(Note 6).

Figure 1: Overview of LAMTSS.

This technology allows more accurate anomaly detection by incorporating time lags associated with measurements and control, and it will contribute to reduced system downtime due to anomalies, especially in the fields of infrastructure and manufacturing.

Toward the goal of becoming a cyber–physical systems technology company, Toshiba has been utilizing time-series waveform data, which thanks to advances in IoT technology can now be collected in large quantities from various sensors, in order to promote research and development of various AI applications for solving problems in the manufacturing and infrastructure fields(Note 7).

Toshiba will continue to improve the accuracy of LAMTSS and verify its effectiveness for various time-series data in order to develop anomaly detection technologies for the infrastructure and manufacturing fields

(Note 1)
Shigeru Maya et al. Lag-Aware Multivariate Time-Series Segmentation, SIAM International Conference on Data Mining (SDM20), pp. 622-630, 2020/5. https://epubs.siam.org/doi/pdf/10.1137/1.9781611976236.70
(Note 2)
According to in-house investigations.
(Note 3)
Society for Industrial and Applied Mathematics (SIAM) International Conference on Data Mining (SDM).
https://www.siam.org/conferences/cm/conference/sdm20
(Note 4)
An existing technology for partially expanding and contracting time-series data.
(Note 5)
An optimization method that reduces computation times by decomposing problems into smaller, equivalent problems.
(Note 6)
Toeplitz Inverse Covariance-Based Clustering of Multivariate Time Series Data (Halloc et al. ACM SIGKDD KDD2018).
(Note 7)
http://www.toshiba.co.jp/rdc/rd/detail_e/e1812_02.html
http://www.toshiba.co.jp/rdc/rd/detail_e/e1812_01.html

Media Inquiry

Itaru Kobayashi, Takashi Ebina, Ayana Yamamoto
Toshiba Corporation Media Relations Group: +81 3 3457 2100