News Release Archives
Note that the releases are accurate at the time of publication but may be subject to change without notice.
TOKYO, February 17, 2016 - Mitsubishi Electric Corporation (TOKYO: 6503) announced today it has developed a small-memory, compact AI that can be easily implemented on vehicle equipment, industrial robots and other machines by reducing the computational costs for inference, which is a process including identification, recognition and prediction to anticipate unknown facts based on known facts. This will enable a low-cost AI system that can perform high-level, high-speed inference in a highly-secured environment. The compact AI is expected to be implemented in products sold commercially from around 2017.

A machine-learning algorithm known as deep learning can perform high-level inference, but it requires significant computational costs and memory since it employs a deep neural network. Mitsubishi Electric has used more effective network structure and computational models to develop a novel algorithm that realizes a more compact AI with the same inference performance as a conventional AI. For example, Mitsubishi Electric estimates that the computational costs and memory requirements for image recognition can be reduced by 90 percent.

The compactness means the AI can perform high-level inference even on embedded systems. For example, on a vehicle system, it could provide features that detect when a driver is distracted*. Also, on an industrial machine, it could analyze the actions of factory workers. The new technology realizes AI systems at a much lower cost and with smaller server and network requirements compared to conventional systems, which require a server to gather enormous amounts of data. It also establishes a highly-secured computational environment that eliminates the need to upload classified information to servers. In addition, running the compact AI on an embedded system without a network connection means the inference process can be optimized for each system depending on the environment.

The market size of AI is expected to expand from US$ 31 billion in 2015 to US$ 200 billion in 2020, according to a study at Ernst & Young Institute Co., Ltd. and a compact AI that provides more security and speed at a lower cost will be well-positioned to meet that demand.


A machine-learning algorithm known as deep learning can perform high-level inference, but it requires significant computational costs and memory since it employs a deep neural network. Mitsubishi Electric has used more effective network structure and computational models to develop a novel algorithm that realizes a more compact AI with the same inference performance as a conventional AI. For example, Mitsubishi Electric estimates that the computational costs and memory requirements for image recognition can be reduced by 90 percent.

The compactness means the AI can perform high-level inference even on embedded systems. For example, on a vehicle system, it could provide features that detect when a driver is distracted*. Also, on an industrial machine, it could analyze the actions of factory workers. The new technology realizes AI systems at a much lower cost and with smaller server and network requirements compared to conventional systems, which require a server to gather enormous amounts of data. It also establishes a highly-secured computational environment that eliminates the need to upload classified information to servers. In addition, running the compact AI on an embedded system without a network connection means the inference process can be optimized for each system depending on the environment.

The market size of AI is expected to expand from US$ 31 billion in 2015 to US$ 200 billion in 2020, according to a study at Ernst & Young Institute Co., Ltd. and a compact AI that provides more security and speed at a lower cost will be well-positioned to meet that demand.

* | "Mitsubishi Electric Develops Machine-learning Technology That Detects Cognitive Distractions in Drivers"Oct. 27, 2015 |

Inquiry
Customer Inquiries
Mitsubishi Electric Corporation