Transformer Fault Diagnosis Method Based on Multi-Level Sparse MobileNetV2
Author:
Affiliation:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
    Abstract:

    [Objective] Deep learning models are widely used in transformer fault diagnosis due to their ability to learn underlying data patterns and construct hierarchical feature representations. However, their massive number of parameters, complex network topology, and high calculation and storage costs limit their practical application in fault diagnosis of power transformers. [Methods] To address the above issues, this study proposed a transformer fault diagnosis method based on multi-level sparse MobileNetV2. First, spindle-shaped and hourglass-shaped blocks were used to compactly improve the inverted residual blocks of the MobileNetV2 model, reducing parameter number and computational complexity from the model structure itself to achieve preliminary model sparsity. Second, a group-level pruning method based on dependency graph model was proposed. The coupled parameters in the model were grouped, and a group-level pruning optimization strategy based on L2 norm was designed to perform sparse training and pruning fine-tuning. This process removed redundant structures and parameters in the model, further reducing parameter number and computational complexity and enhancing model sparsity. Finally, an 8-bit symmetric uniform quantization and quantization-aware training method was proposed. The 32-bit high-resolution floating-point parameters in the model were quantized into 8-bit low-resolution integer parameters. Building on this, model inference was performed to further reduce the computational complexity and achieve multi-level model sparsity. [Results] The results of numerical experiments and performance evaluations showed that compared with the original MobileNetV2 model, the improved multi-level sparse model proposed in this study achieved a fault identification accuracy of 95.2%, while reducing the parameter number, computational complexity, and model size by approximately 73.5%, 96.9%, and 68.8%, respectively. Moreover, the inference time for identifying 1 000 images was only 0.66 seconds. [Conclusion] The proposed method in this study effectively combines three types of individual sparsity methods: compact model improvement, model pruning, and parameter quantization. It achieves multi-level sparsity of deep learning models while maintaining high accuracy, effectively addressing the issue of over-parameterization caused by limited sample data in power transformer fault diagnosis and eliminating its adverse effects.

    Reference
    Related
    Cited by
Get Citation

LIU Hang, SHI Zhiyu, LIU Zhijian, LUO Linglin, LI Ming, NIU Ben. Transformer Fault Diagnosis Method Based on Multi-Level Sparse MobileNetV2[J]. Electric Machines & Control Application,2025,52(5):513-526.

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:December 18,2024
  • Revised:February 26,2025
  • Adopted:
  • Online: May 27,2025
  • Published: May 10,2025
You are thevisitor
沪ICP备16038578号-3
Electric Machines & Control Application ® 2025
Supported by:Beijing E-Tiller Technology Development Co., Ltd.

沪公网安备 31010702006048号