- Reference 1:
X. Li et al., "Temporal Modeling for Power Converters With Physics-in-Architecture Recurrent Neural Network," in IEEE Transactions on Industrial Electronics, vol. 71, no. 11, pp. 14111-14123, Nov. 2024.
- Reference 2:
X. Li, F. Lin, X. Zhang, H. Ma and F. Blaabjerg, "Data-Light Physics-Informed Modeling for the Modulation Optimization of a Dual-Active-Bridge Converter," in IEEE Transactions on Power Electronics, vol. 39, no. 7, pp. 8770-8785, July 2024.
- Reference 3:
F. Lin, X. Li, X. Zhang and H. Ma, "STAR: One-Stop Optimization for Dual-Active-Bridge Converter With Robustness to Operational Diversity," in IEEE Journal of Emerging and Selected Topics in Power Electronics, vol. 12, no. 3, pp. 2758-2773, June 2024.
- Reference 4:
X. Li et al., "A Generic Modeling Approach for Dual-Active-Bridge Converter Family via Topology Transferrable Networks," in IEEE Transactions on Industrial Electronics.
- Colabs:
PANN, physics-in-architecture neural network, is a physics-informed neural network specifically for the modeling of power electronics systems, leveraging physics-crafted recurrent neural structure to embed discretized state-space circuit equations. PANN introduces inductive biases by embedding discretized PDEs directly into the network architecture, revealing data invariant directly. The neural architecture of PANN is shown in Fig. 1.
Fig. 1. Structure of PANN.
PANN inference is conducted by recurrently predicting the next state variables using the precomputed input variables and the inferred state variables from previous iteration. The PANN inference unfolded over time is given in Fig. 2.
Fig. 2. Structure of PANN.
PANN model is explainable in power electronics, revealing circuit physical principles, switching behaviors, commutation loops, etc. Those power electronics insights discovered by PANN for an exemplary non-resonant Dual-Active-Bridge converter are shown in Fig. 3.
Fig. 3. PANN's physical explainability.
The training workflow of PANN is shown in Fig. 4, and one training epoch for PANN is shown in Fig. 5.
Fig. 4. PANN's training workflow.
Fig. 5. One training epoch for PANN.
The PANN is data-light, as it directly embeds circuit physical principles into its neural architecutre, ensuring stringent physical consistency. PANN requires only few data samples for training, reducing data requirements by 3 orders of magnitude. Theoretically, PANN only requires a dataset containing time-series points no less than the number of defined converter parameters. Additionally, PANN is similar to a single-layer recurrent neural network with only a few neural parameters, so it exhibits lightweight advantage. The light advantages of PANN are illustrated in Fig. 6.
Fig. 6. Data-light and lightweight advantages of PANN.
The PANN is flexible in terms of four main aspects: operating conditions, modulation strategy, performance metrics, and circuit parameters and topological variants, as summarized in Fig. 7.
Fig. 7. Flexibility of PANN.
The steps shown in Fig. 8 can be followed to customize PANN to your specific application or converter of interests. Fig. 8 showcases the modeling of non-resonant DAB converters.
Fig. 8. Case study: Design PANN for DAB converters.
A comprehensive tutorial of PANN is given in PANN_Tutorial.pdf, with the main topic of "The Next Generation of AI for Power Electronics: Explainable, Light, and Flexible". The slides are prepared by Xinze Li and Fanfan Lin.
It covers basic introduction to applications of AI in power electronics, brief discussion of physics-informed machine learning methods, PANN inference and its explainability, PANN training and its light characteristics, and PANN's flexibility across diverse conditions and topologies (out-of-domain transfer capability).
To deploy PE-GPT on your PC, the first step is to setup your API call to OpenAI models, please see core/llm/llm.py for more details.
If you want to interact with Plecs software to simulate the designed modulation for DAB, you need to enable the xml-rpc interface in Plecs settings.
# clone the github repository
git clone https://github.com/XinzeLee/PANN
# change the current working directory
cd PANN
# install all required dependencies
pip install -r requirements.txt
# Now you can import the customized PANN models defined.
# It is recommended to go through the notebooks first, before you start to implement on your own.
Although it is strongly recommended to try out those notebooks on your local machine (as the graphical plots will be easier to view and play with), We have still added a few Google Colab notebooks.
- Google Colab (Pytorch) PANN-Buck
- Google Colab (Pytorch) PANN-DAB
- Google Colab (Pytorch) PANN-Operational-Diversity
- Google Colab (Pytorch) PANN-Topology-Transfer
@code-author:
- Xinze Li (email: [email protected])
- Fanfan Lin (email: [email protected])
This repository provides a simplified version of the PE-GPT methodology presented in our journal paper. Despite the simplifications, the released code preserves the overall core architecture of the proposed PE-GPT.
This repository currently includes the following functions/blocks: Retrieval augmented generation, LLM agents, Model Zoo (with a physics-in-architecture neural network deployed in ONNX engine for modeling DAB converters), metaheuristic algorithm for optimization, simulation verification, graphical user interface, and knowledge base. Please note that the current knowledge base is a simplified version for illustration.
This code is licensed under the Apache License Version 2.0.