247 0

Deep Reinforcement Learning-based Energy Management System for Electric Vehicles with Hybrid Energy Storage System

Title
Deep Reinforcement Learning-based Energy Management System for Electric Vehicles with Hybrid Energy Storage System
Author
김진훈
Advisor(s)
조인휘
Issue Date
2023. 2
Publisher
한양대학교
Degree
Master
Abstract
Battery electric vehicles (BEV) suffer from the risk of having its battery fully depleted before reaching the destination. Additionally, due to their low power density, they are not suited for sudden power peaks involving high speeds and accelerations. Thus, hybrid energy storage systems (HESSs) consisting of multiple power sources have been developed to extend the driving range of EVs and battery life. Combining an ultracapacitor, a battery and a fuel cell, for instance, has a great synergetic effect. The ultracapacitor, even though it has low energy density, it has high power density and is able to provide enough power for sudden demand peaks. The fuel cell, on the other hand, depends on fuel and can be used more aggressively when the battery’s and the ultracapacitor’s state of charge (SOC) are low. However, despite the great benefits of HESSs, developing an optimal power split method for a hybrid energy storage system with multiple power sources is a challenging task. It involves accurate modeling of each of the mechanical components of the EV to compute the load power and the propulsion machine efficiency and accurate modeling of each of the power sources. Additionally, it is essential to choose the right power split algorithm, depending on the type of the system. It can either be a real-time energy management system or a global energy management system. In this work, we present a real-time deep reinforcement learning (DRL)-based energy management system (EMS) for a dual-motor EV with HESS consisting of three power sources: battery, ultracapacitor and fuel cell. Moreover, the proposed EMS is able to predict the future speed and load power so that the EMS may plan in advance the power split and better manage the resources of the HESS. The DRL model was trained and tested using shaped reward functions and standard and real driving cycles. The experiment results confirm that the dual-motor architecture improves the efficiency by more than 10% in contrast to a single-motor architecture; that the proposed DRL-based EMS is robust to different initial SOCs of the battery and the ultracapacitor and that the prediction step of the model is able to give good insights into how to better manage the resources of the HESS.
URI
http://hanyang.dcollection.net/common/orgView/200000654346https://repository.hanyang.ac.kr/handle/20.500.11754/179416
Appears in Collections:
GRADUATE SCHOOL[S](대학원) > COMPUTER SCIENCE(컴퓨터·소프트웨어학과) > Theses (Master)
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE