Volume 58 Issue 24
Dec 2022
Turn off MathJax
Article Contents
TANG Xiaolin, CHEN Jiaxin, GAO Bolin, YANG Kai, HU Xiaosong, LI Keqiang. Deep Reinforcement Learning-based Integrated Control of Hybrid Electric Vehicles Driven by High Definition Map in Cloud Control System[J]. JOURNAL OF MECHANICAL ENGINEERING, 2022, 58(24): 163-177. doi: 10.3901/JME.2022.24.163
Citation: TANG Xiaolin, CHEN Jiaxin, GAO Bolin, YANG Kai, HU Xiaosong, LI Keqiang. Deep Reinforcement Learning-based Integrated Control of Hybrid Electric Vehicles Driven by High Definition Map in Cloud Control System[J]. JOURNAL OF MECHANICAL ENGINEERING, 2022, 58(24): 163-177. doi: 10.3901/JME.2022.24.163

Deep Reinforcement Learning-based Integrated Control of Hybrid Electric Vehicles Driven by High Definition Map in Cloud Control System

doi: 10.3901/JME.2022.24.163
  • Received Date: 09 Mar 2022
  • Rev Recd Date: 15 Jul 2022
  • Available Online: 07 Mar 2024
  • Issue Publish Date: 20 Dec 2022
  • In the context of the development of intelligence, connectivity, and new energy, the automotive industry combines computer, information communication, artificial intelligence(AI) to achieve integrated development. Based on the new generation of information and communication technology--cloud control system(CCS) of intelligent and connected vehicles(ICVs), the cloud-level automatic driving of new energy vehicles is realized driven by connected data, which provides innovative planning and control ideas for vehicle driving and power systems. Firstly, based on the resource platform of CCS, the latitude, longitude, altitude, and weather of the target road are obtained, and a high definition(HD) path model including slope, curvature, and steering angle is established. Secondly, a deep reinforcement learning(DRL)-based integrated control method for hybrid electric vehicle(HEV) drive by the HD model is proposed. By adopting two DRL algorithms, the speed and steering of the vehicle and the engine and transmission in the powertrain are controlled, and the synchronous learning of four control strategies is realized. Finally, processor-in-the-loop(PIL) tests are performed by using the high-performance edge computing device NVIDIA Jetson AGX Xavier. The results show that under a variable space including 14 states and 4 actions, the DRL -based integrated control strategy realizes the precise control of the speed and steering of the vehicle layer under the high-speed driving cycle of 172 km, and achieves a fuel consumption of 5.53L/100km. Meanwhile, it only consumes 104.14s in the PIL test, which verifies the optimization and real-time performance of the learning-based multi-objective integrated control strategy.

     

  • loading
  • [1]
    李克强, 戴一凡, 李升波, 等. 智能网联汽车(ICV)技术的发展现状及趋势[J]. 汽车安全与节能学报, 2017, 8(1): 1-14. doi: 10.3969/j.issn.1674-8484.2017.01.001

    LI Keqiang, DAI Yifan, LI Shengbo, et al. State-of-the-art and technical trends of intelligent and connected vehicles[J]. Journal of Automotive Safety and Energy, 2017, 8(1): 1-14. doi: 10.3969/j.issn.1674-8484.2017.01.001
    [2]
    李克强, 常雪阳, 李家文, 等. 智能网联汽车云控系统及其实现[J]. 汽车工程, 2020, 42(12): 1595-1605. doi: 10.19562/j.chinasae.qcgc.2020.12.001

    LI Keqiang, CHANG Xueyang, LI Jiawen, et al. Cloud control system for intelligent and connected vehicles and its application[J]. Automotive Engineering, 2020, 42(12): 1595-1605. doi: 10.19562/j.chinasae.qcgc.2020.12.001
    [3]
    欧阳明高. 中国新能源汽车的研发及展望[J]. 科技导报, 2016, 34(6): 13-20. https://www.cnki.com.cn/Article/CJFDTOTAL-KJDB201606008.htm

    OUYANG Minggao. New energy vehicle research and development in China[J]. Science and Technology Review, 2016, 34(6): 13-20. https://www.cnki.com.cn/Article/CJFDTOTAL-KJDB201606008.htm
    [4]
    TANG X L, JIA T, HU X S, et al. Naturalistic data-driven predictive energy management for plug-in hybrid electric vehicles[J]. IEEE Transactions on Transportation Electrification, 2021, 7(2): 497-508. doi: 10.1109/TTE.2020.3025352
    [5]
    TANG X L, CHEN J X, PU H Y, et al. Double deep reinforcement learning-based energy management for a parallel hybrid electric vehicle with engine start–stop strategy[J]. IEEE Transactions on Transportation Electrification, 2022, 8(1): 1376-1388. doi: 10.1109/TTE.2021.3101470
    [6]
    刘华伟, 耿安琪, 何正友, 等. 重载铁路再生制动能量利用方案研究[J]. 电气工程学报, 2021, 16(1): 157-165. https://www.cnki.com.cn/Article/CJFDTOTAL-DQZH202101020.htm

    LIU Huawei, GENG Anqi, HE Zhengyou, et al. Research on energy utilization scheme of regenerative braking for heavy haul railway[J]. Journal of Electrical Engineering, 2021, 16(1): 157-165. https://www.cnki.com.cn/Article/CJFDTOTAL-DQZH202101020.htm
    [7]
    肖梓林. 城市轨道交通再生能量利用的直流牵引供电系统仿真研究[J]. 电气工程学报, 2021, 16(1): 166-172. https://www.cnki.com.cn/Article/CJFDTOTAL-DQZH202101021.htm

    XIAO Zilin. Simulation research on DC traction power supply system for renewable energy utilization of urban rail transit[J]. Journal of Electrical Engineering, 2021, 16(1): 166-172. https://www.cnki.com.cn/Article/CJFDTOTAL-DQZH202101021.htm
    [8]
    TRAN DD, VAFAEIPOUR M, EL BAGHDADI, M, et al. Thorough state-of-the-art analysis of electric and hybrid vehicle powertrains: Topologies and integrated energy management strategies[J]. Renewable & Sustainable Energy Reviews, 2020, 119: 109596.
    [9]
    张风奇, 胡晓松, 许康辉, 等. 混合动力汽车模型预测能量管理研究现状与展望[J]. 机械工程学报, 2019, 55(10): 86-108. doi: 10.3901/JME.2019.10.086

    ZHANG Fengqi, HU Xiaosong, XU Kanghui, et al. Current status and prospects for model predictive energy management in hybrid electric vehicles[J]. Journal of Mechanical Engineering, 2019, 55(10): 86-108. doi: 10.3901/JME.2019.10.086
    [10]
    YANG K, TANG X L, QIN Y C, et al. Comparative study of trajectory tracking control for automated vehicles via model predictive control and robust H-infinity state feedback control[J]. Chinese Journal of Mechanical Engineering, 2021, 34(1): 1-14. doi: 10.1186/s10033-020-00524-5
    [11]
    LIU T, HU X S, LI S B, et al. Reinforcement learning optimized look-ahead energy management of a parallel hybrid electric vehicle[J]. IEEE-ASME Transactions on Mechatronics, 2017, 22(4): 1497-1507. doi: 10.1109/TMECH.2017.2707338
    [12]
    HU X S, LIU T, QI X W, et al. Reinforcement learning for hybrid and plug-in hybrid electric vehicle energy management: Recent advances and prospects[J]. IEEE Industrial Electronics Magazine, 2019, 13(3): 16-25. doi: 10.1109/MIE.2019.2913015
    [13]
    TAN H C, ZHANG H L, PENG J K, et al. Energy management of hybrid electric bus based on deep reinforcement learning in continuous state and action space[J]. Energy Conversion and Management, 2019, 195: 548-560.
    [14]
    ZOU R N, FAN L K, DONG Y R, et al. DQL energy management: An online-updated algorithm and its application in fix-line hybrid electric vehicle[J]. Energy, 2021, 225: 120174.
    [15]
    LI Y C, HE H W, KHAJEPOUR A, et al. Energy management for a power-split hybrid electric bus via deep reinforcement learning with terrain information[J]. Applied Energy, 2019(255): 113762.
    [16]
    WANG Y, TAN H C, WU Y K, et al. Hybrid electric vehicle energy management with computer vision and deep reinforcement learning[J]. IEEE Transactions on Industrial Informatics, 2021, 17(6): 3857-3868.
    [17]
    LI W H, CUI H, NEMETH T, et al. Cloud-based health-conscious energy management of hybrid battery systems in electric vehicles with deep reinforcement learning[J]. Applied Energy, 2021, 293: 116977.
    [18]
    LIAN R Z, TAN H C, PENG J K, et al. Cross-type transfer for deep reinforcement learning based hybrid electric vehicle energy management[J]. IEEE Transactions on Vehicular Technology, 2020, 69(8): 8367-8380.
    [19]
    TANG X L, CHEN J X, LIU T, et al. Distributed deep reinforcement learning-based energy and emission management strategy for hybrid electric vehicles[J]. IEEE Transactions on Vehicular Technology, 2021, 70(10): 9922-9934.
    [20]
    李克强, 李家文, 常雪阳, 等. 智能网联汽车云控系统原理及其典型应用[J]. 汽车安全与节能学报, 2020, 11(3): 261-275. https://www.cnki.com.cn/Article/CJFDTOTAL-QCAN202003001.htm

    LI Keqiang, LI Jiawen, CHANG Xueyang, et al. Principles and typical applications of cloud control system for intelligent and connected vehicles[J]. Journal of Automotive Safety and Energy, 2020, 11(3): 261-275. https://www.cnki.com.cn/Article/CJFDTOTAL-QCAN202003001.htm
    [21]
    唐小林, 李珊珊, 王红, 等. 网联环境下基于分层式模型预测控制的车队能量控制策略研究[J]. 机械工程学报, 2020, 56(14): 119-128. doi: 10.3901/JME.2020.14.119

    TANG Xiaolin, LI Shanshan, WANG Hong, et al. Research on energy control strategy based on hierarchical model predictive control in connected environment[J]. Journal of Mechanical Engineering, 2020, 56(14): 119-128. doi: 10.3901/JME.2020.14.119
    [22]
    唐小林, 陈佳信, 刘腾, 等. 基于深度强化学习的混合动力汽车智能跟车控制与能量管理策略研究[J]. 机械工程学报, 2021, 57(22): 237-246. doi: 10.3901/JME.2021.22.237

    TANG Xiaolin, CHEN Jiaxin, LIU Teng, et al. Research on deep reinforcement learning-based intelligent car-following control and energy management strategy for hybrid electric vehicles[J]. Journal of Mechanical Engineering, 2021, 57(22): 237-246. doi: 10.3901/JME.2021.22.237
    [23]
    刘腾. 混合动力车辆强化学习能量管理研究[D]. 北京: 北京理工大学, 2017.

    LIU Teng. Reinforcement learning-based energy management for hybrid electric vehicles[D]. Beijing : Beijing Institute of Technology, 2017.
    [24]
    CHEN J X, SHU H, TANG X L, et al. Deep reinforcement learning-based multi-objective control of hybrid power system combined with road recognition under time-varying environment[J]. Energy, 2022, 239, Part C: 122123.
    [25]
    VOLODYMYR M, KORAY K, DAVID S, et al. Human- level control through deep reinforcement learning[J]. Nature, 2015, 518(7540): 529-533.
    [26]
    胡悦. 混合动力电动汽车控制系统设计与能量管理策略研究[D]. 深圳: 中国科学院大学(中国科学院深圳先进技术研究院), 2018.

    HU Yue. Research on control system design and energy management strategy of hybrid electric vehicle[D]. Shenzhen: University of Chinese Academy of Sciences (Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences), 2018.
  • 加载中

Catalog

    Figures(17)  / Tables(5)

    Article Metrics

    Article views(27) PDF downloads(0) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return