A crucial challenge in future smart energy grids is the large-scale coordination of distributed energy generation and demand. In the last years several Demand Side Management approaches have been developed. A major drawback of these approaches is that they mainly focus on realtime control and not on planning, and hence cannot fully exploit the flexibility of e.g. electric vehicles over longer periods of time.
In this chapter we investigate the optimization of charging an electric vehicle (EV). More precisely, the problem of charging an EV overnight is formulated as a Stochastic Dynamic Programming (SDP) problem. We derive an analytic solution for this SDP problem which in turn leads to a simple short-term bidding strategy. From an MDP point of view this solution has a number of special features:
• It leads to analytic optimal results based on order statistics.
• It allows for a more practical rule which can be shown to be nearly optimal.
• It is robust with respect to the modeling assumptions, showing little room for further improvement even when compared to a solution with perfect foresight.
Numerical results with real-world data from the Belgium network show a substantial performance improvement compared to standard demand side management strategies, without significant additional complexity. (This chapter is based on Kempker et al. (Proceedings of the 9th EAI international conference on performance evaluation methodologies and tools, Valuetools 2015, Berlin, 14–16 December 2015, pp 1–8, 2016).)
|Title of host publication||Markov Decision Processes in Practice|
|Editors||Richard J. Boucherie, Nico M. van Dijk|
|Place of Publication||Cham|
|Number of pages||18|
|Publication status||Published - 2017|
|Name||International Series in Operations Research & Management Science|
|Publisher||Springer International Publishing|
- Approximate Dynamic Programming
- Computational Research
- Markov Decision Processes
- Neuro Dynamic Programming
- Probabilistic Modeling
- Stochastic Operations Research