Bidding Games on Markov Decision Processes

Warning

This publication doesn't include Faculty of Education. It includes Faculty of Informatics. Official publication website can be found on muni.cz.
Authors

AVNI Guy HENZINGER Thomas A. IBSEN-JENSEN Rasmus NOVOTNÝ Petr

Year of publication 2019
Type Article in Proceedings
Conference Reachability Problems - 13th International Conference, RP 2019, Brussels, Belgium, September 11-13, 2019, Proceedings.
MU Faculty or unit

Faculty of Informatics

Citation
Web https://link.springer.com/chapter/10.1007/978-3-030-30806-3_1
Doi http://dx.doi.org/10.1007/978-3-030-30806-3_1
Keywords Game theory; Markov processes; Stochastic systems; Bidding mechanism
Description In two-player games on graphs, the players move a token through a graph to produce an infinite path, which determines the qualitative winner or quantitative payoff of the game. In bidding games, in each turn, we hold an auction between the two players to determine which player moves the token. Bidding games have largely been studied with concrete bidding mechanisms that are variants of a first-price auction: in each turn both players simultaneously submit bids, the higher bidder moves the token, and pays his bid to the lower bidder in Richman bidding, to the bank in poorman bidding, and in taxman bidding, the bid is split between the other player and the bank according to a predefined constant factor. Bidding games are deterministic games. They have an intriguing connection with a fragment of stochastic games called random-turn games. We study, for the first time, a combination of bidding games with probabilistic behavior; namely, we study bidding games that are played on Markov decision processes, where the players bid for the right to choose the next action, which determines the probability distribution according to which the next vertex is chosen. We study parity and mean-payoff bidding games on MDPs and extend results from the deterministic bidding setting to the probabilistic one.
Related projects:

You are running an old browser version. We recommend updating your browser to its latest version.