eduzhai > Applied Sciences > Engineering >

Behavioral Economics for Human-in-the-loop Control Systems Design Overconfidence and the hot hand fallacy

  • king
  • (0) Download
  • 20210506
  • Save

... pages left unread,continue reading

Document pages: 17 pages

Abstract: Successful design of human-in-the-loop control systems requires appropriatemodels for human decision makers. Whilst most paradigms adopted in the controlsystems literature hide the (limited) decision capability of humans, inbehavioral economics individual decision making and optimization processes arewell-known to be affected by perceptual and behavioral biases. Our goal is toenrich control engineering with some insights from behavioral economicsresearch through exposing such biases in control-relevant settings. This paperaddresses the following two key questions: 1) How do behavioral biases affectdecision making? 2) What is the role played by feedback in human-in-the-loopcontrol systems? Our experimental framework shows how individuals behave whenfaced with the task of piloting an UAV under risk and uncertainty, parallelinga real-world decision-making scenario. Our findings support the notion ofhumans in Cyberphysical Systems underlying behavioral biases regardless of --or even because of -- receiving immediate outcome feedback. We observesubstantial shares of drone controllers to act inefficiently through eitherflying excessively (overconfident) or overly conservatively (underconfident).Furthermore, we observe human-controllers to self-servingly misinterpret randomsequences through being subject to a "hot hand fallacy ". We advise controlengineers to mind the human component in order not to compromise technologicalaccomplishments through human issues.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×