000 | 03776cam a22004577a 4500 | ||
---|---|---|---|
001 | w30981 | ||
003 | NBER | ||
005 | 20230322103720.0 | ||
006 | m o d | ||
007 | cr cnu|||||||| | ||
008 | 230322s2023 mau fo 000 0 eng d | ||
040 |
_aMaCbNBER _beng _cMaCbNBER |
||
100 | 1 | _aAgan, Amanda Y. | |
245 | 1 | 0 |
_aAutomating Automaticity: _bHow the Context of Human Choice Affects the Extent of Algorithmic Bias / _cAmanda Y. Agan, Diag Davenport, Jens Ludwig, Sendhil Mullainathan. |
260 |
_aCambridge, Mass. _bNational Bureau of Economic Research _c2023. |
||
300 |
_a1 online resource: _billustrations (black and white); |
||
490 | 1 |
_aNBER working paper series _vno. w30981 |
|
500 | _aFebruary 2023. | ||
520 | 3 | _aConsumer choices are increasingly mediated by algorithms, which use data on those past choices to infer consumer preferences and then curate future choice sets. Behavioral economics suggests one reason these algorithms so often fail: choices can systematically deviate from preferences. For example, research shows that prejudice can arise not just from preferences and beliefs, but also from the context in which people choose. When people behave automatically, biases creep in; snap decisions are typically more prejudiced than slow, deliberate ones, and can lead to behaviors that users themselves do not consciously want or intend. As a result, algorithms trained on automatic behaviors can misunderstand the prejudice of users: the more automatic the behavior, the greater the error. We empirically test these ideas in a lab experiment, and find that more automatic behavior does indeed seem to lead to more biased algorithms. We then explore the large-scale consequences of this idea by carrying out algorithmic audits of Facebook in its two biggest markets, the US and India, focusing on two algorithms that differ in how users engage with them: News Feed (people interact with friends' posts fairly automatically) and People You May Know (people choose friends fairly deliberately). We find significant out-group bias in the News Feed algorithm (e.g., whites are less likely to be shown Black friends' posts, and Muslims less likely to be shown Hindu friends' posts), but no detectable bias in the PYMK algorithm. Together, these results suggest a need to rethink how large-scale algorithms use data on human behavior, especially in online contexts where so much of the measured behavior might be quite automatic. | |
530 | _aHardcopy version available to institutional subscribers | ||
538 | _aSystem requirements: Adobe [Acrobat] Reader required for PDF files. | ||
538 | _aMode of access: World Wide Web. | ||
588 | 0 | _aPrint version record | |
690 | 7 |
_aRelation of Economics to Other Disciplines _2jelc |
|
650 | 7 |
_aRelation of Economics to Other Disciplines _2jelc |
|
084 |
_aA12 _2jelc |
||
690 | 7 |
_aEquity, Justice, Inequality, and Other Normative Criteria and Measurement _2jelc |
|
650 | 7 |
_aEquity, Justice, Inequality, and Other Normative Criteria and Measurement _2jelc |
|
084 |
_aD63 _2jelc |
||
690 | 7 |
_aSearch • Learning • Information and Knowledge • Communication • Belief • Unawareness _2jelc |
|
650 | 7 |
_aSearch • Learning • Information and Knowledge • Communication • Belief • Unawareness _2jelc |
|
084 |
_aD83 _2jelc |
||
700 | 1 | _aDavenport, Diag. | |
700 | 1 |
_aLudwig, Jens. _915660 |
|
700 | 1 |
_aMullainathan, Sendhil. _917242 |
|
710 | 2 | _aNational Bureau of Economic Research. | |
830 | 0 |
_aWorking Paper Series (National Bureau of Economic Research) _vno. w30981. |
|
856 | 4 | 0 | _uhttps://www.nber.org/papers/w30981 |
856 |
_yAcceso en lĂnea al DOI _uhttp://dx.doi.org/10.3386/w30981 |
||
942 |
_2ddc _cW-PAPER |
||
999 |
_c390698 _d349260 |