• Login
  • Register

Work for a Member company and need a Member Portal account? Register here with your company email address.

Publication

Insights from an experiment crowdsourcing data from thousands of US Amazon users: The importance of transparency, money, and data use

Alex Berke

Alex Berke, Robert Mahari, Sandy Pentland, Kent Larson, and Dana Calacci. 2024. Insights from an experiment crowdsourcing data from thousands of US Amazon users: The importance of transparency, money, and data use. Proc. ACM Hum.-Comput. Interact. 8, CSCW2, Article 466 (November 2024), 48 pages. https://doi.org/10.1145/3687005

Abstract

Data generated by users on digital platforms are a crucial resource for advocates and researchers interested in uncovering digital inequities, auditing algorithms, and understanding human behavior. Yet data access is often restricted. How can researchers both effectively and ethically collect user data? This paper shares an innovative approach to crowdsourcing user data to collect otherwise inaccessible Amazon purchase histories, spanning 5 years, from more than 5,000 U.S. users. We developed a data collection tool that prioritizes participant consent and includes an experimental study design. The design allows us to study multiple important aspects of privacy perception and user data sharing behavior, including how socio-demographics, monetary incentives and transparency can impact share rates. Experiment results (N=6,325) reveal both monetary incentives and transparency can significantly increase data sharing. Age, race, education, and gender also played a role, where female and less-educated participants were more likely to share. Our study design enables a unique empirical evaluation of the “privacy paradox”, where users claim to value their privacy more than they do in practice. We set up both real and hypothetical data sharing scenarios and find measurable similarities and differences in share rates across these contexts. For example, increasing monetary incentives had a 6 times higher impact on share rates in real scenarios. In addition, we study participants’ opinions on how data should be used by various third parties, again finding that gender, age, education, and race have a significant impact. Notably, the majority of participants disapproved of government agencies using purchase data yet the majority approved of use by researchers. Overall, our findings highlight the critical role that transparency, incentive design, and user demographics play in ethical data collection practices, and provide guidance for future researchers seeking to crowdsource user generated data. 

Fig. 1. Flowchart representing the survey. Each box represents a discrete section of the survey. Arrows represent movement from one section to another. “No share” indicates the flow if a participant declined to share within any treatment arm. Boxes within “Real share request” such as “Control” or "Bonus $0.05" correspond to experimental treatment arms that participants were randomly assigned to. “Transparent” and “Non-transparent” boxes represent random assignment into either the transparent condition, where participants were shown their data before choosing to share, or non-transparent, where participants were shown only column names. 

Fig. 2. Screenshots from the survey’s share request section. The experiment had a 2x5 factorial design, with 2 "transparency" treatments and 5 "incentives", with 10 total experiment arms. Shown is the experiment arm with the "transparent" and "$0.20 bonus" treatments. Left: Interface before inserting Amazon data file. Right: Interface after inserting Amazon data file. Software within the browser stripped the data file to only include the data columns the survey text described to participants. No data left participant machines unless they clicked "Consent to share". The interface for the "transparent" treatment presented participants with all rows and columns of data that would be collected, within a scrollable interface, before they chose to consent or decline to share. The "non-transparent" treatment only showed the data columns. 

Fig. 3. Share rates by experiment arm. Participants were randomly assigned to an experiment arm in a 5x2 experimental design, with 5 "incentives", shown on the x-axis. In the transparent condition, when prompted to share their data, participants were shown a table with all data that would be shared, while in the non- transparent condition, participants were only shown the data column headers. 

Fig. 4. Share rates for participants offered real versus hypothetical bonuses to incentivize sharing. Transparent versus non-transparent indicates whether participants were in a treatment group where they were shown their data when prompted to share (transparent). Left: Hypothetical share rates are computed from the control group as the cumulative portion of participants who agreed to share their data for less than or equal to a given bonus amount. Bonus amounts ($0.05, $0.20, $0.50) are spaced on the x-axis corresponding to their values, illustrating a linear relationship between the dollar amount offered and share rates. "Real" share rates reflect data from experiment groups offered real monetary incentives. Note hypothetical shares were added to real shares in the control, resulting in a visibly higher intercept for hypothetical share rates – only the slope should be interpreted. Right: Change in share rate when comparing a given bonus incentive amount to the next smaller amount. The x-axis represents the change in incentive. For instance, "$0.05 to $0.20" labels the measured change in share rate for participants who were presented with a $0.20 bonus incentive, using the $0.05 bonus incentive as a baseline. 

Fig. 5. Data use survey questions and summary of results. Question identifiers (Q1-Q5) are used for clarity and reference for the analyses. Questions were presented in random order to participants. 

Related Content