Post

Cybersecurity Badge Day 2020: Girl Scouts of Eastern MA focuses on data privacy and tech policy

Daniella DiPaola

Last year, the Federal Trade Commission issued a historic $170 million fine to YouTube for alleged violations of the Children’s Online Privacy Protection Rule (COPPA). In January 2020, YouTube responded by launching a series of changes to better protect children and their privacy. One of the biggest changes is that content creators must label their videos as “kids content,” and those videos collect limited data to protect minors.

Going from policy to practice, these are some of the herculean questions that policymakers, practitioners, and researchers are grappling with. Homemade slime with Doctor Squish, 5-minute crafts, and Minecraft videos: What constitutes “kids content” and how do content-creators convey this effectively? How does YouTube limit and minimize data collection in practice? What does a reasonable parent-child consent process look like? 

These conversations are often led by legal experts, technocrats, and academic researchers. To better understand children’s perspective and intuition of data privacy and collection, we organized a workshop with the end user experts themselves: kids. 

With the Girl Scouts of Eastern Massachusetts and Edward M. Kennedy Institute, the Media Lab’s Personal Robots group directed by Cynthia Breazeal collaborated to gather over 60 Girl Scouts and their parents for Cybersecurity Badge Day. The Girl Scouts participated in workshops and then convened to vote on a data privacy draft bill in a full-scale replica of the US Senate Chamber. 

“We’ve noticed that students and girls are most entirely online today and they may not be thinking about the effect they have as they are posting on Instagram and TikTok,” explained Eileen Koury, Girl Program Specialist for the Girl Scouts of Eastern Massachusetts. 

Today, when 81 percent of the world’s children under two have a digital footprint and privacy breaches have regularly inundated news headlines, this convening was a timely initiative for the girls to think critically about the platforms they use everyday. This Media Lab curriculum on data privacy and governance is being developed as part of a larger initiative at MIT, led by Cynthia Breazeal, to democratize AI through K-12 AI education that includes important ethical design considerations on topics like algorithmic bias and recommendation systems.

We built on efforts to improve youth data literacy from Blakeley Payne, Erica Deahl, Berkman Klein Center’s Youth and Media, LSE's Media and Communications team, and existing Girl Scouts curricula. Our two workshops explored topics focused on data privacy and security through the lens of popular technologies such as YouTube and VSCO.

The first workshop focused on data collection and the impacts of predictive technologies to categorize viewers. Analyzing the YouTube recommendation page (see Figure A), the Girl Scouts (ages 7-9) described the person who might be shown these recommendations through questions such as “How old is this person?” “What are their interests?” and “Where do they live?” Many were keenly aware of the personal and societal impacts of sharing data through platforms. “YouTube doesn’t know everything about me,” said one participant, explaining the limitations of platform predictions. “It knows your likes, knows what videos you watch, [...] and might know what you want to watch next but not always.” Another participant highlighted impacts of data-collection, sharing that YouTube’s video recommendations (See: Figure A), may stem from being logged versus not logged in and through previous click behavior from another connected device like smart TVs. 

The second workshop explored how kids aged 9-11 can better recognize helpful versus harmful advertisements through visual user interface indicators. Students created targeted advertisements and faced the tension of trying to sell a product while being transparent about ads through design features for products like a Harry Potter wand or a Tasty cookbook. They pointed out that VSCO girl, a popular subculture among Gen Z, subtly advertises products like Polaroid cameras, hydro flasks, and Fjällräven backpacks. They noted the ubiquity of advertisements in their lives, from school math programs to gas stations and community sports sponsors. Students highlighted design features such as: price, countdown timers, and the words “ad” or “sponsored” to make people aware that a company is trying to sell something. Increasing ad awareness is an area of further exploration from academicscommunity library programs, and industry design practitioners

The last session emulated a Senate vote on Hawley’s SMART Act, focused on an infinite scroll ban and screen time limit. The Girl Scouts ultimately voted to reject the bill and debated how privacy practice varies by context and culture. “I’m only allowed to use [YouTube] one hour a night and only on the weekends— I don’t know if that is a good rule for everyone,” said one Girl Scout during the mock Senate floor vote. 

“We want young people to learn politicians are giving us more than a soundbite on TV,” Amy Munslow, Education Manager at EMK Institute explained. “Helping people understand what a fair and effective bill is for the entire country and what we should expect from legislators is really important.”

Working directly with youth helps improve the gap between user needs, product design, and policy creation. As data collection becomes a ubiquitous part of growing up, topics like data bias and designing platforms to improve user empowerment will be key issues to incorporate into educational resources. Answering these questions will take time and more intentional collaboration across sectors. Events like these are one step forward in that direction.

“When we do programs like this, I am continually reminded not to underestimate young people,” Sarah Yezzi, Director of Education, Family & Youth Programming at EMK Institute reflected. One Scout’s comment in particular deeply resonated with the program staff: “I vote to support the bill—my mom says we might as well go outside while we still can.” 

----------------

Stephanie Thien Hang Nguyen is a research scientist at the MIT Media Lab focusing on data privacy, design, and tech policies that impact marginalized populations. She previously led privacy and user experience design projects with the National Institutes of Health and Johns Hopkins’ Precision Medicine team.

Daniella DiPaola is a first-year graduate student at the Personal Robots Group at the MIT Media Lab, currently working on an AI + Ethics curriculum for middle school students. She has worked as a researcher in the consumer robotics industry, focusing on human-robot interaction paradigms in the home, long-term livability, and robots in the lives of both children and older adults.

Special thanks to workshop and event coordinators: 
@EMKInstitute@GirlScoutsEMass, @Operation_250@CryptoLass, @AdriannaCyberSN 

Related Content