Project Overview

For the course project, you and a team of five students will serve as the Trust and Safety team at a major social media, consumer cloud, or AI platform. Your team will focus on a particular type of abuse, proposing policies to the executives at your company as well as researching and implementing a real technological solution.

The project is split into three milestones, which will be completed over the course of the quarter. Milestone 1 is an individual assignment. Milestones 2 and 3 will be completed as a group once teams are formed after the add/drop deadline. The final milestone will culminate in a presentation that your group will give to the teaching team and guest judges from industry at a poster session on Tuesday, June 2nd from 5-7pm.

How Milestone 1 connects to the group project: After teams are formed, each team member will present their Milestone 1 pitch to the group. The team will then select one abuse type to pursue for Milestones 2 and 3. This is your chance to find an interesting topic for your team to dive more deeply into and solve during the last 2/3rds of the quarter.

Milestone 1: Abuse Study & Topic Pitch

Percent of Final Grade: 20%

Due Date: Friday, April 17 at 11:59 PM PT

Deliverables:

  1. A slide deck (8–15 slides) covering your abuse research, policy comparison, and topic pitch
  2. A 5–8 minute recorded video of you presenting the slides with your voiceover narration

Submission: Upload your PDF and video to Canvas.

Choosing Your Abuse Type

You will select one abuse type to research in depth. This is the topic you will pitch to your future team, so choose something you care about and would want to work on for the rest of the quarter. Please ensure you are comfortable working with the chosen abuse type. We encourage you to skim the reading syllabus for ideas.

The abuse can include either a human abuser and a human victim interacting online, a human using AI to cause harm, or a human using AI and then coming to harm. All three scenarios will be supported with this year’s project. The scenario must be specific enough, however, for your team to create a policy and an intervention, however. “AI systems give false information” would not be specific. “An AI chatbot, when asked to provide voting information, provides incorrect information on how to vote with high confidence” would be. For now, you do not need to pick a specific platform that the abuse occurs on, that will be a requirement for your simulation in Milestone 2.

Milestones 2 and 3 will require testing your solution with test data. You can use stand-ins for illegal and/or very harmful material (e.g. pictures of nude kittens instead of child sexual abuse material). Don’t let the potential technical challenge of a topic scare you away from tackling it; your milestones will be graded on effort and thoughtfulness, not whether or not you’re able to effectively solve these problems. They’re still problems in the real world because they’re extremely challenging issues!

Example Abuse Types: You are welcome to research any of these abuse types or one of equivalent importance. As we said in lecture, you are encouraged to build off these ideas to find an emerging type of abuse that has not been tackled by a team in the past. You are welcome to ask your TAs or the instructors for advice in section, office hours, or via a private post on Ed.

  • Suicides driven by bullying
  • Live streaming of murder-suicides
  • Live streaming of terrorist attacks
  • Government propaganda against domestic minorities
  • Disinformation in online ads
  • Coordinated harassment of journalists
  • Sextortion
  • Trading of child sexual abuse materials
  • Online cryptocurrency scams
  • Hate speech on a streaming platform
  • Terrorist recruitment
  • Grooming
  • Catfishing
  • Virtual kidnappings
  • Mass reporting of content by fake accounts
  • Pig-butchering cryptocurrency scams
  • CSAM creation by visual AI generative models
  • “Nudify apps” created by prompt injection of legitimate AI image manipulation systems
  • AI voice cloning for impersonation and fraud
  • Deepfake non-consensual intimate imagery (NCII)
  • AI-powered romance scams and social engineering
  • LLM-assisted phishing and spear-phishing at scale
  • Synthetic identity fraud using AI-generated documents and faces
  • AI-enabled academic fraud and ghost-writing services
  • Autonomous AI agents used for harassment or stalking
  • AI-generated influence operations and propaganda
  • AI-powered tools that assist child predator grooming
  • Algorithmic amplification of self-harm and eating disorder content
  • Adults or children (pick one) have increasingly sexual conversations with an chatbot
  • AI systems provide false information, like medical advice, that lead to harm

Part 1: Slide Deck (60% of assignment grade)

Your slide deck should be structured as a professional briefing for the leadership team at your company. You are making the case that this abuse type is a serious problem that the company needs to address, backed by research and a concrete plan. Use the following structure, but you are welcome to add additional information in whatever sections you see fit.

In writing your slides, please use citations in the APA style for factual information and include your own original analysis and recommendations. You are welcome to use graphics, charts, or diagrams. Stanford provides good advice on academic writing and citations.

Required Sections in Your Slide Deck:

  1. Title Slide — Your name, the abuse type, the date, and your course listing and Spring 2026.

  2. Description of the Abuse Type — Provide a summary of the kind of abuse, including citations to known examples. This can include reporting in the media, talks by professionals, or legal documents like indictments. Provide context for prevalence and impact.

  3. Actor and Victim Profiles — What do we know about the people behind this kind of abuse? How about the victims? Is this something anybody can experience, or is the abuse tied to a specific part of the victim’s identity? Are there forums or other platforms where these kinds of abusers congregate?

  4. Details of the Abuse — Describe the immutable aspects of this kind of abuse and how they could be detected. What might differ between attackers and victims? Dive into at least one real-world example and pull out specific moments at which the abuse could have been detected or mitigated.

  5. Relevant Social Science Research — What can we learn about this type of abuse from academic research (e.g. peer-reviewed articles in journals or books by academics)? This could include research into the “offline” version of this abuse. Useful resources:
  6. Relevant Technologies — What technologies currently exist that are or can be used to combat this kind of abuse? What are their strengths and weaknesses? On what platforms are they used and to what levels of success?

  7. Policy Comparison Table — Create a table outlining what platform policies are currently in effect that relate to your chosen abuse type. Compare the policies of at least three platforms. For each platform, quote relevant policy language and hyperlink to the exact source. Think critically about what columns are theoretically important. You can see examples in Figure 1 here, Table 1 here, and here.

  8. Specific Recommendations — What policy, product, engineering, or operational changes do you recommend? Why would this be a compelling topic for the team to pursue in Milestones 2 and 3? What is your proposed approach?

  9. Citations - The detailed citations in APA format.

Part 2: Pitch Video (40% of assignment grade)

Record a 5–8 minute video of yourself presenting your slide deck with voiceover narration. This is your pitch to your future team — make it clear, compelling, and well-rehearsed.

Requirements:

  • Target length: 5–8 minutes. Maximum length: 10 minutes. Videos over 10 minutes will be penalized.
  • You must narrate the presentation yourself using your own voice. The use of AI-generated voices (e.g. ElevenLabs, text-to-speech) is not permitted.
  • Your face does not need to be on camera, but your voice must be clearly audible.
  • You must upload the video and PDF, links are not permitted.

How to Record:

Both Google Slides and Microsoft PowerPoint (both licensed by Stanford for all students) have built-in features for recording a presentation with voiceover:

  • Google Slides: Use the “Record” feature in the top-right corner of the editor. Your recording is saved to Google Drive. Google Slides recording instructions
  • Microsoft PowerPoint: Go to the Slide Show tab and click “Record Slide Show.” Your narration is embedded in the file and can be exported as a video. PowerPoint recording instructions

You may also use Zoom (record yourself screen-sharing), Loom, or any other screen recording tool. If you want to get fancy, OBS is the free software used by streamers to record themselves and could be used to put yourself in front of the slides.

AI Usage Policy

You may use AI tools (ChatGPT, Claude, Gemini, etc.) for research purposes — finding sources, summarizing background reading, brainstorming angles, etc. However, you must write the slides yourself and record the pitch in your own voice. The slide content, analysis, and recommendations should reflect your own thinking and synthesis. The pitch video is specifically designed to verify your understanding of the material — you should be able to speak fluently about your topic without reading a script verbatim.

Grading Criteria

Part 1 — Slide Deck (60%)

Description of the Abuse Type

  • Cites at least two specific sources (media coverage, policy reports, academic papers, legal documents, etc.)
  • Provides context for the abuse type and its prevalence
  • Description is concise and cohesive while covering all relevant details

Actor and Victim Profiles

  • Thoroughly overviews the relevant characteristics of actors and victims

Details of the Abuse

  • Thoroughly examines at least one specific real-world example
  • Identifies key moments at which the abuse could have been detected or mitigated
  • Identifies which of those moments is immutable between all instances of this abuse type

Relevant Social Science Research

  • Cites at least two peer-reviewed journal articles or academic books that address the abuse type

Relevant Technologies

  • Examines at least one specific technology
  • Cites where this technology is accessible to developers (open source packages, APIs, etc.)
  • Discusses the technology’s use by a specific organization or explains why it hasn’t been adopted
  • Clearly evaluates the strengths and weaknesses of the technology

Policy Comparison Table

  • Examines policies from at least three platforms, quoting language from them and hyperlinking where appropriate
  • Contains thoughtfully developed columns that illuminate differences across platforms
  • Includes proper citations or hyperlinks to original policy documents

Specific Recommendations

  • Provides at least 2 specific, applicable, and clear recommendations
  • Recommendations are well-justified using evidence presented in earlier slides
  • Connects recommendations to the identified detection/mitigation moments
  • Makes a clear case for why this topic is worth pursuing in Milestones 2 and 3

General Criteria (applied across all sections):

  • Quality of citations: pull from a variety of sources — media coverage, policy reports, academic papers, firsthand accounts, legal documents, talks by professionals, platform policies, etc.
  • Accuracy: all claims should be specific and accurate
  • Depth of analysis: topics handled thoughtfully, with nuance and multiple perspectives considered
  • Slide design: clear, readable, well-organized. Not overloaded with text.

Part 2 — Pitch Video (40%)

  • Clarity: Presenter speaks clearly and the audio quality is adequate
  • Command of material: Presenter demonstrates genuine understanding of the topic, not just reading slides
  • Persuasiveness: The pitch makes a compelling case for why this abuse type is important and worth pursuing
  • Feasibility: The proposed direction for Milestones 2 and 3 is realistic and well-considered
  • Time management: Presentation is 5–8 minutes, covering all key points without rushing or padding. You do not have to read every bullet! This is a key skill you should learn for presenting in professional contexts. The point of a slide deck is to have something that backs up what you are saying, your audience can read the bullets to supplement your talk.