When AI Mistakes a Snack for a Gun at Kenwood High
When a school’s AI system mistakes a snack for a weapon, it’s more than a glitch it’s a wake-up call. This post investigates the recent incident at Kenwood High School, where an AI gun detection system triggered a police response over a student’s empty Doritos bag. You’ll learn what happened, why the system misfired, and what this means for the future of AI surveillance in schools. We’ll explore the technical, dimensions of this event and what communities should demand moving forward.
What Happened at Kenwood High?
On October 20, 2025, Taki Allen, a 16-year-old student at Kenwood
High School in Baltimore County, was handcuffed at gunpoint by police after the
school’s AI-powered gun detection system flagged his empty Doritos bag as a
weapon. Allen had just finished football practice and was waiting with friends
when the system developed by Omnilert analyzed surveillance footage and
triggered an alert.
Within 20 minutes, eight police cars arrived. Officers ordered Allen to the ground, cuffed him, and searched him. The image that prompted the alert showed Allen holding the chip bag with “two hands and one finger out,” which the system interpreted as a gun-like posture.
Why Did the AI Make This Mistake?
Omnilert’s system uses real-time video analysis to detect potential
firearms. According to the company, the alert was triggered due to:
- Lighting
conditions
- Coloration and
reflectivity of the Doritos bag
- The way Allen
was holding it
These factors created a visual pattern that resembled a firearm in the
system’s algorithm. Although the system includes a human verification step,
miscommunication between school officials and law enforcement led to the full
police response before the alert was canceled.
Systemic Issues and Community Response
The incident has ignited concerns about:
- False positives
in AI surveillance
- Over-policing
in schools
- Potential
racial bias in AI systems
- Lack of
accountability and transparency
Allen’s family and local leaders, including Baltimore County Councilman
Izzy Patoka, have called for a full review of the system and its protocols.
Allen himself reported feeling traumatized and unsafe returning to school,
saying, “If I eat another bag of chips or drink something, I feel like they’re
going to come again.”
What’s Next for AI in Schools?
Omnilert maintains that its system “functioned as intended,” emphasizing
rapid human verification. However, critics argue that functioning as
intended should not result in armed officers confronting a student over a
snack. The Baltimore County school district has offered counseling and promised
to reassess its procedures.
This case underscores the urgent need for:
- Rigorous
testing of AI systems in real-world environments
- Clear
communication protocols between schools and law enforcement
- Ethical
oversight and community involvement in tech adoption
Disclaimer
This post was written by Susang6 and her AI assistant, as part of an
ongoing collaboration to document, research, and reflect on emerging
technologies and their impact on real lives. All insights are grounded in
publicly available sources and shaped by our shared commitment to clarity,
truth, and resonance.
Read advocacy articles by Susang6 here
📚 Sources of Research
- CBS
News: AI gun detection system mistakes Doritos bag for weapon at
Baltimore County school
- The
Baltimore Banner: Kenwood High student handcuffed after AI system misidentifies
Doritos bag as gun
- Washington
Post: AI system in school triggers police response over snack bag
- Omnilert: Official site
and product documentation
- Baltimore
County Council: Public statements and press releases

Comments