During the pandemic, an alarming new trend has emerged: Students are sharing more sexually explicit content than they were before. In the first few months of the 2020–21 school year, Gaggle noted a 135% increase in incidents flagged as nudity and sexual content. This doesn’t just include pornographic content students have found online—it also includes nude or otherwise sexually explicit photos and videos of the students themselves.
What happens when your students share too much? And how can educators and law enforcement work together to protect students in vulnerable situations? Join us on Wednesday, April 7 at 1:00 PM ET for our Student Wellness Series: Selfies and Explicit Content webinar. Hear from:
Matt Joy began his career at Wisconsin Department of Justice (DOJ) in 1999 when he was hired as a special agent in the Division of Criminal Investigation (DCI) assigned to drug enforcement investigations in the Milwaukee area. Matt transferred to the Appleton office in 2002 and transitioned to an assignment with the Internet Crimes Against Children (ICAC) Task Force in 2008. As an ICAC special agent, Matt investigated technology-facilitated crimes against children including CyberTips, peer-to-peer, and proactive investigations. In 2013, Matt was promoted to special agent in charge in the Madison Regional Office where he supervised ICAC, public integrity, and white collar crime investigations. Matt took an assignment as special agent in charge of the ICAC Task Force in 2014, which included working with DCI’s Digital Forensics Unit (DFU) and the statewide ICAC Task Force. In 2014, Matt was promoted to director. As a director, Matt has worked with the division’s support services section, managed the division’s training and background programs, and served as the state fire marshal. Currently, Matt works with the division’s ICAC Task Force, DFU, and human trafficking section and serves as Wisconsin’s ICAC Task Force commander. Matt is a proud St. Ambrose University graduate (go Bees!) and earned his Master of Science Degree in Criminal Justice Sciences from Illinois State University.
A psychologist, attorney, author, and mother, Dr. Lisa Strohman established Digital Citizen Academy to help keep families safe from online dangers. Her background working as a visiting scholar with the profiling unit at the FBI during one of the most tragic school shootings in the U.S. helped create her passion to help proactively prevent and educate students, educators, and parents on issues related to technology.
With a focus on providing support and guidance for Gaggle educational partners, Heather Durkac oversees an award-winning Customer Service team, the vital Safety Management team, and the superb Implementation, Professional Development, and Account Management teams. Heather brings nearly 20 years of experience in management, customer relations, and training from both the corporate sector and the K-12 environment. Passionate about keeping students safe, she values the partnership with customers to ensure school leaders and staff are equipped with the means to effectively respond to student safety concerns.
Ryan works at a public school in Phoenix, AZ, that is recognized as the longest-running Mandarin Immersion program in Arizona. During his 17 years in public education, Ryan has been an administrator and taught at the elementary, middle, and high school levels and filled additional duties such as a coach, club sponsor, and curriculum developer. Ryan graduated from Arizona State University with a B.S. in marketing and worked in the financial industry before entering the teaching profession in 2004. Ryan holds an M.A. in Athletic Coaching Education from Ball State University and an M. Ed. in Education Administration from Grand Canyon University. He is currently a doctoral candidate at Grand Canyon University.
Machine learning technology flags potentially harmful content and images in students’ school-issued email and online file storage accounts.
(G Suite, Office 365, and Canvas)
An in-house team of trained safety professionals work 24/7/365 to evaluate flagged content for false positives, categorize incidents, and determine their severity.
Gaggle intercepts harmful content and alerts administrators based on severity. In imminent situations, district-appointed contacts are notified immediately, even after standard business hours.