Students explore AI by turning facial expressions into lights and sounds using a camera and CodyNick. It’s a playful introduction to how computers detect simple emotions.
They test an emotion model with their faces (or emoji cards) and map each mood to a light color or short sound. Using block coding, they connect camera output to CodyNick actions.
A live demo: when the model detects “happy,” a warm light and cheerful chime play; “sad” triggers a cool light, etc. Students discuss respectful, fair use of AI with people’s images.