What if machines could understand how you feel?
Loading...
0 of 0.
1) Enable your camera... 2) Make a face like your favourite emoji... 3) Wait a moment and enjoy!
Like it? ☕ Buy me a coffee
This experiment was inspired by Alan Turing's Imitation Game (or Turing Test). He originally proposed it as a way to judge a machine's ability to show human-like behaviour. The original game used text-based responses, whereas this experiment generates emotional ones.
The graph shows the probability that the video feed currently matches a labelled emotion. Try smiling or frowning to see the graph move.
When a change in emotion is detected, the output clip is updated to reflect the change imitating the viewers current emotional state.
If you like this, I'd be really grateful if you'd consider becoming a sponsor to help keep the site running.
Technical Info: The neural network used (tinyFaceDetector)
is faster,
smaller and more efficient compared to a standard neural network. This is achieved by replacing
regular convolutions with depthwise separable convolutions.
You can read about how this works here.
The resulting quantized model (trained on a dataset of ~14K labelled images) is only 190KB in size.
N.B. The tradeoff is that it performs less well when detecting very small faces,
but this allows it to work well using webcams on comparatively resource-limited mobile devices and
web-browsers.
I built this with lots of information, inspiration and infrastructure provided by...
Towards Data Science,
EmguCV,
face-api.js,
Tensorflow,
Google Domains,
Microsoft Azure,
loading.io,
Emojipedia,
Bootstrap and
ChartJS.
Created by Chris V. Any feedback welcome -