Facial Coding

Affect Lab uses facial coding to quantify how users feel by analyzing facial expression

Facial Coding

Facial Expressions Identified

Facial expressions of users are identified and
landmarked using Affect Lab's proprietary tech

EEG Heatset

Algorithms Calculate Emotions

Our deep learning algorithms them calculate
Human emotions with an accuracy of ~85%

Eye Tracking

Affect Lab uses Remote Eye tracking to monitor where, when, and what people are looking at & for how long

Eye Tracking

Tracking Eye Movements

Visual activity is monitored using a standard webcam & screen based Eye Tracking software

Eye Tracking

Algorithms calculate Visual Metrics

Image processing algorithms calculate the point of gaze to create Heat Maps & Gaze Plots in real-time

Brainwave Mapping

Affect Lab uses EEG(ElectroEncephaloGraphy) to measure user emotion by analyzing brainwave data

EEG Heatset

Electrical Activity Calculated

Electrical activity in user's brain are
monitored using Affect Lab's
headset

Facial Coding

Algorithms Define Metrics

The brain activity data collected by the EEG headset is
then interpreted by Algorithms to define behavioral
and cognitive metrics

EmotionAI

AI platform that uses deep learning AI algorithms & eliminates dependency on test users to predict EMOTION METRICS

- Coming Soon -

Metrics And Reports

Why Choose Us?

When most market research techniques rely on what users have to say, it is time to give yours a competitive edge by knowing a customer's unarticulated and unsaid feelings

DON'T DELIVER A PRODUCT, DELIVER AN EXPERIENCE
Impact

100%

Impact on Topline

120 seconds setup Time

120sec

Setup Time

Emotional Parameters measured

20+

Emotional Parameters Measured