Capabilities

What our technology can do for you

What our technology can do for you

Apply our technology to your needs and integrate it into your products.

Our team has decades of experience in computational face analysis, apparent emotion estimation, machine learning and computer vision.

Apply our technology to your needs and integrate it into your products.

Discover, measure and track novel biomarkers

Medically Relevant Expressed Behaviour

An observable action, gesture or vocalisation, that can provide insight into a person's physical or mental health generating new and better data to improve patient engagement, experience and eported outcomes.

For example, depression for example can manifest as muscle retardation which causes changes to the face and voice, whilst delayed facial muscle responses or puzzled or exaggerated facial expressions may be early indicators of developing Dementia.

We use machine learning to objectively and automatically analyse face and voice data to interpret medically relevant expressed behaviour and help our customers improve people’s quality of life.

Our technology is clinical grade. It continues to be tested in a clinical context and we are currently working with NHS Trusts in Nottinghamshire on clinical trials to evidence clinical safety and efficacy of the technology in an app designed to assist in health assessments of pregnant women.


Understand the emotional drivers of behaviour every second

Continuous Emotion

Measure emotion continuously to reflects how people experience emotion, allowing changes in a user’s expressed emotion to be linked to events or changes in their local environment at a high level of granularity.

This continuous approach, where appropriate, can be mapped back to a much less exact categorical representation. For example, excited, calm, or angry.

We use cameras, to monitor the facial muscle movement underpinning facial expression, identifying how much those muscles are activated. We also determine the direction of eye gaze, and the pose of the head.

This brings objective measures such as the frequency and intensity of facial muscle actions, head actions, and social gaze to areas which have traditionally been dominated by subjective interpretations.

Our continuous approach to measuring expressed emotion (VAD) uses machine learning to analyse face and voice data during a number of predetermined tasks our software identifies over time how actively engaged the user is.

We call this Arousal.

We use the same approach to assess how positive or negative the user is feeling (Valence).

We are currently deploying a third dimension Dominance - how able an individual feels to deal with the cause of the emotion.

By plotting these three values, with Valence on the x-axis, Arousal on the y-axis and Dominance as a depth to the plot, we can pick a point or collection of points within the three dimensional space and give it a label.

These valence and arousal scores are continuous and take the temporal dynamics of expressed emotion into account.

To help understand and communicate Valence and Arousal Russell's Circumplex Model of Emotions is sometimes to translate the continuous representation to a discrete emotion or state label. For example, excited, calm, or angry. By performing this analysis over time and in a continuous 3-dimensional space we can accommodate many more labels.

Focus on the people that matter


Face Detection and Tracking

Rapidly determine whether or not faces are present in an image and establish their location and 3D head pose / direction.

Focus on the people that matter


Facial Action Coding System

A comprehensive, anatomically-based system for describing all visible facial movements. FACS provides a standardised way to objectively measure and analyse facial expressions.

FACS identifies 44 distinct facial muscle movements, called Action Units (AUs), that produce the full range of human facial expressions. Each AU is assigned a numerical code and detailed descriptions of the appearance changes caused by that muscle movement.

By carefully observing and coding the specific AUs present in a facial expression, researchers can objectively analyse the underlying emotional state.

FACS offers several key advantages:
  1. Objectivity: FACS provides an unbiased, systematic way to measure facial expressions, reducing the subjectivity inherent in more qualitative approaches.
  2. Comprehensiveness: The 44 AUs can describe the entire range of visible facial movements, allowing for nuanced analysis of complex expressions.
  3. Cross-Cultural Validity: FACS-based systems have been validated across diverse cultures, making it a reliable tool for global research.
  4. Clinical Applications: FACS has been used to study emotional disorders, autism, stroke, and other conditions involving facial expression impairments.

Know what is being and looked at and the emotions being experienced

Eye Gaze Tracking

Rapidly determine whether or not faces are present in an image and establish their location and 3D head pose / direction.

Hardware agnostic, software only


Easily Integrated

A lightweight software-only solution designed for easy integration and customisation and with data privacy by design, delivering.

Scientific Rigour

Why BLUESKEYE AI?

There are very few people in the world with the knowledge necessary to design our AI solutions. It requires a PhD and a number of years of working in the field afterwards to even attempt to build an emotion recognition system that actually works, and it requires a team of people to detect medically relevant behaviour from face and voice analysis.

Blueskeye AI has:

  • A dedicated data team of 14 people

  • 10 people with relevant PhDs

  • A dedicated R&D team of 6 people with over with over 110 peer reviewed publications in relevant areas

  • Over 100 years of combined research experience in machine learning, face and voice analysis, and medicine

  • Our Co-founder and Chief Scientific Officer, Michel Valstar is the second most cited person in the world in social signal processing, an h-index of 54 and over 18k citations