Thought Leadership Blogs
Our takes on the world today
Deus in Machina - the end of the beginning of conversational AI?
A Swiss church has taken AI to a new level by creating an AI avatar of Jesus! This interactive avatar, placed in a confessional booth, has already been used by over 1,000 people, two thirds of whom reported that their interaction with this JAIsus resulted in ‘a spiritual experience’.
This groundbreaking experiment sparks fascinating questions about humanity, the future of AI and its impact on society.
Read the full blog post to dive deeper into the implications of this technological advancement and the philosophical discussions it inspires.
The Oximeter Oxymoron
Yesterday the Independent Review of Equity in Medical Devices published a report on bias in medical devices. Set up in 2022 by the then Secretary of State for Health and Social Care, Sajid Javid, the review sought to establish the extent and impact of ethnic and other unfair biases in the performance of medical devices commonly used in the NHS.
Ethics in Affective Computing
Chief Scientific Officer, Prof Michel Valstar's latest blog post charts the development of an ethical approach to affective computing resulting in the publication of The IEEE Transaction on Affective Computing’s Special Issue on Ethics in Affective Computing.
BLUESKEYE AI's Explainable, Robust, and Adaptable Approach to Face and Voice AI
Many people are rightly excited about the opportunity AI brings to preventing, detecting, and treating poor mental health. To do so, an ethical approach to AI is required, which includes making AI that is explainable and accurate regardless of who you are. And to achieve the greatest impact, it should be adaptable to be used for as many medical conditions as possible.
BLUESKEYE AI uses a unique approach to achieve explainable, robust, and adaptable analysis of medically relevant face and voice behaviour.
Dimensional Affect: an Explainer of Valence, Arousal, and Dominance (VAD)
Have you ever asked yourself, why do humans spend such an enormous amount of time and energy on faces? Taking pictures, using make-up, and writing elaborate descriptions in books and stories?
Supporting Clinical Practice with Ethical Emotion AI
Two weeks ago I attended CES 2024, a massive trade fair where over 130,000 people attended to see the latest in consumer electronics. BLUESKEYE AI had a stand there, in the Digital Health Zone, with the aim to find new (business) customers. For four days me and the others in the team would explain what we do. Invariably, I’d start with ‘BLUESKEYE specialises at recognising medically relevant face and voice behaviour, to help pharmaceutical companies and the automotive industry detect conditions such as depression, fatigue, pain, and many others'.
There are always a fair few well-connected clinicians there, often acting as a technology scout for companies in the health sectors, and as you can imagine they got really excited about the possibilities for healthcare! We’d have a good conversation about how BLUESKEYE AI could help them, and I came away really inspired about how we can support clinical practice with our Ethical Emotion AI. To give everyone working in healthcare the same benefit of that conversation without having to fly to CES, I will try to set out some of the ways in which BLUESKEYE AI can support clinical practice in this article.
What does the EU AI Act mean for Affective Computing and Emotion AI?
The EU AI Act will heavily affect innovation and commercialisation of Affective Computing in the EU. Despite talk of having measures to support innovation, the EU AI Act will seriously stifle innovation in this area, hitting in particular SMEs and startups hard. The act has been adopted by the EU, and is now being implemented by its member states. That means that very soon providers of Affective Computing and Emotion AI systems will have to comply with its stipulations in the EU. The act defines prohibited, high risk and low risk AI systems, with pretty onerous obligations for providers of high risk systems and relatively few obligations for low risk systems.
Data are Oompa Loompas
In 2006, British mathematician and entrepreneur Clive Humby coined the phrase “Data is the new oil”. This analogy has often been repeated since, including by myself. It’s not an analogy that stands up to a lot of scrutiny though. Sure, as with oil there’s big money involved, and controlling data gives you power similar to controlling oil, and as such it is worth putting thought and effort into.