Ethical Technology

Fair

At BlueSkeye we want our software to work for everyone, irrespective of apparent Gender, age, ethnicity and circumstance.

This is not always the case. There have been a number of widely publicised missteps where algorithmic bias is a result of training datasets skewed towards overly male, Caucasian or English Speaking training sets. 

To guard against this we have completed a model bias evaluation of our face tracker across images of  different age, gender and ethnicity.  

Our analysis concludes that the face tracker continues to be effective irrespective of age, apparent gender and ethnicity.

Private

We create our technology with privacy included by design. 

Data collection and storage is minimised wherever practical, and we process all data on people’s own devices, without using the cloud. Users choose who they share their data with, and when they do so and always with end-to-end encryption.

Transparent and Interpretable

Many advanced AIs, like deep learning neural networks, are complex black boxes, making it impossible to fully explain the internal logic behind the outputs. Lack of transparency prevents humans from verifying system soundness. 

Our AI models are designed to be interpretable and transparent, with predictions based on readily verifiable data, resulting in an AI system where outputs can be checked independently. 

Greater transparency not only enables auditing AIs for bias, but also keeps humans meaningfully in the loop so we don't fully surrender agency to machines.

Lightweight

We have designed our technology to run on edge including on mobile devices.

However, mobile hardware has limitations on memory, storage space, and computational power. That's why our team has engineered our machine learning software to be lightweight and efficient from the start.