• Cases
  • Services
  • About
  • Careers
  • Perspectives
  • Contact
  • NL
Pegus Digital
  • Cases
  • Services
  • About
  • Careers
  • Perspectives
  • Contact
  • NL

Unbiased Coding & Algorithms

24 October 2017 Posted by Ina Danova Insights

Unbiased Coding & Algorithms

At a time when many of the systems responsible for managing our lives depend on artificial intelligence and machine learning, any biases built into the algorithms that run them also have the potential of projecting a bias on the outcomes that we rely on in every realm of our lives: healthcare, education, employment, banking, investments, etc. The need for accountability in coding is thus becoming paramount in the development of any system affecting humanity, going forward. 

How does bias creep into algorithms?

Biases in algorithms can occur when certain parts of the population (e.g. women, the unemployed, the disabled, people of colour, LGBT and others minority groups) can be negatively or unfairly impacted by decisions or outcomes taken by software tools, predetermined by algorithms that don’t take their needs or particular circumstances in mind. This can happen without anyone having a pre-conceived malicious agenda, simply due to the fact that these circumstances are considered ‘’exceptions,’’ or haven’t been fully thought out when defining the typical rules of the particular system. 

In most cases, this so-called coding bias is the unintentional result of system development practices designed with the average or usual user, beneficiary or event in mind, though being conscious of the possibility of unfairness to occur is a good first step towards ensuring we can actively avoid it during the algorithm definition process.

Real-life examples of code bias

MIT grad student Joy Buolamwini was working with facial analysis software when she encountered an issue: the tool didn’t detect her face, because its underlying algorithm didn’t know how to identify a wide array of skin colours and facial structures (it assumed most users would be Caucasian). Buolamwini is since determined to fight bias in machine learning, which she has dubbed ‘’the coded gaze,’’ and describes in further detail in her TED talk. Other examples of the coded gaze include flawed and misrepresentative systems to rank school teachers and a gender-biased model for NLP, among others. 

What is the developer community doing about this?

Although we are still in the early days of dealing with algorithmic bias detection and its consequences, thought leaders are realising the global impact this trend could have on already disadvantaged populations, if it isn’t addressed in a responsible and timely fashion.

Earlier this year an initiative called AI Now was formed by Microsoft, MIT and Google representatives to study and combat what researchers and practitioners alike are beginning to recognise as an important issue. The founders of AI Now say, though hard to detect, bias may already be present in many products and services we use daily.

Preventing bias when developing apps

It’s easy to leave the responsibility of algorithm building and code writing to the programmers, those mystical creatures who can rarely be spotted in the daylight, and let them deal with any bias they might be unwillingly or unknowingly contributing to. However, it is within the ethical responsibility of every organisation, product owner or development company, to be aware of the potential for bias when creating software systems, whether it affects something as innocuous as recognising skin tone or as serious as predicting unemployment rates. Addressing this potentially devastating trend should start with awareness and triggering conversations involving all stakeholders. 

Thus, business analysts, decision makers on the client side and project leads on the agency side all need to participate in the discussion of potential bias threats with those responsible for the algorithm building itself. Ideally, this issue should be flagged at the very beginning, pointing out potential pitfalls and coming up with an action plan on how they will be resolved down the road. In order to make bias consideration and avoidance a valid part of software development, it needs to be a formal part of the planning and the validation processes, as well as be addressed in project documentation and referenced during the maintenance phases of the product or service lifecycle. 

The development team at PegusApps is actively fighting algorithm bias by firstly preventing it from occurring as much as possible, and in later phases, revisiting code to ensure it isn’t unjustly affecting any groups of users or subjects. 

Copywriter: Ina Danova

Tags: Artificial IntelligenceDigital TransformationInternet of Things
0
Share
Internet of Things
6 April 2021
Mastering Web 3.0 While Getting Ready for 4.0 and Beyond
Mastering Web 3.0 While Getting Ready for 4.0 and Beyond
Mastering Web 3.0 While Getting Ready for 4.0 and Beyond
Internet of Things
30 March 2021
The End of the Search Era
The End of the Search Era
The End of the Search Era
Internet of Things
22 March 2021
Quantum computing can revolutionize your business, industry and material design
Quantum computing can revolutionize your business, industry and material design
Quantum computing can revolutionize your business, industry and material design
Internet of Things
15 March 2021
Why Marketing for Machines Should Be a Part of Your Strategy
Why Marketing for Machines Should Be a Part of Your Strategy
Why Marketing for Machines Should Be a Part of Your Strategy
Internet of Things
8 March 2021
From Digital to Human: Apps Transforming Customer Relationships
From Digital to Human: Apps Transforming Customer Relationships
From Digital to Human: Apps Transforming Customer Relationships
Internet of Things
1 March 2021
Increase Your Operational Efficiency with AI Apps
Increase Your Operational Efficiency with AI Apps
Increase Your Operational Efficiency with AI Apps
Internet of Things
22 February 2021
The Progression of AI: from Supervised to Unsupervised Learning
The Progression of AI: from Supervised to Unsupervised Learning
The Progression of AI: from Supervised to Unsupervised Learning
Internet of Things
15 February 2021
3D Printing in Industry 4.0
3D Printing in Industry 4.0
3D Printing in Industry 4.0
Internet of Things
8 February 2021
Adding microbots and nanobots to your tech arsenal
Adding microbots and nanobots to your tech arsenal
Adding microbots and nanobots to your tech arsenal
Internet of Things
1 February 2021
Avoiding common pitfalls of enterprise digital transformation journeys
Avoiding common pitfalls of enterprise digital transformation journeys
Avoiding common pitfalls of enterprise digital transformation journeys

Let's start Building together

GET IN TOUCH

YOUR INNOVATION PARTNER IN

Digital product development

Office Kuurne

Noordlaan 18,
Kuurne, Belgium

Get in touch

+32 56 22 44 24
info@pegus.digital

© 2020 · Pegus Masters in Innovation · Terms & conditions · Privacy Statement · VAT: BE 0846.500.390
Member of Masters in Innovation, a pioneering product innovation group.

© 2021 · Your Website.

Prev Next