Fighting bias in the machines
Jonathan Murphy
Technology & Creativity Blog editor

Ethical coding has become a buzzword recently around the BBC's technical community. So it was timely that for this years developers conference, run jointly by Design & Engineering and the Academy, one of the key note speeches focussed on Bias in the Machine.
Rob Harrop, CEO of software agency Skipjaq which specialises in Machine Learning, looked at the pros and cons of bias. He explained how sometimes bias is necessary - "good bias", as he calls it, is needed in something like car insurance which rewards good drivers or in sports services offering stories to a certain section of football fans.
But he mainly focussed on "bad bias" which can lead to incorrect or discriminating results. He previously worked in the financial loans industry and gave examples of how some groups wouldn't get approvals as a result of demographic assumptions and poor data.
Skipjaq CEO Rob Harrop counters the claim that machines are unbiased.
Rob explained that organisations can guard themselves against machine bias by testing models thoroughly through different scenarios. He also gave examples of algorithmic tools that coders could use bias to counterbalance either pre-existing data biases or learned biases. In addition, there's the need, he said, to create a technical culture which challenges assumptions and human bias.
Rob explains how classification is open to both "good" and "bad" bias.
