US politicians have drafted a law mandating that tech companies test prototypes algorithms for bias.
It is now commonplace for organisations to use algorithms or coded instructions to carry out everyday activities. Examples of this include: showing users relevant advertisements, analysing the behaviour of users, and sorting data. This suggests that bias which is written into code could have an effect, which may not be immediately detectable, on marginalised social groups.
The statement supporting this bill gave an example of this, citing an algorithm used by Amazon to recruit staff, which was found to be biased against women. In addition, in March of this year, the US Department of Housing and Urban Development sued Facebook for not policing how advertisers used algorithms to target specific audiences. Advertisers selling homes were able to pinpoint their audience based on race, religion and nationality.
The bill only applies to companies with annual revenues of £38 million or those that hold data on more than one million people. Law applicants might be interested in the wording of the bill itself, found here, and what the possible effects or reasons might be for these conditions.
The bill does not, however, come without its critics. These critics argue that the possible advantages of artificial intelligence may be limited by such legislation. Daniel Castro from the Information Technology Foundation has also suggested that this ‘hold[s] algorithms to a higher standard than human decision’ based on what he believes to be an inaccurate assumption that decisions made by machines are more dangerous than decisions made by humans.
Engineering and computer scientist applicants may use this as an impetuous to reflect on the ethical considerations around artificial intelligence. The debate around this bill highlights the balance – and potential tension – between the push for technological advancement and political effects. Those interested in studying politics might evaluate this proposed law and the role of politicians in regulating technology and business.
Our Oxbridge-graduate consultants are available between 9.00 am – 5.00 pm from Monday to Friday, with additional evening availability when requested.
Oxbridge Applications, 14 – 16 Waterloo Place, London, SW1Y 4AR