Text 4
The news that the Home Office is sorting applications for visas with secret algorithms (computing) applied to online applications is a reminder of one of Theresa May ’ s more toxic and long-lasting legacies: her immigration policies as home secretary. Yet even if the government ’ s aims in immigration policy were fair and balanced, there would still be serious issues of principle involved in digitizing the process. Handing over life-changing decisions to machine-learning algorithms is always risky. Small biases in the data become large biases in the outcome, but these are difficult to challenge because the use of software covers them in clouds of confusion and supposed objectivity, especially when its workings are described as “ artificial intelligence ” . This is not to say they are always harmful, or never any use: with careful training and well-understood, clearly defined problems, and when they are operating on good data, software systems can perform much better than humans ever could. This isn ’ t just a problem of software. There is a sense in which the whole of the civil service, like any other bureaucracy, is an algorithmic machine: it deals with problems according to a set of determinate rules. The good civil servant, like a computer program, executes their instructions faithfully and does exactly what they
扫码添加老师微信,获取下载码
考点试题免费下载若已添加微信获取下载码,可输入下载码直接下载
下载码出错,请重新输入