AI and the ironies of automation - Part 2 In the previous post, we discussed several observations, Lisanne Bainbridge made in her much-noticed paper “The ironies of automation”, she published in 1983 and what they mean for the current “white-collar” work automation attempts leveraging LLMs and AI agents based on LLMs, still requiring humans in the loop. We stopped at the end of the first chapter, “Introduction”, of the paper. In this post, we will continue with the second chapter, “Approaches to solutions”, and see what we can learn there. Comparing apples and oranges? However, before we start: Some of the observations and recommendations made in the paper must be taken with a grain of salt when applying them to the AI-based automation attempts of today. When monitoring an industrial production plant, it is often a matter of seconds until a human operator must act if something goes wrong to avoid severe or even catastrophic accidents. Therefore, it is of the highest importance to design industrial control stations in a way that a human operator can recognize deviations and malfunctions as easily as possible and immediately trigger countermeasures. A lot of work is put into the design of all the displays and controls, like, e.g., the well-known emergency stop switch in a screaming red color that is big enough to be punched with a flat hand, fist or alike within a fraction of a second if needed. When it comes to AI-based solutions automating white-collar work, we usually do not face such critical conditions. However, this is not a reason to dismiss the observations and recommendations in the paper easily because, e.g.: Most companies are efficiency-obsessed. Hence, they also expect AI solutions to increase “productivity”, i.e., efficiency, to a superhuman level. If a human is meant to monitor the output of the AI and intervene if needed, this requires that the human needs to comprehend what the AI solution produced at superhuman speed – otherwise we are down to human ...
First seen: 2025-12-14 13:54
Last seen: 2025-12-15 06:56