In the teaser video for the performance 'Robin, a theatrical talk about the future of work', by Stichting Nieuwe Helden, actress Isil Vos presents us, in the guise of tech-guru Robin Sharp, with the possibilities of a chip implant. The device will articulate your thoughts for you, even before you are able to do so yourself, and then share them with the outside world. It can soothe negative emotions with small doses of hormones. Less depression is the result and productivity increases by 50%. Fifty percent!
Mental health is a hot issue in the business world. Many taboos are being broken in this area and prominent thought leaders are talking about it. But is the mental health of employees just another production factor? A new metric we can track as managers to help our people perform better?
Isil: "We have actually been doing this for a long time. Our first inclination has always been not to see sick or under-functioning parts of the human being as something that belongs, but to tackle them, get them out. But the technical acceleration has made us much better at it. There are many more possibilities. And the ethical boundaries are shifting."
The technologies in the teaser may sound futuristic, but they are things that are really about to become possible or even are. A year of research preceded the performance. Isil: "I don't mention anything crazy in that film. It is all possible, we have had the figures calculated and the proposed technological possibilities checked. The question is above all: what are our ethical boundaries? The corona crisis has shifted many boundaries. For example, we always thought it was very important to have freedom of choice about what happens to our bodies. Now that is different. Look at how easily we dealt with the corona app. Privacy was not really asked for anymore."
According to Isil, this shifting of ethical boundaries makes many innovations possible that do not necessarily help us feel better as workers, but mostly help companies to increase their profits.
Stefan Leijnen, lecturer in Artificial Intelligence at Utrecht University of Applied Sciences and advisor to Node1, naturally follows these developments closely. Although he has a slightly different view of ethics: "I think ethics is a last resort... When you see no other options, you appeal to morality. We will really have to do more than just make that appeal. Because what Isil says about the acceleration is of course true. This era of converging technologies - AI and quantum computing, data and biotechnology - is producing a tremendous acceleration, threatening to put us behind the times. Google and Facebook have had two decades of pretty free rein to develop, because ad companies were not strictly regulated. For too long, we saw advertising as a safe technology that had little or no negative impact on society. Now that technology is making the transition to other sectors, we need instruments to control this acceleration. We do not have them, or only to a limited extent. That is what we need to talk about now."
But before we get there, we have another conversation to have. Because what does the future of work look like if technology touches and changes every aspect of our work? It's a question we deal with on a daily basis at Node1. But the technology we work with - RPA, task mining, process mining, low-code software development and analytics - doesn't just impact how work gets done. It is shaping the future of work and changing what we mean by 'work'. At the same time, the Internet has also changed our entire understanding of 'working together'. This development is unstructured and has been extremely accelerated by the corona crisis. Isil also noticed in the many conversations she had around this show that companies and managers do not have enough baggage and insight to think about this:
"These conversations are not taking place. The questions that are being asked are something very different. If you are a bank, the questions are: where should we limit risks? What is the most efficient process? There are a lot of people in a lot of places who really have no idea what the implications of these kinds of choices are. What fascinated me most in those conversations was that people are genuinely concerned with how people are doing. I didn't speak to any nasty people. But the boundaries move as the technical possibilities move. And in the end efficiency, growth and profit are the core of a business. In itself, that's not a problem, but growth has almost become the only goal. That is very difficult to change. Because even that sweet HR manager who wants to take care of everyone has to justify her budget."
Stefan: "When economic growth becomes religion, the government must step in and make adjustments.
Isil: "The idea of growth is actually weird. Every year, the profit percentages have to go up. Why is that? Alternative economic models, other ways of looking at growth, have long since been firmly established. Because this infinite growth is now taking its toll: We are massively approaching burnout. We are asking more and more of ourselves. We are working in a system without end, in a world where everything is finite."
Everything is finite. And we're not just talking about the earth and its resources, but also about human resilience. Because even with the best intentions in the world, it is difficult to work on the well-being of employees at scale. Especially when everyone also works from home. And that is reflected in the figures.
"How do you take care of the welfare of 40,000 people? Asks Isil. "Then you get culture programmes and people-oriented technology, such as bracelets that are supposed to measure your well-being. Boundaries then become increasingly blurred. A badge that was intended for security also turns out to be useful for registering how often you leave the department to go to the toilet. And if you have to scan your pass so often, wouldn't a chip in your hand be more convenient? People go a long way towards this kind of control technology. Monitoring software for home workers has been booming business since Corona . And people accept it. Our own ethical boundaries have shifted enormously. But meanwhile, burn-out rates at large international tech companies are skyrocketing.
For Robin, we have compared many studies and they all have a different definition of, and therefore different figures about, 'burnout'. But you can say that about 1 in 5 employees - and that does not include the self-employed, which is a type on its' own - suffers from burnout. The corona crisis has exacerbated this in many sectors. Sectors such as education and care are outliers in this respect.
The reality, apparently, is that companies find it extremely difficult to include welfare in their technology decisions. Stefan sees a pattern in this: "What I learnt from the discussion that evening is that 'helping' always means helping from the perspective of the company. I think the high burnout rates are also due to the fact that people have begun to see themselves as a means to an end rather than an end in itself . The ultimate goal, for everyone, is always a business goal. That's how you keep running behind the facts. You always end up with a kind of tech-solutionism or tech-optimism when you ask more questions.
Tech-optimism: the conviction that technology, as long as it is given free rein, will eventually solve all our problems. Isil and Stefan agree that this is a dangerous pitfall. Isil is particularly annoyed by the assumption that many human characteristics are seen as 'suboptimal' in a system context: "We mustn't fall over anymore, we see that as weak and unnecessary, because we can prevent it, can't we? So, no. Man fails, falls ill, mourns or sometimes just has enough for a while... As human beings we are not malleable. There are simply limits.
Stefan: "By being tech-optimistic, you run the risk of ignoring the rest of organizationand forgetting about people. If you see automation as a goal, it can lead to a crusade to automate everything as much as possible. Technology is a tool. Something that offers possibilities. For example, technology allows us to have this conversation online. But the combination of automation and people work is a complicated story, we actually have very little knowledge about it."
All this raises fundamental questions about what we actually want to achieve with technology. Stefan illustrates this with an example: "Take a self-driving car. It needs to be able to tell the difference between a child crossing the road and a plastic bag blowing into the road. If such a car wants to decide to brake, it has to calculate a certain probability that it is indeed a child and not a plastic bag. What percentage should that be? Ninety per cent? Fifty? Three? And who am I, programmer, to determine that? And the programmer's manager doesn't have the answer either. And neither does the CEO. Nobody is authorised to make a judgement on that. Because the assumption is wrong. The question should not be: how do we replace the driver of the car? The question should be: what do people do and how can we best help them?"
Working more efficiently may still be part of the answer. It all depends on how you deal with the time that becomes available. Isil: "Automation was supposed to give us more time. But in practice, we only end up working more. So ask yourself this question: if you work more efficiently and you save time as a result, whose time is that? Does it belong to you or to your employer? Do they pay you for working 40 hours, or for achieving certain results? And when you've achieved those results, can you go for a walk?"
Stefan likes this idea: "If people can make themselves more efficient with low-code and RPA and if they get some of that time back, that is a nice incentive to democratise automation. The freedom to organise your own work also gives room for creativity."
So, by not simplifying but critically examining our assumptions, we arrive at a way of automating that actually does work. A model that allows us, within the context of a commercial company, to make decisions about technology that benefit people. According to Stefan, commercial and human interests do not have to contradict each other: "What is best for the company? Is it always efficiency? Automation is a means, not an end in itself, and you need to have a nuanced conversation about it. There are two external factors that play a role here. The first is the changing social framework conditions. Society is changing and expects more from companies. The government is now playing it's part and is introducing new rules and laws. As a company, you can sit back and wait, or you can also ensure that you are prepared for this. Social credit systems and keeping an eye on employees, no matter how good your intentions, will be banned within a few years. You will have to come up with alternative solutions. In addition, the labour shortage is growing and employees are demanding more from their organization. It helps ompanies if they start thinking about what is important for their your employees. If you understand more about the nature of work, you can better help people to take care of themselves and bind them to you."
Isil: "When you introduce technological tools into your organization, when you replace work and people with automation, it irrevocably changes the organization itself. And in the end, a business is not just input and output. It is also a mini-society. An employer also bears responsibility for that."
For many organisations, managers and IT professionals, this means above all: acquiring knowledge. Learning to look with new eyes at technology and the processes within companies. It also means that we all have to start learning from experience and previous mistakes, so that we can use the current wave of new technology in a way that is sensible and responsible for the long-term perspectives of our companies. But above all, it means that there will always be questions that we cannot yet answer and that we must always, at any point in the process, be prepared to revisit and adjust decisions.
Or, as Isil puts it: "They are complicated questions. And we don't have the solution. And yet we have to ask them."