Invisibles – exploitation in the digital world of work DW Documentary

Show video

Go get ready. It's 7:30. I'm 42.

I have 2 children, teenagers. I stopped working for 12 years to raise my children. Now, my husband and I are separated. Going back to work after a 12-year hiatus isn't easy. One day, while searching on the net, I came across a site called Lionbridge. They offered various translation jobs.

They also worked with Google, doing micro-working. That's how I got started, 3 and a half years ago. I connect via a Gmail address created for this job. Then, I go to a specific website where jobs are posted.

The jobs involve things like comparing a real person's search on a website with a keyword. Here, we have "Dijon mustard" and "mustard of Dijon.” I'm asked if they mean the same thing, for Google searches. So, I say, yes, they do mean the same thing. Here's another similar search.

"Hang painting without drilling" and "Hanging paintings without drilling.” I, of course, say they mean the same thing. So, I'm helping the algorithm to improve. With a lot of these tasks, I think almost anyone could do them. After 3 and a half years, it becomes automatic. Working and doing the same thing for a couple of hours at a time, it becomes rather robotic.

It's like you become a robot. I don't really have a contract. I can stop working at any time without notice, which I would never do.

And they can fire me whenever they want, without notice. There's no official contract. I've never met the people I work for. It's all done over the internet.

I've never even spoken to anyone in 3 and a half years' work. I wish to remain anonymous because Google, in what is a semblance of a contract, refuses to let us talk about this work. In fact, I'm not allowed to talk about it or to speak out in public or even work in public places. No one can see what we do. My name is Nomena. I'm 32 years old, I'm married, and I have 3 children who are 12, almost 5, and almost 3 years old.

The work takes place on a platform called Alcmeon. We have a login and a password. At 2 PM every day, I log onto the platform.

On this platform, we deal with messages from Disneyland Paris on Facebook and Twitter. There are reservations, complaints, thanks, comments that we can delete, that aren't important to the department. I have to wait. As soon as there's a message, I process it. We have model answers that you just have to personalize a little bit: change the first name, or adapt it a bit, according to the client’s request. I work 6 days a week.

For this work, I earn a monthly flat rate of 200 euros. So it's 200 euros for 48 hours a week. Apart from that, I found a job on another platform. The selection procedures were very tough. The process was difficult and very long.

That platform says it wants to help young women from Africa and Asia earn a stable income. To help them achieve their goals. But they didn’t pay what they promised.

After a while, some of us decided to boycott certain tasks and demand a rate increase. We sent a group email to the people in charge, asking them to apply the 2 euros 86 cents per hour they had promised when we started. Only then did they explain that the 2 euros 86 cents was just an example of what you could earn, in theory.

They said that by working for an hour, doing multiple tasks, we should be able to make 2 euros 86 cents. How can you make 2 euros 86 cents in one hour if a task is paid 27 cents, and each task takes at least 15 minutes to complete? That is simply not possible. We don't know who owns the platform. But on their website, there are several companies listed: the SNCF, which is the French national railway, and French banks.

Nothing but big companies. I don’t know if these companies are aware of what’s happening. But they benefit from the exploitation carried out.

That's undeniable. In the end, they pay very little for high-quality work. The company profits. It's all about money. It's the rich who exploit the poor to make more profit for themselves, without caring about the welfare of those who do the work for them. That’s the way it is.

They can pay more, but they prefer to make bigger profits and get richer and richer. So, the operators and moderators remain operators and moderators until they die. These platforms open branches all over the world. They get more customers, they make a lot of money. But we remain moderators forever. Every month, we work just to keep going the next month.

If you get paid on the 15th, you have to juggle the bills and really manage your money just to make it last a month, when you get your next salary. That's how it works. It's a vicious circle.

After college, I traveled. I stopped in Barcelona. I found odd jobs before coming to work for Facebook. I am not allowed to say anything about the company. Nothing about how it works, our working conditions, our salary, our contract, what we are doing, nothing.

So, this is very risky. I'm taking a big risk. It could ruin my life. It might sound like a movie, but you never know. You just never know.

They have bottomless resources whereas I have none. So, it's not easy. Facebook partnered with an outsourcing firm. That firm found the employees and trained them according to Facebook's guidelines. But it was this outsourcing firm that actually handled the employees. I was a content moderator.

I analyzed the content to see if it was appropriate for the platform. On a typical day, you arrive at the office, go upstairs, leaving all your personal items in a locker, and sit at a computer which isn't yours. Then, you spend some time reviewing the decisions made in previous days.

The decisions are always monitored. After that, you start moderating content all day long. When you arrive, you clock in. Not with a badge, but with your fingerprints.

The amount of moderating people do at Facebook depends on their pace and the time they've worked there. From one person to the next, it varies between 300 and 600 reviews a day. Murder, suicide, rape, domestic violence, racism, discrimination, bullying...

For me, the worst thing is... the beheadings and the rape of babies. I think they're the worst. If you're still a bit naive or innocent, please don't take this job.

Don't do it. Don't spoil that. There are things you don't want to see.

It's not worth it. Once you've lost your innocence, you can't get it back. My name is Chris. I'm from England originally and I lived a long time in Asia and then I came back to Ireland with my new wife and we needed to find work.

And this was the easiest job that I could find. The job title is Community Operations Analyst. It doesn't say anything about Facebook.

It doesn't really explain the job. They tell you it's about analyzing trends and recognizing what's happening in the world and implementing standards and so forth. It sounds very exciting, but we're working in a fierce Facebook building. We're using the Facebook systems. I have a Facebook I.D. I have documents for Facebook that I have to sign.

It's...you're working for Facebook, but there's a gap, I think, and it's just to protect Facebook legally, I think. Most of my work was related to content moderation. So I was just looking at pornography or material content that people have reported as pornography. So mostly naked ladies. It was a very nice... My first month was a very pleasant month of work.

But then you would see some disgusting animal sex or children or something occasionally. So it was a little bit of a shock. And then later, the priority for the U.K. team is hate speech,

bullying, threats of violence, you know, nasty, nasty stuff. And I would spend six hours reading arguments between people or complaints about Muslims or Black people or English people or French people. Just nasty, awful stuff all day. So when I started the job, I was, I was kind of excited. You know, I’m saving the world. I’m here to protect the people who use Facebook from bad actors.

This is the name we have for people whose actions are bad. So you would just review the content and think about the rules. And after some time, you notice maybe you’re agreeing with the content or disagreeing and getting angry. And then after a little bit more time, you’re responding more to something bad that you see. Oh, God, not this again. I hate this guy.

Why do people do this? And you start to. It’s not immediately because you’re ready, but over time, it starts to just slap, slap, slap, slap. You start to feel the pain. It's there. I can see every detail

and it comes back and it's deep in my mind. It's buried in my head. Yeah, it's really, really hard.

I have several friends that I worked with before who are now unable to work or not working, can't find a job. They are taking medication, anxiety medication. They're taking anti depression medication.

My doctor told me I'm depressed and I need to take SRI, SSRI, Prozac. Because she thinks I'm depressed and I am... I don't believe in these diseases. I don't, I don't want to take medicines.

But I'm not doing very well either. I've had three jobs since I left Facebook. I can't keep a job.

I get into arguments with people about nothing. So... we need help.

We need... a professional person to spend some time with us and help us to understand what problems we have and what we can do. The NDA means I can't have this conversation.

I can't talk to you. I can't talk to my wife about the work that I do or did. I'm not...

You're not allowed to discuss any detail of anything related to Facebook's operations or Facebook's secret information or your working conditions. It covers everything. It's like a gagging order. For life, it's a lifetime agreement. Somebody has to speak because we're hidden, we're silent. We are not allowed to have a voice because Facebook wants to protect Facebook.

Facebook doesn't want to protect the people that may be harmed. And it was like a light in your head. You just go: "Oh, yeah. This is the truth." I should ask a lawyer if it's OK.

Initially, I was contacted by Chris, who came to me and explained his work circumstances. It was very easy for me to do the research, to see what he was complaining about and to understand and empathize with his position. It was also easy for me to understand his legal rights and how they were infringed by what has happened.

And that, of course, will form the basis of the case against Facebook. From through Chris, many more people have come forward and both male and female, and it's quite clear that a pattern emerged straight away of people who were hurt, people who were injured as a result of what happened to them in their workplace and the many hours they spent doing what they had to do, which unfortunately has led to their damage and will now lead to their vindication of their rights through the courts. I think if you listen to politicians, nobody is aware of what's being done, what's happening, because politicians are all complaining about fake news and election integrity and the spread of extremism, etc. There's this problem that they see and they just shout: "We want a solution.” But they don't know anything about the work.

They're not talking about, you know: "How can we do this?" They're just pointing to the social media companies and saying this is your fault. But they're not, they're not engaged with... "How do you do this?" Because to write the rules is really, really difficult. To create a system that enables us to protect you, your children. Somebody is bullying your kid? I have to deal with that. Somebody is spreading hateful ideology? I have to deal with it.

I tried to think about this as if I'm Mark Zuckerberg. "How do you solve this problem?" And it's not his problem. His problem is that he has to make money for his shareholders. It's a legal requirement in America.

He has to make the maximum profit. So he's not interested in content moderation unless it makes more money. And I never heard anybody talk about how Facebook makes more money because of this.

I think these platforms’ biggest trick is to make consumers, or users in general, believe that there are automatic processes and algorithms at work everywhere. When very often, these tasks are done by hand. And to make workers believe that what they are doing is not real work. They call it a job, a gig. So, it is transitory, ephemeral and will eventually disappear.

Thanks to this approach, they avoid paying for work at its true value. And they avoid providing any kind of social security. If we look only at the workers, we overlook the fact that these workers also produce data. This data is used to produce automation.

For example, workers are geolocated. All this geolocation data is collected by the Uber platform. The Uber platform uses this data, on the one hand, to make its service work. But also, and this is the most worrying thing, in the long run Uber is using the data to create automatic processes. For drivers, it is a question of using this data to train autonomous vehicles.

For delivery drivers - we see it on platforms like Amazon — to train delivery robots. So, we're preparing for automation by using this mass of data produced by these clickworkers. To outsource today - to put people to work who are less well paid, whose rights are less respected — there's no need to open a factory in another country. It is quite simply a matter of attracting people from other countries — where average wages are lower than the platform's countries of origin - to work for your platform.

This type of logic, of economic and political asymmetry — some would even call it neo-colonialism — is also applied by platforms locally. In their own country. The American platforms ruthlessly exploit micro-workers and clickworkers in the U.S.

In France, French platforms don't hesitate to pay very little to French micro-workers. This is something that points to a kind of... global impoverishment and exploitation, both in the north and south. Faced with the most extreme situations of subordination, exploitation and surveillance, we need workers who will help raise collective awareness.

We need whistleblowers, who will speak out against a situation that is unjust. And must be protested. I realized I'd be working for Apple when I signed the contract, and read the clause saying I'm criminally responsible under Irish law if I divulge anything to anyone about this work. Siri is not artificial intelligence.

All the so-called machine learning is done by humans. All this so-called artificial intelligence is just people feeding the machine continuously. That was my job: feeding the machine. On our first day in Cork, what they call “induction day,” we were told we were going to work to improve Siri's precision.

So, they explained, we'd wear headphones and listen to recordings all day. I'd suspected for some time there was some kind of espionage element embedded in this type of device. When you accept the terms and conditions of use, you don't know what data is being collected or why. But I had no idea that we'd listen to people all the time.

The microphones on Apple devices are always on, and recordings are triggered randomly. So, all day long, we listened to people talking about their private lives, discussing very intimate things. The idea was to listen to people dictating a message or talking to Siri. That was the principle.

But in fact, there were lots and lots of recordings when the voice assistant triggered itself. The recordings were stored either on the iPhone or on some server. I decided to talk about it because it makes me so mad to see that we are completely unaware of the degree to which these companies interfere in our lives. Even people who want nothing to do with them.

I thought we should keep track of all this. So, when I decided to leave, I risked everything and started taking screenshots of whatever I could. Then I plugged a memory stick into the computer. And every day, I got new screenshots. And frankly, I was scared. One of the strictest instructions was not telling anyone.

Even among ourselves, in our open-plan office, we weren't supposed to discuss what we heard. And we were really not supposed to talk to journalists or people outside Apple. By taking these screenshots, I had proof that there were millions and millions of recordings out there. For example, for the iPads, when I took the screenshots, there were 14 different countries involved. Each of these work projects contained 600,000 to 1.2 million recordings

for a total of 1,000 hours per country. On average, there are 8 to 12 million recordings for each of these projects which make up 1 tab on a very long scrolling menu. I feel overwhelmed by the magnitude of it. By the size of these companies, and also by the economic and political stakes behind them.

But it's not that complicated to refuse to take part in it. I'd like to show my face as I hope others will follow suit. It's already a first step...to defend oneself. And then, to counter-attack. Despite what we're led to believe, it's not normal. It's not normal and it's not a given.

It shouldn't be. Today, the resistance is starting to take place. Clickworkers are organizing. Trade unions are reaching out to these new populations of workers to defend their rights, to organize them. It's too much. So we voted to get paid by the hour, plus the orders.

If you don't get an order, you still get paid. It's work. We're on call like firefighters, but they get paid. But we're not paid when we're on call. We're connected and ready to deliver, but we're not paid. So, I think it's only logical to be paid once you're connected to the app. I'm Ludovic.

I've been a Deliveroo driver for 7 months. I work part time. I'm also a union rep in the CGT's job insecurity committee. For the past few weeks, we've been trying to get organized against the companies we work for.

We're trying to raise awareness and show Uber and Deliveroo that they can't do whatever they want. They've tried to set up this status quo in recent years, and it’s become widespread. But the workers want to have a say in things. The platforms invite their workers to be self-employed. This is an extremely individualistic and capitalist vision. Paradoxically enough, more and more workers are saying: "I'll be responsible for myself, I'll take charge of my own destiny, but I'll do it collectively."

We see, for example, workers for Uber creating a cooperative. There were demonstrations against Deliveroo. I worked for them with other colleagues. Over the course of time, we started to realize that the working conditions they proposed were catastrophic. We asked to improve certain things. All together, we decided to start a project.

We set up a cooperative. We sought to refute the idea that you must go with the flow and that Uber is a model for the future. We want to prove that technology can also be ethical, that it's not incompatible with the way a cooperative works. It can be democratic. At Mensakas, we have contracts, sick pay, paid vacations, pay slips... Our organization is horizontal, not vertical, or a pyramid.

There's no hierarchy. All the workers without exception have exactly the same rights and the same salary. Our company is a non-profit cooperative. All the money we make is reinvested in the cooperative. At the moment, there are about 30 of us.

I'm very confident as far as the future of Mensakas is concerned. We have to keep moving forward. For us, on the one hand, but also to prove on a more general level that there is an alternative. Our model is starting to be exported.

Similar projects are emerging in other cities. In Madrid, our colleagues founded La Prahara Ciclomensajeria, which works as a cooperative. It's also the case elsewhere in Europe. People are starting cooperatives that were born out of a similar struggle.

There are some in Belgium, in France... I think that little by little, it's a model that will grow. What we need to generate is a large-scale social awareness of what's going on. It's true that technology makes our lives easier.

These new ways of consuming won't disappear. But we have to be aware of the consequences that they can have. Every time we consume a product, we have to ask ourselves where it comes from.

It's time for the working class to step forward and propose alternatives. If we don't get organized to defend our rights, no one will do it for us. We have to make our voices heard so things can change.

2021-05-05

Show video