Why this video of a woman in the toilet recorded by Roomba won’t be the last we see | Technique



A few days ago, a series of photos that were recorded in 2020 by Roomba vacuum cleaners came out. In one, a young woman is seen sitting on a toilet with her panties down. It is the most striking, but there are other cases where a child is shown lying on the floor looking at a cleaning robot or a woman walking through her home. In total, 15 excerpts from videos recorded in the US, Japan, France, Germany and Spain that were found on Facebook and Discord by a reporter from MIT Technology Review.

The interesting thing about this case is how these videos spread on social networks. They were uploaded to the Internet by Venezuelan micro-workers who were responsible for tagging photos to train the vacuum cleaner’s algorithm. And that tells us two things: that AI is less automatic than advised, and that the platform economy (what was once called the sharing economy) has reached new heights.

We live surrounded by devices that rely on machine learning algorithms, or machine learning. This technology basically consists of gathering large amounts of data and developing algorithms that extract patterns from it. Machine learning is used, for example, in computer vision systems found in self-driving cars and robotic vacuum cleaners. For a computer to recognize a chair, it is necessary to give it (train it) thousands or millions of examples of images of chairs, so that it will extract a pattern and be able to recognize one when shown.

But someone must associate the thousands of images that are provided of the machine with the word chair. That’s where taggers come in, a piece as basic as it is silent about AI. They are workers who connect to certain platforms (Amazon Mechanical Turk was the pioneer) where they manually write what appears in the image, identify and flag potentially problematic content or help improve speech recognition technology by interpreting and translating parts of the voice that are particularly difficult for them. who are not reached by automatic translators.

This all happens in real time, in an instant auction process for small tasks that are paid for in pennies on the dollar. Mary L. Gray and Siddharth Suri described this industry in ghost work, the book shook up the sector in 2019 by proving that AI works thanks to a large number of ghost workers, mostly in developing countries, who do very simple, low-paying micro-tasks. All they need is a computer with internet access and they respond instantly. same riders Those who travel the cities on the back of their bicycles with other people’s dinner in square backpacks.

The picture these two Microsoft researchers paint of AI head-on runs counter to the chants of progress and less drudgery that the technology’s guarantors have been promising for decades. Automation brings great improvements to our lives, yes, but at the cost of creating unimportant jobs in the service of AI. These invisible jobs are concentrated in non-Western countries, but they also employ people in the United States or other European countries. Although like ridersThey work long hours for little pay. The AI ​​also goes to the pedals.

Gray and Surrey argue that “great technological developments always require cheap and expendable labor.” In the 19th century, massachusetts textile mill owners hired farmers to make clothes too thin to be made in their machine shops. In the 1950s, calculators, or human calculators, reviewed the calculations of the first computers. People today are paid to improve search engines and help train algorithms.

The Roomba case shows just how true that is. The Venezuelans who posted the videos gained access to them through Scale AI, one of the companies hired by iRobot, maker of Roomba vacuum cleaners, to “train” their systems. Workers were sorting things the vacuum cleaners came across to improve their systems.

Last summer, Amazon announced its intention to acquire iRobot for $1.7 billion. The process is waiting for the US regulator to determine whether it will affect free competition in the smart home sector.

As iRobot told MIT Technology ReviewThe leaked images come from modified robot models. The company claims Scale AI breached the terms of the contract, while the microworks platform dumps liability on co-workers who shared the images. The fact is that highly sensitive user data is shared for the sole purpose of training the algorithms. And it’s not unreasonable to think that this will happen with smart products other than Roomba vacuums.

Maintaining some privacy in the digital age is a pipe dream. From the moment we upload a document to the internet, it can be that way Hacked or stolen. The interference of ghost workers in AI operations adds a new vector of potential data leaks. And it shows the layers of technology, artificial intelligence, which was supposed to be more automatic and less analog. Every time we see a file racer Walking down the street, we might think that in some room in Caracas, Bombay, or Detroit, there might be a colleague of yours helping make the Uber app or Roomba vacuum cleaners work a little better.

You can follow country technology in a Facebook s Twitter Or sign up here to receive The weekly newsletter.



Leave a Reply

Your email address will not be published. Required fields are marked *