Selling people: there’s an app for that

Based on recent revelations, it’s clear to see that there’s little progress being made to stop bad and evil behaviour online. Picture: Markus Spiske/Unsplash

Based on recent revelations, it’s clear to see that there’s little progress being made to stop bad and evil behaviour online. Picture: Markus Spiske/Unsplash

Published Jun 19, 2022

Share

When Steve Jobs and the Apple team imagined the App Store, they were inspired by business needs and requirements for custom apps. Little did they know that one day those apps would bring back evils of the past.

If claims on a 2019 BBC documentary (Silicon Valley's Online Slave Market) about an app that sells helpers (people who are often referred to as maids) on a market are true, then both the App Store (by Apple) and Play Store (on Android and Google) have indirectly brought back slavery.

It’s a classic case of the unintended consequences of developing tech that people can use to create whatever they like. How can society prevent technology from going AWOL from doing good to bad?

At the surface level, the app appears as innocent as just another marketplace that sells cars.

Further scrutiny, as the BBC documentary shows, reveals that there was a dedicated section for selling helpers.

The app is used in the middle east region, particularly in Kuwait.

In the video, a buyer and seller appear to be discussing helpers on the market and particularly how they ought to treat them.

They discuss whether helpers can be given mobile phones or not, whether they should keep their passports or not, and should speak to family or not. It’s one of the most inhumane practices you will find online and is enabled by modern technology.

Further research shows that this is a problem that has been giving companies like Meta (Facebook) headaches in an attempt to stop it from being carried out online.

This however shows that it is possible for something this illegal to occur online without the authorities detecting it.

At the heart of this is the lack of oversight of what is happening online. Both at a global level and local level of operations.

The task of policing the internet is left to technology companies themselves to monitor bad behaviour.

Based on recent revelations, it’s clear to see more needs to be done to stop bad and evil behaviour online.

In the absence of effective oversight, it is high time that a global police force was developed for the internet.

It is important that this is implemented before the metaverse dreams are realised. If the internet has enabled slavery 2.0, we will very soon see slavery 3.0 emerging with terrible consequences for society at large. It’s a known fact that there are laws against slavery on the internet, however stopping it proves to be a serious challenge.

The digital police force is necessary to enforce existing internet laws which are now being flouted with impunity. To be effective, the digital police force will have to exist at the global level to ensure that there’s no company or country that is exempt. At the local level, such a police force would have to implement local and global internet laws to ensure local enforcement of these laws.

Establishing the digital police force at the local and global level will take some time, and in the meantime global tech companies need to review their monitoring mechanisms for apps on their platforms. How is it possible that Apple and Alphabet (Google) did not know about such terrible practices on their platforms? If they knew, why did they allow their platforms to be spaces to sell human beings?

For a very long time, technology companies have knowingly and unknowingly enabled some of the worst behaviours to be conducted on their platforms. It’s about time that countries took these issues seriously and held big tech companies accountable for their operations.

Currently, due to the global nature of big tech platforms, what gets created in one country can easily infest other countries with speed.

Although this global structure works well for economies it’s hurting the social fibre of society.

Is it not time that the tech architecture is localised to avoid the damage to human beings? Is it not time to have neutral bodies approving apps instead of big tech companies approving apps themselves?

Lastly, if slavery 2.0 has taken place online and is enabled by Apple and Google, and other big tech companies, what should have been the punishment? Is financial punishment sufficient to stop the behaviour? These are questions that should occupy the minds of technology regulators as they consider what should be done about Africans who are exploited through global tech platforms.

Wesley Diphoko is the Editor-In-Chief of FastCompany (SA) magazine. He hosts weekly TwitterSpaces on technology and innovation, you can follow him on Twitter via: @WesleyDiphoko

IOL Business