Parasitical Bots

Parasitical bots

If we think back to the urban farms in Taipei, what set them apart was the ad-hoc collective space making process. However, let's not forget what happened when the anarchist pocket functioning by its own rules was appropriated by the city council and marketed to a young, creative crowd. Once the space was legitimised, the original inhabitants were driven away.

The problem we face with using bots to make space for groups of resistance on existing platforms is that every intervention generates revenue for said platforms and little incentive for them to intervene themselves in processes that cause “stress”. This was the case for the situation with Facebook and Cambridge Analytica. As you probably already noticed, Twitter is one of the main websites that research is being done on, especially in relation to propaganda bots. However it took Twitter almost a year from being signalled about the Agenda of Evil account to actually suspend it. And only a few months ago did they make a public statement about the platform being misused by deviant actors.

Most of the examples that were given above are not in direct conflict to the host platform and so could be said to function symbiotically within the ecology.

But how could one design an automatic intervention that creates conflict within the host body? An automatism that punctures the skin of the network?

APIs

“Among other things, web APIs encompass: a physicality in terms of the corporeal landscape of infrastructure and technology, through to the economic logics at work (i.e. business models, ownership, licencing of the APIs), functions and services (i.e. access to data), practices of users (i.e. forms of labor, play and collaboration), discursive formations (i.e. statements, knowledge, ideas), rules and norms (i.e. design principles, terms of service, technical standards), as well as social imaginaries and desires.” Taina Bucher, Objects of Intense Feeling: The Case of the Twitter API

If we talk about the relationship between platform and bots, we cannot escape talking about the API. The API, which stands for Application Programming Interface is a set of rules that applications use to communicate with each other. This apparently neutral conception is in reality a very complex conglomerate of imaginaries, whether technological, economical or societal. Essentially an API determines how the platform developers imagine it will be used by other developers, revealing a relationship of imbalanced power. It welcomes interventions as an opportunity to expand the functionalities beyond what the original developers might have imagined, and thus add more economical value to the platform.

Scraping, on the other hand, is a rogue way of interacting with the platform and although less warmly regarded, it offers more possibilities.

Politwoops

In the case of Politwoops, it saves deleted tweets by politicians and then publishes them on their own website.

Google Will Eat Itself

One example that is not necessarily a bot, but that functions on a bot logic, is the well known work by Alessandro Ludovico and Paulo Cirio, where they serve Google text advertisements on multiple websites and the money that is generated is used to purchase Google shares.

While this example may not be a bot as it would be defined by the textbook, it does employ bot logic, in that it follows the logic of the platform and interferes in a critical point.