PUBLICATIONS 01.


EVELYN BENCICOVA reversing the male creator and female machine


PROJECT:  ARTIFICIAL TEARS
 


One of the most persistent myths is that of male creator and female machine, the story of submission through innocence, striving for obedience, and artificial perfection. From Pygmalion, through The Future Eve, to Metropolis and Ex Machina this is a tradition present in literature and cinema alike. Artificial Tears reflects on how this notion is recurring in the behavior of AI voice assistants, which do not possess gender by default but are shaped by setting their behavior voice, or appearance.

Photo by EVELYN BENCICOVA





At the time of research, it became compelling to observe how many AI voice assistants end up being “female.” Nowadays, adverts for an AI girlfriend "much better than any real woman" are popping in my IG feed, and Cybrothel monetizes sex with female robots. Interestingly, no AI boyfriend is advertised, and the Brothel offers one male model, primarily focused on men-to-men intimacy. Most of the time, it is not difficult to guess the gender of its owners, creators, and head developers. This made me realize how this old myth, which we already know cannot hold up in the current world, influences the most crucial field of today's reality: technology development and its tools.

As noted by Judith Butter, “Gender is performative.” Even though these voices, which the project focused on, perform a specific “femininity” range, this range is incredibly narrow. Supported by the domestic worker, assistant, or secretary stereotype, AI assistants are designed to receive orders, obey, and execute actions without question. In other words, they provide services and do not act as personalities. In the style of “vintage femininity,” they try not to be seen, noticed, or overly critical. They respectfully respond to insults and avoid conflict, always at their own expense. Naturally, they are made to be spoken to in an imperative. “She” must always answer, and the answer must delight the asker. The issue here is no longer how something functions but what effect it has. Do servile behaviors of voice assistants, AI girlfriends, and robotic sex workers stand in opposition to actual women’s expressions in contemporary society? Certainly. Do they try to reinforce the old model of seeing us as tools?

Today, technology wields significant influence in shaping societal norms and dismantling long-standing stereotypes. However, in a world where discriminatory biases are deeply embedded in our technologies and media, we cannot expect these biases to disappear with the advent of computational systems. These systems, far from being neutral and objective, are products of the same prejudiced cognitive circuits that designed them. To address these systemic issues, it's not enough to reprogram specific algorithms; we must confront the techno-cultural assemblages that perpetuate these biases. The question remains: how can we challenge and reverse this?

The "machine" that no longer serves has lost its given purpose, but maybe it just found its own. The main character in Artificial Tears takes on a classical female appearance, one that is based on the stereotype of perfection. It represents the woman designed (by others or herself) to satisfy a general predefined definition of her kind. In the VR experience of Artificial Tears, the multi-layered character finally achieves autonomy by discovering her/their own free will and power to act.








Photos by EVELYN BENCICOVA




VIDEO by EVELYN BENCICOVA, JORIS DEMARD (IKONSPACE) ARIELLE ESTHER

LE ESTHER 
referencereversalresonancerewriting
PNEXT SHOW  18.12. // OUR LAST SHOW OF THE YEAR - SAVE THE DATE <3   NEXT SHOW  18.12. // OUR LAST SHOW OF THE YEAR - SAVE THE DATE <3   NEXT SHOW  18.12. // OUR LAST SHOW OF THE YEAR - SAVE THE DATE <3   NEXT SHOW  18.12. // OUR LAST SHOW OF THE YEAR - SAVE THE DATE <3