It’s all over the news – generative AI models are advancing at a rapid pace. According to Legacy Russell, author of Glitch Feminism: A Manifesto, the outcomes it generates replicates our current social biases, and ‘reinforces the violence of systemic discrimination.’
In this month’s Rows and Columns, we’ve selected this data visualisation by Bloomberg, which analysed Stable Diffusion’s text-to-image model and found that it amplified both gender and racial stereotypes. Bloomberg warns that we need to examine these biases before models like this “rapidly morph from fun, creative outlets for personal expression into the platforms on which the future economy will be built.”
So how do we make sure automated systems are tools of empowerment rather than instruments of oppression? A series of talks at this year’s London Design Biennale, Remapping our Nature in the Digital Age, provided us with valuable insights: By incorporating inclusive data in the training of AI models, fostering a diverse AI community, and employing transparent algorithms, we can work towards a future in which technology is inclusive and uplifts everyone.
Latest from Applied Works
A world first: How Chatham House made data on the trade of resources accessible to all
Before the launch of resourcetrade.earth there was nowhere to explore global resource trade in one place. Chatham House’s ambitious project visualised the trade of our planet’s natural resources on a world map, and made it accessible to everyone.
This newsletter is brought to you by award-winning design studio Applied Works. Want to submit something to Rows & Columns? Share it with us here