The rising paradox between tech & gender - what it means for brands?
An event hosted by Landor and chaired by Nick Foley, Regional President of Landor SEAPJ, shed light on the role that brands can play in weeding out gender stereotypes
It’s a fact well-witnessed that advances in digital technology and robotics are occurring faster than ever before. Artificial intelligence (AI) is influencing and transforming lives and behaviors in a plethora of ways. While many of these changes are exciting and help propel us forward, this new wave of pervasive technology has also created an unavoidable tension when viewed through the lens of gender. An event hosted by Landor shed light on the above issue and on the role that brands can play in weeding out such stereotypes.
Deepali Naair, Director of Marketing, IBM India & SA, Falguni Nayar, CEO, Nykaa, Garth Viegas, Director of Insights, TATA Global Beverages and Jaimit Doshi, EVP, Kotak Securities came together to discuss tech, gender and brands. The panel was chaired by Nick Foley, Regional President, Landor SEAPJ. Landor Mumbai’s Lulu Raghavan set the stage for the discussion by addressing interesting questions like whether we risk making AI sexist.
Raghavan hinted on how Apple’s Siri—which means “beautiful woman who leads you to victory”—and Amazon’s Alexa are not only digital personifications of women, they embody old-world female roles: women as domestic servants available to submit to every command they are given.
However, she shared that some brands have already begun to move away from gender stereotypes. “Google’s Assistant, for instance, which has no human name and now has gender options available. Or consider Accenture, a company that plays an active part in increasing women’s involvement in the fields of science, technology, engineering, and mathematics (STEM). It runs key initiatives to increase investment in STEM and attract, recruit, develop, and retain women that work in these roles,” Raghavan shared.
Experts pointed out that gender stereotypes have been embedded in the society through the most basic aspects. Giving an example, Naair said, "AI systems are only as good as the data we put into them. Bad data can contain implicit racial, gender, or ideological biases. Many AI systems will continue to be trained using bad data, making this an ongoing problem. At IBM, we believe that bias can be tamed and that the AI systems that will tackle bias will be the most successful. Research shows that the challenge facing us today is that more than 180 human biases have been defined and classified, and any one of which can affect how we make decisions."
According to Viegas, we have lived with gender stereotypes all these years so dispelling these in a short period of time is a challenge. He advises, "As brand marketers, we live to build brands and brands with personalities. What I can do is understand the nuances of gender-stereotype and do away with it for my brand."
Naair opined that a smart woman sitting at home versus a man sitting at home is looked very differently which are things that need to change. Jaimit was quick to point out that they see that young women are coming to their platform and being independent. When their income rise and they have a surplus, they will be independent which is a welcome change," revealed Doshi.
Speaking about AI at risk of being sexist, experts concluded that brands must train their AI with more responsibility. They contended that advertising decisions can be biased because of unconscious, preconceived notions about gender roles but they need to be upfront with their consumers, steer away from these stereotypes and stay honest and relevant.
WhatsApp, Instagram, LinkedIn, Twitter, Facebook & Youtube