



In the example below, you notice that while translating gender-neutral sentences from English to Spanish, the translated text follows the stereotypical gender role, i.e., lawyer is translated as being male. Often, these arbitrary gender assignments align with stereotypes, perpetuating harmful societal bias (Stanovsky et al., 2019 Ciora et al., 2021) and leading to translations that are not fully accurate. In the absence of information about the gender of a noun like ‘lawyer’ in a source sentence, MT models may resort to selecting an arbitrary gender for the noun in the target language. For example, in English, the word lawyer could refer to either a male or female individual, but in Spanish, abogada would refer to a female lawyer, while abogado would refer to a male one. Gender is expressed differently across different languages. As part of this journey our first step is to provide feminine and masculine translation variants. In accordance with the Microsoft responsible AI principles, we want to ensure we provide correct alternative translations and are more inclusive to all genders. Bing Translator has always produced a single translation for an input sentence even when the translations could have had other gender variations including feminine and masculine variants. Our latest release is a step towards reducing one of these biases, specifically gender bias that is prevalent in MT systems. However, models optimized to capture the statistical properties of data collected from the real world inadvertently learn or even amplify social biases found in that data. Over the last few years, the field of Machine Translation (MT) has been revolutionized by the advent of transformer models, leading to tremendous improvements in quality. You can try out this new feature in both Bing Search and Bing Translator verticals. We’re excited to announce that, as of today, masculine and feminine alternative translations are available for when translating from English to Spanish, French, or Italian.
