To apply you need to attempt one skill assessment test as recruiter has attached skill assessment test with this job and wants to see your obtained marks,
So be carefull while attempting this skill assessment test

Satya Nadella talked to us about confronting a sexist bias he didn't know he had - and making sure AI doesn't make the same mistake

Microsoft CEO Satya Nadella sat down with Business Insider and revealed what he learned in the wake of making a controversial comment about women's pay. Nadella is the author ofHit Refresh. The following is a transcript of the video.

Satya Nadella:The context of the question at a women's conference was what are you as a CEO doing to help women make progress in their career? Understand the bias in the system as opposed to talk about trusting the system. And that's the realization coming back even talking to all of the amazingly accomplished women at Microsoft in my own leadership team, I was able to really get in touch with how we can do a lot more.

One of the things that we definitely look to do is to make sure that things like these unconscious bias training are not just checkbox items for people but are really learning moments because, in some sense, life's experience is what helps us develop increasing levels of empathy. And empathy is not just a nice thing to have, it's an existential thing for us because we are in the business of meeting unmet, unarticulated needs of customers.

Matt Stuart: Do you worry about any of those biases being preprogrammed into any of the AI apps or other things that are being developed?

Nadella:I think it's one of the more important issues for us to make sure that things like training data are not biased. And one of the best ways to ensure that what you do, whether it's the programs, the algorithms, the training regimen, are not biased, is to make sure you have diversity of engineers who are designing them. That's one of the great ways we, in fact, use to make sure that we're testing these products for that diversity and lack of bias.