Sunday, June 28, 2015

HOW SHOULD HIRING DECISIONS BE MADE?

You are competing for a job. You are one of many candidates qualified for the position. How do you become the company's candidate of choice? Will your "human skills" like being able to carry on an intelligible conversation and being adept a reading social cues make a difference in your being selected?

Interviewers are people too--complete with implicit biases that can affect the hiring decisions they made. Perceiving you to share similarities with them--perhaps you attended the same school, have a friend in common, or enjoy playing the same sport--could incline an interviewer to view you more favorably than other candidates.

But what happens when the decision is taken out of human hands and automated instead? Might an algorithm encoded into software precipitate better hiring decisions than a human interviewer? Could the use of technology actually help to increase workplace diversity as opposed to workplace homogeneity. Or are some of the human qualities necessary to succeed in the workplace uncodable--like gut feelings and human chemistry? And what about the biases we worry about in interviewers, might these not be as likely to exist in the created software--since humans did the coding.

In your opinion, what really determines your cultural suitability for a job, and who should decide if you have what's needed:  a person or a computer?

(See Claire Cain Miller, "Can an Algorithm Hire Better Than a Human?" The New York Times, June 25, 2015.)

No comments:

Post a Comment