We predict that bad things can happen but that no body will predict what they accurately could be.
There has been a few AI jobs shutdown, simply because they had been offering accurate, rather than politically proper, tips.
Amazon’s recruitment AI had been favoring males: “Reuters had been told through people in the group working about it that the device effortlessly taught it self that male prospects had been preferable…They literally desired that it is an motor where I’m going to provide you with 100 resumes, it’ll spit out of the top five, and we’ll employ those… The system began to penalise CVs including your message “women”. This program ended up being modified making it basic into the term nonetheless it became clear that the operational system could never be relied upon, Reuters was told. Continue reading