Your smartphone’s AI algorithms could tell if you are depressed

Depression is a huge problem for millions of people, and it is often compounded by poor mental-health support and stigma. Early diagnosis can help, but many mental disorders are difficult to detect. The machine-learning algorithms that let smartphones identify faces or respond to our voices could help provide a universal and low-cost way of spotting the early signs and getting treatment where it’s needed. Recommended for You The US and China are in a quantum arms race that will transform warfare In 2019, blockchains will start to become boring The future of work still requires people—so stop investing in them at your own peril This is the first picture of the moon’s far side taken by China’s probe after it landed Data mining adds evidence that war is baked into the structure of society In a study carried out by a team at Stanford University, scientists found that face and speech software can identify signals of depression with reasonable accuracy. The researchers fed video footage of depressed and non-depressed people into a machine-learning model that was trained to learn from a combination of signals: facial expressions, voice tone, and spoken words. The data was collected from interviews in which a patient spoke to an avatar controlled by a physician. In testing, it was able to detect whether someone was depressed more than 80% of the time. The research was led by Fei-Fei Li, a prominent AI expert