Mindstrong, he says, will first focus on assessment, spending the next year or so testing phone-based data-collection-and-analysis systems; and then explore ways to partner with others to provide intervention through what Insel calls “learning-based mental-health care.” Continuous assessment and feedback would drive the interventions. Likewise, all therapies would use measurement-based practices, which give clinicians and patients steady feedback on what’s working and what isn’t—an approach shown to sharply improve outcomes.

To preserve users’ privacy, Insel says, Mindstrong will collect information only on an opt-in basis, and all data will be strongly encrypted. For most services, Mindstrong will save not actual data streams, such as what is said in spoken or typed conversations, but only metadata that reflect state of mind without revealing actual conversation. This might include semantic structures or the repeated use of key words or phrases that can reveal emotional or cognitive states such as depression, mania, psychosis, and cognitive confusion. All data will be firewalled according to strict patient-privacy practices.

Even such metadata, of course, might make an attractive target for people who’d want to exploit it. (Picture a digital-era version of the Nixon administration’s bungled attempt to steal the psychiatric records of Daniel Ellsberg, who leaked the Pentagon Papers, so that it could blackmail or smear him.) The only thing stopping such an effort in digital form might be the strength of Mindstrong’s firewall or its willingness to defy a government request for data. This danger is real. Google, for instance, has steadily reduced the proportion of government data requests it responds to, but in the second half of 2016, the company reports, it still produced data for 60 percent of such requests overall, and 79 percent of such requests from the U.S. The company reveals little to the public about which requests it honors or why. For these reasons, some digital-privacy experts think it’s dangerous for any company to collect and keep something as sensitive as psychiatric-patient data. For his part, Allen Frances feels it’s naive to trust any commercial entity to permanently protect such data from other commercial interests, hackers, or a government bent on getting the information. Others argue that this worry is itself naive, since most of us already leave enough footprints with smartphones, computers, phone calls, and credit-card purchases to forfeit the privacy Frances wants to protect. The question may not be whether a Mindstrong firewall would be perfect, but whether it would be stronger than the many porous containers already holding our personal and medical information.

I once asked Insel how he saw his move to Silicon Valley in relation to the rest of his career. I expected he’d say it was a complete departure.

Instead he said it felt to him like a return to his first concerns—“a return to behavior.” He meant the voles.

The fundamental assumption behind the vole work, and behind Insel’s career at the NIMH, was that beneath behavior lay biological mechanisms you could discern and then tweak to change that behavior. The crux of the biological model, in other words, was that you could and should address mental illness from within. Otherwise, why bother with the nearly impossible job of figuring out how it all worked? You looked for mechanisms so you could fix the machine.

Now, however, Insel means to address mental disorders not from the inside, but from the outside; and not with something new, but with things at hand. He’s shifting from mechanistic discovery to practical application. He is acting on the epiphany he had when the man at his talk complained that Insel was discussing paint chemistry when he should have been putting out fires. It was then, Insel says, that he began “to realize that the really urgent issue isn’t that our treatments get better, but that we don’t use what we have today.”

Insel will always believe in the value of research, of figuring out how things work. But our most pressing problem, he says—what keeps psychiatry from making the huge strides that have been made in disciplines like infectious disease and cardiology—“is not what we don’t know. We know well enough what works. Our problem is that we’re not doing it.”

The other big development in Insel’s work today is his embrace of social contact as a basic health necessity. For this he credits Schlosser’s work. “She convinced me that people with psychotic illness really crave social connections,” he said. “This was a great wake-up call for me: to see they want to connect on their own terms, sometimes anonymously, on their own schedule, in a way that they feel they can control”—often with others like them, in relationships that feel equal, rather than only with clinicians who may seem to hold too much power.

Why didn’t he come to all this sooner? Why now?, I asked.

“I have always believed,” he said, “that to get the most impact, you should go where you get the most traction.” Even five years ago, he said, he could not have gotten traction on the ground that he and Mindstrong are working today. Smartphones weren’t ubiquitous enough; the data weren’t rich enough.

“But now,” he said to me, “now we can do this.” He was leaning forward and smiling and holding both hands up in front of him as if he were fixing to catch something—as if he were a basketball player who’d just shook his defender and was calling for the ball to take an open shot. His eyes had the look of someone who felt he couldn’t miss.