About the author
Christopher Endemann is a research intern in Professor Matthew Banks’ research lab in the department of anesthesiology, which focuses on understanding how the brain changes when we lose and regain consciousness. As an aspiring computational neuroscientist, Chris specializes in extracting insights from high-dimensional electrophysiological datasets to better understand how different regions of the brain function together in concert to produce a seamless conscious experience. He believes that understanding the brain as a computational system will help pave the way for developing new treatments to psychiatric disorders as well as developing novel artificial intelligence models. Outside the lab, Chris leads AI@UW, an artificial intelligence club he co-founded in 2017 which offers AI-related workshops, tutorials and special projects to the campus community.
About this story
Data science is changing the face of biology. But the challenge isn’t just gathering the data — it’s making sense of the billions of data points we are capable of generating.
The Center for High-Throughput Computing (CHTC), led by Morgridge Institute Investigator Miron Livny, provides UW-Madison scientists with the computing power to vastly increase the size and complexity of their research problems. In 2019, the center worked with 290 researcher groups and processed more than 390 million hours of computing time.
This is one in a periodic series on CHTC projects facilitated at Morgridge.
Our lab focuses on how general anesthesia induces loss of consciousness, as well as how the brain changes under anesthesia and during sleep. We are trying to find signatures for consciousness by looking at the brain and how it processes sensory information. Under different states of awareness, different parts of the brain are communicating with each other, so we are looking for the neural correlates for conscious experience.
This requires analysis of data that shows how the brain is changing across different states of awareness. It’s a hot topic of interest, because people don’t really understand how anesthesia works. It’s still one of the biggest medical mysteries of neuroscience.
The data we use comes from high-dimensional recordings of patients with epilepsy. In the course of surgery to help prevent seizures, our partners at The University of Iowa placed electrodes directly on the cortical surface of the brains of five patients and traced them through multiple stages of sleep (wake, N1, N2, N3, REM) and multiple arousal levels under anesthesia (wake, sedated, unresponsive). There are 100 to 200 channels per patient spanning roughly 30 to 40 regions of interest in the brain, and we want to estimate how all of these different channels are causally influencing each other.
No one else has had such a large data set with this many connectivity models. We have multiple subjects, multiple awareness states, multiple regions of the brain, and multiple channels, and every minute of that data has its own separate connectivity model. Then we combine all of those models together to get what we call a connectivity network. The multiplier is just ridiculous.
One practical implication of this research is to be able to non-invasively assess someone’s awareness state. A patient could be in a coma, but there’s no way to probe them and ask if they’re having any sort of conscious experience. Furthermore, patients who undergo general anesthesia are typically given a paralytic to prevent any movement—meaning that anesthesiologists must rely on various biomarkers to assess whether someone is truly unconscious. With the results from our analysis, clinicians can use neuroimaging tools such as EEG or ECoG to compare a new patient’s brain connectivity profile to the connectivity profiles we find are associated with either conscious or unconscious brain activity—thus increasing confidence in one’s clinical assessment of awareness state.
We could not do this work without CHTC. I do computational neuroscience, so I write the program that specifies which steps to run in which order. But they take care of almost everything else. I stop by whenever I have a question. They have helped me countless times when I need to debug a program or design how to run a model through their system. I always have jobs running. I also use Condor for estimating which connections are significant and that’s a computationally intensive thing to run.
IT is such a fast-moving field that you have to learn how to rely on other people’s expertise in order to get a project done. I was a sophomore when I had to learn all these new tools from scratch, but they made it easy for an undergrad, even though their system is incredibly powerful. It can take a week to run all of our data at once, but that’s still a crazy short time compared to what it would take a single computer. They schedule jobs for me that run thousands of computers across the nation. Programs that would not have been feasible to run in a lifetime are now possible.