When describing the future of microscopy, Jan Huisken imagines an alien landing for the first time on Earth, trying to size up the human species.
After seeing a few dozen subjects, the alien might get thrown off by some people wearing glasses, others with long or short hair, or differences in stature. But it will eventually figure out the patterns: This species has two arms and legs, two eyes, a mouth, two ears, etc. It would define the baseline, then focus more on features such as height, gender, pigmentation, eye color, and other interesting traits.
Huisken wants research microscopes capable of doing the same thing.
“In an attempt to understand how diverse development is, we don’t want to image specimens hundreds of times blindly,” says Huisken. “After the first handful, we should figure out what an organism looks like, then go out and find the defining features and the peculiarities that tell us more.”
As the medical engineering lead at Morgridge, Huisken will continue his innovations in “smart microscopy” by building custom devices both for his own lab and for the campus research community. The concept is creating microscopes that are customized, self-learning and to some degree self-directed, being able to separate the meaningful from the mundane.
Huisken invented light sheet microscopy, which captures the sensitive biology of live specimens in an almost entirely unaltered environment. Huisken fielded a few questions recently about smart microscopy and his plans at Morgridge and UW–Madison, where he is a professor of biomedical engineering.
-
What will be transformative about the next wave of microscopes?
- With the smart microscope, of course, it will be intelligent in that it knows what to record and what not to record. And it is rather picky in taking out the information that you’re actually interested in, and not just blindly recording the entire image. But all of that has to do with customizing the microscope to the sample and to the application. I think it’s something that we can do particularly well here at Morgridge and the Fab Lab, where we can 3D print a custom chamber for that particular sample, and yet another chamber for this sample. In that way, every microscope we produce would be very different from the previous one. It would have common parts, but arranged differently and controlled by a different code and fed with different instructions on how to record the information.
-
Is automation and self-learning a big part of future goals?
- Oh, absolutely. I think scientists can only teach the microscope so much. And there are a lot of things that we can’t describe to the microscope. Say you look at an image and you want to count the cells. The computer will have a hard time distinguishing a cell from dirt or other background noise. It will be looking for something round and will misidentify many things. Those are the kinds of questions that image analysis people have worked on for the last few decades — perfecting the computer in ways that it can mimic what humans have been doing manually. In a sense we can do that now through high throughput imaging, where we can quantify information in those images. But what we can’t do, and what is something nobody has really worked on so far, is really presenting the computer with a bunch of images, and having the computer automatically make sense of them. You could present photographs of birds or dogs, and after 100 images, you throw in a picture of a cat. The computer would say, wait a minute, there’s something wrong here, right? Without telling the computer what a dog looks like, it would just automatically recognize that all of these 100 images have something in common.
-
So it should be able to detect differences and anomalies without having to tell the machine what to look for?
- Even before that, just describing the wild type is important to understand how much variance there is in nature. We study zebrafish, and we want to understand how the fish is allowed to deviate from whatever the average may be and still develop into a healthy fish. So there is some tolerance where it can deviate a little bit this way and a little bit that way. But if it goes too far, it will die. That’s something we’d like to understand: how much can you go out on that ridge and not fall down the cliff … and find your way and still develop into a healthy animal or human being. And of course, you can also change the environment and put stress on the organism. So you can starve them or increase the temperature, or you can add a drug or something, to see if you can make this ridge wider or narrower. That is something no one has really ever done in a living organism. We can try to characterize them more precisely to where the phenotype is not just, ‘oh this one looks a little sick,’ but we know precisely what is wrong.
-
What are you looking for in potential collaborations with UW–Madison biologists?
- In order for this to be really successful, we will need to identify the right people, and also change their mindset a little bit. We want to find people who are creative enough, and maybe adventurous enough, to take the risk, to dive into a new technology, where the technology is not just provided to them, and they just use it, but they have to also do a little bit on their side. We can meet half way. It takes some time in these discussions where we get people to lean back and think again about what their dream is. You need to have people who dream and think like, ‘Oh, it would be wonderful if we could just have our jellyfish swim in their tanks, and on my computer I can see a 3D representation of the forces that are involved in their swimming movements as they propel through the water.’ Or whatever the case may be. It won’t work if people say, ‘I wish I had a microscope with higher resolution, or I wish I had the same microscope my neighboring lab has.’
-
So you want people to focus less on all the technology specs, and more on imaging something never seen before?
- It’s a little bit science fiction, and a little bit of forgetting all the technology that you already have in your lab, and thinking about the ideal setting. Maybe you want to ask a kid. Ask them: ‘There’s all these fish out there in the lake, how would you want to image them?’ And some might say, ‘I’ll just scoop them out and put them on a microscope.’ But another kid might say, ‘I’ll use my remote controlled helicopter, fly across the lake, drop a camera in the lake, and as the fish swim by I record a 3D image of them. Then a week later I fly in again, pick up the camera and have all the data.’ I’ll be looking for some scientists here on campus who have these crazy ideas.
-
Seems like your goal is getting further away from the concept of slicing and dicing and pushing for greater details, toward capturing the big picture of actual environment, behavior, and biological function?
- Yes, because I think we’ve been a little bit too obsessed about this idea of extracting every little bit out of our already dead and sliced sample. And really ignoring that what we have under the microscope used to be a living animal, and probably under all the procedures we’ve already done on the way to the microscope, we are not really looking at true, live biology anymore, but lots of artifacts. And that’s something we would want to move away from, go back to the roots and go and look at something where we know it’s behaving normally.
-
You have suggested that research microscopes could take a cue from the iPhone?
- It’s oftentimes this discrepancy that you have. You have your iPhone in your pocket, and you’re used to having this intelligent device with you at all times. Your Google Mail application will send an email from your airline about your flight booking, then automatically put it in your calendar. And it tells you when you’re supposed to leave to get on time to the airport. And all that is so intelligent, we just get used to it. But at the same time, we work in high-level science with all these sophisticated instruments, but then they turn out to be rather dumb — not as intelligent as some of the devices that we have in our pockets. We’re in the type of industry where it just takes time to have all of these new ideas implemented, to put more intuitive features in a device that may cost a half-million dollars.