Generic fluoxetine au no prescription

In 1998, University of Oregon researcher Avinash Singh Bala was working with barn owls in an Institute of Neuroscience lab when the birds’ eyes caught his attention.

The usual research done in the lab, led by Terry Takahashi, explores, at a fundamental level, how barn owls process sounds, with the idea that such knowledge could lead to improved hearing devices for people.

But those eyes. Every time the owls heard an unexpected sound, their eyes dilated.

“So, we asked, might this work in humans?” Bala said. “We thought, if so, it would be a great way to assess hearing in people who cannot respond by pushing a button, raising a hand or talking, such as babies, older children with developmental deficits and adults who are suffering from a debilitating disorder or are too sick to respond.”

Over the next decade, Bala and Takahashi, as free time outside their primary research allowed, pursued ideas on how to use the eyes as a window to hearing. They experimented, finding similar involuntary dilation in humans. They tweaked a possible approach, aiming for sensitivity that might equal that achieved with traditional tone-and-response testing.

“We presented early data analyses at conferences, and there was a lot of resistance to the idea that by looking at an involuntary response we could get results as good as button-press data.”

Last month, the two UO neuroscientists published a freely accessible paper in the Journal of the Association for Research in Otolaryngology that solidifies their case. They used eye-tracking technology simultaneously as they conducted traditional hearing exams with 31 adults in a quiet room.

Dilation was monitored for about three seconds as participants stared at a dot on a monitor while a tone was played. To avoid being fooled by pupil reactions generated by pushing a response button, subjects’ responses were delayed until the dot was replaced by a question mark, when eye-tracking stopped.

Levels of dilation seen throughout the testing directly reflected the participants’ subsequent push-button responses on whether or not a tone was heard. That, Bala said, allowed his team, which also included former doctoral student and co-author Elizabeth Whitchurch, “to see and establish causality.”

“This study is a proof of concept that this is possible,” Bala said. “The first time we tested a human subject’s pupil response was in 1999. We knew it could work, but we had to optimize the approach for capturing the detection of the quietest sounds.”

Takahashi said the initial discovery was completely accidental.

“If we hadn’t been working with owls, we wouldn’t have known about this possible human diagnostic technique,” he said. “This is a really good example of how animal-based research can benefit advances in human diagnostics.”

The testing in the newly published research, funded initially by internal grants, was done using conventional, commercially available hearing and eye-tracking technologies.

Bala and Takahashi are now collaborating with Dare Baldwin, a professor of psychology, on developing their own technology for testing with babies. The effort is being supported by a 2015 Incubating Interdisciplinary Initiatives award from the Office of the Vice President for Research and Innovation and a recent grant from the University Venture Development Fund.

Source: Read Full Article