If there ever was a dynamic duo, recent Haas grad Stanford Stickney and his younger brother, Daniel, are it. Together, they collaborated with the team that won the Big Ideas@Berkeley prize this past May in the Information Technology for Society category.
Led by UC Berkeley undergraduate students Tomás Vega and Pierre Karashchuk, with Stephen Frey, Kelly Peng, and John Naulty, the team won first place for creating a Brain Computer Interface (BCAPI). Stanford, BS 15, lent his business development skills as a team member, and Daniel, 21, who has cerebral palsy and is visually impaired, tested the technology and provided feedback.
(l-r: Stanford and Daniel Stickney, with Tomas Vega, Pierre Karashchuk, and Stephen Fry. Photo: Roman Decca)
“Growing up, both Daniel and I had a belief that you can do anything,” says Stanford, a Los Gatos native, one of four children raised by a single dad. “Technology is one platform that’s enabling us to do that together. My mission in life is to help my brother and I am with him every step of the way.”
For their project, the Big Ideas team – whose members have backgrounds in software engineering, cognitive neuroscience, signal processing, and machine learning – equipped a helmet with electrodes that combine electroencephalography and computer algorithms. The device, which connects to a laptop in Daniel’s backpack, enables his thoughts to interact with a computer to move his wheelchair to the left or right.
A Perfect Match
Stanford met Vega, now a senior studying computer and cognitive science, last semester in a New Media graduate level class taught by electrical engineering and computer science Professor Eric Paulos. The class focused on rapid prototyping at the CITRIS Invention Lab at UC Berkeley.
The two became friends and Stanford shared background on the work he and his brother were doing to help Silicon Valley companies improve technology for people with disabilities. Vega, who was on the team that won Cal Hacks last year for building a MindDrone, a flying drone maneuvered by neurological signals, described his BCAPI project to Stanford and his interest in human-computer interfaces. Suddenly, everything clicked. “It was very exciting,” Stanford says. “I said ‘This is a perfect match’ and we were able to put the two together.”
They set up a meeting at Karashcuk’s apartment, where Daniel tried on the helmet for the first time (photo below). Trouble was, the program was designed for someone in a wheelchair who could see a computer screen. Stanford instead touched Daniel’s right or left arm to trigger him to think “left” or “right.”
(Photo: Daniel tries on the helmet, which is connected to the team’s laptop.)
Observing Daniel, the team decided to change the design of its prototype. In its next upgrade, they will add arm vibrations to alert a visually impaired person to think “left” or “right.”
The $13,000 Big Ideas grant the team won will be used to improve the BCAPI technology and to conduct a long-term study of its effectiveness.
Daniel currently measures 40 percent accuracy with the brain-computer interface in controlling the functions of his chair. As he continues his work with the device, the neuroplasticity — or the pathways to his brain — is expected to strengthen. “Learning to use the device is like learning a new language, and as Daniel gets more proficient, it gets easier,” Stanford says.
The team’s vision is to provide an open-source platform that enables the creation of a community of technology software and product developers who contribute to the independence of millions of disabled technology users.
“It’s been so exciting to be one of Daniel’s advocates in this journey,” Stanford said. “I hope that we’ll be questioning the status quo for a long time to come.”
By Kate Madden Yee and Kim Girard