jueves, febrero 28, 2013

Brain-Machine Interfaces (BMI)

Brain-Machine Interfaces (BMI)
Designing better interfaces: skin-like electronics

Capabilities for non-invasive measurement of neural signals are important because they support many critical biomedical applications, including brain-machine interface paradigms in mobile applications. Currently, recording neural signals in mobile environments is a challenge because conventional measurement devices have rigid or mildly flexible construction and bulky cables for signal conduction. Technologies of the future must address these drawbacks, through new ideas that provide ultrathin, conformal designs, with high fidelity and non-invasive measurement modes. Our research group, in conjunction with the research group of John Rogers at UIUC, is developing foldable, stretchable electrode arrays that can non-invasively measure neural signals (i.e. EEG) without the need for gel. The electrodes rely on layouts recently developed for silicon electronics that offer linear elastic responses to applied force, with the capacity to fold, twist and deform into various curved shapes. Stretchable electronics have the key advantage that they can wrap arbitrary, curvilinear surfaces and, at the same time, achieve mechanical properties that approach those of tissues of the human body (e.g. skin). These capabilities are especially significant for applications in skin-mounted devices for electroencephalography (EEG) in mobile environments.


Here, intimate contact enables efficient electrical coupling for high fidelity measurement. In particular, the signal to noise ratios of recorded signals benefit from low output impedances between the electrodes and the skin, enabled by the conformal interface.
Related publications
  • D. H. Kim, N. Lu, R. Ma, Y.S. Kim, R.H. Kim, S. W, S. M. Won, H. Tao, A. Islam, K.J. Yu, T. Kim, R. Chowdhury, M. Ying, L. Xu, J. Wu, M. Li, H.J. Chung, F. G. Omenetto, Y. Huang, T. P. Coleman, J. A. Rogers, “Epidermal Electronics”, Science, Aug 12, 2011.
Related press coverage:

A Team Decision-Theory Approach: “agents cooperating to achieve a common goal”

A brain-machine interface is a system comprising a direct communication pathway between the brain and an external device.  Our research group has developed an interpretation of the BMI as a system comprising multiple agents cooperating to achieve a common goal.  This “team decision theory” viewpoint has led us to leverage insights from feedback information theory and control theory to develop direct brain control systems that are easy to use, are theoretically optimal, and attain previously un-attainable performance.
Related publications
  • C. Omar, A. Akce, M. Johnson, T. Bretl, R. Ma, E. Maclin, M. McCormick, and T. P. Coleman, “A Feedback Information-Theoretic Approach to the Design of Brain-Computer Interfaces”, International Journal on Human-Computer Interaction, January 2011.
  • R. Ma, N. Aghasadeghi, J. A. Jarzebowski, T. W. Bretl, and T. P. Coleman, “A Stochastic Control Approach to Optimally Designing Variable-Sized Menus in P300 Neural Communication Prostheses”,  IEEE Trans on Neural Systems and Rehabilitation Engineering (TNSRE), January 2012.
  • S. K. Gorantla, and T. P. Coleman, “Equivalence Between Reliable Feedback Communication and Nonlinear Filter Stability”, IEEE International Symposium on Information Theory , August 2011.
  • R. Ma, and T. P. Coleman, “Generalizing the Posterior Matching Scheme to Higher Dimensions via Optimal Transportation”, Allerton Conference on Communication, Control, and Computing, September 2011.
  • A. Kulkarni and T. P. Coleman, “An Optimizer’s Approach to Stochastic Control Problems with Non-classical Information Structure”, IEEE International Conference on Decision and Control (CDC), to appear, December 2012

Machine Learning in Dynamic Interacting Networks: Uncovering Causal Influences

Many current viewpoints about how neural processes integrate to elicit complex brain function posit that populations of neurons in the human brain are connected to form functionally specialized assemblies.  With the increasing ability to record multiple neural signals at different brain areas simultaneously, one core issue in neuroscience research is understanding the mechanistic phenomena and how to analyze these ensemble recordings and infer the structure of these mechanisms.  One such approach to attempt to understand this mechanistic phenomena is by using a statistical measure of causality.
The directed information is an information-theoretic quantity analogous to mutual information that encodes the fundamental limits of communication with feedback.  It is directional and non-symmetric.  From the viewpoint of a sequential prediction under the log loss, it can be shown to be philosophically consistent with “Granger causality”, in that it measures directionality of causality (e.g., X causing Y) by assessing whether or not past values of X and Y help to predict the future of Y better than only past values of Y. We have used the directed information and extended it to the “right” notion of causality when we record more than two time series simultaneously.  By espousing a coupled dynamical systems and generative model viewpoint, we show that the “right” measure of causality within a network of many interacting processes is the “causally conditioned directed information”.
We have developed provably good estimation algorithms to estimate these quantities from data and have demonstrated how the network causal dynamics represent information processing in the brain.  In the primary visual cortex of an awake-behaving monkey, we analyzed simultaneous spiking and field potential recordings and demonstrate a consistent change in causal interactions between cells before, during, and after a visual stimulus evokes a motor response. Our procedure identifies strong structure in the estimated causal relationships in the spike trains, the directionality and speed of which is consistent with predictions made from the wave propagation of simultaneously recorded local field potentials.
Our approach is applicable to an arbitrary modality and thus can be applicable to a variety applications, including social networks, economics, and network security.
Related publications
  • C. Quinn, T. P. Coleman, N. Kiyavash, and N. G. Hatsopoulos, “Estimating the directed information to infer causal relationships in ensemble neural spike train recordings”, J. Computational Neuroscience, January 2011.
  • S. Kim, K. Takahashi, N. Hatsopoulos, and T. P. Coleman, “Information Transfer Between Neurons in the Motor Cortex Triggered by Visual Cues”, IEEE Engineering in Medicine and Biology Society Annual Conference , September 2011.
  • J. Etesami, N. Kiyavash, and T. P. Coleman, “Learning Minimal Latent Directed Information Trees”, IEEE International Symposium on Information Theory (ISIT), July 2012.
  • C. Quinn, N. Kiyavash, and T. P. Coleman, “Efficient Methods to Compute Optimal Tree Approximations of Directed Information Graphs”, to appear, IEEE Transactions on Signal Processing .

3 thoughts on “Research

  1. Pingback: Finally, Tattoos That Let You Control Objects with Your Mind | TIME.com
  2. Pingback: Scientists Are Working On Telekinetic Tattoos! - UberFactsUberFacts
  3. Pingback: Psychical Research in the 21st Century - University Blog » Electronic Telepathy & Psychokinesis: For the General Public?