Publications

(last update: 26/08/2016)
For any additional information or to request a copy of the articles, please contact me directly.

Update (09/08/2016):  A paper about the feasibility of BCI control in a realistic smart home environment is accepted to Frontiers in Human Neuroscience.

  • Kosmyna N, Tarpin-Bernard F, Bonnefond N and Rivet B (2016) Feasibility of BCI Control in a Realistic Smart Home EnvironmentFront. Hum. Neurosci. 10:416.                                      doi: 10.3389/fnhum.2016.00416
  • Kosmyna, F. Tarpin-Bernard and B. Rivet. Conceptual Priming for In-game BCI Training. ACM Trans. Comput.-Hum. Interact.  2015. 5-year Impact Factor: 1.37  Presented at CHI 2016.
  • N. Kosmyna, F. Tarpin-­Bernard and B. Rivet. Operationalization of Conceptual Imagery for BCIs.  EUSIPCO’2015. In Proceedings of the 23d European Signal Processing Conference. Aug.2015.Abstract: We present a Brain Computer Interface (BCI) system in an asynchronous setting that allows classifying objects in their semantic categories (e.g. a hammer is a tool). For training, we use visual cues that are representative of the concepts (e.g. a hammer image for the concept of ham- mer). We evaluate the system in an offline synchronous setting and in an online asynchronous setting. We consider two scenarios: the first one, where concepts are in close semantic families (10 subjects) and the second where concepts are from distinctly different categories (10 subjects). We find that both have classification accuracies of 70% and above, although more distant conceptual categories lead to 5% more in classification accuracy.   Presented.
  • Nataliya Kosmyna, Franck Tarpin-Bernard, and Bertrand Rivet. 2015. Brains, computers, and drones: think and control! ACM Interactions 22, 4 (June 2015), 44-47. DOI=10.1145/2782758 http://doi.acm.org/10.1145/2782758Abstract: Imagine you could control the world with your thoughts. Sounds appealing, doesn’t it? There is a technology that can capture your brain activity and issue commands to computer systems, such as robots, prosthetics, and games. Indeed, brain-computer interfaces (BCIs) have been around since the 1970s, and have improved with each passing decade. You might wonder: “Wait! If this technology has been around all this time, how come we’re not all using it? I mean, we hear about great applications sometimes in the press—controlling a drone, for instance—but then nothing seems to come of it. Why is that?”
  • N. Kosmyna, F. Tarpin-Bernard and B. Rivet. Towards Brain Computer Interfaces for Recreational Activities: Piloting a Drone. In 15th IFIP TC.13 International Conference on Human-Computer Interaction – INTERACT 2015. Springer Berlin Heidelberg. 2015.
    Conference of A level (CORE Ranking) / 2014 Acceptance Rate: 29%. Presented.Abstract: Active Brain Computer Interfaces (BCIs) allow people to exert voluntary control over a computer system: brain signals are captured and imagined actions (movements, concepts) are recognized after a training phase (from 10 minutes to 2 months). BCIs are confined in labs, with only a few dozen people using them outside regularly (e.g. assistance for impairments). We propose a “Co-learning BCI” (CLBCI) that reduces the amount of training and makes BCIs more suitable for recreational applications. We replicate an existing experiment where the BCI controls a drone and compare CLBCI to their Operant Conditioning (OC) protocol over three durations of practice (1 day, 1 week, 1 month). We find that OC works at 80% after a month practice, but the performance is between 60 and 70% any earlier. In a week of practice, CLBCI reach- es a performance of around 75%. We conclude that CLBCI is better suited for recreational use. OC should be reserved for users for whom performance is the main concern.
  • N. Kosmyna, F. Tarpin-Bernard, and B. Rivet. 2015. Adding Human Learning in Brain–Computer Interfaces (BCIs): Towards a Practical Control ModalityACM Trans. Comput.-Hum. Interact. 22, 3, Article 12 (May’2015), 37 pages. DOI=10.1145/2723162 http://doi.acm.org/10.1145/2723162
    5-year Impact Factor: 1.37. Presented at CHI 2016Abstract: In this article we introduce CLBCI (Co-Learning for Brain Computer Interfaces), a BCI architecture based on co-learning, where users can give explicit feedback to the system rather than just receiving feedback. CLBCI is based on minimum distance classification with Independent Component Analysis (ICA) and allows for shorter training times compared to classical BCIs, as well as a faster learning in users and a good performance progression. We further propose a new scheme for real-time two-dimensional visualization of classification outcomes using Wachspress coordinate interpolation. It allows us to represent classification outcomes for n classes in simple regular polygons. Our objective is to devise a BCI system that constitutes a practical interaction modality that can be deployed rapidly and used on a regular basis. We apply our system to an event-based control task in the form of a simple shooter game where we evaluate the learning effect induced by our architecture compared to a classical approach. We also evaluate how much user feedback and our visualization method contribute to the performance of the system.
  • N. Kosmyna, F. Tarpin-Bernard and B. Rivet. Drone, Your Brain, Ring Course: Accept the Challenge and Prevail! UBICOMP’14 ADJUNCT. 2014. 243-246. DOI 10.1145/2638728.2638785
    Conference of A+ level (CORE Ranking). Presented.Abstract: Brain Computer Interface systems (BCIs) rely on lengthy training phases that can last up to months due to the inherent variability in brainwave activity between users. We propose a BCI architecture based on the co- learning between the user and the system through different feedback strategies. Thus, we achieve an operational BCI within minutes. We apply our system to the piloting of an AR.Drone 2.0 quadricopter with a series of hoops delimiting an exciting circuit. We show that our architecture provides better task performance than traditional BCI paradigms within a shorter time frame. We further demonstrate the enthusiasm of users towards our BCI-based interaction modality and how they find it much more enjoyable than traditional interaction modalities.
  • N. Kosmyna, F. Tarpin-Bernard and B. Rivet. Bidirectional Feedback in Motor Imagery BCIs: Learn to Control a Drone within 5 Minutes. CHI’14 Extended Abstracts on Human Factors in Computing Systems. 2014. 479-482. DOI 10.1145/2559206.2574820
    Conference of A+ level (CORE Ranking). Presented during 3 days.
    Abstract: Brain Computer Interface systems rely on lengthy training phases that can last up to months due to the inherent variability in brainwave activity between users. We propose a BCI architecture based on the co-learning between the user and the system through different feedback strategies. Thus, we achieve an operational BCI within minutes. We apply our system to the piloting of an AR.Drone 2.0 quadricopter. We show that our architecture provides better task performance than traditional BCI paradigms within a shorter time frame. We further demonstrate the enthusiasm of users towards our BCI-based interaction modality and how they find it much more enjoyable than traditional interaction modalities.
  • N. Kos’myna, F. Tarpin-Bernard and Bertrand Rivet. Towards a General Architecture for a Co-Learning of Brain Computer Interfaces in Proceeding of the 6th International IEEE EMBS Conference on Neural Engineering, San Diego, USA, November 2013  DOI 10.1109/NER.2013.6696118
    Conference of A level (CORE Ranking). Presented.
    Abstract: In this article we propose a software architecture for asynchronous BCIs based on co-learning, where both the system and the user jointly learn by providing feedback to one another. We propose the use of recent filtering techniques such as Riemann Geometry and ICA followed by multiple classifications, by both incremental supervised classifiers and minimally supervised classifiers. The classifier outputs are then combined adaptively according to the feedback using recursive neural networks.
  • N. Kos’myna and F. Tarpin-Bernard. Evaluation and comparison of a multimodal combination of BCI paradigms and Eye tracking with affordable consumer-grade hardware in a gaming context. 2013. In IEEE Transactions on Computational Intelligence and AI in Games. Volume PP. Issue 99. DOI http://dx.doi.org/10.1109/TCIAIG.2012.2230003
    5-­year Impact Factor: 1.167.
    Abstract: This paper evaluates the usability and efficiency of three multimodal combinations of brain–computer interface (BCI) and eye tracking in the context of a simple puzzle game involving tile selection and rotations using affordable consumer-grade hardware. It presents preliminary results indicating that the BCI interaction is interesting but very tiring and imprecise, and may be better suited as an optional and complementary modality to other interaction techniques.
  • N. Kos’myna and F. Tarpin-Bernard. Une combinaison de paradigmes d’interaction cerveau-ordinateur et suivi du regard pour des interactions multimodales. in Ergonomie et Interaction Homme-Machine ErgoIHM’2012. 2012. DOI 10.1145/2671470.2671486Abstract: This study evaluates the usability and efficiency of three multimodal combinations of Brain-Computer Interfaces (BCI) and Eye-tracking in the context of a simple puzzle game involving tile selections and rotations. Results indicate that although BCI interaction raises interest is still very tiring and imprecise. However, BCI based on SSVEP are efficient in cooperation with gaze.
  • N. Kos’myna. A Multimodal Combination of Brain Computer Interfaces and Eye Tracking. Master Thesis. 2012.

© Copyright 2015 Nataliya Kosmyna. All Rights Reservedhttp://lhaphoto.com/?item=casino-casino-internet-online&id=72.