Image-only Lipreading of Vowels – 2016

In this work, it is presented a practical example of using a vowel speech generator to give sound to the vowels detected and classified using only visemic information.

This work was published and can be accessed here for more detail.

Real-time vowel detection and classification using Matlab ans Intraface tracker – 2016

The video presents an overall view of the detection and classification of vowels. It shows a real-time usage from multiple users, with plots alongside to better enlighten the usage.

The method uses only visual information, proving we can extract and classify vowels. It also shows that a user can adapt to a model trained for another person.

This work was published and can be accessed here for more detail.

News Reporter Recognition to Create Lipreading Speech Recognition Database – 2016

Knowing how hard it is to create a lipreading research database, we created an algorithm which can trim news reporter parts to be used in visemic extraction.

This overcomes the difficulty of finding available users for video acquisition, as there are hundreds of news videos online. The classified reporter parts are highlighted with a green border.

Multiple News Reporter Segmentation – 2016

This video is the result of an algorithm to identify and recognise the news reporter in the video. For that we feed the model with some positive frames of the person and extract some features based on HSV and yCbCr histograms to discriminate the used. Edges are also used in the classification.

The video shows the trimmed positive parts.

Teeth segmentation using Matlab – 2015

This is the result of an algorithm implemented in Matlab to track the teeth area while a user is speaking. The work is intended to be integrated in the Lipreading project serving as a new feature.

The algorithm is based on the image processing of the Lab and Luv colorspaces.

Extreme face stabilisation using Matlab – 2015

In this work, we take a low resolution web can video stream and tracking the user’s face, we implement a method to stabilise the mouth of the person. The method used in the video is documented by Matlab in, this link.

This algorithm was implemented to track and straighten the user’s mouth for lipreading feature extraction.

How to Deal with Motion Sickness in Virtual Reality – 2015

With the increase of popularity in VR, we studied one of its biggest drawback so far, simulation sickness. To do so, we inserted the user in a virtual scenario with some different voluntary and involuntary movements and studied the relationship between biosignals and reported simulation sickness by the users.

We could identify some serious relations between some EDA and ECG features and simulation sickness events. To get more in-depth information please take a look at the published paper. This work won the Honourable Mention Award for best demonstration in the Sciences and Technologies of Interaction Conference, 2015.

Demonstration of the Influence of Human Emotions in Physiological Signals – 2015

In this work, we show how the human emotions are illustrated in our biosignals.  Four situations are studied,  exercise, stress environment, peaceful environment and driving simulation.

More information can be found both in the conference paper or journal paper. This work won the Honourable Mention Award for best demonstration in the Exp.at’2015 International Conference, 2015.

Sensor Based Detection of Alzheimer’s Disease-Related Behaviours – 2013

This work presents the use of wearable sensors to detect one of the symptoms of the Alzheimer Disease that has a large impact on life quality of both patients and caregivers – Psychomotor Agitation. Using signal processing, pattern recognition and machine learning techniques, it is possible to detect Psychomotor Agitation.

More detail in my thesis or in the associated published paper (here or here).

BioPlux Biosignals Acquisition and Player Software using Qt and BlueZ in Ubuntu – 2013

Software developing in my Master Thesis to integrate the bioPLUX hardware device in a linux-platform. The program is written in C++ using Qt and BlueZ.

Using the BioPlux bluetooth connection protocol, we are able to collect biosignals from both the device and a prototype band.

You can take a more in-depth look in my thesis.

VTK interface to explore retinal tissue, using Qt and C++ – 2012

Software developing with VTK and Qt in C++, and Matlab. The main purpose was to create a software on linux-based platform to visualize a 3D volume of retinal tissue from exam slices in the three main planes.

We used Matlab to process the images that come from the examination machine into bmp type slices, which are the input in the Qt interface.

This is the result of my two months internship.