Research Projects

COST Action 2102

The main objective of the Action is to develop an advanced acoustical, perceptual and psychological analysis of verbal and non-verbal communication signals originating in spontaneous face-to-face interaction, in order to identify algorithms and automatic procedures capable of identifying human emotional states. Several key aspects will be considered, such as the integration of the developed algorithms and procedures for application in telecommunication, and for the recognition of emotional states, gestures, speech and facial expressions, in anticipation of the implementation of intelligent avatars and interactive dialogue systems that could be exploited to improve user access to future telecommunication services.

NOMCO

The purpose of the project is to carry out collaborative research involving development and analysis of multimodal spoken language corpora in the Nordic countries. The project will
  1. further develop research building on the earlier results we have obtained in this field
  2. start up and pursue a closer cooperation with the purpose of establishing multimodal corpora for Danish, Swedish, Finnish and Estonian with a number of standardized coding features which will make comparative studies possible.
  3. carry out a number of specified studies testing hypotheses on multimodal communicative interaction.
  4. develop, extend and adapt models of multimodal interactive communication management that can serve as a basis for interactive systems.
  5. apply machine learning techniques in order to test the possibilities for automatic recognition of manual gestures, head movements and facial expressions with different interactive communication functions.

Data page

This includes conversational data collected in collaboration Nick Campbell during my visit to ATR/NICT in Kyoto, and eye-tracking data collected with the help of Prof. Seiichi Yamamoto during my stay as NICT Visiting Scholar at the Doshisha University in Kyoto. It also has links and papers associated with it. (Password protected)

DUMAS

DUMAS is a collaboration project with the European 5th framework programme, and I am the technical coordinator of the project. The goal of the project is to furnish electronic systems with intelligent spoken interaction capabilities, and to investigate adaptive multilingual interaction techniques to handle both spoken and text input. The project has constructed AthosMail, an e-mail application that will deal with multilingual issues in several forms and environments, and whose functionality can be adapted to different users, different situations and tasks.

MUMIN

MUMIN is a Nordic network that aims at stimulating Nordic research in the area of on Multimodal interfaces. It encourages joint activities on multimodal research and application development, and organises PhD courses and workshops on issues related to multimodal interaction. The first MUMIN workshop will be organised in Helsinki, joined with a PhD course in Tampere  in November 2002. The 1st Nordic Symposium on Multimodal Interfaces was held in Copenhagen in September 2003, and a workshop on multimodal annotation in Stockholm in June 2004.

4M

This is a continuatiation of the Interact-project (see below). The project investigates the use of ontologies to enrich the system’s interaction and reasoning capabilities.

PUMS

This is a continuation of the Interact-project too, focussing on speech applications. The aim is to combine and coordinate speech, text and pointing gestures in a multimodal interaction framework to provide navigation help and travel information for users in the Helsinki area.

Interact (2001-2002)

I was the project manager in the Interact project (2001-2003) which was a collaboration project between four Finnish universities and supported by the Finnish Technology Agency TEKES, by the leading IT-companies as well as the Arla Institute and the Finnish Association for the Deaf.  The project aimed at developing models for human-computer interaction and at building a spoken dialogue system which would allow users to interact with the computer in a natural and robust way. Within the project, we developed the first Finnish dialogue system that provides information on Helsinki area bus timetables.

An overview of the project is also published in Elsnews 10:2:10, and in our paper at SIGDial'02 Workshop.

MUMMI

MUMMI is a study project at UIAH, realised in collaboration with Marjo Mäenpää and Antti Raike. We explore the Design for all -concept in designing multimodal museum and other types of cultural interfaces, and concern especially how to provide artistic content in an effective way. We collaborate with the Finnish National Gallery (Marjatta Levanto, Riikka Haapalainen). Check also Antti's widely rewarded web portal Elokuvantaju (CinemaSense) which provides net-based study material on film production, and also explores how the concepts for film making can be made accessible for those using sign language.


Home | Quick profile | Projects | Research | Publications | Personal


This page was created by Kristiina Jokinen and was last modified on 03/03/2006.