These are a series of projects that we are working on with an aim to identify affect from speech. Recognizing paralinguistic cues from speech has applications in varied domains of speech processing. In this portal we present approaches to identify the expressed intent from acoustics in the context of INTERSPEECH2018 ComPare challenge.
Trello Board
Slack Group
Google Sheet to track experiments
Overleaf Doc to track progress
Models, log files and outputs in submission formats for each of the experiments
There are two subchallenges we are caring about :