Methods Events

Informal events highlighting a range of methods used in affective science, hosted by expert users. Come along to ask all your burning questions!

Clicking on each speaker’s name will take you to their personal websites.

Please use the following links to navigate to the three major themes:

Theme #1: New Opportunities in Remote Data Collection

<a href="https://findanexpert.unimelb.edu.au/profile/194893-peter-koval">Pete Koval</a>

Pete Koval

University of Melbourne, Australia
<a href="https://elisekalokerinos.com/">Elise Kalokerinos</a>

Elise Kalokerinos

University of Melbourne, Australia

Title: Mobile assessment of momentary emotions and emotion regulation in daily life

Date: Thursday, April 15th, 2021

Time: 1:15pm – 2:15pm JST (4:15am – 5:15am UTC)

Description: This workshop will provide an introduction to assessing momentary emotions in everyday life using the experience sampling method/ecological momentary assessment (ESM/EMA). The workshop will focus on selecting/developing survey items to assess emotion in daily life. We will also briefly cover other design issues, including ESM/EMA sampling frequency, study duration, questionnaire length, online participant recruitment and on-boarding, participant incentives, software options. Finally, we will discuss some common challenges and pitfalls of using ESM/EMA to study emotion in daily life.

<a href="https://cos.northeastern.edu/people/derek-isaacowitz">Derek Isaacowitz</a>

Derek Isaacowitz

Northeastern University, USA

Title: Eye tracking (affective science from the webcam)

Date: Thursday, April 15th, 2021

Time: 1:15pm – 2:15pm BST (12:15pm – 1:15pm UTC)

Description: In this session, we will discuss eye tracking methods that may be of use to affective science researchers. We will consider in-lab stationary eye tracking, mobile eye tracking and at-home eye tracking, in terms of both their research potential and their challenges.

<a href="https://mehr.cz/">Samuel Mehr</a>

Samuel Mehr

Harvard University, USA

Title: Best practices in conducting experimental research online

Date: Friday, April 16th, 2021

Time: 9:00am – 10:30am PDT (4:00pm – 5:30pm UTC)

  • Due to popular demand, this methods event will now be repeated for the following times:
  • Session 1: 9:00am – 9:45am PDT (4:00pm – 4:45pm UTC)
  • Session 2: 9:45am-10:30am PDT (4:45pm – 5:30pm UTC)

Description: Many labs conduct research online using paid participant pools like MTurk or Prolific but relatively few have taken the leap to larger-scale, volunteer-based citizen science. The approach has a few key advantages, including massive numbers of participants (tens or hundreds of thousands) from every corner of the world, low cost, ongoing recruitment “self-propelled” by social media and organic sharing, flexible experiment design, facile cross-lab collaborations, and so on. In this workshop I will present a bit about how we designed and built themusiclab.org and discuss some of what we’ve learned from our first 3.5 million participants, with an eye toward discussion of the pitfalls of expanding laboratory research into the citizen science space.

<a href="http://manostsakiris.com/index.php/project/gianluca-finotti/">Gianluca Finotti</a>

Gianluca Finotti

Royal Holloway, University of London, UK

Title: Measuring cardiac activity via webcam

Date: Friday, April 16th, 2021

Time: 9:00am – 10:30am PDT (4:00pm – 5:30pm UTC)

  • Due to popular demand, this methods event will now be repeated for the following times:
  • Session 1: 9:00am – 9:45am PDT (4:00pm – 4:45pm UTC)
  • Session 2: 9:45am – 10:30am PDT (4:45pm – 5:30pm UTC)

Description: Recording physiological measures, such as cardiac activity, is necessary in many research fields. However, with the increasing need to move psychophysiological research to online platforms, there is an urgent need to develop methods to remotely acquire such physiologic measures. Remote photoplethysmography (rPPG) could offer a low-cost, non-invasive solution to this problem. First explored in the 1930’s, contact-PPG is used in clinical settings to detect blood volume changes in peripheral circulation by illuminating the skin and measuring the perfusion of blood by detecting changes in light absorption. Here, we present the validation of a method based on photoplethysmography that allows measuring cardiac activity using commercial webcams remotely. We validated this approach in four separate studies. In experiment one, we recorded participant’s videos via webcam (N = 48) and compared the Heart Rate (HR) as measured with the rPPG algorithm with the HR measured with a smartphone based PPT app (r = 0.69). In Experiment two we compared the rPPG HR extracted from professional videos recorded in the lab (N = 32) with the real HR calculated from participants’ electrocardiogram (ECG) (r = 0.96). In experiment three, we tested the rPPG algorithm on the CohFace dataset (N = 142). This is a database of videos of faces recorded with a webcam in a controlled setting, with different light conditions and ECG recordings. We then compared the rPPG HR with the real HR (r = 0.80). In experiment four, we recorded videos of faces in our lab via webcam while recording participants’ ECG (N = 27). We then compared the rPPG HR with the real HR (r = 0.96). Overall, results show that this is a viable and reliable technique for remote recording of participant’s HR, adding a valuable tool to online research.

<a href="https://gabryant.scholar.ss.ucla.edu/">Greg Bryant</a>

Greg Bryant

UCLA, USA

Title: Methodological and conceptual issues in multicultural voice perception research

Date: Friday, April 16th, 2021

Time: 9:00am ‚Äď 10:30am JST (12:00am ‚Äď 1:30am UTC)

Description: Interest in large-scale cross cultural research projects has increased dramatically, both online as well as in collaboration with a network of researchers situated in different locations around the world. In this event, I will address some issues that voice (and other) researchers face moving forward, especially for those wishing to work with small scale societies. Various methodological problems must be solved related to language, the use of computers, rating scales, generating stimuli, overall experimental design, and the nature of experimental tasks. 

<a href="https://researchers.mq.edu.au/en/persons/kirk-olsen">Kirk Olsen</a>

Kirk Olsen

Macquarie University, Australia
<a href="https://researchers.mq.edu.au/en/persons/bill-thompson">Bill Thompson</a>

Bill Thompson

Macquarie University, Australia

Title: Enhancing diversity and inclusion in the psychology of music

Date: Thursday, April 15th, 2021

Time: 1:15pm – 2:15pm JST (4:15am – 5:15am UTC)

Description: This workshop will describe recent strategies of nurturing diversity and inclusion in emotion research, using as a case study methods used to examine emotional responses to music. Music psychology has traditionally focused on a restricted range of genres within a Western tonal framework. Such an emphasis has led to an understanding of the psychology of music that reflects a narrow range of human experiences, and yet purports to shed light on general principles and mechanisms. Strategies are urgently needed to ensure that the field considers the experiences of non-Western and under-studied listeners and genres of music, ranging from intercultural community groups, to Eastern chanting rituals, to extreme metal subcultures.

Theme #2: Computational Social Science

<a href="https://clavel.wp.imt.fr/">Chloé Clavel</a>

Chloé Clavel

Télécom Paris, France

Title: Natural language processing for sentiment analysis and social media behavior

Date: Thursday, April 15th, 2021

Time: 9:00am – 10:00am BST (8:00am – 9:00am UTC)

Description: We will examine the role of natural language processing in human-agent interaction through illustrations of research on the analysis of user utterances using both symbolic and deep learning methods. We will describe the specific structure of human-agent interaction and investigate future directions based on the recent trends in neural architectures for the integration of interaction context and spontaneous speech features.

<a href="https://scholar.google.com/citations?user=6jMFwJQAAAAJ&hl=en">Pablo Arias</a>

Pablo Arias

CNRS/IRCAM, France

Title: Computational techniques to synthesize emotional speech

Date: Thursday, April 15th, 2021

Time: 9:00am – 10:30am BST (8:00am – 9:30am UTC)

  • Due to popular demand, this methods event will now be repeated for the following times:
  • Session 1: 9:00am – 9:45am BST (8:00am – 8:45am UTC)
  • Session 2: 9:45am – 10:30am BST (8:45am – 9:30am UTC)

Description: While computational analysis methods have become a commodity in emotion research, experiments that attempt, not only to describe, but to computationally manipulate expressive cues in emotional stimuli have remained relatively rare.¬†In this method event, we will present¬†the methodological advantages of using stimulus manipulation techniques for the experimental study of emotions.¬†Specifically, we will argue that stimulus manipulation techniques can allow researchers¬†to make causal inferences between stimulus features and participant’s behavioral, physiological and neural responses.¬†To illustrate our point, we will present recent studies that use manipulation algorithms to control emotions in faces and voices both in¬†classic¬†‚Äú stimulus-response‚Ä̬†paradigms and during real-time social interactions.

<a href="http://lpvs-uqo.ca/en/directeurs/caroline-blais/">Caroline Blais</a>

Caroline Blais

University of Quebec in Outaouais, Canada

Title: Data-driven computational methods in affective science

Date: Wednesday, April 14th, 2021

Time: 1:15pm – 2:15pm PDT (8:15pm – 9:15pm UTC)

Description: This workshop will provide an introduction on two data-driven methods: reverse correlation and bubbles. These methods were first developed in the fields of auditory and visual psychophysics and have become powerful tools for the field of affective science. Reverse correlation allows to capture an individual’s mental representation of an object (e.g. an emotional face), and bubbles allow to measure the information on which an individual relies to identify an object. I will present the conceptual basis of the methods, discuss the questions they can address, and provide a brief overview on how to use them.

Theme #3: Emerging Methods in Imaging/Physiology

<a href="https://scholar.google.com/citations?user=BDQu3IIAAAAJ&hl=en">Vincent Taschereau-Dumouchel</a>

Vincent Taschereau-Dumouchel

UCLA, USA

Title: Decoded fMRI neurofeedback

Date: Wednesday, April 14th, 2021

Time: 1:15pm – 2:15pm PDT (8:15pm – 9:15pm UTC)

Description: Thanks to new advances in machine learning and real-time brain imaging, it is now possible to train participants to modulate their own ‚Äúdecoded‚ÄĚ brain activity using decoded neurofeedback. In this Methods event, we will discuss how the most recent developments in this field can be leveraged in order to study and modulate affective brain processes.

<a href="https://www.uel.ac.uk/research/developmental-psychology/baby-dev-lab/our-team">Marta Perapoch Amadó</a>

Marta Perapoch Amadó

University of East London, UK
<a href="https://www.uel.ac.uk/research/developmental-psychology/baby-dev-lab/our-team">Louise Goupil</a>

Louise Goupil

University of East London, UK

Title: Dual-person EEG for emotion research

Date: Thursday, April 15th, 2021

Time: 1:15pm – 2:15pm BST (12:15pm – 1:15pm UTC)

Description: Currently, most of our understanding of how young children’ brains and bodies respond to social signals comes from studies that presented these signals to children while they are alone, viewing a screen. This leads to a paradox: most of our knowledge of how the young brains functions during social interaction comes from studies that examine individual humans in isolation. In this session, we consider research that aims to move beyond studying how social signals affect the influence of one partner on the other (a unidirectional influence) in favour of studying how social signals affect the relationship between the two partners in the dyad (a bidirectional influence). We discuss findings from studies that used EEG to record brain function in adult-child dyads during free-flowing interactions, and studies that used wireless autonomic monitors and cameras to record day-long naturalistic recordings from adults and children at home.

<a href="https://researchmap.jp/nakai.tomoya?lang=en">Tomoya Nakai</a>

Tomoya Nakai

University of Tokyo, Japan

Title: Quantitative modeling of emotion representation in the brain

Date: Friday, April 16th, 2021

Time: 1:15pm – 2:15pm JST (4:15am – 5:15am UTC)

Description: We experience a wide variety of emotions in daily life, while most affective neuroscience studies have used only a small number of controlled stimuli. In this method event, we will introduce a voxel-wise encoding model approach, which combines neuroimaging and machine-learning techniques to construct a quantitative model of diverse emotions in the human brain. This approach also enables us to visualize a continuous space of emotion categories and its mapping on the cortical surface.