Rhythm Adaptation Experiment Instruction

Introduction:

Goal: Sensorimotor synchronization (SMS) is the rhythmic synchronization between a timed sensory stimulus and a motor response. In this experiment we will have you performing an SMS task of hearing/watching a tempo changing recurrent auditory/visual stimulus and response to it by generating a corresponding sequence of click sounds.
Task: You are supposed to generate a one-to-one response in synchronization with the auditory/visual sequence of clicks/flashes. Your sensory act is to listen or watch the sequence generated by a computer application and your motor response will be your hand-clapping, finger-tapping or hitting a given weight against a pad. Your results will be analyzed to discover how humans synchronize with a perceived changing rhythm under different circumstances. We are particularly seeking if different subjects act considerably different given their individual properties (musical background, age, gender, etc).

Anonymity: Your results with remain anonymous and will not be reported, published anywhere with your name; however, you can ask the test holders to give you your own analyzed plots in case that you want to know more about your characteristics of rhythm-adaptation.
Safety: Basically there will be no harm caused by such tests, however, if you have a background in epileptic seizure or any neural disorder you may please inform the experimenters. Besides your arm/hand/fingers that you will use during the test must be in a healthy condition.

Compensation: The experiment will be done in two days. For each day, a subject needs to perform a 45 minutes trial and at the second day takes two cinema tickets for accomplishing both trials.

Instructions for the first day experiment (Auditory Rhythm Adaptation):

  1. The test is done in an anechoic chamber (an acoustically dead room) that absorbs sound perfectly. The pyramid-shaped pieces on the walls and floor are covered by carbon and iron, so you may watch your clothes not to get dirty if you lean on them.
  2. Please switch off your phone and take off your watch, wristband, etc. Piercing is fine. The idea is that you should feel comfortable during the test and do not attach considerable weight to your hand. Since the room’s floor is covered by a grid, it will be very hard to find your small accessories back if they are dropped from your hand or the experiment test. So you may hand them to the experimenters before you start the required task.
  3. Your task is to generate a one-to-one response in synchronization with the auditory sequence of clicks you hear.
  4. To make the click sound, you can choose your dominant hand, in case that you are left-handed or right-handed. But you must use the same hand during the whole session.
  5. You will be blindfolded. Please relax, sit comfortably and focus on the stimuli during the each session. The test needs your 100% attention.
  6. There are three different motor tasks and you will be told by the experimenters by which mean you generate the click sounds:
    • For the finger-tapping sessions, you are asked to use the wrist to lead an abrupt, pulsed action and to suddenly release the downward force on the laptop computer keyboard’s space button.
    • For the clapping sessions, you use a take a figure-to-palm position of the right hand relative to the left hand (or vice versa) forming a right angle with a natural curvature.
    • For the weight-hitting sessions, you will be given a metal cylinder (a group of them taped to each other). Your task is holding them vertically in your preferred hand and to make the click sound by hitting the weight against a metal pad. Please keep your elbow fixed on the experiment table and do not twist your wrist to hit the pad.
  7. Each trial will take up to 45 minutes and has three session
  8. Each session lasts 13 minutes and is divided into three stimuli sequences of 4 minutes each, with two short breaks. You need to remain seated during a session but you can take a longer break when the experimenter gets back to you, once you finish a session.
  9. In each 4-minute sequence of auditory stimuli you will listen to a series of a single piano note via a microphone. You are asked to follow the changing rhythm and simultaneously generate a click sound in synchronization with any click you hear. This means that if the tempo changes you should adapt to it as soon as you can.
  10. If your task is not clear, please feel free to ask the experimenter before you start.

Instructions for the second day experiment (Visual Rhythm Adaptation):

  1. The test is done in an anechoic chamber (an acoustically dead room) that absorbs sound perfectly. The pyramid-shaped pieces on the walls and floor are covered by carbon and iron, so you may watch your clothes not to get dirty if you lean on them.
  2. Please switch off your phone and take off your watch, wristband, etc. Piercing is fine. The idea is that you should feel comfortable during the test and do not attach considerable weight to your hand. Since the room’s floor is covered by a grid, it will be very hard to find your small accessories back if they are dropped from your hand or the experiment test. So you may hand them to the experimenters before you start the required task.
  3. Your task is to generate a one-to-one response in synchronization with the visual sequence of flashes you observe.
  4. To make the click sound, you can choose your dominant hand, in case that you are left-handed or right-handed. But you must use the same hand during the whole session.
  5. You need to stare at the flashing screen during a session. Sit comfortably and be prepared for focusing on the stimuli. The test needs your 100% attention.
  6. There are three different motor tasks and you will be told by the experimenters by which mean you generate the click sounds:
    • For the finger-tapping sessions, you are asked to use the wrist to lead an abrupt, pulsed action and to suddenly release the downward force on the laptop computer keyboard’s space button.
    • For the clapping sessions, you use a take a figure-to-palm position of the right hand relative to the left hand (or vice versa) forming a right angle with a natural curvature.
    • For the weight-hitting sessions, you will be given a metal cylinder (a group of them taped to each other). Your task is holding them vertically in your preferred hand and to make the click sound by hitting the weight against a metal pad. Please keep your elbow fixed on the experiment table and do not twist your wrist to hit the pad.
  7. This trial will take up to 45 minutes and has three sessions:
  8. Each session lasts 13 minutes and is divided into three stimuli sequences of 4 minutes each, with two short breaks. You need to remain seated during a session but you can take a longer break when the experimenter gets back to you, once you finish a session.
  9. In each 4-minute sequence of visual stimuli you will watch a black screen that continuously flashes in white. The flashing light changes its frequency and you are asked to follow the visual rhythm and simultaneously generate a click sound in synchronization with any white flash you observe. Each white flash takes 300mS and is long enough to perceive but try to make the sound immediately when the screen becomes white, not when it turns back to black.
  10. If your task is not clear, please feel free to ask the experimenter before you start.

Please contact i@nim.ir in case that you are interested in taking part in the experiment. Thanks for consideration! 🙂

The Others

Could these virtual people at some point jam together? Or could I ask the avatar if he knows the answer to this? If he knows, can he also research how we can jam with him? And then can he at some point own a TV and designs another avatar by his own?

This actually reminded me of an idea I had in childhood. A movie about people living great lives and at the end they sadly understand that they are game avatars created by more “real” creatures. Fictionists should write more on this topic.

Internal self-consciousness?

Read this real story and tell me if you also believe that it could be more than a coincidence:

Shortly, it is about a neuroscientist who had studied the criminal brain for 20 years and had shown that orbital cortex (or whatever) is inactive in those rare people. And after all when he scans his own brain, he happens to be one of them, himself!

But what, other than chance, can be behind this? I bring up this question:

Is it possible for different parts of the brain to directly communicate “inside the skull” and inform each other about their condition? We know that different brain circuits “inform” each other about signals and stuff but I am questioning about a higher level of informing.

Today Mr. Fallon knows that his orbital cortex is abnormally inactive. He knows this by looking at his brain scans, provided through the world outside (scanning devices and the rest, outside of his skull). Now his brain has externally revealed something about itself. Wouldn’t it then be possible that his brain already “knew” it internally, but not consciously?

Let’s map it from the physical brain domain to the mind domain: A part of his mind (call it the researcher part) is now externally aware of a disorder in another part (the criminal part). Now, is it imaginable that the “conscious researcher part” had internally had some clues about the “unconscious criminal part”?

This is a philosopher sending a query to the experimental scientists: Is there such an internal awareness? Back to the hard-wired brain domain, it could be a result of some internal nervous connections between such brain regions. Or I don’t know. Any sort of connection that has in some way inspired, motivated and driven him to perform such study, by the means available in the “outside” world.

Related on brain and mind: Symmetric mind, bilateral brain.

Definition of the Master Thesis (2) – Draft

Title: Modeling human auditory synchronization behavior based on EEG data

Student:
Supervisor:
Co-supervisors: Nima Darabi, PhD at Q2S / Peter Svensson, Professor, IET and Q2S
Semester: Spring 2010

An overall view: While dancing, performing or listening to a rhythmic music, we are synchronizing our reactions based on the auditory stimuli. The characteristics of such an action depend on how we use our short-term auditory memory, as the ability to recall something heard very recently. Every suggested synchronization model should take the role of this memory into account. The traditional way of understanding this memory function is analysis of the recorded behavior of synchronous cooperative subjects such as processing of their produced sound signals. In addition to this, the measurement of the electrical brain activity might be a very useful source, assigning to the relative sound signals and provided by EEG’s good temporal resolution.

The Assignment: In this suggested project, We will do experiments in which subjects passively listen to hand claps, in order to find out how that translates to EEG activity. We will set up some well-defined and simple subjective experiments with auditory stimuli and use quantitative EEG methods (mathematical measurement of aspects of the EEG signal) to analyze the provided information. The process should be passive given the circumstances such as EEG sensitivity to body movements and their results are supposed to address these questions:

  • How the brain electrical activity is influenced by passive rhythm perception?
  • Which temporal structures are human perceivable as rhythm?
  • How fast the auditory stimuli can be traced in the electrical brain activity?
  • How this can be changed by different patterns and tempos?
  • How much is this individual or training dependent?
  • How we can explore memory aspects of human rhythmic and musical behavior using EEG?
  • when experimenting with delays between handclapping performers, and
    we ask them to keep the tempo stable, it sounds bad to them but they can keep the tempo. On average, though, people compensate not enough so the tempo always decreases. Why don’t they compensate completely?

We also need to figure out the extent to which asynchrony is perceived. We will come up with a model that describes the behavior of a clapper. One aspect is the perception of the other clapper and the perception of one’s self: EEG might be able to show when a situation is unusual which might be related to asynchrony.

In this case we just have subject listen to a bunch of pre-recorded hand claps. Do an experiment with active clapping; subsequently, compare the EEG data and find any differences. Here E(t), B(t) and A(t) stand for Ear(t), Brain(t) and Arm(t). Brain (T) comes from EEG data related to hearing of stimuli and will help us to understand more about the processing of perceived stimuli (synchrony check / prediction performed by subject), and movement of arms, respectively.

This study finally aims at defining some memory/inertia-related parameters as a measure of strategy taken by the performers in a musical collaboration/synchronization process.

Jordi and Nima @ Q2S Colloquium

E-228 presents:

Implementing computer clapper as a tool for subjective rhythmic experiment (+)

In this presentation, Jordi will show some of his last works as a developer in the field of audio visual interactivity. Furthermore, the Computer-Clapper software will be presented, an application developed for Nima Darabi which is a programmable sequencer and impulse response detector to achieve rhythm performance metrics. Jordi will introduce MAX/MSP as an interface to develop subjective auditory and visual tests. This interface not only is used to implement human-computer subjective tests, but also will later on be used to implement the computer-clapper as a serious game.

At the end Nima Darabi will finalize Jordi’s talk to show how the observed step responses gathered from the clapper software is used to model human reaction to the tempo change by e second order damped harmonic oscillator like a damped mass spring system. Some objective quality assessment metrics will also be discussed specifically for musical interaction.

An asymmetric challenge

Andrew Perkis has posted to all:

Here is a challenge! A friend of mines daughter took this picture of herself lying next to a mirror. As you see her eyes are open, however, in the mirror they are closed. How can this be explained? All comments appreciated!

Together with Jordi we tried to examine our theory and ended up with these two results. We tried, not so patiently though. So eventually that worked out, but not as good as the original one:

Another day we gave it a try with the newcomer colleague:

Here is another try, it still doesn’t look as extreme as the the girl on the photo. But it seems it goes into the right direction. 🙂

Said Dirk:

Jordi mentioned that it might be also an issue with the release time of the camera, which sounds quite reasonable. If the CCD sensor is quite slow (old cameras had this issue) and writes the data from left to right, the girl might have started with closed eyes and opened them fast while taking the picture. So maybe it is a mixture of both, perspective and crappy camera.

Jordi found some links supporting the possibility that more complicated stuff than static optics might contribute:

I guess the strange camera issue it is somehow related to the known “slow scan” or “photon gating” behavior in phone cameras. Here you have some links talking about this issue or new feature.

  • Photon gating makes for interesting cameraphone pictures
  • Take Distorted and Psychedelic iPhone Photos
  • CF in SF

    or

    Why I am going to US?

    CF stands for something called “Compensation Factor”. I am going to present such a thing at 125th AES convention.

    Based on the mathematical induction, San Fransisco offers a new thing about “Influence of delay on Musical collaboration” on early Octobers of each even year. I kid you not!

    • October 2004, San Francisco: Professor Chris Chafe, a multidisciplinary musician and researcher who is the head of CCRMA (The Stanford University Center for Computer Research in Music and Acoustics), published an important paper entitled “Network time delay and ensemble accuracy: effects of latency, asymmetry” on 117th AES convention. It was in San Fransisco, not that far from to his office, where he proposed his counter-intuitive “Chafe Effect”:

    In a musical interaction short delays may produce a modest, but surprising acceleration. So, moderate amounts of delay are beneficial to improve the collaboration, keeping the tempo stability.

    • October 2006, the same place: Right two years later Snorre Farner confirmed this observed effect and tested it in various situations with his “Ensemble hand-clapping experiments under the influence of delay and various acoustic environments” on 121th AES. He was a postdoc at our center of excellence (Q2S) who is now a Researcher and developer at the Analysis/Synthesis group at IRCAM in Paris (and also a material and electrochemistry scientist based on his backgroud!). This job was done under the supervision of Peter Svensson, an acoustician who is a professor both in electroacoustics and our Q2S center. He is now my supervisor.
    • October 2008, again San Francisco: It’s my turn! I am going to the states to try my chance to add another brick in the wall ;) I continued that way and published a paper in the same convention, but more focused on the strategy  people take while collaborating. I used Snorre’s dataset to analyze but I chose another approach. This recent paper is called “Quantifying the Strategy Taken by a Pair of Ensemble Hand-Clappers under the Influence of Delay” and is going to be presented as a poster in the session Listening Tests & Psychoacoustics of 125th AES, on Friday October 3, 2:30 pm – 4:00 pm. In this work Peter, Snorre, and I, have presented a quantifiable strategy factor called “Compensation Factor (CF)”:
    • While two persons are clapping a rhythm together with a certain delay, each can take a strategy between two extreme cases: Feeling free and let the arms be synchronized with the ears (the lazy strategy which decreases the tempo), or Bothering themselves to clap earlier – as much as needed – than what is supposed to come from the other side (the less synchronized strategy which keeps the tempo stable). At each moment of a clapping trial the strategy taken by performers is something in between of these to extreme cases. We call this trade-off “Compensation Factor”.

    3D Pasargadae

    The picture on top is an autosteregram I made twelve years ago and I recently used for the Project 300.

    This 3D “magic eye” pic is the tomb of Cyrus the great (576 BC – December 530 BC), founder of Achaemenid empire in Persia,  who is also known to have written the first deceleration of human rights 25 centuries ago.

    Can you see the three dimensional tomb (called Pasargadae) hidden behind the cuneiform texture on the top picture? It is recommended to try the picture in higher resolution by clicking on the top pic. The black and white image at the bottom is the key.

    Tomb of Cyrus the Great

    In case you are not familiar with magic eye techniques (autostereograms), these are pictures that can give you a visual illusion of a 3D sculpture from a 2D pictures if you know how to look at them. You should try to change the angle by which your eyes focus on the image. Position your eyes as you are looking at an object behind the screen. Enjoy!

    p.s. Here is the book I wrote back then in Persian, plus its software Jarf-Negar. It’s unfortunately under DOS as it was before the release of Windows 95. It was, as far as I remember, the best graphic DOS could ever afford: “Super” VGA, 1024X768 pixels with only 16 colors (selected palette)!