I officially joined the the Association Of Magicians In Trøndelag (FAMIT), as a mathemagician.
Goal: Sensorimotor synchronization (SMS) is the rhythmic synchronization between a timed sensory stimulus and a motor response. In this experiment we will have you performing an SMS task of hearing/watching a tempo changing recurrent auditory/visual stimulus and response to it by generating a corresponding sequence of click sounds.
Task: You are supposed to generate a one-to-one response in synchronization with the auditory/visual sequence of clicks/flashes. Your sensory act is to listen or watch the sequence generated by a computer application and your motor response will be your hand-clapping, finger-tapping or hitting a given weight against a pad. Your results will be analyzed to discover how humans synchronize with a perceived changing rhythm under different circumstances. We are particularly seeking if different subjects act considerably different given their individual properties (musical background, age, gender, etc).
Anonymity: Your results with remain anonymous and will not be reported, published anywhere with your name; however, you can ask the test holders to give you your own analyzed plots in case that you want to know more about your characteristics of rhythm-adaptation.
Safety: Basically there will be no harm caused by such tests, however, if you have a background in epileptic seizure or any neural disorder you may please inform the experimenters. Besides your arm/hand/fingers that you will use during the test must be in a healthy condition.
Compensation: The experiment will be done in two days. For each day, a subject needs to perform a 45 minutes trial and at the second day takes two cinema tickets for accomplishing both trials.
Please contact email@example.com in case that you are interested in taking part in the experiment. Thanks for consideration! 🙂
Could these virtual people at some point jam together? Or could I ask the avatar if he knows the answer to this? If he knows, can he also research how we can jam with him? And then can he at some point own a TV and designs another avatar by his own?
This actually reminded me of an idea I had in childhood. A movie about people living great lives and at the end they sadly understand that they are game avatars created by more “real” creatures. Fictionists should write more on this topic.
Read this real story and tell me if you also believe that it could be more than a coincidence:
Shortly, it is about a neuroscientist who had studied the criminal brain for 20 years and had shown that orbital cortex (or whatever) is inactive in those rare people. And after all when he scans his own brain, he happens to be one of them, himself!
But what, other than chance, can be behind this? I bring up this question:
Is it possible for different parts of the brain to directly communicate “inside the skull” and inform each other about their condition? We know that different brain circuits “inform” each other about signals and stuff but I am questioning about a higher level of informing.
Today Mr. Fallon knows that his orbital cortex is abnormally inactive. He knows this by looking at his brain scans, provided through the world outside (scanning devices and the rest, outside of his skull). Now his brain has externally revealed something about itself. Wouldn’t it then be possible that his brain already “knew” it internally, but not consciously?
Let’s map it from the physical brain domain to the mind domain: A part of his mind (call it the researcher part) is now externally aware of a disorder in another part (the criminal part). Now, is it imaginable that the “conscious researcher part” had internally had some clues about the “unconscious criminal part”?
This is a philosopher sending a query to the experimental scientists: Is there such an internal awareness? Back to the hard-wired brain domain, it could be a result of some internal nervous connections between such brain regions. Or I don’t know. Any sort of connection that has in some way inspired, motivated and driven him to perform such study, by the means available in the “outside” world.
Related on brain and mind: Symmetric mind, bilateral brain.
Title: Modeling human auditory synchronization behavior based on EEG data
An overall view: While dancing, performing or listening to a rhythmic music, we are synchronizing our reactions based on the auditory stimuli. The characteristics of such an action depend on how we use our short-term auditory memory, as the ability to recall something heard very recently. Every suggested synchronization model should take the role of this memory into account. The traditional way of understanding this memory function is analysis of the recorded behavior of synchronous cooperative subjects such as processing of their produced sound signals. In addition to this, the measurement of the electrical brain activity might be a very useful source, assigning to the relative sound signals and provided by EEG’s good temporal resolution.
The Assignment: In this suggested project, We will do experiments in which subjects passively listen to hand claps, in order to find out how that translates to EEG activity. We will set up some well-defined and simple subjective experiments with auditory stimuli and use quantitative EEG methods (mathematical measurement of aspects of the EEG signal) to analyze the provided information. The process should be passive given the circumstances such as EEG sensitivity to body movements and their results are supposed to address these questions:
We also need to figure out the extent to which asynchrony is perceived. We will come up with a model that describes the behavior of a clapper. One aspect is the perception of the other clapper and the perception of one’s self: EEG might be able to show when a situation is unusual which might be related to asynchrony.
In this case we just have subject listen to a bunch of pre-recorded hand claps. Do an experiment with active clapping; subsequently, compare the EEG data and find any differences. Here E(t), B(t) and A(t) stand for Ear(t), Brain(t) and Arm(t). Brain (T) comes from EEG data related to hearing of stimuli and will help us to understand more about the processing of perceived stimuli (synchrony check / prediction performed by subject), and movement of arms, respectively.
This study finally aims at defining some memory/inertia-related parameters as a measure of strategy taken by the performers in a musical collaboration/synchronization process.
Implementing computer clapper as a tool for subjective rhythmic experiment (+)
In this presentation, Jordi will show some of his last works as a developer in the field of audio visual interactivity. Furthermore, the Computer-Clapper software will be presented, an application developed for Nima Darabi which is a programmable sequencer and impulse response detector to achieve rhythm performance metrics. Jordi will introduce MAX/MSP as an interface to develop subjective auditory and visual tests. This interface not only is used to implement human-computer subjective tests, but also will later on be used to implement the computer-clapper as a serious game.
At the end Nima Darabi will finalize Jordi’s talk to show how the observed step responses gathered from the clapper software is used to model human reaction to the tempo change by e second order damped harmonic oscillator like a damped mass spring system. Some objective quality assessment metrics will also be discussed specifically for musical interaction.
Andrew Perkis has posted to all:
Here is a challenge! A friend of mines daughter took this picture of herself lying next to a mirror. As you see her eyes are open, however, in the mirror they are closed. How can this be explained? All comments appreciated!
Together with Jordi we tried to examine our theory and ended up with these two results. We tried, not so patiently though. So eventually that worked out, but not as good as the original one:
Another day we gave it a try with the newcomer colleague:
Here is another try, it still doesn’t look as extreme as the the girl on the photo. But it seems it goes into the right direction. 🙂
Jordi mentioned that it might be also an issue with the release time of the camera, which sounds quite reasonable. If the CCD sensor is quite slow (old cameras had this issue) and writes the data from left to right, the girl might have started with closed eyes and opened them fast while taking the picture. So maybe it is a mixture of both, perspective and crappy camera.
Jordi found some links supporting the possibility that more complicated stuff than static optics might contribute:
I guess the strange camera issue it is somehow related to the known “slow scan” or “photon gating” behavior in phone cameras. Here you have some links talking about this issue or new feature.
Photon gating makes for interesting cameraphone pictures Take Distorted and Psychedelic iPhone Photos
Why I am going to US?
CF stands for something called “Compensation Factor”. I am going to present such a thing at 125th AES convention.
Based on the mathematical induction, San Fransisco offers a new thing about “Influence of delay on Musical collaboration” on early Octobers of each even year. I kid you not!
In a musical interaction short delays may produce a modest, but surprising acceleration. So, moderate amounts of delay are beneficial to improve the collaboration, keeping the tempo stability.
While two persons are clapping a rhythm together with a certain delay, each can take a strategy between two extreme cases: Feeling free and let the arms be synchronized with the ears (the lazy strategy which decreases the tempo), or Bothering themselves to clap earlier – as much as needed – than what is supposed to come from the other side (the less synchronized strategy which keeps the tempo stable). At each moment of a clapping trial the strategy taken by performers is something in between of these to extreme cases. We call this trade-off “Compensation Factor”.