Sara Smith '08
SUMMER WORK 2006

May 30
June 1 2 5 8 9 12 13 14 15 16 19 20 21 22 27
July 10 13 14 17 18 19 20

Work Completed:

I have gone through the entire Gem tutorial. In doing that I have seen all the geos that are offered through gem. Geos are Gem objects, like a square, or a sphere, or even a teapot!.


Link to Gem Tutorial

The first step in creating a geo is to create the gem window. The gemwin object creates the actual window where the graphics will be shown. Once the window is created then geo can start to be drawn. To create a geo you just have to put an object box and then write the geo that you want to draw. For example, if you wanted to add a square you would write "square" in the object box. This is an example of a simple gem patch that creates a blue square:


Tuesday, May 30


Today was mostly spent trying to install pixelTANGO on both the iMac and the Dell. For some reason pixelTANGO continues to crash on the Dell. Finally, after some help, I got pixelTANGO to work on the iMAC. I spent the rest of the time trying to find tutorials on how to use pixelTANGO.

It is an interesting program. Designed for the ease of use of Gem and Pd. It is made so VJ's can use the software to create installations and do live video production with their performance. Constance has been learning the program as well.

Back to Top

Thursday,June 1

The pixelTANGO software is a useful way to use gem. We were worried, however, that it wasn't able to be directly controlled by audio. The only patch that allowed that was pt.animate. This patch had an inlet that constance attached audio input to. I was able to animate various different layers and really start to get into the possibilities that pixelTANGO offers.

I found a way to attach gem objects, like a teapot, to the actual gem gui so that I could manipulate that object through pixelTANGO without having to reprogram pixelTANGO controls through gem and Pd. This is useful if we want that object to be controlled by audio. In theory the audio input could be put through the gem object and we could use that to control it. I have not been able to fully test this because the iMac has been crashing whenever I add the gem object.

I also spent my time looking at the pmpd tutorial. It looks like this could be useful in the future. I want to make a hurricane like simmulation, that I think would look very cool when controlled by the music.

Back to Top

Friday,June 2

Yesterday, I explained to Constance how gem objects could be added to the pixelTANGO gui. She tried it out on her computer to see if it was only crashing on my computer or if pixelTANGO just could not handle that object. It works on the G5, so I am assuming that it is the iMac and not pixelTAGNO.

I brought my screwdriver in and we installed the updated chip to the iCube digitizer. However when we connected it to the computer it was not being read. We tried all three computers. The dell did not work because I needed administrator privileges to download the software. After we downloaded the software the computers were still not reading the actual device. We are trying different things to make the device work. As of noon we are still working on this.

We finally got the iCube digitalizer to work. We hooked up the gloves to the digitalizer and assigned each sensor in the glove a note. we recorded the sounds we made with garage band.

Back to Top

Monday, June 5

I finished the reading on Bayes nets. I also have been looking online for information about the installations that we are reading about. These are the links that I have found so far.

Installation Links

Experiential Extremism
Dancing in the Streets
Moving Boundary Problem
InfoBreath


Back to Top

Thursday,June 8

Over the past two days that I have been sick, I have been reading up on HMMs and Bayes Nets, so I don't fall behind. I spent today catching up on the computer work. I installed the camera for the dell and it worked well with Gem. We then hooked up the projector to the apple, however, the projector was not working well with the G5. We then hooked it up to the Dell. At first it did not work, but then we attached the projector to the monitor plug on the Dell and it projected the monitor of the Dell. At the time, I had a gem window open that was using the camera as input. The projector was showing the whole computer screen instead of just the gem window. To show just the gem window, I just clicked the full screen button in the window. We did not really try to connect the projector to the Dell by using just the USB cord. If this works, it would probably allow us to project the monitor without having to disconnect it.

I spent the rest of the day playing around with the floor sensor. When I first hooked up the the digitizer to the computer and the sensor, the computer was not detecting it. To fix this, it only had to be reconfigured. However, once the digitizer was being read, I was still not able to get a reaction to the sensor from the iCube editor. I had to read through the editor manual and follow the steps that they gave. Apparently, I just had to reconfigure the digitizer a couple of times before it would work. Once I got it to work with the editor, the next step was to see if I could get the sensor to talk to Pd. I used the stripnote object and it was working fine. When the sensor was steped on Pd would show the velocity of the note being played. It was a higher velocity when stepped on harder. I then used the synth patch offered in Pd, so that I hear the sound. At some point while playing around with this, Pd stopped reacting to the sensor. The editor was still reacting, but Pd was not. I still have not figured out what happened.

Back to Top

Friday, June 9

After the meeting that we had with Judy. I spent my day working on trying to modify the motion detector that is already in gem. In their motion detector patch, they use different gem objects all in the "pix" class. For example they use an object called pix_movement that detects the change in pixels between frames. They connect this to a pix_blob object that normally dectect the center-of-gravity for a still picture, but in the case of video it picks up the center-of-movement. The pix_blob object is like a typical geo in gem, like a cube.

Today I spent trying to find a way to use this motion dectection capability to dectect if something is moving horizontally or vertically. To do this I made a simple program in python to calculate slope. The program works correctly, however I still have to learn how to use python programs in Pd.

Right now the slope program takes in four arguments and then calculates the slope. I made a simple pd program to test the code that just prints out the slope. However, the only way I was able to pass the arguments was by making a list of float numbers. Pd automatically converts these numbers into the list object so that when the slope is calculated and printed, the number shown is not a float, it is an int. A slope of 0.5 prints as 0. I am still trying work this issue out before I use it for motion detection.

Back to Top

Monday, June 12

Today I worked more on the slope program. I realized that I had to do the float casting in python and not in Pd, so it worked correctly. My next issue was figuring out how to input the four variables that I needed. I decided that the best way of doing this was to use pyext. Pyext allows you to create classes in python and then use them as pd patches. With pyext you can define how many inlets and outlets that you would like.

I ran in to problems in actually getting pyext to work in Pd. When I opened the example of pyext that is offered through the browser, pd gave an error message that it couldn't create the pyext object. I tried installing it through flext but was having a lot of difficulties with that approach. I looked online for more information about python and pd and I found this website that offered an example patch that could be downloaded, aswell as the python code to go with it. The website makes note that these two files should be in the same directory. So I downloaded the files and put them into C:\program files\pd. When I opened the pd file, the pyext object was created and the program worked. I looked at the examples in the browser and realized that the pd files and the python scripts that they called were not in the same directory. So I copied a pd example of pyext into the same directory I downloaded the other files into. I also put the py script there. However, when I opened the pd file, I got the same error message that pyext could not be created. However, If I opened the pd file that I downloaded from the internet and then opened the example from the browser, the pyext object was able to be created. After playing around with it some more, we observed that the pyext object would only work if I opened the example downloaded from the internet first and then created pyext.

Back to Top

Tuesday, June 13

I finished creating the slope program using pyext. I still have to have that other window open before I can use it. But it works! so I spent time today at integrating it into the movement detection patch that is already offered in gem. So far I am trying to get the circle, that is seen as tracking the motion, to change red when it goes horizontal and turn blue when it goes vertical. As of now it only somewhat works. I need to find a better way to detect the different slopes. here is a screen shot of what I added to the movement detection patch.


This is the python code that I wrote for the [pyext slope] object.

Constance downloaded the bayes net program that she wrote on to the dell. To get it to work, I first had to download the numeric module, once that was done, it was pretty easy to just make a [py majornet main] object in Pd. I passed it 3 arguments: Current pitch, previous pitch and key. The output was a midi pitch value. It was not hooked up to the keyboard and I did not have it actually make a note, but the py bayes net program works on the dell!

I have also copied constance's orient and code Pd patch and will try to use then in the motion detection patch aswell.

Back to Top

Wednesday, June 14

I finally figured out why pyext was only working when I opened the example patch I downloaded from the internet. When I open Pd from the desktop icon, the py/pyext library is not loading, however, it loads when I open the example program. So, I have to load the py/pyext library at start up using the command line. If I open Pd with "pd -lib py" the library loads and I can create the pyext patch without ever having to open the other window.

I added the orient patch to the movement detection patch I was creating. I am still figuring out how to make it work in the manner I want,but using the orient patch the Constance wrote I have been able to detect movement in different section of the screen. I have tested this by changing the color of the circle (which shows the movement) when it goes into different quadrants of the gem window. The next step is to figure out how to use this so that it can detect up and down motion.

I programed the smoothing equation that Judy gave me. I am not sure if it was done correctly. It takes in a previous x value and the current x value and then they are plugged into the equation. I wrote it so that I smooth te x and y values seperatly. It doesn't seem to really smooth it out. The circle get dragged and it still gets jumpy when something moves in and out of the foreground. Here is the code I wrote:

Back to Top

Thursday, June 15

The major problem that I am running into in motion detection is figuring out how to send the translation information to the different pyext objects. For example, the first slope program I wrote takes in four arguments: x1,y1,x2,y2. I had to figure out a way to send it x2 and y2 after it sends x1 and y1. this is difficult because the x and y positions come from the current position of the [pix_blob] on the screen. I had to use a [pipe] object to change x2 after x1 and then calculate the slope. The problem with this was that x1 continued to change, so the x values were equal and the y values were equal, so I got constant zero division errors.

I used the [spigot] object to stop the flow of the x1 and y1 values, which makes it work a little better, but I need to turn them back on to update the value and the turn it off again. I also have to figure out how to deal with an undefined slope. Horizontal motion should have a slope around 0, but vertical motion is undefined and I do not yet know how to account for that in Pd.

I went looking online to see if there was any information about using Pd as a motion detector. I found that there was another external to use called pdp. Pdp has a pdp_mgrid object that is a grid-based motion detector. I thought that this might be more usefull then what gem was using, so I tried to use it and Pd wasn't creating it. So, I searched for information about pdp and from what I read it only works on linux. It might work on macs aswell, but when I tried to use it on the iMAC I was getting the same error that I was getting on the Dell. I decided that this would be a good time to download Pd on the linux OS. I was having trouble installing it and found out, through the Pd documentation, that for red hat I needed a .rpm file. I found that type of file on this website, but when I tried to download it, it said that I needed administrator privallages.

I tried pdp on the iMAC again, and some of the objects work, and some do not. The pdp_mgrid that I want to use works, but for some reason it says that it can't create the window to view the detecor. I wonder if I need to hook up the camera to the iMAC.

Back to Top

Friday, June 16

I was looking online for more about the pdp extension package. Apparently, it is only currently available for linux according to this site. This site also has a link to download pdp.

Back to Top

Monday, June 19

For motion detection, I decided to leave the slope approach alone, because it was not working. I decided to go back to using the orient program that Constance wrote. With this, I can detect movement in the different quadrants of the screen. I wrote a python program that will detect movement between the quadrants. For example, if someone moved from the 1st quadrant to the 2nd quadrant then that is considered a horizontal movement.

The code I wrote doesn't seem to have any errors, but I am having trouble creating the pyext patch in Pd. I keeps on giving me the error message that it can't create it. I put the python file in the same place that I put the other python files that work fine. The other python files are still working, but for some reason any new python program that I write will not. I put the code for the movement detection in a file that was working and that I no longer needed. I had to keep the name of the file the same and then it was able to create the patch. When I changed the name of the file, it decided to not work again. I need to work on fixing this problem before I can move foward.

I am starting to have the same problems as before. I wrote a simple program called testprog.py to figure out what was going on. Pd will only create the [pyext testprog] patch when I open the example that I downloaded from the internet. When I load py through the command line, it will not create the patch. I tried loading py through the command line, and then opening the example I downloaded, but that still did not work. I don't understand this behavior.

So, before I can create a new pyext patch, I have to ssave the Pd file that I am going to be using it in and that file has to be in the same directory as the py file.

I finally got the python program I wrote to label horizontal,vertical and diagonal movements to work in Pd using pyext. The detect.py program takes in two numbers, and first position and a second position. The gemwin is divided into four sections, which I labeled 1-4. The python program will take in, as the first position, one of the numbers from 1-4, representing what section of the screen the movement is in. After a time delay, it takes another number from 1-4, representing the section of the screen that the movement finishes in. The program then outputs a number: 0 for horizontal movement, 1 for vertical movement, and 2 for diagonal movement. For example, if we starting in section 1 and then went to section 2, the program would output a 0 for horizontal movement.

It seems to be working just fine. Right now I have it hooked up so that if there is a horizontal movement, then the circle turns red. If there is vertical movement, the circle turns green. Finally, If there is diagonal movement, the circle turns blue.

Back to Top

Tuesday, June 20

Today we put up the bayes net installation. We taped the microphone to the door and put the speakers on the ledge next to the door. It is working fine, however, I am not sure that people are actually noticing it. I do not know if that is because the volume is low, or because they arn't paying attention. But it is cool to hear the different sounds that people make as they walk by and listen to the reaction from the program.

Back to Top

Wednesday, June 21

I found this site that has information about downloading pdp on OSX.

I am still determined to get pdp to work. Apparently it will work on OSX 10.3, which is what we have on the imac. In terms of the window display not working, I found a site that said that for video capture and video output, you should use different pdp objects then are used in linux( site). So I think all I need is a camera that will work on the iMac. The logitech only works on 10.1 and 10.2.

I am looking online for information about video and motion detection in Pd and Gem. I found this interesting article about gestures and sound found here. They look into recognizing emotions based on gestures and sound. This is another article that I found about the different technologies that you can use to create a musical instrument with Pd.

I found another Pd external that I could use with Pd called Gridflow. As usual, I am having trouble installing it, but if I could use it, it has some video processing and motion detection capabilities that would be useful. Here is the homepage for GridFlow.

Back to Top

Thursday, June 22

I changed the movement program so that it creates midi notes with the horizontal, vertical and diagonal movements. It is pretty accurate. But I am going to spend some time today seeing if I can split the screen in to more then four sections. I am also going to spend time trying to download gridflow.

After spending all day trying to download Gridflow, I could not get it to work. Apparently, it doesn't work on windows. I tried to download it on the imac, but it wasn't working. It needs the programming language ruby, to run. I was having trouble installing and finding the correct version of ruby.

Back to Top

Tuesday, June 27

On friday I got the second camera. I installed it to the Dell and then opened Gem. To change between cameras all you have to do is change the "device" message box to 1 for one camera, and 0 for the other. I am having bandwidth issues trying to get them to run at the same time.

I decided to download the EyesWeb software. I downloaded it from here. They also have a users manual that you can download.

Back to Top

Monday, July 10

I made a patch for the floor sensors. First, I programmed the sensors in the I-cubeX editor. One of the sensors sends a note message and the other sends a control message (for example volume control). The Pd patch gets the notes and control messages and then makes notes using the numbers it gets.

At first, I only could get the sensors to send one note. So, when the sensor was pressed, only one note played. We wanted it to be able to play different notes depending on how hard the sensor was pressed. The [notein] object in Pd, gets the incoming midi note and then outputs the note and the velocity of the note. When I first started, I connected the note output (from notein) to the [noteout] object. This only played one note. All I had to do, to change the sensor from playing one note to multiple notes, was connect the velocity output (from notein) to the note input of [noteout]. When the sensor was pressed the midi note correspnding to the velocity number was played. When pressed lightly, low notes were played and when pressed hard, high notes were played. The patch simply looked like this:

The notes that it was playing were all the notes from 1 to 128. I changed it to play a specific set of notes. To do this, I had to split the range of 0 to 128 into equal parts. I wrote a pyext object to do this. I divided it up into 8 parts, then 16 parts and then 12 parts. This is the python code for that. Right now it plays the C scale. As you press harder on the sensor, you go up the scale.

In addition to playing the notes, we wanted the sensors to control some graphics. Using gem, I played around with some possible graphics that go along with the floor sensors. Right now, as one of the sensors is pressed, the background of the gemwin gradually turns from blue to green and a teapot rotates. As the other sensor is pressed, the color of the teapot changes. There is also text that shows which note each sensor is outputting that the time. The color of the text changes with the teapot.

As of right now, one of the sensors plays the notes of the C scale. The other sensors plays the same note except, the note is a pitch below the pitch of the other sensor. Below is the Pd patch and a picture of a gemwin display.

The next step is to do more with the floor sensors and audio. I am having a tough time coming up with some ideas. I think I am going to try to get one of the sensors to harmonize with the other.

Back to Top

Thursday, July 13

Everything that I updated between July 10 and today has been lost. I do not rememeber exactly what I wrote on those two days. But I will write about what I have been doing.

I wanted to do more with the sensors in terms of audio. I thought that it would be a good idea to have a sensor play a note and then the other sensor harmonize with that note. At first I was having trouble figuring out a way to get the second sensor to play a note according to what the other sensor was playing. In the I-cubeX editor, I could assign a note to be outputted by the sensor, but I could not send information from Pd to the editor. So I could not recognize what note was being played by the first sensor, and then through pd send the harmony note to the second sensor.

I found the reading about chord and harmonizing with major scales next to my computer. I skimmed through that, and used the information about chord progression and harmony in the floor sensor patch. I changed the pyext code that I had written (to divide the incoming numbers into 12 sections), so that the third output was the high note in the triad that is built off of each note in the C major scale. For example, if a C was being played a G was outputted. I then changed the patch so that the second sensor turned the playing of this note on and off. Because the notes were being outputted at the same time, when the harmony note was turned on and off, it always harmonized with the bass note being played. The patch looked like this:

I then added a third harmony note using the midnote of the triad mentioned in the reading. To do this I had to had to change the pyext object (mentioned above) to have another output, which outputted the midnote. I wanted the second sensor to turn the midnote on when press lightly, and then turn the high note on when pressed harder. The input coming from the sensor was control values, which ranged from 1 to 128. I divided this into two parts and then tested which part the incoming value from the sensor was in. So if the incoming value was between 1 and 64 then the midnote turned on. If the value was inbetween 64 and 128 then the highnote was turned on. Consequently, if the value was inbetween 1 and 64 the high note would be turned off. If there was no input coming from the sensor, i.e noone was standing on it, then neither the midnote or high note was played.

I wrote to little pyext objects. One for the midnote and one for the high note. They output a 1 if the note should be turned on, or a 0 if the note should be turned off. The output is connected to a spigot that controls whether the midi value is sent to a noteout object. Here is the patch and the subpatch that has the pyext objects in it:

This the code for the two pyext objects:

Mid note
High note

I wrote up instructions for each patch that I made. I put them all directly in each of the patches. I also will write a readme for using pd/gem and all of my patches on the dell.

Back to Top

Friday, July 14

I finished writing all of the instructions and the README for the patches I wrote on the dell. I wrote a subpatch in each of the patches called "help" that has step-by-step instructions on how to use each patch. The README is more general. Most of the stuff in the README is redundant to the help subpatches I wrote. The README says where you can find all of the patches, what they are used for, and what they are named. I put the README in the Pd folder.

Here is the README:

README

Back to Top

Monday, July 17

I cannot decide what type of figure to put in my one-page write up. I made a diagram of the screen split into four quadrants. I am going to make up some other ones.

I am posting this because I made it in paint and I am writing on my laptop.

diagram

After putting it in the document, I decided that I like it there, so I will leave it.

We put up the three different lamps. We turned off all of the lights and used the installation, the cameras were picking up movement. I think that when we did it before and the cameras got stuck, it was a cloudy day and therfore it was darker in the room. Still, the lights are good to have, because they offer more light and give the room a nice feel. The graphics still show nicely on the wall with the lamps on.

I changed around some of the graphics with the floor sensor. The sensor that turns the harmony on and off controls the background color. I don't have anymore ideas about what to do with the graphics.

I have an idea that would allow us to use the motion detection patch that I made. We could have the second camera showing on the Dell. As the person stands stil in front of that camera, the HMM will play a chord. We could have the three notes that are played by the motion detection harmonize with that chord. So, someone could stand in front of the camera and move their arm up and down to create notes that go with the chord that is playing as a result of them standing in front of that camera.



Back to Top

Tuesday, July 18

I read over Judy's rewrite. I had only a few comments. It looks like everything is really starting to come together.

Back to Top

Wednesday, July 19

Yesterday, we were hooking up the sensor patch to the iMAC. We ran into problems with the midi output. Apparently there is not synth on macs. We downloaded a program called SimpleSynth and it outputted the midi notes from the sensors. After stepping on the sensor for some time, the midi notes stopped being outputted. Today, I read on the the simplesynth help menu that if you play alot of notes a once then in could cause the program to crash. Since thats what we were doing, we couldn't use SimpleSynth. We set up the floor sensor patch on the iMAC to convert the midi notes to audio signals. I thought that it sounded better using midi, so now the dell does the audio output, and the iMAC does the graphics. for the floor sensor.

I updated the README file. I also made the changes to my python code that Judy made. Here are links to both of them.

README
Floor.py

I changed both the motion detection patch and the floor sensor patch so that the notes played are in the C minor pentatonic.



Back to Top

Thursday, July 20

Today were are showing the installation. It is about 10:40 and the installation is all ready to be shown.

The installation has been shown! It was very interesting to see peoples reaction to everything. There were many people in at once. Though it was hard to tell who what making what react, the overall sounds went very nicely together. It was very cool to hear everything together and to see it being used by people who arn't familiar with it.

Back to Top