Monday, November 3, 2008

EXPO...

So the big question... how'd the expo go...

So, Not one bad review, Had a queue up most of the time...thats right...people actually wanted to play the game :D

Noticed a fair bit of reactions and quirks people tried on their first attempts at playing...which is good because when i continue to develop i now have an even better idea of what interactions to include, new hardware designs and also to take into account that left handers do exist


Reflective report coming soon

RE DO....lol

PS. sorry for the late post...this post has been sitting on my desktop...

Okies...so three days before due and new idea for scope....think Elite Beat Agents and Guitar Hero...only for conducting... HAHAHAHA what fun


Seriously, so I realised I am going to need a hell of a lot more time and research to create an accurate conducting simulator...solution make a game.

Essentially what i have now done is added beat points the user has to reach in tempo or the orchestra begins to slow down or not play.

This is how I went about making this little doosey...

Step one - rough interface concept...

this is what I came up with...



as you can see....or maybe not... a nice simple interface which is kinda self explanatory...


any who....i made the graphics.....and then coded it

the coding was surprisingly easy, essentially, the IR still followed the conductor, but the beats would fade in on their relative timer and if a hit test between the IR and the beat point occurred then the accuracy was checked....if the user was going too fast or too slow the orchestra would change accordingly...

this was the finished product...



All in all... I think it went quite well... it worked fine and was fun to play...

Saturday, October 11, 2008

New gesture recognition......ITS A KEEPER

deep breath....


After a brainstorming session....a new gesture recognition idea was developed...with inbuilt volume and tempo control.

How you ask...it can't be done you say?....pffft...with some coffee, no social life and no sleep anything can be done...

Okies...so here how the new one works...

When the app detects the IR point, it begins to track it... and then the fun begins

While its tracking, a beat timer runs in the background...counting in seconds up to 8...and simultaneously, between each beat the distance covered is measured with a specific tolerance of pixels so that the recognition is not too sensitive.

next....the current and previous coordinates of the IR position is constantly being updated... therefore, I can test whether the user is conducting up or down or left or right.

Using this i did the following...

from the initial Ir point I ignored the x values and concentrated in the y...as soon as the IR point was less than the previous coordinate, I know that the direction has changed, so I trigger that as a beat.

next I had to determine which direction the user was conducting in now...
to check if it was left or right do this ignored the Y value and concentrated on the X value.
including the tolerance, I checked the current X value against the previous x value and could then tell the direction.

if the user conducts straight back upwards, the recorder goes back to ignoring the X value....

Next...essentially I mimicked the x checking for determining when the hit the 3rd and 4th beats.

Then, when the user releases the button and the IR is lost...the final beat is recorded and the whole thing resets.

here is a code snippet...

tracking...
public function track(e:TimerEvent):void{

newX = p1x;
newY = p1y;

if(i == 0){
prevX = newX;
prevY2 = prevY;
prevX2 = prevX
prevY = newY;
i = 1;
}
else if(i == 1){
prevX = newX;
prevY = newY;
i = 0;
}

}

and here is the bit that checks from beat 2 to what the next beat is...


public function leftorright():void{

beat2 = true;
if(p1x < prevX2 - tolerance){
directionfound = true;
//trace("Heading to beat 4");
goingleft = true;
beat1to2.stop();
beat2to4.start();
goingup = false;
goingdown = false;
squarecount = 0;
numofbeats +=1;
}

else if (p1x > prevX2 + tolerance){
directionfound = true;
//trace("Heading to beat 3");
beat1to2.stop();
beat2to3.start();
goingup = false;
goingdown = false;
numofbeats +=1;
}

else if(p1x > prevX2 - tolerance && p1x < prevX2 + tolerance && p1y < prevY2 - (3 * tolerance)){
directionfound = true;
//trace("Heads back to beat 1");
beat1to2.stop();
backto1.start();
numofbeats +=1;
}
// beat2to4.start();

}


A little bit hard to digest...however...essentially...

this allows the user to freely conduct and control tempo plus time signatures...

so is really good

:D

Wednesday, October 1, 2008

Gesture Recognition Rethink...

WOW!....what a couple of days...

okies, so essentially I did switch the gesture recognition class previously talked about over to the IR input. Boy oh boy was I in over my head a little bit. I forgot to take into account that not many people can hold their hand completely still, and even if the human eye could not detect the one pixel movements in the IR...the wiimote sure could. So essentially...every-time the IR moved a pixel off the users desired course...it would register as a direction change and the gesture would be unrecognizable...

Oh well, i did learn a hell of a lot more about class structures and Actionscript 3 language during the last couple of days...now for a different approach...

I want the user to be able to see where they can conduct, but at the same time i still want freedom....so after a bit of brainstorming i came up with this idea...



The final implementation changed a bit from the original design...but here is a screen shot of what the game looks like at the moment...

(Obviously needs polishing) but the set up is there and the gestures work :D...


Okay, here is how it works...

The tempo timer counts through 4 beats before resetting itself. Your standard 4 beat tempo.

as you can see there is an array of conducting points...
During this time, the user can conduct however they want, but when the tempo counter hits 4 beats it checks to see what points have been hit and in what order. It then check against the possible beat patterns and changes the tempo accordingly...

If the user does not conduct, or makes an unrecognisable pattern(or if the cheat and just wave the IR manically :P), the tempo will randomly decrease at different parts of the orchestra...thus creating chaos and the music will sound bad.

So, to conclude, great progress these last couple of days. Although I do miss sleep, it was worth it.

here is a screen capture video of the app in action....a little laggy and no audio...but the IR dot moving around is controlled by the IR conducting baton...and the tempo changes accordingly...as does the music...



My next step is to polish up the orchestra, separate the orchestra into different elements and finally set up my projection surface.

Cheers For Reading...

Saturday, September 27, 2008

More Gesture Recognition

Okay, so calling the gesture recognition function took five minutes...learning, recording and creating the gestures took a lot longer.

to not bamboozle you, once again look at the mouse direction diagram on this page here.

Essentially, as each direction is assigned a number, you can use this to see what beat pattern the user is trying to make...

for a simple 2/4 beat, the user has to move the stick down then up...so the gesture code is 26.

to call this in action script 3 it was simply...

mg.addGesture("2beat","26");

great, well that's great if you want to limit the user to going only directly up and down... so i added some curves and flicks...

essentially the final 2/4 beat gestures looked like this...

mg.addGesture("2beat","26");
mg.addGesture("2beat","206");
mg.addGesture("2beat","2106");
mg.addGesture("2beat","21076");

that allowed for the user to make a u-shape in the air, add flicks to the movement, and it would still register as a 2/4 beat gesture.

now, 4 entries looks nice... by the time i got to the 6/4 gestures I was going insane trying to think of extra flicks and movements someone might make...

mg.addGesture("6beat","246246246205");
mg.addGesture("6beat","2462462462405");
mg.addGesture("6beat","24624624624605");
mg.addGesture("6beat","24624624624675");
mg.addGesture("6beat","24535353475"); mg.addGesture("6beat","245353543475");
mg.addGesture("6beat","2546346346065");
mg.addGesture("6beat","2546346346205");
mg.addGesture("6beat","2546346346305");
mg.addGesture("6beat","253463463405");
mg.addGesture("6beat","253453462405");
mg.addGesture("6beat","253453453405");
mg.addGesture("6beat","263463463405");
mg.addGesture("6beat","26346346305");
mg.addGesture("6beat","253535305);");
mg.addGesture("6beat","25353505);");

It looks like a lot of work, and it did take a couple of hours of testing and recording to get to this point. But i wanted to give whoever uses this as much freedom as possible, as long as they stick to the basic beat patterns(which will be in the next post), they will be fine.

Next was to change the tempo of the music...
As i previously posted, i have the 6 tracks pre-recorded and ready to play.

Essentially, one sound channel is playing the tracks, but only one at a time, if the user conducts a 5/4 beat pattern, the faster track will play, then if they change to say a 3/4 pattern, the channel stops playing the faster track, plays the 3/4 speed track from the same point that the music was up to, only slower.

After getting the above working, i found a little problem, the gesture recognition was far far too sensitive. If someone had a tremor, or unsteady arm, they would not find it very easy to stick to the patterns. Luckily, within the mouse gesture class there was a pixel-level sensitivity setting which i turned way way down to reduce adverse effects...after a bit of testing it seems to be working well.

Obviously what I have made is not perfect, but I think it's working damn well considering gesture recognition is a tedious process.

my next step is to apply the above to the IR conducting stick

Cheers

Monday, September 22, 2008

Changing Tempo on the fly: Gesture Recognition

Well, this is the bit i was dreading, gesture recognition... recognizing the beat patterns the user is making and changing the music accordingly...*shudders*.

Well actually it turned out to take five minutes thanks to this little doozy of a recognition class. Gesture Recognition

Thank you Didier Brun aka Foxy, for making that open source, you just saved me a weeks work :D

Okay, so to understand how it works have a read through that blog link above...but essentially... it tracks the starting point, direction and ending point of a gesture.

In my case, where the user is moving the conductors baton...

I have not got it functioning the way I want yet as I am still designing the Conductors Baton, but essentially I am using three basic patterns and when using the mouse it worked fine.

To switch it over to the IR position simply put I made the mouse cursor hidden and matched it's coordinates to the IR position...it worked so I plan to have a more stable version running in the next couple of days :D

Sunday, September 21, 2008

New Panorama

Because using a cube panorama was getting rather annoying, what with having to change each individual faces image each time the user concentrates on a different part, I decided to revert back to using a spherical panorama.

The only problem I have with the spherical panorama is that it can warp a little bit depending on screen resolutions. So a little more testing, but more stable than the cube.

To make the panorama i just used 3D studio max 9's inbuilt panorama exporter...

first I made the scene using some free google sketch-up models i found at the warehouse.

Then positioned a starting point for the camera and exported.

Essentially I got this...



As you can see, it looks quite distance and weird, but when you chuck it on a sphere in papervision you get this...




And then with the tracking, as you look to different spots the camera changes accordingly, you get feedback etc...

So all in all, not a bad choice to use the sphere, I will just have to watch the warping.