Monday, November 3, 2008

EXPO...

So the big question... how'd the expo go...

So, Not one bad review, Had a queue up most of the time...thats right...people actually wanted to play the game :D

Noticed a fair bit of reactions and quirks people tried on their first attempts at playing...which is good because when i continue to develop i now have an even better idea of what interactions to include, new hardware designs and also to take into account that left handers do exist


Reflective report coming soon

RE DO....lol

PS. sorry for the late post...this post has been sitting on my desktop...

Okies...so three days before due and new idea for scope....think Elite Beat Agents and Guitar Hero...only for conducting... HAHAHAHA what fun


Seriously, so I realised I am going to need a hell of a lot more time and research to create an accurate conducting simulator...solution make a game.

Essentially what i have now done is added beat points the user has to reach in tempo or the orchestra begins to slow down or not play.

This is how I went about making this little doosey...

Step one - rough interface concept...

this is what I came up with...



as you can see....or maybe not... a nice simple interface which is kinda self explanatory...


any who....i made the graphics.....and then coded it

the coding was surprisingly easy, essentially, the IR still followed the conductor, but the beats would fade in on their relative timer and if a hit test between the IR and the beat point occurred then the accuracy was checked....if the user was going too fast or too slow the orchestra would change accordingly...

this was the finished product...



All in all... I think it went quite well... it worked fine and was fun to play...

Saturday, October 11, 2008

New gesture recognition......ITS A KEEPER

deep breath....


After a brainstorming session....a new gesture recognition idea was developed...with inbuilt volume and tempo control.

How you ask...it can't be done you say?....pffft...with some coffee, no social life and no sleep anything can be done...

Okies...so here how the new one works...

When the app detects the IR point, it begins to track it... and then the fun begins

While its tracking, a beat timer runs in the background...counting in seconds up to 8...and simultaneously, between each beat the distance covered is measured with a specific tolerance of pixels so that the recognition is not too sensitive.

next....the current and previous coordinates of the IR position is constantly being updated... therefore, I can test whether the user is conducting up or down or left or right.

Using this i did the following...

from the initial Ir point I ignored the x values and concentrated in the y...as soon as the IR point was less than the previous coordinate, I know that the direction has changed, so I trigger that as a beat.

next I had to determine which direction the user was conducting in now...
to check if it was left or right do this ignored the Y value and concentrated on the X value.
including the tolerance, I checked the current X value against the previous x value and could then tell the direction.

if the user conducts straight back upwards, the recorder goes back to ignoring the X value....

Next...essentially I mimicked the x checking for determining when the hit the 3rd and 4th beats.

Then, when the user releases the button and the IR is lost...the final beat is recorded and the whole thing resets.

here is a code snippet...

tracking...
public function track(e:TimerEvent):void{

newX = p1x;
newY = p1y;

if(i == 0){
prevX = newX;
prevY2 = prevY;
prevX2 = prevX
prevY = newY;
i = 1;
}
else if(i == 1){
prevX = newX;
prevY = newY;
i = 0;
}

}

and here is the bit that checks from beat 2 to what the next beat is...


public function leftorright():void{

beat2 = true;
if(p1x < prevX2 - tolerance){
directionfound = true;
//trace("Heading to beat 4");
goingleft = true;
beat1to2.stop();
beat2to4.start();
goingup = false;
goingdown = false;
squarecount = 0;
numofbeats +=1;
}

else if (p1x > prevX2 + tolerance){
directionfound = true;
//trace("Heading to beat 3");
beat1to2.stop();
beat2to3.start();
goingup = false;
goingdown = false;
numofbeats +=1;
}

else if(p1x > prevX2 - tolerance && p1x < prevX2 + tolerance && p1y < prevY2 - (3 * tolerance)){
directionfound = true;
//trace("Heads back to beat 1");
beat1to2.stop();
backto1.start();
numofbeats +=1;
}
// beat2to4.start();

}


A little bit hard to digest...however...essentially...

this allows the user to freely conduct and control tempo plus time signatures...

so is really good

:D

Wednesday, October 1, 2008

Gesture Recognition Rethink...

WOW!....what a couple of days...

okies, so essentially I did switch the gesture recognition class previously talked about over to the IR input. Boy oh boy was I in over my head a little bit. I forgot to take into account that not many people can hold their hand completely still, and even if the human eye could not detect the one pixel movements in the IR...the wiimote sure could. So essentially...every-time the IR moved a pixel off the users desired course...it would register as a direction change and the gesture would be unrecognizable...

Oh well, i did learn a hell of a lot more about class structures and Actionscript 3 language during the last couple of days...now for a different approach...

I want the user to be able to see where they can conduct, but at the same time i still want freedom....so after a bit of brainstorming i came up with this idea...



The final implementation changed a bit from the original design...but here is a screen shot of what the game looks like at the moment...

(Obviously needs polishing) but the set up is there and the gestures work :D...


Okay, here is how it works...

The tempo timer counts through 4 beats before resetting itself. Your standard 4 beat tempo.

as you can see there is an array of conducting points...
During this time, the user can conduct however they want, but when the tempo counter hits 4 beats it checks to see what points have been hit and in what order. It then check against the possible beat patterns and changes the tempo accordingly...

If the user does not conduct, or makes an unrecognisable pattern(or if the cheat and just wave the IR manically :P), the tempo will randomly decrease at different parts of the orchestra...thus creating chaos and the music will sound bad.

So, to conclude, great progress these last couple of days. Although I do miss sleep, it was worth it.

here is a screen capture video of the app in action....a little laggy and no audio...but the IR dot moving around is controlled by the IR conducting baton...and the tempo changes accordingly...as does the music...



My next step is to polish up the orchestra, separate the orchestra into different elements and finally set up my projection surface.

Cheers For Reading...

Saturday, September 27, 2008

More Gesture Recognition

Okay, so calling the gesture recognition function took five minutes...learning, recording and creating the gestures took a lot longer.

to not bamboozle you, once again look at the mouse direction diagram on this page here.

Essentially, as each direction is assigned a number, you can use this to see what beat pattern the user is trying to make...

for a simple 2/4 beat, the user has to move the stick down then up...so the gesture code is 26.

to call this in action script 3 it was simply...

mg.addGesture("2beat","26");

great, well that's great if you want to limit the user to going only directly up and down... so i added some curves and flicks...

essentially the final 2/4 beat gestures looked like this...

mg.addGesture("2beat","26");
mg.addGesture("2beat","206");
mg.addGesture("2beat","2106");
mg.addGesture("2beat","21076");

that allowed for the user to make a u-shape in the air, add flicks to the movement, and it would still register as a 2/4 beat gesture.

now, 4 entries looks nice... by the time i got to the 6/4 gestures I was going insane trying to think of extra flicks and movements someone might make...

mg.addGesture("6beat","246246246205");
mg.addGesture("6beat","2462462462405");
mg.addGesture("6beat","24624624624605");
mg.addGesture("6beat","24624624624675");
mg.addGesture("6beat","24535353475"); mg.addGesture("6beat","245353543475");
mg.addGesture("6beat","2546346346065");
mg.addGesture("6beat","2546346346205");
mg.addGesture("6beat","2546346346305");
mg.addGesture("6beat","253463463405");
mg.addGesture("6beat","253453462405");
mg.addGesture("6beat","253453453405");
mg.addGesture("6beat","263463463405");
mg.addGesture("6beat","26346346305");
mg.addGesture("6beat","253535305);");
mg.addGesture("6beat","25353505);");

It looks like a lot of work, and it did take a couple of hours of testing and recording to get to this point. But i wanted to give whoever uses this as much freedom as possible, as long as they stick to the basic beat patterns(which will be in the next post), they will be fine.

Next was to change the tempo of the music...
As i previously posted, i have the 6 tracks pre-recorded and ready to play.

Essentially, one sound channel is playing the tracks, but only one at a time, if the user conducts a 5/4 beat pattern, the faster track will play, then if they change to say a 3/4 pattern, the channel stops playing the faster track, plays the 3/4 speed track from the same point that the music was up to, only slower.

After getting the above working, i found a little problem, the gesture recognition was far far too sensitive. If someone had a tremor, or unsteady arm, they would not find it very easy to stick to the patterns. Luckily, within the mouse gesture class there was a pixel-level sensitivity setting which i turned way way down to reduce adverse effects...after a bit of testing it seems to be working well.

Obviously what I have made is not perfect, but I think it's working damn well considering gesture recognition is a tedious process.

my next step is to apply the above to the IR conducting stick

Cheers

Monday, September 22, 2008

Changing Tempo on the fly: Gesture Recognition

Well, this is the bit i was dreading, gesture recognition... recognizing the beat patterns the user is making and changing the music accordingly...*shudders*.

Well actually it turned out to take five minutes thanks to this little doozy of a recognition class. Gesture Recognition

Thank you Didier Brun aka Foxy, for making that open source, you just saved me a weeks work :D

Okay, so to understand how it works have a read through that blog link above...but essentially... it tracks the starting point, direction and ending point of a gesture.

In my case, where the user is moving the conductors baton...

I have not got it functioning the way I want yet as I am still designing the Conductors Baton, but essentially I am using three basic patterns and when using the mouse it worked fine.

To switch it over to the IR position simply put I made the mouse cursor hidden and matched it's coordinates to the IR position...it worked so I plan to have a more stable version running in the next couple of days :D

Sunday, September 21, 2008

New Panorama

Because using a cube panorama was getting rather annoying, what with having to change each individual faces image each time the user concentrates on a different part, I decided to revert back to using a spherical panorama.

The only problem I have with the spherical panorama is that it can warp a little bit depending on screen resolutions. So a little more testing, but more stable than the cube.

To make the panorama i just used 3D studio max 9's inbuilt panorama exporter...

first I made the scene using some free google sketch-up models i found at the warehouse.

Then positioned a starting point for the camera and exported.

Essentially I got this...



As you can see, it looks quite distance and weird, but when you chuck it on a sphere in papervision you get this...




And then with the tracking, as you look to different spots the camera changes accordingly, you get feedback etc...

So all in all, not a bad choice to use the sphere, I will just have to watch the warping.

New head tracking

Thanks to the old formula for head-tracking crashing and burning... heres my new one, which may or may not be included in the final prototype...

essentially, trigonometry...

I get the distance between the two points, work out the angle of rotation and hey presto you have got head-tracking (subject to tweaking of course).

Tempo tantrums :D

sorry for that terrible post title...

So, starting to work with the tempo of the audio for the game...

Originally i planned to do it real-time using the one audio track, but this was not too great as Actionscript 3 can't control playback speed directly, and to convert it into a byte array then alter it and then convert it back was lagging up the entire app.

In comes audacity...

using Audacity, I will just get the tracks, and make individual versions of them, all with different tempos...


"It's actually really easy to change the tempo...also I recommend the latest version of Audacity :D"

Then after importing these into the app, i can transition them depending on where the user is pointing and what movements they are making etc.

A little bit of extra work, what with importing the files, but all in all I believe it to be a better choice as the alternative could take months :P.

will have some screen shots soon...

Friday, September 19, 2008

New C# server

been away for a bit...turns out iinet really suck at activating internet accounts :D

but now that it's connected....

Due to IR point management, and some serious laaaagggg :D... I changed the initial app that sends over the IR info.

Now i'm using the default wiimote monitor app that comes with the wiimote library

The server sends out the raw IR data rather than the manipulated string the other app spewed out, it got rid of the lag and made it easier to work with.

heres what the interface is like... as you can see it has everything related to the wiimote, including tracking of up to 4 IR points...



more updates coming soon...

Wednesday, August 27, 2008

Headtracking change...

So after heading into uni today...talked to the tutors about the bugs I had with my headtracking.

The bugs were nothing too drastic, but after a few if's and but's, and brainstorming I will be using headtracking with a panorama of a 3D orchestra rather than a realtime 3D environment.
this was due to three main points...

1. A panorama is easier to set up, and because a conductor does not walk through his orchestra it would still seem real as the camera would still change depending on where you look.

2. By using a panorama I get more time to work on the interaction and feedback aspect of the project. Which will be good for the critique in two weeks...

3. Using real time 3D constructed graphics from papervision was lagging up the application too much.

So thus decided to use a panorama...

How to go about it...

Well, originally i had a panoramic picture of an orchestra, however it was relatively hard to set up in paper vision properly as it was a cylindrical based panorama. Which would be alright, except papervision is still in strong development mode, the cylinder still needs some work...

One thing that was stable was a sphere in papervision...Once again, i would need a spherical panorama rather than a cylindrical one... It was then I noticed that 3DS max has an export to panorama feature...

However, after finding it too tedious to set up a spherical panorama within flash, I decided to go with something more stable...A cube

The cube was a smart choice, it was simple to set up, easy to apply the panorama, and when i did a test run with the camera looking over it...IT LOOKED REALLY GOOD!...

I will be posting some pictures on Friday, as well as some 3D renders of my mock up orchestra...

Tuesday, August 26, 2008

Getting the IR info into my app

Okay, After connecting the wiimotes I needed it to tell my flash app where the IR points are...

I originally intended to use an open source flash library called google code wiiflash link

At first glance, it seemed awesome, which it is, however, I had far too many troubles getting it to read Infra Red Input. The
buttons, rumble, LED's and sound functions worked fine... just not the IR, which is unfortunately essential, so after seeing many examples of working wiimote apps i decided to go with what they were using...

THey were all using the wiimotelib library developed for C#

Not knowing much about C# was a small downer... but there was far more support and pre-made GUI's for it so it turned out to be a more stable option.

The problem was that i was making the application in flash, not C# and had to somehow get the info from C# over to flash in realtime so I could utilize it properly... along comes dekker who, and I quote.. "Wow, I can't believe how fun this is"... Im glad he had fun coding it...because it pretty much broke my brain...

Essentially the solution was a client/server in which C# would send over the IR info via a socket and the Flash app would read it...

this is the C# app i ended up using, it is from this site and, although it reads all input, I just have it sending the IR info over to Flash...

Picture of it in action(the colored dots are the IR from a wii sensor bar plus a calculated middle point)...


And this is a picture of the flash app in action with headtracking, at the moment it is just a bunch of planes i threw in using papervision3D...



It is still quite a bit buggy, but then again, that is just a quick and dirty "See if there are any communication errors" app.
Essentially there are two main bugs on my flash side, one being that it zooms in as you walk away...easily fixed by reversing the zoom math in the code.

The other bug being that i have not synced it to my stage properly, also easily fixed by some simple maths relative to the stage size...

All in all, this week has been ridiculously productive, I will have the headtracking working properly by Friday the 29th of August, so watch this spot :D

Syncing the wiimotes

Okay, so my first main task is to implement headtracking.

For those who dont know, Head-Tracking uses two IR sources and a wiimote.
The wiimote tells the computer the location of the IR points and changes the
view perspective on screen.

I will get into the code side later but first to set up the bluetooth connection with my laptop...

Although I hear great things about bluesoleil, due
to complications with vista, I went back to a standard Toshiba Stack.

After installing the stack it was really quite simple to connect the wiimote

Essentially I opened the battery case, pressed the red sync button on the controller and hit the
search for devices button on my laptop. It found and registered in less than 30 seconds...





All connection and syncing went almost to smoothly for vista...but hey it works

Friday, August 8, 2008

Actual Sketches Coming Soon...

I will scan my original concept sketches on Monday, so watch this spot :P

Google Sketch-Up shots

Here are a few digital mockup's made in google sketch-up, not perfect, but hopefully you can see what my setup will be like...





The Idea...

Ok,

So I wanted to make something cool for Studio III that I can later develop further and perhaps commercialize...or more importantly get a job. Anyways, I fell upon the idea of "Conductor Hero", a fully interactive and accurate Conducting game, where you can conduct your favorite compositions from beethoven to Rachmaninoff.

The first idea i wanted to include was the concept of head-tracking, using a wiimote and two IR LED's to track the players head. This way the player can look at different parts of their orchestra and it also adds more realism to the game.

The second major idea was to include a curved screen for projection, this way it will compliment the head-tracking component.

finally, the gesture recognition for the Conducting itself, using the same wiimote as for the head-tracking, I can track a modified Conductors Baton's movements and record and analyze the users gestures.

I do not expect to have a 100% complete, rich graphic game. But I do intend to have the curved screen with head-tracking working, as well as some basic gesture recognition, this will get my concept across and hopefully lead to some special topics or similar.

and it begins...