Believe it or not, but designing a True-Music game is incredibly difficult and complex. I have the full B.E.S. team working on developing small musical mechanics that can be layered into a game engine.
One aspect of design that we've recently focused on is how playing music, particularly from memory, can be reverse engineered to expose and examine the associative elements of the learning process. In other words, by returning to what I sometimes refer to as the ultimate (videogame) "controller," the piano, and treating the action of playing music as a game mechanic, I can imagine how I learned to play piano like a game tutorial and apply the results accordingly.
So what have I uncovered? I found out that I've memorized piano music as a series of muscle movements that are coupled with one another. The right hand is generally dominant in music because it usually caries the melody of a song. Because of this, my left hand takes cues from my right hand in order to figure out what muscles to move to play the right notes and how to move my hand/arm to be prepared for what's next. Despite my piano teachers advising me to learn my pieces so well that I could play a whole song without the other hand, I could never do it.
In order to decouple my hands, I invented an exercise that starts with both hands playing together, then gradually works out one of the hands. By pretending to play with one of the hands, the playing hand can take just enough cues to keep playing. Eventually, I can work both hands independently because I've essentially created a new set of cues for my hands that don't depend on each other. With a little practice, I've reached a new level of control and freedom with my piano playing.
For GuitaRPG, I will design a system where the player associates playing certain colored keys with images/cues on the game screen. And by using the same system of decoupling, I can gradually work the player into comfortably playing music form cues they generate themselves.
Keeping the B.E.S. engine cranking.