The more I think about it, the more it seems like music and sound effects really require drastically different approaches to each other to get working in code. Sound effects often require several instances of the same sound, tend to pan left and right frequently as the character moves around in relation to the sound source, and rarely need to sync up or fade in or out. Music should only be playing one or two overlapping tracks at a time, and will probably never need to pan around or be processed in any way except managing the cross-fading of tracks.
The soundmanager class I’ve been working on is rather inadequate.
I’m thinking that though there may yet be some role for a centralized sound effect manager like that, the majority of the heavy lifting for sound effects could be handled by a SoundEmitter class, which has a position in the game world and transforms its sound effects appropriately to its position. The music stuff is being split off into a MusicPlayer class, which handles all the management of music files and transitions and syncing of the different tracks.
I’ve started in on programming the MusicPlayer, which isn’t too hard since in a lot of ways it’s like the SoundManager I was working on yesterday. It’s in a rough but functional state now. I’ll probably start on the SoundEntity class tomorrow sometime.