14357Re: game mixing
- Oct 18, 2008--- In firstname.lastname@example.org, Andy Farnell <padawan12@...>
>Yup I agree. I just personally prefer having specific rules attached
> That truly interactive real-time worlds will always present
> you with unforseen scenarios, so you will always need some
> measure of a rule-based approach, and try to define that
> as exhaustively as possible.
to specific buses, rather than specific mixes attached to specific
Also cool was dynamically assigning mix groups at runtime...
i.e. AI weapons sounds were assigned to different groups depending on
whether or not they were currently targetting the player.
Another cool thing I forgot to mention about our real-time editor <->
We were able to do vid cap and run-time param cap at the same time.
3.a. actual case:
i. Recorded a couple minutes of gameplay, and recorded (via the PC
editor) what the adaptive music system was doing with the various
stems' volumes (as MIDI volume controllers).
ii. In Logic, sync'd up the MIDI param cap with the video clip,
mapped the MIDI volume controllers to logic buses, and assigned the
appropriate stems to the appropriate buses.
This let composers audition and/or compose their adaptive stems to-
"gameplay" (albeit one iteration). They could move around their music
content relative to the action to check various parts of the score
worked. Also, they could tweak the timing and shape of the MIDI
curves and provide precise feedback to the coders about what fades
3.b. hypothetical case:
I always wished our QA team were dumping A/V while testing... then
they could have attached reference vid of audio bugs. (Trying to
decypher text descriptions of audio issues was as ridiculous as it
I envisioned the next stage of this being dumping _all_ audio params
along with the vid, in a format that could be played back and trouble-
shot in our PC audio content manager / mixer tool.
- << Previous post in topic