top of page

Meeting with the team, further studies, research structure

This week I met the some members of the team. They would like a creepy soundtrack like the one of DeadSpace (a mixture of ambience scary drones and 20th century orchestral music). I need to work seriously by listening this kind of music, because I'm not really practical with orchestras, and start running through some libraries too, to produce a proper work.

They will also want more sparse music for the menu, like the one in the menu of The Last of Us.

I've sorted out the structure of my research report (due the 28th of February), it will consist mainly in three parts.

1 - history of dynamic music in video games (intro)

An overview of the most important advances in the use of dynamic music in video game media, by

looking to some studies about the impact dynamic music had on people (did they feel the difference? its really so important?), followed by examples from commercial video games

2 - story of the evolution of audio middleware and how it changed the work for audio people

Different important points I have found out so fart:

- miles sound system—> giving a common language to pc

- imuse—-> a system to synchronize picture and motion on screen more tightly

- 1995: microsoft directmusic engine—->lot of tools for create adaptive music games, but difficult ui, evolved then in XACT

IN 2000 XAUDIO 2, no more tools for composers. why?

- sony SCREAM

- usually company builded their own

- 2006: modern middleware, FMod and WWise

- today: engine semplification and advancements: will be middleware still needed?

3 - primary research

interviews to people in the industry asking how they approach team work and if/how audio middleware has changed they way they work

Building the same music system both with and without FMod to look at the differences. If I can do this in the level I will be working with the guys of th game design course, it will be useful to start having some job on triggers done.


 RECENT POSTS: 
bottom of page