Sound Manager for RTS Games - actionscript-3

I am creating an RTS game in flash, AS3 for the Epic Flash Game design Contest http://www.youtube.com/watch?v=bpFBraUbHyo&list=UUfkxvxrvpNxXvdKusYS0NfQ&index=1
Am almost done, except that creating the class which manages all the sounds is being quite a pain.
Basically there are only 32 available SoundChannels for AS3 before the buffer overflows. But unfortunately, my RTS handles several dozens of units fighting at the same time, and each unit, especially rifle soldiers fire multiple shots at a time.
If I let each sound effect be sounded, the buffer would overflow, even if it does not, it would sound very noisy and messy.
so the question is, I have seen games like starcraft in the market where there are hundreds of units on screen, yet the sound is pretty "unnoisy" and organised. I would like to ask how those people achieved this effect? What sounds do they accept or filter out?
Currently I have 3 possible models:
1)First in, first out model: Accept all sounds being played, but as soon as the buffer limit is reached, the earliest sound in the buffer is silenced.
2) Accept or reject model: Accept all sounds until the buffer overflows, then reject all further plays until sounds end and the buffer empties.
3) Loudest only model: Currently my game has a variety of sounds of different loudness, for example, explosions are louder than gunfire effects. In this model, the loudest 32 sounds are being played, if a sound enters which is among the top 32, the lowest of the 32 is "kicked" and the sound replaces it.
Which model is best, or perhalps you can suggest your own model =p.

maybe also consider using different soundfiles for "single" vs "mass" events...
1 space ship - play "single spaceship sound"
2 space ships - play "single spaceship sound" twice
3 or more space ships - play "many space ships sound"
...grouping the sounds in the buffer by type might be a good idea anyway, as you easily could silence one "space ship sound" if there are too many of those, without silencing other elements.

Related

Deconstructing slot machine reels?

I have a movie of slot machine game play. How to extract only movie frames when the reels are stopped? During spinning game shows fake symbols, which are not part of the game mathematics. Until now I am doing it manually (by screen shots), which takes too much time and it will be nice to be automated.
I know how to do image processing of single images and I get segments of symbols for each reel. Can you suggest me an algorithm with which to connect different segments and to deconstruct original strips? It is like a puzzle solving, but without clear information for the number of pieces and how exactly they match.

Flash Embedded FLV Memory Leak

I am making a game where I have several small character MovieClips which appear on screen randomly. There can be several characters of the same type, and when they are removed from the stage I store them in a memory pool to reuse them.
These characters have several different keyframes which I call to make them do specific things, like fly, land, etc. To improve performance flvs were made for their different actions and these have been embedded in the timeline.
I am having a problem where the amount of memory assigned to Video is constantly increasing as the game is played, even though I am not making more instances of the characters. I have been researching into garbage collecting video but all the stuff I find is for when using the FLVPlayback component and I haven't found anything helpful.
Does anyone have any ideas?
Thanks!
How much is your memory increasing? If it's starting at ie. 80 MB and slowy increasing ie. to 140, and then either staying there or decreasing to 120 and again going slightly up, then there's no need to worry. Unfortunately that's how the Flash GC works. Even if you're not leaking any memory it will slowly show memory increase (and then sudden bob down, as the GC collects trash, and again slowly up).
However, it could be also that you have a real memory leak, but for that to assess, you'd need to post some code. Btw, using memory-pools is a great idea in games, good you're doing it already.

Actionscript: Playing sound from microphone on speakers (through buffer) with constant delay

I'm looking for an example of code that samples signal from microphone and
plays it on speakers. I need a solution that has a resonably constant delay on
different platforms (PC, android, iphone). Delay around 1-2 s is ok for me, and I don't
mind if it varies everytime the application starts.
I tried using SampleDataEvent.SAMPLE_DATA event on Sound and Microphpne classess.
One event would put data into buffer the other would read data.
But it seems impossible to provide constant delay, either the delay grows constantly or
it gets lower to the point where I have less than 2048 samples to put out and Sound class stops
generating SampleDataEvent.SAMPLE_DATA events.
I wan't to process each incoming frame so using setLoopBack(true) is not an option.
ps This is more a problem on Android devices than on PC. Althought when I start to resize application
window on PC delay starts to grow also.
Please help.
Unfortunately, this won't be possible... at least not directly.
Some sound devices will use a different clock between recording and playback. This will be especially true for cell phones where what is running the microphone may very well be different hardware than the headphone audio output.
Basically, if you record at 44.1kHz and play back at 44.1kHz, but those clocks are not in sync, you may be recording at 44.099kHz and play back at 44.101kHz. Over time, this drift will mean that you won't have enough data in the buffer to send to the output.
Another complication (and more than likely your problem) is that your record and playback sample rates may be different. If you record from the microphone at 11kHz and playback at 48kHz, you will note that 11 doesn't evenly fit into 48. Software is often used to up-sample the recording. Sometimes this is done with a nice algorithm which is guaranteed to give you the necessary output. Other times, that 11kHz will get pushed to 44kHz and is deemed "close enough".
In short, you cannot rely on recording and playback devices being in sync, and will need to synchronize yourself. There are many algorithms out there for handling this. The easiest method is to add a sample here and there that averages the sample before and after it. If you do this with just a few samples, it will be inaudible. Depending on the kind of drift problem you are having, this may be sufficient.

Float and Math precision on different systems

I want to implement a gameplay recording feature in a project, which would only record player input and seed of the RNG at the beginning of the level. Then I could take such record and play it on my computer in order to test it for validity.
I'm only concerned with some numerical differences which might appear between different Flash Player version, Operating Systems or CPUs (or whatever else that might be affected). The project would be written for Flash Player 10.0.0+. What stuff I am concerned with:
Operations on Numbers: Multiplying, dividing; bit operations (possibly bit shifting too); addition and subtraction; modulo
Math class: sin, cos and atan2; rounding
localToGlobal/globalToLocal with rotations and scaling
I won't be using stuff like hitTest, getObjectsUnderPoint, hitTestPoint, getBounds and so on, all collisions will be geometrical.
So, are there any chances that using any of the pointed things above will yield different results on different systems? If so, what can I do to avoid them?
That's an interesting question...
It's not a "will this game play the same on multiple platforms", it's "will a recording of user inputs produce the exact same output when simulated" question.
My gut would say "don't worry about it the flash VM abstracts the differences away", but then as I think more, there are some areas that might be a problem.
First, I wouldn't record anything time-based. A user hitting a key at 1.21 seconds in might be tough to predict whether that happens before or after a frame's worth of computation, especially if either the recording or playback computer was under load. Trying to time tweens with user input is probably a recipe for failure.
Accuracy of floating point should be ok. The algorithms that define when to round are well documented in IEEE-754, and all VM's use 64 bit Numbers regardless of OS they're running on. I'm guessing the math operations are equally understood.
I think it's good to avoid hitTest and whatnot. I imagine they theoretically could be influenced by whether or not hardware acceleration is being used. But I'm not an expert there, so maybe not.
Now localToGlobal/globalToLocal... I just don't know. They might have that theoretical hardware acceleration problem, but I tend to doubt it.
So I guess I didn't give any real answers.
Trig functions WILL NOT WORK! You must create custom implementations of the following: acos, asin, atan, atan2, cos, exp, log, pow, sin, and sqrt. And obviously, random().
I'm still in the process of testing the Number class. I can't say for sure whether additon/subtraction/etc. will be consistent on every machine.
It is very unlikely (although possible) that things will behave in a noticeably different way on different computers. Even if they did, it would be a very rare event and not something I would recommend worrying about unless it is absolutely crucial to gameplay.

On what is game time based? Real time or frames?

I'm designing a game for the first time, but I wonder on what game time is based. Is it based on the clock or does it rely on frames? (Note: I'm not sure if 'game time' is the right word here, correct me if it isn't)
To be more clear, imagine these scenarios:
Computer 1 is fast, up to 60fps
Computer 2 is slow, not more than 30fps
On both computers the same game is played, in which a character walks at the same speed.
If game time is based on frames, the character would move twice as fast on computer 1. On the other hand, if game time was based on actual time, computer 1 would show twice as much frames, but the character would move just as fast as on computer 2.
My question is, what is the best way to deal with game time and what are advantages and disadvantages?
In general, commercial games have two things running - a "simulation" loop and a "rendering" loop. These need to be decoupled as much as possible.
You want to fix your simulation time-step to some value (greater or equal to your maximum framerate). Complex physics doesn't like variable time steps. I'm surprised no-one has mentioned this, but fixed-time steps versus variable time steps are a big deal if you have any kind of interesting physics. Here's a good link:
http://gafferongames.com/game-physics/fix-your-timestep/
Then your rendering loop can run as fast as possible, and render the output of the current simulation step.
So, referring to your example:
You would run your simulation at 60fps, that is 16.67ms time step. Computer A would render at 60fps, ie it would render every simulation frame. Computer B would render every second simulation frame. Thus the character would move the same distance in the same time, but not as smoothly.
Really old games used a frame-count. It became fairly obvious quickly that this was a poor idea, since machines get newer, and thus the games run faster.
Thus, base it on the system clock. Generally this is done by knowing how long last frame took, and using that number to know how much 'real time' to go through this frame.
It should rely on the system clock, not on the number of frames. You've made your own case for this.
The FPS is simply how much frame the computer can render per second.
The game time is YOUR game time. You define it. It is often called the "Game Loop". The frame rendering is a part of the game loop. Also check for FSM related to game programming.
I highly suggest you to read a couple of books on game programming. The question you are asking is what those book explain in the first chapters.
For the users of each to have the same experience you'll want to use actual time, otherwise different users will have advantages/disadvantages depending on their hardware.
Games should all use the clock, not the frames, to provide the same gameplay whatever the platform. It is obvious when you look at MMO or online shooter games: no player should be faster than others.
It depends on what you're processing, what part of the game is in question.
For example, animations, physics and AI need to be framerate independent to function properly. If you have a FPS-dependent animation or physics thread, then the physics system will slow down or character will move slower on slower systems and will go incredibly fast on very fast systems. Not good.
For some other elements, like scripting and rendering, you obviously need it to be per-frame and so, framerate-dependent. You would want to process each script and render each object once per frame, regardless of the time difference between frames.
Game must rely on system clock. Since you don't want your game is played in decent computers in notime!
Games typically use the highest resolution timer available like QueryPerformanceCounter on Windows to time things. Old games used to use frames, but after you could literally run faster in Quake by changing your FPS, we learned not to do that anymore.