Timers in AS3 can not tick fast enough - actionscript-3

I'm making a game in AS3 that requires a huge amount of bullets to be fired sequentially in an extremely short amount of time. For example, at a certain point, I need to fire one bullet, every 1-5 millisecond, for about 1 second. The game runs (smoothly) at 60 FPS with around 800+ objects on screen, but the timers don't seem to be able to tick faster than my framerate (around once every 16 milliseconds). I only have one enterFrame going, which everything else updates from.
Any tips?

The 16 milliseconds sounds about right... According to the docs it has a resolution no smaller than 16.6 seconds.
delay:Number — The delay between timer events, in milliseconds. A delay lower than 20 milliseconds is not recommended. Timer frequency is limited to 60 frames per second, meaning a delay lower than 16.6 milliseconds causes runtime problems.
I would recommend that you create x objects (bullets) off-screen, at different offsets, on each tick to get the required amount of objects you want in 1 second. This assumes that your context allows for enemies off-screen to shoot.

How can you possibly have 800+ objects on screen? is each object a single pixel or is the entire screen just filled? I mean to be fair I have a 1920x1080 screen in front of me so each object could be 2 pixels wide and 2 pixels tall and it wouldn't quite fill the entire screen width wise 1600x1600. I'm just curious why you would have such a scenario as I've been toying with game development a bit.
As for the technical question a Timer is not guaranteed to be triggered at the moment after the duration has expired (just some time after), it depends on how quickly it's able to get around to processing the code for the timer tick. My guess is having so many objects is exhausting the CPU (on *NIX systems use top in the console to see or in Windows use task manager is it peaking a core of the cpu?). This can probably confirm or deny it or if you turn off the creation/updating of your objects and see if the timer itself ticks at the correct rate then. If either is true it suggests the CPU is peaking out.
Consider using Stage3D to offload the object drawing to the GPU to free up the CPU to run your Timer. You may also want to consider a "game framework" like flixel to help manage your resources though I don't know that it takes advantage of the GPU... actually just Googled and found an interesting post discussing it:
http://forums.flixel.org/index.php?topic=6101.0

Related

Flex Mobile: changing default application frameRate

I'm developing a flex game which is really jerky and not smooth at all on mobile devices.
I changed the application frameRate to 60 in my mxml file and it seems to run smoother (but not as it should). Does this have any impact on performance?
Is there any other way to do this? I don't have long and complex operations and I'm saying this because I found some open source libraries through which I can use async threads. But I read that this has downsides also.
I'm really confused because the only objects I have on stage are:
15 Image objects, each one with a Move object attached and an OnClick listener.
4 timers that repeat each 500 ms, 1 second, 2 seconds and 5 seconds.
The longest operation in the listeners and timers is O(n) where n = image count = 15, but most of them are O(1)
All the objects are created on view creationComplete event and I reuse them throughout the entire time.
Memory is managed correctly, I checked using memory profiler.
Can you point me in some directions?

Actionscript: Playing sound from microphone on speakers (through buffer) with constant delay

I'm looking for an example of code that samples signal from microphone and
plays it on speakers. I need a solution that has a resonably constant delay on
different platforms (PC, android, iphone). Delay around 1-2 s is ok for me, and I don't
mind if it varies everytime the application starts.
I tried using SampleDataEvent.SAMPLE_DATA event on Sound and Microphpne classess.
One event would put data into buffer the other would read data.
But it seems impossible to provide constant delay, either the delay grows constantly or
it gets lower to the point where I have less than 2048 samples to put out and Sound class stops
generating SampleDataEvent.SAMPLE_DATA events.
I wan't to process each incoming frame so using setLoopBack(true) is not an option.
ps This is more a problem on Android devices than on PC. Althought when I start to resize application
window on PC delay starts to grow also.
Please help.
Unfortunately, this won't be possible... at least not directly.
Some sound devices will use a different clock between recording and playback. This will be especially true for cell phones where what is running the microphone may very well be different hardware than the headphone audio output.
Basically, if you record at 44.1kHz and play back at 44.1kHz, but those clocks are not in sync, you may be recording at 44.099kHz and play back at 44.101kHz. Over time, this drift will mean that you won't have enough data in the buffer to send to the output.
Another complication (and more than likely your problem) is that your record and playback sample rates may be different. If you record from the microphone at 11kHz and playback at 48kHz, you will note that 11 doesn't evenly fit into 48. Software is often used to up-sample the recording. Sometimes this is done with a nice algorithm which is guaranteed to give you the necessary output. Other times, that 11kHz will get pushed to 44kHz and is deemed "close enough".
In short, you cannot rely on recording and playback devices being in sync, and will need to synchronize yourself. There are many algorithms out there for handling this. The easiest method is to add a sample here and there that averages the sample before and after it. If you do this with just a few samples, it will be inaudible. Depending on the kind of drift problem you are having, this may be sufficient.

Does AS3 Event.ENTER_FRAME run on every frame, always? Even on slow computers?

I have a script that relies on ENTER_FRAME event to run every time. I have noticed on some slower computers there can be some lag when a flash movie is playing.
Does ENTER_FRAME run on every frame, even if its on a slow computer?
If the flash movie lags, does the ENTER_FRAME event still run and the rendering just try to catch up?
Is running code on ENTER_FRAME a reliable way to execute code every time a frame is entered?
Yep. Every frame, no exceptions. If something is slowing a movie down (either heavy scripts or heavy graphics), it Event.ENTER_FRAME handlers are still being executed before a frame is rendered.
Hence, it's generally a good idea to use a Timer instance with TimerEvent.TIMER, even if it's delay is set to be equal to 'ideal' frame duration for your movie's fps. Because timer handler is not bound to be triggered at exactly uniform rate.
See the following link for more in-depth explanation: The Elastic Racetrack
if you have a framerate set to 30fps, then the event will fire 30 times per second, as long as you don't put a load on the processor, making the frame rate drop. Therefor, if the framerate is fluctuating, you might get more consistent results with a timer Event.
on a side note, be aware that...
Using many Event handlers can create performance issues too (if you have too many)
Every time it is called, flash has to create an event object at the very least. That means you have memory that needs to be allocated every time the event fires. That memory then needs to be garbage collected at a later time, and the garbage collection will also use resources to execute.
If you have have many movie clips or sprites it could be worthwhile to have one controller that manages all of them, rather than each one having it's own EnterFrame handler.
The general answer to general question.
If you want to improve performance of Flash Player then consider following points,
Do not use strokes unless if it is required. (Strokes are more cpu
intensive)
Use less gradient colors if possible.
Use optimized bitmaps if any.
Make effective use of addChild(yourObject), addChildAt(yourObject, index), removeChild(yourObject), removeChildAt(index).
Listen to Event.ADDED_TO_STAGE and Event.REMOVED_FROM_STAGE
respectively.
Listen to addEventListener(somelistener, somefunction);
removeEventListener(somelistener, somefunction);
Listen to Event.ACTIVATE and Event.DEACTIVATE.
If objects are loaded externally then make sure to use
unloadAndStop() to completely remove unnecessary objects
from
the stage.
Check this out for anyone looking for a framerate independent solution... this guy's really smart and has a technique for both animating consistently across multiple framerates (slower devices, desktops, etc) and keeping your object's framerate independent of your timeline's framerate. Check it out here. Tips 4 & 5. Hope that helps!
I found that the timer class is actually very inconsistent when mashing buttons, sometimes the timer just fails to complete a cycle and the TIMER.COMPLETE event never gets reached, if I had 5 cycles of 100ms, it would just stop after 3 cycles... Also, framerate will fire every frame but IT IS NOT CONSISTENT!!! If you have lag on the CPU, your framerate will drop and hence you will not have anything updating at regular intervals, but rather whatever the current framerate is. Check out that link, there's even some framerate code you can use on your projects to check it.

On what is game time based? Real time or frames?

I'm designing a game for the first time, but I wonder on what game time is based. Is it based on the clock or does it rely on frames? (Note: I'm not sure if 'game time' is the right word here, correct me if it isn't)
To be more clear, imagine these scenarios:
Computer 1 is fast, up to 60fps
Computer 2 is slow, not more than 30fps
On both computers the same game is played, in which a character walks at the same speed.
If game time is based on frames, the character would move twice as fast on computer 1. On the other hand, if game time was based on actual time, computer 1 would show twice as much frames, but the character would move just as fast as on computer 2.
My question is, what is the best way to deal with game time and what are advantages and disadvantages?
In general, commercial games have two things running - a "simulation" loop and a "rendering" loop. These need to be decoupled as much as possible.
You want to fix your simulation time-step to some value (greater or equal to your maximum framerate). Complex physics doesn't like variable time steps. I'm surprised no-one has mentioned this, but fixed-time steps versus variable time steps are a big deal if you have any kind of interesting physics. Here's a good link:
http://gafferongames.com/game-physics/fix-your-timestep/
Then your rendering loop can run as fast as possible, and render the output of the current simulation step.
So, referring to your example:
You would run your simulation at 60fps, that is 16.67ms time step. Computer A would render at 60fps, ie it would render every simulation frame. Computer B would render every second simulation frame. Thus the character would move the same distance in the same time, but not as smoothly.
Really old games used a frame-count. It became fairly obvious quickly that this was a poor idea, since machines get newer, and thus the games run faster.
Thus, base it on the system clock. Generally this is done by knowing how long last frame took, and using that number to know how much 'real time' to go through this frame.
It should rely on the system clock, not on the number of frames. You've made your own case for this.
The FPS is simply how much frame the computer can render per second.
The game time is YOUR game time. You define it. It is often called the "Game Loop". The frame rendering is a part of the game loop. Also check for FSM related to game programming.
I highly suggest you to read a couple of books on game programming. The question you are asking is what those book explain in the first chapters.
For the users of each to have the same experience you'll want to use actual time, otherwise different users will have advantages/disadvantages depending on their hardware.
Games should all use the clock, not the frames, to provide the same gameplay whatever the platform. It is obvious when you look at MMO or online shooter games: no player should be faster than others.
It depends on what you're processing, what part of the game is in question.
For example, animations, physics and AI need to be framerate independent to function properly. If you have a FPS-dependent animation or physics thread, then the physics system will slow down or character will move slower on slower systems and will go incredibly fast on very fast systems. Not good.
For some other elements, like scripting and rendering, you obviously need it to be per-frame and so, framerate-dependent. You would want to process each script and render each object once per frame, regardless of the time difference between frames.
Game must rely on system clock. Since you don't want your game is played in decent computers in notime!
Games typically use the highest resolution timer available like QueryPerformanceCounter on Windows to time things. Old games used to use frames, but after you could literally run faster in Quake by changing your FPS, we learned not to do that anymore.

What is the best way to make a game timer in Actionscript 3?

I have built an online game system that depends on a timer that records how long it took a player to complete a challenge. It needs to be accurate to the millisecond. Their time is stored in a SQL database.
The problem is that when I use the Timer class, some players are ending up getting scores in the database of less than a second. (which is impossible, as most challenges would take at least 11 seconds to complete even in a perfect situation.)
What I have found is that if a player has too many browser windows open, and/or a slow computer, the flash game slows down actually affecting the timer speed itself. The timer is 'spinning' on screen so you can physically see the numbers slowing down.
It is frustrating that I cannot just open a second thread or do something to allow flash to keep accurate time regardless of whatever else is going on in the program. Any ideas?
Another way of tracking time is using getTimer() at the start of the game. Store the result in a variable. Another getTimer() call at the end of the game will let you calculate the amount of milliseconds that the game lasted.
Timer is (in theory) independent of framerate, so hopefully should execute "on time" even if the player slows the framerate (in heavy display update cases). Of course, Timer is still dependent somewhat on CPU load and will have some marginal inaccuracies. 10+ second inaccuracies? I doubt it. I think either
a) You are using an incorrect number for the timer counts (timer runs on milliseconds, not seconds). If you ran it every 50ms, you could calculate total time based on the number of ticks it got through * 50ms
b) Your users are using something like Tamper Data to pause the request and change their "score"
c) You've got another bug in the game that's causing the issue.
Impossible to tell without sample code. Got any?