I am loading all my assets using the AssetManager in my splashscreen and I dispose the assetmanager properly. I am also using a tiled map and various actors as well as a stage. When I create the stage and initialize all actors initially and setTheScreen on it the game's frame rate is always something like:
45,47,50,52,47,55 and then it recovers to 60 and then it never drops below 59. So on a start the game always lags slightly for about 5-10 seconds and then it recovers and always maintain 60 FPS. Has anyone experienced something like that, is it normal?
Related
When should a sound effect be disposed of?
I'm a bit confused about sound effects vs music. With music, I just dispose of it when a level is complete as I have the music just repeat until the end of the level and then that music is no longer used. With sound effects, they play for a short time so I'm not sure what happens when one is done and a new one plays of the same sound.
For example, the player character can shoot a gun and each time the shoot sound plays. So if the player shoots 6 times, is that sound effect handled like 6 separate sounds which need to be disposed of or is a sound file only to be disposed of once when no longer needed regardless of how many times it is used?
You don't need to dispose of it each time it's played. Just dispose it when you dispose your other game assets (ie call it from the Game dispose method when the game is being disposed).
When the player shoots 6 times, the sound will be played 6 times concurrently.
From the javadoc:
Plays the sound. If the sound is already playing, it will be played again, concurrently.
But there aren't 6 different sounds you should be disposing or anything like that.
So to answer your question: is a sound file only to be disposed of once when no longer needed regardless of how many times it is used?
Yes.
For Small Projects/Games :
Any Sound that you're using on PlayScreen, create once in Game's create() method and use inside your game not dispose them.
Sound sound = Gdx.audio.newSound(Gdx.files.internal("data/mysound.mp3"));
Dispose sound in dispose() method of ApplicationListener, that is called when you exit your Game.
Create your Sound in show() method of Screen Interface, and dispose them in dispose() of Screen but dispose() of Screen is not called by framework so you need to call by yourself so call dispose() method from hide() method of Screen.
Use AssetManager and manage your resource by loading ,unloading by your requirement and dispose your AssetManager in dispose in ApplicationListener.
For Big Porjects/Game in Resource wise
You must use AssetManager because you've to manage lot's of resources.
I have an interactive animation that consists of a button and some lights that are lit in a sequence. The animation also has a timer that shows the passage of time to illustrate the interval between light transitions, so the user knows how much time has passed between each transition in the sequence (it took x amount of time to go from light 'a' to light 'b'). The button controls the speed.
I implemented the timer using the Timer class in AS3:
var MyTimer:Timer = new Timer(1);
Which means the timer should fire every millisecond.
Now, I am aware the this is nowhere near accurate, due to frame rate (24fps) and the fact that it has to run the code inside the handler function. But I am not going for an accurate timer, in fact I do need it to go slower so the user can see the time between transitions, a scaled/slowed timer if you will. The current time is displayed in a dynamic text field.
As is, when using the debug player in Adobe Animate, the speed of the timer runs at about 1/10th normal speed, taking about 10 seconds to show that a second has passed, which incidentally is great for my purposes. However, this animation will be used in a PDF, in which case, a release player of Flash will be used; and in the release version, the timer appears to be about half as fast, about 1/20th the speed, taking about 20 or more seconds to show that a second has passed.
I'm not sure why this is. How do I get the swf file to be played so that the timer behaves the same in both the debug and release versions of the Flash Player?
I have tried unticking 'Permit debugging' and ticking 'Omit trace statements' (and every combination of un/tick) in the Publish Settings as well as all the Hardware acceleration options.
I've also implemented a FPS counter as suggested here by user 'kglad':
published swf file is a lot slower than the 'test' mode?
But both debug and release versions show the FPS at 23-24.'kglad' gives the last reply in the thread as "the debug players are slower than the non-debug players". I'm not sure what he meant by this as this seems to be the opposite problem that me and the OP of that thread are having, unless s/he meant using the players in a browser.
TL;DR
How do I get a timer to behave the same in both the debug and release version of the Flash player?
Thanks in advance for any suggestions.
I'm trying to write a simple game in AS3 / Flex 4, and I'm trying to find the best way from the get-go to handle timed execution of code.
The most natural approach would be to use a bunch of Timer objects throughout the program. However that's supposedly pretty expensive on CPU cycles, compared to ENTER_FRAME.
However I'm not sure how natural it would be to base all of a program's timed execution off of ENTER_FRAME, and I have read on stackoverflow that it's got issues - that Timer doesn't - with regards to dragging down animation and framerate when the complexity of the program increases.
Another option that comes to mind is to simply use one Timer object in the program, whose event handler will go through and check on everything in one go around - kind of like blending the usual Timer and ENTER_FRAME approaches.
My question is this: What's really kind of the best idea for a video game in AS3/Flex 4, and why? Thanks!
It depends entirely on when you want to update values, check collisions, manage input and so on vs when you want to actually see something happen on the screen.
ENTER_FRAME will make your update logic and the rendering of your game synchronous.
Every time an ENTER_FRAME event is dispatched, the scene is redrawn. What this means is that all your game update logic is always immediately followed by the screen being rendered. If the frame rate in your game drops due to complex graphics on the screen taking a long time to render, so will the rate at which your game updates. ENTER_FRAME dispatches are erratic, meaning they're not good for update tasks that you need to perform with even intervals between them.
Timers will cause your update logic and the rendering of your game to become asynchronous.
Timers can trigger far more or far less frequently than an ENTER_FRAME handler would. This means that your update loop could run multiple times before the scene is redrawn, or that the scene could be redrawn multiple times with no changes. Timers are less erratic than ENTER_FRAME handlers, making them better at doing something at set intervals. With that said though, there will still be a little bit of offset between updates:
Depending on the SWF file's framerate or the runtime environment (available memory and other factors), the runtime may dispatch events at slightly offset intervals. For example, if a SWF file is set to play at 10 frames per second (fps), which is 100 millisecond intervals, but your timer is set to fire an event at 80 milliseconds, the event will be dispatched close to the 100 millisecond interval. Memory-intensive scripts may also offset the events.- help.adobe.com | flash.utils.Timer
Personally I have always used ENTER_FRAME vs Timers. To me it is logical that if I make changes to the objects within a game, those changes should be immediately represented on-screen.
Timers are good if you want to be able to update components within your game faster than the frame-rate is able to manage. Timers are also good if you expect a given amount of updates to be completed within a certain timeframe, because they're not bound by the rate at which the screen is able to be redrawn like ENTER_FRAME is.
As for actual implementation, you preferably want to pick one and implement one handler. This means that you should only have a single function in the entire game that will be triggered by either a Timer or ENTER_FRAME. You do not want to be creating individual handlers in every single class that should be updated. Instead, you want to have your top level class (or a close relative of that class) define the handler. You also want to create a list within that class which will represent everything in the game that needs to be updated.
From there, you will create a small collection of methods that will deal with listing and de-listing updatable instances from that class. Each updatable instance will either implement an interface or extend a class that defines an update() method. It could look like this:
public interface IUpdatable
{
function update();
}
From there, the update handler within the updater class will simply iterate over all of the updatables in the list and call their update() method, like this:
for each(var i:IUpdatable in updateList)
{
i.update();
}
A final note is that this approach means that if you do decide to switch from using an ENTER_FRAME handler to a Timer or vice versa, it's a simple switch in the updater class and doesn't require you to change any other part of the game code. If you go around creating handlers in each class that needs to be updated, your change of heart will mean making changes to every single class.
I'm making a game in AS3 that requires a huge amount of bullets to be fired sequentially in an extremely short amount of time. For example, at a certain point, I need to fire one bullet, every 1-5 millisecond, for about 1 second. The game runs (smoothly) at 60 FPS with around 800+ objects on screen, but the timers don't seem to be able to tick faster than my framerate (around once every 16 milliseconds). I only have one enterFrame going, which everything else updates from.
Any tips?
The 16 milliseconds sounds about right... According to the docs it has a resolution no smaller than 16.6 seconds.
delay:Number — The delay between timer events, in milliseconds. A delay lower than 20 milliseconds is not recommended. Timer frequency is limited to 60 frames per second, meaning a delay lower than 16.6 milliseconds causes runtime problems.
I would recommend that you create x objects (bullets) off-screen, at different offsets, on each tick to get the required amount of objects you want in 1 second. This assumes that your context allows for enemies off-screen to shoot.
How can you possibly have 800+ objects on screen? is each object a single pixel or is the entire screen just filled? I mean to be fair I have a 1920x1080 screen in front of me so each object could be 2 pixels wide and 2 pixels tall and it wouldn't quite fill the entire screen width wise 1600x1600. I'm just curious why you would have such a scenario as I've been toying with game development a bit.
As for the technical question a Timer is not guaranteed to be triggered at the moment after the duration has expired (just some time after), it depends on how quickly it's able to get around to processing the code for the timer tick. My guess is having so many objects is exhausting the CPU (on *NIX systems use top in the console to see or in Windows use task manager is it peaking a core of the cpu?). This can probably confirm or deny it or if you turn off the creation/updating of your objects and see if the timer itself ticks at the correct rate then. If either is true it suggests the CPU is peaking out.
Consider using Stage3D to offload the object drawing to the GPU to free up the CPU to run your Timer. You may also want to consider a "game framework" like flixel to help manage your resources though I don't know that it takes advantage of the GPU... actually just Googled and found an interesting post discussing it:
http://forums.flixel.org/index.php?topic=6101.0
I have a script that relies on ENTER_FRAME event to run every time. I have noticed on some slower computers there can be some lag when a flash movie is playing.
Does ENTER_FRAME run on every frame, even if its on a slow computer?
If the flash movie lags, does the ENTER_FRAME event still run and the rendering just try to catch up?
Is running code on ENTER_FRAME a reliable way to execute code every time a frame is entered?
Yep. Every frame, no exceptions. If something is slowing a movie down (either heavy scripts or heavy graphics), it Event.ENTER_FRAME handlers are still being executed before a frame is rendered.
Hence, it's generally a good idea to use a Timer instance with TimerEvent.TIMER, even if it's delay is set to be equal to 'ideal' frame duration for your movie's fps. Because timer handler is not bound to be triggered at exactly uniform rate.
See the following link for more in-depth explanation: The Elastic Racetrack
if you have a framerate set to 30fps, then the event will fire 30 times per second, as long as you don't put a load on the processor, making the frame rate drop. Therefor, if the framerate is fluctuating, you might get more consistent results with a timer Event.
on a side note, be aware that...
Using many Event handlers can create performance issues too (if you have too many)
Every time it is called, flash has to create an event object at the very least. That means you have memory that needs to be allocated every time the event fires. That memory then needs to be garbage collected at a later time, and the garbage collection will also use resources to execute.
If you have have many movie clips or sprites it could be worthwhile to have one controller that manages all of them, rather than each one having it's own EnterFrame handler.
The general answer to general question.
If you want to improve performance of Flash Player then consider following points,
Do not use strokes unless if it is required. (Strokes are more cpu
intensive)
Use less gradient colors if possible.
Use optimized bitmaps if any.
Make effective use of addChild(yourObject), addChildAt(yourObject, index), removeChild(yourObject), removeChildAt(index).
Listen to Event.ADDED_TO_STAGE and Event.REMOVED_FROM_STAGE
respectively.
Listen to addEventListener(somelistener, somefunction);
removeEventListener(somelistener, somefunction);
Listen to Event.ACTIVATE and Event.DEACTIVATE.
If objects are loaded externally then make sure to use
unloadAndStop() to completely remove unnecessary objects
from
the stage.
Check this out for anyone looking for a framerate independent solution... this guy's really smart and has a technique for both animating consistently across multiple framerates (slower devices, desktops, etc) and keeping your object's framerate independent of your timeline's framerate. Check it out here. Tips 4 & 5. Hope that helps!
I found that the timer class is actually very inconsistent when mashing buttons, sometimes the timer just fails to complete a cycle and the TIMER.COMPLETE event never gets reached, if I had 5 cycles of 100ms, it would just stop after 3 cycles... Also, framerate will fire every frame but IT IS NOT CONSISTENT!!! If you have lag on the CPU, your framerate will drop and hence you will not have anything updating at regular intervals, but rather whatever the current framerate is. Check out that link, there's even some framerate code you can use on your projects to check it.