Libgdx - how to properly dispose sound files - libgdx

When should a sound effect be disposed of?
I'm a bit confused about sound effects vs music. With music, I just dispose of it when a level is complete as I have the music just repeat until the end of the level and then that music is no longer used. With sound effects, they play for a short time so I'm not sure what happens when one is done and a new one plays of the same sound.
For example, the player character can shoot a gun and each time the shoot sound plays. So if the player shoots 6 times, is that sound effect handled like 6 separate sounds which need to be disposed of or is a sound file only to be disposed of once when no longer needed regardless of how many times it is used?

You don't need to dispose of it each time it's played. Just dispose it when you dispose your other game assets (ie call it from the Game dispose method when the game is being disposed).
When the player shoots 6 times, the sound will be played 6 times concurrently.
From the javadoc:
Plays the sound. If the sound is already playing, it will be played again, concurrently.
But there aren't 6 different sounds you should be disposing or anything like that.
So to answer your question: is a sound file only to be disposed of once when no longer needed regardless of how many times it is used?
Yes.

For Small Projects/Games :
Any Sound that you're using on PlayScreen, create once in Game's create() method and use inside your game not dispose them.
Sound sound = Gdx.audio.newSound(Gdx.files.internal("data/mysound.mp3"));
Dispose sound in dispose() method of ApplicationListener, that is called when you exit your Game.
Create your Sound in show() method of Screen Interface, and dispose them in dispose() of Screen but dispose() of Screen is not called by framework so you need to call by yourself so call dispose() method from hide() method of Screen.
Use AssetManager and manage your resource by loading ,unloading by your requirement and dispose your AssetManager in dispose in ApplicationListener.
For Big Porjects/Game in Resource wise
You must use AssetManager because you've to manage lot's of resources.

Related

frame drop on game initial start

I am loading all my assets using the AssetManager in my splashscreen and I dispose the assetmanager properly. I am also using a tiled map and various actors as well as a stage. When I create the stage and initialize all actors initially and setTheScreen on it the game's frame rate is always something like:
45,47,50,52,47,55 and then it recovers to 60 and then it never drops below 59. So on a start the game always lags slightly for about 5-10 seconds and then it recovers and always maintain 60 FPS. Has anyone experienced something like that, is it normal?

Does a SpriteBatch instance need to call dispose() once it is no longer used?

According to this article, a SpriteBatch instance needs to call dispose() once it is no longer needed. However, as I examine some of libgdx's official examples like Pax Britannica and Super Jumper, I found out that they never call SpriteBatch.dispose(). Why is that?
SpriteBatch must always be disposed.
Internally, it creates and manages several Mesh objects. These objects allocate vertex/index arrays on the GPU. Those are only deallocated if you explicitly call Mesh#dispose(), which will be triggered by calling dispose() on your SpriteBatch object.
It will also, by default, create its own ShaderProgram. And similarly, would be leaked if you didn't call dispose().
If the demo's aren't doing this, perhaps it's time to send a pull request!
I think the given demo games try to keep things simple. They are supposed to show how the basic things in libgdx work in a minimalistic way and thus also abstract a little of some details. That's useful for beginners to not bloat up the examples with a lot of very specific code.
In a real world example I think SpriteBatch.dispose() has to be called in the dispose() method of the GameScreen in SuperJumper for example. And also GameScreen.dispose() has to be called when switching back to the MainMenuScreen, because this doesn't happen automatically as well.
When creating a Spritebatch like this new SpriteBatch(), it creates one internal Mesh. When not calling SpriteBatch.dispose() this one mesh would also not be disposed and thus SuperJumper has a memory leak there.
I've created games where I have multiple screens which all has their own SpriteBatch. I just removed all dispose() methods of the batches and for me, there's no effect of this. So keep in mind to check for this before releasing your product. Even if you can't feel any downside not to dispose batches, there's no reason not to dispose them. Just do it on the Screen implemententations dispose methos, takes about 1 nano second to do :)

TimerEvent vs. ENTER_FRAME in AS3/Flex games?

I'm trying to write a simple game in AS3 / Flex 4, and I'm trying to find the best way from the get-go to handle timed execution of code.
The most natural approach would be to use a bunch of Timer objects throughout the program. However that's supposedly pretty expensive on CPU cycles, compared to ENTER_FRAME.
However I'm not sure how natural it would be to base all of a program's timed execution off of ENTER_FRAME, and I have read on stackoverflow that it's got issues - that Timer doesn't - with regards to dragging down animation and framerate when the complexity of the program increases.
Another option that comes to mind is to simply use one Timer object in the program, whose event handler will go through and check on everything in one go around - kind of like blending the usual Timer and ENTER_FRAME approaches.
My question is this: What's really kind of the best idea for a video game in AS3/Flex 4, and why? Thanks!
It depends entirely on when you want to update values, check collisions, manage input and so on vs when you want to actually see something happen on the screen.
ENTER_FRAME will make your update logic and the rendering of your game synchronous.
Every time an ENTER_FRAME event is dispatched, the scene is redrawn. What this means is that all your game update logic is always immediately followed by the screen being rendered. If the frame rate in your game drops due to complex graphics on the screen taking a long time to render, so will the rate at which your game updates. ENTER_FRAME dispatches are erratic, meaning they're not good for update tasks that you need to perform with even intervals between them.
Timers will cause your update logic and the rendering of your game to become asynchronous.
Timers can trigger far more or far less frequently than an ENTER_FRAME handler would. This means that your update loop could run multiple times before the scene is redrawn, or that the scene could be redrawn multiple times with no changes. Timers are less erratic than ENTER_FRAME handlers, making them better at doing something at set intervals. With that said though, there will still be a little bit of offset between updates:
Depending on the SWF file's framerate or the runtime environment (available memory and other factors), the runtime may dispatch events at slightly offset intervals. For example, if a SWF file is set to play at 10 frames per second (fps), which is 100 millisecond intervals, but your timer is set to fire an event at 80 milliseconds, the event will be dispatched close to the 100 millisecond interval. Memory-intensive scripts may also offset the events.- help.adobe.com | flash.utils.Timer
Personally I have always used ENTER_FRAME vs Timers. To me it is logical that if I make changes to the objects within a game, those changes should be immediately represented on-screen.
Timers are good if you want to be able to update components within your game faster than the frame-rate is able to manage. Timers are also good if you expect a given amount of updates to be completed within a certain timeframe, because they're not bound by the rate at which the screen is able to be redrawn like ENTER_FRAME is.
As for actual implementation, you preferably want to pick one and implement one handler. This means that you should only have a single function in the entire game that will be triggered by either a Timer or ENTER_FRAME. You do not want to be creating individual handlers in every single class that should be updated. Instead, you want to have your top level class (or a close relative of that class) define the handler. You also want to create a list within that class which will represent everything in the game that needs to be updated.
From there, you will create a small collection of methods that will deal with listing and de-listing updatable instances from that class. Each updatable instance will either implement an interface or extend a class that defines an update() method. It could look like this:
public interface IUpdatable
{
function update();
}
From there, the update handler within the updater class will simply iterate over all of the updatables in the list and call their update() method, like this:
for each(var i:IUpdatable in updateList)
{
i.update();
}
A final note is that this approach means that if you do decide to switch from using an ENTER_FRAME handler to a Timer or vice versa, it's a simple switch in the updater class and doesn't require you to change any other part of the game code. If you go around creating handlers in each class that needs to be updated, your change of heart will mean making changes to every single class.

Adobe AIR - Garbage collection and system.gc()

I'm building an Adobe AIR desktop app with Flash CS5 that makes a lot of use of bitmapdata, bytearrays and base64 strings. After a while the memory usage of the app doubles.
Is it recommended to use system.gc() to free memory at that point or is that bad practice?
Thanks.
system.gc is a debug only functionality in AIR and Flash player. I think the better thing is to recycle bitmapdata and other objects if you can to avoid gc, and if not call bitmapdata.dispose() and bitmapdata = null as soon as you are done with using them.
If you have bitmap objects of the same size at various times in your project, you can use the same instance of BitmapData to operate on them. This is similar to how ItemRenderers recycle items or how even other platforms like iOS's UITableViewController recycles/reuses UITableViewCell. Garbage collection is not panacea, it should be used when easy programmability is more important than performance.
You don't need to call system.gc as it will be called automatically on idle cycles by the Flash runtime. If you call it yourself you might end up slowing down your application for no real gain.
When you don't need a BitmapData or a ByteArray anymore, just call BitmapData.dispose() or ByteArray.clear().

Does AS3 Event.ENTER_FRAME run on every frame, always? Even on slow computers?

I have a script that relies on ENTER_FRAME event to run every time. I have noticed on some slower computers there can be some lag when a flash movie is playing.
Does ENTER_FRAME run on every frame, even if its on a slow computer?
If the flash movie lags, does the ENTER_FRAME event still run and the rendering just try to catch up?
Is running code on ENTER_FRAME a reliable way to execute code every time a frame is entered?
Yep. Every frame, no exceptions. If something is slowing a movie down (either heavy scripts or heavy graphics), it Event.ENTER_FRAME handlers are still being executed before a frame is rendered.
Hence, it's generally a good idea to use a Timer instance with TimerEvent.TIMER, even if it's delay is set to be equal to 'ideal' frame duration for your movie's fps. Because timer handler is not bound to be triggered at exactly uniform rate.
See the following link for more in-depth explanation: The Elastic Racetrack
if you have a framerate set to 30fps, then the event will fire 30 times per second, as long as you don't put a load on the processor, making the frame rate drop. Therefor, if the framerate is fluctuating, you might get more consistent results with a timer Event.
on a side note, be aware that...
Using many Event handlers can create performance issues too (if you have too many)
Every time it is called, flash has to create an event object at the very least. That means you have memory that needs to be allocated every time the event fires. That memory then needs to be garbage collected at a later time, and the garbage collection will also use resources to execute.
If you have have many movie clips or sprites it could be worthwhile to have one controller that manages all of them, rather than each one having it's own EnterFrame handler.
The general answer to general question.
If you want to improve performance of Flash Player then consider following points,
Do not use strokes unless if it is required. (Strokes are more cpu
intensive)
Use less gradient colors if possible.
Use optimized bitmaps if any.
Make effective use of addChild(yourObject), addChildAt(yourObject, index), removeChild(yourObject), removeChildAt(index).
Listen to Event.ADDED_TO_STAGE and Event.REMOVED_FROM_STAGE
respectively.
Listen to addEventListener(somelistener, somefunction);
removeEventListener(somelistener, somefunction);
Listen to Event.ACTIVATE and Event.DEACTIVATE.
If objects are loaded externally then make sure to use
unloadAndStop() to completely remove unnecessary objects
from
the stage.
Check this out for anyone looking for a framerate independent solution... this guy's really smart and has a technique for both animating consistently across multiple framerates (slower devices, desktops, etc) and keeping your object's framerate independent of your timeline's framerate. Check it out here. Tips 4 & 5. Hope that helps!
I found that the timer class is actually very inconsistent when mashing buttons, sometimes the timer just fails to complete a cycle and the TIMER.COMPLETE event never gets reached, if I had 5 cycles of 100ms, it would just stop after 3 cycles... Also, framerate will fire every frame but IT IS NOT CONSISTENT!!! If you have lag on the CPU, your framerate will drop and hence you will not have anything updating at regular intervals, but rather whatever the current framerate is. Check out that link, there's even some framerate code you can use on your projects to check it.