AS3 setInterval double, triple issue - actionscript-3

I'm having a setInterval AS3 problem.Let me explain: I'm making a game with a timer, lets for example give its instance, timer1.every 500 milliseconds timer1 moves to left 25 times (timer1.x-=25) and when timer1 hitTests finish1 (if(timer1.hitTestObject(finish1))) it goes to the you lose scene.and you have to replay the level.When I hit replay and enter the scene the speed increases in the setInterval by double and if I lose again, triple and so on.How do fix this? It's very important that I have it fixed soon.thanks

Sounds like multiple instances of timer1 are continuing to run.
1 instance of timer1 runs at original speed.
2 instances of timer1 runs at double speed.
etc.
Ensure that the original timer1 is stopped, deleted or killed off before changing scene.
You may want to reference the function clearInterval.

Related

Moving blocks down y axis in pygame tetris clone [duplicate]

I am writing a Tetris program with PyGame, and came across a funny problem.
Before I ask the question, here is the pseudo-code:
while True:
# In this part, the human controls the block to go left, right, or speed down
if a key is pressed and the block isnt touching the floor:
if the key is K-left:
move piece left one step
if the key is K-right:
move piece right one step
if the key is K-down:
move piece down one step
# This part of the code makes the piece fall by itself
if the block isnt touching the floor:
move block down one step
# This part makes the while loop wait 0.4 seconds so that the block does not move
# down so quickly
wait 0.4 seconds
The problem is that, because of the "wait 0.4 seconds" part of the code, the part that the human controls can only move every 0.4 seconds. I would like it so that the block moves as fast as the human can press the key, while at the same time, the block dropping every 0.4 seconds. How could I arrange the code so that it will do that? Thanks!
The main problem I see here is that you are limiting your framerate using a wait of 0.4 seconds.
You should not limit framerate, but instead, you should limit how fast your block falls.
If I remember well, there was a formula you could use to do just that. It was based on the amout of time elapsed since the last frame. It looked like:
fraction of a second elapsed since last frame * distance you want your block to move in a second
This way, you can keep your mainloop intact, and the move processing will happen at every frame.
You could also do...
...
# This part of the code makes the piece fall by itself
if the block isn't touching the floor and
the block hasn't automatically moved in the last 0.4 seconds:
move block down one step
...
Just realize you'll be doing a lot of polling if the user hasn't struck any keys.
You may try asking gamedev.stackexchange.com instead. Check the site for Game Loops, and check out other example pygame projects to see how they're doing it. Having a good game loop is essential and will take care of things for you such as user inputs and a consistent frame rate.
Edit: https://gamedev.stackexchange.com/questions/651/tips-for-writing-the-main-game-loop
When doing games you should always try to do something like this:
while not finished:
events = get_events() # get the user input
# update the world based on the time that elapsed and the events
world.update(events, dt)
word.draw() # render the world
sleep(1/30s) # go to next frame
The sleep time should be variable so it takes into consideration the amount of time spend drawing and calculating the world updates.
The world update method would look something like this:
def update(self, events, dt):
self.move(events) # interpret user action
self.elapsed += dt
if self.elapsed > ADVANCE_TIME:
self.piece.advance()
self.elapsed = 0
The other way of implementing this (so you dont redraw too much) is to have events fired when the user orders a piece to be moved or when ADVANCE_TIME time passes. In each event handler you would then update the world and redraw.
This is assuming you want the pieces to move one step at a time and not continuous. In any case, the change for continuous movement is pretty trivial.

Increase Timer interval

I have a timer that calls the function 'bottleCreate' from 500 to 500 miliseconds. But I want that time to increase during the game (getting faster the creation of the bottles, and the game gets more difficult). But I don't know how to increase that variable inside new Timer. Thanks
interval=500;
var my_timer=new Timer(interval);
my_timer.addEventListener(TimerEvent.TIMER, bottleCreate);
my_timer.start();
You want the game to get faster, so the variable needs to decrease, because less time between function calls will make it faster.
According to the Documentation of the Timer Class you can use the delay variable to change the interval speed.
So, to make it faster, you could simply write
my_timer.delay -= 50;
Each time you do this, the function call will be called 50 ms faster.
Be aware though, going beneath 20ms will cause problems, according to the Documentation.
Furthermore, each time you manipulate the delay variable, the timer will restart completely, with the same repeat count you use at initialization.

AS3 - Slow Render Speeds

I've written a basic velocity-based animation engine which iterates over a list of custom Shape objects which hold direction and velocity. When animating more than 500 objects, my application sees significant slowdown, but the actual time it takes to change the positions of the objects is very low.
Results are roughly as follows:
100 objects - <1ms modification time, 60 FPS
500 objects - 2ms modification time, 40 FPS
1000 objects - 4ms modification time, 10 FPS
I am currently using Timer-based animation, and my timer is set to 15ms intervals. The timer is the ONLY thing executing in my program, and the modification times I have listed measure the entirety of the timer's event function which contains only synchronous code. This means (as far as I can tell) that the only thing that could be responsible for the delay between timer events is screen rendering.
All my objects are tightly clustered. Would screen rendering really take four times as long for 1000 objects as 500? There is no opacity, and the only properties being edited are x and y values.
Specifically, I am wondering if there is a more efficient way to re-render content than changing the positions and then calling event.updateAfterEvent(). Any help is appreciated!

Smooth 60fps frame rate independent motion in AS3

I'm having trouble achieving frame rate independent motion in AS3 at 60fps. Every frame I measure the time since the previous frame, add it to an accumulator, and if the accumulator is greater than my target time, which is 16.666ms (60fps), a frame is simulated.
The problem is that the AS3 getTimer() only returns a time in milliseconds.
The delta times I get are often 16ms for the first frame, 16ms for the second, then 18ms for the third, and this pattern repeats. This averages out to 16.666. But in the first frame it is lower than the target time (16 < 16.666), so no frame is simulated. In the second frame the accumulator is higher than the target time, but slightly less than double it, so one frame is simulated. For the third frame 18ms pushes the accumulator over double the target time, so two frames are simulated.
So I'm getting this very jerky motion where no frames are rendered, then one, then two, then none, then one, then two, and this continues.
How would I get around this?
Wow... I thought I was only one who found that out.
Yes, the timer class in AS3 is not accurate. It will only trigger every ~16ms which causes MAJOR issues at times.
If want to see 2048 actions in 1000ms: FlashText Markup Language
(To test this, you'll need a method which takes 1ms to execute - just for even stats)
Notice the difference:
CORRECT: 1000ms | timer=0 == 1000 actions
AS3: 1000ms | timer=0 == 62.5 actions
I was able to write a class that works like this:
CORRECT: 1000ms | timer=0 == 1000 actions
RESHAPE TIMER: 1000ms | timer=0 == 1024 actions
NOTE:
it DOES NOT fire an action every ms
it DOES catch up to actions between the 16ms interval
an ACTION (the way I use it) is a method call, each method call can have its own body
The methodology I used to create this was catch-up... basically the first timer event will fire at 16ms... we know we have a full 16ms worth of code time to fire our own actions before the timer class fires again - thats where you inject sub-actions...
The highest I was able to produce was 2048 actions in 1000ms... more than 2 actions per ms.
Now... back to your problem
There is NO WAY to trigger a 0ms timer event. Based on my solution, if you want to by-pass the first 16ms of lag before the timer fires... you can dispatch an event which will fire within 2ms depending on the current system processes.
Possible Solution
If you take my approach for throwing your own actions within the 16ms, then you can build your own timer class. Use events for times under 16ms, when fired... fire 15 more - lol. Basically, you can create your own deltas between the 16ms.

Actionscript 3.0: What is the difference between using the ENTER_FRAME event and a TIMER event for the update method?

Im looking for some comparison between ENTER_FRAME and TIMER methods when using them for an update method. I have looked around the internet for some answers but I'm still finding it hard to understand.
Would anyone be able to help with simplifying the different between them?
Timer events can dispatch independent of the framerate of the swf (to a point). They can happen more often or less often than an ENTER_FRAME event, and should be used if you care about the precision of calculations as they happen between the span of time covered by ENTER_FRAME. The most common use case for this is a physics engine, where you may want to be as precise as possible and therefore wish to perform your simulation at a rate faster than Flash's fps.
Also, timers can be useful if you want a specific action to occur after a given delay. For example, a Timer lets you perform an action after 10 seconds easily. You simply pass 10000 milliseconds into your Timer's constructor and then the Timer event will be dispatched 10 seconds later. If you were to use ENTER_FRAME you would need to manually track the time elapsed on every frame update if you wanted to know when 10 seconds had passed.
ENTER_FRAME events are tied to the rendering cycle of the timeline and more or less match the framerate you've specified. For instance, if you have a framerate of 30fps then you'll receive approximately 30 ENTER_FRAME events per second. You may receive fewer if you have a particularly complex display list, or if your logic takes a particularly long time to execute.
"enterFrame" is dispatched on every frame.
Suppose your SWF is 24fps: "enterFrame" will be dispatched up to 24 times every second.
"timer" is dispatched at a set interval.
Suppose you start a Timer with a delay of 50 milliseconds: "timer" will be dispatched up to 20 times every second.
The actual frequency of these events will depend on the host environment as well as what's going on inside your application. For example, if you have a for loop inside your "timer" handler where you're iterating over a 1,000-element array and performing some string manipulation on each element, then you'll likely get fewer "timer" events than if your array contained only 10 elements. Likewise, if the user's system is low on free memory, then Flash Player may have trouble executing your SWF and it might slow down the rate at which these events are dispatched.
"enterFrame" depends directly on the frame rate. "timer" depends somewhat indirectly on the frame rate.
Because you (or someone else) will invariably ask what I mean by "somewhat indirectly," here's a small AS3 app that tests both events:
package
{
import flash.display.*;
import flash.events.*;
import flash.utils.*;
public class Test extends Sprite
{
private var timer:Timer = null;
private var timerEventCount:int = 0;
private var enterFrameEventCount:int = 0;
private var startTime:Number = 0;
public function Test()
{
timer = new Timer(20, 0);
timer.addEventListener("timer", timerHandler);
timer.start();
addEventListener("enterFrame", enterFrameHandler);
startTime = new Date().time;
}
private function timerHandler(event:Event):void
{
timerEventCount++;
var timeElapsed:Number = new Date().time - startTime;
//for (var i:int = 0; i < 4000; i++)
// trace("i", i);
if (timeElapsed >= 1000) {
// Stop timer after 1 second.
timer.stop();
removeEventListener("enterFrame", enterFrameHandler);
trace(timerEventCount + " timer events and "
+ enterFrameEventCount + " enterFrame events in "
+ timeElapsed + " milliseconds.");
}
}
private function enterFrameHandler(event:Event):void
{
enterFrameEventCount++;
}
}
}
Compile at 12fps:
mxmlc Test.as -default-frame-rate=12
Output:
45 timer events and 12 enterFrame events in 1001 milliseconds.
Compile at 60fps:
mxmlc Test.as -default-frame-rate=60
Output:
29 timer events and 58 enterFrame events in 1010 milliseconds.
As you can see, a higher frame rate actually slows down the timer. I'm running this in Flash Player Debugger 10.3.181.34 (10.3); your mileage may vary.
Finally, if you uncomment the for loop and run it again with 60fps, you'll see what I'm talking about.
Output:
3 timer events and 3 enterFrame events in 1145 milliseconds.
ENTER_FRAME is an event that is triggered every time the render loop of the virtual machine runs and this is relative to the framerate of the movie. For example, in the Flash CS IDE if you set the framerate to 30, then from the root display object or stage, 30 ENTER_FRAME events will be fired every second.
A timer on the other hand is just that, a timer. It runs solely based on the system clock time. For example, if you set a timer with a delay of 1 millisecond, then that timer will fire one millisecond after being started, and will continue to fire once every single millisecond if you enable it. What I think camus was trying to say in his answer is that this process runs independent of the framerate. It's based solely on checking the system clock and triggering events for timers that have had the requested delay satisfied. This is verified internally by storing the system time at which the timer was started and then checking the current system time repeatedly until it is greater than or equal to the saved time PLUS the timers delay. Example:
timer.start() //Lets say current system time is 1000
Timer duration is 1000, so we need to trigger this timer when the system time is greater than or equal to 2000.
checkTimers() //Loops, gets the current system
//If system time is greater than or equal to 2000, trigger timer with an event
dispatchEvent(Timer.TIME, etc, etc);
Note that the above "code" is just pseudo code to demonstrate the basic principles of the system.
ENTER_FRAME is relative to the movie's frame rate . TIMER events should be absolute.