How can something be delayed based off of FPS? [duplicate] - pygame

I am practicing on pygame and I was wondering how can we do so that the framerate does not affect the speed of execution of the game
I would like FPS to not be locked and the game to always run at the same speed.
Until now I used the pygame.time.Clock.tick function but the speed of the character was changing depending on the number of FPS, which I don't want.

You have to calculate the movement per frame depending on the frame rate.
pygame.time.Clock.tick returns the number of milliseconds since the last call. When you call it in the application loop, this is the number of milliseconds that have passed since the last frame. Multiply the objects speed by the elapsed time per frame to get constant movement regardless of FPS.
For instance define the distance in number of pixel, which the player should move per second (move_per_second). Then compute the distance per frame in the application loop:
move_per_second = 500
FPS = 60
run = True
clock = pygame.time.Clock()
while run:
ms_frame = clock .tick(FPS)
move_per_frame = move_per_second * ms_frame / 1000
# [...]

Related

Increase Timer interval

I have a timer that calls the function 'bottleCreate' from 500 to 500 miliseconds. But I want that time to increase during the game (getting faster the creation of the bottles, and the game gets more difficult). But I don't know how to increase that variable inside new Timer. Thanks
interval=500;
var my_timer=new Timer(interval);
my_timer.addEventListener(TimerEvent.TIMER, bottleCreate);
my_timer.start();
You want the game to get faster, so the variable needs to decrease, because less time between function calls will make it faster.
According to the Documentation of the Timer Class you can use the delay variable to change the interval speed.
So, to make it faster, you could simply write
my_timer.delay -= 50;
Each time you do this, the function call will be called 50 ms faster.
Be aware though, going beneath 20ms will cause problems, according to the Documentation.
Furthermore, each time you manipulate the delay variable, the timer will restart completely, with the same repeat count you use at initialization.

STM32F4 nanosecs delay

I've been playing with SysTick for a couple of days and i cannot reach nanoseconds delay. Is it possible with Systick to reach such small values or i have to use timers and interrupts? The LEDs though won't work lower than 350ns delay. Here is an image from my usb oscilloscope:
In general I want to make a project (i am just experimenting with LEDs and SysTick above) which will be like this:
where Δt = 250ns (the other parameters will be determined somehow). The question is, can I make these pulses by using SysTick?
STM32F407VG have 24-bit SysTick timer and its maximum clock speed is 168MHz (Core Clock speed). That means, even if you set your SysTick reload register to:
0x000001 (1 cycle)
You can only have 5.95ns period.
I found this in section 6.2 Clocks of the RM0368 reference manual:
The RCC feeds the external clock of the Cortex System Timer (SysTick) with the AHB clock (HCLK) divided by 8. The SysTick can work either with this clock or with the Cortex clock (HCLK), configurable in the SysTick control and status register.
So maybe the maximum tick rate is limited by the clock divisions. Check Figure 12. Clock tree to see which clock config you should use to get maximum speed.

AS3 - Slow Render Speeds

I've written a basic velocity-based animation engine which iterates over a list of custom Shape objects which hold direction and velocity. When animating more than 500 objects, my application sees significant slowdown, but the actual time it takes to change the positions of the objects is very low.
Results are roughly as follows:
100 objects - <1ms modification time, 60 FPS
500 objects - 2ms modification time, 40 FPS
1000 objects - 4ms modification time, 10 FPS
I am currently using Timer-based animation, and my timer is set to 15ms intervals. The timer is the ONLY thing executing in my program, and the modification times I have listed measure the entirety of the timer's event function which contains only synchronous code. This means (as far as I can tell) that the only thing that could be responsible for the delay between timer events is screen rendering.
All my objects are tightly clustered. Would screen rendering really take four times as long for 1000 objects as 500? There is no opacity, and the only properties being edited are x and y values.
Specifically, I am wondering if there is a more efficient way to re-render content than changing the positions and then calling event.updateAfterEvent(). Any help is appreciated!

Smooth 60fps frame rate independent motion in AS3

I'm having trouble achieving frame rate independent motion in AS3 at 60fps. Every frame I measure the time since the previous frame, add it to an accumulator, and if the accumulator is greater than my target time, which is 16.666ms (60fps), a frame is simulated.
The problem is that the AS3 getTimer() only returns a time in milliseconds.
The delta times I get are often 16ms for the first frame, 16ms for the second, then 18ms for the third, and this pattern repeats. This averages out to 16.666. But in the first frame it is lower than the target time (16 < 16.666), so no frame is simulated. In the second frame the accumulator is higher than the target time, but slightly less than double it, so one frame is simulated. For the third frame 18ms pushes the accumulator over double the target time, so two frames are simulated.
So I'm getting this very jerky motion where no frames are rendered, then one, then two, then none, then one, then two, and this continues.
How would I get around this?
Wow... I thought I was only one who found that out.
Yes, the timer class in AS3 is not accurate. It will only trigger every ~16ms which causes MAJOR issues at times.
If want to see 2048 actions in 1000ms: FlashText Markup Language
(To test this, you'll need a method which takes 1ms to execute - just for even stats)
Notice the difference:
CORRECT: 1000ms | timer=0 == 1000 actions
AS3: 1000ms | timer=0 == 62.5 actions
I was able to write a class that works like this:
CORRECT: 1000ms | timer=0 == 1000 actions
RESHAPE TIMER: 1000ms | timer=0 == 1024 actions
NOTE:
it DOES NOT fire an action every ms
it DOES catch up to actions between the 16ms interval
an ACTION (the way I use it) is a method call, each method call can have its own body
The methodology I used to create this was catch-up... basically the first timer event will fire at 16ms... we know we have a full 16ms worth of code time to fire our own actions before the timer class fires again - thats where you inject sub-actions...
The highest I was able to produce was 2048 actions in 1000ms... more than 2 actions per ms.
Now... back to your problem
There is NO WAY to trigger a 0ms timer event. Based on my solution, if you want to by-pass the first 16ms of lag before the timer fires... you can dispatch an event which will fire within 2ms depending on the current system processes.
Possible Solution
If you take my approach for throwing your own actions within the 16ms, then you can build your own timer class. Use events for times under 16ms, when fired... fire 15 more - lol. Basically, you can create your own deltas between the 16ms.

Actionscript 3.0: What is the difference between using the ENTER_FRAME event and a TIMER event for the update method?

Im looking for some comparison between ENTER_FRAME and TIMER methods when using them for an update method. I have looked around the internet for some answers but I'm still finding it hard to understand.
Would anyone be able to help with simplifying the different between them?
Timer events can dispatch independent of the framerate of the swf (to a point). They can happen more often or less often than an ENTER_FRAME event, and should be used if you care about the precision of calculations as they happen between the span of time covered by ENTER_FRAME. The most common use case for this is a physics engine, where you may want to be as precise as possible and therefore wish to perform your simulation at a rate faster than Flash's fps.
Also, timers can be useful if you want a specific action to occur after a given delay. For example, a Timer lets you perform an action after 10 seconds easily. You simply pass 10000 milliseconds into your Timer's constructor and then the Timer event will be dispatched 10 seconds later. If you were to use ENTER_FRAME you would need to manually track the time elapsed on every frame update if you wanted to know when 10 seconds had passed.
ENTER_FRAME events are tied to the rendering cycle of the timeline and more or less match the framerate you've specified. For instance, if you have a framerate of 30fps then you'll receive approximately 30 ENTER_FRAME events per second. You may receive fewer if you have a particularly complex display list, or if your logic takes a particularly long time to execute.
"enterFrame" is dispatched on every frame.
Suppose your SWF is 24fps: "enterFrame" will be dispatched up to 24 times every second.
"timer" is dispatched at a set interval.
Suppose you start a Timer with a delay of 50 milliseconds: "timer" will be dispatched up to 20 times every second.
The actual frequency of these events will depend on the host environment as well as what's going on inside your application. For example, if you have a for loop inside your "timer" handler where you're iterating over a 1,000-element array and performing some string manipulation on each element, then you'll likely get fewer "timer" events than if your array contained only 10 elements. Likewise, if the user's system is low on free memory, then Flash Player may have trouble executing your SWF and it might slow down the rate at which these events are dispatched.
"enterFrame" depends directly on the frame rate. "timer" depends somewhat indirectly on the frame rate.
Because you (or someone else) will invariably ask what I mean by "somewhat indirectly," here's a small AS3 app that tests both events:
package
{
import flash.display.*;
import flash.events.*;
import flash.utils.*;
public class Test extends Sprite
{
private var timer:Timer = null;
private var timerEventCount:int = 0;
private var enterFrameEventCount:int = 0;
private var startTime:Number = 0;
public function Test()
{
timer = new Timer(20, 0);
timer.addEventListener("timer", timerHandler);
timer.start();
addEventListener("enterFrame", enterFrameHandler);
startTime = new Date().time;
}
private function timerHandler(event:Event):void
{
timerEventCount++;
var timeElapsed:Number = new Date().time - startTime;
//for (var i:int = 0; i < 4000; i++)
// trace("i", i);
if (timeElapsed >= 1000) {
// Stop timer after 1 second.
timer.stop();
removeEventListener("enterFrame", enterFrameHandler);
trace(timerEventCount + " timer events and "
+ enterFrameEventCount + " enterFrame events in "
+ timeElapsed + " milliseconds.");
}
}
private function enterFrameHandler(event:Event):void
{
enterFrameEventCount++;
}
}
}
Compile at 12fps:
mxmlc Test.as -default-frame-rate=12
Output:
45 timer events and 12 enterFrame events in 1001 milliseconds.
Compile at 60fps:
mxmlc Test.as -default-frame-rate=60
Output:
29 timer events and 58 enterFrame events in 1010 milliseconds.
As you can see, a higher frame rate actually slows down the timer. I'm running this in Flash Player Debugger 10.3.181.34 (10.3); your mileage may vary.
Finally, if you uncomment the for loop and run it again with 60fps, you'll see what I'm talking about.
Output:
3 timer events and 3 enterFrame events in 1145 milliseconds.
ENTER_FRAME is an event that is triggered every time the render loop of the virtual machine runs and this is relative to the framerate of the movie. For example, in the Flash CS IDE if you set the framerate to 30, then from the root display object or stage, 30 ENTER_FRAME events will be fired every second.
A timer on the other hand is just that, a timer. It runs solely based on the system clock time. For example, if you set a timer with a delay of 1 millisecond, then that timer will fire one millisecond after being started, and will continue to fire once every single millisecond if you enable it. What I think camus was trying to say in his answer is that this process runs independent of the framerate. It's based solely on checking the system clock and triggering events for timers that have had the requested delay satisfied. This is verified internally by storing the system time at which the timer was started and then checking the current system time repeatedly until it is greater than or equal to the saved time PLUS the timers delay. Example:
timer.start() //Lets say current system time is 1000
Timer duration is 1000, so we need to trigger this timer when the system time is greater than or equal to 2000.
checkTimers() //Loops, gets the current system
//If system time is greater than or equal to 2000, trigger timer with an event
dispatchEvent(Timer.TIME, etc, etc);
Note that the above "code" is just pseudo code to demonstrate the basic principles of the system.
ENTER_FRAME is relative to the movie's frame rate . TIMER events should be absolute.