C#/XAML - DispatcherTimer resolution - windows-runtime

I'm writing a Windows 10 Universal App in C#/XAML.
I make use of DispatcherTimer class, and I'm wondering what's its resolution. I'm doing tests in desktop environment - when I set set the Interval property to value less than 33 milliseconds it seems not to affect the timer - the tick event still fires as often as if Interval was set to 33 miliseconds (around 33 times per second).
Is this behaviour by design (you can't fire a DispatcherTimer more often than 33 times per second) or does it depend on App's working environment - like device/processor/memory etc. - and my environment simply doesn't support higher timer resolution?

Although I cannot find official documentation anywhere, what you are observing seems to make sense. 33 times per second is around 30 fps default for Windows Phone 8.1.
It would be great to know why you need such a high resolution? Also check: WPF Timer problem… Cannot get correct millisecond tick

DispatcherTimer just queues callbacks on the UI thread. Even if it were to use a high resolution timer under the hood, it would have unavoidable jitter due to random UI activity.
If the intended use is for continuous drawing, you should use a SwapChainPanel - https://msdn.microsoft.com/en-us/library/windows/apps/windows.ui.xaml.controls.swapchainpanel.aspx

Related

YouTube HTML5 API - Is it possible to get better time resolution when polling the player for current time?

I've been looking for a solution for this for a while now and I still haven't found it. Our app needs to poll a YouTube video object using player.getCurrentTime() to drive some screen animations. Using the flash API this was great because we could poll the player at 40ms intervals (25 FPS) and get very accurate current player time values. We have now started to use the iFrame API which unfortunately does not allow anything near that level of accuracy. I did some research and it seems that because it's an iFrame, a postMessage is used to expose the players state to the player.getCurrentTime() call. Unfortunately this post message event is fired very infrequently - sometimes as low as 4 times a second. Worse, the actual rate the message fires seems to be dependent on the render engine for the browser.
Does anybody know if it is possible to force the render engine to fire those messages more frequently so that greater time resolution can be achieved polling the player? I tried requestAnimationFrame and it doesn't solve the problem. Has anybody successfully managed to get the iFrame player to report more accurate times and more frequently?
I've come up with a workaround for my original problem. I wrote a simple tween function that will poll the iFrame player at the frequency I desire and interpolate the time instants in between. The player itself only updates the current time every 250 ms or so depending on the render engine and platform. If you poll it more frequently than that, it will return the same current time value on several consecutive polls. However, if you apply some logic, you can detect when the player returns a new current time and update your own timer accordingly. I run the function below on a timer with a 25 ms interval. On each iteration, I add 25 ms to the current time except in the case where I detect a change in the current time reported by the player. In that case I update my own timer with the new "actual" current time. There maybe a small jump or non linearity in the time when you do this but if you poll the player at a high enough rate, this should be imperceptible.
window.setInterval(tween_time, 25);
function tween_time() {
time_update = (ytplayer.getCurrentTime()*1000)
playing=ytplayer.getPlayerState();
if (playing==1){
if (last_time_update == time_update)
{
current_time_msec += 25;
}
if (last_time_update != time_update)
{
current_time_msec = time_update;
}
}
do_my_animations();
last_time_update = time_update;
}
In HTML5 it is likely that the getCurrentTime() function and postMessage event in Youtube API are linked to the currentTime property and timeupdate event of the HTML5 media element specification.
The rate at which the timeupdate event fires varies between browsers and as of today cannot be tuned for the level of precision you are looking for (Flash is still a bit ahead on this one). According to the specification:
If the time was reached through the usual monotonic increase of the current playback position during normal playback, and if the user agent has not fired a timeupdate event at the element in the past 15 to 250ms and is not still running event handlers for such an event, then the user agent must queue a task to fire a simple event named timeupdate at the element. (In the other cases, such as explicit seeks, relevant events get fired as part of the overall process of changing the current playback position.)
The event thus is not to be fired faster than about 66Hz or slower than 4Hz (assuming the event handlers don't take longer than 250ms to run). User agents are encouraged to vary the frequency of the event based on the system load and the average cost of processing the event each time, so that the UI updates are not any more frequent than the user agent can comfortably handle while decoding the video.
For the currentTime property the precision is expressed in seconds. Any precision below of the second is browser specific implementation and should not be taken for granted (in reality you will get sub second precision in modern browsers like Chrome but with fluctuating efficiency).
On top of that the Youtube API could be throttling all of those things up to get to the larger common ground of 250ms precision and make all browsers happy (hence 4 events per second and what you did notice in your tests). For your case scenario you better off trying to scale your animations to this 250 ms precision notion and allow for some margin of error for a better user experience. In the future browsers and HTML5 media will get better and hopefully we will get true milliseconds precision.

Flex Mobile: changing default application frameRate

I'm developing a flex game which is really jerky and not smooth at all on mobile devices.
I changed the application frameRate to 60 in my mxml file and it seems to run smoother (but not as it should). Does this have any impact on performance?
Is there any other way to do this? I don't have long and complex operations and I'm saying this because I found some open source libraries through which I can use async threads. But I read that this has downsides also.
I'm really confused because the only objects I have on stage are:
15 Image objects, each one with a Move object attached and an OnClick listener.
4 timers that repeat each 500 ms, 1 second, 2 seconds and 5 seconds.
The longest operation in the listeners and timers is O(n) where n = image count = 15, but most of them are O(1)
All the objects are created on view creationComplete event and I reuse them throughout the entire time.
Memory is managed correctly, I checked using memory profiler.
Can you point me in some directions?

Decrease CPU load while calculating for a long time on AIR mobile application?

I am facing the follow problem :
- There is a calculation which is calculated complex maths during the loading of the application, and it is toking considerable long time ( about 20 seconds ) on which time the CPU is used nearly on 100% and the application look like it is frozen.
Since it is a mobile application, this must be prevented, even with the costs of extending the initial loading time, but there is not direct access to the calculating code since it is inside 3rd party library.
Is there a way to prevent AIR application most of CPU generally ?
On desktop, you would use the Workers API. Its pretty new, I'd recommend it for AS3 only projects. If you use flex, its better to wait a few months.
Workers is a multi-threading API, what allows you to make a UI and a Working thread. This will still use 100% of CPU, but UI won't stuck.. Here are some links to get you started:
Thibault Imbert - sneak peek,
Intro to as 3 workers,
AS3 Workers livedocs
However, on Mobile, you can't use workers, so you'd have to break your function apart, and insert some delays there, like callLater, or setTimeout. Its hard to compose a function like that, but if it has a loop, you can insert a callLater method after every x iteration. you can parametrize both x, and the delay of callLater function to achieve perfect solution. After callLater is called, the UI will be rendered, events will be generated and catched. If you don't need them, remove their listeners, or stop their propagation with a higher priority handler. If you need, I can post some source example of callLater in a loop.

Timers in AS3 can not tick fast enough

I'm making a game in AS3 that requires a huge amount of bullets to be fired sequentially in an extremely short amount of time. For example, at a certain point, I need to fire one bullet, every 1-5 millisecond, for about 1 second. The game runs (smoothly) at 60 FPS with around 800+ objects on screen, but the timers don't seem to be able to tick faster than my framerate (around once every 16 milliseconds). I only have one enterFrame going, which everything else updates from.
Any tips?
The 16 milliseconds sounds about right... According to the docs it has a resolution no smaller than 16.6 seconds.
delay:Number — The delay between timer events, in milliseconds. A delay lower than 20 milliseconds is not recommended. Timer frequency is limited to 60 frames per second, meaning a delay lower than 16.6 milliseconds causes runtime problems.
I would recommend that you create x objects (bullets) off-screen, at different offsets, on each tick to get the required amount of objects you want in 1 second. This assumes that your context allows for enemies off-screen to shoot.
How can you possibly have 800+ objects on screen? is each object a single pixel or is the entire screen just filled? I mean to be fair I have a 1920x1080 screen in front of me so each object could be 2 pixels wide and 2 pixels tall and it wouldn't quite fill the entire screen width wise 1600x1600. I'm just curious why you would have such a scenario as I've been toying with game development a bit.
As for the technical question a Timer is not guaranteed to be triggered at the moment after the duration has expired (just some time after), it depends on how quickly it's able to get around to processing the code for the timer tick. My guess is having so many objects is exhausting the CPU (on *NIX systems use top in the console to see or in Windows use task manager is it peaking a core of the cpu?). This can probably confirm or deny it or if you turn off the creation/updating of your objects and see if the timer itself ticks at the correct rate then. If either is true it suggests the CPU is peaking out.
Consider using Stage3D to offload the object drawing to the GPU to free up the CPU to run your Timer. You may also want to consider a "game framework" like flixel to help manage your resources though I don't know that it takes advantage of the GPU... actually just Googled and found an interesting post discussing it:
http://forums.flixel.org/index.php?topic=6101.0

Running the Flash Player over long time period

Hi I'm looking into the issues to expect if the Flash Player (version 10) is run over a long period of time, say 24+ hours.
I know that the player has issues with not performing garbage collection properly, and that the weak listener system is buggy.
I plan on having the flash app started/monitored using a watchdog/sentinal app written in C/C++/C#. So I plan on refreshing the app periodically.
Does anyone have recommended practices for running a flash player over that sort of time scale?
Memory leaks are probably the worst. If you manage to make an app run on a 100% stable memory level for 1 hour it should also run for 24 hour and more (if its activity is stable as well).
Flex's profiler is nice to look up for leaks and uncollected items ...
I think, you want to make an application run a long time not easily and not difficult. You must always free memory at interval 300ms ( use LocalConnection).