I am working on AIR for mobile game .I want to make leader board to show scores. for this i am using Goviral ANE version 3.0.11 of milkmanGame. To get scores i am calling funtion
facebookGraphRequest("fql",function1,{q:"SELECT value,user_id FROM score WHERE user_id= me() AND app_id = 12345"},"GET");
and GVFacebookEvent.FB_REQUEST_FAILED event fire and give error "fql is deprecated for versions v2.1 and higher"
You should upgrade your ANE to the current version (4.6.0).
Milkman Games regularly have to update the ANE to keep up with changes to the Facebook SDK.
Related
I have a simple Cordova app with a video HTML5 element. I try to cast this video to a Chromecast. It won't display the Chromecast icon as default (if Chromecast is available) so I using a plugin.
I'm using the "cordova-plugin-chromecast". By calling chrome.cast.initialize, the next error will be logged:
"For some reason, not attempting to join route Chromecast, null, false".
And I cannot call a session.
Chromecast is connected, confirmed by the Google Home app on the device.
Why this log error?
If you're integrating Chromecast/Google Cast into Cordova, this plugin is the most suggested one - cordova-plugin-connectsdk. Check this related SO post for more details.
I need to set the night mode Google maps, which will work automatically using the sensor in the phone and manual using the switch. I need some example or documentation
Install the Xamarin.Forms.GoogleMaps Nuget Package (source code available on GitHub)
which has already implemented it on Xamarin.Forms.
You can refer to the MapStylePage sample available here which basically explains you how to create original map styles using MapStyle With Google. You can use the wizard, select the Night theme from there and get the corresponding json style, which you'll use on your Xamarin app.
int currentNightMode = getResources().getConfiguration().uiMode
& Configuration.UI_MODE_NIGHT_MAS
switch (currentNightMode) {
case Configuration.UI_MODE_NIGHT_NO:
// Night mode is not active, we're in day time
case Configuration.UI_MODE_NIGHT_YES:
// Night mode is active, we're at night!
case Configuration.UI_MODE_NIGHT_UNDEFINED:
// We don't know what mode we're in, assume notnight
}
Maybe I'm going crazy...
I got a crash log in the Google Dev Console:
5 Feb 02:42 on app version 10
Google Emulator (generic_x86), 4096MB RAM, Android 8.0
Report 1 of 1
java.lang.IllegalArgumentException:
at business.dots.android.collection.ui.detail.redeem.states.RxStateMachine.register (RxStateMachine.kt)
at business.dots.android.collection.ui.detail.redeem.states.UIState.<init> (UIState.kt:15)
at business.dots.android.collection.ui.detail.redeem.states.UIStateIncomplete.<init> (UIStateIncomplete.kt:9)
at business.dots.android.collection.ui.detail.redeem.RedeemFragment.onCreateView (RedeemFragment.kt:58)
at android.support.v4.app.Fragment.performCreateView (Fragment.java:2261)
at android.support.v4.app.FragmentManagerImpl.moveToState (FragmentManager.java:1419)
at android.support.v4.app.FragmentManagerImpl.moveFragmentToExpectedState (FragmentManager.java:1750)
at android.support.v4.app.FragmentManagerImpl.moveToState (FragmentManager.java:1819)
at android.support.v4.app.BackStackRecord.executeOps (BackStackRecord.java:797)
at android.support.v4.app.FragmentManagerImpl.executeOps (FragmentManager.java:2590)
at android.support.v4.app.FragmentManagerImpl.executeOpsTogether (FragmentManager.java:2377)
at android.support.v4.app.FragmentManagerImpl.removeRedundantOperationsAndExecute (FragmentManager.java:2332)
at android.support.v4.app.FragmentManagerImpl.execPendingActions (FragmentManager.java:2239)
at android.support.v4.app.FragmentManagerImpl$1.run (FragmentManager.java:700)
at android.os.Handler.handleCallback (Handler.java:789)
at android.os.Handler.dispatchMessage (Handler.java:98)
at android.os.Looper.loop (Looper.java:164)
at android.app.ActivityThread.main (ActivityThread.java:6541)
at java.lang.reflect.Method.invoke (Native Method)
at com.android.internal.os.Zygote$MethodAndArgsCaller.run (Zygote.java:240)
at com.android.internal.os.ZygoteInit.main (ZygoteInit.java:767)
The thing is that app version 10 was released back in 2017.
As of now, the class RxStateMachine does not exist in the release version.
I introduced the class around the 5th of February but i never released a version including the class.
Even if Google had a time machine and this exception is from the future, this would not be possible since I would have to use the next version code (11) in order to release a new version^^
So my question is...how is that possible?
But maybe I'm going crazy...
I am creating an app on iOS in Flash Builder, with as3.
The app uses the Starling plugin: http://wiki.starling-framework.org/start
My app allows users to take photos and customise them. When attempting to access the camera or camera roll on iOS 8, I get the error message "The application lost the device context!".
On Android, I can get around this problem with this line:
Starling.handleLostContext = true;
But I am told that iOS should never lose context (and I haven't seen it lose context on iOS 7 or below).
If I include that line in iOS 8, the application crashes at around the same point, but in this case the app crashes completely, and returns me to the home screen rather than displaying the previous message.
I have heard there are restrictions on iOS 8 with regards to the use of 64 bit/32 bit plugins and extensions, but I am not using any ANEs in this particular app. Are there any other areas where 32-bit could be causing problems or is that strictly related to ANEs?
I don't get this error on iOS 7 or below or Android, unless I set handleLostContext to false.
Adobe Scout provides no error message.
Any help would be appreciated.
UPDATE:
This code calls in the camera functionality:
var cameraRoll:CameraRoll = new CameraRoll();
if(CameraRoll.supportsBrowseForImage) {
trace("camera rolling");
cameraRoll.addEventListener(MediaEvent.SELECT, imageSelected);
cameraRoll.addEventListener(flash.events.Event.CANCEL, browseCanceled);
cameraRoll.addEventListener(flash.events.ErrorEvent.ERROR, galleryMediaError);
cameraRoll.browseForImage();
} else {
var alert:Alert = Alert.show("Image browsing is not supported on this device.", "Error", new ListCollection([{label:"OK"}]));
}
UPDATE 2:
I've switched from AIR SDK 17 to 16, and it is now more stable but has similar issues
There is a known issue with the camera roll in iOS which will cause stage3d to lose context. Your options:
Set Starling.handleLostContext = true, it is possible to lose context in iOS.
Find an ANE (Supposedly these exist) that handles the camera roll without losing context.
More Information:
http://forum.starling-framework.org/topic/starling-and-cameraui#post-77339
I can confirm on ios 8.3 a Context3D is not lost when opening the CameraRoll using AIR 18.
Make sure you are using AIR 18.
Make sure you are using the latest Starling Version.
If the problem persist it's likely Starling is the cause.
Either report the problem and wait for an update.
Do not use Starling when requesting the CameraRoll (turn Starling off and Display normal Bitmaps).
Don't use Starling and use another engine or create your own.
I've installed Samsung Smart TV SDK and I'm trying to get the current frame/time the video is playing.
The video is already running, and everything seems to work just fine, but when I try to get the current time, using $('#video1')[0].currentTime, I'm just getting the value in seconds, without the milliseconds.
I'm developing a program where I need the milliseconds, but it seems impossible to get them using the SDK. What am I doing wrong? Any tips? Or it's a SDK limitation, and I'll never be able to get the current "REAL" time?
I found some solutions online, read a lot about the video tag in HTML 5, but the SDK documentation seem to lack a lot of information, and some are even wrong...
I followed this example: http://jsfiddle[dot]net/893aM/1/
Works just fine in any browser, but when I apply it to the Smart Tv, I just get the time in seconds, no luck with my precious milliseconds...
Thank you.
Please take a look at the following links:
Using SEF Plugin and SEF Plugin API
SEF Plugin Player
"OnCurrentPlaybackTime" event from the SEF Plugin Player.
EDIT:
Player.OnCurrentPlayTime = function (milliseconds) {
// use 'milliseconds' parameter
}
or you can convert it by yourself
Player.OnCurrentPlayTime = function (time) {
var hms = time.toString().split(":");
var seconds=(parseInt(hms[0],10)*60*60+parseInt(hms[1],10)*60+parseInt(hms[2],10))*1000;
alert(seconds);}