How does TVJS detect swipes on the Apple TV remote? - tvos

I am building an app using TVJS and TVML for Apple TV. Is there an event to help me detect when the user has swiped the siri remote's touchpad? When I'm showing an image full-screen and the user swipes, I want to move to the next/previous image. But I can't seem to find any event for detecting these swipes. Thanks!

I don't think that you can 'detect these swipes' and react with them manually, TVML and TVJS use standard templates and have particular interactive components that can be laid out on screen and then the Apple TV handles moving between those elements by interpreting the commands from the control.
TVJS only allows you to listen to a limited number of events
enum TVElementEventType : Int {
case Play
case Select
case HoldSelect
case Highlight
case Change
}
You should be able to use the functionality you described by using the correct templates or elements, such as the OneUp template.
Without knowing how you've implemented this so far, there isn't anything else I can recommend. please show the code you are using for more accurate advice

Related

Adobe Air - Starling/Features | Controls and device simulator

I have few questions here, regarding creating app in Adobe Air using Starling and feathers.
I created yet a very simple app, which has Feathers list controller with static data provided to its dataProvider. According to the code it should work fine, but there are three major issues I am facing.
1: Touch/Click Positions
I am using:
list.addEventListener( Event.CHANGE, list_changeHandler );
Now the problem is, clicking coordinates are not correct. Clicking on 3rd Item triggered 4th item, to trigger 3rd, 2nd item needs to be clicked it's half way through etc.
2: Nothing, without Theme
I am using a custom theme, came along with a tutorial. If I don't use the theme, I am unable to see anything on the screen, somehow.
3: Resolution (Device Simulator) Problem
Though buggy, but it works with Theme, but my app doesn't fit with the resolution for each device simulator. Either its, iPad or iPhone 4 or any android simulator.
Also, can anyone please also explains, what is significance and use of Context3D render mode in starling class.
Any help is appreciated:
Thanks in advance
Waqar Iqbal
Starling is a Stage3D framework that displays content directly on graphic card using Context3D. Everything displayed by Starling is always under the regular display list. Feather is a component framework based on Starling.
Stage3D cannot handle any mouse operations so Starling and Feather simulate all their mouse event (those mouse event never really happen anywhere, they are created by calculation of mouse position on the stage)
not sure, never used Feather
Starling does not handle screen density and dpi calculation, if you want your app to fit any screen you'll have to handle it yourself.
I think you should see the example carefully. if u want to use any feathers component either you have to use feathers theme or custom theme.
if you use feather theme you need to provide theme path and before using any component you need to initialize that theme.Then use component any where.without theme you will not see any thing.
1: Touch/Click Positions
please provide minTouchHeight in class theme of DefaultListItemRenderer like:-
renderer.minWidth = this.gridSize;
renderer.minHeight = this.gridSize;
renderer.minTouchWidth = this.gridSize;
renderer.minTouchHeight = this.gridSize;
2: Nothing, without Theme,
3: Resolution (Device Simulator) Problem
Follow the example given in feather library
feathers-2.1.1\themes\MetalWorksMobileTheme\source\feathers\themes

Vimeo player custom controls

I´m using froogaloop to work with Vimeo in order to use it in my sites. My problem here is that I´d like to customize the player as you can see here in the "Customize to your taste" part -> http://vimeo.com/player
The last point is a simplified control interface that I would like to use, but I´m not capable to do it. These are the parameters that we seem to be able to tweak -> http://developer.vimeo.com/player/embedding#universal-parameters
Any idea about how to work with these features?
Thanks!
Plus and PRO members can choose to customize their embeds (https://vimeo.com/s/tce). Those options are not available as embed code parameters.

WinRT barcode scanner component

I have a Windows Store (Metro) application. I need to add support for scanning barcodes.
I tried using ZXing first. From what I was able to get working, you actually need to click and save an image for it to do the processing. There's no nice overlay of a red line "scanner" nor does it process a live feed. This isn't a very elegant solution. It works far better on Android. Basically, this won't work as I need a constant video and a constant search for a barcode to be in focus.
This blog (http://www.soulier.ch/?p=1275&lang=en) mentions that extrapolating a frame out of a WinRT video stream is not allowed in managed code which means I'd need to use C++.
So, are there any components out there that do this? Anything free or paid that I can get that would be written in C++ and can find and extrapolate a barcode? Learning C++ is not on my bucket list.
You can capture frames while displaying a preview with C# only. Here's an example control that does it:
https://winrtxamltoolkit.codeplex.com/SourceControl/latest#WinRTXamlToolkit/Controls/CameraCaptureControl/CameraCaptureControl.cs
Basically you need to create a MediaCapture object and associate it with a CaptureElement control to display the preview. Then you can use CapturePhotoToStreamAsync() to capture a frame to a stream of your selected encoding format and then have a go at it with your bar code reading code.
I made a lib for WinRT using ZXing & Imaging SDK.
It works well (but does not include any additional focus feature).
There is a lib and a sample app that you can try.
It works for barcodes and QRCode (barcode by default but just change the optional parameter in the scan function code to use QRCode)

How to build iOS7 Style Audio Recorder App

I am trying to build an audio recorder app similar to iOS7 built in one and looking for guidance on what controls to use for the recording app. I understand I will be using a tableview for the list of previous recordings and a UIView for the top recording view and on tapping record adjust the table view and move down the black recording view.
How should I implement the endless horizontal scrolling view? Should I use a collection view and keep adding elements to the model array as the time increments. Also what should I use for the timer. Is there something like setInterval for Objective C like in Javascript that I can use to keep updating the UI at regular time interval?
If someone also knows of a cocoa pod or sample code that would be greatly appreciated.
For recording the simplest audiorecorder is AVAudioRecorder. Here is a simple implementation of an audio recording app: https://github.com/calmez/Recorder. AVAudioRecorder has simple metering methods where you can read volume output of the channels
Honestly though, Apple would probably use CoreAudio to get the audio because it is more optimized. Novocaine is a good core audio engine that could get you started https://github.com/alexbw/novocaine
For rendering the waveform, I would guess that Apple probably uses OpenGL. I don't see how to do it easily and efficiently otherwise. You could draw them using the standard drawing APIs for UIView like this project does (https://github.com/fulldecent/FDWaveformView) but I don't see this animating well.
For the timer, there is NSTimer

how to know when Map control was first manipulated?

I am using the Map control in Windows Phone 8.
I need to implement a page where user can select his location using the map control.
I am trying to know when the app was first manipulated by the user.
Some background info:
I saw that when the control is shown, it automatically centers the world map, and CenterChanged event is raised.
I am not able to understand how ManipulationStarted, ManipulationDelta and ManipulationCompleted work.
the first time I drag, ManipulationStarted is not called, only ManipulationCompleted.
I could consider the first manipulation by user as being the 2nd time the CenterChanged is fired.
But this is a hack or a guess, I am not happy not having a good understanding how it works.
The Map control intercepts and handles Manipulation events and as such you don't get all of them. Remember, once routed events are marked at e.Handled=true they no longer bubble up.
Depending on your Scenario WP8 exposes the UseOptimizedManipulationRouting property which might prove useful. Setting UseOptimizedManipulationRouting=false causes Map, Pivot and other controls to not swallow events for nested controls.
If that doesn't help, have a look at the following Nokia Wiki article where the author ran into the same problem as you did and used Touch.FrameReported to get out of it # http://www.developer.nokia.com/Community/Wiki/Real-time_rotation_of_the_Windows_Phone_8_Map_Control