iOS 8 vs iOS 7 Autorotation - uiviewcontroller

Here is a simple single view controller app :
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
self.view.backgroundColor = [UIColor greenColor];
}
- (BOOL)shouldAutorotate
{
return YES;
}
- (NSUInteger)supportedInterfaceOrientations
{
return UIInterfaceOrientationMaskLandscapeRight;
}
The outputs are so different in iOS 8.
It's got to do with the difference in UIWindow bounds on iOS 8 vs iOS 7. How do I get iOS 7 like behavior ?

This appears to be a bug in Xcode 6 or iOS 8. After switching to storyboards from xib, the problem disappeared.

In IOS8 the list of possible orientations should be in the Info.plist file, the method shouldAutorotate return YES by default.
Take a look on the discussion and documentation below:
https://stackoverflow.com/a/24467576/3330421
UIKit Reference:
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIViewController_Class/index.html#//apple_ref/occ/instm/UIViewController/supportedInterfaceOrientations
When the user changes the device orientation, the system calls this
method on the root view controller or the topmost presented view
controller that fills the window. If the view controller supports the
new orientation, the window and view controller are rotated to the new
orientation. This method is only called if the view controller's
shouldAutorotate method returns YES.
Override this method to report all of the orientations that the view
controller supports. The default values for a view controller's
supported interface orientations is set to
UIInterfaceOrientationMaskAll for the iPad idiom and
UIInterfaceOrientationMaskAllButUpsideDown for the iPhone idiom.
The system intersects the view controller's supported orientations
with the app's supported orientations (as determined by the Info.plist
file or the app delegate's
application:supportedInterfaceOrientationsForWindow: method) to
determine whether to rotate.

Related

Is it possible to launch mobile sensor with html5 but only with android webview?

I mean without cordova or other framework. I'm pretty sure i need to write Java code and link it somehow with html5 through the android webview.
If it is possible, can get a little example how to connect to the camera or other sensor.
Some of the sensors have a JavaScript API such as geolocation, orientation (gyroscope) and the battery. To access the camera you could use MediaDevices.getUserMedia, however, this is still in an experimental stage and is not supported by all Android devices. For more information refer to this link.
Look into JavascriptInterface
https://developer.android.com/reference/android/webkit/WebView.html
https://developer.android.com/guide/webapps/webview.html
Specifically, addJavascriptInterface(java.lang.Object, java.lang.String))
#JavascriptInterface
class JsInterface {
public void startCamera() { ... }
}
WebView myWebView = (WebView) findViewById(R.id.webview);
WebSettings webSettings = myWebView.getSettings();
webSettings.setJavaScriptEnabled(true);
webView.addJavascriptInterface(new JsInterface(), "androidInterface");
Basically, add the JavascriptInterface, and enable javascript on the web view. Then in your javascript you can detect if the interface exists like so:
if ("undefined" != typeof androidInterface) {
androidInterface.startCamera();
}
Now in the Java code for startCamera, you can do whatever native stuff you need done.

presentControllerWithName in tvOS

in watchOS I used presentControllerWithName to show a View Controller and to pass the context in this way
presentControllerWithName("NameOfTheViewController", context:"PassedContext")
Which is the equivalent in tvOS?
Best Regards
As noted in other answers, the way to programmatically show another view controller in tvOS (or iOS) is performSegueWithIdentifier:sender:. (Or presentViewController:animated:completion: if you're not getting your VCs from a storyboard flow.)
But you might not need to do it programmatically. In watchOS it's sometimes easiest to do it that way, but in iOS & tvOS, it's common to make controls directly perform storyboard transitions entirely from Interface Builder. Just control-drag (right-click-drag) from the button to another view controller. (More step-by-step instructions in Xcode Help.)
Unlike watchOS, the view controller transitions in iOS & tvOS don't include a way to pass context information. Not as part of the API, at least — you have to include a bit of glue code yourself to do that. How to do that is a pretty common question.
If you're using storyboard segues (generally, you should), the prepareForSegue:sender: method is typically where you do this — you get a reference to the new view controller that's about to be shown, and use some function or property you've defined on that view controller to pass it some context. It often looks something like this:
override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) {
if segue.identifier == mySegueIdentifier {
guard let destination = segue.destinationViewController as? MyViewControllerClass
else { abort("unexpected storyboard segue") }
destination.someProperty = someValue
}
}
You can find good examples of this when you create a new Xcode project with the Master-Detail App template.
tvOS is more similar to iOS than it is to watchOS, although they all have some similarities. In tvOS (like in iOS) you can use both performSegueWithIdentifier:sender: or presentViewController:animated:completion: depending on your situation.
For more on this, you can check out the UIViewController class reference.

Using the Google Maps SDK in views other than the main view

I am trying to use the Google Maps SDK for iOS in a subview of the main view which I created in the storyboard and linked to the view controller via an IBOutlet (I called it extraView, subclassed from UIView). When I follow the steps in the SDK getting started guide, the SDK works just fine, but it uses the uppermost view in the hierarchy (the main view), which I don't want. I want my map to be in a smaller portion of the screen and use the rest of the screen for something else. When I attempt to assign the mapView_ object (see the getting started guide) to self.extraView instead of self.view, the whole screen is black and I get an error in the console output:
"Application windows are expected to have a root view controller at the end of application launch"
Has anyone else figured this out? I can't find anything in the documentation and the sample code Google provides does not use a storyboard.
Here's how...
add a UIView into the view controller where you're working
set it's class to be GMSMapView in the identity inspector.
Then control-drag it to your code as you would for any other outlet.
You can lazily instantiate it in its setter...
- (void) setMapView:(GMSMapView *)mapView {
if (!mapView) {
mapView = [[GMSMapView alloc] initWithFrame:mapView.bounds];
}
_mapView = mapView;
}
To display a map Google's sample code becomes...
GMSCameraPosition *camera = [GMSCameraPosition cameraWithLatitude:1.285
longitude:103.848
zoom:12];
self.mapView = [GMSMapView mapWithFrame:CGRectZero camera:camera];
I solved my problem just removing the loadview code that i took from the example.
Just adding a view as sberley said should works.
just on thing more, on the identity inspector, that attribute that you have to change is class, at least it is on xcode 4.5

supportedInterfaceOrientations method not working on iOS 6 UIViewController

In iOS 5, my application I used the method to change my orientation:
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
return (interfaceOrientation == UIInterfaceOrientationLandscapeRight);
}
In iOS 6 I think I'm supposed to do this, but it does nothing! My app is rotated not the way I want it to be.
- (BOOL) shouldAutorotate
{
return YES;
}
-(NSUInteger)supportedInterfaceOrientations
{
return UIInterfaceOrientationMaskLandscapeRight;
}
It's how I was adding my viewController.
I replaced this line code:
[window addSubview:viewController.view];
by this line:
[window setRootViewController:viewController];
When the device changes orientation, in iOS 6, the system asks the application which orientation it supports. The application will return a number of orientations it accepts.
How does the application determine it's orientation?
First the application checks it's info.plist orientations (this is very important to determine which orientation to use for launch.
Secondly it asks it's application delegate for it's orientations, this can be specified via the -(NSUInteger)application:supportedInterfaceOrientationsForWindow: method. This effectively overrides the info.plist setting, this implementation is optional. By default iPad apps orientate in all directions and iPhone in all but upside down.
Lastly the application delegate queries it's top most view controller, this can be a UINavigationController or a UIViewController... etc, then it specifies how to be presented and if it wants to autorotate. These UIViewControllers can use shouldAutorotate: and supportedInterfaceOrientations methods to tell the app delegate how to present it.
You must make sure you set the root view controller of your window.
Also if you are presenting any other view controllers in full screen such as a modal view controller, this is the one responsible for determining orientation changes or not.
For me the solution worked, the case was different a bit, because I had a UINavigationController.
My case was that I needed all Portrait window except one. I had to enable all landscape and the portrait orientations in targets (otherwise it crashes on the only landscape view).
So in this case:
create a subclass for UINavigationController,
insert the rotationspecific stuff there
and use the subclass instead of UINavigationController.
If you are using a UINavigationController, it sends the supportedInterfaceOrientations: message to the UINavigationController itself rather than the top most view controller (i.e. what you would actually want it to do).
Instead of having to subclass UINavigationController to fix it, you can simply add a category to UINavigationController.
I created a new variable in my singleton: canRotate, and I set it to YES when I want the viewcontroller to support some orientation.
In appDelegate I added this:
- (NSUInteger)application:(UIApplication *)application supportedInterfaceOrientationsForWindow:(UIWindow *)window
{
if([[SharedData sharedInstance] canRotate])
{
return (UIInterfaceOrientationMaskPortrait | UIInterfaceOrientationMaskLandscapeLeft | UIInterfaceOrientationMaskLandscapeRight);
}
return UIInterfaceOrientationMaskPortrait;
}
in my case it works. I have a custom tabbar and UINavigationviewControllers are added as subviews to rootViewController.
I know this sounds pretty elementary, but I was wracking my brain to figure out orientation issues while testing on my iPhone - I had the physical auto lock mode in Portrait mode - so nothing I changed programmatically mattered - thought this should be troubleshooting step number 1!

UIViewController device rotation delegate methods are not getting called in iOS5

I am developing an iPhone application. In this application, UIViewController (vc1) presents another UIViewController (vc2). vc1 supports both Portrait and Landscape orientations; vc2 supports only Portrait orientation.
When vc2 is presented, it asks vc1: shouldAutorotateToInterfaceOrientation: and this returns YES
In iOS5 (Beta 7) willRotateToInterfaceOrientation:, didRotateFromInterfaceOrientation: are not getting called for this sequence. But, this works fine in iOS4. Is this a bug in iOS5?
I had reported a bug to Apple and I got the following reply:
"Engineering has determined that this issue behaves as intended based on the following information:
The presentation behavior is correct - if it behaved differently in previous versions, that was a bug. The arguably unexpected change in behavior regards the dismiss of VC1 which no longer gets rotation callbacks but it will layout in portrait.
There are other ways to determine what your orientation is when a view controller lays itself out. For various reasons, relying on the rotation callbacks proved to be problematic.
In general, viewController rotation callbacks occur in two cases:
The device orientation changes for view controllers in the window hierarchy
Mixed interface orientation presentations. (Bottom controller only supports portrait, device is in landscape, and a view controller that supports landscape is presented.) However this is arguably a misfeature.
Try using viewWillLayoutSubviews: in iOS 5."
I had faced a similar issue when testing my app on iOS5. The layout of subviews in the main view controller used to get messed up if the orientation changed when a modal view controller was active.
What I did was to store the current orientation flag in the main controller. This flag is updated in two places in the main controller
willAnimateRotationToInterfaceOrientation:
viewWillLayoutSubviews (this is on iOS5 only)
I write all the logic to adjust the subviews by comparing the current orientation with the stored value. If they are different - update the stored orientation, update your subviews.
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
// Any orientation is OK
return YES;
}
- (void)willAnimateRotationToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration
{
portrait = UIInterfaceOrientationIsPortrait(toInterfaceOrientation);
// Code to update subview layout goes here
}
-(void)viewWillLayoutSubviews
{
BOOL isPortraitNow = UIInterfaceOrientationIsPortrait(self.interfaceOrientation);
if(isPortraitNow != portrait)
{
DLog(#"Interfaceorientation mismatch!, correcting");
portrait = isPortraitNow;
// Code to update subview layout goes here
}
}