I know you can fake full screen by expanding a window and eliminating the title bar , status bar , and other stuff, I'm not interested in this, I want to know about "real" full screen mode (I don't know how to call it else) , like in games.
what exactly is full screen mode?
what win-api should I use to achieve this?
can this be used to play movies in full screen ? I know windows media player uses a fake full screen because I can "cut" thru it and see the desktop (using regions win-api).
can I "cut" thru "real" full screen like I thru a window (using regions win-api) ore is this directly writing to video memory and there is nothing "under" it?
Thanks!
If you want to make games on Windows in full-screen, the best option is XNA. This uses DirectX underneath, but hides a lot of the implementation details and plumbing to make it easy for the developer to start working on his game.
XNA is freely downloadable, and has good documentation.
XNA Game Studio 4.0 can be downloaded here.
...and you might want to support the "fake" fullscreen mode in addition to "real" fullscreen - it's very nice for those of us that run multi-monitor systems.
If you don't want to use DirectX, create window and call ChangeDisplaySettings with CDS_FULLSCREEN flag. OpenGL applications use this way to go fullscreen.
As far as a user is concerned, full screen is just when a window takes up the entire screen such that they no longer see any window borders or other desktop stuff.
As you know, not all full screens are created equal.
'proper' full screen is where the program takes control of the screen. When a game uses this mode, it can change the resolution of you screen. If you have ever played an old game and existed to see your icons all messed up, this is; for the duration of playing the game, your desktop was at a lower resolution.
with 'borderless full screen' the program window is striped of any borders, the title bar and frame etc., and is just a rectangle of pure rendering. If you then set this rendering context to be the same size as your desktop, you get the effect of full screen.
Doing border-less is usually the more user friendly way these days, as it is easier to 'tab out' as the other programs are still graphically around. 'proper' full screen gives you full control of the hardware, so in theory you have more power for your program, but it means you have to wait for things to reinitialise when you tab out.
what you do with your rendering context is up to you, so yes, you can use it play videos. It would not matter if you are in 'proper' full screen or not, the rendering code would be the same.
As for cutting through proper full screen windows, I am not sure, but I think there would be nothing else to see, there is only your program.
as for what win-api, there is only one windows api, but I think you mean, what windowing library; as this is getting to be a long answer already, I shall just say it depends a lot on what you want from it.
Please feel free to leave comments if you need me to clarify or expand on any points.
Related
My Ms Access(2010) database uses forms to display and manipulate data. Recently, when presenting these, I found out that the use of these forms on a video projector leads to a severe problem, the forms appear to be 'zoomed' in.
Therefore I have 2 questions, thanks for the answer!
1) How are twips on a video projector calculated? Theoretically this should depend on the distance between projector and screen, which woulnd't make too much sense. (I'd need this information to be able to explain the problem, thank you very very much)
2) How can this be configurated? Is it possible to use VBA or Win32 API to achieve this?
I don’t think this problem has anything to do with TWIPS or with the video projector. Any monitor (analogue, digital, projector, etc.) shows the same picture if it has the same resolution. If you set the output of your computer to i.e. 1024 * 768 pixels and the output device (analogue or digital monitor, projector, etc.) uses the same native resolution then the picture will look the same on each device.
Access works in pixels. If you have a form optimized for a specific amount of pixels then this is what Access uses. If you have a higher resolution then form will not fill the screen and if you have a smaller resolution the form will not be completely shown on the screen.
I guess what happens is that you use on your PC an output of i.e. 1600 * 1200 pixels but your projector can’t show this correctly. So the projector tries to convert i.e. 1600*1200 to 1024*768 and this will never look good.
I think you have two options: Check the resolution which your projector expects and set your PC to the same resolution. Or change your application – or the projector.
In line with what Edgar has suggested, regardless of the display type (projector, monitor etc...) the issue will remain the same. In this scenario, the problem is the form is designed for a given screen size, say 1600x900 (16:9) or 1920x1200 (16:10) or whatever you have chosen to design the form as.
The projector is likely not the same resolution as this. Many smaller projectors are either 1024x768 or 1280x720, both of which are likely smaller than your computer monitor in regards to resolution. While it is true that you could design the forms to the proper pixel dimension of the projector an easier way, that wouldn't require editing any content, would be to send the projector the same resolution that you have designed the forms to be.
For example, if your forms fit nicely on a 1920x1080 pixel space but your projector is 1024x768 then you could open display preferences on your computer and set the output to the projector to be 1920x1080. The projector will then scale the image to fit onto its 1024x768 panel.
There are many variables in here and you may run into equipment limitation with this approach, such as the projector not being capable of ingesting and scaling a given resolution which you are forcing into it. In that instance you could utilize a hardware video scaler inline between the computer and projector to perform the scaling operation for you. An example of a device capable of this would be a Barco ImagePro, though there are many other more cost-effective solutions on the market as well.
I take several photos using iPad. I take them in different orientations (rotate iPad every time on 90 degrees).
Then I download them to my Windows laptop and what I see? I don't see them as I saw them on the screen of iPad. Actually, there is only one valid image. Others are rotated.
I found this problem in browser (FF & Chrome). When you display image using img html tag it is rotated. But if you display it by entering image's full URL - it's totally OK.
I checked pictures via Safari on iPad - they look fine (in img tag), but don't in Windows.
Is there some metadata which shows that image should be rotated or smth like this?
As you know, the iPad has a hardware device in it that tells it the device orientation, which is how it determines how to display the screen to the user. While the hardware instantly knows how it's positioned at any given time, they seem to have engineered a lag into the software registering this change to improve the user experience (so the screen doesn't flip back and forth several times in a single second). However, this lag might lead to some unexpected results when taking a photo.
I have found that the orientation is most often unexpected with the iPhone / iPad when I am taking photos with my screen facing downward (i.e. taking a picture of something on a tabletop, for example). I assume landscape but get portrait, and vice versa. In that scenario (downward / flat), it is more difficult for the device to know what my intended orientation is.
I find the best way to resolve this is to hold the device in the clear orientation that I want for a second before I take the photo, then point the camera downward and snap.
The orientation data is included in an image's metadata (AKA exif data). You can take a look here for more information:
http://www.daveperrett.com/articles/2012/07/28/exif-orientation-handling-is-a-ghetto/
It is relatively easy to retrieve (and modify) the exif data in software. If you are doing lots of batch processing in some type of custom way, libraries are available to help with this for a variety of frameworks. But for small jobs, the absolute most simple way is to click the little "rotate" icon in the image viewer software within Windows which will make the update for you.
I'm working on a little flash game that has a couple of GUI components. I'm having a bit of trouble coming up with a good design that can support a min spec of 768x1024 and a max spec of 1200 x 1920.
An example of my design:
I have a List component that hugs the top right corner. When the browser is resized I slide the component over, keeping it always 5 pixels from the edge of the browser. Once the stage has hit a minimum size (1024 for width) I stop sliding the GUI over otherwise it will start overlapping with other GUI's that are on the left side. This seems to be a similar fashion to how Farmville and other popular flash games handle their GUI's (they keep them the same scale for all resolutions and only translate them to keep them centered and what not)
If I knew the min spec would always have 768x1024 then I could deal with that, but the problem is that is rarely the case. It seems like the url bar, and other browser menus cut into your height space. Also when I'm running Ubuntu (my version has that menu bar on the side) I lose some width to my stage. I'm guessing the windows menu bar on the bottom also cuts into a browsers height space. This ends up cutting parts of my GUI out and can make the game unplayable.
I believe I can only truly expect 768x1024 when in full size mode.
My question is:
What is a safe min spec of resolution, including what windows/ubuntu menu bars and browser menu bars will take up?
The safe area varies a lot depending on the browser/OS:
http://designerstoolbox.com/designresources/safearea/
Back in the day (when full flash sites were popular) the usual way to deal with this was, as bokan says, to make your SWF to always fill up the browser's viewport, unless the viewport was smaller than a certain limit, then show scrollbars. swffit was a quite popular script to help with this. The typical minimum safe area used to be around 990x600 (to support 1024x768).
Being a game though, I'd recommend defining the minimum area where the game is playable, and if you detect a smaller viewport, require the user to go fullscreen (I wouldn't rely on scrollbars for any game, unless it's a super-cool-parallax-html5 game, of course :P).
There is no safe minimum size. Some users have 6 bars (favorites, yahoo, google, stumble, +2 spyware).
What You can do is resize your flash animation to 100% of a div and set a min-with min-height for this div. This way, users will see that they need to make the windows bigger.
The resolution depends on your target audiance. Do you want user to play on netbooks ? Just check your game with a typical netbook resolution.
You should take a look at http://www.greensock.com/liquidstage/ it's what you need. Greensock api are awsome.
I think 1000x600 is a good choice.
We use width x height, so your 768x1024 would be tall.
When I put my Flash game into full screen interactive display mode and set stage.fullScreenSourceRect so that it uses hardware scaling, the performance at any resolution seems to be much better than if I do it without the fullScreenSourceRect. I'd really like to use this feature, but the problem is that it seems to be using a 4x blur or some similar algorithm for scaling that leaves everything looking very blurry.
It seems like an odd choice to have a blur as the only available scale mode. I would be much happier with a simple nearest-neighbor. I can't find anything about changing the scale algorithm in the documentation. Is there any way to do this while still using hardware acceleration?
What is the intended platform for this game? If this is mobile, then there are standards that can be implemented to ensure the ideal resolutions. If this is for web then, I would recommend defining rigid dimensions. Otherwise in my experience, its best to develop to your display's ideal native resolutions. Unless you have your code dynamically drawing objects to your stage, the there will always be some kind of rastering/interpolation. You can also get your screens resolutions and have the code make adjustments accordingly: How do I get the user's screen resolution in ActionScript 3?
I've developed a lot of touch screen applications that span multiple displays with different resolutions and AIR has some great options in it's 'Screen' class to make the process easier.
i would like to create some Flex Desktop Application that will be always in front of other applications (appWindow.alwaysInFront = true). It should looks like tiny bar at the top of the screen (eg. width = screenWidth, height = 50px). I know how to do that. But I have problem with other applications - when i maximize them, they are under my application. Is there any way how to say to the system, that maximized resolution for other apps is other than default?
Thanks for your answer.
You cannot do this with flash code, because you're interfering with external applications. At best, you could write some native code in another language and use AIR to execute that code but I can't really see that working out well, or at least being anything less than a massive undertaking.
If you do want to attempt this however, you can find some info about executing native code from AIR here: How can i on button press execute a command in the command prompt and get back the output in ActionScript?