Google Chrome kiosk mode compatibility on a touch screen monitor - google-chrome

I'm trying to develop a kiosk web application that uses Google Chrome on kiosk mode setup which loads automatically after start-up.
http://www.sitepoint.com/google-chrome-kiosk-mode/
The kiosk web application also uses a virtual keyboard plugin for Google Chrome for the text inputs.
http://xontab.com/Apps/VirtualKeyboard
I'm planning to setup a computer unit with a touch screen monitor for the kiosk.
Note: It's my first time to develop a web application that uses the kiosk mode setup for Google Chrome and I don't have a touch screen monitor for testing. I wanted to ask this question for developers that has experience with this.
My question is:
Does Google Chrome on kiosk mode setup automatically detects my touch screen monitor?
Does Google Chrome automatically enables touch features when my web application is on kiosk mode such as swipe for scrolling up and down.

A touch screen is an input device just like a computer mouse - Google Chrome receives touch events the same way it receives mouse events (although the events are different).
The annoying thing when starting working with touch screens is that the standard click event that we are used to, is triggered after a delay comparing to mouse event. You should listen for tap event or use a library such as https://github.com/ftlabs/fastclick
Making long story short developing a kiosk application integrated with a touch screen is similar to developing mobile websites. You should probably use a JavaScript library to support all kind of touch events such as tap, swipe etc. See http://hammerjs.github.io/
You may also find this website useful http://peter.sh/experiments/chromium-command-line-switches/

I know this answer is six years later but for anyone reading this. You can run Google Chrome in Kiosk Mode with custom options. This can be done by creating a new account and right clicking (in windows 10) on the icon for this account.
Under properties add the following tags to optimize the application for touchscreen use:
–-touch-events –enable-viewport

Related

How can i track user's choice in chrome when native app install banner shows using adobe analytics

My mobile website when viewed on chrome, pops up the banner to ask the user if he/she wants to install the native app. Am able to use this by using the manifest.json. Since this pop up behavior is driven by chrome, am unsure how to track this using Adobe Analytics. I can use beforeinstallprompt to get a call back into my application before the app install banner is shown (chrome lets me to listen to this event), but how can I propagate this to adobe dtm for analytics purposes? I want to know how many people saw this banner, how many dismissed it etc..
Thanks
K

Electron steals focus to Windows 10 Touch Keyboard and makes it unusable

I'm trying to run the attached project on Windows 10 Pro (latest version available without Windows Insider Program).
Basically it is a fullscreen browser window that navigate to http://www.google.com.
I configured Windows in Tablet mode, in order to let the touch keyboard popup whenever any text field in the page (in this case the query one) gets focused.
Then, I packaged the application with electron-windows-store in order to let electron work as Windows Store application.
When I start the application and Google home page is loaded I'm not able to use the touch keyboard, because it pops up but immediately disappears, like if electron tries to acquire again focus and causes touch keyboard disposal.
I tried also to disable fullscreen mode and setup frame coordinates in order to be as it was in fullscreen, but no success.
Any suggestion?
TestApp.zip
GitHub Repo
This seems to be related to an open issue on the Electron GitHub repository. You might have to wait for the Electron team to introduce this improvement.
I've managed it. The issue was caused by an old dependency to electron. Once updated it to the latest version I know (1.4.7) it all started working.

Toolbar in Chrome app mode

Trying to employ kiosk based on Chrome browser. Just Installed Kiosk app extension, everything is fine, but in Chrome's application mode there are no toolbar. I basically need "Back", "Forward", "Home" buttons. Is there any way to enable toolbar in app mode, or add it when extension creates window?
Navigation is not meaningful for Chrome Apps. It is, in fact, disabled.
So if your app changes state, you need custom controls for that state anyway.
If you have embedded web content in the app in a <webview>, then you need to make your own custom controls for that. See the browser app sample.

how to run touch screen applications in non touch environment?

I need to run the touch based browser app in my desktop and modify some codes written in touch events. Can some one tell me what are the best ways to run the touch screen apps in non touch environment? and check its event?
If your app is running inside a browser, you can emulate touch events on non-touch systems by turning on touch emulation in Chrome's developer tools.
In Chrome 26+ this can be found by clicking the "gear wheel" icon in the bottom right, and then selecting the "Overrides" panel. See https://developers.google.com/chrome-developer-tools/docs/mobile-emulation#emulate-touch-events
NB: this will only work while the DevTools are open. So it should be fine for testing, but not for end-user running.

A universal cross-platform way (mobile) to show alerts to a user

I have a task to create a client application which can show notifications to a user with a high probability of notifications being noticed.
The application should work on Android (2.0+)/iOS/WP.
Here is the use case:
The user starts the Application and performs some Action. Then he switches to the home screen/another application.
The response to the Action makes the Application to issue a notification. The notification is noticed by the user disregarding of what another application (or home screen) he uses on his mobile device at the moment.
There is no requirement for the application to be a native app or to be a web browser-based mobile app. The notification could be a sound or a vibration on the device, but I know that accessing the vibrations from within a browser is still tricky.
Here are my research results of making universal sound/vibro notification mechanism so far:
it seems that making a mobile device vibrate from a browser works only in mobile Firefox (no iOS, no WP);
the support of the audio html5 tag is still experimental, it doesn't work on each and every browser/device;
the sound alert from this example works only in mobile Firefox (asks for a plugin to play an mp3 sound), the Android browser just remains silent.
So, the question is:
Is there any way to force a user of a mobile device (Android 2.0+/iOS/WP) to view a notification from a mobile application? Is the only way to do this is to write a native app for each mobile platform?
I would propose PhoneGap for that particular problem.
Among other things it features cross-platform alert, sound and vibrator notifications.
Only quirk for Windows Phone 7 is that the Cordova lib includes a generic beep file that is used. You should consult the Notification reference page to make sure if it can help you.