Openshift haproxy - web-service hibernated - openshift

I'm using small webapp which is hosted on openshift and that app is actually a web-serviсe. Sinse my app is scalable it is maintained by haproxy load baloancer. But I noticed that my app was hibernated after some period of time.
Why does it happen?
Is haproxy able to maintaine web-service application?
as it turned out there were just terminology problems as scalable does not mean that it will not idling. thus the issue has been resolved.

As said here "Openshift suspends and serializes apps without much activity after a given period, and the first time they 'wake' they deserialize and this takes time."

Related

How does open.spotify.com work?

I can go to the webpage https://open.spotify.com and select songs that are then immediately played on my desktop spotify application. How does this work? I could imagine a scenario where the webpage sends a request to a server which then tells my desktop application to play, but the website and my application seem too in sync. Sure the web is fast, but the song time counters are perfectly in sync and there is no lag when I click play.
I guess they could do something clever with syncing the song time counter, but I'm wondering if they're dong something even more clever: not using a server at all.
So the real question: Is there a way to have direct communication between a webpage and an application running on the client?
The mechanism is described on How does the Spotify web browser button interact with the Spotify app?. When you install Spotify's desktop application, a process called SpotifyWebHelper runs in the background. This process acts as a local server and receives requests from open.spotify.com to interact with the current playback. As you see, there is a way to communicate a web site and a local application.
It's worth noting that there is an increase concern by browser vendors about this mechanism (see https://bugs.chromium.org/p/chromium/issues/detail?id=378566) and it will stop working at some point in the future. A more future-proof solution could be based on a proxy service that gets requests from the web page and updates your application, and viceversa. Web sockets are a good candidate for this. Although the proxy, acting as a state management service, introduces some delay, it also allows for some other use cases: eg you don't need to have the application installed on the same machine on which the web page is browsed, and one could for instance control a mobile client.

Sharing session variables between Vaadin and embedded applications

I have a Vaadin v6 application that uses the embed component to show a JSP page from another application (JPivot, in this case). Both applications are running in the same application server (Tomcat). I need both applications to communicate and I'm trying to do this by using session attributes. However, each application has its own session and so one is ignorant of the other attributes. My question is how to make these applications communicate without using a database or an external file? It can be other solution than session attributes.
What you wish to do is either IPC between two web apps,
or sharing some informations between them.
If you have a cache available (memcached or similar) you could
store/retrieve the information there.
If no cache is available, then CrossContext=true might help you.
With this you can call "the other" webapp from inside the servlet/request.
Here is a simple explanation how this works
http://lukaszbudnik.blogspot.ch/2009/06/session-sharing-in-apache-tomcat.html
If you google for "tomcat session sharing" you will get many more results.
Please note that this CrossContext stuff only works as long as they are in the same tomcat instance.
As soon as you add another tomcat instance for load balancing or high availability this will break. In that case you should use some sort of message bus or message queue.

openshift application goes idle and halts its cron jobs

I use openshift to run a script from time to time with the cron cartridge.
however, as my application has no web activity (yet) it goes idle and my process doesn't run.
one could think of an ugly solution to generate fake web-load by using another service (such as ifttt to retrieve a page constantly but this sounds wrong.
could there be a better solution?
apparently the only way is to trick openshift to have out-coming traffic, I used a free account of https://www.site24x7.com for that
OpenShift will idle your application after 24 hours of inactivity1, but you can add an hourly cron job to your app to keep itself alive.
.openshift/cron/hourly/ping.sh
#!/bin/bash
PATH=/bin:/usr/bin:/usr/sbin
app_url=http://$OPENSHIFT_APP_DNS/
curl --insecure --location --silent --fail "$app_url" >/dev/null
Assuming your app isn't already idled and won't run the cron job :-)
1 Apparently the idle period used to be 48 hours before, but now it is 24 hours according to the OpenShift pricing table. In other words, a daily pinger cron job won't do it for you.
Openshift cartridge goes idle after 24 hours of inactivity.
Activity is considered receiving a Get request in your application originated outside your cartridge (so pinging your app from your own cartridge doesn’t work).
You can use any free pinging service to ping your application after a specific interval of time(< 24hrs).
You may use Pingdom. I have found success using it. It provides a nice dashboard and graphs of response time as well. You will be notified if there is any issues connecting to your app or if its down. You can manage your account with their mobile apps.
There are other free pinging services too. Feel free to Google and try out other services. Do comment if you find a good one, might be a great help for some :-).
It's not really a "trick" per se, but as long as you have consistent incoming traffic, your gear will not idle.
Bronze is free. All plans retain the free stuff, e.g. 3 free gears etc. you will only pay $0.02 hr on gears above 3.
So if you are using the free tear and not using 3 gears you should be safe to upgrade to bronze and remain free.
In your nodejs app:
Create an httpserver able to dispatch a get request page
Include a cron job into your array job list that every hour send a post to external page (php,jsp or every kind of "page" that can create a curl request).
In external page:
Execute the logic of a job (optional because you can use job2, job3..jobn and leave this job just for hold your app awake)
Insert somewhere in the code a request back to nodejs server page using php curl library.
In this case:
At every hour, idle timeout will be reset and your application stay awake
You can decide to put jobs in external pages and/or nodejs
Hope it helps someone.
EDIT: I'm sorry, it's not working anymore. No matter which kind of strategy you use, they will be able to discover systematic requests coming from a specific IP and exclude this situation, bringing your app idle for earn money, because everybody know that BRONZE is not FREE. It costs at least 0.02$/hour.

HTML5 Stress Testing

We have a webapp that's backed by a java spring web application. Apparently our Rational Robot tool that we normally would use to stress test a browser application doesn't deal with the HTML5'ness of our app. We are considering simply scripting the raw http requests to beat on the application, but wondering if anyone knows of any tools out there to do the same without having to craft all of the http requests by hand?
Apache JMeter allows you to record all http requests by setting it up as a proxy. You can then save those requests and play them back in multiple threads to simulate users.
Fiddler with our free stress testing add-on called StresStimulus is another proxy that records HTTP requests and replays them with configurable user ramping.
If you have anything a bit complex, it would be easier to record a script in one of the load testing tools. I have used WebLOAD in the past, which handles things like fetching all the sub-components, script correlation, AJAX recording, etc.
You can record straight from the Android device, or use a desktop browser you like.

What's the best way to notify a non-web application about a change on a web page?

Let's say I have two applications which have to work together to a certain extent.
A web application (PHP, Ruby on Rails, ...)
A desktop application (Java, C++, ...)
The desktop application has to be notified from the web application and the delay between sending and receiving the notification must be short. (< 10 seconds)
What are possible ways to do this? I can think of polling in a 10 second interval, but that would produce much traffic if many desktop applications have to be notified. On a LAN I'd use an UDP broadcast, but unfortunately that's not possible here...
I appreciate any ideas you could give me.
I think the "best practice" here will depend on the number of desktop clients you expect to serve. If there's just one desktop to be notified, then polling may well be a fine approach -- yes, polling is much more overhead than an event-based notification, but it'll certainly be the easiest solution to implement.
If the overhead of polling is truly unacceptable, then I see two basic alternatives:
Keep a persistent connection open between the desktop and web-server (could be a "comet"-style web request, or a raw socket connection)
Expose a service from within the desktop app, and register the address of the service with the web-server. This way, the web-server can call out to the desktop as needed.
Be warned, though -- both alternatives are chock full of gotchas. A few highlights:
Keeping a connection open can be tricky, since you want your web-servers to be hot-swappable
Calling out to an external service (eg, your desktop) from a web-server is dangerous, because this request could hang. You'd want move this notification onto a separate thread to avoid tying up the webserver.
To mitigate some of the concerns, you might decouple the unreliable desktop from the web-server by introducing an intermediary notification server -- the web-server could post an update somewhere, and the desktop could poll/connect/register there to be notified. To avoid reinventing the wheel here, this could involve some sort of MessageQueue system... This, of course, adds the complexity of needing to maintain the new intermediary.
Again, all of these approaches are probably quite complex, so I'd say polling is probably the best bet.
I can see two ways:
Your desktop application polls the web app
Your web app notifies the desktop application
Your web app could publish an RSS feed, but your desktop app will still have to poll the feed every 10 s.
The traffic need not be huge: if you use an HTTP HEAD request, you'll get a small packet with the date of the last modification (conveniently named Last-Modified).
I don't know exactly what to do to achieve your task but I can suggest to create a windows service at the desktop application PC.
This service checks the web application every interval of time for new changes and if changes occurred it can run the desktop application with notification that there is a change in the web application and in the web application when any change occurrs you can response with acknowledgment
I hope that this may be useful I didn't try it exactly but I am suggesting using like this idea.
A layer of syndication would help to scale out the system.
The desktop app can register itself with a "publisher" service (running on one of several/many machines) This publisher service receives the "notice" from your web app that something has changed, and immediately starts notifying all of its registered subscribers.
The number of publishers you need will increase with the number of users.
Edit: Forgot to mention that the desktop app will need to listen on a socket.