Docker setup for Wirecloud - fiware

I am trying to set up a local version of Wirecloud with Docker.
Is there any template already cooked for that ?
Otherwise I was considering to go a path similar to this tutorial
https://realpython.com/blog/python/django-development-with-docker-compose-and-machine/

Currently, there are no docker templates for deploying WireCloud. The link you have posted looks good as starting point.
I suggest you to open a ticket in the WireCloud's issue tracker or to make a pull request if you are going to create a docket template and don't mind to share it. You can get support for creating such a template from the WireCloud team in the issue tracker.
Update: WireCloud is now available on docker hub.

Related

Legacy GCE and GKE metadata requests from google_daemon/manage_addresses.py

I have an old Debian Compute Engine instance (created and running since December 2013) and got an email warning about the turndown of Legacy GCE and GKE metadata server endpoints (more details at https://cloud.google.com/compute/docs/migrating-to-v1-metadata-server).
I followed the directions for locating the process and found that the requests were coming from /usr/share/google/google_daemon/manage_addresses.py. The script seems to be the same as what's at https://github.com/gtt116/gce/blob/master/google_daemon/manage_addresses.py (also with what's in that directory).
I don't recall installing this, so I'm imaging it came with the provided Debian image I used in 2013.
Does anyone know what this manage_addresses.py script is, what it does, and what I should do with it now that the legacy metadata server endpoints are turning down? Is it safe to just stop running it? Or is there a new script I should replace it with? Or should I just try to update it myself to use the new endpoint?
I dug around and was able to trace /usr/share/google/google_daemon/manage_addresses.py as being installed by a package called google-compute-daemon. A search for that brought me to https://github.com/GoogleCloudPlatform/compute-image-packages#troubleshooting which explains that google-compute-daemon has been replaced with python-google-compute-engine. That led me to https://cloud.google.com/compute/docs/images/install-guest-environment . I followed the instructions there and manually installed the guest environment.
I noticed during installation that it said it was removing the google-compute-daemon package (and a packaged called google-startup-scripts), so this seems like the right thing. And I'm no longer seeing any requests to the legacy endpoints. So it seems like at some point the old guest environment failed to update.
TLDR; If you have this problem, follow the instructions at https://cloud.google.com/compute/docs/images/install-guest-environment#installing_guest_environment to manually update the guest environment.

Openshift Login Plugin Jenkins - Invalid Request

I tried to setup-up a custom jenkins image, based on the redhat jenkins image. The redhat jenkins image, has the Openshift Login Plugin, installed already.
After, the image started up properly, I tried to login, with my Openshift credentials, but it didn't work.
I just saw the the following error message:
"error":"invalid_request","error_description":"The request is missing a required parameter, includes an invalid parameter value, includes a parameter more than once, or is otherwise malformed.","state":"xxxxxxxxxxxxxxxxxxx"
and there was another message in the OS terminal of the running pod.
I read about several other issues from the Openshift Login Plugin, but also an update to Version 1.0.12 didn't fix my problems.
My Problem was, that I didn't knew, that each Openshift Service Account has a redirect reference - specially configured for one deployment.
I already used the Service Account, I used for the above mentioned Jenkins, for another Jenkins Deployment, because of this Openshift added the redirect reference, configured for this "older" deployment.
In our Openshift Setup (3.11), you aren't able to find the redirection configuration within the Service Account settings that you can find under Resources --> Membership --> Service Accounts. Instead you have to look and edit the YAML File of the Service Account, that you can find under Resources --> Other Resources --> Service Account.
serviceaccounts.openshift.io/oauth-redirectreference.jenkins
Since this is quite tricky to find out, I hope that I could possibly save somebody a few hours of searching.

How to test openshift action_hooks prior to git push to Openshift server

I have been looking at Openshift docs and on Stack Overflow for a while now and I can't seem to get any answers.
I want to know what the standard pattern is for developing applications for deployment on Openshift? I am especially concerned with testing of action_hooks prior to deployment. I found this particularly troublesome when I was using a DIY cartridge recently where I had to deal with downloading dependencies in my build script prior to starting my application. As my application kept failing to start every time I made a change and pushed it (I only did this as an initial test of the Openshift service, I would never develop like this). I ended up having to ssh onto my instance and resolve the issue by trial and error (not really ideal).
I appreciate any help anyone can offer.
Thanks
The only way that I am aware of to test action hooks on OpenShift is to ssh into an application and run them manually from the command line. This way you can quickly debug & update them. Then copy them to your git repository and do a git push to deploy the new code.
The only other way I can think of would be to run OpenShift Origin (v2) locally and use that to test with.

click to deploy Hadoop on GCE not working

I'm trying "Click-to-deploy Hadoop on Google Compute Engine" here
Unfortunately this doesn't seems to work : either the process stops almost immediately, or it's like it's frozen.
message displayed is
Deployment may take 3 to 10 minutes to complete, depending on the size of your cluster
Creating deployment
In any case, I can't have any cluster. Tried several zones, Hadoop versions, nothing.
Any thought ?
The problem is occurring because your Cloud project does not have a project id associated with it, but only a project number, which is true for some long-standing Cloud projects.
https://developers.google.com/console/help/new/#projectnumber
You can fix this by going into Developers Console, selecting your project from the project list, selecting Billing & settings from the left-hand navigation, and adding the project id there.
The following URL should take you there directly:
https://console.developers.google.com/project/_/settings
Thanks,
-Matt
A few items to help diagnose the problem:
Go to the Compute Engine instance list and check if there are any instances created for the deployment.
Check if there are any errors raised to the Javascript Console for your browser.
BTW, what browser and version are you using?
Thanks.
No instance deployed (however I can (and had) deployed compute engine VM instances)
I have a 404 in console :
POST https://console.developers.google.com/m/deploy?pid=1090158225078&cmd=custom…ion=europe-west1&app=hadoop&xsrf=R5Ezthkrr1L8xU1STye3sXUiHiA:1414055456964 404 (Not Found)
on Chrome, Windows7
I tried on Firefox too : no 404 in console but same effect : no deployment at all.
The "customdeploy" command should not be returning a 404, so let's check if there's something going on with your Cloud project.
Click to Deploy uses the preview version of Deployment Manager on the backend. Let's check the objects (if any) that Deployment Manager has created for the Hadoop deployment.
To do this, you will need to:
Install the Google Cloud SDK (if you have not already)
Add the preview component
Query for Deployment Manager templates
Query for Deployment Manager deployments
Install the Google Cloud SDK:
Instructions are here: https://cloud.google.com/sdk/
Add the preview component:
gcloud components update preview
Query for Deployment Manager templates
gcloud preview --project=<projectid> deployment-manager templates list
Query for Deployment Manager deployments
gcloud preview --project=<projectid> deployment-manager deployments --region europe-west1 list
One last question. Is this a relatively "new" or "old" Google Cloud project? Sometimes old projects need a feature to be enabled that is automatically enabled on new projects.
Thanks.

What can I do as OpenShift user?

I'm currently using a virtual server and want to try OpenShift out. But I'm not really getting yet, how it works. Do I get a root access to my "webspace"? Can I set up the server OS (e.g. Debian 7)? Can I install/uninstall software (nginx, PHP 5.5, PHP Code Sniffer PEAR package etc.)? Can I use one gear for multiple websites?
It not clear by your line of questioning what portion of OpenShift you are not understanding, so I will try and lay out the architecture and provide documentation to get you started.
OpenShift is a Red Hat developed product (so its going to be easiest to get started on RHEL or Fedora), but it can also run on other Linux systems (however you may need to piece meal the components together, but it can be done).
This is talked about in building your own live cd on the community site, however has not been done for you by the OpenShift community.
There are two starting places for OpenShift, and they dependon on what you are trying to use openshift for? As a PaaS hosting solution, or PaaS hosted solution?
For a PaaS hosting solution a good starting point is to look at the Origin page as it provides VM's and install instructions, for OpenShift's Community product.
Because OpenShift is a PaaS solution these components (see Architecture Links), when cobbled together provide users with an application space (which they do not have root access to).
https://www.openshift.com/products/architecture
https://www.openshift.com/wiki/architecture-overview
As the administrator for the box you would have (root access) but your end users would not.
For a PaaS hosted solution a good starting place for OpenShift is OpenShift Online which is Red Hat's Hosted solution for the OpenShift Origin project.
Get started by Creating an Acount
With an online account you can get started using the hosted solution very quickly by trying some of the quickstarts. Be sure to read the full set of OpenShift Documentation as well as install the client tools