How do you create a DAG with JSON in Apache Apex? - json

I've been trying to find the documentation for populating a DAG instance using JSON. Is there formal documentation to the format somewhere?

JSON application spec is available under http://apex.apache.org/docs/apex/application_development/#json-file-dag-specification
At the current moment JSON application creation is enabled with dtAssemble, a beta feature of DataTorrent RTS. The documentation for using dtAssemble to create JSON apps is available under http://docs.datatorrent.com/dtassemble/
You can experiment with dtAssemble, and inspect the PUT operations of /ws/v2/appPackages/[user]/[package]/[version]/applications/[application] service call to see the JSON structure generated for the application you're building. You can also download the application package where the app is being saved, unzip it, and inspect app/*.json files added there.

Related

Is it possible to generate swagger.json with out deploying the application?

I'm a beginner in swagger and by following the documentation I'm able to generate swagger.json (swagger definition in json schema format) and also able to view that in swagger-ui.
But currently I'm able to access my json schema file only after deploying my application(It is getting generated under the base path I have given in web.xml).
Is there anyway to get it without deploying the application? Because each time when I want to re-check the documentation after a change has done, I have to deploy my application. It would be better if we have any offline method.
Thanks.

MarkLogic Java API batch upload files (.csv)

Im trying out the MarkLogic Java API and would want to bulk upload some files with the extension .csv
I'm not sure what to use, since the Java API only supports JSON, XML, and TXT files.
How do I batch upload files using the MarkLogic Java api? Do i convert everything to JSON?
Do i convert everything to JSON?
Yes, that is a common way to do it.
If you would like additional examples of how you can wrangle CSV with the Java Client API, check out OpenCSVBatcherExample and JacksonDatabindTest.testDatabindingThirdPartyPojoWithMixinAnnotations. The first demonstrates converting the csv to XML and using a custom REST extension. The second example (well, unit test...) demonstrates converting the csv to JSON and using the batch upload (Bulk Writes) capabilities Justin linked to.
If you have CSV files on your filesystem, I’d start with mlcp, as suggested above. It will handle all of the parsing and splitting into multiple transactions/batches for you. Take a look at the mlcp documentation for more details and some example configurations.
If you’d like more control over the parsing and splitting logic than mlcp gives you out-of-the-box or you’re getting CSV from some other source (i.e. not files on the filesystem), you can use the Java Client API. The Java Client API allows you to efficiently write batches using a WriteSet. Take a look at the “Bulk Writes” example.
According to your reply to Justin, you cannot use MLCP because it is command line and you need to integrate it into a web portal.
Well, MLCP is released as open cource software under the Apache2 licence. So if you are happy with this licence, then you have the source to integrate.
But what I see as your main problem statement is more specific:
How can I create miltiple XML OR JSON documents from a CSV file [allowing the use of the java API to then upload them as documents in MarkLogic]
With that specific problem statement:
1) have a look at SplitDelimitedTextReader.java from the mlcp source
2) try some java libraries for this purpose such as http://jsefa.sourceforge.net/quick-tutorial.html

Custom dynamic inventory scripts/plugins in Ansible

Ansible allows devs
to write programs (in any language) that will return JSON describing the dynamic “snapshot” of current hosts. I’m using vSphere, which is currently not supported by Ansible OSS, and so I need to write such a "custom inventory plugin".
I can handle the querying of vSphere for a list of hosts, as well as constructing the JSON that is compatible with what Ansible is expecting.
Where the documentation completely (seemingly) falls flat is:
How do I “connect” Ansible with my inventory app? That is, say my inventory app is a simple bash script (inventory.sh)..how do I configure Ansible to call bash inventory.sh and obtain JSON from it? In reality the app will likely be a Java executable (inventory.jar) but I figure that if I can figure out how to get it working with bash, I can extrapolate to Java; and
How does Ansible actually capture/fetch the JSON back from the app? STDOUT? Is this all supposed to happen over an HTTP connection? Examples? How does inventory.sh or inventory.jar communicate that JSON back to Ansible?
The inventory script has to be located on the same machine where Ansible runs. It is not communicating through http, Ansible will simply parse the STDOUT of your program. The location does not matter at all, you have to pass the path to Ansible when you call Ansible:
ansible-playbook ... -i /path/to/your/inventory.sh
To avoid passing the inventory location every time you could add this to you ansible.cfg:
inventory = /path/to/your/inventory.sh
You could also copy the script to /etc/ansible/hosts, which is the default location Ansible will look for inventory files/scripts, but I prefer to keep things together so I suggest to place it close to your playbooks/roles etc.
And (3) Is any of this documented, anywhere? Don't see anything in the Ansible docs...
It is not mentioned on the page Developing Dynamic Inventory Sources but it is to be seen on some examples on the page Dynamic Inventory. The docs are community managed and from times litte unstructured and lacking important information.
BTW, there is a VMware inventory script included. By looking at the source I have seen it imports some vSphere stuff. I have little experience with VMware so I can't judge if this is actually what you need and don't need to write your own.
This is completely user defined. Typically you would write your dynamic inventory in Python and use a json dump of the output to create the inventory.
Here is an example for the use case you mentioned (vSphere): https://github.com/RaymiiOrg/ansible-vmware/blob/master/query.py
In a nutshell you create it like a normal Python file and create the options (as he does in main) and selectively execute functions based on which options are passed. These will make REST calls and return the output in the form of a JSON dump, which Ansible can parse for use in inventory.

Save Json to online database GAE or Firebase

I am new to JSON and online databases. I have learned the basics of using .js files and manipulation on them. But I have no clue how to save them onto GAE or Firebase databases.
1)My question is, are every online databases uses JSON differently when they store them?
I have no idea what it looks like storing onto an online database so
2)Can you give me an example of JSON stored in Firebase or GAE. Links to tutorials are also helpful.
Firebase is a true "online database" in a sense that you can save/retrieve/query data to it, without actually writing any code on the server. As such, it is close to Backend as a Service offerings, such as Parse, Kinvey, etc.. Search the web to find more services and compare features that you need.
OTOH, GAE is an application platform - you will need to write server-side code to create any functionality.
As for examples: please RTFM.
GAE's ndb datastore API has a JsonProperty:
https://cloud.google.com/appengine/docs/python/ndb/properties
It's easy to store a JSON object as a StringProperty, using json.loads, json.dumps to parse. For a simple list, you can use a StringProperty, and giving it the repeated=True tag:
https://cloud.google.com/appengine/docs/python/ndb/properties#repeated

Is it possible to save data to a json file on local disk using $resource.save without using any server side implementation?

I am trying to build an Employement Management App in AngularJS where a basic operation is adding an employee.
I am displaying details using a service and getting Json data from the mock json file i am using.
Similarly, can i add form data to a textfile on the harddisk.
I have seen it done in a tutorial using $resource.save.
If at all it is possible without any server side code please share the example,it would be helpful.
Thanks.
You can make use of HTML5 local browser storage, as this does not require folder access. Mode datails here http://diveintohtml5.info/storage.html
AngularJS has modules for local storage which you can use to access such storages like this https://github.com/grevory/angular-local-storage