I need to initialize from ~/myConfig.json, which looks like:
{
"databaseActive": "production",
"databases": [
{
"name": "localhost",
"PGDB": "asdf",
"PGHOST": "localhost",
"PGPASSWORD": "asdf",
"PGPORT": "5432",
"PGUSER": "asdf"
},
{
"name": "production",
"PGDB": "asdf",
"PGHOST": "asdf.rds.amazonaws.com",
"PGPASSWORD": "asdf",
"PGPORT": "5432",
"PGUSER": "asdf"
}
]
}
This means I cannot call scalikejdbc.config.DBs.setupAll(). How might I use this JSON file to initialize scalikeJDBC from the appropriate database settings, according to the value of databaseActive?
ScalikeJDBC offers only the HOCON reader. If you go with your own JSON config files, you need to write your own JSON parser which checks the databaseActive.
Parsing your config and binding it to ScalikeJDBC's config classess would be simple:
https://github.com/scalikejdbc/scalikejdbc/blob/3.3.5/scalikejdbc-config/src/main/scala/scalikejdbc/config/DBs.scala#L10-L17
https://github.com/scalikejdbc/scalikejdbc/blob/3.3.5/scalikejdbc-config/src/main/scala/scalikejdbc/config/TypesafeConfigReader.scala#L59-L124
Related
I am able to get a single JSON object in Kibana:
By having this in the filebeat.yml file:
output.elasticsearch:
hosts: ["localhost:9200"]
How can I get the individual elements in the JSON string. So say if I wanted to compare all the "pseudorange" fields of all my JSON objects. How would I:
Select "pseudorange" field from all my JSON messages to compare them.
Compare them visually in kibana. At the moment I can't even find the message let alone the individual fields in the visualisation tab...
I have heard of people using logstash to parse the string somehow but is there no way of doing this simply with filebeat? If there isn't then what do I do with logstash to help filter the individual fields in the json instead of have my message just one big json string that I cannot interact with?
I get the following output from output.console, note I am putting some information in <> to hide it:
"#timestamp": "2021-03-23T09:37:21.941Z",
"#metadata": {
"beat": "filebeat",
"type": "doc",
"version": "6.8.14",
"truncated": false
},
"message": "{\n\t\"Signal_data\" : \n\t{\n\t\t\"antenna type:\" : \"GPS\",\n\t\t\"frequency type:\" : \"GPS\",\n\t\t\"position x:\" : 0.0,\n\t\t\"position y:\" : 0.0,\n\t\t\"position z:\" : 0.0,\n\t\t\"pseudorange:\" : 20280317.359730639,\n\t\t\"pseudorange_error:\" : 0.0,\n\t\t\"pseudorange_rate:\" : -152.02620448094211,\n\t\t\"svid\" : 18\n\t}\n}\u0000",
"source": <ip address>,
"log": {
"source": {
"address": <ip address>
}
},
"input": {
"type": "udp"
},
"prospector": {
"type": "udp"
},
"beat": {
"name": <ip address>,
"hostname": "ip-<ip address>",
"version": "6.8.14"
},
"host": {
"name": "ip-<ip address>",
"os": {
<ubuntu info>
},
"id": <id>,
"containerized": false,
"architecture": "x86_64"
},
"meta": {
"cloud": {
<cloud info>
}
}
}
In Filebeat, you can leverage the decode_json_fields processor in order to decode a JSON string and add the decoded fields into the root obejct:
processors:
- decode_json_fields:
fields: ["message"]
process_array: false
max_depth: 2
target: ""
overwrite_keys: true
add_error_key: false
Credit to Val for this. His answer worked however as he suggested my JSON string had a \000 at the end which stops it being JSON and prevented the decode_json_fields processor from working as it should...
Upgrading to version 7.12 of Filebeat (also ensure version 7.12 of Elasticsearch and Kibana because mismatched versions between them can cause issues) allows us to use the script processor: https://www.elastic.co/guide/en/beats/filebeat/current/processor-script.html.
Credit to Val here again, this script removed the null terminator:
- script:
lang: javascript
id: trim
source: >
function process(event) {
event.Put("message", event.Get("message").trim());
}
After the null terminator was removed the decode_json_fields processor did its job as Val suggested and I was able to extract the individual elements of the JSON field which allowed Kibana visualisation to look at the elements I wanted!
I'm trying to find if JSON file supports defining variables and using them within that JSON file?
{
"artifactory_repo": "toplevel_virtual_NonSnapshot",
"definedVariable1": "INSTANCE1",
"passedVariable2": "${passedFromOutside}",
"products": [
{ "name": "product_${definedVariable1}_common",
"version": "1.1.0"
},
{ "name": "product_{{passedVariable2}}_common",
"version": 1.5.1
}
]
}
I know YAML files allow this but now sure if JSON file allows this behavior or not. My plan is that a user will pass "definedVariable" value from Jenkins and I'll create a target JSON file (after substi
This might help you:
{
"artifactory_repo": "toplevel_virtual_NonSnapshot",
"definedVariable1": "INSTANCE1",
"passedVariable2": `${passedFromOutside}`,
"products": [
{ "name": `product_${definedVariable1}_common`,
"version": "1.1.0"
},
{ "name": `product_${passedVariable2}_common`,
"version": 1.5.1
}
]
}
*Note the use of `` instead of ''
I have a packer json like:
"builders": [{...}],
"provisioners": [
{
"type": "file",
"source": "packer/myfile.json",
"destination": "/tmp/myfile.json"
}
],
"variables": {
"myvariablename": "value"
}
and myfile.json is:
{
"var" : "{{ user `myvariablename`}}"
}
The variable into the file does get replaced, is a sed replacement with shell provisioner after the file the only option available here?
Using packer version 0.12.0
You have to pass these as environment variables. For example:
"provisioners": [
{
"type": "shell"
"environment_vars": [
"http_proxy={{user `proxy`}}",
],
"scripts": [
"some_script.sh"
],
}
],
"variables": {
"proxy": null
}
And in the script you can use $http_proxy
So far I've come just with the solution to use file & shell provisioner. Upload file and then replace variables in file via shell provisioner which can be fed from template variables provided by e.g. HashiCorp Vault
Yo may use OS export function to set environment and pass it to Packer
Here is a config using OS ENV_NAME value to choose local folder to copy from
export ENV_NAME=dev will set local folder to dev
{
"variables": {
...
"env_folder": "{{env `ENV_NAME`}}",
},
"builders": [{...}]
"provisioners": [
{
"type": "file",
"source": "files/{{user `env_folder`}}/",
"destination": "/tmp/"
},
{...}
]
}
User variables must first be defined in a variables section within your template. Even if you want a user variable to default to an empty string, it must be defined. This explicitness helps reduce the time it takes for newcomers to understand what can be modified using variables in your template.
The variables section is a key/value mapping of the user variable name to a default value. A default value can be the empty string. An example is shown below:
{
"variables": {
"aws_access_key": "",
"aws_secret_key": ""
},
"builders": [{
"type": "amazon-ebs",
"access_key": "{{user `aws_access_key`}}",
"secret_key": "{{user `aws_secret_key`}}",
// ...
}]
}
check this link for more information
Below is json file. I want to use variables for db password and db username. How can I add a variable in json ?
{
"name" : "mydb3",
"storage" : {
"binaryStorage" : {
"type" : "database",
"driverClass" : "com.mysql.jdbc.Driver",
"url" : "$jdbcdburl",
"username" : "$jdbcusername",
"password" : "$jdbcpassword"
}
},
"workspaces" : {
"default" : "default",
"allowCreation" : true
}
}
You can either build the JSON up using something like JSON Variables
https://www.npmjs.com/package/json-variables
Or you can load the JSON into memory look for the key and update it, example below using Node.JS
How to update a value in a json file and save it through node.js
Either way you wouldn't have to store the username and passwords in clear text, which is what I am guessing your are trying to avoid?
You may want to try Jsonnet, a data templating language, that is an extension of JSON and can export JSON files. E.g.
{
person1: {
username: "Alice",
password: "abc",
welcome: "Hello " + self.username + "!",
},
person2: self.person1 { username: "Bob", password: "123" },
}
would produce
{
"person1": {
"password": "abc",
"username": "Alice",
"welcome": "Hello Alice!"
},
"person2": {
"password": "123",
"username": "Bob",
"welcome": "Hello Bob!"
}
}
Other than fields you can also declare variables using local, e.g.
local pi = 3.14;
You can use string interpolation + backslash...
See "{{testUrl}}" in Postman exported JSON for exmaple:
Variable value:
"info":{
"_postman_id": "YOUR-ID-IN-POSTMAN",
"name": "YOU-EXPORTED-COLLECTION-FILE-NAME",
"schema": "https://schema.getpostman.com/json/collection/v2.1.0/collection.json",
"testUrl": "https://www.techeader.com"},
Variable implemented in raw:
"url":{
"raw": "\"{{testUrl}}\"",
"protocol": "http",
"host": ["test","com"]},
JSON is a standard format for representing objects as a textual, human readable (most of the time :-/ ) format. The concept of a variable is not applicable in this context. Variables exist in memory/code only, and a variable can be written to JSON format in a file for example, but with the risk of sounding "blunt" IMHO, your question doesn't make much sense at this point.
If you elaborate a little more on what you want to achieve in your application, I might be able to help you better.
In the app we're developing, we create all the JSON at the server side using dinamically generated configs (JSON objects). We use that for stores (and other stuff, like GUIs), with a dinamically generated list of its data fields.
With a JSON like this:
{
"proxy": {
"type": "rest",
"url": "/feature/163",
"timeout": 600000
},
"baseParams": {
"node": "163"
},
"fields": [{"name": "id", "type": "int" },
{"name": "iconCls", "type": "auto"},
{"name": "text","type": "string"
},{ "name": "name", "type": "auto"}
],
"xtype": "jsonstore",
"autoLoad": true,
"autoDestroy": true
}, ...
Ext will gently create an "implicit model" with which I'll be able to work with, load it on forms, save it, delete it, etc.
What I want is to specify through a JSON config not the fields, but the model itself. Is this possible?
Something like:
{
model: {
name: 'MiClass',
extends: 'Ext.data.Model',
"proxy": {
"type": "rest",
"url": "/feature/163",
"timeout": 600000},
etc... }
"autoLoad": true,
"autoDestroy": true
}, ...
That way I would be able to create a whole JSON from the server without having to glue stuff using JS statements on the client side.
Best regards,
I don't see why not. The syntax to create a model class is similar to that of store and components:
Ext.define('MyApp.model.MyClass', {
extend:'Ext.data.Model',
fields:[..]
});
So if you take this apart you could call Ext.define(className,config);
where className is a string and config is a JSON object and both are generated on the server.
There's no way to achieve what I want.
The only way you can do it is by means of defining the fields of the Ext.data.Store and have it to generate the implicit model by using the fields configuration.