How to convert hcl tf file to json - json

want to read TF files with an node app. Therefore I want to read and parse hcl variable declarations. Is there any way to convert a .tf file to a .tf.json file via terraform cli? I‘ve tried terraform show -json, but this outputs my state without variable declaration.

I think you should use a dedicated tool for this.
This one comes up in Google search for "nodejs hcl Terraform library":
https://github.com/NewSpring/node-hcl
(I have never seen it before, to be honest)
A general comment - parsing a language like HCL is not a trivial thing.

Related

Modify Mxnet nodes in a json file in C++

I have a generic Mxnet model saved into a json file that I load in a C++ script. But depending on the specific case, I need to modify the attributes of many nodes.
Is there a way to do so after (or before) calling LoadJson on that json file?
Thank you.

How to convert binary protobuf file to json file in Scala?

In my project I have proto file, the respective class files also. I have probuf binary data as blob file format. How can I convert this file to json file? I am very new to Scala.
I came across https://scalapb.github.io/docs/ site. It is not very clear to me. I do not want to install this whole but rather just make use of Json4s to convert protobuf data to json data.
Any pointers? Which library can I use and how can I use it?
The .proto files are your source of truth. There should be no "respective class files"
ScalaPB should be used as an SBT plugin. It will generate "managed" code when you compile your project. That managed code will consist of case classes that mirror the definitions in the proto files and companion objects that can serialize the protobufs.
Managed code is code you do not edit as a user. You won't even see it unless you look for it in the target directory. If your proto files change, the compiler will re-create the proper code to reflect those changes. That is why you should not hand-create these case class files.
Then apply their Json4s helper-lib which cuts down on a few steps

Best data processing software to parse CSV file and make API call per row

I'm looking for ideas for an Open Source ETL or Data Processing software that can monitor a folder for CSV files, then open and parse the CSV.
For each CSV row the software will transform the CSV into a JSON format and make an API call to start a Camunda BPM process, passing the cell data as variables into the process.
Looking for ideas,
Thanks
You can use a Java WatchService or Spring FileSystemWatcher as discussed here with examples:
How to monitor folder/directory in spring?
referencing also:
https://www.baeldung.com/java-nio2-watchservice
Once you have picked up the CSV you can use my example here as inspiration or extend it: https://github.com/rob2universe/csv-process-starter specifically
https://github.com/rob2universe/csv-process-starter/blob/main/src/main/java/com/camunda/example/service/CsvConverter.java#L48
The example starts a configurable process for every row in the CSV and includes the content of the row as a JSON process data.
I wanted to limit the dependencies of this example. The CSV parsing logic applied is very simple. Commas in the file may break the example, special characters may not be handled correctly. A more robust implementation could replace the simple Java String .split(",") with an existing CSV parser library such as Open CSV
The file watcher would actually be a nice extension to the example. I may add it when I get around to it, but would also accept a pull request in case you fork my project.

How can we read a json file only once for entire feature file in karate

I am calling multiple json and js files in my feature file in background, which is required for every scenarion in my feature file.
def test= read('classpath:testData/responseFiles/test.json')
problem is that, it is running/reading for each scenario. Is there something i can do, so that it read only once for feature file and can use for all scenarios. I am using 9.0.0 karate version
callonce is only working to call feature filen not json file
Read the JSON file in a called feature and then use callonce.

Read properties file with JSON format

I have a Java code using selenium where I have a properties file which is in JSON format with multiple values and I want to use that file in Jenkins. For that I am using "This project is parameterized" option where I am selecting "File parameters" option.
So My question is How to use the JSON format in Jenkins? Am I doing is correct and what changes we have to make in code for that?
Can anyone help on this?
The "File parameters" is not working in the way you think, it is not like Jenkins will parse file and give you something like key/value map - no.
What is it doing is follwoing , you basically upload file and then how you use it is up to you, so in other words, if that file is for you java code, set the path for that file using the JVM params (e.g. -DpropertiesFilePath = ${abc.xyz}) and then Jenkins will parse the ${abc.xyz} for you and you java code will have proper path to file.
Otherwise, if you want to use the properties inside that JSON file itself for jenkins job configuration needs, then you have to write Jenkins job using either DSL or Jenkinsfile, in which having full access to file you can use for example JsonSlupper and parse Json file and assign properties to stages or whatever you need in Jenkins job walkthrow.