Parsing translatable messages from JSON file - json

I have a project that I want to be localizable. While most strings are in the source code, where xgettext/Poedit can easily find them when wrapped with the localization function call, some are in pure JSON files, that I'm using for data storage. Since it's just JSON, and not actually JS, I can't use function calls. For example, a little database:
somedb.txt
[
{ "id": 1, "name": "Xyz", "local": "AxWhyZzz", /*...*/ },
/*...*/
]
Is there a way to extract the "local" values from the JSON files with xgettext? And if there isn't, what are my options? Creating a source file that has all local values, wrapped with calls to _?
Alternatively I could write my own parser of course, or modify gettext, but I'd much rather use existing solutions if available.

No, there isn't a way. JSON is just a generic container format, the actual meaning of the values is domain/application specific — and xgettext must understand the meaning to know what to extract. How could it understand your homegrown format?
For XML files, this is solved by ITS (v2), which gettext (and thus Poedit) supports since 0.19.7. But for JSON, nothing exists… yet. There's some work being done (see here and here and here), though.

Here is the way you will get them as JS arrays through XMLHttpRequest: http://codepen.io/KryptoniteDove/post/load-json-file-locally-using-pure-javascript
Also there is a way to include somedb.txt as a valid js if you modify it by adding variable id somevar to provide further access:
somevar = [
{ "id": 1, "name": "Xyz", "local": "AxWhyZzz", /*...*/ },
/*...*/
]

Related

How to use regular expression in JSON as key

Is it even possible to do so? I didn't find a single solution on the internet. I'm working on Angular 6 project. I've to translate few strings and for that I'm maintaining one en-US.json file in i18n folder. JSON is like this:
"words": {
"Car": "Wagen",
"Ship": "Schiff",
"Flight": "Flug",
"*": "", // <------------ HELP NEEDED
}
I don't want to convert other strings whatsoever. Can I use regular expression to ignore them. Leaving them unattended is giving me strings like:
words.train
words.helicopter
Please correct me where I am wrong.

Is there a way to get/call the current working directory from inside a JSON?

I am not sure if I am asking this properly so I am going to try to elaborate.
If I were using a batch file, I could use the command
echo %cd%
Which would tell me the current directory I am in. Even if the file is in C:\Random\path
If I opened the command prompt from the desktop and called C:\Random\path\directory.cmd
It would "echo" the C:\Users\user\Desktop
I want to do something similar with a JSON file.
I haven't tried much. I am new to JSON so this is an adventure.
Right now my JSON looks like:
"search_paths": [{
"recurse": false,
"path": "C:\\Random\\path"
}]
I want it to look something like:
"search_paths": [{
"recurse": false,
"path": "%CD%"
}]
Where %CD% is whatever JSON's equivalence is to it.
Thanks, and sorry if I'm not making as much sense as I want.
JSON is a text representation of some data structure. It does not contain variables or placeholder, only data. There is no such thing as "whatever JSON's equivalence is to it".
You can leave the path out of your JSON and modify the code that reads the JSON to add the current path into the path property.
Or you can use placeholders that have special meaning for the code that reads the JSON and write it to find such placeholders and replace them with values computed at runtime. F.e. %CD% could be such a placeholder and at runtime the code could replace it with the current directory.
The second approach is more flexible (and extendable). It allows you to have JSON that looks like this:
"search_paths": [
{
"recurse": false,
"path": "%CD%"
},
{
"recurse": true,
"path": "C:\\Windows"
},
]

manipulating (nested) JSON keys and there values, using nifi

I am currently facing an issue where I have to read a JSON file that has mostly the same structure, has about 10k+ lines, and is nested.
I thought about creating my own custom processor which reads the JSON and replaces several matching key/values to the ones needed. As I am trying to use NiFi I assume that there should be a more comfortable way as the JSON-structure itself is mostly consistent.
I already tried using the ReplaceText processor as well as the JoltTransformJson processor, but I could not figure out. How can I transform both keys and values, if needed? For example: if there is something like this:
{
"id": "test"
},
{
"id": "14"
}
It might be necessary to turn the "id" into "Number" and map "test" to "3", as I am using different keys/values in my jsonfiles/database, so they need to fit those. Is there a way of doing so without having to create my own processor?
Regards,
Steve

Is it possible to use an if-else statement inside a JSON file? [duplicate]

This question already has answers here:
How to use if statement inside JSON?
(6 answers)
Closed 5 years ago.
I want to include an if-else condition in JSON based on which I need to set an attribute in the JSON file.
For example like this:
"identifier": "navTag",
"items": [{
"label": "abc",
"url": "yxz.com",
},
{
"label": "abc1",
"url": "yxz1.com",
},
{
"label": "abc2",
"url": "yxz2.com",/*I need to change this value on certain
condition like if condition is true then
"url": xyz2.com if false "url":xyz3.com*/
}
]
Is this possible?
JSON is a structure for storing data so that we can retrieved it much faster comparative to other data structure.So we can not give some conditions here.If you want to retrieve some data according to some if-else condition then there is two possible way,
1.We can create different JSON files for different conditions.
2.We can create two field in your JSON structure called if and else.If if condition satisfied then fetch the if field's value and if else satisfied then retrieved the else field's value.
eg:
{
"if":"if-value",
"else":"else-value"
}
JSON is only a data representation (unrelated to any programming language, even if early JavaScript implementations remotely inspired it). There is no notion of "execution" or "conditional" (or of "behavior" or of "semantics") in it.
Read carefully the (short) JSON definition. It simply defines what sequence of characters (e.g. the content of a file) is valid JSON. It does not define the "meaning" of JSON data.
JSON data is parsed by some program, and emitted by some program (often different ones, but could be the same).
The program handling JSON can of course use conditions and give some "meaning" (whatever is the definition of that word) to it. But JSON is only "data syntax".
You could (easily) write your own JSON transformer (using some existing JSON library, and there are many of them), and that is really simple. Some programs (notably jq) claim to be more or less generic JSON processors.
Since JSON is a textual format, you could even use some editor (such as emacs, vim or many others) to manually change parts of it. You'll better validate the result with some existing JSON parser (to be sure you did not add any mistakes).

Efficient Portable Database for Hierarchical Dataset - Json, Sqlite or?

I need to make a file that contains a hierarchical dataset. The dataset in question is a file-system listing (directory names, file name/sizes in each directory, sub-directories, ...).
My first instinct was to use Json and flatten the hierarchy using paths so the parser doesn't have to recurse so much. As seen in the example below, each entry is a path ("/", "/child01", "/child01/gchild01",...) and it's files.
{
"entries":
[
{
"path":"/",
"files":
[
{"name":"File1", "size":1024},
{"name":"File2", "size":1024}
]
},
{
"path":"/child01",
"files":
[
{"name":"File1", "size":1024},
{"name":"File2", "size":1024}
]
},
{
"path":"/child01/gchild01",
"files":
[
{"name":"File1", "size":1024},
{"name":"File2", "size":1024}
]
},
{
"path":"/child02",
"files":
[
{"name":"File1", "size":1024},
{"name":"File2", "size":1024}
]
}
]
}
Then I thought that repeating the keys over and over ("name", "size") for each file kind of sucks. So I found this article about how to use Json as if it were a database - http://peter.michaux.ca/articles/json-db-a-compressed-json-format
Using that technique I'd have a Json table like "Entry" with columns "Id", "ParentId", "EntryType", "Name", "FileSize" where "EntryType" would be 0 for Directory and 1 for File.
So, at this point, I'm wondering if sqlite would be a better choice. I'm thinking that the file size would be a LOT smaller than a Json file, but it might only be negligible if I use Json-DB-compressed format from the article. Besides size, are there any other advantages that you can think of?
I think a Javascript object for datasource, loaded as a file stream into the browser and then used in javascript logic in the browser would consume the least time and have good performance.. BUT only until a limited hierarchy size of the content.
Also, not storing the hierarchy anywhere else and keeping it only as a JSON file badly limits your data source's use in your project to client-side technologies.. or forces conversions to other technologies.
If you are building a pure javascript based application (html, js, css only app), then you could keep it as JSON object alone.. and limit your hierarchy sizes.. you could split bigger hierarchies into multiple files linking json objects.
If you will have server-side code like php, in your project,
Considering managebility of code, and scaling, you should ideally store the data in SQLite DB, at runtime create your json hierarchies for limited levels as ajax loads from your page.
If this is the only data your application stores then you can do something really simple like just store the data in an easy to parse/read text file like this:
File1:1024
File2:1024
child01
File1:1024
File2:1024
gchild01
File1:1024
File2:1024
child02
File1:1024
File2:1024
Files get File:Size and directories get just their name. Indentation gives structure. For something slightly more standard but just as easy to read, use yaml.
http://www.yaml.org/
Both can benefit from decreased file size (but decreased user readability) by gzipping the file.
And if you have more data to store, then use SQLite. SQLite is great.
Don't use JSON for data persistence. It's wasteful.