Why does Sublime Text 3 allow comments in JSON configuration files? - json

Using comments in JSON configuration files in Sublime Text can make JSON objects unable to be decoded. Here is my story.
I newly installed SublimeREPL plugin in my Sublime Text 3. Soon I discovered it ran Python2.7 instead of 3.5 in default, so I added my own Python3.5 configuration files according to SublimeREPL Docs to make it support Python3.5.
My Packages/SublimeREPL/config/Python3.5/Main.sublime-menu JSON config file looks like this:
[
{
"id": "tools",
"children":
[{
"caption": "SublimeREPL",
"mnemonic": "R",
"id": "SublimeREPL",
"children":
[
{"caption": "Python3.5",
"id": "Python3.5",
"children":[
{"command": "repl_open",
"caption": "Python3.5",
"id": "repl_python3.5",
"mnemonic": "P",
"args": {
"type": "subprocess",
"encoding": "utf8",
"cmd": ["python3", "-i", "-u"],
"cwd": "$file_path",
"syntax": "Packages/Python/Python.tmLanguage",
"external_id": "python3",
"extend_env": {"PYTHONIOENCODING": "utf-8"}
}
},
// run files
{"command": "repl_open",
"caption": "Python3.5 - RUN current file",
"id": "repl_python3.5_run",
"mnemonic": "R",
"args": {
"type": "subprocess",
"encoding": "utf8",
"cmd": ["python3", "-u", "$file_basename"],
"cwd": "$file_path",
"syntax": "Packages/Python/Python.tmLanguage",
"external_id": "python3",
"extend_env": {"PYTHONIOENCODING": "utf-8"}
}
}
]}
]
}]
}]
Note there is a comment // run files in this file. This config works fine from the menu bar tools->SublimeREPL->Python3.5. However,when I tried to bind the F5 key with repl_python3.5_run to have easier access to 3.5,the following exception was thrown in the console:
Traceback (most recent call last):
File "./python3.3/json/decoder.py", line 367, in raw_decode
StopIteration
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/sublime_text/sublime_plugin.py", line 551, in run_
return self.run(**args)
File "/home/ubuntu/.config/sublime-text-3/Packages/SublimeREPL/run_existing_command.py", line 32, in run
json_cmd = self._find_cmd(id, path)
File "/home/ubuntu/.config/sublime-text-3/Packages/SublimeREPL/run_existing_command.py", line 41, in _find_cmd
return self._find_cmd_in_file(id, file)
File "/home/ubuntu/.config/sublime-text-3/Packages/SublimeREPL/run_existing_command.py", line 53, in _find_cmd_in_file
data = json.loads(bytes)
File "./python3.3/json/__init__.py", line 316, in loads
File "./python3.3/json/decoder.py", line 351, in decode
File "./python3.3/json/decoder.py", line 369, in raw_decode
ValueError: No JSON object could be decoded
After I removed the // run files comment. The F5 key works fine.It's exactly the comment that causes the problem.
Sublime Text uses JSON as config files,lots of config files come with // style comments. As we know, comments are removed from JSON by design.
Then how can sublime text allow comments in config files, is it using a pipe? If it is, how can my key binding fail?

Sublime itself (the core program, not plugins like SublimeREPL) uses an internal JSON library for parsing config files like .sublime-settings, .sublime-menu, .sublime-build, etc. This (most likely customized) parser allows comments.
However, plugins are run through a version of Python (currently 3.3.6 for the dev builds) linked to the Sublime plugin_host executable. Any plugin that imports the standard library's json module (such as run_existing_command.py has to obey the restrictions of that module, and that includes failing to recognize JavaScript-style comments like // in JSON.
One workaround to this would be to import an external module like commentjson that strips various types of comments, including //, before passing the data on to the standard json module. Since it is a pure Python module, you could just copy the source directory into the main SublimeREPL dir, then edit run_existing_command.py appropriately - change line 6 to import commentjson as json and you're all set.

Related

The file ionic.config.json is not a valid format

I keep getting this weird error:
The config file (.\ionic.config.json) is not a valid format
I've looked at a few questions answering this and they've told me to delete a file in:
C:\Users\{user}\.ionic\ionic.config
but even if I do that the error are the same, which makes me think it's my local config file, which looks like this:
{
"app_id": "",
"name": "AstootClient",
"type": "ionic-angular",
"typescript": true,
"v2": true
}
What can I do to make Ionic2 start working agian?

Is it possible to define global compile tasks (e.g. file-extension-dependent) for Visual Studio Code?

I set up a tasks.json file to build my C code in VS Code, which also includes a problem matcher.
{
"version": "0.1.0",
"command": "gcc",
"args": ["-Wall", "-o", "${fileBasenameNoExtension}", "${fileBasename}"],
"problemMatcher": {
"owner": "cpp",
"fileLocation": ["relative", "${workspaceRoot}"],
"pattern": {
"regexp": "^(.*):(\\d+):(\\d+):\\s+(warning|error):\\s+(.*)$",
"file": 1,
"line": 2,
"column": 3,
"severity": 4,
"message": 5
}
}
Since I use variable substitution for the source file and the executable, it would work as portable solution for comfortable compilation of single source files. Unfortunately, VSC requires the tasks.json to be defined separately for each project, i.e. each folder. Is there any way to derive the suitable build task from file extension globally for any file?

Azure customscriptextension can't find script file

I'm using Powershell and a JSON file to create new VM's. The last part of the JSON file instructs the newly-created VM to run a file. I can verify (via the Azure portal and via Powershell) that the file exists at the URL given in the JSON file.
Yet the VM finishes with result ProvisioningState:Failed because VM has reported a failure when processing extension 'CustomScriptExtension'. Error message: "Finished executing command" which is frustratingly ambivalent. (We've also had problems with the domainjoin CustomScriptExtension which also "failed" with a "success" message.)
The script, which is a simple CMD file, should create the file C:\Users\Public\Documents\runonce.log, but this does not happen so I guess the file does not run.
However, the VM is created, and I can then manually log in and run the file just fine. So it should not be a matter of user privileges or such.
Edit: This comment states that "CustomScriptExtension does not run with local admin priviledges", but I don't think that's the problem here.
What am I missing?
Here's the relevant part of the JSON:
...
"resources": [
...
{
"type": "Microsoft.Compute/virtualMachines/extensions",
"name": "[concat(parameters('vmName'),'/CustomScriptExtension')]",
"apiVersion": "2015-05-01-preview",
"location": "[resourceGroup().location]",
"dependsOn": ["[concat('Microsoft.Compute/virtualMachines/', parameters('vmName'))]"],
"properties": {
"publisher": "Microsoft.Compute",
"type": "CustomScriptExtension",
"typeHandlerVersion": "1.2",
"settings": {
"fileUris": ["[concat(parameters('scriptFilePath'),parameters('scriptFileName'))]"],
"commandToExecute": "[concat('powershell -ExecutionPolicy Unrestricted -file ',parameters('scriptFileName'))]"
}
}
}]
and I have verified that scriptFilePath and scriptFileName are https://euwest2766.blob.core.windows.net/script/ and run-once.cmd which corresponds to the blob URL shown in Azure. I have even tried putting that Azure Blob URL into the JSON, with the same result.
This won't work because you are trying to run a batch file through Powershell.
you would need to change this -
"commandToExecute": "[concat('powershell -ExecutionPolicy Unrestricted -file ',parameters('scriptFileName'))]"
to this
"commandToExecute": "[concat('cmd /s',parameters('scriptFileName'))]"
That should execute as expected.

How to configure LumberJack(LogStash-forwarder) on windows

I've installed ELK on my ubuntu server using this manual,
Now i want to index some log files from a windows server so I installed a logstash forwarder (LumberJack), but I can't get it to run.
this is the logstash-forwarder.conf file :
{
"network": {
"servers": [ "http://XX.XX.XX.XX:5000" ],
"ssl key": "D:/lumberjack/pki/tls/certs/logstash-forwarder.crt",
"ssl ca": "D:/lumberjack/pki/tls/certs/logstash-forwarder.crt",
"timeout": 15,
},
"files": [
{
"paths": [
#single paths are fine
"D:/bea12/Oracle/Middleware/domains/Google/servers/RT1/logs/AppLogs/RT1_APP_9_0.log",
#globs are fine too, they will be periodically evaluated
#to see if any new files match the wildcard.
"/var/logauth.log"
],
]
}
}
and this is the Error I get when i'm trying to run the "lumberjack.exe" ,
That I created with go-build:
2015/04/30 18:17:39.052033 Failed unmarshalling json: invalid character '}' looking for beginning of object key string
2015/04/30 18:17:39.052033 Could not load config file d:\lumberjack\logstash-forwarder.conf: invalid character '}' looking for beginning of object key string
Can anyone please tell me what am I doing wrong?
By the way this is the command I'm using to run the forwarder:
lumberjack.exe -config="d:\lumberjack\logstash-forwarder.conf"
Ok.
So the problem was in the configuration file, There were 2 unnecesary commas and no need for the http:\ at the start:
{
"network": {
"servers": [ "XX.XX.XX.XX:5000" ],
"ssl key": "D:/lumberjack/pki/tls/certs/logstash-forwarder.key",
"ssl ca": "D:/lumberjack/pki/tls/certs/logstash-forwarder.crt",
"timeout": 15
},
"files": [
{
"paths": [
#single paths are fine
"D:/bea12/Oracle/Middleware/domains/google/servers/RT1/logs/AppLogs/RT1_APP_9_0.log",
#globs are fine too, they will be periodically evaluated
#to see if any new files match the wildcard.
"/var/logauth.log"
]
}
]
}
This is my suggested configuration file for LumberJack on windows.

OrientDB ETL Throws exception on loading CSV file

I am trying to load a simple CSV file to OrientDB and it always throws this exception.
CSV File Content
id, name, role
1, Sarath, Architect
2, Anoop, Project Manager
3, Nazeem, Lead Developer
4, Rini, Senior Developer
5, Shine, iOS Developer
6, Vishnu, iOS Developer
json config file
{
"source": { "file": { "path": "./dev.csv" } },
"extractor": { "row": {} },
"transformers": [
{
"csv": {}
},
{ "vertex": { "class": "Person" } }
],
"loader": {
"orientdb": {
"dbURL": "remote:localhost/dev",
"dbType": "graph",
"dbUser": "root",
"dbPassword": "root",
"dbAutoCreate": true,
"classes": [
{"name": "Person", "extends": "V"}
], "indexes": [
{"class":"Person", "fields":["id:integer"], "type":"UNIQUE" }
]
}
}
}
All the time it shows this exception. I tried different CSV options but it did not work. Looks like the file reading itself throwing exception.
ยป oetl posts.json sarat#Saraths-MacBook-Air
OrientDB etl v.2.0.1 (build #BUILD#) www.orientechnologies.com
Exception in thread "main" com.orientechnologies.orient.core.exception.OConfigurationException: Error on loading config file: posts.json
at com.orientechnologies.orient.etl.OETLProcessor.main(OETLProcessor.java:151)
Platform Details
OS X 10.10.2
java version "1.8.0_25"
OrientDB - v.2.0.1
The stack trace gives you the cause of the problem (you only showed the first, not interesting lines in your example).
Check:
The server is running (you specified remote:localhost/dev, so the server must be running on the default port at the same machine).
The database exists and has "root" / "root" as credentials (note that mostly "admin" "admin" is used)
The file dev.csv is present in the current directory
Tried out your example and works without modification.
I had the same problem, and it was due to one of the directories in the path to the JSON config file having a space in it. The oetl script doesn't handle this well and thus can't find the script.
#rmuller: The OP showed the entire traceback. That's all you get in this case.