Filename with string cannot be created in Powershell - json

Here, the name of the file, I want, is b0:d5:cc:fc:90:98_Reader_Configuration.
"b0:d5:cc:fc:90:98" is a string.
After creating the file, the filename I get is _Reader_Configuration.
$newMAC = "b0:d5:cc:fc:90:98"
$newData | Out-File C:\Users\sams\Desktop\Reader_Test\Config_Files\${$newMAC}_Reader_Configuration.json
I get the file with the data but not the filename I want.
Any help is appreciated.

Colons : are not allowed in your filename.
See Microsoft docs: Naming Conventions

Related

How to treat strings in ConvertFrom-Json as literals?

I'm importing a set of configuration values from a file in JSON format using the following:
$configFileContent = Get-Content -Path run.config | ConvertFrom-Json
This produces a result that among other things, contains the following (the contents of the variable $configFileContent):
{
"config-values":{
"path":"..\temp-path"
}
}
Next, I try to access the value of path from that config as follows:
$conf = $configFileContent.'config-values'
$tempPath = $conf.'path'
..but this fails due to the characters \t in ..\temp-path being interpreted as an escape sequence representing a Tab instead. This is clear by printing the contents of $conf, which is now:
path
----
.. emp-path
As you can see, the value of Path is .. <tab> emp-path instead of ..\temp-path, as intended. Obviously this causes trouble later when I'm trying to use the variable $tempPath as an actual path.
How can I make Powershell interpret this as intended - i.e., treat strings as literals here?
I understand this may not be the answer you are looking for.
But the quick fix is to use a double backslash in your JSON file
It is also a common workaround in all other languages
{
"config-values":{
"path":"..\\temp-path"
}
}

convert CSV to JSON using Python

I need to convert a CSV file to JSON file using Python. I used this,
variable = csv.DictReader(file.csv)
It throws this ERROR
csv.Error: line contains NULL byte
I checked the CSV file in Excel, it shows no NULL chars, but when I printed the data in CSV file using Python. There are some data like SOHNULNULHG (here last 2 letters, HG is the data displaying in the Excel). I need to remove these ASCII chars in the CSV file, while converting to JSON. (i.e. I need only HG from the above string)
I just ran into the same issue. I converted my csv file to csv UTF-8 and ran it again without any errors. That seemed to fix the ASCII char issue. Hope that helps.
To convert the csv type, I just opened my file up in Excel, did save as, then selected CSV UTF-8(Comma delimited)(*.csv) in the Save as type.
Hope that helps.

How do I get strings from a JSON file?

I'm writing a internationalized desktop program written in Vala where a use an extern JSON file to store a list of languages.
I'm using gettext for l10n so if I get the string from the json file and I do something like _(string_var) I could get the translated string. The problem is that I don't know how can I add the string to the pot file using xgettext or some similar tool.
Any idea??
If the tool jq (http://stedolan.github.io/jq/) is an option for you, below might work;
$ curl -s https://raw.githubusercontent.com/chavaone/gnomecat/master/data/languages.json | jq .languages[43].name
"English"
The solution I finally use was to modify the JSON file to use double quoted strings only when I wanted to translate that string. For example:
{
'code' : 'es',
'name' : "Spanish; Castilian",
'pluralform' : 'nplurals=2; plural=(n != 1);',
'default-team-email': 'gnome-es-list#gnome.org'
}
In the previous piece of JSON file the only string I wanted to translate was "Spanish; Castillian". Then in the POTFILES.in, I just use the gettext/quoted type.
# List of source files containing translatable strings.
# Please keep this file sorted alphabetically.
[encoding: UTF-8]
[type: gettext/quoted]data/languages.json
[type: gettext/quoted]data/plurals.json
[type: gettext/glade]data/ui/appmenu.ui
[...]

How to escape quotes that passed from a JSON file to Jade template?

I have some variables stored in a JSON file which will be injected into my generated HTML later. Those variables would be put in places like:
var str = '#{content.str}';
While in the JSON file the content.str might contain ' in it, and it would cause the JavaScript error after rendering the HTML file.
What should I do to prevent this happening?
Thanks,
It's simple -
"I've done it".replace("'", "\\'")
//output "I\'ve done it"
Using RegEx - replace all
"I've done it haven't you".replace(/'/g, "\\'")
//output "I\'ve done it haven\'t you"

how to use ascii character for quote in COPY in cqlsh

I am uploading data from a a big .csv file into Cassandra using copy in cqlsh.
I am using cassandra 1.2 and CQL 3.0.
However since " is part of my data I have to use some other character for uploading my data, I need to use any extended ASCII characters. I tried various approaches but fails.
The following works, but need to use an extended ascii characters for my purpose..
copy (<columnnames>) from <filename> where deleimiter='|' and quote = '"';
copy (<columnnames>) from <filename> where deleimiter='|' and quote = '~';
When I give quote='ß', I get the error below:
:"quotechar" must be an 1-character string
Pls advice on how I can use an extended ASCII character for quote parameter..
Thanks in advance
A note on the COPY documentation page suggests that for bulk loading (like in your case), the json2sstable utility should be used. You can then load the sstables to your cluster using sstableloader. So I suggest that you write a script/program to convert your CSV to JSON and use these tools for your big CSV. JSON will not have any problem handling all characters from ASCII table.
I had a similar problem, and inspected the source code of cqlsh (it's a python script). In my case, I was generating the csv with python, so it was a matter of finding the right python csv parameters.
Here's the key information from cqlsh:
csv_dialect_defaults = dict(delimiter=',', doublequote=False,
escapechar='\\', quotechar='"')
So if you are lucky enough to generate your .csv file from python, it's just a matter of using the csv module with:
writer = csv.writer(open("output.csv", 'w'), **csv_dialect_defaults)
Hope this helps, even if you are not using python.