JSON really is a pain to use for local configuration files as it does not support comments or functions, and requires incredibly verbose syntax (commas, always use " for keys). Making it very error prone, or in the case where functions are required, impossible to use.
Now I know that I could just do:
require('coffee-script')
config = require('config.coffee')
However, that requires me to do module.exports = {the data} inside config.coffee which is less than ideal. And even allows for things such as require to be exposed which can make the configuration files insecure if we do not trust them.
Has anyone found a way to read coffeescript configuration files, but keep them secure?
Turns out CoffeeScript has support for the security part built in via setting the sandbox argument to true via the eval call. E.g.
# Prepare
fsUtil = require('fs')
coffee = require('coffee-script')
# Read
dataStr = fsUtil.readFileSync('path').toString()
data = coffee.eval(dataStr, {sandbox:true})
The above code will read in the file data, then eval it with coffeescript in sandbox mode.
I've created a nice wrapper for this called CSON which supports coffee and js files via require, and cson files via the above mechanism, and json files via the typical JSON.parse - as well as stringifying the values back to coffeescript notation. Using this, the following API is exposed:
# Include CSON
CSON = require('cson')
# Parse a file path
CSON.parseFile 'data.cson', (err,obj) -> # async
result = CSON.parseFile('data.cson') # sync
# Parse a string
CSON.parse src, (err,obj) -> # async
result = CSON.parseSync(src) # sync
# Stringify an object to CSON
CSON.stringify data, (err,str) -> # async
result = CSON.stringifySync(obj) # sync
Related
In ASP.NET Core, the JsonConfigurationProvider will load configuration from appsettings.json, and then will read in the environment version, appsettings.{Environment}.json, based on what IHostingEnvironment.EnvironmentName is. The environment version can override the values of the base appsettings.json.
Is there any reasonable way to preview what the resulting overridden configuration looks like?
Obviously, you could write unit tests that explicitly test that elements are overridden to your expectations, but that would be a very laborious workaround with upkeep for every time you change a setting. It's not a good solution if you just wanted to validate that you didn't misplace a bracket or misspell an element name.
Back in ASP.NET's web.config transforms, you could simply right-click on a transform in Visual Studio and choose "Preview Transform". There are also many other ways to preview an XSLT transform outside of Visual Studio. Even for web.config parameterization with Parameters.xml, you could at least execute Web Deploy and review the resulting web.config to make sure it came out right.
There does not seem to be any built-in way to preview appsettings.{Environment}.json's effects on the base file in Visual Studio. I haven't been able to find anything outside of VS to help with this either. JSON overriding doesn't appear to be all that commonplace, even though it is now an integral part of ASP.NET Core.
I've figured out you can achieve a preview with Json.NET's Merge function after loading the appsettings files into JObjects.
Here's a simple console app demonstrating this. Provide it the path to where your appsettings files are and it will emit previews of how they'll look in each environment.
static void Main(string[] args)
{
string targetPath = #"C:\path\to\my\app";
// Parse appsettings.json
var baseConfig = ParseAppSettings($#"{targetPath}\appsettings.json");
// Find all appsettings.{env}.json's
var regex = new Regex(#"appsettings\..+\.json");
var environmentConfigs = Directory.GetFiles(targetPath, "*.json")
.Where(path => regex.IsMatch(path));
foreach (var env in environmentConfigs)
{
// Parse appsettings.{env}.json
var transform = ParseAppSettings(env);
// Clone baseConfig since Merge is a void operation
var result = (JObject)baseConfig.DeepClone();
// Merge the two, making sure to overwrite arrays
result.Merge(transform, new JsonMergeSettings
{
MergeArrayHandling = MergeArrayHandling.Replace
});
// Write the preview to file
string dest = $#"{targetPath}\preview-{Path.GetFileName(env)}";
File.WriteAllText(dest, result.ToString());
}
}
private static JObject ParseAppSettings(string path)
=> JObject.Load(new JsonTextReader(new StreamReader(path)));
While this is no guarantee there won't be some other config source won't override these once deployed, this will at least let you validate that the interactions between these two files will be handled correctly.
There's not really a way to do that, but I think a bit about how this actually works would help you understand why.
With config transforms, there was literal file modification, so it's easy enough to "preview" that, showing the resulting file. The config system in ASP.NET Core is completely different.
It's basically just a dictionary. During startup, each registered configuration provider is run in the order it was registered. The provider reads its configuration source, whether that be a JSON file, system environment variables, command line arguments, etc. and builds key-value pairs, which are then added to the main configuration "dictionary". An "override", such as appsettings.{environment}.json, is really just another JSON provider registered after the appsettings.json provider, which obviously uses a different source (JSON file). Since it's registered after, when an existing key is encountered, its value is overwritten, as is typical for anything being added to a dictionary.
In other words, the "preview" would be completed configuration object (dictionary), which is composed of a number of different sources, not just these JSON files, and things like environment variables or command line arguments will override even the environment-specific JSON (since they're registered after that), so you still wouldn't technically know the the environment-specific JSON applied or not, because the value could be coming from another source that overrode that.
You can use the GetDebugView extension method on the IConfigurationRoot with something like
app.UseEndpoints(endpoints =>
{
if(env.IsDevelopment())
{
endpoints.MapGet("/config", ctx =>
{
var config = (Configuration as IConfigurationRoot).GetDebugView();
return ctx.Response.WriteAsync(config);
});
}
});
However, doing this can impose security risks, as it'll expose all your configuration like connection strings so you should enable this only in development.
You can refer to this article by Andrew Lock to understand how it works: https://andrewlock.net/debugging-configuration-values-in-aspnetcore/
FileSystemLoader loads templates from a directory, Is there anyway I could pull the template from a database as string into loader ?
env = Environment(
#loader=FileSystemLoader(templates),
loader = Filedb('template.j2') # fetch from db ?
undefined=StrictUndefined # Force variable to be defined
)
env.filters['custom_filter'] = func
t = env.get_template("template.j2")
From the Jinja docs:
If you want to create your own loader, subclass BaseLoader and override get_source.
For example:
class DatabaseLoader(BaseLoader):
def __init__(self, database_credentials):
self.database_credentials = database_credentials
def get_source(self, environment, template):
# Load from database... an exercise for the reader.
Because templates can depend on other templates, loading one template could require multiple database lookups. Database lookups could be minimized using bytecode caching to cache compiled templates.
It is also possible to load all of the templates from the database into a dictionary, and then load the dictionary using Jinja's DictLoader.
At the moment I have a custom library to read a json file and output a list. For example
def get(f):
with open(f) as fd:
data=json.load(fd)
return [ k for k in data['d']['a'] ]
Then in RF, I call it like this
#{items}= get "f.json"
Is there a way I can do this natively in Robotframework without my custom function? I looked thru HttpLibrary but couldn't find anything relevant.
yes, it is possible. Here is how to do it without HttpLibrary, but using OperatingSystem Robot Framework Library (to open the file), and json Python library (to load the JSON):
*** Settings ***
# Import Robot Framework Libraries
Library OperatingSystem
# Import Python Library
Library json
*** test cases ***
mytest
# no need for double quote around file name. Variables are string by default
#{item} = get_in_robot f.json
*** Keywords ***
get_in_robot
[Arguments] ${file_path}
${data_as_string} = Get File ${file_path}
${data_as_json} = json.loads ${data_as_string}
# looking into the dict at ["d"]["a"] will return the list
[Return] ${data_as_json["d"]["a"]}
Hope this helps
I am trying to use the Ocaml csv library. I downloaded csv-1.2.3 and followed the installation instructions after installing findlib:
Uncompress the source archive and go to the root of the package,
Run 'ocaml setup.ml -configure',
Run 'ocaml setup.ml -build',
Run 'ocaml setup.ml -install'
Now I have META, csv.a, csv.cma, csv.cmi, csv.cmx, csv.cmxa, csv.mli files in ~/opt/lib/ocaml/site-lib/csv repertory. The shell command ocamlfind list -describe gives csv A pure OCaml library to read and write CSV files. (version: 1.2.3) which I believe means that csv is installed properly.
BUT when I add
let data = Csv.load "foo.csv" in
in my compute.ml module and try to compile it within the larger program package I have the compilation error :
File "_none_", line 1, characters 0-1:
Error: No implementations provided for the following modules:
Csv referenced from compute.cmx"
and if I simply type
let data = load "foo.csv" in
i get :
File "compute.ml", line 74, characters 13-17:
Error: Unbound value load
I have the same type of errors when I use Csv.load or load directly in the Ocaml terminal. Would somebody have an idea of what is wrong in my code or library installation?
My guess is that you're using ocamlfind for compilation (ocamlfind ocamlc -package csv ...), because you have a linking error, not a type-checking one (which would be the case if you had not specified at all where csv is). The solution may be, in this case, to add a -linkall option to the final compilation line producing an executable, to ask it to link csv.cmx with it. Otherwise, please try to use ocamlfind and yes, tell us what your compilation command is.
For the toplevel, it is very easy to use ocamlfind from it. Watch this toplevel interaction:
% ocaml
Objective Caml version 3.12.1
# #use "topfind";;
- : unit = ()
Findlib has been successfully loaded. Additional directives:
#require "package";; to load a package
#list;; to list the available packages
#camlp4o;; to load camlp4 (standard syntax)
#camlp4r;; to load camlp4 (revised syntax)
#predicates "p,q,...";; to set these predicates
Topfind.reset();; to force that packages will be reloaded
#thread;; to enable threads
- : unit = ()
# #require "csv";;
/usr/lib/ocaml/csv: added to search path
/usr/lib/ocaml/csv/csv.cma: loaded
# Csv.load;;
- : ?separator:char -> ?excel_tricks:bool -> string -> Csv.t = <fun>
To be explicit. What I typed once in the toplevel was:
#use "topfind";;
#require "csv";;
Csv.load;; (* or anything else that uses Csv *)
I have a simple grails file upload app.
I am using transferTo to save the file to the file system.
To get the base path in my controller I am using
def basePath = System.properties['base.dir'] // HERE IS HOW I GET IT
println "Getting new file"
println "copying file to "+basePath+"/files"
def f = request.getFile('file')
def okcontents = ['application/zip','application/x-zip-compressed']
if (! okcontents.contains(f.getContentType())) {
flash.message = "File must be of a valid zip archive"
render(view:'create', model:[zone:create])
return;
}
if(!f.empty) {
f.transferTo( new File(basePath+"/files/"+zoneInstance.title+".zip") )
}
else
{
flash.message = 'file cannot be empty'
redirect(action:'upload')
}
println "Done getting new file"
For some reason this is always null when deployed to my WAS 6.1 server.
Why does it work when running dev but not in prod on the WAS server? Should I be accessing this information in a different way?
Thanks j,
I found the best dynamic solution possible. As a rule I never like to code absolute paths into any piece of software. Property file or no.
So here is how it is done:
def basePath = grailsAttributes.getApplicationContext().getResource("/files/").getFile().toString()
grailsAttributes is available in any controller.
getResource(some relative dir) will look for anything inside of the web-app folder.
So for example in my dev system it will toString out to "C:\WORKSPACEFOLDER\PROJECTFOLDER\web-app\ with the relative dir concated to the end
like so in my example above
C:\WORKSPACEFOLDER\PROJECTFOLDER\web-app\files
I tried it in WAS 6.1 and it worked in the container no problems.
You have to toString it or it will try to return the object.
mugafuga
There's a definitive way...
grailsApplication.parentContext.getResource("dir/or/file").file.toString()
Out of controllers (ex. bootstrap)? Just inject..
def grailsApplication
Best regards!
Grails, when it's run in dev mode, provides a whole host of environment properties to its Gant scripts and the app in turn, including basedir.
Take a look at the grails.bat or grails.sh script and you will find these lines:
Unix: -Dbase.dir="." \
Windows: set JAVA_OPTS=%JAVA_OPTS% -Dbase.dir="."
When these scripts start your environment in dev mode you get these thrown in for free.
When you take the WAR and deploy you no longer use these scripts and therefore you need to solve the problem another way; you can either
Specify the property yourself to the startup script for the app server, eg: -Dbase.dir=./some/dir .. however
... it usually makes more sense to use the Grails Config object which allows for per-environment properties
Another option:
def basePath = BuildSettingsHolder.settings.baseDir