I am using a cookiecutter template with a json file like this:
{
"full_name": "Your name",
"email": "Your address email (eg. you#example.com)",
"project_name": "Project name",
"project_slug": "{{cookiecutter.project_name.lower().replace(' ', '-')}}",
"project_desc": "Project description",
"git_remote": "Git repository (if known)",
"start_date": "{% now 'local' %}"
}
I wonder if there is a way to add a variable that retrieve the path to the folder generated by cookiecutter. I guess we could use something like this:
"folder_path": "{% pwd %}"
But this command does not work and I am not sure how to set it. The code in brackets is in jinja2. Thanks for the help !
Related
If I had a json file like this:
{
"allMyTags": {
"owner": "john",
"department": "HR",
"city": "New York"
}
}
and my AWS provider terraform main.tf looks like this:
resource "aws_vpc" "example" {
# ... other configuration ...
tags = {
owner = "john"
}
}
How do I go about replacing everything that is in the tags section of main.tf with the external json file. The json file is a lot longer that I have put up there and I just didn't want to manually put in 20 values in the tags section of main.tf. Is there a way to "loop" thru the json file and add it in? Thanks for any help you can provide.
Assuming that you json is already loaded into TF, you could do:
resource "aws_vpc" "example" {
# ... other configuration ...
tags = jsondecode(local.myjson["allMyTags"])
}
where local.myjson is the loaded json to TF.
According to https://learn.microsoft.com/en-gb/azure/virtual-machines/windows/extensions-dsc-template, the latest method for passing credentials from an ARM template to a DSC extension is by placing the whole credential within the configurationArguments of the protectedSettings section, as shown below:
"properties": {
"publisher": "Microsoft.Powershell",
"type": "DSC",
"typeHandlerVersion": "2.24",
"autoUpgradeMinorVersion": true,
"settings": {
"wmfVersion": "latest",
"configuration": {
"url": "[concat(parameters('_artifactsLocation'), '/', variables('artifactsProjectFolder'), '/', variables('dscArchiveFolder'), '/', variables('dscSitecoreInstallArchiveFileName'))]",
"script": "[variables('dscSitecoreInstallScriptName')]",
"function": "SitecoreInstall"
},
"configurationArguments": {
"nodeName": "[parameters('CMCD VMName')]",
"sitecorePackageUrl": "[concat(parameters('sitecorePackageLocation'), '/', parameters('sitecoreRelease'), '/', parameters('sitecorePackageFilename'))]",
"sitecorePackageUrlSasToken": "[parameters('sitecorePackageLocationSasToken')]",
"sitecoreLicense": "[concat(parameters('sitecorePackageLocation'), '/', parameters('sitecoreLicenseFilename'))]",
"domainName": "[parameters('domainName')]",
"joinOU": "[parameters('domainOrgUnit')]"
},
"configurationData": {
"url": "[concat(parameters('_artifactsLocation'), '/', variables('artifactsProjectFolder'), '/', variables('dscArchiveFolder'), '/', variables('dscSitecoreInstallConfigurationName'))]"
}
},
"protectedSettings": {
"configurationUrlSasToken": "[parameters('_artifactsLocationSasToken')]",
"configurationDataUrlSasToken": "[parameters('_artifactsLocationSasToken')]",
"configurationArguments": {
"domainJoinCredential": {
"userName": "[parameters('domainJoinUsername')]",
"password": "[parameters('domainJoinPassword')]"
}
}
}
}
Azure DSC is supposed to handle the encrypting/decrypting of the protectedSettings for me. This does appear to work, as I can see that the protectedSettings are encrypted within the settings file on the VM, however the operation ultimately fails with:
VM has reported a failure when processing extension 'dsc-sitecore-de
v-install'. Error message: "The DSC Extension received an incorrect input: Comp
ilation errors occurred while processing configuration 'SitecoreInstall'. Pleas
e review the errors reported in error stream and modify your configuration code
appropriately. System.InvalidOperationException error processing property 'Cre
dential' OF TYPE 'xComputer': Converting and storing encrypted passwords as pla
in text is not recommended. For more information on securing credentials in MOF
file, please refer to MSDN blog: http://go.microsoft.com/fwlink/?LinkId=393729
At C:\Packages\Plugins\Microsoft.Powershell.DSC\2.24.0.0\DSCWork\dsc-sitecore-d
ev-install.0\dsc-sitecore-dev-install.ps1:103 char:3
+ xComputer Converting and storing encrypted passwords as plain text is not r
ecommended. For more information on securing credentials in MOF file, please re
fer to MSDN blog: http://go.microsoft.com/fwlink/?LinkId=393729 Cannot find pat
h 'HKLM:\SOFTWARE\Microsoft\PowerShell\3\DSC' because it does not exist. Cannot
find path 'HKLM:\SOFTWARE\Microsoft\PowerShell\3\DSC' because it does not exis
t.
Another common error is to specify parameters of type PSCredential without an e
xplicit type. Please be sure to use a typed parameter in DSC Configuration, for
example:
configuration Example {
param([PSCredential] $UserAccount)
...
}.
Please correct the input and retry executing the extension.".
The only way that I can make it work is to add PsDscAllowPlainTextPassword = $true to my configurationData, but I thought I was using the protectedSettings section to avoid using plain text passwords...
Am I doing something wrong, or is it simply that my understanding is wrong?
Proper way of doing this:
"settings": {
"configuration": {
"url": "xxx",
"script": "xxx",
"function": "xx"
},
"configurationArguments": {
"param1": xxx,
"param2": xxx
etc...
}
},
"protectedSettings": {
"configurationArguments": {
"NameOfTheCredentialsParameter": {
"userName": "USERNAME",
"password": "PASSWORD!1"
}
}
}
this way you don't need PsDSCAllowPlainTextPassword = $true
Then you can receive the parameters in your Configuration with
Configuration MyConf
param (
[PSCredential] $NameOfTheCredentialsParameter
)
An use it in your resource
Registry DoNotOpenServerManagerAtLogon {
Ensure = "Present"
Key = "HKEY_CURRENT_USER\SOFTWARE\Microsoft\ServerManager"
ValueName = "DoNotOpenServerManagerAtLogon"
ValueData = 1
ValueType = REG_DWORD"
PsDscRunAsCredential = $NameOfTheCredentialsParameter
}
The fact that you still need to use the PsDSCAllowPlainTextPassword = $true is documented
Here is the quoted section:
However, currently you must tell PowerShell DSC it is okay for credentials to be outputted in plain text during node configuration MOF generation, because PowerShell DSC doesn’t know that Azure Automation will be encrypting the entire MOF file after its generation via a compilation job.
Based on the above, it seems that it is an order of operations issue. The MOF is generated and THEN encrypted.
We use POST /v1/url/bulk/:branch_key for batch deep link generation for some of our items.
The response returns an array of URL's alone. The links are working fine, but its not returned in the order of our items send as request.
Is there any way to identify which branch link belongs to which item?
At least if the response had item's id or some other custom data returned with it, we could identify the link correctly.
Any hope? Thanks.
At the most basic level, this information is available to you via the Links tab on the Branch dashboard's Liveview & Export page. You can see the last 100 links created on this tab. To see more, you can use the "Export Links" button that appears in the upper right hand corner of the page.
If you need this for more information than can be retrieved via "Export Links," you can have the app whitelisted for the Data Export API (see: https://dev.branch.io/methods-endpoints/data-export-api/guide/). This provides access to a daily collection of .csv files that would include links created and their metadata. To whitelist the app for the Data Export API you send a request to integrations#branch.io. Be sure to include the app's key and to send the request from an email address on the Team tab (https://dashboard.branch.io/settings/team).
You can also query links. For a single link, append "?debug=true" and enter this value into the address bar of your browser.
You can also script the lookup of link data using the HTTP API: https://github.com/BranchMetrics/branch-deep-linking-public-api#viewing-state-of-existing-deep-linking-urls
The Branch API also allows you to specify a custom alias (the URL slug), so if you simply want an easy way to tie specific bulk-created URLs to the data inside without querying a second time, you could use this as a workaround. Details here
The bulk creation link API would return the links in that specific order.
You can test out the same via creating 3 links and using a particular parameter to differentiate.
E.G :
curl -XPOST https://api2.branch.io/v1/url/bulk/key_live_xxxxxxxxxxx -H "Content-Type: application/json" \
-d '[
{
"channel": "facebook",
"feature": "onboarding",
"campaign": "new product",
"stage": "new user",
"tags": ["one", "two", "three"],
"data": {
"$canonical_identifier": "content/123",
"$og_title": "Title1",
"$og_description": "Description from Deep Link",
"$og_image_url": "http://www.lorempixel.com/400/400/",
"$desktop_url": "http://www.example.com",
"custom_boolean": true,
"custom_integer": 1243,
"custom_string": "everything",
"custom_array": [1,2,3,4,5,6],
"custom_object": { "random": "dictionary" }
}
},
{
"channel": "facebook",
"feature": "onboarding",
"campaign": "new product",
"stage": "new user",
"tags": ["one", "two", "three"],
"data": {
"$canonical_identifier": "content/123",
"$og_title": "Title2",
"$og_description": "Description from Deep Link",
"$og_image_url": "http://www.lorempixel.com/400/400/",
"$desktop_url": "http://www.example.com"
}
},
{
"channel": "facebook",
"feature": "onboarding",
"campaign": "new product",
"stage": "new user",
"tags": ["one", "two", "three"],
"data": {
"$canonical_identifier": "content/123",
"$og_title": "Title3",
"$og_description": "Description from Deep Link",
"$og_image_url": "http://www.lorempixel.com/400/400/",
"$desktop_url": "http://www.example.com"
}
}
]'
As you can see, we have used og_title as a unique parameter and the links created for your app would be in the same order.
Yes, You can identify link belongs to which item by using data of branch.io link , you can pass branch.io config parameter as well as your custom parameters.
Every Branch link includes a dictionary of key : value pairs that is specified by you at the time the link is created. Branch’s SDKs make this data available within your app whenever the app is opened via a Branch link click.
Is there a way/tool to import a json file containing a list of objects and have Firebase push IDs created for each on one the way in? What I'd like is for every things in
{"Top Things" :
[{
"Thingnum": 1,
"place": "place 1"
},
{
"Thingnum": 2,
"place": "place 2"
}]
}
to have it's own push ID created.
I've tried firebase-import but it doesn't create push IDs.
Or will I have to write a script?
Cheers
I ended up writing node scripts. Some kind of DBAadmin tool would be nice.
is there any way to configure DocPad to generate pages starting from a json array saved in external file (or inline string) instead of from a collection of files?
To clarify, for show posts details I fetch from a JSON file, see below.
instead of this:
<% for post in #getCollection("html").findAll({ relativeOutDirPath: 'posts' }).toJSON(): %>
I use this:
<% for post in JSON.parse #include("posts.json"): %>
Ok. Now I would to generate the post pages directly from this JSON and not creating a page for each post like in the example..
For example I would to create page with url /posts/{urlname}.html when {urlname} exist in JSON like this:
[
{ "id": "1", "urlname": "prod1", "metadata": { "title": "val1" } },
{ "id": "2", "metadata": null },
{ "id": "3", "urlname": "prod3", "metadata": { "title": "val1b", "prop2": "val2b" } }
]
I would to generate /posts/prod1.html and /posts/prod2.html page with metadata those in metadata properties..
Thank's for the replies.. ;)
PS Great work!!!!!!!
Currently there is no official way to inject data into the DocPad in memory database besides having it parsed from the file system on the src directory (the way we are all use to). HOWEVER, this feature (called importers) is the next big todo for DocPad, you can find the task issue here.
For the meantime, you could include the JSON inside your template data, which is suitable for content listings, but not suitable for providing individual documents for each entry.