Serilog MSSqlServer table schema - configuration

Is there any way to change the default schema to something other than dbo?
I am developing a CORE 2.0 web api and using MSSqlServer as sink. Tried adding "Schema": "Log" in the appsettings.json file, and it does not work.
Thanks

Use schemaName property in configuration
"Args": {
"connectionString": ***,
"tableName": "Logs",
"schemaName": "aclibi"
}
https://github.com/serilog/serilog-sinks-mssqlserver/issues/63

The MSSqlServer sink can take an optional parameter "schemaName":
Log.Logger = new LoggerConfiguration()
.WriteTo.MSSqlServer(connectionString, "MyLogs", schemaName: "myschema")
.CreateLogger();
See the MSSqlServer sink configuration options documentation for more information.

Related

ASP.NET Core 3 - Serilog how to configure Serilog.Sinks.Map in appsettings.json file?

I came across the Serilog.Sinks.Map addon today which will solve my challenge with routing specific log events to a specific sink interface. In my environment, I am writing to a log file as well as using the SQL interface. I only want certain logs to be written to the SQL Server though.
Reading the instructions on GitHub by the author, I can only see an example for implementing the LoggerConfiguration through C# in the Program.CS, but I am using the appsettings.json file and unsure what to change from the provided example to the required json format.
Example given by Serilog on GitHub:
Log.Logger = new LoggerConfiguration()
.WriteTo.Map("Name", "Other", (name, wt) => wt.File($"./logs/log-{name}.txt"))
.CreateLogger();
My current configuration: Note I haven't implemented the Sinks.Map in my code yet.
Program.CS File:
public static void Main(string[] args)
{
// Build a configuration system with the route of the app settings.json file.
// this is becuase we dont yet have dependancy injection available, that comes later.
var configuration = new ConfigurationBuilder()
.AddJsonFile("appsettings.json")
.Build();
Log.Logger = new LoggerConfiguration()
.ReadFrom.Configuration(configuration)
.CreateLogger();
var host = CreateHostBuilder(args).Build();
}
And here is my appsettings.json file. I want to be able configure sink name 'MSSqlServer' as the special route, then use the standard file appender sink for all the other general logging.
"AllowedHosts": "*",
"Serilog": {
"Using": [],
"MinumumLevel": {
"Default": "Information",
"Override": {
"Microsoft": "Warning",
"System": "Warning"
}
},
"Enrich": [ "FromLogContext", "WithMachineName", "WithProcessId", "WithThreadId" ],
"WriteTo": [
{ "Name": "Console" },
{
"Name": "File",
"Args": {
//"path": "C:\\NetCoreLogs\\log.txt", // Example path to Windows Drive.
"path": ".\\Logs\\logs.txt",
//"rollingInterval": "Day", // Not currently in use.
"rollOnFileSizeLimit": true,
//"retainedFileCountLimit": null, // Not currently in use.
"fileSizeLimitBytes": 10000000,
"outputTemplate": "{Timestamp:dd-MM-yyyy HH:mm:ss.fff G} {Message}{NewLine:1}{Exception:1}"
// *Template Notes*
// Timestamp 'G' means UTC Time
}
},
{
"Name": "MSSqlServer",
"Args": {
"connectionString": "DefaultConnection",
"schemaName": "EventLogging",
"tableName": "Logs",
"autoCreateSqlTable": true,
"restrictedToMinimumLevel": "Information",
"batchPostingLimit": 1000,
"period": "0.00:00:30"
}
}
//{
// "Name": "File",
// "Args": {
// "path": "C:\\NetCoreLogs\\log.json",
// "formatter": "Serilog.Formatting.Json.JsonFormatter, Serilog"
// }
//}
]
}
Lastly if i could squeeze in another quick question on the topic, when using the SQL sink interface, how do manage the automatic purging/deletion of the oldest events i.e. DB should only store max 1,000,000 events then automatically write over the oldest event first, thanks in advance
I believe it is currently impossible to configure the standard Map call in json, since it relies on a few types that have no serialization support right now, like Action<T1, T2>. I created an issue to discuss this in the repository itself:
Unable to configure default Map call in json? #22
However, there is a way to still get some functionality out of it in Json, by creating a custom extension method. In your particular case, it would be something like this:
public static class SerilogSinkConfigurationExtensions
{
public static LoggerConfiguration MapToFile(
this LoggerSinkConfiguration loggerSinkConfiguration,
string keyPropertyName,
string pathFormat,
string defaultKey)
{
return loggerSinkConfiguration.Map(
keyPropertyName,
defaultKey,
(key, config) => config.File(string.Format(pathFormat, key));
}
}
Then, on your json file, add a section like this:
"WriteTo": [
...
{
"Name": "MapToFile",
"Args": {
"KeyPropertyName": "Name",
"DefaultKey": "Other",
"PathFormat": "./logs/log-{0}.txt"
}
}
]
To have these customizations work properly, Serilog needs to understand that your assembly has these kinds of extensions, to load them during the parsing stage. As per the documentation, you either need to have these extensions on a *.Serilog.* assembly, or add the Using clause on the json:
// Assuming the extension method is inside the "Company.Domain.MyProject" dll
"Using": [ "Company.Domain.MyProject" ]
More information on these constraints here:
https://github.com/serilog/serilog-settings-configuration#using-section-and-auto-discovery-of-configuration-assemblies

Serilog configuration via appsettings.json for Application Insights

I am trying to use https://github.com/serilog/serilog-settings-configuration to read app settings and setup serilog for app insights: https://github.com/serilog/serilog-sinks-applicationinsights. The issue I am having is that I cannot set the last parameters for ApplicationInsightsEvents call, which is a function that takes LogEvent and returns ITelemetry. How can this be set via appsettings.json?
Basically, I want to replace the followoing line:
log.WriteTo.ApplicationInsightsEvents(instrumentationKey, level, CultureInfo.CurrentCulture, TelemetryConverter.ConvertLogEventsToEnerGovTelemetry);
with a line inside appsettings.json
Thanks.
Add a sink configuration to appsettings.json
{
"Name": "ApplicationInsights",
"Args": {
"instrumentationKey": "<instrumentationKey>",
"telemetryConverter": "Serilog.Sinks.ApplicationInsights.Sinks.ApplicationInsights.TelemetryConverters.TraceTelemetryConverter, Serilog.Sinks.ApplicationInsights",
"outputTemplate": "[{Component}|{MachineName}|{ThreadId}] {Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} [{Level:u3}] <{SourceContext}> {Message:lj}{NewLine}{Exception}"
}
}
And
"Serilog.Sinks.ApplicationInsights"
to "Serilog:Using" array
Answered on GitHub: https://github.com/serilog/serilog-settings-configuration/issues/165. Just need to write an assembly and embed the final code there.

export whole Neo4j database / cypher result to GraphJSON

I´ve already had a look at different post like this and this but nothing seems to be answered 100%.
My current problem is, that I want to visualyze - and ideally - analyze my Neo4j-Graph with a library (or software/tool).
The database-server is running on a remote (virtual) server and it seems that there is no chance to export the database to a format where I can work on with.
I´ve tried exporting the graph in a .graphml-file to import this file in Gephi, but Gephi doesn´t find the properties. Gephi-streaming with apoc-procedures and the graph-streaming plugin also does not work, because it´s a remote server (also with the tool mentioned here).
Now I´m currently testing around with Alchemy.js... So far, so good. But as it seems there´s no way to export the graph/query to the GraphJson-format?
Is there really no "easy" way to export the data?
Thanks for your help in advance!
This is how I would proceed
Run this query from the post you mentioned in the Neo4j Browser or in any bolt driver:
MATCH (a)-[r]->(b)
WITH collect(
{
source: id(a),
target: id(b),
caption: type(r)
}
) AS edges
RETURN edges
Now that you have loaded the data, you can simply download it as JSON using download button.(if you are using bolt driver ignore)
Either you manually downloaded JSON from Neo4j Browser or you are using bolt driver, you will end up with something like this.
{
"columns": [
"edges"
],
"data": [
{
"row": [
[
{
"source": 31288,
"target": 152,
"caption": "HAS_PAYMENT_METHOD"
}
]
],
"meta": [
null
],
"graph": {
"nodes": [
],
"relationships": [
]
}
}
]
Now all you have to is to filter out data.row results and you are done. Probably using bolt driver is the better choice as you have to clean up data anyway and it doesnt run into issues with returning a lots of data to the browser(it might crash).
Update: added python version
from neo4j.v1 import GraphDatabase
driver = GraphDatabase.driver("bolt://localhost:7687", auth=("neo4j", "neo4j"))
session = driver.session()
result = session.run("MATCH (a)-[r]->(b) WITH collect({source: id(a),target: id(b),caption: type(r)}) AS edges RETURN edges")
for record in result:
print(record["edges"])
Hope this helps

Securely pass credentials to DSC Extension from ARM Template

According to https://learn.microsoft.com/en-gb/azure/virtual-machines/windows/extensions-dsc-template, the latest method for passing credentials from an ARM template to a DSC extension is by placing the whole credential within the configurationArguments of the protectedSettings section, as shown below:
"properties": {
"publisher": "Microsoft.Powershell",
"type": "DSC",
"typeHandlerVersion": "2.24",
"autoUpgradeMinorVersion": true,
"settings": {
"wmfVersion": "latest",
"configuration": {
"url": "[concat(parameters('_artifactsLocation'), '/', variables('artifactsProjectFolder'), '/', variables('dscArchiveFolder'), '/', variables('dscSitecoreInstallArchiveFileName'))]",
"script": "[variables('dscSitecoreInstallScriptName')]",
"function": "SitecoreInstall"
},
"configurationArguments": {
"nodeName": "[parameters('CMCD VMName')]",
"sitecorePackageUrl": "[concat(parameters('sitecorePackageLocation'), '/', parameters('sitecoreRelease'), '/', parameters('sitecorePackageFilename'))]",
"sitecorePackageUrlSasToken": "[parameters('sitecorePackageLocationSasToken')]",
"sitecoreLicense": "[concat(parameters('sitecorePackageLocation'), '/', parameters('sitecoreLicenseFilename'))]",
"domainName": "[parameters('domainName')]",
"joinOU": "[parameters('domainOrgUnit')]"
},
"configurationData": {
"url": "[concat(parameters('_artifactsLocation'), '/', variables('artifactsProjectFolder'), '/', variables('dscArchiveFolder'), '/', variables('dscSitecoreInstallConfigurationName'))]"
}
},
"protectedSettings": {
"configurationUrlSasToken": "[parameters('_artifactsLocationSasToken')]",
"configurationDataUrlSasToken": "[parameters('_artifactsLocationSasToken')]",
"configurationArguments": {
"domainJoinCredential": {
"userName": "[parameters('domainJoinUsername')]",
"password": "[parameters('domainJoinPassword')]"
}
}
}
}
Azure DSC is supposed to handle the encrypting/decrypting of the protectedSettings for me. This does appear to work, as I can see that the protectedSettings are encrypted within the settings file on the VM, however the operation ultimately fails with:
VM has reported a failure when processing extension 'dsc-sitecore-de
v-install'. Error message: "The DSC Extension received an incorrect input: Comp
ilation errors occurred while processing configuration 'SitecoreInstall'. Pleas
e review the errors reported in error stream and modify your configuration code
appropriately. System.InvalidOperationException error processing property 'Cre
dential' OF TYPE 'xComputer': Converting and storing encrypted passwords as pla
in text is not recommended. For more information on securing credentials in MOF
file, please refer to MSDN blog: http://go.microsoft.com/fwlink/?LinkId=393729
At C:\Packages\Plugins\Microsoft.Powershell.DSC\2.24.0.0\DSCWork\dsc-sitecore-d
ev-install.0\dsc-sitecore-dev-install.ps1:103 char:3
+ xComputer Converting and storing encrypted passwords as plain text is not r
ecommended. For more information on securing credentials in MOF file, please re
fer to MSDN blog: http://go.microsoft.com/fwlink/?LinkId=393729 Cannot find pat
h 'HKLM:\SOFTWARE\Microsoft\PowerShell\3\DSC' because it does not exist. Cannot
find path 'HKLM:\SOFTWARE\Microsoft\PowerShell\3\DSC' because it does not exis
t.
Another common error is to specify parameters of type PSCredential without an e
xplicit type. Please be sure to use a typed parameter in DSC Configuration, for
example:
configuration Example {
param([PSCredential] $UserAccount)
...
}.
Please correct the input and retry executing the extension.".
The only way that I can make it work is to add PsDscAllowPlainTextPassword = $true to my configurationData, but I thought I was using the protectedSettings section to avoid using plain text passwords...
Am I doing something wrong, or is it simply that my understanding is wrong?
Proper way of doing this:
"settings": {
"configuration": {
"url": "xxx",
"script": "xxx",
"function": "xx"
},
"configurationArguments": {
"param1": xxx,
"param2": xxx
etc...
}
},
"protectedSettings": {
"configurationArguments": {
"NameOfTheCredentialsParameter": {
"userName": "USERNAME",
"password": "PASSWORD!1"
}
}
}
this way you don't need PsDSCAllowPlainTextPassword = $true
Then you can receive the parameters in your Configuration with
Configuration MyConf
param (
[PSCredential] $NameOfTheCredentialsParameter
)
An use it in your resource
Registry DoNotOpenServerManagerAtLogon {
Ensure = "Present"
Key = "HKEY_CURRENT_USER\SOFTWARE\Microsoft\ServerManager"
ValueName = "DoNotOpenServerManagerAtLogon"
ValueData = 1
ValueType = REG_DWORD"
PsDscRunAsCredential = $NameOfTheCredentialsParameter
}
The fact that you still need to use the PsDSCAllowPlainTextPassword = $true is documented
Here is the quoted section:
However, currently you must tell PowerShell DSC it is okay for credentials to be outputted in plain text during node configuration MOF generation, because PowerShell DSC doesn’t know that Azure Automation will be encrypting the entire MOF file after its generation via a compilation job.
Based on the above, it seems that it is an order of operations issue. The MOF is generated and THEN encrypted.

How do I configure VS Code to enable code completion on .json files (jsonschema support)?

In the Visual Stuido Code demo minute 28:57-29:20 and 30:20-31:10, some cool JSON code completion is shown.
Where and how do I add a schema for my JSON files to a project?
How does VS Code know which schema to use for a given .json file?
The association of JSON schemas to files is done in the settings (File, Preferences, User Settings or Workspace Settings), under the property 'json.schemas'.
This is an example how the JSON schema for bower is associated to the bower schema.
"json.schemas": [
{
"fileMatch": [
"/bower.json",
"/.bower.json"
],
"url": "http://json.schemastore.org/bower"
},
...
You can also use schemas located in your workspace or define a schema right in the settings itself. Check https://code.visualstudio.com/docs/languages/json for examples.
You can refer your JSON Schema in $schema node and get your intellisense in VS Code right away. No need to configure anywhere else.
For example,
{
"$schema": "http://json.schemastore.org/coffeelint",
"line_endings": "unix"
}
This is the intellisense I was talking about. JSON Schema has the list of possible JSON properties in your current cursor position and VS Code can pull out that list of intellisense.
Note that, every official JSON should have a concrete JSON Schema to prove the data integrity. This answer is still valid!
The three ways I've got VS Code to use a JSON schema are ...
So for something like the Azure Function schema from ... http://json.schemastore.org
"json.schemas": [
{
"fileMatch": [
"/function.json"
],
"url": "http://json.schemastore.org/function"
}
]
In User Settings", i.e. as an element in the users settings.json in 'C:\Users\\AppData\Roaming\Code\User'
In the "Workspace Settings", then in it's the "settings" section in the .code-workspace file ... assuming your're using a VS Code Workspace
In the "Folder Settings", it's "settings" section in the settings.json, which is in the .vscode directory ... assuming your're using a VS Code Workspace
The Folder takes precedence over Workspace, and Workspace over User
And the Workspace and Folder work with relative paths, e.g. in the .code-workspace file ...
"settings": {
"json.schemas": [
{
"fileMatch": [
"/task.json"
],
"url": "./schema/tasks.schema.json"
}
]
}
or in the Folder Settings settings.json in \.vscode\ ...
"json.schemas": [
{
"fileMatch": [
"/task.json"
],
"url": "./schema/tasks.schema.json"
}
]
Just add the following configuration item to the settings file to fix it:
"json.validate.enable": false
Or use the GUI way: