how to print custom stacktrace in java logback? - logback

Current logback.xml
<appender name="FILEOUT3" class="ch.qos.logback.core.FileAppender">
<file>D:/${byDay}.log</file>
<append>true</append>
<encoder>
<Pattern>%d{HH:mm:ss} %-5level %msg%replace(%xException){"\n", ">> "}%nopex%n</Pattern>
</encoder>
</appender>
Current result :
play.api.Configuration$$anon$1: Configuration error[Cannot connect to database [default]]
>> at play.api.Configuration$.play$api$Configuration$$configError(Configuration.scala:92) ~[play_2.10-2.2.0.jar:2.2.0]
>> at play.api.Configuration.reportError(Configuration.scala:570) ~[play_2.10-2.2.0.jar:2.2.0]
>> at play.api.db.BoneCPPlugin$$anonfun$onStart$1.apply(DB.scala:252) ~[play-jdbc_2.10-2.2.0.jar:2.2.0]
I want result :
[12:43:16.454] play.api.Configuration$$anon$1: Configuration error[Cannot connect to database [default]]
[12:43:16.454] at play.api.Configuration$.play$api$Configuration$$configError(Configuration.scala:92) ~[play_2.10-2.2.0.jar:2.2.0]
[12:43:16.454] at play.api.Configuration.reportError(Configuration.scala:570) ~[play_2.10-2.2.0.jar:2.2.0]
[12:43:16.454] at play.api.db.BoneCPPlugin$$anonfun$onStart$1.apply(DB.scala:252) ~[play-jdbc_2.10-2.2.0.jar:2.2.0]
:
:
more 40 lines
:
[12:43:16.454] at play.api.db.BoneCPPlugin$$anonfun$onStart$1.apply(DB.scala:252) ~[play-jdbc_2.10-2.2.0.jar:2.2.0]
IMPORTANT!
Same Time Print
I want know.
How to change logback.xml?

You cannot do this by changing only logback.xml, because ReplacingCompositeConverter, which is called by %replace, uses only static strings as replacements (no dates or any other conversion words of PatternLayout).
To reach your goal you should create custom converter (if you'll use mine, notice that it should be placed in ch.qos.logback.core.pattern package)
package ch.qos.logback.core.pattern;
import ch.qos.logback.classic.PatternLayout;
import ch.qos.logback.core.pattern.parser.Node;
import ch.qos.logback.core.pattern.parser.Parser;
import ch.qos.logback.core.spi.ScanException;
public class ReplacingAndParsingCompositeConverter<E> extends ReplacingCompositeConverter<E> {
#Override
protected String transform(E event, String in) {
if (!started) {
return in;
}
String parsedReplacement;
try {
Parser<E> p = new Parser<E>(replacement);
p.setContext(getContext());
Node t = p.parse();
Converter<E> c = p.compile(t, PatternLayout.defaultConverterMap);
ConverterUtil.setContextForConverters(getContext(), c);
ConverterUtil.startConverters(c);
StringBuilder buf = new StringBuilder();
while (c != null) {
c.write(buf, event);
c = c.getNext();
}
parsedReplacement = buf.toString();
} catch (ScanException e) {
e.printStackTrace();
parsedReplacement = replacement;
}
return pattern.matcher(in).replaceAll(parsedReplacement);
}
}
Then you should declare this converter in logback.xml with <conversionRule/> and use it instead of old %replace.
<configuration ...>
<conversionRule conversionWord="replaceAndParse" converterClass="ch.qos.logback.core.pattern.ReplacingAndParsingCompositeConverter" />
<appender name="FILEOUT3" class="ch.qos.logback.core.FileAppender">
<file>D:/${byDay}.log</file>
<append>true</append>
<encoder>
<Pattern>[%d{HH:mm:ss.SSS}] %-5level %msg%replaceAndParse(%xException){"(\r?\n)", "$1[%d{HH:mm:ss.SSS}]"}%nopex%n</Pattern>
</encoder>
</appender>
....
</configuration>

Related

Convert xml to json in Jenkinsfile

I have a problem with a method in my Jenkinsfile when i tried to convert xml to json. That is the method, and the pipeline.
I tried to pass the method directly to the echo, but it gives me an error and the pipeline fails
Sorry but i don't know that details i could give about the error, because i start to learning and this is the first time that i see this code.
ERROR: org.xml.sax.SAXParseException; lineNumber: 2; columnNumber: 1; Content is not allowed in prolog.
I edit my question and i add a bat in stage OWASP dependencies testing. This bat create automatically the xml, i put in a validator xml and this did not errors. So i don't know if the problem is with the code of Jenkinsfile or xml, because the error is the same. I put part of xml code because it's very long, but
the error is still in the second line.
XML Code:
<?xml version="1.0" encoding="UTF-8"?>
<analysis xmlns="https://jeremylong.github.io/DependencyCheck/dependency-check.2.2.xsd">
<scanInfo>
<engineVersion>5.2.2</engineVersion>
<dataSource>
<name>NVD CVE Checked</name>
<timestamp>2019-11-25T09:01:51</timestamp>
</dataSource>
<datasource>...</datasource>
</scanInfo>
....................
</analysis>
import groovy.json.*;
def getDependencyResumeFromXML(pathReport){
def xml = bat(script:'type ' + pathReport, returnStdout:true);
def x = new XmlParser().parseText(xml);
def nDep = x.dependencies.dependency.size();
def dependencies = [:];
for(def i=0;i<nDep;i++){
dependencies[i] = [fileName: x.dependencies.dependency[i].fileName.text(),description:x.dependencies.dependency[i].description.text(),vulnerabilities:[:]];
def nVul = x.dependencies.dependency[i].vulnerabilities.vulnerability.size();
for(def j=0;j<nVul;j++){
dependencies[i].vulnerabilities[j] = [
name:x.dependencies.dependency[i].vulnerabilities.vulnerability[j].name.text(), cvssScore:x.dependencies.dependency[i].vulnerabilities.vulnerability[j].cvssScore.text(),
severity:x.dependencies.dependency[i].vulnerabilities.vulnerability[j].severity.text(),
cwe:x.dependencies.dependency[i].vulnerabilities.vulnerability[j].cwe.text(),
description:x.dependencies.dependency[i].vulnerabilities.vulnerability[j].description.text(),
];
}
}
return dependencies;
}
pipeline{
.......
stages{
stage('OWASP dependencies testing'){
steps{
script{
bat 'mvn org.owasp:dependency-check-maven:check';
def pathReport = 'C:\\tmp\\workspace\\umbrella-pipeline-prueba\\target\\dependency-check\\dependency-check-report.xml';
def xml = bat(script:'type ' + pathReport, returnStdout:true);
echo '------------------ 1';
echo xml;
echo '------------------ 2';
echo '--------------------------------'
def dependencias = getDependencyResumeFromXML(pathReport);
echo '------------- 3';
echo dependencias;
echo '------------- 4';
}
}
}
}
}

How to write slf4j-over-logback logs as JSON

I have the below logging statements in my code.
import org.slf4j.Logger;
public class MySampleClass {
private static final Logger logger = LoggerFactory.getLogger(MySmapleClass.class);
public void mySampleMethod(List<String> userID) {
logger.debug("userRuntimeId =" + userId);
.
.
.
Business Logic
.
.
}
}
My log configs are available in:
logback-common.xml
logback-local.xml
This prints my logs as given below,
2019-02-25 16:27:45,460 | DEBUG | [fileTaskExecutor-2] |
[a.abc.mySampleApp.handlers.userRecordHandler] | [MY_SAMPLE_APP] |
[Vijay-20190225-162738.trigger] | [] | userRuntimeId =
3051aa39-2e0a-11e9-bee3-e7388cjg5le0
I want to print the logs as JSON. How do I do it?
Sample JSON format I expect:
{
timestamp="2019-02-25 16:27:45,460" ,
level="DEBUG",
triggerName="fileTaskExecutor-2",
className="a.abc.mySampleApp.handlers.userRecordHandler",
appName="MY_SAMPLE_APP",
userRuntimeId="3051aa39-2e0a-11e9-bee3-e7388cjg5le0"
}
You can use logback-contrib's JsonLayout inside any Logback appender. For example:
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<layout class="ch.qos.logback.contrib.json.classic.JsonLayout">
<jsonFormatter class="ch.qos.logback.contrib.jackson.JacksonJsonFormatter">
<prettyPrint>false</prettyPrint>
</jsonFormatter>
<timestampFormat>yyyy-MM-dd' 'HH:mm:ss.SSS</timestampFormat>
<appendLineSeparator>true</appendLineSeparator>
<includeContextName>false</includeContextName>
</layout>
</appender>
With that configuration the following log invocation ...
logger.info("hello!");
... will emit:
{
"timestamp" : "2019-03-01 08:08:32.413",
"level" : "INFO",
"thread" : "main",
"logger" : "org.glytching.sandbox.logback.LogbackTest",
"message" : "hello!"
}
That's quite close to your desired output and JsonLayout is extensible so you could ...
Override toJsonMap() to change names of the keys
Implement addCustomDataToJsonMap() to add other key:value pairs to the log event
More details on Logback JSON extensions here.

BIML Dataflow fails to read Configuration file

Summary
Configuration file lookup works for Execute SQL Task, but fails for Dataflow tasks.
Problem
I have 2 databases:
Source (On-premise SQL Server database)
Destination (Azure SQL Database)
I have 2 packages that I want to create from BIML code.
1) Create Staging (Works fine)
Creates tables in destination database using for each loop and metadata from source database
2) Load Staging (Does not work)
Loads created tables in destination database using for each loop and Dataflow tasks (Source to Destination)
Both of these packages need to use a Package Configuration file that I have created, which stores the Username and Password of the Destination database (Azure database, using SQL Server Authentication).
Using this configuration file works fine for Package 1), but when I try to create the SSIS package using BIML code for Package 2) I get the following error:
Could not execute Query on Connection Dest: SELECT * FROM stg.SalesTaxRate. Login failed for user ''.
I have tried using the BIML Code for Package 1) and adding in a dataflow task and that seems to raise the same error - it seems that when using an Execute SQL Task it can find and use the Configuration file no problem, but when using a Dataflow Task it won't find it.
Script for Package 1):
<## import namespace="System.Data" #>
<## import namespace="System.Data.SqlClient" #>
<## template language="C#" tier="2" #>
<#
string _source_con_string = #"Data Source=YRK-L-101098;Persist Security Info=true;Integrated Security=SSPI;Initial Catalog=AdventureWorks2016";
string _dest_con_string = #"Data Source=mpl.database.windows.net;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False";
string _table_name_sql = "SELECT TABLE_SCHEMA, TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE TABLE'";
DataTable _table_names = new DataTable();
SqlDataAdapter _table_name_da = new SqlDataAdapter(_table_name_sql, _source_con_string);
_table_name_da.Fill(_table_names);
#>
<#+
public string RowConversion(DataRow Row)
{
string _ret = "[" + Row["COLUMN_NAME"] + "] " + Row["DATA_TYPE"];
switch (Row["DATA_TYPE"].ToString().ToUpper())
{
case "NVARCHAR":
case "VARCHAR":
case "NCHAR":
case "CHAR":
case "BINARY":
case "VARBINARY":
if (Row["CHARACTER_MAXIMUM_LENGTH"].ToString() == "-1")
_ret += "(max)";
else
_ret += "(" + Row["CHARACTER_MAXIMUM_LENGTH"] + ")";
break;
case "NUMERIC":
_ret += "(" + Row["NUMERIC_PRECISION"] + "," + Row["NUMERIC_SCALE"] + ")";
break;
case "FLOAT":
_ret += "(" + Row["NUMERIC_PRECISION"] + ")";
break;
}
return _ret;
}
#>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="Dest" ConnectionString="Data Source=mpl.database.windows.net;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False" />
</Connections>
<Packages>
<Package Name="005_Create_Staging_Configuration" ConstraintMode="Linear">
<PackageConfigurations>
<PackageConfiguration Name="Configuration">
<ExternalFileInput ExternalFilePath="C:\VSRepo\BIML\Configurations\AzureConfigEdit.dtsConfig">
</ExternalFileInput>
</PackageConfiguration>
</PackageConfigurations>
<Tasks>
<Container Name="Create Staging Tables" ConstraintMode="Linear">
<Tasks>
<# foreach(DataRow _table in _table_names.Rows) { #>
<ExecuteSQL Name="SQL-S_<#= _table["TABLE_NAME"] #>" ConnectionName="Dest">
<DirectInput>
IF OBJECT_ID('stg.<#= _table["TABLE_NAME"] #>','U') IS NOT NULL
DROP TABLE stg.<#= _table["TABLE_NAME"] #>;
CREATE TABLE stg.<#= _table["TABLE_NAME"] #>
(
<#
string _col_name_sql = "select COLUMN_NAME, DATA_TYPE, CHARACTER_MAXIMUM_LENGTH, NUMERIC_PRECISION, NUMERIC_SCALE from INFORMATION_SCHEMA.COLUMNS where TABLE_SCHEMA='" + _table["TABLE_SCHEMA"] + "' and TABLE_NAME='"+ _table["TABLE_NAME"] + "' order by ORDINAL_POSITION ";
DataTable _col_names = new DataTable();
SqlDataAdapter _col_names_da = new SqlDataAdapter(_col_name_sql, _source_con_string);
_col_names_da.Fill(_col_names);
for (int _i=0; _i<_col_names.Rows.Count ; _i++ )
{
DataRow _r = _col_names.Rows[_i];
if (_i == 0)
WriteLine(RowConversion(_r));
else
WriteLine(", " + RowConversion(_r));
}
#>
, append_dt datetime
)
</DirectInput>
</ExecuteSQL>
<# } #>
</Tasks>
</Container>
</Tasks>
</Package>
</Packages>
</Biml>
Script for Package 2)
<## import namespace="System.Data" #>
<## import namespace="System.Data.SqlClient" #>
<## template language="C#" tier="2" #>
<#
string _source_con_string = #"Data Source=YRK-L-101098;Persist Security Info=true;Integrated Security=SSPI;Initial Catalog=AdventureWorks2016";
string _dest_con_string = #"Data Source=mpl.database.windows.net;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False";
string _table_name_sql = "select TABLE_SCHEMA , table_name from INFORMATION_SCHEMA.TABLES where TABLE_TYPE='BASE TABLE'";
DataTable _table_names = new DataTable();
SqlDataAdapter _table_name_da = new SqlDataAdapter(_table_name_sql, _source_con_string);
_table_name_da.Fill(_table_names);
#>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="Source" ConnectionString="Data Source=YRK-L-101098;Provider=SQLNCLI11.1;Persist Security Info=true;Integrated Security=SSPI;Initial Catalog=AdventureWorks2016" />
<OleDbConnection Name="Dest" ConnectionString="Data Source=mpl.database.windows.net;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False" />
</Connections>
<Packages>
<Package Name="006_Load_Staging_Configuration" ConstraintMode="Linear">
<PackageConfigurations>
<PackageConfiguration Name="Configuration">
<ExternalFileInput ExternalFilePath="C:\VSRepo\BIML\Configurations\AzureConfigDF.dtsConfig"></ExternalFileInput>
</PackageConfiguration>
</PackageConfigurations>
<Tasks>
<Container Name="Load Staging Tables" ConstraintMode="Linear">
<Tasks>
<# foreach(DataRow _table in _table_names.Rows) { #>
<Dataflow Name="DFT-S_<#= _table["TABLE_NAME"] #>">
<Transformations>
<OleDbSource Name="SRC-<#= _table["TABLE_SCHEMA"] #>_<#= _table["TABLE_NAME"] #>" ConnectionName="Source">
<DirectInput>
SELECT *
FROM <#= _table["TABLE_SCHEMA"] #>.<#= _table["TABLE_NAME"] #>
</DirectInput>
</OleDbSource>
<OleDbDestination Name="DST-<#= _table["TABLE_SCHEMA"] #>_<#= _table["TABLE_NAME"] #>" ConnectionName="Dest">
<ExternalTableOutput Table="stg.<#= _table["TABLE_NAME"] #>"/>
</OleDbDestination>
</Transformations>
</Dataflow>
<# } #>
</Tasks>
</Container>
</Tasks>
</Package>
</Packages>
</Biml>
Configuration file:
<?xml version="1.0"?>
<DTSConfiguration>
<Configuration ConfiguredType="Property" Path="\Package.Connections[Source].Properties[ConnectionString]" ValueType="String">
<ConfiguredValue>"Data Source=YRK-L-101098;Provider=SQLNCLI11.1;Persist Security Info=true;Integrated Security=SSPI;Initial Catalog=AdventureWorks2016"</ConfiguredValue>
</Configuration> <Configuration ConfiguredType="Property" Path="\Package.Connections[Dest].Properties[ConnectionString]" ValueType="String">
<ConfiguredValue>Data Source=mpl.database.windows.net;User ID=*****;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False</ConfiguredValue>
</Configuration>
<Configuration ConfiguredType="Property" Path="\Package.Connections[Dest].Properties[Password]" ValueType="String">
<ConfiguredValue>******</ConfiguredValue>
</Configuration>
<Configuration ConfiguredType="Property" Path="\Package.Connections[Dest].Properties[UserName]" ValueType="String">
<ConfiguredValue>******</ConfiguredValue>
</Configuration>
</DTSConfiguration>
Note: Values for 'UserId' and 'Password' are filled with correct values in actual script.
Summary
The issue is that you are trying to use package configurations (an SSIS run-time feature) when developing the packages by generating them using Biml (a build-time feature).
What is happening?
Think of it this way. If you were to manually create the SSIS package, during development, you would have to connect to the source and destination databases by specifying user names and passwords. Without connecting to the databases, SSIS will not be able to get the metadata needed. Once the package has been developed and all the metadata has been mapped, you can use package configurations. When you open or execute a package with package configurations, all hardcoded values will be replaced by configuration values. This value replacement is the SSIS run-time feature.
Now, compare that to using Biml instead of manually creating the SSIS package: When generating the packages, you are expecting the Biml engine to get the user names and passwords from the package configuration files. Since this is an SSIS run-time feature, Biml is unable to get that data, and will use the connection strings specified in the BimlScript. Since these connection strings don't specify the user names and passwords, Biml will not be able to connect and you get the error Login failed for user ''. (This is similar to creating a connection manager in SSIS without providing a user name and password and getting an error when clicking "Test Connection".)
But it works for the Execute SQL task?
It may look like it, but it doesn't. The Execute SQL task basically just gets ignored. The SQL code in Execute SQL tasks is not checked or validated by SSIS or the Biml engine until the package is executed. You can type anything in there, and SSIS will be happy until you try to execute invalid code, at which point it will give you an error. Because this code is not validated by SSIS during development, it is not validated by Biml during package generation. The package gets generated successfully, and then when you open it, the package configurations will be applied, and you will not see any errors.
An OLE DB Destination, however, is validated by both SSIS and the Biml engine during development. They both need to connect to the database to get the metadata. This is why you get an error on this file only.
Solution
Package Configurations are an SSIS run-time feature only. You can not use them to pass connection strings, user names or passwords to the Biml engine. You can either hardcode the connection strings in your BimlScript or store the connection strings in an external metadata repository, but you will need to provide the user names and passwords to the Biml engine during package generation.

mysql Invalid column count in CSV input while importing csv file

I have a table in the database called locations which contains 2 columns(id which is auto incremented , location) , And I have a csv contains 1 column looks like :
When I try to import that file to locations table , I get this error invalid column count in CSV input on line 1.
Also I tried CSV using LOAD DATA but I get this : MySQL returned an empty result set (i.e. zero rows)
Maybe you should use:
$array = array_map('str_getcsv', file('file.csv'));
You have more option to check variables.
If Java is supported in your machine, Use below solution
https://dbisweb.wordpress.com/
The simple configuration required to do is
<?xml version="1.0" encoding="UTF-8"?>
<config>
<connections>
<jdbc name="mysql">
<driver>com.mysql.jdbc.Driver</driver>
<url>jdbc:mysql://localhost:3306/test1</url>
<user>root</user>
<password></password>
</jdbc>
</connections>
<component-flow>
<execute name="file2table" enabled="true">
<migrate>
<source>
<file delimiter="," header="false" path="D:/test_source.csv"/>
</source>
<destination>
<table connection="mysql">locations</table>
</destination>
<mapping>
<column source="1" destination="location" />
</mapping>
</migrate>
</execute>
</component-flow>
</config>
If you are interested, Same can be achieved by Java code.
Source source = new File("D:/test_source.csv", ',', false);
Destination destination = new Table("locations", new JDBCConnection("com.mysql.jdbc.Driver", "jdbc:mysql://localhost:3306/test1", "root", ""));
List<Mapping> mapping = new ArrayList<>();
mapping.add(new Mapping("1", "location", null));
Component component = new MigrateExecutor(source, destination, mapping);
component.execute();

Supplied connections must be of type AstDbConnectionNode

I have been working on a simple BIML solution to start learning how to use it. I keep getting an error message:
Supplied connections must be of type AstDbConnectionNode for this method.
at Varigence.Biml.Extensions.ExternalDataAccess.GetDatabaseSchema in :line 0
I've been searching and trying different solutions and have not found an answer yet. So, I'm turning to everyone here. I need another set of eyes on this so I can figure out what I'm doing wrong.
My first BIML file has my connection setup to World Wide Importers on my local box.
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<## template language = "C#" tier="0" #>
<Connections>
<OleDbConnection
Name="src"
ConnectionString="Data Source=localhost\SQL16;Initial Catalog=WorldWideImporters;Provider=SQLNCLI11.1;Integrated Security=SSPI;"
CreateInProject = "true">
</OleDbConnection>
</Connections>
<Databases>
<Database Name="src" ConnectionName = "src" />
</Databases>
Second BIML file is what is throwing the error
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<## template language = "C#" tier = "1" #>
<## import namespace="Varigence.Biml.CoreLowerer.SchemaManagement" #>
<# var srcDB = RootNode.OleDbConnections["src"]; #>
<# var WWIdb = srcDB.GetDatabaseSchema(ImportOptions.ExcludeViews); #>
<Packages>
<# foreach (var table in WWIdb.TableNodes) { #>
<Package Name="<#=table.Schema#>_<#=table.Name#>" ConstraintMode="Linear">
<Tasks>
<Dataflow Name="DF Copy <#=table.Name#>">
</Dataflow>
</Tasks>
</Package>
<# } #>
</Packages>
</Biml>
That, misleading, error surfaces from the call to GetDatabaseSchema I say it's misleading because the root problem is that srcDB is null. See for yourself by using this code in your second Biml file.
<## import namespace="System.Windows.Forms" #>
<## assembly name= "C:\Windows\Microsoft.NET\Framework\v4.0.30319\System.Windows.Forms.dll" #>
<# var srcDB = RootNode.OleDbConnections["ConnectionDoesNotExist"]; #>
<#
if (srcDB == null)
{
MessageBox.Show("It's null");
}
else
{
MessageBox.Show("It's not null - {0}", srcDB.Name);
}
#>
Root problem
You are access an object in the connections collection that doesn't exist - probably because while you have your tiering correct, you need to "include" all the files when you build.
How do you resolve this?
If you're using BimlExpress or BIDS Helper then you simply need to select both file1.biml and file2.biml in the solution menu and right click to generate package.
If you are using Mist/BimlStudio, then I would just right click on file1.biml and change that to Convert to Live BimlScript.