Adding Line separator (\u2028) into Logback pattern - logback

Tried to add custom Logback pattern in order to log Exception stacktraces into a single line (where new line character is replaced with unicode Line separator \u2028), as:
%date{"yyyy-MM-dd'T'HH:mm:ss.SSSZ", UTC} %5p %t %c{5}:%L [log_framework=logback;app_name=${APP_NAME};app_version=${APP_VERSION};instance_id=${CF_INSTANCE_INDEX}] %m MULTIEXCEPTION %replace(%xException){'\n','\u2028'}%nopex%n
Note: See spring-config.xml config file in Github
In the console, \n is replaced, the Exception stacktrace is in one line but, instead with "Line separator" character (\u2028), \n is replaced with the string "u2028".
If I try to log directly this "Line separator" character (via Logback, as log message) - it is printed in the console correctly.
What could be the problem?

I've managed to do this by entering the "Line separator" unicode character (
) directly:
%date{"yyyy-MM-dd'T'HH:mm:ss.SSSZ", UTC} %5p %t %c{5}:%L [log_framework=logback;app_name=${APP_NAME};app_version=${APP_VERSION};instance_id=${CF_INSTANCE_INDEX}] %m MULTIEXCEPTION %replace(%xException){'\n','
'}%nopex%n
Note: You can also manage to make Exception stacktraces "single-lined" in Spring Boot application by adding next application property:
logging.exception-conversion-word: "%replace(%xException){'\\n','\u2028'}%nopex"

Related

Nifi: Delimiters in Nifi

I was working on a task and got an error. Which says that invalid char between encapsulated token and delimiter.
Here is the SS of the data
For line 08, I was using the pipe as a delimiter, and the escape character was '^'. For line 12 I was using the comma as a delimiter. The highlighted part is the issue. If I remove the cap sign from line 08 and a single quote from line 12 it runs with success.
The processor is ConverRecord and here is the screenshot of the configs of the processor.
Actually, I am using two processors of ConvertRecord. In one processor the fields separator is a comma(,) whereas in the second processor the fields separator is also comma(,) but the escape character is Cap sign(^).
assume that these are two different records.
Why it is throwing error at that point? And how can I solve this issue?
Thanks in advance.
For the first sample data (line 08), configure CSVReader as:
Quote Character: "
Escape Character: \
Value Separator(delimiter): |
For the second sample data (line 12), configure CSVReader as:
Quote Character: "
Escape Character: \
Value Separator(delimiter): ,
The reason for failure is that your data does not conform with delimited data specifications i.e. data is invalid, so you need to add upstream cleanup logic.
For line 08 data - you have used escape character as ^ and same is appeared in the data as well, so when CSVReader encountered ^" it escaped " because of this opening double quote does not have a corresponding closing double quote causing to throw the exception. So setting Escape Character: \ property will resolve the issue. \ is kind of widely used escape character, so it is very rare to get \ as a part of data.
For line 12 data - seems like single quote ' is used as Quote Character and missing a corresponding closing quote character i.e. ' causing to throw the exception. You need to devise a logic that will add the missing closing quote character wherever required. A workaround would be to use Quote Character: " so that ' will be the part of the data and then you can clean it at downstream eg. if you are putting data into a table then post-ingestion updating the column to remove '

Excel - We found extra characters at the end of JSON input

I am trying to import my json file to excel via the get data function. When doing this i get an error saying that "We found extra characters at the end of JSON input"
i ran the json file in jsonformatter and got this additional piece of information:
Parse error on line 1:
...s":"1555615338756"}
{"created_at":"Thu A
-----------------------^
Expecting 'EOF', '}', ',', ']', got '{'
Edit: line 1
{"created_at":"Thu Apr 18 19:22:18 +0000 2019","id":1118957948263206913,"id_str":"1118957948263206913","text":"Arsenal jersey looks weird. #NapoliArsenal","source":"\u003ca href=\"https://mobile.twitter.com\" rel=\"nofollow\"\u003eTwitter Web App\u003c/a\u003e","truncated":false,"in_reply_to_status_id":null,"in_reply_to_status_id_str":null,"in_reply_to_user_id":null,"in_reply_to_user_id_str":null,"in_reply_to_screen_name":null,"user":{"id":955479430377373696,"id_str":"955479430377373696","name":"Yash Iyer","screen_name":"MesutOziI28","location":"Bengaluru South, India","url":null,"description":"RM,Nerazzurri,BFC,RCB,bcci,rafa nadal and so on! Lately into B99,superstore! Sympathetic liker of tweets!","translator_type":"none","protected":false,"verified":false,"followers_count":258,"friends_count":454,"listed_count":0,"favourites_count":47788,"statuses_count":5318,"created_at":"Mon Jan 22 16:37:02 +0000 2018","utc_offset":null,"time_zone":null,"geo_enabled":false,"lang":"en","contributors_enabled":false,"is_translator":false,"profile_background_color":"F5F8FA","profile_background_image_url":"","profile_background_image_url_https":"","profile_background_tile":false,"profile_link_color":"1DA1F2","profile_sidebar_border_color":"C0DEED","profile_sidebar_fill_color":"DDEEF6","profile_text_color":"333333","profile_use_background_image":true,"profile_image_url":"http://pbs.twimg.com/profile_images/1109886916609007616/9rAavtGh_normal.jpg","profile_image_url_https":"https://pbs.twimg.com/profile_images/1109886916609007616/9rAavtGh_normal.jpg","profile_banner_url":"https://pbs.twimg.com/profile_banners/955479430377373696/1544903252","default_profile":true,"default_profile_image":false,"following":null,"follow_request_sent":null,"notifications":null},"geo":null,"coordinates":null,"place":null,"contributors":null,"is_quote_status":false,"quote_count":0,"reply_count":0,"retweet_count":0,"favorite_count":0,"entities":{"hashtags":[{"text":"NapoliArsenal","indices":[28,42]}],"urls":[],"user_mentions":[],"symbols":[]},"favorited":false,"retweeted":false,"filter_level":"low","lang":"en","timestamp_ms":"1555615338756"}
Forever old, but I was trying to ingest an AWS CloudCheckr JSON into Excel via Power Query and getting the "We found extra characters at the end of JSON input."
Finally figured out, with the help of https://jsonformatter.org/ that some data was provided as True -- without quotes, which Excel PQ needed. Simple find/replacing :True, with :"True", did the trick.
Seriously, Microsoft, you did not recognize Boolean when you found it? Excel would have had no problem.
The answer is in the error message:
Expecting 'EOF', '}', ',', ']', got '{'
Looking at where { appears notice that directly before that is }. The JSON has no separator after the closing curly } and thus cannot process it because it's looking for one of the following:
EOF
}
,
]
In this case, it most likely needs a ,. It could also need ],, if it's an array of items. If neither of those fix it, you will need to post the entire line 1 of your JSON.
I had to do two changes to make this work: changed json body to Pascal case notation even thought it was in Camel case in chrome payload
, and secondly I was returning a simple string to verify the service is working - once I changed that to an object (as a json formatted response) this resolved the error I was getting.

neo4j throws error for "\" character

I am exporting csv file and need to read line one by one.
One of the line in csv file contains the string "C:\Program Files\". Because of this line it throws the below error.
At D:\workdir\Neo4j_Database\Database1\import\Data.csv:22798 - there's
a field starting with a quote and whereas it ends that quote there
seems to be characters in that field after that ending quote. That
isn't supported. This is what I read: 'CMM 10.0.1 Silent Installation
will install SW always in "C:\Program Files"",V10.0,
,,,,,,,,105111,AVASAAIS AG,E,,"G,"'
If I remove the last \ of the line then it does not throw this error.
I am not sure how to resolve this without modifying the csv file.
Note: CSV loader used LOAD CSV.

How to load a csv file with header and data type in neo4j?

Am I correct that one cannot use cypher to load a csv file with header together with datatype?
(By "with datatype", I mean header with something like this:
For entities:
orderId:ID(Order) customerId:IGNORE
For relationships:
:START_ID(Order) :END_ID(Product)
)
According to this two websites: https://neo4j.com/developer/guide-import-csv/, http://jexp.de/blog/2015/04/how-to-neo4j-data-import-minimal-example/
It seems that I could import data together with header in this way in either powershell or command prompt (I am using a windows computer):
path\to\neo4j-community-3.1.1\bin\neo4j-import --into graph.db \
--nodes:Person C:\SavedNewest\people_header.csv, C:\SavedNewest\people.csv \
--relationships:KNOWS C:\SavedNewest\friendships_header.csv,C:\SavedNewest\friendships.csv
(The csv are reconstructed according to this website: http://jexp.de/blog/2015/04/how-to-neo4j-data-import-minimal-example/)
Error from PowerShell:
At line:2 char:3
+ --nodes:Person C:\SavedNewest\people_header.csv,https://gist.githubus ...
+ ~
Missing expression after unary operator '--'.
At line:2 char:3
+ --nodes:Person C:\SavedNewest\people_header.csv,https://gist.githubus ...
+ ~~~~~~~~~~~~
Unexpected token 'nodes:Person' in expression or statement.
+ CategoryInfo : ParserError: (:) [], ParentContainsErrorRecordException
+ FullyQualifiedErrorId : MissingExpressionAfterOperator
Error from Command prompt:
WARNING: This command does not appear to be running with administrative rights.
Some commands may fail e.g. Start/Stop
WARNING: neo4j-import is deprecated and support for it will be removed in a future
version of Neo4j; please use neo4j-admin import instead.
Input error: Expected '--relationships' to have at least 1 valid item, but had 0 []
Caused by:Expected '--relationships' to have at least 1 valid item, but had 0 []
java.lang.IllegalArgumentException: Expected '--relationships' to have at least 1 valid item, but had 0 []
at org.neo4j.kernel.impl.util.Validators.lambda$atLeast$6(Validators.java:125)
at org.neo4j.helpers.Args.validated(Args.java:640)
at org.neo4j.helpers.Args.interpretOptionsWithMetadata(Args.java:608)
at org.neo4j.tooling.ImportTool.extractInputFiles(ImportTool.java:508)
at org.neo4j.tooling.ImportTool.main(ImportTool.java:389)
at org.neo4j.tooling.ImportTool.main(ImportTool.java:334)
What is the cause of error and how should I load a csv file with header and data type correctly?
Edit:
New Input for cmd:
C:\Users\tsutomu\Desktop\MSS\Bachelorarbeit\neo4j-community-3.1.1\bin\neo4j-import --into graph.db --nodes:Person "file:c:/SavedNewest/people_header.csv,file:c:/SavedNewest/people.csv" --relationships:KNOWS "file:c:/SavedNewest/friendships_header.csv,file:c:/SavedNewest/friendships.csv"
The Error:
Input error: Directory of file:c:\SavedNewest\people_header.csv doesn't exist
Caused by:Directory of file:c:\SavedNewest\people_header.csv doesn't exist
java.lang.IllegalArgumentException: Directory of file:c:\SavedNewest\people_header.csv doesn't exist
at org.neo4j.kernel.impl.util.Validators.matchingFiles(Validators.java:48)
at org.neo4j.kernel.impl.util.Converters.lambda$regexFiles$7(Converters.java:76)
at org.neo4j.kernel.impl.util.Converters.lambda$toFiles$8(Converters.java:95)
at org.neo4j.helpers.Args.interpretOptionsWithMetadata(Args.java:608)
at org.neo4j.tooling.ImportTool.extractInputFiles(ImportTool.java:508)
at org.neo4j.tooling.ImportTool.main(ImportTool.java:388)
at org.neo4j.tooling.ImportTool.main(ImportTool.java:334)
Same error for powershell.
The path of people_header.csv: C:\SavedNewest\people_header.csv
Is there anything I should add to environmental path?
You have a couple of issues in your command line:
You cannot have embedded spaces in your command line arguments. For example, C:\SavedNewest\people_header.csv, C:\SavedNewest\people.csv should be a single argument, so you need to either remove the space after the comma or double-quote the entire argument.
The file path URLs must be formatted appropriately. To quote from the developer guide:
Make sure to use the right URLs esp. file URLs.+ On OSX and Unix use
file:///path/to/data.csv, on Windows, please use
file:c:/path/to/data.csv

Doesn't PhpMyAdmin generate correct JSON export files? With what to replace ^M character on Unix?

I exported an SQL table in a JSON file using PhpMyAdmin tools.
I got a file like this:
/**
Export to JSON plugin for PHPMyAdmin
#version 0.1
*/
// Database 'table_name'
// ...
[{"id": 7,"field1": "value1","field2": 2,"field3": "Lorem^M
ipsum^M
dolor^M
sit^M
amet","field4": "value4"}]
It's clear that the first lines (that are comments are not valid because JSON doesn't allow comments).
I know that ^M is caused by the DOS/Windows line-ending characters.
Running jsonlint myexportedfile.json after removing the comments I get this:
[Error: Parse error on line 1:
dolorld2": 2,"field3": "Lorem
-----------------------^
Expecting 'STRING', 'NUMBER', 'NULL', 'TRUE', 'FALSE', '{', '[', got 'undefined']
Still, is this a valid JSON format? What character am I supposed to replace ^M character? New line (\\n), or?
I want to be able to do require("./myexportedfile") in NodeJS and to get the data and then to make some changes.
after removing the lines with comments you can apply the following perl command on the file:
perl -pi -e 's/\r\n/\n/g'
its changing the file inplace (!) removing all the ^M in them and making a real unix file out of it.