FeignClient with Hateoas PagedModel: Content is always empty - spring-hateoas

I have an endpoint which produces Hateoas:
#GetMapping()
public ResponseEntity<PagedModel<ContentModel>> getContent(
#RequestParam(required = false) final String sortBy,
#RequestParam(defaultValue = "0") final Integer page,
#RequestParam(defaultValue = "0") final Integer size) {
// .. do stuff
}
If I use RestTemplate to call this endpoint I get the expected result.
But If I use a FeignClient:
#FeignClient(url = "${project.backend.url}/contents")
public interface ContentClient {
#GetMapping
ResponseEntity<PagedModel<ContentModel>> getContent(
#RequestParam(required = false, name = "sortBy") final String sortBy,
#RequestParam(defaultValue = "0", name = "page") final Integer page,
#RequestParam(defaultValue = "0", name = "size") final Integer size);
}
I still get a response where the metadata is correct but the content is empty:
backend-services-test_1 | 2022-03-15 11:43:29.713 INFO 80 --- [ main] d.b.b.p.p.s.c.CommonQueriesAndAsserts : body: PagedResource { content: [], metadata: Metadata { number: 0, total pages: 1, total elements: 9, size: 2147483647 }, links: }
I tried to add: #EnableHypermediaSupport(type = EnableHypermediaSupport.HypermediaType.HAL)
but that did not solve the problem.
Also this did not solve the problem: Spring Data Rest Hateoas Resources object empty when consuming with Feign client in client service
I also have this config to support jackson with HAL:
#Configuration
public class ServiceConfiguration {
#Bean
public ObjectMapper objectMapper() {
final ObjectMapper objectMapper = new ObjectMapper();
objectMapper.configure(FAIL_ON_UNKNOWN_PROPERTIES, false);
objectMapper.registerModule(new Jackson2HalModule());
return objectMapper;
}
#Bean
public MappingJackson2HttpMessageConverter converter() {
final MappingJackson2HttpMessageConverter converter = new MappingJackson2HttpMessageConverter();
converter.setSupportedMediaTypes(singletonList(HAL_JSON));
converter.setObjectMapper(objectMapper());
return converter;
}
#Bean
public TestRestTemplate testRestTemplate(final RestTemplateBuilder builder) {
return new TestRestTemplate(builder.messageConverters(converter()));
}
}
What do I need to do so that FeignClient can parse the information it gets?
EIDT
I enabled DEBUG and Feign FULL logging and I can see that the json data is correct:
backend-services-test_1 | 2022-03-15 13:24:54.947 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent] <--- HTTP/1.1 200 (607ms)
backend-services-test_1 | 2022-03-15 13:24:54.947 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent] cache-control: no-cache, no-store, max-age=0, must-revalidate
backend-services-test_1 | 2022-03-15 13:24:54.947 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent] connection: keep-alive
backend-services-test_1 | 2022-03-15 13:24:54.947 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent] content-type: application/json
backend-services-test_1 | 2022-03-15 13:24:54.947 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent] date: Tue, 15 Mar 2022 12:24:54 GMT
backend-services-test_1 | 2022-03-15 13:24:54.947 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent] expires: 0
backend-services-test_1 | 2022-03-15 13:24:54.948 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent] keep-alive: timeout=60
backend-services-test_1 | 2022-03-15 13:24:54.948 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent] pragma: no-cache
backend-services-test_1 | 2022-03-15 13:24:54.948 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent] transfer-encoding: chunked
backend-services-test_1 | 2022-03-15 13:24:54.948 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent] x-content-type-options: nosniff
backend-services-test_1 | 2022-03-15 13:24:54.948 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent] x-frame-options: DENY
backend-services-test_1 | 2022-03-15 13:24:54.948 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent] x-xss-protection: 1; mode=block
backend-services-test_1 | 2022-03-15 13:24:54.948 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent]
backend-services-test_1 | 2022-03-15 13:24:54.967 DEBUG 81 --- [ main] d.b.b.p.p.steps.content.ContentClient : [ContentClient#getContent] {
backend-services-test_1 | "links" : [ {
backend-services-test_1 | "rel" : "self",
backend-services-test_1 | "href" : "http://backend-services:8082/contents?sortBy&page=0&size=2000"
backend-services-test_1 | } ],
backend-services-test_1 | "content" : [ {
backend-services-test_1 | "key" : {
backend-services-test_1 | "id" : 1,
backend-services-test_1 | "version" : "2.0"
backend-services-test_1 | },
backend-services-test_1 | "type" : "CONTENT",
backend-services-test_1 | "title" : "Title",
backend-services-test_1 | "subtitle" : "Subtitle",
backend-services-test_1 | }, {
backend-services-test_1 | "key" : {
backend-services-test_1 | "id" : 2,
backend-services-test_1 | "version" : "2.0"
backend-services-test_1 | },
backend-services-test_1 | "type" : "CONTENT",
backend-services-test_1 | "title" : "Title",
backend-services-test_1 | "subtitle" : "Subtitle",
...
backend-services-test_1 | } ],
backend-services-test_1 | "page" : {
backend-services-test_1 | "size" : 2147483647,
backend-services-test_1 | "totalElements" : 9,
backend-services-test_1 | "totalPages" : 1,
backend-services-test_1 | "number" : 0
backend-services-test_1 | }
backend-services-test_1 | }
EIDT 2
Adding:
<dependency>
<groupId>io.github.openfeign</groupId>
<artifactId>feign-jackson</artifactId>
<version>9.3.1</version>
</dependency>
and:
#Bean
public Decoder feignDecoder() {
return new ResponseEntityDecoder(new JacksonDecoder(objectMapper()));
}
Also did not solve the problem.

So with the custom feign decoder I was on the right path.
Here the full config which made it working:
#Configuration
public class FeignConfiguration {
#Bean
Logger.Level feignLoggerLevel() {
return Level.FULL;
}
#Bean
public ObjectMapper objectMapper() {
return new ObjectMapper()
.configure(FAIL_ON_UNKNOWN_PROPERTIES, false)
.registerModule(new Jackson2HalModule());
}
#Bean
public MappingJackson2HttpMessageConverter converter() {
final MappingJackson2HttpMessageConverter converter = new MappingJackson2HttpMessageConverter();
converter.setSupportedMediaTypes(singletonList(HAL_JSON));
converter.setObjectMapper(objectMapper());
return converter;
}
#Bean
public Decoder feignDecoder() {
return new ResponseEntityDecoder(
new SpringDecoder(
() -> new HttpMessageConverters(converter())));
}
}

Related

jq: Unable to retrieve a key/value from Json file using jq

I have the below JSON from the curl output and I need to retrieve the IP address from it. I tried the below jq query, but I am getting the below error. I tried several other ways of doing it, but no luck
curl -sH "X-Requested-By: ambari" -u admin:admin -i http://${AMBARI_IP}:8080/api/v1/hosts?fields=Hosts/host_name,Hosts/ip | jq '.[] | {.items.Hosts.ip}'
jq: error: syntax error, unexpected FIELD (Unix shell quoting issues?) at <top-level>, line 1:
.[] | {.items.Hosts.ip}
jq: 1 compile error
(23) Failed writing body
below is the output of curl
HTTP/1.1 200 OK
Date: Fri, 02 Jul 2021 21:04:27 GMT
X-Frame-Options: DENY
X-XSS-Protection: 1; mode=block
X-Content-Type-Options: nosniff
Cache-Control: no-store
Pragma: no-cache
Set-Cookie: AMBARISESSIONID=123344.node0;Path=/;HttpOnly
Expires: Thu, 01 Jan 1970 00:00:00 GMT
User: admin
Content-Type: text/plain;charset=utf-8
X-Content-Type-Options: nosniff
Vary: Accept-Encoding, User-Agent
Transfer-Encoding: chunked
{
"href" : "http://10.0.0.33:8080/api/v1/hosts?fields=Hosts/host_name,Hosts/ip",
"items" : [
{
"href" : "http://10.0.0.33:8080/api/v1/hosts/sil.dev.test.com",
"Hosts" : {
"host_name" : "test123.sil.dev.test.com",
"ip" : "10.135.3.119"
}
},
{
"href" : "http://10.0.0.33:8080/api/v1/hosts/test001.sil.dev.test.com",
"Hosts" : {
"cluster_name" : "test_cluster",
"host_name" : "test001.sil.dev.test.com",
"ip" : "10.0.0.33"
}
},
{
"href" : "http://10.0.0.33:8080/api/v1/hosts/test002.sil.dev.test.com",
"Hosts" : {
"cluster_name" : "test_cluster",
"host_name" : "test002.sil.dev.test.com",
"ip" : "10.0.0.34"
}
},
{
"href" : "http://10.0.0.33:8080/api/v1/hosts/test003.sil.dev.test.com",
"Hosts" : {
"cluster_name" : "test_cluster",
"host_name" : "test003.sil.dev.test.com",
"ip" : "10.0.0.35"
}
},
}
Try jq '.items[].Hosts.ip'. This grabs the .items key from the outer object, iterates the items array, then pulls the value from the path .Hosts.ip from each object.
PS > cat a.json
{
"href" : "http://10.0.0.33:8080/api/v1/hosts?fields=Hosts/host_name,Hosts/ip",
"items" : [
{
"href" : "http://10.0.0.33:8080/api/v1/hosts/sil.dev.test.com",
"Hosts" : {
"host_name" : "test123.sil.dev.test.com",
"ip" : "10.135.3.119"
}
},
{
"href" : "http://10.0.0.33:8080/api/v1/hosts/test001.sil.dev.test.com",
"Hosts" : {
"cluster_name" : "test_cluster",
"host_name" : "test001.sil.dev.test.com",
"ip" : "10.0.0.33"
}
},
{
"href" : "http://10.0.0.33:8080/api/v1/hosts/test002.sil.dev.test.com",
"Hosts" : {
"cluster_name" : "test_cluster",
"host_name" : "test002.sil.dev.test.com",
"ip" : "10.0.0.34"
}
},
{
"href" : "http://10.0.0.33:8080/api/v1/hosts/test003.sil.dev.test.com",
"Hosts" : {
"cluster_name" : "test_cluster",
"host_name" : "test003.sil.dev.test.com",
"ip" : "10.0.0.35"
}
}
]
}
PS > cat a.json | jq '.items[].Hosts.ip'
"10.135.3.119"
"10.0.0.33"
"10.0.0.34"
"10.0.0.35"
PS > cat a.json | jq -r '.items[].Hosts.ip'
10.135.3.119
10.0.0.33
10.0.0.34
10.0.0.35

Map Json to POJO class using Rest Template exchange method

This is my POST request to an API
ResponseEntity<String> result = rt.exchange(url, HttpMethod.POST, httpEntity, String.class);
The response i am getting is below
{
"headers": {
"Cache-Control": [
"no-store"
],
"Content-Type": [
"application/json;charset=UTF-8"
],
"Date": [
"Thu, 20 Jun 2019 12:50:08 GMT"
],
"Pragma": [
"no-cache"
],
"Server": [
"val"
],
"X-Content-Type-Options": [
"nosniff"
],
"X-Frame-Options": [
"DENY"
],
"X-Xss-Protection": [
"1; mode=block"
],
"Content-Length": [
"331"
]
},
"body": {
"access_token": "token_value,
"scope": "KYC",
"token_type": "bearer",
"expires_in": 49900,
"jti": "jti_val"
},
"statusCode": "OK",
"statusCodeValue": 200
}
I need to get extract
access_token,scope,token_type,statusCodeValue
So what should be structure of my POJO class to map the response ?Or how can i get the values from JSON for those fields ?
ResponseEntity<PojoClass> result = rt.exchange(url, HttpMethod.POST, httpEntity, PojoClass.class);
Body can get through the snippet posted above and status code can be retrieved using "response.getStatus()"
try this:updated
ResponseEntity<Response> response = rt.exchange(url, HttpMethod.POST, httpEntity, Response.class);
Class definition:
Body Class:
public class Body implements Serializable{
private String access_token;
private String scope;
private String token_type;
private long expires_in;
private String jti;
//standard getters and setters
}
Response class:
#JsonIgnoreProperties(ignoreUnknown = true)
public class Response implements Serializable{
private String statusCode;
private int statusCodeValue;
private Body body;
//standard getters and setters
}
Now:
you should access your desired values using their respective getters method.
like:
String token=response.getBody.getAccessToken();
String statusCode= respnse.getStatusCode();

Can I output a property with jq based on a nested property in the input?

This follows on from Extracting selected properties from a nested JSON object with jq which lets the OP there get rid of a load of unwanted properties from a nested object.
I've got the same problem but instead of an array starting [, just have a stream of JSON objects, each like this:
{
"localHostName" : "rest-2-17ve6",
"port" : "80",
"requestHeaders" : {
"x-forwarded-port" : "443",
"x-forwarded-host" : "dummy.com",
"content-length" : "15959431",
"accept" : "*/*",
"x-forwarded-for" : "10.1.9.11",
"authorization" : "hash is present",
"expect" : "100-continue",
"forwarded" : "for=10.5.9.1;host=dummy.com;proto=https",
"content-type" : "application/json",
"host" : "dummy.com",
"x-forwarded-proto" : "https",
"user-agent" : "curl/7.51.0"
},
"uri" : "/2/data/saveList",
"protocol" : "HTTP/1.1",
"threadName" : "http-nio-8080-exec-10",
"requestBytes" : 15959431,
"applicationDuration" : 44135,
"responseStatus" : "200",
"remoteIpAddress" : "10.1.10.1",
"responseHeaders" : {
"X-XSS-Protection" : "1; mode=block",
"Content-Type" : "application/json;charset=UTF-8",
"X-Content-Type-Options" : "nosniff",
"Cache-Control" : "no-cache, no-store, max-age=0, must-revalidate",
"Date" : "Wed, 20 Jun 2018 15:53:27 GMT",
"Transfer-Encoding" : "chunked",
"Vary" : "Accept-Encoding",
"X-Frame-Options" : "DENY",
"Expires" : "0",
"Pragma" : "no-cache"
},
"isoDateTime" : "2018-06-20T15:52:42.466913985Z",
"method" : "POST",
"username" : "rd7y1",
"localIpAddress" : "10.129.9.238",
"responseBytes" : 2,
"requestContentExcerpt" : "blah",
"totalDuration" : 44869,
"responseContentExcerpt" : " [] "
}
I want to filter the stream on the command line so I only get:
{
"isoDateTime" : "2018-06-20T15:52:42.466913985Z",
"method" : "POST",
"username" : "rd7y1",
"requestHeaders.user-agent" : "Rcurl"
}
I tried cat /logs/json.log | jq -cC 'map(requestHeaders|={user-agent})' but I'm getting a syntax error.
Since jq is stream-oriented, you would just use select(...) rather than map(select(...))
It looks like you intend to use .requestHeaders."user-agent" in the criterion for selection.
It's generally recommended to avoid using cat when possible.
According to your stated requirements, you should drop the -c command-line option.
Since "Rcurl" does not appear in your sample input, I'll use the string that does appear.
So in your case, you'd end up with something like:
< /logs/json.log jq '
select(.requestHeaders."user-agent" == "curl/7.51.0")
| {isoDateTime, method, username,
"requestHeaders.user-agent": .requestHeaders."user-agent"}'

AWS Gateway API fails to convert DynamoDb JSON in array to regular JSON

I am attempting to create a service which outputs json from a dynamo db database. However, after implementing body mapping templates in amazon gateway api, the service only converts pieces of the dynamodb json to regular json. There is no error visible in the logs. I have attached below the body mapping templates, and the response body for the GET.
Body Mapping Template:
#set($inputRoot = $input.path('$'))
#foreach($elem in $inputRoot.Items) {
"accession" : "$elem.accession.S",
"entryName" : "$elem.entryName.S",
"sequence" : "$elem.sequence.S",
"sequenceChecksum" : "$elem.sequenceChecksum.S",
"taxid" : "$elem.taxid.N",
"features" : "$elem.features.L"
} #if($foreach.hasNext),#end
#end
Response:
{
"accession" : "P05067",
"entryName" : "A4_HUMAN",
"sequence" : "MLPGLALLLLAAWTARALEVPTDGNAGLLAEPQIAMFCGRLNMHMNVQNGKWDSDPSGTKTCIDTKEGILQYCQEVYPELQITNVVEANQPVTIQNWCKRGRKQCKTHPHFVIPYRCLVGEFVSDALLVPDKCKFLHQERMDVCETHLHWHTVAKETCSEKSTNLHDYGMLLPCGIDKFRGVEFVCCPLAEESDNVDSADAEEDDSDVWWGGADTDYADGSEDKVVEVAEEEEVAEVEEEEADDDEDDEDGDEVEEEAEEPYEEATERTTSIATTTTTTTESVEEVVREVCSEQAETGPCRAMISRWYFDVTEGKCAPFFYGGCGGNRNNFDTEEYCMAVCGSAMSQSLLKTTQEPLARDPVKLPTTAASTPDAVDKYLETPGDENEHAHFQKAKERLEAKHRERMSQVMREWEEAERQAKNLPKADKKAVIQHFQEKVESLEQEAANERQQLVETHMARVEAMLNDRRRLALENYITALQAVPPRPRHVFNMLKKYVRAEQKDRQHTLKHFEHVRMVDPKKAAQIRSQVMTHLRVIYERMNQSLSLLYNVPAVAEEIQDEVDELLQKEQNYSDDVLANMISEPRISYGNDALMPSLTETKTTVELLPVNGEFSLDDLQPWHSFGADSVPANTENEVEPVDARPAADRGLTTRPGSGLTNIKTEEISEVKMDAEFRHDSGYEVHHQKLVFFAEDVGSNKGAIIGLMVGGVVIATVIVITLVMLKKKQYTSIHHGVVEVDAAVTPEERHLSKMQQNGYENPTYKFFEQMQN",
"sequenceChecksum" : "A12EE761403740F5",
"taxid" : "9606",
"features" : "[{"M":{"category":{"S":"MOLECULE_PROCESSING"},"description":{"S":" "},"type":"category":{"S":"MOLECULE_PROCESSING"},"description":{"S":"Amyloid beta A4 protein"},"type":{"S":"CHAIN"},"end":{"S":"770"},"begin":{"S":"18"}}},{"M":{"ftId":{"S":"PRO_0000000089"},"category":{"S":"MOLECULE_PROCESSING"},"description":{"S":"Soluble APP-alpha"},"type":{"S":"CHAIN"},"end":{"S":"687"},"begin":{"S":"18"}}},{"M":{"ftId":{"S":"PRO_0000000090"},"category":{"S":"MOLECULE_PROCESSING"},"description":{"S":"Soluble APP-beta"},"type":{"S":"CHAIN"},"end":{"S":"671"},"begin":{"S":"18"}}},{"M":{"ftId":{"S":"PRO_0000381966"},"category":{"S":"MOLECULE_PROCESSING"},"description":{"S":"N-APP"},"type":{"S":"CHAIN"},"end":{"S":"286"},"begin":{"S":"18"}}},{"M":{"ftId":{"S":"PRO_0000000091"},"category":{"S":"MOLECULE_PROCESSING"},"description":{"S":"C99"},"type":{"S":"CHAIN"},"end":{"S":"770"},"begin":{"S":"672"}}},{"M":{"ftId":{"S":"PRO_0000000092"},"category":{"S":"MOLECULE_PROCESSING"},"description":{"S":"Beta-amyloid protein 42"},"type":{"S":"CHAIN"},"end":{"S":"713"},"begin":{"S":"672"}}}]"
}
Response Headers
{"Content-Type":"application/json"}
Logs
Execution log for request test-request
Mon Aug 08 15:26:05 UTC 2016 : Starting execution for request: test-invoke-request
Mon Aug 08 15:26:05 UTC 2016 : HTTP Method: GET, Resource Path: /proteins/{accession}
Mon Aug 08 15:26:05 UTC 2016 : Method request path: {accession=P05067}
Mon Aug 08 15:26:05 UTC 2016 : Method request query string: {}
Mon Aug 08 15:26:05 UTC 2016 : Method request headers: {}
Mon Aug 08 15:26:05 UTC 2016 : Method request body before transformations: null
Mon Aug 08 15:26:05 UTC 2016 : Endpoint request URI: https://dynamodb.us-west-2.amazonaws.com/?Action=Query
Mon Aug 08 15:26:05 UTC 2016 : Endpoint request headers: {Authorization=****************************************************************************************************************************************************************************************************************************************************************************************b0b302, X-Amz-Date=20160808T152605Z, x-amzn-apigateway-api-id=9x56sueb85, Accept=application/json, User-Agent=AmazonAPIGateway_9x56sueb85, X-Amz-Security-Token=AgoGb3JpZ2luEJv//////////wEaCXVzLXdlc3QtMiKAAlXlB1cz9vo5Kf2llpupTpP1fTiHMBBbZhOmQW30/jCc5Q3RV+BM9k0LtqfJXRdRpzw5DEHg1dmlA1k8Ljha+og4RGYFdpj/9wdc4u1WKnZdy/lZFUAMey0YotNc+RniWyMq+ZiVhY94Sv/zKJ+dxSGkDZbz5A6Jbfj4EfVFuMLC3kHA4tJKWp6PCXpyHJqFqQ+UuI/q0coHNQv0euBD6hNUBOEBZes2TIQdTha8f4k+avX7o1f3LcpIjfvdPN4InOXZ7ZMHDgpLEuxurOZ7taZjoXftHxpRG2GAciTNj7gQASCsxRAQL/4gujC6yydGievEE6V5Zn5prIRnHPz0Lmcq8QII8f//////////ARAAGgw5MTUzMzI4Mzc1NDAiDD4AxveOBPRjUQ28ZirFAj5mJUN8gxUfXUQc1AzD08pLgpAtrz11K1Xgax/ATvptUj//Pcy+4fS90PqdZSqMSmS8KsD0X46m7GfhNNzuQypCdY3lyyN [TRUNCATED]
Mon Aug 08 15:26:05 UTC 2016 : Endpoint request body after transformations: {
"TableName": "Protein_DB",
"IndexName": "accession-index",
"KeyConditionExpression": "accession = :v1",
"ExpressionAttributeValues": {
":v1": {
"S": "P05067"
}
}
}
Mon Aug 08 15:26:05 UTC 2016 : Endpoint response body before transformations: {"Count":1,"Items":[{"accession":{"S":"P05067"},"features":{"L":[{"M":{"category":{"S":"MOLECULE_PROCESSING"},"description":{"S":" "},"type":{"S":"SIGNAL"},"evidences":{"L":[{"M":{"source":{"M":{"id":{"S":"12665801"},"alternativeUrl":{"S":"http://europepmc.org/abstract/MED/12665801"},"name":{"S":"PubMed"},"url":{"S":"http://www.ncbi.nlm.nih.gov/pubmed/12665801"}}},"code":{"S":"ECO:0000269"}}},{"M":{"source":{"M":{"id":{"S":"2900137"},"alternativeUrl":{"S":"http://europepmc.org/abstract/MED/2900137"},"name":{"S":"PubMed"},"url":{"S":"http://www.ncbi.nlm.nih.gov/pubmed/2900137"}}},"code":{"S":"ECO:0000269"}}},{"M":{"source":{"M":{"id":{"S":"3597385"},"alternativeUrl":{"S":"http://europepmc.org/abstract/MED/3597385"},"name":{"S":"PubMed"},"url":{"S":"http://www.ncbi.nlm.nih.gov/pubmed/3597385"}}},"code":{"S":"ECO:0000269"}}}]},"end":{"S":"17"},"begin":{"S":"1"}}},{"M":{"ftId":{"S":"PRO_0000000088"},"category":{"S":"MOLECULE_PROCESSING"},"description":{"S":"Amyloid b [TRUNCATED]
Mon Aug 08 15:26:05 UTC 2016 : Endpoint response headers: {x-amzn-RequestId=J0J7K1RFDOGEK20V70T0HKJ14JVV4KQNSO5AEMVJF66Q9ASUAAJG, x-amz-crc32=273504943, Content-Length=75281, Date=Mon, 08 Aug 2016 15:26:05 GMT, Content-Type=application/x-amz-json-1.0}
Mon Aug 08 15:26:05 UTC 2016 : Method response body after transformations:
{
"accession" : "P05067",
"entryName" : "A4_HUMAN",
"sequence" : "MLPGLALLLLAAWTARALEVPTDGNAGLLAEPQIAMFCGRLNMHMNVQNGKWDSDPSGTKTCIDTKEGILQYCQEVYPELQITNVVEANQPVTIQNWCKRGRKQCKTHPHFVIPYRCLVGEFVSDALLVPDKCKFLHQERMDVCETHLHWHTVAKETCSEKSTNLHDYGMLLPCGIDKFRGVEFVCCPLAEESDNVDSADAEEDDSDVWWGGADTDYADGSEDKVVEVAEEEEVAEVEEEEADDDEDDEDGDEVEEEAEEPYEEATERTTSIATTTTTTTESVEEVVREVCSEQAETGPCRAMISRWYFDVTEGKCAPFFYGGCGGNRNNFDTEEYCMAVCGSAMSQSLLKTTQEPLARDPVKLPTTAASTPDAVDKYLETPGDENEHAHFQKAKERLEAKHRERMSQVMREWEEAERQAKNLPKADKKAVIQHFQEKVESLEQEAANERQQLVETHMARVEAMLNDRRRLALENYITALQAVPPRPRHVFNMLKKYVRAEQKDRQHTLKHFEHVRMVDPKKAAQIRSQVMTHLRVIYERMNQSLSLLYNVPAVAEEIQDEVDELLQKEQNYSDDVLANMISEPRISYGNDALMPSLTETKTTVELLPVNGEFSLDDLQPWHSFGADSVPANTENEVEPVDARPAADRGLTTRPGSGLTNIKTEEISEVKMDAEFRHDSGYEVHHQKLVFFAEDVGSNKGAIIGLMVGGVVIATVIVITLVMLKKKQYTSIHHGVVEVDAAVTPEERHLSKMQQNGYENPTYKFFEQMQN",
"sequenceChecksum" : "A12EE761403740F5",
"taxid" : "9606",
[TRUNCATED]
Mon Aug 08 15:26:05 UTC 2016 : Method response headers: {Content-Type=application/json}
Mon Aug 08 15:26:05 UTC 2016 : Successfully completed execution
Mon Aug 08 15:26:05 UTC 2016 : Method completed with status: 200
I've attempted to format the featuers field with little sucess.Here is my attempt:
#set($inputRoot = $input.path('$'))
#foreach($elem in $inputRoot.Items) {
"accession" : "$elem.accession.S",
"entryName" : "$elem.entryName.S",
"sequence" : "$elem.sequence.S",
"sequenceChecksum" : "$elem.sequenceChecksum.S",
"taxid" : "$elem.taxid.N",
"features" : "$elem.features.L"
}#if($foreach.hasNext),#end
#end
#foreach($elem in $inputRoot.Items.features)
{
"alternativeSequence": "$elem.alternativeSequence.S",
"begin": "$elem.begin.S",
"category": "$elem.category.S",
"description": "$elem.description.S",
"end": "$elem.end.S",
"evidences": "$elem.evidences.L",
"ftID": "$elem.ftId.S",
"type": "$elem.type.S"
}#if($foreach.hasNext),#end
#end``
Is the problem the 'features' field? I think you just added quotes when they aren't necessary.
#set($inputRoot = $input.path('$'))
#foreach($elem in $inputRoot.Items) {
"accession" : "$elem.accession.S",
"entryName" : "$elem.entryName.S",
"sequence" : "$elem.sequence.S",
"sequenceChecksum" : "$elem.sequenceChecksum.S",
"taxid" : "$elem.taxid.N",
"features" : $elem.features.L
} #if($foreach.hasNext),#end
#end
The issue was that $elem is a variable not a function, and could not be reused in the foreach loop. A new name for the variable must be given such as $elem1. I have attached below the final working body template.
#set($inputRoot = $input.path('$'))
#foreach($elem in $inputRoot.Items) {
"accession" : "$elem.accession.S",
"entryName" : "$elem.entryName.S",
"sequence" : "$elem.sequence.S",
"sequenceChecksum" : "$elem.sequenceChecksum.S",
"taxid" : $elem.taxid.N,
"features" : [
#foreach($elem1 in $elem.features.L){
"alternativeSequence": "$elem1.M.alternativeSequence.S",
"begin": "$elem1.M.begin.S",
"category": "$elem1.M.category.S",
"description": "$elem1.M.description.S",
"end": "$elem1.M.end.S",
"evidences": [
#foreach($elem2 in $elem1.M.evidences.L){
"code":"$elem2.M.code.S",
"source" : #foreach($elem3 in $elem2.M.source){
"alternativeUrl" : "$elem3.alternativeUrl.S",
"id":"$elem3.id.S",
"name":"$elem3.name.S",
"url":"$elem3.url.S",
}#if($foreach.hasNext),#end
#end
}#if($foreach.hasNext),#end
#end
],
"ftID": "$elem1.M.ftId.S",
"type": "$elem1.M.type.S"
}#if($foreach.hasNext),#end
#end
]
}#if($foreach.hasNext),#end
end

Dynamic logstash json mapping in elasticsearch

I need to implement rest service that will take any typed data in request and put it to elasticsearch through logstash.
Spring controller receives request body:
public class CustomData {
private String component;
private Object data;
}
Data is any custom json from PUT request.
I try to utilize net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder
Logstash config is as follows:
input {
tcp {
port => 5000
codec => json
}
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
index => "%{indexName}-%{+YYYY.MM.dd}"
}
}
As you can see there is a indexName parameter - I set it with MDC. But after message goes through logstash, elastic says there is no mapping for my object:
{
"appId1-2016.03.15" : {
"aliases" : { },
"mappings" : {
"logs" : {
"properties" : {
"#timestamp" : {
"type" : "date",
"format" : "strict_date_optional_time||epoch_millis"
},
"#version" : {
"type" : "long"
},
"host" : {
"type" : "string"
},
"port" : {
"type" : "long"
}
}
},
"e1" : {
"properties" : {
"#timestamp" : {
"type" : "date",
"format" : "strict_date_optional_time||epoch_millis"
},
"#version" : {
"type" : "long"
},
"applicationId" : {
"type" : "string"
},
"host" : {
"type" : "string"
},
"level" : {
"type" : "string"
},
"level_value" : {
"type" : "long"
},
"logger_name" : {
"type" : "string"
},
"message" : {
"type" : "string"
},
"port" : {
"type" : "long"
},
"thread_name" : {
"type" : "string"
},
"type" : {
"type" : "string"
},
"userId" : {
"type" : "string"
}
}
}
},
"settings" : {
"index" : {
"creation_date" : "1458053563829",
"number_of_shards" : "5",
"number_of_replicas" : "1",
"uuid" : "w5y7GPd-Sk65gdpBQ_IKow",
"version" : {
"created" : "2020099"
}
}
},
"warmers" : { }
}
}
There's only message with string type.
stdout from logstash is:
elasticsearch_1 | [2016-03-16 04:38:49,018][INFO ][cluster.metadata ] [Ripfire] [json_encoder-events-2016.03.16] create_mapping [ef]
elasticsearch_1 | [2016-03-16 04:38:49,108][INFO ][cluster.metadata ] [Ripfire] [json_encoder-events-2016.03.16] update_mapping [logs]
elasticsearch_1 | [2016-03-16 04:38:49,174][INFO ][cluster.metadata ] [Ripfire] [json_encoder-events-2016.03.16] update_mapping [ef]
logstash_1 | 2016-03-16T04:38:48.197Z 10.27.13.228 Initializing Spring FrameworkServlet 'dispatcherServlet'
logstash_1 | 2016-03-16T04:38:48.197Z 10.27.13.228 FrameworkServlet 'dispatcherServlet': initialization started
logstash_1 | 2016-03-16T04:38:48.219Z 10.27.13.228 FrameworkServlet 'dispatcherServlet': initialization completed in 22 ms
logstash_1 | 2016-03-16T04:38:48.412Z 10.27.13.228 {"field1":"value","field2":40000}
logstash_1 | 2016-03-16T04:38:48.423Z 10.27.13.228 {"field1":"value","field2":40000}
logstash_1 | 2016-03-16T04:38:48.457Z 10.27.13.228
logstash_1 | request payload={
logstash_1 | "type": "ef",
logstash_1 | "userId": "ASD",
logstash_1 | "data": {
logstash_1 | "field1": "value",
logstash_1 | "field2": 40000
logstash_1 | }
logstash_1 | }
logstash_1 |
logstash_1 | response payload=null
Is there any way to get data field mapped according to it's structure? Thank you so much for suggestions...