In ssh,when we want to return json data we can just set
<result name="success" type="json"></result>
in the struts xml.
But when my data has Timestamp values it return result like "2015-08-11T09:19:25Z";
Then I want to use fastjson to treat the data(has some other reasons),but I don't know to set fastjson into ssh so that I can return a json from fastjson rather than the json util itself.
looking forward to your answer
Above the get method of the timestamp value we can add
#JSON(format="yyyy-MM-dd HH:mm:ss")
public Timestamp getTest(){
return this.test;
}
Related
I have a copy of product in order details table saved as json encoded. Now what I want is that when I retrieve it returns a string but I want it to json decoded. I don't want to decode it explicitly I want it implicitly decoded. Some thing in Model like $casts etc.
Any help would be appreciated thanks in advance.
Let's assume your column is named order_details.
You can add this accessor (full doc here: https://laravel.com/docs/8.x/eloquent-mutators#defining-an-accessor) in your Order model.
public function getOrderDetailsAttribute($value)
{
return json_decode($value ?: [], true); //"$value ?: []" ensure a null value will be coverted into an empty array
}
Every time you'll call $order->order_details, it'll decode the json and return it as an array.
I'm using Flink to process the data coming from some data source (such as Kafka, Pravega etc).
In my case, the data source is Pravega, which provided me a flink connector.
My data source is sending me some JSON data as below:
{"key": "value"}
{"key": "value2"}
{"key": "value3"}
...
...
Here is my piece of code:
PravegaDeserializationSchema<ObjectNode> adapter = new PravegaDeserializationSchema<>(ObjectNode.class, new JavaSerializer<>());
FlinkPravegaReader<ObjectNode> source = FlinkPravegaReader.<ObjectNode>builder()
.withPravegaConfig(pravegaConfig)
.forStream(stream)
.withDeserializationSchema(adapter)
.build();
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<ObjectNode> dataStream = env.addSource(source).name("Pravega Stream");
dataStream.map(new MapFunction<ObjectNode, String>() {
#Override
public String map(ObjectNode node) throws Exception {
return node.toString();
}
})
.keyBy("word") // ERROR
.timeWindow(Time.seconds(10))
.sum("count");
As you see, I used the FlinkPravegaReader and a proper deserializer to get the JSON stream coming from Pravega.
Then I try to transform the JSON data into a String, KeyBy them and count them.
However, I get an error:
The program finished with the following exception:
Field expression must be equal to '*' or '_' for non-composite types.
org.apache.flink.api.common.operators.Keys$ExpressionKeys.<init>(Keys.java:342)
org.apache.flink.streaming.api.datastream.DataStream.keyBy(DataStream.java:340)
myflink.StreamingJob.main(StreamingJob.java:114)
It seems that KeyBy threw this exception.
Well, I'm not a Flink expert so I don't know why. I've read the source code of the official example WordCount. In that example, there is a custtom splitter, which is used to split the String data into words.
So I'm thinking if I need to use some kind of splitter in this case too? If so, what kind of splitter should I use? Can you show me an example? If not, why did I get such an error and how to solve it?
I guess you have read the document about how to specify keys
Specify keys
The example codes use keyby("word") because word is a field of POJO type WC.
// some ordinary POJO (Plain old Java Object)
public class WC {
public String word;
public int count;
}
DataStream<WC> words = // [...]
DataStream<WC> wordCounts = words.keyBy("word").window(/*window specification*/);
In your case, you put a map operator before keyBy, and the output of this map operator is a string. So there is obviously no word field in your case. If you actually want to group this string stream, you need to write it like this .keyBy(String::toString)
Or you can even implement a customized keySelector to generate your own key.
Customized Key Selector
I am converting a TFDMemTable to JSON via SaveToStream(). Then I use TJSONObject::ParseJSONValue() to get the JSON object. After some parsing, I return the JSON in string format via ToString().
TStringStream *Stream = new TStringStream();
TJSONObject *Json = new TJSONObject();
fdMemTable->SaveToStream(Stream.get(), sfJSON);
TJSONObject *JsonParsed = (TJSONObject*) Json->ParseJSONValue(Stream->DataString);
...
return JsonParsed->ToString();
All through this, the dates remain in the form 20180329T013152 instead of 2018-03-29T01:31:52. I am looking to see if there is any option that I can set. TJsonOptions seems to be close to what I am looking for, but seems to only be used with ObjectToJsonString().
Does anyone know any such option, or do I have to do this conversion per date/time field?
There is no date/time type in JSON. Date/time values are just arbitrary string values with formatting. So, unless TFDMemTable provides an option to specify date/time formatting for its JSON output, then you will have to handle this on a per-field basis.
BTW, you don't need to create a TJSONObject object to call ParseJSONValue():
TJSONObject *JsonParsed = (TJSONObject*) TJSONObject::ParseJSONValue(Stream->DataString);
According to:
http://search.cpan.org/~drolsky/DateTime-1.43/lib/DateTime.pm#Formatters_And_Stringification
the following will work:
use DateTime;
$dt = DateTime->new( ... );
print $dt; # as string
and it does.
However, when using the JSON module to embed the created $dt string as a json-encoded data structure as follows:
use JSON;
my $json = JSON::encode_json( { dt => $dt } );
The JSON module is throwing the following exception:
encountered object '2017-08-02', but neither allow_blessed nor convert_blessed settings are enabled at ...
I can work around the issue as follows:
my $json = JSON::encode_json( { dt => substr($dt,0) } );
but wonder if there's a cleaner way to get a "true" string out of DateTime?
I've looked through the DateTime docs but I don't see an explicit stringify() to create the true string as required by JSON.
EDIT: In the constructor, I specify a desired output format via 'formatter'. I'd like to get a string using the already-specified format, and not specify the format again (which would be against DRY principle...)
EDIT2: Interpolating the string is a good solution if just $dt, but doesn't work if $dt is actually a moose attribute (i.e. "$person->birth_d").
You can use "$dt" or "".$o->dt. Like your solution, it's short for
$dt->formatter ? $dt->formatter->format_datetime($dt) : $dt->iso8601
If you want a specific format, there's $dt->strftime, $dt->iso8601, or you could use a DateTime::Format:: module.
You can explicitly interpolate the object in a string ("$dt") or call the method that's actually getting called ($dt->iso8601).
use strict;
use warnings;
use 5.010;
use DateTime;
my $dt = DateTime->now;
say $dt;
say "$dt";
say $dt->iso8601;
Output:
2017-08-02T19:54:22
2017-08-02T19:54:22
2017-08-02T19:54:22
I want to write an API ReST endpoint, using Spring 4.0 and Groovy, such that the #RequestBody parameter can be any generic JSON input, and it will be mapped to a Groovy JsonSlurper so that I can simply access the data via the slurper.
The benefit here being that I can send various JSON documents to my endpoint without having to define a DTO object for every format that I might send.
Currently my method looks like this (and works):
#RequestMapping(value = "/{id}", method = RequestMethod.PUT, consumes = MediaType.APPLICATION_JSON_VALUE)
ResponseEntity<String> putTest(#RequestBody ExampleDTO dto) {
def json = new groovy.json.JsonBuilder()
json(
id: dto.id,
name: dto.name
);
return new ResponseEntity(json.content, HttpStatus.OK);
}
But what I want, is to get rid of the "ExampleDTO" object, and just have any JSON that is passed in get mapped straight into a JsonSlurper, or something that I can input into a JsonSlurper, so that I can access the fields of the input object like so:
def json = new JsonSlurper().parseText(input);
String exampleName = json.name;
I initially thought I could just accept a String instead of ExampleDTO, and then slurp the String, but then I have been running into a plethora of issues in my AngularJS client, trying to send my JSON objects as strings to the API endpoint. I'm met with an annoying need to escape all of the double quotes and surround the entire JSON string with double quotes. Then I run into issues if any of my data has quotes or various special characters in it. It just doesn't seem like a clean or reliable solution.
I open to anything that will cleanly translate my AngularJS JSON objects into valid Strings, or anything I can do in the ReST method that will allow JSON input without mapping it to a specific object.
Thanks in advance!
Tonya