Laravel 5.4 won't validate JSON - json

I'm using Laravel 5.4 and trying to validate JSON in my POST request however the validator fails stating that the JSON isn't valid, even though it is. I'm assuming I'm not understanding the validation rules correctly and my implementation is wrong, rather than a bug or something else.
I have a simple POST endpoint which has both the Accept and Content-Type headers set to application/json.
In my POST request (testing using Postman) I'm supplying RAW data.
{
"only_this_key": { "one": "two" }
}
In my controller method I have the following:
// I'm using intersect to remove any other parameters that may have been supplied as this endpoint only requires one
$requestData = $request->intersect(['only_this_key']);
$messages = [
'only_this_key.required' => 'The :attribute is required',
'only_this_key.json' => 'The :attribute field must be valid JSON',
];
$validator = \Validator::make($requestData, [
'only_this_key' => 'required|json',
], $messages);
if ($validator->fails()) {
return new APIErrorValidationResponse($request, $validator);
}
return response()->json(['all good' => 'here']);
The error I get back is The inventory field must be valid JSON even though it is!
Passing in the raw data using Postman
{
"only-this-key": {
"item-one": "one",
"item-two": "two",
"item-three": "three"
},
"not": "wanted"
}
When I use dd($request->all()); within the method
array:2 [
"what-i-want" => array:3 [
"item-one" => "one"
"item-two" => "two"
"item-three" => "three"
]
"not" => "wanted"
]

The problem is with how Laravel is interpreting the raw data in the request. If you run dd($request->all()) in your controller you will see this output:
array:1 [
"{"only_this_key":{"one":"two"}}" => ""
]
Your entire JSON string is getting set as a key with a value of an empty string. If you absolutely must send it as raw data, then you're going to have to grab that key value and save it to an array with the key that you want. This should work (instead of the intersect line).
$requestData = ['only_this_key' => key($request->all())];
Alternatively, you can just send the body as x-www-form-urlencoded with your entire JSON string as the only value for one key.

Related

Using Laravel, is there a way to run validation on one ajax call with data for multiple models?

Assuming one were to post multiple data sets of one model at the time through JSON, it is possible to insert these using Eloquent's Model::create() function. However in my case I'll also need to validate this data.
The Validator only takes a Request object as input, and as far as I've seen I can't create a new Request instance with only one model.
Assuming this would be the input data (JSON), and index is the value for the browser to know what data belongs to an what item (as they have no unique ID assigned at the point of creation)
[
{
"index" : 1,
"name" : "Item 1",
"value" : "Some description"
},
{
"index" : 2,
"name" : "Item 2",
"value" : "Something to describe item 2"
},
(and so on)
]
Every object in the root array needs to be ran through the same validator. The rules of it are defined in Model::$rules (public static array).
Would there be a way to run the validator against every item, and possibly capture the errors per item?
You can utilize Validator for manual validation:
...
use Validator;
...
$validator = Validator::make(
json_decode($data, true), // where $data contains your JSON data string
[
// List your rules here using wildcard syntax.
'*.index' => 'required|integer',
'*.name' => 'required|min:2',
...
],
[
// Array of messages for validation errors.
...
],
[
// Array of attribute titles for validation errors.
...
]
);
if ($validator->fails()) {
// Validation failed.
// $validator->errors() will return MessageBag with what went wrong.
...
}
You can read more about validating arrays here.

Json::decode returns NULL

I have problem with Json::decode. I'm using this code:
use Drupal\Component\Serialization\Json;
$client = \Drupal::httpClient();
$request = $client->post($rest_url, [
'form_params' => [
'id' => $rest_id,
],
]);
$response = Json::decode($request->getBody());
to get JSON from some server but it returns NULL. Of course this is just a part of the code (without try, catch...)
$request->getBody() return is ok, but in Json::decode I'm still getting NULL.
The only thing I noticed is that in Postman, when I look at raw body content, I see some empty lines at the beginning of the JSON (like return on keyboard when typing), but I checked JSON as it is on JSONLint and it's valid.
Any idea what is the problem?
I'm not familiar with Drupal's JSON serializer, but try to force response body conversation to a string.
$response = Json::decode($request->getBody()->getContents());
Guzzle return a Stream object from getBody(), it could be the issue.

Parsing JSON file into logstash

Hi I am trying to send a json file with multiple objects to elasticsearch with the logstash so I can display the data using kibana. I have researched this extensively and simply cannot understand how to make the data formatted correctly to be used in kibana.
I have tried to use different filters such as: json, date, and grok
The issue is probably how I'm going about using these filters as I can't understand it's setup all to well.
Here is a sample line of the input json file:
{"time":"2015-09-20;12:13:24","bug_code":"tr","stacktrace":"543534"},
I want to use this format for displaying the data in kibana and sorting many objects according to their "time"
this following is what my current filter section is:
filter {
date {
match => ["time", "YYYY-MM-dd;HH:mm:ss Z" ]
timezone => "America/New_York"
locale => "en"
target => "#timestamp"
}
grok {
match => ["time", "%{TIMESTAMP_ISO8601:timestamp}"]
}
}
At this point I know the grok is wrong because I get "_grokparsefailure"
but how can I figure out the correct way to use grok or is there a simple way to sort the data using the given timestamp and not the processed timestamp given when sending the data through.
here is what the output currently shows:
"message" => "{\"time\":\"2015-09-20;12:13:24\",\"bug_code\":\"tr\",\"stacktrace\":\"543534\"},\r",
"#version" => "1",
"#timestamp" => "2015-11-23T09:54:50:274Z",
"host" => "<my_computer>",
"path" => "<path_to_.json>",
"type" => "json",
"tags" => [
[0] "_grokparsefailure"
any advice would be very much appreciated
You're almost there, I could get it working with a few tweaks.
First, you need to add the json{} filter in the first position. Then you need to change the date pattern to YYYY-MM-dd;HH:mm:ss and finally you can remove the grok filter at the end. You filter configuration would look like this:
filter {
json {
source => "message"
}
date {
match => ["time", "YYYY-MM-dd;HH:mm:ss" ]
timezone => "America/New_York"
locale => "en"
target => "#timestamp"
}
}
The parsed event for your sample JSON line would then look like this:
{
"message" => "{\"time\":\"2015-09-20;12:13:24\",\"bug_code\":\"tr\",\"stacktrace\":\"543534\"}",
"#version" => "1",
"#timestamp" => "2015-09-20T16:13:24.000Z",
"host" => "iMac.local",
"time" => "2015-09-20;12:13:24",
"bug_code" => "tr",
"stacktrace" => "543534"
}

Logstash JSON Parse - Ignore or Remove Sub-Tree

I'm sending JSON to logstash with a config like so:
filter {
json {
source => "event"
remove_field => [ "event" ]
}
}
Here is an example JSON object I'm sending:
{
  "#timestamp": "2015-04-07T22:26:37.786Z",
  "type": "event",
  "event": {
    "activityRecord": {
      "id": 68479,
      "completeTime": 1428445597542,
      "data": {
        "2015-03-16": true,
        "2015-03-17": true,
        "2015-03-18": true,
        "2015-03-19": true
      }
    }
  }
}
Because of the arbitrary nature of the activityRecord.data object, I don't want logstash and elasticsearch to index all these date fields. As is, I see activityRecord.data.2015-03-16 as a field to filter on in Kibana.
Is there a way to ignore this sub-tree of data? Or at least delete it after it has already been parsed? I tried remove_field with wildcards and whatnot, but no luck.
Though not entirely intuitive it is documented that subfield references are made with square brackets, e.g. [field][subfield], so that's what you'll have to use with remove_field:
mutate {
remove_field => "[event][activityRecord][data]"
}
To delete fields using wildcard matching you'd have to use a ruby filter.

Is there an easy way to make GORM errors restful api friendly

I'm working on a simple Restful API in GRAILS, I want users to be able to create an entry on one of my domain classes, so they can hit an entry point /rest/v1/create/event?params
In the receiving controller if the GORM entry fails, !event.save()
I have code like this:
def result = [
'status' : 'error',
'data' : event.errors.fieldErrors.toList()
]
render result as JSON
Is there a way to easily make event.errors.fieldErrors JSON friendly, something with just the field error and message, or will I have to write a parser method to handle this?
Ending up writing a short method to parse through and make friendly errors
If anyone finds this useful, here it is:
def gorm_errors(results) {
results = results.fieldErrors.toList()
def errors = []
for(error in results) {
errors.add([
'type' : 'invalid_entry',
'field' : error.field,
'rejected_value' : error.rejectedValue,
'message' : error.defaultMessage
])
}
return errors
}
Here is a more "groovy-er" version of the above example:
def gorm_errors(errors) {
errors.fieldErrors.toList().collect {error ->
[
'type': 'invalid_entry',
'field': error.field,
'rejected_value': error.rejectedValue,
'message': error.defaultMessage
]
}