Accessing Json objects in perl and reuse it in another JSON - json

I have received arguments in my mason handler which looks in the following format:
$data = {
'cacheParams' => 0,
'requests' => {
'locationId' => 1,
'uniqueId' => [
'ABC',
'DEF',
'XYZ'
]
}
};
I am able to access the requests by using $data['requests']. How do I access the values stored in requests, i.e. locationId and uniqueId ? I need to use these values to form another JSON in the following way:
my $input = {
stateID => 44,
locationId => requests.locationId,
uniqueId => requests.uniqueId
.
.
.
}

The $data['requests'] object should be an hash in your way. So you can access to the keys like in the following:
$data['requests']->{'locationId'}
$data['requests']->{'uniqueId'}
or
$requests = $data['requests']
$locationId = $requests->{'locationId'}
$uniqueId = $requests->{'uniqueId'}

Related

Getting error while using GroupBy and Pagination in Eloquent

I'm trying to use eloquent to get me a grouped by response and at the same time give me a Pagination response (The one that gives me the link to the second page).
I'm trying to do this:
App\Eating::Where('student_id', 2)->orderBy('created_at', 'DESC')->groupBy(function ($row) {
return Carbon\Carbon::parse($row->created_at)->format('Y-m-d');
})->paginate(25);
But, I'm getting this error when running it in the Tinker:
PHP warning: strtolower() expects parameter 1 to be string, object given in D:\Folder\vendor\laravel\framework\src\Illuminate\Database\Grammar.php on line 58
without the groupBy, I'm getting the correct result:
>>> App\Eating::Where('student_id', 2)->orderBy('created_at', 'DESC')->paginate(25)->toArray();
=> [
"total" => 1,
"per_page" => 25,
"current_page" => 1,
"last_page" => 1,
"next_page_url" => null,
"prev_page_url" => null,
"from" => 1,
"to" => 3,
"data" => [
[
"id" => 5,
"status" => "Comeu Bem",
"created_at" => "2017-07-05 13:55:25",
"updated_at" => "2017-07-05 13:55:25",
],
],
]
BUT, when I remove the pagination, I do get the error but only because I added the get():
>>> App\Eating::Where('student_id', 2)->orderBy('created_at', 'DESC')->groupBy(function ($row) {
... return Carbon\Carbon::parse($row->created_at)->format('Y-m-d');
... })->get();
PHP warning: strtolower() expects parameter 1 to be string, object given in D:\Joao\git\F1Softwares\Code\Server\F1Softwares\vendor\laravel\framework\src\Illuminate\Database\Grammar.php on line 58
>>>
>>>
>>> App\Eating::Where('student_id', 2)->orderBy('created_at', 'DESC')->groupBy(function ($row) {
... return Carbon\Carbon::parse($row->created_at)->format('Y-m-d');
... });
=> Illuminate\Database\Eloquent\Builder {#855}
Any idea what I could be doing wrong? I do need to have the orderBy AND the pagination, to make it easier for the app to show the results(It is a RestFul call).
Thanks,
João
You must call the groupBy() method on a collection, but it seems this won't work with paginate(). You could try using the forPage() method on the collection:
App\Eating::where('student_id', 2)->orderBy('created_at', 'DESC')
->get()->groupBy(function ($eating) {
return $eating->created_at->format('Y-m-d');
})->forPage(1, 25);
Also, just a note, you don't need to use Carbon to parse the date, Eloquent does this for you.
Alternatively, you could try to manually create your paginator once you have the collection grouped using Illuminate\Pagination\LengthAwarePaginator.
$eatings = App\Eating::where('student_id', 2)->orderBy('created_at', 'DESC')
->get()->groupBy(function ($eating) {
return $eating->created_at->format('Y-m-d');
});
$paginatedEatings = new LengthAwarePaginator($eatings, $eatings->count(), 25);
return $paginatedEatings->toArray();

Tried to bind parameter number 0. SQL Server supports a maximum of 2100 parameters

I'm currently using a PDO class that works on MySQL perfectly. But when it comes to MSSQL , I get an error when I try to insert data via the bindValue() function.
I'm using this method for data binding:
bindValue(":param",$value)
Step 1 - Create an array for the table fields in the query
$counter = 0;
foreach($fields as $cols)
{
$fieldBind[$counter] = ":".$cols;
$new_f = $new_f ."". $cols;
$counter ++;
if($counter!=count($fields))
{
$new_f = $new_f.",";
}
}
output : (
[0] => :field1
[1] => :field2
[2] => :field3
)
Step 2 - Create an array for the data of the fields in the query
$counter2 = 0;
foreach($data as $cols)
{
$dataBind[$counter2] = $cols;
$new_d = $new_d."'".$cols."'";
$counter2 ++;
if($counter2!=count($data))
{
$new_d = $new_d.",";
}
}
output : ( [0] => value1 [1] => value2 [2] => value3 )
Step 3 - Prepare the query via the query function
parent::query("INSERT INTO $table($new_f) VALUES($new_d)");
Step 4 - Bind the Parameters and Values
for($i=0;$i<count($data);$i++){
parent::bind($fieldBind[$i],$dataBind[$i]);
}
The query looks like this:
INSERT INTO table(field1,field2,field3) values(':value1',':value2',':value3')
Step 5 - Execute the Query
try {
parent::execute();
return parent::rowCount();
}
catch(PDOException $e) {
echo $e->getMessage();
}
This method works perfectly on MySQL, but when I try to execute this on SQL Server, I get this error:
SQLSTATE[IMSSP]: Tried to bind parameter number 0. SQL Server supports a maximum of 2100 parameters.
Try removing the apostrophe ''
from :
INSERT INTO table(field1,field2,field3) values(':value1',':value2',':value3')
to the following:
INSERT INTO table(field1,field2,field3) values(:value1,:value2,:value3)

Lazy-loading a SQL row upon construction of a Perl Moose object? [duplicate]

Using Moose, is it possible to create a builder that builds multiple attributes at once?
I have a project in which the object has several 'sets' of fields - if any member of the set is requested, I want to go ahead and populate them all. My assumption is that if I need the name, I'll also need the birthdate, and since they're in the same table, it's faster to get both in one query.
I'm not sure if my question is clear enough, but hopefully some sample code will make it clear.
What I have:
Package WidgetPerson;
use Moose;
has id => (is => 'ro', isa => 'Int' );
has name => (is => 'ro', lazy => 1, builder => '_build_name');
has birthdate => (is => 'ro', lazy => 1, builder => '_build_birthdate');
has address => (is => 'ro', lazy => 1, builder => '_build_address');
sub _build_name {
my $self = shift;
my ($name) = $dbh->selectrow_array("SELECT name FROM people WHERE id = ?", {}, $self->id);
return $name;
}
sub _build_birthdate {
my $self = shift;
my ($date) = $dbh->selectrow_array("SELECT birthdate FROM people WHERE id = ?", {}, $self->id);
return $date;
}
sub _build_address {
my $self = shift;
my ($date) = $dbh->selectrow_array("SELECT address FROM addresses WHERE person_id = ?", {}, $self->id);
return $date;
}
But what I want is:
has name => (is => 'ro', isa => 'Str', lazy => 1, builder => '_build_stuff');
has birthdate => (is => 'ro', isa => 'Date', lazy => 1, builder => '_build_stuff');
has address => (is => 'ro', isa => 'Address', lazy => 1, builder => '_build_address');
sub _build_stuff {
my $self = shift;
my ($name, $date) = $dbh->selectrow_array("SELECT name, birthdate FROM people WHERE id = ?", {}, $self->id);
$self->name($name);
$self->birthdate($date);
}
sub _build_address {
#same as before
}
What I do in this case, when I don't want to have a separate object as in Ether's answer, is have a lazily built attribute for the intermediate state. So, for example:
has raw_row => (is => 'ro', init_arg => undef, lazy => 1, builder => '_build_raw_row');
has birthdate => (is => 'ro', lazy => 1, builder => '_build_birthdate');
sub _build_raw_row {
$dbh->selectrow_hashref(...);
}
sub _build_birthdate {
my $self = shift;
return $self->raw_row->{birthdate};
}
Repeat the same pattern as birthdate for name, etc.
Reading any of the individual attributes will try to get data from raw_row, whose lazy builder will only run the SQL once. Since your attributes are all readonly, you don't have to worry about updating any object state if one of them changes.
This pattern is useful for things like XML documents, too -- the intermediate state you save can be e.g. a DOM, with individual attributes being lazily built from XPath expressions or what-have-you.
No, an attribute builder can only return one value at a time. You could build both by having each builder set the value of the other attribute before returning, but that gets ugly pretty quickly...
However, if you generally have two pieces of data that go together in some way (e.g. coming from the same DB query as in your case), you can store these values together in one attribute as an object:
has birth_info => (
is => 'ro', isa => 'MyApp::Data::BirthInfo',
lazy => 1,
default => sub {
MyApp::Data::BirthInfo->new(shift->some_id)
},
handles => [ qw(birthdate name) ],
);
package MyApp::Data::BirthInfo;
use Moose;
has some_id => (
is => 'ro', isa => 'Int',
trigger => sub {
# perhaps this object self-populates from the DB when you assign its id?
# or use some other mechanism to load the row in an ORMish way (perhaps BUILD)
}
);
has birthdate => (
is => 'ro', isa => 'Str',
);
has name => (
is => 'ro', isa => 'Str',
);

How to parse this JSON object/string?

I am trying to parse the JSON written # http://a0.awsstatic.com/pricing/1/ec2/sles-od.min.js
Here is a quick snippet from above link:
{vers:0.01,config:{rate:"perhr",valueColumns:["vCPU","ECU","memoryGiB","storageGB","sles"],currencies:["USD"],regions:[{region:"us-east",instanceTypes:[{type:"generalCurrentGen",sizes:[{size:"t2.micro",vCPU:"1",ECU:"variable",
...
...
...
...
Please visit the aforementioned link to see the complete JSON.
As seen above, none of the keys of above JSON have Double Quotes around them.
This leads to malformed JSON string and my JSON parser is failing at it. I also tried putting this JSON in http://www.jsoneditoronline.org/ and it fails as well.
Now, this is the same link which is used by Amazon to display various prices of their EC2 instance. So I think I am missing something here. My Googling led me to believe that above thing is not JSON and is instead JSONP.. I don't understand what is that.
Could you help me understand how to parse this JSON. BTW, I am doing this work in perl using JSON Module.
Some background:
Amazon Web Services does not have an API to get Pricing info programmatically. Hence I am parsing these links which is what amazon is doing while displaying pricing information here. Besides, I am not from programming space and perl is all I know.
Like you said JSONP or "JSON with padding" can't be parsed by json parser because it is not json (it is a different format). But it is actually a json with the prefix (padding)
The padding is typically the name of a callback function that wraps json.
In this case, its default callback names 'callback' and we can do a bit hackiest way by using Regular Expression to capture json that is wrapped by 'callback()' like this
s/callback\((.*)\);$/$1/s;
Also, if you would like to use JSON library, you can enable allow_barekey which means you don't need those quotes around those keys.
Below is my working code. I use LWP::Simple to get the content for the given and Data::Dump to print the isolated data structure.
use strict;
use warnings;
use LWP::Simple;
use JSON;
my $jsonp = get("http://a0.awsstatic.com/pricing/1/ec2/sles-od.min.js")
or die "Couldn't get url";
( my $json = $jsonp ) =~ s/callback\((.*)\);$/$1/s; #grap the json from $jsonp and store in $json variable
my $hash = JSON->new->allow_barekey->decode ( $json );
use Data::Dump;
dd $hash;
Outputs:
{
config => {
currencies => ["USD"],
rate => "perhr",
regions => [
{
instanceTypes => [
{
sizes => [
{
ECU => "variable",
memoryGiB => 1,
size => "t2.micro",
storageGB => "ebsonly",
valueColumns => [{ name => "os", prices => { USD => 0.023 } }],
vCPU => 1,
},
{
ECU => "variable",
memoryGiB => 2,
size => "t2.small",
storageGB => "ebsonly",
valueColumns => [{ name => "os", prices => { USD => 0.056 } }],
vCPU => 1,
},
{
ECU => "variable",
memoryGiB => 4,
size => "t2.medium",
storageGB => "ebsonly",
valueColumns => [{ name => "os", prices => { USD => 0.152 } }],
vCPU => 2,
},
{
ECU => 3,
memoryGiB => 3.75,
size => "m3.medium",
storageGB => "1 x 4 SSD",
valueColumns => [{ name => "os", prices => { USD => "0.170" } }],
vCPU => 1,
},
....
As said in comments above, it is not JSON so it can't be parsed by JSON parser... But for an quick & (very)dirty work, you can try the JSON::DWIW module.
The next code:
use 5.014;
use warnings;
use WWW::Mechanize;
use Data::Dump;
use JSON::DWIW;
my $mech = WWW::Mechanize->new();
my $jsonstr = $mech->get('http://a0.awsstatic.com/pricing/1/ec2/sles-od.min.js')->content;
($jsonstr) = $jsonstr =~ /callback\((.*)\)/s;
my $json_obj = JSON::DWIW->new;
my $data = $json_obj->from_json( $jsonstr );
dd $data;
prints a structure what maybe is what you want, e.g.:
{
config => {
currencies => ["USD"],
rate => "perhr",
regions => [
{
instanceTypes => [
{
sizes => [
{
ECU => "variable",
memoryGiB => 1,
size => "t2.micro",
storageGB => "ebsonly",
valueColumns => [{ name => "os", prices => { USD => 0.023 } }],
vCPU => 1,
},
{

Logstash indexing JSON arrays

Logstash is awesome. I can send it JSON like this (multi-lined for readability):
{
"a": "one"
"b": {
"alpha":"awesome"
}
}
And then query for that line in kibana using the search term b.alpha:awesome. Nice.
However I now have a JSON log line like this:
{
"different":[
{
"this": "one",
"that": "uno"
},
{
"this": "two"
}
]
}
And I'd like to be able to find this line with a search like different.this:two (or different.this:one, or different.that:uno)
If I was using Lucene directly I'd iterate through the different array, and generate a new search index for each hash within it, but Logstash currently seems to ingest that line like this:
different: {this: one, that: uno}, {this: two}
Which isn't going to help me searching for log lines using different.this or different.that.
Any got any thoughts as to a codec, filter or code change I can make to enable this?
You can write your own filter (copy & paste, rename the class name, the config_name and rewrite the filter(event) method) or modify the current JSON filter (source on Github)
You can find the JSON filter (Ruby class) source code in the following path logstash-1.x.x\lib\logstash\filters named as json.rb. The JSON filter parse the content as JSON as follows
begin
# TODO(sissel): Note, this will not successfully handle json lists
# like your text is '[ 1,2,3 ]' JSON.parse gives you an array (correctly)
# which won't merge into a hash. If someone needs this, we can fix it
# later.
dest.merge!(JSON.parse(source))
# If no target, we target the root of the event object. This can allow
# you to overwrite #timestamp. If so, let's parse it as a timestamp!
if !#target && event[TIMESTAMP].is_a?(String)
# This is a hack to help folks who are mucking with #timestamp during
# their json filter. You aren't supposed to do anything with
# "#timestamp" outside of the date filter, but nobody listens... ;)
event[TIMESTAMP] = Time.parse(event[TIMESTAMP]).utc
end
filter_matched(event)
rescue => e
event.tag("_jsonparsefailure")
#logger.warn("Trouble parsing json", :source => #source,
:raw => event[#source], :exception => e)
return
end
You can modify the parsing procedure to modify the original JSON
json = JSON.parse(source)
if json.is_a?(Hash)
json.each do |key, value|
if value.is_a?(Array)
value.each_with_index do |object, index|
#modify as you need
object["index"]=index
end
end
end
end
#save modified json
......
dest.merge!(json)
then you can modify your config file to use the/your new/modified JSON filter and place in \logstash-1.x.x\lib\logstash\config
This is mine elastic_with_json.conf with a modified json.rb filter
input{
stdin{
}
}filter{
json{
source => "message"
}
}output{
elasticsearch{
host=>localhost
}stdout{
}
}
if you want to use your new filter you can configure it with the config_name
class LogStash::Filters::Json_index < LogStash::Filters::Base
config_name "json_index"
milestone 2
....
end
and configure it
input{
stdin{
}
}filter{
json_index{
source => "message"
}
}output{
elasticsearch{
host=>localhost
}stdout{
}
}
Hope this helps.
For a quick and dirty hack, I used the Ruby filter and below code , no need to use the out of box 'json' filter anymore
input {
stdin{}
}
filter {
grok {
match => ["message","(?<json_raw>.*)"]
}
ruby {
init => "
def parse_json obj, pname=nil, event
obj = JSON.parse(obj) unless obj.is_a? Hash
obj = obj.to_hash unless obj.is_a? Hash
obj.each {|k,v|
p = pname.nil?? k : pname
if v.is_a? Array
v.each_with_index {|oo,ii|
parse_json_array(oo,ii,p,event)
}
elsif v.is_a? Hash
parse_json(v,p,event)
else
p = pname.nil?? k : [pname,k].join('.')
event[p] = v
end
}
end
def parse_json_array obj, i,pname, event
obj = JSON.parse(obj) unless obj.is_a? Hash
pname_ = pname
if obj.is_a? Hash
obj.each {|k,v|
p=[pname_,i,k].join('.')
if v.is_a? Array
v.each_with_index {|oo,ii|
parse_json_array(oo,ii,p,event)
}
elsif v.is_a? Hash
parse_json(v,p, event)
else
event[p] = v
end
}
else
n = [pname_, i].join('.')
event[n] = obj
end
end
"
code => "parse_json(event['json_raw'].to_s,nil,event) if event['json_raw'].to_s.include? ':'"
}
}
output {
stdout{codec => rubydebug}
}
Test json structure
{"id":123, "members":[{"i":1, "arr":[{"ii":11},{"ii":22}]},{"i":2}], "im_json":{"id":234, "members":[{"i":3},{"i":4}]}}
and this is whats output
{
"message" => "{\"id\":123, \"members\":[{\"i\":1, \"arr\":[{\"ii\":11},{\"ii\":22}]},{\"i\":2}], \"im_json\":{\"id\":234, \"members\":[{\"i\":3},{\"i\":4}]}}",
"#version" => "1",
"#timestamp" => "2014-07-25T00:06:00.814Z",
"host" => "Leis-MacBook-Pro.local",
"json_raw" => "{\"id\":123, \"members\":[{\"i\":1, \"arr\":[{\"ii\":11},{\"ii\":22}]},{\"i\":2}], \"im_json\":{\"id\":234, \"members\":[{\"i\":3},{\"i\":4}]}}",
"id" => 123,
"members.0.i" => 1,
"members.0.arr.0.ii" => 11,
"members.0.arr.1.ii" => 22,
"members.1.i" => 2,
"im_json" => 234,
"im_json.0.i" => 3,
"im_json.1.i" => 4
}
The solution I liked is the ruby filter because that requires us to not write another filter. However, that solution creates fields that are on the "root" of JSON and it's hard to keep track of how the original document looked.
I came up with something similar that's easier to follow and is a recursive solution so it's cleaner.
ruby {
init => "
def arrays_to_hash(h)
h.each do |k,v|
# If v is nil, an array is being iterated and the value is k.
# If v is not nil, a hash is being iterated and the value is v.
value = v || k
if value.is_a?(Array)
# "value" is replaced with "value_hash" later.
value_hash = {}
value.each_with_index do |v, i|
value_hash[i.to_s] = v
end
h[k] = value_hash
end
if value.is_a?(Hash) || value.is_a?(Array)
arrays_to_hash(value)
end
end
end
"
code => "arrays_to_hash(event.to_hash)"
}
It converts arrays to has with each key as the index number. More details:- http://blog.abhijeetr.com/2016/11/logstashelasticsearch-best-way-to.html