Datetime format is not same with database in Yii2 - yii2

I want to make datetime format in my view in yii2 project is not same with datetime in my database. I use this code:
return Yii::$app->formatter->asDatetime($model->tanggal_sampai, "php:d M Y H:i");
The datetime in database is : 2016-06-14 16:53:40
But when I see the result of the code above in yii2, the result is not same. The result is : 14 Jun 2016 18:53
There is no problem with the date, but the time is very different. What's the problem? I use format 'WIB' because I'm in Indonesia.

In your main config file, put:
'components' => [
'formatter' => [
'class' => 'yii\i18n\Formatter',
'dateFormat' => 'php:m/d/Y',
'datetimeFormat' => 'php:Y-m-d H:i:s',
'timeFormat' => 'php:H:i:s',
],
And you will get:
echo Yii::$app->formatter->asDatetime('2016-06-14 16:53:40');
2016-06-14 18:53:40
It takes the time as UTC, and adds +2 because of my Timezone. If you define date_default_timezone_set in index as UTC or Yii::$app->timeZone = 'UTC' it won't transform the time you've saved in your database.
2016-06-14 16:53:40
INFO: Timezones for Asia http://php.net/manual/en/timezones.asia.php

Related

How to migrate Mysql data to elasticsearch using logstash

I need a brief explanation of how can I convert MySQL data to Elastic Search using logstash.
can anyone explain the step by step process about this
This is a broad question, I don't know how much you familiar with MySQL and ES. Let's say you have a table user. you may just simply dump it as csv and load it at your ES will be good. but if you have a dynamic data, like the MySQL just like a pipeline, you need to write a Script to do those stuff. anyway you can check the below link to build your basic knowledge before you ask How.
How to dump mysql?
How to load data to ES
Also, since you probably want to know how to covert your CSV to json file, which is the best suite for ES to understand.
How to covert CSV to JSON
You can do it using the jdbc input plugin for logstash.
Here is a config example.
Let me provide you with a high level instruction set.
Install Logstash, and Elasticsearch.
In Logstash bin folder copy jar ojdbc7.jar.
For logstash, create a config file ex: config.yml
#
input {
# Get the data from database, configure fields to get data incrementally
jdbc {
jdbc_driver_library => "./ojdbc7.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:#db:1521:instance"
jdbc_user => "user"
jdbc_password => "pwd"
id => "some_id"
jdbc_validate_connection => true
jdbc_validation_timeout => 1800
connection_retry_attempts => 10
connection_retry_attempts_wait_time => 10
#fetch the db logs using logid
statement => "select * from customer.table where logid > :sql_last_value order by logid asc"
#limit how many results are pre-fetched at a time from the cursor into the client’s cache before retrieving more results from the result-set
jdbc_fetch_size => 500
jdbc_default_timezone => "America/New_York"
use_column_value => true
tracking_column => "logid"
tracking_column_type => "numeric"
record_last_run => true
schedule => "*/2 * * * *"
type => "log.customer.table"
add_field => {"source" => "customer.table"}
add_field => {"tags" => "customer.table" }
add_field => {"logLevel" => "ERROR" }
last_run_metadata_path => "last_run_metadata_path_table.txt"
}
}
# Massage the data to store in index
filter {
if [type] == 'log.customer.table' {
#assign values from db column to custom fields of index
ruby{
code => "event.set( 'errorid', event.get('ssoerrorid') );
event.set( 'msg', event.get('errormessage') );
event.set( 'logTimeStamp', event.get('date_created'));
event.set( '#timestamp', event.get('date_created'));
"
}
#remove the db columns that were mapped to custom fields of index
mutate {
remove_field => ["ssoerrorid","errormessage","date_created" ]
}
}#end of [type] == 'log.customer.table'
} #end of filter
# Insert into index
output {
if [type] == 'log.customer.table' {
amazon_es {
hosts => ["vpc-xxx-es-yyyyyyyyyyyy.us-east-1.es.amazonaws.com"]
region => "us-east-1"
aws_access_key_id => '<access key>'
aws_secret_access_key => '<secret password>'
index => "production-logs-table-%{+YYYY.MM.dd}"
}
}
}
Go to bin, Run as
logstash -f config.yml

notify Logstash when new data is entered in mysql databse without using parameter schedule

I am working on Elastic Stack with Mysql. everything is working fine like logstash taking data from mysql database and sending it to elasticsearch and when new entries entered in mysql data then to update elasticsearch automatically i am using parameter: Schedule but in this case logstash is checking continuously for new data from it's terminal that is my main concern.
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/testdb"
# The user we wish to execute our statement as
jdbc_user => "root"
jdbc_password => ""
# The path to our downloaded jdbc driver
jdbc_driver_library => "/home/Downloads/mysql-connector-java-5.1.38.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
#run logstash at an interval of on minute
schedule => "*/15 * * * *"
use_column_value => true
tracking_column => 'EVENT_TIME_OCCURRENCE_FIELD'
# our query
statement => "SELECT * FROM brainplay WHERE EVENT_TIME_OCCURRENCE_FIELD > :sql_last_value"
}
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => "localhost:9200"
"index" => "test-migrate"
"document_type" => "data"
"document_id" => "%{personid}"
}
}
But if data is large Logstash will check for new entries in entire data without any stopping point then this will reduce scalability and consume more power.
Is there any other method or any webhook like when new data is entered into database then mysql will notify Logstash only for new data or Logstash will check for only new entries, Please help
You can either use sql_last_start parameter in your query with any timestamp field (assuming that there is a timestamp field like last_updated).
For example, your query could be like,
WHERE last_updated >= :sql_last_start
From this answer,
For example, the first time you run this sql_last_start will be
1970-01-01 00:00:00 and you'll get all rows. The second run
sql_last_start will be (for example) 2015-12-03 10:55:00 and the query
will return all rows with a timestamp newer than that.
or you can read this answer on using :sql_last_value
WHERE last_updated > :sql_last_value

Yii2 formatter language

This is my configurations:
'formatter' => [
'class' => 'yii\i18n\Formatter',
'dateFormat' => 'd MMMM Y',
'locale' => 'ru-RU'
],
When I am trying this:
echo Yii::$app->formatter->asDate('2014-01-01');
I get:
01 Jan 2014
But I want the same in Russian.
Solved:
Open the file php.ini via OpenServer and uncomment ;extension=php_intl.dll.
the config you specified should output 01 янв. 2014
you are most likely missing php-intl extension
With formatter set to 'dateFormat' => 'php:d M Y' you can only get 01 Jan 2014 because PHP's date M stands for
A short textual representation of a month, three letters: Jan through Dec
If you want date to be formatted by intl for your language you need to set formatter with ICU:
'dateFormat' => 'd MMM Y'
like described at ICU User Guide.

ActiveRecord 4.2 to 5.0.1 datetime precision behavior change

I am trying to upgrade my project to ActiveRecord 5.0.1 with Mysql 5.6.33 and mysql2 gem 0.4.5. I have a number of spec tests that involve creating a record, then searching for records whose created_at value is <= Time.now. I have summarized the failure with the following example of a behavioral change from ActiveRecord 4.2 to ActiveRecord 5.0.1:
ActiveRecord 4
irb(main):021:0> puts Time.at(1482722443.8581448)
2016-12-26 03:20:43 +0000
=> nil
irb(main):022:0> p.updated_at = Time.at(1482722443.8581448)
=> 2016-12-26 03:20:43 +0000
irb(main):023:0> p.save
=> true
irb(main):024:0> p.updated_at
=> 2016-12-26 03:20:43 UTC
irb(main):025:0> p.reload
=> #<Profile id: 1, ...
irb(main):026:0> p.updated_at
=> 2016-12-26 03:20:43 UTC
ActiveRecord 5.0.1
> puts Time.at(1482722443.8581448)
2016-12-26 03:20:43 +0000
=> nil
> p.updated_at = Time.at(1482722443.8581448)
=> 2016-12-26 03:20:43 +0000
> p.save
=> true
> p.updated_at
=> 2016-12-26 03:20:43 UTC
> p.reload
=> #<Profile:0x0055e04486dc40
id: 1,
...
> p.updated_at
=> 2016-12-26 03:20:44 UTC
As you can see, in the first example in AR4, the datetime returned by the database is at the 43-second mark, and it is therefore doing a floor operation on the supplied timestamp of 1482722443.8581448.
In AR5, it is doing a round operation, and moving the created_at time to be the next whole second, thus making it go from second 43 to second 44.
This is causing records to be created "in the future", and my example of creating, then searching for records whose created_at is <= Time.now is returning no records because it has literally been created in the future by some millisecond-margin.
Is this behavior expected, or is this a bug? Can I configure this millisecond rounding behavior in AR5?
UPDATE
It looks like in mysql if I do:
update alerts set created_at = '2016-12-26 04:08:19.7777' limit 1;
And then:
select created_at from alerts;
I get
2016-12-26 04:08:20
Thus mysql is doing the "rounding up". Is it possible that, from 4.2 to 5.0.1, ActiveRecord starting writing queries with datetimes as
2016-12-26 04:08:19.7777
Instead of
2016-12-26 04:08:19
With Mysql 5.6.33?
Yes, ActiveRecord 5 has changed its behavior per this blog post.
Here are the two commits that add this support in ActiveRecord 5.0.1.
To restore old behavior, I had to add the following monkeypatch:
module ActiveRecord
module ConnectionAdapters
class AbstractMysqlAdapter < AbstractAdapter
end
class Mysql2Adapter < AbstractMysqlAdapter
def supports_datetime_with_precision?
false
end
end
end
end

Logstash parsing csv date

I have a problem parsing date from csv and I cannot find the problem with (one would assume) simple date - dd/MM/yy. Here's structure of my csv file:
Date,Key-values,Line Item,Creative,Ad unit,Creative size,Ad server impressions,Ad server clicks,Ad server CTR
04/04/16,prid=DUBAP,Hilton_PostAuth 1,Stop Clicking Around - 300x250,383UKHilton_300x250,300 x 250,31,0,0.00%
04/04/16,prid=DUBAP,Hilton_PostAuth 2,16-0006_Auction_Banners_300x250_cat4,383UKHilton_300x250,300 x 250,59,0,0.00%
and my logstash.config file:
input {
file {
path => "/Users/User/*.csv"
type => "core2"
start_position => "beginning"
}
}
filter {
csv {
columns => ["Date","Key-values","Line Item","Creative","Ad unit","Creative size","Ad server impressions","Ad server clicks","Ad server CTR"]
separator => ","
}
date {
match => ["Date", "dd/MM/YY"]
}
mutate {convert => ["Ad server impressions", "float"]}
mutate {convert => ["Ad server clicks", "float"]}
mutate {convert => ["Ad server CTR", "float"]}
}
output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "test1"
workers => 1
}
stdout {}
}
I have also tried combinations with date being "dd/MM/yy" with no luck, Date is not indexed as date, and I can select only #timestamp in Kibana..
I think there must be a simple thing I'm just missing but as for this moment I cannot find it..
Cheers!
EDIT 1:
Please find my console output when logstash starts and how data is being processed:
Settings: Default pipeline workers: 4
Pipeline main started
Failed parsing date from field {:field=>"Date", :value=>"Date", :exception=>"Invalid format: \"Date\"", :config_parsers=>"dd/MM/YY", :config_locale=>"default=en_US", :level=>:warn}
2016-05-06T20:32:48.034Z Pawels-MacBook-Air.local Date,Key-values,Line Item,Creative,Ad unit,Creative size,Ad server impressions,Ad server clicks,Ad server CTR
2016-04-03T23:00:00.000Z Pawels-MacBook-Air.local 04/04/16,prid=DUBAP,Hilton_PostAuth 1,Stop Clicking Around - 300x250,383UKHilton_300x250,300 x 250,31,0,0.00%
It still loads it into Elasticsearch but in Kibana there's no 'Date' field - I can only use #timestamp
Cheers
Actually what date filter does is:
The date filter is used for parsing dates from fields, and then using
that date or timestamp as the logstash timestamp for the event.
So with that configuration it reads your date and use it as timestamp field. If you want to use it as a seperate field, configure as:
date {
match => ["Date", "dd/MM/yy"]
target => "Date"
}