JSON Feed Timestamp - json

I am using FullCalendar JS Plugin for a tool I am building and I am creating a JSON Feed.
"id": "'.$row['AppointmentID'].'",
"title": "'.$row['AppointmentName'].'",
"url": "'.$row['URL'].'",
"class": "event-important",
"start": "1364407286400"
},
The time stamp for the start of this is 1364407286400 I for the life of me cannot work out how this timestamp is formated. I thought it was Unix but I generated a timestamp for today and replaced it and it is still not showing.
Can anyone point me in right direction?

This is a timestamp in miliseconds. You can easily test this value using:
$test = (int)(1364407286400/1000);
var_dump((new DateTime())->setTimestamp($test));
the output will be:
object(DateTime)#1 (3) {
["date"]=>
string(26) "2013-03-27 11:01:26.000000"
["timezone_type"]=>
int(3)
["timezone"]=>
string(10) "US/Pacific"
}

Related

JSON Path according to specific date

I have the following array of JSON objects
There ist one for each week.
[
{
"valutakode": "EUR",
"valutabeskrivelse": "EURO",
"valutakurs": "10,390",
"omregningsenhet": 1,
"fomdato": "2022-10-31",
"tomdato": "2022-11-06"
},
{
"valutakode": "EUR",
"valutabeskrivelse": "EURO",
"valutakurs": "10,180",
"omregningsenhet": 1,
"fomdato": "2022-11-07",
"tomdato": "2022-11-13"
}
]
I need the object that corresponds with the current week.
Is there an option to achive this using jsonpath?
Thanks
you can try this jsonpath
$[?(#.fomdato== '2022-11-07')]
or
$[?(#.fomdato == '2022-11-07' && #.tomdato == '2022-11-13')]
You will have to figure what is the first or the last date of weeek of current date and convert it to string yyyy-mm-dd
I don't know what language are you using, but IMHO maybe it would be much easier to parse your json and use Date format to find the right object.

Using the key from JSON in query

I'm new to MySQL and received a task which requires a complex(for me) query. I read the documentation and a few sources but I still cannot write it myself.
I'm selecting a rows from a table where in one of the cells I have JSON like this one
{
[
{
"interval" : 2,
"start": 03,
"end": 07,
"day_of_week": 3
}, {
"interval" : 8,
"start": 22,
"end": 23,
"day_of_week": 6
}
]
}
I want to check if some of the "day_of_week" values is equal to the current day of week and if so to write this value and the values of "start", "end" and "day_of_week" assoiciated with it in a variables to use them in the query.
That's not valid JSON format, so none of the MySQL JSON functions will work on it regardless. Better just fetch the whole blob of not-JSON into a client application that knows how to parse it, and deal with it there.
Even if it were valid JSON, I would ask this: why would you store data in a format you don't know how to query?
The proper solution is the following:
SELECT start, end, day_of_week
FROM mytable
WHERE day_of_week = DAYOFWEEK(CURDATE());
See how easy that is when you store data in normal rows and columns? You get to use ordinary SQL expressions, instead of wondering how you can trick MySQL into giving up the data buried in your non-JSON blob.
JSON is the worst thing to happen to relational databases.
Re your comment:
If you need to query by day of week, then you could reorganize your JSON to support that type of query:
{
"3":{
"interval" : 2,
"start": 03,
"end": 07,
"day_of_week": 3
},
"6": {
"interval" : 8,
"start": 22,
"end": 23,
"day_of_week": 6
}
}
Then it's possible to get results for the current weekday this way:
SELECT data->>'$.start' AS `start`,
data->>'$.end' AS `end`,
data->>'$.day_of_week' AS `day_of_week`
FROM (
SELECT JSON_EXTRACT(data, CONCAT('$."', DAYOFWEEK(CURDATE()), '"')) AS data
FROM mytable
) AS d;
In general, when you store data in a non-relational manner, the way to optimize it is to organize the data to support a specific query.

Retrieve json field value with importJSON

I have a problem with google spreadsheets. I try to import a value from a link (which returns me a JSON) but it seems like it does not work.
I tried this:
https://medium.com/#paulgambill/how-to-import-json-data-into-google-spreadsheets-in-less-than-5-minutes-a3fede1a014a#.pb26xo98x
The link returns a json like this:
{
"data": [
{
"time": "2016-10-16T07:00:00+0000",
"value": "249.884067074"
}
],
"summary": {
"name": "Custom Events",
"period": "daily",
"since": "2016-10-17T00:00:00+0000",
"until": "2016-10-17T00:00:00+0000"
}
}
How can I extract the value from the data field?
I tried like this:
=ImportJSON(myUrl, "/data[0]/value", "noInherit,noTruncate,rawHeaders")
According to a comment on the project page there is a fix that should be manually applied:
Chris says:
November 4, 2014 at 11:35 pm (UTC -4)
Trevor,
I was able to fix this problem by making a minor change to the
ParseData_ function. I changed line 286 in version 1.2.1 to:
if (i >= 0 && data[state.rowIndex]) {
and it seems to have addressed the issue.
Thank you!
CR

Phoenix, Json and Unix Timestamps

I'm exporting data from SQL Server in json format so I can import it into my Phoenix app. What I'm not sure about is how I should deal with dates. At the moment I'm exporting dates as a Unix timestamp. Here's a sample:
[
{ "Log": { "Start": 1319734790, "End": 0, "Comment": "" },
{ "Log": { "Start": 1319732847, "End": 1319734790, "Comment": "Had lunch today" }
]
In Phoenix/Elixir what's the best way to convert a Unix timestamp into a DateTime object? I'm assuming because I'm inserting into an Ecto database, that I need to use an Ecto DateTime object.
You can get an Erlang-style datetime tuple from a unix timestamp like this:
epoch = :calendar.datetime_to_gregorian_seconds({{1970, 1, 1}, {0, 0, 0}})
datetime = :calendar.gregorian_seconds_to_datetime(your_unix_timestamp + epoch)
This will have datetime in tuple format like {{2016, 4, 28}, {00:50:12}}
You can convert that tuple to an Ecto.DateTime with Ecto.DateTime.from_erl/1
But check your assumptions. Maybe you need timezone information. And I see you have the value 0 in your example. Do you really want 1970-01-01 00:00:00 to represent "no value"?
You can use DateTime.from_unix/2 to transform unix timestamp to DateTime struct, like below
# Change unit to millisecond, the default is second
timestamps |> DateTime.from_unix(:millisecond)
When you have a DateTime construct, you can just pass it to Ecto models field. Ecto supports 2 field types naive_datetime and utc_datetime which corresponding to NaiveDateTime and DateTime structs. You can choose what you want and Ecto will do the transformation for you.
Finally, both 2 field types will be transformed to the same type in database, i.e. timestamp without time zone in PostgreSQL. So you can easily switch the model schema.

"invalid char in json text" error in Couchbase view results

This is the my document which i store in the bucket.Also the Id(key) attribute is screenName.
{
"created": null,
"createdBy": null,
"updated": null,
"updatedBy": null,
"screenName": "steelers",
"appId": "100",
"domain": "100$APPINFOTEAM",
"alias": "steelers",
"devision": "1"
}
I have multiple documents in Couchbase in this format. So i need to fetch these documents in descending order. So this is the implementation I used it for,
Query query = new Query();
// Filter on the domain key
query.setIncludeDocs(true);
query.setDescending(true);
query.setInclusiveEnd(true);
query.setKey(domain);
List<AppInfoTeam> appInfoTeam = appinfoTeamService.getAppInfoTeamForApp(query);
This will give me the exact documents without sorting. This is my view
function (doc, meta) {
if (doc._class == "com.link.twitter.pojo.AppInfoTeam") {
emit(doc.domain, null);
}
}
Also I tried to Filter Results using Couchbase server interface. I tick the descending and inclusive_end values. Also put the domain as a key. Then when I click the show results button it will give me this error.
url: ?stale=false&descending=true&inclusive_end=true&key=domain&connection_timeout=60000&limit=10&skip=0
Error:
{"error":"bad_request","reason":"invalid UTF-8 JSON: {{error,{1,\"lexical error: invalid char in json text.\\n\"}},\n \"domain\"}"}
How can I fix this issue?
You need to wrap the key with double quotes:
<url>?stale=false&descending=true&inclusive_end=true&key="domain"&connection_timeout=60000&limit=10&skip=0