I'm exporting data from SQL Server in json format so I can import it into my Phoenix app. What I'm not sure about is how I should deal with dates. At the moment I'm exporting dates as a Unix timestamp. Here's a sample:
[
{ "Log": { "Start": 1319734790, "End": 0, "Comment": "" },
{ "Log": { "Start": 1319732847, "End": 1319734790, "Comment": "Had lunch today" }
]
In Phoenix/Elixir what's the best way to convert a Unix timestamp into a DateTime object? I'm assuming because I'm inserting into an Ecto database, that I need to use an Ecto DateTime object.
You can get an Erlang-style datetime tuple from a unix timestamp like this:
epoch = :calendar.datetime_to_gregorian_seconds({{1970, 1, 1}, {0, 0, 0}})
datetime = :calendar.gregorian_seconds_to_datetime(your_unix_timestamp + epoch)
This will have datetime in tuple format like {{2016, 4, 28}, {00:50:12}}
You can convert that tuple to an Ecto.DateTime with Ecto.DateTime.from_erl/1
But check your assumptions. Maybe you need timezone information. And I see you have the value 0 in your example. Do you really want 1970-01-01 00:00:00 to represent "no value"?
You can use DateTime.from_unix/2 to transform unix timestamp to DateTime struct, like below
# Change unit to millisecond, the default is second
timestamps |> DateTime.from_unix(:millisecond)
When you have a DateTime construct, you can just pass it to Ecto models field. Ecto supports 2 field types naive_datetime and utc_datetime which corresponding to NaiveDateTime and DateTime structs. You can choose what you want and Ecto will do the transformation for you.
Finally, both 2 field types will be transformed to the same type in database, i.e. timestamp without time zone in PostgreSQL. So you can easily switch the model schema.
Related
I have the following array of JSON objects
There ist one for each week.
[
{
"valutakode": "EUR",
"valutabeskrivelse": "EURO",
"valutakurs": "10,390",
"omregningsenhet": 1,
"fomdato": "2022-10-31",
"tomdato": "2022-11-06"
},
{
"valutakode": "EUR",
"valutabeskrivelse": "EURO",
"valutakurs": "10,180",
"omregningsenhet": 1,
"fomdato": "2022-11-07",
"tomdato": "2022-11-13"
}
]
I need the object that corresponds with the current week.
Is there an option to achive this using jsonpath?
Thanks
you can try this jsonpath
$[?(#.fomdato== '2022-11-07')]
or
$[?(#.fomdato == '2022-11-07' && #.tomdato == '2022-11-13')]
You will have to figure what is the first or the last date of weeek of current date and convert it to string yyyy-mm-dd
I don't know what language are you using, but IMHO maybe it would be much easier to parse your json and use Date format to find the right object.
I'm new to MySQL and received a task which requires a complex(for me) query. I read the documentation and a few sources but I still cannot write it myself.
I'm selecting a rows from a table where in one of the cells I have JSON like this one
{
[
{
"interval" : 2,
"start": 03,
"end": 07,
"day_of_week": 3
}, {
"interval" : 8,
"start": 22,
"end": 23,
"day_of_week": 6
}
]
}
I want to check if some of the "day_of_week" values is equal to the current day of week and if so to write this value and the values of "start", "end" and "day_of_week" assoiciated with it in a variables to use them in the query.
That's not valid JSON format, so none of the MySQL JSON functions will work on it regardless. Better just fetch the whole blob of not-JSON into a client application that knows how to parse it, and deal with it there.
Even if it were valid JSON, I would ask this: why would you store data in a format you don't know how to query?
The proper solution is the following:
SELECT start, end, day_of_week
FROM mytable
WHERE day_of_week = DAYOFWEEK(CURDATE());
See how easy that is when you store data in normal rows and columns? You get to use ordinary SQL expressions, instead of wondering how you can trick MySQL into giving up the data buried in your non-JSON blob.
JSON is the worst thing to happen to relational databases.
Re your comment:
If you need to query by day of week, then you could reorganize your JSON to support that type of query:
{
"3":{
"interval" : 2,
"start": 03,
"end": 07,
"day_of_week": 3
},
"6": {
"interval" : 8,
"start": 22,
"end": 23,
"day_of_week": 6
}
}
Then it's possible to get results for the current weekday this way:
SELECT data->>'$.start' AS `start`,
data->>'$.end' AS `end`,
data->>'$.day_of_week' AS `day_of_week`
FROM (
SELECT JSON_EXTRACT(data, CONCAT('$."', DAYOFWEEK(CURDATE()), '"')) AS data
FROM mytable
) AS d;
In general, when you store data in a non-relational manner, the way to optimize it is to organize the data to support a specific query.
I GET below JSON data from source with date in "YYYY-MM-DD HH:MM:SS" format, and I want to convert the it to the Milliseconds to Epoch before sending it to destination, can someone help how to do it in Logic App?
Source data:
{
"result":[
{
"number":"123",
"name":"ABC",
"created":"2018-09-19 09:03:03"
}
]
}
Desired data:
{
"result":[
{
"number":"123",
"name":"ABC",
"created":"1537304583000"
}
]
}
Azure logic app only has the ticks method that can convert the timestamp to an ticks number.
You can use the sub method to convert the value after ticks conversion to an Epoch by subtraction.
Expression:
sub(ticks('2018-09-19 09:03:03'),636727908525417000)
636727908525417000 is the difference between ticks('2018-09-19 09:03:03') and the 1537304583000 you gave.
You can refer to this article:
https://devkimchi.com/2018/11/04/converting-tick-or-epoch-to-timestamp-in-logic-app/
Logic Apps expression to get current time in Epoch format:
div(sub(ticks(utcNow()),621355968000000000),10000000)
Explanation:
utcNow() gets datetime, by default in format of "o" (yyyy-MM-ddTHH:mm:ss.fffffffK).
utcNow() can be replace with ticks("myTimestamp") or parseDateTime("myTimestamp") or trigger().startTime
ticks()/ticks(utcNow()) coverts datetime to 100-nanosecond intervals since January 1, 0001 12:00:00, aka "ticks".
Hardcoded value: 621355968000000000 - equal to ticks value of 1970-01-01T00:00:00Z, aka start of Unix Epoch.
sub()/sub(ticks(utcNow()),621355968000000000) subtracts number of ticks of your datetime from hardcoded ticks value of start of Unix Epoch.
div()/div(sub(ticks(utcNow()),621355968000000000),10000000) by dividing ticks by 10,000,000 you convert the ticks from 100-nanoseconds into milliseconds.
I need to compare duplicates ip of a json by date field and remove the older date
Ex:
[
{
"IP": "10.0.0.20",
"Date": "2019-09-14T20:00:11.543-03:00"
},
{
"IP": "10.0.0.10",
"Date": "2019-09-17T15:45:16.943-03:00"
},
{
"IP": "10.0.0.10",
"Date": "2019-09-18T15:45:16.943-03:00"
}
]
The output of operation need to be like this:
[
{
"IP": "10.0.0.20",
"Date": "2019-09-14T20:00:11.543-03:00"
},
{
"IP": "10.0.0.10",
"Date": "2019-09-18T15:45:16.943-03:00"
}
]
For simplicity's sake, I'll assume the order of the data doesn't matter.
First, if your data isn't already in Python, you can use json.load or json.loads to convert it into a Python object, following the straightforward type mappings.
Then you problem has three parts: comparing date strings as dates, finding the maximum element of a list by that date, and performing this process for each distinct IP address. For these purposes, you can use two of Pyhton's built-in methods and two from the standard library.
Python's built-in max and sorted functions (as well as list.sort) support a (keyword-only) key argument, which uses a function to determine the value to compare by. For example, max(d1, d2, key=lambda x: x[0]) compares the data by the first index of the each (like d1[0] < d2[0]), and returns whichever of d1 and d2 produced the larger key.
To allow that type of comparison between dates, you can use the datetime.datetime class. If your dates are all in the format specified by datetime.datetime.fromisoformat, you can use that function to turn your date strings into datetimes, which can then be compared to each other. Using that in a function that extracts the dates from the dictionaries gives you the key function you need.
def extract_date(item):
return datetime.datetime.fromisoformat(item['Date'])
Those functions allow you to choose the object from the list with the largest date, but not to keep separate values for different IP addresses. To do that, you can use itertools.groupby, which takes a key function and puts the elements of the input into separate outputs based on that key. However, there are two things you might need to watch out for with groupby:
It only groups elements that are next to each other. For example, if you give it [3, 3, 2, 2, 3], it will group two 3s, then two 2s, then one 3 rather than grouping all three 3 together.
It returns an iterator of key, iterator pairs, so you have to collect the results yourself. The best way to do that may depend on your application, but a basic approach is nested iterations:
for key, values in groupby(data, key_function):
for value in values:
print(key, value)
With the functions I've mentioned above, it should be relatively straightforward to assemble an answer to your problem.
I use a tool called Redash to query (in JSON) on MongoDB. In my collections dates are formulated in ISO, so when my query is imported (with google sheets' importdata function) to a sheet, I have to convert it to the appropriate format with a formula designed in the sheet.
I would love to integrate this operation directly in my query, that the ISO date format is directly sent to Sheets in the appropriate "dd-MM-yyyy HH:ss" format.
Any ideas ?
Many many thanks
You may be able to use the $dateToString aggregation operator inside a $project aggregation stage.
For example:
> db.test.find()
{ "_id": 0, "date": ISODate("2018-03-07T05:14:13.063Z"), "a": 1, "b": 2 }
> db.test.aggregate([
{$project: {
date: {$dateToString: {
format: '%d-%m-%Y %H:%M:%S',
date: '$date'
}},
a: '$a',
b: '$b'
}}
])
{ "_id": 0, "date": "07-03-2018 05:14:13", "a": 1, "b": 2 }
Note that although the $dateToString operator was available since MongoDB 3.0, MongoDB 3.6 adds the capability to output the string according to a specific timezone.