How to convert array of numbers to timestamp in MySQL? - mysql

I have date value stored in a format like this:
"timestamp": [
2020,
2,
14,
15,
45,
47,
8000000
]
I have no idea what this format is. I haven't found any solution on how to convert this array of numbers into a MySQL timestamp via MySQL tools. Could anybody give me a clue about it?

Related

Using the key from JSON in query

I'm new to MySQL and received a task which requires a complex(for me) query. I read the documentation and a few sources but I still cannot write it myself.
I'm selecting a rows from a table where in one of the cells I have JSON like this one
{
[
{
"interval" : 2,
"start": 03,
"end": 07,
"day_of_week": 3
}, {
"interval" : 8,
"start": 22,
"end": 23,
"day_of_week": 6
}
]
}
I want to check if some of the "day_of_week" values is equal to the current day of week and if so to write this value and the values of "start", "end" and "day_of_week" assoiciated with it in a variables to use them in the query.
That's not valid JSON format, so none of the MySQL JSON functions will work on it regardless. Better just fetch the whole blob of not-JSON into a client application that knows how to parse it, and deal with it there.
Even if it were valid JSON, I would ask this: why would you store data in a format you don't know how to query?
The proper solution is the following:
SELECT start, end, day_of_week
FROM mytable
WHERE day_of_week = DAYOFWEEK(CURDATE());
See how easy that is when you store data in normal rows and columns? You get to use ordinary SQL expressions, instead of wondering how you can trick MySQL into giving up the data buried in your non-JSON blob.
JSON is the worst thing to happen to relational databases.
Re your comment:
If you need to query by day of week, then you could reorganize your JSON to support that type of query:
{
"3":{
"interval" : 2,
"start": 03,
"end": 07,
"day_of_week": 3
},
"6": {
"interval" : 8,
"start": 22,
"end": 23,
"day_of_week": 6
}
}
Then it's possible to get results for the current weekday this way:
SELECT data->>'$.start' AS `start`,
data->>'$.end' AS `end`,
data->>'$.day_of_week' AS `day_of_week`
FROM (
SELECT JSON_EXTRACT(data, CONCAT('$."', DAYOFWEEK(CURDATE()), '"')) AS data
FROM mytable
) AS d;
In general, when you store data in a non-relational manner, the way to optimize it is to organize the data to support a specific query.

How to sort text column that has mostly numbers and a dot

In MS Access , I have a field by name "TargetDays" that has values like "0", "13", 20", "6", "1", "9", ".""2", "28"
I want them to be sorted as
., 0, 1, 2, 6, 9, 13, 20, 28
I tried doing ORDER BY val(TargetDays)
But this sorts sometimes as ., 0, 1, 2, 6, 13, 20, 28. But other times it sorts as 0, ., 1, 2, 6, 13, 20, 28. The problem is coming with "." and "0".
Could someone please tell me a solution to sort in the intended order (as mentioned above)?
That happens because Val(".") and Val("0") both return 0, so your ORDER BY has no way to distinguish between those 2 characters in your [TargetDays] field ... and no way to know it should sort "." before "0".
You can include a secondary sort, based on ASCII values, to tell it what you want. An Immediate window example of the Asc() function in action ...
? Asc("."), Asc("0")
46 48
You could base your secondary sort on that function ...
ORDER BY val(TargetDays), Asc(TargetDays)
However, I don't think you should actually need to include the function because this should give you the same result ...
ORDER BY val(TargetDays), TargetDays

Edit Parameter in JSON

I want to deploy an Azure ARM Template.
In the parameter section I defined a IP Range for the Subnet.
"SubnetIP": {
"defaultValue": "10.0.0.0",
"type": "string"
},
"SubnetMask": {
"type": "int",
"defaultValue": 16,
"allowedValues": [
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27
]
}
When creating the private IP i used
"privateIPAddress": "[concat(parameters('SubnetIP'),copyindex(20))]",
This give me not the excepted output because Subnet Ip is 10.0.0.0 and not 10.0.0. is there a way to edit the parameter in that function?
Regards Stefan
you should do a bit calculation if you want this to be robust:
"ipAddress32Bit": "[add(add(add(mul(int(split(parameters('ipAddress'),'.')[0]),16777216),mul(int(split(parameters('ipAddress'),'.')[1]),65536)),mul(int(split(parameters('ipAddress'),'.')[2]),256)),int(split(parameters('ipAddress'),'.')[3]))]",
"modifiedIp": "[add(variables('ipAddress32Bit'),1)]",
"ipAddressOut": "[concat(string(div(variables('modifiedIP'),16777216)), '.', string(div(mod(variables('modifiedIP'),16777216),65536)), '.', string(div(mod(variables('modifiedIP'),65536),256)), '.', string(mod(variables('modifiedIP'),256)))]"
not going to take credit for that. source. addition happens in the modifiedIp variable in this example. you could also combine this with copy function.
edit. ok, i thought that this is somewhat obvious, but I'll explain how I understand whats going on (i might be wrong).
he takes individual ip address pieces (10.1.2.3 > 10, 1, 2, 3)
he multiplies each piece by a specific number to get its decimal representation
he sums the pieces
he adds 1 (to get next ip address in decimal representation)
he casts decimal number back to ip address
To illustrate the idea use these 3 links:
https://www.browserling.com/tools/dec-to-ip
https://www.ipaddressguide.com/ip
So you want only the first part of the specified subnet?
maybe try something like this?
"variables":{
"SubnetPrefix": "[substring(parameters('SubnetIP'), 0, lastIndexOf(parameters('SubnetIP'), '.'))]"
"privateIPAddress": "[concat(variables('SubnetPrefix'),copyindex(20))]"
}
It would not be pretty for larger subnets than /24, but in the example it could work. Have a look at ARM template string functions

Getting the index of an element of an JSON array in mysql

I am having a JSON array of numbers like [16, 9, 11, 22, 23, 12]. I would like to get the index of numbers within the array. For example is I say that I would like to have the index of 9, it should return 1.
I tried using below mentioned query in MySQL, but getting null.
SELECT JSON_SEARCH(CAST('[16, 9, 11, 22, 23, 12]' AS JSON),'one',9)
Do you guys have solution for this ?
CAST is not necessary here. But array values should be quoted as
JSON_SEARCH(json_doc, one_or_all, search_str[, escape_char[, path] ...])
Returns the path to the given string within a JSON document.
SELECT json_search('["16", "9", "11", "22", "23", "12"]', 'one', '9');
returns "$[1]"

Phoenix, Json and Unix Timestamps

I'm exporting data from SQL Server in json format so I can import it into my Phoenix app. What I'm not sure about is how I should deal with dates. At the moment I'm exporting dates as a Unix timestamp. Here's a sample:
[
{ "Log": { "Start": 1319734790, "End": 0, "Comment": "" },
{ "Log": { "Start": 1319732847, "End": 1319734790, "Comment": "Had lunch today" }
]
In Phoenix/Elixir what's the best way to convert a Unix timestamp into a DateTime object? I'm assuming because I'm inserting into an Ecto database, that I need to use an Ecto DateTime object.
You can get an Erlang-style datetime tuple from a unix timestamp like this:
epoch = :calendar.datetime_to_gregorian_seconds({{1970, 1, 1}, {0, 0, 0}})
datetime = :calendar.gregorian_seconds_to_datetime(your_unix_timestamp + epoch)
This will have datetime in tuple format like {{2016, 4, 28}, {00:50:12}}
You can convert that tuple to an Ecto.DateTime with Ecto.DateTime.from_erl/1
But check your assumptions. Maybe you need timezone information. And I see you have the value 0 in your example. Do you really want 1970-01-01 00:00:00 to represent "no value"?
You can use DateTime.from_unix/2 to transform unix timestamp to DateTime struct, like below
# Change unit to millisecond, the default is second
timestamps |> DateTime.from_unix(:millisecond)
When you have a DateTime construct, you can just pass it to Ecto models field. Ecto supports 2 field types naive_datetime and utc_datetime which corresponding to NaiveDateTime and DateTime structs. You can choose what you want and Ecto will do the transformation for you.
Finally, both 2 field types will be transformed to the same type in database, i.e. timestamp without time zone in PostgreSQL. So you can easily switch the model schema.