I am working on an Azure Logic App that is triggered via an HTTP call and returns a JSON response. Internally, the logic app retrieves JSON data from a web API and then converts the response JSON data to a format that is acceptable to the calling client of the logic app.
The problem I'm having is that the web API is returning dates in the format "/Date(1616371200000)/" and I need the date format to look like "2021-03-32T19:00:00Z". I don't see any built-in logic app function that can work with or convert the Epoch timestamp as-is (unless I'm missing something).
To clarify...
Source Data:
{
"some_silly_date": "/Date(1616371200000)/"
}
Desired Data:
{
"some_silly_date": "2021-03-32T19:00:00Z"
}
The following solution would theoretically work if the source date wasn't wrapped with "/Date(...)/":
"#addToTime('1970-01-01T00:00:00Z', 1616371200000, 'Second')"
Parsing that text off the timestamp before converting it would lead to a really ugly expression. Is there a better way to convert the timestamp that I'm not aware of?
Note that using the Liquid JSON-to-JSON templates is not an option. I was using that and found this action apparently has to JIT compile before use which is causing my logic app to time-out when being called after a long period of inactivity.
Can you get the value "/Date(1616371200000)/" from the JSON into a variable? If so, a little string manipulation would do the trick:
int(replace(replace(variables('data_in'),'/Date(',''),')/',''))
Then use the variable in the addToTime function.
Result:
The following expression seems to be working and returns a timestamp in UTC. Note that the substring() function is only using a length of 10 instead of 13. I'm intentionally trimming-off the milliseconds from the Epoch timestamp because the addToTime() function only handles seconds.
{
"some_silly_date": "#addToTime('1970-01-01T00:00:00Z', int(substring(item()?['some_silly_date'],6,10)), 'Second')"
}
As an added bonus, if the timestamp value is null in the source JSON, do the following:
{
"some_silly_date": "#if(equals(item()?['some_silly_date'], null), null, addToTime('1970-01-01T00:00:00Z', int(substring(item()?['some_silly_date'],6,10)), 'Second'))"
}
Quite possibly the ugliest code I've ever written.
Related
I want to know if there is a way to write the date as DateTime format in JSON.
I have followed so many links on the internet but everywhere date is converted to string(str) in order to write it on JSON file.
I used the below code:
import json
fileName='json_output.json'
def writeToJSONFile(data):
with open(fileName, 'a+') as fp:
json.dump(data, fp, indent=4, default=str)
then calling it as :
from datetime import datetime
date_value="09-23-2019"
date_time = datetime.strptime(date_value,'%m-%d-%Y')
date_dict={"eventDate":date_time}
writeToJSONFile(date_dict)
The above code is able to write date into the JSON file but in the string format.
I have already went through link:
How to overcome "datetime.datetime not JSON serializable"?
JSON datetime between Python and JavaScript
I just want to know if it is possible or not at all possible to store date as datetime format?
Simple answer is No, JSON does not have a native datetime representation, so it will appear as a string; however, if you use a standard that is agreed upon, the receiving application can parse the variable into a datetime object, if they choose to. If you do not know what format they may agree with, I would recommend just making it a standard ISO 8601(Combined date and time in UTC), that way it can be parsed by the receiver, and retain the correct value regardless of time zone.
I have an azure function with Blob Storage as an output.
My question is how to specify a {date}/{time} output path pattern from the Azure functions? I don't want to store all blobs flat in the container.
I tried mycontainername/{date}/{time}, but it complaints saying "No binding parameter exists for 'date'"
Thanks
You can use the datetime parameter resolver with the appropriate format string.
For example:
{datetime:yyyy} will result in 2017 (on 2017)
{datetime:hhmmss} will result in hours, minutes and seconds, with no separators.
The format strings used are the ones supported by the .NET framework and you can learn more about them here. (Standard strings are also supported).
You can use it now in Java Azure Function as well.
{DateTime}
example:
#BlobOutput(name = "blob", connection = "AzureWebJobsStorage", path = "samples-java/new-{DateTime}.zip") OutputBinding<byte[]> blob
I am writing pdxInstances to GemFire using the sequence: rabbitmq => springxd => gemfire.
If I put this JSON into rabbitmq {'ID':11,'value':5}, value appears as a byte value in GemFire. If I put {'ID':11,'value':500}, value appears as a word and if I put {'ID':11,'value':50000} it appears as an Integer.
A problem arises when I query data from GemFire and order them. For example, if I use a query such as select * from /my_region order by value it fails, saying it cannot compare a byte with a word (or byte with an integer).
Is there any way to declare the data type in JSON? Or any other method to get rid of this problem?
To add a bit of insight into this problem... in reviewing GemFire/Geode source code, it would seem it is not possible to configure the desired value type and override GemFire/Geode's default behavior, which can be seen in JSONFormatter.setNumberField(..).
I will not explain how GemFire/Geode involves the JSONFormatter during a Region.put(key, value) operation as it is rather involved and beyond the scope of this discussion.
However, one could argue that the problem is not necessarily with the JSONFormatter class, since storing a numeric value in a byte is more efficient than storing the value in an integer, especially when the value would indeed fit into a byte. Therefore, the problem is really that the Comparator used in the Query processor should be able to compare numeric values in the same type family (byte, short, int, long), upcasting where appropriate.
If you feel so inclined, feel free to file a JIRA ticket in the Apache Geode JIRA repository at https://issues.apache.org/jira/browse/GEODE-72?jql=project%20%3D%20GEODE
Note, Apache Geode is the open source "core" of Pivotal GemFire now. See the Apache Geode website for more details.
Cheers!
Your best bet would be to take care of this with a custom module or a groovy script. You can either write a custom module in Java to do the conversion and then upload the custom module into SpringXD, then you could reference your custom module like any other processor. Or you could write a script in Groovy and pass the incoming data through a transform processor.
http://docs.spring.io/spring-xd/docs/current/reference/html/#processors
The actual conversion probably won't be too tricky, but will vary depending on which method you use. The stream creation would look something like this when you're done.
stream create --name myRabbitStream --definition "rabbit | my-custom-module | gemfire-json-server etc....."
stream create --name myRabbitStream --definition "rabbit | transform --script=file:/transform.groovy | gemfire-json-server etc...."
It seems like you have your source and sink modules set up just fine, so all you need to do is get your processor module setup to do the conversion and you should be all set.
I've iOS project which is using RestKit 0.21.0 component responsible to get, parse and store in Core Data responses from remote server. In one of the backend JSON response I have something like that:
"response": [
{
"id": 1,
"start_time": "10:00:00",
"end_time": "14:00:00",
"name": "Object name"
},
.
.
.
]
In Model.xcdatamodeld I've defined entity with fields startTime and endTime type of Date. Generally all mappings JSON response to objects works correctly, but I have problem with JSON fields start_time and end_time.
do you have any advices how could be done correctly mapping time fields to data which could be stored in Core Data (SQLite datatbase)?
Create an NSDateFormatter with the appropriate format to parse your time strings. Add the date formatter with [[RKValueTransformer defaultValueTransformer] insertValueTransformer:dateFormatter atIndex:0];. Now RestKit will search through all your defined date formatters as well as the default ones whenever it needs to map to an NSDate destination.
Have inherited an app that is making extensive use of RestKit and is at 0.21 release now and its great. Needed to add date to string conversion in YYYY-MM-DD HH:MM:SS.SSS and followed the advice above to add the required date formatter to the default compound formatters at index 0. However found calls to RKObjectMapping overrode this by adding ISO8601 formatter at index 0 for backward compatibility in +(void)initialize. Commented those lines out and I am getting the correct result. I guess it is possibly the way the app is structured, there are any number of calls to RKObjectMapping and it was not possible to add the date formatter in the right place without the change to RKObjectMapping.
As JSON format doesn't standardize dates subformat, this task is completely on a programmer, right?
When sending dates from PHP to Javascript and back I sent dates as a single integer in UNIX timestamp format (number of seconds since 01/01/1970).
On server:
$now = new DateTime('now');
$now->getTimestamp();
On client:
.success : function (data)
{
var date = new Date(data * 1000);
}
What's the best format for sending dates from JSP? (I'm JSP and Java newbie).
Obviously, it has to be easy encoding/decoding using Java native classes as well as Javascript Date object.
There should not be any problems with overflowing (I'm afraid after 2038 my PHP code will break).
Regards,
new Date(milliseconds) //milliseconds since 1970/01/01 - so you've solved your own problem. I personally prefer to use yyyy-mm-dd hh:ii:ss format as it readable and different from UK and US formats.