Is it possible to write date as DateTime format in JSON using Python without converting it to string - json

I want to know if there is a way to write the date as DateTime format in JSON.
I have followed so many links on the internet but everywhere date is converted to string(str) in order to write it on JSON file.
I used the below code:
import json
fileName='json_output.json'
def writeToJSONFile(data):
with open(fileName, 'a+') as fp:
json.dump(data, fp, indent=4, default=str)
then calling it as :
from datetime import datetime
date_value="09-23-2019"
date_time = datetime.strptime(date_value,'%m-%d-%Y')
date_dict={"eventDate":date_time}
writeToJSONFile(date_dict)
The above code is able to write date into the JSON file but in the string format.
I have already went through link:
How to overcome "datetime.datetime not JSON serializable"?
JSON datetime between Python and JavaScript
I just want to know if it is possible or not at all possible to store date as datetime format?

Simple answer is No, JSON does not have a native datetime representation, so it will appear as a string; however, if you use a standard that is agreed upon, the receiving application can parse the variable into a datetime object, if they choose to. If you do not know what format they may agree with, I would recommend just making it a standard ISO 8601(Combined date and time in UTC), that way it can be parsed by the receiver, and retain the correct value regardless of time zone.

Related

Azure Logic Apps - Convert JSON Epoch Timestamp to DateTime String

I am working on an Azure Logic App that is triggered via an HTTP call and returns a JSON response. Internally, the logic app retrieves JSON data from a web API and then converts the response JSON data to a format that is acceptable to the calling client of the logic app.
The problem I'm having is that the web API is returning dates in the format "/Date(1616371200000)/" and I need the date format to look like "2021-03-32T19:00:00Z". I don't see any built-in logic app function that can work with or convert the Epoch timestamp as-is (unless I'm missing something).
To clarify...
Source Data:
{
"some_silly_date": "/Date(1616371200000)/"
}
Desired Data:
{
"some_silly_date": "2021-03-32T19:00:00Z"
}
The following solution would theoretically work if the source date wasn't wrapped with "/Date(...)/":
"#addToTime('1970-01-01T00:00:00Z', 1616371200000, 'Second')"
Parsing that text off the timestamp before converting it would lead to a really ugly expression. Is there a better way to convert the timestamp that I'm not aware of?
Note that using the Liquid JSON-to-JSON templates is not an option. I was using that and found this action apparently has to JIT compile before use which is causing my logic app to time-out when being called after a long period of inactivity.
Can you get the value "/Date(1616371200000)/" from the JSON into a variable? If so, a little string manipulation would do the trick:
int(replace(replace(variables('data_in'),'/Date(',''),')/',''))
Then use the variable in the addToTime function.
Result:
The following expression seems to be working and returns a timestamp in UTC. Note that the substring() function is only using a length of 10 instead of 13. I'm intentionally trimming-off the milliseconds from the Epoch timestamp because the addToTime() function only handles seconds.
{
"some_silly_date": "#addToTime('1970-01-01T00:00:00Z', int(substring(item()?['some_silly_date'],6,10)), 'Second')"
}
As an added bonus, if the timestamp value is null in the source JSON, do the following:
{
"some_silly_date": "#if(equals(item()?['some_silly_date'], null), null, addToTime('1970-01-01T00:00:00Z', int(substring(item()?['some_silly_date'],6,10)), 'Second'))"
}
Quite possibly the ugliest code I've ever written.

Flattening Express data to JSON string in Labview

I am working on Labview. I want to flatten the Express data type coming out from my DAQ-Assistent into JSON string. I am using JKI JSON but it is showing an error of unsupported data type: Expressdata. Are there any suggestions?
If the JSON VI does not know how to interpret the data from the express VI, it can't convert it into JSON. For example, the LabVIEWs native JSON VIs can not de/encode timestamps, since JSON does not have a timestamp data type. An additional convention on how to store timestamps would be necessary, such as seconds since 1970 or a string in ISO time format.
It is even possible that the data wire just contains some references, and storing the reference gives you nothing.
If you convert the express data to a more basic datatype like waveform, the JSON VI should be able to encode it.

Date fields transformation from AWS Glue table to RedShift Spectrum external table

I am trying to transform the JSON dataset from S3 to Glue table schema into an Redshift spectrum for data analysis. While creating external tables, how to transform the DATE fields?
Need to highlight the source data is coming from MongoDB in ISODate format. Here, is the Glue table format.
struct $date:string
Tried the following formats within the External table
startDate:struct<$date:varchar(40)>
startDate:struct<date:varchar(40)>
startDate:struct<date:timestamp>
Is there a work around within the Redshift Spectrum or Glue to handle ISODate formats? Or the recommendation is to go back to the source to convert the ISOdate format?
Assuming you are using Python in glue, and assuming python understands your field as a date, you could do something like:
from pyspark.sql.functions import date_format
from awsglue.dynamicframe import DynamicFrame
from awsglue.context import GlueContext
def out_date_format(to_format):
"""formats the passed date into MM/dd/yyyy format"""
return date_format(to_format,"MM/dd/yyyy")
#if you have a dynamic frame you will need to convert it to a dataframe first:
#dataframe = dynamic_frame.toDF()
dataframe.withColumn("new_column_name", out_date_format("your_old_date_column_name"))
#assuming you are outputting via glue, you will need to convert the dataframe back into a dynamic frame:
#glue_context = GlueContext(spark_context)
#final = DynamicFrame.fromDF(dataframe, glue_context,"final")
Depending on how you are getting the data, there may be other options to use mapping or formatting.
If python doesn't understand your field as a date object, you will need to parse it first, something like:
import dateutil.parser
#and the convert would change to:
def out_date_format(to_format):
"""formats the passed date into MM/dd/yyyy format"""
yourdate = dateutil.parser.parse(to_format)
return date_format(yourdate,"MM/dd/yyyy")
Note that if the dateutil isn't built into glue, you will need to add it to your job parameters with syntax like:
"--additional-python-modules" = "python-dateutil==2.8.1"

Custom Formatting of JSON output using Spark

I have a dataset with a bunch of BigDecimal values. I would like to output these records to a JSON file, but when I do the BigDecimal values will often be written with trailing zeros (123.4000000000000), but the spec we are must conform to does not allow this (for reasons I don't understand).
I am trying to see if there is a way to override how the data is printed to JSON.
Currently, my best idea is to convert each record to a string using JACKSON and then writing the data using df.write().text(..) rather than JSON.
I suggest to convert Decimal type to String before writing to JSON.
Below code is in Scala, but you can use it in Java easily
import org.apache.spark.sql.types.StringType
# COLUMN_NAME is your DataFrame column name.
val new_df = df.withColumn('COLUMN_NAME_TMP', df.COLUMN_NAME.cast(StringType)).drop('COLUMN_NAME').withColumnRenamed('COLUMN_NAME_TMP', 'COLUMN_NAME')

change datatype of json string to Datetime in spark

i am feeding json files to spark. A value in it is of Datetime type but it is being converted to string type. I got a solution here which said to rebuild spark after changing InferSchema.scala file of it but i dont want to do it. Is there any way i can convert it while reading json files. Also can i convert it using spark sql after "jsonFiles.registerTempTable('jsonFiles')". Any help in this regard will be greatly appriciated.
With the jsonFile function you can also specify the schema at read time so:
sqlContext.jsonFile(path, schema) or in the new API (post 1.4) sqlContext.read.schema(schema).format("json").load(path)