how to cast json output field to date type - json

I used a wrapper between mongodb and postgres.
Now, I got my tables and my data and I can access to it.
now, when I listed my table, I found that the field date that is under an object is not type date or timestamp,
let me explain more with short scree,
When I use postgeSQL, my object is converted to json and I got all the fields, but no more dates. but numbers,like in the screen shot
and I can't alter the type of the field date.
what should I do?

Related

SOLVED - Big Query Cannot read repeated field of type STRUCT as optional STRUCT Field

I have MongoDB data that is extracted to Big Query using Google Cloud Function. We use partition table that is separated with date. There is one field (location) in collections that change from object (RECORD) to array (unexpected). we change it to object (RECORD) again and there is error when extracting this field. the error message is:
Cannot read repeated field of type STRUCT as optional STRUCT Field: location
I thought it is because there is partition table that still array data types. after we backfill all the partition. the error still same. any idea guys?
UPDATE: Some of partition table is still in ARRAY, so it have to be changed to RECORD

Inserting a nifi record with date type field into PutMongo

Im trying to add a date field to a JSON record and insert it to mongo using PutMongo processor (I succedded to do so using PutMongoRecord but when Im using PutMongo processor the field type is string and not date because there is no use of the JsonTreeReader)...I need the PutMongo processor to insert this field as a date in order to put a TTL index on it...
Currently I used JoltTransform and some variations of UpdateRecord but I only managed to convert the date to a timestamp which doesn't help me...
Is there a way to convert a string or a timestamp to a date field and insert it to mongo as a date field with the PutMongo processor?
Thanks in advance :)
I would recommend you stick to PutMongoRecord. You haven't said why you prefer to use PutMongo, but if you need to add some fields to enable that TTL index it's as simple as adding some optional attributes in your schema and calling UpdateRecord to put them in there before doing the put.

Tableau Timestamp String to Date

i have a timestamp in my bigquery looking like this: 2017.09.25 10:22:19
i want to convert this string to a date dimension. i tried it with the dropdown menu, calculated fields like datetime, dateparse, date,... and a calculated field where i trimmed the string and took only parts of the date as a sum, but nothing is working. I always get the error that google bigquery could'nt compile my task: "Invalid date: '2017.07.03 10:52:16' "
does anyone have an idea as a solution for my problem?
regards
Date parts need to be separated with dashes, not dots, in order for the cast to work. For example,
'2017-09-25 10:22:19'
As a string, this is valid to cast both to a DATETIME and a TIMESTAMP type. If you want to convert your original string to one of these types, however, you can use PARSE_DATETIME, or similarly PARSE_TIMESTAMP:
SELECT
PARSE_DATETIME('%Y.%m.%d %T', timestamp_string) AS datetime
FROM YourTable;

how separate json field in postgres and got the field

I'm working with mongoDB, and I used a wrapper mongo/Postegres.
Now, I can find my tables and data.
I want to do some statistics but I can't reach objects that got json type in Postgres.
My problem is that I got all the object in json but I need to separate the fields.
I used this :
CREATE FOREIGN TABLE rents( _id NAME, status text, "from" json )
SERVER mongo_server
OPTIONS (database 'tr', collection 'rents');
The field "from" is an object.
I found something like this :
enter code here
but nothing happened
The error (why a screenshot??) means that the data are not in valid json format.
As a first step, you could define the column as type text instead of json. Then querying the foreign table will probably work, and you can see what is actually returned and why PostgreSQL thinks that this is not valid JSON.
Maybe you can create a view on top of the foreign table that converts the value to valid JSON for further processing.

SSIS cannot convert text to date

I am trying to convert a text field in an XML file with a value of 2014-04-01T00:00:00-04:00 to a date via a data conversion component.
The full error message is
[Data Conversion [2]] Error: Data conversion failed while converting column "BKG_DATE" (417) to column "Converted_BKG_DATE" (26). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
The conversion is from a DT_WSTR to a database time [DT_DBTIME].
Please note I only need the date value from the xml field value.
Can anyone help me to resolve this.
Instead of the Data Conversion component, try the Derived Column transformation, and derive the new column from an expression that builds the date by getting the appropriate substrings from the text column.
If you need only the date value, you need to convert to DT_DBDATE, which is a date structure that consists of year, month, and day. DT_DBTIME is a time structure that consists of hour, minute, and second, and so is unsuitable.
See MSDN > Integration Services Data Types: https://msdn.microsoft.com/en-gb/library/ms141036.aspx
To convert from date-time-with-offset values in this format 2014-04-01T00:00:00-04:00 to DT_DBDATE, you can use a Derived Column transformation with the following expression. In this example, the incoming values are in a column called RawDateTime.
(DT_DBDATE)LEFT(RawDateTime,10)
We can test it using a data viewer.
In my case, the date came in this format:
2021-03-31T16:26:26Z
I found that the only date type that natively supported that format was DT_DBTIMESTAMPOFFSET. This worked direct on my input file configuration -- I did not require a Derived Column transform.