Report Builder 3 - Number Format - reporting-services

From an application DB (SQL Server 2012) I run a queries to get some data from. I have 3 fields which contains numbers as data type, the problem is the format of these numbers:
As you can see at the end of each number there is an 'E+15'. But in my application I can see the same data with correct format.
Any ideas what is mean 'E+15' and how can I get this data with correct format?
#Juan, the result of STR()

Convert or Cast doesnt work in this case. You still will ge scientific notation.
For display USE STR()
STR(MinValue), STR (MaxValue)

Related

How to format a esriFieldTypeDate parameter in a json url query to an ESRI REST server

I'm trying to query this API, filtering by the RptDt field, which is type esriFieldTypeDate.
The basic query looks like this:
https://dhsgis.wi.gov/server/rest/services/DHS_COVID19/COVID19_WI_V2/MapServer/11/query?where=1%3D1&outFields=*&outSR=4326&f=json
It's easy to filter a numeric variable like POS_CUM_SUM like so:
"https://dhsgis.wi.gov/server/rest/services/DHS_COVID19/COVID19_WI_V2/MapServer/11/query?where=%20(POS_CUM_CP%20%3D%200%20OR%20POS_CUM_CP%20%3D%2010)%20&outFields=*&outSR=4326&f=json"
I can't figure out how to format the minimum and maximum arguments for the date field RptDt.
because the attributes of RptDt are formatted as unix timestamps: 1642255200000, 1642428000000. But that returns error code 400.
https://dhsgis.wi.gov/server/rest/services/DHS_COVID19/COVID19_WI_V2/MapServer/11/query?where=%20(RptDt%20%3D%20'1642255200000'%20OR%20RptDt%20%3D%20'1642428000000')%20&outFields=*&outSR=4326&f=json
Then, I noticed that the field length for RptDt is 8, so I tried rounding the unix timestamp to 8 digits (16422552, 16424280), but that also gives error code 400.
I tried using YYYYMMDD format (20210101 to 20211231), this didn't give an error, but the response has no features. Most of the dates do fall this period.
https://dhsgis.wi.gov/server/rest/services/DHS_COVID19/COVID19_WI_V2/MapServer/11/query?where=%20(RptDt%20%3D%20%2720210101%27%20OR%20RptDt%20%3D%20%2720211231%27)%20&outFields=*&outSR=4326&f=json
I haven't been able to find a solution in the ArcGIS REST APIs documentation. Does anyone understand what's missing from my queries?
The correct format is 'YYYY-MM-DD'. This query works. The relevant bit is where=RptDt>'2022-01-01'.
https://dhsgis.wi.gov/server/rest/services/DHS_COVID19/COVID19_WI_V2/MapServer/11/query?where=RptDt>'2022-01-01'&outFields=*&outSR=4326&f=json
Thanks to JamieKelly1 at the Esri Community Forum for answering this question.
For anyone else stumbling upon this one after getting the -2147220985 (which btw means bad SQL query), the full url I used had to contain Date before the actual date like so Date'2015-02-02' not just '2015-02-02'. Also, you cannot use epoch timestamp in miliseconds in the query builder (at least not for open data dc).
Example url:
https://maps2.dcgis.dc.gov/dcgis/rest/services/DCGIS_DATA/ServiceRequests/MapServer/6/query?where=%20(ADDDATE%20%3D%20Date'2015-02-02'%20OR%20ADDDATE%20%3D%20Date'2015-02-03')%20&outFields=*&outSR=4326&f=json
To search between two dates:
https://maps2.dcgis.dc.gov/dcgis/rest/services/DCGIS_DATA/ServiceRequests/MapServer/6/query?where=ADDDATE BETWEEN DATE '2015-01-09 00:00:00' AND DATE '2015-01-09 12:00:00'&outFields=*&outSR=4326&f=json

How to convert date from csv file into integer

I have to send data from csv into SQL DB.
Problem starts when I try to convert data into Int. It wasnt my idea and I really cant do much with this datatype. When I'm trying to achieve this problem pop up:
Data Conversion 2: Data conversion failed while converting column
"pr_czas" (387) to column "C pr_dCz_id" (14). The conversion returned
status value 2 and status text "The value could not be converted
because of a potential loss of data.".
Tried already to ignore this problem but then another problems came up so there is no other way than solving this.
I have to convert this data from csv file which is str 50 into int 4
It must be int4. One of the requirements Dont know what t odo.
This is data I'm trying to put into int4. Look on pr_czas
This is data's datatype
Before I tried to do same thing with just DD.MM.YYYY but got same result...
Given an input column named [pr_czas] that contain string values that look like 31.01.2020 00:00 which appears to be a formatted date time represented in the format "DD.mm.YYYY HH:MM", I would like to express that as a whole number DDMMYYHHMM
Add a derived column to your data flow and call this new_pr_czas
The logic I'm going to use is a series of REPLACE statements and cast the final result to an integer. Replace the period, replace the colon and the space - all with nothing
(DT_I8)REPLACE(REPLACE(REPLACE([pr_czas], ".", ""), ":", ""), " ", "")
This is an easy case but things to note.
An integer/int32/I4 has a maximum value of 2 billion.
310120200000 is too large to fit into that space so you would need to make that an bigint/int64/I8. If I remember your previous question, you were having troubles with a lookup task so this data type mismatch might hurt you there.
The other thing to be aware of is that leading zeros will be dropped when converted to a number because they are not significant. If you need to retain the leading zeros, then you're working with string data type. This is an advantage to working with the ISO standard but if your data expects DD, then far be it for me to say otherwise.
If you need to slice your date into another format, then you'll want to have a few derived columns. The first one will generate a string column for each piece of pr_czas - year, month, day, hour and minute. You'll use the substring method for this and findstring to find the period space and colon.
The next data flow will be used to put those string pieces back into the new format and cast that to I8. Why? Because you can't debug doing it all in one shot but you can put a data viewer between two derived columns to figure out where a slice went awry.

Ensuring Same date format in JSON objects

Setup :
Angular 8 + Spring boot 2.2.3 + (Oracle 12.1 / PostgreSQL 12.3)
We are building a approval System where User can fill online form like google forms and can submit for approval. Rather than normalizing form structure , we'll be storing metadata in JSON format in our DB.
Values that are filled in form would also be going as JSON format in DB.
One point come up as a concern , in DB we can store date in particular format like 12-May-2020 which would be consistent across all inserted data as this data might be used to construct reports in near future.
Based on pros/cons of this approach need to decide on DB / data model as well.
So,
Is there any way I can enforce date format in JSON
If this cannot be done in JSON , what options do i have at Angular 8 / Spring boot application level which would enforce all developers / date components / date fields to use same date format.
If these cannot be done , how can I handle different formats in Query over JSON data that would be used in reporting or otherwise , both in Oracle and PostgreSQL
Regards
The proper solution to your problem is to create a real, normalized date column.
If for some reason you can't or don't want to do that, I would create a check constraint that validates the date format by trying to cast it to a real date value.
The following is for Postgres, but you can create something similar for Oracle as well:
create table the_table
(
form_data jsonb,
constraint check_valid_date check ( (form_data ->> 'entry_date')::date is not null)
);
Obviously you will need to adjust the expression that gets the date value from the JSON to match the key and path inside your json value.
The cast to date will require that the date value is entered using the ISO standard format, yyyy-mm-dd which is the only "consistent" way to store a date as a string.
Alternatively you can use to_date() with a format mask:
check ( to_date(form_data ->> 'entry_date', 'yyyy-mm-dd) is not null)
in DB we can store date in particular format like 12-May-2020
You are mistaken, Oracle doesn't store date in that format. It is internally stored in TYPE12/13 data type. Each bit represents different parts of the date. What you see is a human readable format displayed according to your locale-specific NLS settings or using TO_CHAR with format mask.
To keep it aligned across all platforms, use the globally accepted ANSI standard date literal which uses a fixed format 'YYYY-MM-DD'.
For example:
DATE '2020-05-21'

Map String date value of a Csv to a column in mySql database using Informatica Cloud

I am doing Data Integration project to create a Data Warehouse in mySql. I have a couple of Csv/flat files to be ingested into Informatica Cloud which will populate the destination table in mySql database (The destination table is already created).
The Csv files have columns related to datetime but they are in String format (MM/DD/YYYY HH:MM) and I am not able to pass them through Informatica Cloud to the mySql datatbase tables. The column in mySql Database has format YYYY-MM-DD HH:MM:SS as the data type is datetime.
I tried different strategies of keeping the destination columns datatype as datetime and sometime varchar.
I got Following errors on different attempts with different changes.
*Transformation [Expr_DSS_0010LY0I000000000002_1] had an error evaluating output column [SystemModstamp_OUT]. Error message is [<<Expression Error>> [TO_DATE]: invalid string for converting to Date... t:TO_DATE(u:TO_CHAR(t:TO_DATE(u:'5/30/2013 12:26',u:'mm/dd/yyyy hh:mi'),u:'yyyy-mm-dd hh:mm'),u:'MM/DD/YYYY HH24:MI:SS')].
Transformation [Expr_DSS_0010LY0I000000000002_1] had an error evaluating output column [CreatedDate_OUT]. Error message is [<> [TO_DATE]: invalid string for converting to Date... t:TO_DATE(u:'6/24/2008 18:23',u:'mm/dd/yyyy hh24:mi:ss')].
Transformation [Expr_DSS_0010LY0I000000000002_1] had an error evaluating output column [CreatedDate_OUT]. Error message is [<> [TO_DATE]: invalid string for converting to Date... t:TO_DATE(u:TO_CHAR(t:TO_DATE(u:'6/24/2008 18:23',u:'mm/dd/yyyy hh24:mi'),u:'YYYY-MM-DD hh:mm:ss'),u:'MM/DD/YYYY HH24:MI')].*
I also tried the documentation related to Informatica Cloud (Csv datetime) but didn't get any concrete solution.
I will be really glad if someone can help.
Thanks in advance.
A single TO_DATE should work
TO_DATE(input_date_field,'mm/dd/yyyy hh24:mi')
I just finished solving the exactly same issue myself.
There is no good way to do this, I will explain what I did:
Basically I treated the date columns as string in power center. i.e.
I changed the target definition only in PC to varchar for all the dates column.
SQ, expects everything is string for their attributes.
Where as the physical definition is still datetime in mysql DB.
For me this worked. hope it works for you too. Good luck !!

SSIS how to convert string (DT_STR) to money (DT_CY) when source has more than 2 decimals

I have a source flat file with values such as 24.209991, but they need to load to SQL Server as type money. In the DTS (which I am converting from), that value comes across as 24.21. How do I convert that field in SSIS?
Right now, I am just changing the type from DT_STR to DT_CY, and it gives a run error of 'Data conversion failed. The data conversion for column "Col003" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".'
Do I use a Data Conversion task? And then what?
I've also tried setting the source output column to DT_NUMERIC, and then convert that to DT_CY, with the same result.
I've also tried using Derived Columns, casting the DT_STR field Col003 to (DT_NUMERIC,10,2)Col003 and then casting that to (DT_CY)Col003_Numeric. That's getting a cast error.
The flat file defaults to all fields being DT_STR. Use the Advanced option on editing the connection to have the numeric field as float (DT_R4). Then, in the advanced editing of the Flat File Source (on the Data Flow tab), set that output column to money (DT_CY).
Then, the field will convert without any additional conversions. The issue was leaving the source file definition as DT_STR.
If you don't have any null value use Data Conversion, and make sure you don't have any funny character (e.g. US$200 produce error)
If you have null or empty fields in your field and you are using Flat file source, make sure that you tick "Return null value from source.."
Another trick I have used is something like: (taxvalue != "" ? taxvalue : NULL(DT_WSTR,50)). in Derived Column transformation (you can just replace the field)
Generally SSIS doesn't convert empty strings to money properly.
For some reason in my scenario, the OLE DB Destination actually was configured to accept a DT_CY. However, casting to this format (no matter the length of the input and destination data, and no matter wether or not the data was NULL when it arrived) always caused the same issue.
After adding data viewers, I can conclude that this has something to do with the locale. Here in Denmark, we use comma (,) as decimal delimiters and dots (.) as thousands-delimiters, instead of the opposite.
This means that a huge number like 382,939,291,293.38 would (after the conversion to DT_CY) look like 382.939.291.293,38. Even though I highly doubted that it could be the issue, I decided to do the opposite of what I originally had intended.
I decided to go to the advanced settings of my OLE DB Destination and change the DT_CY column's type to DT_STR instead. Then, I added a Derived Column transformation, and entered the following expression to transform the column before the data would arrive at the destination.
REPLACE(SUBSTRING(Price, 2, 18), ",", ".") where Price was the column's name.
To my big surprise, this solved the problem, since I figured out that my OLE DB Destination was now sending the data as a string, which the SQL Server understood perfectly fine.
I am certain that this is a bug! I was using SQL Server 2008, so it might have been solved in later editions. However, I find it quite critical that such an essential thing is not working correctly!