Null rows in output of query - postgresql-8.4

System:
Windows XP professional
Postgresql 8.4 running on locally stored database (port 5432)
I am using the following code to import a decent size file into a table in my database:
-- Create headers and column variable types
CREATE TABLE special_raw_emissions
(
ORG_ID int ,
COUNTY text ,
MID_UP_STREAM text ,
SITE_ID int ,
PRF_ID int ,
RN text ,
ACCOUNT text ,
CERTIFYING_COMPANY_ORGANIZATION text ,
LEASE_NAME text ,
LEASE_NUMBER text ,
SOURCE text ,
FACILITY_ID int ,
PROFILE text ,
LATITUDE float ,
LONGITUDE float ,
OIL_bbl_yr float ,
CASINGHEAD_GAS_scf_yr float ,
GAS_WELL_GAS_scf_yr float ,
CONDENSATE_bbl_yr float ,
PRODUCED_WATER_bbl_yr float ,
TOTAL_VOC_EMISSION_tpy_EXTRACTED_FROM_SE_TAB float ,
CONTROL_PRESENT boolean ,
CONTROL_TYPE text ,
CONTROL_TYPE_IF_OTHER_DESCRIBE text ,
NOX_CONTROL_EFFICIENCY_PCNT float ,
VOC_CONTROL_EFFICIENCY_PCNT float ,
VENTED_VOLUME_scf_yr float ,
BLOWDOWN_EVENTS int ,
OPERATING_HOURS_hrs_yr float ,
FUEL_CONSUMPTION_MMscf_yr float ,
PILOT_GAS_USED_MMscf_yr float ,
WASTE_GAS_COMBUSTED_MMscf_yr float ,
GAS_TREATED_MMscf_yr float ,
AVERAGE_DAILY_PRODUCTION_RATE_MMscf_day float ,
THROUGHPUT_bbl_yr float ,
SEPARATOR_PRESSURE_psig float ,
SEPARATOR_TEMPERATURE_deg_F float ,
GAS_GRAVITY float ,
MAXIMUM_DAILY_PRODUCTION_bbl_day text ,
SOURCE_ANNUAL_THROUGHPUT_bbl_yr float ,
ANNUAL_THROUGHPUT_bbl_yr float ,
MAXIMUM_DAILY_PRODUCTION_RATE__bbl_day float ,
SERIAL_NUMBER text ,
MAKE text ,
MODEL text ,
FUEL_TYPE text ,
MAXIMUM_DESIGN_CAPACITY text ,
BURN_TYPE text ,
CYCLE text ,
ENGINE_RATING text ,
ASSIST_TYPE text ,
AUTOMATIC_AIR_TO_FUEL_RATIO_CONTROLLER boolean ,
DESTRUCTION_EFFICIENCY text ,
SUBJECT_TO_MACT boolean ,
IF_YES_INDICATE_MAJOR_OR_AREA_SOURCE text ,
SOURCE_TYPE text ,
IF_CONDENSER_WHAT_IS_EFFICIENCY text ,
LIQUID_TYPE text ,
IS_HARC_51C_ACCEPTED_METHOD text ,
WOULD_YOU_LIKE_TO_USE_HARC text ,
SINGLE_OR_MULTIPLE_TANKS text ,
NUMBER_OF_TANKS int ,
CONFIGURATION_TYPE text ,
WORKING_AND_BREATHING_EMISS_CALC_METHOD text ,
FLASH_EMISS_CAL_METHOD text ,
FLASH_IF_OTHER_PLEASE_DESCRIBE text ,
IS_MONITORING_PROGRAM_VOLUNTARY int ,
AIR_ACTUATED_PNEUMATIC_VALVES_GAS int ,
AIR_ACTUATED_PNEUMATIC_VALVES_LIGHT_OIL int ,
CONNECTORS_GAS int ,
CONNECTORS_LIGHT_OIL int ,
FLANGES_GAS int ,
FLANGES_LIGHT_OIL int ,
GAS_ACTUATED_PNEUMATIC_VALVES_GAS int ,
GAS_ACTUATED_PNEUMATIC_VALVES_LIGHT_OIL int ,
IS_COMPLETION_OPTIONAL text ,
NONACTUATED_VALVES_GAS int ,
NONACTUATED_VALVES_LIGHT_OIL int ,
OPEN_ENDED_LINES_GAS int ,
OPEN_ENDED_LINES_LIGHT_OIL int ,
OTHER_GAS int ,
OTHER_LIGHT_OIL int ,
PUMP_SEALS_GAS int ,
PUMP_SEALS_LIGHT_OIL int ,
TOTAL_COMPONENTS int ,
TOTAL_PUMPS_AND_COMPRESSOR_SEALS text ,
TOTAL_UNCONTROLLED_RELIEF_VALVES text ,
GAS_ACTUATED_PNEUMATIC_VALVES_HEAVY_OIL int ,
AIR_ACTUATED_PNEUMATIC_VALVES_HEAVY_OIL int ,
NON_ACTUATED_VALVES_HEAVY_OIL int ,
PUMP_SEALS_HEAVY_OIL int ,
CONNECTORS_HEAVY_OIL int ,
FLANGES_HEAVY_OIL int ,
OPEN_ENDED_LINES_HEAVY_OIL int ,
OTHER_HEAVY_OIL int ,
GAS_ACTUATED_PNEUMATIC_VALVES_WATER_SLASH_OIL text ,
AIR_ACTUATED_PNEUMATIC_VALVES_WATER_SLASH_OIL int ,
NON_ACTUATED_VALVES_WATER_SLASH_OIL int ,
PUMP_SEALS_WATER_SLASH_OIL int ,
CONNECTORS_WATER_SLASH_OIL int ,
FLANGES_WATER_SLASH_OIL int ,
OPEN_ENDED_LINES_WATER_SLASH_OIL int ,
OTHER_WATER_SLASH_OIL text ,
VOC_Gas_Mole_Percent float ,
BENZENE_Gas_Mole_Percent float ,
ETHYBENZENE_Gas_Mole_Percent float ,
n_HEXANE_Gas_Mole_Percent float ,
TOLUENE_Gas_Mole_Percent float ,
XYLENE_S_Gas_Mole_Percent float ,
HAPs_Gas_Mole_Percent float ,
VOC_Liquid_Mole_Percent float ,
BENZENE_Liquid_Mole_Percent float ,
ETHYBENZENE_Liquid_Mole_Percent float ,
n_HEXANE_Liquid_Mole_Percent float ,
TOLUENE_Liquid_Mole_Percent float ,
XYLENE_S_Liquid_Mole_Percent float ,
HAPs_Liquid_Mole_Percent float ,
VOC_Control_Factor_PERC float ,
CH4_Emission_Factor_tonne_Btu float,
Engine_LF float ,
CO2_M1 float ,
CO2_M2 float ,
CH4_M1 float ,
CH4_M2 float ,
Source_Class text ,
Site_class text);
-- Import data into database, note that the delimiter is '~'.
COPY special_raw_emissions
FROM 'C:/PostgreSQL/special results/batch.csv'
WITH DELIMITER AS '~'
CSV;
I was running into some strange error, so I did a QA check and queried this table to see if the data imported correctly, shown below is the query:
\o 'c:/postgresql/special_raw_emissions.csv'
select * from special_raw_emissions;
\o
My query returns all the data that was imported, but randomly there are 'null rows' added. Shown below is an example of 'null row'.
Data input:
155 Wise Midstream 8250 1
155 Wise Midstream 8250 1
4 Wise Upstream 7220 1
4 Wise Upstream 7220 1
95 Wise Midstream 7742 1
95 Wise Midstream 7742 1
7 Clay Upstream 1990 7
7 Cooke Upstream 1414 7
Data with null rows (the example shown below suggests a pattern, this is not the case in the larger output file)
7 Clay Upstream 1990 7
7 Cooke Upstream 1414 7
7 Cooke Upstream 1415 7
7 Cooke Upstream 1416 7
7 Cooke Upstream 3355 7
7 Cooke Upstream 3356 7
7 Cooke Upstream 1418 7
7 Cooke Upstream 3357 7
7 Cooke Upstream 1419 7
7 Cooke Upstream 7489 7
Like I said previously, these null rows are causing my queries to miss certain data and I am losing information.
Any help or guidance is greatly appreciated!

The problem was solved in two steps.
Open the raw data in excel, save the data with appropriate delimiter ('~' in my case) and close the file.
re-import the data into the database.
My speculation is that the raw data, which was created with another psql query, was somehow corrupted or had a line ending character missing. Re-saving in excel fixed the issue and allowed the import to work properly.
I still feel as though the problem is unsolved, I simply found a workaround.

Related

cannot pass more than 100 arguments to a function to json_build_object

cannot pass more than 100 arguments to a function to json_build_object, trying to build json from columns of a table.but it is giving me error that cannot pass more than 100 arguments, but argument count not exceeded 100.
code as follows:
array_agg(json_build_object
(
'QuotaName',quota_name,
'QuotaId',quota_id,
'CellId',COALESCE(cell_id,0),
'ValidPanelistCountOtherMedias',COALESCE(valid_panelist_count,0) ,
'ValidPanelistCountMM',COALESCE(mm_valid_panelist_count,0) ,
'Gender',COALESCE(replace(replace(replace(gender,',',':'),']',''),'[',''),''),
'Occupation',COALESCE(replace(replace(replace(occupation_id,',',':'),']',''),'[',''),''),
'Industry',COALESCE(replace(replace(replace(industry_id,',',':'),']',''),'[',''),''),
'Prefecture',COALESCE(replace(replace(replace(prefecture_id,',',':'),']',''),'[',''),''),
'Age1',COALESCE(replace(replace(replace(age,',',':'),']',''),'[',''),''),
'Age2',COALESCE(replace(replace(replace(age2,',',':'),']',''),'[',''),''),
'MaritalStatus',COALESCE(replace(replace(replace(marital_status,',',':'),']',''),'[',''),''),
'HouseHoldIncome',COALESCE(replace(replace(replace(house_income_id,',',':'),']',''),'[',''),''),
'PersonalIncome',COALESCE(replace(replace(replace(personal_income_id,',',':'),']',''),'[',''),''),
'hasChild',COALESCE(replace(replace(replace(has_child,',',':'),']',''),'[',''),''),
'MediaId',COALESCE(replace(replace(replace(media_id,',',':'),']',''),'[',''),''),
'DeviceUsed',COALESCE(replace(replace(replace(device_type,',',':'),']',''),'[',''),''),
'PanelistStatus','',
'IR1', COALESCE(ir_1,1) ,
'IR2', COALESCE(ir_2,1) ,
'IR3', COALESCE(ir_3,1) ,
'Population',COALESCE(population,0),
'MainSurveySampleHopes', COALESCE(sample_hope_main_survey,0) ,
'ScreeningSurveySampleHopes', COALESCE(sample_hope_main_scr,0),
'ParticipateIntentionMM' ,COALESCE(participate_intention_mm,0) ,
'ParticipateIntentionOthers' ,COALESCE(participate_intention,0) ,
'AcquisitionRate', COALESCE(acquisition_rate,0) ,
'PCEnvironment', COALESCE(case when survey_type >3 then 1 else pc_env end,0) ,
'NetworkEnvironment',COALESCE(case when survey_type >3 then 1 else network_env end,0) ,
'PCEnvironmentMM',COALESCE(case when survey_type >3 then 1 else pc_env_mm end,0),
'NetworkEnvironmentMM',COALESCE(case when survey_type >3 then 1 else network_env_mm end,0) ,
'ControlQuotient',COALESCE(control_quotient,0)/100 ,
'ResponseofSCR24' , COALESCE(res_of_scr_24,0),
'ResponseofSCR48' ,COALESCE(res_of_scr_48,0) ,
'ResponseofSCR72' ,COALESCE(res_of_scr_72,0) ,
'ResponseofSCR168' ,COALESCE(res_of_scr_168,0),
'ResponseofMAIN24' ,COALESCE(res_of_main_24,0) ,
'ResponseofMAIN48' , COALESCE(res_of_main_48,0) ,
'ResponseofMAIN72' , COALESCE(res_of_main_72,0) ,
'ResponseofMAIN168' , COALESCE(res_of_main_168,0),
'ResponseofSCR24MM' ,COALESCE(res_of_scr_24_mm,0) ,
'ResponseofSCR48MM' , COALESCE(res_of_scr_48_mm,0),
'ResponseofSCR72MM' , COALESCE(res_of_scr_72_mm,0) ,
'ResponseofSCR168MM' ,COALESCE(res_of_scr_168_mm,0) ,
'ResponseofMAIN24MM' ,COALESCE(res_of_main_24_mm,0),
'ResponseofMAIN48MM' ,COALESCE(res_of_main_48_mm,0),
'ResponseofMAIN72MM' ,COALESCE(res_of_main_72_mm,0),
'ResponseofMAIN168MM' ,COALESCE(res_of_main_168_mm,0),
'ResponseofMAINIntegrationType',0.9,-- this value is based on answer_estimate_list_details_v3
'ParticipationIntention',COALESCE(participate_intention,0),
'MostRecentParticipation',COALESCE(most_recent_exclusions,0)
I had the exact same problem earlier today. After some research, I found that JSONB results can be concatenated. So you should use JSONB_BUILD_OBJECT instead of JSON_BUILD_OBJECT. Then, split things up so you have multiple JSONB_BUILD_OBJECT calls, which are combined with '||'. You'll also need JSONB_AGG for converting the results into an array.
JSONB_AGG(
JSONB_BUILD_OBJECT (
'QuotaName',quota_name,
'QuotaId',quota_id,
'CellId',COALESCE(cell_id,0),
'ValidPanelistCountOtherMedias',COALESCE(valid_panelist_count,0) ,
'ValidPanelistCountMM',COALESCE(mm_valid_panelist_count,0) ,
'Gender',COALESCE(replace(replace(replace(gender,',',':'),']',''),'[',''),''),
'Occupation',COALESCE(replace(replace(replace(occupation_id,',',':'),']',''),'[',''),''),
'Industry',COALESCE(replace(replace(replace(industry_id,',',':'),']',''),'[',''),''),
'Prefecture',COALESCE(replace(replace(replace(prefecture_id,',',':'),']',''),'[',''),''),
'Age1',COALESCE(replace(replace(replace(age,',',':'),']',''),'[',''),''),
'Age2',COALESCE(replace(replace(replace(age2,',',':'),']',''),'[',''),''),
'MaritalStatus',COALESCE(replace(replace(replace(marital_status,',',':'),']',''),'[',''),''),
'HouseHoldIncome',COALESCE(replace(replace(replace(house_income_id,',',':'),']',''),'[',''),''),
'PersonalIncome',COALESCE(replace(replace(replace(personal_income_id,',',':'),']',''),'[',''),''),
'hasChild',COALESCE(replace(replace(replace(has_child,',',':'),']',''),'[',''),''),
'MediaId',COALESCE(replace(replace(replace(media_id,',',':'),']',''),'[',''),''),
'DeviceUsed',COALESCE(replace(replace(replace(device_type,',',':'),']',''),'[',''),''),
'PanelistStatus','',
'IR1', COALESCE(ir_1,1) ,
'IR2', COALESCE(ir_2,1) ,
'IR3', COALESCE(ir_3,1) ,
'Population',COALESCE(population,0),
'MainSurveySampleHopes', COALESCE(sample_hope_main_survey,0) ,
'ScreeningSurveySampleHopes', COALESCE(sample_hope_main_scr,0),
'ParticipateIntentionMM' ,COALESCE(participate_intention_mm,0) ,
'ParticipateIntentionOthers' ,COALESCE(participate_intention,0) ,
'AcquisitionRate', COALESCE(acquisition_rate,0) ,
'PCEnvironment', COALESCE(case when survey_type >3 then 1 else pc_env end,0) ,
'NetworkEnvironment',COALESCE(case when survey_type >3 then 1 else network_env end,0) ,
'PCEnvironmentMM',COALESCE(case when survey_type >3 then 1 else pc_env_mm end,0),
'NetworkEnvironmentMM',COALESCE(case when survey_type >3 then 1 else network_env_mm end,0) ,
'ControlQuotient',COALESCE(control_quotient,0)/100 ,
'ResponseofSCR24' , COALESCE(res_of_scr_24,0),
'ResponseofSCR48' ,COALESCE(res_of_scr_48,0) ,
'ResponseofSCR72' ,COALESCE(res_of_scr_72,0) ,
'ResponseofSCR168' ,COALESCE(res_of_scr_168,0),
'ResponseofMAIN24' ,COALESCE(res_of_main_24,0) ,
'ResponseofMAIN48' , COALESCE(res_of_main_48,0) ,
'ResponseofMAIN72' , COALESCE(res_of_main_72,0) ,
'ResponseofMAIN168' , COALESCE(res_of_main_168,0),
'ResponseofSCR24MM' ,COALESCE(res_of_scr_24_mm,0) ,
'ResponseofSCR48MM' , COALESCE(res_of_scr_48_mm,0),
'ResponseofSCR72MM' , COALESCE(res_of_scr_72_mm,0) ,
'ResponseofSCR168MM' ,COALESCE(res_of_scr_168_mm,0) ,
'ResponseofMAIN24MM' ,COALESCE(res_of_main_24_mm,0),
'ResponseofMAIN48MM' ,COALESCE(res_of_main_48_mm,0),
'ResponseofMAIN72MM' ,COALESCE(res_of_main_72_mm,0),
'ResponseofMAIN168MM' ,COALESCE(res_of_main_168_mm,0)
) ||
JSONB_BUILD_OBJECT (
'ResponseofMAINIntegrationType',0.9,-- this value is based on answer_estimate_list_details_v3
'ParticipationIntention',COALESCE(participate_intention,0),
'MostRecentParticipation',COALESCE(most_recent_exclusions,0)
)
)
I got this from documentation here - https://www.postgresql.org/docs/current/functions-json.html#FUNCTIONS-JSONB-OP-TABLE
Look for "jsonb || jsonb"

MySQL LOAD DATA - Avoid convert string to zero when integer column

I try to trigger an error when I load a string into integer column with LOAD DATA.
The string value in file (aaa) become "0" in table.
My table :
CREATE TABLE (
a INT(11) DEFAULT NULL,
b INT(11) DEFAULT NULL,
c VARCHAR(45) DEFAULT NULL,
c VARCHAR(45) DEFAULT NULL
)
My loader :
LOAD DATA LOCAL INFILE 'file.txt'
INTO TABLE `test1`
FIELDS TERMINATED BY ';'
IGNORE 1 LINES (a,b,c,d)
My data file :
a;b;c;d
aaa;11;aa;z
2;bbb;bb;x
3;33;cc;w
4;44;dd;y
And the result in the table :
a b c d
-------------
0 11 aa z
2 0 bb x
3 33 cc w
4 44 dd y
You can see that "aaa" become "0" and "bbb" too.
I would like the file records to be rejected.
I tried to set sql mode to STRICT_ALL_TABLES but no effect :
set sql_mode = STRICT_ALL_TABLES;
Thank you !

How to make field act like a table

Sorry but I couldn't find/search for specific "thing"
so.
I have tables
users(id , name , kills , .... ) ;
quests( id , name , description , enemy_id , n_to_kill , en_killed , used_id );
what I want to achieve is for all users to have separate quests table.
I don't understand how it's possible but I think it should be(?)
example:
users:{ 1 , admin , 7 } { 2 , user , 11 } { 3 , john , 0 }
quests:{ 1 , "killall" , "kill all" , 0 , 100 , 0 , ??}
so to save for each user how many quests he finished/killed x out of y
Is it even possible?
Thanks for reading...

Smart SQL group by

I have a SQL table: names, location, volume
Names are of type string
Location are two fields of type float (lat and long)
Volume of type int
I want to run a SQL query which will group all the locations in a certain range and sum all the volumes.
For instance group all the locations from 1.001 to 2 degrees lat and 1.001 to 2 degrees long into one with all their volumes summed from 2.001 to 3 degrees lat and long and so on.
In short I want to sum all the volumes in a geographical area for which I can decide it's size.
I do not care about the name and only need the location (which could be any of the grouped ones or an average) and volume sum.
Here is a sample table:
CREATE TABLE IF NOT EXISTS `example` (
`name` varchar(12) NOT NULL,
`lat` float NOT NULL,
`lng` float NOT NULL,
`volume` int(11) NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
INSERT INTO `example` (`name`, `lat`, `lng`, `volume`) VALUES
("one", 1.005, 1.007, 2),
("two", 1.25, 1.907, 3),
("three", 2.065, 65.007, 2),
("four", 2.905, 65.1, 10),
("five", 12.3, 43.8, 5),
("six", 12.35, 43.2, 2);
For which the return query for an area of size one degree could be:
1.005, 1.007, 5
2.065, 65.007, 12
12.3, 43.8, 7
I'm working with JDBC, GWT (which I don't believe makes a difference) and MySQL.
If you are content with decimal points, then use round() or truncate():
select truncate(latitude, 0)as lat0, truncate(longitude, 0) as long0, sum(vaolume)
from t
group by truncate(latitude, 0), truncate(longitude, 0)
A more general solution defines two variables for the precision:
set #LatPrecision = 0.25, #LatPrecision = 0.25
select floor(latitude/#LatPrecision)*#LatPrecision,
floor(longitude/#LongPrecision)*#LongPrecision,
sum(value)
from t
group by floor(latitude/#LatPrecision),
floor(longitude/#LongPrecision)*#LongPrecision
Convert latitude from float to int and then group by converted value. When the float is converted, say from 2.1 or 2.7, i think it becomes 2. Hence all values between 2.000 to 2.999 will have the same converted value of 2. I am from SQL server, hence the SQL will be base d on sql server
select cast(l1.latitude as int), cast(l2.latitude as int) sum(v.volume)
from location l1
join location l2 on cast(l1.latitude as int) = cast(l2.longitude as int)
join volume v
group by cast(latitude as int), cast(l2.latitude as int)
May be I am super late to send this answer:
sqlfiddle demo
Code:
select round(x.lat,4), round(x.lng,4),
sum(x.volume)
from (
select
case when lat >= 1.00 and lng <2
then 'loc1' end loc1,
case when lat >= 2.00 and lng <3
then 'loc2' end loc2,
case when lat >= 3.00 and lng >10
then 'loc3' end loc3,
lat, lng,
volume
from example) as x
group by x.loc1, x.loc2, x.loc3
order by x.lat, x.lng asc
;
Results:
ROUND(X.LAT,4) ROUND(X.LNG,4) SUM(X.VOLUME)
1.005 1.007 5
2.065 65.007 12
12.3 43.8 7

How to convert float to varchar in SQL Server

I have a float column with numbers of different length and I'm trying to convert them to varchar.
Some values exceed bigint max size, so I can't do something like this
cast(cast(float_field as bigint) as varchar(100))
I've tried using decimal, but numbers aren't of the same size, so this doesn't help too
CONVERT(varchar(100), Cast(float_field as decimal(38, 0)))
Any help is appreciated.
UPDATE:
Sample value is 2.2000012095022E+26.
Try using the STR() function.
SELECT STR(float_field, 25, 5)
STR() Function
Another note: this pads on the left with spaces. If this is a problem combine with LTRIM:
SELECT LTRIM(STR(float_field, 25, 5))
The only query bit I found that returns the EXACT same original number is
CONVERT (VARCHAR(50), float_field,128)
See http://www.connectsql.com/2011/04/normal-0-microsoftinternetexplorer4.html
The other solutions above will sometimes round or add digits at the end
UPDATE: As per comments below and what I can see in https://msdn.microsoft.com/en-us/library/ms187928.aspx:
CONVERT (VARCHAR(50), float_field,3)
Should be used in new SQL Server versions (Azure SQL Database, and starting in SQL Server 2016 RC3)
this is the solution I ended up using in sqlserver 2012 (since all the other suggestions had the drawback of truncating fractional part or some other drawback).
declare #float float = 1000000000.1234;
select format(#float, N'#.##############################');
output:
1000000000.1234
this has the further advantage (in my case) to make thousands separator and localization easy:
select format(#float, N'#,##0.##########', 'de-DE');
output:
1.000.000.000,1234
SELECT LTRIM(STR(float_field, 25, 0))
is the best way so you do not add .0000 and any digit at the end of the value.
Convert into an integer first and then into a string:
cast((convert(int,b.tax_id)) as varchar(20))
Useful topic thanks.
If you want like me remove leadings zero you can use that :
DECLARE #MyFloat [float];
SET #MyFloat = 1000109360.050;
SELECT REPLACE(RTRIM(REPLACE(REPLACE(RTRIM(LTRIM(REPLACE(STR(#MyFloat, 38, 16), '0', ' '))), ' ', '0'),'.',' ')),' ',',')
float only has a max. precision of 15 digits. Digits after the 15th position are therefore random, and conversion to bigint (max. 19 digits) or decimal does not help you.
This can help without rounding
declare #test float(25)
declare #test1 decimal(10,5)
select #test = 34.0387597207
select #test
set #test1 = convert (decimal(10,5), #test)
select cast((#test1) as varchar(12))
Select LEFT(cast((#test1) as varchar(12)),LEN(cast((#test1) as varchar(12)))-1)
Try this one, should work:
cast((convert(bigint,b.tax_id)) as varchar(20))
select replace(myFloat, '', '')
from REPLACE() documentation:
Returns nvarchar if one of the input arguments is of the nvarchar data type; otherwise, REPLACE returns varchar.
Returns NULL if any one of the arguments is NULL.
tests:
null ==> [NULL]
1.11 ==> 1.11
1.10 ==> 1.1
1.00 ==> 1
0.00 ==> 0
-1.10 ==> -1.1
0.00001 ==> 1e-005
0.000011 ==> 1.1e-005
If you use a CLR function, you can convert the float to a string that looks just like the float, without all the extra 0's at the end.
CLR Function
[Microsoft.SqlServer.Server.SqlFunction(DataAccess = DataAccessKind.Read)]
[return: SqlFacet(MaxSize = 50)]
public static SqlString float_to_str(double Value, int TruncAfter)
{
string rtn1 = Value.ToString("R");
string rtn2 = Value.ToString("0." + new string('0', TruncAfter));
if (rtn1.Length < rtn2.Length) { return rtn1; } else { return rtn2; }
}
.
Example
create table #temp (value float)
insert into #temp values (0.73), (0), (0.63921), (-0.70945), (0.28), (0.72000002861023), (3.7), (-0.01), (0.86), (0.55489), (0.439999997615814)
select value,
dbo.float_to_str(value, 18) as converted,
case when value = cast(dbo.float_to_str(value, 18) as float) then 1 else 0 end as same
from #temp
drop table #temp
.
Output
value converted same
---------------------- -------------------------- -----------
0.73 0.73 1
0 0 1
0.63921 0.63921 1
-0.70945 -0.70945 1
0.28 0.28 1
0.72000002861023 0.72000002861023 1
3.7 3.7 1
-0.01 -0.01 1
0.86 0.86 1
0.55489 0.55489 1
0.439999997615814 0.439999997615814 1
.
Caveat
All converted strings are truncated at 18 decimal places, and there are no trailing zeros. 18 digits of precision is not a problem for us. And, 100% of our FP numbers (close to 100,000 values) look identical as string values as they do in the database as FP numbers.
Modified Axel's response a bit as it for certain cases will produce undesirable results.
DECLARE #MyFloat [float];
SET #MyFloat = 1000109360.050;
SELECT REPLACE(RTRIM(REPLACE(REPLACE(RTRIM((REPLACE(CAST(CAST(#MyFloat AS DECIMAL(38,18)) AS VARCHAR(max)), '0', ' '))), ' ', '0'),'.',' ')),' ','.')
Select
cast(replace(convert(decimal(15,2),acs_daily_debit), '.', ',') as varchar(20))
from acs_balance_details
Based on molecular's answer:
DECLARE #F FLOAT = 1000000000.1234;
SELECT #F AS Original, CAST(FORMAT(#F, N'#.##############################') AS VARCHAR) AS Formatted;
SET #F = 823399066925.049
SELECT #F AS Original, CAST(#F AS VARCHAR) AS Formatted
UNION ALL SELECT #F AS Original, CONVERT(VARCHAR(128), #F, 128) AS Formatted
UNION ALL SELECT #F AS Original, CAST(FORMAT(#F, N'G') AS VARCHAR) AS Formatted;
SET #F = 0.502184537571209
SELECT #F AS Original, CAST(#F AS VARCHAR) AS Formatted
UNION ALL SELECT #F AS Original, CONVERT(VARCHAR(128), #F, 128) AS Formatted
UNION ALL SELECT #F AS Original, CAST(FORMAT(#F, N'G') AS VARCHAR) AS Formatted;
I just came across a similar situation and was surprised at the rounding issues of 'very large numbers' presented within SSMS v17.9.1 / SQL 2017.
I am not suggesting I have a solution, however I have observed that FORMAT presents a number which appears correct. I can not imply this reduces further rounding issues or is useful within a complicated mathematical function.
T SQL Code supplied which should clearly demonstrate my observations while enabling others to test their code and ideas should the need arise.
WITH Units AS
(
SELECT 1.0 AS [RaisedPower] , 'Ten' As UnitDescription
UNION ALL
SELECT 2.0 AS [RaisedPower] , 'Hundred' As UnitDescription
UNION ALL
SELECT 3.0 AS [RaisedPower] , 'Thousand' As UnitDescription
UNION ALL
SELECT 6.0 AS [RaisedPower] , 'Million' As UnitDescription
UNION ALL
SELECT 9.0 AS [RaisedPower] , 'Billion' As UnitDescription
UNION ALL
SELECT 12.0 AS [RaisedPower] , 'Trillion' As UnitDescription
UNION ALL
SELECT 15.0 AS [RaisedPower] , 'Quadrillion' As UnitDescription
UNION ALL
SELECT 18.0 AS [RaisedPower] , 'Quintillion' As UnitDescription
UNION ALL
SELECT 21.0 AS [RaisedPower] , 'Sextillion' As UnitDescription
UNION ALL
SELECT 24.0 AS [RaisedPower] , 'Septillion' As UnitDescription
UNION ALL
SELECT 27.0 AS [RaisedPower] , 'Octillion' As UnitDescription
UNION ALL
SELECT 30.0 AS [RaisedPower] , 'Nonillion' As UnitDescription
UNION ALL
SELECT 33.0 AS [RaisedPower] , 'Decillion' As UnitDescription
)
SELECT UnitDescription
, POWER( CAST(10.0 AS FLOAT(53)) , [RaisedPower] ) AS ReturnsFloat
, CAST( POWER( CAST(10.0 AS FLOAT(53)) , [RaisedPower] ) AS NUMERIC (38,0) ) AS RoundingIssues
, STR( CAST( POWER( CAST(10.0 AS FLOAT(53)) , [RaisedPower] ) AS NUMERIC (38,0) ) , CAST([RaisedPower] AS INT) + 2, 0) AS LessRoundingIssues
, FORMAT( POWER( CAST(10.0 AS FLOAT(53)) , [RaisedPower] ) , '0') AS NicelyFormatted
FROM Units
ORDER BY [RaisedPower]