JSON table encountered too many errors - json

I have to upload a JSON load into the big query. While uploading the load I got the below-mentioned error.
I debug and found that it is failing on this JSON record which seems to be valid.
{"firebaseUid":"00FKNF7x2BQhDoPk9TSzE4Ncepn1","age_range":{"min":21},"signUpApp":"stationApp","uid":"00FKNF7x2BQhDoPk9TSzE4Ncepn1","locale":"en_US","emailSha256":"501a8456ececb2a50e733eed6c64b840d63d3aad99fb9ad4a1bbd2cbc33fc1f6","loginMethod":"facebook","notificationToken":"dummy","ageRangeMin":21,"pushNotificationEnabled":true,"projectId":"triplembaas","createDate":"13/07/2018","state":"QLD","station":"TripleM 104.5","facebookId":"1021TheHotBreakfast740157586","email":"connollyharley#gmail.com","cellularNetwork":"OPTUS","suburb":"Bellara","idfa":"60A63734A27E40249331658F1AC670A1","deviceId":"BBD901JaseJuelz454E100000000000000000","firstSignUpDate":"13/07/2018","name":"Harley Connolly","gender":"male","emailVerificationFlag":false,"lastUpdateDate":"20/07/2018","link":"dummy"}
Error while reading data, error message: JSON table encountered too many errors, giving up. Rows: 1; errors: 1. Please look into the error stream for more details.

Probably I am late but for anyone still looking for answer try this.
Change your date format to "YYYY-MM-DD". Somehow bigquery detects that the field value is Date and it won't allow any other format of date instead of "YYYY-MM-DD"

Related

bigquery error: "Could not parse '41.66666667' as INT64"

I am attempting to create a table using a .tsv file in BigQuery, but keep getting the following error:
"Failed to create table: Error while reading data, error message: Could not parse '41.66666667' as INT64 for field Team_Percentage (position 8) starting at location 14419658 with message 'Unable to parse'"
I am not sure what to do as I am completely new to this.
Here is a file with the first 100 lines of the full data:
https://wetransfer.com/downloads/25c18d56eb863bafcfdb5956a46449c920220502031838/f5ed2f
Here are the steps I am currently taking to to create the table:
https://i.gyazo.com/07815cec446b5c0869d7c9323a7fdee4.mp4
Appreciate any help I can get!
As confirmed with OP (#dan), the error encountered is caused by selecting Auto detect when creating a table using a .tsv file as the source.
The fix for this is to manually create a schema and define the data type for each column properly. For more reference on using schema in BQ see this document.

Invalid property name: `errorType` on class `java.lang.String`. Validate that the correct setters is present

When I try to use a json-logger in Mule 4. I'm getting this error. I'm trying to log a error object here but it is not getting successfull. Please find the error object below.
I sorted out the issue. The issue was we cannot give JSON in the MESSAGE section of json-logger. When i changed it to a string. It worked
The MESSAGE section is meant to describe what you are going to Log.
It looks like you are trying to use the function stringifyNonJSON() with a Mule error and treat it like a String. Without more details of the flow and the payload it is not possible to have more insights.
You could try to create a string from the payload manually first and use that as the parameter, as the function is not able to handle this case apparently.

Error while reading data, error message: CSV table references column position 15, but line starting at position:0 contains only 1 columns

I am new in bigquery, Here I am trying to load the Data in GCP BigQuery table which I have created manually, I have one bash file which contains bq load command -
bq load --source_format=CSV --field_delimiter=$(printf '\u0001') dataset_name.table_name gs://bucket-name/sample_file.csv
My CSV file contains multiple ROWS with 16 column - sample Row is
100563^3b9888^Buckname^https://www.settttt.ff/setlllll/buckkkkk-73d58581.html^Buckcherry^null^null^2019-12-14^23d74444^Reverb^Reading^Pennsylvania^United States^US^40.3356483^-75.9268747
Table schema -
When I am executing bash script file from cloud shell, I am getting following Error -
Waiting on bqjob_r10e3855fc60c6e88_0000016f42380943_1 ... (0s) Current status: DONE
BigQuery error in load operation: Error processing job 'project-name-
staging:bqjob_r10e3855fc60c6e88_0000ug00004521': Error while reading data, error message: CSV
table
encountered too many errors, giving up. Rows: 1; errors: 1. Please look into the errors[] collection
for more details.
Failure details:
- gs://bucket-name/sample_file.csv: Error while
reading data, error message: CSV table references column position
15, but line starting at position:0 contains only 1 columns.
What would be the solution, Thanks in advance
You are trying to insert wrong values to your table per the schema you provided
Based on table schema and your data example I run this command:
./bq load --source_format=CSV --field_delimiter=$(printf '^') mydataset.testLoad /Users/tamirklein/data2.csv
1st error
Failure details:
- Error while reading data, error message: Could not parse '39b888'
as int for field Field2 (position 1) starting at location 0
At this point, I manually removed the b from 39b888 and now I get this
2nd error
Failure details:
- Error while reading data, error message: Could not parse
'14/12/2019' as date for field Field8 (position 7) starting at
location 0
At this point, I changed 14/12/2019 to 2019-12-14 which is BQ date format and now everything is ok
Upload complete.
Waiting on bqjob_r9cb3e4ef5ad596e_0000016f42abd4f6_1 ... (0s) Current status: DONE
You will need to clean your data before upload or use a data sample with more lines with --max_bad_records flag (Some of the lines will be ok and some not based on your data quality)
Note: unfortunately there is no way to control date format during the upload see this answer as a reference
We had the same problem while importing data from local to BigQuery. After researching the data we saw that there data which starting \r or \s enter image description here
After implementing ua['ColumnName'].str.strip() and ua['District'].str.rstrip(). we could add data to Bg.
Thanks

Streaming Analytics Query Error

When typing my query for Streaming Analytics I don't receive any error messages, however, when I start my streaming analytics job, I receive the following error:
Stream Analytics job has validation errors: The output dsleads used in the query was not defined. Activity Id: '5cc961c5-4dbd-4a63-95df-8e3b08d2121c-2016-03-28 14:56:21Z'.
I've checked the output name and have verified that it's correct. I'm not sure what I'm doing wrong. The query I'm using is as follows:
SELECT type, count(*) as count, system.timestamp as time
INTO dsleads
FROM ttvhuball timestamp by time
GROUP BY type, TumblingWindow(minute, 10)
I've scoured the internet and have not found anything helpful.
seems like an output alias error. Is the output sink alias that you've created(that is you give a name to it during creation) "dsleads" in your case?
Also worth mentioning INTO is the output and FROM is the input. INTO always needs to come before FROM in the query as well.
Hope this helps!
Mert
I was using the incorrect output name. One needs to specify the output name. I was specifying the dataset name of the output I needed to specify.

Entity Framework - MySQL - Datetime format issue

I have a simple table with few date fields.
Whenever I run following query:
var docs = ( from d in base.EntityDataContext.document_reviews
select d ).ToList();
I get following exception:
Unable to convert MySQL date/time value to System.DateTime.
MySql.Data.Types.MySqlConversionException: Unable to convert MySQL date/time value to System.DateTime
The document reviews table has two date/time fields. One of them is nullable.
I have tried placing following in connection string:
Allow Zero Datetime=true;
But I am still getting exception.
Anyone with a solution?
#effkay - if you solved this it would be great if you could post the answer.
Also if anyone else has a solution that would be great too :).
Edit:
The solution can be found in the http://dev.mysql.com/doc/refman/5.1/en/connector-net-connection-options.html connector documentation.
I needed to set "Convert Zero Datetime" to true, and now it works.
hth.
You need to set Convert Zero Datetime=True in connection string of running application