BigQuery parse json child column with special character - json
I have load the entire json file into a STRING column of BigQuery table. Now I am trying to access the keys using JSON_EXTRACT_SCALAR function, but I am getting null result for the child keys which contain special character period(".") within their name.
Here's the snippet of the data:
{"server_received_time":"2019-01-17 15:00:00.482000","app":161,"device_carrier":null,"$schema":12,"city":"Caro","user_id":null,"uuid":"9018","event_time":"2019-01-17 15:00:00.045000","platform":"Web","os_version":"49","vendor_id":711,"processed_time":"2019-01-17 15:00:00.817195","user_creation_time":"2018-11-01 19:16:34.971000","version_name":null,"ip_address":null,"paying":null,"dma":null,"group_properties":{},"user_properties":{"location.radio":"ca","vendor.userTier":"free","vendor.userID":"a989","user.id":"a989","user.tier":"free","location.region":"ca"},"client_upload_time":"2019-01-17 15:00:00.424000","$insert_id":"e8410","event_type":"LOADED","library":"amp\/4.5.2","vendor_attribution_ids":null,"device_type":"Mac","device_manufacturer":null,"start_version":null,"location_lng":null,"server_upload_time":"2019-01-17 15:00:00.493000","event_id":64,"location_lat":null,"os_name":"Chrome","vendor_event_type":null,"device_brand":null,"groups":{},"event_properties":{"content.authenticated":false,"content.subsection1":"regions","custom.DNT":true,"content.subsection2":"ca","referrer.url":"","content.url":"","content.type":"index","content.title":"","custom.cookiesenabled":true,"app.pillar":"feed","content.area":"news","app.name":"oc"},"data":{},"device_id":"","language":"English","device_model":"Mac","country":"","region":"","is_attribution_event":false,"adid":null,"session_id":15,"device_family":"Mac","sample_rate":null,"idfa":null,"client_event_time":"2019-01-17 14:59:59.987000"}
{"server_received_time":"2019-01-17 15:00:00.913000","app":161,"device_carrier":null,"$schema":12,"city":"Fo","user_id":null,"uuid":"9052","event_time":"2019-01-17 15:00:00.566000","platform":"Web","os_version":"71","vendor_id":797,"processed_time":"2019-01-17 15:00:01.301936","user_creation_time":"2019-01-17 15:00:00.566000","version_name":null,"ip_address":null,"paying":null,"dma":"CO","group_properties":{},"user_properties":{"user.tier":"free"},"client_upload_time":"2019-01-17 15:00:00.157000","$insert_id":"69ae","event_type":"START WEB SESSION","library":"amp\/4.5.2","vendor_attribution_ids":null,"device_type":"Android","device_manufacturer":null,"start_version":null,"location_lng":null,"server_upload_time":"2019-01-17 15:00:00.925000","event_id":1,"location_lat":null,"os_name":"Chrome Mobile","vendor_event_type":null,"device_brand":null,"groups":{},"event_properties":{"content.subsection3":"home","content.subsection2":"archives","content.title":"","content.keywords.subject":["Lifestyle\/Recreation and leisure\/Outdoor recreation\/Boating","Lifestyle\/Relationships\/Couples","General news\/Weather","Oddities"],"content.publishedtime":154687,"app.name":"oc","referrer.url":"","content.subsection1":"archives","content.url":"","content.authenticated":false,"content.keywords.location":["Ot"],"content.originaltitle":"","content.type":"story","content.authors":["Archives"],"app.pillar":"feed","content.area":"news","content.id":"1.49","content.updatedtime":1546878600538,"content.keywords.tag":["24 1","boat house","Ot","Rockcliffe","River","m"],"content.keywords.person":["Ber","Shi","Jea","Jean\u00e9tien"]},"data":{"first_event":true},"device_id":"","language":"English","device_model":"Android","country":"","region":"","is_attribution_event":false,"adid":null,"session_id":15477,"device_family":"Android","sample_rate":null,"idfa":null,"client_event_time":"2019-01-17 14:59:59.810000"}
{"server_received_time":"2019-01-17 15:00:00.913000","app":16,"device_carrier":null,"$schema":12,"city":"","user_id":null,"uuid":"905","event_time":"2019-01-17 15:00:00.574000","platform":"Web","os_version":"71","vendor_id":7973,"processed_time":"2019-01-17 15:00:01.301957","user_creation_time":"2019-01-17 15:00:00.566000","version_name":null,"ip_address":null,"paying":null,"dma":"DCO","group_properties":{},"user_properties":{"user.tier":"free"},"client_upload_time":"2019-01-17 15:00:00.157000","$insert_id":"d045","event_type":"LOADED","library":"am-js\/4.5.2","vendor_attribution_ids":null,"device_type":"Android","device_manufacturer":null,"start_version":null,"location_lng":null,"server_upload_time":"2019-01-17 15:00:00.925000","event_id":2,"location_lat":null,"os_name":"Chrome Mobile","vendor_event_type":null,"device_brand":null,"groups":{},"event_properties":{"content.subsection3":"home","content.subsection2":"archives","content.subsection1":"archives","content.keywords.subject":["Lifestyle\/Recreation and leisure\/Outdoor recreation\/Boating","Lifestyle\/Relationships\/Couples","General news\/Weather","Oddities"],"content.type":"story","content.keywords.location":["Ot"],"app.pillar":"feed","app.name":"oc","content.authenticated":false,"custom.DNT":false,"content.id":"1.4","content.keywords.person":["Ber","Shi","Jea","Je\u00e9tien"],"content.title":"","content.url":"","content.originaltitle":"","custom.cookiesenabled":true,"content.authors":["Archives"],"content.publishedtime":1546878600538,"referrer.url":"","content.area":"news","content.updatedtime":1546878600538,"content.keywords.tag":["24 1","boat house","O","Rockcliffe","River","pr"]},"data":{},"device_id":"","language":"English","device_model":"Android","country":"","region":"","is_attribution_event":false,"adid":null,"session_id":1547737199081,"device_family":"Android","sample_rate":null,"idfa":null,"client_event_time":"2019-01-17 14:59:59.818000"}
Here's the sample query against the table:
SELECT
CAST(JSON_EXTRACT_SCALAR(data,'$.uuid')AS INT64) AS uuid_id,
CAST(JSON_EXTRACT_SCALAR(data,'$.event_time') AS TIMESTAMP) AS event_time,
JSON_EXTRACT_SCALAR(data,'$[event_properties].app.name') AS app_name,
JSON_EXTRACT_SCALAR(data,'$[user_properties].user.tier') AS user_tier
FROM
mytable
Above query give null result for app_name & user_tier columns even though data exists for them.
Following the BigQuery JSON function documentation - JSON Functions in Standard SQL
In cases where a JSON key uses invalid JSONPath characters, you can escape those characters using single quotes and brackets, [' '].
and running the query as:
SELECT
CAST(JSON_EXTRACT_SCALAR(data,"$.uuid_id")AS INT64) AS uuid_id,
CAST(JSON_EXTRACT_SCALAR(data,"$.event_time") AS TIMESTAMP) AS event_time,
JSON_EXTRACT_SCALAR(data,"$.event_properties.['app.name']") AS app_name,
JSON_EXTRACT_SCALAR(data,"$.user_properties.['user.tier']") AS user_tier
FROM
mytable
result into following error:
Invalid token in JSONPath at: .['app.name']
Please advise. What am I missing here?
You have an extra . before the [. Use
"$.event_properties['app.name']"
Related
Get value from JSON object having special character in key using SQL query
I have an outlet_details table having two columns(id and extended_attributes as a JSON object). extended_attributes have values like { "parent-0-0-id": "DS-606", "parent-0-1-id": "SD066", "secondaryOutletCode": "MG_918" } I want to get parent-0-0-id's value, but when I'm trying to hit SELECT extended_attributes->>'$.parent-0-0-id' AS 'parent00id' FROM outlet_details; I'm getting an: invalid JSON path expression error(3143).
You could just enclose the column name under quotes to separate out the name from escape characters. SELECT extended_attributes->>"$.\"parent-0-0-id\"" AS 'parent00id' FROM outlet_details; should work
I need to replace the character \\ of a string and read with JSON_EXTRACT
When saving the JSON object in Mysql with JSON.stringify it puts the character "\" in the string. I am building a VIEW and I need to separate the data with json_extract, for that I used the MySql REPLACE command but the return is null. EDITED JSON IN FIELD "DADOS" (LONGTEXT) { "pessoal":"[{\"nome\":\"Marie Luiza Novaes\",\"nascimento\":\"1994-06-20\",\"civil\":\"Casado(a)\",\"sexo\":\"F\",\"rg\":\"469326293\",\"cpf\":\"06649073504\"}]", "contato":[], "interesse":[], "adicional":[], "profissional":[], "academico":[], "anotacoes":[], "extras":"[]" } 1 - GET NOME SELECT json_extract (REPLACE(dados,'\\"','"'), '$.pessoal[0].nome') dados FROM cadastro 2 - GET NOME SELECT json_extract (REPLACE(dados,'\\',''), '$.pessoal[0].nome') dados FROM cadastro TEST
I see multiple problems with your current approach. First, the JSON literal text in your column appears to be somewhat malformed. The JSON array does not take double quotes, because it is part of the JSON structure. Second, the JSON path syntax you are using is also off. The following exact setup is working for me: WITH cadastro AS ( SELECT '{"pessoal":[{"nome":"Marie Luiza Novaes","nascimento":"1994-06-20","civil":"Casado(a)","sexo":"F","rg":"469326293","cpf":"06649073504"}], "contato":[], "interesse":[], "adicional":[], "profissional":[], "academico":[], "anotacoes":[], "extras":[]}' AS dados ) SELECT JSON_EXTRACT(dados, '$.pessoal[0].nome') dados FROM cadastro; Demo The output from this query is "Marie Luiza Novaes".
JSON_QUERY unable to handle NULL
I have a table. Few columns are of String, Integer type. And few are JSON type. I am writing a query to form each row as a json object. I have issues with JSON_QUERY(jsondataColumnName). If the column is populated NULL JSON_QUERY fails. I have already written query below. select ( SELECT [customerReferenceNumber] as customerReferenceNumber ,[customerType] as customerType ,[personReferenceNumber] as personReferenceNumber ,[organisationReferenceNumber] as organisationReferenceNumber ,json_query(isnull(product,'')) as product ,json_query(isnull([address],'')) as address FROM [dbo].[customer] FOR JSON PATH, WITHOUT_ARRAY_WRAPPER) AS customer from [dbo].[customer] P Msg 13609, Level 16, State 1, Line 2 JSON text is not properly formatted. Unexpected character '.' is found at position 0.
By default, for JSON don't work with NULL VALUES. Use INCLUDE_NULL_VALUES to handle. As Example: JSON PATH, WITHOUT_ARRAY_WRAPPER, INCLUDE_NULL_VALUES) AS customer As reference: https://learn.microsoft.com/en-us/sql/relational-databases/json/include-null-values-in-json-include-null-values-option?view=sql-server-ver15
Parse JSON into U-SQL then convert to csv
I'm trying to convert some telemetry data that is in JSON format into CSV format, then write it out to a file, using U-SQL. The problem is that some of the JSON key values have periods in them, and so when I'm doing the SELECT operation, U-SQL is not recognizing them. When I check the output file, all that I am seeing is the values for "p1". How can I represent the names of the JSON key names in the script so that they are recognized. Thanks in advance for any help! Code: REFERENCE ASSEMBLY MATSDevDB.[Newtonsoft.Json]; REFERENCE ASSEMBLY MATSDevDB.[Microsoft.Analytics.Samples.Formats]; USING Microsoft.Analytics.Samples.Formats.Json; #jsonDocuments = EXTRACT jsonString string FROM #"adl://xxxx.azuredatalakestore.net/xxxx/{*}/{*}/{*}/telemetry_{*}.json" USING Extractors.Tsv(quoting:false); #jsonify = SELECT Microsoft.Analytics.Samples.Formats.Json.JsonFunctions.JsonTuple(jsonString) AS json FROM #jsonDocuments; #columnized = SELECT json["EventInfo.Source"] AS EventInfoSource, json["EventInfo.InitId"] AS EventInfoInitId, json["EventInfo.Sequence"] AS EventInfoSequence, json["EventInfo.Name"] AS EventInfoName, json["EventInfo.Time"] AS EventInfoTime, json["EventInfo.SdkVersion"] AS EventInfoSdkVersion, json["AppInfo.Language"] AS AppInfoLanguage, json["UserInfo.Language"] AS UserInfoLanguage, json["DeviceInfo.BrowserName"] AS DeviceInfoBrowswerName, json["DeviceInfo.BrowserVersion"] AS BrowswerVersion, json["DeviceInfo.OsName"] AS DeviceInfoOsName, json["DeviceInfo.OsVersion"] AS DeviceInfoOsVersion, json["DeviceInfo.Id"] AS DeviceInfoId, json["p1"] AS p1, json["PipelineInfo.AccountId"] AS PipelineInfoAccountId, json["PipelineInfo.IngestionTime"] AS PipelineInfoIngestionTime, json["PipelineInfo.ClientIp"] AS PipelineInfoClientIp, json["PipelineInfo.ClientCountry"] AS PipelineInfoClientCountry, json["PipelineInfo.IngestionPath"] AS PipelineInfoIngestionPath, json["AppInfo.Id"] AS AppInfoId, json["EventInfo.Id"] AS EventInfoId, json["EventInfo.BaseType"] AS EventInfoBaseType, json["EventINfo.IngestionTime"] AS EventINfoIngestionTime FROM #jsonify; OUTPUT #columnized TO "adl://xxxx.azuredatalakestore.net/poc/TestResult.csv" USING Outputters.Csv(quoting : false); JSON: {"EventInfo.Source":"JS_default_source","EventInfo.Sequence":"1","EventInfo.Name":"daysofweek","EventInfo.Time":"2018-01-25T21:09:36.779Z","EventInfo.SdkVersion":"ACT-Web-JS-2.6.0","AppInfo.Language":"en","UserInfo.Language":"en-US","UserInfo.TimeZone":"-08:00","DeviceInfo.BrowserName":"Chrome","DeviceInfo.BrowserVersion":"63.0.3239.132","DeviceInfo.OsName":"Mac OS X","DeviceInfo.OsVersion":"10","p1":"V1","PipelineInfo.IngestionTime":"2018-01-25T21:09:33.9930000Z","PipelineInfo.ClientCountry":"CA","PipelineInfo.IngestionPath":"FastPath","EventInfo.BaseType":"custom","EventInfo.IngestionTime":"2018-01-25T21:09:33.9930000Z"}
I got this to work with single quotes and single square brackets, eg #columnized = SELECT json["['EventInfo.Source']"] AS EventInfoSource, ... Full code: #columnized = SELECT json["['EventInfo.Source']"] AS EventInfoSource, json["['EventInfo.InitId']"] AS EventInfoInitId, json["['EventInfo.Sequence']"] AS EventInfoSequence, json["['EventInfo.Name']"] AS EventInfoName, json["['EventInfo.Time']"] AS EventInfoTime, json["['EventInfo.SdkVersion']"] AS EventInfoSdkVersion, json["['AppInfo.Language']"] AS AppInfoLanguage, json["['UserInfo.Language']"] AS UserInfoLanguage, json["['DeviceInfo.BrowserName']"] AS DeviceInfoBrowswerName, json["['DeviceInfo.BrowserVersion']"] AS BrowswerVersion, json["['DeviceInfo.OsName']"] AS DeviceInfoOsName, json["['DeviceInfo.OsVersion']"] AS DeviceInfoOsVersion, json["['DeviceInfo.Id']"] AS DeviceInfoId, json["p1"] AS p1, json["['PipelineInfo.AccountId']"] AS PipelineInfoAccountId, json["['PipelineInfo.IngestionTime']"] AS PipelineInfoIngestionTime, json["['PipelineInfo.ClientIp']"] AS PipelineInfoClientIp, json["['PipelineInfo.ClientCountry']"] AS PipelineInfoClientCountry, json["['PipelineInfo.IngestionPath']"] AS PipelineInfoIngestionPath, json["['AppInfo.Id']"] AS AppInfoId, json["['EventInfo.Id']"] AS EventInfoId, json["['EventInfo.BaseType']"] AS EventInfoBaseType, json["['EventINfo.IngestionTime']"] AS EventINfoIngestionTime FROM #jsonify; My results:
MariaDB COLUMN_JSON query returns binary
I've been trying to use dynamic columns with an instance of MariaDB v10.1.12. First, I send the following query: INSERT INTO savedDisplays (user, name, body, dataSource, params) VALUES ('Marty', 'Hey', 'Hoy', 'temp', COLUMN_CREATE('type', 'tab', 'col0', 'champions', 'col1', 'averageResults')); Where params' type was defined as a blob, just like the documentation suggests. The query is accepted, the table updated. If I COLUMN_CHECK the results, it tells me it's fine. But when I try to select: "SELECT COLUMN_JSON(params) AS params FROM savedDisplays; I get a {type: "Buffer", data: Array} containing binary returned to me, instead of the {"type":"tab", "col0":"champions", "col1":"averageResults"} I expect. EDIT: I can use COLUMN_GET just fine, but I need every column inside the params field, and I need to check the type property first to know what kind of and how many columns there are in the JSON / params field. I could probably make it work still, but that would require multiple queries, as opposed to only one. Any ideas?
Try: SELECT CONVERT(COLUMN_JSON(params) USING utf8) AS params FROM savedDisplays In MariaDB 10 this works at every table: SELECT CONVERT(COLUMN_JSON(COLUMN_CREATE('t', text, 'v', value)) USING utf8) as json FROM test WHERE 1 AND value LIKE '%12345%' LIMIT 10; output in node.js [ TextRow { json: '{"t":"test text","v":"0.5339044212345805"}' } ]