I am converting a XML file to JSON file using jsonDecode(jsonEncode(file)). The problem is when I try to Fetch data from the JSON file with an index number. It prints only one letter. I have inserted the code below. See if there is any problem with my code. If I try to fetch data with string it shows error.
My file is large. So, I am only inserting a small portion of the output.
I have already tried the xml2json package and I got the output.
But in dart without any package we can convert the XML file to JSON with import 'dart:convert';. With the documentation, I came to understand it only works for small files. If it also works for large files, I want to know the problem in this code and the solution.
And that is what I tried in the following code.
This is my code,
Future<String> decodexml() async {
final file =
await rootBundle.loadString('assets/xml_file/belovedskincare.xml');
final jsonData = jsonDecode(jsonEncode(file));
debugPrint('$jsonData');
return jsonData;
}
When I print $jsonData this is the output,
<rss version="2.0"
I/flutter (24058): xmlns:excerpt="http://wordpress.org/export/1.2/excerpt/"
I/flutter (24058): xmlns:content="http://purl.org/rss/1.0/modules/content/"
I/flutter (24058): xmlns:wfw="http://wellformedweb.org/CommentAPI/"
I/flutter (24058): xmlns:dc="http://purl.org/dc/elements/1.1/"
I/flutter (24058): xmlns:wp="http://wordpress.org/export/1.2/"
I/flutter (24058): >
I/flutter (24058):
I/flutter (24058): <channel>
I/flutter (24058): <title>Beloved Skincare</title>
I/flutter (24058): <link>https://belovedskincare.com.my</link>
I/flutter (24058): <description>Food for your Skin</description>
I/flutter (24058): <pubDate>Sat, 02 Oct 2021 09:17:04 +0000</pubDate>
I/flutter (24058): <language>en-US</language>
I/flutter (24058): <wp:wxr_version>1.2</wp:wxr_version>
I/flutter (24058): <wp:base_site_url>https://belovedskincare.com.my</wp:base_site_url>
I/flutter (24058): <wp:base_blog_url>https://belovedskincare.com.my</wp:base_blog_url>
When I try to print some data from jsonData, this is the output I get,
code:
var json = jsonData['rss'];
debugPrint('$json');
output:
[ERROR:flutter/lib/ui/ui_dart_state.cc(209)] Unhandled Exception: type 'String' is not a subtype of type 'int' of 'index'
code:
var json = jsonData[0];
debugPrint('$json');
output:
I/flutter (24058): <
code:
var json = jsonData[1];
debugPrint('$json');
output:
I/flutter (24058): r
To my understanding, It seems each letter from the json file saved separately in the jsonData variable.
I am already using the xml2json package.
No, you don't. There is no such thing in your code. You are just calling meaningless functions that act as if XMl data were json. It's not and that is why you get errors.
I suggest you actually read the example on the package you are using and then write your code accordingly. A package is not magic that you reference and then it works. You have to call their functions.
Related
As per the Google documentation if we use a sessiontoken in the autocomplete API followed by placeDetails api then subsequent all the calls will be grouped together and it will be billed as a single request. But while using, it is actually not happening and Google counts all of the requests as individual.
I/flutter (12295): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=s&key=<API_KEY>&sessiontoken=d7f1f97f-621d-45cf-b8fd-38d74c976fd8&location=37.4220309%2C-122.0839848&radius=5000&strictbounds=true
I/flutter (12295): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=st&key=<API_KEY>&sessiontoken=d7f1f97f-621d-45cf-b8fd-38d74c976fd8&location=37.4220309%2C-122.0839848&radius=5000&strictbounds=true
I/flutter (12295): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=sta&key=<API_KEY>&sessiontoken=d7f1f97f-621d-45cf-b8fd-38d74c976fd8&location=37.4220309%2C-122.0839848&radius=5000&strictbounds=true
I/flutter (12295): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=stat&key=<API_KEY>&sessiontoken=d7f1f97f-621d-45cf-b8fd-38d74c976fd8&location=37.4220309%2C-122.0839848&radius=5000&strictbounds=true
I/flutter (12295): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=stati&key=<API_KEY>&sessiontoken=d7f1f97f-621d-45cf-b8fd-38d74c976fd8&location=37.4220309%2C-122.0839848&radius=5000&strictbounds=true
I/flutter (12295): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=statio&key=<API_KEY>&sessiontoken=d7f1f97f-621d-45cf-b8fd-38d74c976fd8&location=37.4220309%2C-122.0839848&radius=5000&strictbounds=true
I/flutter (12295): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=station&key=<API_KEY>&sessiontoken=d7f1f97f-621d-45cf-b8fd-38d74c976fd8&location=37.4220309%2C-122.0839848&radius=5000&strictbounds=true
I/flutter (12295): Get URI: https://maps.googleapis.com/maps/api/place/details/json?place_id=ChIJr3mngX66j4ARcUF6Ln6aAmk&key=<API_KEY>&sessiontoken=d7f1f97f-621d-45cf-b8fd-38d74c976fd8
I/flutter (12295): Instance of 'PlaceDetails'
In the above scenario, based on the documentation it should have counted as a single request but in the Google console - API usage, it's showing as 9 separate requests.
Is it the right way to use sessiontoken? Am I missing something?
Edit > Added logs for two search groups with placeDetails API. The sessiontoken are being refreshed after every the place details.
I/flutter (17558): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=sta&key=<API_KEY>&sessiontoken=3dd4163e-c5aa-4504-a652-0c7861704efa&location=22.863185%2C87.3552233&radius=5000&strictbounds=true
I/flutter (17558): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=stat&key=<API_KEY>&sessiontoken=3dd4163e-c5aa-4504-a652-0c7861704efa&location=22.863185%2C87.3552233&radius=5000&strictbounds=true
I/flutter (17558): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=station&key=<API_KEY>&sessiontoken=3dd4163e-c5aa-4504-a652-0c7861704efa&location=22.863185%2C87.3552233&radius=5000&strictbounds=true
I/flutter (17558): Get URI: https://maps.googleapis.com/maps/api/place/details/json?place_id=<PLACE_DETAILS>&key=<API_KEY>&sessiontoken=3dd4163e-c5aa-4504-a652-0c7861704efa
I/flutter (17558): Instance of 'PlaceDetails'
I/flutter (17558): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=sch&key=<API_KEY>&sessiontoken=a204cfe5-0824-4e2c-8fe6-5382ef542e76&location=22.863185%2C87.3552233&radius=5000&strictbounds=true
I/flutter (17558): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=scho&key=<API_KEY>&sessiontoken=a204cfe5-0824-4e2c-8fe6-5382ef542e76&location=22.863185%2C87.3552233&radius=5000&strictbounds=true
I/flutter (17558): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=schoo&key=<API_KEY>&sessiontoken=a204cfe5-0824-4e2c-8fe6-5382ef542e76&location=22.863185%2C87.3552233&radius=5000&strictbounds=true
I/flutter (17558): Get URI: https://maps.googleapis.com/maps/api/place/autocomplete/json?input=school&key=<API_KEY>&sessiontoken=a204cfe5-0824-4e2c-8fe6-5382ef542e76&location=22.863185%2C87.3552233&radius=5000&strictbounds=true
I/flutter (17558): Get URI: https://maps.googleapis.com/maps/api/place/details/json?place_id=<PLACE_DETAILS>&key=<API_KEY>&sessiontoken=a204cfe5-0824-4e2c-8fe6-5382ef542e76
I/flutter (17558): Instance of 'PlaceDetails'
I'm using World Air Quality API. I need 3 values "AQI", "PM10" and "PM25". Some cities dosen't have value PM10 and my app is freezing on request with error:
E/flutter ( 1339): [ERROR:flutter/lib/ui/ui_dart_state.cc(209)] Unhandled Exception: NoSuchMethodError: The method '[]' was called on null.
E/flutter ( 1339): Receiver: null
E/flutter ( 1339): Tried calling: []("v")
How I can add if statement for checking if this value [pm10][v] exist in json?
My code looks like this:
AirQuality(Map<String, dynamic> jsonBody){
aqi = int.tryParse(jsonBody['data']['aqi'].toString()) ??-1;
pm25 = int.tryParse(jsonBody['data']['iaqi']['pm25']['v'].toString()) ??-1;
pm10 = int.tryParse(jsonBody['data']['iaqi']['pm10']['v'].toString()) ??-1;
}
I am trying to convert Map<DateTime,List> into json with following code:
String ecoded = json.encode({
DateTime.now(): ["Sample data", 3,true,"Example", 2],
});
But compiler give following error:
E/flutter (11773): [ERROR:flutter/lib/ui/ui_dart_state.cc(209)] Unhandled
Exception: Converting object to an encodable object failed: _LinkedHashMap len:1
E/flutter (11773): #0 _JsonStringifier.writeObject
(dart:convert/json.dart:688:7)
E/flutter (11773): #1 _JsonStringStringifier.printOn
(dart:convert/json.dart:877:17)
E/flutter (11773): #2 _JsonStringStringifier.stringify
(dart:convert/json.dart:862:5)
E/flutter (11773): #3 JsonEncoder.convert (dart:convert/json.dart:262:30)
E/flutter (11773): #4 JsonCodec.encode (dart:convert/json.dart:172:45)
I converted many times Map into json but in all cases Map was something like that:
Map<String,dynamic>
This error is occurring because there is List
If anyone know how to convert Map<DateTime,List> into json then Answer this question.
Thanks
Json only supports Strings as keys. So if you want to use DateTimes as keys you have to manually convert it first:
String ecoded = json.encode({
DateTime.now().toString(): ["Sample data", 3,true,"Example", 2],
});
or
Map<String, dynamic> result = datetimeMap.map((k, v) => MapEntry('$k', v))
In dart programming, json.encode(DateTime.Now()) which is not possible. You need to convert by using Iso8601String() or toString()
String ecoded = json.encode({
DateTime.now().toIso8610String(): ["Sample data", 3,true,"Example", 2],
});
I am trying to read a json stream from an MQTT broker in Apache Spark with structured streaming, read some properties of an incoming json and output them to the console. My code looks like that:
val spark = SparkSession
.builder()
.appName("BahirStructuredStreaming")
.master("local[*]")
.getOrCreate()
import spark.implicits._
val topic = "temp"
val brokerUrl = "tcp://localhost:1883"
val lines = spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.toDF().withColumn("payload", $"payload".cast(StringType))
val jsonDF = lines.select(get_json_object($"payload", "$.eventDate").alias("eventDate"))
val query = jsonDF.writeStream
.format("console")
.start()
query.awaitTermination()
However, when the json arrives I get the following errors:
Exception in thread "main" org.apache.spark.sql.streaming.StreamingQueryException: Writing job aborted.
=== Streaming Query ===
Identifier: [id = 14d28475-d435-49be-a303-8e47e2f907e3, runId = b5bd28bb-b247-48a9-8a58-cb990edaf139]
Current Committed Offsets: {MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496]: -1}
Current Available Offsets: {MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496]: 0}
Current State: ACTIVE
Thread State: RUNNABLE
Logical Plan:
Project [get_json_object(payload#22, $.id) AS eventDate#27]
+- Project [id#10, topic#11, cast(payload#12 as string) AS payload#22, timestamp#13]
+- StreamingExecutionRelation MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496], [id#10, topic#11, payload#12, timestamp#13]
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:300)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:189)
Caused by: org.apache.spark.SparkException: Writing job aborted.
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.doExecute(WriteToDataSourceV2Exec.scala:92)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:247)
at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:296)
at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3384)
at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2783)
at org.apache.spark.sql.Dataset.$anonfun$withAction$2(Dataset.scala:3365)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3365)
at org.apache.spark.sql.Dataset.collect(Dataset.scala:2783)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$15(MicroBatchExecution.scala:537)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$14(MicroBatchExecution.scala:533)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:351)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:349)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runBatch(MicroBatchExecution.scala:532)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$2(MicroBatchExecution.scala:198)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:351)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:349)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:166)
at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:56)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:160)
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:279)
... 1 more
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 8, localhost, executor driver): java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String$(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getUTF8String(rows.scala:195)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:619)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.$anonfun$run$2(WriteToDataSourceV2Exec.scala:117)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:116)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.$anonfun$doExecute$2(WriteToDataSourceV2Exec.scala:67)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:405)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1887)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1875)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1874)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1874)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2108)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2057)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2046)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.doExecute(WriteToDataSourceV2Exec.scala:64)
... 34 more
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String$(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getUTF8String(rows.scala:195)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:619)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.$anonfun$run$2(WriteToDataSourceV2Exec.scala:117)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:116)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.$anonfun$doExecute$2(WriteToDataSourceV2Exec.scala:67)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:405)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
I am sending the JSON records using mosquitto broker and they look like this:
mosquitto_pub -m '{"eventDate": "2020-11-11T15:17:00.000+0200"}' -t "temp"
It seems that every strings coming from Bahir stream source provider raise this error. For instance the following code also raises this error :
spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.select("topic")
.writeStream
.format("console")
.start()
It looks like Spark does not recognize strings coming from Bahir, maybe some kind of weird string class version issue. I've tried the following actions to make the code work:
setup java version to 8
upgrade spark version from 2.4.0 to 2.4.7
setup scala version to 2.11.12
use decode function with all possible encoding combinations instead of .cast(StringType) to transform column "payload" to String
use substring function on column "payload" to recreate a compatible String.
Finally, I got working code by recreating the string using constructor and dataset:
val lines = spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.select("payload")
.as[Array[Byte]]
.map(payload => new String(payload))
.toDF("payload")
This solution is rather ugly but at least it works.
I believe that there is nothing wrong with the code provided in the question and I suspect a bug on Bahir or Spark side preventing Spark to handle String from Bahir source.
I am trying to get some value from Youtube Api. Now, My express code is->
https.get("https://www.googleapis.com/youtube/v3/videos?part=snippet&id=qW_SWM1wpMA&key=MYAPIKEY",function(res){
res.on("data",function(data){
const da=JSON.parse(data);
console.log(da);
})
});
Now, I am console log the data what's coming from api. But I get this error->
$ node app.js
Port is running
undefined:13
"description": "\"Bang Bhaja\" shows how Nonte and Fonte makes a trick to punish Keltu for punishing them from Superintendent Sir for eating the Ilish Maach.\n\nNonte Fonte is a Bengali comic-strip creation of Narayan Debnath which originally was serialized for the childre
SyntaxError: Unexpected end of JSON input
at JSON.parse (<anonymous>)
at IncomingMessage.<anonymous> (C:\Users\LENOVO\Desktop\youtube\app.js:13:19)
at IncomingMessage.emit (events.js:314:20)
at IncomingMessage.Readable.read (_stream_readable.js:514:10)
at flow (_stream_readable.js:987:34)
at resume_ (_stream_readable.js:968:3)
at processTicksAndRejections (internal/process/task_queues.js:80:21)
But when paste the API link (what's mention in above in https.get method) in chrome I get the json data successfully. Then why not i getting the json data in terminal?