Serialization error while writing JSON to file - json

I am reading text files and creating Json objects JsValues in every iteration. I want to save them to a file at every iteration. I am using Play Framework to create JSON objects.
class Cleaner {
def getDocumentData() = {
for (i <- no_of_files) {
.... do something ...
some_json = Json.obj("text" -> LARGE_TEXT)
final_json = Json.stringify(some_json)
//save final_json here to a file
}
}
}
I tried using PrintWriter to save that json but I am getting Exception in thread "main" org.apache.spark.SparkException: Task not serializable as the error.
How should I correct this? or is there any other way I can save the JsValue?
UPDATE:
I read that the trait serializable has to be used in this case. I have the following function:
class Cleaner() extends Serializable {
def readDocumentData() {
val conf = new SparkConf()
.setAppName("linkin_spark")
.setMaster("local[2]")
.set("spark.executor.memory", "1g")
.set("spark.rdd.compress", "true")
.set("spark.storage.memoryFraction", "1")
val sc = new SparkContext(conf)
val temp = sc.wholeTextFiles("text_doc.dat)
val docStartRegex = """<DOC>""".r
val docEndRegex = """</DOC>""".r
val docTextStartRegex = """<TEXT>""".r
val docTextEndRegex = """</TEXT>""".r
val docnoRegex = """<DOCNO>(.*?)</DOCNO>""".r
val writer = new PrintWriter(new File("test.json"))
for (fileData <- temp) {
val filename = fileData._1
val content: String = fileData._2
println(s"For $filename, the data is:")
var startDoc = false // This is for the
var endDoc = false // whole file
var startText = false //
var endText = false //
var textChunk = new ListBuffer[String]()
var docID: String = ""
var es_json: JsValue = Json.obj()
for (current_line <- content.lines) {
current_line match {
case docStartRegex(_*) => {
startDoc = true
endText = false
endDoc = false
}
case docnoRegex(group) => {
docID = group.trim
}
case docTextStartRegex(_*) => {
startText = true
}
case docTextEndRegex(_*) => {
endText = true
startText = false
}
case docEndRegex(_*) => {
endDoc = true
startDoc = false
es_json = Json.obj(
"_id" -> docID,
"_source" -> Json.obj(
"text" -> textChunk.mkString(" ")
)
)
writer.write(es_json.toString())
println(es_json.toString())
textChunk.clear()
}
case _ => {
if (startDoc && !endDoc && startText) {
textChunk += current_line.trim
}
}
}
}
}
writer.close()
}
}
This is function to which I added the trait but still I am getting the same exception.
I rewrote a smaller version of it:
def foo() {
val conf = new SparkConf()
.setAppName("linkin_spark")
.setMaster("local[2]")
.set("spark.executor.memory", "1g")
.set("spark.rdd.compress", "true")
.set("spark.storage.memoryFraction", "1")
val sc = new SparkContext(conf)
var es_json: JsValue = Json.obj()
val writer = new PrintWriter(new File("test.json"))
for (i <- 1 to 10) {
es_json = Json.obj(
"_id" -> i,
"_source" -> Json.obj(
"text" -> "Eureka!"
)
)
println(es_json)
writer.write(es_json.toString() + "\n")
}
writer.close()
}
This function works fine with and also without serializable. I cannot understand what's happening?

EDIT: First answer made on phone.
It's not your main class that needs to be serializable but the class you use in the rdd processing loop in this case inside for (fileData <- temp)
It needs to be serializable because the spark data is on multiple partitions that may be on multiple computers. So the functions you apply to this data need to be serializable so you can send them to the other computer where they will be executed in parallel.
PrintWriter cannot be serializable since it refers to a file that is only available from the original computer. Hence the serializaion error.
To write your data on the computer initializing the spark process. You need to take the data that is all over the cluster and bring it to your machine then write it.
To do that you can either collect the result. rdd.collect() and that will take all the data from the cluster and put it in your driver thread memory. Then you can write it to a file using the PrintWriter.
like this:
temp.flatMap { fileData =>
val filename = fileData._1
val content: String = fileData._2
println(s"For $filename, the data is:")
var startDoc = false // This is for the
var endDoc = false // whole file
var startText = false //
var endText = false //
var textChunk = new ListBuffer[String]()
var docID: String = ""
var es_json: JsValue = Json.obj()
var results = ArrayBuffer[String]()
for (current_line <- content.lines) {
current_line match {
case docStartRegex(_*) => {
startDoc = true
endText = false
endDoc = false
}
case docnoRegex(group) => {
docID = group.trim
}
case docTextStartRegex(_*) => {
startText = true
}
case docTextEndRegex(_*) => {
endText = true
startText = false
}
case docEndRegex(_*) => {
endDoc = true
startDoc = false
es_json = Json.obj(
"_id" -> docID,
"_source" -> Json.obj(
"text" -> textChunk.mkString(" ")
)
)
results.append(es_json.toString())
println(es_json.toString())
textChunk.clear()
}
case _ => {
if (startDoc && !endDoc && startText) {
textChunk += current_line.trim
}
}
}
}
results
}
.collect()
.foreach(es_json => writer.write(es_json))
If the result is too large for the driver thread memory you can use the saveAsTextFile function that will stream each partition to your drive. In this second case the path you give as argument will be made into a folder and each partition of your rdd will be written to a numbered file in it.
like this:
temp.flatMap { fileData =>
val filename = fileData._1
val content: String = fileData._2
println(s"For $filename, the data is:")
var startDoc = false // This is for the
var endDoc = false // whole file
var startText = false //
var endText = false //
var textChunk = new ListBuffer[String]()
var docID: String = ""
var es_json: JsValue = Json.obj()
var results = ArrayBuffer[String]()
for (current_line <- content.lines) {
current_line match {
case docStartRegex(_*) => {
startDoc = true
endText = false
endDoc = false
}
case docnoRegex(group) => {
docID = group.trim
}
case docTextStartRegex(_*) => {
startText = true
}
case docTextEndRegex(_*) => {
endText = true
startText = false
}
case docEndRegex(_*) => {
endDoc = true
startDoc = false
es_json = Json.obj(
"_id" -> docID,
"_source" -> Json.obj(
"text" -> textChunk.mkString(" ")
)
)
results.append(es_json.toString())
println(es_json.toString())
textChunk.clear()
}
case _ => {
if (startDoc && !endDoc && startText) {
textChunk += current_line.trim
}
}
}
}
results
}
.saveAsTextFile("test.json")

Related

Reading and initializing json data with scalatest withFixture

I am trying to use withFixture method to initialize my var ip2GeoTestJson and use it throughout my tests. I was able to achieve the desired logic with var year. I believe the error I am getting (parsing JNothing) is caused because the withFixture is not initializing my ip2GeoTestJson with the JSON.
I am currently getting this error:
*** RUN ABORTED ***
An exception or error caused a run to abort: java.lang.ClassCastException was thrown scenario("event.client_ip_address and event_header.client_ip_address both have values") -, construction cannot continue: "org.json4s.JsonAST$JNothing$ cannot be cast to org.json4s.JsonAST$JObject" (IP2GeoTestSuite.scala:51)
Code:
class IP2GeoTestSuite extends FeatureSpec with SparkContextFixture {
var ip2GeoTestJson: JValue = null
var year: String = null
feature("feature") {
scenario("scenario") {
println(ip2GeoTestJson)
assert(year != null)
assert(ip2GeoTestJson != null)
}
}
def withFixture(test: NoArgTest): org.scalatest.Outcome = {
year = test.configMap("year").asInstanceOf[String]
val ip2GeoConfigFile = test.configMap("config").asInstanceOf[String]
val ip2GeoUrl = getClass.getResourceAsStream(s"/$ip2GeoConfigFile")
val ip2GeoJsonString = Source.fromInputStream(ip2GeoUrl).getLines.mkString("")
System.out.println(ip2GeoJsonString)
ip2GeoTestJson = parse(ip2GeoJsonString)
try {
test()
}
}
}
The code works fine when the lines regarding ip2GeoData are moved to the top of the class like so however I need to hardcode the file name:
class IP2GeoTestSuite extends FeatureSpec with SparkContextFixture {
val ip2GeoConfigFile = "ip2geofile.json"
val ip2GeoUrl = getClass.getResourceAsStream(s"/$ip2GeoConfigFile")
val ip2GeoJsonString = Source.fromInputStream(ip2GeoUrl).getLines.mkString("")
System.out.println(ip2GeoJsonString)
val ip2GeoTestJson = parse(ip2GeoJsonString)
var year: String = null
feature("feature") {
scenario("scenario") {
println(ip2GeoTestJson)
assert(year != null)
assert(ip2GeoTestJson != null)
}
}
def withFixture(test: NoArgTest): org.scalatest.Outcome = {
year = test.configMap("year").asInstanceOf[String]
try {
test()
}
}
}
Set params before every test (see http://www.scalatest.org/user_guide/sharing_fixtures#withFixtureOneArgTest):
case class FixtureParams(year: String, ip2GeoTestJson: JValue)
class IP2GeoTestSuite extends FeatureSpec with SparkContextFixture {
feature("feature") {
scenario("scenario") {
println(ip2GeoTestJson)
assert(year != null)
assert(ip2GeoTestJson != null)
}
}
override def withFixture(test: OneArgTest): org.scalatest.Outcome = {
val year = test.configMap("year").asInstanceOf[String]
val ip2GeoConfigFile = test.configMap("config").asInstanceOf[String]
val ip2GeoUrl = getClass.getResourceAsStream(s"/$ip2GeoConfigFile")
val ip2GeoJsonString = Source.fromInputStream(ip2GeoUrl).getLines.mkString("")
val fixtureParam = FixtureParam(year, parseJson(ip2GeoJsonString))
try {
withFixture(test.toNoArgTest(fixtureParam))
} finally {
// Close resourses to avoid memory leak and unpredictable behaviour
ip2GeoUrl.close()
}
}
}
Set params only once before any test will run (http://www.scalatest.org/user_guide/sharing_fixtures#beforeAndAfter):
class IP2GeoTestSuite extends FeatureSpec with BeforeAndAfter {
var ip2GeoTestJson: JValue = null
var year: String = null
before {
// Load config manually because configMap isn't available here.
val config = ConfigFactory.load()
year = config.getString("year")
val ip2GeoConfigFile = "ip2geofile.json"
val ip2GeoUrl = getClass.getResourceAsStream(s"/$ip2GeoConfigFile")
val ip2GeoJsonString = Source.fromInputStream(ip2GeoUrl).getLines.mkString("")
ip2GeoUrl.close()
System.out.println(ip2GeoJsonString)
ip2GeoTestJson = parseJson(ip2GeoJsonString)
}
feature("feature") {
scenario("scenario") {
println(ip2GeoTestJson)
assert(year != null)
assert(ip2GeoTestJson != null)
}
}
}

Convert spark decision tree model debug string to nested JSON in scala

Similar to the tree json parsing quoted here, I am trying to implement a simple visualization of decision trees in scala. It is exactly same as the display method available in databricks notebooks.
I am new to scala and struggling to get the logic right. I understand we have to make recursive calls to build the children and break when the final prediction values are shown. i have attempted a code here using the below mentioned input model debug string
def getStatmentType(x: String): (String, String) = {
val ifPattern = "If+".r
val ifelsePattern = "Else+".r
var t = ifPattern.findFirstIn(x.toString)
if(t != None){
("If", (x.toString).replace("If",""))
}else {
var ts = ifelsePattern.findFirstIn(x.toString)
if(ts != None) ("Else", (x.toString).replace("Else", ""))
else ("None", (x.toString).replace("(", "").replace(")",""))
}
}
def delete[A](test:List[A])(i: Int) = test.take(i) ++ test.drop((i+1))
def BuildJson(tree:List[String]):List[Map[String, Any]] = {
var block:List[Map[String, Any]] = List()
var lines:List[String] = tree
loop.breakable {
while (lines.length > 0) {
println("here")
var (cond, name) = getStatmentType(lines(0))
println("initial" + cond)
if (cond == "If") {
println("if" + cond)
// lines = lines.tail
lines = delete(lines)(0)
block = block :+ Map("if-name" -> name, "children" -> BuildJson(lines))
println("After pop Else State"+lines(0))
val (p_cond, p_name) = getStatmentType(lines(0))
// println(p_cond + " = "+ p_name+ "\n")
cond = p_cond
name = p_name
println(cond + " after="+ name+ "\n")
if (cond == "Else") {
println("else" + cond)
lines = lines.tail
block = block :+ Map("else-name" -> name, "children" -> BuildJson(lines))
}
}else if( cond == "None") {
println(cond + "NONE")
lines = delete(lines)(0)
block = block :+ Map("predict" -> name)
}else {
println("Finaly Break")
println("While loop--" +lines)
loop.break()
}
}
}
block
}
def treeJson1(str: String):JsValue = {
val str = "If (feature 0 in {1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,10.0,11.0,12.0,13.0})\n If (feature 0 in {6.0})\n Predict: 17.0\n Else (feature 0 not in {6.0})\n Predict: 6.0\n Else (feature 0 not in {1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,10.0,11.0,12.0,13.0})\n Predict: 20.0"
val x = str.replace(" ","")
val xs = x.split("\n").toList
var js = BuildJson(xs)
println(MapReader.mapToJson(js))
Json.toJson("")
}
Expected output:
[
{
'name': 'Root',
'children': [
{
'name': 'feature 0 in {1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,10.0,11.0,12.0,13.0}',
'children': [
{
'name': 'feature 0 in {6.0}',
'children': [
{
'name': 'Predict: 17.0'
}
]
},
{
'name': 'feature 0 not in {6.0}',
'children': [
{
'name': 'Predict: 6.0'
}
]
}
]
},
{
'name': 'feature 0 not in {1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,10.0,11.0,12.0,13.0}',
'children': [
{
'name': 'Predict: 20.0'
}
]
}
]
you don`t need to parse the debugstring, instead, you can parse from the rootnode of the model.
refer to enter link description here

Play scala - confusing about the result type of Action.async

I'm little bit confusing about the expected result of Action.async. Here the use case : from the frontend, I receive a JSON to validate (a Foo), I send this data calling an another web service and I extract and validate the received JSON (Bar case class) which I want to validate too. The problem is when I return a result, I have the following error :
type mismatch;
found : Object
required: scala.concurrent.Future[play.api.mvc.Result]
Here my code :
case class Foo(id : String)
case class Bar(id : String)
def create() = {
Action.async(parse.json) { request =>
val sessionTokenOpt : Option[String] = request.headers.get("sessionToken")
val sessionToken : String = "Bearer " + (sessionTokenOpt match {
case None => throw new NoSessionTokenFound
case Some(session) => session
})
val user = ""
val structureId : Option[String] = request.headers.get("structureId")
if (sessionToken.isEmpty) {
Future.successful(BadRequest("no token"))
} else {
val url = config.getString("createURL").getOrElse("")
request.body.validate[Foo].map {
f =>
Logger.debug("sessionToken = " + sessionToken)
Logger.debug(f.toString)
val data = Json.toJson(f)
val holder = WS.url(url)
val complexHolder =
holder.withHeaders(("Content-type","application/json"),("Authorization",(sessionToken)))
Logger.debug("url = " + url)
Logger.debug(complexHolder.headers.toString)
Logger.debug((Json.prettyPrint(data)))
val futureResponse = complexHolder.put(data)
futureResponse.map { response =>
if(response.status == 200) {
response.json.validate[Bar].map {
b =>
Future.successful(Ok(Json.toJson(b)))
}.recoverTotal { e : JsError =>
Future.successful(BadRequest("The JSON in the body is not valid."))
}
} else {
Logger.debug("status from apex " + response.status)
Future.successful(BadRequest("alo"))
}
}
Await.result(futureResponse,5.seconds)
}.recoverTotal { e : JsError =>
Future.successful(BadRequest("The JSON in the body is not valid."))
}
}
}
}
What is wrong in my function ?
Firstly, this is doing nothing:
futureResponse.map { response =>
if(response.status == 200) {
response.json.validate[Bar].map {
b =>
Future.successful(Ok(Json.toJson(b)))
}.recoverTotal { e : JsError =>
Future.successful(BadRequest("The JSON in the body is not valid."))
}
} else {
Logger.debug("status from apex " + response.status)
Future.successful(BadRequest("alo"))
}
}
Because you're not capturing or assigning the result of it to anything. It's equivalent to doing this:
val foo = "foo"
foo + " bar"
println(foo)
The foo + " bar" statement there is pointless, it achieves nothing.
Now to debug type inference problems, what you need to do is assign results to things, and annotate with the types you're expecting. So, assign the result of the map to something first:
val newFuture = futureResponse.map {
...
}
Now, what is the type of newFuture? The answer is actually Future[Future[Result]], because you're using map, and then returning a future from inside that. If you want to return a future inside your map function, then you have to use flatMap instead, this flattens the Future[Future[Result]] to Future[Result]. But actually in your case, you don't need that you can use map, and just get rid of all those Future.successful calls, because you're not actually doing anything in that map function that needs to return a future.
And then get rid of that await as others have said - using await means blocking, which negates the point of using futures in the first place.
Anyway, this should compile:
def create() = {
Action.async(parse.json) { request =>
val sessionTokenOpt : Option[String] = request.headers.get("sessionToken")
val sessionToken : String = "Bearer " + (sessionTokenOpt match {
case None => throw new NoSessionTokenFound
case Some(session) => session
})
val user = ""
val structureId : Option[String] = request.headers.get("structureId")
if (sessionToken.isEmpty) {
Future.successful(BadRequest("no token"))
} else {
val url = config.getString("createURL").getOrElse("")
request.body.validate[Foo].map {
f =>
Logger.debug("sessionToken = " + sessionToken)
Logger.debug(f.toString)
val data = Json.toJson(f)
val holder = WS.url(url)
val complexHolder =
holder.withHeaders(("Content-type","application/json"),("Authorization",(sessionToken)))
Logger.debug("url = " + url)
Logger.debug(complexHolder.headers.toString)
Logger.debug((Json.prettyPrint(data)))
val futureResponse = complexHolder.put(data)
futureResponse.map { response =>
if(response.status == 200) {
response.json.validate[Bar].map {
b =>
Ok(Json.toJson(b))
}.recoverTotal { e : JsError =>
BadRequest("The JSON in the body is not valid.")
}
} else {
Logger.debug("status from apex " + response.status)
BadRequest("alo")
}
}
}.recoverTotal { e : JsError =>
Future.successful(BadRequest("The JSON in the body is not valid."))
}
}
}
}
Do not Await.result(futureResponse, 5 seconds). Just return the futureResponse as is. The Action.async can deal with it (in fact, it wants to deal with it, it requires you to return a Future).
Note that in your various other codepaths (else, recoverTotal) you are already doing that.
If you use Action.async you don't need to await for result. So try to return future as is, without Await.result

How to yield a JSON object from a for loop in scala?

for (character <- content) {
if (character == '\n') {
val current_line = line.mkString
line.clear()
current_line match {
case docStartRegex(_*) => {
startDoc = true
endText = false
endDoc = false
}
case docnoRegex(group) => {
docID = group.trim
}
case docTextStartRegex(_*) => {
startText = true
}
case docTextEndRegex(_*) => {
endText = true
startText = false
}
case docEndRegex(_*) => {
endDoc = true
startDoc = false
es_json = Json.obj(
"_index" -> "ES_SPARK_AP",
"_type" -> "document",
"_id" -> docID,
"_source" -> Json.obj(
"text" -> textChunk.mkString(" ")
)
)
// yield es_json
textChunk.clear()
}
case _ => {
if (startDoc && !endDoc && startText) {
textChunk += current_line.trim
}
}
}
} else {
line += character
}
}
The above for-loop parses through a text file and creates a JSON object of each chunk parsed in a loop. This is JSON will be sent to for further processing to Elasticsearch. In python, we can yield the JSON and use generator easily like:
def func():
for i in range(num):
... some computations ...
yield {
JSON ## JSON is yielded
}
for json in func(): ## we parse through the generator here.
process(json)
I cannot understand how I can use yield in similar fashion using scala?
If you want lazy returns, scala does this using Iterator types. Specifically if you want to handle line by line values, I'd split it into lines first with .lines
val content: String = ???
val results: Iterator[Json] =
for {
lines <- content.lines
line <- lines
} yield {
line match {
case docEndRegex(_*) => ...
}
}
You can also use a function directly
def toJson(line: String): Json =
line match {
case "hi" => Json.obj("line" -> "hi")
case "bye" => Json.obj("what" -> "a jerk")
}
val results: Iterator[Json] =
for {
lines <- content.lines
line <- lines
} yield toJson(line)
This is equivalent to doing
content.lines.map(line => toJson(line))
Or somewhat equivalently in python
lines = (line.strip() for line in content.split("\n"))
jsons = (toJson(line) for line in lines)

Compare json equality in Scala

How can I compare if two json structures are the same in scala?
For example, if I have:
{
resultCount: 1,
results: [
{
artistId: 331764459,
collectionId: 780609005
}
]
}
and
{
results: [
{
collectionId: 780609005,
artistId: 331764459
}
],
resultCount: 1
}
They should be considered equal
You should be able to simply do json1 == json2, if the json libraries are written correctly. Is that not working for you?
This is with spray-json, although I would expect the same from every json library:
import spray.json._
import DefaultJsonProtocol._
Welcome to Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_51).
Type in expressions to have them evaluated.
Type :help for more information.
scala> val json1 = """{ "a": 1, "b": [ { "c":2, "d":3 } ] }""".parseJson
json1: spray.json.JsValue = {"a":1,"b":[{"c":2,"d":3}]}
scala> val json2 = """{ "b": [ { "d":3, "c":2 } ], "a": 1 }""".parseJson
json2: spray.json.JsValue = {"b":[{"d":3,"c":2}],"a":1}
scala> json1 == json2
res1: Boolean = true
Spray-json uses an immutable scala Map to represent a JSON object in the abstract syntax tree resulting from a parse, so it is just Map's equality semantics that make this work.
You can also use scalatest-json
Example:
it("should fail on slightly different json explaining why") {
val input = """{"someField": "valid json"}""".stripMargin
val expected = """{"someField": "different json"}""".stripMargin
input should matchJson(expected)
}
When the 2 jsons doesn't match, a nice diff will be display which is quite useful when working with big jsons.
Can confirm that it also works just fine with the Jackson library using == operator:
val simpleJson =
"""
|{"field1":"value1","field2":"value2"}
""".stripMargin
val simpleJsonNode = objectMapper.readTree(simpleJson)
val simpleJsonNodeFromString = objectMapper.readTree(simpleJsonNode.toString)
assert(simpleJsonNode == simpleJsonNodeFromString)
spray-json is definitely great, but I use Gson since I already had dependency on Gson library on my project. I am using these in my unit tests, works well for simple json.
import com.google.gson.{JsonParser}
import org.apache.flume.event.JSONEvent
import org.scalatest.FunSuite
class LogEnricherSpec extends FunSuite {
test("compares json to json") {
val parser = new JsonParser()
assert(parser.parse("""
{
"eventType" : "TransferItems",
"timeMillis" : "1234567890",
"messageXml":{
"TransferId" : 123456
}
} """.stripMargin)
==
parser.parse("""
{
"timeMillis" : "1234567890",
"eventType" : "TransferItems",
"messageXml":{
"TransferId" : 123456
}
}
""".stripMargin))
}
Calling the method compare_2Json(str1,str2) will return a boolean value.
Please make sure that the two string parameters are json.
Welcome to use and test.
def compare_2Json(js1:String,js2:String): Boolean = {
var js_str1 = js1
var js_str2 = js2
js_str1=js_str1.replaceAll(" ","")
js_str2=js_str2.replaceAll(" ","")
var issame = false
val arrbuff1 = ArrayBuffer[String]()
val arrbuff2 = ArrayBuffer[String]()
if(js_str1.substring(0,1)=="{" && js_str2.substring(0,1)=="{" || js_str1.substring(0,1)=="["&&js_str2.substring(0,1)=="["){
for(small_js1 <- split_JsonintoSmall(js_str1);small_js2 <- split_JsonintoSmall((js_str2))) {
issame = compare_2Json(small_js1,small_js2)
if(issame == true){
js_str1 = js_str1.substring(0,js_str1.indexOf(small_js1))+js_str1.substring(js_str1.indexOf(small_js1)+small_js1.length)
js_str2 = js_str2.substring(0,js_str2.indexOf(small_js2))+js_str2.substring(js_str2.indexOf(small_js2)+small_js2.length)
}
}
js_str1 = js_str1.substring(1,js_str1.length-1)
js_str2 = js_str2.substring(1,js_str2.length-1)
for(str_js1 <- js_str1.split(","); str_js2 <- js_str2.split(",")){
if(str_js1!="" && str_js2!="")
if(str_js1 == str_js2){
js_str1 = js_str1.substring(0,js_str1.indexOf(str_js1))+js_str1.substring(js_str1.indexOf(str_js1)+str_js1.length)
js_str2 = js_str2.substring(0,js_str2.indexOf(str_js2))+js_str2.substring(js_str2.indexOf(str_js2)+str_js2.length)
}
}
js_str1=js_str1.replace(",","")
js_str2=js_str2.replace(",","")
if(js_str1==""&&js_str2=="")return true
else return false
}
else return false
}
def split_JsonintoSmall(js_str: String):ArrayBuffer[String]={
val arrbuff = ArrayBuffer[String]()
var json_str = js_str
while(json_str.indexOf("{",1)>0 || json_str.indexOf("[",1)>0){
if (json_str.indexOf("{", 1) < json_str.indexOf("[", 1) && json_str.indexOf("{",1)>0 || json_str.indexOf("{", 1) > json_str.indexOf("[", 1) && json_str.indexOf("[",1)<0 ) {
val right = findrealm(1, json_str, '{', '}')
arrbuff += json_str.substring(json_str.indexOf("{", 1), right + 1)
json_str = json_str.substring(0,json_str.indexOf("{",1))+json_str.substring(right+1)
}
else {
if(json_str.indexOf("[",1)>0) {
val right = findrealm(1, json_str, '[', ']')
arrbuff += json_str.substring(json_str.indexOf("[", 1), right + 1)
json_str = json_str.substring(0, json_str.indexOf("[", 1)) + json_str.substring(right + 1)
}
}
}
arrbuff
}
def findrealm(begin_loc: Int, str: String, leftch: Char, rightch: Char): Int = {
var left = str.indexOf(leftch, begin_loc)
var right = str.indexOf(rightch, left)
left = str.indexOf(leftch, left + 1)
while (left < right && left > 0) {
right = str.indexOf(rightch, right + 1)
left = str.indexOf(leftch, left + 1)
}
right
}