I have been trying to parse JSON, which have 3 different set of data where one element have various number of children and sometimes none. I am getting an error when there is no children present or only one present. I declared the JSON as var data.
JSON A
{
"floorplan": [
{
"title": "plan1",
"url": "https://media.plan1.pdf"
},
{
"title": "plan2",
"url": "https://media.plan2.pdf"
}
]
}
JSON B
{"floorplan": []}
JSON C
{
"floorplan": [
{
"title": "plan1",
"url": "https://media.plan1.pdf"
}
]
}
I parsed the JSON like this:
var items = JSON.parse(data);
return {
floorplan1: items.floorplan[0].url;
floorplan2: items.floorplan[1].url;
}
But, it only returned data for the JSON A, for other 2 it gave TypeError: Cannot read property 'url' of undefined.
I modified the code to check if floorplan have at least one child and then parse data.
var items = JSON.parse(data);
var plan = items.floorplan[0];
if(plan){
return {
floorplan1: items.floorplan[0].url;
floorplan2: items.floorplan[1].url;
}
}
The new code returned data for JSON A and B(as empty row), but gave error for C. C have one child still it got the error.
I also tried this code, still got the error for JSON C.
var items = JSON.parse(data);
var plan = items.floorplan[0];
var plan1;
var plan2;
if(plan){
plan1 = items.floorplan[0].url;
plan2 = items.floorplan[1].url;
}
return{
floorplan1 : plan1 ? plan1 : null;
floorplan2 : plan2 ? plan2 : null;
}
Is there any method I can try to get data returned for all 3 types of JSON?
let data = `
[{"floorplan": [{
"title": "plan1",
"url": "https://media.plan1.pdf"
}, {
"title": "plan2",
"url": "https://media.plan2.pdf"
}]},
{"floorplan": []},
{"floorplan": [{
"title": "plan1",
"url": "https://media.plan1.pdf"
}]}]`;
let json = JSON.parse(data);
//console.log(json);
json.forEach(items=>{
//console.log(items);
let o = {
floorplan1: items.floorplan.length > 0 ? items.floorplan[0].url : '',
floorplan2: items.floorplan.length > 1 ? items.floorplan[1].url : ''
};
console.log(o);
o = {
floorplan1: (items.floorplan[0] || {'url':''}).url,
floorplan2: (items.floorplan[1] || {'url':''}).url
};
console.log(o);
o = {
floorplan1: items.floorplan[0]?.url,
floorplan2: items.floorplan[1]?.url
};
console.log(o);
const {floorplan: [one = {url:''}, two = {url:''}]} = items;
o = {
floorplan1: one.url,
floorplan2: two.url
};
console.log(o);
});
Sure. A few ways, and more than I have here. I have put all the raw data into one string, parsed it into json and then iterated through that. In each loop my variable items will correspond to one of the json variables you created and referenced in your question as items.
In the first example, I check to make sure that items.floorplan has at least enough elements to contain the url I'm trying to reference, then use the ternary operator ? to output that URL if it exists or an empty string if it doesn't.
In the second example, I use the || (OR) operator to return the first object that evaluates to true. If items.floorplan[x] exists, then it will be that node, and if it doesn't I provide a default object with an empty url property on the right hand side, and then just use the url from the resulting object.
In the third, I use the optional chaining operator that was introduced in 2020. This method will return undefined if the url doesn't exist.
In the fourth example, I use destructuring to pull values out of the items variable, and make sure that there is a default value for url in case the items variable doesn't have a corresponding value.
But there are many more ways to go about it. These are just a few, and you can't necessarily say which approach is better. It's dependent on your intent and environment. With the exception of optional chaining (which shows undefined if the property doesn't exist), you can see these produce the same results.
DOCS for optional chaining: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Optional_chaining
DOCS for destructuring: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Destructuring_assignment
An article on destructuring: https://javascript.info/destructuring-assignment
I apologize if this seems similar to other questions asked but I have not been able to find any posts that have resolved this issue for me. Basically, I am getting a JSON object and I am trying to parse it but I can't parse it correctly. Mainly the WordDetails section that I am getting from a Word API. I am able to get everything outside the results section under WordDetails. Basically, when I get to results, I am not able to parse it correctly. Below is an example of the format.
{
"LastIndex": 133,
"SRDWords": [
{
"Domain": {
"URL": "abactinal.com",
"Available": true
},
"WordDetails": "{\"word\":\"abactinal\",\"results\":[{\"definition\":\"(of radiate animals) located on the surface or end opposite to that on which the mouth is situated\",\"partOfSpeech\":null,\"antonyms\":[\"actinal\"]}],\"syllables\":{\"count\":4,\"list\":[\"ab\",\"ac\",\"ti\",\"nal\"]}}"
},
{
"Domain": {
"URL": "aaronical.com",
"Available": true
},
"WordDetails": "{\"word\":\"aaronical\",\"syllables\":{\"count\":4,\"list\":[\"aa\",\"ron\",\"i\",\"cal\"]},\"pronunciation\":{\"all\":\"ɜ'rɑnɪkəl\"}}"
},
...
Here is my code below. Basically, I am getting to the results section of WordDetails but if I try to parse the results section it fails and if I try object.entries on it, it will not return a response according to the alert messages I used. I know there must be a better way but not sure what. Most articles say just JSON.parse then map it but that does not work. Any help would be appreciated!
data.Words.map(word => {
//get data
for (let [key, value] of Object.entries(word)) {
if (key === "Domain") {
url = value.URL;
availability = value.Available;
} else if (key.trim() === "WordDetails") {
alert("value " + value);
wDetails = JSON.parse(value);
for (let [key2, value2] of Object.entries(wDetails)) {
if (key2 === "word") {
//store word
} else if (key2.toString().trim() === "results") {
let test = JSON.parse(value2);
test = Object.entries(value2);
test.map(t => {
alert(t.definition);
});
}
}
}
}
});
You did JSON.parse above, no need to parse value2 again.
And value for results is an array, so no need for Object.entries.
...
} else if (key2.toString().trim() === 'results') {
let test = JSON.parse(value2); // this should be remove
test = Object.entries(value2); // this should be remove, value2 should be an array
// map value2 directly
value2.map(t => {
alert(t.definition);
});
}
...
I have the URL of a JSON file and I want to get all the items with the same value.
Example:
http://sampleurl.com has this JSON
`{
"posts":[
{
"authors":[
{
{"name":"John",
"age": 30
},
{"name":"John",
"age": 35
}
}
]
}
]
}`
What I want to do is to list all those authors with the same name together with their age.
I have tried this with no success:
`var allposts = "http://sampleurl.com";
$.each(allposts.posts.authors, function(i, v) {
if (v.name == "John") {
alert("Ok");
return;
}
});`
Thanks
You need to get the data via an Ajax call - $.getJSON:
const authors = {};
$.getJSON( "http://sampleurl.com", data =>
data.posts.authors.forEach(author => {
authors[author.name] = authors[author.name] || []
authors[author.name].push(author)
});
);
At the end you have an object keyed on unique author names, with each key containing as its value an array of the authors with that name. You can do further processing to transform that to the data structure you need.
This example doesn't deal with data coming back that isn't in the shape you expect. For example, if some author records are missing a name, you will end up with a key undefined. And if there is no authors key or no posts key in the returned object you will get an exception.
So you have to decide how your program should behave in those cases. Should it explode? Or return an empty object? If you want it to continue with an empty object:
const authors = {};
$.getJSON( "http://sampleurl.com", data =>
if (data.posts && data.posts.authors) {
authors.forEach(author => {
const name = author.name || 'unknown';
authors[author.name] = authors[author.name] || []
authors[author.name].push(author)
});
} else {
console.log('Warning! Data from API did not contain posts.authors!')
}
);
Note that neither of these examples deal with the AJAX call itself failing. For that you need to chain a .fail() handler:
const authors = {};
const url = "http://sampleurl.com"
$.getJSON( url, data =>
if (data.posts && data.posts.authors) {
authors.forEach(author => {
const name = author.name || 'unknown';
authors[author.name] = authors[author.name] || []
authors[author.name].push(author)
});
} else {
console.log('Warning! Data from API did not contain posts.authors!')
}
).fail(res => console.log(`Ajax call to ${url} failed with message ${res.responseText}!`);
10% of programming is getting it to work. The other 90% is coding for what happens when it doesn't work.
I was writing a node.js script to combine all the json files in a directory and store the result as a new json file. I tried do the job to a great extent but it has few flaws.
A.json
[
{
"id": "addEmoticon1",
"description": "Message to greet the user.",
"defaultMessage": "Hello, {name}!"
},
{
"id": "addPhoto1",
"description": "How are youu.",
"defaultMessage": "How are you??"
}
]
B.json
[
{
"id": "close1",
"description": "Close it.",
"defaultMessage": "Close!"
}
]
What I finally need is:
result.json
{
"addEmoticon1": "Hello, {name}!",
"addPhoto1": "How are you??",
"close1": "Close!"
}
I wrote a node.js script:
var fs = require('fs');
function readFiles(dirname, onFileContent, onError) {
fs.readdir(dirname, function(err, filenames) {
if (err) {
onError(err);
return;
}
filenames.forEach(function(filename) {
fs.readFile(dirname + filename, 'utf-8', function(err, content) {
if (err) {
onError(err);
return;
}
onFileContent(filename, content);
});
});
});
}
var data = {};
readFiles('C:/node/test/', function(filename, content) {
data[filename] = content;
var lines = content.split('\n');
lines.forEach(function(line) {
var parts = line.split('"');
if (parts[1] == 'id') {
fs.appendFile('result.json', parts[3]+': ', function (err) {});
}
if (parts[1] == 'defaultMessage') {
fs.appendFile('result.json', parts[3]+',\n', function (err) {});
}
});
}, function(err) {
throw err;
});
It extracts the 'id' and 'defaultMessage' but is not able to append correctly.
What I get:
result.json
addEmoticon1: addPhoto1: Hello, {name}!,
close1: How are you??,
Close!,
This output is different every time I run my script.
Aim 1: Surround items in double quotes,
Aim 2: Add curly braces at the top and at the end
Aim 3: No comma at the end of last element
Aim 4: Same output every time I run my script
I'll start with the finished solution...
There's a big explanation at the end of this answer. Let's try to think big-picture for a little bit first tho.
readdirp('.')
.fmap(filter(match(/\.json$/)))
.fmap(map(readfilep))
.fmap(map(fmap(JSON.parse)))
.fmap(concatp)
.fmap(flatten)
.fmap(reduce(createMap)({}))
.fmap(data=> JSON.stringify(data, null, '\t'))
.fmap(writefilep(resolve(__dirname, 'result.json')))
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
Console output
wrote results to /path/to/result.json
result.json (I added a c.json with some data to show that this works with more than 2 files)
{
"addEmoticon1": "Hello, {name}!",
"addPhoto1": "How are you??",
"close1": "Close!",
"somethingelse": "Something!"
}
Implementation
I made Promise-based interfaces for readdir and readFile and writeFile
import {readdir, readFile, writeFile} from 'fs';
const readdirp = dir=>
new Promise((pass,fail)=>
readdir(dir, (err, filenames) =>
err ? fail(err) : pass(mapResolve (dir) (filenames))));
const readfilep = path=>
new Promise((pass,fail)=>
readFile(path, 'utf8', (err,data)=>
err ? fail(err) : pass(data)));
const writefilep = path=> data=>
new Promise((pass,fail)=>
writeFile(path, data, err=>
err ? fail(err) : pass(path)));
In order to map functions to our Promises, we needed an fmap utility. Notice how we take care to bubble errors up.
Promise.prototype.fmap = function fmap(f) {
return new Promise((pass,fail) =>
this.then(x=> pass(f(x)), fail));
};
And here's the rest of the utilities
const fmap = f=> x=> x.fmap(f);
const mapResolve = dir=> map(x=>resolve(dir,x));
const map = f=> xs=> xs.map(x=> f(x));
const filter = f=> xs=> xs.filter(x=> f(x));
const match = re=> s=> re.test(s);
const concatp = xs=> Promise.all(xs);
const reduce = f=> y=> xs=> xs.reduce((y,x)=> f(y)(x), y);
const flatten = reduce(y=> x=> y.concat(Array.isArray(x) ? flatten (x) : x)) ([]);
Lastly, the one custom function that does your work
const createMap = map=> ({id, defaultMessage})=>
Object.assign(map, {[id]: defaultMessage});
And here's c.json
[
{
"id": "somethingelse",
"description": "something",
"defaultMessage": "Something!"
}
]
"Why so many little functions ?"
Well despite what you may think, you have a pretty big problem. And big problems are solved by combining several small solutions. The most prominent advantage of this code is that each function has a very distinct purpose and it will always produce the same results for the same inputs. This means each function can be used other places in your program. Another advantage is that smaller functions are easier to read, reason with, and debug.
Compare all of this to the other answers given here; #BlazeSahlen's in particular. That's over 60 lines of code that's basically only usable to solve this one particular problem. And it doesn't even filter out non-JSON files. So the next time you need to create a sequence of actions on reading/writing files, you'll have to rewrite most of those 60 lines each time. It creates lots of duplicated code and hard-to-find bugs because of exhausting boilerplate. And all that manual error-handling... wow, just kill me now. And he/she thought callback hell was bad ? haha, he/she just created yet another circle of hell all on his/her own.
All the code together...
Functions appear (roughly) in the order they are used
import {readdir, readFile, writeFile} from 'fs';
import {resolve} from 'path';
// logp: Promise<Value> -> Void
const logp = p=> p.then(x=> console.log(x), x=> console.err(x));
// fmap : Promise<a> -> (a->b) -> Promise<b>
Promise.prototype.fmap = function fmap(f) {
return new Promise((pass,fail) =>
this.then(x=> pass(f(x)), fail));
};
// fmap : (a->b) -> F<a> -> F<b>
const fmap = f=> x=> x.fmap(f);
// readdirp : String -> Promise<Array<String>>
const readdirp = dir=>
new Promise((pass,fail)=>
readdir(dir, (err, filenames) =>
err ? fail(err) : pass(mapResolve (dir) (filenames))));
// mapResolve : String -> Array<String> -> Array<String>
const mapResolve = dir=> map(x=>resolve(dir,x));
// map : (a->b) -> Array<a> -> Array<b>
const map = f=> xs=> xs.map(x=> f(x));
// filter : (Value -> Boolean) -> Array<Value> -> Array<Value>
const filter = f=> xs=> xs.filter(x=> f(x));
// match : RegExp -> String -> Boolean
const match = re=> s=> re.test(s);
// readfilep : String -> Promise<String>
const readfilep = path=>
new Promise((pass,fail)=>
readFile(path, 'utf8', (err,data)=>
err ? fail(err) : pass(data)));
// concatp : Array<Promise<Value>> -> Array<Value>
const concatp = xs=> Promise.all(xs);
// reduce : (b->a->b) -> b -> Array<a> -> b
const reduce = f=> y=> xs=> xs.reduce((y,x)=> f(y)(x), y);
// flatten : Array<Array<Value>> -> Array<Value>
const flatten = reduce(y=> x=> y.concat(Array.isArray(x) ? flatten (x) : x)) ([]);
// writefilep : String -> Value -> Promise<String>
const writefilep = path=> data=>
new Promise((pass,fail)=>
writeFile(path, data, err=>
err ? fail(err) : pass(path)));
// -----------------------------------------------------------------------------
// createMap : Object -> Object -> Object
const createMap = map=> ({id, defaultMessage})=>
Object.assign(map, {[id]: defaultMessage});
// do it !
readdirp('.')
.fmap(filter(match(/\.json$/)))
.fmap(map(readfilep))
.fmap(map(fmap(JSON.parse)))
.fmap(concatp)
.fmap(flatten)
.fmap(reduce(createMap)({}))
.fmap(data=> JSON.stringify(data, null, '\t'))
.fmap(writefilep(resolve(__dirname, 'result.json')))
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
Still having trouble following along?
It's not easy to see how these things work at first. This is a particularly squirrely problem because the data gets nested very quickly. Thankfully that doesn't mean our code has to be a big nested mess just to solve the problem ! Notice the code stays nice and flat even when we're dealing with things like a Promise of an Array of Promises of JSON...
// Here we are reading directory '.'
// We will get a Promise<Array<String>>
// Let's say the files are 'a.json', 'b.json', 'c.json', and 'run.js'
// Promise will look like this:
// Promise<['a.json', 'b.json', 'c.json', 'run.js']>
readdirp('.')
// Now we're going to strip out any non-JSON files
// Promise<['a.json', 'b.json', 'c.json']>
.fmap(filter(match(/\.json$/)))
// call `readfilep` on each of the files
// We will get <Promise<Array<Promise<JSON>>>>
// Don't freak out, it's not that bad!
// Promise<[Promise<JSON>, Promise<JSON>. Promise<JSON>]>
.fmap(map(readfilep))
// for each file's Promise, we want to parse the data as JSON
// JSON.parse returns an object, so the structure will be the same
// except JSON will be an object!
// Promise<[Promise<Object>, Promise<Object>, Promise<Object>]>
.fmap(map(fmap(JSON.parse)))
// Now we can start collapsing some of the structure
// `concatp` will convert Array<Promise<Value>> to Array<Value>
// We will get
// Promise<[Object, Object, Object]>
// Remember, we have 3 Objects; one for each parsed JSON file
.fmap(concatp)
// Your particular JSON structures are Arrays, which are also Objects
// so that means `concatp` will actually return Promise<[Array, Array, Array]
// but we'd like to flatten that
// that way each parsed JSON file gets mushed into a single data set
// after flatten, we will have
// Promise<Array<Object>>
.fmap(flatten)
// Here's where it all comes together
// now that we have a single Promise of an Array containing all of your objects ...
// We can simply reduce the array and create the mapping of key:values that you wish
// `createMap` is custom tailored for the mapping you need
// we initialize the `reduce` with an empty object, {}
// after it runs, we will have Promise<Object>
// where Object is your result
.fmap(reduce(createMap)({}))
// It's all downhill from here
// We currently have Promise<Object>
// but before we write that to a file, we need to convert it to JSON
// JSON.stringify(data, null, '\t') will pretty print the JSON using tab to indent
// After this, we will have Promise<JSON>
.fmap(data=> JSON.stringify(data, null, '\t'))
// Now that we have a JSON, we can easily write this to a file
// We'll use `writefilep` to write the result to `result.json` in the current working directory
// I wrote `writefilep` to pass the filename on success
// so when this finishes, we will have
// Promise<Path>
// You could have it return Promise<Void> like writeFile sends void to the callback. up to you.
.fmap(writefilep(resolve(__dirname, 'result.json')))
// the grand finale
// alert the user that everything is done (or if an error occurred)
// Remember `.then` is like a fork in the road:
// the code will go to the left function on success, and the right on failure
// Here, we're using a generic function to say we wrote the file out
// If a failure happens, we write that to console.error
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
All done !
Assumed files is list of arrays; [a, b, ...];
var res = {};
files.reduce((a, b) => a.concat(b), []).forEach(o => res[o.id] = o.defaultMessage);
But you need not to get all files at once.
Just add this code to onFileContent callback.
JSON.parse(fileContent).forEach(o => res[o.id] = o.defaultMessage);
Also, you should to add any final callback to your readFiles.
And in this callback:
fs.writeFile('result.json', JSON.stringify(res));
So, final solution for you:
var fs = require('fs');
function task(dir, it, cb) {
fs.readdir(dir, (err, names) => {
if (err) return cb([err]);
var errors = [], c = names.length;
names.forEach(name => {
fs.readFile(dir + name, 'utf-8', (err, data) => {
if (err) return errors.push(err);
try {
it(JSON.parse(data)); // We get a file data!
} catch(e) {
errors.push('Invalid json in ' + name + ': '+e.message);
}
if (!--c) cb(errors); // We are finish
});
});
});
}
var res = {};
task('C:/node/test/', (data) => data.forEach(o => res[o.id] = o.defaultMessage), (errors) => {
// Some files can be wrong
errors.forEach(err => console.error(err));
// But we anyway write received data
fs.writeFile('C:/node/test/result.json', JSON.stringify(res), (err) => {
if (err) console.error(err);
else console.log('Task finished. see results.json');
})
});
this should do it once you have your json in variables a and b:
var a = [
{
"id": "addEmoticon1",
"description": "Message to greet the user.",
"defaultMessage": "Hello, {name}!"
},
{
"id": "addPhoto1",
"description": "How are youu.",
"defaultMessage": "How are you??"
}
];
var b = [
{
"id": "close1",
"description": "Close it.",
"defaultMessage": "Close!"
}
];
var c = a.concat(b);
var res = []
for (var i = 0; i < c.length; i++){
res[ c[i].id ] = c[i].defaultMessage;
}
console.log(res);
Here's my solution:
function readFiles(dirname, onFileContent, onError) {
fs.readdir(dirname, function(err, filenames) {
/**
* We'll store the parsed JSON data in this array
* #type {Array}
*/
var fileContent = [];
if (err) {
onError(err);
} else {
filenames.forEach(function(filename) {
// Reading the file (synchronously) and storing the parsed JSON output (parsing from string to JSON object)
var jsonObject = JSON.parse(fs.readFileSync(dirname + filename, 'utf-8'));
// Pushing the parsed JSON output into array
fileContent.push(jsonObject);
});
// Calling the callback
onFileContent(fileContent);
}
});
}
readFiles('./files/',function(fileContent) {
/**
* We'll store the final output object here
* #type {Object}
*/
var output = {};
// Loop over the JSON objects
fileContent.forEach(function(each) {
// Looping within each object
for (var index in each) {
// Copying the `id` as key and the `defaultMessage` as value and storing in output object
output[each[index].id] = each[index].defaultMessage;
}
});
// Writing the file (synchronously) after converting the JSON object back to string
fs.writeFileSync('result.json', JSON.stringify(output));
}, function(err) {
throw err;
});
Notable difference is that I've not used the asynchronous readFile and writeFile functions as they'd needlessly complicate the example. This example is meant to showcase the use of JSON.parse and JSON.stringify to do what OP wants.
UPDATE:
var fs = require('fs');
function readFiles(dirname, onEachFilename, onComplete) {
fs.readdir(dirname, function(err, filenames) {
if (err) {
throw err;
} else {
// Prepending the dirname to each filename
filenames.forEach(function(each, index, array) {
array[index] = dirname + each;
});
// Calling aync.map which accepts these parameters:
// filenames <-------- array of filenames
// onEachFilename <--- function which will be applied on each filename
// onComplete <------- function to call when the all elements of filenames array have been processed
require('async').map(filenames, onEachFilename, onComplete);
}
});
}
readFiles('./files/', function(item, callback) {
// Read the file asynchronously
fs.readFile(item, function(err, data) {
if (err) {
callback(err);
} else {
callback(null, JSON.parse(data));
}
});
}, function(err, results) {
/**
* We'll store the final output object here
* #type {Object}
*/
var output = {};
if (err) {
throw err;
} else {
// Loop over the JSON objects
results.forEach(function(each) {
// Looping within each object
for (var index in each) {
// Copying the `id` as key and the `defaultMessage` as value and storing in output object
output[each[index].id] = each[index].defaultMessage;
}
});
// Writing the file (synchronously) after converting the JSON object back to string
fs.writeFileSync('result.json', JSON.stringify(output));
}
});
This is a simple asynchronous implementation of the same, using readFile. For more information, async.map.
I was trying to make a simple LDAP client to just retrieve the data from an LDAP server. I am returning array of JSON objects from the JSP. On click of any value I will get some data from online server. I am able to load the first set of array into a tree. The arrays got in the next step dont get attached to the JSTree. My codes:
function getGroupsStructure(id) {
console.log("in getGroupsStructure-->");
var paramId = "";
if(id == '') {
console.log("in if-->");
paramId = "c=de";
} else {
console.log("in else-->");
paramId = id;
}
var params = {
"DN" : paramId,
};
console.log("params-->",params);
var getGroupsStructureForUserService = service(webURL + "sendingValues/getGroupsStructureForUser",params,"POST");
getGroupsStructureForUserService.success(function(data) {
console.log("in success-->dta-->",data);
if(data.errorCode == '0') {
console.log("in error code 0-->dta-->",data.treeData);
$('.treeNode').jstree({
'core': {
'data': function (obj, cb) {
cb.call(this,
data.treeData);
}
}
});
console.log("Tree Created...");
} else {
console.log("error code not 0--data-->",data);
}
$(document).off('click').on('click', '.treeNode a', function() {
console.log("on click of a-->");
var id = $(this).parent().attr('id');
console.log("id-->",id);
getGroupsStructure(id);
console.log("after getGroupsStructure");
});
});
getGroupsStructureForUserService.error(function(data) {
console.log(" empty error");
// console.log(err);
});
}
The JSP Code is
def NextLevelLDAP(String DN) {
// println "Next Level===>"
assert ldap!=null
def responseArray=[]
def results=ldap.search('objectClass=*',DN,SearchScope.ONE) //Will be triggered when + is pressed in GUI to get next level of tree
// assert results==null
if(DN.startsWith("c="))
{
JSONObject responseJson1=new JSONObject()
responseJson1.put("id", initialDN )
responseJson1.put("parent", "#")
responseJson1.put("text","Parent")
responseArray.add(responseJson1)
for(entry in results) {
// println entry
// println "In NextLevel Using InitialDN"
JSONObject responseJson=new JSONObject()
responseJson.put("id", entry.dn)
responseJson.put("parent", DN)
String tempResDN=entry.dn.toString()
def tempLength=tempResDN.length() - DN.length()
// println tempResDN
String tempName=tempResDN.substring(2,tempLength-1)
// println tempName
responseJson.put("text",tempName)
responseArray.add(responseJson)
// println entry
println responseJson.toString()
}
return responseArray
}
if(results.size!=0)
{
for(entry in results) {
println entry
JSONObject responseJson=new JSONObject()
responseJson.put("id", entry.dn)
responseJson.put("parent", DN)
String tempResDN=entry.dn.toString()
def tempLength=tempResDN.length() - DN.length()
// println tempResDN
String tempName=tempResDN.substring(2,tempLength-1)
println tempName
responseJson.put("text",tempName)
responseArray.add(responseJson)
// println entry
}
return responseArray
}
}
Please Ignore the way of getting the Parent ID. Its Something COmplicated.
Please help me out how do I get The tree nodes created dynamically. I am just getting the fist level of the tree. The data on click for other levels is being shown in the console but not getting attached to the tree.
Thank you.
You have it the other way around - you need to create the tree and have it make the request for you, so instead of this:
'data': function (obj, cb) {
cb.call(this, data.treeData);
}
Use something like this:
'data': function (obj, cb) {
// you probably need to pass the obj.id as a parameter to the service
// keep in mind if obj.id is "#" you need to return the root nodes
service(...).success(function (data) {
cb.call(this, data.treeData);
});
}
This way you do not need to detach and reattach click handlers every time and it will work out of the box for opening nodes. If you want to open a node on click, you can use this:
$('#tree').on('select_node.jstree', function (e, data) {
data.instance.open_node(data.node);
});
So your whole code should look something like this:
function load(id) {
var params = {
"DN" : id && id !== '#' ? id : "c=de"
};
return service(webURL + "sendingValues/getGroupsStructureForUser", params, "POST");
}
$('#tree')
.jstree({
'core' : {
'data': function (obj, cb) {
load(obj.id).success(function (data) {
cb.(data.treeData);
});
}
}
})
.on('select_node.jstree', function (e, data) {
data.instance.open_node(data.node);
});
Just make sure you mark the nodes your return as having children (set their children property to boolean true).