d3 - reading JSON data instead of CSV file - json

I'm trying to read data into my calendar visualisation using JSON. At
the moment it works great using a CSV file:
d3.csv("RSAtest.csv", function(csv) {
var data = d3.nest()
.key(function(d) { return d.date; })
.rollup(function(d) { return d[0].total; })
.map(csv);
rect.filter(function(d) { return d in data; })
.attr("class", function(d) { return "day q" + color(data[d]) +
"-9"; })
.select("title")
.text(function(d) { return d + ": " + data[d]; });
});
It reads the following CSV data:
date,total
2000-01-01,11
2000-01-02,13
.
.
.etc
Any pointers on how I can read the following JSON data instead:
{"2000-01-01":19,"2000-01-02":11......etc}
I tried the following but it not working for me (datareadCal.php spits
out the JSON for me):
d3.json("datareadCal.php", function(json) {
var data = d3.nest()
.key(function(d) { return d.Key; })
.rollup(function(d) { return d[0].Value; })
.map(json);
thanks

You can use d3.entries() to turn an object literal into an array of key/value pairs:
var countsByDate = {'2000-01-01': 10, ...};
var dateCounts = d3.entries(countsByDate);
console.log(JSON.stringify(dateCounts[0])); // {"key": "2000-01-01", "value": 10}
One thing you'll notice, though, is that the resulting array isn't properly sorted. You can sort them by key ascending like so:
dateCounts = dateCounts.sort(function(a, b) {
return d3.ascending(a.key, b.key);
});

Turn your .json file into a .js file that is included in your html file. Inside your .js file have:
var countsByDate = {'2000-01-01':10,...};
Then you can reference countsByDate....no need to read from a file per se.
And you can read it with:
var data = d3.nest()
.key(function(d) { return d.Key; })
.entries(json);
As an aside....d3.js says it's better to set your json up as:
var countsByDate = [
{Date: '2000-01-01', Total: '10'},
{Date: '2000-01-02', Total: '11'},
];

Related

Angular reading data from json into textarea

I'm trying to read some test data from a local json file and output the data with correct formatting into a textarea. Right now though it just outputs [object Object]. How would I go about getting it so it outputs:
Id: theIdGoesHere
Title: theTitleGoesHere
step.service.ts The service used to call the json data
public getJson(): Observable<any>{
return this.http.get('/assets/jsonData/MyJson.json')
.map(response => response.json());
}
MyJson.json
{
"data":[
{
"id": 1,
"title":"Test1"
},
{
"id": 2,
"title":"Test2"
}
]
}
main.componenet.ts
private testVar: any;
test(){
this.stepService.getJson().subscribe(data => (this.testVar = data));
}
anothermethod(){
this.test();
this.mainStepText = this.testVar; //mainStepText binded to textarea with [(ngModel)]="mainStepText"
}
get mainStepText2() { //Rebinded this one
const text = [];
const { data } = this.testVar;
for (let item of this.testVar.data) {
Object.keys(item).forEach(key => {
text.push(key + ': ' + item[key]);
});
}
return text.join('\r\n'); // \r\n is the line break
}
You can use json pipe to format your object into a json string:
[(ngModel)]="mainStepText | json"
If you want to show a specific property of your object, you can access it in your template:
[(ngModel)]="mainStepText.data[0].title"
This will display "Test1" in your field.
You could loop through your json.data and through their keys to extract the text and values and generate the string you need for the text area.
const text = [];
for (let item of this.textVar.data) {
Object.keys(item).forEach(key => {
text.push(key + ': ' + item[key]);
});
}
return text.join('\r\n'); // \r\n is the line break
Here's the running code, I put it in app.ts: http://plnkr.co/edit/3AbQYQOW0MVBqO91X9qi?p=preview
Hope this is of help.

D3 getting data from multiple column CSV

I'm following this example for a multi-line graph
I want to plot each forecast in my CSV using values for years 2010, 2011, 2012.
forecast.csv
forecast,2010,2011,2012
Outlook,87,88,88
Reform,50,20,88
Renewal,43,21,88
If my data was simple like the example link, the code to build the chart would look like this:
var priceline = d3.svg.line()
.x(function(d) { return x(d.year); })
.y(function(d) { return y(d.value); });
var dataNest = d3.nest()
.key(function(d) {return d.forecast;})
.entries(data);
dataNest.forEach(function(d) {
svg.append("path")
.attr("class", "line")
.attr("d", priceline(d.values));
console.log(dataNest)
});
However my data is coming from a multi column CSV.
I'm trying to nest the forecasts, so each forecast would have an array of year and value pairs. i.e
[0] Object
[key] Outlook
[values]
[0] year: 2010
value: 28
[1] year: 2011
value: 88
but dataNest currently looks like this
[0] Object
[key] Outlook
[values]
[0] 2010: 87
2011: 88
2012: 88
There are many other years in the real data so transposing is not an option. How can I draw a line from this multi column CSV data?
You can change your data format or modify your d3.nest function. However, I reckon the easiest solution is using plain JavaScript to modify your array:
dataNest.forEach(function(d) {
d.values.forEach(function(e) {
var myArr = [];
for (var key in e) {
if (e[key] != d.key) {
myArr.push({
"year": key,
"value": e[key]
});
}
}
d.values = myArr;
})
})
Here is a demo, using the data in your question:
var data = d3.csv.parse(d3.select("#csv").text());
var dataNest = d3.nest()
.key(function(d) {
return d.forecast;
})
.entries(data);
dataNest.forEach(function(d) {
d.values.forEach(function(e) {
var myArr = [];
for (var key in e) {
if (e[key] != d.key) {
myArr.push({
"year": key,
"value": e[key]
});
}
}
d.values = myArr;
})
})
console.log(dataNest)
pre {
display: none;
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/d3/3.4.11/d3.min.js"></script>
<pre id="csv">forecast,2010,2011,2012
Outlook,87,88,88
Reform,50,20,88
Renewal,43,21,88</pre>

Node JS: Make a flat json from a tree json

I was writing a node.js script to combine all the json files in a directory and store the result as a new json file. I tried do the job to a great extent but it has few flaws.
A.json
[
{
"id": "addEmoticon1",
"description": "Message to greet the user.",
"defaultMessage": "Hello, {name}!"
},
{
"id": "addPhoto1",
"description": "How are youu.",
"defaultMessage": "How are you??"
}
]
B.json
[
{
"id": "close1",
"description": "Close it.",
"defaultMessage": "Close!"
}
]
What I finally need is:
result.json
{
"addEmoticon1": "Hello, {name}!",
"addPhoto1": "How are you??",
"close1": "Close!"
}
I wrote a node.js script:
var fs = require('fs');
function readFiles(dirname, onFileContent, onError) {
fs.readdir(dirname, function(err, filenames) {
if (err) {
onError(err);
return;
}
filenames.forEach(function(filename) {
fs.readFile(dirname + filename, 'utf-8', function(err, content) {
if (err) {
onError(err);
return;
}
onFileContent(filename, content);
});
});
});
}
var data = {};
readFiles('C:/node/test/', function(filename, content) {
data[filename] = content;
var lines = content.split('\n');
lines.forEach(function(line) {
var parts = line.split('"');
if (parts[1] == 'id') {
fs.appendFile('result.json', parts[3]+': ', function (err) {});
}
if (parts[1] == 'defaultMessage') {
fs.appendFile('result.json', parts[3]+',\n', function (err) {});
}
});
}, function(err) {
throw err;
});
It extracts the 'id' and 'defaultMessage' but is not able to append correctly.
What I get:
result.json
addEmoticon1: addPhoto1: Hello, {name}!,
close1: How are you??,
Close!,
This output is different every time I run my script.
Aim 1: Surround items in double quotes,
Aim 2: Add curly braces at the top and at the end
Aim 3: No comma at the end of last element
Aim 4: Same output every time I run my script
I'll start with the finished solution...
There's a big explanation at the end of this answer. Let's try to think big-picture for a little bit first tho.
readdirp('.')
.fmap(filter(match(/\.json$/)))
.fmap(map(readfilep))
.fmap(map(fmap(JSON.parse)))
.fmap(concatp)
.fmap(flatten)
.fmap(reduce(createMap)({}))
.fmap(data=> JSON.stringify(data, null, '\t'))
.fmap(writefilep(resolve(__dirname, 'result.json')))
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
Console output
wrote results to /path/to/result.json
result.json (I added a c.json with some data to show that this works with more than 2 files)
{
"addEmoticon1": "Hello, {name}!",
"addPhoto1": "How are you??",
"close1": "Close!",
"somethingelse": "Something!"
}
Implementation
I made Promise-based interfaces for readdir and readFile and writeFile
import {readdir, readFile, writeFile} from 'fs';
const readdirp = dir=>
new Promise((pass,fail)=>
readdir(dir, (err, filenames) =>
err ? fail(err) : pass(mapResolve (dir) (filenames))));
const readfilep = path=>
new Promise((pass,fail)=>
readFile(path, 'utf8', (err,data)=>
err ? fail(err) : pass(data)));
const writefilep = path=> data=>
new Promise((pass,fail)=>
writeFile(path, data, err=>
err ? fail(err) : pass(path)));
In order to map functions to our Promises, we needed an fmap utility. Notice how we take care to bubble errors up.
Promise.prototype.fmap = function fmap(f) {
return new Promise((pass,fail) =>
this.then(x=> pass(f(x)), fail));
};
And here's the rest of the utilities
const fmap = f=> x=> x.fmap(f);
const mapResolve = dir=> map(x=>resolve(dir,x));
const map = f=> xs=> xs.map(x=> f(x));
const filter = f=> xs=> xs.filter(x=> f(x));
const match = re=> s=> re.test(s);
const concatp = xs=> Promise.all(xs);
const reduce = f=> y=> xs=> xs.reduce((y,x)=> f(y)(x), y);
const flatten = reduce(y=> x=> y.concat(Array.isArray(x) ? flatten (x) : x)) ([]);
Lastly, the one custom function that does your work
const createMap = map=> ({id, defaultMessage})=>
Object.assign(map, {[id]: defaultMessage});
And here's c.json
[
{
"id": "somethingelse",
"description": "something",
"defaultMessage": "Something!"
}
]
"Why so many little functions ?"
Well despite what you may think, you have a pretty big problem. And big problems are solved by combining several small solutions. The most prominent advantage of this code is that each function has a very distinct purpose and it will always produce the same results for the same inputs. This means each function can be used other places in your program. Another advantage is that smaller functions are easier to read, reason with, and debug.
Compare all of this to the other answers given here; #BlazeSahlen's in particular. That's over 60 lines of code that's basically only usable to solve this one particular problem. And it doesn't even filter out non-JSON files. So the next time you need to create a sequence of actions on reading/writing files, you'll have to rewrite most of those 60 lines each time. It creates lots of duplicated code and hard-to-find bugs because of exhausting boilerplate. And all that manual error-handling... wow, just kill me now. And he/she thought callback hell was bad ? haha, he/she just created yet another circle of hell all on his/her own.
All the code together...
Functions appear (roughly) in the order they are used
import {readdir, readFile, writeFile} from 'fs';
import {resolve} from 'path';
// logp: Promise<Value> -> Void
const logp = p=> p.then(x=> console.log(x), x=> console.err(x));
// fmap : Promise<a> -> (a->b) -> Promise<b>
Promise.prototype.fmap = function fmap(f) {
return new Promise((pass,fail) =>
this.then(x=> pass(f(x)), fail));
};
// fmap : (a->b) -> F<a> -> F<b>
const fmap = f=> x=> x.fmap(f);
// readdirp : String -> Promise<Array<String>>
const readdirp = dir=>
new Promise((pass,fail)=>
readdir(dir, (err, filenames) =>
err ? fail(err) : pass(mapResolve (dir) (filenames))));
// mapResolve : String -> Array<String> -> Array<String>
const mapResolve = dir=> map(x=>resolve(dir,x));
// map : (a->b) -> Array<a> -> Array<b>
const map = f=> xs=> xs.map(x=> f(x));
// filter : (Value -> Boolean) -> Array<Value> -> Array<Value>
const filter = f=> xs=> xs.filter(x=> f(x));
// match : RegExp -> String -> Boolean
const match = re=> s=> re.test(s);
// readfilep : String -> Promise<String>
const readfilep = path=>
new Promise((pass,fail)=>
readFile(path, 'utf8', (err,data)=>
err ? fail(err) : pass(data)));
// concatp : Array<Promise<Value>> -> Array<Value>
const concatp = xs=> Promise.all(xs);
// reduce : (b->a->b) -> b -> Array<a> -> b
const reduce = f=> y=> xs=> xs.reduce((y,x)=> f(y)(x), y);
// flatten : Array<Array<Value>> -> Array<Value>
const flatten = reduce(y=> x=> y.concat(Array.isArray(x) ? flatten (x) : x)) ([]);
// writefilep : String -> Value -> Promise<String>
const writefilep = path=> data=>
new Promise((pass,fail)=>
writeFile(path, data, err=>
err ? fail(err) : pass(path)));
// -----------------------------------------------------------------------------
// createMap : Object -> Object -> Object
const createMap = map=> ({id, defaultMessage})=>
Object.assign(map, {[id]: defaultMessage});
// do it !
readdirp('.')
.fmap(filter(match(/\.json$/)))
.fmap(map(readfilep))
.fmap(map(fmap(JSON.parse)))
.fmap(concatp)
.fmap(flatten)
.fmap(reduce(createMap)({}))
.fmap(data=> JSON.stringify(data, null, '\t'))
.fmap(writefilep(resolve(__dirname, 'result.json')))
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
Still having trouble following along?
It's not easy to see how these things work at first. This is a particularly squirrely problem because the data gets nested very quickly. Thankfully that doesn't mean our code has to be a big nested mess just to solve the problem ! Notice the code stays nice and flat even when we're dealing with things like a Promise of an Array of Promises of JSON...
// Here we are reading directory '.'
// We will get a Promise<Array<String>>
// Let's say the files are 'a.json', 'b.json', 'c.json', and 'run.js'
// Promise will look like this:
// Promise<['a.json', 'b.json', 'c.json', 'run.js']>
readdirp('.')
// Now we're going to strip out any non-JSON files
// Promise<['a.json', 'b.json', 'c.json']>
.fmap(filter(match(/\.json$/)))
// call `readfilep` on each of the files
// We will get <Promise<Array<Promise<JSON>>>>
// Don't freak out, it's not that bad!
// Promise<[Promise<JSON>, Promise<JSON>. Promise<JSON>]>
.fmap(map(readfilep))
// for each file's Promise, we want to parse the data as JSON
// JSON.parse returns an object, so the structure will be the same
// except JSON will be an object!
// Promise<[Promise<Object>, Promise<Object>, Promise<Object>]>
.fmap(map(fmap(JSON.parse)))
// Now we can start collapsing some of the structure
// `concatp` will convert Array<Promise<Value>> to Array<Value>
// We will get
// Promise<[Object, Object, Object]>
// Remember, we have 3 Objects; one for each parsed JSON file
.fmap(concatp)
// Your particular JSON structures are Arrays, which are also Objects
// so that means `concatp` will actually return Promise<[Array, Array, Array]
// but we'd like to flatten that
// that way each parsed JSON file gets mushed into a single data set
// after flatten, we will have
// Promise<Array<Object>>
.fmap(flatten)
// Here's where it all comes together
// now that we have a single Promise of an Array containing all of your objects ...
// We can simply reduce the array and create the mapping of key:values that you wish
// `createMap` is custom tailored for the mapping you need
// we initialize the `reduce` with an empty object, {}
// after it runs, we will have Promise<Object>
// where Object is your result
.fmap(reduce(createMap)({}))
// It's all downhill from here
// We currently have Promise<Object>
// but before we write that to a file, we need to convert it to JSON
// JSON.stringify(data, null, '\t') will pretty print the JSON using tab to indent
// After this, we will have Promise<JSON>
.fmap(data=> JSON.stringify(data, null, '\t'))
// Now that we have a JSON, we can easily write this to a file
// We'll use `writefilep` to write the result to `result.json` in the current working directory
// I wrote `writefilep` to pass the filename on success
// so when this finishes, we will have
// Promise<Path>
// You could have it return Promise<Void> like writeFile sends void to the callback. up to you.
.fmap(writefilep(resolve(__dirname, 'result.json')))
// the grand finale
// alert the user that everything is done (or if an error occurred)
// Remember `.then` is like a fork in the road:
// the code will go to the left function on success, and the right on failure
// Here, we're using a generic function to say we wrote the file out
// If a failure happens, we write that to console.error
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
All done !
Assumed files is list of arrays; [a, b, ...];
var res = {};
files.reduce((a, b) => a.concat(b), []).forEach(o => res[o.id] = o.defaultMessage);
But you need not to get all files at once.
Just add this code to onFileContent callback.
JSON.parse(fileContent).forEach(o => res[o.id] = o.defaultMessage);
Also, you should to add any final callback to your readFiles.
And in this callback:
fs.writeFile('result.json', JSON.stringify(res));
So, final solution for you:
var fs = require('fs');
function task(dir, it, cb) {
fs.readdir(dir, (err, names) => {
if (err) return cb([err]);
var errors = [], c = names.length;
names.forEach(name => {
fs.readFile(dir + name, 'utf-8', (err, data) => {
if (err) return errors.push(err);
try {
it(JSON.parse(data)); // We get a file data!
} catch(e) {
errors.push('Invalid json in ' + name + ': '+e.message);
}
if (!--c) cb(errors); // We are finish
});
});
});
}
var res = {};
task('C:/node/test/', (data) => data.forEach(o => res[o.id] = o.defaultMessage), (errors) => {
// Some files can be wrong
errors.forEach(err => console.error(err));
// But we anyway write received data
fs.writeFile('C:/node/test/result.json', JSON.stringify(res), (err) => {
if (err) console.error(err);
else console.log('Task finished. see results.json');
})
});
this should do it once you have your json in variables a and b:
var a = [
{
"id": "addEmoticon1",
"description": "Message to greet the user.",
"defaultMessage": "Hello, {name}!"
},
{
"id": "addPhoto1",
"description": "How are youu.",
"defaultMessage": "How are you??"
}
];
var b = [
{
"id": "close1",
"description": "Close it.",
"defaultMessage": "Close!"
}
];
var c = a.concat(b);
var res = []
for (var i = 0; i < c.length; i++){
res[ c[i].id ] = c[i].defaultMessage;
}
console.log(res);
Here's my solution:
function readFiles(dirname, onFileContent, onError) {
fs.readdir(dirname, function(err, filenames) {
/**
* We'll store the parsed JSON data in this array
* #type {Array}
*/
var fileContent = [];
if (err) {
onError(err);
} else {
filenames.forEach(function(filename) {
// Reading the file (synchronously) and storing the parsed JSON output (parsing from string to JSON object)
var jsonObject = JSON.parse(fs.readFileSync(dirname + filename, 'utf-8'));
// Pushing the parsed JSON output into array
fileContent.push(jsonObject);
});
// Calling the callback
onFileContent(fileContent);
}
});
}
readFiles('./files/',function(fileContent) {
/**
* We'll store the final output object here
* #type {Object}
*/
var output = {};
// Loop over the JSON objects
fileContent.forEach(function(each) {
// Looping within each object
for (var index in each) {
// Copying the `id` as key and the `defaultMessage` as value and storing in output object
output[each[index].id] = each[index].defaultMessage;
}
});
// Writing the file (synchronously) after converting the JSON object back to string
fs.writeFileSync('result.json', JSON.stringify(output));
}, function(err) {
throw err;
});
Notable difference is that I've not used the asynchronous readFile and writeFile functions as they'd needlessly complicate the example. This example is meant to showcase the use of JSON.parse and JSON.stringify to do what OP wants.
UPDATE:
var fs = require('fs');
function readFiles(dirname, onEachFilename, onComplete) {
fs.readdir(dirname, function(err, filenames) {
if (err) {
throw err;
} else {
// Prepending the dirname to each filename
filenames.forEach(function(each, index, array) {
array[index] = dirname + each;
});
// Calling aync.map which accepts these parameters:
// filenames <-------- array of filenames
// onEachFilename <--- function which will be applied on each filename
// onComplete <------- function to call when the all elements of filenames array have been processed
require('async').map(filenames, onEachFilename, onComplete);
}
});
}
readFiles('./files/', function(item, callback) {
// Read the file asynchronously
fs.readFile(item, function(err, data) {
if (err) {
callback(err);
} else {
callback(null, JSON.parse(data));
}
});
}, function(err, results) {
/**
* We'll store the final output object here
* #type {Object}
*/
var output = {};
if (err) {
throw err;
} else {
// Loop over the JSON objects
results.forEach(function(each) {
// Looping within each object
for (var index in each) {
// Copying the `id` as key and the `defaultMessage` as value and storing in output object
output[each[index].id] = each[index].defaultMessage;
}
});
// Writing the file (synchronously) after converting the JSON object back to string
fs.writeFileSync('result.json', JSON.stringify(output));
}
});
This is a simple asynchronous implementation of the same, using readFile. For more information, async.map.

Parsing doubly nested JSON object to MongoDB

Schema for my MongoDB model:
var resultsSchema = new mongoose.Schema({
start_date: String,
end_date: String,
matches:[{
id:Number,
match_date:String,
status:String,
timer:Number,
time:String,
hometeam_id:Number,
hometeam_name:String,
hometeam_score:Number,
awayteam_id:Number,
awayteam_name:String,
awayteam_score:Number,
ht_score:String,
ft_score:String,
et_score:String,
match_events:[{
id:Number,
type:String,
minute:Number,
team:String,
player_name:String,
player_id:Number,
result:String
}]
}]
});
Example of JSON data coming from the server:
"matches":
[
{
"match_id":"1234"
"match_date":"Aug 30"
...
...
"match_events":
[
{
"event_id":"234",
"event_minute":"38",
...,
...
},
{
"event_id":"2334",
"event_minute":"40",
...,
...
}
],
{
"match_id":"454222"
"match_date":"Aug 3"
...
...
"match_events":
[
{
"event_id":"234",
"event_minute":"38",
...,
...
},
....
My current implementation works for parsing just the matches (i.e the first array). But I can't seem to access the inner array properly.
async.waterfall([
function(callback) {
request.get('http://football-api.com/api/?Action=fixtures&APIKey=' + apiKey + '&comp_id=' + compId +
'&&from_date=' + lastWeek_string + '&&to_date=' + today_string, function(error, response, body) {
if (error) return next(error);
var parsedJSON = JSON.parse(body);
var matches = parsedJSON.matches;
var events = parsedJSON.matches.match_events;
var results = new Results({
start_date: lastWeek_string,
end_date: today_string,
matches:[]
});
_.each(matches, function(match) {
results.matches.push({
id: match.match_id,
match_date: match.match_formatted_date,
status:match.match_status,
timer:match.match_timer,
hometeam_id:match.match_localteam_id,
hometeam_name:match.match_localteam_name,
hometeam_score:match.match_localteam_score,
awayteam_id:match.match_visitorteam_id,
awayteam_name:match.match_visitorteam_name,
awayteam_score:match.match_visitorteam_score,
ht_score:match.match_ht_score,
ft_score:match.match_ft_score,
et_score:match.match_et_score,
match_events:[]
});
});
_.each(events, function(event) {
results.matches.match_events.push({
id:event.event_id,
type:event.event_type,
minute:event.event_minute,
team:event.event_team,
player_name:event.event_player,
player_id:event.event_player_id,
result:event.event_result
});
});
I understand that the second _.each loop should be iterating for every match, since very match has it's own events subarray. I'm just not sure how to structure this and have been struggling with it for a while.
I tried nesting that loop inside the _.each(matches, function(match) { loop but that didn't work.
Thank you.
Edit: How could I get this to work?
var results = new Results({
start_date: lastWeek_string,
end_date: today_string,
matches:[
match_events: []
]
});
Because then as #zangw says I could construct the match_events array first, append it to matches, and so on.

Read csv to object of object for d3 [datamaps]

I'm using datamaps and would like to be able to read the data from a csv file.
The data format that datamaps is expecting is the following:
var loadeddata = {
"JPN":{Rate:17.5,fillKey:"firstCat"},
"DNK":{Rate:16.6,fillKey:"secondCat"}
};
I would like to read a csv file of the following structure and transform it into the format that datamaps is expecting:
ISO, Rate, fillKey
JPN, 17.5, firstCat
DNK, 16.6, secondCat
My 'best attempt' was using the following code:
var csvloadeddata;
d3.csv("simpledata.csv", function (error, csv) {
if (error) return console.log("there was an error loading the csv: " + error);
console.log("there are " + csv.length + " elements in my csv set");
var nestFunction = d3.nest().key(function(d){return d.ISO;});
csvloadeddata = nestFunction.entries(
csv.map(function(d){
d.Rate = +d.Rate;
d.fillKey = d.fillKey;
return d;
})
);
console.log("there are " + csvloadeddata.length + " elements in my data");
});
But this code generates a variable 'csvloadeddata' that looks like this:
var csvloadeddata = [
{"key": "JPN", "values": { 0: {Rate:17.5, fillKey:"firstCat"}} },
{"key": "DNK", values : { 1: {Rate:16.6,fillKey:"secondCat"}} }
];
What am I doing wrong?
Found the answer myself. If somebody is interested – this is what I ended up using:
<script>
d3.csv("simpledata.csv", function(error, csvdata1) {
globalcsvdata1 = csvdata1;
for (var i=0;i<csvdata1.length;i++)
{
globalcsvdata1[ globalcsvdata1[i].ISO] = globalcsvdata1[i] ;
//console.log(globalcsvdata1[i]);
delete globalcsvdata1[i].ISO;
delete globalcsvdata1[i] ;
}
myMap.updateChoropleth(globalcsvdata1);
}
);
var myMap = new Datamap({
element: document.getElementById('map'),
scope: 'world',
geographyConfig: {
popupOnHover: true,
highlightOnHover: false
},
fills: {
'AA': '#1f77b4',
'BB': '#9467bd',
defaultFill: 'grey'
}
});
</script>
</body>
The csv has the following structure:
ISO,fillKey
RUS,AA
USA,BB
Here is a working example: http://www.explainingprogress.com/wp-content/uploads/datamaps/uploaded_gdpPerCapita2011_PWTrgdpe/gdpPerCapita2011_PWTrgdpe.html