When I get the data thru socket in node js, I use the data and before writing new data to that JSON, I need to delete some parts of it, but there is too many different things would be deleted. I want to make JSON object clear, erase everything before start adding anything to it.
So I have tried something like:
JSON_object = {}; // does nothing
JSON_object = null; // error
JSON_object.empty(); // error...
and so on..
Nothing works.
Does any1 know how to make it empty?
Related
I was wondering if there is a way to cast a field that comes in through the JsonLoader(). This is basically what I want but it doesn't work.
Person = LOAD 'people' USING JsonLoader() AS (name:chararray)
I haven't tried with JsonLoader, but i have faced a similar situation with HCatLoader.
There i did the casting in the second line.
Person_tmp = LOAD 'people' USING JsonLoader();
Person = FOREACH Person_tmp GENERATE name:chararray;
Just try it out. It might work out.
How do you query MongoDB using mongoose with node.js? I have it to where I can insert my JSON object to my DB, but all the ways I have tried to return the JSON object from the DB return null or just information about the database.
Is there some good method using mongoose to be able to query the database similar to the method:
var cursor = db.collection.find()
var JSONobject = cursor.next()
Here's what is in my code right now:
mongoose.connect('mongodb://localhost/myDB');
mongoose.connection.on('error', console.error.bind(console, 'connection error:'));
var cursor = mongoose.connection.db.contents.find();
console.log(cursor.next());
This throws an error at the line :
var cursor = mongoose....
claiming 'cannot call method 'find' of undefined'.
Note, that my collection 'contents' does in fact exist, and it contains one JSON document. I know this because I manually navigated to the collection using the mongo shell.
Edit: I am open to alternative methods to querying the database. I simply just want to be able to return JSON objects from my DB one at a time, while keeping track of where the client is at in the database.
One method to query mongoDB using mongoose is as follows:
Content.findOne().exec(function(err,docs){console.log(docs)});
Docs contains the JSON object. Access its attributes like any other object.
Note, this method uses asynchronous call backs. As such, you can't store the JSON object docs to a variable in order to use the JSON document information outside of the function. Therefore, you need to perform whatever actions needed with the JSON docs object information inside of the function.
For example, I was needing the query information to provide the filepath for my get api. As such, the result was as follows:
//get
app.get('/api/media/', function(req,res){
Content.findOne().exec(function(err,docs){res.sendFile(path.join(__dirname, '/api/media/', docs.filename))});
});
Note, Content is the model of my schema, and one of its parameters is filename.
I can't figure out what's wrong with this snippet of code. I have an indexedDB instance. The keyPath is auto-generated. I can successfully add objects to the DB and get all objects in the DB, but I can't successfully search for an object in an index I created.
See my jsfiddle: http://jsfiddle.net/R5ngM/13/
Haven't nailed it perfectly yet but the issue seems to be that you're opening the cursor on your object store, rather than the index. With the default keyPath that works fine but it won't work when you're trying to use a secondary index.
What I think you're looking to do should look like:
var request = null;
if ( null === index || 'undefined' === typeof index ) {
request = transaction.openCursor( keyRange );
} else {
request = index.openCursor( keyRange );
}
request.onsuccess = on_success;
UPDATE: I spent a lot of time looking in the wrong place after finding the above issue with the index cursor. Here's the issue: The object you're storing is an array not an object literal.
I was looking into the object you were storing and noticed this:
var entry = '<li>'+row[0].fname+' '+row[0].lname+'</li>';
See how you're accessing the first element of the row array? You're storing an array. Your keypath (and me) assumed an object was stored. I believe it's possible to have an index on an array but in any case your keyPath is off.
Here's a largely-similar chunk of working code. I mucked it up a bit while debugging but you'll get the gist. (It's a nice snippit so if you don't mind, I'll use it as the base of other StackOverflow examples later on.)
Keep the cursor on the index as explained in my answer above. Then change to this line:
var entry = '<li>'+row[0].fname+' '+row[0].lname+'</li>';
And the change this:
var newUser = [{fname:$('#fname').val(),lname:$('#lname').val()}];
To this:
var newUser = {fname:$('#fname').val(),lname:$('#lname').val()};
It must also be noted that the above issue can even arise when you open a cursor on the index. If you open the cursor on index and still you always get null you should double check the data type that you created the index on and the the data type on which you're creating your index cursor on. In my case I'd created the index on string data-type and was opening a cursor on the index with int and it was driving me crazy.
I want to assign the header row of a .csv file as the dataprovider of a drop down list. So far I can load the .csv with a url request and loader. I assign the datatype as text and trace the output of the loader.data which shows me everything in the csv.
protected function appendFileUploadedHandler(event:AppendFileUploaded):void
{
userCSVRequest = new URLRequest("foo.com/myFile.csv");
userCSVLoader.dataFormat = URLLoaderDataFormat.TEXT;
userCSVLoader.addEventListener(Event.COMPLETE, csvLoader_complete);
userCSVLoader.load(userCSVRequest);
}
public function csvLoader_complete(event:Event):void{
trace(userCSVLoader.data.toString());
}
What I can't figure out is how to determine where the first row / header row ends. Should I just convert the csv to xml or is there something I can do that doesn't involve that extra step. Some of the csv files will be very large so I dont want to waste the loading time.
For anyone else that runs into this problem, the solution is pretty simple. Nothing I found through searching gave me exactly what I needed. I ended up using the .split method on userCSV.data (which is a string type) and assigning the value to an array.
//will truncate everything past the first new line char. hopefully this is your header.
var headerArray:Array = userCSV.split("\n",1);
I have a program that adds a lot of new data to a database using Linq2SQL.
In order to avoid DuplicateKeyExceptions, I check for the existence of the key, before trying to add a new value into the database.
As of now, I can't provide an isolated test-case, but I have simplified the code as much as possible.
// newValue is created outside of this function, with data read from a file
// The code is supposed to either add new values to the database, or update existing ones
var entryWithSamePrimaryKey = db.Values.FirstOrDefault(row => row.TimestampUtc == newValue.TimestampUtc && row.MeterID == newValue.MeterID);
if (entryWithSamePrimaryKey == null)
{
db.Values.InsertOnSubmit(newValue);
db.SubmitChanges();
}
else if(entryWithSamePrimaryKey.VALUE != newValue.VALUE)
{
db.Values.DeleteOnSubmit(entryWithSamePrimaryKey);
db.SubmitChanges();
db.Values.InsertOnSubmit(newValue);
db.SubmitChanges();
}
Strangely enough, when I look at the exceptions in the application log, as to which items cause trouble, I am unable to find ANY of them in the database.
I suspect this happens within the update code, so that the items get removed from the database, but not added again.
I will update my code to deliver more information, and then update this post accordingly.
If the error is generated in the update block, you can merge the object in the update case without deleting entryWithSamePrimaryKey, but valorizing it with the property value of newValue and than save the changes.