Append value to JSON decode array parameter stored in MySQL - mysql

Im trying to work out how to append a zero to a specific JSON decoded array value for multiple records stored in a MySQL table according to some conditions.
for example, for table 'menu', column 'params'(text) have records containing JSON decoded arrays of this format:
{"categories":["190"],"singleCatOrdering":"","menu-anchor_title":""}
and column 'id' has a numeric value of 90.
my goal is to add a zero to 'categories' value in menu.params whenever (for example) menu.id is under 100.
for this records the result being
{"categories":["1900"],"singleCatOrdering":"","menu-anchor_title":""}
so im looking for a SQL Query that will search and find the occurrences of "categories": ["999"] in the Database and update the record by adding a zero to the end of the value.
this answer is partially helpful by offering to use mysql-udf-regexp but its referring to REPLACE a value and not UPDATE it.
perhaps the REGEXP_REPLACE? function will do the trick. i have never used this library and am not familiar with it, perhaps there is an easier way to achieve what i need ?
Thanks

If I understand your question correctly, you want code that does something like this:
var data = {
"menu": {
"id": 90,
"params": {
"categories": ["190"],
"singleCatOrdering": "",
"menu-anchor_title": ""
}
}
};
var keys = Object.keys(data);
var columns;
for (var ii = 0, key; key = keys[ii]; ii++) {
value = data[key];
if (value.id < 100) {
value.params.categories[0] += "0";
alert(value.params.categories[0]);
}
}
jsFiddle
However, I am not using a regular expression at all. Perhaps if you reword the question, the necessity of a regex will become clearer.

Related

Couchbase Function to query view: send parameter from Java

I have some Couchbase data in the following format
{
"id": "12343",
"transaction": {
"2018-01-11": 10,
"2017-12-01" : 20
},
"_type": "TransactionData"
}
I would like to get the ids whose transaction list contains key older than a given date ( for example, this object would not be retrieved for a value of "2017-11-01", but it does for "2017-12-12".
I made a view, but I would like to parameterise the date String:
function (doc, meta) {
if (doc._type == 'TransactionData') {
for (var key in doc.transaction) {
//I want to send the String value from java
if (key < "2018-02-21") {
emit(doc.id, null);
break;
}
}
}
}
I tried writing some N1QL query, but my server doesn't allow that, I can't change this configuration.
I don't think I can use the startKey, because I return a map of (id, null) pairs.
How can I filter the ids that have transactions older than a configurable date?
Thanks.
You can do like this:
function (doc, meta) {
if (doc._type == 'TransactionData') {
for (var key in doc.transaction) {
emit(doc.id, null);
}
}
}
user _count for Reduce function, then you can query using
query.range("2018-02-21", {}).reduce(true)
then you can take the value to see how many rows there are
Views are static indexes. Documents are processed once after each change, and any emitted results put into the index. You can't parameterize your function because it isn't rerun for every query. So you can't solve the problem the way you're approaching it. (You can do that with N1QL.)
Typically you solve this by adding a key range filter to your query. Look at the documentation for querying views. There are examples on how to select by date. You'll have to decide how you want to structure the index (view) you create.

Azure tables unable to store flattened JSON

I am using the npm flat package, and arrays/objects are flattened, but object/array keys are surrounded by '' , like in 'task_status.0.data' using the object below.
These specific fields do not get stored into AzureTables - other fields go through, but these are silently ignored. How would I fix this?
var obj1 = {
"studentId": "abc",
"task_status": [
{
"status":"Current",
"date":516760078
},
{
"status":"Late",
"date":1516414446
}
],
"student_plan": "n"
}
Here is how I am using it - simplified code example: Again, it successfully gets written to the table, but does not write the properties that were flattened (see further below):
var flatten = require('flat')
newObj1 = flatten(obj1);
var entGen = azure.TableUtilities.entityGenerator;
newObj1.PartitionKey = entGen.String(uniqueIDFromMyDB);
newObj1.RowKey = entGen.String(uniqueStudentId);
tableService.insertEntity(myTableName, newObj1, myCallbackFunc);
In the above example, the flattened object would look like:
var obj1 = {
studentId: "abc",
'task_status.0.status': 'Current',
'task_status.0.date': 516760078,
'task_status.1.status': 'Late',
'task_status.1.date': 516760078,
student_plan: "n"
}
Then I would add PartitionKey and RowKey.
all the task_status fields would silently fail to be inserted.
EDIT: This does not have anything to do with the actual flattening process - I just checked a perfectly good JSON object, with keys that had 'x.y.z' in it, i.e. AzureTables doesn't seem to accept these column names....which almost completely destroys the value proposition of storing schema-less data, without significant rework.
. in column name is not supported. You can use a custom delimiter to flatten your objects instead.
For example:
newObj1 = flatten(obj1, {delimiter: '__'});

How should I search through this json structure?

I'm working with PHP, I have a json structure which looks like that :
{
"events": [
{
"timestamp": 1468774519,
"id": 75964,
},
{
"timestamp": 1468771410,
"id": 24891,
},
// etc
I need to fetch 5 events in a row, but starting from one specific id, so my first idea is to loop every event from the beginning and check if the id is the offset i'm looking for, and then when i get it i can loop the next 5 events.
But is there a better way to do so ? It could possibly loop through hundreds of events so maybe there's a better way to get there ? thanks
Since the ids aren't in numeric order, you can't use a binary search, so you need to use a sequential search. Here's an example in JavaScript. Also note this code assumes the id is present and there are at least four more events after it in the array.
var index = 0;
var id = 12345; // for example
var json = {...}; // whatever that object was
while(json.events[index].id!=id) {
index++;
}
// found the one, do something with the next five
for(var i=0; i<5; i++) {
var event = json.events[index+i];
// do something
}
In my opinion, you can only take one loop over the events with a filter event.id >= theId,and then check if the filtered array contains theId. if you get it, you can sort this smaller array and take the 5 events.
First I would make a key:value hash object (a lookup object), where the key would be the id from your structure, and the value would be the reference to the event. As a result, you iterate over the structure only once and then get all the events from the lookup structure, by just accessing them by their keys.
You could sort it as well(ideally, you could get it already sorted by id from your source of data) and then use a binary search algorithm.

MongoDB - Dynamically update an object in nested array

I have a document like this:
{
Name : val
AnArray : [
{
Time : SomeTime
},
{
Time : AnotherTime
}
...arbitrary more elements
}
I need to update "Time" to a Date type (right now it is string)
I would like to do something psudo like:
foreach record in document.AnArray { record.Time = new Date(record.Time) }
I've read the documentation on $ and "dot" notation as well as a several similar questions here, I tried this code:
db.collection.update({_id:doc._id},{$set : {AnArray.$.Time : new Date(AnArray.$.Time)}});
And hoping that $ would iterate the indexes of the "AnArray" property as I don't know for each record the length of it. But am getting the error:
SyntaxError: missing : after property id (shell):1
How can I perform an update on each member of the arrays nested values with a dynamic value?
There's no direct way to do that, because MongoDB doesn't support an update-expression that references the document. Moreover, the $ operator only applies to the first match, so you'd have to perform this as long as there are still fields where AnArray.Time is of $type string.
You can, however, perform that update client side, in your favorite language or in the mongo console using JavaScript:
db.collection.find({}).forEach(function (doc) {
for(var i in doc.AnArray)
{
doc.AnArray[i].Time = new Date(doc.AnArray[i].Time);
}
db.outcollection.save(doc);
})
Note that this will store the migrated data in a different collection. You can also update the collection in-place by replacing outcollection with collection.

Compare data against sql column

I have collection of numbers like: 11111, 12345, 12346 stored in a list in c# code. I need to compare this list against sql database column of numbers similar to this and find out if matching numbers exist. Below is what i am doing:
foreach (number in numbers)
{
//get column data through sql reader and iterate through it:
foreach(column in columnData)
{
if(number == column)
{
// do something
}
}
my question is this right approach? Or is there a better way to do this? As it looks like this requires lots of processing.
I would so something like this..
var matches = columnData.Where(z=> numbers.Contains(z=>z.columnData)).ToList();
or
var matches = columnData.Select(z=> z.columnData).Intersect(numbers);