How do I push values into a column Google Sheets App Script - google-apps-script

I'm searching through the documentation as best I can, I'm just on a time limit here so if someone can tell me that would be great.
I need to insert data into a column and have it push the data in the column down when it's inserted. For example I need to add the word "Good" at the top of the column, "Bad" WAS at the top, but when I pushed in "Good", "Bad" became the number two spot, the number two spot became the number three spot, etc. It needs to do this without deleting or moving the rows themselves because I'm reading data from two columns in the sheet and then writing to a third column.
Thanks in advance!

Welcome to StackOverflow.
From what I understood reading your question is that you have already been able to read data from two column and now you just want to store some of them in a separate column. Apologies if I misunderstood your question.
If I understood your question right, I would suggest you to create a list of the requests and make a batch update of it. Which would help you refraining yourself from reaching the write quota.
So, here how it goes-
request = []
request.append({
"updateCells": {
"rows": [
{
"values":[
{
"userEnteredValue": {
"numberValue": 546564 #Assuming your value is integer
},
"userEnteredFormat": {
"horizontalAlignment": "CENTER",
"verticalAlignment": "MIDDLE"
}
}
]
}
],
"fields": "*",
"range": {
#Replace these values with actual values
"sheetId": sheetId,
"startRowIndex": startRow, #Indexing start from 0
"endRowIndex": endRow,
"startColumnIndex": startColumn,
"endColumnIndex": endColumn,
}
}
})
#You can add more requests like this in the list and then execute
body = {
"requests": request
}
response = sheet.spreadsheets().batchUpdate(
spreadsheetId=spreadsheet_id,
body=body).execute()
#If you are using gspread, then you can use this
sheet.batch_update({"requests" : request})
This will update the cells with your given value. For detailed information and other formatting follow the documentation.

Related

Secomea DCM 3529 multiple triggers or two values in the same reading

I am currently having troubles with Secomea Data Collection Module, I was wondering if anyone here might be able to enlighten me.
So I am collecting sensor data from the product Secomea 3529 through a portal called Secomea Sitemanager. I can't seem to find any information about my two questions below, I hope someone knows the answer.
Information about protocol used in this project
"Protocol": "S7/TCP",
"S7Access": {
"S7Model": "S7-200",
"S7Rack": 0,
"S7Slot": 1
When collecting data it is programmed using JSON as seen below.
I was wondering if it is possible to somehow have more than one TriggerSample and if so how is it set up?
{
"SampleName": "Sensor1",
"SampleDescription": "Some Description",
"SampleDataType": "bool",
"SamplesSaved": 3600,
"Aggregation": {
"Function": "compute",
"Expression": "Sensor2,1,/",
"TriggerSample": "Sensor3"
}
},
My other question, is it possible to have more than one S7Var?
{
"SampleName": "ModeCheck",
"SampleDescription": "Mode status",
"SampleDataType": "int16",
"SamplesSaved": 360,
"S7Var": {
"S7PLCVar": "LocationInMachineDB1",
"S7SampleInterval": 5
}
},

Mongo query to get comma separated value

I have query which is traversing only in forward direction.
example:
{
"orderStatus": "SUBMITTED",
"orderNumber": "785654",
"orderLine": [
{
"lineNumber": "E1000",
**"trackingnumber": "12345,67890",**
"lineStatus": "IN-PROGRESS",
"lineStatusCode": 50
}
],
"accountNumber": 9076
}
find({'orderLine.trackingNumber' : { $regex: "^12345.*"} })**
When I use the above query I get the entire document. But I want to fetch the document when I search with 67890 value as well
At any part of time I will be always querying with single tracking number only.
12345 or 67890 Either with 12345 or 67890. There are chances tracking number value can extend it's value 12345,56789,01234,56678.
I need to pull the whole document no matter what the tracking number is in whatever position.
OUTPUT
should be whole document
{
"orderStatus": "SUBMITTED",
"orderNumber": "785654",
"orderLine": [
{
"lineNumber": "E1000",
"trackingnumber": "12345,67890",
"lineStatus": "IN-PROGRESS",
"lineStatusCode": 50
}
],
"accountNumber": 9076
}
Also I have done indexing for trackingNumber field. Need help here. Thanks in advance.
Following will search with either 12345 or 67890. It is similar to like condition
find({'orderLine.trackingNumber' : { $regex: /12345/} })
find({'orderLine.trackingNumber' : { $regex: /67890/} })
There's also an alternative way to do this
Create a text index
db.order.createIndex({'orderLine.trackingnumber':"text"})
You can make use of this index to search the value from trackingnumber field
db.order.find({$text:{$search:'12345'}})
--
db.order.find({$text:{$search:'67890'}})
--
//Do take note that you can't search using few in between characters
//like the following query won't give any result..
db.order.find({$text:{$search:'6789'}}) //have purposefully removed 0
To further understand how $text searches work, please go through the following link.

How to parse in Google Sheets a nested JSON structure with fallback when the data isn't available?

I'm getting Yahoo Finance data as a JSON file (via the YahooFinancials python API) and I would like to be able to parse the data in a smart way to feed my Google Sheet.
For this example, I'm interested in getting the "cash" variable under the "date" nested structure. But as you'll see, sometimes there is no "cash" variable under the first date, so I would like the script/formula to go and get the "cash" variable that's under the second date structure.
Here is sample 1 of JSON code:
{ "balanceSheetHistoryQuarterly": {
"ABBV": [
{
"2018-12-31": {
"totalStockholderEquity": -2921000000,
"netTangibleAssets": -45264000000
}
},
{
"2018-09-30": {
"intangibleAssets": 26625000000,
"capitalSurplus": 14680000000,
"totalLiab": 69085000000,
"totalStockholderEquity": -2921000000,
"otherCurrentLiab": 378000000,
"totalAssets": 66164000000,
"commonStock": 18000000,
"otherCurrentAssets": 112000000,
"retainedEarnings": 6789000000,
"otherLiab": 16511000000,
"goodWill": 15718000000,
"treasuryStock": -24408000000,
"otherAssets": 943000000,
"cash": 8015000000,
"totalCurrentLiabilities": 15387000000,
"shortLongTermDebt": 1026000000,
"otherStockholderEquity": -2559000000,
"propertyPlantEquipment": 2950000000,
"totalCurrentAssets": 18465000000,
"longTermInvestments": 1463000000,
"netTangibleAssets": -45264000000,
"shortTermInvestments": 770000000,
"netReceivables": 5780000000,
"longTermDebt": 37187000000,
"inventory": 1786000000,
"accountsPayable": 10981000000
}
},
{
"2018-06-30": {
"intangibleAssets": 26903000000,
"capitalSurplus": 14596000000,
"totalLiab": 65016000000,
"totalStockholderEquity": -3375000000,
"otherCurrentLiab": 350000000,
"totalAssets": 61641000000,
"commonStock": 18000000,
"otherCurrentAssets": 128000000,
"retainedEarnings": 5495000000,
"otherLiab": 16576000000,
"goodWill": 15692000000,
"treasuryStock": -23484000000,
"otherAssets": 909000000,
"cash": 3547000000,
"totalCurrentLiabilities": 17224000000,
"shortLongTermDebt": 3026000000,
"otherStockholderEquity": -2639000000,
"propertyPlantEquipment": 2787000000,
"totalCurrentAssets": 13845000000,
"longTermInvestments": 1505000000,
"netTangibleAssets": -45970000000,
"shortTermInvestments": 196000000,
"netReceivables": 5793000000,
"longTermDebt": 31216000000,
"inventory": 1580000000,
"accountsPayable": 10337000000
}
},
{
"2018-03-31": {
"intangibleAssets": 27230000000,
"capitalSurplus": 14519000000,
"totalLiab": 65789000000,
"totalStockholderEquity": 3553000000,
"otherCurrentLiab": 125000000,
"totalAssets": 69342000000,
"commonStock": 18000000,
"otherCurrentAssets": 17000000,
"retainedEarnings": 4977000000,
"otherLiab": 17250000000,
"goodWill": 15880000000,
"treasuryStock": -15961000000,
"otherAssets": 903000000,
"cash": 9007000000,
"totalCurrentLiabilities": 17058000000,
"shortLongTermDebt": 6024000000,
"otherStockholderEquity": -2630000000,
"propertyPlantEquipment": 2828000000,
"totalCurrentAssets": 20444000000,
"longTermInvestments": 2057000000,
"netTangibleAssets": -39557000000,
"shortTermInvestments": 467000000,
"netReceivables": 5841000000,
"longTermDebt": 31481000000,
"inventory": 1738000000,
"accountsPayable": 10542000000
}
}
]
}
}
The first date structure (under 2018-12-31) doesn't contain the cash variable. So I would like the Google sheet to go and search for the same data in 2018-09-30 and if not available go and search in 2018-06-30.
OR just scan the nested structure dates and fetch the first "cash" occurrence that will be found.
Basically, I would like to know how to skip the name of the date variable (i.e.2018-12-31) as it doesn't really matter, and just make the formula seek for the first available "cash" variable.
Main questions recap
How to skip mentioning an exact nested level name and scan what's
inside?
How to keep scanning until you find the desired variable with
a value that is not "null" (this can happen)?
What would be the entire formula to achieve the following logic: Scan the JSON file until you find the value > if no value found, fallback to this IMPORTXML function that calls an external API.
Let me know if you need more context about the issue and thanks in advance for your help :)
EDIT: this is the IMPORTJSON formula I use in the cell of the spreadsheet right now.
=ImportJSON("https://api.myjson.com/bins/8mxvi", "/financial/balanceSheetHistoryQuarterly/ABBV/2018-31-12/cash", "noHeaders")
Obviously, this one returns an error as there is nothing under that date. The JSON is also the valid link I use just now.
=REGEXEXTRACT(FILTER(
TRANSPOSE(SPLIT(SUBSTITUTE(A1, ","&CHAR(10), "×"), "×")),
ISNUMBER(SEARCH("*"&"cash"&"*",
TRANSPOSE(SPLIT(SUBSTITUTE(A1, ","&CHAR(10), "×"), "×"))))), ": (.+)")
=INDEX(ARRAYFORMULA(SUBSTITUTE(REGEXEXTRACT(FILTER(TRANSPOSE(SPLIT(SUBSTITUTE(
TRANSPOSE(IMPORTDATA("https://api.myjson.com/bins/8mxvi")), ","&CHAR(10), "×"), "×")),
ISNUMBER(SEARCH("*"&"cash"&"*", TRANSPOSE(SPLIT(SUBSTITUTE(
TRANSPOSE(IMPORTDATA("https://api.myjson.com/bins/8mxvi")), ","&CHAR(10), "×"), "×"))))),
":(.+)"), ",", "")), 1, 1)

DataTables footerCallback - conditional on another column value

I'm trying to implement a footerCallback in DataTables that do a conditional sum of some columns, based on a cell that's in a different column in the same row.Can anyone help me with this? I have used below code and check alert(cur_index); but I think it is not working as expected. And I did not get a correct sum of values of a column. My code is:
pageTotal6 = api
.column( 6, { page: 'current'} )
.data()
.reduce( function (a, b) {
var cur_index = api.column(6).data().indexOf(b);
alert(cur_index);
alert(api.column(3).data()[cur_index]);
if (api.column(3).data()[cur_index] != "Pending review") {
return parseInt(a) + parseInt(b);
}
else { return parseInt(a); }
return intVal(a) + intVal(b);
}, 0 );
And in 3rd column I have some repeated value and I want sum only for distinct value from 3rd column. How can I do this 2 thing using datatable & html
Theres two ways you can go about this.
First Method
(I will assume you are reading JSON data from Database [ViewModel] in C#, and using server-side processing)
Using the image below as reference to how I solved the problem
I wanted to sum of the "Amount" column where "Measure Type" (last column) != 99. First thing I did with the ViewModel that was going to pass the list to my JSON object was add a column sum column that didnt read any MeasureType = 99 rows from the table.
So essentially my JSON object has two columns that read the Amount column data, one is visible that you see in the image that has all figures and another invisible that only reads the values I want to sum in my footer.
while (MyDataReader.Read())
{
//get all other columns
//column with amount figures measuretype != 99
if (reportData.q_measuretype != 99)
{
reportData.amountNo99 = Convert.ToDecimal(String.Format("{0:0.00}", read["q_amount"]));
}
else
{
reportData.amountNo99 = 0;
}
list.Add(reportData);
}
After that step, then within the footerCallback function you can keep it simple by just summing the invisible column, because the condition has already been set when you get the list of rows onto the page
totalNettNo99 = api
.column(8, { page: 'current' }) //remember this is the last invisible column
.data()
.reduce(function (a, b) {
return intVal(a) + intVal(b);
});
You can then update your footer with that sum on the visible column 3 (index 2)
$(api.column(2).footer()).html(
'€' + totalNettNo99.toFixed(2)
);
Remember to set the invisble column this way in "columnDefs"
"ajax": {
"url": "/Reports/loadTransactionList",
"type": "POST",
"datatype": "JSON"
},
"columnDefs": [
{
"targets": [8],
"visible": false,
"searchable": false,
"render": false
}
],
"columns": [
{
"data": "convertDateToString"
},
{
"data": "convertTimeToString"
},
{
"data": "q_receiptnumber"
},
As you can see from the image, only the rows with Guinness Pint total the sum on the footer. Its a bit more typing but solves the problem if you have been tearing your hair with script solution.
Second Method
You can have a look at this answer here done purely in script and less typing that my solution
https://stackoverflow.com/a/42215009/7610106
credit to nkbved

How to avoid Keys with Duplicate Values in Couchbase.Lite

Is it possible to tell CB.Lite to reject documents that contain values from a certain key repeated?
For instance, if i have the next document already in CB.Lite:
{
"Dog": {
"Name": "Dug",
"Color": "Blue",
"Age": 2
}
}
Is it possible to tell CB.Lite to reject any document with repeated Key "Name", so that if i try to add the next one:
{
"Dog": {
"Name": "Dug",
"Color": "Green",
"Age": 5
}
}
it would reject it?
I know It would be not much hassle to implement this functionality myself, but i was wondering if CB.Lite has already something Out of the Box.
Currently not at commit time (this is as of 1.4.x). The closest you could where Couchbase would do most of the work would be to create a View emitting the value you don't want repeated, then query and do the enforcement yourself.
This is assuming the docs themselves have different IDs. If you had what you showed using the same document ID, there are other possibilities. For example, you could trap this and reject it in Sync Gateway.