Inserting Null values to Mysql database from Sheets using Google Apps Script - mysql

I'm currently working on a data ingestion add-on using Google apps script. The main idea is that the users of an application can insert data from sheets to a database. To do so, i'm using the JDBC api that apps script provides
The problem i'm currently having is that when I read a cell from the sheet that is empty apps script uses the type undefined, therefore producing an error a the moment of insertion. How could I do such thing?
My current insert function:
function putData(row, tableName) {
var connectionName = '****';
var user = '****';
var userPwd = '*****';
var db = '******';
var dbUrl = 'jdbc:google:mysql://' + connectionName + '/' + db;
var conn = Jdbc.getCloudSqlConnection(dbUrl, user, userPwd);
var stmt = conn.createStatement();
var data = row
var query = "INSERT INTO "+ db + '.' + tableName +" VALUES (" ;
var i = 0
//The following loop is just to build the query from the rows taken from the sheet
// if the value is a String I add quotation marks
for each (item in row){
if ((typeof item) == 'string'){
if (i == row.length-1){
query += "'" + item + "'";
} else {
query += "'" + item + "',";
}
}else {
if (i == row.length-1){
query += item;
} else {
query += item + ",";
}
}
i++
}
query += ")"
results = stmt.executeUpdate(query)
stmt.close();
conn.close();
}
When I try to insert the word "NULL" in some cases in thinks it is a string and brings out an error on other fields.

When trying to get the data from the Spreadsheet, more precisely from a cell, the value will be automatically parsed to one of these types: Number, Boolean, Date or String.
According to the Google getValues() documentation:
The values may be of type Number, Boolean, Date, or String, depending on the value of the cell. Empty cells are represented by an empty string in the array.
So essentially, the undefined type may be an issue present in the way you pass the row parameter (for example, trying to access cells which are out of bounds).
If you want to solve your issue, you should add an if statement right after the for each (item in row) { line:
if (typeof item == 'undefined')
item = null;
The if statement checks if the row content is of type undefined and if so, it automatically parses it to null. In this way, the content will be of type null and you should be able to insert it into the database.
The recommended way to do what you are doing actually is by using the JDBC Prepared Statements, which are basically precompiled SQL statements, making it easier for you to insert the necessary data. More exactly, you wouldn't have to manually prepare data for the insertion, like you did in the code you provided above. They are also the safer way, making your data less prone to various attacks.
Also, the for each...in statement is a deprecated one and you should consider using something else instead such as the for loop or the while loop.
Furthermore, I suggest you take a look at these links, since they might be of help:
Class JdbcPreparedStatement;
Class Range Apps Script - getValues().

Related

"We're sorry, a server error occurred. Please wait a bit and try again" while creating charts

We are running into sporadic errors (more often than not) on a project that generates Google Doc documents based on info entered into Google Sheet spreadsheets.
The Google Apps Script project pulls data (from sheets), caches it in-memory with standard var statements, massages the data a bit (formatting, de-duping, etc), then copies a Doc template and does a bunch of token substitution and also inserts some charts based on the in-memory variables.
We are encountering the following error during a high quantity of executions:
Exception: We're sorry, a server error occurred. Please wait a bit and try again.
at insertChart(Code:5890:10)
at processForm(Code:4540:5)
The main method creates several (between 5 - 20 or more) charts depending on parameters entered at run-time by the user (which happens via a standard Web Form created by GAS:
function doGet() {
return HtmlService
.createTemplateFromFile('index')
.evaluate();
}
The web page returned by the above is a very standard GAS HTMLService webapp. It contains a form that allows a user to select a few standard types of reporting criteria (date range, filter by certain types of data, etc).
We are duplicating a "template" Docs file which has a bunch of common text and some tokens that we use as placeholders to find and replace with data loaded from the Sheets. The Sheets are very common spreadsheets with the kind of data you'd imagine: date ranges, selections from drop-down lists, free-text columns, etc. (The filtering info specified by the user is used to query the Sheets for the data we want to "merge" into the new Doc. It's basically a much more complex version of your standard "mail merge".)
var templateId = 'asdfasdfasdfasdfasdfasdfasdfasdfasdf'; // DEV
var documentId = DriveApp.getFileById(templateId).makeCopy().getId();
DriveApp.getFileById(documentId).setName('Demo Report - Project Report');
DriveApp.getFileById(documentId).setSharing(DriveApp.Access.ANYONE_WITH_LINK, DriveApp.Permission.COMMENT);
Logger.log('templateId: ' + templateId);
Logger.log('documentId: ' + documentId);
var doc = DocumentApp.openById(documentId);
var body = doc.getBody();
We have the following method which takes the Doc body, placeholder, and formatted data from a chart builder. It then identifies the Doc element token to replace (simply a string in the Doc file), and runs a body.insertImage after that token element.
function insertChart(body, placeholder, chart) {
if (placeholder.getParent().getType() === DocumentApp.ElementType.BODY_SECTION && typeof chart !== 'undefined' && chart !== null) {
var offset = body.getChildIndex(placeholder);
//Logger.log(chart);
body.insertImage(offset + 1, chart);
}
}
We have several helper methods like the following buildStackedColumnChart(). It's meant to wrap the Chart API commands to use a dataTable and Chart Builder to return a specific type of chart (bar, stacked bar, line, etc).
Its resulting chart gets passed into the insertChart() method above.
function buildStackedColumnChart(title, header, data) {
Logger.log('buildStackedColumnChart');
Logger.log('title: ' + title);
Logger.log('header: ' + header);
Logger.log('data: ' + data);
try {
var dataTable = Charts.newDataTable();
for (var hr = 0; hr < header.length; hr++) {
var headerRow = header[hr];
dataTable.addColumn(headerRow[0], headerRow[1]);
}
for (var r = 0; r < data.length; r++) {
var row = data[r];
dataTable.addRow(row);
}
dataTable.build();
var chart = Charts.newColumnChart()
.setDataTable(dataTable)
.setDimensions(600, 500)
.setTitle(title)
.setStacked()
.build();
return chart;
} catch (ex) {
Logger.log('ex: ' + ex);
return null;
}
}
In the main processForm() method (called directly by the webapp menioned above when the user selects their criteria and clicks a "Generate Report" button. We have a few calls to a method which finds the token in the template file (just text with {{}} around it as shown below). It iterates through a hard-coded list of values in the in-memory data variables storing accumulations from the Sheets and creates chartHeader and chartData (which contains the values to be charted in a mechanism the helper methods above can translate back into calls that make sense for the Chart API) and uses the insertChart() and `` helper methods to insert the chart after the bookmark token and then remove the bookmark from the document (cleanup so the tokens aren't present in the end report Doc).
var chartTrainingsSummaryBookmark = findPlaceholder(body, '{{CHART_TRAININGS_SUMMARY}}');
Logger.log('chartTrainingsSummaryBookmark: ' + chartTrainingsSummaryBookmark);
var chartCategories = [
'Call',
'Conference Attendee',
'Meeting (External)',
'Partnership',
'Quarterly Board Meeting',
'Alliance or Workgroup',
'Training (Others)',
'Training (Professional Development)',
'Youth Training',
'Webinar or Zoom',
'Other',
'Coalition',
'Email',
'Face-to-face',
'Phone',
'Site Visit',
];
// {Call=10.0, Conference Attendee=10.0, Quarterly Board Meeting=10.0, Alliance or Workgroup=10.0, Coalition=10.0, Meeting (External)=10.0}
var chartHeaderData = [];
var chartData = [];
var monthChartHeader = [[Charts.ColumnType.STRING, 'Month']];
var monthChartData = [monthName];
for (var cci = 0; cci < chartCategories.length; cci++) {
var chartCategory = monthName + ':' + chartCategories[cci];
if (getChartData(chartCategory, trainingsChartData) > 0) {
monthChartHeader.push([Charts.ColumnType.NUMBER, chartCategory.split(':')[1]]);
monthChartData.push(getChartData(chartCategory, trainingsChartData));
}
}
if (monthChartData.length > 1) {
if (chartHeaderData.length < 1) {
chartHeaderData = chartHeaderData.concat(monthChartHeader);
}
chartData.push(monthChartData);
}
Logger.log('----- CHART - TRAININGS: SUMMARY -----');
if (chartData.length > 0 && chartData[0].length > 1) {
insertChart(body, chartTrainingsSummaryBookmark, buildStackedColumnChart('', chartHeaderData, chartData));
}
chartTrainingsSummaryBookmark.removeFromParent();
We are also seeing slight variations that also error out the document generation with a Timeout. (This process can take a while if there is a lot of data that fits within the parameters which the user specified.)
We were wondering why these errors would happen on a seemingly random interval. We can literally just click the button that makes the call again and it might work, or might error out on a different chart, etc.
This script has been working for the most part (we did encounter a few specific errors) in months previous; however, this "Exception: We're sorry, a server error occurred. Please wait a bit and try again." started occuring last weekend and got far worse on Monday / Tuesday (nearly every execution failed, very few succeeded). It's not happening quite as badly today; but we are seeing a few errors.
Also, it's a bit strange, but copying the GAS Project and executing the copy has thrown the error less frequently than the original project in production.
We have some ideas for workarounds; bu, ideally, we would like to identify a root cause so we can correctly fix the issue.

Stop custom function from auto refreshing/periodically calling external API

I am using Google Apps Script and a custom function to call an external API to verify phone numbers.
Below is the code for my function.
/**
* This CUSTOM FUNCTION uses the numVerify API to validate
* a phone number based on the input from JotForm and a
* country code which is derived from the JotForm country
*
* Numverify website: https://numverify.com/dashboard (account via LastPass)
* Numverify docs: https://numverify.com/documentation
*/
function PHONE_CHECK(number, country){
if(country == "")
return [["", "country_not_set"]]
// check the API result has already been retrieved
var range = SpreadsheetApp.getActiveSheet().getActiveRange()
var apires = range.offset(0, 1).getValue()
if(apires.length > 0)
return range.offset(0, 0, 1, 2).getValues()
var url = 'http://apilayer.net/api/validate'
+ '?access_key=' + NUMVERIFY_KEY
+ '&number=' + encodeURIComponent(number)
+ '&country_code=' + encodeURIComponent(country)
+ '&format=1';
var response = UrlFetchApp.fetch(url, {'muteHttpExceptions': true});
var json = response.getContentText();
var data = JSON.parse(json);
if(data.valid !== undefined){
if(data.valid){
return [[data.international_format, "OK"]]
}else{
return [["", "invalid_number"]] // overflows data to the next column (API Error) while keeping the phone field clear for import into TL
}
}else if(data.success !== undefined){
if(data.error.type.length > 0){
return [[number, data.error.type]]
}else{
return [[number, "no_error_type"]]
}
}else{
return [[number, "unexpected_error"]] // this generally shouldn't happen...
}
}
Given this formula, which takes a phone number and country code, it will then check the phone number against the numverify API and return the result in the cell and overflow to the cell to the right of it. The overflow is used to indicate whether the API was called successfully and to check if the result was already retrieved.
Example:
=PHONE_CHECK("+32123456789", "BE")
Note that the first cell is empty because the API returns an 'invalid phone number' code. Because of privacy, I won't put any real phone numbers here. In case I would've used a real phone number, the first cell would contain the phone number formatted in the international number format.
Since I'm using the free plan, I don't want to rerun the function every time if I already know what the result is, as I don't want to run up against the rate limit. Unfortunately, this doesn't seem to work and periodically (it looks like once every day), it will refresh the results for each row in the sheet.
So two questions:
Is something wrong with my logic in checking the API result and then just exiting the function? (see below for the code)
If the logic is right, why does Google Sheets seem to periodically ignore (or refresh?) the values in that second column and call the external API anyhow?
var range = SpreadsheetApp.getActiveSheet().getActiveRange() // get the cell from which the function is called
var apires = range.offset(0, 1).getValue() // get the values directly to the right of the cell
if(apires.length > 0) // check if there's anything there...
return range.offset(0, 0, 1, 2).getValues() // return an array that basically just resets the same values, effectively stopping the script from running
Your Aim:
You want a custom function, AKA a formula to only run once, or as many times as is necessary to produce a certain result.
You want the same formula to write a value to the another cell, for example the adjacent cell, that will tell the formula in future, if it should be run again or not.
Short Answer:
I'm afraid that values that are evaluated from custom functions AKA formulas are transient, and what you want to accomplish is not possible with them.
Explanation:
You can run a quick test with this custom function:
function arrayTest() {
return [[1, 2, 3, 4 ,5]]
}
If you put this in a cell as below:
You will see that if you delete the formula in the original cell, the overflow values also dissapear.
Therefore something like the following code will almost always produce the same value:
function checkTest() {
var cell = SpreadsheetApp.getActiveRange()
var status = cell.offset(0, 1).getValue();
if (status != "") {
return "already executed" // in your case without calling API
} else {
return [["OK","executed"]] // in your case making API call - will happen ~90% of the time.
}
}
// OUTPUT [["OK","executed"]]
Here I am inserting a row and deleting it to force re-calculation of the formulas.
The first thing that Sheets does before re-calculating a formula is that it clears the previous values populated by formula. Since the conditional statment depends on the value of its previous execution, it will always evaluate to the same result. In your case, it will almost always make the API call.
Confusingly, this is not 100% reliable! You will find that sometimes, it will work as you intend. Though in my tests, this only happened around 1 times out of 10, and most often when the formulas updated when saving changes to the script editor.
Ideally, though not possible, you would want to be able to write something like this:
function checkTest() {
var cell = SpreadsheetApp.getActiveRange();
var cellValue = cell.getValue();
var adjacentCell = cell.offset(0, 1);
var status = adjacentCell.getValue();
if (status == "") {
cell.setValue(cellValue)
adjacentCell.setValue("executed")
}
}
Which would clear the formula once it has run, alas, setValue() is disabled for formulas! If you wanted to use setValue() you would need to run your script from a menu, trigger or the script editor. In which case it would no longer make sense as a formula.z
References
https://developers.google.com/apps-script/guides/sheets/functions

Google Sheets SQL Queries Timing Out [duplicate]

I am trying to fetch data from mySQL database on Google Cloud SQL using JDBC from Google Apps Script. However, I got this error:
Exception: Statement cancelled due to timeout or client request
I can fetch some other data successfully. However, some data I can't.
I execute one of the successful queries and one of the unsuccessful queries on mySQL workbench. I can execute the unsuccessful query with no problem on mySQL workbench.
I compared the durations.
Duration / Fetch
-------------------------------------------
Successful query: 0.140 sec / 0.016 sec
Unsuccessful query: 0.406 sec / 0.047 sec
The unsuccessful query seems to take longer. So, I set query timeout with:
stmt.setQueryTimeout(0);
intending to set no timeout (when the value is set to zero it means that the execution has no timeout limit). Then, I executed it on Google Apps Script.
However, it doesn't work and get the same error. Could you tell me a solution for this?
This seems to be a known issue. Star ★ and comment on the issue to get Google developers to prioritise the issue. Until the issue is fixed, you can switch back to rhino runtime.
Update to add 2nd fix
After some trial and error I figured out what solved this for me -- Some queries worked, others returned this error.
Fix 1
The common denominator was that it was the queries that had been converted to the multi-line format by the V8 engine / new editor that had this issue.
As an example, switching to the new editor / V8 converted long text strings to be similar to the following:
var query = "select Street_Number, street_name, street_suffix, street_dir_prefix, postal_code, city, mls, address, unitnumber "
+"as unit, uspsid,latitude,longitude from properties.forsale where status = 'active' and zpid is null and property_type = 'residential'"
This query resulted in the error as described. The fix is changing longer queries to be continuous strings like the following:
var query = "select Street_Number, street_name, street_suffix, street_dir_prefix, postal_code, city, mls, address, unitnumber as unit, uspsid,latitude,longitude from properties.forsale where status = 'active' and zpid is null and property_type = 'residential'"
Fix 2
This one was a bit more frustrating. V8 is not as forgiving when it comes to connections to the database. Previous to V8 It would automatically close any connections that lingered, however, it looks like V8 does not like that. I had originally written my scripts to share as few connections as possible but I noticed there were some that I got this error on and it was the ones where a connection might be 'split.' For example:
var conn = getconnection() //this is a connection function I have written
var date = date || Utilities.formatDate(new Date(), "America/Chicago", "yyyy-MM-dd");
var keyobj = getkeys(undefined,undefined,conn);
conn = keyobj.conn;
var results = sqltojson(query, true, conn);
The function above was causing this error, and I'm assuming it's because the 'conn' variable was being returned from the function but, I'm going to make a wild assumption, that the connection for whatever reason cannot be in two separate functions at once since it was being both returned and continued to be a value in the object 'keyobj' and also as 'conn'. Adding delete keyobj.conn immediately after defining the conn variable did the trick:
var conn = getconnection() //this is a connection function I have written
var date = date || Utilities.formatDate(new Date(), "America/Chicago", "yyyy-MM-dd");
var keyobj = getkeys(undefined,undefined,conn);
conn = keyobj.conn;
delete keyobj.conn;
var results = sqltojson(query, true, conn);
Doing both of these fixes stopped this error and allowed the script to continue without problems.
The issue was the same as mentioned above and it failed on today so I tried to change the old version and it works for me. FYI
#TheMaster: Trying out connection to MySQL, same time out issue, even when I tried the example at https://developers.google.com/apps-script/guides/jdbc > Write 500 rows of data to a table in a single batch.
Even worse, when I reverted to the rhino interface, the result was the same. That crashes my whole development approach! :(
[Edit] FWIW, it seems to me that both Rhino and V8 don't like keeping a connection (or is it the statement?) open long enough for the above prepareStatement to complete.
So I tried inserting the 500 records as per above linked example, using a prepared SQL statement, which worked OK:
var conn = sqlGetConnection();
var start = new Date();
// conn.setAutoCommit(false);
// var stmt = conn.prepareStatement('INSERT INTO entries '
// + '(guestName, content) values (?, ?)');
// for (var i = 0; i < 500; i++) {
// stmt.setString(1, 'Name ' + i);
// stmt.setString(2, 'Hello, world ' + i);
// stmt.addBatch();
// }
// var batch = stmt.executeBatch();
// conn.commit();
var sql = 'INSERT INTO ' +
'entries (guestName, content) ' +
'values ';
for (var i = 0; i < 500; i++) {
var col1 = "'" + 'Name ' + i + "'"; // Note that the strings had to be ecapsulated
var col2 = "'" + 'Hello, world ' + i + "'"; // in quotes to work with this method
sql = sql + '(' + col1 + ', ' + col2 + '),';
}
sql = sql.substr(0,sql.length-1);
var stmt = conn.createStatement();
var response = stmt.executeUpdate(sql); // executeQuery is only for SELECT statements
conn.close();
var end = new Date();
Logger.log('Time elapsed: %sms, response: %s rows', end - start, response);
}
I got this issue when I submit the update statement for the same records in mysql table.
I set the breakpoint before update statement in my program, and I start the 2 process to run this program. so, the first process will update the mysql table correctly and the second process will get this exception later.
you need add the 'for update ' in you select statement. so the second process will got the zero not the exception when you update the record in the transaction.

How can i call google-bigquery delete and insert API's synchronously?

I am maintaining a database of transaction records which change data periodically.
i have a cron running every half an hour pulling latest transactions from main database and feeding to my express node app (i am pretty new to node), i am deleting old transactions which match with incoming transaction's order number first then insert the latest one into big query table.
after running the app for a day i am getting duplicate transactions in my database. even after checking logs i don't see delete api failing anywhere no idea how and where duplicates are coming from.
i am using #google-cloud/bigquery: ^2.0.2 , i am deleting and inserting data into bigquery tables using query api.
i have tried using streaming inserts but it won't allow me to delete the recently inserted rows until 90 minutes which won't work in my case.
My index.js
let orderNumbers = '';
rows.map(function (value) {
orderNumbers += "'" + value.Order_Number+ "',";
});
orderNumbers = orderNumbers.slice(0, -1);
await functions.deleteAllWhere('Order_Number', orderNumbers);
let chunkedRowsArray = _.chunk(rows, CONSTANTS.chunkSize);
let arrSize = chunkedRowsArray.length;
for (var i = 0; i < arrSize; i++) {
let insertString = '';
chunkedRowsArray[i].forEach(element => {
let values = '(';
Object.keys(element).forEach(function (key) {
if (typeof element[key] == 'string') {
values += '"' + element[key] + '",';
} else {
values += element[key] + ",";
}
});
values = values.slice(0, -1);
values += '),';
insertString += values;
});
insertString = insertString.slice(0, -1);
let rs = await functions.bulkInsert(insertString,i);
}
delete function call
await functions.deleteAllWhere('Order_Number', orderNumbers);
module.exports.deleteAllWhere = async (conditionKey, params) => {
const DELETEQUERY = `
DELETE FROM
\`${URI}\`
WHERE ${conditionKey}
IN
(${params})`;
const options = {
query: DELETEQUERY,
timeoutMs: 300000,
useLegacySql: false, // Use standard SQL syntax for queries.
};
// // Runs the query
return await bigquery.query(options);
};
similarly building insert query with values by chunk of 200 in insert function.
I need to write a synchronous node program which deletes some rows first and after successful deletion of rows insert the new ones.
I have no idea if this is caused by async nature of code or something is up with bigquery or the stored procedure is buggy from which i am getting the data.
Sorry for this long post i am new to node and stack overflow.
any help is appreciated.
Regarding BigQuery integration, you should arhitect your data flow in such way to let every new row in BigQuery table. Then have queries that return only newest row, which is easy to do if you have a field to order by most recent row.
You can schedule BigQuery queries that maintain a materialized table of this cleanup data. So in the end you end up having two tables one that you stream into all rows, one that is materialized to retain only the newest.

Google Sheets Custom Formula Sometimes Works Sometimes Doesn't

I have a spreadsheet in which I developed a custom function called RawVal:
function RawVal(BlockName) {
try {
var rawVal = 1;
var thiSheet = SpreadsheetApp.getActiveSheet();
var BlockRow = thiSheet.getRange("C:C").getValues().map(function(row) {return row[0];}).indexOf(BlockName);
if (BlockRow > -1) {
var baseVal = thiSheet.getRange("B" + (BlockRow+1)).getValue();
var ingVal = thiSheet.getRange("D" + (BlockRow+1)).getValue();
rawVal = Math.max(baseVal, ingVal);
Logger.log(BlockName+": base="+baseVal+"; ing="+ingVal+"; max="+rawVal);
}
return rawVal;
}
catch (e) {
Logger.log('RawVal yielded an error for " + Blockname + ": ' + e);
}
}
While the function is long, the intent is to replace a moderately sized function from having to be typed in on each row such as:
=if(sumif(C:C,"Emerald Block",D:D)=0,sumif(C:C,"Emerald Block",B:B),sumif(C:C,"Emerald Block",D:D))
The problem is sometimes it works and sometimes it just doesn't. And it doesn't seem to be related to the content. A cell that worked previously may display #NUM and have the error "Result was not a number". But if I delete it and retype it (but oddly not paste the formula), most of the time it will calculate correctly. Note: it is NOT stuck at "Loading", it is actually throwing an error.
Debug logs haven't been useful - and the inconsistency is driving me crazy. What have I done wrong?
EDIT: I replaced the instances of console.log with Logger.log - the cells calculated correctly for 6 hours and now have the #NUM error again.
It seems that your custom function is used in many places (on each row of the sheet). This and the fact that they stop working after a while points to excessive computational time that Google eventually refuse to provide. Try to follow their optimization suggestion and replace multiple custom functions with one function that processes an array and returns an array. Here is how it could work:
function RawVal(array) {
var thiSheet = SpreadsheetApp.getActiveSheet();
var valuesC = thiSheet.getRange("C:C").getValues().map(function(row) {return row[0];});
var valuesBD = thiSheet.getRange("B:D").getValues();
var output = array.map(function(row) {
var rawVal = 1;
var blockRow = valuesC.indexOf(row[0]);
if (blockRow > -1) {
var baseVal = valuesBD[blockRow][0];
var ingVal = valuesBD[blockRow][2];
rawVal = Math.max(baseVal, ingVal);
}
return [rawVal];
}
return output;
}
You'd use this function as =RawVal(E2:E100), for example. The argument is passed as a double array of values, and the output is a double array too.
Also, when using ranges like "C:C", consider whether the sheet has a lot of empty rows under the data: it's not unusual to see a sheet with 20000 empty rows that pointlessly get searched over by functions like that.
Alternative: use built-in functions
It seems that your custom function is mostly a re-implementation of existing =vlookup. For example,
=arrayformula(iferror(vlookup(H:H, {C:C, B:B}, 2, false), 1))
looks up all entries in H in column C, and returns the corresponding values in column B; one formula does this for all rows (and it returns 1 when there is no match). You could have another such for column D, and then another arrayformula to take elementwise maximum of those two columns (see Take element-wise maximum of two columns with an array formula for the latter). The intermediate columns can be hidden from the view.