Pivot Table in Adwords Script Alernative - google-apps-script

Manipulating Pivot table in Google Adwords Script is not yet supported. Is there any script or project alternative for this?

You can generate a new tab, and insert in any cell formula with a Google Query like that:
=QUERY(IMPORTRANGE("1fSdx7f3rg_Vp_11yFuqKZmHraqFit8A", "Headline1!A1:J"), "select Col1, sum(Col4), sum(Col5), sum(Col7), sum(Col8) group by Col1", 1)
, where you specify the source of the data, the query and a headers parameter.
Note, that the columns should be named like this: Col1,.Col2 etc. So the columns order should be strict (if you want to generate a report regularly).
Google Visualization API Query Language (used in the "query" parameter) is almost the same as SQL, but with some limitations on commands that can be used.

Related

Google Apps Script Transpose & Cartesian Product

I have been building a solution for project resource management in Google Sheets.
The idea is that the team manager would input the resource demand per project on a weekly level on a single row. This is a very user friendly and easy solution for the end user. See image below for a description of what the input sheet looks like.
Input_Sheet
Since my organization has multiple teams, we want to have a separate sheet for each team. In order to still keep the reporting centralized, I have connected each sheet into BigQuery where I am combining the data from different sheets and finally creating reports in Power BI.
Currently, I am using a mix of google functions such as QUERY, ARRAYFORMULA, SPLIT, FLATTEN to transpose the data into a database suitable format, which is presented in the image below. In order to have the data in a database, I need to transpose the data from a horizontal format into a vertical format.
Result_Sheet
My current issue is that there can be around 300 rows and 100 columns and at this point a simple google function is getting too heavy to run. I am now looking for a solution to do the processing using apps script, if that would prove to be a more efficient solution and would allow me to add some logic for how often the script is run.
Dear Experts, do you see that something like this would be possible to do in apps script and how should one do this? I have some coding experience, but I'm new to apps script. I am struggling especially with producing the Cartesian Product, which would allow me to link date with the demand.
I have added a link to my example sheet below.
Link to sheet: https://docs.google.com/spreadsheets/d/1XKyt3BAo5L2RsK2vYpqrlBuEZoehIztGJ_Nl2DN-h-8/edit?usp=sharing
Use an { array expression }, like this:
=arrayformula( query(
{
Input_Sheet!B2:D2 \ "Date" \ "Demand";
flatten( iferror(Input_Sheet!E3:H / 0; Input_Sheet!B3:B) ) \
flatten( iferror(Input_Sheet!E3:H / 0; Input_Sheet!C3:C) ) \
flatten( iferror(Input_Sheet!E3:H / 0; Input_Sheet!D3:D) ) \
flatten( to_date( iferror(Input_Sheet!E3:H / 0; Input_Sheet!E2:H2) ) ) \
flatten(Input_Sheet!E3:H)
};
"where Col5 is not null
order by Col1";
1
) )
This is not a Cartesian product. It is more like unpivot.
See the new Solution sheet in your sample spreadsheet.

google sheet apps script truncate column value

I have the following table:
I am simply trying to write a Google apps script to insert into column E everything in column B prior to '.Upload'. So in the table column E = 20ba4a5c.
I think I should be able to use the split() function to do that but I'm having some difficulty.
ss.getRange('E'+lastRow).setFormula('SPLIT(B'+lastRow+'.Upload')[0]');
You should use REGEXEXTRACT to build a simple regex in order to
achieve your goal.
It is also a better practice to use template literals when dealing with multiple concatenations.
Solution:
Replace:
setFormula('SPLIT(B'+lastRow+'.Upload')[0]');
with:
setFormula( `REGEXEXTRACT(B${lastRow},"^(.*?).Upload")`)
Output:

How can I automate COUNTIF formulas across multiple Google sheets?

Each release, a number of my reports help test for our engineering team, they are assigned cases in a google sheet and work through it.
Is there a way i can count the amount of times their name appears in a google sheet, from a separate sheet or app?
I know i could COUNTIF, but this would mean each week I'd have to go in, and write a bunch of countif formulas for the various people testing.
I was thinking there must be a way to write a script that searches for unique names in a column, then automatically writes the countif statements and prints them to a CSV or seperate sheet, but this is really the first thing I've tried to automate and am struggling to find the right info to get started, any direction or help would be greatly appreciated.
You can use the =QUERY() Google Sheets function along with row concatenation. As an example, having a worksheet that has 3 sheets (named respectively Part1, Part2 and Part3) each of one having an issue column and an agent name column:
=QUERY({Part1!A2:B;Part2!A2:B;Part3!A2:B}, "SELECT Col2, COUNT(Col2) WHERE Col2 != '' GROUP BY Col2 LABEL Col2 'Agent', COUNT(Col2) 'Cases this release'")
Example
Result
You can see a working example of this feature in this public worksheet: https://docs.google.com/spreadsheets/d/1hrIMGsvSYOxHDoDCZNdYflE7PLh8Q_MOaeJBnr52tb8/edit?usp=sharing

MySQL query using Excel

I am writing a MySQL script in VBA to extract data from my database.
Is it possible to pass a list of criteria to a MySQL statement in excel ??
For example lets my script (using VBA) is as follows
select * from farms where date = '20180101' and
(animals like 'cat',
or animals like 'dog'
or animals like 'horse')
Now lets say i have 200 animals and they are in an excel list or in a spreadsheet , is it possible to reference the list (like a lookup) instead of writing a very lengthy script ??

Using Google BigQuery to run multiple queries back to back

I'm currently working a project where I'm using Google Big Query to pull data from spreadsheets. I'm VERY new to SQL, so I apologize. I'm currently using the following code
Select *
From my_data
Where T1 > 1000
And T2 > 2000
So keeping the Select and From the same, I want to be able to run multiple queries where I can just keep changing the values I'm looking for between t1 and t2. Around 50 different values. I'd like for BigQuery to run through these 50 different values back to back. Is there a way to do this? Thanks!
I'm VERY new to SQL
... and I assume to BigQuery either ..., so
Below is one of the options for new users who are not familiar yet with BigQuery API and/or different clients rather than BigQuery Web UI.
BigQuery Mate adds parameters feature to BigQuery Web UI
What you need to do is
Save you query as below using Save Query Button
Notice <var_t1> and <var_t2>
Those are the parameters identifyable by BigQuery Mate
Now you can set those parameters
Click QB Mate and then Parameters to get to below form
Now you can set parameters with whatever values you want to run with;
Click on Replace Parameters OK Button and those values will appear in Editor. For example
After OK is clicked you get
So now you can run your query
To Run another round with new parameters, you need to load again your saved query to the editor by clicking on Edit Query Button
and now repeat settings parameters and so on
You can find BigQuery Mate Chrome extension here
Disclaimer: I am the author an the only developer of this tool
You may be interested in running parameterized queries. The idea would be to have a single query string, e.g.:
SELECT *
FROM YourTable
WHERE t1 > #t1_min AND
t2 > #t2_min;
You would execute this multiple times, where each time you bind different values of the t1_min and t2_min parameters. The exact logic would depend on the API through which you are using the client libraries, and there are language-specific examples in the first link that I provided.
If you are not concerned about sql-injection and just want to iteratively swap out parameters in queries, you might want to look into the mustache templating language (available in R as 'whisker').
If you are using R, you can iterate/automate this type of query with the condusco R package. Here's a complete R script that will accomplish this kind of iterative query using both whisker and condusco:
library(bigrquery)
library(condusco)
library(whisker)
# create a simple function that will create a query
# using {{{mustache}}} placeholders for any parameters
create_results_table <- function(params){
destination_table <- '{{{dataset_id}}}.{{{table_prefix}}}_results_{{{year_low}}}_{{{year_high}}}'
query <- '
SELECT *
FROM `bigquery-public-data.samples.gsod`
WHERE year > {{{year_low}}}
AND year <= {{{year_high}}}
'
# use whisker to swap out {{{mustache}}} placeholders with parameters
query_exec(
whisker.render(query,params),
project=whisker.render('{{{project}}}', params),
destination_table = whisker.render(destination_table,params),
use_legacy_sql = FALSE
)
}
# create an invocation query to provide sets of parameters to create_results_table
invocation_query <- '
SELECT
"<YOUR PROJECT HERE>" as project,
"<YOUR DATASET_ID HERE>" as dataset_id,
"<YOUR TABLE PREFIX HERE>" as table_prefix,
num as year_low,
num+1 as year_high
FROM `bigquery-public-data.common_us.num_999999`
WHERE num BETWEEN 1992 AND 1995
'
# call condusco's run_pipeline_gbq to iteratively run create_results_table over invocation_query's results
run_pipeline_gbq(
create_results_table,
invocation_query,
project = '<YOUR PROJECT HERE>',
use_legacy_sql = FALSE
)