How to get the filtered Data Extension DataSource? - salesforce-marketing-cloud

filtered DataExtension
I am trying to get the filtered Data Extension Data Source using wsproxy in Salesforce marketing cloud. The objects I used are DataExtension and FilterDefinition. However none of them can get the value. Is that any way I can get the value? Thanks in advance.
var prox = new Script.Util.WSProxy();
var cols = ["DataSource"];
var filter = {
Property: "CustomerKey",
SimpleOperator: "equals",
Value: "CustomerKey"
};
var desc = prox.retrieve("FilterDefinition", cols, filter);

Related

Pulling PubMed data into Google Sheets

I'm looking for some help. I am trying to grab an author's publications from PubMed and populate the data into Google Sheets using Apps Script. I've gotten as far as the code below and am now stuck.
Basically, what I have done was first pull all the Pubmed IDs from a particular author whose name comes from the name of the sheet. Then I have tried creating a loop to go through each Pubmed ID JSON summary and pull each field I want. I have been able to pull the pub date. I had set it up with the idea that I would do a loop for each field of that PMID I want, store it in an array, and then return it to my sheet. However, I'm now stuck trying to get the second field - title - and all the subsequent fields (e.g. authors, last author, first author, etc.)
Any help would be greatly appreciated.
function IMPORTPMID(){
var ss = SpreadsheetApp.getActiveSpreadsheet();
var sheet = ss.getSheets()[0];
var author = sheet.getSheetName();
var url = ("https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esearch.fcgi?db=pubmed&term=" + author + "[author]&retmode=json&retmax=1000");
var response = UrlFetchApp.fetch(url);
var AllAuthorPMID = JSON.parse(response.getContentText());
var xpath = "esearchresult/idlist";
var patharray = xpath.split("/");
for (var i = 0; i < patharray.length; i++) {
AllAuthorPMID = AllAuthorPMID[patharray[i]];
}
var PMID = AllAuthorPMID;
var PDparsearray = [PMID.length];
var titleparsearray = [PMID.length];
for (var x = 0; x < PMID.length; x++) {
var urlsum = ("https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esummary.fcgi?db=pubmed&retmode=json&rettype=abstract&id=" + PMID[x]);
var ressum = UrlFetchApp.fetch(urlsum);
var contentsum = ressum.getContentText();
var jsonsum = JSON.parse(contentsum);
var PDpath = "result/" + PMID[x] + "/pubdate";
var titlepath = "result/" + PMID[x] + "/title";
var PDpatharray = PDpath.split("/");
var titlepatharray = titlepath.split("/");
for (var j = 0; j < PDpatharray.length; j++) {
var jsonsum = jsonsum[PDpatharray[j]];
}
PDparsearray[x] = jsonsum;
}
var tempArr = [];
for (var obj in AllAuthorPMID) {
tempArr.push([obj, AllAuthorPMID[obj], PDparsearray[obj]]);
}
return tempArr;
}
From a PubMed JSON response for a given PubMed ID, you should be able to determine the fieldnames (and paths to them) that you want to include in your summary report. Reading them all is simpler to implement if they are all at the same level, but if some are properties of a sub-field, you can still access them if you give the right path in your setup.
Consider the "source JSON":
[
{ "pubMedId": "1234",
"name": "Jay Sahn",
"publications": [
{ "pubId": "abcd",
"issn": "A1B2C3",
"title": "Dynamic JSON Parsing: A Journey into Madness",
"authors": [
{ "pubMedId": "1234" },
{ "pubMedId": "2345" }
]
},
{ "pubId": "efgh",
...
},
...
],
...
},
...
]
The pubId and issn fields would be at the same level, while the publications and authors would not.
You can retrieve both the pubMedId and publications fields (and others you desire) in the same loop by either 1) hard-coding the field access, or 2) writing code that parses a field path and supplying field paths.
Option 1 is likely to be faster, but much less flexible if you suddenly want to get a new field, since you have to remember how to write the code to access that field, along with where to insert it, etc. God save you if the API changes.
Option 2 is harder to get right, but once right, will (should) work for any field you (properly) specify. Getting a new field is as easy as writing the path to it in the relevant config variable. There are possibly libraries that will do this for you.
To convert the above into spreadsheet rows (one per pubMedId in the outer array, e.g. the IDs you queried their API for), consider this example code:
function foo() {
const sheet = /* get a sheet reference somehow */;
const resp = UrlFetchApp.fetch(...).getContentText();
const data = JSON.parse(resp);
// paths relative to the outermost field, which for the imaginary source is an array of "author" objects
const fields = ['pubMedId', 'name', 'publications/pubId', 'publications/title', 'publications/authors/pubMedId'];
const output = data.map(function (author) {
var row = fields.map(function (f) {
var desiredField = f.split('/').reduce(delve_, author);
return JSON.stringify(desiredField);
});
return row;
});
sheet.getRange(1, 1, output.length, output[0].length).setValues(output);
}
function delve_(parentObj, property, i, fullPath) {
// Dive into the given object to get the path. If the parent is an array, access its elements.
if (parentObj === undefined)
return;
// Simple case: parentObj is an Object, and property exists.
const child = parentObj[property];
if (child)
return child;
// Not a direct property / index, so perhaps a property on an object in an Array.
if (parentObj.constructor === Array)
return collate_(parentObj, fullPath.splice(i));
console.warn({message: "Unhandled case / missing property",
args: {parent: parentObj, prop: property, index: i, pathArray: fullPath}});
return; // property didn't exist, user error.
}
function collate_(arr, fields) {
// Obtain the given property from all elements of the array.
const results = arr.map(function (element) {
return fields.slice().reduce(delve_, element);
});
return results;
}
Executing this yields the following output in Stackdriver:
Obviously you probably want some different (aka real) fields, and probably have other ideas for how to report them, so I leave that portion up to the reader.
Anyone with improvements to the above is welcome to submit a PR.
Recommended Reading:
Array#reduce
Array#map
Array#splice
Array#slice
Internet references on parsing nested JSON. There are a lot.

Unable to get only price from WorldCoin Index API

I'm using the world coin index API, however when using the following with google scripts:
function myFunction() {
var sheet = SpreadsheetApp.getActiveSheet();
var response = UrlFetchApp.fetch("https://www.worldcoinindex.com/apiservice/ticker?key=h3mWeJn5YvaCFGIVqGGXz4fuKM9EaA&label=sumobtc&fiat=btc");
var json = response.getContentText();
var data = JSON.parse(json);
sheet.getRange(2,10).setValue([data.Markets.Price]);
Logger.log(response)
}
I'm getting the following response
`{"Markets":[{"Label":"SUMO/BTC","Name":"Sumokoin","Price":0.00028270,"Volume_24h":15.68123925,"Timestamp":1525166340}]}`
I'm trying to get the individual Price only and not the total information
It appears that your response contains an array of objects on the Market key. If you would like to access the first price in this array of objects, you must first access the first index of the array, then access the price key of this object.
You should be able to access this value with data.Markets[0].Price

Is GAS/google sheet an alternative option to PHP/mySQL?

I keep finding forum results that refer to the google visualiser for displaying my query results. But it just seems to dump the data in a pre-made table. I haven't found a way to write my own custom table dynamically.
In the past, when I have hacked together PHP to make use of mySQL DB, I would simply connect to my DB, run a query into an array, then cycle through the array and write the table in HTML. I could do IF statements in the middle for formatting or extra tweaks to the displayed data, etc. But I am struggling to find documentation on how to do something similar in a google script. What is the equivalent work flow? Can someone point me to a tutorial that will start me down this path?
I just want a simple HTML page with text box and submit button that runs a query on my Google sheet (back in the .gs file) and displays the results in a table back on the HTML page.
Maybe my understanding that GAS/google sheets is an alternative to PHP/mySQL is where I'm going wrong? Am I trying to make a smoothie with a toaster instead of a blender?
Any help appreciated
Welcome David. Ruben is right, it's cool to post up some code you have tried. But I spent many months getting my head around Apps-script and love to share what I know. There are several ways to get the data out of a Google sheet. There is a very well document Spreadsheet Service For GAS. There is also a client API..
There is the approach you mention. I suggest converting the incoming data to JSON so you can do with it as you like.
Unauthenticated frontend queries are also possible. Which needs the spreadsheet to be published and set to anyone with the link can view, and uses the Google visualisation API
var sql = 'SELECT A,B,C,D,E,F,G where A = true order by A DESC LIMIT 10 offset '
var queryString = encodeURIComponent(sql);
var query = new google.visualization.Query('https://docs.google.com/spreadsheets/d/'+ spreadsheetId +'/gviz/tq?tq=' + queryString);
query.send(handleSampleDataQueryResponse);
function handleSampleDataQueryResponseTotal(responsetotal) {
var myData = responsetotal.getDataTable();
var myObject = JSON.parse(myData.toJSON());
console.log(myObject)
}
Other Approaches
In your GAS back end this can get all of your data in columns A to C as an array of arrays ([[row1],[row2],[row3]]).
function getData(query){
var ss = SpreadsheetApp.getActiveSpreadsheet();
var sheet = ss.getSheetByName('Sheet1');
var range = sheet.getRange('A1:C');
var data = range.getValues();
// return data after some query
}
In your GAS front end this can call to your backend.
var data = google.script.run.withSuccessHandler(success).getData(query);
var success = (e) => {
console.log(e)
}
In your GAS backend this can add data to your sheet. The client side API will also add data, but is more complex and needs authentication.
function getData(data){
var ss = SpreadsheetApp.getActiveSpreadsheet();
var sheet = ss.getSheetByName('Sheet1');
var range = sheet.getRange('A1:C');
var data = range.setValues([[row1],[row2],[row3]]);
return 'success!'
}
In your GAS frontend this can send data to your backend.
var updateData = [[row1],[row2],[row3]]
var data = google.script.run.withSuccessHandler(success).getData(updateData);
var success = (e) => {
console.log(e)
}
Finally
Or you can get everything from the sheet as JSON and do the query in the client. This works okay if you manipulate the data in the sheet as you will need it. This also needs the spreadsheet to be published and set to anyone with the link can view.
var firstSheet = function(){
var spreadsheetID = "SOME_ID";
var url = "https://spreadsheets.google.com/feeds/list/" + spreadsheetID +"/1/public/values?alt=json";
return new Promise((resolve,reject)=>{
$.getJSON(url, (data)=>{
let result = data.feed.entry
resolve(result)
});
})
}
firstSheet().then(function(data){
console.log(data)
})

Can I append data to an existing BigQuery table from a CSV file using the API?

I'm trying to use Google Apps Script to append data into a BigQuery table using the BigQuery API. The data to append is currently CSV format. So far I've found that you can stream data into BigQuery using tabledata().insertAll() but it looks like that requires json format and I'm not even convinced that it would do what I need to. Is there a straightforward solution to this that I'm missing? Because I know BigQuery supports appending, and yet everything I'm finding is really focused on loading data into new tables.
EDIT:
Sounds like tabledata().insertAll() is indeed the right method to use (hopefully). So I converted my file to json instead, but now I'm stuck on how to actually use it. I'm trying to base what I'm doing off of the reference page for it but it's still really confusing for me. Currently I am getting a 404 error when I run my code and it hits the fetch call. I'm trying to do a URL fetch, maybe that's not how I'm supposed to be doing things? I'm really new to APIs and I'm still figuring out how they work. Here's the code I currently have that's causing this:
var tableId = 'users';
var file = DriveApp.getFileById(jsonId);
//I don't know if a blob is the type that I want or not, but I'm trying it
var data = file.getBlob();
var url = 'https://www.googleapis.com/bigquery/v2/projects/PROJECT_ID/datasets/DATASET_ID/tables/tableId/insertAll'
.replace("PROJECT_ID", params.PROJECT_ID)
.replace("DATASET_ID", params.DATASET_ID)
.replace("tableId", tableId);
var response = UrlFetchApp.fetch(url, {
"kind": "bigquery#tableDataInsertAllRequest",
"skipInvalidRows": 0,
"ignoreUnknownValues": 0,
"rows": [
{
"json": data
}
],
headers: {
Authorization: 'Bearer ' + service.getAccessToken()
}
});
var result = JSON.parse(response.getContentText());
Logger.log(JSON.stringify(result, null, 2));
This is not the most direct from csv to BQ JSON but it's some code that I'm using that should help you on the BigQuery side.
var PROJECT_ID = "xxx";
var DATASET_ID = "yyy";
function convertValuesToRows(data) {
var rows = [];
var headers = data[0];
for (var i = 1, numColumns = data.length; i < numColumns; i++) {
var row = BigQuery.newTableDataInsertAllRequestRows();
row.json = data[i].reduce(function(obj, value, index) {
obj[headers[index]] = value;
return obj
}, {});
rows.push(row);
};
return rows;
}
function bigqueryInsertData(data, tableName) {
var insertAllRequest = BigQuery.newTableDataInsertAllRequest();
insertAllRequest.rows = convertValuesToRows(data);
var response = BigQuery.Tabledata.insertAll(insertAllRequest, PROJECT_ID, DATASET_ID, tableName);
if (response.insertErrors) {
Logger.log(response.insertErrors);
}
}
This allows you to supply any GAS style value matrix (from getValues or indeed Utilities.parseCsv)
convertValuesToRows will take a 2d array of strings (with headers) and encode it in the format BigQuery needs, e.g.
[["H1", "H2", "H3"],
[1 , 2 , 3 ],
[4 , 5 , 6 ]];
will be added to the insertRows request int he form of key value pairs i.e.
[{H1: 1, H2: 2, H3: 3},
{H1: 4, H2: 5, H3: 6}]
You only need to worry about the first representation as that is what you pass into bigQueryInsertData together with the table name you want to feed the data in to (The schema of the table needs to match what you are sending) and the converter function is called from within.
Utilities.parseCsv already returns a 2d array of strings so you can basically call bigQueryInsertData(Utilities.parseCsv(data.getDataAsString()), "myTable")

How do you query ScriptDb for partial matches?

I tried using RegEx and it did not return any results:
function findRecord() {
var db = ScriptDb.getMyDb();
var toFind = /Quality/i;
var results = db.query({companyName: toFind});
while (results.hasNext()) {
var result = results.next();
Logger.log(Utilities.jsonStringify(result));
}
}
From what I can see, ScriptDb's query() will only return exact matches for strings.
The only way I can see is to return the entire database and then iterate through it. I really hope there is a way to query partial matches.
Try iterating over the results using the match method
function testQuery() {
var db = ScriptDb.getMyDb();
var results = db.query({});
var start = new Date();
while (results.hasNext()) {
var result = results.next();
if (result.companyName.match(/qual.*/i)){
Logger.log(Utilities.jsonStringify(result));
}
}
var endTime = new Date();
Logger.log("time is " + (endTime.getTime() - start.getTime()) + "ms");
}
ScriptDb currently doesn't support partial matches in strings. Depending on the data you may be able to use the anyOf method:
var results = db.query({
companyName: db.anyOf(['Quality', 'quality'])
});
I don't think that is possible. You may open an "enhancement request" on the issue tracker.
But depending on your usage, it may be possible to achieve your goal if you structured your database differently, probably creating some kind of "tag" category properties for your objects, that you set beforehand, i.e. when adding the object to the database, so you can query on it later.