Gapps returns undefined for getParents() - google-apps-script

I'm running an Add-on for spreadsheets where users can create a document.
This document will be places inside a folder of the current year at the location of the spreadsheet.
This has worked some time but for some reason (I can't find why, or I'm searching in the wrong places) it stopped working.
To find the folder I use this code:
const ss = sa.getActiveSpreadsheet();
var documentID = ss.getId();
var DriveID = DriveApp.getFileById(documentID);
var parents = DriveID.getParents();
console.log(parents.hasNext());
var parent;
while(parents.hasNext()){
parent = parents.next();
console.log(`next parent = ${parent.getName()}`);
}
When I run this part on my own account the parents.hasNext() returns true and I've got a name in the While loop.
But when I run the same script on a test account i get false en that's it, no while loop.
First thing I thought that it was the issue with sharing within drive, but it's set that Everyone of my company has access (read/write) to that folder, subfolder and folder above, so I am kinda running out of ideas what this might be.
Who can tell me what I'm doing wrong here.

The most probable cause I can think is that your test account doesn't have access to the folder this file is in. Because of that, your script cannot get it. Sharing the entire folder should solve the issue (remember that it may take a while to fully propagate after sharing).

Related

How to log a the creation date of a google drive folder in a spreadsheets script?

This is my first google apps script and the original plan was to do something way more comlplicated, but I got stuck in the very beginning.
Now, I just want to log the creation date of my folder, and I can't.
When I press ctr+enter there's nothing to see in the logs window.
Why is this happening? I am not trying to build a rocket here...
Here's my code :
fetchFiles();
function fetchFiles() {
var folder = DriveApp.getFoldersByName("Aff Comm");
Logger.log(folder.next().getDateCreated());
}
Your code, just tested in my Script, works. But:
you need to save your script (extremely important). If there is the red asterisk a in the image below you have not saved your code
you need to select fetchFiles in the dropdown menu (the one in the red rectangle in the image below, if you did not save, the function may be unavailable in the dropdown menu)
at the first run you agree with the permission request from google
You may improve your code in this way:
function fetchFiles() {
var folders = DriveApp.getFoldersByName("Aff Comm");
while (folders.hasNext()) {
var folder = folders.next();
Logger.log(folder.getDateCreated());
}
}

Using Google App Scripts to share documents

I have so many documents that I need to manage due to the amount of people that have access to them, and when someone needs to be added or removed, it can become a real pain and time consumer.
I'm curious, is there a script that you can set across multiple documents that give edit or view access.
I'm aware that there is this, "addEditor(emailAddress)", but from what I have gathered, you have to do a script for each document which defeats the purpose of productivity.
Basically, I need one script that gives access to a particular set of documents, and when I need someone removed, i just delete their name and run the script and removes edit/view access from those documents.
Ax example of a script, or rather what I'm trying to achieve. (Not actually a script):
//Human Resources//
addEditor(emailAddress) to BBM1 - Membership Tracker, "https://docs.google.com/spreadsheets/d/1gb8T1K74cRR_6qSyByqrtiphujrchceLP_QsMunoras/edit#gid=0"
//Administrator//
addEditor(emailAddress) to BBM1 - Membership Tracker, "https://docs.google.com/spreadsheets/d/1gb8T1K74cRR_6qSyByqrtiphujrchceLP_QsMunoras/edit#gid=0"
//Moderator//
addViewer(emailAddress) to BBM1 - Membership Tracker, "docs.google.com/spreadsheets/d/1gb8T1K74cRR_6qSyByqrtiphujrchceLP_QsMunoras/edit#gid=0"
So basically, I can just add emails to that, run the script, and it gives them edit access. But, I just don't know how to create a script that actually works for that and also to have a script across multiple documents.
Many Thanks,
Shaun.
Take a look on folder api. You can put all those files into a folder, and then everytime you run the script you can loop through all files in the folder and set correct permissions.
Example (untested):
var folderName = "My auto-managed documents";
var folderIterator = DriveApp.getFoldersByName(folderName);
while (folderIterator.hasNext()) {
var folder = folderIterator.next();
var fileIterator = folder.getFiles();
while (fileIterator.hasNext()) {
var file = fileIterator.next();
// Do things with the file:
// file.addEditor('my-email#example.com');
}
}

Change ownership of file which is target a parentId, also ends up in root (2 parents in file)

I have made a way to transfer file ownership within the domain, from admin user to destinationUser. This works expect that I have a issue with the parents being added to the file when the permission is changed.
The Admin user are performing the sequence in this order:
uploadFile - To the destinationUser's folder (which is shared to the admin to write in)
Insert permission (to the destinationUser as a user and have ownership of the file) (see code below).
Remove permission for the Admin user (this to only make the destinationUser to be the only owner of the file)
The problem is in step 2:
When the permission is shanged to the destinationUser, the file's parents are changed.
There now exist 2 parents, one of the destination Folder, but now also in Root (isRoot=true). The issue is that I didn't request it to be in root and I see this as a bug actually.
Even if I add a extra step after step 2, to update the parents to remove the "root" parent, this is not shown on the file. I guess due to that the admin now isn't owner of the file.
Are there any why to solve this issue, since the file shouldn't be both in the folder and root. Other sequence, or maybe file a bug-report on Google Drive API ??
var service = new DriveService(CreateAuthenticator());
var newPermission = new Permission();
newPermission.Value = user.email;
newPermission.Type = "user";
newPermission.Role = "owner";
try
{
return service.Permissions.Insert(newPermission, fileId).Fetch();
}
catch (Exception e).....
The solution was to go to Domain-Wide approach instead to remove this. Since I couldn't find any other solution.
Also Domain-Wide solution was better en cleaner when it eventually worked.
https://developers.google.com/drive/web/delegation

Algorithm to process all files and folders across multiple runs

I note the new Docslist Token and get*ForPaging() options available now but I am still struggling with an algorithm to process "all files and folders" for arbitrarily large file/folder trees.
Assume a Google Drive based web file system with n files and folders. It will take multiple runs of 6 minutes to get through with a Google Apps Script. Nightly I need to process all files older than 30 days in trees of subfolders beneath a starting folder. I need to process each file once only (but my functions are idempotent so I don't mind if I run against files again).
I have my recursive algo working but the thing that I am missing is a way to have a placeholder so that I don't have to start at the top of the folder tree each time I invoke the script. In six minutes I get through only a few hundred folders and a few thousand files.
My question is what index can I store and how do I start where I left off the next time through?
I have thought about storing Tokens or the last completed folder path "/mytop/sub4/subsub47/" but how would that help me on another invocation? If I started there it would falsely just work down the tree from there and miss siblings and ancestor folders.
I have thought about the "find" methods and using a "before:2012/10..." style search but there's no way to limit that to files in my tree (only a single folder).
I am not pasting my code as it's just standard recursive getFolders/getFiles and not actually relevant to the core of the question.
I'd create an array of the folders that I have to work on and save it all for a future run.
Since you said it's no problem to work on some files/folders repeatedly, you don't even need to put a fake stop to your function. You can let it timeout every time.
Something like this:
var folders = null;
//call this to start the process or set the property manually
function start() {
folders = ['id-of-the-starting-folder'];
work();
}
//set this to run on the trigger
function work() {
if( folders == null )
folders = ScriptProperties.getProperty('folders').split(',');
while( folders.length > 0 ) {
workOnFolder(folders[0]);
folders.shift(); //remove the 1st element
ScriptProperties.setProperty('folders', folders.join());
}
//remove the trigger here
}
function doFolderLater(folder) {
folders.push(folder.getId());
}
function workOnFolder(id) {
var folder = DocsList.getFolderById(id);
folder.getFolders().forEach(doFolderLater);
folder.getFiles().forEach(workOnFile);
}
function workOnFile(file) {
//do your thing
}

google script I get this error "Cannot find folder"

I'm using Google Code unchanged (except for the folder name) yet I get the error "Cannot find folder".
I've checked the spelling of the folder name and tried several different folders in "My Drive" but get the error on all of them. The exact same code works for my friend but not for me.
Here is the line of code:
var folder = DocsList.getFolder('My Docs');
Try to use the folder ID instead of the name.
var folder = DocsList.getFolderById("234asdfih-324asdf")
You can find your folder ID in the adressbar of your browser when you open the folder.
If you use the ID you can change the name of the folder later and it does not matter at all.
If you use the folder name watch that its case sensitive.
I faced a very similar but somehow different issue. Depending upon the folder name I include in CHECKLIST_FOLDER (and they are all valid), the command :
myFile.addToFolder(DocsList.getFolder(CHECKLIST_FOLDER));
returns (or not) the error message "Cannot find folder". Folders details and permissions are identical in all cases, no clear reason for failure.
In this case as well, Thomas' work around is working great, thanks !