dovecot create Virtual folders (imap search) - dovecot

I have taken a look at virtual folders and have it working. The couple of the folders I have set up are All Mail and Flagged by creating the respective dovecot-virtual file.
The dovecot-virtual files seems to be needed in each users mailbox.
How can I auto create the required dovecot-virtual files for every mailbox?

This is a pretty old question, but if you are still wondering, if i understood correctly what you were trying to achieve, you have to write auto = subscribe in each virtual mailbox in 15-mailboxes.conf, like so:
namespace inbox {
mailbox "Flagged messages" {
special_use = \Flagged
auto = subscribe
comment = Flagged messages
}
}

Related

Using addEditor(s) with users outside your workspace

I used appscript to create copies of a document and share them with a particular user, this works fine in that the code creates the document and the user receives an invitation to edit, but I have found that at least some users are not able to edit the document. When I view the sharing options on the individual docs, I can see that they are restricted to my workspace only.
I had the same issue when doing this process manually, but thought addEditor would be able to override this, as the shared Gdrive is not restricted in this way overall (i.e, we can share docs with individuals by adding them as editor, from outside our organisation.) . I do not want to have to individually share >100 docs per week manually, so any fixes/workarounds would be very appreciated.
Some limitations: the doc must be shared to one user exactly, but still be accessible to everyone in my workspace.
All of the users I am sharing the document are external to the organisation.
I tried making the master copy of the file accessible to anyone who had the link, hoping that this would mean they could access the file without manually approving the request, but it looks like this doesn't transfer to the copies. I'm not sure enough of the actual issue to know that this would solve it.
This is how the code looks:
function createCopy() {
let file = DriveApp.getFileById("ID of file to copy");
let sheet = SpreadsheetApp.openByUrl('Sheet with names and emails').getSheetByName("Sheet1");
let range = sheet.getRange('D2:D5');
let email = sheet.getRange('C2:C5').getValues().flat();
let values = range.getValues();
let folder = DriveApp.getFolderById("Destination Folder");
for (let i = 0; i < values.length; i++) {
Logger.log(values[i]);
file.makeCopy(values[i].toString(), folder).addEditor(email[i]);
}
}
From the question
Some limitations: the doc must be shared to one user exactly, but still be accessible to everyone in my workspace.
Try this
file.makeCopy(values[i].toString(), folder).addEditor(email[i]);
by
file.makeCopy(values[i].toString(), folder)
.addEditor(email[i])
.setSharing(DriveApp.Access.DOMAIN, DriveApp.Permission.EDIT)
References
https://developers.google.com/apps-script/reference/drive/file#setsharingaccesstype,-permissiontype

Laravel 5.4 protected documents on user permission

I have a Laravel project where users have roles with permissions(I'm using
Zizaco/entrust) and the app is accessable just for registered user.
The application holds uploaded documents but this documents should not available for public view, on the other side this documents should be accessable in function of users permission.
My question: how to go in this case, how to protect documents in function of users permission?
I'm not sure if this will help, but you can create a special Controller for downloading/showing a document, where you can check permissions of a actual user.
From Entrust documentation, you can check if user should be able to see the document:
$user->hasRole('owner'); //returns boolean
So you can use this code from below in a Controller:
$user = User::where('username', '=', 'Mark')->first();
$pathToFile = Storage::get('file.pdf');
if ($user->hasRole('admin'))
{
return response()->download($pathToFile); //if you want to display a file, then change download to file
}
else
{
abort(403, 'Unauthorized action.');
}
Remember about adding this line to your controller:
use Zizaco\Entrust\Traits\EntrustUserTrait;
You can read more about responses here: https://laravel.com/docs/5.4/responses and files here: https://laravel.com/docs/5.4/filesystem
Look here for short syntax which will help you implement file downloads in routes.php without creating a new controller.
https://github.com/Zizaco/entrust#short-syntax-route-filter

Google Drive Folders/Files Created Using API Not Visible on Google Interface

This is rather strange. I used Google Drive API to create a folder in Google Drive and then uploaded a file there. I can retrieve the folder and file using the same API (the code is working fine in all respect). However, when I go to Google Drive Web interface, I can't seem to find the folder or file. The file also doesn't sync to my local drive. Is there a setting in API or elsewhere to set the "visibility" ON?
Thank you in advance.
I had the same issue. Turned out to be permissions. When the file is uploaded by the service account, the service account is set as the owner, and then you can't see the files from the Drive UI. I found the solution online (but can't seem to find it again...)
This is what I did...
It's C#, your question didn't specify. The code you're interested in is the permission stuff after you get the response body after the upload...
FilesResource.InsertMediaUpload request = service.Files.Insert(body, stream, "text/plain");
request.Upload();
//Start here...
Google.Apis.Drive.v2.Data.File file = request.ResponseBody;
Permission newPermission = new Permission();
newPermission.Value = "yourdriveaccount#domain.com";
newPermission.Type = "user";
newPermission.Role = "reader";
service.Permissions.Insert(newPermission, file.Id).Execute();
The file was visible on the Drive UI after this. I tried specifying "owner" for the role, like the api suggests, but I got and error saying that they're working on it. I haven't played around with the other setting yet, (I literary did this last night). Let me know if you have any luck with any other combinations on permissions.
Hope that helps
I had the same issue, but this got solved my using a list data type for parents parameter, eg: If one wants to create a folder under a folder("1TBymLMZXPGkouw-lTQ0EccN0CMb_yxUB") then the python code would look something like
drive_service = build('drive', 'v3', credentials=creds)
body={
'name':'generated_folder',
'parents':['1TBymLMZXPGkouw-lTQ0EccN0CMb_yxUB'],
'mimeType':'application/vnd.google-apps.folder'
}
doc = drive_service.files().create(body=body).execute()
While permission issue is the main cause of this problem. What I did to make the folders or files appear after I uploaded it with service account was to specify the parent folder. If you upload / create folder / files without parent folder ID, that object's owner will be the service account that you are using.
By specifying parent ID, it will use the inherited permissions.
Here's the code I use in php (google/apiclient)
$driveFile = new Google\Service\Drive\DriveFile();
$driveFile->name = $req->name;
$driveFile->mimeType = 'application/vnd.google-apps.folder';
$driveFile->parents = ['17SqMne7a27sKVviHcwPn87epV7vOwLko'];
$result = $service->files->create($driveFile);
When you create the folder, you should ensure you set a parent, such as 'root'. Without this, it will be not appear in 'My Drive' and only in Search (Have you tried searching in the UI?)
Since you have already created the folder, you can update the file and give it the parent root as well.
You can test it out using the Parents insert 'Try it now' example.
Put your Folders ID in the fileId box, then in the request body, add root in the ID field.
private void SetFilePermission(string fileId)
{
Permission adminPermission = new Permission
{
EmailAddress = "test#gmail.com", // email address of drive where
//you want to see files
Type = "user",
Role = "owner"
};
var permissionRequest = _driveService.Permissions.Create(adminPermission, fileId);
permissionRequest.TransferOwnership = true; // to make owner (important)
permissionRequest.Execute();
Permission globalPermission = new Permission
{
Type = "anyone",
Role = "reader"
};
var globalpermissionRequest = _driveService.Permissions.Create(globalPermission, fileId);
globalpermissionRequest.Execute();
}

EWS Item.Copy causes ErrorAccessDenied

I want to copy items between mailboxes used EWS managed API. Here I've met strange situation.
When I try first to get destination folder and then copy item using its ID I get an error ErrorAccessDenied.
this.exchangeService.ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SmtpAddress, "test1#test.local");
var folder = Folder.Bind(this.exchangeService, WellKnownFolderName.Inbox);
item.Copy(folder.Id);
This gets an error
If I create a FolderId object instance specifying well known folder name (Inbox) and mailbox name I get no problems.
var folderId = new FolderId(WellKnownFolderName.Inbox, new Mailbox("test1#test.local"));
item.Copy(folderId);
This works
Is such behavior by design? Or I can use destination folder not just well known one?
I believe this behavior is by design. In the first example, the calling account (acct1) is impersonating acct2. I think that the request is being processed as acct2 trying to bind to the inbox of acct1 since the credentials of the call are associated with acct1. The context for calls are based on the identifiers which contain the SMTP address of the target mailbox.
Your second example explicitly identifies the mailbox that should be targeted. All subsequent identifiers accessed based on that folder identifier will have the context of test1#test.local.
I think you can change your first call to this to make it work:
var folder = Folder.Bind(this.exchangeService, new FolderId(WellKnownFolderName.Inbox, new Mailbox("test1#test.local")));
I know you are copying to a folder in to test1#test.local mailbox. Whose mailbox contains 'item'? I can't recall if it works to use impersonation to copy across mailboxes. I'm interested to know how it works for you.

Determine if Outlook Contact has been deleted using EWS 2007

i am able to retrieve lists of contacts for specified mailboxes using exchange web services. my issue is that some of the contacts returned have been deleted by the outlook user, and i need to determine which ones. how can i do this?
all the examples i've seen online use this method, but never for contacts.
i have tried setting the Traversal property of the ItemView variable to SoftDeleted, but that does not return anything.
below is the pertinent portion of my code:
ItemView itemViewDeleted = new ItemView(100);
itemViewDeleted.Traversal = ItemTraversal.SoftDeleted;
FindItemsResults<Item> deletedItems = svc.FindItems(WellKnownFolderName.Contacts, itemViewDeleted);
You need to check the WellKnownFolderName.DeletedItems folder. That is where my contacts go when I delete them.
There are (3) ways to delete a Contact. See TechNet for Exchange terminology reference.
Delete (moved to Deleted Items folder - WellKnownFolderName.DeletedItems)
Soft Delete (moved to Recoverable Items folder - WellKnownFolderName.RecoverableItemsDeletions)
Hard Delete (purged from mailbox - WellKnownFolderName.RecoverableItemsPurges)