I am looking for a way to apply same actions on multiple Items, example Messages with a single call. Example, I retrieve a set of Messages using a search filter. After this I would want to delete the messages that matched the criteria. Currently, I iterate and call the delete() call on each item. Is there a way to call delete on the whole collection in order to optimize the number of calls?
I saw that EWS has this support but was not able to find an equivalent on the (awesome) Exchangelib.
https://learn.microsoft.com/en-us/exchange/client-developer/exchange-web-services/how-to-process-email-messages-in-batches-by-using-ews-in-exchange
Have a look at the bulk operations documented at https://ecederstrand.github.io/exchangelib/#bulk-operations
Specifically, you can delete all items found by a query with:
a.inbox.filter(subject__startswith='something').delete()
Related
When writing to a Phonograph table sometimes queries exceed the maximum allowed size of 190kB. Is there any way to work around this limit?
The context to that question is that users should define custom filters on a dashboard. Upon storing a filter definition I want to keep track of the data items that are matching the current filter to later being able to identify newly matching data items, e.g. when new data arrives in the system. However, the list of data ids that match the filter is sometimes too big, and hence the error occurs.
Thanks,
Tobias
Can you try forcing the user to use a group of filters.
We had this problem and were told that the intention of allowing a small package size of Phonograph is with the thought process that we base our filters on a set of fields that collectively always result a finite set of rows. For example a given set of geo filters/ set of product dim filters.
I am implementing some filters for a specific Eloquent-model.
Right now I fetch all available products from the database wrapping the command in laravels Cache::remember () function. Then I pass this collection through a series of Filter-Classes. In each filter-class I perform a $productCollection->filter() action which returns all fiting products. At the end, I get all products fitting the given filter parameters.
Now I wonder if this is a good practice or if it would be better to use scopes or filter direct during the database query.
Some filters filter for a direct match (e.g. brand must be id 4). Others have a set of possible fits (e.g. category must be id 3, 4 or 7).
It depends on your application scale and the size of the collections you are filtering. If you have a small collection, with about 100 models, it will make no difference. But, if you need to filter 1k+ models, it's better to use a query scope directly with you eloquent model.
The query scopes will return only the registers which match with the scope condition, while the ->filter() method will iterate over all your models to, then, take only those models that matches with the rule you specify inside the filter callback function.
In resume, it is always better to return only the records that you need, since it remove the need to iterate through all the records and return just a half or less of them.
Besides that, the query scopes allow you to define common sets of constraints that you may re-use throughout your app, while, if using collection filter, you need to rewrite the same filter every time you want to use it.
The documentation about query scopes.
Regards, Mateus.
I was trying to develop U-SQL User defined operators using this link. It looks like we can read one row, process it and write it as a single row using UDO.
In my scenario I have to read multiple consecutive rows and write multiple consecutive rows and that seems not possible using the help provided in blog.
In another scenario, I have to process single row and break into multiple and then write to output.
I am wondering if it is possible to process multiple rows using U-SQL UDO or if there is any other way to do it in U-SQL?
You can write a custom applier to take a single row and return several rows. You invoke it with CROSS APPLY.
You can write a custom reducer (or a user-defined aggregator) to take several rows (cells) and return a single row (cell).
What do you want to do by reading several rows, see them all and then return several rows? Would that be similar to a self-join (you could use a combiner)?
I have a google refine project with 36k rows of data. I would like to add another column with fetching json data from freebase url. I was able to get it working on a small dataset but when i ran it on this project it took few hours to process and then most of the results were blank. I did get some results with data though. Is there a way to limit on amount of rows the data will be fetched or a better way of getting the data from the url.
Thank You!
If you're adding data from Freebase, you'd probably be better off using the "Add column from Freebase" rather than "Add column by fetching URL."
Facets are one of the most powerful Google Refine features and they can be used to control all kinds of things. In this case, you could use a facet to select a subset of your data and only do the fetch on that subset (and then repeat with a different subset).
The next version of Refine will include better error reporting on the results of URL fetches to help debug problems like this, but make sure that you're respecting all the limits of the remote site as far as total number of requests, requests per second, etc.
i am trying to retrieve information from the database but i dont know how number of the rows will be retrieved. information will be retrieved from the following statement.
SELECT * from pgw_alarm WHERE (((pgw_alarm.sev)='0 0 0') AND ((pgw_alarm.month)='"+mon+"'));
now i want to display them to user using table since i dont know how many rows will be there i am unable to create table. is there a way to increase the number of rows in table if so how if not what is the alternative way to display them.
thank you.
In pure HTML this is not possible, however a number of technologies will solve this issue for you. Depending on your server side language, such as JSP or ASP you may be able to dynamically create the rows on the server. Another possibility is to retrieve the rows via Ajax and use javascript to dynamically create dom elements that correspond with each row and append them.
The server-side developer should define a loop in the provided html template. Thus any number of rows can be displayed.
Alternatively the server-side developer should provide a gateway URL which returns JSON formatted data, which can be accessed through AJAX call and then any number of rows generated via JavaScript.