Multiple keywords for history search browser addon API - google-chrome

Is there a way to have an OR in the search query when searching the browser history?
Currently I use it like this:
browser.history.search({
text: some_root_url,
maxResults: max_results,
startTime: some_date
}).then(process_items)
This is for an addon.
https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/API/history/search
What I do is search for a root url in "text", like "https://somesite.com".
But I want to try using multiple sources, multiple urls in "text".
But doesn't seem to have an OR mode.
I know I can just do multiple calls to the history, one for each source, but that seems to double the running time.

Related

Need a method for deleting/ignoring Solr index records with a particular string in a field

This is a bit hard to explain so bear with me. We have a website that uses a built-in Solr product to index or remove content when it is added/updated/deleted. Standard web content is specifically tagged as published or private, so it is easy to exclude private content from our custom search engine. However, binary files (DOCs, PDFs, etc.) do not have a public/private workflow state. The only way we can determine if a file is private is that, for some reason, the CMS doubles-up the FullURL string. So the URL will have two instances of "http" in the string. Not sure why that happens, but it's a good thing b/c it's the only way to tell if a file is published or private.
Because the Solr install that's packaged with the CMS is so wonky, and b/c we have numerous other sites in other CMSes, we have a "catalog" Solr install in AWS that aggregates content from our various web properties using a data import handler. So what I'm looking for is a way, using the DIH data-config.xml file, to exclude any index records that have "http" in the URL string twice. I'm currently using a filter query (fq) field in the tag to filter out certain records, but I don't know how to write a fq to do what I'm suggesting above or if that's even possible. My hunch is that I'd need a function query, but that's a level of Solr knowledge I haven't yet achieved. If anyone has an advice or knows how to write a function query that would exclude a url field with two instances of "http" in the string I'd appreciate it!

MediaWiki API: search for pages in specific namespace, containing a substring (NOT a prefix)

I want to scrape pages from a list of Wikipedia Categories, for which there isn't a 'mother category'. In this case, dishes -- I want to get a list of all of the pages like Category:Vegetable Dishes, Category: Italian Dishes, then scrape and tag the pages in them. I know how to search for pages in a known category, but there are hundreds of categories containing the substring dishes + it feels like it should be easy to list them.
However, mediaWiki allcategories search seems to only allow search by prefix (e.g. from and to results), and while old opensearch documentation still allows search by substring, this is no longer supported. (see updated API docs + it also doesn't work if I try it)
This is very doable in the wikipedia browser, to the point where I think it might be quicker to just scrape search results, but I wonder if I'm missing something?
Thanks to #Tgr, for pointing out that I'd missed the regular search API, which allows for both a text search, specified namespace, and so on.
The correct query for my instance is:
curl "https://en.wikipedia.org/w/api.php?action=query&list=search&srnamespace=14&srsearch=Dishes&format=json"
thanks!

Add AzureDevOps Search as Chrome/Chromium 'Other search engine'

I want to add Azure DevOps Search to Chrome (or other Chromium) browsers so I can do quick code searches from the browser.
I got it working to search all repositories, but I want to also be able to add a specific "Search engine" for a specific repository.
What's the Query URL to search a specific repository in Azure DevOps?
WHAT I HAVE SO FAR:
I've added a new "Other search engines":
Search engine: Azure DevOps (all)
Keyword: code
Url: https://dev.azure.com/skykick/SkyKick%201/_search?action=contents&text=%s&type=code
And that works:
In address bar, type code and press tab:
Search for test
Press enter - be taken to Azure DevOps code results
What's the URL format to include a specific Repository in my search results?
So I have a Repository SkyKick.Example - I'd like to be able to create an additional "Other search engine" that will search just that repoistory.
I looked at the Network tab looking for what url the app uses, and I tried this configuration:
Search engine: Azure DevOps (SkyKick.Example)
Keyword: example
Url: https://dev.azure.com/skykick/SkyKick%201/_search?action=contents
&text=%s
&type=code
&lp=code-Project
&filters=ProjectFilters%7BSkyKick%201%7DRepositoryFilters%7BSkyKick.Example%7D
&pageSize=25
&__rt=fps
&__ver=2
But this doesn't load a page, just a wall of text.
Cool idea! This works for me for scoping it to just a repository
https://dev.azure.com/COLLECTION-NAME/_search?action=contents
&text=%s
&type=code
&lp=code-Project
&filters=ProjectFilters%7Besmith.dev%7DRepositoryFilters%7Besmith.dev%7D
&pageSize=25
&result=?

DNN database search and replace tool

I have a DNN (9.3.x) website with CKEditor, 2sxc etc installed.
Now old URLs need to be changed into new URLs because the domain name changed. Does anyone know a tool for searching & replacing URLs in a database of DNN?
I tried "DNN Search and Replace Tool" by Evotiva, but it goes only through native DNN database-tables, leaving 2sxc and other plugin /modules tables untouched.
Besides that, there are data in JSON-format in database-tables of 2sxc, also containing old URLs.
I'm pretty sure that the Evotiva tool can be configured to search and replace in ANY table in the DNN database.
"Easy configuration of the search targets (table/column pairs. Just point and click to add/remove items. The 'Available Targets' can be sorted, filtered, and by default all 'textual' columns of 250 characters or more are included as possible targets."
It's still a text search.
As a comment, you should be trying to use relative URLs and let DNN handle the domain name part..
I believe the Engage F3 module will search Text/HTML modules for replacement strings, but it's open-source, so you could potentially extend it to inspect additional tables.

In the google drive search api, how to group words into a phrase?

I'm using the Google Drive search api with the Files.list - search for files.
I have a query like : fullText contains 'battle of hastings'.
I'm getting results that seem to suggest that it searches for the individual words, rather than the phrase as a whole. I'm not completely clear though, and am relating the API's functionality to what can be done on a Google Search via the website, so please correct me there.
Anyway, I really only want results for the whole phrase - ie like surrounding a phrase in Google's Search web site with double quotes. For example, if you use Google's search web site to search for "no one will have written this before", then it says 'No results found for "no one will have written this before".', but if you don't use double quotes, then you get all sort of stuff.
To summarise:
Does the query api search for individual words and only return files with all those words in, even if they're not as a phrase, or in that order?
Is there a way to make it consider the words as a single phrase?
By using the try section of Files:List and consulting the search parameter documentation.
fullText - Full text of the file including title, description, and content.
contains - The content of one string is present in the other.
I tested using this
fullText contains 'daimto twitter'
It returned all of the files that contain that exact match.
By using the try it facility, I found that the behaviour is similar to the search ui in google drive, and you need to surround phrases that are to be considered one with double quotes. The quotes should be encoded into the URL like this :
https://www.googleapis.com/drive/v2/files?maxResults=100&
q=fullText+contains+'%22flippity+floppity%22'
I'm not sure if the spaces need to be encoded like that, but I tried to emulate it as much as possible.