Foursquare Explore API not providing venue tips on certain accounts - json

When using the Foursquare Developer API to perform a venue search, I receive different results depending on the account being used.
For a given authorised query (for example, https://api.foursquare.com/v2/venues/explore?ll=40.7,-74) I am given back a collection of items with "reasons" and "venue" objects. On some accounts, it also provides "tips" data; on others it does not.
Is this additional information dependent on certain account settings, or is there something I'm missing?
Thanks in advance!

The explore endpoint is tailored to each user and provides dynamic recommendations depending on social context, time of day, and user interests. Thus, two queries are likely to contain different content depending on who is querying the API.
For example, if a friend has left a tip at a nearby venue, it's more likely that you'll see that in your explore results, because it provides social reinforcement to the recommendation.
For other venues, there might be signals more important than tips, so a tip is not surfaced.
The official foursquare apps provide good examples of this data in action, as the explore result panes change format slightly from recommendation to recommendation.
Hope that helps?

Related

Is it possible to return place reviews mentioning certain keywords in the text?

I know I can return a JSON array of up to 5 reviews. If I can return just 5 reviews that include only certain keywords that would be fine. Is it possible?
If not, perhaps I could generate a request that returns all the reviews for a place, then perform the keyword filter on the results.
I understand the business owner may return all reviews via My Business API, but I will not have My Business access for the places. I believe I can get more than 5 results with a premium plan, but assume that will get pretty expensive.
You are right the only way to get all reviews is via Google My Business API.The Places API at the moment provides only 5 reviews doesn't matter Premium or Standard plan.
There are a couple of feature requests in Google issue tracker you might be interested in:
https://issuetracker.google.com/issues/35825957
https://issuetracker.google.com/issues/35821903
Note these requests are pretty old, Google didn't set high priority on them. You can star them to express your interest and subscribe to further notifications.
Hope this answer clarifies your doubt.

Visualizing chatbot structure

I have created a chatbot using Snatchbot for the purpose of a quiz. I have been asked to create a dynamic decision tree structure for the chatbot which must be displayed on the web page, i.e. everytime the user answers a question, a branch on the tree must be created according the user's response. Is there anyway to do this? Is it possible to generate the JSON for the structure of the chatbot rather than the JSON for previous conversations? Would any other platform such as dialogflow be more suitable?
I am also using SnatchBot, you will need to use the NLP section to create all your samples and train your Data, then you could add global connections, Giving the possibility to direct the bot to the needed subject at any point of the conversation.
The value of this tool is that it allows the user to immediately (and at any point in the conversation) direct the bot to a particular subject.
Technical perspective, I have some recommendations for you:
https://jorin.me/chatbots.pdf (Development and Applications)
https://www.researchgate.net/publication/325607065_Implementation_of_a_Chat_Bot_System_using_AI_and_NLP (Implementation Using AI And NLP)
Strategy perspective, here are the crucial 6 different main criteria for enterprise chatbot implementation success:
Defining clear audience profiles of the project
Identifying clear goal for the project
Defining clear Dialog-flow Key Intents Related
Platform’s Customer Experience SWOT assessment Forming coherent teams
Testing and involving the audience from early on in the validation of
the project
Implementing feedback analytics to be used as basis for
continuous improvement
(Source: http://athenka.com)

Alternatives to Deprecated Google News API

What are some alternatives to the comprehensive and now deprecated Google News API?
I'm trying to load some JSON data from two government webservices to be parsed into a query for a news API which will send me back hefty relevant news.
It will be over economic and regional data. So it could be lowest paying job in dallas county.
Are there any news api with as much functionality?
This is a relatively old question, but there have been lots of recent developments in the News API space and I thought I could shed some light.
I’m going to immediately disregard the more expensive choices like Bing ($7/1,000 requests), since most people won’t find it easy to afford them, and the results are actually often inferior to more specialized solutions.
That means you’re basically left with two options: News API and newsapi.org. I really do enjoy newsapi.org, they do a fantastic job and provide high quality search results. They also provide developers with 500 free requests per day, which is pretty great.
ContextualWeb News API does have a couple of advantages: they offer 10,000 requests per month free, and after that our API is significantly cheaper and more flexible (the baseline is $0.5/1000 requests, compared with a slightly higher $1.8/1000 requests with newsapi.org’s basic plan). ContextualWeb also offers result keywords, allowing you to do all kinds of nifty machine learning stuff with the search results.
Having said that, newsapi.org’s documentation is currently more straightforward, and their results are mostly always spot on.
Check out the Social Animal NEWS API.
The news database has 450 million-plus articles and 1 million articles are added on a daily basis.
Notable features:
It is fast and provides breaking news across the web in 39 languages.
Social engagement data is available for all news URLs which enables surfacing top quality articles
Offers Sentiment Analysis to filter articles based on their sentiments.
For more details, please visit the documentation page.
Disclaimer: I work for Social Animal.
There's a free alternative that provides good data but the quantity of query is limited.
If you want to get a higher number of query per day you must pay a subscription.
The site is called gnews.io, it's really simple to authenticate and to get a key.
You could use a general news api site like newsapi.org or if you want more specific niche news (e.g. stocks) you can search for niche api's like iex, alphavantage or stocknewsapi.com
Aylien provides a News API that gives you access to NLP-enriched news articles from 80,000+ news sources: https://aylien.com/product/news-api/demo
The Newsdata.io API is a simple news API with this you can search over 50,000 news data sources worldwide. You can use this API to get live-breaking news from any country in the world in your preferred language as they provide news data in 22 languages in 7 categories from 88 countries.
Newsdata.io allows you to search for published articles using keywords or phrases, languages, publications, time. You can sort the results by time, the popularity of the publication source, or location.
Newsdata.io also provides news data analysis and it includes data analysis like topic labeling, intent detection, sentiment analysis, emotion analysis, entity extraction, semantic similarities.
The Newsdata.io API is free to use for non-commercial purposes with 500 API requests per day with 10 articles per request, 99.99% SLA uptime.
Data format: The API sends quick GET HTTP requests and returns JSON results.
Ease of use: The API is simple to use. Furthermore, you can use its documentation to get started implementing the API in a matter of minutes.
Maybe you could try Google BigQuery

Should I use Google Maps API/Geocoding to power a store finder

I'm new to geocoding so I'm not certain this is even the question I should be asking, but all of the other discussions I've seen on this topic (here and on the Google API forum) are so application specific that I feel like I might be missing a very elementary step - I don't need to know how to implement a store finder - I need to know if I should.
Here is my specific situation - I have been contracted to design an application wherein we will build a database of shops (say, independently owned bars and pubs). This list will continually grow and change as shops close and new ones open. The user can enter his/her point of origin (zip code or address) and be shown a list or map containing all the various shops within a given radius in order of proximity.
I know how to deliver these results from a static database:
One would store the longitude and latitude as columns for each row and then just use that information to check distances.
But I have inherited an (already fairly large) database of shops which have addresses but not coordinates - so I'm not sure what the best way to get those addresses is. I could write a script to query them one at a time against google geocoding, I could have a data entry person manually look up the coordinates for each one and populate the data that way, or maybe there is a third option I'm not aware of.
Is this the right place to be asking this question? Google Maps Geocoding doesn't host a forum of their own, but refers people to Stack Overflow. Other forums on the net dealing with this topic are all relating to a specific technical question but no one seems to be talking about it from a top-down perspective (ie the big picture).
Google imposes a 2,500 queries per day limit on free users and a 100,000 queries a day limit on paid ones - neither of these seem to be up to the task of a site with even moderate traffic if, every time a user makes a request, the entire database (perhaps thousands of shops) are being checked against Google's data. It seems certain we must store the coords locally but even storing them locally, there will have to be checks against Google in order to plot them on a map. If I had a finite number of locations (if, for example, I had six hardware shops) and I wanted to make a store locator, there would be a wealth of discussions, tutorials, and stack overflow questions available to point the way for me, but I'm dealing with a potentially vast number of records and not sure how to proceed or where to begin.
Any advice would be welcome - Additionally, if this is not the best place to be asking this question, a helpful response would be to indicate a better place to post it. I've searched for three days but haven't found what looks like a good resource for asking such subjective questions.
The best way of course would be when you use a geocoding-service to get coordinates and store the coordinates in your DB. But it's not possible with google's geocoding-service, because it's not permitted to store geocoded data permanent.
There are free services without this restriction, some keywords to search for: mapquest, nominatim, geonames(but these services are less accurate than google)
Another option would be to use a FusionTable. The geocoding would run automatically(but the daily limits are the same as for the geocoding-service). The benefit: the geocoding is permanent(you can't access the locations directly by e.g. downloading the DB-dump), but you may use the coordinates for plotting markers(via a FusionTablesLayer) or filtering(e.g. by distance)
The number of entries shouldn't be an issue, 100k is no problem for a database

Using the CloudMade API or some other geosearching API to get tourism spots?

I've been looking for an online geosearch API such that I can get location data for tourism spots, e.g. historic landmarks, tourism information locations, castles, etc. I'm not looking for a map image, but rather a list of location tagged data. I've played with CloudMade, but none of the object types seems to returning anything like the number of reponses I expect (even with a wide bounding box). None of Bing, Yahoo or Google Maps seems to have a geosearch API w/o the maps -- just geocoding. I've seen apps that find restaurants open 24-hours or apartments for rent near me -- where does that day come from? Thanks!
Where does that data come from:
Nearby restaurant searches are often performed with the Google AJAX Search API using its Local Search mode. It doesn't work for performing vague searches like "tourism spot" or "historical landmark", it needs something more specific like "museum". It's fundamentally a business search (they initially bought data from Yellow Pages, so it's heavily biassed towards organisations that have a telephone) so it won't find natural features. It's also limited to about 8 replies per query.
Apps that find apartments for rent tend to have their own database, and that database is built by reading things like Craigslist.
CloudMade uses limitation on the returning results not to hurt your browser. By default it returns 10 result. You could use "limit=" and "offset=" for paging. And the response contains "found" field which gives you the total number of results.
BTW, couple of weeks ago new CloudMade Geocoding API was published. It is far faster and easier to use.
More info - http://blog.cloudmade.com/2009/10/08/new-geocoding-engine-delivers-results-up-to-24-times-faster/
Documentation - http://developers.cloudmade.com/wiki/geocoding-http-api/Documentation