Background
I'm creating an app that will allow me to send meeting points to my clients by SMS. The meeting points are always physical addresses, not coordinates. When doing the process manually through Google Maps, it looks like so:
Note that the link generated is a goo.gl. Also, note the neatly generated thumbnail, displaying the facade of the house and the address under it. Very user friendly.
My problem
Trying to recreate the step above programmatically works only partially; it creates a workable link but the SMS does not display the facade picture of the house, nor is the address displayed below the thumbnail.
Most inquiries I found on the topic point to Google's Developer Guide, where they make use of the following synthax:
https://www.google.com/maps/search/?api=1&query={{URL encoded address}}
or in this situation:
https://www.google.com/maps/search/?api=1&query=172+Fourth+Ave+Ottawa%2C+ON+K1S+2L6
Following the approach above yields the following SMS:
Any suggestion on how to get the first output generated via goo.gl link?
Related
We have service where all user interaction is done via texting (iMessage) using Sendblue. We want users to be able to refer their friends to get free credits. The easiest way we can think of would be giving that user a link to send to their friends, which, when opened, would auto-populate a message to our number, the content of that message being the referring user's number. Example:
I (phone number +1234) want to refer my friends. I send them a link, which when they click, opens a text message to +4321 (the service) with the text prepopulated with my phone number, "+1234".
Right now the entire service is run through Zapier, so ideally would be able to work through that. You can also enter Javascript blocks into Zapier to manipulate data if that would help.
The only solution I have been able to come up with so far is create a different webpage for every person's phone number, which would just be a custom html link with the above that would be clicked automatically on page open.
Is there an existing service that does this (and ideally works with Zapier)? Or would I have to do what is described above? If so, what is the easiest way to integrate that data? Is there a way to automate webpage creation?
Any and all help is appreciated!
Currently, I'm working with Beautiful Soup to extract information from websites. I've managed to scrape data from multiple pages from a certain apartment renting website with it - let's call it “Website A”. Now, I'd like to obtain data from another renting websites (“Website B”). I tried to follow a similar procedure as previously, but it failed because Website B requires a login.
I did manage to scrape the first page of apartments of Website B by means of Adelin's answer. His/her approach is based on the usage of Curl Trillworks (link). In principle, this approach could work for Website B as well. However, then one would need to manually repeat the procedure for the 800 or so pages on which the apartments are listed, and afterwards do the same for each of the 15 apartments per page.
This is too much work for me, so I try to automate the process. For instance, I tried adapting this to my situation, but I haven't succeeded so far. The dictionary I get is empty. I also tried making a new header for each page by putting a new referer each time in the original header. Then I'd put these referers in the the header dictionary. However, this failed - probably because websiteB recognized I was using the same cookie everytime I sent a request (the same one I used for the original apartment page for Website B).
So my question is:
Suppose one would have a list of pages of Website B that all have the same format (www.websiteB.com/PageNumber/ ). How would one
quickly/automatically obtain a header for each page by means of your
own login credentials for the website, with which one can create an appropriate response?
I could share the code I have so far, but I'm somewhat hesitant as this is a large commercial website and I suspect they aren't particularly happy with me sharing code that allows their website to be scraped and names the website itself as well.
I am using Google Maps API for a project whose functionality is something like below
Functionality achieved:
When user sign in they see map and their current location is traced. Users can mark their favorite location and add a comment.
Functionality required and I need help in:
When any user passes through any location in real time, then my web application should send a notification saying "you are someone's favorite place do you want to add comment and xyz..."
Can anyone help me even with a hint on how I can achieve this for the web application?
P.S: I am not using any mobile application for this purpose. This is a web application.
It sounds like you're already set up with getting the user's location through the browser geolocation api when the page loads. My suggestion is to get the user's location again on a certain interval, every couple minutes maybe. If the new location is more than a little bit different than the previous location, send an ajax request to your server, see if the new location is near one of these favorite places, and if so, display a notification that they are near this place.
Google analytics shows me the report for the root of my site (/), however, there are two possible roots. One for public users (and google bot), one for logged in users (like Facebook, for example).
Can I add some code, anywhere to my HTML, so that in Google Analytics I can see those two pages differentiated?
I don't want to code two different links for those two pages, for simple reasons:
I don't want/need to index the inside of my site, so I don't care that google bot doesn't know about it
I don't want my public main page to be domain.com/something
I don't want my 'logged in' main page to be domain.com/something-else
You can pass a special url for logged in users as a parameter to your track pageview call via serverside code (or a javascript funktion).
Example (assuming you use the asynchronous code):
_gaq.push(['_setAccount', 'UA-12345-1']);
_gaq.push(['_trackPageview', '/root_loggedind']);
Even though the url "/root_loggedind" does not physically exist it will register in your reports. The root for "normal" users will still register as simply "/".
I am developing an application like http://www.reputation.com/
User have to input the URLs like https://plus.google.com/105836802564090875429/about and i have to fetch corresponding review/ratings and have to display on my screen.
All the process will be automated. As we have google place detail api available(https://developers.google.com/places/documentation/#PlaceDetails) but here we have to provide reference of a place.
Now i am stuck that how to find reference from my url(https://plus.google.com/105836802564090875429/about)?