Google Driving Directions Gadget Alternative - google-maps

Does anyone know what's going on with the Google Driving Directions Gadget?
If you click the add a gadget to your website and you just get a 404.
I have it on a page here, and if I open chrome developer tools I get:
Failed to load resource: the server responded with a status of 404 (Not Found)
http://www.gmodules.com/ig/ifr?url=http://hosting.gmodules.com/ig/gadgets/f…&lang=en&country=US&border=%23ffffff%7C3px%2C1px+solid+%23999999&output=js Failed to load
On this page.
Does anyone know a working alternative or a work around for this? It's strange that Google would let something like this break.
If you click the google maps gadget above that too, it also gives a 404. It has been like that for the last few days that I have tried it, so not sure how long its been like that for.

Those gadgets are by third party developers. From the bottom of the page you link to:
Much of the content in this directory was developed by other companies or by Google's users, not by Google. Google makes no promises or representations about its performance, quality, or content. Google doesn't charge for inclusion in this directory or accept payment for better placement.

Related

DOMException Error: Is Google Drive still supporting Embed Link with the /preview?

If you open a PDF file from Google Drive in a new window and go to the vertical 3 dots menu, you still can see the Embed item... option being made available to you. As such, I assume that Google still allows the PDF files to be embedded as shown below:
But as a result, I get the following errors being output to the Chrome browser console:
Uncaught (in promise) DOMException: Failed to execute 'getLayoutMap' on 'Keyboard': getLayoutMap()
must be called from a top-level browsing context.
I even tried with <embed> and <object> but no luck. I'm OK if it is just a warning and not an error, which is quite severe in the context of SEO. I just want to confirm if this is a bug or Google has officially discontinued the embed service, otherwise what would be the solution to tackle this issue? I just explored the Google Drive API but it doesn't seem to offer any solution for a proper HTML embed in particular.
Please advise. Thanks in advance!

How to find the HTTP request from google chrome inspect element?

Forgive me if I don't use the proper terminology. I have a webpage that I'm trying to scrape information from. The problem is that when I view the page source the data I want to scrape is not there. I've encountered this problem before where the main http request triggers other requests and so the information I'm looking for is actually somewhere else which I find using Google chromes inspect - Network feature. I manually search the various documents and xhr files so the one that has the correct information. This is sometimes long and tedious. I can also use google chromes inspect feature to inspect the element that contains the information I want and that brings up the correct source code but it I can't seem to figure out where or how I can use that to quickly find the corresponding HTTP headers.
Restated in a short - can I use the inspect element feature of google chrome and then ask it to show me the corresponding network event (HTTP request) that produced that code?
I'll add the case study I'm working on.
http://www.flashscore.com/tennis/atp-singles/acapulco/results/
shows the different matches that took place at a tennis tournament. I'm trying to scrape the the match hrefs but if you view source of the page you'll see they're not there.
Thanks
Restated in a short - can I use the inspect element feature of google chrome and then ask it to show me the corresponding network event (HTTP request) that produced that code?
No. This isn't something that the browser keeps track of.
In most situations, the HTTP response will pass through a good deal of Javascript code before being eventually turned into elements on the page. Tracing which HTTP response was "responsible" for a given element would involve a great deal of data flow analysis, and is impractical for a browser to do.
One way:
open firefox, install LiveHttpHeaders, then run it, and you will see the expected HEADERS.
There's the same addon for google chrome, but not tested.

How can i tell google that i have removed the .html from my url?

Hi I have recently removed the '.html' from the end of my url's to make them look more professional which was brilliant. However, now when I see my site on Google the old url which includes the '.html' still appears which produces people with an error page as expected. How can I tell Google that I have new url addresses so that people can visit my site again?
thanks!
Best way to remove .html extensions is by adding it in .htaccess file. This way search engines will "understand" it, but you will not seeing the search result immediately, since search engine crawler, will take some time to update.
And make sure to submit your url in google. If you have google webmaster you will be able to see this process and status of your website more clearly.

Using Instagram API for simple web page

So I am working on a fairly simple project, basically a web page that should list the captions from a certain instagram account. It's all designed, it just needs to be lit up with the content. Have a look at http://evanshellborn.com/speechofthebeets/.
I found that you can see a json file containing all the necessary data at instagram.com/{username}/media. So in my case, https://www.instagram.com/beets_are_life/media/. So before I put that page actually online, I was on my local machine, and I did a JSON call to that page and it worked perfectly. So I built it all out and my web page loaded the captions just like I wanted it to.
Then I went to put it online, (http://evanshellborn.com/speechofthebeets), but it doesn't work. Have a look at the script at the bottom of it, on my localhost that code works and the captions get loaded. But on the live page, I get an access not allowed error in the console. So I think Instagram doesn't allow this sort of direct access anymore, you have to go through their API.
Now I've tried looking at the API but it seems rather confusing. Basically what I'm asking for is a different JSON url that would give me the same result as https://www.instagram.com/beets_are_life/media/, but that would work from the live page.
I think https://api.instagram.com/v1/users/{user-id}/?access_token=ACCESS-TOKEN would work, just replacing {user-id} with the appropraite user_id. But where do I get an access token?
From reading https://www.instagram.com/developer/authentication/, it looks like you get one when a user puts in their user credentials. But I don't want to have anyone log in, I just want a simple web page.
Hopefully that made sense. How can I do what I want?
Looks like the API url https://www.instagram.com/beets_are_life/media/ does not support jsonp (no callback support), so u cannot use javascript (client side) for making API request, it will fail because of Access-Control-Allow-Origin error on browser side, you have make this API call on server side as proxy.
I guess https://www.instagram.com/<USER_NAME>/media/ is not a publicly documented API, thats the reason it is not supporting jsonp, Instagram uses it for their website and since it is same-origin it will work for them on client-side
This link will help you embeding the instagram on a simple html webpage.
There is a button on the bottom of the post on instagram.when you click on the link a menu pops up. then click on embed
now a box pops up
just copy paste the html and you are done.
it will fetch the post for you

Google Crawl Errors Because Of External Links

I have tons of 404 crawl errors(my old url's).. I deleted them via Google Webmaster tools > remove url tool..
example: www.mysite.com/page1.html
But there are some external source sites which link my old urls on their content pages (ex: www.anothersite.com).. And because of they have my old urls on their pages, my url removal always fails..
What can i do now? i cannot delete these links; i don't know who is these websites owners.. And there are tons of external URLs like this; i cannot delete one by one via pressing button again and again.
Can robots.txt be enough? or what can i do more?
You dont want to use robots.txt for blocking the url(Google does not recommending).
404s are a perfectly normal (and in many ways desirable) part of the web. You will likely never be able to control every link to your site, or resolve every 404 error listed in Webmaster Tools. Instead, check the top-ranking issues, fix those if possible, and then move on.
https://support.google.com/webmasters/answer/2409439?hl=en