We are providing a RSS feed for our blog. A couple of customers are complaining about broken image formatting. I have tried it using "Vienna RSS reader" and it is indeed broken. The images seem weirdly warped. Apart from that everything kind of seems to work.
It seems my company is always sending the whole article via RSS. I am not sure this is the right way. Is it not smarter to send a small message like in twitter via RSS accompanied by a link?
Please give me your opinion on the matter. Does it seems weird to send pictures via RSS feed at all?
Edit 1: I am wondering if i should strive to fix the images, or rather change our whole concept of sending the complete article into sending a link only.
Edit 2: Is there a golden standard of RSS that i should follow? How is RSS meant to work?
Edit 3: Back on topic then: How can i fix my images? How should they be formatted? In my RSS reader every image seems to have the same ratio.
Related
Was hoping that there was a simple, turnkey way to download the metadata that Clarifai generates as a .CSV file. Ideally I would then take that information, format it, and then upload into our DAM system to round out the metadata for images.
Have looked through their documentation and while interesting, isn't pointing me in an actionable direction. I'm not a coder nor have the time to experiment too much, so wondering if there's something out there.
So there isn't a way currently to do this in bulk, but you can do it image by image if you'd want. When you look at the image, on the bottom right side you should see the option of either JSON or TABLE for the metadata.
(The image I posted has an image without metadata but you get the idea).
If that works for what you're looking for you would probably have to scrape the site for the data you want :(.
It might be worth contacting the customer support to request this feature though.
I am trying to create a web comic aggregation website using HTML 5, CSS 3, and JavaScript. I would like users to be able to view comics of different dates from different websites all in one place. After some research, it seems like I'm probably going to need to use an RSS feed to accomplish this. However, I don't fully understand the capabilities and usage of an RSS feed.
First, would it be possible to pull images from comic websites in an automated and orderly fashion using an RSS feed? Or would I need to use something else? Or would it not be possible at all? If it is possible with an RSS feed, I'm also confused somewhat about the general implementation. Would the code for the feed be in HTML, or JavaScript, or both? Would I need to use existing libraries and or APIs? Is their existing code with a similar enough function that I could use it as a starting point?
Thanks
You are in the right direction - RSS is a standard format used to update users/readers of newly published contents.
I'm sure you've searched it already, but its Wikipedia page is quite informative. Basically, it is a standardisation and extension of xml allowing for a uniform way to distribute and read material in an automated fashion.
In the same way there are other formats, such as Atom.
So, for your purpose the main thing to understand is that you want to READ RSS feeds, rather than writing/making one (although you could make one as well - combining the comics you've found). for example, at the bottom of xkcd you can see there are two links - one for an RSS feed and another for an Atom feed. You need to find websites like that, which publish RSS/Atom feeds of comic strips and write your site to read their feed and update itself with the new content. You can maybe automate even the way your site links to feeds by using (if you find one) or creating a feed for comic feeds (so your site would lookup this feed which would contain links to other feeds which would all be appropriate for you).
You could also put up a backend on a server that would fetch the feeds and update a database/databases from which the front-end would fetch the content from using one linking point, but let's stick with the technologies you've mentioned - for a client-based-website for now.
To read and parse the feeds you can look at the answer here, recommending using jFeed, a plugin for jQuery (jQuery is a very popular library for javaScript, if you don't know it)
I'm pretty sure that answers your questions, but let's address them again, dividing it down and going one by one:
would it be possible to pull images from comic websites in an automated and orderly fashion using an RSS feed?
Yes! As you can see in the feed of xkcd I've linked above, it is both possible and widely used to pull/distribute images using RSS (and Atom) feeds.
would I need to use something else?
You can use Atom, which is just a different standard, but fairly the same idea (also an extension of xml, still you can use jFeed)
would it not be possible at all?
It is possible. Do not worry. Stay calm and code away.
If it is possible with an RSS feed, I'm also confused somewhat about the general implementation. Would the code for the feed be in HTML, or JavaScript, or both?
Do not confuse the feed's code with yours. Your code should READ the feed. Not be it. The feed itself, as explained above is written in a standard form of xml called RSS (or Atom if you go with that). But that is what your code reads. For you code see next question/answer.
Would I need to use existing libraries and or APIs? Is their existing code with a similar enough function that I could use it as a starting point?
As mentioned above - you can use jQuesry and the plugin jFeed for it.
Hope that helps and is not confusing.
Is there a way that could convert all mailto:example#exapmle.com email links in a html page to images showing same content/email address (I didn't have reputation to post images, an example image ). I knew some website provide such kind of service, but it could be done one by one. I have a webpage have many email links, so want to ask a better or smarter way to do that.
Any response would be appreciated. Thanks.
Yes. One way to do this is using PHP. Basically, what you are describing is writing text to an image. This can be done using the PHP imagettftext() function. See http://php.net/manual/en/function.imagettftext.php for more info.
I was wondering which elements are fetched when an user shares an url on Facebook or Google+...
For example: how can i make sure the description of the post will be the description i want to be shared and the image will be the image i want to be shared?
Title is pretty obvious, so i skipped that.
Facebook suggests the opengraph protocol: http://ogp.me
It works reliable and can be checked with the facebook url linter http://developers.facebook.com/tools/lint/
Funny, I just wrote a blog post about this this week. It seems to me that there's no reliable way of knowing how either social network site will parse your web page to get the "status" version of it. Not only does each site do it differently (i.e. FB, vs. linkedin vs. G+), but they're liable to change it at a whim.
So currently the short answer is that you can't know this for sure. You have to reverse engineer each social network site's behavior and hope it doesn't change too often. That is until the industry smartens up and decides on some markup to convey, for example, which image form a page is considered the cardinal "share" image, and so on.
I found this topic: Remove (or replace) all hyperlinks from an RSS feed? (probably with yahoo pipes)
So, the idea is, while the RSS feed remains intact, hyperlinks would be removed.
but it simply doesn't seem to work under any circumstances.
(I tried already tested and working pipes with this module as well, didn't pull any content either).
Any idea how to get this running?
Thanks,
T.T.
The link you provided to the other SO question seems to be talking about removing the rss item links. If you want to remove links from the rss description that would be a different matter. You will need to write some regex that looks for the and tags.