I am relatively new programmer, talking with a partner he told me, that before AJAX, he used a iframe to send data and change the content(obviously with help of JavaScript).
I understood that both are similar techniques, but i didn't find a article to describe their characteristic,
what are advantages of AJAX over Iframe ?
EDIT
i didnt find any explanation of the technique, but my partner told me he post the data trough a hidden iframe and submit the iframe, sound like just the iframe have to be refreshed, but i never did that
One advantage AJAX has is being able to read the state/status of the
request. You also have access to page headers, which you don't with
Iframes.
Ajax can handle multiple asynch requests. It's a little trickier
with Iframes as you need to create an Iframe per request (and keep
track of all of them to delete them later) instead of recycling the
same one.
Existing libraries are full of AJAX goodness and there is a larger community support base.
iframe
is a way show seperately two (or more) webpages in one
ajax
is a way to merge two (or more) webpages ( or new data ) into one
key advantages to Ajax I find are;
CSS will flow to the page called into it.
A way to retrieve data and update new information to the visitors without page refresh.
A fab mention to this site for it's clever use of Ajax.
A'Google instant' and suggestive searching is achieved via Ajax
Just my two cents:
I agree with Kris above that I wouldn't say they are comparable.
There's on use case that I find iFrames to be easier to work with over AJAX and that is if you need to submit a complicated form to another page but you don't need any response - the iframe route is by far the easiest to code.
Beyond that, AJAX, using a metaphor, acts a very knowledgeable go-between. It will handle multiple requests, the status of those requests, and hand back the data in the format you need.
I just wanted to add this because I didn't see in any of the answers.
The reasons to use Ajax are mostly about control, which you get a lot of. These reasons have been mentioned above.
One serious downside of Ajax, though, is that it is a JS fix. JavaScript is a great language, but people have been throwing it at every problem for a while now, and things which could be optimized if they were built in to the browsers, are now instead being done slowly (compared to compiled languages) with JS.
iFrames are a great example of this. They represent an incredibly common use case, wanting to include some html in some other html. Unfortunately, they aren't very amazing at it, often creating more headache than anything else.
If you want to include something and not have it mess with your site, nor your site to mess with it, iFrames are great. For the more common use case of including some random html in some other html, Ajax is better.
And here is the point I'm trying to make: this is dumb. There is no reason there shouldn't be something like an iFrame that acts more like Ajax. But, by jumping on board (as all of us did) with Ajax, we are now left with no choice.
The biggest reason this is a problem is that JS was never meant to be the absolute building blocks of the internet. Further, it's being used by pretty much every site around to violate user privacy. So, if you're looking for a good reason to use iFrames, this is mine:
It feels good to not need JS. If you can make your site improved by JS rather than dependent on it, that's a hard earned accomplishment, and the site will feel less "hacky" overall.
Anyways, that's just my input.
In my experience data loaded via AJAX is easier to manipulate versus data inside an iFrame. Also AJAX is really good for creating a better user experience. However I am not sure if I would necessarily put iFrames and AJAX in the same category because AJAX is asynchronous content and an iFrame is really just another page being loaded from outside of your site.
Also I could see iFraming creating SEO barriers and creating bad user experience. Honestly though if I had access to content I would prefer AJAX.
Related
I am trying to find a iframe equivalent or alternate method of inserting a page into another page, the page will still need to be active, and when I submit a form or click a link within it, it will need to function like an iframe. I know iframes still work, but now that they have been depreciating over several years, I would like to find a new method. Is there one, and what kind of scripting am I looking at?
My answer: No, there is no alternative, at least if you need to embed a page with another domain.
It is logical because you shouldn't have any cross domain access to an embedded page (for example https://americanbank.com/), which can only be guaranteed if the embedded site is captured inside a frame. And since that kind of frame is exactly the iframe, there is no space for alternatives and no reason to not use it.
EDIT: Well I have to concede there are some tricky ways but never without accessing a server side dynamic site via AJAX. So it is somehow possible but not with a comparable effort.
You can ajax in external pages that is very similar to an iframe. jQuery makes this really easy.
i'm reading this article today. To be honest, im really interessed to "2. Much of your content is created by a server-side technology such as PHP or ASP.NET" point.
I want understand if i have understood :)
I create that php script (gethtmlsnapshot.php) where i include the server-side ajax page (getdata.php) and i escape (for security) the parameters. Then i add it at the end of the html static page (index-movies.html). Right? Now...
1 - Where i put that gethtmlsnapshot.php? In other words, i need to call (or better, the crawler need) that page. But if i don't have link on the main page, the crawler can't call it :O How can crawler call the page with _escaped_fragment_ parameters? It can't know them if i don't specific them somewhere :)
2 - How can crewler call that page with the parameters? As before, i need link to that script with the parameters, so crewler browse each page and save the content of the dinamic result.
Can you help me? And what do you think about this technique? Won't be better if the developers of crawler do their own bots in some others ways? :)
Let me know what do you think about. Cheers
I think you got something wrong so I'll try to explain what's going on here including the background and alternatives. as this indeed a very important topic that most of us stumbled upon (or at least something similar) from time to time.
Using AJAX or rather asynchronous incremental page updating (because most pages actually don't use XML but JSON), has enriched the web and provided great user experience.
It has however also come at a price.
The main problem were clients that didn't support the xmlhttpget object or JavaScript at all.
In the beginning you had to provide backwards compatibility.
This was usually done by providing links and capture the onclick event and fire an AJAX call instead of reloading the page (if the client supported it).
Today almost every client supports the necessary functions.
So the problem today are search engines. Because they don't. Well that's not entirely true because they partly do (especially Google), but for other purposes.
Google evaluates certain JavaScript code to prevent Blackhat SEO (for example a link pointing somewhere but with JavaScript opening some completely different webpage... Or html keyword codes that are invisible to the client because they are removed by JavaScript or the other way round).
But keeping it simple its best to think of a search engine crawler of a very basic browser with no CSS or JS support (it's the same with CSS, its party parsed for special reasons).
So if you have "AJAX links" on your website, and the Webcrawler doesn't support following them using JavaScript, they just don't get crawled. Or do they?
Well the answer is JavaScript links (like document.location whatever) get followed. Google is often intelligent enough to guess the target.
But ajax calls are not made. simple because they return partial content and no senseful whole page can be constructed from it as the context is unknown and the unique URI doesn't represent the location of the content.
So there are basically 3 strategies to work around that.
have an onclick event on the links with normal href attribute as fallback (imo the best option as it solves the problem for clients as well as search engines)
submitting the content websites via your sitemap so they get indexed, but completely apart from your site links (usually pages provide a permalink to this urls so that external pages link them for the pagerank)
ajax crawling scheme
the idea is to have your JavaScript xmlhttpget requests entangled with corresponding href attributes that look like so:
www.example.com/ajax.php#!key=value
so the link looks like:
go to my imprint
the function handleajax could evaluate the document.location variable to fire the incremental asynchronous page update. its also possible to pass an id or url or whatever.
the crawler however recognises the ajax crawling scheme format and automatically fetches http://www.example.com/ajax.php.php?%23!page=imprint instead of http://www.example.com/ajax.php#!page=imprint
so you the query string then contanis the html fragment from which you can tell which partial content has been updated.
so you have just have to make sure that http://www.example.com/ajax.php.php?%23!page=imprint returns a full website that just looks like the website should look to the user after the xmlhttpget update has been made.
a very elegant solution is also to pass the a object itself to the handler function which then fetches the same URL as the crawler would have fetched using ajax but with additional parameters. Your server side script then decides whether to deliver the whole page or just the partial content.
It's a very creative approach indeed and here comes my personal pr/ con analysis:
pro:
partial updated pages receive a unique identifier at which point they are fully qualified resources in the semantic web
partially updated websites receive a unique identifier that can be presented by search engines
con:
it's just a fallback solution for search engines, not for clients without JavaScript
it provides opportunities for black hat SEO. So Google for sure won't adopt it fully or rank pages with this technique high with out proper verification of the content.
conclusion:
just usual links with fallback legacy working href attributes, but an onclick handler are a better approach because they provide functionality for old browsers.
the main advantage of the ajax crawling scheme is that partially updated websites get a unique URI, and you don't have to do create duplicate content that somehow serves as the indexable and linkable counterpart.
you could argue that ajax crawling scheme implementation is more consistent and easier to implement. I think this is a question of your application design.
I've read some related articles (like making JavaScript generated content possible for search engines to index), but what I'd like to know, is there a simpler option to embed content from another site? Without the use of iFrames.
What I'd like to achieve in the end is to create some sort of repository for content and serve that to different sites/clients.
For instance (and this is pseudo-coded):
<dl><dt>Date of birth</dt><dd><span src="http://myserver.com/get.aspx?value=dob&userid=102" /></dd></dl>
where the span src is ofcourse not valid or working, but I'd like something similar. First, and foremost, it should be "codable" for non-technical users and second it should be indexable by search spiders.
Now the question: is there something for this?
EDIT:
The sites who need to "recieve" this data I keep aren't mine. Like I've said in a comment Facebook being the worst example I can choose but the principle remains: I'd like to create 1 source of information which you keep at my server and let other party's feed from this content so you'll only need to update some generic information only once.
Now the question: is there something for this?
Only using a server-side language like PHP, or using Server Side Includes.
The downside to these methods is that the rendering of your page becomes dependent on the remote page's availability and rendering speed. If the remote page goes down, so does yours.
Therefore, some kind of caching should be used when including 3rd party content from server side... And it gets complicated then, as well, so it doesn't match the simpler solution you are looking for.
I know iframes have their disadvantages, but if you can live with them, they are still the simplest way of doing this.
um... How simple are you looking for? I mean... In your example, if you change "SPAN" to "IFRAME" you'll have working code.
I think the real problem is how get it indexable by seach spiders, but that request basically translates to "How can my site get credit for other people's work --- with absolutely no effort on my part...."
In light of how ajax is actually used by most sites today; why is ajax embraced while frames are still regarded as a bad idea?
AJAX, from where I'm sitting, is a sort of grand tradeoff. You are breaking things in the "document" model of the interwebs so that your site can behave more like an "application." If a site is using AJAx well, they will break the document model in subtle ways that add something of value to the application. The "vote" link isn't really a link, but it gives you a cool animation and updates the question's status asynchronously.
Frames break just as much, if not more, of the document model (bookmarks, scrolling, copy-and-paste, etc) but without as much of the benefit. Frames also insert whatever decorations my OS/Window manager happens to be using, so they look pretty ugly.
AJAX, if done correctly, also breaks better for people using screen readers, text-based browsers, etc.
The big problems with frames are that it's possible to deep-link to the frames page outside of the frameset, and that bookmarking rarely works as expected. There are of course fixes for all these things, but they simply make an already not-very-nice system even clunkier and more complicated.
Ajax, as I have stated elsewhere, is more about bringing modern javascript to the mainstream and making it acceptable again than it is about using the xmlhttp object (which is really what the term AJAX means). Once you have a site on which javascript use is accepted and even expected, there's a lot more interesting stuff you can do with it.
With Ajax you can put all your logic in javascript code. That way you can create or use a javascript library that does not depend on your page. if you use an iframe, now you have to deal with a hidden control and most of your javascript code has to know the iframe.
Also for search engines work better if the page don't have frames.
Ajax gives you more granular control. You can update an individual element in a page, where Frames give you control of blocks that aren't even really in the same document.
Here are two simple answers:
1) Just using the term AJAX is cool and makes your project sound more "Web 2.0". Frames is not sexy. In fact, in web terms, frames are the antithesis of sexy.
2) AJAX is forward looking even if used in non-standard or poorly supported ways. It is less likely, IMHO, to break moving forward compared to frames which is backward looking even if in the same manner.
Ajax and frames are completely different from an accessibility standpoint (they're also completely different full stop).
Frames offer very little positive effect but bring with them a host of negative issues.
Ajax on the other hand makes the user interface more dynamic without compromising usability in most cases.
I imagine there are many of you out there who have developed an application online which automates a lot of processes and saves people at your company time and money.
The question is, what are your experiences with developing that application, having it all set in place, then "spicing" it up with some Ajax, so it makes for a better user experience?
Also, what libraries would you suggest using when adding Ajax to an already-developed web application?
Lastly, what are some common processes you see in web applications that Ajax does well with? For example, auto-populating the search box as you type.
My preferred way of building Ajax-enabled applications is to build it the old-fashioned way where every button, link, etc. posts to the server, and then hijack all those button, link, etc. clicks to the Ajax functionality.
This ensures that my app is down-browser compatible, which is good.
It doesn't really matter which you use, unless you're trying to do something very specialized.
Here's a good list: http://code.google.com/apis/ajaxlibs/.
Yes, auto-completers are a pretty handy implementation of Ajax. It's also quite useful for data-intensive activities like populating drill-down data.
A lot of what you can do with these libraries isn't Ajax-specific, there is a lot of UI interaction that can benefit the user as well. You can do things like slideshows and lightboxes quite easily with many of these libraries.
Pick the one that you're comfortable with. The syntax they all use is a little different. Give a few a spin and try to build simple examples. Stick with the one you like.
Using ASP.NET Ajax to wrap a few chunks of code is an easy way to get going. But personally I prefer to use jQuery. You can easily add some simple Ajax calls with it to make the UI more responsive without the ASP.NET Ajax overhead.
If you are using ASP.NET to write your applications, adding AJAX using ASP.NET AJAX is very straightforward and in many places will not require you to change any code at all except add two controls to the pages you want to modify.
This works using partial page loads. The controls you have to add (off the top off my head) are called something like
<asp:ScriptManager
and
<asp:UpdatePanel
The biggest thing I use for AJAX is lists and search forms. Why? Because the overhead of loading an entire page when you are going though a list of, let's say, 200 records, it will get frustrating for a user to go though everything. However, it is important that if you click on a link in the page and then hit the back button or use a link at the top to return to the same page you were on.
For search forms, as you fill out the form I use AJAX queries to return the first few results and a number indicating how many records that were returned.
For AJAX frameworks, I use mootools. http://www.mootools.net.
Please ignore if not using ASP.NET. Your platform wasn't clear from your question.
Depending on when you created your web application, your web config file may need some tweaks to use ASP.NET Ajax. The easiest way to see is to create a new web site with the ASP.NET Ajax template and compare the web config, copying over configuration items as needed to bring the old one up to date.
If "spicing it up" is all you're after then develop the fully functional app without AJAX first. From here you can unobtrusively add AJAX functionality and ensure that the app degrades well for non JavaScript-enabled browsers.
I've started using jQuery for JavaScript on my site. It takes away all the worry of cross-browser JavaScript differences - things like class and classname, and getElementById. It also includes some very handy and simple functionality for AJAX postbacks. It's very easy to learn and extremely lightweight when used well.
I've seen some good use of AJAX right here on Stack Overflow, things like the tag selector and the question lookup when you type a question title. I think these simple things work best; we're just adding to the user experience with small additions to functionality that are intuitive, we're not flooding the screen with drag/drop handles etc.
I would differ from the first poster. Adding Ajax isn't always as easy as 1,2,3. It really depends on what you are after.
Adding things such as a colour animation can be made fairly easy, but if you are after things such as auto populating a text box, this requires extra code. It's not as easy as adding just something client side. You would also need to add in server-side support to fetch the partial query results.
Going beyond that, it can become even more complex keeping your client-side script in sync with server-side support.
But with the spirit of simplicity in mind there are libraries you can use to 'spice' up a website with animations and other eyecandy that can be implemented fairly easily which have been mentioned already.
I've often had to Ajax-enable an old-fashioned ASP.NET 2.0 sites. The easiest way I've found to do that is to create a new Ajax-enabled site and copy and paste certain sections of the web.config into your old project's web.config.
Just compare the two and see what's missing in your old one. You'll obviously also need to add references to AjaxExtensions and AjaxControlToolkit.