Just In General: JS Only Vs Page-Based Web Apps
Asked Answered
R

10

6

When a developing a web app, versus a web site, what reasons are there to use multiple HTML pages, rather than using one html page and doing everything through Javascript?

I would expect that it depends on the application -- maybe -- but would appreciate any thoughts on the subject.

Thanks in advance.

EDIT:

Based on the responses here, and some of my own research, if you wanted to do a single-page, fully JS-Powered site, some useful tools would seem to include:

JQuery Plug Ins:

JQuery History: http://balupton.com/projects/jquery-history

JQuery Address: http://plugins.jquery.com/project/jquery-address

JQuery Pagination: http://plugins.jquery.com/project/pagination

Frameworks:

Sproutcore http://www.sproutcore.com/

Cappucino http://cappuccino.org/

Possibly, JMVC: http://www.javascriptmvc.com/

Routinize answered 28/12, 2010 at 21:55 Comment(0)
K
5

page based applications provide:

  • ability to work on any browser or device
  • simpler programming model

they also provide the following (although these are solvable by many js frameworks):

  • bookmarkability
  • browser history
  • refresh or F5 to repeat action
  • indexability (in case the application is public and open)
Kandrakandy answered 28/12, 2010 at 22:4 Comment(10)
Thanks. The only thing I would question are the bookmarkability and browser history aspects, as I believe you can do, as Flash/Flex apps do, a url fragment like "www.url.com/#rs=resultset1" and then parse the url to simulate e.g. a back button.Routinize
Yes. As I mentioned the last 4 items are 100% solvable by frameworks and techniques.Kandrakandy
My mistake. (= What would you say are the best frameworks for addressing some of these issues?Routinize
Or more specifically, is there a framework you know of that advocates an all JS approach?Routinize
@stofac: I've never really used it for anything serious, but try SproutCore.Gifferd
I disagree that page-base applications are a "simpler programming model". Maintaining session-state from one page to the next was a constant burden to me until I switched to JS-only sites. Indexability is a minor issue and fortunately Google helps a lot with Google Ajax Crawling. The other three problems are really the same problem (the relationship between the real state of the page and the URL) and every JS framework will help. For jQuery, see the Address plug-in.Yellowstone
@Malvolio: can you possibly expand on your experience with doing JS-only sites? The benefits you've encountered? Issues to overcome? More than anything, why you made the switch...Would love to hear about it.Routinize
@Malvolio: From my experience working with people on building web applications, the plain page based request/response cycle is cleaner and easier to "grasp" especially in big teams. If you have a look at articles which describe at what lengths the developers go to make JS applications such as gmail and twitter work correctly and efficiently, you'll see what I mean. It's crazy.Kandrakandy
@Cherouvim -- is it seriously your suggestion that I should disregard my experiences of the last four or five years and instead rely on third-hand accounts of what other people supposedly went through years ago?Yellowstone
@Malvolio: I didn't ask you to disregard anything (and bravo for your years of experience). I'm just trying to support my point that usually JS based webapps are more complex to build than regular page based webapps. That's my opinion anyway.Kandrakandy
F
2

One of the bigger reasons is going to be how searchable your website is.

Doing everything in javascript is going to make it complicated for search engines to crawl all content of your website, and thus not fully indexing it. There are ways around this (with Google's recent AJAX SEO guidelines) but I'm not sure if all search engines support this yet. On top of that, it's a little bit more complex then just making separate pages.

The bigger issue, whether you decide to build multiple HTML pages, or you decide to use some sort of framework or CMS to generate them for you, is that the different sections of your website have URL's that are unique to them. E.g., an about section would have a URL like mywebsite.com/about, and that URL is used on the actual "about" link within the website.

Fatso answered 28/12, 2010 at 22:2 Comment(6)
Thanks. I strictly mean an application. So while nytimes.com is certainly an application and a web site, an application more strictly speaking would be Google Docs, I think, where SEO does not apply.Routinize
stofac: the bariers are blured nowadays. For example twitter and 4sq are probably more of an application than a website but you want them open and searchable. So a google query for site:twitter.com now returns 1.12 bil results. This is good.Kandrakandy
Agreed on that one. If your application is meant to be open and searchable, an all JS approach is not a good one.Routinize
@cherouvim: Twitter is more of a platform than a website/web app. Most Twitter users use desktop clients anyway. The website is only there for convenience--it's not meant to be used as a full-blown web application. I don't think that's a good example.Gifferd
@musicfreak: Whatever twitter is please have a look at engineering.twitter.com/2010/09/tech-behind-new-twittercom.html and see whether the "all on the client side" decision is something simple or complex. Sections "Page Management" and "The Rendering Stack" are relevant.Kandrakandy
@cherouvim: Okay... My point still stands. I never claimed the decision was either simple or complex. I just stated that their biggest concern is their API, and the website is just one way of getting to that API, so having an all-JavaScript website doesn't make sense for them. There just isn't a real advantage.Gifferd
K
1

One of the biggest downfalls of single-page, Ajax-ified websites is complexity. What might otherwise be spread across several pages suddenly finds its way into one huge, master page. Also, it can be difficult to coordinate the state of the page (for example, tracking if you are in Edit mode, or Preview mode, etc.) and adjusting the interface to match.

Also, one master page that is heavy on JS can be a performance drag if it has to load multiple, big JS files.

Kiva answered 28/12, 2010 at 22:14 Comment(0)
Y
1

At the OP's request, I'm going to discuss my experience with JS-only sites. I've written four relevant sites: two JS-heavy (Slide and SpeedDate) and two JS-only (Yazooli and GameCrush). Keep in mind that I'm a JS-only-site bigot, so you're basically reading John Hinkley on the subject of Jody Foster.

  1. The idea really works. It produces gracefully, responsive sites at very low operational costs. My estimate is that the cost for bandwidth, CPU, and such goes to 10% of the cost of running a similar page-based site.
  2. You need fewer but better (or at least, better-trained) programmers. JavaScript is an powerful and elegant language, but it has huge problems that a more rigid and unimaginative language like Java doesn't have. If you have a whole bunch of basically mediocre guys working for you, consider JSP or Ruby instead of JS-only. If you are required to use PHP, just shoot yourself.
  3. You have to keep basic session state in the anchor tag. Users simply expect that the URL represents the state of the site: reload, bookmark, back, forward. jQuery's Address plug-in will do a lot of the work for you.
  4. If SEO is an issue for you, investigate Google Ajax Crawling. Basically, you make a very simple parallel site, just for search engines.

When would I not use JS-only? If I were producing a site that was almost entirely content, where the user did nothing but navigate from one place to another, never interacting with the site in a complicated manner. So, Wikipedia and ... well, that's about it. A big reference site, with a lot of data for the user to read.

Yellowstone answered 29/12, 2010 at 0:4 Comment(1)
Cool, thanks. I appreciate your points and it is encouraging to see some significant sites built as all-JS, for sure.Routinize
S
0

modularization. multiple files allows you to mre cleanly break out different workflow paths and process parts. chances are your Business Rules are something that do not usually directly impact your layout rules and multiple files would better help in editing on what needs to be edited without the risk of breaking something unrelated.

Scar answered 28/12, 2010 at 22:1 Comment(1)
You could easily have this kind of modularization with a client-side JavaScript framework. You could even have multiple files during development and just compile them all into a single JavaScript-powered web page. (This is sort of what SproutCore does.)Gifferd
T
0

I actually just developed my first application using only one page.

..it got messy

My idea was to create an application that mimicked the desktop environment as much as possible. In particular I wanted a detailed view of some app data to be in a popup window that would maintain it's state regardless of the section of the application they were in.

Thus my frankenstein was born.

What ended up happening due to budget/time constraints was the code got out of hand. The various sections of my JavaScript source got muddled together. Maintaining the proper state of various views I had proved to be... difficult.

With proper planning and technique I think the 'one-page' approach is a very easy way to open up some very interesting possibilities (ex: widgets that maintain state across application sections). But it also opens up many... many potential problem areas. including...

  • Flooding the global namespace (if you don't already have your own... make one)
  • Code organization can easily get... out of hand
  • Context - It's very easy to

I'm sure there are more...

In short, I would urge you to stay away from relying on JavaScript dependency for the compatibility issue's alone. What I've come to realize is there is simply no need rely on JavaScript to everything.


I'm actually in the process of removing JavaScript dependencies in loo of Progressive Enhancement. It just makes more sense. You can achieve the same or similar effects with properly coded JavaScript.

The idea is too...

  1. Develop out well-formatted, fully functional application w/o any JavaScript
  2. Style it
  3. Wrap the whole thing with JavaScript

Using Progressive Enhancement one can develop an application that delivers the best possible experience for the user that is possible.

Threat answered 28/12, 2010 at 23:9 Comment(3)
Thanks Derek. I'm really curious whether there are tools and frameworks that might have helped you avoid some of these issues. I have to say I've never heard of Progressive Enhancement but will read up on it.Routinize
Based on a google search - javascript application framework it appears there is. sproutCore is one.Threat
@stofac: Progressive enhancement is basically where you provide a static website and only use JavaScript to enhance the user experience. The idea is that all of the website's functionality should be available without JavaScript, and JavaScript should only be used to add more "sugar" to the site and make it easier to navigate.Gifferd
R
0

For some additional arguments, check out The Single Page Interface Manifesto and some (mostly) negative reaction to it on Hacker News (link at the bottom of the SPI page):

The Single Page Interface Manifesto: http://itsnat.sourceforge.net/php/spim/spi_manifesto_en.php

Routinize answered 31/12, 2010 at 17:57 Comment(0)
B
0

stofac, first of all, thanks for the link to the Single Page Interface (SPI) Manifesto (I'm the author of this boring text)

Said this, SPI != doing everything through Javascript

Take a look to this example (server-centric): http://www.innowhere.com/insites/

The same in GAE: http://itsnatsites.appspot.com/

More info about the GAE approach: http://www.theserverside.com/news/thread.tss?thread_id=60270

In my opinion coding a complex SPI application/web site fully on JavaScript is very very complex and problematic, the best approach in my opinion is "hybrid programming" for SPI, a mix of server-centric for big state management and client-centric (a.k.a JavaScript by hand) for special effects.

Bonnet answered 1/1, 2011 at 11:10 Comment(0)
I
-1

Doing everything on a single page using ajax everywhere would break the browser's history/back button functionality and be annoying to the user.

Inscrutable answered 28/12, 2010 at 22:4 Comment(1)
Not really, there are ways to work around that, such as the hash tags (#) in URLs. "annoying to the user" is subjective, but I've never heard of anyone complaining that Gmail is "annoying" because of its use of JavaScript.Gifferd
M
-1

I utterly despise JS-only sites where it is not needed. That extra condition makes all the difference. By way of example consider the oft quoted Google Docs, in this case it not only helps improve experiences it is essential. But some parts of Google Help have been JS-only and yet it adds nothing to the experience, it is only showing static content.

Here are reasons for my upset:

  • Like many, I am a user of NoScript and love it. Pages load faster, I feel safer and the more distracting adverts are avoided. The last point may seem like a bad thing for webmasters but I don't want anyone to get rewarded for pushing annoying flashy things in my face, if tactless advertisers go out of business I consider it natural selection.
    Obviously this means some visitors to your site are either going to be turned away or feel hassled by the need to provide a temporary exclusion. This reduces your audience.
  • You are duplicating effort. The browser already has a perfectly good history function and you shouldn't need to reinvent the wheel by redrawing the previous page when a back button is clicked. To make matters worse going back a page shouldn't require re-rendering. I guess I am a student of If-it-ain't-broke-don't-fix-it School (from Don't-Repeat-Yourself U.).
  • There are no HTTP headers when traversing "pages" in JS. This means no cache controls, no expiries, content cannot be adjusted for requested language nor location, no meaningful "page not found" nor "unavailable" responses. You could write error handling routines within your uber-page that respond to failed AJAX fetches but that is more complexity and reinvention, it is redundant.
  • No caching is a big deal for me, without it proxies cannot work efficiently and caching has the greatest of all load reducing effects. Again, you could mimic some caching in your JS app but that is yet more complexity and redundancy, higher memory usage and poorer user experience overall.
  • Initial load times are greater. By loading so much Javascript on the first visit you are causing a longer delay.
  • More JavaScript complexity means more debugging in various browsers. Server-side processing means debugging only once.
  • Unfuddle (a bug-tracker) left a bad taste. One of my most unpleasant web experiences was being forced to use this service by an employer. On the surface it seems well suited; the JS-heavy section is private so doesn't need to worry about search engines, only repeat visitors will be using it so have time to turn off protections and shouldn't mind the initial JS library load.
    But it's use of JS is pointless, most content is static. "Pages" were still being fetched (via AJAX) so the delay is the same. With the benefit of AJAX it should be polling in the background to check for changes but I wouldn't get notified when the visible page had been modified. Sections had different styles so there was an awkward re-rendering when traversing those, loading external stylesheets by Javascript is Bad Practice™. Ease of use was sacrificed for whizz-bang "look at our Web 2.0" features. Such a business-orientated application should concentrate on speed of retrieval, but it ended up slower.
    Eventually I had to refuse to use it as it was disrupting the team's work flow. This is not good for client-vendor relationships.
  • Dynamic pages are harder to save for offline use. Some mobile users like to download in advance and turn off their connection to save power and data usage.
  • Dynamic pages are harder for screen readers to parse. While the number of blind users are probably less than those with NoScript or a mobile connection it is inexcusable to ignore accessibility - and in some countries even illegal, see the "Disability Discrimination Act" (1999) and "Equality Act" (2010).

As mentioned in other answers the "Progressive Enhancement", née "Unobtrusive Javascript", is the better approach. When I am required to make a JS-only site (remember, I don't object to it on principle and there are times when it is valid) I look forward to implementing the aforementioned AJAX crawling and hope it becomes more standardised in future.

Mcripley answered 29/12, 2010 at 13:55 Comment(3)
Wow, everything about that answer is wrong, even the ands and the thes. HTML was developed to display data, not render applications, and yes, it is possible by banging on it extensively, to make it look as if it's an app, but only by a series of hacks. The remarks about caching are particularly misguided -- it's much easier to cache JS sites, since semi-permanent things (like files of the JS itself) can be completely split off from transient things like the user's identity. As for being "browser-independent", let me just say, mwa-ha-ha-ha. Mwah-ha-ha. Ha.Yellowstone
I agree with @Malvolio, half of what was said in this answer is simply wrong. I also want to rebut the last point: why would you need a screen reader for a web application? I guess it would make sense for some, but it doesn't really make sense to support screen readers for an image editor, for example.Gifferd
In fairness, Señor Clockwork did say "where not needed" (and even put it in boldface), but that really isn't the point. Everything is unneeded where it isn't needed. My complaint is that his criticisms are factually wrong.Yellowstone

© 2022 - 2024 — McMap. All rights reserved.