how to deal with large data sets with jquery isotope
Asked Answered
T

3

9

I am planning on using the great isotope plugin for displaying a list of contacts and then allowing them to be filtered. The issue I have is that it works great for a small data set but i'm not sure the best way of scaling it up for 1000+ pieces of data.

So far the ideas I had were:

  • loading a random subset and then adding nodes to it as filters are clicked to fill in the gaps
  • loading more nodes as a user scrolls
  • paging the results
  • not displaying contacts until enough filters have been selected to bring the numbers below a predefined threshold.

I'm not sure if these will work well and I was hoping others had faced this situation and could give me some ideas.

Terryl answered 28/1, 2012 at 19:54 Comment(2)
What is the bottleneck you are facing in particular? Transferring the information from server to client? Rendering and animating that many elements on the screen? Simply providing a useful interface? Something else?Existential
its more about the interface and the best way to organise it. obviously it needs to be responsive too.Terryl
A
7

The situation you describe is pretty common: how to give your user access to more data than they can possibly see in detail at once.

There are several ways to answer the question and the correct answer is completely subjective: it depends on what your user is trying to see or do with the contacts. Before you can really get a satisfactory solution, you need to know what the users are going to use the contacts for.

Just guessing (but you would know better than me!), I'd expect there are two things they're doing:

  • Lookup: Looking for a specific contact and they already know their name/handle.
  • Explore: Looking for a specific contact but they can't quite remember their name/handle. Or they're just browsing.

If you do filtering for all the solutions, then the Lookup goal is pretty much in the bag. The Explore goal is the one you want to design for:

  • Random Subset: Its not a great way to browse since you're basically left with a subset to browse and then you must explicitly filter to see anything new. Hard to filter when you don't know exactly what you're looking for.
  • Infinite Scrolling: seems like a popular solution these days. I find it cumbersome, especially if you are 'infinitely' scrolling thru 1000+ contacts. Probably not great for the Explore goal.
  • Paging: Also cumbersome - but perhaps if the paging is tied to alphabetical sorting this could work well.
  • Threshold limiting: so...simply relying on the filtering? This may be bad in some corner cases in which the user applies one filter and they don't see anything b/c the threshold still isn't met (maybe there are a lot of people with the last name Johnson, which is what you searched for). Plus, I think the ability to browse is important when you don't know what you are looking for.

I think if I were in your shoes, I'd introduce some clustering of the contacts. I doubt that the 1000+ contacts is much of a performance problem (in less you're talking a million!), so the 10000+ is really a user constraint: they just can't view 1000 contacts at once.

I'd suggest introducing some clustering, probably by the last name or last name and first name. Then present the user with a way to drill into one cluster but fold up all the other contacts so they're aren't immediately visible. Something in the ream of the accordian/rollodex paradigm. This gives your user the illusion that they are working with 'all the contacts'. Probably introduce a minimal number for each cluster so that if the cluster is sufficiently small you don't bother showing it (ie, why show a cluster for 2 or 3 or 5 contacts - just show the contacts). As filters are applied then, the clusters melt away.

Asbestosis answered 13/2, 2012 at 13:47 Comment(0)
C
1

Taking the idea of Read Through Cache, something like:

  • create a method that can load a batch of up to 100 (or any configurable number) elements. It would:
    • search in the cache (JS array with primary key the ID of the element) for the filtered items
    • request by AJAX the filtered items
    • items returned by AJAX would be added to the cache
    • items returned by AJAX would also be added in a "loading" area at the bottom of the DOM (see below), with the id of the created DIVs the primary key of the element
    • the server would send up to 100 elements. If there is no filter, it would send the next elements not yet been sent. You would need to keep track of the loaded elements. If the size of cached data on server side (i.e. session) is critical, you can keep track only of the highest continuous sent ID (i.e. if you send in 1st batch IDs 1,2,3,6,9,10, then the highest ID is 3, so next time you would send from 4, ..., so you keep in session only one value)
  • create a method that can move the cached DIVs to/from the isotope container
  • onDomReady load using the method above and display the first 20 elements by natural ordering (in your case it would be alphabetically by name). It can be 20 elements or 50 or any...
  • in the background, load in loop by AJAX and in batch of 100 all the elements.

Loading area could be simply:

<html>
  <body>
    <!-- the page stuff -->
    <div id="loader" style='display:none'>
      <!-- all elements are loaded here -->
      <div class="item">...</div>
    </div>
  </body>
</html>

This way you can load all elements step by step in the DOM, and you can display only what is needed.

Cormophyte answered 11/2, 2012 at 14:51 Comment(3)
how many items can i load into the dom before it becomes an issue to the user/isotope system?Terryl
I created a test page for that. The two "Shuffle" and "Insert" are action: enter the nb of items to insert in the text box, then click insert. Warning, adding about a 1000 take > 1 min. dev.rochefolle.net/iso/iso.htmlCormophyte
For the DOM you could check it too, but I would say the limitation is much higher than the isotope code. If you display only a limited and filtered numbers of elements in the isotope container, you might be able to load several 1000s in the DOM. In the test page above, once the 1000 elements are added, the shuffle is responding, though not too fluid (I am running FF 10 on Ubuntu)Cormophyte
P
1

I was experiencing poor performance when appending and arranging a large number of isotope items and it was because I was adding the items incrementally rather than in a batch. It should be an obvious choice, but something that I had overlooked.

Be sure to use an array or list of elements, as opposed to loading or removing individually.

incomingData=['<div>a</div>','<div>b</div>'];
elements=[];

jQuery.each(incomingData,function(ind,val){
    var element = jQuery(val).get(0);
    //$container.isotope('insert', element); //resource heavy
    elements.push(element); 
});

$container.isotope( 'insert', elements ); //less processing
Pomeranian answered 11/2, 2015 at 18:30 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.