Featured post
javascript - Performance for appending large element/datasets to the dom -
i appending large amounts of table row elements @ time , experiencing major bottlenecks. @ moment using jquery, i'm open javascript based solution if gets job done.
i have need append anywhere 0-100 table rows @ given time (it's potentially more, i'll paginating on 100).
right appending each table row individually dom...
loop { ..build html str... $("#mytable").append(row); }
then fade them in @ once
$("#mytable tr").fadein();
there couple things consider here...
1) binding data each individual table row, why switched mass append appending individual rows in first place.
2) fade effect. although not essential application big on aesthetics , animations (that of course don't distract use of application). there has way apply modest fade effect larger amounts of data.
(edit) 3) major reason me approaching in smaller chunk/recursive way need bind specific data each row. binding data wrong? there better way keep track of data binding respective tr
?
is better apply affects/dom manipulations in large chunks or smaller chunks in recursive functions?
are there situations it's better 1 or other? if so, indicators choosing appropriate method?
take @ post john resig, explains benefit of using documentfragments when doing large additions dom.
a documentfragment container can add nodes without altering dom in way. when ready add entire fragment dom , places it's content dom in single operation.
also, doing $("#mytable")
on each iteration not recommended - once before loop.
- Get link
- X
- Other Apps
Comments
Post a Comment