Google Chrome, do you use it?

Started by Thorin, February 24, 2010, 12:48:41 PM

Previous topic - Next topic

Darren Dirt

Quote from: Thorin on February 26, 2010, 10:15:54 AM
What I've learned is that browsers get sent web pages in chunks of data; once all the chunks arrive, the page will have been completely downloaded.  Looks like Chrome starts rendering and running Javascript before all the chunks arrive, causing a lot of re-rendering and re-running the Javascript, in turn causing a horrendous slowdown (in my case about 65 times slower).

setTimeout("X();", 50) is your friend -- my guess is if you have that at the very bottom of the page, it will execute after all those "chunks" are actually loaded, and your function can do the populating or field modifying or whatever magic it does.

Also could have a ".onload" event that fires X() -- but from my experience it's the same result, but more "dependable" with the end-of-document .setTimeout() method.

_____________________

Strive for progress. Not perfection.
_____________________

Thorin

My script is already using timers (from the setTimeout() method, not intervals from the setInterval() method - they are different beasts even though most online tutorials discussing the two treat them as interchangeable).

I've optimized the script to the point where it's still fairly responsive to the user but runs quick.  setTimeout() injects a lot of unnecessary waiting, so I run through one-tenth of my dataloading loop then use setTimeout() to let the UI update, then run through the next tenth.  I think they call this "green threading", where you simulate a thread interrupt using a timer.  In Firefox and Internet Explorer, this gives acceptable updates to the user while still running at a decent speed.

More UI updates = longer time to finish loading, less UI updates = shorter time to finish loading but users who think it froze and click refresh.  So I've picked the arbitrary value of one-tenth, and used that as my frequency updater.  If I'm loading 1,000 dropdown lists, then I'm calling setTimeout() 10 times, once to kick off each batch of 100 data loads.

My first call to setTimeout() obviously has to come when the page has finished loading.  In ASP.NET this is done by registering a startup script, which I do.  Said script is emitted at the end of the page.  There'll be a few bits after it (ending script tag, ending body tag, stuff like that), but the script is clearly in the last chunk of the page.

Thing is, packets don't travel across the internet at exactly the same speed or even order.  The chunks a web page are sent in are one or more packets in size.  Browsers are supposed to be smart enough not to start rendering until all chunks are completely received (meaning all packets have been received), but in the latest WebKit, that Chrome and Safari are based on, some bug crept into the rendering/script-running engine that causes it to re-start rendering and running scripts hundreds of times for operations that take over a certain amount of memory.  See the bug reports I attached earlier.  If I go back to Chrome 3, my script runs blazingly fast, so it really is a bug in Chrome.
Prayin' for a 20!

gcc thorin.c -pedantic -o Thorin
compile successful

Tom

Quote from: Thorin on February 27, 2010, 02:47:31 PM
My script is already using timers (from the setTimeout() method, not intervals from the setInterval() method - they are different beasts even though most online tutorials discussing the two treat them as interchangeable).

I've optimized the script to the point where it's still fairly responsive to the user but runs quick.  setTimeout() injects a lot of unnecessary waiting, so I run through one-tenth of my dataloading loop then use setTimeout() to let the UI update, then run through the next tenth.  I think they call this "green threading", where you simulate a thread interrupt using a timer.  In Firefox and Internet Explorer, this gives acceptable updates to the user while still running at a decent speed.

More UI updates = longer time to finish loading, less UI updates = shorter time to finish loading but users who think it froze and click refresh.  So I've picked the arbitrary value of one-tenth, and used that as my frequency updater.  If I'm loading 1,000 dropdown lists, then I'm calling setTimeout() 10 times, once to kick off each batch of 100 data loads.

My first call to setTimeout() obviously has to come when the page has finished loading.  In ASP.NET this is done by registering a startup script, which I do.  Said script is emitted at the end of the page.  There'll be a few bits after it (ending script tag, ending body tag, stuff like that), but the script is clearly in the last chunk of the page.

Thing is, packets don't travel across the internet at exactly the same speed or even order.  The chunks a web page are sent in are one or more packets in size.  Browsers are supposed to be smart enough not to start rendering until all chunks are completely received (meaning all packets have been received), but in the latest WebKit, that Chrome and Safari are based on, some bug crept into the rendering/script-running engine that causes it to re-start rendering and running scripts hundreds of times for operations that take over a certain amount of memory.  See the bug reports I attached earlier.  If I go back to Chrome 3, my script runs blazingly fast, so it really is a bug in Chrome.

Well technically most browsers try to start rendering before all of the content (images and stuff) arrive. As for packets arriving in odd orders, TCP makes that impossible. The actual html content will all arrive in an ordered stream, a TCP app will never see packets arrive out of order as the OS has re assembled the TCP stream. But yes, a browser should try and be smart about when it starts rendering.. If theres scripts in the head, most browsers will not do any rendering till the entire page has arrived, since the js usually depends on the html content.
<Zapata Prime> I smell Stanley... And he smells good!!!