Cached, PHP generated Thumbnails load slowly
Asked Answered
L

19

181

Question Part A ▉ (100 bountys, awarded)
Main question was how to make this site, load faster. First we needed to read these waterfalls. Thanks all for your suggestions on the waterfall readout analysis. Evident from the various waterfall graphs shown here is the main bottleneck: the PHP-generated thumbnails. The protocol-less jquery loading from CDN advised by David got my bounty, albeit making my site only 3% faster overall, and while not answering the site's main bottleneck. Time for for clarification of my question, and, another bounty:

Question Part B ▉ (100 bountys, awarded)
The new focus was now to solve the problem that the 6 jpg images had, which are causing the most of the loading-delay. These 6 images are PHP-generated thumbnails, tiny and only 3~5 kb, but loading relatively very slowly. Notice the "time to first byte" on the various graphs. The problem remained unsolved, but a bounty went to James, who fixed the header error that RedBot underlined: "An If-Modified-Since conditional request returned the full content unchanged.".

Question Part C ▉ (my last bounty: 250 points)
Unfortunately, after even REdbot.org header error was fixed, the delay caused by the PHP-generated images remained untouched. What on earth are these tiny puny 3~5Kb thumbnails thinking? All that header information can send a rocket to moon and back. Any suggestions on this bottleneck is much appreciated and treated as possible answer, since I am stuck at this bottleneckish problem for already seven months now.

[Some background info on my site: CSS is at the top. JS at the bottom (Jquery,JQuery UI, bought menu awm/menu.js engines, tabs js engine, video swfobject.js) The black lines on the second image show whats initiating what to load. The angry robot is my pet "ZAM". He is harmless and often happier.]


Load Waterfall: Chronological | http://webpagetest.org enter image description here


Parallel Domains Grouped | http://webpagetest.org enter image description here


Site-Perf Waterfall | http://site-perf.com enter image description here


Pingdom Tools Waterfall | http://tools.pingdom.com

enter image description here


GTmetrix Waterfall | http://gtmetrix.com

enter image description here


Louanneloucks answered 26/1, 2011 at 22:29 Comment(16)
I think most browser only make 20 connections at a time so after 20 the first one has to finish before the next starts, hence the slowdown after 20Zaidazailer
I think you forgot to redact the first instance of your domain. At least you got the rest of them though :DCruet
@Jakub: If that relieves you: not all Dutch people have to sit, stare and wait for their laundry.Locket
Can't you combine some of those images into sprites?Locket
@Dagon, be aware that HTTP 1.1 RFC asks (SHOULD) for HTTP 1.1 clients to use at most 2 connections to HTTP 1.1 servers; HTTP 1.0 of course is much more open.Anaximenes
@Dagon browsers will also only make 2 concurrent connections to any given domain.Rigi
@Louanneloucks You need to work out why you time to first byte is so long. If that was shorter towards the top of your Load waterfall the whole thing would, as you put it, look more vertical.Rigi
Agreed, but just HOW can I find out whats causing those prolonged time to first byte delays?Louanneloucks
@Sarnold, are you saying that a HTTP1.0 connection can be faster (more open) more connections than HTTP1.1?Louanneloucks
@Sam, yes, the HTTP1.0 specification doesn't say how many simultaneous connections a client may establish with a server; since each request uses a new connection (section 1.3 of rfc 1945) it makes sense to use more connections -- the latency of setting up connections can be 'hidden' behind more connections. HTTP1.1 re-uses connections, so the extra latency is missing, and the specification limits the number of simultaneous connections because they are much less useful with HTTP1.1. Deciding which one is 'faster' depends too much on specifics. :)Anaximenes
What code are you using to generate the thumbnails?Te
@James, bought some while ago code from CodeCanyon, seemlingly pretty neatly well coded php file that makes a thumbnial out of <img src="thmbgen.php?src=bigimage.jpg&w=100&h=100"> where my htaccess allows for neater urls to do reach the same image with a better url: e.g. <img src="IMG-bigimage_w100_h100.jpg"> I tested with and without apache and both had same delay! Do you reccon anything strange besided this, I find the graphs bizzar myself too!Louanneloucks
@Sam, Well this in only a guess but from your Redbot.org image it seems that the thumbnail images have not been compressed using gzip and are returning a 200 status code. Its possible the code you have isn't taking into account the 'if_modified_since' header and still reprocessing the image which is quite an intensive time consuming task. Which could explain the long connection time.Te
@James, thanks for suggestion, but are you suggesting that JPEG images should be gzipped? How do I implement the if_modified_since correctly: what line of code goes into the PHP thumnail generator when it is creating the file? I think I have done this correctly, but apparently something fishy is going on, looking forward to your response.Louanneloucks
For the Love of Layout, please put your suggestions as Answers! It is very difficult to read/comment on your suggestions this way for me. Thanks.Louanneloucks
You are welcome And ah i was going to suggest u server-side caching by using Apache mod_cache but well you are on shared :(Armandoarmature
B
61

First, using those multiple domains requires several DNS lookups. You'd be better off combining many of those images into a sprite instead of spreading the requests.

Second, when I load your page, I see most of the blocking (~1.25s) on all.js. I see that begins with (an old version of) jQuery. You should reference that from the Google CDN, to not only decrease load time, but potentially avoid an HTTP request for it entirely.

Specifically, the most current jQuery and jQuery UI libraries can be referenced at these URLs (see this post if you're interested why I omitted the http:):

//ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js

//ajax.googleapis.com/ajax/libs/jqueryui/1.8.9/jquery-ui.min.js

If you're using one of the default jQuery UI themes, you can also pull its CSS and images off the Google CDN.

With the jQuery hosting optimized, you should also combine awmlib2.js and tooltiplib.js into a single file.

If you address those things, you should see a significant improvement.

Bolide answered 27/1, 2011 at 9:12 Comment(5)
Excellent comment Dave! the old 1.3 JQuery was much smaller so I thought while its working, it might be faster. But I like your recommendations: Which of the google CDN Links do you suggest me to use as my Jqyuery? Can I use same way JQ UI javascript? +1 thanks very muchLouanneloucks
I definitely recommend using the latest version of jQuery (1.4.4 currently). When minified and gzipped, there's only a few bytes difference between them. I've updated the answer with a couple links to the latest jQuery and jQuery UI versions on the Google CDN, which I would recommend using.Bolide
Good tip with the sprite, that should reduce the number of open connections to the serverLamarre
currently working on reducing the open connections (went from 40 orso to now 30 orso... the final push is the most difficult as some of the images are repweating backgrounds and cannot go into a sprite (or???)Louanneloucks
Update Page Speed Grade: (96%) YSlow Grade: (90%) ... and still the thumbnails are same slow as ever!Louanneloucks
R
17

I had a similar problem a few days ago & i found head.js. It's a Javascript Plugin which allows you to load all JS files paralell. Hope that helps.

Relucent answered 30/1, 2011 at 21:44 Comment(2)
Incredible! How can I have overlooked that? +1 Im going to testing now this one. Smells like a fruitfull night.Thanks Schattenbaum!Louanneloucks
May I ask whether you are the Schattenbaum from schattenbaum.net?Neoplasm
T
12

I am far from an expert but...

In regards to this: "An If-Modified-Since conditional request returned the full content unchanged." and my comments.

The code used to generate the Thumbnails should be checking for the following:

  1. Is there a cached version of the thumbnail.
  2. Is the cached version newer than the original image.

If either of these are false the thumbnail should be generated and returned no matter what. If they are both true then the following check should be made:

  1. Is there a HTTP_IF_MODIFIED_SINCE header
  2. Is the cached version's last modified time the same as the HTTP_IF_MODIFIED_SINCE

If either of these are false the cached thumbnail should be returned.

If both of these are true then a 304 http status should be returned. I'm not sure if its required but I also personally return the Cache-Control, Expires and Last-Modified headers along with the 304.

In regards to GZipping, I've been informed that there is no need to GZip images so ignore that part of my comment.

Edit: I didn't notice your addition to your post.

session_cache_limiter('public');
header("Content-type: " . $this->_mime);
header("Expires: " . gmdate("D, d M Y H:i:s", time() + 2419200) . " GMT");
// I'm sure Last-Modified should be a static value. not dynamic as you have it here.
header("Last-Modified: " . gmdate("D, d M Y H:i:s",time() - 404800000) . " GMT");

I'm also sure that your code needs to check for the HTTP_IF_MODIFIED_SINCE header and react to it. Just setting these headers and your .htaccess file won't provide the required result.

I think you need something like this:

$date = 'D, d M Y H:i:s T'; // DATE_RFC850
$modified = filemtime($filename);
$expires = strtotime('1 year'); // 1 Year

header(sprintf('Cache-Control: %s, max-age=%s', 'public', $expires - time()));
header(sprintf('Expires: %s', date($date, $expires)));
header(sprintf('Last-Modified: %s', date($date, $modified)));
header(sprintf('Content-Type: %s', $mime));

if(isset($_SERVER['HTTP_IF_MODIFIED_SINCE'])) {
    if(strtotime($_SERVER['HTTP_IF_MODIFIED_SINCE']) === $modified) {
        header('HTTP/1.1 304 Not Modified', true, 304);
        // Should have been an exit not a return. After sending the not modified http
        // code, the script should end and return no content.
        exit();
    }
}
// Render image data
Te answered 9/2, 2011 at 13:40 Comment(5)
James you nailed the essence of the problem after your edit in your answer! the If Modified Since issue seems to work now! Yet, the long headers/waiting time for the tiny thumbs isnt solved yet...Louanneloucks
@Te PS REdbot.org says that your Expires header is incorrect value. I think it has to be GMT and not CET?Louanneloucks
@Louanneloucks Sorry my server is in the UK so it generates GMT dates automatically. Just use the PHP function gmdate instead if date. This should produce a GMT date relative to your server time.Te
@Sam, Your waiting time is the script execution time. Its either taking a long time to get through your code to the point your sending your headers or your not exiting after you have sent your headers.Te
@James, I see... But apart from that php thumbnail generator, there are a whole lot of other equially lengh-scripts that do various other stuff ( translations, menu loading etc) all in a fraction of a time... THEY don't seem to be bottlenecked at all... does that direct the problem then to the thumbnail generator php ONLY?Louanneloucks
D
6

Wow, it's hard to explain things using that image.. But here, some tries:

  • files 33-36 load that late, because they are dynamically loaded within the swf, and the swf (25) is loaded first completely before it loads any additional content
  • files 20 & 21 are maybe (I don't know, because I don't know your code) libraries that are loaded by all.js (11), but for 11 to execute, it waits for the whole page (and assets) to load (you should change that to domready)
  • files 22-32 are loaded by those two libraries, again after those are completely loaded
Deianira answered 26/1, 2011 at 22:39 Comment(2)
Interesting point. I guesse there is nothing around the swf... How can i change what to domready? I have a hunch what you mean. Its about when the javascript is ready and tells on document ready this or that? should that document.ready be replaced by dom.ready?Louanneloucks
@Louanneloucks if you're using client side caching (and you should be), you can load the resources used by the swf in js or hidden divs on your page so that when the swf requests them they are already at the client.Rigi
O
4

Just a simple guess because this kind of analysis requires a lot of A/B testing: your .ch domain seems to be hard to reach (long, green bands before the first byte arrives).

This would mean that either the .ch website is poorly hosted or that you ISP does not have a good route to them.

Given the diagrams, this could explain a big performance hit.

On a side note, there is this cool tool cuzillion that could help you sort out things depending on your ordering of ressource loading.

Okhotsk answered 26/1, 2011 at 22:41 Comment(0)
C
4

Try running Y!Slow and Page Speed tests on your site/page, and follow the guidelines to sort out possible performance bottlenecks. You should be getting huge performance gains once you score higher in Y!Slow or Page Speed.

These tests will tell you what's wrong and what to change.

Cimex answered 27/1, 2011 at 8:55 Comment(2)
Thanks! scores are: 92 on Page Speed and 93 on Ylow. Whats missing are : KEEP ALIVE = off and not using CDN.Louanneloucks
UPDATE: 96 and 90 respectively currentlyLouanneloucks
A
4

So your PHP script is generating the thumbnails on every page load? First off, if the images that are being thumbnailed are not changing that often, could you set up a cache such that they don't have to be parsed each time the page loads? Secondly, is your PHP script using something like imagecopyresampled() to create the thumbnails? That's a non-trivial downsample and the PHP script won't return anything until its done shrinking things down. Using imagecopymerged() instead will reduce the quality of the image, but speed up the process. And how much of a reduction are you doing? Are these thumbnails 5% the size of the original image or 50%? A greater size of the original image likely is leading to a slowdown since the PHP script has to get the original image in memory before it can shrink it and output a smaller thumbnail.

Agretha answered 7/2, 2011 at 20:31 Comment(4)
Thanks MidnightLightning! There is a cache folder where thumbnail JPGs are created and reused from, though I have the feeling within here lies the problem of the script that I have bought ( and seems to work fine for others )Louanneloucks
If the thumbnails are cached, make sure the script that's pulling them from cache is using readfile() rather than file_get_contents() followed by an echo, which waits to output anything until the whole file is moved into memory of the PHP script.Agretha
Better yet- if the files are cached, generate the HTML in a way that directly pulls in the cached image from disk without going through PHP. That's what I do in my scripts for videodb.netMonopetalous
"There is a cache folder where..." and how quickly are they dereferenced? Does your URL point directly to cached file or a PHP script? Do you redirect or use readfile()? Does the same PHP script contain the thumbnail generation code - or do you defer loading the bulk of the code using include/erquire?Venator
L
4

I've found the URL of your website and checked an individual jpg file from the homepage. While the loading time is reasonable now (161ms), it's waiting for 126ms, which is far too much.

Your last-modified headers are all set to Sat, 01 Jan 2011 12:00:00 GMT, which looks too "round" to be the real date of generation ;-)

Since Cache-control is "public, max-age=14515200", arbitrary last-modified headers will could cause problem after 168 days.

Anyway, this is not the real reason for delays.

You have to check what your thumbnail generator do when the thumbnail already exists and what could consume so much time checking and delivering the picture.

You could install xdebug to profile the script and see where the bottlenecks are.

Maybe the whole thing uses a framework or connects to some database for nothing. I've seen very slow mysql_connect() on some servers, mostly because they were connecting using TCP and not socket, sometimes with some DNS issues.

I understand you can't post your paid generator here but I'm afraid there are too many possible issues...

Ledezma answered 23/2, 2011 at 23:14 Comment(5)
Thanks for your detective & spot on clues Capsule! First things first: theres no database. Your findings are same as mine: its waiting 90% of the time for? Crazy little thumbs. Interesting thoughts on the last-modified headers, because according to James post here, I had to set those Last-modified headers to a STATIC (fixed) time, not dynamic/always changing time set by php gmdate generators. Or perhaps you mean something else here? (Nominated for bounty)Louanneloucks
To be perfect, it should reflect the real generation date, for example by getting the filemtime() of the cached thumbnail. What would be interesting to test is access an empty PHP file, or a PHP file just echoing "test" and see how much waiting you've got on this one. Maybe the whole server is just slow and impacts every single PHP script, whatever it does.Ledezma
I'm also seeing a relatively long delay on pure static files (for example the images linked to the thumbs), like 36ms. On one of the servers I'm administrating (which is not a beast... dual core with 2Gb of RAL), I get almost half this, like 20ms on static files.Ledezma
Interesting... 1.what software / online tool do you use to measure? 2. are your faster 20 ms measurements consistent (how many ± xx%) do you find your results to be varying? In my case it really varies alot depending on what test tool I use. some are very consistent ( gtmetrix.com) some are really varying (pingdom.com) and its difficult to give times in XX ms since they alter every time...Louanneloucks
I'm using Firebug's NET tab. 20ms is the fastest timing I'm getting. It's varying between 20 and 28. Of course the 36ms I measured on your server was the the fastest too.Ledezma
T
4

If there isn't a really good reason (usually there isn't) your images shouldn't invoke the PHP interpreter.

Create a rewrite rule for your web server that servers the image directly if it is found on the file system. If it's not, redirect to your PHP script to generate the image. When you edit the image, change the images filename to force users that have a cached version to fetch the newly edited image.

If it doesn't work at least you will now it doesn't have anything to do with the way the images are created and checked.

Thursday answered 24/2, 2011 at 21:27 Comment(1)
Thanks Goran, however this is not the elegant solution that I wish for: I think there is something fishy in my case, and that normally it really doesnt take as long for a php script to know wether to pass a 304 header or to bake the image etc. thanks anyway for your suggestion as it directs the problem from an entirely new perspective! Which is valuable by itself +1Louanneloucks
C
3

Investigate PHP's usage of session data. Maybe (just maybe), the image-generating PHP script is waiting to get a lock on the session data, which is locked by the still-rendering main page or other image-rendering scripts. This would make all the JavaScript/browser optimizations almost irrelevant, since the browser's waiting for the server.

PHP locks the session data for every script running, from the moment the session handling starts, to the moment the script finishes, or when session_write_close() is called. This effectively serializes things. Check out the PHP page on sessions, especially the comments, like this one.

Cancellate answered 11/2, 2011 at 23:31 Comment(2)
Thanks for suggestion Ricardo! It seems Alix is suggesting the same as you (right?). In practical terms, what do you suggest me to put/remove from the code, then test again the graphs then report back? Much appreciated.Louanneloucks
Yes I think so. I suggest you change the image-generating scripts so that they do not depend on $_SESSION data or similar (maybe they don't, already). Then use session_write_close() as soon as possible, or, even better, avoid using sessions at all on those scripts. Check out php.net/manual/en/function.session-write-close.phpCancellate
D
3

This is just a wild guess since I haven't looked at your code but I suspect sessions may be playing a role here, the following is from the PHP Manual entry on session_write_close():

Session data is usually stored after your script terminated without the need to call session_write_close(), but as session data is locked to prevent concurrent writes only one script may operate on a session at any time. When using framesets together with sessions you will experience the frames loading one by one due to this locking. You can reduce the time needed to load all the frames by ending the session as soon as all changes to session variables are done.

Like I said, I don't know what your code is doing but those graphs seem oddly suspicious. I had a similar issue when I coded a multipart file serving function and I had the same problem. When serving a large file I couldn't get the multipart functionality to work nor could I open another page until the download was completed. Calling session_write_close() fixed both my problems.

Destructible answered 4/3, 2011 at 21:38 Comment(2)
Thanks Alix for your suggestion. A question: is the exit(); function in similar lines as the session_write_close();? currently, the original writer of the code is investigating the issue, but helas it seems he is also a bit in the dark, since hes generous update of the code with better If-Modified-Since handling seem to have the same delays (new waterfall graphs produced same graphs, although real worls results looked/feeled faster loadings! Its a very weird issue...Louanneloucks
@Sam: I can't give you any sources right now, but I believe exit() first calls any destructors and/or functions registered for shutdown and only then the session is closed. Anyhow, I bet your problem probably lies before your exit() call. See also: #1674814Destructible
O
2

Have you tried replacing the php generated thumnails by regular images to see if there is any difference ? The problem could be around - a bug in your php code leading to a regeneration of the thumbnail upon each server invocation - a delay in your code ( sleep()?) associated with a clock problem - a hardrive issue causing a very bad race condition since all the thumbnails get loaded/generated at the same time.

Okhotsk answered 25/2, 2011 at 2:13 Comment(1)
Something that I at somepoint thought giving a try +1 for reading my thoughts and revealing the first solution I already did. What I was HOPING, was to find that normal images would also load slowly sothat it could be the download bandwith speed or something physically limiting, but i found instead that normal static dump images (I saved the generated thumbs and uploaded as static) these loaded EXTREMELY fast. So its gotta do with what the thumbnail generator php!Louanneloucks
A
2

I think instead of using that thumbnail-generator script you must give TinySRC a try for rapid fast and cloud-hosted thumbnail generation. It has a very simple and easy to use API, you can use like:-

http://i.tinysrc.mobi/ [height] / [width] /http://domain.tld/path_to_img.jpg

[width] (optional):- This is a width in pixels (which overrides the adaptive- or family-sizing). If prefixed with ‘-’ or ‘x’, it will subtract from, or shrink to a percentage of, the determined size.

[height] (optional):- This is a height in pixels, if width is also present. It also overrides adaptive- or family-sizing and can be prefixed with ‘-’ or ‘x’.

You can check the API summary here


FAQ

What does tinySrc cost me?

Nothing.

When can I start using tinySrc?

Now.

How reliable is the service?

We make no guarantees about the tinySrc service. However, it runs on a major, distributed cloud infrastructure, so it provides high availability worldwide. It should be sufficient for all your needs.

How fast is it?

tinySrc caches resized images in memory and in our datastore for up to 24 hours, and it will not fetch your original image each time. This makes the services blazingly fast from the user’s perspective. (And reduces your server load as a nice side-effect.)


Good Luck. Just a suggestion, since u ain't showing us the code :p

Armandoarmature answered 4/3, 2011 at 20:43 Comment(0)
E
2

As some browsers only download 2 parallels downloads per domain, could you not add additional domains to shard the requests over two to three different hostnames. e.g. 1.imagecdn.com 2.imagecdn.com

Euchology answered 11/3, 2011 at 16:32 Comment(1)
+1 for your suggestion: thank you, but if you look closer on my (admitted: very chaotic drawings) you will see that some items come from .......es some come from ........com ..........de BUT, perhaps that does not do the trick as well as your suggestion does? (I see you suggest subdomains, instead of just different domains.)Louanneloucks
T
1

First of all, you need to handle If-Modified-Since requests and such appropriately, as James said. That error states that: "When I ask your server if that image is modified since the last time, it sends the whole image instead of a simple yes/no".

The time between the connection and the first byte is generally the time your PHP script takes to run. It is apparent that something is happening when that script starts to run.

  1. Have you considered profiling it? It may have some issues.
  2. Combined with the above issue, your script may be running many more times than needed. Ideally, it should generate thumbs only if the original image is modified and send cached thumbs for every other request. Have you checked that the script is generating the images unnecessarily (e.g. for each request)?

Generating proper headers through the application is a bit tricky, plus they may get overwritten by the server. And you are exposed to abuse as anyone sending some no-cache request headers will cause your thumbnail generator to run continuously (and raise loads). So, if possible, try to save those generated thumbs, call the saved images directly from your pages and manage headers from .htaccess. In this case, you wouldn't even need anything in your .htaccess if your server is configured properly.

Other than these, you can apply some of the bright optimization ideas from the performance parts of this overall nice SO question on how to do websites the right way, like splitting your resources into cookieless subdomains, etc. But at any rate, a 3k image shouldn't take a second to load, this is apparent when compared to other items in the graphs. You should try to spot the problem before optimizing.

Teeterboard answered 10/2, 2011 at 9:49 Comment(2)
-1: Responding to a conditional request with 'Not modified' and no revised expiration time will make your site slower in 99.9% of cases (BTW, AFAIK, there is no way to get Apache to issue revised caching info with a 304 response)Venator
And, what that has to do with my answer?Trivet
V
1

Have you tried to set up several subdomains under NGINX webserver specially for serving static data like images and stylesheets? Something helpful could be already found in this topic.

Vetiver answered 22/2, 2011 at 0:49 Comment(4)
Thanks! After some research, it seems, however, that setting up subdomains to server static cookies only makes a site faster, when there are many images, at a cost of a little extra overhead. In my case I bet the 6 images will not be loading faster than the sub/extra domain's overhead. Right?Louanneloucks
NGinx supports sendfile syscall, which could send files straight from hdd. please see the following doc wiki.nginx.org/HttpCoreModule on directives 'sendfile', 'aio'. This webserver serves static files like images lot faster than apache.Vetiver
interesting... I did not know there can be anything better than Apache. By the way, what do you mean by straight from hdd. do you mean instead straight from DDR3 RAM / straight from Solid State Disk I do know that harddiscs, unlike DDR3 ram or Solid State Discs, have a very slow access time. But I feel this is not the bottleneck here...Louanneloucks
the point is that nginx doesn't buffer static data output, as apache does.Vetiver
L
1

Regarding the delayed thumbnails, try putting a call to flush() immediately after the last call to header() in your thumbnail generation script. Once done, regenerate your waterfall graph and see if the delay is now on the body instead of the headers. If so you need to take a long look at the logic that generates and/or outputs the image data.

The script that handles the thumbnails should hopefully use some sort of caching so that whatever actions it takes on the images you're serving will only happen when absolutely necessary. It looks like some expensive operation is taking place every time you serve the thumbnails which is delaying any output (including the headers) from the script.

Learned answered 28/2, 2011 at 19:11 Comment(6)
+1 Exciting guess gonna try it out now! will report back when I got the new Waterfall flowing...Louanneloucks
Unfotunately, after adding flush(); right after headers, there seems to be no change at all! What could that mean?Louanneloucks
Not sure. Is there any way you can link us to the PHP script in question? I know you paid for it, but it's incredibly difficult to say what might be causing the behavior without being able to see what it's doing.Learned
Are the thumbnails being referenced in CSS or in <img> tags?Learned
What do you mean by referenced in css? they are in side the body html and as follows: <img src="thumbprocessor.php?src=/folder/image.jpg&w=100&h=200" id="thumbnail"/>Louanneloucks
That's what I wanted to know. Is this thumbnail processor script resizing the image every single time it loads? If so this is exactly the reason it's so slow. If the script doesn't cache its output then your performance will suffer. Remove the &w=100&h=200 from the URL and see if the loading performance issues go away. It probably won't look like it is supposed to, but at least you can see if the script's resizing is always happening (because the load will be faster without the resize).Learned
R
1

The majority of the slow issue is your TTFB (Time to first byte) being too high. This is a hard one to tackle without getting intimate with your server config files, code and underlying hardware, but I can see it's rampant on every request. You got too much green bars (bad) and very little blue bars (good). You might want to stop optimizing the frontend for a bit, as I believe you've done much in that area. Despite the adage that "80%-90% of the end-user response time is spent on the frontend", I believe yours is occuring in the backend.

TTFB is backend stuff, server stuff, pre-processing prior to output and handshaking.

Time your code execution to find slow stuff like slow database queries, time entering and exiting functions/methods to find slow functions. If you use php, try Firephp. Sometimes it is one or two slow queries being run during startup or initializtion like pulling session info or checking authentication and what not. Optimizing queries can lead to some good perf gains. Sometimes code is run using php prepend or spl autoload so they run on everything. Other times it can be mal configured apache conf and tweaking that saves the day.

Look for inefficient loops. Look for slow fetching calls of caches or slow i/o operations caused by faulty disk drives or high disk space usage. Look for memory usages and what's being used and where. Run a webpagetest repeated test of 10 runs on a single image or file using only first view from different locations around the world and not the same location. And read your access and error logs, too many developers ignore them and rely only on outputted onscreen errors. If your web host has support, ask them for help, if they don't maybe politely ask them for help anyway, it won't hurt.

You can try DNS Prefetching to combat the many domains and resources, http://html5boilerplate.com/docs/DNS-Prefetching/

Is the server your own a good/decent server? Sometimes a better server can solve a lot of problems. I am a fan of the 'hardware is cheap, programmers are expensive' mentality, if you have the chance and the money upgrade a server. And/Or use a CDN like maxcdn or cloudflare or similar.

Good Luck!

(p.s. i don't work for any of these companies. Also the cloudflare link above will argue that TTFB is not that important, I threw that in there so you can get another take.)

Reagan answered 24/7, 2012 at 1:34 Comment(1)
Dear Anthony, thanks very much for this insightful "background" knowledge. I agree that sometimes the hardware is the bottleneck and that is less obvious to measure especially when the hosting company is hosting the server part in a shared hosting environment. I think cloudflare is a good option to try out in combination with apache configuration optimization. Greetings!Louanneloucks
C
-1

Sorry to say, you provide to few data. And you already had some good suggestions.

How are you serving those images ? If you're streaming those via PHP you're doing a very bad thing, even if they are already generated.

NEVER STREAM IMAGES WITH PHP. It will slow down your server, no matter the way you use it.

Put them in a accessible folder, with a meaningful URI. Then call them directly with their real URI. If you need on the fly generation you should put an .htaccess in the images directory which redirects to a generator php-script only if the request image is missing. (this is called cache-on-request strategy).

Doing that will fix php session, browser-proxy, caching, ETAGS, whatever all at once.

WP-Supercache uses this strategy, if properly configured.

I wrote this some time ago ( http://code.google.com/p/cache-on-request/source/detail?r=8 ), last revisions are broken, but I guess 8 or less should work and you can grab the .htaccess as an example just to test things out (although there are better ways to configure the .htaccess than the way I used to).

I described that strategy in this blog post ( http://www.stefanoforenza.com/need-for-cache/ ). It is probably badly written but it may help clarifying things up.

Further reading: http://meta.wikimedia.org/wiki/404_handler_caching

Centenary answered 6/3, 2011 at 17:21 Comment(4)
Mind, ErrorDocument is not really the best thing you can do, as it generated entries in apache's error log, a -f redirect would be better.Centenary
Thanks for your input tacone. Are you saying that no matter how good the php script will be, it will slow down the server, or as you said in your post "It will kill your server, no matter what."Louanneloucks
it will slow down the server no matter how good the script is. For every image the server will have to load php and have it stream the image byte-per-byte. Let apache do the job without even pass by the php interpreter. As a side result, many other possible mistake will be automatically avoided, such as sessions, content-lenght, caching, mime/type etc. WHEN performance is critical you should not even load php (but on generation-time).Centenary
Vote downers, could you explain why ?Centenary

© 2022 - 2024 — McMap. All rights reserved.