Comparing cold-start to warm start
Asked Answered
S

10

13

Our application takes significantly more time to launch after a reboot (cold start) than if it was already opened once (warm start).

Most (if not all) the difference seems to come from loading DLLs, when the DLLs' are in cached memory pages they load much faster. We tried using ClearMem to simulate rebooting (since its much less time consuming than actually rebooting) and got mixed results, on some machines it seemed to simulate a reboot very consistently and in some not.

To sum up my questions are:

  1. Have you experienced differences in launch time between cold and warm starts?
  2. How have you delt with such differences?
  3. Do you know of a way to dependably simulate a reboot?

Edit:

Clarifications for comments:

  • The application is mostly native C++ with some .NET (the first .NET assembly that's loaded pays for the CLR).
  • We're looking to improve load time, obviously we did our share of profiling and improved the hotspots in our code.

Something I forgot to mention was that we got some improvement by re-basing all our binaries so the loader doesn't have to do it at load time.

Saloop answered 24/9, 2008 at 13:11 Comment(2)
Motti, have you got some new idea on simulating a reboot? I am searching for such way for our very large application as well but didn't find a solid wayShadow
@Dbger, sorry I didn't make any additional progress on this, I moved on to different problems. Good luck.Saloop
X
3

How did you profile your code? Not all profiling methods are equal and some find hotspots better than others. Are you loading lots of files? If so, disk fragmentation and seek time might come into play.

Maybe even sticking basic timing information into the code, writing out to a log file and examining the files on cold/warm start will help identify where the app is spending time.

Without more information, I would lean towards filesystem/disk cache as the likely difference between the two environments. If that's the case, then you either need to spend less time loading files upfront, or find faster ways to load files.

Example: if you are loading lots of binary data files, speed up loading by combining them into a single file, then do a slerp of the whole file into memory in one read and parse their contents. Less disk seeks and time spend reading off of disk. Again, maybe that doesn't apply.

I don't know offhand of any tools to clear the disk/filesystem cache, but you could write a quick application to read a bunch of unrelated files off of disk to cause the filesystem/disk cache to be loaded with different info.

Xenophanes answered 24/9, 2008 at 18:53 Comment(2)
I'll look into unifying DLLs that sounds like a promising option.Saloop
From the symptoms, it sounds like the time consuming thing is loading the code rather than the execution time of the initialization code. In that case, a profiler won't help (except, possibly, to see if you've made progress on improving the load time). Reduce the number of DLLs that must load, make them smaller, rebase them so that there's no overlap. Registry access can also be painful on a fresh boot.Enrich
D
6

As for simulating reboots, have you considered running your app from a virtual PC? Using virtualization you can conveniently replicate a set of conditions over and over again.

I would also consider some type of profiling app to spot the bit of code causing the time lag, and then making the judgement call about how much of that code is really necessary, or if it could be achieved in a different way.

Demos answered 24/9, 2008 at 13:12 Comment(2)
The problem with virtual machines (we use VMWare not Virtual PC) is that they load balance everything (including the CPU) and the numbers we were getting were not consistent.Saloop
Point taken, though I suspect by replicating reboots you are chasing symptoms rather than root causes. The fact remains that there is something way too time consuming occuring in a DLL regardless of when it loads, hard reboots just exacerbate the problem. Im confident a profiling tool will help hereDemos
C
4

It would be hard to truly simulate a reboot in software. When you reboot, all devices in your machine get their reset bit asserted, which should cause all memory system-wide to be lost.

In a modern machine you've got memory and caches everywhere: there's the VM subsystem which is storing pages of memory for the program, then you've got the OS caching the contents of files in memory, then you've got the on-disk buffer of sectors on the harddrive itself. You can probably get the OS caches to be reset, but the on-disk buffer on the drive? I don't know of a way.

Congregation answered 24/9, 2008 at 13:15 Comment(0)
X
3

How did you profile your code? Not all profiling methods are equal and some find hotspots better than others. Are you loading lots of files? If so, disk fragmentation and seek time might come into play.

Maybe even sticking basic timing information into the code, writing out to a log file and examining the files on cold/warm start will help identify where the app is spending time.

Without more information, I would lean towards filesystem/disk cache as the likely difference between the two environments. If that's the case, then you either need to spend less time loading files upfront, or find faster ways to load files.

Example: if you are loading lots of binary data files, speed up loading by combining them into a single file, then do a slerp of the whole file into memory in one read and parse their contents. Less disk seeks and time spend reading off of disk. Again, maybe that doesn't apply.

I don't know offhand of any tools to clear the disk/filesystem cache, but you could write a quick application to read a bunch of unrelated files off of disk to cause the filesystem/disk cache to be loaded with different info.

Xenophanes answered 24/9, 2008 at 18:53 Comment(2)
I'll look into unifying DLLs that sounds like a promising option.Saloop
From the symptoms, it sounds like the time consuming thing is loading the code rather than the execution time of the initialization code. In that case, a profiler won't help (except, possibly, to see if you've made progress on improving the load time). Reduce the number of DLLs that must load, make them smaller, rebase them so that there's no overlap. Registry access can also be painful on a fresh boot.Enrich
S
2

@Morten Christiansen said:

One way to make apps start cold-start faster (sort of) is used by e.g. Adobe reader, by loading some of the files on startup, thereby hiding the cold start from the users. This is only usable if the program is not supposed to start up immediately.

That makes the customer pay for initializing our app at every boot even when it isn't used, I really don't like that option (neither does Raymond).

Saloop answered 25/9, 2008 at 7:27 Comment(0)
T
2

One succesful way to speed up application startup is to switch DLLs to delay-load. This is a low-cost change (some fiddling with project settings) but can make startup significantly faster. Afterwards, run depends.exe in profiling mode to figure out which DLLs load during startup anyway, and revert the delay-load on them. Remember that you may also delay-load most Windows DLLs you need.

Thermomagnetic answered 25/9, 2008 at 10:53 Comment(0)
V
2

A very effective technique for improving application cold launch time is optimizing function link ordering.

The Visual Studio linker lets you pass in a file lists all the functions in the module being linked (or just some of them - it doesn't have to be all of them), and the linker will place those functions next to each other in memory.

When your application is starting up, there are typically calls to init functions throughout your application. Many of these calls will be to a page that isn't in memory yet, resulting in a page fault and a disk seek. That's where slow startup comes from.

Optimizing your application so all these functions are together can be a big win.

Check out Profile Guided Optimization in Visual Studio 2005 or later. One of the thing sthat PGO does for you is function link ordering.

It's a bit difficult to work into a build process, because with PGO you need to link, run your application, and then re-link with the output from the profile run. This means your build process needs to have a runtime environment and deal cleaning up after bad builds and all that, but the payoff is typically 10+ or more faster cold launch with no code changes.

There's some more info on PGO here:

http://msdn.microsoft.com/en-us/library/e7k32f4k.aspx

Veratrine answered 18/12, 2008 at 20:11 Comment(1)
Thanks, I'll check it out. BTW when you say 10+ do you mean 10%?Saloop
S
1

As an alternative to function order list, just group the code that will be called within the same sections:

#pragma code_seg(".startUp")
 //...
#pragma code_seg

#pragma data_seg(".startUp")
 //...
#pragma data_seg

It should be easy to maintain as your code changes, but has the same benefit as the function order list.

I am not sure whether function order list can specify global variables as well, but use this #pragma data_seg would simply work.

Shadow answered 17/3, 2009 at 11:38 Comment(0)
B
0

One way to make apps start cold-start faster (sort of) is used by e.g. Adobe reader, by loading some of the files on startup, thereby hiding the cold start from the users. This is only usable if the program is not supposed to start up immediately.

Another note, is that .NET 3.5SP1 supposedly has much improved cold-start speed, though how much, I cannot say.

Bowling answered 24/9, 2008 at 14:8 Comment(1)
This is evil, and one of the reasons I don't have Adobe reader installed.Enrich
A
0

It could be the NICs (LAN Cards) and that your app depends on certain other services that require the network to come up. So profiling your application alone may not quite tell you this, but you should examine the dependencies for your application.

Apospory answered 24/9, 2008 at 18:56 Comment(1)
This isn't the case, we're purely client side.Saloop
S
-1

If your application is not very complicated, you can just copy all the executables to another directory, it should be similar to a reboot. (Cut and Paste seems not work, Windows is smart enough to know the files move to another folder is cached in the memory)

Shadow answered 18/3, 2009 at 11:25 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.