CTimeSpan always gets zero
Asked Answered
V

1

1

I am trying to get the running time of Insertion Sort Algorithm. MSDN said that using CTime could get the Elapsed Time. But I tried many times and always got zero. I thought it is impossible that the time of running this algorithm is zero. There must be some error or something else. Could anybody help me? I posted my code below:

#include <cstdlib>
#include <iostream>
#include <atltime.h> 
using namespace std;
//member function
void insertion_sort(int arr[], int length);
int *create_array(int arrSize);

int main() {
    //Create random array
    int arraySize=100;
    int *randomArray=new int[arraySize];
    int s;
    for (s=0;s<arraySize;s++){
        randomArray[s]=(rand()%99)+1;
    }

    CTime startTime = CTime::GetCurrentTime();

    int iter;
    for (iter=0;iter<1000;iter++){
        insertion_sort(randomArray,arraySize);
    }

    CTime endTime = CTime::GetCurrentTime();
    CTimeSpan elapsedTime = endTime - startTime;
    double nTMSeconds = elapsedTime.GetTotalSeconds()*1000;
    cout<<nTMSeconds;
    return 0;
}//end of main
Vday answered 21/9, 2014 at 0:30 Comment(1)
CTime has a resolution of one second. Apparently, your whole test takes less than a second.Norikonorina
B
0

CTime isn't meant to time things to a resolution less than one second. I think what you are really after is something like GetTickCount or GetTickCount64 . See this MSDN link .

GetTickCount function

Retrieves the number of milliseconds that have elapsed since the system was started, up to 49.7 days.

If using GetTickCount64 you could declare startTime and endTime this way:

uint64_t endTime, startTime, diffTime;

Then use GetTickCount64 to retrieve the time in milliseconds with something like

startTime = GetTickCount64();
... do stuff ...
endTime = GetTickCount64();

diffTime = endTime - startTime;

And of course diffTime can be used however you want.

If you don't need to time things for more than a month then you can simply use GetTickCount and the type returned will be a uint32_t instead of uint64_t

If you need resolution beyond 1 millisecond for timing and your computer supports a high resolution timer then this code may work:

LARGE_INTEGER freq;
double time_sec = 0.0;

if (QueryPerformanceFrequency(&freq))
{
    LARGE_INTEGER start;
    LARGE_INTEGER stop;

    QueryPerformanceCounter(&start);

    // Do Stuff to time Here

    QueryPerformanceCounter(&stop);
    time_sec = (uint64_t)(stop.QuadPart - start.QuadPart) / (double)freq.QuadPart;
}
else {
    cout << "Your computer doesn't have a high resolution timer to use";
} 

Information on the high performance timer can be found in this MSDN entry

Boak answered 21/9, 2014 at 0:43 Comment(2)
I tried it but still not work. I checked the value of the start time and the end time. They were the same. Therefore, the diffTime is still zero.Vday
Its not that it isn't working. Whatever you are timing is less than the resolution of the timer being used. If get tickcount produces 0 it means that whatever you are timing is less than 1 millisecond(1/1000th of a second). I have provided one last mechanism (I added it to my answer) but it won't work on some computer systems without a high resolution timer. If you have a timer this will be the best you can get.Boak

© 2022 - 2024 — McMap. All rights reserved.