SE250:lab-1:npit006

From Marks Wiki
Jump to navigation Jump to search

SOFTENG 250: LAB 1


Firstly, I familiarised myself with the clock() function included in time.h. From my research, a clock_t datatype was suggested, but I wasn't able to successfully use it... so I ended up using double datatypes for recording clock times. Following that, I wrote up some code to record the time taken for a huge loop containing an 'int' addition to execute. Gradually I refined my program step by step... I added code to convert the clocks in the results to seconds and nested the huge loop inside a smaller loop which would execute 6 times in order to further minimise random error in my results.

Finally, I attempted to calculate the looper overhead by using the clock() function to measure how long the same huge loop would require to execute, minus the addition. To my surprise and dismay, it seems that the empty loop actually required more time to execute than the same loop containing an addition operation. And so, once I subtracted the loop overhead from the average times, the result was a negative number!

Nevertheless, I copied the 'int addition' block of code multiple times so that I could perform the same operation with addition of different types such as short, long, float and double. It would have been possible to reduce the size of my program, by including a select case kind of conditional function within the loop, where each option would lead to addition of different datatypes. But with this extra function and processing time required, it would only serve to obfuscate the results. More specifically, for the int addition, my code was 'sum_a = a+a;' where sum_a and a were both int variables.

Given that I had discovered that accounting for the loop overhead mucked things up, I chose to pay attention only to the average times which didn't account for the loop overheads. In order of increasing processing times, Long < Int < Short < Double < Float. This was mostly as expected with the decimal operations taking more time than the integer operations. If there are reasons for int addition taking longer than long addition, I imagine they would be intrinsically buried within the C architecture. However, I thought double would take much longer than short.

I haven't had the chance to run my program enough times to come up with an overall set of realistic results, but I'll look into doing this and also demystifying the loop overhead problem at a later time.



Results:

      • Adding two int values***

Loop Overhead: 33.500000

Time Taken (clocks unaccounted): 25.500000

Time Taken (clocks accounted): -8.000000

Time in seconds for 10 million operations: 0.025500

Time in seconds for one operation: 0.00000000255000000000


      • Adding two short values***

Loop Overhead: 33.500000

Time Taken (clocks unaccounted): 32.000000

Time Taken (clocks accounted): -1.500000

Time in seconds for 10 million operations: 0.032000

Time in seconds for one operation: 0.00000000320000000000


      • Adding two long values***

Loop Overhead: 33.500000

Time Taken (clocks unaccounted): 24.000000

Time Taken (clocks accounted): -9.500000

Time in seconds for 10 million operations: 0.024000

Time in seconds for one operation: 0.00000000240000000000


      • Adding two float values***

Loop Overhead: 33.500000

Time Taken (clocks unaccounted): 34.166667

Time Taken (clocks accounted): 0.666667

Time in seconds for 10 million operations: 0.034167

Time in seconds for one operation: 0.00000000341666666667


      • Adding two double values***

Loop Overhead: 33.500000

Time Taken (clocks unaccounted): 32.333333

Time Taken (clocks accounted): -1.166667

Time in seconds for 10 million operations: 0.032333

Time in seconds for one operation: 0.00000000323333333333