Supposing that the measurement of time was less by a tenth, 0.1seconds, then the new value of g would be given as shown in the table below.
Original time (s) | Time less by 0.1 (s) | Original | New |
1.55 | 1.45 | 9.406 | 10.7491 |
1.48 | 1.38 | 10.318 | 11.8673 |
1.54 | 1.44 | 9.53 | 10.8989 |
1.52 | 1.42 | 9.8 | 11.2081 |
1.51 | 1.41 | 9.9 | 11.3676 |
Average: | 9.7908 | 11.2182 |
The new value of g obtain increase by a given a ratio. The average of the new value of g is 11.2182 m/s^2 which gives a new ∆% error of:
Use your promo and get a custom paper on
"Expansion in Physics: Questions and Answers".
∆% error=(9.80665-11.2182)/9.80665×100=-14.394%
The new ∆% error therefore becomes -14.394% which is 14.665% difference from the previous value.
Now, supposing that the measurement of time was more by a tenth, or 0.1seconds, then the new value of g would be given as shown in the table below.
Original time (s) | Time more by 0.1 (s) | Original | New |
1.55 | 1.65 | 9.406 | 8.30119 |
1.48 | 1.58 | 10.318 | 9.05304 |
1.54 | 1.64 | 9.53 | 8.40274 |
1.52 | 1.62 | 9.8 | 8.61149 |
1.51 | 1.61 | 9.9 | 8.7188 |
Average: | 9.7908 | 8.61745 |
The new value of g obtain increase by a given a ratio. The average of the new value of g is 8.61745 m/s^2 which gives a new ∆% error of:
∆% error=(9.80665-8.61745)/9.80665×100=12.126%
The new ∆% error therefore becomes 12.126% which is a 11.855% difference from the previous value.
From the example shown in question 1 time measurement is very sensitive. A small variation of 0.1seconds leads to a 14.665% (-0.1s) or 11.855% (0.1s) difference in the current ∆% error. The stopwatch is not an adequate instrument for the measurement of time, instead, sensors using infrared timers might be used to reduce the possibility of systematic errors. Such timers can measure up to microseconds and at the exact moment of release of the ball, or when the ball passes through a determined point.