Weber’s law predicts that stimulus sensitivity will increase proportionally with increases in stimulus intensity. Does this hold for the stimulus of time – specifically, duration in the milliseconds to seconds range? There is conflicting evidence on the relationship between temporal sensitivity and duration. Weber’s law predicts a linear relationship between sensitivity and duration on interval timing tasks, while two alternative models predict a reverse J-shaped and a U-shaped relationship. Based on previous research, we hypothesised that temporal sensitivity in humans would follow a U-shaped function, increasing and then decreasing with increases in duration, and that this model would provide a better statistical fit to the data than the reverse-J or the simple Weber’s Law model. In a two-alternative forced-choice interval comparison task, 24 participants made duration judgements about six groups of auditory intervals between 100 and 3,200 ms. Weber fractions were generated for each group of intervals and plotted against time to generate a function describing sensitivity to the stimulus of duration. Although the sensitivity function was slightly concave, and the model describing a U-shaped function gave the best fit to the data, the increase in the model fit was not sufficient to warrant the extra free parameter in the chosen model. Further analysis demonstrated that Weber’s law itself provided a better description of sensitivity to changes in duration than either of the two models tested.