The dispersion of a sequence of measurements is defined as the difference between the largest and the smallest value.

Input

The first line of the input contains a number of measurements $$n \in \mathbb{N}$$, where $$n \geq 1$$. After that $$n$$ lines follow with a measurement $$m \in \mathbb{R}$$ on each line.

Output

The dispersion of the measurements.

Example

Input:

3
3.14
1.41
2.72

Output:

1.73