I came across this coding question where one is given as input a sequence of n numbers, say and one has to output the number (need not be one of the input values) which would minimize the difference from that number squared and summed over all numbers in the input sequence i.e. if that number is K, then this sum

calculated over is minimum when X=K.

My first guess was that number would be either mean or median. After taking a few small examples, I could see that median wouldn’t be the desired answer. So, I submitted my solution by calculating the mean of the numbers and it was accepted by the system.

But, I had to convince myself that this squared sum of differences of a sequence is indeed the minimum only about the arithmetic mean. Hence, I did the following elementary mathematical proof to convince myself.

Let’s calculate the square sums across, a number X, which is away from M. So, where . Let’s see that difference of square sum calculated about mean, , is always lesser than square sum calculated, about any other number X.

[Since, ] [From definition of mean] [Since ] Hence proved, .

Hence proved that arithmetic mean is the right choice for minimizing the square difference sums of a given sequence of numbers.

Observe that S_X varies continuously as X, and hence we can use high-school calculus for the minimization problem. S'_X = 0 only when X is the mean. The function is continuous, and unbounded, so any extremum points must be minima. The proof that this X minimizes S_X is complete without examining the second derivative.

Yes, that's an alternative proof using calculus.