There is no exact analog of the mean value theorem for vector-valued functions (see below). However, there is an inequality which can be applied to many of the same situations to which the mean value theorem is applicable in the one dimensional case: {{math theorem|
For a continuous vector-valued function \mathbf{f}:[a,b]\to\mathbb{R}^k differentiable on (a,b), there exists a number c\in(a,b) such that :|\mathbf{f}(b)-\mathbf{f}(a)| \le (b-a)\left|\mathbf{f}'(c)\right|.}} {{math proof| Take \varphi(t) = (\textbf{f}(b) - \textbf{f}(a)) \cdot \textbf{f}(t). Then \varphi is real-valued and thus, by the mean value theorem, :\varphi(b) - \varphi(a) = \varphi'(c)(b-a) for some c \in (a, b). Now, \varphi(b) - \varphi(a) = |\textbf{f}(b) - \textbf{f}(a)|^2 and \varphi'(c) = (\textbf{f}(b) - \textbf{f}(a)) \cdot \textbf{f}'(c). Hence, using the
Cauchy–Schwarz inequality, from the above equation, we get: :|\textbf{f}(b) - \textbf{f}(a)|^2 \le |\textbf{f}(b) - \textbf{f}(a)| |\textbf{f}'(c) |(b-a). If \textbf{f}(b) = \textbf{f}(a), the theorem holds trivially. Otherwise, dividing both sides by |\textbf{f}(b) - \textbf{f}(a)| yields the theorem.}}
Mean value inequality Jean Dieudonné in his classic treatise
Foundations of Modern Analysis discards the mean value theorem and replaces it by mean inequality as the proof is not constructive and one cannot find the mean value and in applications one only needs mean inequality.
Serge Lang in
Analysis I uses the mean value theorem, in integral form, as an instant reflex but this use requires the continuity of the derivative. If one uses the
Henstock–Kurzweil integral one can have the mean value theorem in integral form without the additional assumption that derivative should be continuous as every derivative is Henstock–Kurzweil integrable. The reason why there is no analog of mean value equality is the following: If is a differentiable function (where is open) and if , is the line segment in question (lying inside ), then one can apply the above parametrization procedure to each of the component functions of
f (in the above notation set ). In doing so one finds points on the line segment satisfying :f_i(x+h) - f_i(x) = \nabla f_i (x + t_ih) \cdot h. But generally there will not be a
single point on the line segment satisfying :f_i(x+h) - f_i(x) = \nabla f_i (x + t^* h) \cdot h. for all
simultaneously. For example, define: :\begin{cases} f : [0, 2 \pi] \to \R^2 \\ f(x) = (\cos(x), \sin(x)) \end{cases} Then f(2\pi) - f(0) = \mathbf{0} \in \R^2, but f_1'(x) = -\sin (x) and f_2'(x) = \cos (x) are never simultaneously zero as x ranges over \left[0, 2 \pi\right]. The above theorem implies the following: {{math_theorem :|\textbf{f}(b) - \textbf{f}(a)| \le (b-a)\sup_{(a, b)} |\textbf{f}'|. }} In fact, the above statement suffices for many applications and can be proved directly as follows. (We shall write f for \textbf{f} for readability.) {{math proof|First assume f is differentiable at a too. If f' is unbounded on (a, b), there is nothing to prove. Thus, assume \sup_{(a, b)} |f'| . Let M > \sup_{(a, b)} |f'| be some real number. Let E = \{ 0 \le t \le 1 \mid |f(a + t(b-a)) - f(a)| \le Mt(b-a) \}. We want to show 1 \in E. By continuity of f, the set E is closed. It is also nonempty as 0 is in it. Hence, the set E has the largest element s. If s = 1, then 1 \in E and we are done. Thus suppose otherwise. For 1 > t > s, :\begin{align} &|f(a + t(b-a)) - f(a)| \\ &\le |f(a + t(b-a)) - f(a+s(b - a)) - f'(a + s(b-a))(t-s)(b-a)| + |f'(a+s(b-a))|(t-s)(b-a) \\ &+|f(a + s(b-a)) - f(a)|. \end{align} Let \epsilon > 0 be such that M - \epsilon > \sup_{(a, b)} |f'|. By the differentiability of f at a + s(b-a) (note s may be 0), if t is sufficiently close to s, the first term is \le \epsilon (t-s)(b-a). The second term is \le (M - \epsilon) (t-s)(b-a). The third term is \le Ms(b-a). Hence, summing the estimates up, we get: |f(a + t(b-a)) - f(a)| \le tM|b-a|, a contradiction to the maximality of s. Hence, 1 = s \in M and that means: :|f(b) - f(a)| \le M(b-a). Since M is arbitrary, this then implies the assertion. Finally, if f is not differentiable at a, let a' \in (a, b) and apply the first case to f restricted on [a', b], giving us: :|f(b) - f(a')| \le (b-a')\sup_{(a, b)} |f'| since (a', b) \subset (a, b). Letting a' \to a finishes the proof.}} == Cases where the theorem cannot be applied ==