Perform the same computation as in Example 14.3,

14.34 Manning’s formula for a rectangular channel can be written as Q = 1 n m B H 53 B + 2H 23 √ S where Q = flow m 3 s, n m = a roughness coefficient, B = width m, H = depth m, and S = slope. You are applying this formula to a stream where you know that the width = 20 m and the depth = 0.3 m. Unfortunately, you know the rough- ness and the slope to only a ±10 precision. That is, you know that the roughness is about 0.03 with a range from 0.027 to 0.033 and the slope is 0.0003 with a range from 0.00027 to 0.00033. Assuming uniform distributions, use a Monte Carlo analysis with n = 10,000 to estimate the distribution of flow. 14.35 A Monte Carlo analysis can be used for optimization. For example, the trajectory of a ball can be computed with y = tanθ x − g 2v 2 cos 2 θ x 2 + y P14.35 where y = the height m, θ = the initial angle radians, v = the initial velocity ms, g = the gravitational constant = 9.81 ms 2 , and y = the initial height m. Given y = 1 m, v = 25 ms, and θ = 50 o , determine the maximum height and the corresponding x distance a analytically with calcu- lus and b numerically with Monte Carlo simulation. For the latter, develop a script that generates a vector of 10,000 uniformly-distributed values of x between 0 and 60 m. Use this vector and Eq. P14.35 to generate a vector of heights. Then, employ the max function to determine the maximum height and the associated x distance. 361 15

15.1 POLYNOMIAL REGRESSION

In Chap.14, a procedure was developed to derive the equation of a straight line using the least-squares criterion. Some data, although exhibiting a marked pattern such as seen in Fig. 15.1, are poorly represented by a straight line. For these cases, a curve would be bet- ter suited to fit the data. As discussed in Chap. 14, one method to accomplish this objective is to use transformations. Another alternative is to fit polynomials to the data using poly- nomial regression. The least-squares procedure can be readily extended to fit the data to a higher-order polynomial. For example, suppose that we fit a second-order polynomial or quadratic: y = a + a 1 x + a 2 x 2 + e 15.1 General Linear Least-Squares and Nonlinear Regression CHAPTER OBJECTIVES This chapter takes the concept of fitting a straight line and extends it to a fitting a polynomial and b fitting a variable that is a linear function of two or more independent variables. We will then show how such applications can be generalized and applied to a broader group of problems. Finally, we will illustrate how optimization techniques can be used to implement nonlinear regression. Specific objectives and topics covered are • Knowing how to implement polynomial regression. • Knowing how to implement multiple linear regression. • Understanding the formulation of the general linear least-squares model. • Understanding how the general linear least-squares model can be solved with MATLAB using either the normal equations or left division. • Understanding how to implement nonlinear regression with optimization techniques. For this case the sum of the squares of the residuals is S r = n i = 1 y i − a − a 1 x i − a 2 x 2 i 2 15.2 To generate the least-squares fit, we take the derivative of Eq. 15.2 with respect to each of the unknown coefficients of the polynomial, as in ∂ S r ∂ a = − 2 y i − a − a 1 x i − a 2 x 2 i ∂ S r ∂ a 1 = − 2 x i y i − a − a 1 x i − a 2 x 2 i ∂ S r ∂ a 2 = − 2 x 2 i y i − a − a 1 x i − a 2 x 2 i These equations can be set equal to zero and rearranged to develop the following set of normal equations: na + x i a 1 + x 2 i a 2 = y i x i a + x 2 i a 1 + x 3 i a 2 = x i y i x 2 i a + x 3 i a 1 + x 4 i a 2 = x 2 i y i y x a y x b FIGURE 15.1 a Data that are ill-suited for linear least-squares regression. b Indication that a parabola is preferable.