M01478

Optimization of Food Science in Science
and Mathematics Faculty of SWCU in Indonesia
Hanna Arini Parhusip∗

Abstract. Optimization problems in Science and Mathematics Faculty of SWCU are
recalled here. The underlying theories are least square method, the properties of Hessian
matrix of the residual functions and convexity of the objective functions and Karush
Kuhn Tucker conditions. One observed that the given data may not support the used
theories. Functions of Lagrange are analysed on the optimizers. The gradients of the
obtained system are possible not zero.
Tke studied problems contain optimization of the used yeast on beans, stevioside and
Mocorin (Modification Of Bisi 2 Variety Yellow Corn (Zea Mays L.) - Rice Brand Flour),
food classification and optimization on rice harvesting. The possible update approaches
on these topics are proposed in the final discussion of this paper.
Mathematics Subject Classification (2010). 90.58, 90.59, 90C20.
Keywords. least square, Hessian, Karush Kuhn Tucker condition, convex and concave

1. Introduction
Mathematical techniques in optimization are not easily known in food science
though some problems in food production are oftenly dealing with optimization.
One normally use his experience for doing an optimization. On the other hand,

some efforts may be reduced due to additional knowledge in mathematics for solving an optimization problem. This paper shows some examples using mathematical
techniques (including some mathematical keywords) that must be satisfied to have
an optimal solution. This paper is focus on optimization methods in food production.
One should aware that before an optimization theory is applied, one needs to
analyze whether the given data support enough to achieve an optimal solution
(a maximum or a minimum). There are several necessary conditions must be
satisfied. An objective function should be convex in a convex domain for searching
a minimum solution. A concavity of an objective function is required for setting up
a maximization problem. Additionally, a Hessian matrix of its Lagrangian function
must also be taken into account. This paper addresses this consideration based on
∗ Author

is grateful to ICM 2014 for its travel grant to this congress.

2

Hanna Arini Parhusip

some optimization problems from Science and Mathematics Faculty of SWCU in
Indonesia in 2011-2013.

There are two objective functions required to set up optimization problem based
on the given data,i.e the first objective function is due to parameter determination
and the second one is due to optimal values of the obtained objective function after
the parameters are found.

2. Nonlinear Optimization of parameters and its
Hessian matrix
Parameters of an objective function must be determined by minimizing errors. In
general case, one has a vector y ∈ D ⊂ ℜn and its approximation is modelled
ePas its model where the sum squared distances are minimized,
by a continuous y
2
i.e min R(a) =
{yi − yei (x, a)} with x ∈ ℜn and the vector of parameters
m
a ∈ ℜ , m

Dokumen yang terkait

M01478

0 0 11