Home | History | Annotate | Download | only in source

Lines Matching full:cost

121    this is the case when computing cost only. If ``jacobians[i]`` is
189 To get an auto differentiated cost function, you must define a
191 cost function in terms of the template parameter ``T``. The
206 measurements, where there is an instance of the cost function for
209 The actual cost added to the total problem is :math:`e^2`, or
213 To write an auto-differentiable cost function for the above model,
239 Then given this class definition, the auto differentiated cost
261 :class:`AutoDiffCostFunction` also supports cost functions with a
277 The framework can currently accommodate cost functions of up to 10
316 slightly. The expected interface for the cost functors is:
327 also specify the sizes after creating the dynamic autodiff cost
338 Under the hood, the implementation evaluates the cost function
344 while reducing the number of passes over the cost function. The
356 In some cases, its not possible to define a templated cost functor,
406 instance of the cost function for each measurement :math:`k`.
458 NumericDiffCostFunction also supports cost functions with a
476 The framework can currently accommodate cost functions of up to 10
481 the cost of twice as many function evaluations than forward
501 To get a numerically differentiated cost function, define a
509 concrete cost function, even though it could be implemented only in
512 The numerically differentiated version of a cost function for a
513 cost function can be constructed as follows:
546 expected interface for the cost functors is:
556 also specify the sizes after creating the dynamic numeric diff cost
576 Sometimes parts of a cost function can be differentiated
643 Now, we are ready to construct an automatically differentiated cost
737 values of a wrapped cost function. An example where this is useful is
738 where you have an existing cost function that produces N values, but you
739 want the total cost to be something other than just the sum of these
741 values, to change their contribution to the cost.
752 // Make N 1x1 cost functions (1 parameter, 1 residual)
790 Implements a cost function of the form
792 .. math:: cost(x) = ||A(x - b)||^2
795 variable. In case the user is interested in implementing a cost
798 .. math:: cost(x) = (x - \mu)^T S^{-1} (x - \mu)
837 Using a robust loss function, the cost for large residuals is
854 Here the convention is that the contribution of a term to the cost
869 so that they mimic the squared cost for small residuals.
1058 to remove the null directions of the cost. More generally, if
1068 tangent space. For a cost function defined on this sphere, given a
1260 The cost function carries with it information about the sizes of
1264 detected. ``loss_function`` can be ``NULL``, in which case the cost
1313 Add a residual block to the overall cost function. The cost
1318 NULL, in which case the cost of the term is just the squared norm
1370 block depends on are not removed. The cost and loss functions for the
1387 of the problem (similar to cost/loss functions in residual block
1496 .. function:: bool Problem::Evaluate(const Problem::EvaluateOptions& options, double* cost, vector<double>* residuals, vector<double>* gradient, CRSMatrix* jacobian)
1508 double cost = 0.0;
1509 problem.Evaluate(Problem::EvaluateOptions(), &cost, NULL, NULL, NULL);
1511 The cost is evaluated at `x = 1`. If you wish to evaluate the
1517 problem.Evaluate(Problem::EvaluateOptions(), &cost, NULL, NULL, NULL);