HomeSort by relevance Sort by last modified time
    Searched refs:preconditioner (Results 1 - 12 of 12) sorted by null

  /external/ceres-solver/internal/ceres/
implicit_schur_complement.h 94 // preconditioner indicates whether the inverse of the matrix F'F
95 // should be computed or not as a preconditioner for the Schur
100 ImplicitSchurComplement(int num_eliminate_blocks, bool preconditioner);
iterative_schur_complement_solver.cc 83 // matrix with the block diagonal of the matrix F'F as the preconditioner.
101 // that the only method ever called on the preconditioner is the
104 cg_per_solve_options.preconditioner =
118 cg_per_solve_options.preconditioner =
122 LOG(FATAL) << "Unknown Preconditioner Type";
linear_solver.h 146 preconditioner(NULL),
185 // simplest form a preconditioner is a matrix M such that instead
194 // A null preconditioner is equivalent to an identity matrix being
195 // used a preconditioner.
196 LinearOperator* preconditioner; member in struct:ceres::internal::LinearSolver::PerSolveOptions
cgnr_solver.cc 64 cg_per_solve_options.preconditioner = jacobi_preconditioner_.get();
conjugate_gradients_solver.cc 120 // Apply preconditioner
121 if (per_solve_options.preconditioner != NULL) {
123 per_solve_options.preconditioner->RightMultiply(r.data(), z.data());
implicit_schur_complement.cc 45 bool preconditioner)
47 preconditioner_(preconditioner),
system_test.cc 480 #define CONFIGURE(linear_solver, sparse_linear_algebra_library, ordering, preconditioner) \
484 preconditioner))
  /external/eigen/Eigen/src/IterativeLinearSolvers/
IterativeSolverBase.h 25 typedef typename internal::traits<Derived>::Preconditioner Preconditioner;
62 * Currently, this function mostly call analyzePattern on the preconditioner. In the future
76 * Currently, this function mostly call factorize on the preconditioner.
95 * Currently, this function mostly initialized/compute the preconditioner. In the future
129 /** \returns a read-write reference to the preconditioner for custom configuration. */
130 Preconditioner& preconditioner() { return m_preconditioner; } function in class:Eigen::IterativeSolverBase
132 /** \returns a read-only reference to the preconditioner. */
133 const Preconditioner& preconditioner() const { return m_preconditioner; function in class:Eigen::IterativeSolverBase
    [all...]
  /external/ceres-solver/examples/
bundle_adjuster.cc 83 DEFINE_string(preconditioner, "jacobi", "Options are: "
nist.cc 61 DEFINE_string(preconditioner, "jacobi", "Options are: "
  /external/ceres-solver/docs/
solving.tex 340 Thus, we can run PCG on $S$ with the same computational effort per iteration as PCG on $H$, while reaping the benefits of a more powerful preconditioner. In fact, we do not even need to compute $H$, \eqref{eq:schurtrick1} can be implemented using just the columns of $J$.
344 \section{Preconditioner}
347 The solution to this problem is to replace~\eqref{eq:normal} with a {\em preconditioned} system. Given a linear system, $Ax =b$ and a preconditioner $M$ the preconditioned system is given by $M^{-1}Ax = M^{-1}b$. The resulting algorithm is known as Preconditioned Conjugate Gradients algorithm (PCG) and its worst case complexity now depends on the condition number of the {\em preconditioned} matrix $\kappa(M^{-1}A)$.
349 The computational cost of using a preconditioner $M$ is the cost of computing $M$ and evaluating the product $M^{-1}y$ for arbitrary vectors $y$. Thus, there are two competing factors to consider: How much of $H$'s structure is captured by $M$ so that the condition number $\kappa(HM^{-1})$ is low, and the computational cost of constructing and using $M$. The ideal preconditioner would be one for which $\kappa(M^{-1}A) =1$. $M=A$ achieves this, but it is not a practical choice, as applying this preconditioner would require solving a linear system equivalent to the unpreconditioned problem. It is usually the case that the more information $M$ has about $H$, the more expensive it is use. For example, Incomplete Cholesky factorization based preconditioners have much better convergence behavior than the Jacobi preconditioner, but are also much more expensive.
352 The simplest of all preconditioners is the diagonal or Jacobi preconditioner, \ie, $M=\operatorname{diag}(A)$, which for block structured matrices like $H$ can be generalized to the block Jacobi preconditioner.
354 For \texttt{ITERATIVE\_SCHUR} there are two obvious choices for block diagonal preconditioners for $S$. The block diagonal of the matrix $B$~\cite{mandel1990block} and the block diagonal $S$, \ie the block Jacobi preconditioner for $S$. Ceres's implements both of these preconditioners and refers to them as \texttt{JACOBI} a (…)
    [all...]
changes.tex 233 \item New iterative linear solver for general sparse problems - \texttt{CGNR} and a block Jacobi preconditioner for it.

Completed in 306 milliseconds