Home | History | Annotate | Download | only in eager

Lines Matching defs:tensors

63 // Operations the tape needs to perform on tensors to do backpropagation. Named
90 // Consumes references to the tensors in the gradient_tensors list and returns
123 // functions (and hence the tensors they keep alive). Instead, everything
146 // once) and produces the gradient of the target tensors with respect to the
147 // source tensors. The output gradients are used if not empty and not
203 std::vector<TapeTensor> tensors;
204 tensors.reserve(output_tensors.size());
210 tensors.push_back(o);
213 op_type, tensors, ids, backward_function, backward_function_deleter};
233 // Do not delete watched tensors.
265 // Tensors need gradients to be computed (Tensors which are not used do not need
278 // When the stack is empty we have gradients for all tensors we're interested
291 // Maps from op ID to how many output tensors of this op still need to have