Home | History | Annotate | Download | only in Scalar

Lines Matching defs:Load

11 // instructions.  It also performs simple dead load elimination.
69 static cl::opt<bool> EnableLoadPRE("enable-load-pre", cl::init(true));
515 LoadVal, // A value produced by a load.
524 /// Offset - The byte offset in Val that is interesting for the load query.
696 // Helper fuctions of redundant load elimination
847 // The store has to be at least as big as the load.
856 /// then a load from a must-aliased pointer of a different type, try to coerce
857 /// the stored value. LoadedTy is the type of the load we want to replace and
894 // Cast to pointer if the load needs a pointer type.
941 /// memdep query of a load that ends up being a clobbering memory write (store,
943 /// by the load but we can't be sure because the pointers don't mustalias.
947 /// value of the piece that feeds the load.
964 // If the load and store are to the exact same address, they should have been
967 // to a load from the base of the memset.
970 dbgs() << "STORE/LOAD DEP WITH COMMON POINTER MISSED:\n"
974 << "Load Ptr = " << *LoadPtr << "\n";
979 // If the load and store don't overlap at all, the store doesn't provide
980 // anything to the load. In this case, they really don't alias at all, AA
998 dbgs() << "STORE LOAD DEP WITH COMMON BASE:\n"
1002 << "Load Ptr = " << *LoadPtr << "\n";
1008 // If the Load isn't completely contained within the stored bits, we don't
1010 // (issue a smaller load then merge the bits in) but this seems unlikely to be
1017 // store that the load is.
1022 /// memdep query of a load that ends up being a clobbering store.
1038 /// memdep query of a load that ends up being clobbered by another load. See if
1039 /// the other load can feed into the second load.
1051 // If we have a load/load clobber an DepLI can be widened to cover this load,
1099 // Otherwise, see if we can constant fold a load from the constant with the
1115 /// memdep query of a load that ends up being a clobbering store. This means
1116 /// that the store provides bits used by the load but we the pointers don't
1129 // Compute which bits of the stored value are being used by the load. Convert
1154 /// memdep query of a load that ends up being a clobbering load. This means
1155 /// that the load *may* provide bits used by the load but we can't be sure
1163 // widen SrcVal out to a larger load.
1167 assert(SrcVal->isSimple() && "Cannot widen volatile/atomic load!");
1168 assert(SrcVal->getType()->isIntegerTy() && "Can't widen non-integer load");
1169 // If we have a load/load clobber an DepLI can be widened to cover this
1170 // load, then we should widen it to the next power of 2 size big enough!
1177 // Insert the new load after the old load. This ensures that subsequent
1178 // memdep queries will find the new load. We can't easily remove the old
1179 // load completely because it is already in the value numbering table.
1191 DEBUG(dbgs() << "GVN WIDENED LOAD: " << *SrcVal << "\n");
1194 // Replace uses of the original load with the wider load. On a big endian
1204 // because the load is already memoized into the leader map table that GVN
1205 // tracks. It is potentially possible to remove the load from the table,
1207 // rehashed. Just leave the dead load around.
1217 /// memdep query of a load that ends up being a clobbering mem intrinsic.
1227 // provides the bits for the load.
1261 // Otherwise, see if we can constant fold a load from the constant with the
1280 // Check for the fully redundant, dominating load case. In this case, we can
1344 LoadInst *Load = getCoercedLoadValue();
1345 if (Load->getType() == LoadTy && Offset == 0) {
1346 Res = Load;
1348 Res = GetLoadValueForLoad(Load, Offset, LoadTy, BB->getTerminator(),
1351 DEBUG(dbgs() << "GVN COERCED NONLOCAL LOAD:\nOffset: " << Offset << " "
1381 // dependencies that produce an unknown value for the load (such as a call
1382 // that could potentially clobber the load).
1390 // Dead dependent mem-op disguise as a load evaluating the same value
1391 // as the load in question.
1403 // the pointer operand of the load if PHI translation occurs. Make sure
1408 // read by the load, we can extract the bits we need for the load from the
1424 // load i32* P
1425 // load i8* (P+1)
1499 // If the types mismatch and we can't handle it, reject reuse of the load.
1520 // doing PRE of this load. This will involve inserting a new load into the
1523 // that we only have to insert *one* load (which means we're basically moving
1524 // the load, not inserting a new one).
1544 // block along which the load may not be anticipated. Hoisting the load
1545 // above this block would be adding the load to execution paths along
1573 DEBUG(dbgs() << "COULD NOT PRE LOAD BECAUSE OF INDBR CRITICAL EDGE '"
1580 << "COULD NOT PRE LOAD BECAUSE OF LANDING PAD CRITICAL EDGE '"
1592 // Decide whether PRE is profitable for this load.
1597 // If this load is unavailable in multiple predecessors, reject it.
1599 // all the preds that don't have an available LI and insert a new load into
1613 // Check if the load can safely be moved to all the unavailable predecessors.
1624 // the load on the pred (?!?), so we can insert code to materialize the
1654 // Okay, we can eliminate this load by inserting a reload in the predecessor
1657 DEBUG(dbgs() << "GVN REMOVING PRE LOAD: " << *LI << '\n');
1679 // Transfer the old load's AA tags to the new load.
1688 // Add the newly created load.
1707 /// Attempt to eliminate a load whose dependencies are
1710 // Step 1: Find the non-local dependencies of the load.
1715 // dependencies, this load isn't worth worrying about. Optimizing
1726 dbgs() << "GVN: non-local load ";
1733 // If this load follows a GEP, see if we can PRE the indices before analyzing.
1742 // Step 2: Analyze the availability of the load
1747 load, exit
1755 // load, then it is fully redundant and we can use PHI insertion to compute
1758 DEBUG(dbgs() << "GVN REMOVING NONLOCAL LOAD: " << *LI << '\n');
1820 /// Attempt to eliminate a load, first by eliminating it
1845 // %C = load i8* %B
1849 // completely covers this load. This sort of thing can happen in bitfield
1861 // load i32* P
1862 // load i8* (P+1)
1889 // Replace the load!
1903 dbgs() << "GVN: load ";
1918 dbgs() << "GVN: load ";
1929 // The store and load are to a must-aliased pointer, but they may not
1962 DEBUG(dbgs() << "GVN COERCED LOAD:\n" << *DepLI << "\n" << *AvailableVal
1975 // If this load really doesn't depend on anything, then we must be loading an
1985 // If this load occurs either right after a lifetime begin,
1996 // If this load follows a calloc (which zero initializes memory),