Commit 729f495a by Richard Sandiford Committed by Richard Sandiford

Improve canonicalisation of TARGET_MEM_REFs

A general TARGET_MEM_REF is:

    BASE + STEP * INDEX + INDEX2 + OFFSET

After classifying the address in this way, the code that builds
TARGET_MEM_REFs tries to simplify the address until it's valid
for the current target and for the mode of memory being addressed.
It does this in a fixed order:

(1) add SYMBOL to BASE
(2) add INDEX * STEP to the base, if STEP != 1
(3) add OFFSET to INDEX or BASE (reverted if unsuccessful)
(4) add INDEX to BASE
(5) add OFFSET to BASE

So suppose we had an address:

    &symbol + offset + index * 8

(e.g. a[i + 1] for a global "a") on a target only allows an index or an
offset, not both.  Following the steps above, we'd first create:

    tmp = symbol
    tmp2 = tmp + index * 8

Then if the given offset value was valid for the mode being addressed,
we'd create:

    MEM[base:tmp2, offset:offset]

while if it was invalid we'd create:

    tmp3 = tmp2 + offset
    MEM[base:tmp3, offset:0]

The problem is that this could happen if ivopts had decided to use
a scaled index for an address that happens to have a constant base.
The old procedure failed to give an indexed TARGET_MEM_REF in that case,
and adding the offset last prevented later passes from being able to
fold the index back in.

The patch avoids this by checking at (2) whether the offset is the
only component that causes the address to be invalid, folding it
into the base if so.

2018-01-13  Richard Sandiford  <richard.sandiford@linaro.org>
	    Alan Hayward  <alan.hayward@arm.com>
	    David Sherwood  <david.sherwood@arm.com>

gcc/
	* tree-ssa-address.c (mem_ref_valid_without_offset_p): New function.
	(add_offset_to_base): New function, split out from...
	(create_mem_ref): ...here.  When handling a scale other than 1,
	check first whether the address is valid without the offset.
	Add it into the base if so, leaving the index and scale as-is.

Co-Authored-By: Alan Hayward <alan.hayward@arm.com>
Co-Authored-By: David Sherwood <david.sherwood@arm.com>

From-SVN: r256609
parent b4923738
2018-01-13 Richard Sandiford <richard.sandiford@linaro.org>
Alan Hayward <alan.hayward@arm.com>
David Sherwood <david.sherwood@arm.com>
* tree-ssa-address.c (mem_ref_valid_without_offset_p): New function.
(add_offset_to_base): New function, split out from...
(create_mem_ref): ...here. When handling a scale other than 1,
check first whether the address is valid without the offset.
Add it into the base if so, leaving the index and scale as-is.
2018-01-12 Jakub Jelinek <jakub@redhat.com>
PR c++/83778
......
......@@ -746,6 +746,35 @@ gimplify_mem_ref_parts (gimple_stmt_iterator *gsi, struct mem_address *parts)
true, GSI_SAME_STMT);
}
/* Return true if the OFFSET in PARTS is the only thing that is making
it an invalid address for type TYPE. */
static bool
mem_ref_valid_without_offset_p (tree type, mem_address parts)
{
if (!parts.base)
parts.base = parts.offset;
parts.offset = NULL_TREE;
return valid_mem_ref_p (TYPE_MODE (type), TYPE_ADDR_SPACE (type), &parts);
}
/* Fold PARTS->offset into PARTS->base, so that there is no longer
a separate offset. Emit any new instructions before GSI. */
static void
add_offset_to_base (gimple_stmt_iterator *gsi, mem_address *parts)
{
tree tmp = parts->offset;
if (parts->base)
{
tmp = fold_build_pointer_plus (parts->base, tmp);
tmp = force_gimple_operand_gsi_1 (gsi, tmp, is_gimple_mem_ref_addr,
NULL_TREE, true, GSI_SAME_STMT);
}
parts->base = tmp;
parts->offset = NULL_TREE;
}
/* Creates and returns a TARGET_MEM_REF for address ADDR. If necessary
computations are emitted in front of GSI. TYPE is the mode
of created memory reference. IV_CAND is the selected iv candidate in ADDR,
......@@ -812,6 +841,14 @@ create_mem_ref (gimple_stmt_iterator *gsi, tree type, aff_tree *addr,
if (parts.step && !integer_onep (parts.step))
{
gcc_assert (parts.index);
if (parts.offset && mem_ref_valid_without_offset_p (type, parts))
{
add_offset_to_base (gsi, &parts);
mem_ref = create_mem_ref_raw (type, alias_ptr_type, &parts, true);
gcc_assert (mem_ref);
return mem_ref;
}
parts.index = force_gimple_operand_gsi (gsi,
fold_build2 (MULT_EXPR, sizetype,
parts.index, parts.step),
......@@ -906,18 +943,7 @@ create_mem_ref (gimple_stmt_iterator *gsi, tree type, aff_tree *addr,
[base']. */
if (parts.offset && !integer_zerop (parts.offset))
{
tmp = parts.offset;
parts.offset = NULL_TREE;
/* Add offset to base. */
if (parts.base)
{
tmp = fold_build_pointer_plus (parts.base, tmp);
tmp = force_gimple_operand_gsi_1 (gsi, tmp,
is_gimple_mem_ref_addr,
NULL_TREE, true, GSI_SAME_STMT);
}
parts.base = tmp;
add_offset_to_base (gsi, &parts);
mem_ref = create_mem_ref_raw (type, alias_ptr_type, &parts, true);
if (mem_ref)
return mem_ref;
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment