Commit fef5a0d9 by Richard Biener Committed by Richard Biener

builtins.c (c_getstr, [...]): Export.

2014-08-08  Richard Biener  <rguenther@suse.de>

	* builtins.c (c_getstr, readonly_data_expr, init_target_chars,
	target_percent, target_percent_s): Export.
	(var_decl_component_p, fold_builtin_memory_op, fold_builtin_memset,
	fold_builtin_bzero, fold_builtin_strcpy, fold_builtin_strncpy,
	fold_builtin_strcat, fold_builtin_fputs, fold_builtin_memory_chk,
	fold_builtin_stxcpy_chk, fold_builtin_stxncpy_chk,
	fold_builtin_sprintf_chk_1, fold_builtin_snprintf_chk_1):
	Move to gimple-fold.c.
	(fold_builtin_2): Remove handling of bzero, fputs, fputs_unlocked,
	strcat and strcpy.
	(fold_builtin_3): Remove handling of memset, bcopy, memcpy,
	mempcpy, memmove, strncpy, strcpy_chk and stpcpy_chk.
	(fold_builtin_4): Remove handling of memcpy_chk, mempcpy_chk,
	memmove_chk, memset_chk, strncpy_chk and stpncpy_chk.
	(rewrite_call_expr_array): Remove.
	(fold_builtin_sprintf_chk): Likewise.
	(fold_builtin_snprintf_chk): Likewise.
	(fold_builtin_varargs): Remove handling of sprintf_chk,
	vsprintf_chk, snprintf_chk and vsnprintf_chk.
	(gimple_fold_builtin_sprintf_chk): Remove.
	(gimple_fold_builtin_snprintf_chk): Likewise.
	(gimple_fold_builtin_varargs): Likewise.
	(fold_call_stmt): Do not call gimple_fold_builtin_varargs.
	* predict.c (optimize_bb_for_size_p): Handle NULL bb.
	* gimple.c (gimple_seq_add_seq_without_update): New function.
	* gimple.h (gimple_seq_add_seq_without_update): Declare.
	* gimple-fold.c: Include output.h.
	(gsi_replace_with_seq_vops): New function, split out from ...
	(gimplify_and_update_call_from_tree): ... here.
	(replace_call_with_value): New function.
	(replace_call_with_call_and_fold): Likewise.
	(var_decl_component_p): Moved from builtins.c.
	(gimple_fold_builtin_memory_op): Moved from builtins.c
	fold_builtin_memory_op and rewritten to GIMPLE.
	(gimple_fold_builtin_memset): Likewise.
	(gimple_fold_builtin_strcpy): Likewise.
	(gimple_fold_builtin_strncpy): Likewise.
	(gimple_fold_builtin_strcat): Likewise.
	(gimple_fold_builtin_fputs): Likewise.
	(gimple_fold_builtin_memory_chk): Likewise.
	(gimple_fold_builtin_stxcpy_chk): Likewise.
	(gimple_fold_builtin_stxncpy_chk): Likewise.
	(gimple_fold_builtin_snprintf_chk): Likewise.
	(gimple_fold_builtin_sprintf_chk): Likewise.
	(gimple_fold_builtin_strlen): New function.
	(gimple_fold_builtin_with_strlen): New function split out from
	gimple_fold_builtin.
	(gimple_fold_builtin): Change signature and handle
	bzero, memset, bcopy, memcpy, mempcpy and memmove folding
	here.  Call gimple_fold_builtin_with_strlen.
	(gimple_fold_call): Adjust.

	* gcc.dg/strlenopt-8.c: Remove XFAIL.
	* gcc.dg/tree-prof/stringop-2.c: Adjust.
	* gfortran.dg/array_memcpy_4.f90: Likewise.
	* gfortran.dg/trim_optimize_1.f90: Likewise.
	* gfortran.dg/trim_optimize_2.f90: Likewise.

From-SVN: r213753
parent 322d490e
2014-08-08 Richard Biener <rguenther@suse.de>
* builtins.c (c_getstr, readonly_data_expr, init_target_chars,
target_percent, target_percent_s): Export.
(var_decl_component_p, fold_builtin_memory_op, fold_builtin_memset,
fold_builtin_bzero, fold_builtin_strcpy, fold_builtin_strncpy,
fold_builtin_strcat, fold_builtin_fputs, fold_builtin_memory_chk,
fold_builtin_stxcpy_chk, fold_builtin_stxncpy_chk,
fold_builtin_sprintf_chk_1, fold_builtin_snprintf_chk_1):
Move to gimple-fold.c.
(fold_builtin_2): Remove handling of bzero, fputs, fputs_unlocked,
strcat and strcpy.
(fold_builtin_3): Remove handling of memset, bcopy, memcpy,
mempcpy, memmove, strncpy, strcpy_chk and stpcpy_chk.
(fold_builtin_4): Remove handling of memcpy_chk, mempcpy_chk,
memmove_chk, memset_chk, strncpy_chk and stpncpy_chk.
(rewrite_call_expr_array): Remove.
(fold_builtin_sprintf_chk): Likewise.
(fold_builtin_snprintf_chk): Likewise.
(fold_builtin_varargs): Remove handling of sprintf_chk,
vsprintf_chk, snprintf_chk and vsnprintf_chk.
(gimple_fold_builtin_sprintf_chk): Remove.
(gimple_fold_builtin_snprintf_chk): Likewise.
(gimple_fold_builtin_varargs): Likewise.
(fold_call_stmt): Do not call gimple_fold_builtin_varargs.
* predict.c (optimize_bb_for_size_p): Handle NULL bb.
* gimple.c (gimple_seq_add_seq_without_update): New function.
* gimple.h (gimple_seq_add_seq_without_update): Declare.
* gimple-fold.c: Include output.h.
(gsi_replace_with_seq_vops): New function, split out from ...
(gimplify_and_update_call_from_tree): ... here.
(replace_call_with_value): New function.
(replace_call_with_call_and_fold): Likewise.
(var_decl_component_p): Moved from builtins.c.
(gimple_fold_builtin_memory_op): Moved from builtins.c
fold_builtin_memory_op and rewritten to GIMPLE.
(gimple_fold_builtin_memset): Likewise.
(gimple_fold_builtin_strcpy): Likewise.
(gimple_fold_builtin_strncpy): Likewise.
(gimple_fold_builtin_strcat): Likewise.
(gimple_fold_builtin_fputs): Likewise.
(gimple_fold_builtin_memory_chk): Likewise.
(gimple_fold_builtin_stxcpy_chk): Likewise.
(gimple_fold_builtin_stxncpy_chk): Likewise.
(gimple_fold_builtin_snprintf_chk): Likewise.
(gimple_fold_builtin_sprintf_chk): Likewise.
(gimple_fold_builtin_strlen): New function.
(gimple_fold_builtin_with_strlen): New function split out from
gimple_fold_builtin.
(gimple_fold_builtin): Change signature and handle
bzero, memset, bcopy, memcpy, mempcpy and memmove folding
here. Call gimple_fold_builtin_with_strlen.
(gimple_fold_call): Adjust.
2014-08-08 Kugan Vivekanandarajah <kuganv@linaro.org> 2014-08-08 Kugan Vivekanandarajah <kuganv@linaro.org>
* calls.c (precompute_arguments): Check * calls.c (precompute_arguments): Check
......
...@@ -86,7 +86,6 @@ builtin_info_type builtin_info; ...@@ -86,7 +86,6 @@ builtin_info_type builtin_info;
/* Non-zero if __builtin_constant_p should be folded right away. */ /* Non-zero if __builtin_constant_p should be folded right away. */
bool force_folding_builtin_constant_p; bool force_folding_builtin_constant_p;
static const char *c_getstr (tree);
static rtx c_readstr (const char *, enum machine_mode); static rtx c_readstr (const char *, enum machine_mode);
static int target_char_cast (tree, char *); static int target_char_cast (tree, char *);
static rtx get_memory_rtx (tree, tree); static rtx get_memory_rtx (tree, tree);
...@@ -148,7 +147,6 @@ static tree rewrite_call_expr (location_t, tree, int, tree, int, ...); ...@@ -148,7 +147,6 @@ static tree rewrite_call_expr (location_t, tree, int, tree, int, ...);
static bool validate_arg (const_tree, enum tree_code code); static bool validate_arg (const_tree, enum tree_code code);
static bool integer_valued_real_p (tree); static bool integer_valued_real_p (tree);
static tree fold_trunc_transparent_mathfn (location_t, tree, tree); static tree fold_trunc_transparent_mathfn (location_t, tree, tree);
static bool readonly_data_expr (tree);
static rtx expand_builtin_fabs (tree, rtx, rtx); static rtx expand_builtin_fabs (tree, rtx, rtx);
static rtx expand_builtin_signbit (tree, rtx); static rtx expand_builtin_signbit (tree, rtx);
static tree fold_builtin_sqrt (location_t, tree, tree); static tree fold_builtin_sqrt (location_t, tree, tree);
...@@ -164,7 +162,6 @@ static tree fold_builtin_ceil (location_t, tree, tree); ...@@ -164,7 +162,6 @@ static tree fold_builtin_ceil (location_t, tree, tree);
static tree fold_builtin_round (location_t, tree, tree); static tree fold_builtin_round (location_t, tree, tree);
static tree fold_builtin_int_roundingfn (location_t, tree, tree); static tree fold_builtin_int_roundingfn (location_t, tree, tree);
static tree fold_builtin_bitop (tree, tree); static tree fold_builtin_bitop (tree, tree);
static tree fold_builtin_memory_op (location_t, tree, tree, tree, tree, bool, int);
static tree fold_builtin_strchr (location_t, tree, tree, tree); static tree fold_builtin_strchr (location_t, tree, tree, tree);
static tree fold_builtin_memchr (location_t, tree, tree, tree, tree); static tree fold_builtin_memchr (location_t, tree, tree, tree, tree);
static tree fold_builtin_memcmp (location_t, tree, tree, tree); static tree fold_builtin_memcmp (location_t, tree, tree, tree);
...@@ -205,18 +202,16 @@ static void maybe_emit_free_warning (tree); ...@@ -205,18 +202,16 @@ static void maybe_emit_free_warning (tree);
static tree fold_builtin_object_size (tree, tree); static tree fold_builtin_object_size (tree, tree);
static tree fold_builtin_strcat_chk (location_t, tree, tree, tree, tree); static tree fold_builtin_strcat_chk (location_t, tree, tree, tree, tree);
static tree fold_builtin_strncat_chk (location_t, tree, tree, tree, tree, tree); static tree fold_builtin_strncat_chk (location_t, tree, tree, tree, tree, tree);
static tree fold_builtin_sprintf_chk (location_t, tree, enum built_in_function);
static tree fold_builtin_printf (location_t, tree, tree, tree, bool, enum built_in_function); static tree fold_builtin_printf (location_t, tree, tree, tree, bool, enum built_in_function);
static tree fold_builtin_fprintf (location_t, tree, tree, tree, tree, bool, static tree fold_builtin_fprintf (location_t, tree, tree, tree, tree, bool,
enum built_in_function); enum built_in_function);
static bool init_target_chars (void);
static unsigned HOST_WIDE_INT target_newline; static unsigned HOST_WIDE_INT target_newline;
static unsigned HOST_WIDE_INT target_percent; unsigned HOST_WIDE_INT target_percent;
static unsigned HOST_WIDE_INT target_c; static unsigned HOST_WIDE_INT target_c;
static unsigned HOST_WIDE_INT target_s; static unsigned HOST_WIDE_INT target_s;
static char target_percent_c[3]; static char target_percent_c[3];
static char target_percent_s[3]; char target_percent_s[3];
static char target_percent_s_newline[4]; static char target_percent_s_newline[4];
static tree do_mpfr_arg1 (tree, tree, int (*)(mpfr_ptr, mpfr_srcptr, mp_rnd_t), static tree do_mpfr_arg1 (tree, tree, int (*)(mpfr_ptr, mpfr_srcptr, mp_rnd_t),
const REAL_VALUE_TYPE *, const REAL_VALUE_TYPE *, bool); const REAL_VALUE_TYPE *, const REAL_VALUE_TYPE *, bool);
...@@ -634,7 +629,7 @@ c_strlen (tree src, int only_value) ...@@ -634,7 +629,7 @@ c_strlen (tree src, int only_value)
/* Return a char pointer for a C string if it is a string constant /* Return a char pointer for a C string if it is a string constant
or sum of string constant and integer constant. */ or sum of string constant and integer constant. */
static const char * const char *
c_getstr (tree src) c_getstr (tree src)
{ {
tree offset_node; tree offset_node;
...@@ -8496,503 +8491,6 @@ fold_builtin_exponent (location_t loc, tree fndecl, tree arg, ...@@ -8496,503 +8491,6 @@ fold_builtin_exponent (location_t loc, tree fndecl, tree arg,
return NULL_TREE; return NULL_TREE;
} }
/* Return true if VAR is a VAR_DECL or a component thereof. */
static bool
var_decl_component_p (tree var)
{
tree inner = var;
while (handled_component_p (inner))
inner = TREE_OPERAND (inner, 0);
return SSA_VAR_P (inner);
}
/* Fold function call to builtin memset. Return
NULL_TREE if no simplification can be made. */
static tree
fold_builtin_memset (location_t loc, tree dest, tree c, tree len,
tree type, bool ignore)
{
tree var, ret, etype;
unsigned HOST_WIDE_INT length, cval;
if (! validate_arg (dest, POINTER_TYPE)
|| ! validate_arg (c, INTEGER_TYPE)
|| ! validate_arg (len, INTEGER_TYPE))
return NULL_TREE;
if (! tree_fits_uhwi_p (len))
return NULL_TREE;
/* If the LEN parameter is zero, return DEST. */
if (integer_zerop (len))
return omit_one_operand_loc (loc, type, dest, c);
if (TREE_CODE (c) != INTEGER_CST || TREE_SIDE_EFFECTS (dest))
return NULL_TREE;
var = dest;
STRIP_NOPS (var);
if (TREE_CODE (var) != ADDR_EXPR)
return NULL_TREE;
var = TREE_OPERAND (var, 0);
if (TREE_THIS_VOLATILE (var))
return NULL_TREE;
etype = TREE_TYPE (var);
if (TREE_CODE (etype) == ARRAY_TYPE)
etype = TREE_TYPE (etype);
if (!INTEGRAL_TYPE_P (etype)
&& !POINTER_TYPE_P (etype))
return NULL_TREE;
if (! var_decl_component_p (var))
return NULL_TREE;
length = tree_to_uhwi (len);
if (GET_MODE_SIZE (TYPE_MODE (etype)) != length
|| get_pointer_alignment (dest) / BITS_PER_UNIT < length)
return NULL_TREE;
if (length > HOST_BITS_PER_WIDE_INT / BITS_PER_UNIT)
return NULL_TREE;
if (integer_zerop (c))
cval = 0;
else
{
if (CHAR_BIT != 8 || BITS_PER_UNIT != 8 || HOST_BITS_PER_WIDE_INT > 64)
return NULL_TREE;
cval = TREE_INT_CST_LOW (c);
cval &= 0xff;
cval |= cval << 8;
cval |= cval << 16;
cval |= (cval << 31) << 1;
}
ret = build_int_cst_type (etype, cval);
var = build_fold_indirect_ref_loc (loc,
fold_convert_loc (loc,
build_pointer_type (etype),
dest));
ret = build2 (MODIFY_EXPR, etype, var, ret);
if (ignore)
return ret;
return omit_one_operand_loc (loc, type, dest, ret);
}
/* Fold function call to builtin memset. Return
NULL_TREE if no simplification can be made. */
static tree
fold_builtin_bzero (location_t loc, tree dest, tree size, bool ignore)
{
if (! validate_arg (dest, POINTER_TYPE)
|| ! validate_arg (size, INTEGER_TYPE))
return NULL_TREE;
if (!ignore)
return NULL_TREE;
/* New argument list transforming bzero(ptr x, int y) to
memset(ptr x, int 0, size_t y). This is done this way
so that if it isn't expanded inline, we fallback to
calling bzero instead of memset. */
return fold_builtin_memset (loc, dest, integer_zero_node,
fold_convert_loc (loc, size_type_node, size),
void_type_node, ignore);
}
/* Fold function call to builtin mem{{,p}cpy,move}. Return
NULL_TREE if no simplification can be made.
If ENDP is 0, return DEST (like memcpy).
If ENDP is 1, return DEST+LEN (like mempcpy).
If ENDP is 2, return DEST+LEN-1 (like stpcpy).
If ENDP is 3, return DEST, additionally *SRC and *DEST may overlap
(memmove). */
static tree
fold_builtin_memory_op (location_t loc, tree dest, tree src,
tree len, tree type, bool ignore, int endp)
{
tree destvar, srcvar, expr;
if (! validate_arg (dest, POINTER_TYPE)
|| ! validate_arg (src, POINTER_TYPE)
|| ! validate_arg (len, INTEGER_TYPE))
return NULL_TREE;
/* If the LEN parameter is zero, return DEST. */
if (integer_zerop (len))
return omit_one_operand_loc (loc, type, dest, src);
/* If SRC and DEST are the same (and not volatile), return
DEST{,+LEN,+LEN-1}. */
if (operand_equal_p (src, dest, 0))
expr = len;
else
{
tree srctype, desttype;
unsigned int src_align, dest_align;
tree off0;
/* Build accesses at offset zero with a ref-all character type. */
off0 = build_int_cst (build_pointer_type_for_mode (char_type_node,
ptr_mode, true), 0);
/* If we can perform the copy efficiently with first doing all loads
and then all stores inline it that way. Currently efficiently
means that we can load all the memory into a single integer
register which is what MOVE_MAX gives us. */
src_align = get_pointer_alignment (src);
dest_align = get_pointer_alignment (dest);
if (tree_fits_uhwi_p (len)
&& compare_tree_int (len, MOVE_MAX) <= 0
/* ??? Don't transform copies from strings with known length this
confuses the tree-ssa-strlen.c. This doesn't handle
the case in gcc.dg/strlenopt-8.c which is XFAILed for that
reason. */
&& !c_strlen (src, 2))
{
unsigned ilen = tree_to_uhwi (len);
if (exact_log2 (ilen) != -1)
{
tree type = lang_hooks.types.type_for_size (ilen * 8, 1);
if (type
&& TYPE_MODE (type) != BLKmode
&& (GET_MODE_SIZE (TYPE_MODE (type)) * BITS_PER_UNIT
== ilen * 8)
/* If the pointers are not aligned we must be able to
emit an unaligned load. */
&& ((src_align >= GET_MODE_ALIGNMENT (TYPE_MODE (type))
&& dest_align >= GET_MODE_ALIGNMENT (TYPE_MODE (type)))
|| !SLOW_UNALIGNED_ACCESS (TYPE_MODE (type),
MIN (src_align, dest_align))))
{
tree srctype = type;
tree desttype = type;
if (src_align < GET_MODE_ALIGNMENT (TYPE_MODE (type)))
srctype = build_aligned_type (type, src_align);
if (dest_align < GET_MODE_ALIGNMENT (TYPE_MODE (type)))
desttype = build_aligned_type (type, dest_align);
if (!ignore)
dest = builtin_save_expr (dest);
expr = build2 (MODIFY_EXPR, type,
fold_build2 (MEM_REF, desttype, dest, off0),
fold_build2 (MEM_REF, srctype, src, off0));
goto done;
}
}
}
if (endp == 3)
{
/* Both DEST and SRC must be pointer types.
??? This is what old code did. Is the testing for pointer types
really mandatory?
If either SRC is readonly or length is 1, we can use memcpy. */
if (!dest_align || !src_align)
return NULL_TREE;
if (readonly_data_expr (src)
|| (tree_fits_uhwi_p (len)
&& (MIN (src_align, dest_align) / BITS_PER_UNIT
>= tree_to_uhwi (len))))
{
tree fn = builtin_decl_implicit (BUILT_IN_MEMCPY);
if (!fn)
return NULL_TREE;
return build_call_expr_loc (loc, fn, 3, dest, src, len);
}
/* If *src and *dest can't overlap, optimize into memcpy as well. */
if (TREE_CODE (src) == ADDR_EXPR
&& TREE_CODE (dest) == ADDR_EXPR)
{
tree src_base, dest_base, fn;
HOST_WIDE_INT src_offset = 0, dest_offset = 0;
HOST_WIDE_INT size = -1;
HOST_WIDE_INT maxsize = -1;
srcvar = TREE_OPERAND (src, 0);
src_base = get_ref_base_and_extent (srcvar, &src_offset,
&size, &maxsize);
destvar = TREE_OPERAND (dest, 0);
dest_base = get_ref_base_and_extent (destvar, &dest_offset,
&size, &maxsize);
if (tree_fits_uhwi_p (len))
maxsize = tree_to_uhwi (len);
else
maxsize = -1;
src_offset /= BITS_PER_UNIT;
dest_offset /= BITS_PER_UNIT;
if (SSA_VAR_P (src_base)
&& SSA_VAR_P (dest_base))
{
if (operand_equal_p (src_base, dest_base, 0)
&& ranges_overlap_p (src_offset, maxsize,
dest_offset, maxsize))
return NULL_TREE;
}
else if (TREE_CODE (src_base) == MEM_REF
&& TREE_CODE (dest_base) == MEM_REF)
{
if (! operand_equal_p (TREE_OPERAND (src_base, 0),
TREE_OPERAND (dest_base, 0), 0))
return NULL_TREE;
offset_int off = mem_ref_offset (src_base) + src_offset;
if (!wi::fits_shwi_p (off))
return NULL_TREE;
src_offset = off.to_shwi ();
off = mem_ref_offset (dest_base) + dest_offset;
if (!wi::fits_shwi_p (off))
return NULL_TREE;
dest_offset = off.to_shwi ();
if (ranges_overlap_p (src_offset, maxsize,
dest_offset, maxsize))
return NULL_TREE;
}
else
return NULL_TREE;
fn = builtin_decl_implicit (BUILT_IN_MEMCPY);
if (!fn)
return NULL_TREE;
return build_call_expr_loc (loc, fn, 3, dest, src, len);
}
/* If the destination and source do not alias optimize into
memcpy as well. */
if ((is_gimple_min_invariant (dest)
|| TREE_CODE (dest) == SSA_NAME)
&& (is_gimple_min_invariant (src)
|| TREE_CODE (src) == SSA_NAME))
{
ao_ref destr, srcr;
ao_ref_init_from_ptr_and_size (&destr, dest, len);
ao_ref_init_from_ptr_and_size (&srcr, src, len);
if (!refs_may_alias_p_1 (&destr, &srcr, false))
{
tree fn;
fn = builtin_decl_implicit (BUILT_IN_MEMCPY);
if (!fn)
return NULL_TREE;
return build_call_expr_loc (loc, fn, 3, dest, src, len);
}
}
return NULL_TREE;
}
if (!tree_fits_shwi_p (len))
return NULL_TREE;
/* FIXME:
This logic lose for arguments like (type *)malloc (sizeof (type)),
since we strip the casts of up to VOID return value from malloc.
Perhaps we ought to inherit type from non-VOID argument here? */
STRIP_NOPS (src);
STRIP_NOPS (dest);
if (!POINTER_TYPE_P (TREE_TYPE (src))
|| !POINTER_TYPE_P (TREE_TYPE (dest)))
return NULL_TREE;
/* In the following try to find a type that is most natural to be
used for the memcpy source and destination and that allows
the most optimization when memcpy is turned into a plain assignment
using that type. In theory we could always use a char[len] type
but that only gains us that the destination and source possibly
no longer will have their address taken. */
/* As we fold (void *)(p + CST) to (void *)p + CST undo this here. */
if (TREE_CODE (src) == POINTER_PLUS_EXPR)
{
tree tem = TREE_OPERAND (src, 0);
STRIP_NOPS (tem);
if (tem != TREE_OPERAND (src, 0))
src = build1 (NOP_EXPR, TREE_TYPE (tem), src);
}
if (TREE_CODE (dest) == POINTER_PLUS_EXPR)
{
tree tem = TREE_OPERAND (dest, 0);
STRIP_NOPS (tem);
if (tem != TREE_OPERAND (dest, 0))
dest = build1 (NOP_EXPR, TREE_TYPE (tem), dest);
}
srctype = TREE_TYPE (TREE_TYPE (src));
if (TREE_CODE (srctype) == ARRAY_TYPE
&& !tree_int_cst_equal (TYPE_SIZE_UNIT (srctype), len))
{
srctype = TREE_TYPE (srctype);
STRIP_NOPS (src);
src = build1 (NOP_EXPR, build_pointer_type (srctype), src);
}
desttype = TREE_TYPE (TREE_TYPE (dest));
if (TREE_CODE (desttype) == ARRAY_TYPE
&& !tree_int_cst_equal (TYPE_SIZE_UNIT (desttype), len))
{
desttype = TREE_TYPE (desttype);
STRIP_NOPS (dest);
dest = build1 (NOP_EXPR, build_pointer_type (desttype), dest);
}
if (TREE_ADDRESSABLE (srctype)
|| TREE_ADDRESSABLE (desttype))
return NULL_TREE;
/* Make sure we are not copying using a floating-point mode or
a type whose size possibly does not match its precision. */
if (FLOAT_MODE_P (TYPE_MODE (desttype))
|| TREE_CODE (desttype) == BOOLEAN_TYPE
|| TREE_CODE (desttype) == ENUMERAL_TYPE)
desttype = bitwise_type_for_mode (TYPE_MODE (desttype));
if (FLOAT_MODE_P (TYPE_MODE (srctype))
|| TREE_CODE (srctype) == BOOLEAN_TYPE
|| TREE_CODE (srctype) == ENUMERAL_TYPE)
srctype = bitwise_type_for_mode (TYPE_MODE (srctype));
if (!srctype)
srctype = desttype;
if (!desttype)
desttype = srctype;
if (!srctype)
return NULL_TREE;
src_align = get_pointer_alignment (src);
dest_align = get_pointer_alignment (dest);
if (dest_align < TYPE_ALIGN (desttype)
|| src_align < TYPE_ALIGN (srctype))
return NULL_TREE;
if (!ignore)
dest = builtin_save_expr (dest);
destvar = dest;
STRIP_NOPS (destvar);
if (TREE_CODE (destvar) == ADDR_EXPR
&& var_decl_component_p (TREE_OPERAND (destvar, 0))
&& tree_int_cst_equal (TYPE_SIZE_UNIT (desttype), len))
destvar = fold_build2 (MEM_REF, desttype, destvar, off0);
else
destvar = NULL_TREE;
srcvar = src;
STRIP_NOPS (srcvar);
if (TREE_CODE (srcvar) == ADDR_EXPR
&& var_decl_component_p (TREE_OPERAND (srcvar, 0))
&& tree_int_cst_equal (TYPE_SIZE_UNIT (srctype), len))
{
if (!destvar
|| src_align >= TYPE_ALIGN (desttype))
srcvar = fold_build2 (MEM_REF, destvar ? desttype : srctype,
srcvar, off0);
else if (!STRICT_ALIGNMENT)
{
srctype = build_aligned_type (TYPE_MAIN_VARIANT (desttype),
src_align);
srcvar = fold_build2 (MEM_REF, srctype, srcvar, off0);
}
else
srcvar = NULL_TREE;
}
else
srcvar = NULL_TREE;
if (srcvar == NULL_TREE && destvar == NULL_TREE)
return NULL_TREE;
if (srcvar == NULL_TREE)
{
STRIP_NOPS (src);
if (src_align >= TYPE_ALIGN (desttype))
srcvar = fold_build2 (MEM_REF, desttype, src, off0);
else
{
if (STRICT_ALIGNMENT)
return NULL_TREE;
srctype = build_aligned_type (TYPE_MAIN_VARIANT (desttype),
src_align);
srcvar = fold_build2 (MEM_REF, srctype, src, off0);
}
}
else if (destvar == NULL_TREE)
{
STRIP_NOPS (dest);
if (dest_align >= TYPE_ALIGN (srctype))
destvar = fold_build2 (MEM_REF, srctype, dest, off0);
else
{
if (STRICT_ALIGNMENT)
return NULL_TREE;
desttype = build_aligned_type (TYPE_MAIN_VARIANT (srctype),
dest_align);
destvar = fold_build2 (MEM_REF, desttype, dest, off0);
}
}
expr = build2 (MODIFY_EXPR, TREE_TYPE (destvar), destvar, srcvar);
}
done:
if (ignore)
return expr;
if (endp == 0 || endp == 3)
return omit_one_operand_loc (loc, type, dest, expr);
if (expr == len)
expr = NULL_TREE;
if (endp == 2)
len = fold_build2_loc (loc, MINUS_EXPR, TREE_TYPE (len), len,
ssize_int (1));
dest = fold_build_pointer_plus_loc (loc, dest, len);
dest = fold_convert_loc (loc, type, dest);
if (expr)
dest = omit_one_operand_loc (loc, type, dest, expr);
return dest;
}
/* Fold function call to builtin strcpy with arguments DEST and SRC.
If LEN is not NULL, it represents the length of the string to be
copied. Return NULL_TREE if no simplification can be made. */
tree
fold_builtin_strcpy (location_t loc, tree fndecl, tree dest, tree src, tree len)
{
tree fn;
if (!validate_arg (dest, POINTER_TYPE)
|| !validate_arg (src, POINTER_TYPE))
return NULL_TREE;
/* If SRC and DEST are the same (and not volatile), return DEST. */
if (operand_equal_p (src, dest, 0))
return fold_convert_loc (loc, TREE_TYPE (TREE_TYPE (fndecl)), dest);
if (optimize_function_for_size_p (cfun))
return NULL_TREE;
fn = builtin_decl_implicit (BUILT_IN_MEMCPY);
if (!fn)
return NULL_TREE;
if (!len)
{
len = c_strlen (src, 1);
if (! len || TREE_SIDE_EFFECTS (len))
return NULL_TREE;
}
len = fold_convert_loc (loc, size_type_node, len);
len = size_binop_loc (loc, PLUS_EXPR, len, build_int_cst (size_type_node, 1));
return fold_convert_loc (loc, TREE_TYPE (TREE_TYPE (fndecl)),
build_call_expr_loc (loc, fn, 3, dest, src, len));
}
/* Fold function call to builtin stpcpy with arguments DEST and SRC. /* Fold function call to builtin stpcpy with arguments DEST and SRC.
Return NULL_TREE if no simplification can be made. */ Return NULL_TREE if no simplification can be made. */
...@@ -9034,55 +8532,6 @@ fold_builtin_stpcpy (location_t loc, tree fndecl, tree dest, tree src) ...@@ -9034,55 +8532,6 @@ fold_builtin_stpcpy (location_t loc, tree fndecl, tree dest, tree src)
return dest; return dest;
} }
/* Fold function call to builtin strncpy with arguments DEST, SRC, and LEN.
If SLEN is not NULL, it represents the length of the source string.
Return NULL_TREE if no simplification can be made. */
tree
fold_builtin_strncpy (location_t loc, tree fndecl, tree dest,
tree src, tree len, tree slen)
{
tree fn;
if (!validate_arg (dest, POINTER_TYPE)
|| !validate_arg (src, POINTER_TYPE)
|| !validate_arg (len, INTEGER_TYPE))
return NULL_TREE;
/* If the LEN parameter is zero, return DEST. */
if (integer_zerop (len))
return omit_one_operand_loc (loc, TREE_TYPE (TREE_TYPE (fndecl)), dest, src);
/* We can't compare slen with len as constants below if len is not a
constant. */
if (len == 0 || TREE_CODE (len) != INTEGER_CST)
return NULL_TREE;
if (!slen)
slen = c_strlen (src, 1);
/* Now, we must be passed a constant src ptr parameter. */
if (slen == 0 || TREE_CODE (slen) != INTEGER_CST)
return NULL_TREE;
slen = size_binop_loc (loc, PLUS_EXPR, slen, ssize_int (1));
/* We do not support simplification of this case, though we do
support it when expanding trees into RTL. */
/* FIXME: generate a call to __builtin_memset. */
if (tree_int_cst_lt (slen, len))
return NULL_TREE;
/* OK transform into builtin memcpy. */
fn = builtin_decl_implicit (BUILT_IN_MEMCPY);
if (!fn)
return NULL_TREE;
len = fold_convert_loc (loc, size_type_node, len);
return fold_convert_loc (loc, TREE_TYPE (TREE_TYPE (fndecl)),
build_call_expr_loc (loc, fn, 3, dest, src, len));
}
/* Fold function call to builtin memchr. ARG1, ARG2 and LEN are the /* Fold function call to builtin memchr. ARG1, ARG2 and LEN are the
arguments to the call, and TYPE is its return type. arguments to the call, and TYPE is its return type.
Return NULL_TREE if no simplification can be made. */ Return NULL_TREE if no simplification can be made. */
...@@ -10707,21 +10156,9 @@ fold_builtin_2 (location_t loc, tree fndecl, tree arg0, tree arg1, bool ignore) ...@@ -10707,21 +10156,9 @@ fold_builtin_2 (location_t loc, tree fndecl, tree arg0, tree arg1, bool ignore)
CASE_FLT_FN (BUILT_IN_MODF): CASE_FLT_FN (BUILT_IN_MODF):
return fold_builtin_modf (loc, arg0, arg1, type); return fold_builtin_modf (loc, arg0, arg1, type);
case BUILT_IN_BZERO:
return fold_builtin_bzero (loc, arg0, arg1, ignore);
case BUILT_IN_FPUTS:
return fold_builtin_fputs (loc, arg0, arg1, ignore, false, NULL_TREE);
case BUILT_IN_FPUTS_UNLOCKED:
return fold_builtin_fputs (loc, arg0, arg1, ignore, true, NULL_TREE);
case BUILT_IN_STRSTR: case BUILT_IN_STRSTR:
return fold_builtin_strstr (loc, arg0, arg1, type); return fold_builtin_strstr (loc, arg0, arg1, type);
case BUILT_IN_STRCAT:
return fold_builtin_strcat (loc, arg0, arg1, NULL_TREE);
case BUILT_IN_STRSPN: case BUILT_IN_STRSPN:
return fold_builtin_strspn (loc, arg0, arg1); return fold_builtin_strspn (loc, arg0, arg1);
...@@ -10736,9 +10173,6 @@ fold_builtin_2 (location_t loc, tree fndecl, tree arg0, tree arg1, bool ignore) ...@@ -10736,9 +10173,6 @@ fold_builtin_2 (location_t loc, tree fndecl, tree arg0, tree arg1, bool ignore)
case BUILT_IN_RINDEX: case BUILT_IN_RINDEX:
return fold_builtin_strrchr (loc, arg0, arg1, type); return fold_builtin_strrchr (loc, arg0, arg1, type);
case BUILT_IN_STRCPY:
return fold_builtin_strcpy (loc, fndecl, arg0, arg1, NULL_TREE);
case BUILT_IN_STPCPY: case BUILT_IN_STPCPY:
if (ignore) if (ignore)
{ {
...@@ -10866,31 +10300,9 @@ fold_builtin_3 (location_t loc, tree fndecl, ...@@ -10866,31 +10300,9 @@ fold_builtin_3 (location_t loc, tree fndecl,
return do_mpfr_remquo (arg0, arg1, arg2); return do_mpfr_remquo (arg0, arg1, arg2);
break; break;
case BUILT_IN_MEMSET:
return fold_builtin_memset (loc, arg0, arg1, arg2, type, ignore);
case BUILT_IN_BCOPY:
return fold_builtin_memory_op (loc, arg1, arg0, arg2,
void_type_node, true, /*endp=*/3);
case BUILT_IN_MEMCPY:
return fold_builtin_memory_op (loc, arg0, arg1, arg2,
type, ignore, /*endp=*/0);
case BUILT_IN_MEMPCPY:
return fold_builtin_memory_op (loc, arg0, arg1, arg2,
type, ignore, /*endp=*/1);
case BUILT_IN_MEMMOVE:
return fold_builtin_memory_op (loc, arg0, arg1, arg2,
type, ignore, /*endp=*/3);
case BUILT_IN_STRNCAT: case BUILT_IN_STRNCAT:
return fold_builtin_strncat (loc, arg0, arg1, arg2); return fold_builtin_strncat (loc, arg0, arg1, arg2);
case BUILT_IN_STRNCPY:
return fold_builtin_strncpy (loc, fndecl, arg0, arg1, arg2, NULL_TREE);
case BUILT_IN_STRNCMP: case BUILT_IN_STRNCMP:
return fold_builtin_strncmp (loc, arg0, arg1, arg2); return fold_builtin_strncmp (loc, arg0, arg1, arg2);
...@@ -10907,11 +10319,6 @@ fold_builtin_3 (location_t loc, tree fndecl, ...@@ -10907,11 +10319,6 @@ fold_builtin_3 (location_t loc, tree fndecl,
case BUILT_IN_SNPRINTF: case BUILT_IN_SNPRINTF:
return fold_builtin_snprintf (loc, arg0, arg1, arg2, NULL_TREE, ignore); return fold_builtin_snprintf (loc, arg0, arg1, arg2, NULL_TREE, ignore);
case BUILT_IN_STRCPY_CHK:
case BUILT_IN_STPCPY_CHK:
return fold_builtin_stxcpy_chk (loc, fndecl, arg0, arg1, arg2, NULL_TREE,
ignore, fcode);
case BUILT_IN_STRCAT_CHK: case BUILT_IN_STRCAT_CHK:
return fold_builtin_strcat_chk (loc, fndecl, arg0, arg1, arg2); return fold_builtin_strcat_chk (loc, fndecl, arg0, arg1, arg2);
...@@ -10961,19 +10368,6 @@ fold_builtin_4 (location_t loc, tree fndecl, ...@@ -10961,19 +10368,6 @@ fold_builtin_4 (location_t loc, tree fndecl,
switch (fcode) switch (fcode)
{ {
case BUILT_IN_MEMCPY_CHK:
case BUILT_IN_MEMPCPY_CHK:
case BUILT_IN_MEMMOVE_CHK:
case BUILT_IN_MEMSET_CHK:
return fold_builtin_memory_chk (loc, fndecl, arg0, arg1, arg2, arg3,
NULL_TREE, ignore,
DECL_FUNCTION_CODE (fndecl));
case BUILT_IN_STRNCPY_CHK:
case BUILT_IN_STPNCPY_CHK:
return fold_builtin_stxncpy_chk (loc, arg0, arg1, arg2, arg3, NULL_TREE,
ignore, fcode);
case BUILT_IN_STRNCAT_CHK: case BUILT_IN_STRNCAT_CHK:
return fold_builtin_strncat_chk (loc, fndecl, arg0, arg1, arg2, arg3); return fold_builtin_strncat_chk (loc, fndecl, arg0, arg1, arg2, arg3);
...@@ -11070,25 +10464,6 @@ rewrite_call_expr_valist (location_t loc, int oldnargs, tree *args, ...@@ -11070,25 +10464,6 @@ rewrite_call_expr_valist (location_t loc, int oldnargs, tree *args,
return build_call_expr_loc_array (loc, fndecl, nargs, buffer); return build_call_expr_loc_array (loc, fndecl, nargs, buffer);
} }
/* Construct a new CALL_EXPR to FNDECL using the tail of the argument
list ARGS along with N new arguments specified as the "..."
parameters. SKIP is the number of arguments in ARGS to be omitted.
OLDNARGS is the number of elements in ARGS. */
static tree
rewrite_call_expr_array (location_t loc, int oldnargs, tree *args,
int skip, tree fndecl, int n, ...)
{
va_list ap;
tree t;
va_start (ap, n);
t = rewrite_call_expr_valist (loc, oldnargs, args, skip, fndecl, n, ap);
va_end (ap);
return t;
}
/* Return true if FNDECL shouldn't be folded right now. /* Return true if FNDECL shouldn't be folded right now.
If a built-in function has an inline attribute always_inline If a built-in function has an inline attribute always_inline
wrapper, defer folding it after always_inline functions have wrapper, defer folding it after always_inline functions have
...@@ -11321,7 +10696,7 @@ default_expand_builtin (tree exp ATTRIBUTE_UNUSED, ...@@ -11321,7 +10696,7 @@ default_expand_builtin (tree exp ATTRIBUTE_UNUSED,
/* Returns true is EXP represents data that would potentially reside /* Returns true is EXP represents data that would potentially reside
in a readonly section. */ in a readonly section. */
static bool bool
readonly_data_expr (tree exp) readonly_data_expr (tree exp)
{ {
STRIP_NOPS (exp); STRIP_NOPS (exp);
...@@ -11594,77 +10969,6 @@ fold_builtin_strpbrk (location_t loc, tree s1, tree s2, tree type) ...@@ -11594,77 +10969,6 @@ fold_builtin_strpbrk (location_t loc, tree s1, tree s2, tree type)
} }
} }
/* Simplify a call to the strcat builtin. DST and SRC are the arguments
to the call.
Return NULL_TREE if no simplification was possible, otherwise return the
simplified form of the call as a tree.
The simplified form may be a constant or other expression which
computes the same value, but in a more efficient manner (including
calls to other builtin functions).
The call may contain arguments which need to be evaluated, but
which are not useful to determine the result of the call. In
this case we return a chain of COMPOUND_EXPRs. The LHS of each
COMPOUND_EXPR will be an argument which must be evaluated.
COMPOUND_EXPRs are chained through their RHS. The RHS of the last
COMPOUND_EXPR in the chain will contain the tree for the simplified
form of the builtin function call. */
tree
fold_builtin_strcat (location_t loc ATTRIBUTE_UNUSED, tree dst, tree src,
tree len)
{
if (!validate_arg (dst, POINTER_TYPE)
|| !validate_arg (src, POINTER_TYPE))
return NULL_TREE;
else
{
const char *p = c_getstr (src);
/* If the string length is zero, return the dst parameter. */
if (p && *p == '\0')
return dst;
if (optimize_insn_for_speed_p ())
{
/* See if we can store by pieces into (dst + strlen(dst)). */
tree newdst, call;
tree strlen_fn = builtin_decl_implicit (BUILT_IN_STRLEN);
tree memcpy_fn = builtin_decl_implicit (BUILT_IN_MEMCPY);
if (!strlen_fn || !memcpy_fn)
return NULL_TREE;
/* If the length of the source string isn't computable don't
split strcat into strlen and memcpy. */
if (! len)
len = c_strlen (src, 1);
if (! len || TREE_SIDE_EFFECTS (len))
return NULL_TREE;
/* Stabilize the argument list. */
dst = builtin_save_expr (dst);
/* Create strlen (dst). */
newdst = build_call_expr_loc (loc, strlen_fn, 1, dst);
/* Create (dst p+ strlen (dst)). */
newdst = fold_build_pointer_plus_loc (loc, dst, newdst);
newdst = builtin_save_expr (newdst);
len = fold_convert_loc (loc, size_type_node, len);
len = size_binop_loc (loc, PLUS_EXPR, len,
build_int_cst (size_type_node, 1));
call = build_call_expr_loc (loc, memcpy_fn, 3, newdst, src, len);
return build2 (COMPOUND_EXPR, TREE_TYPE (dst), call, dst);
}
return NULL_TREE;
}
}
/* Simplify a call to the strncat builtin. DST, SRC, and LEN are the /* Simplify a call to the strncat builtin. DST, SRC, and LEN are the
arguments to the call. arguments to the call.
...@@ -11822,84 +11126,6 @@ fold_builtin_strcspn (location_t loc, tree s1, tree s2) ...@@ -11822,84 +11126,6 @@ fold_builtin_strcspn (location_t loc, tree s1, tree s2)
} }
} }
/* Fold a call to the fputs builtin. ARG0 and ARG1 are the arguments
to the call. IGNORE is true if the value returned
by the builtin will be ignored. UNLOCKED is true is true if this
actually a call to fputs_unlocked. If LEN in non-NULL, it represents
the known length of the string. Return NULL_TREE if no simplification
was possible. */
tree
fold_builtin_fputs (location_t loc, tree arg0, tree arg1,
bool ignore, bool unlocked, tree len)
{
/* If we're using an unlocked function, assume the other unlocked
functions exist explicitly. */
tree const fn_fputc = (unlocked
? builtin_decl_explicit (BUILT_IN_FPUTC_UNLOCKED)
: builtin_decl_implicit (BUILT_IN_FPUTC));
tree const fn_fwrite = (unlocked
? builtin_decl_explicit (BUILT_IN_FWRITE_UNLOCKED)
: builtin_decl_implicit (BUILT_IN_FWRITE));
/* If the return value is used, don't do the transformation. */
if (!ignore)
return NULL_TREE;
/* Verify the arguments in the original call. */
if (!validate_arg (arg0, POINTER_TYPE)
|| !validate_arg (arg1, POINTER_TYPE))
return NULL_TREE;
if (! len)
len = c_strlen (arg0, 0);
/* Get the length of the string passed to fputs. If the length
can't be determined, punt. */
if (!len
|| TREE_CODE (len) != INTEGER_CST)
return NULL_TREE;
switch (compare_tree_int (len, 1))
{
case -1: /* length is 0, delete the call entirely . */
return omit_one_operand_loc (loc, integer_type_node,
integer_zero_node, arg1);;
case 0: /* length is 1, call fputc. */
{
const char *p = c_getstr (arg0);
if (p != NULL)
{
if (fn_fputc)
return build_call_expr_loc (loc, fn_fputc, 2,
build_int_cst
(integer_type_node, p[0]), arg1);
else
return NULL_TREE;
}
}
/* FALLTHROUGH */
case 1: /* length is greater than 1, call fwrite. */
{
/* If optimizing for size keep fputs. */
if (optimize_function_for_size_p (cfun))
return NULL_TREE;
/* New argument list transforming fputs(string, stream) to
fwrite(string, 1, len, stream). */
if (fn_fwrite)
return build_call_expr_loc (loc, fn_fwrite, 4, arg0,
size_one_node, len, arg1);
else
return NULL_TREE;
}
default:
gcc_unreachable ();
}
return NULL_TREE;
}
/* Fold the next_arg or va_start call EXP. Returns true if there was an error /* Fold the next_arg or va_start call EXP. Returns true if there was an error
produced. False otherwise. This is done so that we don't output the error produced. False otherwise. This is done so that we don't output the error
or warning twice or three times. */ or warning twice or three times. */
...@@ -12571,240 +11797,6 @@ fold_builtin_object_size (tree ptr, tree ost) ...@@ -12571,240 +11797,6 @@ fold_builtin_object_size (tree ptr, tree ost)
return NULL_TREE; return NULL_TREE;
} }
/* Fold a call to the __mem{cpy,pcpy,move,set}_chk builtin.
DEST, SRC, LEN, and SIZE are the arguments to the call.
IGNORE is true, if return value can be ignored. FCODE is the BUILT_IN_*
code of the builtin. If MAXLEN is not NULL, it is maximum length
passed as third argument. */
tree
fold_builtin_memory_chk (location_t loc, tree fndecl,
tree dest, tree src, tree len, tree size,
tree maxlen, bool ignore,
enum built_in_function fcode)
{
tree fn;
if (!validate_arg (dest, POINTER_TYPE)
|| !validate_arg (src,
(fcode == BUILT_IN_MEMSET_CHK
? INTEGER_TYPE : POINTER_TYPE))
|| !validate_arg (len, INTEGER_TYPE)
|| !validate_arg (size, INTEGER_TYPE))
return NULL_TREE;
/* If SRC and DEST are the same (and not volatile), return DEST
(resp. DEST+LEN for __mempcpy_chk). */
if (fcode != BUILT_IN_MEMSET_CHK && operand_equal_p (src, dest, 0))
{
if (fcode != BUILT_IN_MEMPCPY_CHK)
return omit_one_operand_loc (loc, TREE_TYPE (TREE_TYPE (fndecl)),
dest, len);
else
{
tree temp = fold_build_pointer_plus_loc (loc, dest, len);
return fold_convert_loc (loc, TREE_TYPE (TREE_TYPE (fndecl)), temp);
}
}
if (! tree_fits_uhwi_p (size))
return NULL_TREE;
if (! integer_all_onesp (size))
{
if (! tree_fits_uhwi_p (len))
{
/* If LEN is not constant, try MAXLEN too.
For MAXLEN only allow optimizing into non-_ocs function
if SIZE is >= MAXLEN, never convert to __ocs_fail (). */
if (maxlen == NULL_TREE || ! tree_fits_uhwi_p (maxlen))
{
if (fcode == BUILT_IN_MEMPCPY_CHK && ignore)
{
/* (void) __mempcpy_chk () can be optimized into
(void) __memcpy_chk (). */
fn = builtin_decl_explicit (BUILT_IN_MEMCPY_CHK);
if (!fn)
return NULL_TREE;
return build_call_expr_loc (loc, fn, 4, dest, src, len, size);
}
return NULL_TREE;
}
}
else
maxlen = len;
if (tree_int_cst_lt (size, maxlen))
return NULL_TREE;
}
fn = NULL_TREE;
/* If __builtin_mem{cpy,pcpy,move,set}_chk is used, assume
mem{cpy,pcpy,move,set} is available. */
switch (fcode)
{
case BUILT_IN_MEMCPY_CHK:
fn = builtin_decl_explicit (BUILT_IN_MEMCPY);
break;
case BUILT_IN_MEMPCPY_CHK:
fn = builtin_decl_explicit (BUILT_IN_MEMPCPY);
break;
case BUILT_IN_MEMMOVE_CHK:
fn = builtin_decl_explicit (BUILT_IN_MEMMOVE);
break;
case BUILT_IN_MEMSET_CHK:
fn = builtin_decl_explicit (BUILT_IN_MEMSET);
break;
default:
break;
}
if (!fn)
return NULL_TREE;
return build_call_expr_loc (loc, fn, 3, dest, src, len);
}
/* Fold a call to the __st[rp]cpy_chk builtin.
DEST, SRC, and SIZE are the arguments to the call.
IGNORE is true if return value can be ignored. FCODE is the BUILT_IN_*
code of the builtin. If MAXLEN is not NULL, it is maximum length of
strings passed as second argument. */
tree
fold_builtin_stxcpy_chk (location_t loc, tree fndecl, tree dest,
tree src, tree size,
tree maxlen, bool ignore,
enum built_in_function fcode)
{
tree len, fn;
if (!validate_arg (dest, POINTER_TYPE)
|| !validate_arg (src, POINTER_TYPE)
|| !validate_arg (size, INTEGER_TYPE))
return NULL_TREE;
/* If SRC and DEST are the same (and not volatile), return DEST. */
if (fcode == BUILT_IN_STRCPY_CHK && operand_equal_p (src, dest, 0))
return fold_convert_loc (loc, TREE_TYPE (TREE_TYPE (fndecl)), dest);
if (! tree_fits_uhwi_p (size))
return NULL_TREE;
if (! integer_all_onesp (size))
{
len = c_strlen (src, 1);
if (! len || ! tree_fits_uhwi_p (len))
{
/* If LEN is not constant, try MAXLEN too.
For MAXLEN only allow optimizing into non-_ocs function
if SIZE is >= MAXLEN, never convert to __ocs_fail (). */
if (maxlen == NULL_TREE || ! tree_fits_uhwi_p (maxlen))
{
if (fcode == BUILT_IN_STPCPY_CHK)
{
if (! ignore)
return NULL_TREE;
/* If return value of __stpcpy_chk is ignored,
optimize into __strcpy_chk. */
fn = builtin_decl_explicit (BUILT_IN_STRCPY_CHK);
if (!fn)
return NULL_TREE;
return build_call_expr_loc (loc, fn, 3, dest, src, size);
}
if (! len || TREE_SIDE_EFFECTS (len))
return NULL_TREE;
/* If c_strlen returned something, but not a constant,
transform __strcpy_chk into __memcpy_chk. */
fn = builtin_decl_explicit (BUILT_IN_MEMCPY_CHK);
if (!fn)
return NULL_TREE;
len = fold_convert_loc (loc, size_type_node, len);
len = size_binop_loc (loc, PLUS_EXPR, len,
build_int_cst (size_type_node, 1));
return fold_convert_loc (loc, TREE_TYPE (TREE_TYPE (fndecl)),
build_call_expr_loc (loc, fn, 4,
dest, src, len, size));
}
}
else
maxlen = len;
if (! tree_int_cst_lt (maxlen, size))
return NULL_TREE;
}
/* If __builtin_st{r,p}cpy_chk is used, assume st{r,p}cpy is available. */
fn = builtin_decl_explicit (fcode == BUILT_IN_STPCPY_CHK
? BUILT_IN_STPCPY : BUILT_IN_STRCPY);
if (!fn)
return NULL_TREE;
return build_call_expr_loc (loc, fn, 2, dest, src);
}
/* Fold a call to the __st{r,p}ncpy_chk builtin. DEST, SRC, LEN, and SIZE
are the arguments to the call. If MAXLEN is not NULL, it is maximum
length passed as third argument. IGNORE is true if return value can be
ignored. FCODE is the BUILT_IN_* code of the builtin. */
tree
fold_builtin_stxncpy_chk (location_t loc, tree dest, tree src,
tree len, tree size, tree maxlen, bool ignore,
enum built_in_function fcode)
{
tree fn;
if (!validate_arg (dest, POINTER_TYPE)
|| !validate_arg (src, POINTER_TYPE)
|| !validate_arg (len, INTEGER_TYPE)
|| !validate_arg (size, INTEGER_TYPE))
return NULL_TREE;
if (fcode == BUILT_IN_STPNCPY_CHK && ignore)
{
/* If return value of __stpncpy_chk is ignored,
optimize into __strncpy_chk. */
fn = builtin_decl_explicit (BUILT_IN_STRNCPY_CHK);
if (fn)
return build_call_expr_loc (loc, fn, 4, dest, src, len, size);
}
if (! tree_fits_uhwi_p (size))
return NULL_TREE;
if (! integer_all_onesp (size))
{
if (! tree_fits_uhwi_p (len))
{
/* If LEN is not constant, try MAXLEN too.
For MAXLEN only allow optimizing into non-_ocs function
if SIZE is >= MAXLEN, never convert to __ocs_fail (). */
if (maxlen == NULL_TREE || ! tree_fits_uhwi_p (maxlen))
return NULL_TREE;
}
else
maxlen = len;
if (tree_int_cst_lt (size, maxlen))
return NULL_TREE;
}
/* If __builtin_st{r,p}ncpy_chk is used, assume st{r,p}ncpy is available. */
fn = builtin_decl_explicit (fcode == BUILT_IN_STPNCPY_CHK
? BUILT_IN_STPNCPY : BUILT_IN_STRNCPY);
if (!fn)
return NULL_TREE;
return build_call_expr_loc (loc, fn, 3, dest, src, len);
}
/* Fold a call to the __strcat_chk builtin FNDECL. DEST, SRC, and SIZE /* Fold a call to the __strcat_chk builtin FNDECL. DEST, SRC, and SIZE
are the arguments to the call. */ are the arguments to the call. */
...@@ -12888,201 +11880,6 @@ fold_builtin_strncat_chk (location_t loc, tree fndecl, ...@@ -12888,201 +11880,6 @@ fold_builtin_strncat_chk (location_t loc, tree fndecl,
return build_call_expr_loc (loc, fn, 3, dest, src, len); return build_call_expr_loc (loc, fn, 3, dest, src, len);
} }
/* Fold a call EXP to __{,v}sprintf_chk having NARGS passed as ARGS.
Return NULL_TREE if a normal call should be emitted rather than
expanding the function inline. FCODE is either BUILT_IN_SPRINTF_CHK
or BUILT_IN_VSPRINTF_CHK. */
static tree
fold_builtin_sprintf_chk_1 (location_t loc, int nargs, tree *args,
enum built_in_function fcode)
{
tree dest, size, len, fn, fmt, flag;
const char *fmt_str;
/* Verify the required arguments in the original call. */
if (nargs < 4)
return NULL_TREE;
dest = args[0];
if (!validate_arg (dest, POINTER_TYPE))
return NULL_TREE;
flag = args[1];
if (!validate_arg (flag, INTEGER_TYPE))
return NULL_TREE;
size = args[2];
if (!validate_arg (size, INTEGER_TYPE))
return NULL_TREE;
fmt = args[3];
if (!validate_arg (fmt, POINTER_TYPE))
return NULL_TREE;
if (! tree_fits_uhwi_p (size))
return NULL_TREE;
len = NULL_TREE;
if (!init_target_chars ())
return NULL_TREE;
/* Check whether the format is a literal string constant. */
fmt_str = c_getstr (fmt);
if (fmt_str != NULL)
{
/* If the format doesn't contain % args or %%, we know the size. */
if (strchr (fmt_str, target_percent) == 0)
{
if (fcode != BUILT_IN_SPRINTF_CHK || nargs == 4)
len = build_int_cstu (size_type_node, strlen (fmt_str));
}
/* If the format is "%s" and first ... argument is a string literal,
we know the size too. */
else if (fcode == BUILT_IN_SPRINTF_CHK
&& strcmp (fmt_str, target_percent_s) == 0)
{
tree arg;
if (nargs == 5)
{
arg = args[4];
if (validate_arg (arg, POINTER_TYPE))
{
len = c_strlen (arg, 1);
if (! len || ! tree_fits_uhwi_p (len))
len = NULL_TREE;
}
}
}
}
if (! integer_all_onesp (size))
{
if (! len || ! tree_int_cst_lt (len, size))
return NULL_TREE;
}
/* Only convert __{,v}sprintf_chk to {,v}sprintf if flag is 0
or if format doesn't contain % chars or is "%s". */
if (! integer_zerop (flag))
{
if (fmt_str == NULL)
return NULL_TREE;
if (strchr (fmt_str, target_percent) != NULL
&& strcmp (fmt_str, target_percent_s))
return NULL_TREE;
}
/* If __builtin_{,v}sprintf_chk is used, assume {,v}sprintf is available. */
fn = builtin_decl_explicit (fcode == BUILT_IN_VSPRINTF_CHK
? BUILT_IN_VSPRINTF : BUILT_IN_SPRINTF);
if (!fn)
return NULL_TREE;
return rewrite_call_expr_array (loc, nargs, args, 4, fn, 2, dest, fmt);
}
/* Fold a call EXP to __{,v}sprintf_chk. Return NULL_TREE if
a normal call should be emitted rather than expanding the function
inline. FCODE is either BUILT_IN_SPRINTF_CHK or BUILT_IN_VSPRINTF_CHK. */
static tree
fold_builtin_sprintf_chk (location_t loc, tree exp,
enum built_in_function fcode)
{
return fold_builtin_sprintf_chk_1 (loc, call_expr_nargs (exp),
CALL_EXPR_ARGP (exp), fcode);
}
/* Fold a call EXP to {,v}snprintf having NARGS passed as ARGS. Return
NULL_TREE if a normal call should be emitted rather than expanding
the function inline. FCODE is either BUILT_IN_SNPRINTF_CHK or
BUILT_IN_VSNPRINTF_CHK. If MAXLEN is not NULL, it is maximum length
passed as second argument. */
static tree
fold_builtin_snprintf_chk_1 (location_t loc, int nargs, tree *args,
tree maxlen, enum built_in_function fcode)
{
tree dest, size, len, fn, fmt, flag;
const char *fmt_str;
/* Verify the required arguments in the original call. */
if (nargs < 5)
return NULL_TREE;
dest = args[0];
if (!validate_arg (dest, POINTER_TYPE))
return NULL_TREE;
len = args[1];
if (!validate_arg (len, INTEGER_TYPE))
return NULL_TREE;
flag = args[2];
if (!validate_arg (flag, INTEGER_TYPE))
return NULL_TREE;
size = args[3];
if (!validate_arg (size, INTEGER_TYPE))
return NULL_TREE;
fmt = args[4];
if (!validate_arg (fmt, POINTER_TYPE))
return NULL_TREE;
if (! tree_fits_uhwi_p (size))
return NULL_TREE;
if (! integer_all_onesp (size))
{
if (! tree_fits_uhwi_p (len))
{
/* If LEN is not constant, try MAXLEN too.
For MAXLEN only allow optimizing into non-_ocs function
if SIZE is >= MAXLEN, never convert to __ocs_fail (). */
if (maxlen == NULL_TREE || ! tree_fits_uhwi_p (maxlen))
return NULL_TREE;
}
else
maxlen = len;
if (tree_int_cst_lt (size, maxlen))
return NULL_TREE;
}
if (!init_target_chars ())
return NULL_TREE;
/* Only convert __{,v}snprintf_chk to {,v}snprintf if flag is 0
or if format doesn't contain % chars or is "%s". */
if (! integer_zerop (flag))
{
fmt_str = c_getstr (fmt);
if (fmt_str == NULL)
return NULL_TREE;
if (strchr (fmt_str, target_percent) != NULL
&& strcmp (fmt_str, target_percent_s))
return NULL_TREE;
}
/* If __builtin_{,v}snprintf_chk is used, assume {,v}snprintf is
available. */
fn = builtin_decl_explicit (fcode == BUILT_IN_VSNPRINTF_CHK
? BUILT_IN_VSNPRINTF : BUILT_IN_SNPRINTF);
if (!fn)
return NULL_TREE;
return rewrite_call_expr_array (loc, nargs, args, 5, fn, 3, dest, len, fmt);
}
/* Fold a call EXP to {,v}snprintf. Return NULL_TREE if
a normal call should be emitted rather than expanding the function
inline. FCODE is either BUILT_IN_SNPRINTF_CHK or
BUILT_IN_VSNPRINTF_CHK. If MAXLEN is not NULL, it is maximum length
passed as second argument. */
static tree
fold_builtin_snprintf_chk (location_t loc, tree exp, tree maxlen,
enum built_in_function fcode)
{
return fold_builtin_snprintf_chk_1 (loc, call_expr_nargs (exp),
CALL_EXPR_ARGP (exp), maxlen, fcode);
}
/* Builtins with folding operations that operate on "..." arguments /* Builtins with folding operations that operate on "..." arguments
need special handling; we need to store the arguments in a convenient need special handling; we need to store the arguments in a convenient
data structure before attempting any folding. Fortunately there are data structure before attempting any folding. Fortunately there are
...@@ -13099,16 +11896,6 @@ fold_builtin_varargs (location_t loc, tree fndecl, tree exp, ...@@ -13099,16 +11896,6 @@ fold_builtin_varargs (location_t loc, tree fndecl, tree exp,
switch (fcode) switch (fcode)
{ {
case BUILT_IN_SPRINTF_CHK:
case BUILT_IN_VSPRINTF_CHK:
ret = fold_builtin_sprintf_chk (loc, exp, fcode);
break;
case BUILT_IN_SNPRINTF_CHK:
case BUILT_IN_VSNPRINTF_CHK:
ret = fold_builtin_snprintf_chk (loc, exp, NULL_TREE, fcode);
break;
case BUILT_IN_FPCLASSIFY: case BUILT_IN_FPCLASSIFY:
ret = fold_builtin_fpclassify (loc, exp); ret = fold_builtin_fpclassify (loc, exp);
break; break;
...@@ -13376,7 +12163,7 @@ fold_builtin_fprintf (location_t loc, tree fndecl, tree fp, ...@@ -13376,7 +12163,7 @@ fold_builtin_fprintf (location_t loc, tree fndecl, tree fp,
/* Initialize format string characters in the target charset. */ /* Initialize format string characters in the target charset. */
static bool bool
init_target_chars (void) init_target_chars (void)
{ {
static bool init; static bool init;
...@@ -13993,76 +12780,6 @@ do_mpc_arg2 (tree arg0, tree arg1, tree type, int do_nonfinite, ...@@ -13993,76 +12780,6 @@ do_mpc_arg2 (tree arg0, tree arg1, tree type, int do_nonfinite,
return result; return result;
} }
/* Fold a call STMT to __{,v}sprintf_chk. Return NULL_TREE if
a normal call should be emitted rather than expanding the function
inline. FCODE is either BUILT_IN_SPRINTF_CHK or BUILT_IN_VSPRINTF_CHK. */
static tree
gimple_fold_builtin_sprintf_chk (gimple stmt, enum built_in_function fcode)
{
int nargs = gimple_call_num_args (stmt);
return fold_builtin_sprintf_chk_1 (gimple_location (stmt), nargs,
(nargs > 0
? gimple_call_arg_ptr (stmt, 0)
: &error_mark_node), fcode);
}
/* Fold a call STMT to {,v}snprintf. Return NULL_TREE if
a normal call should be emitted rather than expanding the function
inline. FCODE is either BUILT_IN_SNPRINTF_CHK or
BUILT_IN_VSNPRINTF_CHK. If MAXLEN is not NULL, it is maximum length
passed as second argument. */
tree
gimple_fold_builtin_snprintf_chk (gimple stmt, tree maxlen,
enum built_in_function fcode)
{
int nargs = gimple_call_num_args (stmt);
return fold_builtin_snprintf_chk_1 (gimple_location (stmt), nargs,
(nargs > 0
? gimple_call_arg_ptr (stmt, 0)
: &error_mark_node), maxlen, fcode);
}
/* Builtins with folding operations that operate on "..." arguments
need special handling; we need to store the arguments in a convenient
data structure before attempting any folding. Fortunately there are
only a few builtins that fall into this category. FNDECL is the
function, EXP is the CALL_EXPR for the call, and IGNORE is true if the
result of the function call is ignored. */
static tree
gimple_fold_builtin_varargs (tree fndecl, gimple stmt,
bool ignore ATTRIBUTE_UNUSED)
{
enum built_in_function fcode = DECL_FUNCTION_CODE (fndecl);
tree ret = NULL_TREE;
switch (fcode)
{
case BUILT_IN_SPRINTF_CHK:
case BUILT_IN_VSPRINTF_CHK:
ret = gimple_fold_builtin_sprintf_chk (stmt, fcode);
break;
case BUILT_IN_SNPRINTF_CHK:
case BUILT_IN_VSNPRINTF_CHK:
ret = gimple_fold_builtin_snprintf_chk (stmt, NULL_TREE, fcode);
default:
break;
}
if (ret)
{
ret = build1 (NOP_EXPR, TREE_TYPE (ret), ret);
TREE_NO_WARNING (ret) = 1;
return ret;
}
return NULL_TREE;
}
/* A wrapper function for builtin folding that prevents warnings for /* A wrapper function for builtin folding that prevents warnings for
"statement without effect" and the like, caused by removing the "statement without effect" and the like, caused by removing the
call node earlier than the warning is generated. */ call node earlier than the warning is generated. */
...@@ -14093,8 +12810,6 @@ fold_call_stmt (gimple stmt, bool ignore) ...@@ -14093,8 +12810,6 @@ fold_call_stmt (gimple stmt, bool ignore)
{ {
if (nargs <= MAX_ARGS_TO_FOLD_BUILTIN) if (nargs <= MAX_ARGS_TO_FOLD_BUILTIN)
ret = fold_builtin_n (loc, fndecl, args, nargs, ignore); ret = fold_builtin_n (loc, fndecl, args, nargs, ignore);
if (!ret)
ret = gimple_fold_builtin_varargs (fndecl, stmt, ignore);
if (ret) if (ret)
{ {
/* Propagate location information from original call to /* Propagate location information from original call to
......
...@@ -71,29 +71,23 @@ extern void expand_builtin_trap (void); ...@@ -71,29 +71,23 @@ extern void expand_builtin_trap (void);
extern rtx expand_builtin (tree, rtx, rtx, enum machine_mode, int); extern rtx expand_builtin (tree, rtx, rtx, enum machine_mode, int);
extern enum built_in_function builtin_mathfn_code (const_tree); extern enum built_in_function builtin_mathfn_code (const_tree);
extern tree fold_builtin_expect (location_t, tree, tree, tree); extern tree fold_builtin_expect (location_t, tree, tree, tree);
extern tree fold_builtin_strcpy (location_t, tree, tree, tree, tree);
extern tree fold_builtin_strncpy (location_t, tree, tree, tree, tree, tree);
extern tree fold_fma (location_t, tree, tree, tree, tree); extern tree fold_fma (location_t, tree, tree, tree, tree);
extern bool avoid_folding_inline_builtin (tree); extern bool avoid_folding_inline_builtin (tree);
extern tree fold_call_expr (location_t, tree, bool); extern tree fold_call_expr (location_t, tree, bool);
extern tree fold_builtin_call_array (location_t, tree, tree, int, tree *); extern tree fold_builtin_call_array (location_t, tree, tree, int, tree *);
extern bool validate_gimple_arglist (const_gimple, ...); extern bool validate_gimple_arglist (const_gimple, ...);
extern rtx default_expand_builtin (tree, rtx, rtx, enum machine_mode, int); extern rtx default_expand_builtin (tree, rtx, rtx, enum machine_mode, int);
extern tree fold_builtin_strcat (location_t, tree, tree, tree);
extern tree fold_builtin_fputs (location_t, tree, tree, bool, bool, tree);
extern bool fold_builtin_next_arg (tree, bool); extern bool fold_builtin_next_arg (tree, bool);
extern tree fold_builtin_memory_chk (location_t, tree, tree, tree, tree, tree,
tree, bool, enum built_in_function);
extern tree fold_builtin_stxcpy_chk (location_t, tree, tree, tree, tree, tree,
bool, enum built_in_function);
extern tree fold_builtin_stxncpy_chk (location_t, tree, tree, tree, tree, tree,
bool, enum built_in_function);
extern tree gimple_fold_builtin_snprintf_chk (gimple, tree,
enum built_in_function);
extern tree do_mpc_arg2 (tree, tree, tree, int, int (*)(mpc_ptr, mpc_srcptr, mpc_srcptr, mpc_rnd_t)); extern tree do_mpc_arg2 (tree, tree, tree, int, int (*)(mpc_ptr, mpc_srcptr, mpc_srcptr, mpc_rnd_t));
extern tree fold_call_stmt (gimple, bool); extern tree fold_call_stmt (gimple, bool);
extern void set_builtin_user_assembler_name (tree decl, const char *asmspec); extern void set_builtin_user_assembler_name (tree decl, const char *asmspec);
extern bool is_simple_builtin (tree); extern bool is_simple_builtin (tree);
extern bool is_inexpensive_builtin (tree); extern bool is_inexpensive_builtin (tree);
extern bool readonly_data_expr (tree exp);
extern const char *c_getstr (tree);
extern bool init_target_chars (void);
extern unsigned HOST_WIDE_INT target_percent;
extern char target_percent_s[3];
#endif #endif
...@@ -54,6 +54,7 @@ along with GCC; see the file COPYING3. If not see ...@@ -54,6 +54,7 @@ along with GCC; see the file COPYING3. If not see
#include "gimplify-me.h" #include "gimplify-me.h"
#include "dbgcnt.h" #include "dbgcnt.h"
#include "builtins.h" #include "builtins.h"
#include "output.h"
/* Return true when DECL can be referenced from current unit. /* Return true when DECL can be referenced from current unit.
FROM_DECL (if non-null) specify constructor of variable DECL was taken from. FROM_DECL (if non-null) specify constructor of variable DECL was taken from.
...@@ -626,6 +627,79 @@ fold_gimple_cond (gimple stmt) ...@@ -626,6 +627,79 @@ fold_gimple_cond (gimple stmt)
return false; return false;
} }
/* Replace a statement at *SI_P with a sequence of statements in STMTS,
adjusting the replacement stmts location and virtual operands.
If the statement has a lhs the last stmt in the sequence is expected
to assign to that lhs. */
static void
gsi_replace_with_seq_vops (gimple_stmt_iterator *si_p, gimple_seq stmts)
{
gimple stmt = gsi_stmt (*si_p);
if (gimple_has_location (stmt))
annotate_all_with_location (stmts, gimple_location (stmt));
/* First iterate over the replacement statements backward, assigning
virtual operands to their defining statements. */
gimple laststore = NULL;
for (gimple_stmt_iterator i = gsi_last (stmts);
!gsi_end_p (i); gsi_prev (&i))
{
gimple new_stmt = gsi_stmt (i);
if ((gimple_assign_single_p (new_stmt)
&& !is_gimple_reg (gimple_assign_lhs (new_stmt)))
|| (is_gimple_call (new_stmt)
&& (gimple_call_flags (new_stmt)
& (ECF_NOVOPS | ECF_PURE | ECF_CONST | ECF_NORETURN)) == 0))
{
tree vdef;
if (!laststore)
vdef = gimple_vdef (stmt);
else
vdef = make_ssa_name (gimple_vop (cfun), new_stmt);
gimple_set_vdef (new_stmt, vdef);
if (vdef && TREE_CODE (vdef) == SSA_NAME)
SSA_NAME_DEF_STMT (vdef) = new_stmt;
laststore = new_stmt;
}
}
/* Second iterate over the statements forward, assigning virtual
operands to their uses. */
tree reaching_vuse = gimple_vuse (stmt);
for (gimple_stmt_iterator i = gsi_start (stmts);
!gsi_end_p (i); gsi_next (&i))
{
gimple new_stmt = gsi_stmt (i);
/* If the new statement possibly has a VUSE, update it with exact SSA
name we know will reach this one. */
if (gimple_has_mem_ops (new_stmt))
gimple_set_vuse (new_stmt, reaching_vuse);
gimple_set_modified (new_stmt, true);
if (gimple_vdef (new_stmt))
reaching_vuse = gimple_vdef (new_stmt);
}
/* If the new sequence does not do a store release the virtual
definition of the original statement. */
if (reaching_vuse
&& reaching_vuse == gimple_vuse (stmt))
{
tree vdef = gimple_vdef (stmt);
if (vdef
&& TREE_CODE (vdef) == SSA_NAME)
{
unlink_stmt_vdef (stmt);
release_ssa_name (vdef);
}
}
/* Finally replace the original statement with the sequence. */
gsi_replace_with_seq (si_p, stmts, false);
}
/* Convert EXPR into a GIMPLE value suitable for substitution on the /* Convert EXPR into a GIMPLE value suitable for substitution on the
RHS of an assignment. Insert the necessary statements before RHS of an assignment. Insert the necessary statements before
iterator *SI_P. The statement at *SI_P, which must be a GIMPLE_CALL iterator *SI_P. The statement at *SI_P, which must be a GIMPLE_CALL
...@@ -643,8 +717,6 @@ gimplify_and_update_call_from_tree (gimple_stmt_iterator *si_p, tree expr) ...@@ -643,8 +717,6 @@ gimplify_and_update_call_from_tree (gimple_stmt_iterator *si_p, tree expr)
gimple stmt, new_stmt; gimple stmt, new_stmt;
gimple_stmt_iterator i; gimple_stmt_iterator i;
gimple_seq stmts = NULL; gimple_seq stmts = NULL;
gimple laststore;
tree reaching_vuse;
stmt = gsi_stmt (*si_p); stmt = gsi_stmt (*si_p);
...@@ -681,240 +753,1431 @@ gimplify_and_update_call_from_tree (gimple_stmt_iterator *si_p, tree expr) ...@@ -681,240 +753,1431 @@ gimplify_and_update_call_from_tree (gimple_stmt_iterator *si_p, tree expr)
pop_gimplify_context (NULL); pop_gimplify_context (NULL);
if (gimple_has_location (stmt)) gsi_replace_with_seq_vops (si_p, stmts);
annotate_all_with_location (stmts, gimple_location (stmt)); }
/* First iterate over the replacement statements backward, assigning
virtual operands to their defining statements. */ /* Replace the call at *GSI with the gimple value VAL. */
laststore = NULL;
for (i = gsi_last (stmts); !gsi_end_p (i); gsi_prev (&i)) static void
replace_call_with_value (gimple_stmt_iterator *gsi, tree val)
{
gimple stmt = gsi_stmt (*gsi);
tree lhs = gimple_call_lhs (stmt);
gimple repl;
if (lhs)
{ {
new_stmt = gsi_stmt (i); if (!useless_type_conversion_p (TREE_TYPE (lhs), TREE_TYPE (val)))
if ((gimple_assign_single_p (new_stmt) val = fold_convert (TREE_TYPE (lhs), val);
&& !is_gimple_reg (gimple_assign_lhs (new_stmt))) repl = gimple_build_assign (lhs, val);
|| (is_gimple_call (new_stmt) }
&& (gimple_call_flags (new_stmt) else
& (ECF_NOVOPS | ECF_PURE | ECF_CONST | ECF_NORETURN)) == 0)) repl = gimple_build_nop ();
tree vdef = gimple_vdef (stmt);
if (vdef && TREE_CODE (vdef) == SSA_NAME)
{ {
tree vdef; unlink_stmt_vdef (stmt);
if (!laststore) release_ssa_name (vdef);
vdef = gimple_vdef (stmt); }
gsi_replace (gsi, repl, true);
}
/* Replace the call at *GSI with the new call REPL and fold that
again. */
static void
replace_call_with_call_and_fold (gimple_stmt_iterator *gsi, gimple repl)
{
gimple stmt = gsi_stmt (*gsi);
gimple_call_set_lhs (repl, gimple_call_lhs (stmt));
gimple_set_location (repl, gimple_location (stmt));
if (gimple_vdef (stmt)
&& TREE_CODE (gimple_vdef (stmt)) == SSA_NAME)
{
gimple_set_vdef (repl, gimple_vdef (stmt));
gimple_set_vuse (repl, gimple_vuse (stmt));
SSA_NAME_DEF_STMT (gimple_vdef (repl)) = repl;
}
gsi_replace (gsi, repl, true);
fold_stmt (gsi);
}
/* Return true if VAR is a VAR_DECL or a component thereof. */
static bool
var_decl_component_p (tree var)
{
tree inner = var;
while (handled_component_p (inner))
inner = TREE_OPERAND (inner, 0);
return SSA_VAR_P (inner);
}
/* Fold function call to builtin mem{{,p}cpy,move}. Return
NULL_TREE if no simplification can be made.
If ENDP is 0, return DEST (like memcpy).
If ENDP is 1, return DEST+LEN (like mempcpy).
If ENDP is 2, return DEST+LEN-1 (like stpcpy).
If ENDP is 3, return DEST, additionally *SRC and *DEST may overlap
(memmove). */
static bool
gimple_fold_builtin_memory_op (gimple_stmt_iterator *gsi,
tree dest, tree src, int endp)
{
gimple stmt = gsi_stmt (*gsi);
tree lhs = gimple_call_lhs (stmt);
tree len = gimple_call_arg (stmt, 2);
tree destvar, srcvar;
location_t loc = gimple_location (stmt);
/* If the LEN parameter is zero, return DEST. */
if (integer_zerop (len))
{
gimple repl;
if (gimple_call_lhs (stmt))
repl = gimple_build_assign (gimple_call_lhs (stmt), dest);
else else
vdef = make_ssa_name (gimple_vop (cfun), new_stmt); repl = gimple_build_nop ();
gimple_set_vdef (new_stmt, vdef); tree vdef = gimple_vdef (stmt);
if (vdef && TREE_CODE (vdef) == SSA_NAME) if (vdef && TREE_CODE (vdef) == SSA_NAME)
SSA_NAME_DEF_STMT (vdef) = new_stmt; {
laststore = new_stmt; unlink_stmt_vdef (stmt);
release_ssa_name (vdef);
} }
gsi_replace (gsi, repl, true);
return true;
} }
/* Second iterate over the statements forward, assigning virtual /* If SRC and DEST are the same (and not volatile), return
operands to their uses. */ DEST{,+LEN,+LEN-1}. */
reaching_vuse = gimple_vuse (stmt); if (operand_equal_p (src, dest, 0))
for (i = gsi_start (stmts); !gsi_end_p (i); gsi_next (&i))
{ {
new_stmt = gsi_stmt (i); unlink_stmt_vdef (stmt);
/* If the new statement possibly has a VUSE, update it with exact SSA if (gimple_vdef (stmt) && TREE_CODE (gimple_vdef (stmt)) == SSA_NAME)
name we know will reach this one. */ release_ssa_name (gimple_vdef (stmt));
if (gimple_has_mem_ops (new_stmt)) if (!lhs)
gimple_set_vuse (new_stmt, reaching_vuse); {
gimple_set_modified (new_stmt, true); gsi_replace (gsi, gimple_build_nop (), true);
if (gimple_vdef (new_stmt)) return true;
reaching_vuse = gimple_vdef (new_stmt); }
goto done;
}
else
{
tree srctype, desttype;
unsigned int src_align, dest_align;
tree off0;
/* Build accesses at offset zero with a ref-all character type. */
off0 = build_int_cst (build_pointer_type_for_mode (char_type_node,
ptr_mode, true), 0);
/* If we can perform the copy efficiently with first doing all loads
and then all stores inline it that way. Currently efficiently
means that we can load all the memory into a single integer
register which is what MOVE_MAX gives us. */
src_align = get_pointer_alignment (src);
dest_align = get_pointer_alignment (dest);
if (tree_fits_uhwi_p (len)
&& compare_tree_int (len, MOVE_MAX) <= 0
/* ??? Don't transform copies from strings with known length this
confuses the tree-ssa-strlen.c. This doesn't handle
the case in gcc.dg/strlenopt-8.c which is XFAILed for that
reason. */
&& !c_strlen (src, 2))
{
unsigned ilen = tree_to_uhwi (len);
if (exact_log2 (ilen) != -1)
{
tree type = lang_hooks.types.type_for_size (ilen * 8, 1);
if (type
&& TYPE_MODE (type) != BLKmode
&& (GET_MODE_SIZE (TYPE_MODE (type)) * BITS_PER_UNIT
== ilen * 8)
/* If the destination pointer is not aligned we must be able
to emit an unaligned store. */
&& (dest_align >= GET_MODE_ALIGNMENT (TYPE_MODE (type))
|| !SLOW_UNALIGNED_ACCESS (TYPE_MODE (type), dest_align)))
{
tree srctype = type;
tree desttype = type;
if (src_align < GET_MODE_ALIGNMENT (TYPE_MODE (type)))
srctype = build_aligned_type (type, src_align);
tree srcmem = fold_build2 (MEM_REF, srctype, src, off0);
tree tem = fold_const_aggregate_ref (srcmem);
if (tem)
srcmem = tem;
else if (src_align < GET_MODE_ALIGNMENT (TYPE_MODE (type))
&& SLOW_UNALIGNED_ACCESS (TYPE_MODE (type),
src_align))
srcmem = NULL_TREE;
if (srcmem)
{
gimple new_stmt;
if (is_gimple_reg_type (TREE_TYPE (srcmem)))
{
new_stmt = gimple_build_assign (NULL_TREE, srcmem);
if (gimple_in_ssa_p (cfun))
srcmem = make_ssa_name (TREE_TYPE (srcmem),
new_stmt);
else
srcmem = create_tmp_reg (TREE_TYPE (srcmem),
NULL);
gimple_assign_set_lhs (new_stmt, srcmem);
gimple_set_vuse (new_stmt, gimple_vuse (stmt));
gsi_insert_before (gsi, new_stmt, GSI_SAME_STMT);
}
if (dest_align < GET_MODE_ALIGNMENT (TYPE_MODE (type)))
desttype = build_aligned_type (type, dest_align);
new_stmt
= gimple_build_assign (fold_build2 (MEM_REF, desttype,
dest, off0),
srcmem);
gimple_set_vuse (new_stmt, gimple_vuse (stmt));
gimple_set_vdef (new_stmt, gimple_vdef (stmt));
if (gimple_vdef (new_stmt)
&& TREE_CODE (gimple_vdef (new_stmt)) == SSA_NAME)
SSA_NAME_DEF_STMT (gimple_vdef (new_stmt)) = new_stmt;
if (!lhs)
{
gsi_replace (gsi, new_stmt, true);
return true;
}
gsi_insert_before (gsi, new_stmt, GSI_SAME_STMT);
goto done;
}
}
}
}
if (endp == 3)
{
/* Both DEST and SRC must be pointer types.
??? This is what old code did. Is the testing for pointer types
really mandatory?
If either SRC is readonly or length is 1, we can use memcpy. */
if (!dest_align || !src_align)
return false;
if (readonly_data_expr (src)
|| (tree_fits_uhwi_p (len)
&& (MIN (src_align, dest_align) / BITS_PER_UNIT
>= tree_to_uhwi (len))))
{
tree fn = builtin_decl_implicit (BUILT_IN_MEMCPY);
if (!fn)
return false;
gimple_call_set_fndecl (stmt, fn);
gimple_call_set_arg (stmt, 0, dest);
gimple_call_set_arg (stmt, 1, src);
fold_stmt (gsi);
return true;
}
/* If *src and *dest can't overlap, optimize into memcpy as well. */
if (TREE_CODE (src) == ADDR_EXPR
&& TREE_CODE (dest) == ADDR_EXPR)
{
tree src_base, dest_base, fn;
HOST_WIDE_INT src_offset = 0, dest_offset = 0;
HOST_WIDE_INT size = -1;
HOST_WIDE_INT maxsize = -1;
srcvar = TREE_OPERAND (src, 0);
src_base = get_ref_base_and_extent (srcvar, &src_offset,
&size, &maxsize);
destvar = TREE_OPERAND (dest, 0);
dest_base = get_ref_base_and_extent (destvar, &dest_offset,
&size, &maxsize);
if (tree_fits_uhwi_p (len))
maxsize = tree_to_uhwi (len);
else
maxsize = -1;
src_offset /= BITS_PER_UNIT;
dest_offset /= BITS_PER_UNIT;
if (SSA_VAR_P (src_base)
&& SSA_VAR_P (dest_base))
{
if (operand_equal_p (src_base, dest_base, 0)
&& ranges_overlap_p (src_offset, maxsize,
dest_offset, maxsize))
return false;
}
else if (TREE_CODE (src_base) == MEM_REF
&& TREE_CODE (dest_base) == MEM_REF)
{
if (! operand_equal_p (TREE_OPERAND (src_base, 0),
TREE_OPERAND (dest_base, 0), 0))
return false;
offset_int off = mem_ref_offset (src_base) + src_offset;
if (!wi::fits_shwi_p (off))
return false;
src_offset = off.to_shwi ();
off = mem_ref_offset (dest_base) + dest_offset;
if (!wi::fits_shwi_p (off))
return false;
dest_offset = off.to_shwi ();
if (ranges_overlap_p (src_offset, maxsize,
dest_offset, maxsize))
return false;
}
else
return false;
fn = builtin_decl_implicit (BUILT_IN_MEMCPY);
if (!fn)
return false;
gimple_call_set_fndecl (stmt, fn);
gimple_call_set_arg (stmt, 0, dest);
gimple_call_set_arg (stmt, 1, src);
fold_stmt (gsi);
return true;
}
/* If the destination and source do not alias optimize into
memcpy as well. */
if ((is_gimple_min_invariant (dest)
|| TREE_CODE (dest) == SSA_NAME)
&& (is_gimple_min_invariant (src)
|| TREE_CODE (src) == SSA_NAME))
{
ao_ref destr, srcr;
ao_ref_init_from_ptr_and_size (&destr, dest, len);
ao_ref_init_from_ptr_and_size (&srcr, src, len);
if (!refs_may_alias_p_1 (&destr, &srcr, false))
{
tree fn;
fn = builtin_decl_implicit (BUILT_IN_MEMCPY);
if (!fn)
return false;
gimple_call_set_fndecl (stmt, fn);
gimple_call_set_arg (stmt, 0, dest);
gimple_call_set_arg (stmt, 1, src);
fold_stmt (gsi);
return true;
}
}
return false;
}
if (!tree_fits_shwi_p (len))
return false;
/* FIXME:
This logic lose for arguments like (type *)malloc (sizeof (type)),
since we strip the casts of up to VOID return value from malloc.
Perhaps we ought to inherit type from non-VOID argument here? */
STRIP_NOPS (src);
STRIP_NOPS (dest);
if (!POINTER_TYPE_P (TREE_TYPE (src))
|| !POINTER_TYPE_P (TREE_TYPE (dest)))
return false;
/* In the following try to find a type that is most natural to be
used for the memcpy source and destination and that allows
the most optimization when memcpy is turned into a plain assignment
using that type. In theory we could always use a char[len] type
but that only gains us that the destination and source possibly
no longer will have their address taken. */
/* As we fold (void *)(p + CST) to (void *)p + CST undo this here. */
if (TREE_CODE (src) == POINTER_PLUS_EXPR)
{
tree tem = TREE_OPERAND (src, 0);
STRIP_NOPS (tem);
if (tem != TREE_OPERAND (src, 0))
src = build1 (NOP_EXPR, TREE_TYPE (tem), src);
}
if (TREE_CODE (dest) == POINTER_PLUS_EXPR)
{
tree tem = TREE_OPERAND (dest, 0);
STRIP_NOPS (tem);
if (tem != TREE_OPERAND (dest, 0))
dest = build1 (NOP_EXPR, TREE_TYPE (tem), dest);
}
srctype = TREE_TYPE (TREE_TYPE (src));
if (TREE_CODE (srctype) == ARRAY_TYPE
&& !tree_int_cst_equal (TYPE_SIZE_UNIT (srctype), len))
{
srctype = TREE_TYPE (srctype);
STRIP_NOPS (src);
src = build1 (NOP_EXPR, build_pointer_type (srctype), src);
}
desttype = TREE_TYPE (TREE_TYPE (dest));
if (TREE_CODE (desttype) == ARRAY_TYPE
&& !tree_int_cst_equal (TYPE_SIZE_UNIT (desttype), len))
{
desttype = TREE_TYPE (desttype);
STRIP_NOPS (dest);
dest = build1 (NOP_EXPR, build_pointer_type (desttype), dest);
}
if (TREE_ADDRESSABLE (srctype)
|| TREE_ADDRESSABLE (desttype))
return false;
/* Make sure we are not copying using a floating-point mode or
a type whose size possibly does not match its precision. */
if (FLOAT_MODE_P (TYPE_MODE (desttype))
|| TREE_CODE (desttype) == BOOLEAN_TYPE
|| TREE_CODE (desttype) == ENUMERAL_TYPE)
desttype = bitwise_type_for_mode (TYPE_MODE (desttype));
if (FLOAT_MODE_P (TYPE_MODE (srctype))
|| TREE_CODE (srctype) == BOOLEAN_TYPE
|| TREE_CODE (srctype) == ENUMERAL_TYPE)
srctype = bitwise_type_for_mode (TYPE_MODE (srctype));
if (!srctype)
srctype = desttype;
if (!desttype)
desttype = srctype;
if (!srctype)
return false;
src_align = get_pointer_alignment (src);
dest_align = get_pointer_alignment (dest);
if (dest_align < TYPE_ALIGN (desttype)
|| src_align < TYPE_ALIGN (srctype))
return false;
destvar = dest;
STRIP_NOPS (destvar);
if (TREE_CODE (destvar) == ADDR_EXPR
&& var_decl_component_p (TREE_OPERAND (destvar, 0))
&& tree_int_cst_equal (TYPE_SIZE_UNIT (desttype), len))
destvar = fold_build2 (MEM_REF, desttype, destvar, off0);
else
destvar = NULL_TREE;
srcvar = src;
STRIP_NOPS (srcvar);
if (TREE_CODE (srcvar) == ADDR_EXPR
&& var_decl_component_p (TREE_OPERAND (srcvar, 0))
&& tree_int_cst_equal (TYPE_SIZE_UNIT (srctype), len))
{
if (!destvar
|| src_align >= TYPE_ALIGN (desttype))
srcvar = fold_build2 (MEM_REF, destvar ? desttype : srctype,
srcvar, off0);
else if (!STRICT_ALIGNMENT)
{
srctype = build_aligned_type (TYPE_MAIN_VARIANT (desttype),
src_align);
srcvar = fold_build2 (MEM_REF, srctype, srcvar, off0);
}
else
srcvar = NULL_TREE;
}
else
srcvar = NULL_TREE;
if (srcvar == NULL_TREE && destvar == NULL_TREE)
return false;
if (srcvar == NULL_TREE)
{
STRIP_NOPS (src);
if (src_align >= TYPE_ALIGN (desttype))
srcvar = fold_build2 (MEM_REF, desttype, src, off0);
else
{
if (STRICT_ALIGNMENT)
return false;
srctype = build_aligned_type (TYPE_MAIN_VARIANT (desttype),
src_align);
srcvar = fold_build2 (MEM_REF, srctype, src, off0);
}
}
else if (destvar == NULL_TREE)
{
STRIP_NOPS (dest);
if (dest_align >= TYPE_ALIGN (srctype))
destvar = fold_build2 (MEM_REF, srctype, dest, off0);
else
{
if (STRICT_ALIGNMENT)
return false;
desttype = build_aligned_type (TYPE_MAIN_VARIANT (srctype),
dest_align);
destvar = fold_build2 (MEM_REF, desttype, dest, off0);
}
}
gimple new_stmt;
if (is_gimple_reg_type (TREE_TYPE (srcvar)))
{
new_stmt = gimple_build_assign (NULL_TREE, srcvar);
if (gimple_in_ssa_p (cfun))
srcvar = make_ssa_name (TREE_TYPE (srcvar), new_stmt);
else
srcvar = create_tmp_reg (TREE_TYPE (srcvar), NULL);
gimple_assign_set_lhs (new_stmt, srcvar);
gimple_set_vuse (new_stmt, gimple_vuse (stmt));
gsi_insert_before (gsi, new_stmt, GSI_SAME_STMT);
}
new_stmt = gimple_build_assign (destvar, srcvar);
gimple_set_vuse (new_stmt, gimple_vuse (stmt));
gimple_set_vdef (new_stmt, gimple_vdef (stmt));
if (gimple_vdef (new_stmt)
&& TREE_CODE (gimple_vdef (new_stmt)) == SSA_NAME)
SSA_NAME_DEF_STMT (gimple_vdef (new_stmt)) = new_stmt;
if (!lhs)
{
gsi_replace (gsi, new_stmt, true);
return true;
}
gsi_insert_before (gsi, new_stmt, GSI_SAME_STMT);
}
done:
if (endp == 0 || endp == 3)
len = NULL_TREE;
else if (endp == 2)
len = fold_build2_loc (loc, MINUS_EXPR, TREE_TYPE (len), len,
ssize_int (1));
if (endp == 2 || endp == 1)
dest = fold_build_pointer_plus_loc (loc, dest, len);
dest = force_gimple_operand_gsi (gsi, dest, false, NULL_TREE, true,
GSI_SAME_STMT);
gimple repl = gimple_build_assign (lhs, dest);
gsi_replace (gsi, repl, true);
return true;
}
/* Fold function call to builtin memset or bzero at *GSI setting the
memory of size LEN to VAL. Return whether a simplification was made. */
static bool
gimple_fold_builtin_memset (gimple_stmt_iterator *gsi, tree c, tree len)
{
gimple stmt = gsi_stmt (*gsi);
tree etype;
unsigned HOST_WIDE_INT length, cval;
/* If the LEN parameter is zero, return DEST. */
if (integer_zerop (len))
{
replace_call_with_value (gsi, gimple_call_arg (stmt, 0));
return true;
}
if (! tree_fits_uhwi_p (len))
return false;
if (TREE_CODE (c) != INTEGER_CST)
return false;
tree dest = gimple_call_arg (stmt, 0);
tree var = dest;
if (TREE_CODE (var) != ADDR_EXPR)
return false;
var = TREE_OPERAND (var, 0);
if (TREE_THIS_VOLATILE (var))
return false;
etype = TREE_TYPE (var);
if (TREE_CODE (etype) == ARRAY_TYPE)
etype = TREE_TYPE (etype);
if (!INTEGRAL_TYPE_P (etype)
&& !POINTER_TYPE_P (etype))
return NULL_TREE;
if (! var_decl_component_p (var))
return NULL_TREE;
length = tree_to_uhwi (len);
if (GET_MODE_SIZE (TYPE_MODE (etype)) != length
|| get_pointer_alignment (dest) / BITS_PER_UNIT < length)
return NULL_TREE;
if (length > HOST_BITS_PER_WIDE_INT / BITS_PER_UNIT)
return NULL_TREE;
if (integer_zerop (c))
cval = 0;
else
{
if (CHAR_BIT != 8 || BITS_PER_UNIT != 8 || HOST_BITS_PER_WIDE_INT > 64)
return NULL_TREE;
cval = TREE_INT_CST_LOW (c);
cval &= 0xff;
cval |= cval << 8;
cval |= cval << 16;
cval |= (cval << 31) << 1;
}
var = fold_build2 (MEM_REF, etype, dest, build_int_cst (ptr_type_node, 0));
gimple store = gimple_build_assign (var, build_int_cst_type (etype, cval));
gimple_set_vuse (store, gimple_vuse (stmt));
tree vdef = gimple_vdef (stmt);
if (vdef && TREE_CODE (vdef) == SSA_NAME)
{
gimple_set_vdef (store, gimple_vdef (stmt));
SSA_NAME_DEF_STMT (gimple_vdef (stmt)) = store;
}
gsi_insert_before (gsi, store, GSI_SAME_STMT);
if (gimple_call_lhs (stmt))
{
gimple asgn = gimple_build_assign (gimple_call_lhs (stmt), dest);
gsi_replace (gsi, asgn, true);
}
else
{
gimple_stmt_iterator gsi2 = *gsi;
gsi_prev (gsi);
gsi_remove (&gsi2, true);
}
return true;
}
/* Return the string length, maximum string length or maximum value of
ARG in LENGTH.
If ARG is an SSA name variable, follow its use-def chains. If LENGTH
is not NULL and, for TYPE == 0, its value is not equal to the length
we determine or if we are unable to determine the length or value,
return false. VISITED is a bitmap of visited variables.
TYPE is 0 if string length should be returned, 1 for maximum string
length and 2 for maximum value ARG can have. */
static bool
get_maxval_strlen (tree arg, tree *length, bitmap visited, int type)
{
tree var, val;
gimple def_stmt;
if (TREE_CODE (arg) != SSA_NAME)
{
/* We can end up with &(*iftmp_1)[0] here as well, so handle it. */
if (TREE_CODE (arg) == ADDR_EXPR
&& TREE_CODE (TREE_OPERAND (arg, 0)) == ARRAY_REF
&& integer_zerop (TREE_OPERAND (TREE_OPERAND (arg, 0), 1)))
{
tree aop0 = TREE_OPERAND (TREE_OPERAND (arg, 0), 0);
if (TREE_CODE (aop0) == INDIRECT_REF
&& TREE_CODE (TREE_OPERAND (aop0, 0)) == SSA_NAME)
return get_maxval_strlen (TREE_OPERAND (aop0, 0),
length, visited, type);
}
if (type == 2)
{
val = arg;
if (TREE_CODE (val) != INTEGER_CST
|| tree_int_cst_sgn (val) < 0)
return false;
}
else
val = c_strlen (arg, 1);
if (!val)
return false;
if (*length)
{
if (type > 0)
{
if (TREE_CODE (*length) != INTEGER_CST
|| TREE_CODE (val) != INTEGER_CST)
return false;
if (tree_int_cst_lt (*length, val))
*length = val;
return true;
}
else if (simple_cst_equal (val, *length) != 1)
return false;
}
*length = val;
return true;
}
/* If ARG is registered for SSA update we cannot look at its defining
statement. */
if (name_registered_for_update_p (arg))
return false;
/* If we were already here, break the infinite cycle. */
if (!bitmap_set_bit (visited, SSA_NAME_VERSION (arg)))
return true;
var = arg;
def_stmt = SSA_NAME_DEF_STMT (var);
switch (gimple_code (def_stmt))
{
case GIMPLE_ASSIGN:
/* The RHS of the statement defining VAR must either have a
constant length or come from another SSA_NAME with a constant
length. */
if (gimple_assign_single_p (def_stmt)
|| gimple_assign_unary_nop_p (def_stmt))
{
tree rhs = gimple_assign_rhs1 (def_stmt);
return get_maxval_strlen (rhs, length, visited, type);
}
else if (gimple_assign_rhs_code (def_stmt) == COND_EXPR)
{
tree op2 = gimple_assign_rhs2 (def_stmt);
tree op3 = gimple_assign_rhs3 (def_stmt);
return get_maxval_strlen (op2, length, visited, type)
&& get_maxval_strlen (op3, length, visited, type);
}
return false;
case GIMPLE_PHI:
{
/* All the arguments of the PHI node must have the same constant
length. */
unsigned i;
for (i = 0; i < gimple_phi_num_args (def_stmt); i++)
{
tree arg = gimple_phi_arg (def_stmt, i)->def;
/* If this PHI has itself as an argument, we cannot
determine the string length of this argument. However,
if we can find a constant string length for the other
PHI args then we can still be sure that this is a
constant string length. So be optimistic and just
continue with the next argument. */
if (arg == gimple_phi_result (def_stmt))
continue;
if (!get_maxval_strlen (arg, length, visited, type))
return false;
}
}
return true;
default:
return false;
}
}
/* Fold function call to builtin strcpy with arguments DEST and SRC.
If LEN is not NULL, it represents the length of the string to be
copied. Return NULL_TREE if no simplification can be made. */
static bool
gimple_fold_builtin_strcpy (gimple_stmt_iterator *gsi,
location_t loc, tree dest, tree src, tree len)
{
tree fn;
/* If SRC and DEST are the same (and not volatile), return DEST. */
if (operand_equal_p (src, dest, 0))
{
replace_call_with_value (gsi, dest);
return true;
}
if (optimize_function_for_size_p (cfun))
return false;
fn = builtin_decl_implicit (BUILT_IN_MEMCPY);
if (!fn)
return false;
if (!len)
{
len = c_strlen (src, 1);
if (! len || TREE_SIDE_EFFECTS (len))
return NULL_TREE;
}
len = fold_convert_loc (loc, size_type_node, len);
len = size_binop_loc (loc, PLUS_EXPR, len, build_int_cst (size_type_node, 1));
len = force_gimple_operand_gsi (gsi, len, true,
NULL_TREE, true, GSI_SAME_STMT);
gimple repl = gimple_build_call (fn, 3, dest, src, len);
replace_call_with_call_and_fold (gsi, repl);
return true;
}
/* Fold function call to builtin strncpy with arguments DEST, SRC, and LEN.
If SLEN is not NULL, it represents the length of the source string.
Return NULL_TREE if no simplification can be made. */
static bool
gimple_fold_builtin_strncpy (gimple_stmt_iterator *gsi, location_t loc,
tree dest, tree src, tree len, tree slen)
{
tree fn;
/* If the LEN parameter is zero, return DEST. */
if (integer_zerop (len))
{
replace_call_with_value (gsi, dest);
return true;
}
/* We can't compare slen with len as constants below if len is not a
constant. */
if (len == 0 || TREE_CODE (len) != INTEGER_CST)
return false;
if (!slen)
slen = c_strlen (src, 1);
/* Now, we must be passed a constant src ptr parameter. */
if (slen == 0 || TREE_CODE (slen) != INTEGER_CST)
return false;
slen = size_binop_loc (loc, PLUS_EXPR, slen, ssize_int (1));
/* We do not support simplification of this case, though we do
support it when expanding trees into RTL. */
/* FIXME: generate a call to __builtin_memset. */
if (tree_int_cst_lt (slen, len))
return false;
/* OK transform into builtin memcpy. */
fn = builtin_decl_implicit (BUILT_IN_MEMCPY);
if (!fn)
return false;
len = fold_convert_loc (loc, size_type_node, len);
len = force_gimple_operand_gsi (gsi, len, true,
NULL_TREE, true, GSI_SAME_STMT);
gimple repl = gimple_build_call (fn, 3, dest, src, len);
replace_call_with_call_and_fold (gsi, repl);
return true;
}
/* Simplify a call to the strcat builtin. DST and SRC are the arguments
to the call.
Return NULL_TREE if no simplification was possible, otherwise return the
simplified form of the call as a tree.
The simplified form may be a constant or other expression which
computes the same value, but in a more efficient manner (including
calls to other builtin functions).
The call may contain arguments which need to be evaluated, but
which are not useful to determine the result of the call. In
this case we return a chain of COMPOUND_EXPRs. The LHS of each
COMPOUND_EXPR will be an argument which must be evaluated.
COMPOUND_EXPRs are chained through their RHS. The RHS of the last
COMPOUND_EXPR in the chain will contain the tree for the simplified
form of the builtin function call. */
static bool
gimple_fold_builtin_strcat (gimple_stmt_iterator *gsi,
location_t loc ATTRIBUTE_UNUSED, tree dst, tree src,
tree len)
{
gimple stmt = gsi_stmt (*gsi);
const char *p = c_getstr (src);
/* If the string length is zero, return the dst parameter. */
if (p && *p == '\0')
{
replace_call_with_value (gsi, dst);
return true;
}
if (!optimize_bb_for_speed_p (gimple_bb (stmt)))
return false;
/* See if we can store by pieces into (dst + strlen(dst)). */
tree newdst;
tree strlen_fn = builtin_decl_implicit (BUILT_IN_STRLEN);
tree memcpy_fn = builtin_decl_implicit (BUILT_IN_MEMCPY);
if (!strlen_fn || !memcpy_fn)
return false;
/* If the length of the source string isn't computable don't
split strcat into strlen and memcpy. */
if (! len)
len = c_strlen (src, 1);
if (! len || TREE_SIDE_EFFECTS (len))
return false;
/* Create strlen (dst). */
gimple_seq stmts = NULL, stmts2;
gimple repl = gimple_build_call (strlen_fn, 1, dst);
gimple_set_location (repl, loc);
if (gimple_in_ssa_p (cfun))
newdst = make_ssa_name (size_type_node, NULL);
else
newdst = create_tmp_reg (size_type_node, NULL);
gimple_call_set_lhs (repl, newdst);
gimple_seq_add_stmt_without_update (&stmts, repl);
/* Create (dst p+ strlen (dst)). */
newdst = fold_build_pointer_plus_loc (loc, dst, newdst);
newdst = force_gimple_operand (newdst, &stmts2, true, NULL_TREE);
gimple_seq_add_seq_without_update (&stmts, stmts2);
len = fold_convert_loc (loc, size_type_node, len);
len = size_binop_loc (loc, PLUS_EXPR, len,
build_int_cst (size_type_node, 1));
len = force_gimple_operand (len, &stmts2, true, NULL_TREE);
gimple_seq_add_seq_without_update (&stmts, stmts2);
repl = gimple_build_call (memcpy_fn, 3, newdst, src, len);
gimple_seq_add_stmt_without_update (&stmts, repl);
if (gimple_call_lhs (stmt))
{
repl = gimple_build_assign (gimple_call_lhs (stmt), dst);
gimple_seq_add_stmt_without_update (&stmts, repl);
gsi_replace_with_seq_vops (gsi, stmts);
/* gsi now points at the assignment to the lhs, get a
stmt iterator to the memcpy call.
??? We can't use gsi_for_stmt as that doesn't work when the
CFG isn't built yet. */
gimple_stmt_iterator gsi2 = *gsi;
gsi_prev (&gsi2);
fold_stmt (&gsi2);
}
else
{
gsi_replace_with_seq_vops (gsi, stmts);
fold_stmt (gsi);
}
return true;
}
/* Fold a call to the fputs builtin. ARG0 and ARG1 are the arguments
to the call. IGNORE is true if the value returned
by the builtin will be ignored. UNLOCKED is true is true if this
actually a call to fputs_unlocked. If LEN in non-NULL, it represents
the known length of the string. Return NULL_TREE if no simplification
was possible. */
static bool
gimple_fold_builtin_fputs (gimple_stmt_iterator *gsi,
location_t loc ATTRIBUTE_UNUSED,
tree arg0, tree arg1,
bool ignore, bool unlocked, tree len)
{
/* If we're using an unlocked function, assume the other unlocked
functions exist explicitly. */
tree const fn_fputc = (unlocked
? builtin_decl_explicit (BUILT_IN_FPUTC_UNLOCKED)
: builtin_decl_implicit (BUILT_IN_FPUTC));
tree const fn_fwrite = (unlocked
? builtin_decl_explicit (BUILT_IN_FWRITE_UNLOCKED)
: builtin_decl_implicit (BUILT_IN_FWRITE));
/* If the return value is used, don't do the transformation. */
if (!ignore)
return false;
if (! len)
len = c_strlen (arg0, 0);
/* Get the length of the string passed to fputs. If the length
can't be determined, punt. */
if (!len
|| TREE_CODE (len) != INTEGER_CST)
return false;
switch (compare_tree_int (len, 1))
{
case -1: /* length is 0, delete the call entirely . */
replace_call_with_value (gsi, integer_zero_node);
return true;
case 0: /* length is 1, call fputc. */
{
const char *p = c_getstr (arg0);
if (p != NULL)
{
if (!fn_fputc)
return false;
gimple repl = gimple_build_call (fn_fputc, 2,
build_int_cst
(integer_type_node, p[0]), arg1);
replace_call_with_call_and_fold (gsi, repl);
return true;
}
}
/* FALLTHROUGH */
case 1: /* length is greater than 1, call fwrite. */
{
/* If optimizing for size keep fputs. */
if (optimize_function_for_size_p (cfun))
return false;
/* New argument list transforming fputs(string, stream) to
fwrite(string, 1, len, stream). */
if (!fn_fwrite)
return false;
gimple repl = gimple_build_call (fn_fwrite, 4, arg0,
size_one_node, len, arg1);
replace_call_with_call_and_fold (gsi, repl);
return true;
}
default:
gcc_unreachable ();
}
return false;
}
/* Fold a call to the __mem{cpy,pcpy,move,set}_chk builtin.
DEST, SRC, LEN, and SIZE are the arguments to the call.
IGNORE is true, if return value can be ignored. FCODE is the BUILT_IN_*
code of the builtin. If MAXLEN is not NULL, it is maximum length
passed as third argument. */
static bool
gimple_fold_builtin_memory_chk (gimple_stmt_iterator *gsi,
location_t loc,
tree dest, tree src, tree len, tree size,
tree maxlen, bool ignore,
enum built_in_function fcode)
{
tree fn;
/* If SRC and DEST are the same (and not volatile), return DEST
(resp. DEST+LEN for __mempcpy_chk). */
if (fcode != BUILT_IN_MEMSET_CHK && operand_equal_p (src, dest, 0))
{
if (fcode != BUILT_IN_MEMPCPY_CHK)
{
replace_call_with_value (gsi, dest);
return true;
}
else
{
tree temp = fold_build_pointer_plus_loc (loc, dest, len);
temp = force_gimple_operand_gsi (gsi, temp,
false, NULL_TREE, true,
GSI_SAME_STMT);
replace_call_with_value (gsi, temp);
return true;
}
}
if (! tree_fits_uhwi_p (size))
return false;
if (! integer_all_onesp (size))
{
if (! tree_fits_uhwi_p (len))
{
/* If LEN is not constant, try MAXLEN too.
For MAXLEN only allow optimizing into non-_ocs function
if SIZE is >= MAXLEN, never convert to __ocs_fail (). */
if (maxlen == NULL_TREE || ! tree_fits_uhwi_p (maxlen))
{
if (fcode == BUILT_IN_MEMPCPY_CHK && ignore)
{
/* (void) __mempcpy_chk () can be optimized into
(void) __memcpy_chk (). */
fn = builtin_decl_explicit (BUILT_IN_MEMCPY_CHK);
if (!fn)
return false;
gimple repl = gimple_build_call (fn, 4, dest, src, len, size);
replace_call_with_call_and_fold (gsi, repl);
return true;
}
return false;
}
}
else
maxlen = len;
if (tree_int_cst_lt (size, maxlen))
return false;
}
fn = NULL_TREE;
/* If __builtin_mem{cpy,pcpy,move,set}_chk is used, assume
mem{cpy,pcpy,move,set} is available. */
switch (fcode)
{
case BUILT_IN_MEMCPY_CHK:
fn = builtin_decl_explicit (BUILT_IN_MEMCPY);
break;
case BUILT_IN_MEMPCPY_CHK:
fn = builtin_decl_explicit (BUILT_IN_MEMPCPY);
break;
case BUILT_IN_MEMMOVE_CHK:
fn = builtin_decl_explicit (BUILT_IN_MEMMOVE);
break;
case BUILT_IN_MEMSET_CHK:
fn = builtin_decl_explicit (BUILT_IN_MEMSET);
break;
default:
break;
}
if (!fn)
return false;
gimple repl = gimple_build_call (fn, 3, dest, src, len);
replace_call_with_call_and_fold (gsi, repl);
return true;
}
/* Fold a call to the __st[rp]cpy_chk builtin.
DEST, SRC, and SIZE are the arguments to the call.
IGNORE is true if return value can be ignored. FCODE is the BUILT_IN_*
code of the builtin. If MAXLEN is not NULL, it is maximum length of
strings passed as second argument. */
static bool
gimple_fold_builtin_stxcpy_chk (gimple_stmt_iterator *gsi,
location_t loc, tree dest,
tree src, tree size,
tree maxlen, bool ignore,
enum built_in_function fcode)
{
tree len, fn;
/* If SRC and DEST are the same (and not volatile), return DEST. */
if (fcode == BUILT_IN_STRCPY_CHK && operand_equal_p (src, dest, 0))
{
replace_call_with_value (gsi, dest);
return true;
} }
/* If the new sequence does not do a store release the virtual if (! tree_fits_uhwi_p (size))
definition of the original statement. */ return false;
if (reaching_vuse
&& reaching_vuse == gimple_vuse (stmt)) if (! integer_all_onesp (size))
{ {
tree vdef = gimple_vdef (stmt); len = c_strlen (src, 1);
if (vdef if (! len || ! tree_fits_uhwi_p (len))
&& TREE_CODE (vdef) == SSA_NAME)
{ {
unlink_stmt_vdef (stmt); /* If LEN is not constant, try MAXLEN too.
release_ssa_name (vdef); For MAXLEN only allow optimizing into non-_ocs function
if SIZE is >= MAXLEN, never convert to __ocs_fail (). */
if (maxlen == NULL_TREE || ! tree_fits_uhwi_p (maxlen))
{
if (fcode == BUILT_IN_STPCPY_CHK)
{
if (! ignore)
return false;
/* If return value of __stpcpy_chk is ignored,
optimize into __strcpy_chk. */
fn = builtin_decl_explicit (BUILT_IN_STRCPY_CHK);
if (!fn)
return false;
gimple repl = gimple_build_call (fn, 3, dest, src, size);
replace_call_with_call_and_fold (gsi, repl);
return true;
}
if (! len || TREE_SIDE_EFFECTS (len))
return false;
/* If c_strlen returned something, but not a constant,
transform __strcpy_chk into __memcpy_chk. */
fn = builtin_decl_explicit (BUILT_IN_MEMCPY_CHK);
if (!fn)
return false;
len = fold_convert_loc (loc, size_type_node, len);
len = size_binop_loc (loc, PLUS_EXPR, len,
build_int_cst (size_type_node, 1));
len = force_gimple_operand_gsi (gsi, len, true, NULL_TREE,
true, GSI_SAME_STMT);
gimple repl = gimple_build_call (fn, 4, dest, src, len, size);
replace_call_with_call_and_fold (gsi, repl);
return true;
} }
} }
else
maxlen = len;
/* Finally replace the original statement with the sequence. */ if (! tree_int_cst_lt (maxlen, size))
gsi_replace_with_seq (si_p, stmts, false); return false;
}
/* If __builtin_st{r,p}cpy_chk is used, assume st{r,p}cpy is available. */
fn = builtin_decl_explicit (fcode == BUILT_IN_STPCPY_CHK
? BUILT_IN_STPCPY : BUILT_IN_STRCPY);
if (!fn)
return false;
gimple repl = gimple_build_call (fn, 2, dest, src);
replace_call_with_call_and_fold (gsi, repl);
return true;
} }
/* Return the string length, maximum string length or maximum value of /* Fold a call to the __st{r,p}ncpy_chk builtin. DEST, SRC, LEN, and SIZE
ARG in LENGTH. are the arguments to the call. If MAXLEN is not NULL, it is maximum
If ARG is an SSA name variable, follow its use-def chains. If LENGTH length passed as third argument. IGNORE is true if return value can be
is not NULL and, for TYPE == 0, its value is not equal to the length ignored. FCODE is the BUILT_IN_* code of the builtin. */
we determine or if we are unable to determine the length or value,
return false. VISITED is a bitmap of visited variables.
TYPE is 0 if string length should be returned, 1 for maximum string
length and 2 for maximum value ARG can have. */
static bool static bool
get_maxval_strlen (tree arg, tree *length, bitmap visited, int type) gimple_fold_builtin_stxncpy_chk (gimple_stmt_iterator *gsi,
tree dest, tree src,
tree len, tree size, tree maxlen, bool ignore,
enum built_in_function fcode)
{ {
tree var, val; tree fn;
gimple def_stmt;
if (TREE_CODE (arg) != SSA_NAME) if (fcode == BUILT_IN_STPNCPY_CHK && ignore)
{ {
/* We can end up with &(*iftmp_1)[0] here as well, so handle it. */ /* If return value of __stpncpy_chk is ignored,
if (TREE_CODE (arg) == ADDR_EXPR optimize into __strncpy_chk. */
&& TREE_CODE (TREE_OPERAND (arg, 0)) == ARRAY_REF fn = builtin_decl_explicit (BUILT_IN_STRNCPY_CHK);
&& integer_zerop (TREE_OPERAND (TREE_OPERAND (arg, 0), 1))) if (fn)
{ {
tree aop0 = TREE_OPERAND (TREE_OPERAND (arg, 0), 0); gimple repl = gimple_build_call (fn, 4, dest, src, len, size);
if (TREE_CODE (aop0) == INDIRECT_REF replace_call_with_call_and_fold (gsi, repl);
&& TREE_CODE (TREE_OPERAND (aop0, 0)) == SSA_NAME) return true;
return get_maxval_strlen (TREE_OPERAND (aop0, 0), }
length, visited, type);
} }
if (type == 2) if (! tree_fits_uhwi_p (size))
return false;
if (! integer_all_onesp (size))
{ {
val = arg; if (! tree_fits_uhwi_p (len))
if (TREE_CODE (val) != INTEGER_CST {
|| tree_int_cst_sgn (val) < 0) /* If LEN is not constant, try MAXLEN too.
For MAXLEN only allow optimizing into non-_ocs function
if SIZE is >= MAXLEN, never convert to __ocs_fail (). */
if (maxlen == NULL_TREE || ! tree_fits_uhwi_p (maxlen))
return false; return false;
} }
else else
val = c_strlen (arg, 1); maxlen = len;
if (!val)
if (tree_int_cst_lt (size, maxlen))
return false; return false;
}
if (*length) /* If __builtin_st{r,p}ncpy_chk is used, assume st{r,p}ncpy is available. */
{ fn = builtin_decl_explicit (fcode == BUILT_IN_STPNCPY_CHK
if (type > 0) ? BUILT_IN_STPNCPY : BUILT_IN_STRNCPY);
{ if (!fn)
if (TREE_CODE (*length) != INTEGER_CST
|| TREE_CODE (val) != INTEGER_CST)
return false; return false;
if (tree_int_cst_lt (*length, val)) gimple repl = gimple_build_call (fn, 3, dest, src, len);
*length = val; replace_call_with_call_and_fold (gsi, repl);
return true; return true;
}
/* Fold a call EXP to {,v}snprintf having NARGS passed as ARGS. Return
NULL_TREE if a normal call should be emitted rather than expanding
the function inline. FCODE is either BUILT_IN_SNPRINTF_CHK or
BUILT_IN_VSNPRINTF_CHK. If MAXLEN is not NULL, it is maximum length
passed as second argument. */
static bool
gimple_fold_builtin_snprintf_chk (gimple_stmt_iterator *gsi,
tree maxlen, enum built_in_function fcode)
{
gimple stmt = gsi_stmt (*gsi);
tree dest, size, len, fn, fmt, flag;
const char *fmt_str;
/* Verify the required arguments in the original call. */
if (gimple_call_num_args (stmt) < 5)
return false;
dest = gimple_call_arg (stmt, 0);
len = gimple_call_arg (stmt, 1);
flag = gimple_call_arg (stmt, 2);
size = gimple_call_arg (stmt, 3);
fmt = gimple_call_arg (stmt, 4);
if (! tree_fits_uhwi_p (size))
return false;
if (! integer_all_onesp (size))
{
if (! tree_fits_uhwi_p (len))
{
/* If LEN is not constant, try MAXLEN too.
For MAXLEN only allow optimizing into non-_ocs function
if SIZE is >= MAXLEN, never convert to __ocs_fail (). */
if (maxlen == NULL_TREE || ! tree_fits_uhwi_p (maxlen))
return false;
} }
else if (simple_cst_equal (val, *length) != 1) else
maxlen = len;
if (tree_int_cst_lt (size, maxlen))
return false; return false;
} }
*length = val; if (!init_target_chars ())
return true; return false;
/* Only convert __{,v}snprintf_chk to {,v}snprintf if flag is 0
or if format doesn't contain % chars or is "%s". */
if (! integer_zerop (flag))
{
fmt_str = c_getstr (fmt);
if (fmt_str == NULL)
return false;
if (strchr (fmt_str, target_percent) != NULL
&& strcmp (fmt_str, target_percent_s))
return false;
} }
/* If ARG is registered for SSA update we cannot look at its defining /* If __builtin_{,v}snprintf_chk is used, assume {,v}snprintf is
statement. */ available. */
if (name_registered_for_update_p (arg)) fn = builtin_decl_explicit (fcode == BUILT_IN_VSNPRINTF_CHK
? BUILT_IN_VSNPRINTF : BUILT_IN_SNPRINTF);
if (!fn)
return false; return false;
/* If we were already here, break the infinite cycle. */ /* Replace the called function and the first 5 argument by 3 retaining
if (!bitmap_set_bit (visited, SSA_NAME_VERSION (arg))) trailing varargs. */
gimple_call_set_fndecl (stmt, fn);
gimple_call_set_fntype (stmt, TREE_TYPE (fn));
gimple_call_set_arg (stmt, 0, dest);
gimple_call_set_arg (stmt, 1, len);
gimple_call_set_arg (stmt, 2, fmt);
for (unsigned i = 3; i < gimple_call_num_args (stmt) - 2; ++i)
gimple_call_set_arg (stmt, i, gimple_call_arg (stmt, i + 2));
gimple_set_num_ops (stmt, gimple_num_ops (stmt) - 2);
fold_stmt (gsi);
return true; return true;
}
var = arg; /* Fold a call EXP to __{,v}sprintf_chk having NARGS passed as ARGS.
def_stmt = SSA_NAME_DEF_STMT (var); Return NULL_TREE if a normal call should be emitted rather than
expanding the function inline. FCODE is either BUILT_IN_SPRINTF_CHK
or BUILT_IN_VSPRINTF_CHK. */
switch (gimple_code (def_stmt)) static bool
gimple_fold_builtin_sprintf_chk (gimple_stmt_iterator *gsi,
enum built_in_function fcode)
{
gimple stmt = gsi_stmt (*gsi);
tree dest, size, len, fn, fmt, flag;
const char *fmt_str;
unsigned nargs = gimple_call_num_args (stmt);
/* Verify the required arguments in the original call. */
if (nargs < 4)
return false;
dest = gimple_call_arg (stmt, 0);
flag = gimple_call_arg (stmt, 1);
size = gimple_call_arg (stmt, 2);
fmt = gimple_call_arg (stmt, 3);
if (! tree_fits_uhwi_p (size))
return false;
len = NULL_TREE;
if (!init_target_chars ())
return false;
/* Check whether the format is a literal string constant. */
fmt_str = c_getstr (fmt);
if (fmt_str != NULL)
{ {
case GIMPLE_ASSIGN: /* If the format doesn't contain % args or %%, we know the size. */
/* The RHS of the statement defining VAR must either have a if (strchr (fmt_str, target_percent) == 0)
constant length or come from another SSA_NAME with a constant
length. */
if (gimple_assign_single_p (def_stmt)
|| gimple_assign_unary_nop_p (def_stmt))
{ {
tree rhs = gimple_assign_rhs1 (def_stmt); if (fcode != BUILT_IN_SPRINTF_CHK || nargs == 4)
return get_maxval_strlen (rhs, length, visited, type); len = build_int_cstu (size_type_node, strlen (fmt_str));
} }
else if (gimple_assign_rhs_code (def_stmt) == COND_EXPR) /* If the format is "%s" and first ... argument is a string literal,
we know the size too. */
else if (fcode == BUILT_IN_SPRINTF_CHK
&& strcmp (fmt_str, target_percent_s) == 0)
{ {
tree op2 = gimple_assign_rhs2 (def_stmt); tree arg;
tree op3 = gimple_assign_rhs3 (def_stmt);
return get_maxval_strlen (op2, length, visited, type)
&& get_maxval_strlen (op3, length, visited, type);
}
return false;
case GIMPLE_PHI: if (nargs == 5)
{ {
/* All the arguments of the PHI node must have the same constant arg = gimple_call_arg (stmt, 4);
length. */ if (POINTER_TYPE_P (TREE_TYPE (arg)))
unsigned i;
for (i = 0; i < gimple_phi_num_args (def_stmt); i++)
{ {
tree arg = gimple_phi_arg (def_stmt, i)->def; len = c_strlen (arg, 1);
if (! len || ! tree_fits_uhwi_p (len))
/* If this PHI has itself as an argument, we cannot len = NULL_TREE;
determine the string length of this argument. However, }
if we can find a constant string length for the other }
PHI args then we can still be sure that this is a }
constant string length. So be optimistic and just }
continue with the next argument. */
if (arg == gimple_phi_result (def_stmt))
continue;
if (!get_maxval_strlen (arg, length, visited, type)) if (! integer_all_onesp (size))
{
if (! len || ! tree_int_cst_lt (len, size))
return false; return false;
} }
/* Only convert __{,v}sprintf_chk to {,v}sprintf if flag is 0
or if format doesn't contain % chars or is "%s". */
if (! integer_zerop (flag))
{
if (fmt_str == NULL)
return false;
if (strchr (fmt_str, target_percent) != NULL
&& strcmp (fmt_str, target_percent_s))
return false;
} }
return true;
default: /* If __builtin_{,v}sprintf_chk is used, assume {,v}sprintf is available. */
fn = builtin_decl_explicit (fcode == BUILT_IN_VSPRINTF_CHK
? BUILT_IN_VSPRINTF : BUILT_IN_SPRINTF);
if (!fn)
return false; return false;
/* Replace the called function and the first 4 argument by 2 retaining
trailing varargs. */
gimple_call_set_fndecl (stmt, fn);
gimple_call_set_fntype (stmt, TREE_TYPE (fn));
gimple_call_set_arg (stmt, 0, dest);
gimple_call_set_arg (stmt, 1, fmt);
for (unsigned i = 2; i < gimple_call_num_args (stmt) - 2; ++i)
gimple_call_set_arg (stmt, i, gimple_call_arg (stmt, i + 2));
gimple_set_num_ops (stmt, gimple_num_ops (stmt) - 2);
fold_stmt (gsi);
return true;
}
/* Fold a call to __builtin_strlen with known length LEN. */
static bool
gimple_fold_builtin_strlen (gimple_stmt_iterator *gsi, tree len)
{
if (!len)
{
gimple stmt = gsi_stmt (*gsi);
len = c_strlen (gimple_call_arg (stmt, 0), 0);
} }
if (!len)
return false;
replace_call_with_value (gsi, len);
return true;
} }
/* Fold builtin call in statement STMT. Returns a simplified tree. /* Fold builtins at *GSI with knowledge about a length argument. */
We may return a non-constant expression, including another call
to a different function and with different arguments, e.g.,
substituting memcpy for strcpy when the string length is known.
Note that some builtins expand into inline code that may not
be valid in GIMPLE. Callers must take care. */
static tree static bool
gimple_fold_builtin (gimple stmt) gimple_fold_builtin_with_strlen (gimple_stmt_iterator *gsi)
{ {
tree result, val[3]; gimple stmt = gsi_stmt (*gsi);
tree callee, a; tree val[3];
tree a;
int arg_idx, type; int arg_idx, type;
bitmap visited; bitmap visited;
bool ignore; bool ignore;
int nargs;
location_t loc = gimple_location (stmt); location_t loc = gimple_location (stmt);
ignore = (gimple_call_lhs (stmt) == NULL); ignore = (gimple_call_lhs (stmt) == NULL);
/* First try the generic builtin folder. If that succeeds, return the
result directly. */
result = fold_call_stmt (stmt, ignore);
if (result)
{
if (ignore)
STRIP_NOPS (result);
else
result = fold_convert (gimple_call_return_type (stmt), result);
return result;
}
/* Ignore MD builtins. */
callee = gimple_call_fndecl (stmt);
if (DECL_BUILT_IN_CLASS (callee) == BUILT_IN_MD)
return NULL_TREE;
/* Give up for always_inline inline builtins until they are
inlined. */
if (avoid_folding_inline_builtin (callee))
return NULL_TREE;
/* If the builtin could not be folded, and it has no argument list,
we're done. */
nargs = gimple_call_num_args (stmt);
if (nargs == 0)
return NULL_TREE;
/* Limit the work only for builtins we know how to simplify. */ /* Limit the work only for builtins we know how to simplify. */
tree callee = gimple_call_fndecl (stmt);
switch (DECL_FUNCTION_CODE (callee)) switch (DECL_FUNCTION_CODE (callee))
{ {
case BUILT_IN_STRLEN: case BUILT_IN_STRLEN:
...@@ -949,11 +2212,12 @@ gimple_fold_builtin (gimple stmt) ...@@ -949,11 +2212,12 @@ gimple_fold_builtin (gimple stmt)
type = 2; type = 2;
break; break;
default: default:
return NULL_TREE; return false;
} }
int nargs = gimple_call_num_args (stmt);
if (arg_idx >= nargs) if (arg_idx >= nargs)
return NULL_TREE; return false;
/* Try to use the dataflow information gathered by the CCP process. */ /* Try to use the dataflow information gathered by the CCP process. */
visited = BITMAP_ALLOC (NULL); visited = BITMAP_ALLOC (NULL);
...@@ -961,120 +2225,150 @@ gimple_fold_builtin (gimple stmt) ...@@ -961,120 +2225,150 @@ gimple_fold_builtin (gimple stmt)
memset (val, 0, sizeof (val)); memset (val, 0, sizeof (val));
a = gimple_call_arg (stmt, arg_idx); a = gimple_call_arg (stmt, arg_idx);
if (!get_maxval_strlen (a, &val[arg_idx], visited, type)) if (!get_maxval_strlen (a, &val[arg_idx], visited, type)
|| !is_gimple_val (val[arg_idx]))
val[arg_idx] = NULL_TREE; val[arg_idx] = NULL_TREE;
BITMAP_FREE (visited); BITMAP_FREE (visited);
result = NULL_TREE;
switch (DECL_FUNCTION_CODE (callee)) switch (DECL_FUNCTION_CODE (callee))
{ {
case BUILT_IN_STRLEN: case BUILT_IN_STRLEN:
if (val[0] && nargs == 1) return gimple_fold_builtin_strlen (gsi, val[0]);
{
tree new_val =
fold_convert (TREE_TYPE (gimple_call_lhs (stmt)), val[0]);
/* If the result is not a valid gimple value, or not a cast
of a valid gimple value, then we cannot use the result. */
if (is_gimple_val (new_val)
|| (CONVERT_EXPR_P (new_val)
&& is_gimple_val (TREE_OPERAND (new_val, 0))))
return new_val;
}
break;
case BUILT_IN_STRCPY: case BUILT_IN_STRCPY:
if (val[1] && is_gimple_val (val[1]) && nargs == 2) return gimple_fold_builtin_strcpy (gsi, loc,
result = fold_builtin_strcpy (loc, callee,
gimple_call_arg (stmt, 0), gimple_call_arg (stmt, 0),
gimple_call_arg (stmt, 1), gimple_call_arg (stmt, 1),
val[1]); val[1]);
break;
case BUILT_IN_STRNCPY: case BUILT_IN_STRNCPY:
if (val[1] && is_gimple_val (val[1]) && nargs == 3) return gimple_fold_builtin_strncpy (gsi, loc,
result = fold_builtin_strncpy (loc, callee,
gimple_call_arg (stmt, 0), gimple_call_arg (stmt, 0),
gimple_call_arg (stmt, 1), gimple_call_arg (stmt, 1),
gimple_call_arg (stmt, 2), gimple_call_arg (stmt, 2),
val[1]); val[1]);
break;
case BUILT_IN_STRCAT: case BUILT_IN_STRCAT:
if (val[1] && is_gimple_val (val[1]) && nargs == 2) return gimple_fold_builtin_strcat (gsi, loc, gimple_call_arg (stmt, 0),
result = fold_builtin_strcat (loc, gimple_call_arg (stmt, 0),
gimple_call_arg (stmt, 1), gimple_call_arg (stmt, 1),
val[1]); val[1]);
break;
case BUILT_IN_FPUTS: case BUILT_IN_FPUTS:
if (nargs == 2) return gimple_fold_builtin_fputs (gsi, loc, gimple_call_arg (stmt, 0),
result = fold_builtin_fputs (loc, gimple_call_arg (stmt, 0),
gimple_call_arg (stmt, 1), gimple_call_arg (stmt, 1),
ignore, false, val[0]); ignore, false, val[0]);
break;
case BUILT_IN_FPUTS_UNLOCKED: case BUILT_IN_FPUTS_UNLOCKED:
if (nargs == 2) return gimple_fold_builtin_fputs (gsi, loc, gimple_call_arg (stmt, 0),
result = fold_builtin_fputs (loc, gimple_call_arg (stmt, 0),
gimple_call_arg (stmt, 1), gimple_call_arg (stmt, 1),
ignore, true, val[0]); ignore, true, val[0]);
break;
case BUILT_IN_MEMCPY_CHK: case BUILT_IN_MEMCPY_CHK:
case BUILT_IN_MEMPCPY_CHK: case BUILT_IN_MEMPCPY_CHK:
case BUILT_IN_MEMMOVE_CHK: case BUILT_IN_MEMMOVE_CHK:
case BUILT_IN_MEMSET_CHK: case BUILT_IN_MEMSET_CHK:
if (val[2] && is_gimple_val (val[2]) && nargs == 4) return gimple_fold_builtin_memory_chk (gsi, loc,
result = fold_builtin_memory_chk (loc, callee,
gimple_call_arg (stmt, 0), gimple_call_arg (stmt, 0),
gimple_call_arg (stmt, 1), gimple_call_arg (stmt, 1),
gimple_call_arg (stmt, 2), gimple_call_arg (stmt, 2),
gimple_call_arg (stmt, 3), gimple_call_arg (stmt, 3),
val[2], ignore, val[2], ignore,
DECL_FUNCTION_CODE (callee)); DECL_FUNCTION_CODE (callee));
break;
case BUILT_IN_STRCPY_CHK: case BUILT_IN_STRCPY_CHK:
case BUILT_IN_STPCPY_CHK: case BUILT_IN_STPCPY_CHK:
if (val[1] && is_gimple_val (val[1]) && nargs == 3) return gimple_fold_builtin_stxcpy_chk (gsi, loc,
result = fold_builtin_stxcpy_chk (loc, callee,
gimple_call_arg (stmt, 0), gimple_call_arg (stmt, 0),
gimple_call_arg (stmt, 1), gimple_call_arg (stmt, 1),
gimple_call_arg (stmt, 2), gimple_call_arg (stmt, 2),
val[1], ignore, val[1], ignore,
DECL_FUNCTION_CODE (callee)); DECL_FUNCTION_CODE (callee));
break;
case BUILT_IN_STRNCPY_CHK: case BUILT_IN_STRNCPY_CHK:
case BUILT_IN_STPNCPY_CHK: case BUILT_IN_STPNCPY_CHK:
if (val[2] && is_gimple_val (val[2]) && nargs == 4) return gimple_fold_builtin_stxncpy_chk (gsi,
result = fold_builtin_stxncpy_chk (loc, gimple_call_arg (stmt, 0), gimple_call_arg (stmt, 0),
gimple_call_arg (stmt, 1), gimple_call_arg (stmt, 1),
gimple_call_arg (stmt, 2), gimple_call_arg (stmt, 2),
gimple_call_arg (stmt, 3), gimple_call_arg (stmt, 3),
val[2], ignore, val[2], ignore,
DECL_FUNCTION_CODE (callee)); DECL_FUNCTION_CODE (callee));
break;
case BUILT_IN_SNPRINTF_CHK: case BUILT_IN_SNPRINTF_CHK:
case BUILT_IN_VSNPRINTF_CHK: case BUILT_IN_VSNPRINTF_CHK:
if (val[1] && is_gimple_val (val[1])) return gimple_fold_builtin_snprintf_chk (gsi, val[1],
result = gimple_fold_builtin_snprintf_chk (stmt, val[1],
DECL_FUNCTION_CODE (callee)); DECL_FUNCTION_CODE (callee));
break;
default: default:
gcc_unreachable (); gcc_unreachable ();
} }
if (result && ignore) return false;
result = fold_ignored_result (result);
return result;
} }
/* Fold the non-target builtin at *GSI and return whether any simplification
was made. */
static bool
gimple_fold_builtin (gimple_stmt_iterator *gsi)
{
gimple stmt = gsi_stmt (*gsi);
tree callee = gimple_call_fndecl (stmt);
/* Give up for always_inline inline builtins until they are
inlined. */
if (avoid_folding_inline_builtin (callee))
return false;
if (gimple_fold_builtin_with_strlen (gsi))
return true;
switch (DECL_FUNCTION_CODE (callee))
{
case BUILT_IN_BZERO:
return gimple_fold_builtin_memset (gsi, integer_zero_node,
gimple_call_arg (stmt, 1));
case BUILT_IN_MEMSET:
return gimple_fold_builtin_memset (gsi,
gimple_call_arg (stmt, 1),
gimple_call_arg (stmt, 2));
case BUILT_IN_BCOPY:
return gimple_fold_builtin_memory_op (gsi, gimple_call_arg (stmt, 1),
gimple_call_arg (stmt, 0), 3);
case BUILT_IN_MEMCPY:
return gimple_fold_builtin_memory_op (gsi, gimple_call_arg (stmt, 0),
gimple_call_arg (stmt, 1), 0);
case BUILT_IN_MEMPCPY:
return gimple_fold_builtin_memory_op (gsi, gimple_call_arg (stmt, 0),
gimple_call_arg (stmt, 1), 1);
case BUILT_IN_MEMMOVE:
return gimple_fold_builtin_memory_op (gsi, gimple_call_arg (stmt, 0),
gimple_call_arg (stmt, 1), 3);
case BUILT_IN_SPRINTF_CHK:
case BUILT_IN_VSPRINTF_CHK:
return gimple_fold_builtin_sprintf_chk (gsi, DECL_FUNCTION_CODE (callee));
default:;
}
/* Try the generic builtin folder. */
bool ignore = (gimple_call_lhs (stmt) == NULL);
tree result = fold_call_stmt (stmt, ignore);
if (result)
{
if (ignore)
STRIP_NOPS (result);
else
result = fold_convert (gimple_call_return_type (stmt), result);
if (!update_call_from_tree (gsi, result))
gimplify_and_update_call_from_tree (gsi, result);
return true;
}
return false;
}
/* Attempt to fold a call statement referenced by the statement iterator GSI. /* Attempt to fold a call statement referenced by the statement iterator GSI.
The statement may be replaced by another statement, e.g., if the call The statement may be replaced by another statement, e.g., if the call
simplifies to a constant value. Return true if any changes were made. simplifies to a constant value. Return true if any changes were made.
...@@ -1186,16 +2480,13 @@ gimple_fold_call (gimple_stmt_iterator *gsi, bool inplace) ...@@ -1186,16 +2480,13 @@ gimple_fold_call (gimple_stmt_iterator *gsi, bool inplace)
/* Check for builtins that CCP can handle using information not /* Check for builtins that CCP can handle using information not
available in the generic fold routines. */ available in the generic fold routines. */
if (gimple_call_builtin_p (stmt)) if (gimple_call_builtin_p (stmt, BUILT_IN_NORMAL))
{
tree result = gimple_fold_builtin (stmt);
if (result)
{ {
if (!update_call_from_tree (gsi, result)) if (gimple_fold_builtin (gsi))
gimplify_and_update_call_from_tree (gsi, result);
changed = true; changed = true;
} }
else if (gimple_call_builtin_p (stmt, BUILT_IN_MD)) else if (gimple_call_builtin_p (stmt, BUILT_IN_MD))
{
changed |= targetm.gimple_fold_builtin (gsi); changed |= targetm.gimple_fold_builtin (gsi);
} }
else if (gimple_call_internal_p (stmt)) else if (gimple_call_internal_p (stmt))
......
...@@ -1178,6 +1178,21 @@ gimple_seq_add_seq (gimple_seq *dst_p, gimple_seq src) ...@@ -1178,6 +1178,21 @@ gimple_seq_add_seq (gimple_seq *dst_p, gimple_seq src)
gsi_insert_seq_after (&si, src, GSI_NEW_STMT); gsi_insert_seq_after (&si, src, GSI_NEW_STMT);
} }
/* Append sequence SRC to the end of sequence *DST_P. If *DST_P is
NULL, a new sequence is allocated. This function is
similar to gimple_seq_add_seq, but does not scan the operands. */
void
gimple_seq_add_seq_without_update (gimple_seq *dst_p, gimple_seq src)
{
gimple_stmt_iterator si;
if (src == NULL)
return;
si = gsi_last (*dst_p);
gsi_insert_seq_after_without_update (&si, src, GSI_NEW_STMT);
}
/* Determine whether to assign a location to the statement GS. */ /* Determine whether to assign a location to the statement GS. */
static bool static bool
......
...@@ -1226,6 +1226,7 @@ gimple gimple_build_predict (enum br_predictor, enum prediction); ...@@ -1226,6 +1226,7 @@ gimple gimple_build_predict (enum br_predictor, enum prediction);
extern void gimple_seq_add_stmt (gimple_seq *, gimple); extern void gimple_seq_add_stmt (gimple_seq *, gimple);
extern void gimple_seq_add_stmt_without_update (gimple_seq *, gimple); extern void gimple_seq_add_stmt_without_update (gimple_seq *, gimple);
void gimple_seq_add_seq (gimple_seq *, gimple_seq); void gimple_seq_add_seq (gimple_seq *, gimple_seq);
void gimple_seq_add_seq_without_update (gimple_seq *, gimple_seq);
extern void annotate_all_with_location_after (gimple_seq, gimple_stmt_iterator, extern void annotate_all_with_location_after (gimple_seq, gimple_stmt_iterator,
location_t); location_t);
extern void annotate_all_with_location (gimple_seq, location_t); extern void annotate_all_with_location (gimple_seq, location_t);
......
...@@ -339,7 +339,8 @@ optimize_function_for_speed_p (struct function *fun) ...@@ -339,7 +339,8 @@ optimize_function_for_speed_p (struct function *fun)
bool bool
optimize_bb_for_size_p (const_basic_block bb) optimize_bb_for_size_p (const_basic_block bb)
{ {
return optimize_function_for_size_p (cfun) || !maybe_hot_bb_p (cfun, bb); return (optimize_function_for_size_p (cfun)
|| (bb && !maybe_hot_bb_p (cfun, bb)));
} }
/* Return TRUE when BB should be optimized for speed. */ /* Return TRUE when BB should be optimized for speed. */
......
2014-08-08 Richard Biener <rguenther@suse.de>
* gcc.dg/strlenopt-8.c: Remove XFAIL.
* gcc.dg/tree-prof/stringop-2.c: Adjust.
* gfortran.dg/array_memcpy_4.f90: Likewise.
* gfortran.dg/trim_optimize_1.f90: Likewise.
* gfortran.dg/trim_optimize_2.f90: Likewise.
2014-08-08 Kugan Vivekanandarajah <kuganv@linaro.org> 2014-08-08 Kugan Vivekanandarajah <kuganv@linaro.org>
* gcc.dg/zero_sign_ext_test.c: New test. * gcc.dg/zero_sign_ext_test.c: New test.
......
...@@ -43,8 +43,8 @@ main () ...@@ -43,8 +43,8 @@ main ()
return 0; return 0;
} }
/* { dg-final { scan-tree-dump-times "strlen \\(" 0 "strlen" { xfail *-*-* } } } */ /* { dg-final { scan-tree-dump-times "strlen \\(" 0 "strlen" } } */
/* { dg-final { scan-tree-dump-times "memcpy \\(" 4 "strlen" { xfail *-*-* } } } */ /* { dg-final { scan-tree-dump-times "memcpy \\(" 4 "strlen" } } */
/* { dg-final { scan-tree-dump-times "strcpy \\(" 0 "strlen" } } */ /* { dg-final { scan-tree-dump-times "strcpy \\(" 0 "strlen" } } */
/* { dg-final { scan-tree-dump-times "strcat \\(" 0 "strlen" } } */ /* { dg-final { scan-tree-dump-times "strcat \\(" 0 "strlen" } } */
/* { dg-final { scan-tree-dump-times "strchr \\(" 0 "strlen" } } */ /* { dg-final { scan-tree-dump-times "strchr \\(" 0 "strlen" } } */
......
...@@ -19,6 +19,6 @@ main() ...@@ -19,6 +19,6 @@ main()
} }
/* { dg-final-use { scan-ipa-dump "Single value 4 stringop" "profile"} } */ /* { dg-final-use { scan-ipa-dump "Single value 4 stringop" "profile"} } */
/* The versioned memset of size 4 should be optimized to an assignment. */ /* The versioned memset of size 4 should be optimized to an assignment. */
/* { dg-final-use { scan-tree-dump "a\\\[0\\\] = 168430090" "optimized"} } */ /* { dg-final-use { scan-tree-dump "MEM\\\[\\(void .\\)&a\\\] = 168430090" "optimized"} } */
/* { dg-final-use { cleanup-tree-dump "optimized" } } */ /* { dg-final-use { cleanup-tree-dump "optimized" } } */
/* { dg-final-use { cleanup-ipa-dump "profile" } } */ /* { dg-final-use { cleanup-ipa-dump "profile" } } */
...@@ -9,5 +9,5 @@ ...@@ -9,5 +9,5 @@
d = s d = s
end end
! { dg-final { scan-tree-dump-times "MEM.*d\\\] = MEM" 1 "original" } } ! { dg-final { scan-tree-dump-times "memcpy" 1 "original" } }
! { dg-final { cleanup-tree-dump "original" } } ! { dg-final { cleanup-tree-dump "original" } }
...@@ -11,6 +11,6 @@ program main ...@@ -11,6 +11,6 @@ program main
if (c /= 'abc') call abort if (c /= 'abc') call abort
end program main end program main
! { dg-final { scan-tree-dump-times "memmove" 2 "original" } } ! { dg-final { scan-tree-dump-times "memmove" 3 "original" } }
! { dg-final { scan-tree-dump-times "string_trim" 0 "original" } } ! { dg-final { scan-tree-dump-times "string_trim" 0 "original" } }
! { dg-final { cleanup-tree-dump "original" } } ! { dg-final { cleanup-tree-dump "original" } }
...@@ -32,6 +32,6 @@ contains ...@@ -32,6 +32,6 @@ contains
end subroutine foo end subroutine foo
end program main end program main
! { dg-final { scan-tree-dump-times "memmove" 4 "original" } } ! { dg-final { scan-tree-dump-times "memmove" 6 "original" } }
! { dg-final { scan-tree-dump-times "string_trim" 0 "original" } } ! { dg-final { scan-tree-dump-times "string_trim" 0 "original" } }
! { dg-final { cleanup-tree-dump "original" } } ! { dg-final { cleanup-tree-dump "original" } }
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment