1. 20 Mar, 2020 24 commits
    • Fix verifier ICE on wrong comdat local flag [PR93347] · 72b3bc89
      gcc/ChangeLog:
      
      2020-03-20  Jan Hubicka  <hubicka@ucw.cz>
      
      	PR ipa/93347
      	* cgraph.c (symbol_table::create_edge): Update calls_comdat_local flag.
      	(cgraph_edge::redirect_callee): Move here; likewise.
      	(cgraph_node::remove_callees): Update calls_comdat_local flag.
      	(cgraph_node::verify_node): Verify that calls_comdat_local flag match
      	reality.
      	(cgraph_node::check_calls_comdat_local_p): New member function.
      	* cgraph.h (cgraph_node::check_calls_comdat_local_p): Declare.
      	(cgraph_edge::redirect_callee): Move offline.
      	* ipa-fnsummary.c (compute_fn_summary): Do not compute
      	calls_comdat_local flag here.
      	* ipa-inline-transform.c (inline_call): Fix updating of
      	calls_comdat_local flag.
      	* ipa-split.c (split_function): Use true instead of 1 to set the flag.
      	* symtab.c (symtab_node::add_to_same_comdat_group): Update
      	calls_comdat_local flag.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Jan Hubicka  <hubicka@ucw.cz>
      
      	* g++.dg/torture/pr93347.C: New test.
      Jan Hubicka committed
    • adjust SLP tree dumping · a89349e6
      This also dumps the root node we eventually smuggle in.
      
      2020-03-20  Richard Biener  <rguenther@suse.de>
      
      	* tree-vect-slp.c (vect_analyze_slp_instance): Dump SLP tree
      	from the possibly modified root.
      Richard Biener committed
    • c++: Add testcases from PR c++/69694 · a23eff1b
      These testcases are compiling successfully since 7.1.
      
      gcc/testsuite/ChangeLog:
      
      	PR c++/69694
      	* g++.dg/cpp0x/decltype74.C: New test.
      	* g++.dg/cpp0x/decltype75.C: New test.
      Patrick Palka committed
    • [ARM][GCC][11x]: MVE ACLE vector interleaving store and deinterleaving load… · 1dfcc3b5
      [ARM][GCC][11x]: MVE ACLE vector interleaving store and deinterleaving load intrinsics and also aliases to vstr and vldr intrinsics.
      
      This patch supports following MVE ACLE intrinsics which are aliases of vstr and
      vldr intrinsics.
      
      vst1q_p_u8, vst1q_p_s8, vld1q_z_u8, vld1q_z_s8, vst1q_p_u16, vst1q_p_s16,
      vld1q_z_u16, vld1q_z_s16, vst1q_p_u32, vst1q_p_s32, vld1q_z_u32, vld1q_z_s32,
      vld1q_z_f16, vst1q_p_f16, vld1q_z_f32, vst1q_p_f32.
      
      This patch also supports following MVE ACLE vector deinterleaving loads and vector
      interleaving stores.
      
      vst2q_s8, vst2q_u8, vld2q_s8, vld2q_u8, vld4q_s8, vld4q_u8, vst2q_s16, vst2q_u16,
      vld2q_s16, vld2q_u16, vld4q_s16, vld4q_u16, vst2q_s32, vst2q_u32, vld2q_s32,
      vld2q_u32, vld4q_s32, vld4q_u32, vld4q_f16, vld2q_f16, vst2q_f16, vld4q_f32,
      vld2q_f32, vst2q_f32.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
      [1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* config/arm/arm_mve.h (vst1q_p_u8): Define macro.
      	(vst1q_p_s8): Likewise.
      	(vst2q_s8): Likewise.
      	(vst2q_u8): Likewise.
      	(vld1q_z_u8): Likewise.
      	(vld1q_z_s8): Likewise.
      	(vld2q_s8): Likewise.
      	(vld2q_u8): Likewise.
      	(vld4q_s8): Likewise.
      	(vld4q_u8): Likewise.
      	(vst1q_p_u16): Likewise.
      	(vst1q_p_s16): Likewise.
      	(vst2q_s16): Likewise.
      	(vst2q_u16): Likewise.
      	(vld1q_z_u16): Likewise.
      	(vld1q_z_s16): Likewise.
      	(vld2q_s16): Likewise.
      	(vld2q_u16): Likewise.
      	(vld4q_s16): Likewise.
      	(vld4q_u16): Likewise.
      	(vst1q_p_u32): Likewise.
      	(vst1q_p_s32): Likewise.
      	(vst2q_s32): Likewise.
      	(vst2q_u32): Likewise.
      	(vld1q_z_u32): Likewise.
      	(vld1q_z_s32): Likewise.
      	(vld2q_s32): Likewise.
      	(vld2q_u32): Likewise.
      	(vld4q_s32): Likewise.
      	(vld4q_u32): Likewise.
      	(vld4q_f16): Likewise.
      	(vld2q_f16): Likewise.
      	(vld1q_z_f16): Likewise.
      	(vst2q_f16): Likewise.
      	(vst1q_p_f16): Likewise.
      	(vld4q_f32): Likewise.
      	(vld2q_f32): Likewise.
      	(vld1q_z_f32): Likewise.
      	(vst2q_f32): Likewise.
      	(vst1q_p_f32): Likewise.
      	(__arm_vst1q_p_u8): Define intrinsic.
      	(__arm_vst1q_p_s8): Likewise.
      	(__arm_vst2q_s8): Likewise.
      	(__arm_vst2q_u8): Likewise.
      	(__arm_vld1q_z_u8): Likewise.
      	(__arm_vld1q_z_s8): Likewise.
      	(__arm_vld2q_s8): Likewise.
      	(__arm_vld2q_u8): Likewise.
      	(__arm_vld4q_s8): Likewise.
      	(__arm_vld4q_u8): Likewise.
      	(__arm_vst1q_p_u16): Likewise.
      	(__arm_vst1q_p_s16): Likewise.
      	(__arm_vst2q_s16): Likewise.
      	(__arm_vst2q_u16): Likewise.
      	(__arm_vld1q_z_u16): Likewise.
      	(__arm_vld1q_z_s16): Likewise.
      	(__arm_vld2q_s16): Likewise.
      	(__arm_vld2q_u16): Likewise.
      	(__arm_vld4q_s16): Likewise.
      	(__arm_vld4q_u16): Likewise.
      	(__arm_vst1q_p_u32): Likewise.
      	(__arm_vst1q_p_s32): Likewise.
      	(__arm_vst2q_s32): Likewise.
      	(__arm_vst2q_u32): Likewise.
      	(__arm_vld1q_z_u32): Likewise.
      	(__arm_vld1q_z_s32): Likewise.
      	(__arm_vld2q_s32): Likewise.
      	(__arm_vld2q_u32): Likewise.
      	(__arm_vld4q_s32): Likewise.
      	(__arm_vld4q_u32): Likewise.
      	(__arm_vld4q_f16): Likewise.
      	(__arm_vld2q_f16): Likewise.
      	(__arm_vld1q_z_f16): Likewise.
      	(__arm_vst2q_f16): Likewise.
      	(__arm_vst1q_p_f16): Likewise.
      	(__arm_vld4q_f32): Likewise.
      	(__arm_vld2q_f32): Likewise.
      	(__arm_vld1q_z_f32): Likewise.
      	(__arm_vst2q_f32): Likewise.
      	(__arm_vst1q_p_f32): Likewise.
      	(vld1q_z): Define polymorphic variant.
      	(vld2q): Likewise.
      	(vld4q): Likewise.
      	(vst1q_p): Likewise.
      	(vst2q): Likewise.
      	* config/arm/arm_mve_builtins.def (STORE1): Use builtin qualifier.
      	(LOAD1): Likewise.
      	* config/arm/mve.md (mve_vst2q<mode>): Define RTL pattern.
      	(mve_vld2q<mode>): Likewise.
      	(mve_vld4q<mode>): Likewise.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vld1q_z_f16.c: New test.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_u8.c: Likewise.
      Srinath Parvathaneni committed
    • d: Fix SEGV in hash_table<odr_name_hasher, false, xcallocator>::find_slot_with_hash · b5446d0c
      This patch fixes LTO bug with the D front-end.  As DECL_ASSEMBLER_NAME
      is set on the TYPE_DECL, so TYPE_CXX_ODR_P must also be set on the type.
      
      The addition of merge_aggregate_types is not strictly needed now, but it
      fixes a problem introduced in newer versions of the dmd front-end where
      templated types could be sent more than once to the D code generator.
      
      gcc/d/ChangeLog:
      
      2020-03-20  Iain Buclaw  <ibuclaw@gdcproject.org>
      
      	PR lto/91027
      	* d-tree.h (struct GTY): Add daggregate field.
      	(IDENTIFIER_DAGGREGATE): Define.
      	(d_mangle_decl): Add declaration.
      	* decl.cc (mangle_decl): Remove static linkage, rename to...
      	(d_mangle_decl): ...this, update all callers.
      	* types.cc (merge_aggregate_types): New function.
      	(TypeVisitor::visit (TypeStruct *)): Call merge_aggregate_types, set
      	IDENTIFIER_DAGGREGATE and TYPE_CXX_ODR_P.
      	(TypeVisitor::visit (TypeClass *)): Likewise.
      Iain Buclaw committed
    • c-family: Tighten vector handling in type_for_mode [PR94072] · 1aa22b19
      In this PR we had a 512-bit VECTOR_TYPE whose mode is XImode
      (an integer mode used for four 128-bit vectors).  When trying
      to expand a zero constant for it, we hit code in expand_expr_real_1
      that tries to use the associated integer type instead.  The code used
      type_for_mode (XImode, 1) to get this integer type.
      
      However, the c-family implementation of type_for_mode checks for
      any registered built-in type that matches the mode and has the
      right signedness.  This meant that it could return a built-in
      vector type when given an integer mode (particularly if, as here,
      the vector type isn't supported by the current subtarget and so
      TYPE_MODE != TYPE_MODE_RAW).  The expand code would then cycle
      endlessly trying to use this "new" type instead of the original
      vector type.
      
      2020-03-20  Richard Sandiford  <richard.sandiford@arm.com>
      
      gcc/c-family/
      	PR middle-end/94072
      	* c-common.c (c_common_type_for_mode): Before using a registered
      	built-in type, check that the vectorness of the type matches
      	the vectorness of the mode.
      
      gcc/testsuite/
      	PR middle-end/94072
      	* gcc.target/aarch64/pr94072.c: New test.
      Richard Sandiford committed
    • [ARM][GCC][10x]: MVE ACLE intrinsics "add with carry across beats" and "beat-wise substract". · c3562f81
      This patch supports following MVE ACLE "add with carry across beats" intrinsics and "beat-wise substract" intrinsics.
      
      vadciq_s32, vadciq_u32, vadciq_m_s32, vadciq_m_u32, vadcq_s32, vadcq_u32, vadcq_m_s32, vadcq_m_u32, vsbciq_s32, vsbciq_u32, vsbciq_m_s32, vsbciq_m_u32, vsbcq_s32, vsbcq_u32, vsbcq_m_s32, vsbcq_m_u32.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
      [1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* config/arm/arm-builtins.c (ARM_BUILTIN_GET_FPSCR_NZCVQC): Define.
      	(ARM_BUILTIN_SET_FPSCR_NZCVQC): Likewise.
      	(arm_init_mve_builtins): Add "__builtin_arm_get_fpscr_nzcvqc" and
      	"__builtin_arm_set_fpscr_nzcvqc" to arm_builtin_decls array.
      	(arm_expand_builtin): Define case ARM_BUILTIN_GET_FPSCR_NZCVQC
      	and ARM_BUILTIN_SET_FPSCR_NZCVQC.
      	* config/arm/arm_mve.h (vadciq_s32): Define macro.
      	(vadciq_u32): Likewise.
      	(vadciq_m_s32): Likewise.
      	(vadciq_m_u32): Likewise.
      	(vadcq_s32): Likewise.
      	(vadcq_u32): Likewise.
      	(vadcq_m_s32): Likewise.
      	(vadcq_m_u32): Likewise.
      	(vsbciq_s32): Likewise.
      	(vsbciq_u32): Likewise.
      	(vsbciq_m_s32): Likewise.
      	(vsbciq_m_u32): Likewise.
      	(vsbcq_s32): Likewise.
      	(vsbcq_u32): Likewise.
      	(vsbcq_m_s32): Likewise.
      	(vsbcq_m_u32): Likewise.
      	(__arm_vadciq_s32): Define intrinsic.
      	(__arm_vadciq_u32): Likewise.
      	(__arm_vadciq_m_s32): Likewise.
      	(__arm_vadciq_m_u32): Likewise.
      	(__arm_vadcq_s32): Likewise.
      	(__arm_vadcq_u32): Likewise.
      	(__arm_vadcq_m_s32): Likewise.
      	(__arm_vadcq_m_u32): Likewise.
      	(__arm_vsbciq_s32): Likewise.
      	(__arm_vsbciq_u32): Likewise.
      	(__arm_vsbciq_m_s32): Likewise.
      	(__arm_vsbciq_m_u32): Likewise.
      	(__arm_vsbcq_s32): Likewise.
      	(__arm_vsbcq_u32): Likewise.
      	(__arm_vsbcq_m_s32): Likewise.
      	(__arm_vsbcq_m_u32): Likewise.
      	(vadciq_m): Define polymorphic variant.
      	(vadciq): Likewise.
      	(vadcq_m): Likewise.
      	(vadcq): Likewise.
      	(vsbciq_m): Likewise.
      	(vsbciq): Likewise.
      	(vsbcq_m): Likewise.
      	(vsbcq): Likewise.
      	* config/arm/arm_mve_builtins.def (BINOP_NONE_NONE_NONE): Use builtin
      	qualifier.
      	(BINOP_UNONE_UNONE_UNONE): Likewise.
      	(QUADOP_NONE_NONE_NONE_NONE_UNONE): Likewise.
      	(QUADOP_UNONE_UNONE_UNONE_UNONE_UNONE): Likewise.
      	* config/arm/mve.md (VADCIQ): Define iterator.
      	(VADCIQ_M): Likewise.
      	(VSBCQ): Likewise.
      	(VSBCQ_M): Likewise.
      	(VSBCIQ): Likewise.
      	(VSBCIQ_M): Likewise.
      	(VADCQ): Likewise.
      	(VADCQ_M): Likewise.
      	(mve_vadciq_m_<supf>v4si): Define RTL pattern.
      	(mve_vadciq_<supf>v4si): Likewise.
      	(mve_vadcq_m_<supf>v4si): Likewise.
      	(mve_vadcq_<supf>v4si): Likewise.
      	(mve_vsbciq_m_<supf>v4si): Likewise.
      	(mve_vsbciq_<supf>v4si): Likewise.
      	(mve_vsbcq_m_<supf>v4si): Likewise.
      	(mve_vsbcq_<supf>v4si): Likewise.
      	(get_fpscr_nzcvqc): Define isns.
      	(set_fpscr_nzcvqc): Define isns.
      	* config/arm/unspecs.md (UNSPEC_GET_FPSCR_NZCVQC): Define.
      	(UNSPEC_SET_FPSCR_NZCVQC): Define.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                 Andre Vieira  <andre.simoesdiasvieira@arm.com>
                 Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vadciq_m_s32.c: New test.
      	* gcc.target/arm/mve/intrinsics/vadciq_m_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vadciq_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vadciq_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vadcq_m_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vadcq_m_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vadcq_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vadcq_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbciq_m_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbciq_m_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbciq_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbciq_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbcq_m_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbcq_m_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbcq_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbcq_u32.c: Likewise.
      Srinath Parvathaneni committed
    • c++: Include the constraint parameter mapping in diagnostic constraint contexts · 828878c3
      When diagnosing a constraint error, we currently try to print the constraint
      inside a diagnostic constraint context with its template arguments substituted
      in.  If substitution fails, then we instead just print the dependent form, as in
      the test case below:
      
        .../diagnostic6.C:14:15: error: static assertion failed
           14 | static_assert(E<int>); // { dg-error "static assertion failed|not a class" }
              |               ^~~~~~
        .../diagnostic6.C:14:15: note: constraints not satisfied
        .../diagnostic6.C:4:11:   required for the satisfaction of ‘C<T>’
        .../diagnostic6.C:8:11:   required for the satisfaction of ‘D<typename T::type>’
        .../diagnostic6.C:14:15: error: ‘int’ is not a class, struct, or union type
      
      But printing just the dependent form sometimes makes it difficult to understand
      the underlying failure.  In the above example, for instance, there's no
      indication of how the template argument 'int' relates to either of the 'T's.
      
      This patch improves the situation by changing these diagnostics to always print
      the dependent form of the constraint, and alongside it the (preferably
      substituted) constraint parameter mapping.  So with the same test case below we
      now get:
      
        .../diagnostic6.C:14:15: error: static assertion failed
           14 | static_assert(E<int>); // { dg-error "static assertion failed|not a class" }
              |               ^~~~~~
        .../diagnostic6.C:14:15: note: constraints not satisfied
        .../diagnostic6.C:4:11:   required for the satisfaction of ‘C<T>’ [with T = typename T::type]
        .../diagnostic6.C:8:11:   required for the satisfaction of ‘D<typename T::type>’ [with T = int]
        .../diagnostic6.C:14:15: error: ‘int’ is not a class, struct, or union type
      
      This change arguably makes it easier to figure out what's going on whenever a
      constraint fails due to substitution creating an invalid type rather than
      failing due to the constraint evaluating to false.
      
      gcc/cp/ChangeLog:
      
      	* cxx-pretty-print.c (pp_cxx_parameter_mapping): Make extern.  Move
      	the "[with ]" bits to here from ...
      	(pp_cxx_atomic_constraint): ... here.
      	* cxx-pretty-print.h (pp_cxx_parameter_mapping): Declare.
      	* error.c (rebuild_concept_check): Delete.
      	(print_concept_check_info): Print the dependent form of the constraint and the
      	preferably substituted parameter mapping alongside it.
      
      gcc/testsuite/ChangeLog:
      
      	* g++.dg/concepts/diagnostic6.C: New test.
      Patrick Palka committed
    • [ARM][GCC][9x]: MVE ACLE predicated intrinsics with (dont-care) variant. · 261014a1
      This patch supports following MVE ACLE predicated intrinsic with `_x` (dont-care) variant.
      * ``_x`` (dont-care) which indicates that the false-predicated lanes have undefined values.
      These are syntactic sugar for merge intrinsics with a ``vuninitializedq`` inactive parameter.
      
      vabdq_x_f16, vabdq_x_f32, vabdq_x_s16, vabdq_x_s32, vabdq_x_s8, vabdq_x_u16, vabdq_x_u32, vabdq_x_u8,
      vabsq_x_f16, vabsq_x_f32, vabsq_x_s16, vabsq_x_s32, vabsq_x_s8, vaddq_x_f16, vaddq_x_f32, vaddq_x_n_f16,
      vaddq_x_n_f32, vaddq_x_n_s16, vaddq_x_n_s32, vaddq_x_n_s8, vaddq_x_n_u16, vaddq_x_n_u32, vaddq_x_n_u8,
      vaddq_x_s16, vaddq_x_s32, vaddq_x_s8, vaddq_x_u16, vaddq_x_u32, vaddq_x_u8, vandq_x_f16, vandq_x_f32,
      vandq_x_s16, vandq_x_s32, vandq_x_s8, vandq_x_u16, vandq_x_u32, vandq_x_u8, vbicq_x_f16, vbicq_x_f32,
      vbicq_x_s16, vbicq_x_s32, vbicq_x_s8, vbicq_x_u16, vbicq_x_u32, vbicq_x_u8, vbrsrq_x_n_f16,
      vbrsrq_x_n_f32, vbrsrq_x_n_s16, vbrsrq_x_n_s32, vbrsrq_x_n_s8, vbrsrq_x_n_u16, vbrsrq_x_n_u32,
      vbrsrq_x_n_u8, vcaddq_rot270_x_f16, vcaddq_rot270_x_f32, vcaddq_rot270_x_s16, vcaddq_rot270_x_s32,
      vcaddq_rot270_x_s8, vcaddq_rot270_x_u16, vcaddq_rot270_x_u32, vcaddq_rot270_x_u8, vcaddq_rot90_x_f16,
      vcaddq_rot90_x_f32, vcaddq_rot90_x_s16, vcaddq_rot90_x_s32, vcaddq_rot90_x_s8, vcaddq_rot90_x_u16,
      vcaddq_rot90_x_u32, vcaddq_rot90_x_u8, vclsq_x_s16, vclsq_x_s32, vclsq_x_s8, vclzq_x_s16, vclzq_x_s32,
      vclzq_x_s8, vclzq_x_u16, vclzq_x_u32, vclzq_x_u8, vcmulq_rot180_x_f16, vcmulq_rot180_x_f32,
      vcmulq_rot270_x_f16, vcmulq_rot270_x_f32, vcmulq_rot90_x_f16, vcmulq_rot90_x_f32, vcmulq_x_f16,
      vcmulq_x_f32, vcvtaq_x_s16_f16, vcvtaq_x_s32_f32, vcvtaq_x_u16_f16, vcvtaq_x_u32_f32, vcvtbq_x_f32_f16,
      vcvtmq_x_s16_f16, vcvtmq_x_s32_f32, vcvtmq_x_u16_f16, vcvtmq_x_u32_f32, vcvtnq_x_s16_f16,
      vcvtnq_x_s32_f32, vcvtnq_x_u16_f16, vcvtnq_x_u32_f32, vcvtpq_x_s16_f16, vcvtpq_x_s32_f32,
      vcvtpq_x_u16_f16, vcvtpq_x_u32_f32, vcvtq_x_f16_s16, vcvtq_x_f16_u16, vcvtq_x_f32_s32, vcvtq_x_f32_u32,
      vcvtq_x_n_f16_s16, vcvtq_x_n_f16_u16, vcvtq_x_n_f32_s32, vcvtq_x_n_f32_u32, vcvtq_x_n_s16_f16,
      vcvtq_x_n_s32_f32, vcvtq_x_n_u16_f16, vcvtq_x_n_u32_f32, vcvtq_x_s16_f16, vcvtq_x_s32_f32,
      vcvtq_x_u16_f16, vcvtq_x_u32_f32, vcvttq_x_f32_f16, vddupq_x_n_u16, vddupq_x_n_u32, vddupq_x_n_u8,
      vddupq_x_wb_u16, vddupq_x_wb_u32, vddupq_x_wb_u8, vdupq_x_n_f16, vdupq_x_n_f32, vdupq_x_n_s16,
      vdupq_x_n_s32, vdupq_x_n_s8, vdupq_x_n_u16, vdupq_x_n_u32, vdupq_x_n_u8, vdwdupq_x_n_u16, vdwdupq_x_n_u32,
      vdwdupq_x_n_u8, vdwdupq_x_wb_u16, vdwdupq_x_wb_u32, vdwdupq_x_wb_u8, veorq_x_f16, veorq_x_f32, veorq_x_s16,
      veorq_x_s32, veorq_x_s8, veorq_x_u16, veorq_x_u32, veorq_x_u8, vhaddq_x_n_s16, vhaddq_x_n_s32,
      vhaddq_x_n_s8, vhaddq_x_n_u16, vhaddq_x_n_u32, vhaddq_x_n_u8, vhaddq_x_s16, vhaddq_x_s32, vhaddq_x_s8,
      vhaddq_x_u16, vhaddq_x_u32, vhaddq_x_u8, vhcaddq_rot270_x_s16, vhcaddq_rot270_x_s32, vhcaddq_rot270_x_s8,
      vhcaddq_rot90_x_s16, vhcaddq_rot90_x_s32, vhcaddq_rot90_x_s8, vhsubq_x_n_s16, vhsubq_x_n_s32,
      vhsubq_x_n_s8, vhsubq_x_n_u16, vhsubq_x_n_u32, vhsubq_x_n_u8, vhsubq_x_s16, vhsubq_x_s32, vhsubq_x_s8,
      vhsubq_x_u16, vhsubq_x_u32, vhsubq_x_u8, vidupq_x_n_u16, vidupq_x_n_u32, vidupq_x_n_u8, vidupq_x_wb_u16,
      vidupq_x_wb_u32, vidupq_x_wb_u8, viwdupq_x_n_u16, viwdupq_x_n_u32, viwdupq_x_n_u8, viwdupq_x_wb_u16,
      viwdupq_x_wb_u32, viwdupq_x_wb_u8, vmaxnmq_x_f16, vmaxnmq_x_f32, vmaxq_x_s16, vmaxq_x_s32, vmaxq_x_s8,
      vmaxq_x_u16, vmaxq_x_u32, vmaxq_x_u8, vminnmq_x_f16, vminnmq_x_f32, vminq_x_s16, vminq_x_s32, vminq_x_s8,
      vminq_x_u16, vminq_x_u32, vminq_x_u8, vmovlbq_x_s16, vmovlbq_x_s8, vmovlbq_x_u16, vmovlbq_x_u8,
      vmovltq_x_s16, vmovltq_x_s8, vmovltq_x_u16, vmovltq_x_u8, vmulhq_x_s16, vmulhq_x_s32, vmulhq_x_s8,
      vmulhq_x_u16, vmulhq_x_u32, vmulhq_x_u8, vmullbq_int_x_s16, vmullbq_int_x_s32, vmullbq_int_x_s8,
      vmullbq_int_x_u16, vmullbq_int_x_u32, vmullbq_int_x_u8, vmullbq_poly_x_p16, vmullbq_poly_x_p8,
      vmulltq_int_x_s16, vmulltq_int_x_s32, vmulltq_int_x_s8, vmulltq_int_x_u16, vmulltq_int_x_u32,
      vmulltq_int_x_u8, vmulltq_poly_x_p16, vmulltq_poly_x_p8, vmulq_x_f16, vmulq_x_f32, vmulq_x_n_f16,
      vmulq_x_n_f32, vmulq_x_n_s16, vmulq_x_n_s32, vmulq_x_n_s8, vmulq_x_n_u16, vmulq_x_n_u32, vmulq_x_n_u8,
      vmulq_x_s16, vmulq_x_s32, vmulq_x_s8, vmulq_x_u16, vmulq_x_u32, vmulq_x_u8, vmvnq_x_n_s16, vmvnq_x_n_s32,
      vmvnq_x_n_u16, vmvnq_x_n_u32, vmvnq_x_s16, vmvnq_x_s32, vmvnq_x_s8, vmvnq_x_u16, vmvnq_x_u32, vmvnq_x_u8,
      vnegq_x_f16, vnegq_x_f32, vnegq_x_s16, vnegq_x_s32, vnegq_x_s8, vornq_x_f16, vornq_x_f32, vornq_x_s16,
      vornq_x_s32, vornq_x_s8, vornq_x_u16, vornq_x_u32, vornq_x_u8, vorrq_x_f16, vorrq_x_f32, vorrq_x_s16,
      vorrq_x_s32, vorrq_x_s8, vorrq_x_u16, vorrq_x_u32, vorrq_x_u8, vrev16q_x_s8, vrev16q_x_u8, vrev32q_x_f16,
      vrev32q_x_s16, vrev32q_x_s8, vrev32q_x_u16, vrev32q_x_u8, vrev64q_x_f16, vrev64q_x_f32, vrev64q_x_s16,
      vrev64q_x_s32, vrev64q_x_s8, vrev64q_x_u16, vrev64q_x_u32, vrev64q_x_u8, vrhaddq_x_s16, vrhaddq_x_s32,
      vrhaddq_x_s8, vrhaddq_x_u16, vrhaddq_x_u32, vrhaddq_x_u8, vrmulhq_x_s16, vrmulhq_x_s32, vrmulhq_x_s8,
      vrmulhq_x_u16, vrmulhq_x_u32, vrmulhq_x_u8, vrndaq_x_f16, vrndaq_x_f32, vrndmq_x_f16, vrndmq_x_f32,
      vrndnq_x_f16, vrndnq_x_f32, vrndpq_x_f16, vrndpq_x_f32, vrndq_x_f16, vrndq_x_f32, vrndxq_x_f16,
      vrndxq_x_f32, vrshlq_x_s16, vrshlq_x_s32, vrshlq_x_s8, vrshlq_x_u16, vrshlq_x_u32, vrshlq_x_u8,
      vrshrq_x_n_s16, vrshrq_x_n_s32, vrshrq_x_n_s8, vrshrq_x_n_u16, vrshrq_x_n_u32, vrshrq_x_n_u8,
      vshllbq_x_n_s16, vshllbq_x_n_s8, vshllbq_x_n_u16, vshllbq_x_n_u8, vshlltq_x_n_s16, vshlltq_x_n_s8,
      vshlltq_x_n_u16, vshlltq_x_n_u8, vshlq_x_n_s16, vshlq_x_n_s32, vshlq_x_n_s8, vshlq_x_n_u16, vshlq_x_n_u32,
      vshlq_x_n_u8, vshlq_x_s16, vshlq_x_s32, vshlq_x_s8, vshlq_x_u16, vshlq_x_u32, vshlq_x_u8, vshrq_x_n_s16,
      vshrq_x_n_s32, vshrq_x_n_s8, vshrq_x_n_u16, vshrq_x_n_u32, vshrq_x_n_u8, vsubq_x_f16, vsubq_x_f32,
      vsubq_x_n_f16, vsubq_x_n_f32, vsubq_x_n_s16, vsubq_x_n_s32, vsubq_x_n_s8, vsubq_x_n_u16, vsubq_x_n_u32,
      vsubq_x_n_u8, vsubq_x_s16, vsubq_x_s32, vsubq_x_s8, vsubq_x_u16, vsubq_x_u32, vsubq_x_u8.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
      [1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
      
      	* config/arm/arm_mve.h (vddupq_x_n_u8): Define macro.
      	(vddupq_x_n_u16): Likewise.
      	(vddupq_x_n_u32): Likewise.
      	(vddupq_x_wb_u8): Likewise.
      	(vddupq_x_wb_u16): Likewise.
      	(vddupq_x_wb_u32): Likewise.
      	(vdwdupq_x_n_u8): Likewise.
      	(vdwdupq_x_n_u16): Likewise.
      	(vdwdupq_x_n_u32): Likewise.
      	(vdwdupq_x_wb_u8): Likewise.
      	(vdwdupq_x_wb_u16): Likewise.
      	(vdwdupq_x_wb_u32): Likewise.
      	(vidupq_x_n_u8): Likewise.
      	(vidupq_x_n_u16): Likewise.
      	(vidupq_x_n_u32): Likewise.
      	(vidupq_x_wb_u8): Likewise.
      	(vidupq_x_wb_u16): Likewise.
      	(vidupq_x_wb_u32): Likewise.
      	(viwdupq_x_n_u8): Likewise.
      	(viwdupq_x_n_u16): Likewise.
      	(viwdupq_x_n_u32): Likewise.
      	(viwdupq_x_wb_u8): Likewise.
      	(viwdupq_x_wb_u16): Likewise.
      	(viwdupq_x_wb_u32): Likewise.
      	(vdupq_x_n_s8): Likewise.
      	(vdupq_x_n_s16): Likewise.
      	(vdupq_x_n_s32): Likewise.
      	(vdupq_x_n_u8): Likewise.
      	(vdupq_x_n_u16): Likewise.
      	(vdupq_x_n_u32): Likewise.
      	(vminq_x_s8): Likewise.
      	(vminq_x_s16): Likewise.
      	(vminq_x_s32): Likewise.
      	(vminq_x_u8): Likewise.
      	(vminq_x_u16): Likewise.
      	(vminq_x_u32): Likewise.
      	(vmaxq_x_s8): Likewise.
      	(vmaxq_x_s16): Likewise.
      	(vmaxq_x_s32): Likewise.
      	(vmaxq_x_u8): Likewise.
      	(vmaxq_x_u16): Likewise.
      	(vmaxq_x_u32): Likewise.
      	(vabdq_x_s8): Likewise.
      	(vabdq_x_s16): Likewise.
      	(vabdq_x_s32): Likewise.
      	(vabdq_x_u8): Likewise.
      	(vabdq_x_u16): Likewise.
      	(vabdq_x_u32): Likewise.
      	(vabsq_x_s8): Likewise.
      	(vabsq_x_s16): Likewise.
      	(vabsq_x_s32): Likewise.
      	(vaddq_x_s8): Likewise.
      	(vaddq_x_s16): Likewise.
      	(vaddq_x_s32): Likewise.
      	(vaddq_x_n_s8): Likewise.
      	(vaddq_x_n_s16): Likewise.
      	(vaddq_x_n_s32): Likewise.
      	(vaddq_x_u8): Likewise.
      	(vaddq_x_u16): Likewise.
      	(vaddq_x_u32): Likewise.
      	(vaddq_x_n_u8): Likewise.
      	(vaddq_x_n_u16): Likewise.
      	(vaddq_x_n_u32): Likewise.
      	(vclsq_x_s8): Likewise.
      	(vclsq_x_s16): Likewise.
      	(vclsq_x_s32): Likewise.
      	(vclzq_x_s8): Likewise.
      	(vclzq_x_s16): Likewise.
      	(vclzq_x_s32): Likewise.
      	(vclzq_x_u8): Likewise.
      	(vclzq_x_u16): Likewise.
      	(vclzq_x_u32): Likewise.
      	(vnegq_x_s8): Likewise.
      	(vnegq_x_s16): Likewise.
      	(vnegq_x_s32): Likewise.
      	(vmulhq_x_s8): Likewise.
      	(vmulhq_x_s16): Likewise.
      	(vmulhq_x_s32): Likewise.
      	(vmulhq_x_u8): Likewise.
      	(vmulhq_x_u16): Likewise.
      	(vmulhq_x_u32): Likewise.
      	(vmullbq_poly_x_p8): Likewise.
      	(vmullbq_poly_x_p16): Likewise.
      	(vmullbq_int_x_s8): Likewise.
      	(vmullbq_int_x_s16): Likewise.
      	(vmullbq_int_x_s32): Likewise.
      	(vmullbq_int_x_u8): Likewise.
      	(vmullbq_int_x_u16): Likewise.
      	(vmullbq_int_x_u32): Likewise.
      	(vmulltq_poly_x_p8): Likewise.
      	(vmulltq_poly_x_p16): Likewise.
      	(vmulltq_int_x_s8): Likewise.
      	(vmulltq_int_x_s16): Likewise.
      	(vmulltq_int_x_s32): Likewise.
      	(vmulltq_int_x_u8): Likewise.
      	(vmulltq_int_x_u16): Likewise.
      	(vmulltq_int_x_u32): Likewise.
      	(vmulq_x_s8): Likewise.
      	(vmulq_x_s16): Likewise.
      	(vmulq_x_s32): Likewise.
      	(vmulq_x_n_s8): Likewise.
      	(vmulq_x_n_s16): Likewise.
      	(vmulq_x_n_s32): Likewise.
      	(vmulq_x_u8): Likewise.
      	(vmulq_x_u16): Likewise.
      	(vmulq_x_u32): Likewise.
      	(vmulq_x_n_u8): Likewise.
      	(vmulq_x_n_u16): Likewise.
      	(vmulq_x_n_u32): Likewise.
      	(vsubq_x_s8): Likewise.
      	(vsubq_x_s16): Likewise.
      	(vsubq_x_s32): Likewise.
      	(vsubq_x_n_s8): Likewise.
      	(vsubq_x_n_s16): Likewise.
      	(vsubq_x_n_s32): Likewise.
      	(vsubq_x_u8): Likewise.
      	(vsubq_x_u16): Likewise.
      	(vsubq_x_u32): Likewise.
      	(vsubq_x_n_u8): Likewise.
      	(vsubq_x_n_u16): Likewise.
      	(vsubq_x_n_u32): Likewise.
      	(vcaddq_rot90_x_s8): Likewise.
      	(vcaddq_rot90_x_s16): Likewise.
      	(vcaddq_rot90_x_s32): Likewise.
      	(vcaddq_rot90_x_u8): Likewise.
      	(vcaddq_rot90_x_u16): Likewise.
      	(vcaddq_rot90_x_u32): Likewise.
      	(vcaddq_rot270_x_s8): Likewise.
      	(vcaddq_rot270_x_s16): Likewise.
      	(vcaddq_rot270_x_s32): Likewise.
      	(vcaddq_rot270_x_u8): Likewise.
      	(vcaddq_rot270_x_u16): Likewise.
      	(vcaddq_rot270_x_u32): Likewise.
      	(vhaddq_x_n_s8): Likewise.
      	(vhaddq_x_n_s16): Likewise.
      	(vhaddq_x_n_s32): Likewise.
      	(vhaddq_x_n_u8): Likewise.
      	(vhaddq_x_n_u16): Likewise.
      	(vhaddq_x_n_u32): Likewise.
      	(vhaddq_x_s8): Likewise.
      	(vhaddq_x_s16): Likewise.
      	(vhaddq_x_s32): Likewise.
      	(vhaddq_x_u8): Likewise.
      	(vhaddq_x_u16): Likewise.
      	(vhaddq_x_u32): Likewise.
      	(vhcaddq_rot90_x_s8): Likewise.
      	(vhcaddq_rot90_x_s16): Likewise.
      	(vhcaddq_rot90_x_s32): Likewise.
      	(vhcaddq_rot270_x_s8): Likewise.
      	(vhcaddq_rot270_x_s16): Likewise.
      	(vhcaddq_rot270_x_s32): Likewise.
      	(vhsubq_x_n_s8): Likewise.
      	(vhsubq_x_n_s16): Likewise.
      	(vhsubq_x_n_s32): Likewise.
      	(vhsubq_x_n_u8): Likewise.
      	(vhsubq_x_n_u16): Likewise.
      	(vhsubq_x_n_u32): Likewise.
      	(vhsubq_x_s8): Likewise.
      	(vhsubq_x_s16): Likewise.
      	(vhsubq_x_s32): Likewise.
      	(vhsubq_x_u8): Likewise.
      	(vhsubq_x_u16): Likewise.
      	(vhsubq_x_u32): Likewise.
      	(vrhaddq_x_s8): Likewise.
      	(vrhaddq_x_s16): Likewise.
      	(vrhaddq_x_s32): Likewise.
      	(vrhaddq_x_u8): Likewise.
      	(vrhaddq_x_u16): Likewise.
      	(vrhaddq_x_u32): Likewise.
      	(vrmulhq_x_s8): Likewise.
      	(vrmulhq_x_s16): Likewise.
      	(vrmulhq_x_s32): Likewise.
      	(vrmulhq_x_u8): Likewise.
      	(vrmulhq_x_u16): Likewise.
      	(vrmulhq_x_u32): Likewise.
      	(vandq_x_s8): Likewise.
      	(vandq_x_s16): Likewise.
      	(vandq_x_s32): Likewise.
      	(vandq_x_u8): Likewise.
      	(vandq_x_u16): Likewise.
      	(vandq_x_u32): Likewise.
      	(vbicq_x_s8): Likewise.
      	(vbicq_x_s16): Likewise.
      	(vbicq_x_s32): Likewise.
      	(vbicq_x_u8): Likewise.
      	(vbicq_x_u16): Likewise.
      	(vbicq_x_u32): Likewise.
      	(vbrsrq_x_n_s8): Likewise.
      	(vbrsrq_x_n_s16): Likewise.
      	(vbrsrq_x_n_s32): Likewise.
      	(vbrsrq_x_n_u8): Likewise.
      	(vbrsrq_x_n_u16): Likewise.
      	(vbrsrq_x_n_u32): Likewise.
      	(veorq_x_s8): Likewise.
      	(veorq_x_s16): Likewise.
      	(veorq_x_s32): Likewise.
      	(veorq_x_u8): Likewise.
      	(veorq_x_u16): Likewise.
      	(veorq_x_u32): Likewise.
      	(vmovlbq_x_s8): Likewise.
      	(vmovlbq_x_s16): Likewise.
      	(vmovlbq_x_u8): Likewise.
      	(vmovlbq_x_u16): Likewise.
      	(vmovltq_x_s8): Likewise.
      	(vmovltq_x_s16): Likewise.
      	(vmovltq_x_u8): Likewise.
      	(vmovltq_x_u16): Likewise.
      	(vmvnq_x_s8): Likewise.
      	(vmvnq_x_s16): Likewise.
      	(vmvnq_x_s32): Likewise.
      	(vmvnq_x_u8): Likewise.
      	(vmvnq_x_u16): Likewise.
      	(vmvnq_x_u32): Likewise.
      	(vmvnq_x_n_s16): Likewise.
      	(vmvnq_x_n_s32): Likewise.
      	(vmvnq_x_n_u16): Likewise.
      	(vmvnq_x_n_u32): Likewise.
      	(vornq_x_s8): Likewise.
      	(vornq_x_s16): Likewise.
      	(vornq_x_s32): Likewise.
      	(vornq_x_u8): Likewise.
      	(vornq_x_u16): Likewise.
      	(vornq_x_u32): Likewise.
      	(vorrq_x_s8): Likewise.
      	(vorrq_x_s16): Likewise.
      	(vorrq_x_s32): Likewise.
      	(vorrq_x_u8): Likewise.
      	(vorrq_x_u16): Likewise.
      	(vorrq_x_u32): Likewise.
      	(vrev16q_x_s8): Likewise.
      	(vrev16q_x_u8): Likewise.
      	(vrev32q_x_s8): Likewise.
      	(vrev32q_x_s16): Likewise.
      	(vrev32q_x_u8): Likewise.
      	(vrev32q_x_u16): Likewise.
      	(vrev64q_x_s8): Likewise.
      	(vrev64q_x_s16): Likewise.
      	(vrev64q_x_s32): Likewise.
      	(vrev64q_x_u8): Likewise.
      	(vrev64q_x_u16): Likewise.
      	(vrev64q_x_u32): Likewise.
      	(vrshlq_x_s8): Likewise.
      	(vrshlq_x_s16): Likewise.
      	(vrshlq_x_s32): Likewise.
      	(vrshlq_x_u8): Likewise.
      	(vrshlq_x_u16): Likewise.
      	(vrshlq_x_u32): Likewise.
      	(vshllbq_x_n_s8): Likewise.
      	(vshllbq_x_n_s16): Likewise.
      	(vshllbq_x_n_u8): Likewise.
      	(vshllbq_x_n_u16): Likewise.
      	(vshlltq_x_n_s8): Likewise.
      	(vshlltq_x_n_s16): Likewise.
      	(vshlltq_x_n_u8): Likewise.
      	(vshlltq_x_n_u16): Likewise.
      	(vshlq_x_s8): Likewise.
      	(vshlq_x_s16): Likewise.
      	(vshlq_x_s32): Likewise.
      	(vshlq_x_u8): Likewise.
      	(vshlq_x_u16): Likewise.
      	(vshlq_x_u32): Likewise.
      	(vshlq_x_n_s8): Likewise.
      	(vshlq_x_n_s16): Likewise.
      	(vshlq_x_n_s32): Likewise.
      	(vshlq_x_n_u8): Likewise.
      	(vshlq_x_n_u16): Likewise.
      	(vshlq_x_n_u32): Likewise.
      	(vrshrq_x_n_s8): Likewise.
      	(vrshrq_x_n_s16): Likewise.
      	(vrshrq_x_n_s32): Likewise.
      	(vrshrq_x_n_u8): Likewise.
      	(vrshrq_x_n_u16): Likewise.
      	(vrshrq_x_n_u32): Likewise.
      	(vshrq_x_n_s8): Likewise.
      	(vshrq_x_n_s16): Likewise.
      	(vshrq_x_n_s32): Likewise.
      	(vshrq_x_n_u8): Likewise.
      	(vshrq_x_n_u16): Likewise.
      	(vshrq_x_n_u32): Likewise.
      	(vdupq_x_n_f16): Likewise.
      	(vdupq_x_n_f32): Likewise.
      	(vminnmq_x_f16): Likewise.
      	(vminnmq_x_f32): Likewise.
      	(vmaxnmq_x_f16): Likewise.
      	(vmaxnmq_x_f32): Likewise.
      	(vabdq_x_f16): Likewise.
      	(vabdq_x_f32): Likewise.
      	(vabsq_x_f16): Likewise.
      	(vabsq_x_f32): Likewise.
      	(vaddq_x_f16): Likewise.
      	(vaddq_x_f32): Likewise.
      	(vaddq_x_n_f16): Likewise.
      	(vaddq_x_n_f32): Likewise.
      	(vnegq_x_f16): Likewise.
      	(vnegq_x_f32): Likewise.
      	(vmulq_x_f16): Likewise.
      	(vmulq_x_f32): Likewise.
      	(vmulq_x_n_f16): Likewise.
      	(vmulq_x_n_f32): Likewise.
      	(vsubq_x_f16): Likewise.
      	(vsubq_x_f32): Likewise.
      	(vsubq_x_n_f16): Likewise.
      	(vsubq_x_n_f32): Likewise.
      	(vcaddq_rot90_x_f16): Likewise.
      	(vcaddq_rot90_x_f32): Likewise.
      	(vcaddq_rot270_x_f16): Likewise.
      	(vcaddq_rot270_x_f32): Likewise.
      	(vcmulq_x_f16): Likewise.
      	(vcmulq_x_f32): Likewise.
      	(vcmulq_rot90_x_f16): Likewise.
      	(vcmulq_rot90_x_f32): Likewise.
      	(vcmulq_rot180_x_f16): Likewise.
      	(vcmulq_rot180_x_f32): Likewise.
      	(vcmulq_rot270_x_f16): Likewise.
      	(vcmulq_rot270_x_f32): Likewise.
      	(vcvtaq_x_s16_f16): Likewise.
      	(vcvtaq_x_s32_f32): Likewise.
      	(vcvtaq_x_u16_f16): Likewise.
      	(vcvtaq_x_u32_f32): Likewise.
      	(vcvtnq_x_s16_f16): Likewise.
      	(vcvtnq_x_s32_f32): Likewise.
      	(vcvtnq_x_u16_f16): Likewise.
      	(vcvtnq_x_u32_f32): Likewise.
      	(vcvtpq_x_s16_f16): Likewise.
      	(vcvtpq_x_s32_f32): Likewise.
      	(vcvtpq_x_u16_f16): Likewise.
      	(vcvtpq_x_u32_f32): Likewise.
      	(vcvtmq_x_s16_f16): Likewise.
      	(vcvtmq_x_s32_f32): Likewise.
      	(vcvtmq_x_u16_f16): Likewise.
      	(vcvtmq_x_u32_f32): Likewise.
      	(vcvtbq_x_f32_f16): Likewise.
      	(vcvttq_x_f32_f16): Likewise.
      	(vcvtq_x_f16_u16): Likewise.
      	(vcvtq_x_f16_s16): Likewise.
      	(vcvtq_x_f32_s32): Likewise.
      	(vcvtq_x_f32_u32): Likewise.
      	(vcvtq_x_n_f16_s16): Likewise.
      	(vcvtq_x_n_f16_u16): Likewise.
      	(vcvtq_x_n_f32_s32): Likewise.
      	(vcvtq_x_n_f32_u32): Likewise.
      	(vcvtq_x_s16_f16): Likewise.
      	(vcvtq_x_s32_f32): Likewise.
      	(vcvtq_x_u16_f16): Likewise.
      	(vcvtq_x_u32_f32): Likewise.
      	(vcvtq_x_n_s16_f16): Likewise.
      	(vcvtq_x_n_s32_f32): Likewise.
      	(vcvtq_x_n_u16_f16): Likewise.
      	(vcvtq_x_n_u32_f32): Likewise.
      	(vrndq_x_f16): Likewise.
      	(vrndq_x_f32): Likewise.
      	(vrndnq_x_f16): Likewise.
      	(vrndnq_x_f32): Likewise.
      	(vrndmq_x_f16): Likewise.
      	(vrndmq_x_f32): Likewise.
      	(vrndpq_x_f16): Likewise.
      	(vrndpq_x_f32): Likewise.
      	(vrndaq_x_f16): Likewise.
      	(vrndaq_x_f32): Likewise.
      	(vrndxq_x_f16): Likewise.
      	(vrndxq_x_f32): Likewise.
      	(vandq_x_f16): Likewise.
      	(vandq_x_f32): Likewise.
      	(vbicq_x_f16): Likewise.
      	(vbicq_x_f32): Likewise.
      	(vbrsrq_x_n_f16): Likewise.
      	(vbrsrq_x_n_f32): Likewise.
      	(veorq_x_f16): Likewise.
      	(veorq_x_f32): Likewise.
      	(vornq_x_f16): Likewise.
      	(vornq_x_f32): Likewise.
      	(vorrq_x_f16): Likewise.
      	(vorrq_x_f32): Likewise.
      	(vrev32q_x_f16): Likewise.
      	(vrev64q_x_f16): Likewise.
      	(vrev64q_x_f32): Likewise.
      	(__arm_vddupq_x_n_u8): Define intrinsic.
      	(__arm_vddupq_x_n_u16): Likewise.
      	(__arm_vddupq_x_n_u32): Likewise.
      	(__arm_vddupq_x_wb_u8): Likewise.
      	(__arm_vddupq_x_wb_u16): Likewise.
      	(__arm_vddupq_x_wb_u32): Likewise.
      	(__arm_vdwdupq_x_n_u8): Likewise.
      	(__arm_vdwdupq_x_n_u16): Likewise.
      	(__arm_vdwdupq_x_n_u32): Likewise.
      	(__arm_vdwdupq_x_wb_u8): Likewise.
      	(__arm_vdwdupq_x_wb_u16): Likewise.
      	(__arm_vdwdupq_x_wb_u32): Likewise.
      	(__arm_vidupq_x_n_u8): Likewise.
      	(__arm_vidupq_x_n_u16): Likewise.
      	(__arm_vidupq_x_n_u32): Likewise.
      	(__arm_vidupq_x_wb_u8): Likewise.
      	(__arm_vidupq_x_wb_u16): Likewise.
      	(__arm_vidupq_x_wb_u32): Likewise.
      	(__arm_viwdupq_x_n_u8): Likewise.
      	(__arm_viwdupq_x_n_u16): Likewise.
      	(__arm_viwdupq_x_n_u32): Likewise.
      	(__arm_viwdupq_x_wb_u8): Likewise.
      	(__arm_viwdupq_x_wb_u16): Likewise.
      	(__arm_viwdupq_x_wb_u32): Likewise.
      	(__arm_vdupq_x_n_s8): Likewise.
      	(__arm_vdupq_x_n_s16): Likewise.
      	(__arm_vdupq_x_n_s32): Likewise.
      	(__arm_vdupq_x_n_u8): Likewise.
      	(__arm_vdupq_x_n_u16): Likewise.
      	(__arm_vdupq_x_n_u32): Likewise.
      	(__arm_vminq_x_s8): Likewise.
      	(__arm_vminq_x_s16): Likewise.
      	(__arm_vminq_x_s32): Likewise.
      	(__arm_vminq_x_u8): Likewise.
      	(__arm_vminq_x_u16): Likewise.
      	(__arm_vminq_x_u32): Likewise.
      	(__arm_vmaxq_x_s8): Likewise.
      	(__arm_vmaxq_x_s16): Likewise.
      	(__arm_vmaxq_x_s32): Likewise.
      	(__arm_vmaxq_x_u8): Likewise.
      	(__arm_vmaxq_x_u16): Likewise.
      	(__arm_vmaxq_x_u32): Likewise.
      	(__arm_vabdq_x_s8): Likewise.
      	(__arm_vabdq_x_s16): Likewise.
      	(__arm_vabdq_x_s32): Likewise.
      	(__arm_vabdq_x_u8): Likewise.
      	(__arm_vabdq_x_u16): Likewise.
      	(__arm_vabdq_x_u32): Likewise.
      	(__arm_vabsq_x_s8): Likewise.
      	(__arm_vabsq_x_s16): Likewise.
      	(__arm_vabsq_x_s32): Likewise.
      	(__arm_vaddq_x_s8): Likewise.
      	(__arm_vaddq_x_s16): Likewise.
      	(__arm_vaddq_x_s32): Likewise.
      	(__arm_vaddq_x_n_s8): Likewise.
      	(__arm_vaddq_x_n_s16): Likewise.
      	(__arm_vaddq_x_n_s32): Likewise.
      	(__arm_vaddq_x_u8): Likewise.
      	(__arm_vaddq_x_u16): Likewise.
      	(__arm_vaddq_x_u32): Likewise.
      	(__arm_vaddq_x_n_u8): Likewise.
      	(__arm_vaddq_x_n_u16): Likewise.
      	(__arm_vaddq_x_n_u32): Likewise.
      	(__arm_vclsq_x_s8): Likewise.
      	(__arm_vclsq_x_s16): Likewise.
      	(__arm_vclsq_x_s32): Likewise.
      	(__arm_vclzq_x_s8): Likewise.
      	(__arm_vclzq_x_s16): Likewise.
      	(__arm_vclzq_x_s32): Likewise.
      	(__arm_vclzq_x_u8): Likewise.
      	(__arm_vclzq_x_u16): Likewise.
      	(__arm_vclzq_x_u32): Likewise.
      	(__arm_vnegq_x_s8): Likewise.
      	(__arm_vnegq_x_s16): Likewise.
      	(__arm_vnegq_x_s32): Likewise.
      	(__arm_vmulhq_x_s8): Likewise.
      	(__arm_vmulhq_x_s16): Likewise.
      	(__arm_vmulhq_x_s32): Likewise.
      	(__arm_vmulhq_x_u8): Likewise.
      	(__arm_vmulhq_x_u16): Likewise.
      	(__arm_vmulhq_x_u32): Likewise.
      	(__arm_vmullbq_poly_x_p8): Likewise.
      	(__arm_vmullbq_poly_x_p16): Likewise.
      	(__arm_vmullbq_int_x_s8): Likewise.
      	(__arm_vmullbq_int_x_s16): Likewise.
      	(__arm_vmullbq_int_x_s32): Likewise.
      	(__arm_vmullbq_int_x_u8): Likewise.
      	(__arm_vmullbq_int_x_u16): Likewise.
      	(__arm_vmullbq_int_x_u32): Likewise.
      	(__arm_vmulltq_poly_x_p8): Likewise.
      	(__arm_vmulltq_poly_x_p16): Likewise.
      	(__arm_vmulltq_int_x_s8): Likewise.
      	(__arm_vmulltq_int_x_s16): Likewise.
      	(__arm_vmulltq_int_x_s32): Likewise.
      	(__arm_vmulltq_int_x_u8): Likewise.
      	(__arm_vmulltq_int_x_u16): Likewise.
      	(__arm_vmulltq_int_x_u32): Likewise.
      	(__arm_vmulq_x_s8): Likewise.
      	(__arm_vmulq_x_s16): Likewise.
      	(__arm_vmulq_x_s32): Likewise.
      	(__arm_vmulq_x_n_s8): Likewise.
      	(__arm_vmulq_x_n_s16): Likewise.
      	(__arm_vmulq_x_n_s32): Likewise.
      	(__arm_vmulq_x_u8): Likewise.
      	(__arm_vmulq_x_u16): Likewise.
      	(__arm_vmulq_x_u32): Likewise.
      	(__arm_vmulq_x_n_u8): Likewise.
      	(__arm_vmulq_x_n_u16): Likewise.
      	(__arm_vmulq_x_n_u32): Likewise.
      	(__arm_vsubq_x_s8): Likewise.
      	(__arm_vsubq_x_s16): Likewise.
      	(__arm_vsubq_x_s32): Likewise.
      	(__arm_vsubq_x_n_s8): Likewise.
      	(__arm_vsubq_x_n_s16): Likewise.
      	(__arm_vsubq_x_n_s32): Likewise.
      	(__arm_vsubq_x_u8): Likewise.
      	(__arm_vsubq_x_u16): Likewise.
      	(__arm_vsubq_x_u32): Likewise.
      	(__arm_vsubq_x_n_u8): Likewise.
      	(__arm_vsubq_x_n_u16): Likewise.
      	(__arm_vsubq_x_n_u32): Likewise.
      	(__arm_vcaddq_rot90_x_s8): Likewise.
      	(__arm_vcaddq_rot90_x_s16): Likewise.
      	(__arm_vcaddq_rot90_x_s32): Likewise.
      	(__arm_vcaddq_rot90_x_u8): Likewise.
      	(__arm_vcaddq_rot90_x_u16): Likewise.
      	(__arm_vcaddq_rot90_x_u32): Likewise.
      	(__arm_vcaddq_rot270_x_s8): Likewise.
      	(__arm_vcaddq_rot270_x_s16): Likewise.
      	(__arm_vcaddq_rot270_x_s32): Likewise.
      	(__arm_vcaddq_rot270_x_u8): Likewise.
      	(__arm_vcaddq_rot270_x_u16): Likewise.
      	(__arm_vcaddq_rot270_x_u32): Likewise.
      	(__arm_vhaddq_x_n_s8): Likewise.
      	(__arm_vhaddq_x_n_s16): Likewise.
      	(__arm_vhaddq_x_n_s32): Likewise.
      	(__arm_vhaddq_x_n_u8): Likewise.
      	(__arm_vhaddq_x_n_u16): Likewise.
      	(__arm_vhaddq_x_n_u32): Likewise.
      	(__arm_vhaddq_x_s8): Likewise.
      	(__arm_vhaddq_x_s16): Likewise.
      	(__arm_vhaddq_x_s32): Likewise.
      	(__arm_vhaddq_x_u8): Likewise.
      	(__arm_vhaddq_x_u16): Likewise.
      	(__arm_vhaddq_x_u32): Likewise.
      	(__arm_vhcaddq_rot90_x_s8): Likewise.
      	(__arm_vhcaddq_rot90_x_s16): Likewise.
      	(__arm_vhcaddq_rot90_x_s32): Likewise.
      	(__arm_vhcaddq_rot270_x_s8): Likewise.
      	(__arm_vhcaddq_rot270_x_s16): Likewise.
      	(__arm_vhcaddq_rot270_x_s32): Likewise.
      	(__arm_vhsubq_x_n_s8): Likewise.
      	(__arm_vhsubq_x_n_s16): Likewise.
      	(__arm_vhsubq_x_n_s32): Likewise.
      	(__arm_vhsubq_x_n_u8): Likewise.
      	(__arm_vhsubq_x_n_u16): Likewise.
      	(__arm_vhsubq_x_n_u32): Likewise.
      	(__arm_vhsubq_x_s8): Likewise.
      	(__arm_vhsubq_x_s16): Likewise.
      	(__arm_vhsubq_x_s32): Likewise.
      	(__arm_vhsubq_x_u8): Likewise.
      	(__arm_vhsubq_x_u16): Likewise.
      	(__arm_vhsubq_x_u32): Likewise.
      	(__arm_vrhaddq_x_s8): Likewise.
      	(__arm_vrhaddq_x_s16): Likewise.
      	(__arm_vrhaddq_x_s32): Likewise.
      	(__arm_vrhaddq_x_u8): Likewise.
      	(__arm_vrhaddq_x_u16): Likewise.
      	(__arm_vrhaddq_x_u32): Likewise.
      	(__arm_vrmulhq_x_s8): Likewise.
      	(__arm_vrmulhq_x_s16): Likewise.
      	(__arm_vrmulhq_x_s32): Likewise.
      	(__arm_vrmulhq_x_u8): Likewise.
      	(__arm_vrmulhq_x_u16): Likewise.
      	(__arm_vrmulhq_x_u32): Likewise.
      	(__arm_vandq_x_s8): Likewise.
      	(__arm_vandq_x_s16): Likewise.
      	(__arm_vandq_x_s32): Likewise.
      	(__arm_vandq_x_u8): Likewise.
      	(__arm_vandq_x_u16): Likewise.
      	(__arm_vandq_x_u32): Likewise.
      	(__arm_vbicq_x_s8): Likewise.
      	(__arm_vbicq_x_s16): Likewise.
      	(__arm_vbicq_x_s32): Likewise.
      	(__arm_vbicq_x_u8): Likewise.
      	(__arm_vbicq_x_u16): Likewise.
      	(__arm_vbicq_x_u32): Likewise.
      	(__arm_vbrsrq_x_n_s8): Likewise.
      	(__arm_vbrsrq_x_n_s16): Likewise.
      	(__arm_vbrsrq_x_n_s32): Likewise.
      	(__arm_vbrsrq_x_n_u8): Likewise.
      	(__arm_vbrsrq_x_n_u16): Likewise.
      	(__arm_vbrsrq_x_n_u32): Likewise.
      	(__arm_veorq_x_s8): Likewise.
      	(__arm_veorq_x_s16): Likewise.
      	(__arm_veorq_x_s32): Likewise.
      	(__arm_veorq_x_u8): Likewise.
      	(__arm_veorq_x_u16): Likewise.
      	(__arm_veorq_x_u32): Likewise.
      	(__arm_vmovlbq_x_s8): Likewise.
      	(__arm_vmovlbq_x_s16): Likewise.
      	(__arm_vmovlbq_x_u8): Likewise.
      	(__arm_vmovlbq_x_u16): Likewise.
      	(__arm_vmovltq_x_s8): Likewise.
      	(__arm_vmovltq_x_s16): Likewise.
      	(__arm_vmovltq_x_u8): Likewise.
      	(__arm_vmovltq_x_u16): Likewise.
      	(__arm_vmvnq_x_s8): Likewise.
      	(__arm_vmvnq_x_s16): Likewise.
      	(__arm_vmvnq_x_s32): Likewise.
      	(__arm_vmvnq_x_u8): Likewise.
      	(__arm_vmvnq_x_u16): Likewise.
      	(__arm_vmvnq_x_u32): Likewise.
      	(__arm_vmvnq_x_n_s16): Likewise.
      	(__arm_vmvnq_x_n_s32): Likewise.
      	(__arm_vmvnq_x_n_u16): Likewise.
      	(__arm_vmvnq_x_n_u32): Likewise.
      	(__arm_vornq_x_s8): Likewise.
      	(__arm_vornq_x_s16): Likewise.
      	(__arm_vornq_x_s32): Likewise.
      	(__arm_vornq_x_u8): Likewise.
      	(__arm_vornq_x_u16): Likewise.
      	(__arm_vornq_x_u32): Likewise.
      	(__arm_vorrq_x_s8): Likewise.
      	(__arm_vorrq_x_s16): Likewise.
      	(__arm_vorrq_x_s32): Likewise.
      	(__arm_vorrq_x_u8): Likewise.
      	(__arm_vorrq_x_u16): Likewise.
      	(__arm_vorrq_x_u32): Likewise.
      	(__arm_vrev16q_x_s8): Likewise.
      	(__arm_vrev16q_x_u8): Likewise.
      	(__arm_vrev32q_x_s8): Likewise.
      	(__arm_vrev32q_x_s16): Likewise.
      	(__arm_vrev32q_x_u8): Likewise.
      	(__arm_vrev32q_x_u16): Likewise.
      	(__arm_vrev64q_x_s8): Likewise.
      	(__arm_vrev64q_x_s16): Likewise.
      	(__arm_vrev64q_x_s32): Likewise.
      	(__arm_vrev64q_x_u8): Likewise.
      	(__arm_vrev64q_x_u16): Likewise.
      	(__arm_vrev64q_x_u32): Likewise.
      	(__arm_vrshlq_x_s8): Likewise.
      	(__arm_vrshlq_x_s16): Likewise.
      	(__arm_vrshlq_x_s32): Likewise.
      	(__arm_vrshlq_x_u8): Likewise.
      	(__arm_vrshlq_x_u16): Likewise.
      	(__arm_vrshlq_x_u32): Likewise.
      	(__arm_vshllbq_x_n_s8): Likewise.
      	(__arm_vshllbq_x_n_s16): Likewise.
      	(__arm_vshllbq_x_n_u8): Likewise.
      	(__arm_vshllbq_x_n_u16): Likewise.
      	(__arm_vshlltq_x_n_s8): Likewise.
      	(__arm_vshlltq_x_n_s16): Likewise.
      	(__arm_vshlltq_x_n_u8): Likewise.
      	(__arm_vshlltq_x_n_u16): Likewise.
      	(__arm_vshlq_x_s8): Likewise.
      	(__arm_vshlq_x_s16): Likewise.
      	(__arm_vshlq_x_s32): Likewise.
      	(__arm_vshlq_x_u8): Likewise.
      	(__arm_vshlq_x_u16): Likewise.
      	(__arm_vshlq_x_u32): Likewise.
      	(__arm_vshlq_x_n_s8): Likewise.
      	(__arm_vshlq_x_n_s16): Likewise.
      	(__arm_vshlq_x_n_s32): Likewise.
      	(__arm_vshlq_x_n_u8): Likewise.
      	(__arm_vshlq_x_n_u16): Likewise.
      	(__arm_vshlq_x_n_u32): Likewise.
      	(__arm_vrshrq_x_n_s8): Likewise.
      	(__arm_vrshrq_x_n_s16): Likewise.
      	(__arm_vrshrq_x_n_s32): Likewise.
      	(__arm_vrshrq_x_n_u8): Likewise.
      	(__arm_vrshrq_x_n_u16): Likewise.
      	(__arm_vrshrq_x_n_u32): Likewise.
      	(__arm_vshrq_x_n_s8): Likewise.
      	(__arm_vshrq_x_n_s16): Likewise.
      	(__arm_vshrq_x_n_s32): Likewise.
      	(__arm_vshrq_x_n_u8): Likewise.
      	(__arm_vshrq_x_n_u16): Likewise.
      	(__arm_vshrq_x_n_u32): Likewise.
      	(__arm_vdupq_x_n_f16): Likewise.
      	(__arm_vdupq_x_n_f32): Likewise.
      	(__arm_vminnmq_x_f16): Likewise.
      	(__arm_vminnmq_x_f32): Likewise.
      	(__arm_vmaxnmq_x_f16): Likewise.
      	(__arm_vmaxnmq_x_f32): Likewise.
      	(__arm_vabdq_x_f16): Likewise.
      	(__arm_vabdq_x_f32): Likewise.
      	(__arm_vabsq_x_f16): Likewise.
      	(__arm_vabsq_x_f32): Likewise.
      	(__arm_vaddq_x_f16): Likewise.
      	(__arm_vaddq_x_f32): Likewise.
      	(__arm_vaddq_x_n_f16): Likewise.
      	(__arm_vaddq_x_n_f32): Likewise.
      	(__arm_vnegq_x_f16): Likewise.
      	(__arm_vnegq_x_f32): Likewise.
      	(__arm_vmulq_x_f16): Likewise.
      	(__arm_vmulq_x_f32): Likewise.
      	(__arm_vmulq_x_n_f16): Likewise.
      	(__arm_vmulq_x_n_f32): Likewise.
      	(__arm_vsubq_x_f16): Likewise.
      	(__arm_vsubq_x_f32): Likewise.
      	(__arm_vsubq_x_n_f16): Likewise.
      	(__arm_vsubq_x_n_f32): Likewise.
      	(__arm_vcaddq_rot90_x_f16): Likewise.
      	(__arm_vcaddq_rot90_x_f32): Likewise.
      	(__arm_vcaddq_rot270_x_f16): Likewise.
      	(__arm_vcaddq_rot270_x_f32): Likewise.
      	(__arm_vcmulq_x_f16): Likewise.
      	(__arm_vcmulq_x_f32): Likewise.
      	(__arm_vcmulq_rot90_x_f16): Likewise.
      	(__arm_vcmulq_rot90_x_f32): Likewise.
      	(__arm_vcmulq_rot180_x_f16): Likewise.
      	(__arm_vcmulq_rot180_x_f32): Likewise.
      	(__arm_vcmulq_rot270_x_f16): Likewise.
      	(__arm_vcmulq_rot270_x_f32): Likewise.
      	(__arm_vcvtaq_x_s16_f16): Likewise.
      	(__arm_vcvtaq_x_s32_f32): Likewise.
      	(__arm_vcvtaq_x_u16_f16): Likewise.
      	(__arm_vcvtaq_x_u32_f32): Likewise.
      	(__arm_vcvtnq_x_s16_f16): Likewise.
      	(__arm_vcvtnq_x_s32_f32): Likewise.
      	(__arm_vcvtnq_x_u16_f16): Likewise.
      	(__arm_vcvtnq_x_u32_f32): Likewise.
      	(__arm_vcvtpq_x_s16_f16): Likewise.
      	(__arm_vcvtpq_x_s32_f32): Likewise.
      	(__arm_vcvtpq_x_u16_f16): Likewise.
      	(__arm_vcvtpq_x_u32_f32): Likewise.
      	(__arm_vcvtmq_x_s16_f16): Likewise.
      	(__arm_vcvtmq_x_s32_f32): Likewise.
      	(__arm_vcvtmq_x_u16_f16): Likewise.
      	(__arm_vcvtmq_x_u32_f32): Likewise.
      	(__arm_vcvtbq_x_f32_f16): Likewise.
      	(__arm_vcvttq_x_f32_f16): Likewise.
      	(__arm_vcvtq_x_f16_u16): Likewise.
      	(__arm_vcvtq_x_f16_s16): Likewise.
      	(__arm_vcvtq_x_f32_s32): Likewise.
      	(__arm_vcvtq_x_f32_u32): Likewise.
      	(__arm_vcvtq_x_n_f16_s16): Likewise.
      	(__arm_vcvtq_x_n_f16_u16): Likewise.
      	(__arm_vcvtq_x_n_f32_s32): Likewise.
      	(__arm_vcvtq_x_n_f32_u32): Likewise.
      	(__arm_vcvtq_x_s16_f16): Likewise.
      	(__arm_vcvtq_x_s32_f32): Likewise.
      	(__arm_vcvtq_x_u16_f16): Likewise.
      	(__arm_vcvtq_x_u32_f32): Likewise.
      	(__arm_vcvtq_x_n_s16_f16): Likewise.
      	(__arm_vcvtq_x_n_s32_f32): Likewise.
      	(__arm_vcvtq_x_n_u16_f16): Likewise.
      	(__arm_vcvtq_x_n_u32_f32): Likewise.
      	(__arm_vrndq_x_f16): Likewise.
      	(__arm_vrndq_x_f32): Likewise.
      	(__arm_vrndnq_x_f16): Likewise.
      	(__arm_vrndnq_x_f32): Likewise.
      	(__arm_vrndmq_x_f16): Likewise.
      	(__arm_vrndmq_x_f32): Likewise.
      	(__arm_vrndpq_x_f16): Likewise.
      	(__arm_vrndpq_x_f32): Likewise.
      	(__arm_vrndaq_x_f16): Likewise.
      	(__arm_vrndaq_x_f32): Likewise.
      	(__arm_vrndxq_x_f16): Likewise.
      	(__arm_vrndxq_x_f32): Likewise.
      	(__arm_vandq_x_f16): Likewise.
      	(__arm_vandq_x_f32): Likewise.
      	(__arm_vbicq_x_f16): Likewise.
      	(__arm_vbicq_x_f32): Likewise.
      	(__arm_vbrsrq_x_n_f16): Likewise.
      	(__arm_vbrsrq_x_n_f32): Likewise.
      	(__arm_veorq_x_f16): Likewise.
      	(__arm_veorq_x_f32): Likewise.
      	(__arm_vornq_x_f16): Likewise.
      	(__arm_vornq_x_f32): Likewise.
      	(__arm_vorrq_x_f16): Likewise.
      	(__arm_vorrq_x_f32): Likewise.
      	(__arm_vrev32q_x_f16): Likewise.
      	(__arm_vrev64q_x_f16): Likewise.
      	(__arm_vrev64q_x_f32): Likewise.
      	(vabdq_x): Define polymorphic variant.
      	(vabsq_x): Likewise.
      	(vaddq_x): Likewise.
      	(vandq_x): Likewise.
      	(vbicq_x): Likewise.
      	(vbrsrq_x): Likewise.
      	(vcaddq_rot270_x): Likewise.
      	(vcaddq_rot90_x): Likewise.
      	(vcmulq_rot180_x): Likewise.
      	(vcmulq_rot270_x): Likewise.
      	(vcmulq_x): Likewise.
      	(vcvtq_x): Likewise.
      	(vcvtq_x_n): Likewise.
      	(vcvtnq_m): Likewise.
      	(veorq_x): Likewise.
      	(vmaxnmq_x): Likewise.
      	(vminnmq_x): Likewise.
      	(vmulq_x): Likewise.
      	(vnegq_x): Likewise.
      	(vornq_x): Likewise.
      	(vorrq_x): Likewise.
      	(vrev32q_x): Likewise.
      	(vrev64q_x): Likewise.
      	(vrndaq_x): Likewise.
      	(vrndmq_x): Likewise.
      	(vrndnq_x): Likewise.
      	(vrndpq_x): Likewise.
      	(vrndq_x): Likewise.
      	(vrndxq_x): Likewise.
      	(vsubq_x): Likewise.
      	(vcmulq_rot90_x): Likewise.
      	(vadciq): Likewise.
      	(vclsq_x): Likewise.
      	(vclzq_x): Likewise.
      	(vhaddq_x): Likewise.
      	(vhcaddq_rot270_x): Likewise.
      	(vhcaddq_rot90_x): Likewise.
      	(vhsubq_x): Likewise.
      	(vmaxq_x): Likewise.
      	(vminq_x): Likewise.
      	(vmovlbq_x): Likewise.
      	(vmovltq_x): Likewise.
      	(vmulhq_x): Likewise.
      	(vmullbq_int_x): Likewise.
      	(vmullbq_poly_x): Likewise.
      	(vmulltq_int_x): Likewise.
      	(vmulltq_poly_x): Likewise.
      	(vmvnq_x): Likewise.
      	(vrev16q_x): Likewise.
      	(vrhaddq_x): Likewise.
      	(vrmulhq_x): Likewise.
      	(vrshlq_x): Likewise.
      	(vrshrq_x): Likewise.
      	(vshllbq_x): Likewise.
      	(vshlltq_x): Likewise.
      	(vshlq_x_n): Likewise.
      	(vshlq_x): Likewise.
      	(vdwdupq_x_u8): Likewise.
      	(vdwdupq_x_u16): Likewise.
      	(vdwdupq_x_u32): Likewise.
      	(viwdupq_x_u8): Likewise.
      	(viwdupq_x_u16): Likewise.
      	(viwdupq_x_u32): Likewise.
      	(vidupq_x_u8): Likewise.
      	(vddupq_x_u8): Likewise.
      	(vidupq_x_u16): Likewise.
      	(vddupq_x_u16): Likewise.
      	(vidupq_x_u32): Likewise.
      	(vddupq_x_u32): Likewise.
      	(vshrq_x): Likewise.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vabdq_x_f16.c: New test.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabsq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabsq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabsq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabsq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabsq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclsq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclsq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclsq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclzq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclzq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclzq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclzq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclzq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclzq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_rot180_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_rot180_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_rot270_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_rot270_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_rot90_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_rot90_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtaq_x_s16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtaq_x_s32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtaq_x_u16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtaq_x_u32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtbq_x_f32_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtmq_x_s16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtmq_x_s32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtmq_x_u16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtmq_x_u32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtnq_x_s16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtnq_x_s32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtnq_x_u16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtnq_x_u32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtpq_x_s16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtpq_x_s32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtpq_x_u16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtpq_x_u32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_f16_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_f16_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_f32_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_f32_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f16_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f16_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f32_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f32_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_s16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_s32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_u16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_u32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_s16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_s32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_u16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_u32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvttq_x_f32_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_x_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_x_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_x_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_x_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_x_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_x_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_x_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_x_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_x_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_x_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_x_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_x_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxnmq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxnmq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminnmq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminnmq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovlbq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovlbq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovlbq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovlbq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovltq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovltq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovltq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovltq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulhq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulhq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulhq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulhq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulhq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulhq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_poly_x_p16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_poly_x_p8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_poly_x_p16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_poly_x_p8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vnegq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vnegq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vnegq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vnegq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vnegq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev16q_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev16q_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev32q_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev32q_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev32q_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev32q_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev32q_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrhaddq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrhaddq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrhaddq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrhaddq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrhaddq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrhaddq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrmulhq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrmulhq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrmulhq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrmulhq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrmulhq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrmulhq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndaq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndaq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndmq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndmq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndnq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndnq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndpq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndpq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndxq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndxq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshlq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshlq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshlq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshlq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshlq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshlq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshllbq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshllbq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshllbq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshllbq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlltq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlltq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlltq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlltq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshrq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshrq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshrq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshrq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshrq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_u8.c: Likewise.
      Srinath Parvathaneni committed
    • fix CTOR vectorization · 3d42842c
      We failed to handle pattern stmts appropriately.
      
      2020-03-20  Richard Biener  <rguenther@suse.de>
      
      	* tree-vect-slp.c (vect_analyze_slp_instance): Push the stmts
      	to vectorize for CTOR defs.
      Richard Biener committed
    • [ARM][GCC][2/8x]: MVE ACLE gather load and scatter store intrinsics with writeback. · 41e1a7ff
      This patch supports following MVE ACLE intrinsics with writeback.
      
      vldrdq_gather_base_wb_s64, vldrdq_gather_base_wb_u64, vldrdq_gather_base_wb_z_s64,
      vldrdq_gather_base_wb_z_u64, vldrwq_gather_base_wb_f32, vldrwq_gather_base_wb_s32,
      vldrwq_gather_base_wb_u32, vldrwq_gather_base_wb_z_f32, vldrwq_gather_base_wb_z_s32,
      vldrwq_gather_base_wb_z_u32, vstrdq_scatter_base_wb_p_s64, vstrdq_scatter_base_wb_p_u64,
      vstrdq_scatter_base_wb_s64, vstrdq_scatter_base_wb_u64, vstrwq_scatter_base_wb_p_s32,
      vstrwq_scatter_base_wb_p_f32, vstrwq_scatter_base_wb_p_u32, vstrwq_scatter_base_wb_s32,
      vstrwq_scatter_base_wb_u32, vstrwq_scatter_base_wb_f32.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
      [1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* config/arm/arm-builtins.c (LDRGBWBS_QUALIFIERS): Define builtin
      	qualifier.
      	(LDRGBWBU_QUALIFIERS): Likewise.
      	(LDRGBWBS_Z_QUALIFIERS): Likewise.
      	(LDRGBWBU_Z_QUALIFIERS): Likewise.
      	(STRSBWBS_QUALIFIERS): Likewise.
      	(STRSBWBU_QUALIFIERS): Likewise.
      	(STRSBWBS_P_QUALIFIERS): Likewise.
      	(STRSBWBU_P_QUALIFIERS): Likewise.
      	* config/arm/arm_mve.h (vldrdq_gather_base_wb_s64): Define macro.
      	(vldrdq_gather_base_wb_u64): Likewise.
      	(vldrdq_gather_base_wb_z_s64): Likewise.
      	(vldrdq_gather_base_wb_z_u64): Likewise.
      	(vldrwq_gather_base_wb_f32): Likewise.
      	(vldrwq_gather_base_wb_s32): Likewise.
      	(vldrwq_gather_base_wb_u32): Likewise.
      	(vldrwq_gather_base_wb_z_f32): Likewise.
      	(vldrwq_gather_base_wb_z_s32): Likewise.
      	(vldrwq_gather_base_wb_z_u32): Likewise.
      	(vstrdq_scatter_base_wb_p_s64): Likewise.
      	(vstrdq_scatter_base_wb_p_u64): Likewise.
      	(vstrdq_scatter_base_wb_s64): Likewise.
      	(vstrdq_scatter_base_wb_u64): Likewise.
      	(vstrwq_scatter_base_wb_p_s32): Likewise.
      	(vstrwq_scatter_base_wb_p_f32): Likewise.
      	(vstrwq_scatter_base_wb_p_u32): Likewise.
      	(vstrwq_scatter_base_wb_s32): Likewise.
      	(vstrwq_scatter_base_wb_u32): Likewise.
      	(vstrwq_scatter_base_wb_f32): Likewise.
      	(__arm_vldrdq_gather_base_wb_s64): Define intrinsic.
      	(__arm_vldrdq_gather_base_wb_u64): Likewise.
      	(__arm_vldrdq_gather_base_wb_z_s64): Likewise.
      	(__arm_vldrdq_gather_base_wb_z_u64): Likewise.
      	(__arm_vldrwq_gather_base_wb_s32): Likewise.
      	(__arm_vldrwq_gather_base_wb_u32): Likewise.
      	(__arm_vldrwq_gather_base_wb_z_s32): Likewise.
      	(__arm_vldrwq_gather_base_wb_z_u32): Likewise.
      	(__arm_vstrdq_scatter_base_wb_s64): Likewise.
      	(__arm_vstrdq_scatter_base_wb_u64): Likewise.
      	(__arm_vstrdq_scatter_base_wb_p_s64): Likewise.
      	(__arm_vstrdq_scatter_base_wb_p_u64): Likewise.
      	(__arm_vstrwq_scatter_base_wb_p_s32): Likewise.
      	(__arm_vstrwq_scatter_base_wb_p_u32): Likewise.
      	(__arm_vstrwq_scatter_base_wb_s32): Likewise.
      	(__arm_vstrwq_scatter_base_wb_u32): Likewise.
      	(__arm_vldrwq_gather_base_wb_f32): Likewise.
      	(__arm_vldrwq_gather_base_wb_z_f32): Likewise.
      	(__arm_vstrwq_scatter_base_wb_f32): Likewise.
      	(__arm_vstrwq_scatter_base_wb_p_f32): Likewise.
      	(vstrwq_scatter_base_wb): Define polymorphic variant.
      	(vstrwq_scatter_base_wb_p): Likewise.
      	(vstrdq_scatter_base_wb_p): Likewise.
      	(vstrdq_scatter_base_wb): Likewise.
      	* config/arm/arm_mve_builtins.def (LDRGBWBS_QUALIFIERS): Use builtin
      	qualifier.
      	* config/arm/mve.md (mve_vstrwq_scatter_base_wb_<supf>v4si): Define RTL
      	pattern.
      	(mve_vstrwq_scatter_base_wb_add_<supf>v4si): Likewise.
      	(mve_vstrwq_scatter_base_wb_<supf>v4si_insn): Likewise.
      	(mve_vstrwq_scatter_base_wb_p_<supf>v4si): Likewise.
      	(mve_vstrwq_scatter_base_wb_p_add_<supf>v4si): Likewise.
      	(mve_vstrwq_scatter_base_wb_p_<supf>v4si_insn): Likewise.
      	(mve_vstrwq_scatter_base_wb_fv4sf): Likewise.
      	(mve_vstrwq_scatter_base_wb_add_fv4sf): Likewise.
      	(mve_vstrwq_scatter_base_wb_fv4sf_insn): Likewise.
      	(mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise.
      	(mve_vstrwq_scatter_base_wb_p_add_fv4sf): Likewise.
      	(mve_vstrwq_scatter_base_wb_p_fv4sf_insn): Likewise.
      	(mve_vstrdq_scatter_base_wb_<supf>v2di): Likewise.
      	(mve_vstrdq_scatter_base_wb_add_<supf>v2di): Likewise.
      	(mve_vstrdq_scatter_base_wb_<supf>v2di_insn): Likewise.
      	(mve_vstrdq_scatter_base_wb_p_<supf>v2di): Likewise.
      	(mve_vstrdq_scatter_base_wb_p_add_<supf>v2di): Likewise.
      	(mve_vstrdq_scatter_base_wb_p_<supf>v2di_insn): Likewise.
      	(mve_vldrwq_gather_base_wb_<supf>v4si): Likewise.
      	(mve_vldrwq_gather_base_wb_<supf>v4si_insn): Likewise.
      	(mve_vldrwq_gather_base_wb_z_<supf>v4si): Likewise.
      	(mve_vldrwq_gather_base_wb_z_<supf>v4si_insn): Likewise.
      	(mve_vldrwq_gather_base_wb_fv4sf): Likewise.
      	(mve_vldrwq_gather_base_wb_fv4sf_insn): Likewise.
      	(mve_vldrwq_gather_base_wb_z_fv4sf): Likewise.
      	(mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise.
      	(mve_vldrdq_gather_base_wb_<supf>v2di): Likewise.
      	(mve_vldrdq_gather_base_wb_<supf>v2di_insn): Likewise.
      	(mve_vldrdq_gather_base_wb_z_<supf>v2di): Likewise.
      	(mve_vldrdq_gather_base_wb_z_<supf>v2di_insn): Likewise.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_s64.c: New test.
      	* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_u64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_z_s64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_z_u64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_z_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_z_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_z_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_p_s64.c:
      	Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_p_u64.c:
      	Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_s64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_u64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_p_f32.c:
      	Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_p_s32.c:
      	Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_p_u32.c:
      	Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_u32.c: Likewise.
      Srinath Parvathaneni committed
    • [ARM][GCC][1/8x]: MVE ACLE vidup, vddup, viwdup and vdwdup intrinsics with writeback. · 92f80065
      This patch supports following MVE ACLE intrinsics with writeback.
      
      vddupq_m_n_u8, vddupq_m_n_u32, vddupq_m_n_u16, vddupq_m_wb_u8, vddupq_m_wb_u16, vddupq_m_wb_u32, vddupq_n_u8, vddupq_n_u32, vddupq_n_u16, vddupq_wb_u8, vddupq_wb_u16, vddupq_wb_u32, vdwdupq_m_n_u8, vdwdupq_m_n_u32, vdwdupq_m_n_u16, vdwdupq_m_wb_u8, vdwdupq_m_wb_u32, vdwdupq_m_wb_u16, vdwdupq_n_u8, vdwdupq_n_u32, vdwdupq_n_u16, vdwdupq_wb_u8, vdwdupq_wb_u32, vdwdupq_wb_u16, vidupq_m_n_u8, vidupq_m_n_u32, vidupq_m_n_u16, vidupq_m_wb_u8, vidupq_m_wb_u16, vidupq_m_wb_u32, vidupq_n_u8, vidupq_n_u32, vidupq_n_u16, vidupq_wb_u8, vidupq_wb_u16, vidupq_wb_u32, viwdupq_m_n_u8, viwdupq_m_n_u32, viwdupq_m_n_u16, viwdupq_m_wb_u8, viwdupq_m_wb_u32, viwdupq_m_wb_u16, viwdupq_n_u8, viwdupq_n_u32, viwdupq_n_u16, viwdupq_wb_u8, viwdupq_wb_u32, viwdupq_wb_u16.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
      [1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* config/arm/arm-builtins.c
      	(QUINOP_UNONE_UNONE_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Define quinary
      	builtin qualifier.
      	* config/arm/arm_mve.h (vddupq_m_n_u8): Define macro.
      	(vddupq_m_n_u32): Likewise.
      	(vddupq_m_n_u16): Likewise.
      	(vddupq_m_wb_u8): Likewise.
      	(vddupq_m_wb_u16): Likewise.
      	(vddupq_m_wb_u32): Likewise.
      	(vddupq_n_u8): Likewise.
      	(vddupq_n_u32): Likewise.
      	(vddupq_n_u16): Likewise.
      	(vddupq_wb_u8): Likewise.
      	(vddupq_wb_u16): Likewise.
      	(vddupq_wb_u32): Likewise.
      	(vdwdupq_m_n_u8): Likewise.
      	(vdwdupq_m_n_u32): Likewise.
      	(vdwdupq_m_n_u16): Likewise.
      	(vdwdupq_m_wb_u8): Likewise.
      	(vdwdupq_m_wb_u32): Likewise.
      	(vdwdupq_m_wb_u16): Likewise.
      	(vdwdupq_n_u8): Likewise.
      	(vdwdupq_n_u32): Likewise.
      	(vdwdupq_n_u16): Likewise.
      	(vdwdupq_wb_u8): Likewise.
      	(vdwdupq_wb_u32): Likewise.
      	(vdwdupq_wb_u16): Likewise.
      	(vidupq_m_n_u8): Likewise.
      	(vidupq_m_n_u32): Likewise.
      	(vidupq_m_n_u16): Likewise.
      	(vidupq_m_wb_u8): Likewise.
      	(vidupq_m_wb_u16): Likewise.
      	(vidupq_m_wb_u32): Likewise.
      	(vidupq_n_u8): Likewise.
      	(vidupq_n_u32): Likewise.
      	(vidupq_n_u16): Likewise.
      	(vidupq_wb_u8): Likewise.
      	(vidupq_wb_u16): Likewise.
      	(vidupq_wb_u32): Likewise.
      	(viwdupq_m_n_u8): Likewise.
      	(viwdupq_m_n_u32): Likewise.
      	(viwdupq_m_n_u16): Likewise.
      	(viwdupq_m_wb_u8): Likewise.
      	(viwdupq_m_wb_u32): Likewise.
      	(viwdupq_m_wb_u16): Likewise.
      	(viwdupq_n_u8): Likewise.
      	(viwdupq_n_u32): Likewise.
      	(viwdupq_n_u16): Likewise.
      	(viwdupq_wb_u8): Likewise.
      	(viwdupq_wb_u32): Likewise.
      	(viwdupq_wb_u16): Likewise.
      	(__arm_vddupq_m_n_u8): Define intrinsic.
      	(__arm_vddupq_m_n_u32): Likewise.
      	(__arm_vddupq_m_n_u16): Likewise.
      	(__arm_vddupq_m_wb_u8): Likewise.
      	(__arm_vddupq_m_wb_u16): Likewise.
      	(__arm_vddupq_m_wb_u32): Likewise.
      	(__arm_vddupq_n_u8): Likewise.
      	(__arm_vddupq_n_u32): Likewise.
      	(__arm_vddupq_n_u16): Likewise.
      	(__arm_vdwdupq_m_n_u8): Likewise.
      	(__arm_vdwdupq_m_n_u32): Likewise.
      	(__arm_vdwdupq_m_n_u16): Likewise.
      	(__arm_vdwdupq_m_wb_u8): Likewise.
      	(__arm_vdwdupq_m_wb_u32): Likewise.
      	(__arm_vdwdupq_m_wb_u16): Likewise.
      	(__arm_vdwdupq_n_u8): Likewise.
      	(__arm_vdwdupq_n_u32): Likewise.
      	(__arm_vdwdupq_n_u16): Likewise.
      	(__arm_vdwdupq_wb_u8): Likewise.
      	(__arm_vdwdupq_wb_u32): Likewise.
      	(__arm_vdwdupq_wb_u16): Likewise.
      	(__arm_vidupq_m_n_u8): Likewise.
      	(__arm_vidupq_m_n_u32): Likewise.
      	(__arm_vidupq_m_n_u16): Likewise.
      	(__arm_vidupq_n_u8): Likewise.
      	(__arm_vidupq_m_wb_u8): Likewise.
      	(__arm_vidupq_m_wb_u16): Likewise.
      	(__arm_vidupq_m_wb_u32): Likewise.
      	(__arm_vidupq_n_u32): Likewise.
      	(__arm_vidupq_n_u16): Likewise.
      	(__arm_vidupq_wb_u8): Likewise.
      	(__arm_vidupq_wb_u16): Likewise.
      	(__arm_vidupq_wb_u32): Likewise.
      	(__arm_vddupq_wb_u8): Likewise.
      	(__arm_vddupq_wb_u16): Likewise.
      	(__arm_vddupq_wb_u32): Likewise.
      	(__arm_viwdupq_m_n_u8): Likewise.
      	(__arm_viwdupq_m_n_u32): Likewise.
      	(__arm_viwdupq_m_n_u16): Likewise.
      	(__arm_viwdupq_m_wb_u8): Likewise.
      	(__arm_viwdupq_m_wb_u32): Likewise.
      	(__arm_viwdupq_m_wb_u16): Likewise.
      	(__arm_viwdupq_n_u8): Likewise.
      	(__arm_viwdupq_n_u32): Likewise.
      	(__arm_viwdupq_n_u16): Likewise.
      	(__arm_viwdupq_wb_u8): Likewise.
      	(__arm_viwdupq_wb_u32): Likewise.
      	(__arm_viwdupq_wb_u16): Likewise.
      	(vidupq_m): Define polymorphic variant.
      	(vddupq_m): Likewise.
      	(vidupq_u16): Likewise.
      	(vidupq_u32): Likewise.
      	(vidupq_u8): Likewise.
      	(vddupq_u16): Likewise.
      	(vddupq_u32): Likewise.
      	(vddupq_u8): Likewise.
      	(viwdupq_m): Likewise.
      	(viwdupq_u16): Likewise.
      	(viwdupq_u32): Likewise.
      	(viwdupq_u8): Likewise.
      	(vdwdupq_m): Likewise.
      	(vdwdupq_u16): Likewise.
      	(vdwdupq_u32): Likewise.
      	(vdwdupq_u8): Likewise.
      	* config/arm/arm_mve_builtins.def
      	(QUINOP_UNONE_UNONE_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Use builtin
      	qualifier.
      	* config/arm/mve.md (mve_vidupq_n_u<mode>): Define RTL pattern.
      	(mve_vidupq_u<mode>_insn): Likewise.
      	(mve_vidupq_m_n_u<mode>): Likewise.
      	(mve_vidupq_m_wb_u<mode>_insn): Likewise.
      	(mve_vddupq_n_u<mode>): Likewise.
      	(mve_vddupq_u<mode>_insn): Likewise.
      	(mve_vddupq_m_n_u<mode>): Likewise.
      	(mve_vddupq_m_wb_u<mode>_insn): Likewise.
      	(mve_vdwdupq_n_u<mode>): Likewise.
      	(mve_vdwdupq_wb_u<mode>): Likewise.
      	(mve_vdwdupq_wb_u<mode>_insn): Likewise.
      	(mve_vdwdupq_m_n_u<mode>): Likewise.
      	(mve_vdwdupq_m_wb_u<mode>): Likewise.
      	(mve_vdwdupq_m_wb_u<mode>_insn): Likewise.
      	(mve_viwdupq_n_u<mode>): Likewise.
      	(mve_viwdupq_wb_u<mode>): Likewise.
      	(mve_viwdupq_wb_u<mode>_insn): Likewise.
      	(mve_viwdupq_m_n_u<mode>): Likewise.
      	(mve_viwdupq_m_wb_u<mode>): Likewise.
      	(mve_viwdupq_m_wb_u<mode>_insn): Likewise.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vddupq_m_n_u16.c: New test.
      	* gcc.target/arm/mve/intrinsics/vddupq_m_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_m_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_m_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_m_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_m_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_m_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_m_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_m_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_m_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_m_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_m_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_m_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_m_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_m_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_m_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_m_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_m_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_m_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_m_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_m_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_m_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_m_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_m_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_wb_u8.c: Likewise.
      Srinath Parvathaneni committed
    • [ARM][GCC][7x]: MVE vreinterpretq and vuninitializedq intrinsics. · 85a94e87
      This patch supports following MVE ACLE intrinsics.
      
      vreinterpretq_s16_s32, vreinterpretq_s16_s64, vreinterpretq_s16_s8, vreinterpretq_s16_u16,
      vreinterpretq_s16_u32, vreinterpretq_s16_u64, vreinterpretq_s16_u8, vreinterpretq_s32_s16,
      vreinterpretq_s32_s64, vreinterpretq_s32_s8, vreinterpretq_s32_u16, vreinterpretq_s32_u32,
      vreinterpretq_s32_u64, vreinterpretq_s32_u8, vreinterpretq_s64_s16, vreinterpretq_s64_s32,
      vreinterpretq_s64_s8, vreinterpretq_s64_u16, vreinterpretq_s64_u32, vreinterpretq_s64_u64,
      vreinterpretq_s64_u8, vreinterpretq_s8_s16, vreinterpretq_s8_s32, vreinterpretq_s8_s64,
      vreinterpretq_s8_u16, vreinterpretq_s8_u32, vreinterpretq_s8_u64, vreinterpretq_s8_u8,
      vreinterpretq_u16_s16, vreinterpretq_u16_s32, vreinterpretq_u16_s64, vreinterpretq_u16_s8,
      vreinterpretq_u16_u32, vreinterpretq_u16_u64, vreinterpretq_u16_u8, vreinterpretq_u32_s16,
      vreinterpretq_u32_s32, vreinterpretq_u32_s64, vreinterpretq_u32_s8, vreinterpretq_u32_u16,
      vreinterpretq_u32_u64, vreinterpretq_u32_u8, vreinterpretq_u64_s16, vreinterpretq_u64_s32,
      vreinterpretq_u64_s64, vreinterpretq_u64_s8, vreinterpretq_u64_u16, vreinterpretq_u64_u32,
      vreinterpretq_u64_u8, vreinterpretq_u8_s16, vreinterpretq_u8_s32, vreinterpretq_u8_s64,
      vreinterpretq_u8_s8, vreinterpretq_u8_u16, vreinterpretq_u8_u32, vreinterpretq_u8_u64,
      vreinterpretq_s32_f16, vreinterpretq_s32_f32, vreinterpretq_u16_f16, vreinterpretq_u16_f32,
      vreinterpretq_u32_f16, vreinterpretq_u32_f32, vreinterpretq_u64_f16, vreinterpretq_u64_f32,
      vreinterpretq_u8_f16, vreinterpretq_u8_f32, vreinterpretq_f16_f32, vreinterpretq_f16_s16,
      vreinterpretq_f16_s32, vreinterpretq_f16_s64, vreinterpretq_f16_s8, vreinterpretq_f16_u16,
      vreinterpretq_f16_u32, vreinterpretq_f16_u64, vreinterpretq_f16_u8, vreinterpretq_f32_f16,
      vreinterpretq_f32_s16, vreinterpretq_f32_s32, vreinterpretq_f32_s64, vreinterpretq_f32_s8,
      vreinterpretq_f32_u16, vreinterpretq_f32_u32, vreinterpretq_f32_u64, vreinterpretq_f32_u8,
      vreinterpretq_s16_f16, vreinterpretq_s16_f32, vreinterpretq_s64_f16, vreinterpretq_s64_f32,
      vreinterpretq_s8_f16, vreinterpretq_s8_f32, vuninitializedq_u8, vuninitializedq_u16,
      vuninitializedq_u32, vuninitializedq_u64, vuninitializedq_s8, vuninitializedq_s16,
      vuninitializedq_s32, vuninitializedq_s64, vuninitializedq_f16, vuninitializedq_f32 and
      vuninitializedq.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
      [1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
      
      	* config/arm/arm_mve.h (vreinterpretq_s16_s32): Define macro.
      	(vreinterpretq_s16_s64): Likewise.
      	(vreinterpretq_s16_s8): Likewise.
      	(vreinterpretq_s16_u16): Likewise.
      	(vreinterpretq_s16_u32): Likewise.
      	(vreinterpretq_s16_u64): Likewise.
      	(vreinterpretq_s16_u8): Likewise.
      	(vreinterpretq_s32_s16): Likewise.
      	(vreinterpretq_s32_s64): Likewise.
      	(vreinterpretq_s32_s8): Likewise.
      	(vreinterpretq_s32_u16): Likewise.
      	(vreinterpretq_s32_u32): Likewise.
      	(vreinterpretq_s32_u64): Likewise.
      	(vreinterpretq_s32_u8): Likewise.
      	(vreinterpretq_s64_s16): Likewise.
      	(vreinterpretq_s64_s32): Likewise.
      	(vreinterpretq_s64_s8): Likewise.
      	(vreinterpretq_s64_u16): Likewise.
      	(vreinterpretq_s64_u32): Likewise.
      	(vreinterpretq_s64_u64): Likewise.
      	(vreinterpretq_s64_u8): Likewise.
      	(vreinterpretq_s8_s16): Likewise.
      	(vreinterpretq_s8_s32): Likewise.
      	(vreinterpretq_s8_s64): Likewise.
      	(vreinterpretq_s8_u16): Likewise.
      	(vreinterpretq_s8_u32): Likewise.
      	(vreinterpretq_s8_u64): Likewise.
      	(vreinterpretq_s8_u8): Likewise.
      	(vreinterpretq_u16_s16): Likewise.
      	(vreinterpretq_u16_s32): Likewise.
      	(vreinterpretq_u16_s64): Likewise.
      	(vreinterpretq_u16_s8): Likewise.
      	(vreinterpretq_u16_u32): Likewise.
      	(vreinterpretq_u16_u64): Likewise.
      	(vreinterpretq_u16_u8): Likewise.
      	(vreinterpretq_u32_s16): Likewise.
      	(vreinterpretq_u32_s32): Likewise.
      	(vreinterpretq_u32_s64): Likewise.
      	(vreinterpretq_u32_s8): Likewise.
      	(vreinterpretq_u32_u16): Likewise.
      	(vreinterpretq_u32_u64): Likewise.
      	(vreinterpretq_u32_u8): Likewise.
      	(vreinterpretq_u64_s16): Likewise.
      	(vreinterpretq_u64_s32): Likewise.
      	(vreinterpretq_u64_s64): Likewise.
      	(vreinterpretq_u64_s8): Likewise.
      	(vreinterpretq_u64_u16): Likewise.
      	(vreinterpretq_u64_u32): Likewise.
      	(vreinterpretq_u64_u8): Likewise.
      	(vreinterpretq_u8_s16): Likewise.
      	(vreinterpretq_u8_s32): Likewise.
      	(vreinterpretq_u8_s64): Likewise.
      	(vreinterpretq_u8_s8): Likewise.
      	(vreinterpretq_u8_u16): Likewise.
      	(vreinterpretq_u8_u32): Likewise.
      	(vreinterpretq_u8_u64): Likewise.
      	(vreinterpretq_s32_f16): Likewise.
      	(vreinterpretq_s32_f32): Likewise.
      	(vreinterpretq_u16_f16): Likewise.
      	(vreinterpretq_u16_f32): Likewise.
      	(vreinterpretq_u32_f16): Likewise.
      	(vreinterpretq_u32_f32): Likewise.
      	(vreinterpretq_u64_f16): Likewise.
      	(vreinterpretq_u64_f32): Likewise.
      	(vreinterpretq_u8_f16): Likewise.
      	(vreinterpretq_u8_f32): Likewise.
      	(vreinterpretq_f16_f32): Likewise.
      	(vreinterpretq_f16_s16): Likewise.
      	(vreinterpretq_f16_s32): Likewise.
      	(vreinterpretq_f16_s64): Likewise.
      	(vreinterpretq_f16_s8): Likewise.
      	(vreinterpretq_f16_u16): Likewise.
      	(vreinterpretq_f16_u32): Likewise.
      	(vreinterpretq_f16_u64): Likewise.
      	(vreinterpretq_f16_u8): Likewise.
      	(vreinterpretq_f32_f16): Likewise.
      	(vreinterpretq_f32_s16): Likewise.
      	(vreinterpretq_f32_s32): Likewise.
      	(vreinterpretq_f32_s64): Likewise.
      	(vreinterpretq_f32_s8): Likewise.
      	(vreinterpretq_f32_u16): Likewise.
      	(vreinterpretq_f32_u32): Likewise.
      	(vreinterpretq_f32_u64): Likewise.
      	(vreinterpretq_f32_u8): Likewise.
      	(vreinterpretq_s16_f16): Likewise.
      	(vreinterpretq_s16_f32): Likewise.
      	(vreinterpretq_s64_f16): Likewise.
      	(vreinterpretq_s64_f32): Likewise.
      	(vreinterpretq_s8_f16): Likewise.
      	(vreinterpretq_s8_f32): Likewise.
      	(vuninitializedq_u8): Likewise.
      	(vuninitializedq_u16): Likewise.
      	(vuninitializedq_u32): Likewise.
      	(vuninitializedq_u64): Likewise.
      	(vuninitializedq_s8): Likewise.
      	(vuninitializedq_s16): Likewise.
      	(vuninitializedq_s32): Likewise.
      	(vuninitializedq_s64): Likewise.
      	(vuninitializedq_f16): Likewise.
      	(vuninitializedq_f32): Likewise.
      	(__arm_vuninitializedq_u8): Define intrinsic.
      	(__arm_vuninitializedq_u16): Likewise.
      	(__arm_vuninitializedq_u32): Likewise.
      	(__arm_vuninitializedq_u64): Likewise.
      	(__arm_vuninitializedq_s8): Likewise.
      	(__arm_vuninitializedq_s16): Likewise.
      	(__arm_vuninitializedq_s32): Likewise.
      	(__arm_vuninitializedq_s64): Likewise.
      	(__arm_vreinterpretq_s16_s32): Likewise.
      	(__arm_vreinterpretq_s16_s64): Likewise.
      	(__arm_vreinterpretq_s16_s8): Likewise.
      	(__arm_vreinterpretq_s16_u16): Likewise.
      	(__arm_vreinterpretq_s16_u32): Likewise.
      	(__arm_vreinterpretq_s16_u64): Likewise.
      	(__arm_vreinterpretq_s16_u8): Likewise.
      	(__arm_vreinterpretq_s32_s16): Likewise.
      	(__arm_vreinterpretq_s32_s64): Likewise.
      	(__arm_vreinterpretq_s32_s8): Likewise.
      	(__arm_vreinterpretq_s32_u16): Likewise.
      	(__arm_vreinterpretq_s32_u32): Likewise.
      	(__arm_vreinterpretq_s32_u64): Likewise.
      	(__arm_vreinterpretq_s32_u8): Likewise.
      	(__arm_vreinterpretq_s64_s16): Likewise.
      	(__arm_vreinterpretq_s64_s32): Likewise.
      	(__arm_vreinterpretq_s64_s8): Likewise.
      	(__arm_vreinterpretq_s64_u16): Likewise.
      	(__arm_vreinterpretq_s64_u32): Likewise.
      	(__arm_vreinterpretq_s64_u64): Likewise.
      	(__arm_vreinterpretq_s64_u8): Likewise.
      	(__arm_vreinterpretq_s8_s16): Likewise.
      	(__arm_vreinterpretq_s8_s32): Likewise.
      	(__arm_vreinterpretq_s8_s64): Likewise.
      	(__arm_vreinterpretq_s8_u16): Likewise.
      	(__arm_vreinterpretq_s8_u32): Likewise.
      	(__arm_vreinterpretq_s8_u64): Likewise.
      	(__arm_vreinterpretq_s8_u8): Likewise.
      	(__arm_vreinterpretq_u16_s16): Likewise.
      	(__arm_vreinterpretq_u16_s32): Likewise.
      	(__arm_vreinterpretq_u16_s64): Likewise.
      	(__arm_vreinterpretq_u16_s8): Likewise.
      	(__arm_vreinterpretq_u16_u32): Likewise.
      	(__arm_vreinterpretq_u16_u64): Likewise.
      	(__arm_vreinterpretq_u16_u8): Likewise.
      	(__arm_vreinterpretq_u32_s16): Likewise.
      	(__arm_vreinterpretq_u32_s32): Likewise.
      	(__arm_vreinterpretq_u32_s64): Likewise.
      	(__arm_vreinterpretq_u32_s8): Likewise.
      	(__arm_vreinterpretq_u32_u16): Likewise.
      	(__arm_vreinterpretq_u32_u64): Likewise.
      	(__arm_vreinterpretq_u32_u8): Likewise.
      	(__arm_vreinterpretq_u64_s16): Likewise.
      	(__arm_vreinterpretq_u64_s32): Likewise.
      	(__arm_vreinterpretq_u64_s64): Likewise.
      	(__arm_vreinterpretq_u64_s8): Likewise.
      	(__arm_vreinterpretq_u64_u16): Likewise.
      	(__arm_vreinterpretq_u64_u32): Likewise.
      	(__arm_vreinterpretq_u64_u8): Likewise.
      	(__arm_vreinterpretq_u8_s16): Likewise.
      	(__arm_vreinterpretq_u8_s32): Likewise.
      	(__arm_vreinterpretq_u8_s64): Likewise.
      	(__arm_vreinterpretq_u8_s8): Likewise.
      	(__arm_vreinterpretq_u8_u16): Likewise.
      	(__arm_vreinterpretq_u8_u32): Likewise.
      	(__arm_vreinterpretq_u8_u64): Likewise.
      	(__arm_vuninitializedq_f16): Likewise.
      	(__arm_vuninitializedq_f32): Likewise.
      	(__arm_vreinterpretq_s32_f16): Likewise.
      	(__arm_vreinterpretq_s32_f32): Likewise.
      	(__arm_vreinterpretq_s16_f16): Likewise.
      	(__arm_vreinterpretq_s16_f32): Likewise.
      	(__arm_vreinterpretq_s64_f16): Likewise.
      	(__arm_vreinterpretq_s64_f32): Likewise.
      	(__arm_vreinterpretq_s8_f16): Likewise.
      	(__arm_vreinterpretq_s8_f32): Likewise.
      	(__arm_vreinterpretq_u16_f16): Likewise.
      	(__arm_vreinterpretq_u16_f32): Likewise.
      	(__arm_vreinterpretq_u32_f16): Likewise.
      	(__arm_vreinterpretq_u32_f32): Likewise.
      	(__arm_vreinterpretq_u64_f16): Likewise.
      	(__arm_vreinterpretq_u64_f32): Likewise.
      	(__arm_vreinterpretq_u8_f16): Likewise.
      	(__arm_vreinterpretq_u8_f32): Likewise.
      	(__arm_vreinterpretq_f16_f32): Likewise.
      	(__arm_vreinterpretq_f16_s16): Likewise.
      	(__arm_vreinterpretq_f16_s32): Likewise.
      	(__arm_vreinterpretq_f16_s64): Likewise.
      	(__arm_vreinterpretq_f16_s8): Likewise.
      	(__arm_vreinterpretq_f16_u16): Likewise.
      	(__arm_vreinterpretq_f16_u32): Likewise.
      	(__arm_vreinterpretq_f16_u64): Likewise.
      	(__arm_vreinterpretq_f16_u8): Likewise.
      	(__arm_vreinterpretq_f32_f16): Likewise.
      	(__arm_vreinterpretq_f32_s16): Likewise.
      	(__arm_vreinterpretq_f32_s32): Likewise.
      	(__arm_vreinterpretq_f32_s64): Likewise.
      	(__arm_vreinterpretq_f32_s8): Likewise.
      	(__arm_vreinterpretq_f32_u16): Likewise.
      	(__arm_vreinterpretq_f32_u32): Likewise.
      	(__arm_vreinterpretq_f32_u64): Likewise.
      	(__arm_vreinterpretq_f32_u8): Likewise.
      	(vuninitializedq): Define polymorphic variant.
      	(vreinterpretq_f16): Likewise.
      	(vreinterpretq_f32): Likewise.
      	(vreinterpretq_s16): Likewise.
      	(vreinterpretq_s32): Likewise.
      	(vreinterpretq_s64): Likewise.
      	(vreinterpretq_s8): Likewise.
      	(vreinterpretq_u16): Likewise.
      	(vreinterpretq_u32): Likewise.
      	(vreinterpretq_u64): Likewise.
      	(vreinterpretq_u8): Likewise.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_float.c: New test.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_float1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_int.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_int1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_s64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_u64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_float.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_float1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_int.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_int1.c: Likewise.
      Srinath Parvathaneni committed
    • [ARM][GCC][6x]:MVE ACLE vaddq intrinsics using arithmetic plus operator. · 3eff57aa
      This patch supports following MVE ACLE vaddq intrinsics. The RTL patterns for this intrinsics are added using arithmetic "plus" operator.
      
      vaddq_s8, vaddq_s16, vaddq_s32, vaddq_u8, vaddq_u16, vaddq_u32, vaddq_f16, vaddq_f32.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
      [1]  https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* config/arm/arm_mve.h (vaddq_s8): Define macro.
      	(vaddq_s16): Likewise.
      	(vaddq_s32): Likewise.
      	(vaddq_u8): Likewise.
      	(vaddq_u16): Likewise.
      	(vaddq_u32): Likewise.
      	(vaddq_f16): Likewise.
      	(vaddq_f32): Likewise.
      	(__arm_vaddq_s8): Define intrinsic.
      	(__arm_vaddq_s16): Likewise.
      	(__arm_vaddq_s32): Likewise.
      	(__arm_vaddq_u8): Likewise.
      	(__arm_vaddq_u16): Likewise.
      	(__arm_vaddq_u32): Likewise.
      	(__arm_vaddq_f16): Likewise.
      	(__arm_vaddq_f32): Likewise.
      	(vaddq): Define polymorphic variant.
      	* config/arm/iterators.md (VNIM): Define mode iterator for common types
      	Neon, IWMMXT and MVE.
      	(VNINOTM): Likewise.
      	* config/arm/mve.md (mve_vaddq<mode>): Define RTL pattern.
      	(mve_vaddq_f<mode>): Define RTL pattern.
      	* config/arm/neon.md (add<mode>3): Rename to addv4hf3 RTL pattern.
      	(addv8hf3_neon): Define RTL pattern.
      	* config/arm/vec-common.md (add<mode>3): Modify standard add RTL pattern
      	to support MVE.
      	(addv8hf3): Define standard RTL pattern for MVE and Neon.
      	(add<mode>3): Modify existing standard add RTL pattern for Neon and IWMMXT.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vaddq_f16.c: New test.
      	* gcc.target/arm/mve/intrinsics/vaddq_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_u8.c: Likewise.
      Srinath Parvathaneni committed
    • Fix correct offset in ipa_get_jf_ancestor_result. · 7d4549b2
      	PR ipa/94232
      	* ipa-cp.c (ipa_get_jf_ancestor_result): Use offset in bytes. Previously
      	build_ref_for_offset function was used and it transforms off to bytes
      	from bits.
      Martin Liska committed
    • tree-optimization/94266 - fix object type extraction heuristics · 8fefa21f
      This fixes the heuristic deriving an actual object type from a
      MEM_REFs pointer operand to use the more sensible type of an
      actual object instead of the pointed to type.
      
      2020-03-20  Richard Biener  <rguenther@suse.de>
      
      	PR tree-optimization/94266
      	* gimple-ssa-sprintf.c (get_origin_and_offset): Use the
      	type of the underlying object to adjust for the containing
      	field if available.
      Richard Biener committed
    • gcc, Arm: Revert changes to {get,set}_fpscr · 719c8642
      MVE made changes to {get,set}_fpscr to enable the compiler to optimize
      unneccesary gets and sets when using these for intrinsics that use and/or write
      the carry bit.  However, these actually get and set the full FPSCR register and
      are used by fp env intrinsics to modify the fp context.  So MVE should not be
      using these.
      
      gcc/ChangeLog:
      2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>
      
      	* config/arm/unspecs.md (UNSPEC_GET_FPSCR): Rename this to ...
      	(VUNSPEC_GET_FPSCR): ... this, and move it to vunspec.
      	* config/arm/vfp.md: (get_fpscr, set_fpscr): Revert to old patterns.
      Andre Simoes Dias Vieira committed
    • gcc, Arm: Fix testisms for MVE testsuite · 005f6fc5
      This patch fixes some testism where -mfpu=auto was missing or where we could
      end up with -mfloat-abi=hard and soft on the same command-line.
      
      gcc/testsuite/ChangeLog:
      2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/mve_fp_fpu1.c: Fix testisms.
      	* gcc.target/arm/mve/intrinsics/mve_fp_fpu2.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_fpu1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_fpu2.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_fpu3.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_libcall1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_libcall2.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_float.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_float1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_float2.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_int.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_int1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_int2.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_uint.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_uint1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_uint2.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshrntq_m_n_u32.c: Likewise.
      Andre Simoes Dias Vieira committed
    • gcc, Arm: Fix MVE move from GPR -> GPR · 0efe7d87
      This patch fixes the pattern mve_mov for the case where both MVE vectors are in
      R registers and the move does not get optimized away.  I use the same approach
      as we do for NEON, where we use four register moves.
      
      gcc/ChangeLog:
      2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>
      
      	* config/arm/mve.md (mve_mov<mode>): Fix R->R case.
      
      gcc/testsuite/ChangeLog:
      2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/mve_move_gpr_to_gpr.c: New test.
      Andre Simoes Dias Vieira committed
    • store-merging: Fix up -fnon-call-exceptions handling [PR94224] · 4119cd69
      When we are adding a single store into a store group, we are already
      checking that store->lp_nr matches, but we have also code to add further
      INTEGER_CST stores into the group right away if the ordering requires that
      either we put there all or none from a certain set of stores.  And in those
      cases we weren't doing these lp_nr checks, which means we could end up with
      stores with different lp_nr in the same group, which then ICEs during
      output_merged_store.
      
      2020-03-20  Jakub Jelinek  <jakub@redhat.com>
      
      	PR tree-optimization/94224
      	* gimple-ssa-store-merging.c
      	(imm_store_chain_info::coalesce_immediate): Don't consider overlapping
      	or adjacent INTEGER_CST rhs_code stores as mergeable if they have
      	different lp_nr.
      
      	* g++.dg/tree-ssa/pr94224.C: New test.
      Jakub Jelinek committed
    • gcc, Arm: Fix no_cond issue introduced by MVE · 05009698
      This was a matter of mistaken logic in (define_attr "conds" ..). This was
      setting the conds attribute for any neon instruction to no_cond which was
      messing up code generation.
      
      gcc/ChangeLog:
      2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>
      
      	* config/arm/arm.md (define_attr "conds"): Fix logic for neon and mve.
      Andre Simoes Dias Vieira committed
    • [rs6000] Rewrite the declaration of a variable · 4a18f168
      Rewrite the declaration of toc_section from the source file rs6000.c to its
      header file for standardizing the code.
      
      Bootstrap and regression were done on powerpc64le-linux-gnu (LE) with no
      regressions.
      
      gcc/ChangeLog
      
      2020-03-20  Bin Bin Lv  <shlb.linux.ibm.com>
      
      	* config/rs6000/rs6000-internal.h (toc_section): Remove the
      	declaration.
      	* config/rs6000/rs6000.h (toc_section): Add the declaration.
      	* config/rs6000/rs6000.c (toc_section): Remove the declaration.
      Bin Bin Lv committed
    • c++: Avoid unnecessary empty class copy [94175]. · 94e24187
      A simple empty class copy is still simple when wrapped in a TARGET_EXPR, so
      we need to strip that as well.  This change also exposed some unnecessary
      copies in return statements, which when returning by invisible reference led
      to <RETURN_EXPR <MEM_REF <RESULT_DECL>>>, which gimplify_return_expr didn't
      like.  So we also need to strip the _REF when we eliminate the INIT_EXPR.
      
      gcc/cp/ChangeLog
      2020-03-19  Jason Merrill  <jason@redhat.com>
      
      	PR c++/94175
      	* cp-gimplify.c (simple_empty_class_p): Look through
      	SIMPLE_TARGET_EXPR_P.
      	(cp_gimplify_expr) [MODIFY_EXPR]: Likewise.
      	[RETURN_EXPR]: Avoid producing 'return *retval;'.
      	* call.c (build_call_a): Strip TARGET_EXPR from empty class arg.
      	* cp-tree.h (SIMPLE_TARGET_EXPR_P): Check that TARGET_EXPR_INITIAL
      	is non-null.
      Jason Merrill committed
    • Daily bump. · 3373d3e3
      GCC Administrator committed
  2. 19 Mar, 2020 12 commits
    • Fix cgraph_node::function_symbol availability compuattion [PR94202] · f7dceb4e
      this fixes ICE in inliner cache sanity check which is caused by very old
      bug in visibility calculation in cgraph_node::function_symbol and
      cgraph_node::function_or_virtual_thunk_symbol.
      
      In the testcase there is indirect call to a thunk. At begining we correctly
      see that its body as AVAIL_AVAILABLE but later we inline into the thunk and
      this turns it to AVAIL_INTERPOSABLE.
      
      This is because function_symbol incorrectly overwrites availability parameter
      by availability of the alias used in the call within thunk, which is a local
      alias.
      
      gcc/ChangeLog:
      
      2020-03-19  Jan Hubicka  <hubicka@ucw.cz>
      
      	PR ipa/94202
      	* cgraph.c (cgraph_node::function_symbol): Fix availability computation.
      	(cgraph_node::function_or_virtual_thunk_symbol): Likewise.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-19  Jan Hubicka  <hubicka@ucw.cz>
      
      	PR ipa/94202
      	* g++.dg/torture/pr94202.C: New test.
      Jan Hubicka committed
    • c: Fix up cfun->function_end_locus from the C FE [PR94029] · 9def91e9
      On the following testcase we ICE because while
            DECL_STRUCT_FUNCTION (current_function_decl)->function_start_locus
              = c_parser_peek_token (parser)->location;
      and similarly DECL_SOURCE_LOCATION (fndecl) is set from some token's
      location, the end is set as:
        /* Store the end of the function, so that we get good line number
           info for the epilogue.  */
        cfun->function_end_locus = input_location;
      and the thing is that input_location is only very rarely set in the C FE
      (the primary spot that changes it is the cb_line_change/fe_file_change).
      Which means, e.g. for pretty much all C functions that are on a single line,
      function_start_locus column is > than function_end_locus column, and the
      testcase even has smaller line in function_end_locus because cb_line_change
      isn't performed while parsing multi-line arguments of a function-like macro.
      
      Attached are two possible fixes to achieve what the C++ FE does, in
      particular that cfun->function_end_locus is the locus of the closing } of
      the function.  The first one updates input_location when we see a closing }
      of a compound statement (though any, not just the function body) and thus
      input_location in the finish_function call is what we need.
      The second instead propagates the location_t from the parsing of the
      outermost compound statement (the function body) to finish_function.
      The second one is this version.
      
      2020-03-19  Jakub Jelinek  <jakub@redhat.com>
      
      	PR gcov-profile/94029
      	* c-tree.h (finish_function): Add location_t argument defaulted to
      	input_location.
      	* c-parser.c (c_parser_compound_statement): Add endlocp argument and
      	set it to the locus of closing } if non-NULL.
      	(c_parser_compound_statement_nostart): Return locus of closing }.
      	(c_parser_parse_rtl_body): Likewise.
      	(c_parser_declaration_or_fndef): Propagate locus of closing } to
      	finish_function.
      	* c-decl.c (finish_function): Add end_loc argument, use it instead of
      	input_location to set function_end_locus.
      
      	* gcc.misc-tests/gcov-pr94029.c: New test.
      Jakub Jelinek committed
    • d/dmd: Merge upstream dmd d1a606599 · 37482edc
      Fixes long standing regression in the D front-end implemention, and adds
      a new field to allow retrieving a list of all content imports from the
      code generator.
      
      Reviewed-on: https://github.com/dlang/dmd/pull/10913
      	     https://github.com/dlang/dmd/pull/10933
      Iain Buclaw committed
    • Fix inliner ICE on alias with flatten attribute [PR92372] · f22712bd
      gcc/ChangeLog:
      
      2020-03-19  Jan Hubicka  <hubicka@ucw.cz>
      
      	PR ipa/92372
      	* cgraphunit.c (process_function_and_variable_attributes): warn
      	for flatten attribute on alias.
      	* ipa-inline.c (ipa_inline): Do not ICE on flatten attribute on alias.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-19  Jan Hubicka  <hubicka@ucw.cz>
      
      	PR ipa/92372
      	* gcc.c-torture/pr92372.c: New test.
      	* gcc.dg/attr-flatten-1.c: New test.
      Jan Hubicka committed
    • API extension for binutils (type of symbols). · c8429c2a
      	* lto-section-in.c: Add ext_symtab.
      	* lto-streamer-out.c (write_symbol_extension_info): New.
      	(produce_symtab_extension): New.
      	(produce_asm_for_decls): Stream also produce_symtab_extension.
      	* lto-streamer.h (enum lto_section_type): New section.
      	* lto-symtab.h (enum gcc_plugin_symbol_type): New.
      	(enum gcc_plugin_symbol_section_kind): Likewise.
      	* lto-plugin.c (LTO_SECTION_PREFIX): Rename to ...
      	(LTO_SYMTAB_PREFIX): ... this.
      	(LTO_SECTION_PREFIX_LEN): Rename to ...
      	(LTO_SYMTAB_PREFIX_LEN): ... this.
      	(LTO_SYMTAB_EXT_PREFIX): New.
      	(LTO_SYMTAB_EXT_PREFIX_LEN): New.
      	(LTO_LTO_PREFIX): New.
      	(LTO_LTO_PREFIX_LEN): New.
      	(parse_table_entry): Fill up unused to zero.
      	(parse_table_entry_extension): New.
      	(parse_symtab_extension): New.
      	(finish_conflict_resolution): Change type
      	for resolution.
      	(process_symtab): Use new macro name.
      	(process_symtab_extension): New.
      	(claim_file_handler): Parse also process_symtab_extension.
      	(onload): Call new add_symbols_v2.
      Martin Liska committed
    • Update include/plugin-api.h. · f5389e17
      	* plugin-api.h (struct ld_plugin_symbol): Split
      	int def into 4 char fields.
      	(enum ld_plugin_symbol_type): New.
      	(enum ld_plugin_symbol_section_kind): New.
      	(enum ld_plugin_tag): Add LDPT_ADD_SYMBOLS_V2.
      Martin Liska committed
    • c++: Fix up handling of captured vars in lambdas in OpenMP clauses [PR93931] · 02f7334a
      Without the parser.c change we were ICEing on the testcase, because while the
      uses of the captured vars inside of the constructs were replaced with capture
      proxy decls, we didn't do that for decls in OpenMP clauses.
      
      With that fixed, we don't ICE anymore, but the testcase is miscompiled and FAILs
      at runtime.  This is because the capture proxy decls have DECL_VALUE_EXPR and
      during gimplification we were gimplifying those to their DECL_VALUE_EXPRs.
      That is fine for shared vars, but for privatized ones we must not do that.
      So that is what the cp-gimplify.c changes do.  Had to add a DECL_CONTEXT check
      before calling is_capture_proxy because some VAR_DECLs don't have DECL_CONTEXT
      set (yet) and is_capture_proxy relies on that being non-NULL always.
      
      2020-03-19  Jakub Jelinek  <jakub@redhat.com>
      
      	PR c++/93931
      	* parser.c (cp_parser_omp_var_list_no_open): Call process_outer_var_ref
      	on outer_automatic_var_p decls.
      	* cp-gimplify.c (cxx_omp_disregard_value_expr): Return true also for
      	capture proxy decls.
      
      	* testsuite/libgomp.c++/pr93931.C: New test.
      Jakub Jelinek committed
    • libgomp/testsuite: ignore blank-line output for function-not-offloaded.c · bb83e069
      	* testsuite/libgomp.c-c++-common/function-not-offloaded.c: Add
      	dg-allow-blank-lines-in-output.
      Tobias Burnus committed
    • phiopt: Avoid -fcompare-debug bug in phiopt [PR94211] · c7e90196
      Two years ago, I've added support for up to 2 simple preparation statements
      in value_replacement, but the
      -      && estimate_num_insns (assign, &eni_time_weights)
      +      && estimate_num_insns (bb_seq (middle_bb), &eni_time_weights)
      change, meant that we compute the cost of all those statements rather than
      just the single assign that has been the single supported non-debug
      statement in the bb before, doesn't do what I thought would do, gimple_seq
      is just gimple * and thus it can't be really overloaded depending on whether
      we pass a single gimple * or a whole sequence.  Which means in the last
      two years it doesn't count all the statements, but only the first one.
      With -g that happens to be a DEBUG_STMT, or it could be e.g. the first
      preparation statement which could be much cheaper than the actual assign.
      
      2020-03-19  Jakub Jelinek  <jakub@redhat.com>
      
      	PR tree-optimization/94211
      	* tree-ssa-phiopt.c (value_replacement): Use estimate_num_insns_seq
      	instead of estimate_num_insns for bb_seq (middle_bb).  Rename
      	emtpy_or_with_defined_p variable to empty_or_with_defined_p, adjust
      	all uses.
      
      	* gcc.dg/pr94211.c: New test.
      Jakub Jelinek committed
    • ipa/94217 simplify offsetted address build · f3280e4c
      This avoids using build_ref_for_offset and build_fold_addr_expr
      where type mixup easily results in something not IP invariant.
      
      2020-03-19  Richard Biener  <rguenther@suse.de>
      
      	PR ipa/94217
      	* ipa-cp.c (ipa_get_jf_ancestor_result): Avoid build_fold_addr_expr
      	and build_ref_for_offset.
      Richard Biener committed
    • middle-end/94216 fix another build_fold_addr_expr use · 73bc09fa
      2020-03-19  Richard Biener  <rguenther@suse.de>
      
      	PR middle-end/94216
      	* fold-const.c (fold_binary_loc): Avoid using
      	build_fold_addr_expr when we really want an ADDR_EXPR.
      
      	* g++.dg/torture/pr94216.C: New testcase.
      Richard Biener committed
    • Daily bump. · b5562f11
      GCC Administrator committed
  3. 18 Mar, 2020 4 commits
    • libstdc++: Fix is_trivially_constructible (PR 94033) · b3341826
      This attempts to make is_nothrow_constructible more robust (and
      efficient to compile) by not depending on is_constructible. Instead the
      __is_constructible intrinsic is used directly. The helper class
      __is_nt_constructible_impl which checks whether the construction is
      non-throwing now takes a bool template parameter that is substituted by
      the result of the instrinsic. This fixes the reported bug by not using
      the already-instantiated (and incorrect) value of std::is_constructible.
      I don't think it really fixes the problem in general, because
      std::is_nothrow_constructible itself could already have been
      instantiated in a context where it gives the wrong result. A proper fix
      needs to be done in the compiler.
      
      	PR libstdc++/94033
      	* include/std/type_traits (__is_nt_default_constructible_atom): Remove.
      	(__is_nt_default_constructible_impl): Remove.
      	(__is_nothrow_default_constructible_impl): Remove.
      	(__is_nt_constructible_impl): Add bool template parameter. Adjust
      	partial specializations.
      	(__is_nothrow_constructible_impl): Replace class template with alias
      	template.
      	(is_nothrow_default_constructible): Derive from alias template
      	__is_nothrow_constructible_impl instead of
      	__is_nothrow_default_constructible_impl.
      	* testsuite/20_util/is_nothrow_constructible/94003.cc: New test.
      Jonathan Wakely committed
    • rs6000: Add back some w* constraints (PR91886) · 07fe4af4
      In May and June last year I deleted many of our (vector) constraints.
      We can now just use "wa" for those, together with some other
      conditions, which can be per alternative using the "enabled" attribute
      (which in turn primarily uses the "isa" attribute).
      
      But, it turns out that Clang implements some of those constraints as
      well, and at least musl uses some of them.  It is easy for us to add
      those contraints back (as undocumented aliases to "wa", which always
      did mean the same thing for valid inline assembler code), so do that.
      
      gcc/
      	* config/rs6000/constraints.md (wd, wf, wi, ws, ww): New undocumented
      	aliases for "wa".
      Segher Boessenkool committed
    • Complete change to resolve pr90275. · 529ea7d9
      	PR rtl-optimization/90275
      	* cse.c (cse_insn): Delete no-op register moves too.
      Jeff Law committed
    • PR ipa/92799 - ICE on a weakref function definition followed by a declaration · 3512dc01
      gcc/testsuite/ChangeLog:
      
      	PR ipa/92799
      	* gcc.dg/attr-weakref-5.c: New test.
      
      gcc/ChangeLog:
      
      	PR ipa/92799
      	* cgraphunit.c (process_function_and_variable_attributes): Also
      	complain about weakref function definitions and drop all effects
      	of the attribute.
      Martin Sebor committed