1. 23 Mar, 2020 1 commit
  2. 22 Mar, 2020 4 commits
    • Daily bump. · 83aa5aa3
      GCC Administrator committed
    • d: Generate phony targets for content imported files (PR93038) · fbe60463
      This is in addition to the last change which started including them in
      the make dependency list.
      
      gcc/d/ChangeLog:
      
      2020-03-22  Iain Buclaw  <ibuclaw@gdcproject.org>
      
      	PR d/93038
      	* d-lang.cc (deps_write): Generate phony targets for content imported
      	files.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-22  Iain Buclaw  <ibuclaw@gdcproject.org>
      
      	PR d/93038
      	* gdc.dg/pr93038b.d: New test.
      Iain Buclaw committed
    • Darwin: Fix i686 bootstrap when the assembler supports GOTOFF in data. · 85e10e4f
      When we use an assembler that supports " .long XX@GOTOFF", the current
      combination of configuration parameters and conditional compilation
      (when building an i686-darwin compiler with mdynamic-no-pic) assume that
      it's OK to put jump tables in the .const section.
      
      However, when we encounter a weak function with a jump table, this
      produces relocations that directly access the weak symbol section from
      the .const section - which is deemed illegal by the linker (since that
      would mean that the weak symbol could not be replaced).
      
      Arguably, this is a limitation (maybe even a bug) in the linker - but
      it seems that we'd have to change the ABI to fix it - since it would
      require some annotation (maybe just using a special section for the
      jump tables) to tell the linker that this specific circumstance is OK
      because the direct access to the weak symbol can only occur from that
      symbol itself.
      
      The fix is to force jump tables into the text section for all X86 Darwin
      versions (PIC code already had this change).
      
      gcc/ChangeLog:
      
      2020-03-22  Iain Sandoe  <iain@sandoe.co.uk>
      
      	* config/i386/darwin.h (JUMP_TABLES_IN_TEXT_SECTION): Remove
      	references to Darwin.
      	* config/i386/i386.h (JUMP_TABLES_IN_TEXT_SECTION): Define this
      	unconditionally and comment on why.
      Iain Sandoe committed
    • testsuite: Fix lambda-vis.C for targets with user label prefix '_'. · 88d7d0ce
      This prepends an optional match for the additional USER_LABEL_PREFIX
      to the scan assembler checks.
      
      2020-03-22  Iain Sandoe  <iain@sandoe.co.uk>
      
      	* g++.dg/abi/lambda-vis.C: Amend assembler match
      	strings for targets using a USER_LABEL_PREFIX.
      Iain Sandoe committed
  3. 21 Mar, 2020 11 commits
    • d: Fix missing dependencies in depfile for imported files (PR93038) · 4a01f7b1
      A new field for tracking imported files was added to the front-end, this
      makes use of it by writing all such files in the make dependency list.
      
      gcc/d/ChangeLog:
      
      2020-03-22  Iain Buclaw  <ibuclaw@gdcproject.org>
      
      	PR d/93038
      	* d-lang.cc (deps_write): Add content imported files to the make
      	dependency list.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-22  Iain Buclaw  <ibuclaw@gdcproject.org>
      
      	PR d/93038
      	* gdc.dg/fileimports/pr93038.txt: New test.
      	* gdc.dg/pr93038.d: New test.
      Iain Buclaw committed
    • libstdc++: Fix experimental::path::generic_string (PR 93245) · a577c0c2
      This function was unimplemented, simply returning the native format
      string instead.
      
      	PR libstdc++/93245
      	* include/experimental/bits/fs_path.h (path::generic_string<C,T,A>()):
      	* testsuite/experimental/filesystem/path/generic/generic_string.cc:
      	Improve test coverage.
      Jonathan Wakely committed
    • libstdc++: Fix path::generic_string allocator handling (PR 94242) · 9fc98511
      It's not possible to construct a path::string_type from an allocator of
      a different type. Create the correct specialization of basic_string, and
      adjust path::_S_str_convert to use a basic_string_view so that it is
      independent of the allocator type.
      
      	PR libstdc++/94242
      	* include/bits/fs_path.h (path::_S_str_convert): Replace first
      	parameter with basic_string_view so that strings with different
      	allocators can be accepted.
      	(path::generic_string<C,T,A>()): Use basic_string object that uses the
      	right allocator type.
      	* testsuite/27_io/filesystem/path/generic/94242.cc: New test.
      	* testsuite/27_io/filesystem/path/generic/generic_string.cc: Improve
      	test coverage.
      Jonathan Wakely committed
    • Darwin: Handle NULL DECL_SIZE_TYPE in machopic_select_section (PR94237). · dfb25dfe
      A recent change in the LTO streaming arrangement means that it is
      now possible for machopic_select_section () to be called with a NULL
      value for DECL_SIZE_TYPE - corresponding to an incomplete or not-yet-
      laid out type.
      
      When section anchors are present, and we are generating assembler, we
      normally need to know the object size when choosing the section, since
      zero-sized objects must be placed in sections that forbid section
      anchors.
      
      In the current circumstance, the objective of the earlier streaming of
      this data is to allow nm to determine BSS c.f. Data symbols (when used
      with the LTO plugin).  Since Darwin does not yet make use of the plugin
      this fix is a bit of future-proofing.  We now emit the 'generic' section
      for the decl (absent knowing its size) - which will still be correct in
      determining the BSS c.f. Data case.
      
      gcc/ChangeLog:
      
      2020-03-21  Iain Sandoe  <iain@sandoe.co.uk>
      
      	PR lto/94237
      	* config/darwin.c (darwin_mergeable_constant_section): Collect
      	section anchor checks into the caller.
      	(machopic_select_section): Collect section anchor checks into
      	the determination of 'effective zero-size' objects.  When the
      	size is unknown, assume it is non-zero, and thus return the
      	'generic' section for the DECL.
      Iain Sandoe committed
    • Darwin: Address translation comments (PR93694). · 837cece8
      This updates the options descriptions after feedback from
      a translator.
      
      gcc/ChangeLog:
      
      2020-03-21  Iain Sandoe  <iain@sandoe.co.uk>
      
      	PR target/93694
      	* gcc/config/darwin.opt: Amend options descriptions.
      Iain Sandoe committed
    • d: Fix ICE in add_symbol_to_partition_1, at lto/lto-partition.c:215 · 98eb7b2e
      This patch addresses two problems with TypeInfo initializer generation.
      
      1. D array fields pointing to compiler generated data are referencing
      public symbols with no unique prefix, which can lead to duplicate
      definition errors in some hard to reduce cases.  To avoid name clashes,
      all symbols that are generated for TypeInfo initializers now use the
      assembler name of the TypeInfo decl as a prefix.
      
      2. An ICE would occur during LTO pass because these same decls are
      considered to be part of the same comdat group as the TypeInfo decl that
      it's referred by, despite itself being neither marked public nor comdat.
      This resulted in decls being added to the LTRANS partition out of order,
      triggering an assert when add_symbol_to_partition_1 attempted to add
      them again.  To remedy, TREE_PUBLIC and DECL_COMDAT are now set on all
      generated symbols.
      
      gcc/d/ChangeLog:
      
      2020-03-21  Iain Buclaw  <ibuclaw@gdcproject.org>
      
      	PR d/94290
      	* typeinfo.cc (class TypeInfoVisitor): Replace type_ field with decl_.
      	(TypeInfoVisitor::TypeInfoVisitor): Set decl_.
      	(TypeInfoVisitor::result): Update.
      	(TypeInfoVisitor::internal_reference): New function.
      	(TypeInfoVisitor::layout_string): Use internal_reference.
      	(TypeInfoVisitor::visit (TypeInfoTupleDeclaration *)): Likewise.
      	(layout_typeinfo): Construct TypeInfoVisitor with typeinfo decl.
      	(layout_classinfo): Likewise.
      Iain Buclaw committed
    • c++: Reject changing active member of union during initialization [PR94066] · b599bf9d
      This patch adds a check to detect changing the active union member during
      initialization of another member of the union in cxx_eval_store_expression.  It
      uses the CONSTRUCTOR_NO_CLEARING flag as a proxy for whether the non-empty
      CONSTRUCTOR of UNION_TYPE we're assigning to is in the process of being
      initialized.
      
      This patch additionally fixes an issue in reduced_constant_expression_p where we
      were returning false for an uninitialized union with no active member.  This
      lets us correctly reject the uninitialized use in the testcase
      testconstexpr-union4.C that we weren't before.
      
      gcc/cp/ChangeLog:
      
      	PR c++/94066
      	* constexpr.c (reduced_constant_expression_p) [CONSTRUCTOR]: Properly
      	handle unions without an initializer.
      	(cxx_eval_component_reference): Emit a different diagnostic when the
      	constructor element corresponding to a union member is NULL.
      	(cxx_eval_bare_aggregate): When constructing a union, always set the
      	active union member before evaluating the initializer.  Relax assertion
      	that verifies the index of the constructor element we're initializing
      	hasn't been changed.
      	(cxx_eval_store_expression): Diagnose changing the active union member
      	while the union is in the process of being initialized.  After setting
      	an active union member, clear CONSTRUCTOR_NO_CLEARING on the underlying
      	CONSTRUCTOR.
      	(cxx_eval_constant_expression) [PLACEHOLDER_EXPR]: Don't re-reduce a
      	CONSTRUCTOR returned by lookup_placeholder.
      
      gcc/testsuite/ChangeLog:
      
      	PR c++/94066
      	* g++.dg/cpp1y/constexpr-union2.C: New test.
      	* g++.dg/cpp1y/constexpr-union3.C: New test.
      	* g++.dg/cpp1y/constexpr-union4.C: New test.
      	* g++.dg/cpp1y/constexpr-union5.C: New test.
      	* g++.dg/cpp1y/pr94066.C: New test.
      	* g++.dg/cpp1y/pr94066-2.C: New test.
      	* g++.dg/cpp1y/pr94066-3.C: New test.
      	* g++.dg/cpp2a/constexpr-union1.C: New test.
      Patrick Palka committed
    • lra: Tighten check for reloading paradoxical subregs [PR94052] · 497498c8
      simplify_operand_subreg tries to detect whether the allocation for
      a pseudo in a paradoxical subreg is also valid for the outer mode.
      The condition it used to check for an invalid combination was:
      
        else if (REG_P (reg)
      	   && REGNO (reg) >= FIRST_PSEUDO_REGISTER
      	   && (hard_regno = lra_get_regno_hard_regno (REGNO (reg))) >= 0
      	   && (hard_regno_nregs (hard_regno, innermode)
      	       < hard_regno_nregs (hard_regno, mode))
      	   && (regclass = lra_get_allocno_class (REGNO (reg)))
      	   && (type != OP_IN
      	       || !in_hard_reg_set_p (reg_class_contents[regclass],
      				      mode, hard_regno)
      	       || overlaps_hard_reg_set_p (lra_no_alloc_regs,
      					   mode, hard_regno)))
      
      I think there are two problems with this:
      
      (1) It never actually checks whether the hard register is valid for the
          outer mode (in the hard_regno_mode_ok sense).  If it isn't, any attempt
          to reload in the outer mode is likely to cycle, because the implied
          regno/mode combination will be just as invalid next time
          curr_insn_transform sees the subreg.
      
      (2) The check is valid for little-endian only.  For big-endian we need
          to move hard_regno backwards.
      
      Using simplify_subreg_regno should avoid both problems.
      
      As the existing comment says, IRA should always take subreg references
      into account when allocating hard registers, so this fix-up should only
      really be needed for pseudos allocated by LRA itself.
      
      gcc/
      2020-03-21  Richard Sandiford  <richard.sandiford@arm.com>
      
      	PR rtl-optimization/94052
      	* lra-constraints.c (simplify_operand_subreg): Reload the inner
      	register of a paradoxical subreg if simplify_subreg_regno fails
      	to give a valid hard register for the outer mode.
      
      gcc/testsuite/
      2020-03-21  Tamar Christina  <tamar.christina@arm.com>
      
      	PR target/94052
      	* gcc.target/aarch64/pr94052.C: New test.
      Richard Sandiford committed
    • Fix comma at end of enumerator list seen with -std=c++98. · 15711e83
      	* plugin-api.h (enum ld_plugin_symbol_type): Remove
      	comma after last value of an enum.
      	* lto-symtab.h (enum gcc_plugin_symbol_type): Likewise.
      Martin Liska committed
    • Daily bump. · 84166020
      GCC Administrator committed
  4. 20 Mar, 2020 24 commits
    • sra: Cap number of sub-access propagations with a param (PR 93435) · 29f23ed7
      PR 93435 is a perfect SRA bomb.  It initializes an array of 16 chars
      element-wise, then uses that to initialize an aggregate that consists
      of four such arrays, that one to initialize one four times as big as
      the previous one all the way to an aggregate that has 64kb.
      
      This causes the sub-access propagation across assignments to create
      thousands of byte-sized artificial accesses which are then eligible to
      be replaced - they do facilitate forward propagation but there is
      enough of them for DSE to never finish.
      
      This patch avoids that situation by accounting how many of such
      replacements can be created per SRA candidate.  The default value of
      32 was just the largest power of two that did not slow down
      compilation of the testcase, but it should also hopefully be big
      enough for any reasonable input that might rely on the optimization.
      
      2020-03-20  Martin Jambor  <mjambor@suse.cz>
      
      	PR tree-optimization/93435
      	* params.opt (sra-max-propagations): New parameter.
      	* tree-sra.c (propagation_budget): New variable.
      	(budget_for_propagation_access): New function.
      	(propagate_subaccesses_from_rhs): Use it.
      	(propagate_subaccesses_from_lhs): Likewise.
      	(propagate_all_subaccesses): Set up and destroy propagation_budget.
      
      	gcc/testsuite/
      	* gcc.dg/tree-ssa/pr93435.c: New test.
      Martin Jambor committed
    • Regenerate gcc.pot. · cc3afc9d
      	* gcc.pot: Regenerate.
      Joseph Myers committed
    • rs6000: Add command line and builtin compatibility check · 68dd5780
      2020-03-20  Carl Love  <cel@us.ibm.com>
      
      	PR/target 87583
      	* gcc/config/rs6000/rs6000.c (rs6000_option_override_internal):
      	Add check for TARGET_FPRND for Power 7 or newer.
      Carl Love committed
    • Fix verifier ICE on wrong comdat local flag [PR93347] · 72b3bc89
      gcc/ChangeLog:
      
      2020-03-20  Jan Hubicka  <hubicka@ucw.cz>
      
      	PR ipa/93347
      	* cgraph.c (symbol_table::create_edge): Update calls_comdat_local flag.
      	(cgraph_edge::redirect_callee): Move here; likewise.
      	(cgraph_node::remove_callees): Update calls_comdat_local flag.
      	(cgraph_node::verify_node): Verify that calls_comdat_local flag match
      	reality.
      	(cgraph_node::check_calls_comdat_local_p): New member function.
      	* cgraph.h (cgraph_node::check_calls_comdat_local_p): Declare.
      	(cgraph_edge::redirect_callee): Move offline.
      	* ipa-fnsummary.c (compute_fn_summary): Do not compute
      	calls_comdat_local flag here.
      	* ipa-inline-transform.c (inline_call): Fix updating of
      	calls_comdat_local flag.
      	* ipa-split.c (split_function): Use true instead of 1 to set the flag.
      	* symtab.c (symtab_node::add_to_same_comdat_group): Update
      	calls_comdat_local flag.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Jan Hubicka  <hubicka@ucw.cz>
      
      	* g++.dg/torture/pr93347.C: New test.
      Jan Hubicka committed
    • adjust SLP tree dumping · a89349e6
      This also dumps the root node we eventually smuggle in.
      
      2020-03-20  Richard Biener  <rguenther@suse.de>
      
      	* tree-vect-slp.c (vect_analyze_slp_instance): Dump SLP tree
      	from the possibly modified root.
      Richard Biener committed
    • c++: Add testcases from PR c++/69694 · a23eff1b
      These testcases are compiling successfully since 7.1.
      
      gcc/testsuite/ChangeLog:
      
      	PR c++/69694
      	* g++.dg/cpp0x/decltype74.C: New test.
      	* g++.dg/cpp0x/decltype75.C: New test.
      Patrick Palka committed
    • [ARM][GCC][11x]: MVE ACLE vector interleaving store and deinterleaving load… · 1dfcc3b5
      [ARM][GCC][11x]: MVE ACLE vector interleaving store and deinterleaving load intrinsics and also aliases to vstr and vldr intrinsics.
      
      This patch supports following MVE ACLE intrinsics which are aliases of vstr and
      vldr intrinsics.
      
      vst1q_p_u8, vst1q_p_s8, vld1q_z_u8, vld1q_z_s8, vst1q_p_u16, vst1q_p_s16,
      vld1q_z_u16, vld1q_z_s16, vst1q_p_u32, vst1q_p_s32, vld1q_z_u32, vld1q_z_s32,
      vld1q_z_f16, vst1q_p_f16, vld1q_z_f32, vst1q_p_f32.
      
      This patch also supports following MVE ACLE vector deinterleaving loads and vector
      interleaving stores.
      
      vst2q_s8, vst2q_u8, vld2q_s8, vld2q_u8, vld4q_s8, vld4q_u8, vst2q_s16, vst2q_u16,
      vld2q_s16, vld2q_u16, vld4q_s16, vld4q_u16, vst2q_s32, vst2q_u32, vld2q_s32,
      vld2q_u32, vld4q_s32, vld4q_u32, vld4q_f16, vld2q_f16, vst2q_f16, vld4q_f32,
      vld2q_f32, vst2q_f32.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
      [1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* config/arm/arm_mve.h (vst1q_p_u8): Define macro.
      	(vst1q_p_s8): Likewise.
      	(vst2q_s8): Likewise.
      	(vst2q_u8): Likewise.
      	(vld1q_z_u8): Likewise.
      	(vld1q_z_s8): Likewise.
      	(vld2q_s8): Likewise.
      	(vld2q_u8): Likewise.
      	(vld4q_s8): Likewise.
      	(vld4q_u8): Likewise.
      	(vst1q_p_u16): Likewise.
      	(vst1q_p_s16): Likewise.
      	(vst2q_s16): Likewise.
      	(vst2q_u16): Likewise.
      	(vld1q_z_u16): Likewise.
      	(vld1q_z_s16): Likewise.
      	(vld2q_s16): Likewise.
      	(vld2q_u16): Likewise.
      	(vld4q_s16): Likewise.
      	(vld4q_u16): Likewise.
      	(vst1q_p_u32): Likewise.
      	(vst1q_p_s32): Likewise.
      	(vst2q_s32): Likewise.
      	(vst2q_u32): Likewise.
      	(vld1q_z_u32): Likewise.
      	(vld1q_z_s32): Likewise.
      	(vld2q_s32): Likewise.
      	(vld2q_u32): Likewise.
      	(vld4q_s32): Likewise.
      	(vld4q_u32): Likewise.
      	(vld4q_f16): Likewise.
      	(vld2q_f16): Likewise.
      	(vld1q_z_f16): Likewise.
      	(vst2q_f16): Likewise.
      	(vst1q_p_f16): Likewise.
      	(vld4q_f32): Likewise.
      	(vld2q_f32): Likewise.
      	(vld1q_z_f32): Likewise.
      	(vst2q_f32): Likewise.
      	(vst1q_p_f32): Likewise.
      	(__arm_vst1q_p_u8): Define intrinsic.
      	(__arm_vst1q_p_s8): Likewise.
      	(__arm_vst2q_s8): Likewise.
      	(__arm_vst2q_u8): Likewise.
      	(__arm_vld1q_z_u8): Likewise.
      	(__arm_vld1q_z_s8): Likewise.
      	(__arm_vld2q_s8): Likewise.
      	(__arm_vld2q_u8): Likewise.
      	(__arm_vld4q_s8): Likewise.
      	(__arm_vld4q_u8): Likewise.
      	(__arm_vst1q_p_u16): Likewise.
      	(__arm_vst1q_p_s16): Likewise.
      	(__arm_vst2q_s16): Likewise.
      	(__arm_vst2q_u16): Likewise.
      	(__arm_vld1q_z_u16): Likewise.
      	(__arm_vld1q_z_s16): Likewise.
      	(__arm_vld2q_s16): Likewise.
      	(__arm_vld2q_u16): Likewise.
      	(__arm_vld4q_s16): Likewise.
      	(__arm_vld4q_u16): Likewise.
      	(__arm_vst1q_p_u32): Likewise.
      	(__arm_vst1q_p_s32): Likewise.
      	(__arm_vst2q_s32): Likewise.
      	(__arm_vst2q_u32): Likewise.
      	(__arm_vld1q_z_u32): Likewise.
      	(__arm_vld1q_z_s32): Likewise.
      	(__arm_vld2q_s32): Likewise.
      	(__arm_vld2q_u32): Likewise.
      	(__arm_vld4q_s32): Likewise.
      	(__arm_vld4q_u32): Likewise.
      	(__arm_vld4q_f16): Likewise.
      	(__arm_vld2q_f16): Likewise.
      	(__arm_vld1q_z_f16): Likewise.
      	(__arm_vst2q_f16): Likewise.
      	(__arm_vst1q_p_f16): Likewise.
      	(__arm_vld4q_f32): Likewise.
      	(__arm_vld2q_f32): Likewise.
      	(__arm_vld1q_z_f32): Likewise.
      	(__arm_vst2q_f32): Likewise.
      	(__arm_vst1q_p_f32): Likewise.
      	(vld1q_z): Define polymorphic variant.
      	(vld2q): Likewise.
      	(vld4q): Likewise.
      	(vst1q_p): Likewise.
      	(vst2q): Likewise.
      	* config/arm/arm_mve_builtins.def (STORE1): Use builtin qualifier.
      	(LOAD1): Likewise.
      	* config/arm/mve.md (mve_vst2q<mode>): Define RTL pattern.
      	(mve_vld2q<mode>): Likewise.
      	(mve_vld4q<mode>): Likewise.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vld1q_z_f16.c: New test.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld1q_z_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld2q_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vld4q_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst1q_p_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vst2q_u8.c: Likewise.
      Srinath Parvathaneni committed
    • d: Fix SEGV in hash_table<odr_name_hasher, false, xcallocator>::find_slot_with_hash · b5446d0c
      This patch fixes LTO bug with the D front-end.  As DECL_ASSEMBLER_NAME
      is set on the TYPE_DECL, so TYPE_CXX_ODR_P must also be set on the type.
      
      The addition of merge_aggregate_types is not strictly needed now, but it
      fixes a problem introduced in newer versions of the dmd front-end where
      templated types could be sent more than once to the D code generator.
      
      gcc/d/ChangeLog:
      
      2020-03-20  Iain Buclaw  <ibuclaw@gdcproject.org>
      
      	PR lto/91027
      	* d-tree.h (struct GTY): Add daggregate field.
      	(IDENTIFIER_DAGGREGATE): Define.
      	(d_mangle_decl): Add declaration.
      	* decl.cc (mangle_decl): Remove static linkage, rename to...
      	(d_mangle_decl): ...this, update all callers.
      	* types.cc (merge_aggregate_types): New function.
      	(TypeVisitor::visit (TypeStruct *)): Call merge_aggregate_types, set
      	IDENTIFIER_DAGGREGATE and TYPE_CXX_ODR_P.
      	(TypeVisitor::visit (TypeClass *)): Likewise.
      Iain Buclaw committed
    • c-family: Tighten vector handling in type_for_mode [PR94072] · 1aa22b19
      In this PR we had a 512-bit VECTOR_TYPE whose mode is XImode
      (an integer mode used for four 128-bit vectors).  When trying
      to expand a zero constant for it, we hit code in expand_expr_real_1
      that tries to use the associated integer type instead.  The code used
      type_for_mode (XImode, 1) to get this integer type.
      
      However, the c-family implementation of type_for_mode checks for
      any registered built-in type that matches the mode and has the
      right signedness.  This meant that it could return a built-in
      vector type when given an integer mode (particularly if, as here,
      the vector type isn't supported by the current subtarget and so
      TYPE_MODE != TYPE_MODE_RAW).  The expand code would then cycle
      endlessly trying to use this "new" type instead of the original
      vector type.
      
      2020-03-20  Richard Sandiford  <richard.sandiford@arm.com>
      
      gcc/c-family/
      	PR middle-end/94072
      	* c-common.c (c_common_type_for_mode): Before using a registered
      	built-in type, check that the vectorness of the type matches
      	the vectorness of the mode.
      
      gcc/testsuite/
      	PR middle-end/94072
      	* gcc.target/aarch64/pr94072.c: New test.
      Richard Sandiford committed
    • [ARM][GCC][10x]: MVE ACLE intrinsics "add with carry across beats" and "beat-wise substract". · c3562f81
      This patch supports following MVE ACLE "add with carry across beats" intrinsics and "beat-wise substract" intrinsics.
      
      vadciq_s32, vadciq_u32, vadciq_m_s32, vadciq_m_u32, vadcq_s32, vadcq_u32, vadcq_m_s32, vadcq_m_u32, vsbciq_s32, vsbciq_u32, vsbciq_m_s32, vsbciq_m_u32, vsbcq_s32, vsbcq_u32, vsbcq_m_s32, vsbcq_m_u32.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
      [1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* config/arm/arm-builtins.c (ARM_BUILTIN_GET_FPSCR_NZCVQC): Define.
      	(ARM_BUILTIN_SET_FPSCR_NZCVQC): Likewise.
      	(arm_init_mve_builtins): Add "__builtin_arm_get_fpscr_nzcvqc" and
      	"__builtin_arm_set_fpscr_nzcvqc" to arm_builtin_decls array.
      	(arm_expand_builtin): Define case ARM_BUILTIN_GET_FPSCR_NZCVQC
      	and ARM_BUILTIN_SET_FPSCR_NZCVQC.
      	* config/arm/arm_mve.h (vadciq_s32): Define macro.
      	(vadciq_u32): Likewise.
      	(vadciq_m_s32): Likewise.
      	(vadciq_m_u32): Likewise.
      	(vadcq_s32): Likewise.
      	(vadcq_u32): Likewise.
      	(vadcq_m_s32): Likewise.
      	(vadcq_m_u32): Likewise.
      	(vsbciq_s32): Likewise.
      	(vsbciq_u32): Likewise.
      	(vsbciq_m_s32): Likewise.
      	(vsbciq_m_u32): Likewise.
      	(vsbcq_s32): Likewise.
      	(vsbcq_u32): Likewise.
      	(vsbcq_m_s32): Likewise.
      	(vsbcq_m_u32): Likewise.
      	(__arm_vadciq_s32): Define intrinsic.
      	(__arm_vadciq_u32): Likewise.
      	(__arm_vadciq_m_s32): Likewise.
      	(__arm_vadciq_m_u32): Likewise.
      	(__arm_vadcq_s32): Likewise.
      	(__arm_vadcq_u32): Likewise.
      	(__arm_vadcq_m_s32): Likewise.
      	(__arm_vadcq_m_u32): Likewise.
      	(__arm_vsbciq_s32): Likewise.
      	(__arm_vsbciq_u32): Likewise.
      	(__arm_vsbciq_m_s32): Likewise.
      	(__arm_vsbciq_m_u32): Likewise.
      	(__arm_vsbcq_s32): Likewise.
      	(__arm_vsbcq_u32): Likewise.
      	(__arm_vsbcq_m_s32): Likewise.
      	(__arm_vsbcq_m_u32): Likewise.
      	(vadciq_m): Define polymorphic variant.
      	(vadciq): Likewise.
      	(vadcq_m): Likewise.
      	(vadcq): Likewise.
      	(vsbciq_m): Likewise.
      	(vsbciq): Likewise.
      	(vsbcq_m): Likewise.
      	(vsbcq): Likewise.
      	* config/arm/arm_mve_builtins.def (BINOP_NONE_NONE_NONE): Use builtin
      	qualifier.
      	(BINOP_UNONE_UNONE_UNONE): Likewise.
      	(QUADOP_NONE_NONE_NONE_NONE_UNONE): Likewise.
      	(QUADOP_UNONE_UNONE_UNONE_UNONE_UNONE): Likewise.
      	* config/arm/mve.md (VADCIQ): Define iterator.
      	(VADCIQ_M): Likewise.
      	(VSBCQ): Likewise.
      	(VSBCQ_M): Likewise.
      	(VSBCIQ): Likewise.
      	(VSBCIQ_M): Likewise.
      	(VADCQ): Likewise.
      	(VADCQ_M): Likewise.
      	(mve_vadciq_m_<supf>v4si): Define RTL pattern.
      	(mve_vadciq_<supf>v4si): Likewise.
      	(mve_vadcq_m_<supf>v4si): Likewise.
      	(mve_vadcq_<supf>v4si): Likewise.
      	(mve_vsbciq_m_<supf>v4si): Likewise.
      	(mve_vsbciq_<supf>v4si): Likewise.
      	(mve_vsbcq_m_<supf>v4si): Likewise.
      	(mve_vsbcq_<supf>v4si): Likewise.
      	(get_fpscr_nzcvqc): Define isns.
      	(set_fpscr_nzcvqc): Define isns.
      	* config/arm/unspecs.md (UNSPEC_GET_FPSCR_NZCVQC): Define.
      	(UNSPEC_SET_FPSCR_NZCVQC): Define.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                 Andre Vieira  <andre.simoesdiasvieira@arm.com>
                 Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vadciq_m_s32.c: New test.
      	* gcc.target/arm/mve/intrinsics/vadciq_m_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vadciq_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vadciq_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vadcq_m_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vadcq_m_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vadcq_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vadcq_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbciq_m_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbciq_m_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbciq_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbciq_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbcq_m_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbcq_m_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbcq_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsbcq_u32.c: Likewise.
      Srinath Parvathaneni committed
    • c++: Include the constraint parameter mapping in diagnostic constraint contexts · 828878c3
      When diagnosing a constraint error, we currently try to print the constraint
      inside a diagnostic constraint context with its template arguments substituted
      in.  If substitution fails, then we instead just print the dependent form, as in
      the test case below:
      
        .../diagnostic6.C:14:15: error: static assertion failed
           14 | static_assert(E<int>); // { dg-error "static assertion failed|not a class" }
              |               ^~~~~~
        .../diagnostic6.C:14:15: note: constraints not satisfied
        .../diagnostic6.C:4:11:   required for the satisfaction of ‘C<T>’
        .../diagnostic6.C:8:11:   required for the satisfaction of ‘D<typename T::type>’
        .../diagnostic6.C:14:15: error: ‘int’ is not a class, struct, or union type
      
      But printing just the dependent form sometimes makes it difficult to understand
      the underlying failure.  In the above example, for instance, there's no
      indication of how the template argument 'int' relates to either of the 'T's.
      
      This patch improves the situation by changing these diagnostics to always print
      the dependent form of the constraint, and alongside it the (preferably
      substituted) constraint parameter mapping.  So with the same test case below we
      now get:
      
        .../diagnostic6.C:14:15: error: static assertion failed
           14 | static_assert(E<int>); // { dg-error "static assertion failed|not a class" }
              |               ^~~~~~
        .../diagnostic6.C:14:15: note: constraints not satisfied
        .../diagnostic6.C:4:11:   required for the satisfaction of ‘C<T>’ [with T = typename T::type]
        .../diagnostic6.C:8:11:   required for the satisfaction of ‘D<typename T::type>’ [with T = int]
        .../diagnostic6.C:14:15: error: ‘int’ is not a class, struct, or union type
      
      This change arguably makes it easier to figure out what's going on whenever a
      constraint fails due to substitution creating an invalid type rather than
      failing due to the constraint evaluating to false.
      
      gcc/cp/ChangeLog:
      
      	* cxx-pretty-print.c (pp_cxx_parameter_mapping): Make extern.  Move
      	the "[with ]" bits to here from ...
      	(pp_cxx_atomic_constraint): ... here.
      	* cxx-pretty-print.h (pp_cxx_parameter_mapping): Declare.
      	* error.c (rebuild_concept_check): Delete.
      	(print_concept_check_info): Print the dependent form of the constraint and the
      	preferably substituted parameter mapping alongside it.
      
      gcc/testsuite/ChangeLog:
      
      	* g++.dg/concepts/diagnostic6.C: New test.
      Patrick Palka committed
    • [ARM][GCC][9x]: MVE ACLE predicated intrinsics with (dont-care) variant. · 261014a1
      This patch supports following MVE ACLE predicated intrinsic with `_x` (dont-care) variant.
      * ``_x`` (dont-care) which indicates that the false-predicated lanes have undefined values.
      These are syntactic sugar for merge intrinsics with a ``vuninitializedq`` inactive parameter.
      
      vabdq_x_f16, vabdq_x_f32, vabdq_x_s16, vabdq_x_s32, vabdq_x_s8, vabdq_x_u16, vabdq_x_u32, vabdq_x_u8,
      vabsq_x_f16, vabsq_x_f32, vabsq_x_s16, vabsq_x_s32, vabsq_x_s8, vaddq_x_f16, vaddq_x_f32, vaddq_x_n_f16,
      vaddq_x_n_f32, vaddq_x_n_s16, vaddq_x_n_s32, vaddq_x_n_s8, vaddq_x_n_u16, vaddq_x_n_u32, vaddq_x_n_u8,
      vaddq_x_s16, vaddq_x_s32, vaddq_x_s8, vaddq_x_u16, vaddq_x_u32, vaddq_x_u8, vandq_x_f16, vandq_x_f32,
      vandq_x_s16, vandq_x_s32, vandq_x_s8, vandq_x_u16, vandq_x_u32, vandq_x_u8, vbicq_x_f16, vbicq_x_f32,
      vbicq_x_s16, vbicq_x_s32, vbicq_x_s8, vbicq_x_u16, vbicq_x_u32, vbicq_x_u8, vbrsrq_x_n_f16,
      vbrsrq_x_n_f32, vbrsrq_x_n_s16, vbrsrq_x_n_s32, vbrsrq_x_n_s8, vbrsrq_x_n_u16, vbrsrq_x_n_u32,
      vbrsrq_x_n_u8, vcaddq_rot270_x_f16, vcaddq_rot270_x_f32, vcaddq_rot270_x_s16, vcaddq_rot270_x_s32,
      vcaddq_rot270_x_s8, vcaddq_rot270_x_u16, vcaddq_rot270_x_u32, vcaddq_rot270_x_u8, vcaddq_rot90_x_f16,
      vcaddq_rot90_x_f32, vcaddq_rot90_x_s16, vcaddq_rot90_x_s32, vcaddq_rot90_x_s8, vcaddq_rot90_x_u16,
      vcaddq_rot90_x_u32, vcaddq_rot90_x_u8, vclsq_x_s16, vclsq_x_s32, vclsq_x_s8, vclzq_x_s16, vclzq_x_s32,
      vclzq_x_s8, vclzq_x_u16, vclzq_x_u32, vclzq_x_u8, vcmulq_rot180_x_f16, vcmulq_rot180_x_f32,
      vcmulq_rot270_x_f16, vcmulq_rot270_x_f32, vcmulq_rot90_x_f16, vcmulq_rot90_x_f32, vcmulq_x_f16,
      vcmulq_x_f32, vcvtaq_x_s16_f16, vcvtaq_x_s32_f32, vcvtaq_x_u16_f16, vcvtaq_x_u32_f32, vcvtbq_x_f32_f16,
      vcvtmq_x_s16_f16, vcvtmq_x_s32_f32, vcvtmq_x_u16_f16, vcvtmq_x_u32_f32, vcvtnq_x_s16_f16,
      vcvtnq_x_s32_f32, vcvtnq_x_u16_f16, vcvtnq_x_u32_f32, vcvtpq_x_s16_f16, vcvtpq_x_s32_f32,
      vcvtpq_x_u16_f16, vcvtpq_x_u32_f32, vcvtq_x_f16_s16, vcvtq_x_f16_u16, vcvtq_x_f32_s32, vcvtq_x_f32_u32,
      vcvtq_x_n_f16_s16, vcvtq_x_n_f16_u16, vcvtq_x_n_f32_s32, vcvtq_x_n_f32_u32, vcvtq_x_n_s16_f16,
      vcvtq_x_n_s32_f32, vcvtq_x_n_u16_f16, vcvtq_x_n_u32_f32, vcvtq_x_s16_f16, vcvtq_x_s32_f32,
      vcvtq_x_u16_f16, vcvtq_x_u32_f32, vcvttq_x_f32_f16, vddupq_x_n_u16, vddupq_x_n_u32, vddupq_x_n_u8,
      vddupq_x_wb_u16, vddupq_x_wb_u32, vddupq_x_wb_u8, vdupq_x_n_f16, vdupq_x_n_f32, vdupq_x_n_s16,
      vdupq_x_n_s32, vdupq_x_n_s8, vdupq_x_n_u16, vdupq_x_n_u32, vdupq_x_n_u8, vdwdupq_x_n_u16, vdwdupq_x_n_u32,
      vdwdupq_x_n_u8, vdwdupq_x_wb_u16, vdwdupq_x_wb_u32, vdwdupq_x_wb_u8, veorq_x_f16, veorq_x_f32, veorq_x_s16,
      veorq_x_s32, veorq_x_s8, veorq_x_u16, veorq_x_u32, veorq_x_u8, vhaddq_x_n_s16, vhaddq_x_n_s32,
      vhaddq_x_n_s8, vhaddq_x_n_u16, vhaddq_x_n_u32, vhaddq_x_n_u8, vhaddq_x_s16, vhaddq_x_s32, vhaddq_x_s8,
      vhaddq_x_u16, vhaddq_x_u32, vhaddq_x_u8, vhcaddq_rot270_x_s16, vhcaddq_rot270_x_s32, vhcaddq_rot270_x_s8,
      vhcaddq_rot90_x_s16, vhcaddq_rot90_x_s32, vhcaddq_rot90_x_s8, vhsubq_x_n_s16, vhsubq_x_n_s32,
      vhsubq_x_n_s8, vhsubq_x_n_u16, vhsubq_x_n_u32, vhsubq_x_n_u8, vhsubq_x_s16, vhsubq_x_s32, vhsubq_x_s8,
      vhsubq_x_u16, vhsubq_x_u32, vhsubq_x_u8, vidupq_x_n_u16, vidupq_x_n_u32, vidupq_x_n_u8, vidupq_x_wb_u16,
      vidupq_x_wb_u32, vidupq_x_wb_u8, viwdupq_x_n_u16, viwdupq_x_n_u32, viwdupq_x_n_u8, viwdupq_x_wb_u16,
      viwdupq_x_wb_u32, viwdupq_x_wb_u8, vmaxnmq_x_f16, vmaxnmq_x_f32, vmaxq_x_s16, vmaxq_x_s32, vmaxq_x_s8,
      vmaxq_x_u16, vmaxq_x_u32, vmaxq_x_u8, vminnmq_x_f16, vminnmq_x_f32, vminq_x_s16, vminq_x_s32, vminq_x_s8,
      vminq_x_u16, vminq_x_u32, vminq_x_u8, vmovlbq_x_s16, vmovlbq_x_s8, vmovlbq_x_u16, vmovlbq_x_u8,
      vmovltq_x_s16, vmovltq_x_s8, vmovltq_x_u16, vmovltq_x_u8, vmulhq_x_s16, vmulhq_x_s32, vmulhq_x_s8,
      vmulhq_x_u16, vmulhq_x_u32, vmulhq_x_u8, vmullbq_int_x_s16, vmullbq_int_x_s32, vmullbq_int_x_s8,
      vmullbq_int_x_u16, vmullbq_int_x_u32, vmullbq_int_x_u8, vmullbq_poly_x_p16, vmullbq_poly_x_p8,
      vmulltq_int_x_s16, vmulltq_int_x_s32, vmulltq_int_x_s8, vmulltq_int_x_u16, vmulltq_int_x_u32,
      vmulltq_int_x_u8, vmulltq_poly_x_p16, vmulltq_poly_x_p8, vmulq_x_f16, vmulq_x_f32, vmulq_x_n_f16,
      vmulq_x_n_f32, vmulq_x_n_s16, vmulq_x_n_s32, vmulq_x_n_s8, vmulq_x_n_u16, vmulq_x_n_u32, vmulq_x_n_u8,
      vmulq_x_s16, vmulq_x_s32, vmulq_x_s8, vmulq_x_u16, vmulq_x_u32, vmulq_x_u8, vmvnq_x_n_s16, vmvnq_x_n_s32,
      vmvnq_x_n_u16, vmvnq_x_n_u32, vmvnq_x_s16, vmvnq_x_s32, vmvnq_x_s8, vmvnq_x_u16, vmvnq_x_u32, vmvnq_x_u8,
      vnegq_x_f16, vnegq_x_f32, vnegq_x_s16, vnegq_x_s32, vnegq_x_s8, vornq_x_f16, vornq_x_f32, vornq_x_s16,
      vornq_x_s32, vornq_x_s8, vornq_x_u16, vornq_x_u32, vornq_x_u8, vorrq_x_f16, vorrq_x_f32, vorrq_x_s16,
      vorrq_x_s32, vorrq_x_s8, vorrq_x_u16, vorrq_x_u32, vorrq_x_u8, vrev16q_x_s8, vrev16q_x_u8, vrev32q_x_f16,
      vrev32q_x_s16, vrev32q_x_s8, vrev32q_x_u16, vrev32q_x_u8, vrev64q_x_f16, vrev64q_x_f32, vrev64q_x_s16,
      vrev64q_x_s32, vrev64q_x_s8, vrev64q_x_u16, vrev64q_x_u32, vrev64q_x_u8, vrhaddq_x_s16, vrhaddq_x_s32,
      vrhaddq_x_s8, vrhaddq_x_u16, vrhaddq_x_u32, vrhaddq_x_u8, vrmulhq_x_s16, vrmulhq_x_s32, vrmulhq_x_s8,
      vrmulhq_x_u16, vrmulhq_x_u32, vrmulhq_x_u8, vrndaq_x_f16, vrndaq_x_f32, vrndmq_x_f16, vrndmq_x_f32,
      vrndnq_x_f16, vrndnq_x_f32, vrndpq_x_f16, vrndpq_x_f32, vrndq_x_f16, vrndq_x_f32, vrndxq_x_f16,
      vrndxq_x_f32, vrshlq_x_s16, vrshlq_x_s32, vrshlq_x_s8, vrshlq_x_u16, vrshlq_x_u32, vrshlq_x_u8,
      vrshrq_x_n_s16, vrshrq_x_n_s32, vrshrq_x_n_s8, vrshrq_x_n_u16, vrshrq_x_n_u32, vrshrq_x_n_u8,
      vshllbq_x_n_s16, vshllbq_x_n_s8, vshllbq_x_n_u16, vshllbq_x_n_u8, vshlltq_x_n_s16, vshlltq_x_n_s8,
      vshlltq_x_n_u16, vshlltq_x_n_u8, vshlq_x_n_s16, vshlq_x_n_s32, vshlq_x_n_s8, vshlq_x_n_u16, vshlq_x_n_u32,
      vshlq_x_n_u8, vshlq_x_s16, vshlq_x_s32, vshlq_x_s8, vshlq_x_u16, vshlq_x_u32, vshlq_x_u8, vshrq_x_n_s16,
      vshrq_x_n_s32, vshrq_x_n_s8, vshrq_x_n_u16, vshrq_x_n_u32, vshrq_x_n_u8, vsubq_x_f16, vsubq_x_f32,
      vsubq_x_n_f16, vsubq_x_n_f32, vsubq_x_n_s16, vsubq_x_n_s32, vsubq_x_n_s8, vsubq_x_n_u16, vsubq_x_n_u32,
      vsubq_x_n_u8, vsubq_x_s16, vsubq_x_s32, vsubq_x_s8, vsubq_x_u16, vsubq_x_u32, vsubq_x_u8.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
      [1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
      
      	* config/arm/arm_mve.h (vddupq_x_n_u8): Define macro.
      	(vddupq_x_n_u16): Likewise.
      	(vddupq_x_n_u32): Likewise.
      	(vddupq_x_wb_u8): Likewise.
      	(vddupq_x_wb_u16): Likewise.
      	(vddupq_x_wb_u32): Likewise.
      	(vdwdupq_x_n_u8): Likewise.
      	(vdwdupq_x_n_u16): Likewise.
      	(vdwdupq_x_n_u32): Likewise.
      	(vdwdupq_x_wb_u8): Likewise.
      	(vdwdupq_x_wb_u16): Likewise.
      	(vdwdupq_x_wb_u32): Likewise.
      	(vidupq_x_n_u8): Likewise.
      	(vidupq_x_n_u16): Likewise.
      	(vidupq_x_n_u32): Likewise.
      	(vidupq_x_wb_u8): Likewise.
      	(vidupq_x_wb_u16): Likewise.
      	(vidupq_x_wb_u32): Likewise.
      	(viwdupq_x_n_u8): Likewise.
      	(viwdupq_x_n_u16): Likewise.
      	(viwdupq_x_n_u32): Likewise.
      	(viwdupq_x_wb_u8): Likewise.
      	(viwdupq_x_wb_u16): Likewise.
      	(viwdupq_x_wb_u32): Likewise.
      	(vdupq_x_n_s8): Likewise.
      	(vdupq_x_n_s16): Likewise.
      	(vdupq_x_n_s32): Likewise.
      	(vdupq_x_n_u8): Likewise.
      	(vdupq_x_n_u16): Likewise.
      	(vdupq_x_n_u32): Likewise.
      	(vminq_x_s8): Likewise.
      	(vminq_x_s16): Likewise.
      	(vminq_x_s32): Likewise.
      	(vminq_x_u8): Likewise.
      	(vminq_x_u16): Likewise.
      	(vminq_x_u32): Likewise.
      	(vmaxq_x_s8): Likewise.
      	(vmaxq_x_s16): Likewise.
      	(vmaxq_x_s32): Likewise.
      	(vmaxq_x_u8): Likewise.
      	(vmaxq_x_u16): Likewise.
      	(vmaxq_x_u32): Likewise.
      	(vabdq_x_s8): Likewise.
      	(vabdq_x_s16): Likewise.
      	(vabdq_x_s32): Likewise.
      	(vabdq_x_u8): Likewise.
      	(vabdq_x_u16): Likewise.
      	(vabdq_x_u32): Likewise.
      	(vabsq_x_s8): Likewise.
      	(vabsq_x_s16): Likewise.
      	(vabsq_x_s32): Likewise.
      	(vaddq_x_s8): Likewise.
      	(vaddq_x_s16): Likewise.
      	(vaddq_x_s32): Likewise.
      	(vaddq_x_n_s8): Likewise.
      	(vaddq_x_n_s16): Likewise.
      	(vaddq_x_n_s32): Likewise.
      	(vaddq_x_u8): Likewise.
      	(vaddq_x_u16): Likewise.
      	(vaddq_x_u32): Likewise.
      	(vaddq_x_n_u8): Likewise.
      	(vaddq_x_n_u16): Likewise.
      	(vaddq_x_n_u32): Likewise.
      	(vclsq_x_s8): Likewise.
      	(vclsq_x_s16): Likewise.
      	(vclsq_x_s32): Likewise.
      	(vclzq_x_s8): Likewise.
      	(vclzq_x_s16): Likewise.
      	(vclzq_x_s32): Likewise.
      	(vclzq_x_u8): Likewise.
      	(vclzq_x_u16): Likewise.
      	(vclzq_x_u32): Likewise.
      	(vnegq_x_s8): Likewise.
      	(vnegq_x_s16): Likewise.
      	(vnegq_x_s32): Likewise.
      	(vmulhq_x_s8): Likewise.
      	(vmulhq_x_s16): Likewise.
      	(vmulhq_x_s32): Likewise.
      	(vmulhq_x_u8): Likewise.
      	(vmulhq_x_u16): Likewise.
      	(vmulhq_x_u32): Likewise.
      	(vmullbq_poly_x_p8): Likewise.
      	(vmullbq_poly_x_p16): Likewise.
      	(vmullbq_int_x_s8): Likewise.
      	(vmullbq_int_x_s16): Likewise.
      	(vmullbq_int_x_s32): Likewise.
      	(vmullbq_int_x_u8): Likewise.
      	(vmullbq_int_x_u16): Likewise.
      	(vmullbq_int_x_u32): Likewise.
      	(vmulltq_poly_x_p8): Likewise.
      	(vmulltq_poly_x_p16): Likewise.
      	(vmulltq_int_x_s8): Likewise.
      	(vmulltq_int_x_s16): Likewise.
      	(vmulltq_int_x_s32): Likewise.
      	(vmulltq_int_x_u8): Likewise.
      	(vmulltq_int_x_u16): Likewise.
      	(vmulltq_int_x_u32): Likewise.
      	(vmulq_x_s8): Likewise.
      	(vmulq_x_s16): Likewise.
      	(vmulq_x_s32): Likewise.
      	(vmulq_x_n_s8): Likewise.
      	(vmulq_x_n_s16): Likewise.
      	(vmulq_x_n_s32): Likewise.
      	(vmulq_x_u8): Likewise.
      	(vmulq_x_u16): Likewise.
      	(vmulq_x_u32): Likewise.
      	(vmulq_x_n_u8): Likewise.
      	(vmulq_x_n_u16): Likewise.
      	(vmulq_x_n_u32): Likewise.
      	(vsubq_x_s8): Likewise.
      	(vsubq_x_s16): Likewise.
      	(vsubq_x_s32): Likewise.
      	(vsubq_x_n_s8): Likewise.
      	(vsubq_x_n_s16): Likewise.
      	(vsubq_x_n_s32): Likewise.
      	(vsubq_x_u8): Likewise.
      	(vsubq_x_u16): Likewise.
      	(vsubq_x_u32): Likewise.
      	(vsubq_x_n_u8): Likewise.
      	(vsubq_x_n_u16): Likewise.
      	(vsubq_x_n_u32): Likewise.
      	(vcaddq_rot90_x_s8): Likewise.
      	(vcaddq_rot90_x_s16): Likewise.
      	(vcaddq_rot90_x_s32): Likewise.
      	(vcaddq_rot90_x_u8): Likewise.
      	(vcaddq_rot90_x_u16): Likewise.
      	(vcaddq_rot90_x_u32): Likewise.
      	(vcaddq_rot270_x_s8): Likewise.
      	(vcaddq_rot270_x_s16): Likewise.
      	(vcaddq_rot270_x_s32): Likewise.
      	(vcaddq_rot270_x_u8): Likewise.
      	(vcaddq_rot270_x_u16): Likewise.
      	(vcaddq_rot270_x_u32): Likewise.
      	(vhaddq_x_n_s8): Likewise.
      	(vhaddq_x_n_s16): Likewise.
      	(vhaddq_x_n_s32): Likewise.
      	(vhaddq_x_n_u8): Likewise.
      	(vhaddq_x_n_u16): Likewise.
      	(vhaddq_x_n_u32): Likewise.
      	(vhaddq_x_s8): Likewise.
      	(vhaddq_x_s16): Likewise.
      	(vhaddq_x_s32): Likewise.
      	(vhaddq_x_u8): Likewise.
      	(vhaddq_x_u16): Likewise.
      	(vhaddq_x_u32): Likewise.
      	(vhcaddq_rot90_x_s8): Likewise.
      	(vhcaddq_rot90_x_s16): Likewise.
      	(vhcaddq_rot90_x_s32): Likewise.
      	(vhcaddq_rot270_x_s8): Likewise.
      	(vhcaddq_rot270_x_s16): Likewise.
      	(vhcaddq_rot270_x_s32): Likewise.
      	(vhsubq_x_n_s8): Likewise.
      	(vhsubq_x_n_s16): Likewise.
      	(vhsubq_x_n_s32): Likewise.
      	(vhsubq_x_n_u8): Likewise.
      	(vhsubq_x_n_u16): Likewise.
      	(vhsubq_x_n_u32): Likewise.
      	(vhsubq_x_s8): Likewise.
      	(vhsubq_x_s16): Likewise.
      	(vhsubq_x_s32): Likewise.
      	(vhsubq_x_u8): Likewise.
      	(vhsubq_x_u16): Likewise.
      	(vhsubq_x_u32): Likewise.
      	(vrhaddq_x_s8): Likewise.
      	(vrhaddq_x_s16): Likewise.
      	(vrhaddq_x_s32): Likewise.
      	(vrhaddq_x_u8): Likewise.
      	(vrhaddq_x_u16): Likewise.
      	(vrhaddq_x_u32): Likewise.
      	(vrmulhq_x_s8): Likewise.
      	(vrmulhq_x_s16): Likewise.
      	(vrmulhq_x_s32): Likewise.
      	(vrmulhq_x_u8): Likewise.
      	(vrmulhq_x_u16): Likewise.
      	(vrmulhq_x_u32): Likewise.
      	(vandq_x_s8): Likewise.
      	(vandq_x_s16): Likewise.
      	(vandq_x_s32): Likewise.
      	(vandq_x_u8): Likewise.
      	(vandq_x_u16): Likewise.
      	(vandq_x_u32): Likewise.
      	(vbicq_x_s8): Likewise.
      	(vbicq_x_s16): Likewise.
      	(vbicq_x_s32): Likewise.
      	(vbicq_x_u8): Likewise.
      	(vbicq_x_u16): Likewise.
      	(vbicq_x_u32): Likewise.
      	(vbrsrq_x_n_s8): Likewise.
      	(vbrsrq_x_n_s16): Likewise.
      	(vbrsrq_x_n_s32): Likewise.
      	(vbrsrq_x_n_u8): Likewise.
      	(vbrsrq_x_n_u16): Likewise.
      	(vbrsrq_x_n_u32): Likewise.
      	(veorq_x_s8): Likewise.
      	(veorq_x_s16): Likewise.
      	(veorq_x_s32): Likewise.
      	(veorq_x_u8): Likewise.
      	(veorq_x_u16): Likewise.
      	(veorq_x_u32): Likewise.
      	(vmovlbq_x_s8): Likewise.
      	(vmovlbq_x_s16): Likewise.
      	(vmovlbq_x_u8): Likewise.
      	(vmovlbq_x_u16): Likewise.
      	(vmovltq_x_s8): Likewise.
      	(vmovltq_x_s16): Likewise.
      	(vmovltq_x_u8): Likewise.
      	(vmovltq_x_u16): Likewise.
      	(vmvnq_x_s8): Likewise.
      	(vmvnq_x_s16): Likewise.
      	(vmvnq_x_s32): Likewise.
      	(vmvnq_x_u8): Likewise.
      	(vmvnq_x_u16): Likewise.
      	(vmvnq_x_u32): Likewise.
      	(vmvnq_x_n_s16): Likewise.
      	(vmvnq_x_n_s32): Likewise.
      	(vmvnq_x_n_u16): Likewise.
      	(vmvnq_x_n_u32): Likewise.
      	(vornq_x_s8): Likewise.
      	(vornq_x_s16): Likewise.
      	(vornq_x_s32): Likewise.
      	(vornq_x_u8): Likewise.
      	(vornq_x_u16): Likewise.
      	(vornq_x_u32): Likewise.
      	(vorrq_x_s8): Likewise.
      	(vorrq_x_s16): Likewise.
      	(vorrq_x_s32): Likewise.
      	(vorrq_x_u8): Likewise.
      	(vorrq_x_u16): Likewise.
      	(vorrq_x_u32): Likewise.
      	(vrev16q_x_s8): Likewise.
      	(vrev16q_x_u8): Likewise.
      	(vrev32q_x_s8): Likewise.
      	(vrev32q_x_s16): Likewise.
      	(vrev32q_x_u8): Likewise.
      	(vrev32q_x_u16): Likewise.
      	(vrev64q_x_s8): Likewise.
      	(vrev64q_x_s16): Likewise.
      	(vrev64q_x_s32): Likewise.
      	(vrev64q_x_u8): Likewise.
      	(vrev64q_x_u16): Likewise.
      	(vrev64q_x_u32): Likewise.
      	(vrshlq_x_s8): Likewise.
      	(vrshlq_x_s16): Likewise.
      	(vrshlq_x_s32): Likewise.
      	(vrshlq_x_u8): Likewise.
      	(vrshlq_x_u16): Likewise.
      	(vrshlq_x_u32): Likewise.
      	(vshllbq_x_n_s8): Likewise.
      	(vshllbq_x_n_s16): Likewise.
      	(vshllbq_x_n_u8): Likewise.
      	(vshllbq_x_n_u16): Likewise.
      	(vshlltq_x_n_s8): Likewise.
      	(vshlltq_x_n_s16): Likewise.
      	(vshlltq_x_n_u8): Likewise.
      	(vshlltq_x_n_u16): Likewise.
      	(vshlq_x_s8): Likewise.
      	(vshlq_x_s16): Likewise.
      	(vshlq_x_s32): Likewise.
      	(vshlq_x_u8): Likewise.
      	(vshlq_x_u16): Likewise.
      	(vshlq_x_u32): Likewise.
      	(vshlq_x_n_s8): Likewise.
      	(vshlq_x_n_s16): Likewise.
      	(vshlq_x_n_s32): Likewise.
      	(vshlq_x_n_u8): Likewise.
      	(vshlq_x_n_u16): Likewise.
      	(vshlq_x_n_u32): Likewise.
      	(vrshrq_x_n_s8): Likewise.
      	(vrshrq_x_n_s16): Likewise.
      	(vrshrq_x_n_s32): Likewise.
      	(vrshrq_x_n_u8): Likewise.
      	(vrshrq_x_n_u16): Likewise.
      	(vrshrq_x_n_u32): Likewise.
      	(vshrq_x_n_s8): Likewise.
      	(vshrq_x_n_s16): Likewise.
      	(vshrq_x_n_s32): Likewise.
      	(vshrq_x_n_u8): Likewise.
      	(vshrq_x_n_u16): Likewise.
      	(vshrq_x_n_u32): Likewise.
      	(vdupq_x_n_f16): Likewise.
      	(vdupq_x_n_f32): Likewise.
      	(vminnmq_x_f16): Likewise.
      	(vminnmq_x_f32): Likewise.
      	(vmaxnmq_x_f16): Likewise.
      	(vmaxnmq_x_f32): Likewise.
      	(vabdq_x_f16): Likewise.
      	(vabdq_x_f32): Likewise.
      	(vabsq_x_f16): Likewise.
      	(vabsq_x_f32): Likewise.
      	(vaddq_x_f16): Likewise.
      	(vaddq_x_f32): Likewise.
      	(vaddq_x_n_f16): Likewise.
      	(vaddq_x_n_f32): Likewise.
      	(vnegq_x_f16): Likewise.
      	(vnegq_x_f32): Likewise.
      	(vmulq_x_f16): Likewise.
      	(vmulq_x_f32): Likewise.
      	(vmulq_x_n_f16): Likewise.
      	(vmulq_x_n_f32): Likewise.
      	(vsubq_x_f16): Likewise.
      	(vsubq_x_f32): Likewise.
      	(vsubq_x_n_f16): Likewise.
      	(vsubq_x_n_f32): Likewise.
      	(vcaddq_rot90_x_f16): Likewise.
      	(vcaddq_rot90_x_f32): Likewise.
      	(vcaddq_rot270_x_f16): Likewise.
      	(vcaddq_rot270_x_f32): Likewise.
      	(vcmulq_x_f16): Likewise.
      	(vcmulq_x_f32): Likewise.
      	(vcmulq_rot90_x_f16): Likewise.
      	(vcmulq_rot90_x_f32): Likewise.
      	(vcmulq_rot180_x_f16): Likewise.
      	(vcmulq_rot180_x_f32): Likewise.
      	(vcmulq_rot270_x_f16): Likewise.
      	(vcmulq_rot270_x_f32): Likewise.
      	(vcvtaq_x_s16_f16): Likewise.
      	(vcvtaq_x_s32_f32): Likewise.
      	(vcvtaq_x_u16_f16): Likewise.
      	(vcvtaq_x_u32_f32): Likewise.
      	(vcvtnq_x_s16_f16): Likewise.
      	(vcvtnq_x_s32_f32): Likewise.
      	(vcvtnq_x_u16_f16): Likewise.
      	(vcvtnq_x_u32_f32): Likewise.
      	(vcvtpq_x_s16_f16): Likewise.
      	(vcvtpq_x_s32_f32): Likewise.
      	(vcvtpq_x_u16_f16): Likewise.
      	(vcvtpq_x_u32_f32): Likewise.
      	(vcvtmq_x_s16_f16): Likewise.
      	(vcvtmq_x_s32_f32): Likewise.
      	(vcvtmq_x_u16_f16): Likewise.
      	(vcvtmq_x_u32_f32): Likewise.
      	(vcvtbq_x_f32_f16): Likewise.
      	(vcvttq_x_f32_f16): Likewise.
      	(vcvtq_x_f16_u16): Likewise.
      	(vcvtq_x_f16_s16): Likewise.
      	(vcvtq_x_f32_s32): Likewise.
      	(vcvtq_x_f32_u32): Likewise.
      	(vcvtq_x_n_f16_s16): Likewise.
      	(vcvtq_x_n_f16_u16): Likewise.
      	(vcvtq_x_n_f32_s32): Likewise.
      	(vcvtq_x_n_f32_u32): Likewise.
      	(vcvtq_x_s16_f16): Likewise.
      	(vcvtq_x_s32_f32): Likewise.
      	(vcvtq_x_u16_f16): Likewise.
      	(vcvtq_x_u32_f32): Likewise.
      	(vcvtq_x_n_s16_f16): Likewise.
      	(vcvtq_x_n_s32_f32): Likewise.
      	(vcvtq_x_n_u16_f16): Likewise.
      	(vcvtq_x_n_u32_f32): Likewise.
      	(vrndq_x_f16): Likewise.
      	(vrndq_x_f32): Likewise.
      	(vrndnq_x_f16): Likewise.
      	(vrndnq_x_f32): Likewise.
      	(vrndmq_x_f16): Likewise.
      	(vrndmq_x_f32): Likewise.
      	(vrndpq_x_f16): Likewise.
      	(vrndpq_x_f32): Likewise.
      	(vrndaq_x_f16): Likewise.
      	(vrndaq_x_f32): Likewise.
      	(vrndxq_x_f16): Likewise.
      	(vrndxq_x_f32): Likewise.
      	(vandq_x_f16): Likewise.
      	(vandq_x_f32): Likewise.
      	(vbicq_x_f16): Likewise.
      	(vbicq_x_f32): Likewise.
      	(vbrsrq_x_n_f16): Likewise.
      	(vbrsrq_x_n_f32): Likewise.
      	(veorq_x_f16): Likewise.
      	(veorq_x_f32): Likewise.
      	(vornq_x_f16): Likewise.
      	(vornq_x_f32): Likewise.
      	(vorrq_x_f16): Likewise.
      	(vorrq_x_f32): Likewise.
      	(vrev32q_x_f16): Likewise.
      	(vrev64q_x_f16): Likewise.
      	(vrev64q_x_f32): Likewise.
      	(__arm_vddupq_x_n_u8): Define intrinsic.
      	(__arm_vddupq_x_n_u16): Likewise.
      	(__arm_vddupq_x_n_u32): Likewise.
      	(__arm_vddupq_x_wb_u8): Likewise.
      	(__arm_vddupq_x_wb_u16): Likewise.
      	(__arm_vddupq_x_wb_u32): Likewise.
      	(__arm_vdwdupq_x_n_u8): Likewise.
      	(__arm_vdwdupq_x_n_u16): Likewise.
      	(__arm_vdwdupq_x_n_u32): Likewise.
      	(__arm_vdwdupq_x_wb_u8): Likewise.
      	(__arm_vdwdupq_x_wb_u16): Likewise.
      	(__arm_vdwdupq_x_wb_u32): Likewise.
      	(__arm_vidupq_x_n_u8): Likewise.
      	(__arm_vidupq_x_n_u16): Likewise.
      	(__arm_vidupq_x_n_u32): Likewise.
      	(__arm_vidupq_x_wb_u8): Likewise.
      	(__arm_vidupq_x_wb_u16): Likewise.
      	(__arm_vidupq_x_wb_u32): Likewise.
      	(__arm_viwdupq_x_n_u8): Likewise.
      	(__arm_viwdupq_x_n_u16): Likewise.
      	(__arm_viwdupq_x_n_u32): Likewise.
      	(__arm_viwdupq_x_wb_u8): Likewise.
      	(__arm_viwdupq_x_wb_u16): Likewise.
      	(__arm_viwdupq_x_wb_u32): Likewise.
      	(__arm_vdupq_x_n_s8): Likewise.
      	(__arm_vdupq_x_n_s16): Likewise.
      	(__arm_vdupq_x_n_s32): Likewise.
      	(__arm_vdupq_x_n_u8): Likewise.
      	(__arm_vdupq_x_n_u16): Likewise.
      	(__arm_vdupq_x_n_u32): Likewise.
      	(__arm_vminq_x_s8): Likewise.
      	(__arm_vminq_x_s16): Likewise.
      	(__arm_vminq_x_s32): Likewise.
      	(__arm_vminq_x_u8): Likewise.
      	(__arm_vminq_x_u16): Likewise.
      	(__arm_vminq_x_u32): Likewise.
      	(__arm_vmaxq_x_s8): Likewise.
      	(__arm_vmaxq_x_s16): Likewise.
      	(__arm_vmaxq_x_s32): Likewise.
      	(__arm_vmaxq_x_u8): Likewise.
      	(__arm_vmaxq_x_u16): Likewise.
      	(__arm_vmaxq_x_u32): Likewise.
      	(__arm_vabdq_x_s8): Likewise.
      	(__arm_vabdq_x_s16): Likewise.
      	(__arm_vabdq_x_s32): Likewise.
      	(__arm_vabdq_x_u8): Likewise.
      	(__arm_vabdq_x_u16): Likewise.
      	(__arm_vabdq_x_u32): Likewise.
      	(__arm_vabsq_x_s8): Likewise.
      	(__arm_vabsq_x_s16): Likewise.
      	(__arm_vabsq_x_s32): Likewise.
      	(__arm_vaddq_x_s8): Likewise.
      	(__arm_vaddq_x_s16): Likewise.
      	(__arm_vaddq_x_s32): Likewise.
      	(__arm_vaddq_x_n_s8): Likewise.
      	(__arm_vaddq_x_n_s16): Likewise.
      	(__arm_vaddq_x_n_s32): Likewise.
      	(__arm_vaddq_x_u8): Likewise.
      	(__arm_vaddq_x_u16): Likewise.
      	(__arm_vaddq_x_u32): Likewise.
      	(__arm_vaddq_x_n_u8): Likewise.
      	(__arm_vaddq_x_n_u16): Likewise.
      	(__arm_vaddq_x_n_u32): Likewise.
      	(__arm_vclsq_x_s8): Likewise.
      	(__arm_vclsq_x_s16): Likewise.
      	(__arm_vclsq_x_s32): Likewise.
      	(__arm_vclzq_x_s8): Likewise.
      	(__arm_vclzq_x_s16): Likewise.
      	(__arm_vclzq_x_s32): Likewise.
      	(__arm_vclzq_x_u8): Likewise.
      	(__arm_vclzq_x_u16): Likewise.
      	(__arm_vclzq_x_u32): Likewise.
      	(__arm_vnegq_x_s8): Likewise.
      	(__arm_vnegq_x_s16): Likewise.
      	(__arm_vnegq_x_s32): Likewise.
      	(__arm_vmulhq_x_s8): Likewise.
      	(__arm_vmulhq_x_s16): Likewise.
      	(__arm_vmulhq_x_s32): Likewise.
      	(__arm_vmulhq_x_u8): Likewise.
      	(__arm_vmulhq_x_u16): Likewise.
      	(__arm_vmulhq_x_u32): Likewise.
      	(__arm_vmullbq_poly_x_p8): Likewise.
      	(__arm_vmullbq_poly_x_p16): Likewise.
      	(__arm_vmullbq_int_x_s8): Likewise.
      	(__arm_vmullbq_int_x_s16): Likewise.
      	(__arm_vmullbq_int_x_s32): Likewise.
      	(__arm_vmullbq_int_x_u8): Likewise.
      	(__arm_vmullbq_int_x_u16): Likewise.
      	(__arm_vmullbq_int_x_u32): Likewise.
      	(__arm_vmulltq_poly_x_p8): Likewise.
      	(__arm_vmulltq_poly_x_p16): Likewise.
      	(__arm_vmulltq_int_x_s8): Likewise.
      	(__arm_vmulltq_int_x_s16): Likewise.
      	(__arm_vmulltq_int_x_s32): Likewise.
      	(__arm_vmulltq_int_x_u8): Likewise.
      	(__arm_vmulltq_int_x_u16): Likewise.
      	(__arm_vmulltq_int_x_u32): Likewise.
      	(__arm_vmulq_x_s8): Likewise.
      	(__arm_vmulq_x_s16): Likewise.
      	(__arm_vmulq_x_s32): Likewise.
      	(__arm_vmulq_x_n_s8): Likewise.
      	(__arm_vmulq_x_n_s16): Likewise.
      	(__arm_vmulq_x_n_s32): Likewise.
      	(__arm_vmulq_x_u8): Likewise.
      	(__arm_vmulq_x_u16): Likewise.
      	(__arm_vmulq_x_u32): Likewise.
      	(__arm_vmulq_x_n_u8): Likewise.
      	(__arm_vmulq_x_n_u16): Likewise.
      	(__arm_vmulq_x_n_u32): Likewise.
      	(__arm_vsubq_x_s8): Likewise.
      	(__arm_vsubq_x_s16): Likewise.
      	(__arm_vsubq_x_s32): Likewise.
      	(__arm_vsubq_x_n_s8): Likewise.
      	(__arm_vsubq_x_n_s16): Likewise.
      	(__arm_vsubq_x_n_s32): Likewise.
      	(__arm_vsubq_x_u8): Likewise.
      	(__arm_vsubq_x_u16): Likewise.
      	(__arm_vsubq_x_u32): Likewise.
      	(__arm_vsubq_x_n_u8): Likewise.
      	(__arm_vsubq_x_n_u16): Likewise.
      	(__arm_vsubq_x_n_u32): Likewise.
      	(__arm_vcaddq_rot90_x_s8): Likewise.
      	(__arm_vcaddq_rot90_x_s16): Likewise.
      	(__arm_vcaddq_rot90_x_s32): Likewise.
      	(__arm_vcaddq_rot90_x_u8): Likewise.
      	(__arm_vcaddq_rot90_x_u16): Likewise.
      	(__arm_vcaddq_rot90_x_u32): Likewise.
      	(__arm_vcaddq_rot270_x_s8): Likewise.
      	(__arm_vcaddq_rot270_x_s16): Likewise.
      	(__arm_vcaddq_rot270_x_s32): Likewise.
      	(__arm_vcaddq_rot270_x_u8): Likewise.
      	(__arm_vcaddq_rot270_x_u16): Likewise.
      	(__arm_vcaddq_rot270_x_u32): Likewise.
      	(__arm_vhaddq_x_n_s8): Likewise.
      	(__arm_vhaddq_x_n_s16): Likewise.
      	(__arm_vhaddq_x_n_s32): Likewise.
      	(__arm_vhaddq_x_n_u8): Likewise.
      	(__arm_vhaddq_x_n_u16): Likewise.
      	(__arm_vhaddq_x_n_u32): Likewise.
      	(__arm_vhaddq_x_s8): Likewise.
      	(__arm_vhaddq_x_s16): Likewise.
      	(__arm_vhaddq_x_s32): Likewise.
      	(__arm_vhaddq_x_u8): Likewise.
      	(__arm_vhaddq_x_u16): Likewise.
      	(__arm_vhaddq_x_u32): Likewise.
      	(__arm_vhcaddq_rot90_x_s8): Likewise.
      	(__arm_vhcaddq_rot90_x_s16): Likewise.
      	(__arm_vhcaddq_rot90_x_s32): Likewise.
      	(__arm_vhcaddq_rot270_x_s8): Likewise.
      	(__arm_vhcaddq_rot270_x_s16): Likewise.
      	(__arm_vhcaddq_rot270_x_s32): Likewise.
      	(__arm_vhsubq_x_n_s8): Likewise.
      	(__arm_vhsubq_x_n_s16): Likewise.
      	(__arm_vhsubq_x_n_s32): Likewise.
      	(__arm_vhsubq_x_n_u8): Likewise.
      	(__arm_vhsubq_x_n_u16): Likewise.
      	(__arm_vhsubq_x_n_u32): Likewise.
      	(__arm_vhsubq_x_s8): Likewise.
      	(__arm_vhsubq_x_s16): Likewise.
      	(__arm_vhsubq_x_s32): Likewise.
      	(__arm_vhsubq_x_u8): Likewise.
      	(__arm_vhsubq_x_u16): Likewise.
      	(__arm_vhsubq_x_u32): Likewise.
      	(__arm_vrhaddq_x_s8): Likewise.
      	(__arm_vrhaddq_x_s16): Likewise.
      	(__arm_vrhaddq_x_s32): Likewise.
      	(__arm_vrhaddq_x_u8): Likewise.
      	(__arm_vrhaddq_x_u16): Likewise.
      	(__arm_vrhaddq_x_u32): Likewise.
      	(__arm_vrmulhq_x_s8): Likewise.
      	(__arm_vrmulhq_x_s16): Likewise.
      	(__arm_vrmulhq_x_s32): Likewise.
      	(__arm_vrmulhq_x_u8): Likewise.
      	(__arm_vrmulhq_x_u16): Likewise.
      	(__arm_vrmulhq_x_u32): Likewise.
      	(__arm_vandq_x_s8): Likewise.
      	(__arm_vandq_x_s16): Likewise.
      	(__arm_vandq_x_s32): Likewise.
      	(__arm_vandq_x_u8): Likewise.
      	(__arm_vandq_x_u16): Likewise.
      	(__arm_vandq_x_u32): Likewise.
      	(__arm_vbicq_x_s8): Likewise.
      	(__arm_vbicq_x_s16): Likewise.
      	(__arm_vbicq_x_s32): Likewise.
      	(__arm_vbicq_x_u8): Likewise.
      	(__arm_vbicq_x_u16): Likewise.
      	(__arm_vbicq_x_u32): Likewise.
      	(__arm_vbrsrq_x_n_s8): Likewise.
      	(__arm_vbrsrq_x_n_s16): Likewise.
      	(__arm_vbrsrq_x_n_s32): Likewise.
      	(__arm_vbrsrq_x_n_u8): Likewise.
      	(__arm_vbrsrq_x_n_u16): Likewise.
      	(__arm_vbrsrq_x_n_u32): Likewise.
      	(__arm_veorq_x_s8): Likewise.
      	(__arm_veorq_x_s16): Likewise.
      	(__arm_veorq_x_s32): Likewise.
      	(__arm_veorq_x_u8): Likewise.
      	(__arm_veorq_x_u16): Likewise.
      	(__arm_veorq_x_u32): Likewise.
      	(__arm_vmovlbq_x_s8): Likewise.
      	(__arm_vmovlbq_x_s16): Likewise.
      	(__arm_vmovlbq_x_u8): Likewise.
      	(__arm_vmovlbq_x_u16): Likewise.
      	(__arm_vmovltq_x_s8): Likewise.
      	(__arm_vmovltq_x_s16): Likewise.
      	(__arm_vmovltq_x_u8): Likewise.
      	(__arm_vmovltq_x_u16): Likewise.
      	(__arm_vmvnq_x_s8): Likewise.
      	(__arm_vmvnq_x_s16): Likewise.
      	(__arm_vmvnq_x_s32): Likewise.
      	(__arm_vmvnq_x_u8): Likewise.
      	(__arm_vmvnq_x_u16): Likewise.
      	(__arm_vmvnq_x_u32): Likewise.
      	(__arm_vmvnq_x_n_s16): Likewise.
      	(__arm_vmvnq_x_n_s32): Likewise.
      	(__arm_vmvnq_x_n_u16): Likewise.
      	(__arm_vmvnq_x_n_u32): Likewise.
      	(__arm_vornq_x_s8): Likewise.
      	(__arm_vornq_x_s16): Likewise.
      	(__arm_vornq_x_s32): Likewise.
      	(__arm_vornq_x_u8): Likewise.
      	(__arm_vornq_x_u16): Likewise.
      	(__arm_vornq_x_u32): Likewise.
      	(__arm_vorrq_x_s8): Likewise.
      	(__arm_vorrq_x_s16): Likewise.
      	(__arm_vorrq_x_s32): Likewise.
      	(__arm_vorrq_x_u8): Likewise.
      	(__arm_vorrq_x_u16): Likewise.
      	(__arm_vorrq_x_u32): Likewise.
      	(__arm_vrev16q_x_s8): Likewise.
      	(__arm_vrev16q_x_u8): Likewise.
      	(__arm_vrev32q_x_s8): Likewise.
      	(__arm_vrev32q_x_s16): Likewise.
      	(__arm_vrev32q_x_u8): Likewise.
      	(__arm_vrev32q_x_u16): Likewise.
      	(__arm_vrev64q_x_s8): Likewise.
      	(__arm_vrev64q_x_s16): Likewise.
      	(__arm_vrev64q_x_s32): Likewise.
      	(__arm_vrev64q_x_u8): Likewise.
      	(__arm_vrev64q_x_u16): Likewise.
      	(__arm_vrev64q_x_u32): Likewise.
      	(__arm_vrshlq_x_s8): Likewise.
      	(__arm_vrshlq_x_s16): Likewise.
      	(__arm_vrshlq_x_s32): Likewise.
      	(__arm_vrshlq_x_u8): Likewise.
      	(__arm_vrshlq_x_u16): Likewise.
      	(__arm_vrshlq_x_u32): Likewise.
      	(__arm_vshllbq_x_n_s8): Likewise.
      	(__arm_vshllbq_x_n_s16): Likewise.
      	(__arm_vshllbq_x_n_u8): Likewise.
      	(__arm_vshllbq_x_n_u16): Likewise.
      	(__arm_vshlltq_x_n_s8): Likewise.
      	(__arm_vshlltq_x_n_s16): Likewise.
      	(__arm_vshlltq_x_n_u8): Likewise.
      	(__arm_vshlltq_x_n_u16): Likewise.
      	(__arm_vshlq_x_s8): Likewise.
      	(__arm_vshlq_x_s16): Likewise.
      	(__arm_vshlq_x_s32): Likewise.
      	(__arm_vshlq_x_u8): Likewise.
      	(__arm_vshlq_x_u16): Likewise.
      	(__arm_vshlq_x_u32): Likewise.
      	(__arm_vshlq_x_n_s8): Likewise.
      	(__arm_vshlq_x_n_s16): Likewise.
      	(__arm_vshlq_x_n_s32): Likewise.
      	(__arm_vshlq_x_n_u8): Likewise.
      	(__arm_vshlq_x_n_u16): Likewise.
      	(__arm_vshlq_x_n_u32): Likewise.
      	(__arm_vrshrq_x_n_s8): Likewise.
      	(__arm_vrshrq_x_n_s16): Likewise.
      	(__arm_vrshrq_x_n_s32): Likewise.
      	(__arm_vrshrq_x_n_u8): Likewise.
      	(__arm_vrshrq_x_n_u16): Likewise.
      	(__arm_vrshrq_x_n_u32): Likewise.
      	(__arm_vshrq_x_n_s8): Likewise.
      	(__arm_vshrq_x_n_s16): Likewise.
      	(__arm_vshrq_x_n_s32): Likewise.
      	(__arm_vshrq_x_n_u8): Likewise.
      	(__arm_vshrq_x_n_u16): Likewise.
      	(__arm_vshrq_x_n_u32): Likewise.
      	(__arm_vdupq_x_n_f16): Likewise.
      	(__arm_vdupq_x_n_f32): Likewise.
      	(__arm_vminnmq_x_f16): Likewise.
      	(__arm_vminnmq_x_f32): Likewise.
      	(__arm_vmaxnmq_x_f16): Likewise.
      	(__arm_vmaxnmq_x_f32): Likewise.
      	(__arm_vabdq_x_f16): Likewise.
      	(__arm_vabdq_x_f32): Likewise.
      	(__arm_vabsq_x_f16): Likewise.
      	(__arm_vabsq_x_f32): Likewise.
      	(__arm_vaddq_x_f16): Likewise.
      	(__arm_vaddq_x_f32): Likewise.
      	(__arm_vaddq_x_n_f16): Likewise.
      	(__arm_vaddq_x_n_f32): Likewise.
      	(__arm_vnegq_x_f16): Likewise.
      	(__arm_vnegq_x_f32): Likewise.
      	(__arm_vmulq_x_f16): Likewise.
      	(__arm_vmulq_x_f32): Likewise.
      	(__arm_vmulq_x_n_f16): Likewise.
      	(__arm_vmulq_x_n_f32): Likewise.
      	(__arm_vsubq_x_f16): Likewise.
      	(__arm_vsubq_x_f32): Likewise.
      	(__arm_vsubq_x_n_f16): Likewise.
      	(__arm_vsubq_x_n_f32): Likewise.
      	(__arm_vcaddq_rot90_x_f16): Likewise.
      	(__arm_vcaddq_rot90_x_f32): Likewise.
      	(__arm_vcaddq_rot270_x_f16): Likewise.
      	(__arm_vcaddq_rot270_x_f32): Likewise.
      	(__arm_vcmulq_x_f16): Likewise.
      	(__arm_vcmulq_x_f32): Likewise.
      	(__arm_vcmulq_rot90_x_f16): Likewise.
      	(__arm_vcmulq_rot90_x_f32): Likewise.
      	(__arm_vcmulq_rot180_x_f16): Likewise.
      	(__arm_vcmulq_rot180_x_f32): Likewise.
      	(__arm_vcmulq_rot270_x_f16): Likewise.
      	(__arm_vcmulq_rot270_x_f32): Likewise.
      	(__arm_vcvtaq_x_s16_f16): Likewise.
      	(__arm_vcvtaq_x_s32_f32): Likewise.
      	(__arm_vcvtaq_x_u16_f16): Likewise.
      	(__arm_vcvtaq_x_u32_f32): Likewise.
      	(__arm_vcvtnq_x_s16_f16): Likewise.
      	(__arm_vcvtnq_x_s32_f32): Likewise.
      	(__arm_vcvtnq_x_u16_f16): Likewise.
      	(__arm_vcvtnq_x_u32_f32): Likewise.
      	(__arm_vcvtpq_x_s16_f16): Likewise.
      	(__arm_vcvtpq_x_s32_f32): Likewise.
      	(__arm_vcvtpq_x_u16_f16): Likewise.
      	(__arm_vcvtpq_x_u32_f32): Likewise.
      	(__arm_vcvtmq_x_s16_f16): Likewise.
      	(__arm_vcvtmq_x_s32_f32): Likewise.
      	(__arm_vcvtmq_x_u16_f16): Likewise.
      	(__arm_vcvtmq_x_u32_f32): Likewise.
      	(__arm_vcvtbq_x_f32_f16): Likewise.
      	(__arm_vcvttq_x_f32_f16): Likewise.
      	(__arm_vcvtq_x_f16_u16): Likewise.
      	(__arm_vcvtq_x_f16_s16): Likewise.
      	(__arm_vcvtq_x_f32_s32): Likewise.
      	(__arm_vcvtq_x_f32_u32): Likewise.
      	(__arm_vcvtq_x_n_f16_s16): Likewise.
      	(__arm_vcvtq_x_n_f16_u16): Likewise.
      	(__arm_vcvtq_x_n_f32_s32): Likewise.
      	(__arm_vcvtq_x_n_f32_u32): Likewise.
      	(__arm_vcvtq_x_s16_f16): Likewise.
      	(__arm_vcvtq_x_s32_f32): Likewise.
      	(__arm_vcvtq_x_u16_f16): Likewise.
      	(__arm_vcvtq_x_u32_f32): Likewise.
      	(__arm_vcvtq_x_n_s16_f16): Likewise.
      	(__arm_vcvtq_x_n_s32_f32): Likewise.
      	(__arm_vcvtq_x_n_u16_f16): Likewise.
      	(__arm_vcvtq_x_n_u32_f32): Likewise.
      	(__arm_vrndq_x_f16): Likewise.
      	(__arm_vrndq_x_f32): Likewise.
      	(__arm_vrndnq_x_f16): Likewise.
      	(__arm_vrndnq_x_f32): Likewise.
      	(__arm_vrndmq_x_f16): Likewise.
      	(__arm_vrndmq_x_f32): Likewise.
      	(__arm_vrndpq_x_f16): Likewise.
      	(__arm_vrndpq_x_f32): Likewise.
      	(__arm_vrndaq_x_f16): Likewise.
      	(__arm_vrndaq_x_f32): Likewise.
      	(__arm_vrndxq_x_f16): Likewise.
      	(__arm_vrndxq_x_f32): Likewise.
      	(__arm_vandq_x_f16): Likewise.
      	(__arm_vandq_x_f32): Likewise.
      	(__arm_vbicq_x_f16): Likewise.
      	(__arm_vbicq_x_f32): Likewise.
      	(__arm_vbrsrq_x_n_f16): Likewise.
      	(__arm_vbrsrq_x_n_f32): Likewise.
      	(__arm_veorq_x_f16): Likewise.
      	(__arm_veorq_x_f32): Likewise.
      	(__arm_vornq_x_f16): Likewise.
      	(__arm_vornq_x_f32): Likewise.
      	(__arm_vorrq_x_f16): Likewise.
      	(__arm_vorrq_x_f32): Likewise.
      	(__arm_vrev32q_x_f16): Likewise.
      	(__arm_vrev64q_x_f16): Likewise.
      	(__arm_vrev64q_x_f32): Likewise.
      	(vabdq_x): Define polymorphic variant.
      	(vabsq_x): Likewise.
      	(vaddq_x): Likewise.
      	(vandq_x): Likewise.
      	(vbicq_x): Likewise.
      	(vbrsrq_x): Likewise.
      	(vcaddq_rot270_x): Likewise.
      	(vcaddq_rot90_x): Likewise.
      	(vcmulq_rot180_x): Likewise.
      	(vcmulq_rot270_x): Likewise.
      	(vcmulq_x): Likewise.
      	(vcvtq_x): Likewise.
      	(vcvtq_x_n): Likewise.
      	(vcvtnq_m): Likewise.
      	(veorq_x): Likewise.
      	(vmaxnmq_x): Likewise.
      	(vminnmq_x): Likewise.
      	(vmulq_x): Likewise.
      	(vnegq_x): Likewise.
      	(vornq_x): Likewise.
      	(vorrq_x): Likewise.
      	(vrev32q_x): Likewise.
      	(vrev64q_x): Likewise.
      	(vrndaq_x): Likewise.
      	(vrndmq_x): Likewise.
      	(vrndnq_x): Likewise.
      	(vrndpq_x): Likewise.
      	(vrndq_x): Likewise.
      	(vrndxq_x): Likewise.
      	(vsubq_x): Likewise.
      	(vcmulq_rot90_x): Likewise.
      	(vadciq): Likewise.
      	(vclsq_x): Likewise.
      	(vclzq_x): Likewise.
      	(vhaddq_x): Likewise.
      	(vhcaddq_rot270_x): Likewise.
      	(vhcaddq_rot90_x): Likewise.
      	(vhsubq_x): Likewise.
      	(vmaxq_x): Likewise.
      	(vminq_x): Likewise.
      	(vmovlbq_x): Likewise.
      	(vmovltq_x): Likewise.
      	(vmulhq_x): Likewise.
      	(vmullbq_int_x): Likewise.
      	(vmullbq_poly_x): Likewise.
      	(vmulltq_int_x): Likewise.
      	(vmulltq_poly_x): Likewise.
      	(vmvnq_x): Likewise.
      	(vrev16q_x): Likewise.
      	(vrhaddq_x): Likewise.
      	(vrmulhq_x): Likewise.
      	(vrshlq_x): Likewise.
      	(vrshrq_x): Likewise.
      	(vshllbq_x): Likewise.
      	(vshlltq_x): Likewise.
      	(vshlq_x_n): Likewise.
      	(vshlq_x): Likewise.
      	(vdwdupq_x_u8): Likewise.
      	(vdwdupq_x_u16): Likewise.
      	(vdwdupq_x_u32): Likewise.
      	(viwdupq_x_u8): Likewise.
      	(viwdupq_x_u16): Likewise.
      	(viwdupq_x_u32): Likewise.
      	(vidupq_x_u8): Likewise.
      	(vddupq_x_u8): Likewise.
      	(vidupq_x_u16): Likewise.
      	(vddupq_x_u16): Likewise.
      	(vidupq_x_u32): Likewise.
      	(vddupq_x_u32): Likewise.
      	(vshrq_x): Likewise.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vabdq_x_f16.c: New test.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabdq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabsq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabsq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabsq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabsq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vabsq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vandq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbicq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclsq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclsq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclsq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclzq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclzq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclzq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclzq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclzq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vclzq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_rot180_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_rot180_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_rot270_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_rot270_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_rot90_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_rot90_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcmulq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtaq_x_s16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtaq_x_s32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtaq_x_u16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtaq_x_u32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtbq_x_f32_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtmq_x_s16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtmq_x_s32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtmq_x_u16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtmq_x_u32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtnq_x_s16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtnq_x_s32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtnq_x_u16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtnq_x_u32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtpq_x_s16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtpq_x_s32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtpq_x_u16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtpq_x_u32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_f16_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_f16_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_f32_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_f32_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f16_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f16_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f32_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f32_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_s16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_s32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_u16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_u32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_s16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_s32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_u16_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvtq_x_u32_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vcvttq_x_f32_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_x_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_x_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_x_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdupq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_x_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_x_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_x_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/veorq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhaddq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vhsubq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_x_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_x_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_x_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_x_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_x_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_x_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxnmq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxnmq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmaxq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminnmq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminnmq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vminq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovlbq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovlbq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovlbq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovlbq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovltq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovltq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovltq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmovltq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulhq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulhq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulhq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulhq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulhq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulhq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_poly_x_p16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmullbq_poly_x_p8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_poly_x_p16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulltq_poly_x_p8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmulq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vmvnq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vnegq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vnegq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vnegq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vnegq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vnegq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vornq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vorrq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev16q_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev16q_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev32q_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev32q_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev32q_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev32q_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev32q_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrev64q_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrhaddq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrhaddq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrhaddq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrhaddq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrhaddq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrhaddq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrmulhq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrmulhq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrmulhq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrmulhq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrmulhq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrmulhq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndaq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndaq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndmq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndmq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndnq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndnq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndpq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndpq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndxq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrndxq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshlq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshlq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshlq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshlq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshlq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshlq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshllbq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshllbq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshllbq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshllbq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlltq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlltq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlltq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlltq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshlq_x_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshrq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshrq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshrq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshrq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshrq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vsubq_x_u8.c: Likewise.
      Srinath Parvathaneni committed
    • fix CTOR vectorization · 3d42842c
      We failed to handle pattern stmts appropriately.
      
      2020-03-20  Richard Biener  <rguenther@suse.de>
      
      	* tree-vect-slp.c (vect_analyze_slp_instance): Push the stmts
      	to vectorize for CTOR defs.
      Richard Biener committed
    • [ARM][GCC][2/8x]: MVE ACLE gather load and scatter store intrinsics with writeback. · 41e1a7ff
      This patch supports following MVE ACLE intrinsics with writeback.
      
      vldrdq_gather_base_wb_s64, vldrdq_gather_base_wb_u64, vldrdq_gather_base_wb_z_s64,
      vldrdq_gather_base_wb_z_u64, vldrwq_gather_base_wb_f32, vldrwq_gather_base_wb_s32,
      vldrwq_gather_base_wb_u32, vldrwq_gather_base_wb_z_f32, vldrwq_gather_base_wb_z_s32,
      vldrwq_gather_base_wb_z_u32, vstrdq_scatter_base_wb_p_s64, vstrdq_scatter_base_wb_p_u64,
      vstrdq_scatter_base_wb_s64, vstrdq_scatter_base_wb_u64, vstrwq_scatter_base_wb_p_s32,
      vstrwq_scatter_base_wb_p_f32, vstrwq_scatter_base_wb_p_u32, vstrwq_scatter_base_wb_s32,
      vstrwq_scatter_base_wb_u32, vstrwq_scatter_base_wb_f32.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
      [1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* config/arm/arm-builtins.c (LDRGBWBS_QUALIFIERS): Define builtin
      	qualifier.
      	(LDRGBWBU_QUALIFIERS): Likewise.
      	(LDRGBWBS_Z_QUALIFIERS): Likewise.
      	(LDRGBWBU_Z_QUALIFIERS): Likewise.
      	(STRSBWBS_QUALIFIERS): Likewise.
      	(STRSBWBU_QUALIFIERS): Likewise.
      	(STRSBWBS_P_QUALIFIERS): Likewise.
      	(STRSBWBU_P_QUALIFIERS): Likewise.
      	* config/arm/arm_mve.h (vldrdq_gather_base_wb_s64): Define macro.
      	(vldrdq_gather_base_wb_u64): Likewise.
      	(vldrdq_gather_base_wb_z_s64): Likewise.
      	(vldrdq_gather_base_wb_z_u64): Likewise.
      	(vldrwq_gather_base_wb_f32): Likewise.
      	(vldrwq_gather_base_wb_s32): Likewise.
      	(vldrwq_gather_base_wb_u32): Likewise.
      	(vldrwq_gather_base_wb_z_f32): Likewise.
      	(vldrwq_gather_base_wb_z_s32): Likewise.
      	(vldrwq_gather_base_wb_z_u32): Likewise.
      	(vstrdq_scatter_base_wb_p_s64): Likewise.
      	(vstrdq_scatter_base_wb_p_u64): Likewise.
      	(vstrdq_scatter_base_wb_s64): Likewise.
      	(vstrdq_scatter_base_wb_u64): Likewise.
      	(vstrwq_scatter_base_wb_p_s32): Likewise.
      	(vstrwq_scatter_base_wb_p_f32): Likewise.
      	(vstrwq_scatter_base_wb_p_u32): Likewise.
      	(vstrwq_scatter_base_wb_s32): Likewise.
      	(vstrwq_scatter_base_wb_u32): Likewise.
      	(vstrwq_scatter_base_wb_f32): Likewise.
      	(__arm_vldrdq_gather_base_wb_s64): Define intrinsic.
      	(__arm_vldrdq_gather_base_wb_u64): Likewise.
      	(__arm_vldrdq_gather_base_wb_z_s64): Likewise.
      	(__arm_vldrdq_gather_base_wb_z_u64): Likewise.
      	(__arm_vldrwq_gather_base_wb_s32): Likewise.
      	(__arm_vldrwq_gather_base_wb_u32): Likewise.
      	(__arm_vldrwq_gather_base_wb_z_s32): Likewise.
      	(__arm_vldrwq_gather_base_wb_z_u32): Likewise.
      	(__arm_vstrdq_scatter_base_wb_s64): Likewise.
      	(__arm_vstrdq_scatter_base_wb_u64): Likewise.
      	(__arm_vstrdq_scatter_base_wb_p_s64): Likewise.
      	(__arm_vstrdq_scatter_base_wb_p_u64): Likewise.
      	(__arm_vstrwq_scatter_base_wb_p_s32): Likewise.
      	(__arm_vstrwq_scatter_base_wb_p_u32): Likewise.
      	(__arm_vstrwq_scatter_base_wb_s32): Likewise.
      	(__arm_vstrwq_scatter_base_wb_u32): Likewise.
      	(__arm_vldrwq_gather_base_wb_f32): Likewise.
      	(__arm_vldrwq_gather_base_wb_z_f32): Likewise.
      	(__arm_vstrwq_scatter_base_wb_f32): Likewise.
      	(__arm_vstrwq_scatter_base_wb_p_f32): Likewise.
      	(vstrwq_scatter_base_wb): Define polymorphic variant.
      	(vstrwq_scatter_base_wb_p): Likewise.
      	(vstrdq_scatter_base_wb_p): Likewise.
      	(vstrdq_scatter_base_wb): Likewise.
      	* config/arm/arm_mve_builtins.def (LDRGBWBS_QUALIFIERS): Use builtin
      	qualifier.
      	* config/arm/mve.md (mve_vstrwq_scatter_base_wb_<supf>v4si): Define RTL
      	pattern.
      	(mve_vstrwq_scatter_base_wb_add_<supf>v4si): Likewise.
      	(mve_vstrwq_scatter_base_wb_<supf>v4si_insn): Likewise.
      	(mve_vstrwq_scatter_base_wb_p_<supf>v4si): Likewise.
      	(mve_vstrwq_scatter_base_wb_p_add_<supf>v4si): Likewise.
      	(mve_vstrwq_scatter_base_wb_p_<supf>v4si_insn): Likewise.
      	(mve_vstrwq_scatter_base_wb_fv4sf): Likewise.
      	(mve_vstrwq_scatter_base_wb_add_fv4sf): Likewise.
      	(mve_vstrwq_scatter_base_wb_fv4sf_insn): Likewise.
      	(mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise.
      	(mve_vstrwq_scatter_base_wb_p_add_fv4sf): Likewise.
      	(mve_vstrwq_scatter_base_wb_p_fv4sf_insn): Likewise.
      	(mve_vstrdq_scatter_base_wb_<supf>v2di): Likewise.
      	(mve_vstrdq_scatter_base_wb_add_<supf>v2di): Likewise.
      	(mve_vstrdq_scatter_base_wb_<supf>v2di_insn): Likewise.
      	(mve_vstrdq_scatter_base_wb_p_<supf>v2di): Likewise.
      	(mve_vstrdq_scatter_base_wb_p_add_<supf>v2di): Likewise.
      	(mve_vstrdq_scatter_base_wb_p_<supf>v2di_insn): Likewise.
      	(mve_vldrwq_gather_base_wb_<supf>v4si): Likewise.
      	(mve_vldrwq_gather_base_wb_<supf>v4si_insn): Likewise.
      	(mve_vldrwq_gather_base_wb_z_<supf>v4si): Likewise.
      	(mve_vldrwq_gather_base_wb_z_<supf>v4si_insn): Likewise.
      	(mve_vldrwq_gather_base_wb_fv4sf): Likewise.
      	(mve_vldrwq_gather_base_wb_fv4sf_insn): Likewise.
      	(mve_vldrwq_gather_base_wb_z_fv4sf): Likewise.
      	(mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise.
      	(mve_vldrdq_gather_base_wb_<supf>v2di): Likewise.
      	(mve_vldrdq_gather_base_wb_<supf>v2di_insn): Likewise.
      	(mve_vldrdq_gather_base_wb_z_<supf>v2di): Likewise.
      	(mve_vldrdq_gather_base_wb_z_<supf>v2di_insn): Likewise.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_s64.c: New test.
      	* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_u64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_z_s64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_z_u64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_z_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_z_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_z_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_p_s64.c:
      	Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_p_u64.c:
      	Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_s64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_u64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_p_f32.c:
      	Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_p_s32.c:
      	Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_p_u32.c:
      	Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_u32.c: Likewise.
      Srinath Parvathaneni committed
    • [ARM][GCC][1/8x]: MVE ACLE vidup, vddup, viwdup and vdwdup intrinsics with writeback. · 92f80065
      This patch supports following MVE ACLE intrinsics with writeback.
      
      vddupq_m_n_u8, vddupq_m_n_u32, vddupq_m_n_u16, vddupq_m_wb_u8, vddupq_m_wb_u16, vddupq_m_wb_u32, vddupq_n_u8, vddupq_n_u32, vddupq_n_u16, vddupq_wb_u8, vddupq_wb_u16, vddupq_wb_u32, vdwdupq_m_n_u8, vdwdupq_m_n_u32, vdwdupq_m_n_u16, vdwdupq_m_wb_u8, vdwdupq_m_wb_u32, vdwdupq_m_wb_u16, vdwdupq_n_u8, vdwdupq_n_u32, vdwdupq_n_u16, vdwdupq_wb_u8, vdwdupq_wb_u32, vdwdupq_wb_u16, vidupq_m_n_u8, vidupq_m_n_u32, vidupq_m_n_u16, vidupq_m_wb_u8, vidupq_m_wb_u16, vidupq_m_wb_u32, vidupq_n_u8, vidupq_n_u32, vidupq_n_u16, vidupq_wb_u8, vidupq_wb_u16, vidupq_wb_u32, viwdupq_m_n_u8, viwdupq_m_n_u32, viwdupq_m_n_u16, viwdupq_m_wb_u8, viwdupq_m_wb_u32, viwdupq_m_wb_u16, viwdupq_n_u8, viwdupq_n_u32, viwdupq_n_u16, viwdupq_wb_u8, viwdupq_wb_u32, viwdupq_wb_u16.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
      [1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* config/arm/arm-builtins.c
      	(QUINOP_UNONE_UNONE_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Define quinary
      	builtin qualifier.
      	* config/arm/arm_mve.h (vddupq_m_n_u8): Define macro.
      	(vddupq_m_n_u32): Likewise.
      	(vddupq_m_n_u16): Likewise.
      	(vddupq_m_wb_u8): Likewise.
      	(vddupq_m_wb_u16): Likewise.
      	(vddupq_m_wb_u32): Likewise.
      	(vddupq_n_u8): Likewise.
      	(vddupq_n_u32): Likewise.
      	(vddupq_n_u16): Likewise.
      	(vddupq_wb_u8): Likewise.
      	(vddupq_wb_u16): Likewise.
      	(vddupq_wb_u32): Likewise.
      	(vdwdupq_m_n_u8): Likewise.
      	(vdwdupq_m_n_u32): Likewise.
      	(vdwdupq_m_n_u16): Likewise.
      	(vdwdupq_m_wb_u8): Likewise.
      	(vdwdupq_m_wb_u32): Likewise.
      	(vdwdupq_m_wb_u16): Likewise.
      	(vdwdupq_n_u8): Likewise.
      	(vdwdupq_n_u32): Likewise.
      	(vdwdupq_n_u16): Likewise.
      	(vdwdupq_wb_u8): Likewise.
      	(vdwdupq_wb_u32): Likewise.
      	(vdwdupq_wb_u16): Likewise.
      	(vidupq_m_n_u8): Likewise.
      	(vidupq_m_n_u32): Likewise.
      	(vidupq_m_n_u16): Likewise.
      	(vidupq_m_wb_u8): Likewise.
      	(vidupq_m_wb_u16): Likewise.
      	(vidupq_m_wb_u32): Likewise.
      	(vidupq_n_u8): Likewise.
      	(vidupq_n_u32): Likewise.
      	(vidupq_n_u16): Likewise.
      	(vidupq_wb_u8): Likewise.
      	(vidupq_wb_u16): Likewise.
      	(vidupq_wb_u32): Likewise.
      	(viwdupq_m_n_u8): Likewise.
      	(viwdupq_m_n_u32): Likewise.
      	(viwdupq_m_n_u16): Likewise.
      	(viwdupq_m_wb_u8): Likewise.
      	(viwdupq_m_wb_u32): Likewise.
      	(viwdupq_m_wb_u16): Likewise.
      	(viwdupq_n_u8): Likewise.
      	(viwdupq_n_u32): Likewise.
      	(viwdupq_n_u16): Likewise.
      	(viwdupq_wb_u8): Likewise.
      	(viwdupq_wb_u32): Likewise.
      	(viwdupq_wb_u16): Likewise.
      	(__arm_vddupq_m_n_u8): Define intrinsic.
      	(__arm_vddupq_m_n_u32): Likewise.
      	(__arm_vddupq_m_n_u16): Likewise.
      	(__arm_vddupq_m_wb_u8): Likewise.
      	(__arm_vddupq_m_wb_u16): Likewise.
      	(__arm_vddupq_m_wb_u32): Likewise.
      	(__arm_vddupq_n_u8): Likewise.
      	(__arm_vddupq_n_u32): Likewise.
      	(__arm_vddupq_n_u16): Likewise.
      	(__arm_vdwdupq_m_n_u8): Likewise.
      	(__arm_vdwdupq_m_n_u32): Likewise.
      	(__arm_vdwdupq_m_n_u16): Likewise.
      	(__arm_vdwdupq_m_wb_u8): Likewise.
      	(__arm_vdwdupq_m_wb_u32): Likewise.
      	(__arm_vdwdupq_m_wb_u16): Likewise.
      	(__arm_vdwdupq_n_u8): Likewise.
      	(__arm_vdwdupq_n_u32): Likewise.
      	(__arm_vdwdupq_n_u16): Likewise.
      	(__arm_vdwdupq_wb_u8): Likewise.
      	(__arm_vdwdupq_wb_u32): Likewise.
      	(__arm_vdwdupq_wb_u16): Likewise.
      	(__arm_vidupq_m_n_u8): Likewise.
      	(__arm_vidupq_m_n_u32): Likewise.
      	(__arm_vidupq_m_n_u16): Likewise.
      	(__arm_vidupq_n_u8): Likewise.
      	(__arm_vidupq_m_wb_u8): Likewise.
      	(__arm_vidupq_m_wb_u16): Likewise.
      	(__arm_vidupq_m_wb_u32): Likewise.
      	(__arm_vidupq_n_u32): Likewise.
      	(__arm_vidupq_n_u16): Likewise.
      	(__arm_vidupq_wb_u8): Likewise.
      	(__arm_vidupq_wb_u16): Likewise.
      	(__arm_vidupq_wb_u32): Likewise.
      	(__arm_vddupq_wb_u8): Likewise.
      	(__arm_vddupq_wb_u16): Likewise.
      	(__arm_vddupq_wb_u32): Likewise.
      	(__arm_viwdupq_m_n_u8): Likewise.
      	(__arm_viwdupq_m_n_u32): Likewise.
      	(__arm_viwdupq_m_n_u16): Likewise.
      	(__arm_viwdupq_m_wb_u8): Likewise.
      	(__arm_viwdupq_m_wb_u32): Likewise.
      	(__arm_viwdupq_m_wb_u16): Likewise.
      	(__arm_viwdupq_n_u8): Likewise.
      	(__arm_viwdupq_n_u32): Likewise.
      	(__arm_viwdupq_n_u16): Likewise.
      	(__arm_viwdupq_wb_u8): Likewise.
      	(__arm_viwdupq_wb_u32): Likewise.
      	(__arm_viwdupq_wb_u16): Likewise.
      	(vidupq_m): Define polymorphic variant.
      	(vddupq_m): Likewise.
      	(vidupq_u16): Likewise.
      	(vidupq_u32): Likewise.
      	(vidupq_u8): Likewise.
      	(vddupq_u16): Likewise.
      	(vddupq_u32): Likewise.
      	(vddupq_u8): Likewise.
      	(viwdupq_m): Likewise.
      	(viwdupq_u16): Likewise.
      	(viwdupq_u32): Likewise.
      	(viwdupq_u8): Likewise.
      	(vdwdupq_m): Likewise.
      	(vdwdupq_u16): Likewise.
      	(vdwdupq_u32): Likewise.
      	(vdwdupq_u8): Likewise.
      	* config/arm/arm_mve_builtins.def
      	(QUINOP_UNONE_UNONE_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Use builtin
      	qualifier.
      	* config/arm/mve.md (mve_vidupq_n_u<mode>): Define RTL pattern.
      	(mve_vidupq_u<mode>_insn): Likewise.
      	(mve_vidupq_m_n_u<mode>): Likewise.
      	(mve_vidupq_m_wb_u<mode>_insn): Likewise.
      	(mve_vddupq_n_u<mode>): Likewise.
      	(mve_vddupq_u<mode>_insn): Likewise.
      	(mve_vddupq_m_n_u<mode>): Likewise.
      	(mve_vddupq_m_wb_u<mode>_insn): Likewise.
      	(mve_vdwdupq_n_u<mode>): Likewise.
      	(mve_vdwdupq_wb_u<mode>): Likewise.
      	(mve_vdwdupq_wb_u<mode>_insn): Likewise.
      	(mve_vdwdupq_m_n_u<mode>): Likewise.
      	(mve_vdwdupq_m_wb_u<mode>): Likewise.
      	(mve_vdwdupq_m_wb_u<mode>_insn): Likewise.
      	(mve_viwdupq_n_u<mode>): Likewise.
      	(mve_viwdupq_wb_u<mode>): Likewise.
      	(mve_viwdupq_wb_u<mode>_insn): Likewise.
      	(mve_viwdupq_m_n_u<mode>): Likewise.
      	(mve_viwdupq_m_wb_u<mode>): Likewise.
      	(mve_viwdupq_m_wb_u<mode>_insn): Likewise.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vddupq_m_n_u16.c: New test.
      	* gcc.target/arm/mve/intrinsics/vddupq_m_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_m_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_m_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_m_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_m_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vddupq_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_m_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_m_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_m_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_m_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_m_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_m_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vdwdupq_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_m_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_m_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_m_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_m_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_m_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_m_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vidupq_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_m_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_m_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_m_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_m_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_m_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_m_wb_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_n_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_n_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_n_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_wb_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_wb_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/viwdupq_wb_u8.c: Likewise.
      Srinath Parvathaneni committed
    • [ARM][GCC][7x]: MVE vreinterpretq and vuninitializedq intrinsics. · 85a94e87
      This patch supports following MVE ACLE intrinsics.
      
      vreinterpretq_s16_s32, vreinterpretq_s16_s64, vreinterpretq_s16_s8, vreinterpretq_s16_u16,
      vreinterpretq_s16_u32, vreinterpretq_s16_u64, vreinterpretq_s16_u8, vreinterpretq_s32_s16,
      vreinterpretq_s32_s64, vreinterpretq_s32_s8, vreinterpretq_s32_u16, vreinterpretq_s32_u32,
      vreinterpretq_s32_u64, vreinterpretq_s32_u8, vreinterpretq_s64_s16, vreinterpretq_s64_s32,
      vreinterpretq_s64_s8, vreinterpretq_s64_u16, vreinterpretq_s64_u32, vreinterpretq_s64_u64,
      vreinterpretq_s64_u8, vreinterpretq_s8_s16, vreinterpretq_s8_s32, vreinterpretq_s8_s64,
      vreinterpretq_s8_u16, vreinterpretq_s8_u32, vreinterpretq_s8_u64, vreinterpretq_s8_u8,
      vreinterpretq_u16_s16, vreinterpretq_u16_s32, vreinterpretq_u16_s64, vreinterpretq_u16_s8,
      vreinterpretq_u16_u32, vreinterpretq_u16_u64, vreinterpretq_u16_u8, vreinterpretq_u32_s16,
      vreinterpretq_u32_s32, vreinterpretq_u32_s64, vreinterpretq_u32_s8, vreinterpretq_u32_u16,
      vreinterpretq_u32_u64, vreinterpretq_u32_u8, vreinterpretq_u64_s16, vreinterpretq_u64_s32,
      vreinterpretq_u64_s64, vreinterpretq_u64_s8, vreinterpretq_u64_u16, vreinterpretq_u64_u32,
      vreinterpretq_u64_u8, vreinterpretq_u8_s16, vreinterpretq_u8_s32, vreinterpretq_u8_s64,
      vreinterpretq_u8_s8, vreinterpretq_u8_u16, vreinterpretq_u8_u32, vreinterpretq_u8_u64,
      vreinterpretq_s32_f16, vreinterpretq_s32_f32, vreinterpretq_u16_f16, vreinterpretq_u16_f32,
      vreinterpretq_u32_f16, vreinterpretq_u32_f32, vreinterpretq_u64_f16, vreinterpretq_u64_f32,
      vreinterpretq_u8_f16, vreinterpretq_u8_f32, vreinterpretq_f16_f32, vreinterpretq_f16_s16,
      vreinterpretq_f16_s32, vreinterpretq_f16_s64, vreinterpretq_f16_s8, vreinterpretq_f16_u16,
      vreinterpretq_f16_u32, vreinterpretq_f16_u64, vreinterpretq_f16_u8, vreinterpretq_f32_f16,
      vreinterpretq_f32_s16, vreinterpretq_f32_s32, vreinterpretq_f32_s64, vreinterpretq_f32_s8,
      vreinterpretq_f32_u16, vreinterpretq_f32_u32, vreinterpretq_f32_u64, vreinterpretq_f32_u8,
      vreinterpretq_s16_f16, vreinterpretq_s16_f32, vreinterpretq_s64_f16, vreinterpretq_s64_f32,
      vreinterpretq_s8_f16, vreinterpretq_s8_f32, vuninitializedq_u8, vuninitializedq_u16,
      vuninitializedq_u32, vuninitializedq_u64, vuninitializedq_s8, vuninitializedq_s16,
      vuninitializedq_s32, vuninitializedq_s64, vuninitializedq_f16, vuninitializedq_f32 and
      vuninitializedq.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
      [1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
      
      	* config/arm/arm_mve.h (vreinterpretq_s16_s32): Define macro.
      	(vreinterpretq_s16_s64): Likewise.
      	(vreinterpretq_s16_s8): Likewise.
      	(vreinterpretq_s16_u16): Likewise.
      	(vreinterpretq_s16_u32): Likewise.
      	(vreinterpretq_s16_u64): Likewise.
      	(vreinterpretq_s16_u8): Likewise.
      	(vreinterpretq_s32_s16): Likewise.
      	(vreinterpretq_s32_s64): Likewise.
      	(vreinterpretq_s32_s8): Likewise.
      	(vreinterpretq_s32_u16): Likewise.
      	(vreinterpretq_s32_u32): Likewise.
      	(vreinterpretq_s32_u64): Likewise.
      	(vreinterpretq_s32_u8): Likewise.
      	(vreinterpretq_s64_s16): Likewise.
      	(vreinterpretq_s64_s32): Likewise.
      	(vreinterpretq_s64_s8): Likewise.
      	(vreinterpretq_s64_u16): Likewise.
      	(vreinterpretq_s64_u32): Likewise.
      	(vreinterpretq_s64_u64): Likewise.
      	(vreinterpretq_s64_u8): Likewise.
      	(vreinterpretq_s8_s16): Likewise.
      	(vreinterpretq_s8_s32): Likewise.
      	(vreinterpretq_s8_s64): Likewise.
      	(vreinterpretq_s8_u16): Likewise.
      	(vreinterpretq_s8_u32): Likewise.
      	(vreinterpretq_s8_u64): Likewise.
      	(vreinterpretq_s8_u8): Likewise.
      	(vreinterpretq_u16_s16): Likewise.
      	(vreinterpretq_u16_s32): Likewise.
      	(vreinterpretq_u16_s64): Likewise.
      	(vreinterpretq_u16_s8): Likewise.
      	(vreinterpretq_u16_u32): Likewise.
      	(vreinterpretq_u16_u64): Likewise.
      	(vreinterpretq_u16_u8): Likewise.
      	(vreinterpretq_u32_s16): Likewise.
      	(vreinterpretq_u32_s32): Likewise.
      	(vreinterpretq_u32_s64): Likewise.
      	(vreinterpretq_u32_s8): Likewise.
      	(vreinterpretq_u32_u16): Likewise.
      	(vreinterpretq_u32_u64): Likewise.
      	(vreinterpretq_u32_u8): Likewise.
      	(vreinterpretq_u64_s16): Likewise.
      	(vreinterpretq_u64_s32): Likewise.
      	(vreinterpretq_u64_s64): Likewise.
      	(vreinterpretq_u64_s8): Likewise.
      	(vreinterpretq_u64_u16): Likewise.
      	(vreinterpretq_u64_u32): Likewise.
      	(vreinterpretq_u64_u8): Likewise.
      	(vreinterpretq_u8_s16): Likewise.
      	(vreinterpretq_u8_s32): Likewise.
      	(vreinterpretq_u8_s64): Likewise.
      	(vreinterpretq_u8_s8): Likewise.
      	(vreinterpretq_u8_u16): Likewise.
      	(vreinterpretq_u8_u32): Likewise.
      	(vreinterpretq_u8_u64): Likewise.
      	(vreinterpretq_s32_f16): Likewise.
      	(vreinterpretq_s32_f32): Likewise.
      	(vreinterpretq_u16_f16): Likewise.
      	(vreinterpretq_u16_f32): Likewise.
      	(vreinterpretq_u32_f16): Likewise.
      	(vreinterpretq_u32_f32): Likewise.
      	(vreinterpretq_u64_f16): Likewise.
      	(vreinterpretq_u64_f32): Likewise.
      	(vreinterpretq_u8_f16): Likewise.
      	(vreinterpretq_u8_f32): Likewise.
      	(vreinterpretq_f16_f32): Likewise.
      	(vreinterpretq_f16_s16): Likewise.
      	(vreinterpretq_f16_s32): Likewise.
      	(vreinterpretq_f16_s64): Likewise.
      	(vreinterpretq_f16_s8): Likewise.
      	(vreinterpretq_f16_u16): Likewise.
      	(vreinterpretq_f16_u32): Likewise.
      	(vreinterpretq_f16_u64): Likewise.
      	(vreinterpretq_f16_u8): Likewise.
      	(vreinterpretq_f32_f16): Likewise.
      	(vreinterpretq_f32_s16): Likewise.
      	(vreinterpretq_f32_s32): Likewise.
      	(vreinterpretq_f32_s64): Likewise.
      	(vreinterpretq_f32_s8): Likewise.
      	(vreinterpretq_f32_u16): Likewise.
      	(vreinterpretq_f32_u32): Likewise.
      	(vreinterpretq_f32_u64): Likewise.
      	(vreinterpretq_f32_u8): Likewise.
      	(vreinterpretq_s16_f16): Likewise.
      	(vreinterpretq_s16_f32): Likewise.
      	(vreinterpretq_s64_f16): Likewise.
      	(vreinterpretq_s64_f32): Likewise.
      	(vreinterpretq_s8_f16): Likewise.
      	(vreinterpretq_s8_f32): Likewise.
      	(vuninitializedq_u8): Likewise.
      	(vuninitializedq_u16): Likewise.
      	(vuninitializedq_u32): Likewise.
      	(vuninitializedq_u64): Likewise.
      	(vuninitializedq_s8): Likewise.
      	(vuninitializedq_s16): Likewise.
      	(vuninitializedq_s32): Likewise.
      	(vuninitializedq_s64): Likewise.
      	(vuninitializedq_f16): Likewise.
      	(vuninitializedq_f32): Likewise.
      	(__arm_vuninitializedq_u8): Define intrinsic.
      	(__arm_vuninitializedq_u16): Likewise.
      	(__arm_vuninitializedq_u32): Likewise.
      	(__arm_vuninitializedq_u64): Likewise.
      	(__arm_vuninitializedq_s8): Likewise.
      	(__arm_vuninitializedq_s16): Likewise.
      	(__arm_vuninitializedq_s32): Likewise.
      	(__arm_vuninitializedq_s64): Likewise.
      	(__arm_vreinterpretq_s16_s32): Likewise.
      	(__arm_vreinterpretq_s16_s64): Likewise.
      	(__arm_vreinterpretq_s16_s8): Likewise.
      	(__arm_vreinterpretq_s16_u16): Likewise.
      	(__arm_vreinterpretq_s16_u32): Likewise.
      	(__arm_vreinterpretq_s16_u64): Likewise.
      	(__arm_vreinterpretq_s16_u8): Likewise.
      	(__arm_vreinterpretq_s32_s16): Likewise.
      	(__arm_vreinterpretq_s32_s64): Likewise.
      	(__arm_vreinterpretq_s32_s8): Likewise.
      	(__arm_vreinterpretq_s32_u16): Likewise.
      	(__arm_vreinterpretq_s32_u32): Likewise.
      	(__arm_vreinterpretq_s32_u64): Likewise.
      	(__arm_vreinterpretq_s32_u8): Likewise.
      	(__arm_vreinterpretq_s64_s16): Likewise.
      	(__arm_vreinterpretq_s64_s32): Likewise.
      	(__arm_vreinterpretq_s64_s8): Likewise.
      	(__arm_vreinterpretq_s64_u16): Likewise.
      	(__arm_vreinterpretq_s64_u32): Likewise.
      	(__arm_vreinterpretq_s64_u64): Likewise.
      	(__arm_vreinterpretq_s64_u8): Likewise.
      	(__arm_vreinterpretq_s8_s16): Likewise.
      	(__arm_vreinterpretq_s8_s32): Likewise.
      	(__arm_vreinterpretq_s8_s64): Likewise.
      	(__arm_vreinterpretq_s8_u16): Likewise.
      	(__arm_vreinterpretq_s8_u32): Likewise.
      	(__arm_vreinterpretq_s8_u64): Likewise.
      	(__arm_vreinterpretq_s8_u8): Likewise.
      	(__arm_vreinterpretq_u16_s16): Likewise.
      	(__arm_vreinterpretq_u16_s32): Likewise.
      	(__arm_vreinterpretq_u16_s64): Likewise.
      	(__arm_vreinterpretq_u16_s8): Likewise.
      	(__arm_vreinterpretq_u16_u32): Likewise.
      	(__arm_vreinterpretq_u16_u64): Likewise.
      	(__arm_vreinterpretq_u16_u8): Likewise.
      	(__arm_vreinterpretq_u32_s16): Likewise.
      	(__arm_vreinterpretq_u32_s32): Likewise.
      	(__arm_vreinterpretq_u32_s64): Likewise.
      	(__arm_vreinterpretq_u32_s8): Likewise.
      	(__arm_vreinterpretq_u32_u16): Likewise.
      	(__arm_vreinterpretq_u32_u64): Likewise.
      	(__arm_vreinterpretq_u32_u8): Likewise.
      	(__arm_vreinterpretq_u64_s16): Likewise.
      	(__arm_vreinterpretq_u64_s32): Likewise.
      	(__arm_vreinterpretq_u64_s64): Likewise.
      	(__arm_vreinterpretq_u64_s8): Likewise.
      	(__arm_vreinterpretq_u64_u16): Likewise.
      	(__arm_vreinterpretq_u64_u32): Likewise.
      	(__arm_vreinterpretq_u64_u8): Likewise.
      	(__arm_vreinterpretq_u8_s16): Likewise.
      	(__arm_vreinterpretq_u8_s32): Likewise.
      	(__arm_vreinterpretq_u8_s64): Likewise.
      	(__arm_vreinterpretq_u8_s8): Likewise.
      	(__arm_vreinterpretq_u8_u16): Likewise.
      	(__arm_vreinterpretq_u8_u32): Likewise.
      	(__arm_vreinterpretq_u8_u64): Likewise.
      	(__arm_vuninitializedq_f16): Likewise.
      	(__arm_vuninitializedq_f32): Likewise.
      	(__arm_vreinterpretq_s32_f16): Likewise.
      	(__arm_vreinterpretq_s32_f32): Likewise.
      	(__arm_vreinterpretq_s16_f16): Likewise.
      	(__arm_vreinterpretq_s16_f32): Likewise.
      	(__arm_vreinterpretq_s64_f16): Likewise.
      	(__arm_vreinterpretq_s64_f32): Likewise.
      	(__arm_vreinterpretq_s8_f16): Likewise.
      	(__arm_vreinterpretq_s8_f32): Likewise.
      	(__arm_vreinterpretq_u16_f16): Likewise.
      	(__arm_vreinterpretq_u16_f32): Likewise.
      	(__arm_vreinterpretq_u32_f16): Likewise.
      	(__arm_vreinterpretq_u32_f32): Likewise.
      	(__arm_vreinterpretq_u64_f16): Likewise.
      	(__arm_vreinterpretq_u64_f32): Likewise.
      	(__arm_vreinterpretq_u8_f16): Likewise.
      	(__arm_vreinterpretq_u8_f32): Likewise.
      	(__arm_vreinterpretq_f16_f32): Likewise.
      	(__arm_vreinterpretq_f16_s16): Likewise.
      	(__arm_vreinterpretq_f16_s32): Likewise.
      	(__arm_vreinterpretq_f16_s64): Likewise.
      	(__arm_vreinterpretq_f16_s8): Likewise.
      	(__arm_vreinterpretq_f16_u16): Likewise.
      	(__arm_vreinterpretq_f16_u32): Likewise.
      	(__arm_vreinterpretq_f16_u64): Likewise.
      	(__arm_vreinterpretq_f16_u8): Likewise.
      	(__arm_vreinterpretq_f32_f16): Likewise.
      	(__arm_vreinterpretq_f32_s16): Likewise.
      	(__arm_vreinterpretq_f32_s32): Likewise.
      	(__arm_vreinterpretq_f32_s64): Likewise.
      	(__arm_vreinterpretq_f32_s8): Likewise.
      	(__arm_vreinterpretq_f32_u16): Likewise.
      	(__arm_vreinterpretq_f32_u32): Likewise.
      	(__arm_vreinterpretq_f32_u64): Likewise.
      	(__arm_vreinterpretq_f32_u8): Likewise.
      	(vuninitializedq): Define polymorphic variant.
      	(vreinterpretq_f16): Likewise.
      	(vreinterpretq_f32): Likewise.
      	(vreinterpretq_s16): Likewise.
      	(vreinterpretq_s32): Likewise.
      	(vreinterpretq_s64): Likewise.
      	(vreinterpretq_s8): Likewise.
      	(vreinterpretq_u16): Likewise.
      	(vreinterpretq_u32): Likewise.
      	(vreinterpretq_u64): Likewise.
      	(vreinterpretq_u8): Likewise.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_float.c: New test.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_float1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_int.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_int1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_f16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_s64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_u64.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vreinterpretq_u8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_float.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_float1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_int.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vuninitializedq_int1.c: Likewise.
      Srinath Parvathaneni committed
    • [ARM][GCC][6x]:MVE ACLE vaddq intrinsics using arithmetic plus operator. · 3eff57aa
      This patch supports following MVE ACLE vaddq intrinsics. The RTL patterns for this intrinsics are added using arithmetic "plus" operator.
      
      vaddq_s8, vaddq_s16, vaddq_s32, vaddq_u8, vaddq_u16, vaddq_u32, vaddq_f16, vaddq_f32.
      
      Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
      [1]  https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* config/arm/arm_mve.h (vaddq_s8): Define macro.
      	(vaddq_s16): Likewise.
      	(vaddq_s32): Likewise.
      	(vaddq_u8): Likewise.
      	(vaddq_u16): Likewise.
      	(vaddq_u32): Likewise.
      	(vaddq_f16): Likewise.
      	(vaddq_f32): Likewise.
      	(__arm_vaddq_s8): Define intrinsic.
      	(__arm_vaddq_s16): Likewise.
      	(__arm_vaddq_s32): Likewise.
      	(__arm_vaddq_u8): Likewise.
      	(__arm_vaddq_u16): Likewise.
      	(__arm_vaddq_u32): Likewise.
      	(__arm_vaddq_f16): Likewise.
      	(__arm_vaddq_f32): Likewise.
      	(vaddq): Define polymorphic variant.
      	* config/arm/iterators.md (VNIM): Define mode iterator for common types
      	Neon, IWMMXT and MVE.
      	(VNINOTM): Likewise.
      	* config/arm/mve.md (mve_vaddq<mode>): Define RTL pattern.
      	(mve_vaddq_f<mode>): Define RTL pattern.
      	* config/arm/neon.md (add<mode>3): Rename to addv4hf3 RTL pattern.
      	(addv8hf3_neon): Define RTL pattern.
      	* config/arm/vec-common.md (add<mode>3): Modify standard add RTL pattern
      	to support MVE.
      	(addv8hf3): Define standard RTL pattern for MVE and Neon.
      	(add<mode>3): Modify existing standard add RTL pattern for Neon and IWMMXT.
      
      gcc/testsuite/ChangeLog:
      
      2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
                  Andre Vieira  <andre.simoesdiasvieira@arm.com>
                  Mihail Ionescu  <mihail.ionescu@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/vaddq_f16.c: New test.
      	* gcc.target/arm/mve/intrinsics/vaddq_f32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_s16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_s32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_s8.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_u16.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_u32.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vaddq_u8.c: Likewise.
      Srinath Parvathaneni committed
    • Fix correct offset in ipa_get_jf_ancestor_result. · 7d4549b2
      	PR ipa/94232
      	* ipa-cp.c (ipa_get_jf_ancestor_result): Use offset in bytes. Previously
      	build_ref_for_offset function was used and it transforms off to bytes
      	from bits.
      Martin Liska committed
    • tree-optimization/94266 - fix object type extraction heuristics · 8fefa21f
      This fixes the heuristic deriving an actual object type from a
      MEM_REFs pointer operand to use the more sensible type of an
      actual object instead of the pointed to type.
      
      2020-03-20  Richard Biener  <rguenther@suse.de>
      
      	PR tree-optimization/94266
      	* gimple-ssa-sprintf.c (get_origin_and_offset): Use the
      	type of the underlying object to adjust for the containing
      	field if available.
      Richard Biener committed
    • gcc, Arm: Revert changes to {get,set}_fpscr · 719c8642
      MVE made changes to {get,set}_fpscr to enable the compiler to optimize
      unneccesary gets and sets when using these for intrinsics that use and/or write
      the carry bit.  However, these actually get and set the full FPSCR register and
      are used by fp env intrinsics to modify the fp context.  So MVE should not be
      using these.
      
      gcc/ChangeLog:
      2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>
      
      	* config/arm/unspecs.md (UNSPEC_GET_FPSCR): Rename this to ...
      	(VUNSPEC_GET_FPSCR): ... this, and move it to vunspec.
      	* config/arm/vfp.md: (get_fpscr, set_fpscr): Revert to old patterns.
      Andre Simoes Dias Vieira committed
    • gcc, Arm: Fix testisms for MVE testsuite · 005f6fc5
      This patch fixes some testism where -mfpu=auto was missing or where we could
      end up with -mfloat-abi=hard and soft on the same command-line.
      
      gcc/testsuite/ChangeLog:
      2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/mve_fp_fpu1.c: Fix testisms.
      	* gcc.target/arm/mve/intrinsics/mve_fp_fpu2.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_fpu1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_fpu2.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_fpu3.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_libcall1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_libcall2.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_float.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_float1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_float2.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_int.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_int1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_int2.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_uint.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_uint1.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/mve_vector_uint2.c: Likewise.
      	* gcc.target/arm/mve/intrinsics/vshrntq_m_n_u32.c: Likewise.
      Andre Simoes Dias Vieira committed
    • gcc, Arm: Fix MVE move from GPR -> GPR · 0efe7d87
      This patch fixes the pattern mve_mov for the case where both MVE vectors are in
      R registers and the move does not get optimized away.  I use the same approach
      as we do for NEON, where we use four register moves.
      
      gcc/ChangeLog:
      2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>
      
      	* config/arm/mve.md (mve_mov<mode>): Fix R->R case.
      
      gcc/testsuite/ChangeLog:
      2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>
      
      	* gcc.target/arm/mve/intrinsics/mve_move_gpr_to_gpr.c: New test.
      Andre Simoes Dias Vieira committed
    • store-merging: Fix up -fnon-call-exceptions handling [PR94224] · 4119cd69
      When we are adding a single store into a store group, we are already
      checking that store->lp_nr matches, but we have also code to add further
      INTEGER_CST stores into the group right away if the ordering requires that
      either we put there all or none from a certain set of stores.  And in those
      cases we weren't doing these lp_nr checks, which means we could end up with
      stores with different lp_nr in the same group, which then ICEs during
      output_merged_store.
      
      2020-03-20  Jakub Jelinek  <jakub@redhat.com>
      
      	PR tree-optimization/94224
      	* gimple-ssa-store-merging.c
      	(imm_store_chain_info::coalesce_immediate): Don't consider overlapping
      	or adjacent INTEGER_CST rhs_code stores as mergeable if they have
      	different lp_nr.
      
      	* g++.dg/tree-ssa/pr94224.C: New test.
      Jakub Jelinek committed
    • gcc, Arm: Fix no_cond issue introduced by MVE · 05009698
      This was a matter of mistaken logic in (define_attr "conds" ..). This was
      setting the conds attribute for any neon instruction to no_cond which was
      messing up code generation.
      
      gcc/ChangeLog:
      2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>
      
      	* config/arm/arm.md (define_attr "conds"): Fix logic for neon and mve.
      Andre Simoes Dias Vieira committed