[J3] New intrinsic XXX_PREFIX and XXX_SUFFIX procedures
Brad Richardson
everythingfunctional at protonmail.com
Tue Jan 31 15:50:48 UTC 2023
Hi All,
I have produced a new draft available for comment here:
https://github.com/j3-fortran/fortran_proposals/pull/290
Any further feedback is still welcome, and I will be uploading to J3
first thing next week.
Regards,
Brad
On Thu, 2023-01-26 at 16:52 +0000, Brad Richardson via J3 wrote:
> Hi all,
>
> I've been pondering this more, and I think the way I'd like to
> propose
> this is as a single new intrinsic. I think it could in theory be
> named
> SCAN, as the interface would be fully distinguishable from the
> existing
> SCAN intrinsic for character set membership search. The interface
> would
> be:
>
> SCAN(ARRAY, OPERATION[, IDENTITY, DIM, MASK, SEGMENT, EXCLUSIVE,
> REVERSED, ORDERED])
>
> The type of the result would be taken from IDENTITY (if present), and
> ARRAY and IDENTITY need not have the same type. The first argument
> and
> result of OPERATION must have the same type as IDENTITY, and the
> second
> argument must have the same type as ARRAY. If IDENTITY is not
> present,
> then the type of the result of SCAN, and the type of the first
> argument
> and result of OPERATION have the same type as ARRAY. If EXCLUSIVE is
> present with the value true, or the MASK or SEGMENT arguments are
> present with values that cause an element of the result to have no
> contributing elements from ARRAY, then IDENTITY must be present or
> error termination is initiated.
>
> I'm leaning towards (but not strongly) towards interpreting the
> SEGMENT
> argument the same way that HPF did - consecutive elements of SEGMENT
> with .EQV. values constitute a segment of the result - because it
> makes
> the interactions between the MASK and DIM arguments easier to
> understand, and the value of the first (or last) element of SEGMENT
> actually has meaning. If there are arguments that the opposite
> interpretation - .true. values of SEGMENT signal initial values of a
> segment of the result - has better performance, is easier to
> understand, or has some other benefit I'm missing, please let me
> know.
>
> By defining SCAN in this way, every single one of the
> XXX_{PRE,SUF}FIX
> functions from HPF can be easily defined in terms of SCAN. I.e.
>
> COUNT_SUFFIX(MASK, DIM, SEGMENT, EXCLUSIVE) ==
> SCAN(ARRAY = MASK, OPERATION = INCREMENT, IDENTITY = 0, DIM = DIM,
> SEGMENT = SEGMENT, EXCLUSIVE = EXCLUSIVE, REVERSED = .TRUE.)
>
> where
>
> FUNCTION INCREMENT(x, y)
> integer, intent(in) :: x
> logical, intent(in) :: y
> integer :: INCREMENT
>
> if (y) then
> INCREMENT = x + 1
> else
> INCREMENT = x
> end if
> END FUNCTION
>
> I think CO_SCAN could be defined in a similar way. I.e.
>
> CO_SCAN(A, OPERATION[, IDENTITY, MASK, SEGMENT, EXCLUSIVE, REVERSED,
> STAT, ERRMSG])
>
> Any feedback on this idea is welcome. I'll endeavor to have a paper
> uploaded, including any feedback provided, by two weeks prior to the
> February meeting.
>
> Regards,
> Brad
>
> On Fri, 2023-01-13 at 22:26 +0000, Brad Richardson via J3 wrote:
> > Hi all,
> >
> > First, thank you all for your comments and suggestions. They have
> > revealed some aspects that I had not initially considered. I would
> > like
> > to try and address some of the various comments.
> >
> > The most common suggestion has amounted to something like, "Why not
> > just do exactly what HPF did?"
> >
> > Looking at what HPF did has been very valuable as a reference. I
> > can
> > certainly understand vendors wanting to be able to reuse existing
> > work,
> > but I want to still explore whether HPF necessarily did it
> > perfectly,
> > or whether we could potentially do better now. If the way we choose
> > to
> > do it now is slightly different than HPF, it doesn't mean that
> > prior
> > work is not reusable, just that it may require some modification.
> >
> > HPF provided the combinatorial set of operations with {PRE,SUF}FIX
> > specific procedures, with optional argument to determine inclusive
> > or
> > exclusive. With the recent sentiments expressed that all new
> > features
> > should have compelling use cases, there is much more burden to
> > justifying each one individually. By providing a generic procedure
> > that
> > is applicable for all operations, only a few use cases are needed
> > to
> > justify it, including use cases not covered by the operation
> > specific
> > versions.
> >
> > It has been suggested that the operation specific versions mean
> > that
> > the code can be type checked, but the compiler absolutely could
> > type
> > check the generic version. It is required the ARRAY and OPERATION
> > have
> > the same types, which the compiler can see at the call site and
> > enforce. This is just like the generic REDUCE function. I will note
> > that this does mean COUNT_{PRE,SUF}FIX is not possible with the
> > generic
> > version. A slight change to the description and arguments could re-
> > enable it though. I.e.
> >
> > PURE FUNCTION OPERATION(x, y) RESULT(res)
> > TYPE(<type_of_identity_argument>), INTENT(in) :: x
> > TYPE(<type_of_array_argument>), INTENT(in) :: y
> > TYPE(<type_of_identity_argument>) :: res
> > END FUNCTION
> >
> > and make IDENTITY a required argument.
> >
> > HPF provided {PRE/SUF}FIX functions, with an optional argument to
> > do
> > inclusive vs exclusive. It would be reasonable to do the opposite
> > arrangement, have {IN/EX}CLUSIVE functions with an optional
> > argument
> > to
> > do forward vs backward iteration. Or even just single functions
> > with
> > optional arguments for both, or individual functions for each. Each
> > option has certain advantages and disadvantages that should be
> > considered.
> >
> > HPF defined that the result of a {PRE/SUF}FIX function has the same
> > shape as the array argument, regardless of the presence of a MASK
> > argument, and that elements of the result for which no elements of
> > the
> > input contribute there is a "default" value with which that element
> > is
> > defined. This works for the HPF functions because all the specified
> > operations either have a meaningful "default" value, or in the case
> > of
> > COPY, don't allow a mask or exclusive argument (i.e. no chance of
> > zero
> > elements contributing). This is not necessarily possible in the
> > generic
> > case. The generic REDUCE function overcomes this with an IDENTITY
> > argument. My initial thought though was that the behavior would be
> > more
> > like SUM_PREFIX(ARRAY, MASK) == SUM_PREFIX(PACK(ARRAY, MASK)). I'm
> > not
> > sure I know enough about various use cases to decide which is more
> > appropriate. I'm open to suggestions here. I'll just note that I
> > believe most other intrinsics with a MASK argument do follow this
> > pattern (or a similar pattern with loops over dimensions other than
> > DIM). For example
> >
> > res => MINLOC(ARRAY, MASK)
> > res(s1, ..., sdim-1, :, sdim+1, ..., sn) == MINLOC(PACK(ARRAY(s1,
> > ...,
> > sdim-1, :, sdim+1, ..., sn), MASK(s1, ..., sdim-1, :, sdim+1, ...,
> > sn)))
> >
> > HPF treated the SEGMENT argument as adjacent elements with .EQV.
> > corresponding elements in MASK being members of the same segment.
> > I.e.
> > SUM_PREFIX([1, 2, 3, 4, 5], MASK=[.true., .true., .false., .true.,
> > .true.]) == [1, 3, 3, 4, 9]. However, many implementations in other
> > languages treat .true. values in the SEGMENT argument as signifying
> > the
> > first element in a SEGMENT. I.e. SUM_PREFIX([1, 2, 3, 4, 5],
> > MASK=[.true., .false., .true., .true., .false.]) == [1, 3, 3, 4,
> > 9].
> > Is
> > it better to be consistent with HPF or with other languages?
> >
> > HPF did not provide collective subroutine versions for any of these
> > operations. Depending on the chose design, what operations should
> > we
> > provide collective subroutines for?
> >
> > For all of these various dimensions for the possible design of this
> > feature I of course have opinions, but I'm open to any
> > considerations
> > I
> > may not have thought of. Overall my "values" for the design would
> > be,
> > in order:
> >
> > * Is easy for users to understand and use correctly
> > * Allows for efficient implementations by vendors
> > * Doesn't add too much to the standard
> >
> > I'm also open to the idea of starting with a restricted subset of
> > the
> > above discussed functionality such that we can avoid having to
> > settle
> > on certain aspects of the design initially.
> >
> > Looking forward to hearing more ideas.
> >
> > Regards,
> > Brad
> >
>
More information about the J3
mailing list