[J3] asynchronous coarray collectives?
Jeff Hammond
jehammond at nvidia.com
Sun May 7 05:28:21 UTC 2023
Bill,
How would reordering those statements make them execute simultaneously or offload to the network coprocessor?
I recognize the typo and fixed it in the version of the code for which test results were reported.
Thanks,
Jeff
Sent from my iPhone
On 7. May 2023, at 5.41, Long, Bill F <william.long at hpe.com> wrote:
External email: Use caution opening links or attachments
Hi Jeff,
The declaration "double" is not going to get past any compiler I know about. Maybe DOUBLE PRESICION or REAL(8), ...
The 3 calls and the print statement don't have data dependencies, so a compiler could re-order the statements. If this were a real-life code, a case might be made.
Cheers,
Bill
________________________________
From: J3 <j3-bounces at mailman.j3-fortran.org> on behalf of Jeff Hammond via J3 <j3 at mailman.j3-fortran.org>
Sent: Saturday, May 6, 2023 2:59 AM
To: j3 <j3 at j3-fortran.org>
Cc: Jeff Hammond <jehammond at nvidia.com>
Subject: [J3] asynchronous coarray collectives?
How do I make a Fortran coarry program like this...
subroutine stuff(A,B,C,D)
implicit none
double, intent(inout) :: A, B, C
double, intent(in) :: D(:)
call co_sum(A)
call co_min(B)
call co_max(C)
print*,D
end subroutine stuff
...behave like this...
subroutine stuff(A,B,C,D)
use mpi_f08
implicit none
double, intent(inout) :: A, B, C
double, intent(in) :: D(:)
type(MPI_Request) :: R(3)
call MPI_Iallreduce(MPI_IN_PLACE, A, 1, MPI_DOUBLE, MPI_SUM, MPI_COMM_WORLD, R(1))
call MPI_Iallreduce(MPI_IN_PLACE, B, 1, MPI_DOUBLE, MPI_MIN, MPI_COMM_WORLD, R(2))
call MPI_Iallreduce(MPI_IN_PLACE, C, 1, MPI_DOUBLE, MPI_MAX, MPI_COMM_WORLD, R(3))
print*,D
call MPI_Waitall(3,R,MPI_STATUSES_IGNORE)
end subroutine stuff
...in the sense that it is possible for the network to execute the communication operations asynchronously relative to the print statement?
Do any compilers, e.g. Cray’s, automatically convert coarry operations to asynchronous communication and push the completion of those operations as far down as possible?
Jeff
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.j3-fortran.org/pipermail/j3/attachments/20230507/45b798d1/attachment-0002.htm>
More information about the J3
mailing list