TransWikia.com

FFTW on subarray with MPI

Computational Science Asked by Hannes on October 1, 2021

With the guru interfaces of FFTW, I can apply transforms only to parts of a multidimensional array by modifying the fftw_iodim parameter (e.g., with fftw_plan_guru_r2r(...)). Unfortunately, there seems to be no MPI interface with the same arguments (see, e.g., fftw_mpi_plan_r2r(...)). Does anybody know how I could apply one of the transforms of FFTW to only part of a multidimensional array (a subarray) using MPI?

One Answer

The whole point of the guru interface is to do complicated FFTs without copying the data into contiguous arrays. This advantage is not very important if data needs to be communicated through MPI anyway. Taking advantage of a guru like interface would be complicated and ineffective in an MPI setting.

In other words, if your data is not in the expected MPIFFTW 2D/3D/ND format, copying the data in this format and applying the existing (non-guru interface) functions is not terrible compared to the cost of communication.

As for partial-dimension FFTs the reason is basically the same, either you apply local (even guru) FFTW in the local dimensions you copy into one or many FFTWMPI compatible layouts and do the MPIFFT and copy back to your layout.

I never used it but FFTWMPI has a XM_plan_many_dft that probably helps in the many-MPI case, with signature

     plan XM_plan_many_dft(int rnk, const ptrdiff_t *n, ptrdiff_t howmany,       
      ptrdiff_t block, ptrdiff_t tblock, C *in, C *out,    
      MPI_Comm comm, int sign, unsigned flags);

http://www.fftw.org/fftw3_doc/Advanced-distributed_002dtranspose-interface.html

Answered by alfC on October 1, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP