Article
Version 1
Preserved in Portico This version is not peer-reviewed
How to Parallelize “Non-Parallelizable” Minimization Functions
Version 1
: Received: 24 May 2024 / Approved: 10 June 2024 / Online: 11 June 2024 (10:27:40 CEST)
Version 2 : Received: 21 June 2024 / Approved: 22 June 2024 / Online: 24 June 2024 (11:27:38 CEST)
Version 2 : Received: 21 June 2024 / Approved: 22 June 2024 / Online: 24 June 2024 (11:27:38 CEST)
How to cite: Lukyanenko, D.; Torbin, S.; Shinkarev, V. How to Parallelize “Non-Parallelizable” Minimization Functions. Preprints 2024, 2024060571. https://doi.org/10.20944/preprints202406.0571.v1 Lukyanenko, D.; Torbin, S.; Shinkarev, V. How to Parallelize “Non-Parallelizable” Minimization Functions. Preprints 2024, 2024060571. https://doi.org/10.20944/preprints202406.0571.v1
Abstract
The paper proposes a universal algorithm for parallelizing calculations that arise when using highly-optimized minimization functions available in many computing packages. The main idea of the proposed algorithm is based on the fact that although the “inner workings” of the minimization function used may not be known to the user, it inevitably uses in its work auxiliary functions that implement the calculation of the minimized functional and its gradient, which are usually implemented by the user, which means that in most cases they can be parallelized relatively easily. The paper discusses in detail both the parallelization algorithm and its software implementation using MPI parallel programming technology. Examples of the software implementation of the proposed algorithm are demonstrated using the Python programming language, but can be easily rewritten using the C/C++/Fortran programming languages.
Keywords
minimization; parallel algorithm; parallelization; MPI; MPI-4
Subject
Computer Science and Mathematics, Data Structures, Algorithms and Complexity
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment