You are here

rosetta mpi build

5 posts / 0 new
Last post
rosetta mpi build

I'm trying to build rosetta with mpi and found that because of file `src/utility/` line 178 lacks semicolon at the end of the line it fails to build.

This happens with version 3.11 and 2019.40 (latest).

And I want to ask that is it able to build rosetta with cmake?

I'm currently trying to build by cmake at directory `cmake/build_sharedmpi` with ninja and intel compiler but it fails to build.

Is there any suitable compiler for the build?(maybe gcc?)


Post Situation: 
Thu, 2019-10-17 23:58

You're right, this looks like a bug. We can fix this in master but the fix may take some time to get to you. Are you comfortable editting `Rosetta/main/source/src/utility/`and adding a semicolon at the end of that line? That should fix your problem immediately.

Fri, 2019-10-18 06:11

Yes, I already fixed the source code on my PC and compile on that file is fine.



Sun, 2019-10-20 23:14

Out of curiosity, which MPI implementation are you using?  (This error should only show up if you're using an implementation of MPI that we didn't consider, and I thought we had considered most of the major ones.)


Regarding CMake, the CMake compile is somewhat supported, but to a rather limited extent. (Mainly just for internal development people to use.) Scons is still the official way to build Rosetta. -- One issue with CMake is there's different levels of support for the various builds. The regular release and debug builds are well-used under CMake,  but some of the other builds are less well used. In particular, some of the more niche builds (build_sharedmpi is probably one of them) aren't necessarily robust. They may have been written for a particular person on a particular system, without regard for how well it may work on other machines/with other compilers. (For build_sharedmpi I'd probably assume GCC with OpenMPI, though I don't know for sure.)

Generally, if you're having problems building, I would recommend sticking with the scons build commands.

Fri, 2019-10-18 08:23

I'm currently trying to compile it on computer cluster, and on it it uses HPE MPI or Intel MPI library.

I loaded Intel MPI library, which is based on MPICH2 so it might work if I change some build option but I'm not sure because the error message during CMake compile seems to be executable linking problem like below.

[1/597] Linking CXX executable torsional_potential_corrections
FAILED: torsional_potential_corrections 
: && /home/app/intel/impi/2018.4.274/bin64/mpicxx  -O3 -finline-limit=20000 -s -pipe -w -O3 -ffast-math -fno-finite-math-only -funroll-loops -finline-functions -finline-limit=20000 -s -std=c++11 -pipe -ffor-scope -ftemplate-depth-256 -fPIC -DBOOST_ERROR_CODE_HEADER_ONLY -DBOOST_SYSTEM_NO_DEPRECATED -I /usr/include -I /usr/local/include -I src -I external/include -I src/platform/linux -Wl,--no-as-needed -O3 -ffast-math -fno-finite-math-only -funroll-loops -finline-functions -finline-limit=20000 -s -Wall -Wextra -pedantic -Werror -Wno-long-long -Wno-strict-aliasing -Wno-unused-variable -Wno-unused-parameter -Wno-unused-function  -rdynamic CMakeFiles/torsional_potential_corrections.dir/home/userid/rosetta_src_2019.35.60890_bundle/main/source/src/apps/public/weight_optimization/  -o torsional_potential_corrections -L/home/userid/rosetta_src_2019.35.60890_bundle/main/source/cmake/build_sharedmpi/../../external/boost_1_55_0  -L/home/userid/rosetta_src_2019.35.60890_bundle/main/source/cmake/build_sharedmpi/../../external/lib -Wl,-rpath,/home/userid/rosetta_src_2019.35.60890_bundle/main/source/cmake/build_sharedmpi/../../external/boost_1_55_0:/home/userid/rosetta_src_2019.35.60890_bundle/main/source/cmake/build_sharedmpi/../../external/lib:/home/userid/rosetta_src_2019.35.60890_bundle/main/source/cmake/build_sharedmpi: -lz -ldl -lstdc++ && : undefined reference to `protocols::genetic_algorithm::EntityElement::operator=(protocols::genetic_algorithm::EntityElement const&)'
ninja: build stopped: subcommand failed.

I'm also trying scons, though it stops compiling with some different error message which might be caused by my environment variable.

Actually I successfully finished compile on my local PC (OpenMPI installed) so this might be compiler dependent.


Mon, 2019-10-21 10:01