Skip to content

Fix #914 #916

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 3, 2025
Merged

Fix #914 #916

merged 1 commit into from
Jul 3, 2025

Conversation

wilfonba
Copy link
Contributor

@wilfonba wilfonba commented Jul 2, 2025

User description

Description

This fixes the bug with boundary condition patches in post process.

Fixes #914

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)

Scope

  • This PR comprises a set of related changes with a common goal

If you cannot check the above box, please split your PR into multiple PRs that each have a common goal.

How Has This Been Tested?

Run examples/2D_jet/case.py with 4 ranks. Output and video is attached.
out.txt

test.mp4

PR Type

Bug fix


Description

  • Fix boundary condition patches in post process

  • Add missing bc_io variable to MPI broadcast list


Changes diagram

flowchart LR
  A["MPI broadcast variables"] --> B["Add bc_io variable"] --> C["Fix 2D_jet post process"]
Loading

Changes walkthrough 📝

Relevant files
Bug fix
m_mpi_proxy.fpp
Add missing bc_io to MPI broadcast                                             

src/post_process/m_mpi_proxy.fpp

  • Add bc_io variable to the MPI broadcast variable list
  • Fix missing variable causing post process to hang
  • +1/-1     

    Need help?
  • Type /help how to ... in the comments thread for any questions about Qodo Merge usage.
  • Check out the documentation for more information.
  • @Copilot Copilot AI review requested due to automatic review settings July 2, 2025 18:53
    @wilfonba wilfonba requested a review from a team as a code owner July 2, 2025 18:53
    Copy link
    Contributor

    @Copilot Copilot AI left a comment

    Choose a reason for hiding this comment

    The reason will be displayed to describe this comment to others. Learn more.

    Pull Request Overview

    This PR adds support for the new bc_io boundary-condition flag in the MPI proxy broadcast routine.

    • Included bc_io in the list of flags to be broadcast
    • Ensures that post-process modules receive the updated boundary-condition setting
    Comments suppressed due to low confidence (1)

    src/post_process/m_mpi_proxy.fpp:110

    • Consider adding a focused test (unit or integration) to verify that the bc_io flag is correctly broadcast and applied across all ranks in the MPI proxy.
                & 'output_partial_domain', 'relativity', 'cont_damage', 'bc_io']
    

    Copy link

    qodo-merge-pro bot commented Jul 2, 2025

    PR Reviewer Guide 🔍

    Here are some key observations to aid the review process:

    🎫 Ticket compliance analysis ✅

    914 - PR Code Verified

    Compliant requirements:

    • Fix the 2D_jet example that gets stuck during post process
    • Resolve the hanging issue that prevents the example from completing

    Requires further human verification:

    • Verify that the 2D_jet example now runs successfully without hanging
    • Confirm that other examples still work correctly after this change

    ⏱️ Estimated effort to review: 2 🔵🔵⚪⚪⚪
    🧪 No relevant tests
    🔒 No security concerns identified
    ⚡ Recommended focus areas for review

    Missing Tests

    The fix adds a missing variable to MPI broadcast but lacks automated tests to prevent regression. Consider adding a test that verifies all required variables are properly broadcast in MPI operations.

    & 'output_partial_domain', 'relativity', 'cont_damage', 'bc_io']
    call MPI_BCAST(${VAR}$, 1, MPI_LOGICAL, 0, MPI_COMM_WORLD, ierr)

    Copy link

    qodo-merge-pro bot commented Jul 2, 2025

    PR Code Suggestions ✨

    No code suggestions found for the PR.

    @sbryngelson
    Copy link
    Member

    This closes #914 but opens (please open @wilfonba) an issue that we are seemingly not testing correctly. Can you figure out why this example case was passing when it should have been failing?

    @wilfonba
    Copy link
    Contributor Author

    wilfonba commented Jul 2, 2025

    This closes #914 but opens (please open @wilfonba) an issue that we are seemingly not testing correctly. Can you figure out why this example case was passing when it should have been failing?

    The test suite doesn't test multiple rank cases very thoroughly. It only has 3 MPI cases in the no RDMA case.

    @sbryngelson
    Copy link
    Member

    This closes #914 but opens (please open @wilfonba) an issue that we are seemingly not testing correctly. Can you figure out why this example case was passing when it should have been failing?

    The test suite doesn't test multiple rank cases very thoroughly. It only has 3 MPI cases in the no RDMA case.

    Was that the actual problem? This problem occurred for a single rank CPU case on my laptop...

    @wilfonba
    Copy link
    Contributor Author

    wilfonba commented Jul 2, 2025

    This closes #914 but opens (please open @wilfonba) an issue that we are seemingly not testing correctly. Can you figure out why this example case was passing when it should have been failing?

    The test suite doesn't test multiple rank cases very thoroughly. It only has 3 MPI cases in the no RDMA case.

    Was that the actual problem? This problem occurred for a single rank CPU case on my laptop...

    I just ran it with one rank on the master branch on my laptop, and it ran fine. It only got stuck in post process when I added a second rank. I'm not sure what you did differently on your end. The issue is that when bc_io isn't broadcast, the non-root processes don't attempt to open an MPI_IO file, and a gridlock occurs.

    @sbryngelson
    Copy link
    Member

    I'm not sure what I did either, I guess we will just proceed.

    Copy link

    codecov bot commented Jul 2, 2025

    Codecov Report

    All modified and coverable lines are covered by tests ✅

    Project coverage is 44.03%. Comparing base (f1fea7a) to head (1c3cd99).
    Report is 2 commits behind head on master.

    Additional details and impacted files
    @@           Coverage Diff           @@
    ##           master     #916   +/-   ##
    =======================================
      Coverage   44.03%   44.03%           
    =======================================
      Files          68       68           
      Lines       18395    18395           
      Branches     2227     2227           
    =======================================
      Hits         8101     8101           
      Misses       8991     8991           
      Partials     1303     1303           

    ☔ View full report in Codecov by Sentry.
    📢 Have feedback on the report? Share it here.

    🚀 New features to boost your workflow:
    • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

    @sbryngelson sbryngelson merged commit 47053f8 into MFlowCode:master Jul 3, 2025
    31 of 32 checks passed
    prathi-wind pushed a commit to prathi-wind/MFC-prathi that referenced this pull request Jul 13, 2025
    @wilfonba wilfonba deleted the BC_IOFix branch July 22, 2025 14:43
    Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
    Development

    Successfully merging this pull request may close these issues.

    2D_jet example broken @ post_process
    2 participants