Skip to content

Conversation

CharlieFModo
Copy link
Contributor

Closes # (if applicable).
#484

Changes proposed in this Pull Request

Makes lp-polars API behave the same as lp. All screenshots below use the same problem.

linopy 0.5.6 - io_api="lp-polars"
image

this branch - io_api="lp-polars"
image

linopy 0.5.6 - io_api="lp"
image

Checklist

  • Code changes are sufficiently documented; i.e. new functions contain docstrings and further explanations may be given in doc.
  • Unit tests for new features were added (if applicable).
  • A note for the release notes doc/release_notes.rst of the upcoming release is included.
  • I consent to the release of this PR's code under the MIT license.

@Bauer1610
Copy link

@FabianHofmann - appreciate your time with looking into this bug fix. We have seen a massive speed up!

@FabianHofmann
Copy link
Collaborator

FabianHofmann commented Sep 4, 2025

the speed up is really crazy, and it improved over time it seems. I am just wondering, where is the filtering applied to the ordinary lp export? your change indeed seems to fix the issue. however atm I cannot understand why (is that a precision difference between polars and pandas?). if it is really just about the filtering of close zero coeffs couldn't we then just alter the current filter_null_polars function to

def filter_nulls_polars(df: pl.DataFrame) -> pl.DataFrame:
    """
    Filter out rows containing "empty" values from a polars DataFrame.

    Args:
    ----
        df (pl.DataFrame): The DataFrame to filter.

    Returns:
    -------
        pl.DataFrame: The filtered DataFrame.
    """
    cond = []
    varcols = [c for c in df.columns if c.startswith("vars")]
    if varcols:
        cond.append(reduce(operator.or_, [pl.col(c).ne(-1) for c in varcols]))
    if "coeffs" in df.columns:
        cond.append(pl.col("coeffs").abs().gt(1e-12))
    if "labels" in df.columns:
        cond.append(pl.col("labels").ne(-1))

    cond = reduce(operator.and_, cond)  # type: ignore
    return df.filter(cond)

@CharlieFModo
Copy link
Contributor Author

CharlieFModo commented Sep 4, 2025

The numerical filtering isn't required for this to fix the infeasibilities and can be removed if you prefer. I did this to remove numerical noise. What fixes it is the filtering to ensure each constraint has valid coefficients and RHS.

We found the lp-polars API was writing malformed constraints to the .lp file and this filtering removed them and matched the set written by the default API. These malformed constraints occurred where either the coeffs or RHS were invalid.

I already have a copy of this without the numerical filtering if you would like me to update.

@FabianHofmann
Copy link
Collaborator

I understand, thank you for clarifying. Just to finally get me on board, invalid coeffs and rhs means nans?

@CharlieFModo
Copy link
Contributor Author

Yes - or "null"s in polars speak. I check that every constraint has coefficients that are not null and a sign that is not null. It is assumed that if the sign is not null, the rhs is also not null.

A more robust check would be something like:
(pl.col("sign").is_not_null() & pl.col("rhs").is_not_null()).sum().alias("rhs_rows")

@FBumann
Copy link
Contributor

FBumann commented Sep 4, 2025

I can't add anything, but just wanted to give props to you guys for working on this!

@FabianHofmann
Copy link
Collaborator

great, thanks, I need bit more time to review this properly, would be happy about some effort to make this a more minor change (code-wise) as I'm still getting used to the polars syntax. but when happy to trigger a new release once this is in (early next week I would suggest)

@CharlieFModo
Copy link
Contributor Author

Sure thing - I've pushed some changes to reduce the scope. I've removed the dropping of small values and focussed only on ensuring the constraints are written in the right format. I hope that makes it easier to review and incorporate.

@FabianHofmann FabianHofmann mentioned this pull request Sep 8, 2025
4 tasks
Copy link
Collaborator

@FabianHofmann FabianHofmann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

great @CharlieFModo, happy to merge this. out of curiosity, did you give #493 a try? don't know if my tagging reached you

@FabianHofmann FabianHofmann merged commit 12b0da3 into PyPSA:master Sep 8, 2025
21 of 22 checks passed
@CharlieFModo
Copy link
Contributor Author

@FabianHofmann - I had a look but didn't attempt to implement it. I think if I attempt anything else in the short term it will be the direct api for the xpress solver or other performance improvements to reduce the overhead to and from the solver. Hope that's ok!

@FabianHofmann
Copy link
Collaborator

@FabianHofmann - I had a look but didn't attempt to implement it. I think if I attempt anything else in the short term it will be the direct api for the xpress solver or other performance improvements to reduce the overhead to and from the solver. Hope that's ok!

no problem!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants