Skip to content

Commit

Permalink
Merge branch 'main' into audit_10_24
Browse files Browse the repository at this point in the history
  • Loading branch information
andrewlee94 authored Oct 28, 2024
2 parents 04dabcc + d931060 commit a5574f1
Show file tree
Hide file tree
Showing 17 changed files with 920 additions and 83 deletions.
2 changes: 1 addition & 1 deletion docs/explanations/scaling_toolbox/scaling_theory.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ Types of Scaling
* Constraint residual scaling refers to the magnitude of the residual of all constraints in the model. This is important as this is required to determine whether or not a model has converged, thus it is important that a residual equal to the solver tolerance does not significantly alter the solution to the model.

* E.g. consider a constraint A=B. If the magnitude of A and B are 1e6 and the solver tolerance is 1e-6, this means that A and B need to be solved to 12 significant figures of precision to converge the constraint (which may be unnecessarily onerous). Similarly, if A and B were of order 1e-6, then they will be considered equal if A=1e-6 and B=0; i.e. ±100% error.

* Jacobian scaling refers to the overall scaling and conditioning of the problem Jacobian. This is important as this determines the ability of the solver to find a path from the initial state to the final solution. It is important to ensure that the Jacobian is well-conditioned both in order to get good numerical behavior from the solver and also to ensure that floating point round-off error does not result in large changes to the variables at the solution.

These aspects are not always complimentary, and there are often cases where improving one aspect of the model scaling can negatively affect another aspect (e.g., focusing too much on Jacobian condition number can often result in poor constraint residual tolerances). Additionally, each aspect affects different parts of the solver routine; Jacobian scaling is important for determining the step direction and size at each iteration, whilst constraint residual scaling is important for determining if and when a solution is found. The relative importance of each of these depends on the solver being used (e.g., for IPOPT constraint residual scaling is more important than Jacobian scaling as IPOPT does its own internal scaling of the Jacobian), however in general all of these will have an impact upon the solver behavior and users should endeavor to have good scaling for all aspects of the model.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -51,3 +51,9 @@ EquilibriumReactorData Class

.. autoclass:: EquilibriumReactorData
:members:

EquilibriumReactorScaler Class
------------------------------

.. autoclass:: EquilibriumReactorScaler
:members:
8 changes: 8 additions & 0 deletions idaes/core/base/property_base.py
Original file line number Diff line number Diff line change
Expand Up @@ -593,6 +593,14 @@ def include_inherent_reactions(self):
# pylint: disable-next=protected-access
return self.parent_component()._include_inherent_reactions()

@property
def default_initializer(self):
return self.parent_component().default_initializer

@property
def default_scaler(self):
return self.parent_component().default_scaler

def build(self):
"""
General build method for StateBlockDatas.
Expand Down
8 changes: 8 additions & 0 deletions idaes/core/base/reaction_base.py
Original file line number Diff line number Diff line change
Expand Up @@ -267,6 +267,14 @@ def phase_list(self):
def phase_component_set(self):
return self.state_ref.phase_component_set

@property
def default_initializer(self):
return self.parent_component().default_initializer

@property
def default_scaler(self):
return self.parent_component().default_scaler

def lock_attribute_creation_context(self):
"""Returns a context manager that does not allow attributes to be created
while in the context and allows attributes to be created normally outside
Expand Down
2 changes: 1 addition & 1 deletion idaes/core/scaling/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
# for full copyright and license information.
#################################################################################
from .autoscaling import AutoScaler
from .custom_scaler_base import CustomScalerBase
from .custom_scaler_base import CustomScalerBase, ConstraintScalingScheme
from .scaler_profiling import ScalingProfiler
from .util import (
scaling_factors_from_json_file,
Expand Down
30 changes: 11 additions & 19 deletions idaes/core/scaling/custom_scaler_base.py
Original file line number Diff line number Diff line change
Expand Up @@ -554,8 +554,7 @@ def propagate_state_scaling(

def call_submodel_scaler_method(
self,
model,
submodel: str,
submodel,
method: str,
submodel_scalers: ComponentMap = None,
overwrite: bool = False,
Expand All @@ -567,54 +566,47 @@ def call_submodel_scaler_method(
default scaler for the submodel is used.
Args:
model: parent model of submodel
submodel: local name of submodel to be scaled as str
submodel: submodel to be scaled
submodel_scalers: user provided ComponentMap of Scalers to use for submodels
method: name of method to call from submodel (as string)
overwrite: whether to overwrite existing scaling factors
Returns:
None
"""
# Get actual submodel object from name
# For this method, we have to use the component name as the Scaler is written
# before the model is constructed.
sm_obj = model.find_component(submodel)

if submodel_scalers is None:
submodel_scalers = {}

# Iterate over indices of submodel
for smdata in sm_obj.values():
for smdata in submodel.values():
# Get Scaler for submodel
if sm_obj in submodel_scalers:
scaler = submodel_scalers[sm_obj]
if submodel in submodel_scalers:
scaler = submodel_scalers[submodel]
if callable(scaler):
# Check to see if Scaler is callable - this implies it is a class and not an instance
# Call the class to create an instance
scaler = scaler()
_log.debug(f"Using user-defined Scaler for {model}.{submodel}.")
_log.debug(f"Using user-defined Scaler for {submodel}.")
else:
try:
scaler = smdata.default_scaler
_log.debug(f"Using default Scaler for {model}.{submodel}.")
_log.debug(f"Using default Scaler for {submodel}.")
except AttributeError:
_log.debug(
f"No default Scaler set for {model}.{submodel}. Cannot call {method}."
f"No default Scaler set for {submodel}. Cannot call {method}."
)
return
if scaler is not None:
scaler = scaler()
else:
_log.debug(
f"No Scaler found for {model}.{submodel}. Cannot call {method}."
)
_log.debug(f"No Scaler found for {submodel}. Cannot call {method}.")

# If a Scaler is found, call desired method
if scaler is not None:
try:
smeth = getattr(scaler, method)
except AttributeError:
raise AttributeError(
f"Scaler for {model}.{submodel} does not have a method named {method}."
f"Scaler for {submodel} does not have a method named {method}."
)
smeth(smdata, overwrite=overwrite)
16 changes: 10 additions & 6 deletions idaes/core/scaling/scaler_profiling.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,11 @@
from pyomo.common.tempfiles import TempfileManager

from idaes.core.util.scaling import jacobian_cond
from idaes.core.scaling import AutoScaler, CustomScalerBase
from idaes.core.scaling.autoscaling import AutoScaler
from idaes.core.scaling.custom_scaler_base import (
CustomScalerBase,
ConstraintScalingScheme,
)
from idaes.core.solvers import get_solver


Expand Down Expand Up @@ -83,23 +87,23 @@ def __init__(
"Vars Only": (None, {}),
"Harmonic": (
cscaler.scale_constraint_by_nominal_value,
{"scheme": "harmonic_mean"},
{"scheme": ConstraintScalingScheme.harmonicMean},
),
"Inverse Sum": (
cscaler.scale_constraint_by_nominal_value,
{"scheme": "inverse_sum"},
{"scheme": ConstraintScalingScheme.inverseSum},
),
"Inverse Root Sum Squares": (
cscaler.scale_constraint_by_nominal_value,
{"scheme": "inverse_root_sum_squared"},
{"scheme": ConstraintScalingScheme.inverseRSS},
),
"Inverse Maximum": (
cscaler.scale_constraint_by_nominal_value,
{"scheme": "inverse_maximum"},
{"scheme": ConstraintScalingScheme.inverseMaximum},
),
"Inverse Minimum": (
cscaler.scale_constraint_by_nominal_value,
{"scheme": "inverse_minimum"},
{"scheme": ConstraintScalingScheme.inverseMinimum},
),
"Nominal L1 Norm": (
cscaler.scale_constraint_by_nominal_derivative_norm,
Expand Down
44 changes: 8 additions & 36 deletions idaes/core/scaling/tests/test_custom_scaler_base.py
Original file line number Diff line number Diff line change
Expand Up @@ -608,15 +608,12 @@ def test_call_submodel_scaler_method_no_scaler(self, caplog):
m.b = Block([1, 2, 3])

sb = CustomScalerBase()
sb.call_submodel_scaler_method(m, "b", method="dummy_method", overwrite=True)
sb.call_submodel_scaler_method(m.b, method="dummy_method", overwrite=True)

for bd in m.b.values():
assert not hasattr(bd, "_dummy_scaler_test")

assert (
"No default Scaler set for unknown.b. Cannot call dummy_method."
in caplog.text
)
assert "No default Scaler set for b. Cannot call dummy_method." in caplog.text

@pytest.mark.unit
def test_call_submodel_scaler_method_default_scaler(self, caplog):
Expand All @@ -629,12 +626,12 @@ def test_call_submodel_scaler_method_default_scaler(self, caplog):
bd.default_scaler = DummyScaler

sb = CustomScalerBase()
sb.call_submodel_scaler_method(m, "b", method="dummy_method", overwrite=True)
sb.call_submodel_scaler_method(m.b, method="dummy_method", overwrite=True)

for bd in m.b.values():
assert bd._dummy_scaler_test

assert "Using default Scaler for unknown.b." in caplog.text
assert "Using default Scaler for b." in caplog.text

@pytest.mark.unit
def test_call_submodel_scaler_method_user_scaler(self, caplog):
Expand All @@ -649,8 +646,7 @@ def test_call_submodel_scaler_method_user_scaler(self, caplog):

sb = CustomScalerBase()
sb.call_submodel_scaler_method(
m,
"b",
m.b,
method="dummy_method",
submodel_scalers=scaler_map,
overwrite=False,
Expand All @@ -659,7 +655,7 @@ def test_call_submodel_scaler_method_user_scaler(self, caplog):
for bd in m.b.values():
assert not bd._dummy_scaler_test

assert "Using user-defined Scaler for unknown.b." in caplog.text
assert "Using user-defined Scaler for b." in caplog.text

@pytest.mark.unit
def test_call_submodel_scaler_method_user_scaler_class(self, caplog):
Expand All @@ -674,8 +670,7 @@ def test_call_submodel_scaler_method_user_scaler_class(self, caplog):

sb = CustomScalerBase()
sb.call_submodel_scaler_method(
m,
"b",
m.b,
method="dummy_method",
submodel_scalers=scaler_map,
overwrite=False,
Expand All @@ -684,27 +679,4 @@ def test_call_submodel_scaler_method_user_scaler_class(self, caplog):
for bd in m.b.values():
assert not bd._dummy_scaler_test

assert "Using user-defined Scaler for unknown.b." in caplog.text

@pytest.mark.unit
def test_call_submodel_scaler_method_invalid_method(self):
# Dummy up a nested model
m = ConcreteModel()
m.b = Block([1, 2, 3])

scaler_map = ComponentMap()
scaler_map[m.b] = DummyScaler()

sb = CustomScalerBase()

with pytest.raises(
AttributeError,
match="Scaler for unknown.b does not have a method named foo.",
):
sb.call_submodel_scaler_method(
m,
"b",
method="foo",
submodel_scalers=scaler_map,
overwrite=False,
)
assert "Using user-defined Scaler for b." in caplog.text
2 changes: 0 additions & 2 deletions idaes/core/scaling/tests/test_custom_scaling_integration.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,6 @@
from idaes.models.properties.activity_coeff_models.methane_combustion_ideal import (
MethaneParameterBlock as MethaneCombustionParameterBlock,
)
from idaes.core.util.testing import PhysicalParameterTestBlock, initialization_tester
from idaes.core.solvers import get_solver
from idaes.core.util import to_json, from_json, StoreSpec
from idaes.core.util.scaling import jacobian_cond
from idaes.core.scaling import AutoScaler, CustomScalerBase, set_scaling_factor
Expand Down
74 changes: 74 additions & 0 deletions idaes/models/properties/examples/saponification_reactions.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,11 +31,13 @@
from idaes.core.util.misc import add_object_reference
from idaes.core.util.constants import Constants as const
import idaes.logger as idaeslog
from idaes.core.scaling import CustomScalerBase


# Some more information about this module
__author__ = "Andrew Lee"

from idaes.core.util.scaling import get_scaling_factor

# Set up logger
_log = idaeslog.getLogger(__name__)
Expand Down Expand Up @@ -111,12 +113,84 @@ def define_metadata(cls, obj):
)


class SaponificationReactionScaler(CustomScalerBase):
"""
Scaler for saponification reaction package.
Variables are scaled by nominal order of magnitude, and constraints
using the inverse maximum scheme.
"""

DEFAULT_SCALING_FACTORS = {"reaction_rate": 1e2}

def variable_scaling_routine(
self, model, overwrite: bool = False, submodel_scalers: dict = None
):
if model.is_property_constructed("k_rxn"):
# First check to see if k_rxn is already scaled
sf = get_scaling_factor(model.k_rxn)

if sf is not None and not overwrite:
# k_rxn already has a scaling factor and we are not set to overwrite - move on
pass
else:
# Hopefully temperature has been scaled, so we can get the nominal value of k_rxn
# by walking the expression in the constraint.
nominals = self.get_expression_nominal_values(model.arrhenius_eqn)

# We should get two values, k_rxn (LHS) and the Arrhenius equation (RHS)
# As of 10/3/2024, the LHS will be the 0-th element of the list, and the RHS the 1st
# However, we cannot assume this will always be the case

# If LHS has been scaled, nominal will be 1/sf, otherwise it will be k_rxn.value
# Find the value which does NOT match this - guess that this is the 1st element
if nominals[1] != model.k_rxn.value and sf is None:
# This is the most likely case, so check it first
nominal = nominals[1]
elif sf is not None and nominals[1] != 1 / sf:
# Next, check for case where k_rxn was already scaled
nominal = nominals[1]
else:
# Otherwise we have the case where something changed in Pyomo since 10/3/2024
nominal = nominals[0]

self.set_variable_scaling_factor(
model.k_rxn,
1 / nominal,
overwrite=overwrite,
)

if model.is_property_constructed("reaction_rate"):
for j in model.reaction_rate.values():
self.scale_variable_by_default(j, overwrite=overwrite)

def constraint_scaling_routine(
self, model, overwrite: bool = False, submodel_scalers: dict = None
):
if model.is_property_constructed("arrhenius_eqn"):
self.scale_constraint_by_nominal_value(
model.arrhenius_eqn,
scheme="inverse_maximum",
overwrite=overwrite,
)

if model.is_property_constructed("rate_expression"):
for j in model.rate_expression.values():
self.scale_constraint_by_nominal_value(
j,
scheme="inverse_maximum",
overwrite=overwrite,
)


class _ReactionBlock(ReactionBlockBase):
"""
This Class contains methods which should be applied to Reaction Blocks as a
whole, rather than individual elements of indexed Reaction Blocks.
"""

default_scaler = SaponificationReactionScaler

def initialize(blk, outlvl=idaeslog.NOTSET, **kwargs):
"""
Initialization routine for reaction package.
Expand Down
Loading

0 comments on commit a5574f1

Please sign in to comment.