Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #404 +/- ##
==========================================
+ Coverage 74.88% 74.92% +0.03%
==========================================
Files 22 22
Lines 3325 3334 +9
==========================================
+ Hits 2490 2498 +8
- Misses 835 836 +1 ☔ View full report in Codecov by Sentry. |
ewu63
left a comment
There was a problem hiding this comment.
Looks good, like you said this is a breaking change but probably not too bad for downstream users. Another design pattern I had considered previously was for the user to tell pyOptSparse which information to request back from snstop, and those are returned (either as a single dict or just a bunch of args). That would be the most flexible but will require some further implementation.
| iterDict["funcs"].update(self.cache["funcs"]) | ||
|
|
||
| # Create the restart dictionary to be passed to snstop_handle | ||
| restartDict = { |
There was a problem hiding this comment.
TODO for the future: this should probably be a dataclass object instead of a dict... That way we can avoid some code duplication with above.
There was a problem hiding this comment.
@ewu63 I can do it if you open an issue with the specification of what you would like to have ;-)
There was a problem hiding this comment.
Thanks for the comments. It would be nice for the function to be more flexible. I'm not sure how returning values would work though because the user does not call _snstop. It seems like the dictionaries have to be passed in.
|
Maybe also do a version bump here (minor?) |
The way I'm thinking is the same as how we implemented Another thought - we can also make the "save work arrays at every iter" an option, instead of asking the user to make the snstop function. We can add an option |
|
This is ready for another review. I implemented @ewu63's suggestions so this is more useable and is no longer a breaking change. |
ewu63
left a comment
There was a problem hiding this comment.
I think the version bump got swallowed in the merge resolution. Otherwise LGTM, thanks!
marcomangano
left a comment
There was a problem hiding this comment.
Just a couple of minor comments, looks great. I will bump the minor version in a minute, but I would like to have #410 merged before this so we can include that in the new release.
|
So I forgot that we made it not a breaking change, meaning we should've bumped patch and not minor version... oh well a bit late now, I'd rather not leave any commits on |
I bumped the minor version because this is technically a new feature, I think we got a bit loose with version numbers tbh. Not a big deal, but we can be a bit more strict in the future |
In general I have shied away from semver when I was in the lab, and instead stuck largely to Numpy's versioning scheme, for the reasons outlined there. There is nothing inherently wrong with bumping the minor version frequently, but it should carry some weight behind it - otherwise we quickly move to v2.52.0 without actual substantial changes to the software. |
Purpose
I added an option to save
restartDictafter each major iteration. This is helpful when you want to restart optimizations after they crash unexpectedly or run out of time. In the latter case, this avoids the guesswork of setting a conservative time limit. I also added an option that allowsrestartDictto be passed tosnstop_handleif the user wants to access this after every major iteration.Expected time until merged
1-2 weeks
Type of change
Testing
I added a test that writes
restartDictusingsnstopand uses the dictionary to restart the optimization. I also tested this with my aerodynamic shape optimization cases.Checklist
flake8andblackto make sure the Python code adheres to PEP-8 and is consistently formattedfprettifyor C/C++ code withclang-formatas applicable