Releases: Netflix/metaflow
2.15.4
What's Changed
- check gum and helm is installed properly by @savingoyal in #2326
- Allow users to pass a list of values as varargs to click API by @talsperre in #2330
- Yet another patch release by @talsperre in #2332
Full Changelog: 2.15.3...2.15.4
2.15.3
What's Changed
- Fix issue with parsers not being compatible with extensions by @romain-intel in #2320
- change default make goal to help by @savingoyal in #2323
- configure local events in makefile by @savingoyal in #2324
- yet another patch release by @savingoyal in #2325
Full Changelog: 2.15.2...2.15.3
2.15.2
What's Changed
- look for Makefile in more places by @savingoyal in #2315
- yet another patch release by @savingoyal in #2316
Full Changelog: 2.15.1...2.15.2
2.15.1
What's Changed
- support alternate architectures for @conda/@pypi by @savingoyal in #2304
- more realtime log fetching by @savingoyal in #2307
- @pypi/@conda parsers ... by @savingoyal in #2306
- Add metaflow-diff to main CLI by @npow in #2282
- [jobsets] Fix bug in jobset atexit handler by @valayDave in #2312
- Fix issue in stubs where step decorators dont have any parameters by @talsperre in #2313
- release version 2.15.1 by @talsperre in #2314
New Contributors
Full Changelog: 2.15.0...2.15.1
2.15.0
What's Changed
- mute stubs test for linux py3.13 by @savingoyal in #2295
- fix test configuration by @savingoyal in #2294
- feat: update metadata on ec2 spot termination for batch by @felippemr in #2271
- feature: pin fast-init binary to latest by @saikonen in #2300
- skip signal handling in non main thread by @madhur-ob in #2299
- local kubernetes setup by @savingoyal in #2292
- release: 2.15.0 by @saikonen in #2303
New Contributors
- @felippemr made their first contribution in #2271
Full Changelog: 2.14.3...2.15.0
2.14.3
What's Changed
- Revert "change default micromamba for s3 datastore" by @savingoyal in #2288
- release 2.14.3 by @savingoyal in #2289
Full Changelog: 2.14.2...2.14.3
2.14.2
What's Changed
- [Ready for Review] Make TOGGLE_DECOSPECS only add default decorator once. by @darinyu in #2097
- release 2.14.2 by @savingoyal in #2287
Full Changelog: 2.14.1...2.14.2
2.14.1
What's Changed
- change default micromamba for s3 datastore by @savingoyal in #2262
- Tracing cleanups by @savingoyal in #2265
- More varargs tweaks by @romain-intel in #2249
- Update flag for including for each stack info in metadata by @talsperre in #2266
- fix save_logs.py by @savingoyal in #2268
- Update argo_workflows.py by @savingoyal in #2270
- [argo] fix bug in jobset owner reference by @valayDave in #2280
- [argo] add additional labels to jobsets created by argo by @valayDave in #2281
- add extract method to MetaflowCode by @savingoyal in #2246
- Support escape init function by @wangchy27 in #2267
- fix: Argo notifications issues by @saikonen in #2285
- Add static and runtime dag info, API to fetch ancestor and successor tasks by @talsperre in #2124
- bump version t o2.14.1 by @wangchy27 in #2286
Full Changelog: 2.14.0...2.14.1
2.14.0
Improvements
Fix regression with Metaflow Deployer
This release reverts a change that caused the Metaflow deployer for Argo Workflows and AWS Step Functions not to work.
Minor version bump
The minor version is being bumped due to last releases #2243 changing defaults for artifact serialization, even though these should not have any functional effect with Python versions >=3.4
Connection pooling for metadata service
Improves the time it takes to launch the first task by using a connection pool for the metadata service traffic
What's Changed
- Introduce connection pooling in metadata service by @savingoyal in #2258
- fix: Deployer issue by @saikonen in #2261
- release: 2.14.0 by @saikonen in #2263
Full Changelog: 2.13.10...2.14.0
2.13.10
Features
Argo Workflows for Incident.io alerts
This release introduces support for Incident.io
alerts with Argo Workflows. In order to enable these, some additional configuration is required compared to other notification implementations.
as an example
python alerting_flow.py argo-workflows create \
--notify-on-error \
--notify-on-success \
--notify-incident-io-api-key API-KEY \
--incident-io-error-severity-id ERROR-ID \
--incident-io-success-severity-id SUCCESS-ID
The API key used should have permissions to create incidents.
The severity ID's are a requirement from incident.io
as this is how alerts are categorized. All severity ID's are account based and users can create new ones as they please, which is why we must set ones as part the flow deployment.
Improvements
Default to Pickle protocol 4 for artifacts
This release changes the default artifact serialization to use protocol 4 for pickling. The change should lead to storage savings in small (<2 GB) artifacts along with faster serializations due to skipping trying protocol 2 first.
What's Changed
- Update cli.py to use
echo_always()
for methodoutput_raw
and `out… by @xujiboy in #2244 - feature: basic support for incident.io in Argo Workflows by @saikonen in #2245
- Fix issues with configs and Runner by @romain-intel in #2234
- Update project doc with new possible options by @romain-intel in #2220
- serialize artifacts with pickle protocol 4 if possible by @amerberg in #2243
- change default micromamba for s3 datastore by @savingoyal in #2254
- Revert "change default micromamba for s3 datastore" by @savingoyal in #2255
- skip boto3 compilation for code download by @savingoyal in #2257
- release: 2.13.10 by @saikonen in #2260
New Contributors
Full Changelog: 2.13.9...2.13.10