Releases: Netflix/metaflow
2.12.19
What's Changed
- add local root for
@pypi
by @savingoyal in #2004 - fix: support user-defined attribute checks for pypi_base as well by @saikonen in #2007
- Bump svelte from 4.2.8 to 4.2.19 in /metaflow/plugins/cards/ui by @dependabot in #2008
- feature: add is_attribute_user_defined check to conda decorators by @saikonen in #2011
Full Changelog: 2.12.18...2.12.19
2.12.18
2.12.17
What's Changed
- Fix bootstrap script command by @ob-uk in #1994
- metaflow 2.12.17 by @savingoyal in #1995
New Contributors
Full Changelog: 2.12.16...2.12.17
2.12.16
What's Changed
- fix: deal with empty string and None values in s3 endpoint_url by @saikonen in #1991
- fix: boto3 bootstrap cleanup by @saikonen in #1992
- metaflow 2.12.16 by @savingoyal in #1993
Full Changelog: 2.12.15...2.12.16
2.12.15
What's Changed
- add snowpark by @madhur-ob in #1986
- metaflow 2.12.15 by @savingoyal in #1987
- don't support slurm, snowpark, nvidia by @madhur-ob in #1988
Full Changelog: 2.12.14...2.12.15
2.12.14
What's Changed
- feature: add executable attribute to batch/kubernetes decorators by @saikonen in #1976
- fix: rename docker environment to fast-bakery by @saikonen in #1978
- feature: add is_attribute_user_defined check to pypi decorator by @saikonen in #1979
- make timeout configurable by @madhur-ob in #1980
- azure creds by @madhur-ob in #1977
- add wait_for_run with timeout by @madhur-ob in #1984
- metaflow 2.12.14 by @savingoyal in #1983
Full Changelog: 2.12.13...2.12.14
2.12.13
What's Changed
- enable heart beat daemon and error tracking exit hook by default by @savingoyal in #1953
- Add local configuration option by @romain-intel in #1850
- support compute pools by @savingoyal in #1952
- fix: Metaflow local config init by @saikonen in #1967
- Fix an issue with _graph_info by @romain-intel in #1966
- [Ready for Review] Fix issue where resuming on successful run will fail. by @darinyu in #1956
- Align cloning log msgs by @savingoyal in #1963
- fix: passing kubernetes compute pool option by @saikonen in #1968
- feature: better bootstrap dependencies by @saikonen in #1972
- fix(MetaflowData): Raise AttributeError on failed
__getattr__
access by @mutongx in #1971 - Update version to 2.12.13 by @romain-intel in #1973
New Contributors
Full Changelog: 2.12.12...2.12.13
2.12.12
What's Changed
- fix env var by @iamsgarg-ob in #1959
- release 2.12.12 by @savingoyal in #1960
Full Changelog: 2.12.11...2.12.12
2.12.11
What's Changed
- Sanitize path names when creating cache directories in FileCache by @romain-intel in #1931
- comment clean in argo workflows by @savingoyal in #1942
- Capture stack traces from Argo errors by @savingoyal in #1941
- add secrets to exit hook by @savingoyal in #1944
- Fix error capture commands by @iamsgarg-ob in #1945
- adding additional env vars to capture error by @iamsgarg-ob in #1946
- upgrade black by @savingoyal in #1949
- Inject ns by @madhur-ob in #1950
- 2.12.11 release by @savingoyal in #1951
Full Changelog: 2.12.10...2.12.11
2.12.10
Release Notes
This patch release introduces support for emitting run heartbeats for the entire duration of a run's lifecycle on Argo Workflows. Metaflow's UI relies on a complex state machine to ascertain the correct state of any run, step, task, and attempt. For runs executed on Workflow Orchestrators like AWS Step Functions, Argo Workflows, and Airflow, by design - there is no execution-wide supervisor process that can monitor and keep track of the state of the execution for Metaflow. We instead rely on a task-local process to exfiltrate this state to Metaflow - which aids in effortlessly scaling Metaflow to millions of tasks. However, if no task is scheduled for a while, Metaflow's UI might temporarily show the run as failed (red) before correcting it (green) when the task is scheduled, which can be confusing.
With the latest release, executions on Argo Workflows can kick off a daemon process that is alive for the entire duration of the execution. This daemon process will ensure that a liveness signal is emitted reliably throughout the entire duration of the execution. You can enable this by simply adding the --enable-heartbeat-daemon
flag -
python flow.py argo-workflows create --enable-heartbeat-daemon
The next release will enable this functionality as default. If this turns out to be useful for you, or if you have any feedback, ping us at chat.metaflow.org!