-
Notifications
You must be signed in to change notification settings - Fork 126
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Blog for 0.7 release #49
Conversation
✔️ Deploy Preview for elastic-nobel-0aef7a ready! 🔨 Explore the source changes: 4a7e3cc 🔍 Inspect the deploy log: https://app.netlify.com/sites/elastic-nobel-0aef7a/deploys/61689f4ebe866b000811d89c 😎 Browse the preview: https://deploy-preview-49--elastic-nobel-0aef7a.netlify.app |
@js-ts Can you help squash the commits ? there are 30 commmits |
--- | ||
|
||
### Authors | ||
**Dan Sun**,**Animesh Singh**,**Vedant Padwal** on behalf of the KServe Community |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
**Dan Sun**,**Animesh Singh**,**Vedant Padwal** on behalf of the KServe Community | |
**Dan Sun**, **Animesh Singh**, **Vedant Padwal** on behalf of the KServe Working Group. |
This helps to migrate existing inference services running in the cluster from KFServing to KServe without downtime. | ||
The script installs `KServe` from 0.7 release and deletes your `KFServing` installation after migrating the inference services from `serving.kubeflow.org` to `serving.kserve.io`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do not need to repeat the docs here as you already provided the link, I think the migration instruction still needs to be tweaked.
The script installs `KServe` from 0.7 release and deletes your `KFServing` installation after migrating the inference services from `serving.kubeflow.org` to `serving.kserve.io`. | ||
<br> | ||
- `InferenceService` API group is changed from `serving.kubeflow.org` to `serving.kserve.io` [#1826](https://github.com/kserve/kserve/issues/1826). | ||
- Python SDK name is changed from `kfserving` to `kserve` [#1827](https://github.com/kserve/kserve/issues/1827). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you help provide the link to the pypi package https://pypi.org/project/kserve ?
- `InferenceService` API group is changed from `serving.kubeflow.org` to `serving.kserve.io` [#1826](https://github.com/kserve/kserve/issues/1826). | ||
- Python SDK name is changed from `kfserving` to `kserve` [#1827](https://github.com/kserve/kserve/issues/1827). | ||
- KServe Installation manifests [#1824](https://github.com/kserve/kserve/issues/1824). | ||
- Model-web-app is separated out of the kserve repository to [#1820](https://github.com/kserve/kserve/issues/1820). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to https://github.com/kserve/models-web-app
|
||
### **🐞 What's Fixed?** | ||
|
||
- Bug fix for Azure blob |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
link to the issue?
KFServing totally depends on Knative to bring serving features like scale to zero, Knative Pod Autoscaler(KPA) | ||
and canary rollouts to ML/DL deployments, and the serving runtime contract requires HTTP as the inboundtransport. | ||
There are use cases where Knative would be a great solution but where HTTP is not used. Especially, due to safety | ||
concerns, some cases need alternative transports like UDP or TCP or even custom protocols. So, we need an effective | ||
scenario to solve the problem, make Knative Serving and native serving should be both available for the Kubernetes cluster. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not quite accurate which needs to be rephrased. The raw kubernetes deployment is for lightweight dependency and unlocks the Knative limitations.
trade-off between responsiveness to users and computational footprint. | ||
To learn more about ModelMesh features and components, check out the ModelMesh announcement blog. [ModelMesh release blog](https://developer.ibm.com/blogs/kserve-and-watson-modelmesh-extreme-scale-model-inferencing-for-trusted-ai/) | ||
<br> | ||
- #### (Alpha feature) Raw kubernetes deployment support,Istio/Knative dependency is now optional [#1508](https://github.com/kserve/kserve/issues/1508),[#1830](https://github.com/kserve/kserve/issues/1830). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- #### (Alpha feature) Raw kubernetes deployment support,Istio/Knative dependency is now optional [#1508](https://github.com/kserve/kserve/issues/1508),[#1830](https://github.com/kserve/kserve/issues/1830). | |
- #### (Alpha feature) Raw kubernetes deployment support, Istio/Knative dependency is now optional [#1508](https://github.com/kserve/kserve/issues/1508),[#1830](https://github.com/kserve/kserve/issues/1830). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can point to the installation guide https://kserve.github.io/website/admin/kubernetes_deployment
Update 2021-10-11-KServe-0.7-release.md add model mesh blog link edited modelmesh blog text
Co-authored-by: Dan Sun <[email protected]>
Co-authored-by: Dan Sun <[email protected]>
Co-authored-by: Dan Sun <[email protected]>
Co-authored-by: Dan Sun <[email protected]>
Co-authored-by: Dan Sun <[email protected]>
Co-authored-by: Dan Sun <[email protected]>
/lgtm |
@js-ts: you cannot LGTM your own PR. In response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository. |
/lgtm |
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: js-ts, yuzisun The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
No description provided.