Skip to content

Commit 125b591

Browse files
committed
Readme updates for diffusion model
1 parent 24b69da commit 125b591

File tree

1 file changed

+11
-11
lines changed
  • model-deployment/diffusion-models/bentoml

1 file changed

+11
-11
lines changed
Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,24 +1,24 @@
11
# Introduction
22
Diffusion models are a type of generative model that learns to create new data samples by reversing a gradual process of adding noise to an initial sample. They work by first adding noise to real data points, gradually transforming them into pure noise, and then training a neural network to reverse this process, effectively learning to generate data from noise.
33

4-
[BentoML](https://github.com/bentoml/BentoML) is a Python library for building online serving systems optimized for AI apps and model inference. WHat sets it apart from other text generation frameworks is that it can also support image generation usecase with Stable Diffusion 3 Medium, Stable Video Diffusion, Stable Diffusion XL Turbo, ControlNet, and LCM LoRAs.
5-
In this sample, we are going to deploy [Stable Diffusion 3 Medium](https://github.com/bentoml/BentoDiffusion/tree/main/sd3-medium) with BentoML
4+
[BentoML](https://github.com/bentoml/BentoML) is a Python library for building online serving systems optimized for AI apps and model inference. What sets it apart from other text generation frameworks, is that it can also support image generation usecases with Stable Diffusion 3 Medium, Stable Video Diffusion, Stable Diffusion XL Turbo, ControlNet, and LCM LoRAs.
5+
In this sample, we are going to deploy [Stable Diffusion 3 Medium](https://github.com/bentoml/BentoDiffusion/tree/main/sd3-medium) with BentoML.
66

77
# Steps
88

99
## Dockerize
10-
First let's dockerize the our model serving framework using the [Dockerfile](./Dockerfile).
10+
First let's dockerize our model serving framework using the [Dockerfile](./Dockerfile).
1111
```
12-
docker build -f Dockerfile -t bentoml:latest .
12+
docker build -f Dockerfile -t bentoml:latest .
1313
```
1414

15-
## Create BentoML framework API code to serve Stable Diffusion 3 Medium on the framework
16-
Refer code in [directory](./sd3-medium).
17-
Note the changes done in order to support this on OCI Data Science Model Deployment.
15+
## Create BentoML framework API code to serve Stable Diffusion 3 Medium
16+
Refer the code in [directory](./sd3-medium).
17+
Note the changes that are done, in order to support it on OCI Data Science Model Deployment.
1818
* Add readiness logic if needed, for checking health of model server.
1919
* Add route in bentoml api to support `predict` api endpoint for image generation.
20-
* Check OCI Buckets integration using resource principal to put the generated images in bucket of your choice.
21-
NOTE - In order to allow model deployment service create objects in your bucket, add the policy
20+
* Check OCI Buckets integration using resource principal, to put the generated images in the bucket of your choice.
21+
NOTE - In order to allow model deployment service create objects in your bucket, add below policy -
2222
```
2323
allow any-user to manage objects in compartment <compartment> where ALL { request.principal.type='datasciencemodeldeployment', target.bucket.name='<BUCKET_NAME>' }
2424
```
@@ -36,9 +36,9 @@ Note - Create a VCN, Subnet with internet connectivity in order to fetch the mod
3636
Create model deployment using the [file](./model-deployment.py) as reference.
3737

3838
## Prediction
39-
Once MD is active, use below curl request to send a request
39+
Once MD is active, use below oci cli request to send an image generation call -
4040
```
4141
oci raw-request --http-method POST --target-uri <MODEL_DEPLOYMENT_ENDPOINT> --request-body '{ "prompt": "A cat holding a sign that says hello World", "num_inference_steps": 10,"guidance_scale": 7.0 }' --request-headers '{"Content-Type":"application/json"}'
4242
```
4343

44-
Genrated image will be placed in chosen bucket.
44+
Generated image will be placed in chosen bucket.

0 commit comments

Comments
 (0)