|
| 1 | +--- |
| 2 | +title: Continue Integration |
| 3 | +--- |
| 4 | +import { Aside, CardGrid, Card, LinkCard, Steps, Badge } from '@astrojs/starlight/components'; |
| 5 | +import { Image } from 'astro:assets'; |
| 6 | + |
| 7 | +This Document provides a more in depth tutorial and demo for using the Continue VSCode extension with Db2 for i. |
| 8 | + |
| 9 | +## Getting Started: Continue |
| 10 | + |
| 11 | + |
| 12 | +Continue is the leading open source AI code assistant for VS Code. It provides a wide range of AI features: |
| 13 | + |
| 14 | +* Chat Interface |
| 15 | +* Code Completion |
| 16 | +* Autocomplete |
| 17 | + |
| 18 | +### Install the Continue extension for VS Code |
| 19 | + |
| 20 | +<Steps> |
| 21 | + 1. Install the Continue extension from the [VS Code Marketplace](https://marketplace.visualstudio.com/items?itemName=Continue.continue). |
| 22 | +  |
| 23 | + 2. Once Installed, there will be a new icon in your VS Code menu (mine is on the top right). Click on the icon to open the chat window. |
| 24 | +  |
| 25 | +</Steps> |
| 26 | + |
| 27 | +Once you have the extension installed, you can configure the AI provider you want to use. Continue supports multiple AI providers (including [Watsonx](https://docs.continue.dev/customize/model-providers/more/watsonx)!). You can choose the provider you want to use by clicking on the settings icon in the chat window. |
| 28 | + |
| 29 | +For demonstration purposes, we will use the Ollama Provider for hosting LLMs locally on your machine. |
| 30 | + |
| 31 | +### Setting up Ollama Provider |
| 32 | + |
| 33 | +Here is a step-by-step guide to setting up the Ollama provider with the IBM Granite models in Continue: |
| 34 | + |
| 35 | +#### 1. Install Ollama |
| 36 | + |
| 37 | +Install Ollama on your machine by following the link below: |
| 38 | +<LinkCard title="Install Ollama" href="https://ollama.com/download" /> |
| 39 | + |
| 40 | +#### 2. Fetch the IBM Granite 3.0 models |
| 41 | + |
| 42 | +The IBM Granite 3.0 models are available in the Ollama model registry. More information about the IBM Granite models can be found [here](https://ollama.com/blog/ibm-granite). |
| 43 | + |
| 44 | +Using the Ollama CLI, fetch the IBM Granite 3.0 8b model by running the following command: |
| 45 | +```bash |
| 46 | +ollama pull granite3-dense:8b |
| 47 | +``` |
| 48 | + |
| 49 | +#### 3. Configure the Ollama provider in Continue |
| 50 | + |
| 51 | +Open the VSCode Command Palette (Press ctrl+shift+p) and search for `Continue: open config.json`. This will open the Continue central config file `$HOME/.continue/config.json` in your editor. To enable the Granite models in Ollama, add the following configuration to the `models` section: |
| 52 | + |
| 53 | +```json title="~/.continue/config.json" |
| 54 | +"models": [ |
| 55 | + { |
| 56 | + "title": "Granite Code 8b", |
| 57 | + "provider": "ollama", |
| 58 | + "model": "granite3-dense:8b" |
| 59 | + } |
| 60 | + ], |
| 61 | +``` |
| 62 | + |
| 63 | +save this file and select the Granite model in the chat window. |
| 64 | + |
| 65 | + |
| 66 | + |
| 67 | +### Examples |
| 68 | + |
| 69 | +Once you have the extension installed and the AI provider configured, you can ask questions about your database using the chat window using the `@db2i` context provider. In Continue, a context provider is very similar to a chat participant in GitHub Copilot. It provides additional context to the AI model to help it generate more accurate SQL queries. |
| 70 | + |
| 71 | +More on context providers can be found [here](https://docs.continue.dev/customize/context-providers/). |
| 72 | + |
| 73 | +**Example 1:** Summarize the columns in the `EMPLOYEE` table |
| 74 | + |
| 75 | + |
| 76 | + |
| 77 | +**Example 2:** Get the department name for each employee |
| 78 | + |
| 79 | + |
0 commit comments