Skip to content

Commit a4e543c

Browse files
committed
refactor(google_gemini.py): simplify system prompt handling and update documentation
1 parent 12b75f2 commit a4e543c

File tree

2 files changed

+49
-120
lines changed

2 files changed

+49
-120
lines changed

docs/google-gemini-integration.md

Lines changed: 19 additions & 43 deletions
Original file line numberDiff line numberDiff line change
@@ -251,72 +251,48 @@ To use this filter, ensure it's enabled in your Open WebUI configuration. Then,
251251

252252
Native tool calling is enabled/disabled via the standard 'Function calling' Open Web UI toggle.
253253

254-
## System Prompt Hierarchy
254+
## Default System Prompt
255255

256-
The Google Gemini pipeline supports a hierarchical system prompt configuration that combines multiple sources. This allows for flexible customization at different levels: global defaults and user preferences.
257-
258-
### Prompt Sources (in order of combination)
259-
260-
1. **Default System Prompt** (`GOOGLE_DEFAULT_SYSTEM_PROMPT`): Global default applied to all chats, configurable via environment variable or Admin UI valves.
261-
262-
2. **User System Prompt**: The user's personalized system prompt from either:
263-
- **Chat Controls**: The system message passed with individual chat messages
264-
- **User Settings** (Settings > Personalization): Stored in `settings.ui.system`
265-
266-
Note: Chat controls take precedence over user settings if both are set.
256+
The Google Gemini pipeline supports a configurable default system prompt that is applied to all chats. This is useful when you want to consistently apply certain behaviors or instructions to all Gemini models without having to configure each model individually.
267257

268258
### How It Works
269259

270-
All available prompts are combined in order, separated by blank lines:
271-
272-
```
273-
{Default System Prompt}
274-
275-
{User System Prompt}
276-
```
277-
278-
If only one prompt source is set, it is used as-is without any additional formatting.
260+
- **Default Only**: If only `GOOGLE_DEFAULT_SYSTEM_PROMPT` is set and no user-defined system prompt exists, the default prompt is used as the system instruction.
261+
- **User Only**: If only a user-defined system prompt exists (from model settings), it is used as-is.
262+
- **Both**: If both are set, the default system prompt is **prepended** to the user-defined prompt, separated by a blank line. This allows you to have base instructions that apply to all chats while still allowing model-specific customization.
279263

280264
### Configuration
281265

282-
**Environment Variable:**
266+
Set via environment variable:
283267

284268
```bash
285269
# Default system prompt applied to all chats
286-
# Combined with user prompts if they exist
270+
# If a user-defined system prompt exists, this is prepended to it
287271
GOOGLE_DEFAULT_SYSTEM_PROMPT="You are a helpful AI assistant. Always be concise and accurate."
288272
```
289273

290274
Or configure through the pipeline valves in Open WebUI's Admin panel.
291275

292-
**User System Prompt:**
293-
294-
Users can set their personalized system prompt in Open WebUI:
295-
1. Go to Settings > Personalization
296-
2. Enter your preferred system prompt in the "System Prompt" field
297-
3. Save settings
298-
299-
Alternatively, users can override the system prompt per-chat using chat controls. If both are set, the chat controls value takes precedence.
300-
301276
### Example
302277

303-
If your configuration is:
278+
If your default system prompt is:
304279

305-
**Default system prompt (`GOOGLE_DEFAULT_SYSTEM_PROMPT`):**
306280
```
307281
You are a helpful AI assistant.
308282
```
309283

310-
**User system prompt (chat controls OR Settings > Personalization):**
284+
And your model-specific system prompt is:
285+
311286
```
312-
My name is John. I prefer detailed explanations.
287+
Always respond in formal English.
313288
```
314289

315290
The combined system prompt sent to Gemini will be:
291+
316292
```
317293
You are a helpful AI assistant.
318294
319-
My name is John. I prefer detailed explanations.
295+
Always respond in formal English.
320296
```
321297

322298
## Thinking Configuration
@@ -438,9 +414,9 @@ print(response.text)
438414

439415
### Model Compatibility
440416

441-
| Model | thinking_level | thinking_budget |
442-
|-------|---------------|-----------------|
443-
| gemini-3-* | ✅ Supported ("low", "high") | ❌ Not used |
444-
| gemini-2.5-* | ❌ Not used | ✅ Supported (0-32768) |
445-
| gemini-2.5-flash-image-* | ❌ Not supported | ❌ Not supported |
446-
| Other models | ❌ Not used | ✅ May be supported |
417+
| Model | thinking_level | thinking_budget |
418+
| ------------------------- | ---------------------------- | ---------------------- |
419+
| gemini-3-\* | ✅ Supported ("low", "high") | ❌ Not used |
420+
| gemini-2.5-\* | ❌ Not used | ✅ Supported (0-32768) |
421+
| gemini-2.5-flash-image-\* | ❌ Not supported | ❌ Not supported |
422+
| Other models | ❌ Not used | ✅ May be supported |

pipelines/google/google_gemini.py

Lines changed: 30 additions & 77 deletions
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,6 @@
3434
- Flexible upload fallback options and optimization controls
3535
- Configurable thinking levels (low/high) for Gemini 3 models
3636
- Configurable thinking budgets (0-32768 tokens) for Gemini 2.5 models
37-
- Hierarchical system prompts (default, user)
3837
"""
3938

4039
import os
@@ -209,8 +208,8 @@ class Valves(BaseModel):
209208
)
210209
DEFAULT_SYSTEM_PROMPT: str = Field(
211210
default=os.getenv("GOOGLE_DEFAULT_SYSTEM_PROMPT", ""),
212-
description="Default system prompt applied to all chats. Combined with user prompt "
213-
"in a hierarchy: default → user. Leave empty to disable.",
211+
description="Default system prompt applied to all chats. If a user-defined system prompt exists, "
212+
"this is prepended to it. Leave empty to disable.",
214213
)
215214

216215
# Image Processing Configuration
@@ -291,78 +290,38 @@ def _deduplicate_images(self, images: List[Dict[str, Any]]) -> List[Dict[str, An
291290
result.append(part)
292291
return result
293292

294-
def _get_user_personalization_prompt(
295-
self, __user__: Optional[dict] = None
296-
) -> Optional[str]:
297-
"""Get the per-user system prompt from user settings (Personalization).
298-
299-
In Open WebUI, users can configure a personalized system prompt
300-
in Settings > Personalization. This is stored in __user__["settings"]["ui"]["system"].
301-
302-
Args:
303-
__user__: The user dict passed to the pipe method
304-
305-
Returns:
306-
The user's personalized system prompt or None if not set
307-
"""
308-
if __user__ is None:
309-
return None
310-
311-
try:
312-
settings = __user__.get("settings")
313-
if settings and isinstance(settings, dict):
314-
ui_settings = settings.get("ui")
315-
if ui_settings and isinstance(ui_settings, dict):
316-
system_prompt = ui_settings.get("system")
317-
if system_prompt and isinstance(system_prompt, str):
318-
return system_prompt.strip() or None
319-
except Exception as e:
320-
self.log.debug(f"Could not retrieve user personalization prompt: {e}")
321-
322-
return None
323-
324293
def _combine_system_prompts(
325-
self,
326-
chat_system_prompt: Optional[str],
327-
__user__: Optional[dict] = None,
294+
self, user_system_prompt: Optional[str]
328295
) -> Optional[str]:
329-
"""Combine default and user system prompts.
296+
"""Combine default system prompt with user-defined system prompt.
330297
331-
Prompt hierarchy (all prompts are combined if set):
332-
1. DEFAULT_SYSTEM_PROMPT (environment/valve setting)
333-
2. User system prompt (from chat controls OR user settings - chat controls take precedence)
298+
If DEFAULT_SYSTEM_PROMPT is set and user_system_prompt exists,
299+
the default is prepended to the user's prompt.
300+
If only DEFAULT_SYSTEM_PROMPT is set, it is used as the system prompt.
301+
If only user_system_prompt is set, it is used as-is.
334302
335303
Args:
336-
chat_system_prompt: The chat-level system prompt from messages (may be None)
337-
__user__: The user dict passed to the pipe method
304+
user_system_prompt: The user-defined system prompt from messages (may be None)
338305
339306
Returns:
340-
Combined system prompt or None if none are set
307+
Combined system prompt or None if neither is set
341308
"""
342-
default_prompt = self.valves.DEFAULT_SYSTEM_PROMPT.strip() or None
343-
user_personalization = self._get_user_personalization_prompt(__user__)
344-
chat_prompt = chat_system_prompt.strip() if chat_system_prompt else None
309+
default_prompt = self.valves.DEFAULT_SYSTEM_PROMPT.strip()
310+
user_prompt = user_system_prompt.strip() if user_system_prompt else ""
345311

346-
# User prompt = chat controls OR user settings (chat controls take precedence if both are set)
347-
user_prompt = chat_prompt or user_personalization
348-
349-
prompts = [p for p in [default_prompt, user_prompt] if p]
350-
351-
if not prompts:
352-
return None
353-
354-
if len(prompts) == 1:
355-
self.log.debug(f"Using single system prompt ({len(prompts[0])} chars)")
356-
return prompts[0]
357-
358-
combined = "\n\n".join(prompts)
359-
self.log.debug(
360-
f"Combined system prompts: "
361-
f"default={len(default_prompt) if default_prompt else 0}, "
362-
f"user={len(user_prompt) if user_prompt else 0} -> "
363-
f"total={len(combined)} chars"
364-
)
365-
return combined
312+
if default_prompt and user_prompt:
313+
combined = f"{default_prompt}\n\n{user_prompt}"
314+
self.log.debug(
315+
f"Combined system prompts: default ({len(default_prompt)} chars) + "
316+
f"user ({len(user_prompt)} chars) = {len(combined)} chars"
317+
)
318+
return combined
319+
elif default_prompt:
320+
self.log.debug(f"Using default system prompt ({len(default_prompt)} chars)")
321+
return default_prompt
322+
elif user_prompt:
323+
return user_prompt
324+
return None
366325

367326
def _apply_order_and_limit(
368327
self,
@@ -449,7 +408,6 @@ async def _build_image_generation_contents(
449408
self,
450409
messages: List[Dict[str, Any]],
451410
__event_emitter__: Callable,
452-
__user__: Optional[dict] = None,
453411
) -> Tuple[List[Dict[str, Any]], Optional[str]]:
454412
"""Construct the contents payload for image-capable models.
455413
@@ -462,9 +420,7 @@ async def _build_image_generation_contents(
462420
)
463421

464422
# Combine with default system prompt if configured
465-
system_instruction = self._combine_system_prompts(
466-
user_system_instruction, __user__
467-
)
423+
system_instruction = self._combine_system_prompts(user_system_instruction)
468424

469425
last_user_msg = next(
470426
(m for m in reversed(messages) if m.get("role") == "user"), None
@@ -923,16 +879,13 @@ def _prepare_model_id(self, model_id: str) -> str:
923879
return model_id
924880

925881
def _prepare_content(
926-
self,
927-
messages: List[Dict[str, Any]],
928-
__user__: Optional[dict] = None,
882+
self, messages: List[Dict[str, Any]]
929883
) -> Tuple[List[Dict[str, Any]], Optional[str]]:
930884
"""
931885
Prepare messages content for the API and extract system message if present.
932886
933887
Args:
934888
messages: List of message objects from the request
935-
__user__: The user dict passed to the pipe method
936889
937890
Returns:
938891
Tuple of (prepared content list, system message string or None)
@@ -944,7 +897,7 @@ def _prepare_content(
944897
)
945898

946899
# Combine with default system prompt if configured
947-
system_message = self._combine_system_prompts(user_system_message, __user__)
900+
system_message = self._combine_system_prompts(user_system_message)
948901

949902
# Prepare contents for the API
950903
contents = []
@@ -2203,7 +2156,7 @@ async def pipe(
22032156
contents,
22042157
system_instruction,
22052158
) = await self._build_image_generation_contents(
2206-
messages, __event_emitter__, __user__
2159+
messages, __event_emitter__
22072160
)
22082161
# For image generation, system_instruction is integrated into the prompt
22092162
# so it will be None here (this is expected and correct)
@@ -2215,7 +2168,7 @@ async def pipe(
22152168
else:
22162169
# For non-image generation models, use the full conversation history
22172170
# Prepare content and extract system message normally
2218-
contents, system_instruction = self._prepare_content(messages, __user__)
2171+
contents, system_instruction = self._prepare_content(messages)
22192172
if not contents:
22202173
return "Error: No valid message content found"
22212174
self.log.debug(

0 commit comments

Comments
 (0)