Skip to content

Commit ced388c

Browse files
authored
feat(prompts): add configurable response language (#1246)
Closes #1086 Signed-off-by: Tomas Slusny <slusnucky@gmail.com>
1 parent c3d0048 commit ced388c

File tree

4 files changed

+8
-4
lines changed

4 files changed

+8
-4
lines changed

README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -171,14 +171,13 @@ The mappings can be customized by setting the `mappings` table in your configura
171171
- `normal`: Key for normal mode
172172
- `insert`: Key for insert mode
173173

174-
For example, to change the submit prompt mapping or show_diff full diff option:
174+
For example, to change the complete mapping to Tab or show_diff full diff option:
175175

176176
```lua
177177
{
178178
mappings = {
179-
submit_prompt = {
180-
normal = '<Leader>s',
181-
insert = '<C-s>'
179+
complete = {
180+
insert = '<Tab>'
182181
}
183182
show_diff = {
184183
full_diff = true
@@ -444,6 +443,7 @@ Below are all available configuration options with their default values:
444443
model = 'gpt-4.1', -- Default model to use, see ':CopilotChatModels' for available models (can be specified manually in prompt via $).
445444
tools = nil, -- Default tool or array of tools (or groups) to share with LLM (can be specified manually in prompt via @).
446445
sticky = nil, -- Default sticky prompt or array of sticky prompts to use at start of every new chat (can be specified manually in prompt via >).
446+
language = 'English', -- Default language to use for answers
447447

448448
resource_processing = false, -- Enable intelligent resource processing (skips unnecessary resources to save tokens)
449449

lua/CopilotChat/config.lua

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@
1818
---@field model string?
1919
---@field tools string|table<string>|nil
2020
---@field sticky string|table<string>|nil
21+
---@field language string?
2122
---@field resource_processing boolean?
2223
---@field temperature number?
2324
---@field headless boolean?
@@ -58,6 +59,7 @@ return {
5859
model = 'gpt-4.1', -- Default model to use, see ':CopilotChatModels' for available models (can be specified manually in prompt via $).
5960
tools = nil, -- Default tool or array of tools (or groups) to share with LLM (can be specified manually in prompt via @).
6061
sticky = nil, -- Default sticky prompt or array of sticky prompts to use at start of every new chat (can be specified manually in prompt via >).
62+
language = 'English', -- Default language to use for answers
6163

6264
resource_processing = false, -- Enable intelligent resource processing (skips unnecessary resources to save tokens)
6365

lua/CopilotChat/config/prompts.lua

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@ local COPILOT_BASE = [[
22
When asked for your name, you must respond with "GitHub Copilot".
33
Follow the user's requirements carefully & to the letter.
44
Keep your answers short and impersonal.
5+
Always answer in {LANGUAGE} unless explicitly asked otherwise.
56
<userEnvironment>
67
The user works in editor called Neovim which has these core concepts:
78
- Buffer: An in-memory text content that may be associated with a file

lua/CopilotChat/init.lua

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -473,6 +473,7 @@ function M.resolve_prompt(prompt, config)
473473

474474
if config.system_prompt then
475475
config.system_prompt = config.system_prompt:gsub('{OS_NAME}', jit.os)
476+
config.system_prompt = config.system_prompt:gsub('{LANGUAGE}', config.language)
476477
if state.source then
477478
config.system_prompt = config.system_prompt:gsub('{DIR}', state.source.cwd())
478479
end

0 commit comments

Comments
 (0)