Skip to content

dependabit / detector/src / LLMProviderConfig

Interface: LLMProviderConfig

Defined in: packages/detector/src/llm/client.ts:113

Configuration passed to an LLM provider at construction time.

Config

Pitfalls

  • model controls which checkpoint is used. Leaving it undefined causes the provider to select its default, which can change between SDK versions — pin the model to avoid silent classification drift.
  • maxTokens caps the completion, not the prompt. Very large repository files will still consume prompt budget; use Detector.ignorePatterns to exclude large generated files.

Properties

PropertyTypeDescriptionDefined in
apiKey?string-packages/detector/src/llm/client.ts:114
endpoint?string-packages/detector/src/llm/client.ts:115
maxTokens?number-packages/detector/src/llm/client.ts:118
model?stringModel identifier; pin this value to avoid classification drift.packages/detector/src/llm/client.ts:117
temperature?number-packages/detector/src/llm/client.ts:119

Released under the MIT License.