Skip to content

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Aug 11, 2025

Updates the requirements on sentence-transformers to permit the latest version.

Release notes

Sourced from sentence-transformers's releases.

v5.1.0 - ONNX and OpenVINO backends offering 2-3x speedups; more hard negatives mining formats

This release introduces 2 new efficient computing backends for SparseEncoder embedding models: ONNX and OpenVINO + optimization & quantization, allowing for speedups up to 2x-3x; a new "n-tuple-score" output format for hard negative mining for distillation; gathering across devices for free lunch on multi-gpu training; trackio support; MTEB documentation; any many small fixes and features.

Install this version with

# Training + Inference
pip install sentence-transformers[train]==5.1.0
Inference only, use one of:
pip install sentence-transformers==5.1.0
pip install sentence-transformers[onnx-gpu]==5.1.0
pip install sentence-transformers[onnx]==5.1.0
pip install sentence-transformers[openvino]==5.1.0

Faster ONNX and OpenVINO backends for SparseEncoder models (#3475)

Introducing a new backend keyword argument to the SparseEncoder initialization, allowing values of "torch" (default), "onnx", and "openvino". These require installing sentence-transformers with specific extras:

pip install sentence-transformers[onnx-gpu]
# or ONNX for CPU only:
pip install sentence-transformers[onnx]
# or
pip install sentence-transformers[openvino]

It's as simple as:

from sentence_transformers import SparseEncoder
Load a SparseEncoder model with the ONNX backend
model = SparseEncoder("naver/splade-v3", backend="onnx")
query = "Which planet is known as the Red Planet?"
documents = [
"Venus is often called Earth's twin because of its similar size and proximity.",
"Mars, known for its reddish appearance, is often referred to as the Red Planet.",
"Jupiter, the largest planet in our solar system, has a prominent red spot.",
"Saturn, famous for its rings, is sometimes mistaken for the Red Planet."
]
query_embeddings = model.encode_query(query)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
torch.Size([30522]) torch.Size([4, 30522])
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
tensor([[12.1450, 26.1040, 22.0025, 23.3877]])
</tr></table>

... (truncated)

Commits
  • a5e0425 Release v5.1.0
  • 2cf753f Update main sbert.net page with v5.1 mention (#3482)
  • 58622d4 [feat] Allow n-tuples for CE MarginMSE training (#3481)
  • 6e7d64e docs: add MTEB evaluation guide and update usage.rst (#3477)
  • 08446fc Use SHA256 with usedforsecurity=False in hard negatives caching (#3479)
  • 55f1dd0 chore: Handle error when predict is called with an empty sentence list (#3466)
  • 99dda77 [feat] Add ONNX, OV support for SparseEncoder; refactor ONNX/OV (#3475)
  • 69406a3 [docs] Add splade_index semantic search example (#3473)
  • 240cf30 docs: Fix dead link in ContrastiveLoss references (#3476)
  • 7e8d2cc Fix: prevent loading best model when PEFT adapters are active (#3470)
  • Additional commits viewable in compare view

Most Recent Ignore Conditions Applied to This Pull Request
Dependency Name Ignore Conditions
sentence-transformers [>= 3.3.dev0, < 3.4]
sentence-transformers [>= 3.dev0, < 4]

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Aug 11, 2025
@coveralls
Copy link

Coverage Status

coverage: 52.931% (+0.05%) from 52.88%
when pulling 5f924ec on dependabot/pip/sentence-transformers-gte-2.7.0-and-lt-6
into 3158296 on dev.

Updates the requirements on [sentence-transformers](https://github.com/UKPLab/sentence-transformers) to permit the latest version.
- [Release notes](https://github.com/UKPLab/sentence-transformers/releases)
- [Commits](UKPLab/sentence-transformers@v2.7.0...v5.1.0)

---
updated-dependencies:
- dependency-name: sentence-transformers
  dependency-version: 5.1.0
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot force-pushed the dependabot/pip/sentence-transformers-gte-2.7.0-and-lt-6 branch from 5f924ec to 90bfa5e Compare August 19, 2025 05:08
Copy link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant