1.9 KiB
1.9 KiB
Changelog
All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
[Unreleased]
[0.3.2 - 2025-12-08]
Added
- Configuration option to set llm maximum context window
Changed
- Changed default model shipped with paperless-llm-workflow to ministral 8b base (smaller model with better results)
Fixed
- increase default num gpu layers to 1024 for better performance with gpu
- updated llama-cpp bindings to version b7314 2025-12-07
[0.3.1] - 2025-11-26
Added
- new api endpoint to enable decision based workflows
[0.3.0] - 2025-11-25
- Rename Project to
paperless-llm-workflows
Added
- Added openapi specs for workflow trigger server endpoints
- Added
next_tagparameter to Webhook endpoints, allowing better workflow stages - Changed Architecture to receive webhooks to trigger document processing
- decouple sending updated documents to paperless from llm processing pipeline
- feature correspondent suggestion, let this software suggest correspondents
Fixed
- better error handling for model generation errors
- support bigger context windows for larger documents
- better model sampling pipeline, punish duplicate generations
[0.2.1] - 2025-11-06
Removed
- disable generation of alternative value fields (currently unsed feature anyway)
[0.2.0] - 2025-11-02
Added
- better guiding of the model for
selectanddatecustom fields
[0.1.2] - 2025-10-30
Added
- expand documentation on containerized usage
Fixed
- server configuration
[0.1.1] - 2025-10-30
Fixed
- do not tag documents when running in
dry-runmode - limit decimal for currency custom fields to 2 decimal places
[0.1.0] - 2025-10-29
Initial Release