Skip to content

✨ feat: add batch processing, auto-sort, custom prompt, and debug mode#7

Open
rfiacne wants to merge 6 commits intoNigel1992:mainfrom
rfiacne:main
Open

✨ feat: add batch processing, auto-sort, custom prompt, and debug mode#7
rfiacne wants to merge 6 commits intoNigel1992:mainfrom
rfiacne:main

Conversation

@rfiacne
Copy link
Copy Markdown

@rfiacne rfiacne commented Apr 22, 2026

Summary

This PR adds several features to AutoSort+ Thunderbird extension for improved email sorting workflow (and it is all ai generated, if you didn't notice):

🚀 New Features

1. Batch Processing with Controls

  • Process multiple selected emails with progress tracking
  • Progress bar and status panel showing sorted/skipped/failed counts

2. Batch Chunk Processing

  • Configurable chunk size (1-20 emails per batch)
  • Process N emails, await all AI responses, then continue
  • Prevents overwhelming AI endpoints with too many simultaneous requests

3. Auto-Sort New Emails

  • Automatic sorting of incoming Inbox emails when they arrive
  • Single toggle control in General Settings
  • Works for all AI providers (Gemini, OpenAI, Ollama, etc.)

4. Custom Prompt Feature

  • Full prompt customization via text area in Options page
  • Two placeholders: {labels} for folder list, {email} for email content
  • Reset to Default button to restore hardcoded prompt
  • Graceful fallback to the original default prompt when placeholders missing

5. OpenAI-Compatible Provider

  • Support for custom endpoints (LM Studio, LocalAI, vLLM, Together AI, etc.)
  • Model dropdown populated from /v1/models endpoint
  • Optional API key for cloud providers
  • Direct fetch for local endpoints, tab injection for localhost

6. Debug Mode

  • Collapsible console groups for API request/response logging
  • Toggle in General Settings to enable/disable
  • Works for all providers (Gemini, OpenAI, Ollama, OpenAI-compatible)

🐛 Bug Fixes

  • Fixed async convertToPlainText not being awaited, causing [object Promise] in AI prompts

📁 Files Changed

  • background.js - Core processing logic, AI provider calls, event listeners
  • options.html - UI for all new settings and controls
  • options.js - Load/save logic for all settings
  • styles.css - Styling for batch progress panel and custom prompt textarea
  • manifest.json - Permissions cleanup

rfiacne and others added 5 commits April 21, 2026 10:44
OpenAI-compatible provider:
- Custom endpoint support with base URL + model selection
- Model dropdown with Fetch Models button
- API key optional for local endpoints (no auth required)
- max_tokens: 8192 for reasoning models (Qwen3 etc)

Debug mode:
- Toggle in General Settings (Ctrl+Shift+I to view logs)
- Colored console tags: [AutoSort+], [Gemini], [Ollama], [Custom]
- apiRequest/apiResponse collapsible groups
- Cross-context sync via browser.storage.local

Bug fixes:
- Provider switch: hide/show correct subsections
- Save button state updates on all input changes
- Input validation for Ollama and OpenAI-compatible configs
- Extract subject, author, attachments for better AI categorization

- Build folder Map to avoid N+1 recursion in batch processing

- Cache accounts to avoid N+1 fetching

- Parallel auto-sort with provider-based concurrency limit

- Gemini rate limit mutex for atomic operations

- Add shared tab fetch utility module

Bug fixes:

- pct undefined crash in batch progress

- error message lost in catch block

- missing return true for async message handler

- message.folder null pointer risk

- ollamaNumCtx setting ignored

- context menu stale after label changes
@rfiacne rfiacne force-pushed the main branch 2 times, most recently from 81385ab to b8820b8 Compare April 23, 2026 10:34
Add Thunderbird _locales/ based internationalization with en and zh_CN translations. All UI text, provider info, badges, and dynamic status messages are now localized.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant