Release version 1.4.3
This commit is contained in:
1
.github/workflows/docker-publish.yml
vendored
1
.github/workflows/docker-publish.yml
vendored
@@ -2,7 +2,6 @@ name: Build and Publish Docker Image
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ "main" ]
|
||||
tags: [ 'v*.*.*' ]
|
||||
paths:
|
||||
- 'backend/**'
|
||||
|
||||
76
CHANGELOG.md
76
CHANGELOG.md
@@ -5,6 +5,82 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [1.4.3] - 2026-01-01
|
||||
|
||||
### Changed
|
||||
|
||||
#### Configuration
|
||||
- **Automatic Domain Detection**: Removed `MAILCOW_LOCAL_DOMAINS` environment variable requirement
|
||||
- Active domains are now automatically fetched from Mailcow API (`/api/v1/get/domain/all`)
|
||||
- Only active domains are used
|
||||
- Domains are cached on application startup
|
||||
- No manual domain configuration needed anymore
|
||||
|
||||
#### UI Improvements
|
||||
- **Local Domains Display**: Enhanced domains display in Settings page
|
||||
- Changed from comma-separated list to grid layout (columns)
|
||||
- Scrollable container for many domains
|
||||
|
||||
#### Code Quality
|
||||
- **Code Cleanup**: Removed unnecessary comments from codebase
|
||||
- Removed verbose comments that don't add value
|
||||
- Cleaned up phase markers and redundant inline comments
|
||||
- Improved code readability
|
||||
|
||||
### Fixed
|
||||
|
||||
#### Security Tab
|
||||
- **Timestamp Formatting**: Fixed timestamp display in Security tab to match Messages page format
|
||||
- All timestamps now properly formatted with UTC timezone ('Z' suffix)
|
||||
- Consistent date/time display across all tabs
|
||||
- **Banned Filter**: Fixed filter not working correctly for "Banning" messages
|
||||
- Now correctly identifies "Banning" (present tense) messages as banned actions
|
||||
- Uses priority field ("crit") to determine ban status when message parsing is ambiguous
|
||||
- Added support for CIDR notation in ban messages (e.g., "Banning 3.134.148.0/24")
|
||||
- **View Consistency**: Removed old table view that was sometimes displayed
|
||||
- Only card-based view is now used consistently
|
||||
- Smart refresh now uses same rendering function as initial load
|
||||
- **Duplicate Log Prevention**: Fixed duplicate security events appearing in Security tab
|
||||
- Added deduplication logic based on message + time + priority combination
|
||||
- Frontend filters duplicates before display (handles legacy data)
|
||||
- Backend import now checks database for existing logs with same message + time + priority before inserting
|
||||
- Prevents duplicate entries from being stored in database during import
|
||||
|
||||
#### Import Status
|
||||
- **Last Fetch Run Time**: Added tracking of when imports run (not just when data is imported)
|
||||
- Status page now shows "Last Fetch Run" (when import job ran) separate from "Last Import" (when data was actually imported)
|
||||
- Resolves confusion when imports run but no new logs are available
|
||||
- All three log types (Postfix, Rspamd, Netfilter) now track fetch run times
|
||||
|
||||
#### Netfilter Logging
|
||||
- **Enhanced Logging**: Added detailed debug logs for Netfilter import process
|
||||
- Logs show when fetch starts, how many logs received, how many imported, and how many skipped as duplicates
|
||||
- Better error tracking for troubleshooting import delays
|
||||
- **Import Deduplication**: Improved duplicate detection during Netfilter log import
|
||||
- Now checks database for existing logs with same message + time + priority before inserting
|
||||
- Uses combination of message + time + priority as unique identifier (instead of time + IP + message)
|
||||
- Prevents duplicate entries from being stored in database
|
||||
|
||||
### Added
|
||||
|
||||
#### Version Management
|
||||
- **VERSION File**: Version number now managed in single `VERSION` file instead of hardcoded in multiple places
|
||||
- Supports both Docker and development environments
|
||||
|
||||
#### Footer
|
||||
- **Application Footer**: Added footer to all pages with:
|
||||
- Application name and current version
|
||||
- "Update Available" badge when new version is detected
|
||||
|
||||
#### Settings Page
|
||||
- **Version Information Section**: Added version display in Settings page
|
||||
- Shows current installed version
|
||||
- Shows latest available version from GitHub
|
||||
- Displays "Update Available" or "Up to Date" status
|
||||
- Link to release notes when update is available
|
||||
|
||||
---
|
||||
|
||||
## [1.4.2] - 2025-12-31
|
||||
|
||||
### Fixed
|
||||
|
||||
@@ -22,6 +22,9 @@ COPY backend/app/ /app/app/
|
||||
# Copy frontend files
|
||||
COPY frontend/ /app/frontend/
|
||||
|
||||
# Copy VERSION file
|
||||
COPY VERSION /app/VERSION
|
||||
|
||||
# Create non-root user
|
||||
RUN useradd -m -u 1000 appuser && \
|
||||
chown -R appuser:appuser /app
|
||||
|
||||
@@ -111,9 +111,10 @@ All settings via environment variables. See **[env.example](env.example)** for f
|
||||
|----------|-------------|
|
||||
| `MAILCOW_URL` | Mailcow instance URL |
|
||||
| `MAILCOW_API_KEY` | Mailcow API key |
|
||||
| `MAILCOW_LOCAL_DOMAINS` | Your email domains |
|
||||
| `POSTGRES_PASSWORD` | Database password |
|
||||
|
||||
**Note:** Active domains are automatically fetched from Mailcow API - no configuration needed!
|
||||
|
||||
### Key Optional Settings
|
||||
|
||||
| Variable | Default | Description |
|
||||
|
||||
@@ -1,25 +1,21 @@
|
||||
"""
|
||||
Configuration management using Pydantic Settings
|
||||
ALL settings loaded from environment variables - NO hardcoded values!
|
||||
"""
|
||||
from pydantic_settings import BaseSettings
|
||||
from pydantic import Field, validator
|
||||
from typing import List
|
||||
from typing import List, Optional
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
_cached_active_domains: Optional[List[str]] = None
|
||||
|
||||
|
||||
class Settings(BaseSettings):
|
||||
"""Application settings - ALL from environment variables"""
|
||||
"""Application settings"""
|
||||
|
||||
# Mailcow Configuration
|
||||
mailcow_url: str = Field(..., description="Mailcow instance URL")
|
||||
mailcow_api_key: str = Field(..., description="Mailcow API key")
|
||||
mailcow_local_domains: str = Field(
|
||||
default="sendmail.co.il",
|
||||
description="Comma-separated list of local domains"
|
||||
)
|
||||
mailcow_api_timeout: int = Field(default=30, description="API request timeout in seconds")
|
||||
|
||||
# Blacklist Configuration
|
||||
@@ -32,11 +28,11 @@ class Settings(BaseSettings):
|
||||
fetch_interval: int = Field(default=60, description="Seconds between log fetches")
|
||||
fetch_count_postfix: int = Field(
|
||||
default=2000,
|
||||
description="Postfix logs to fetch per request (higher because each email = ~7-10 log lines)"
|
||||
description="Postfix logs to fetch per request"
|
||||
)
|
||||
fetch_count_rspamd: int = Field(
|
||||
default=500,
|
||||
description="Rspamd logs to fetch per request (1 log = 1 email)"
|
||||
description="Rspamd logs to fetch per request"
|
||||
)
|
||||
fetch_count_netfilter: int = Field(
|
||||
default=500,
|
||||
@@ -44,7 +40,7 @@ class Settings(BaseSettings):
|
||||
)
|
||||
retention_days: int = Field(default=7, description="Days to keep logs")
|
||||
|
||||
# Correlation Configuration (NEW!)
|
||||
# Correlation Configuration
|
||||
max_correlation_age_minutes: int = Field(
|
||||
default=10,
|
||||
description="Stop searching for correlations older than this (minutes)"
|
||||
@@ -69,7 +65,7 @@ class Settings(BaseSettings):
|
||||
)
|
||||
tz: str = Field(
|
||||
default="UTC",
|
||||
description="Timezone (e.g. Asia/Jerusalem, America/New_York)"
|
||||
description="Timezone"
|
||||
)
|
||||
app_title: str = Field(default="Mailcow Logs Viewer", description="Application title")
|
||||
app_logo_url: str = Field(default="", description="Application logo URL (optional)")
|
||||
@@ -111,8 +107,12 @@ class Settings(BaseSettings):
|
||||
|
||||
@property
|
||||
def local_domains_list(self) -> List[str]:
|
||||
"""Parse local domains into a list"""
|
||||
return [d.strip() for d in self.mailcow_local_domains.split(',') if d.strip()]
|
||||
"""Get active domains from Mailcow API cache"""
|
||||
global _cached_active_domains
|
||||
if _cached_active_domains is None:
|
||||
logger.warning("Local domains cache not yet populated")
|
||||
return []
|
||||
return _cached_active_domains
|
||||
|
||||
@property
|
||||
def blacklist_emails_list(self) -> List[str]:
|
||||
@@ -142,14 +142,11 @@ class Settings(BaseSettings):
|
||||
case_sensitive = False
|
||||
|
||||
|
||||
# Global settings instance
|
||||
settings = Settings()
|
||||
|
||||
|
||||
def setup_logging():
|
||||
"""Configure application logging based on LOG_LEVEL from .env"""
|
||||
|
||||
# Simple format
|
||||
"""Configure application logging"""
|
||||
log_format = '%(levelname)s - %(message)s'
|
||||
|
||||
logging.basicConfig(
|
||||
@@ -157,7 +154,6 @@ def setup_logging():
|
||||
format=log_format
|
||||
)
|
||||
|
||||
# Silence noisy third-party libraries
|
||||
logging.getLogger('httpx').setLevel(logging.ERROR)
|
||||
logging.getLogger('httpcore').setLevel(logging.ERROR)
|
||||
logging.getLogger('urllib3').setLevel(logging.ERROR)
|
||||
@@ -165,8 +161,20 @@ def setup_logging():
|
||||
logging.getLogger('apscheduler').setLevel(logging.WARNING)
|
||||
|
||||
if settings.debug:
|
||||
logger.warning("[WARNING] Debug mode is enabled")
|
||||
logger.warning("Debug mode is enabled")
|
||||
|
||||
|
||||
# Initialize logging
|
||||
setup_logging()
|
||||
|
||||
|
||||
def set_cached_active_domains(domains: List[str]) -> None:
|
||||
"""Set the cached active domains list"""
|
||||
global _cached_active_domains
|
||||
_cached_active_domains = domains
|
||||
logger.info(f"Cached {len(domains)} active domains from Mailcow API")
|
||||
|
||||
|
||||
def get_cached_active_domains() -> Optional[List[str]]:
|
||||
"""Get the cached active domains list"""
|
||||
return _cached_active_domains
|
||||
@@ -77,10 +77,6 @@ def parse_postfix_message(message: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Parse Postfix log message to extract structured data
|
||||
|
||||
Handles two types of message-id lines:
|
||||
1. Standalone: "CFCF76E2F45: message-id=<83b54793-ae84-7f0f-5eb5-1a203fd1227e@sendmail.co.il>"
|
||||
2. Inline in delivery: "CFCF76E2F45: to=<recipient@domain.com>, relay=..., status=sent (250 2.6.0 <message-id@domain.com> ...)"
|
||||
|
||||
Args:
|
||||
message: Postfix log message string
|
||||
|
||||
|
||||
@@ -297,6 +297,31 @@ class MailcowAPI:
|
||||
logger.error(f"Failed to fetch domains: {e}")
|
||||
return []
|
||||
|
||||
async def get_active_domains(self) -> List[str]:
|
||||
"""
|
||||
Fetch active domains from Mailcow and return domain names only
|
||||
|
||||
Returns:
|
||||
List of active domain names (where active=1)
|
||||
"""
|
||||
logger.info("Fetching active domains")
|
||||
try:
|
||||
domains = await self.get_domains()
|
||||
|
||||
# Filter active domains and extract domain_name
|
||||
active_domains = [
|
||||
domain.get('domain_name', '')
|
||||
for domain in domains
|
||||
if domain.get('active') == 1 and domain.get('domain_name')
|
||||
]
|
||||
|
||||
logger.info(f"Found {len(active_domains)} active domains: {', '.join(active_domains)}")
|
||||
return active_domains
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to fetch active domains: {e}")
|
||||
return []
|
||||
|
||||
async def get_mailboxes(self) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Fetch all mailboxes from Mailcow
|
||||
@@ -359,5 +384,4 @@ class MailcowAPI:
|
||||
return False
|
||||
|
||||
|
||||
# Global API client instance
|
||||
mailcow_api = MailcowAPI()
|
||||
@@ -9,7 +9,7 @@ from fastapi.responses import HTMLResponse, JSONResponse
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from contextlib import asynccontextmanager
|
||||
|
||||
from .config import settings
|
||||
from .config import settings, set_cached_active_domains
|
||||
from .database import init_db, check_db_connection
|
||||
from .scheduler import start_scheduler, stop_scheduler
|
||||
from .mailcow_api import mailcow_api
|
||||
@@ -17,10 +17,10 @@ from .routers import logs, stats
|
||||
from .routers import export as export_router
|
||||
from .migrations import run_migrations
|
||||
from .auth import BasicAuthMiddleware
|
||||
from .version import __version__
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Import status and messages routers
|
||||
try:
|
||||
from .routers import status as status_router
|
||||
from .routers import messages as messages_router
|
||||
@@ -63,11 +63,21 @@ async def lifespan(app: FastAPI):
|
||||
logger.error(f"Failed to initialize database: {e}")
|
||||
raise
|
||||
|
||||
# Test Mailcow API connection
|
||||
# Test Mailcow API connection and fetch active domains
|
||||
try:
|
||||
api_ok = await mailcow_api.test_connection()
|
||||
if not api_ok:
|
||||
logger.warning("Mailcow API connection test failed - check your configuration")
|
||||
else:
|
||||
try:
|
||||
active_domains = await mailcow_api.get_active_domains()
|
||||
if active_domains:
|
||||
set_cached_active_domains(active_domains)
|
||||
logger.info(f"Loaded {len(active_domains)} active domains from Mailcow API")
|
||||
else:
|
||||
logger.warning("No active domains found in Mailcow - check your configuration")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to fetch active domains: {e}")
|
||||
except Exception as e:
|
||||
logger.error(f"Mailcow API test failed: {e}")
|
||||
|
||||
@@ -92,7 +102,7 @@ async def lifespan(app: FastAPI):
|
||||
app = FastAPI(
|
||||
title="Mailcow Logs Viewer",
|
||||
description="Modern dashboard for viewing and analyzing Mailcow mail server logs",
|
||||
version="1.2.0",
|
||||
version=__version__,
|
||||
lifespan=lifespan
|
||||
)
|
||||
|
||||
@@ -157,7 +167,7 @@ async def health_check():
|
||||
return {
|
||||
"status": "healthy" if db_ok else "unhealthy",
|
||||
"database": "connected" if db_ok else "disconnected",
|
||||
"version": "1.2.0",
|
||||
"version": __version__,
|
||||
"config": {
|
||||
"fetch_interval": settings.fetch_interval,
|
||||
"retention_days": settings.retention_days,
|
||||
@@ -173,7 +183,7 @@ async def app_info():
|
||||
"""Application information endpoint"""
|
||||
return {
|
||||
"name": "Mailcow Logs Viewer",
|
||||
"version": "1.2.0",
|
||||
"version": __version__,
|
||||
"mailcow_url": settings.mailcow_url,
|
||||
"local_domains": settings.local_domains_list,
|
||||
"fetch_interval": settings.fetch_interval,
|
||||
|
||||
@@ -5,7 +5,7 @@ import logging
|
||||
from fastapi import APIRouter, Depends, Query, HTTPException
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import or_, and_, desc, func
|
||||
from datetime import datetime, timedelta
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from typing import List, Optional
|
||||
|
||||
from ..database import get_db
|
||||
@@ -18,6 +18,25 @@ logger = logging.getLogger(__name__)
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
def format_datetime_utc(dt: Optional[datetime]) -> Optional[str]:
|
||||
"""
|
||||
Format datetime for API response with proper UTC timezone
|
||||
Always returns ISO format with 'Z' suffix so browser knows it's UTC
|
||||
"""
|
||||
if dt is None:
|
||||
return None
|
||||
|
||||
# If naive (no timezone), assume UTC
|
||||
if dt.tzinfo is None:
|
||||
dt = dt.replace(tzinfo=timezone.utc)
|
||||
|
||||
# Convert to UTC if not already
|
||||
dt_utc = dt.astimezone(timezone.utc)
|
||||
|
||||
# Format as ISO string with 'Z' suffix for UTC
|
||||
return dt_utc.replace(microsecond=0).isoformat().replace('+00:00', 'Z')
|
||||
|
||||
|
||||
@router.get("/logs/postfix/by-queue/{queue_id}")
|
||||
async def get_postfix_logs_by_queue(
|
||||
queue_id: str,
|
||||
@@ -380,7 +399,7 @@ async def get_netfilter_logs(
|
||||
"data": [
|
||||
{
|
||||
"id": log.id,
|
||||
"time": log.time.isoformat(),
|
||||
"time": format_datetime_utc(log.time),
|
||||
"priority": log.priority,
|
||||
"message": log.message,
|
||||
"ip": log.ip,
|
||||
|
||||
@@ -12,6 +12,7 @@ from typing import Dict, Any, Optional
|
||||
from ..database import get_db
|
||||
from ..models import PostfixLog, RspamdLog, NetfilterLog, MessageCorrelation
|
||||
from ..config import settings
|
||||
from ..scheduler import last_fetch_run_time
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -99,16 +100,19 @@ async def get_settings_info(db: Session = Depends(get_db)):
|
||||
"import_status": {
|
||||
"postfix": {
|
||||
"last_import": format_datetime_utc(last_postfix),
|
||||
"last_fetch_run": format_datetime_utc(last_fetch_run_time.get('postfix')),
|
||||
"total_entries": total_postfix or 0,
|
||||
"oldest_entry": format_datetime_utc(oldest_postfix)
|
||||
},
|
||||
"rspamd": {
|
||||
"last_import": format_datetime_utc(last_rspamd),
|
||||
"last_fetch_run": format_datetime_utc(last_fetch_run_time.get('rspamd')),
|
||||
"total_entries": total_rspamd or 0,
|
||||
"oldest_entry": format_datetime_utc(oldest_rspamd)
|
||||
},
|
||||
"netfilter": {
|
||||
"last_import": format_datetime_utc(last_netfilter),
|
||||
"last_fetch_run": format_datetime_utc(last_fetch_run_time.get('netfilter')),
|
||||
"total_entries": total_netfilter or 0,
|
||||
"oldest_entry": format_datetime_utc(oldest_netfilter)
|
||||
}
|
||||
|
||||
@@ -8,6 +8,7 @@ from datetime import datetime, timedelta
|
||||
from typing import Dict, Any
|
||||
|
||||
from ..mailcow_api import mailcow_api
|
||||
from ..version import __version__
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -22,6 +23,15 @@ version_cache = {
|
||||
"changelog": None
|
||||
}
|
||||
|
||||
# Cache for app version check (check once per day)
|
||||
app_version_cache = {
|
||||
"checked_at": None,
|
||||
"current_version": __version__, # Read from VERSION file
|
||||
"latest_version": None,
|
||||
"update_available": False,
|
||||
"changelog": None
|
||||
}
|
||||
|
||||
@router.get("/status/containers")
|
||||
async def get_containers_status():
|
||||
"""
|
||||
@@ -160,6 +170,74 @@ async def get_version_status():
|
||||
}
|
||||
except Exception as e:
|
||||
logger.error(f"Error fetching version status: {e}")
|
||||
|
||||
|
||||
@router.get("/status/app-version")
|
||||
async def get_app_version_status():
|
||||
"""
|
||||
Get current app version and check for updates from GitHub
|
||||
Checks GitHub once per day and caches the result
|
||||
"""
|
||||
try:
|
||||
global app_version_cache
|
||||
|
||||
# Check if we need to refresh the cache (once per day)
|
||||
now = datetime.utcnow()
|
||||
if (app_version_cache["checked_at"] is None or
|
||||
now - app_version_cache["checked_at"] > timedelta(days=1)):
|
||||
|
||||
logger.info("Checking app version and updates from GitHub...")
|
||||
|
||||
# Check GitHub for latest version
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=10) as client:
|
||||
response = await client.get(
|
||||
"https://api.github.com/repos/ShlomiPorush/mailcow-logs-viewer/releases/latest"
|
||||
)
|
||||
|
||||
if response.status_code == 200:
|
||||
release_data = response.json()
|
||||
latest_version = release_data.get('tag_name', 'unknown')
|
||||
# Remove 'v' prefix if present
|
||||
if latest_version.startswith('v'):
|
||||
latest_version = latest_version[1:]
|
||||
changelog = release_data.get('body', '')
|
||||
|
||||
app_version_cache["latest_version"] = latest_version
|
||||
app_version_cache["changelog"] = changelog
|
||||
|
||||
# Compare versions (simple string comparison)
|
||||
app_version_cache["update_available"] = app_version_cache["current_version"] != latest_version
|
||||
|
||||
logger.info(f"App version check: Current={app_version_cache['current_version']}, Latest={latest_version}")
|
||||
else:
|
||||
logger.warning(f"GitHub API returned status {response.status_code}")
|
||||
app_version_cache["latest_version"] = "unknown"
|
||||
app_version_cache["update_available"] = False
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to check GitHub for app updates: {e}")
|
||||
app_version_cache["latest_version"] = "unknown"
|
||||
app_version_cache["update_available"] = False
|
||||
|
||||
app_version_cache["checked_at"] = now
|
||||
|
||||
return {
|
||||
"current_version": app_version_cache["current_version"],
|
||||
"latest_version": app_version_cache["latest_version"],
|
||||
"update_available": app_version_cache["update_available"],
|
||||
"changelog": app_version_cache["changelog"],
|
||||
"last_checked": app_version_cache["checked_at"].isoformat() if app_version_cache["checked_at"] else None
|
||||
}
|
||||
except Exception as e:
|
||||
logger.error(f"Error fetching app version status: {e}")
|
||||
return {
|
||||
"current_version": app_version_cache["current_version"],
|
||||
"latest_version": "unknown",
|
||||
"update_available": False,
|
||||
"changelog": None,
|
||||
"last_checked": None
|
||||
}
|
||||
raise HTTPException(status_code=500, detail=str(e))
|
||||
|
||||
|
||||
|
||||
@@ -1,11 +1,5 @@
|
||||
"""
|
||||
Background scheduler - REFACTORED
|
||||
|
||||
Architecture:
|
||||
1. IMPORT PHASE: Fetch logs from API and store in DB (NO correlation!)
|
||||
2. CORRELATION PHASE: Separate job that links logs together
|
||||
|
||||
This fixes timing issues where rspamd arrives before postfix.
|
||||
Background scheduler
|
||||
"""
|
||||
import logging
|
||||
import asyncio
|
||||
@@ -29,11 +23,16 @@ logger = logging.getLogger(__name__)
|
||||
|
||||
scheduler = AsyncIOScheduler()
|
||||
|
||||
# Track seen logs to avoid duplicates within session
|
||||
seen_postfix: Set[str] = set()
|
||||
seen_rspamd: Set[str] = set()
|
||||
seen_netfilter: Set[str] = set()
|
||||
|
||||
last_fetch_run_time: Dict[str, Optional[datetime]] = {
|
||||
'postfix': None,
|
||||
'rspamd': None,
|
||||
'netfilter': None
|
||||
}
|
||||
|
||||
|
||||
def is_blacklisted(email: Optional[str]) -> bool:
|
||||
"""
|
||||
@@ -61,21 +60,10 @@ def is_blacklisted(email: Optional[str]) -> bool:
|
||||
return is_blocked
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# PHASE 1: IMPORT LOGS (No correlation during import!)
|
||||
# =============================================================================
|
||||
|
||||
async def fetch_and_store_postfix():
|
||||
"""
|
||||
Fetch Postfix logs from API and store in DB.
|
||||
NO correlation here - that happens in a separate job.
|
||||
"""Fetch Postfix logs from API and store in DB"""
|
||||
last_fetch_run_time['postfix'] = datetime.now(timezone.utc)
|
||||
|
||||
BLACKLIST LOGIC:
|
||||
When we see a blacklisted email in any log, we:
|
||||
1. Mark that Queue ID as blacklisted
|
||||
2. Delete ALL existing logs with that Queue ID from DB
|
||||
3. Skip importing any future logs with that Queue ID
|
||||
"""
|
||||
try:
|
||||
logs = await mailcow_api.get_postfix_logs(count=settings.fetch_count_postfix)
|
||||
|
||||
@@ -87,7 +75,6 @@ async def fetch_and_store_postfix():
|
||||
skipped_blacklist = 0
|
||||
blacklisted_queue_ids: Set[str] = set()
|
||||
|
||||
# First pass: identify blacklisted queue IDs
|
||||
for log_entry in logs:
|
||||
message = log_entry.get('message', '')
|
||||
parsed = parse_postfix_message(message)
|
||||
@@ -119,7 +106,6 @@ async def fetch_and_store_postfix():
|
||||
|
||||
db.commit()
|
||||
|
||||
# Second pass: import non-blacklisted logs
|
||||
for log_entry in logs:
|
||||
try:
|
||||
time_str = str(log_entry.get('time', ''))
|
||||
@@ -148,7 +134,6 @@ async def fetch_and_store_postfix():
|
||||
sender = parsed.get('sender')
|
||||
recipient = parsed.get('recipient')
|
||||
|
||||
# Create and save log (NO correlation here!)
|
||||
postfix_log = PostfixLog(
|
||||
time=timestamp,
|
||||
program=log_entry.get('program'),
|
||||
@@ -173,7 +158,6 @@ async def fetch_and_store_postfix():
|
||||
logger.error(f"Error processing Postfix log: {e}")
|
||||
continue
|
||||
|
||||
# Commit all at once
|
||||
db.commit()
|
||||
|
||||
if new_count > 0:
|
||||
@@ -182,7 +166,6 @@ async def fetch_and_store_postfix():
|
||||
msg += f" (skipped {skipped_blacklist} blacklisted)"
|
||||
logger.info(msg)
|
||||
|
||||
# Clear cache if too large
|
||||
if len(seen_postfix) > 10000:
|
||||
seen_postfix.clear()
|
||||
|
||||
@@ -191,19 +174,9 @@ async def fetch_and_store_postfix():
|
||||
|
||||
|
||||
async def fetch_and_store_rspamd():
|
||||
"""
|
||||
Fetch Rspamd logs from API and store in DB.
|
||||
NO correlation here - that happens in a separate job.
|
||||
"""Fetch Rspamd logs from API and store in DB"""
|
||||
last_fetch_run_time['rspamd'] = datetime.now(timezone.utc)
|
||||
|
||||
API returns:
|
||||
- 'message-id' (with dash!) not 'message_id'
|
||||
- 'sender_smtp' not 'from'
|
||||
- 'rcpt_smtp' not 'rcpt'
|
||||
|
||||
BLACKLIST LOGIC:
|
||||
When we see a blacklisted email, we also delete any existing
|
||||
correlations with that message_id to prevent orphaned data.
|
||||
"""
|
||||
try:
|
||||
logs = await mailcow_api.get_rspamd_logs(count=settings.fetch_count_rspamd)
|
||||
|
||||
@@ -218,16 +191,10 @@ async def fetch_and_store_rspamd():
|
||||
for log_entry in logs:
|
||||
try:
|
||||
unix_time = log_entry.get('unix_time', 0)
|
||||
|
||||
# FIXED: API returns 'message-id' with DASH!
|
||||
message_id = log_entry.get('message-id', '')
|
||||
if message_id == 'undef' or not message_id:
|
||||
message_id = None
|
||||
|
||||
# FIXED: API returns 'sender_smtp' not 'from'!
|
||||
sender = log_entry.get('sender_smtp')
|
||||
|
||||
# FIXED: API returns 'rcpt_smtp' not 'rcpt'!
|
||||
recipients = log_entry.get('rcpt_smtp', [])
|
||||
|
||||
unique_id = f"{unix_time}:{message_id if message_id else 'no-id'}"
|
||||
@@ -235,7 +202,6 @@ async def fetch_and_store_rspamd():
|
||||
if unique_id in seen_rspamd:
|
||||
continue
|
||||
|
||||
# Check blacklist - sender
|
||||
if is_blacklisted(sender):
|
||||
skipped_blacklist += 1
|
||||
seen_rspamd.add(unique_id)
|
||||
@@ -243,7 +209,6 @@ async def fetch_and_store_rspamd():
|
||||
blacklisted_message_ids.add(message_id)
|
||||
continue
|
||||
|
||||
# Check blacklist - any recipient
|
||||
if recipients and any(is_blacklisted(r) for r in recipients):
|
||||
skipped_blacklist += 1
|
||||
seen_rspamd.add(unique_id)
|
||||
@@ -251,13 +216,9 @@ async def fetch_and_store_rspamd():
|
||||
blacklisted_message_ids.add(message_id)
|
||||
continue
|
||||
|
||||
# Parse timestamp with timezone
|
||||
timestamp = datetime.fromtimestamp(unix_time, tz=timezone.utc)
|
||||
|
||||
# Detect direction
|
||||
direction = detect_direction(log_entry)
|
||||
|
||||
# Create and save log (NO correlation here!)
|
||||
rspamd_log = RspamdLog(
|
||||
time=timestamp,
|
||||
message_id=message_id,
|
||||
@@ -287,9 +248,7 @@ async def fetch_and_store_rspamd():
|
||||
logger.error(f"Error processing Rspamd log: {e}")
|
||||
continue
|
||||
|
||||
# Delete correlations for blacklisted message IDs
|
||||
if blacklisted_message_ids:
|
||||
# Get queue IDs from correlations before deleting
|
||||
correlations_to_delete = db.query(MessageCorrelation).filter(
|
||||
MessageCorrelation.message_id.in_(blacklisted_message_ids)
|
||||
).all()
|
||||
@@ -299,12 +258,10 @@ async def fetch_and_store_rspamd():
|
||||
if corr.queue_id:
|
||||
queue_ids_to_delete.add(corr.queue_id)
|
||||
|
||||
# Delete correlations
|
||||
deleted_corr = db.query(MessageCorrelation).filter(
|
||||
MessageCorrelation.message_id.in_(blacklisted_message_ids)
|
||||
).delete(synchronize_session=False)
|
||||
|
||||
# Delete Postfix logs for those queue IDs
|
||||
if queue_ids_to_delete:
|
||||
deleted_postfix = db.query(PostfixLog).filter(
|
||||
PostfixLog.queue_id.in_(queue_ids_to_delete)
|
||||
@@ -316,7 +273,6 @@ async def fetch_and_store_rspamd():
|
||||
if deleted_corr > 0:
|
||||
logger.info(f"[BLACKLIST] Deleted {deleted_corr} correlations for blacklisted message IDs")
|
||||
|
||||
# Commit all at once
|
||||
db.commit()
|
||||
|
||||
if new_count > 0:
|
||||
@@ -325,7 +281,6 @@ async def fetch_and_store_rspamd():
|
||||
msg += f" (skipped {skipped_blacklist} blacklisted)"
|
||||
logger.info(msg)
|
||||
|
||||
# Clear cache if too large
|
||||
if len(seen_rspamd) > 10000:
|
||||
seen_rspamd.clear()
|
||||
|
||||
@@ -333,68 +288,65 @@ async def fetch_and_store_rspamd():
|
||||
logger.error(f"[ERROR] Rspamd fetch error: {e}")
|
||||
|
||||
|
||||
def parse_netfilter_message(message: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Parse Netfilter log message to extract structured data.
|
||||
def parse_netfilter_message(message: str, priority: Optional[str] = None) -> Dict[str, Any]:
|
||||
|
||||
Examples:
|
||||
- "9 more attempts in the next 600 seconds until 80.178.113.140/32 is banned"
|
||||
- "80.178.113.140 matched rule id 3 (warning: 80.178.113.140.adsl.012.net.il[80.178.113.140]: SASL LOGIN authentication failed: ...)"
|
||||
- "Banned 80.178.113.140 for 600 seconds"
|
||||
"""
|
||||
result = {}
|
||||
message_lower = message.lower()
|
||||
|
||||
# Extract IP address - multiple patterns
|
||||
# Pattern 1: IP at start of message
|
||||
ip_match = re.match(r'^(\d+\.\d+\.\d+\.\d+)', message)
|
||||
if ip_match:
|
||||
result['ip'] = ip_match.group(1)
|
||||
|
||||
# Pattern 2: IP in "until X.X.X.X/32 is banned"
|
||||
if not result.get('ip'):
|
||||
ban_match = re.search(r'until\s+(\d+\.\d+\.\d+\.\d+)', message)
|
||||
if ban_match:
|
||||
result['ip'] = ban_match.group(1)
|
||||
|
||||
# Pattern 3: IP in brackets [X.X.X.X]
|
||||
if not result.get('ip'):
|
||||
bracket_match = re.search(r'\[(\d+\.\d+\.\d+\.\d+)\]', message)
|
||||
if bracket_match:
|
||||
result['ip'] = bracket_match.group(1)
|
||||
|
||||
# Pattern 4: "Banned X.X.X.X"
|
||||
if not result.get('ip'):
|
||||
banned_match = re.search(r'Banned\s+(\d+\.\d+\.\d+\.\d+)', message)
|
||||
banned_match = re.search(r'Ban(?:ned|ning)\s+(\d+\.\d+\.\d+\.\d+)', message, re.IGNORECASE)
|
||||
if banned_match:
|
||||
result['ip'] = banned_match.group(1)
|
||||
|
||||
# Extract username (sasl_username=xxx@yyy)
|
||||
if not result.get('ip'):
|
||||
cidr_match = re.search(r'Ban(?:ned|ning)\s+(\d+\.\d+\.\d+\.\d+/\d+)', message, re.IGNORECASE)
|
||||
if cidr_match:
|
||||
ip_part = cidr_match.group(1).split('/')[0]
|
||||
result['ip'] = ip_part
|
||||
|
||||
username_match = re.search(r'sasl_username=([^\s,\)]+)', message)
|
||||
if username_match:
|
||||
result['username'] = username_match.group(1)
|
||||
|
||||
# Extract auth method (SASL LOGIN, SASL PLAIN, etc.)
|
||||
auth_match = re.search(r'SASL\s+(\w+)', message)
|
||||
if auth_match:
|
||||
result['auth_method'] = f"SASL {auth_match.group(1)}"
|
||||
|
||||
# Extract rule ID
|
||||
rule_match = re.search(r'rule id\s+(\d+)', message)
|
||||
if rule_match:
|
||||
result['rule_id'] = int(rule_match.group(1))
|
||||
|
||||
# Extract attempts left
|
||||
attempts_match = re.search(r'(\d+)\s+more\s+attempt', message)
|
||||
if attempts_match:
|
||||
result['attempts_left'] = int(attempts_match.group(1))
|
||||
|
||||
# Determine action
|
||||
if 'is banned' in message.lower() or 'banned' in message.lower():
|
||||
if 'more attempts' in message.lower():
|
||||
if 'banning' in message_lower or 'banned' in message_lower:
|
||||
if 'more attempts' in message_lower:
|
||||
result['action'] = 'warning'
|
||||
else:
|
||||
result['action'] = 'banned'
|
||||
elif 'warning' in message.lower():
|
||||
elif priority and priority.lower() == 'crit':
|
||||
if 'unbanning' in message_lower or 'unban' in message_lower:
|
||||
result['action'] = 'info'
|
||||
elif 'banning' in message_lower:
|
||||
result['action'] = 'banned'
|
||||
else:
|
||||
result['action'] = 'banned'
|
||||
elif 'warning' in message_lower:
|
||||
result['action'] = 'warning'
|
||||
else:
|
||||
result['action'] = 'info'
|
||||
@@ -403,35 +355,51 @@ def parse_netfilter_message(message: str) -> Dict[str, Any]:
|
||||
|
||||
|
||||
async def fetch_and_store_netfilter():
|
||||
"""Fetch Netfilter logs from API and store in DB."""
|
||||
"""Fetch Netfilter logs from API and store in DB"""
|
||||
last_fetch_run_time['netfilter'] = datetime.now(timezone.utc)
|
||||
|
||||
try:
|
||||
logger.debug(f"[NETFILTER] Starting fetch (count: {settings.fetch_count_netfilter})")
|
||||
logs = await mailcow_api.get_netfilter_logs(count=settings.fetch_count_netfilter)
|
||||
|
||||
if not logs:
|
||||
logger.debug("[NETFILTER] No logs returned from API")
|
||||
return
|
||||
|
||||
logger.debug(f"[NETFILTER] Received {len(logs)} logs from API")
|
||||
|
||||
with get_db_context() as db:
|
||||
new_count = 0
|
||||
skipped_count = 0
|
||||
|
||||
for log_entry in logs:
|
||||
try:
|
||||
time_val = log_entry.get('time', 0)
|
||||
message = log_entry.get('message', '')
|
||||
|
||||
# Parse the message to extract structured data
|
||||
parsed = parse_netfilter_message(message)
|
||||
|
||||
ip = parsed.get('ip', '')
|
||||
unique_id = f"{time_val}:{ip}:{message[:50]}"
|
||||
priority = log_entry.get('priority', 'info')
|
||||
unique_id = f"{time_val}:{priority}:{message}"
|
||||
|
||||
if unique_id in seen_netfilter:
|
||||
skipped_count += 1
|
||||
continue
|
||||
|
||||
timestamp = datetime.fromtimestamp(time_val, tz=timezone.utc)
|
||||
existing = db.query(NetfilterLog).filter(
|
||||
NetfilterLog.message == message,
|
||||
NetfilterLog.time == timestamp,
|
||||
NetfilterLog.priority == priority
|
||||
).first()
|
||||
|
||||
if existing:
|
||||
skipped_count += 1
|
||||
seen_netfilter.add(unique_id)
|
||||
continue
|
||||
|
||||
parsed = parse_netfilter_message(message, priority=priority)
|
||||
|
||||
netfilter_log = NetfilterLog(
|
||||
time=timestamp,
|
||||
priority=log_entry.get('priority', 'info'),
|
||||
priority=priority,
|
||||
message=message,
|
||||
ip=parsed.get('ip'),
|
||||
username=parsed.get('username'),
|
||||
@@ -447,38 +415,45 @@ async def fetch_and_store_netfilter():
|
||||
new_count += 1
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error processing Netfilter log: {e}")
|
||||
logger.error(f"[NETFILTER] Error processing log entry: {e}")
|
||||
continue
|
||||
|
||||
db.commit()
|
||||
|
||||
if new_count > 0:
|
||||
logger.info(f"[OK] Imported {new_count} Netfilter logs")
|
||||
logger.info(f"[OK] Imported {new_count} Netfilter logs (skipped {skipped_count} duplicates)")
|
||||
elif skipped_count > 0:
|
||||
logger.debug(f"[NETFILTER] All {skipped_count} logs were duplicates, nothing new to import")
|
||||
|
||||
if len(seen_netfilter) > 10000:
|
||||
logger.debug("[NETFILTER] Clearing seen_netfilter cache (size > 10000)")
|
||||
seen_netfilter.clear()
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[ERROR] Netfilter fetch error: {e}")
|
||||
logger.error(f"[ERROR] Netfilter fetch error: {e}", exc_info=True)
|
||||
|
||||
|
||||
async def fetch_all_logs():
|
||||
"""Fetch all log types concurrently"""
|
||||
try:
|
||||
await asyncio.gather(
|
||||
logger.debug("[FETCH] Starting fetch_all_logs")
|
||||
results = await asyncio.gather(
|
||||
fetch_and_store_postfix(),
|
||||
fetch_and_store_rspamd(),
|
||||
fetch_and_store_netfilter(),
|
||||
return_exceptions=True
|
||||
)
|
||||
|
||||
for i, result in enumerate(results):
|
||||
if isinstance(result, Exception):
|
||||
log_type = ["Postfix", "Rspamd", "Netfilter"][i]
|
||||
logger.error(f"[ERROR] {log_type} fetch failed: {result}", exc_info=result)
|
||||
|
||||
logger.debug("[FETCH] Completed fetch_all_logs")
|
||||
except Exception as e:
|
||||
logger.error(f"[ERROR] Fetch all logs error: {e}")
|
||||
logger.error(f"[ERROR] Fetch all logs error: {e}", exc_info=True)
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# PHASE 2: CORRELATION (Separate from import!)
|
||||
# =============================================================================
|
||||
|
||||
async def cleanup_blacklisted_queues():
|
||||
"""
|
||||
Clean up Postfix queues where the recipient is blacklisted.
|
||||
@@ -497,12 +472,9 @@ async def cleanup_blacklisted_queues():
|
||||
|
||||
try:
|
||||
with get_db_context() as db:
|
||||
# Find queue_ids where recipient is blacklisted
|
||||
blacklisted_queue_ids = set()
|
||||
|
||||
# Query Postfix logs that have a recipient in the blacklist
|
||||
for email in blacklist:
|
||||
# Find logs where recipient matches this blacklisted email
|
||||
logs_with_blacklisted_recipient = db.query(PostfixLog).filter(
|
||||
PostfixLog.recipient == email,
|
||||
PostfixLog.queue_id.isnot(None)
|
||||
@@ -515,7 +487,6 @@ async def cleanup_blacklisted_queues():
|
||||
if not blacklisted_queue_ids:
|
||||
return
|
||||
|
||||
# Delete ALL Postfix logs with these queue_ids
|
||||
deleted_count = 0
|
||||
for queue_id in blacklisted_queue_ids:
|
||||
count = db.query(PostfixLog).filter(
|
||||
@@ -550,7 +521,6 @@ async def run_correlation():
|
||||
|
||||
try:
|
||||
with get_db_context() as db:
|
||||
# Find Rspamd logs without correlation (limit to avoid overload)
|
||||
uncorrelated_rspamd = db.query(RspamdLog).filter(
|
||||
RspamdLog.correlation_key.is_(None),
|
||||
RspamdLog.message_id.isnot(None),
|
||||
@@ -566,9 +536,7 @@ async def run_correlation():
|
||||
|
||||
for rspamd_log in uncorrelated_rspamd:
|
||||
try:
|
||||
# Check blacklist (for legacy logs that were imported before blacklist was set)
|
||||
if is_blacklisted(rspamd_log.sender_smtp):
|
||||
# Mark as correlated so we don't keep trying
|
||||
rspamd_log.correlation_key = "BLACKLISTED"
|
||||
db.commit()
|
||||
skipped_blacklist += 1
|
||||
|
||||
37
backend/app/version.py
Normal file
37
backend/app/version.py
Normal file
@@ -0,0 +1,37 @@
|
||||
"""
|
||||
Version management - reads version from VERSION file
|
||||
"""
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
def get_version() -> str:
|
||||
"""
|
||||
Read version from VERSION file in project root
|
||||
|
||||
Returns:
|
||||
Version string (e.g., "1.4.2")
|
||||
"""
|
||||
# Try multiple possible paths
|
||||
possible_paths = [
|
||||
Path("/app/VERSION"), # Docker container path
|
||||
Path(__file__).parent.parent.parent / "VERSION", # Development path
|
||||
Path(__file__).parent.parent.parent.parent / "VERSION", # Alternative dev path
|
||||
]
|
||||
|
||||
for version_path in possible_paths:
|
||||
if version_path.exists():
|
||||
try:
|
||||
with open(version_path, "r") as f:
|
||||
version = f.read().strip()
|
||||
if version:
|
||||
return version
|
||||
except Exception as e:
|
||||
print(f"Error reading VERSION file from {version_path}: {e}")
|
||||
continue
|
||||
|
||||
# Fallback to default version if file not found
|
||||
return "1.4.2"
|
||||
|
||||
# Cache the version on import
|
||||
__version__ = get_version()
|
||||
|
||||
@@ -4,21 +4,66 @@ This document describes all available API endpoints for the Mailcow Logs Viewer
|
||||
|
||||
**Base URL:** `http://your-server:8080/api`
|
||||
|
||||
**Authentication:** When `AUTH_ENABLED=true`, all API endpoints (except `/api/health`) require HTTP Basic Authentication. Include the `Authorization: Basic <base64(username:password)>` header in all requests.
|
||||
|
||||
---
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Health & Info](#health--info)
|
||||
2. [Messages (Unified View)](#messages-unified-view)
|
||||
3. [Logs](#logs)
|
||||
1. [Authentication](#authentication)
|
||||
2. [Health & Info](#health--info)
|
||||
3. [Messages (Unified View)](#messages-unified-view)
|
||||
4. [Logs](#logs)
|
||||
- [Postfix Logs](#postfix-logs)
|
||||
- [Rspamd Logs](#rspamd-logs)
|
||||
- [Netfilter Logs](#netfilter-logs)
|
||||
4. [Queue & Quarantine](#queue--quarantine)
|
||||
5. [Statistics](#statistics)
|
||||
6. [Status](#status)
|
||||
7. [Settings](#settings)
|
||||
8. [Export](#export)
|
||||
5. [Queue & Quarantine](#queue--quarantine)
|
||||
6. [Statistics](#statistics)
|
||||
7. [Status](#status)
|
||||
8. [Settings](#settings)
|
||||
9. [Export](#export)
|
||||
|
||||
---
|
||||
|
||||
## Authentication
|
||||
|
||||
### Overview
|
||||
|
||||
When authentication is enabled (`AUTH_ENABLED=true`), all API endpoints except `/api/health` require HTTP Basic Authentication.
|
||||
|
||||
**Public Endpoints (No Authentication Required):**
|
||||
- `GET /api/health` - Health check (for Docker monitoring)
|
||||
- `GET /login` - Login page (HTML)
|
||||
|
||||
**Protected Endpoints (Authentication Required):**
|
||||
- All other `/api/*` endpoints
|
||||
|
||||
### Authentication Method
|
||||
|
||||
Use HTTP Basic Authentication with the credentials configured in your environment:
|
||||
- Username: `AUTH_USERNAME` (default: `admin`)
|
||||
- Password: `AUTH_PASSWORD`
|
||||
|
||||
**Example Request:**
|
||||
```bash
|
||||
curl -u username:password http://your-server:8080/api/info
|
||||
```
|
||||
|
||||
Or with explicit header:
|
||||
```bash
|
||||
curl -H "Authorization: Basic $(echo -n 'username:password' | base64)" \
|
||||
http://your-server:8080/api/info
|
||||
```
|
||||
|
||||
### Login Endpoint
|
||||
|
||||
#### GET /login
|
||||
|
||||
Serves the login page (HTML). This endpoint is always publicly accessible.
|
||||
|
||||
**Response:** HTML page with login form
|
||||
|
||||
**Note:** When authentication is disabled, accessing this endpoint will automatically redirect to the main application.
|
||||
|
||||
---
|
||||
|
||||
@@ -28,17 +73,20 @@ This document describes all available API endpoints for the Mailcow Logs Viewer
|
||||
|
||||
Health check endpoint for monitoring and load balancers.
|
||||
|
||||
**Authentication:** Not required (public endpoint for Docker health checks)
|
||||
|
||||
**Response:**
|
||||
```json
|
||||
{
|
||||
"status": "healthy",
|
||||
"database": "connected",
|
||||
"version": "1.3.0",
|
||||
"version": "1.4.3",
|
||||
"config": {
|
||||
"fetch_interval": 60,
|
||||
"retention_days": 7,
|
||||
"mailcow_url": "https://mail.example.com",
|
||||
"blacklist_enabled": true
|
||||
"blacklist_enabled": true,
|
||||
"auth_enabled": false
|
||||
}
|
||||
}
|
||||
```
|
||||
@@ -53,7 +101,7 @@ Application information and configuration.
|
||||
```json
|
||||
{
|
||||
"name": "Mailcow Logs Viewer",
|
||||
"version": "1.3.0",
|
||||
"version": "1.4.3",
|
||||
"mailcow_url": "https://mail.example.com",
|
||||
"local_domains": ["example.com", "mail.example.com"],
|
||||
"fetch_interval": 60,
|
||||
@@ -61,7 +109,8 @@ Application information and configuration.
|
||||
"timezone": "UTC",
|
||||
"app_title": "Mailcow Logs Viewer",
|
||||
"app_logo_url": "",
|
||||
"blacklist_count": 3
|
||||
"blacklist_count": 3,
|
||||
"auth_enabled": false
|
||||
}
|
||||
```
|
||||
|
||||
@@ -661,6 +710,25 @@ Get Mailcow version and update status.
|
||||
|
||||
---
|
||||
|
||||
### GET /status/app-version
|
||||
|
||||
Get application version and check for updates from GitHub.
|
||||
|
||||
**Response:**
|
||||
```json
|
||||
{
|
||||
"current_version": "1.4.3",
|
||||
"latest_version": "1.4.3",
|
||||
"update_available": false,
|
||||
"changelog": "Release notes...",
|
||||
"last_checked": "2026-01-01T10:30:00Z"
|
||||
}
|
||||
```
|
||||
|
||||
**Note:** This endpoint checks GitHub once per day and caches the result.
|
||||
|
||||
---
|
||||
|
||||
### GET /status/mailcow-info
|
||||
|
||||
Get Mailcow system information.
|
||||
@@ -725,7 +793,9 @@ Get system configuration and status information.
|
||||
"mailcow_url": "https://mail.example.com",
|
||||
"local_domains": ["example.com"],
|
||||
"fetch_interval": 60,
|
||||
"fetch_count": 500,
|
||||
"fetch_count_postfix": 2000,
|
||||
"fetch_count_rspamd": 500,
|
||||
"fetch_count_netfilter": 500,
|
||||
"retention_days": 7,
|
||||
"timezone": "UTC",
|
||||
"app_title": "Mailcow Logs Viewer",
|
||||
@@ -734,21 +804,26 @@ Get system configuration and status information.
|
||||
"blacklist_count": 3,
|
||||
"max_search_results": 1000,
|
||||
"csv_export_limit": 10000,
|
||||
"scheduler_workers": 4
|
||||
"scheduler_workers": 4,
|
||||
"auth_enabled": false,
|
||||
"auth_username": null
|
||||
},
|
||||
"import_status": {
|
||||
"postfix": {
|
||||
"last_import": "2025-12-25T10:30:00Z",
|
||||
"last_fetch_run": "2025-12-25T10:35:00Z",
|
||||
"total_entries": 50000,
|
||||
"oldest_entry": "2025-12-18T00:00:00Z"
|
||||
},
|
||||
"rspamd": {
|
||||
"last_import": "2025-12-25T10:30:00Z",
|
||||
"last_fetch_run": "2025-12-25T10:35:00Z",
|
||||
"total_entries": 45000,
|
||||
"oldest_entry": "2025-12-18T00:00:00Z"
|
||||
},
|
||||
"netfilter": {
|
||||
"last_import": "2025-12-25T10:30:00Z",
|
||||
"last_fetch_run": "2025-12-25T10:35:00Z",
|
||||
"total_entries": 1000,
|
||||
"oldest_entry": "2025-12-18T00:00:00Z"
|
||||
}
|
||||
@@ -929,6 +1004,15 @@ All endpoints may return the following error responses:
|
||||
}
|
||||
```
|
||||
|
||||
### 401 Unauthorized
|
||||
```json
|
||||
{
|
||||
"detail": "Authentication required"
|
||||
}
|
||||
```
|
||||
|
||||
**Note:** Returned when authentication is enabled but no valid credentials are provided. The response does not include `WWW-Authenticate` header to prevent browser popup dialogs.
|
||||
|
||||
### 500 Internal Server Error
|
||||
```json
|
||||
{
|
||||
|
||||
@@ -46,9 +46,10 @@ nano .env
|
||||
|----------|-------------|---------|
|
||||
| `MAILCOW_URL` | Your Mailcow instance URL | `https://mail.example.com` |
|
||||
| `MAILCOW_API_KEY` | Your Mailcow API key | `abc123-def456...` |
|
||||
| `MAILCOW_LOCAL_DOMAINS` | Your email domains | `example.com,domain.net` |
|
||||
| `POSTGRES_PASSWORD` | Database password<br>⚠️ Avoid special chars (`@:/?#`) - breaks connection strings<br>💡 Use UUID: Linux/Mac: `uuidgen` <br> or online https://it-tools.tech/uuid-generator | Example: `a7f3c8e2-4b1d-4f9a-8c3e-7d2f1a9b5e4c` |
|
||||
|
||||
**Note:** Active domains are automatically fetched from Mailcow API (`/api/v1/get/domain/all`) - no need to configure `MAILCOW_LOCAL_DOMAINS` anymore!
|
||||
|
||||
**Review all other settings** and adjust as needed for your environment (timezone, fetch intervals, retention period, etc.)
|
||||
|
||||
**🔐 Optional: Enable Authentication**
|
||||
|
||||
@@ -12,8 +12,8 @@ MAILCOW_URL=https://mail.example.com
|
||||
# Required permissions: Read access to logs
|
||||
MAILCOW_API_KEY=your-api-key-here
|
||||
|
||||
# Your local email domains (comma-separated, no spaces)
|
||||
MAILCOW_LOCAL_DOMAINS=example.com,example.net
|
||||
# Note: Active domains are automatically fetched from Mailcow API
|
||||
# No need to configure MAILCOW_LOCAL_DOMAINS anymore
|
||||
|
||||
# =============================================================================
|
||||
# DATABASE CONFIGURATION
|
||||
|
||||
273
frontend/app.js
273
frontend/app.js
@@ -263,6 +263,12 @@ async function loadAppInfo() {
|
||||
if (data.app_title) {
|
||||
document.getElementById('app-title').textContent = data.app_title;
|
||||
document.title = data.app_title;
|
||||
|
||||
// Update footer app name
|
||||
const footerName = document.getElementById('app-name-footer');
|
||||
if (footerName) {
|
||||
footerName.textContent = data.app_title;
|
||||
}
|
||||
}
|
||||
|
||||
if (data.app_logo_url) {
|
||||
@@ -272,6 +278,12 @@ async function loadAppInfo() {
|
||||
document.getElementById('default-logo').classList.add('hidden');
|
||||
}
|
||||
|
||||
// Update footer version
|
||||
const footerVersion = document.getElementById('app-version-footer');
|
||||
if (footerVersion && data.version) {
|
||||
footerVersion.textContent = `v${data.version}`;
|
||||
}
|
||||
|
||||
// Show/hide logout button based on auth status
|
||||
const logoutBtn = document.getElementById('logout-btn');
|
||||
if (logoutBtn) {
|
||||
@@ -281,11 +293,33 @@ async function loadAppInfo() {
|
||||
logoutBtn.classList.add('hidden');
|
||||
}
|
||||
}
|
||||
|
||||
// Load app version status for update check
|
||||
await loadAppVersionStatus();
|
||||
} catch (error) {
|
||||
console.error('Failed to load app info:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async function loadAppVersionStatus() {
|
||||
try {
|
||||
const response = await authenticatedFetch('/api/status/app-version');
|
||||
if (!response.ok) return;
|
||||
|
||||
const data = await response.json();
|
||||
const updateBadge = document.getElementById('update-badge');
|
||||
|
||||
if (updateBadge && data.update_available) {
|
||||
updateBadge.classList.remove('hidden');
|
||||
updateBadge.title = `Update available: v${data.latest_version}`;
|
||||
} else if (updateBadge) {
|
||||
updateBadge.classList.add('hidden');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to load app version status:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
// AUTO-REFRESH SYSTEM - Smart refresh (only updates when data changes)
|
||||
// =============================================================================
|
||||
@@ -300,6 +334,12 @@ let lastDataCache = {
|
||||
settings: null
|
||||
};
|
||||
|
||||
// Cache for version info (separate from settings cache, doesn't update on smart refresh)
|
||||
let versionInfoCache = {
|
||||
app_version: null,
|
||||
version_info: null
|
||||
};
|
||||
|
||||
function startAutoRefresh() {
|
||||
// Clear existing timer if any
|
||||
if (autoRefreshTimer) {
|
||||
@@ -443,11 +483,71 @@ function renderMessagesData(data) {
|
||||
`;
|
||||
}
|
||||
|
||||
// Deduplicate netfilter logs based on message + time + priority
|
||||
function deduplicateNetfilterLogs(logs) {
|
||||
if (!logs || logs.length === 0) return [];
|
||||
|
||||
const seen = new Set();
|
||||
const uniqueLogs = [];
|
||||
|
||||
for (const log of logs) {
|
||||
// Create unique key from message + time + priority
|
||||
const key = `${log.message || ''}|${log.time || ''}|${log.priority || ''}`;
|
||||
|
||||
if (!seen.has(key)) {
|
||||
seen.add(key);
|
||||
uniqueLogs.push(log);
|
||||
}
|
||||
}
|
||||
|
||||
return uniqueLogs;
|
||||
}
|
||||
|
||||
// Render netfilter without loading spinner (for smart refresh)
|
||||
function renderNetfilterData(data) {
|
||||
const container = document.getElementById('netfilter-logs');
|
||||
if (!container) return;
|
||||
|
||||
if (!data.data || data.data.length === 0) {
|
||||
container.innerHTML = '<p class="text-gray-500 dark:text-gray-400 text-center py-8">No logs found</p>';
|
||||
return;
|
||||
}
|
||||
|
||||
// Deduplicate logs
|
||||
const uniqueLogs = deduplicateNetfilterLogs(data.data);
|
||||
|
||||
// Update count display with deduplicated count
|
||||
const countEl = document.getElementById('security-count');
|
||||
if (countEl) {
|
||||
countEl.textContent = uniqueLogs.length > 0 ? `(${uniqueLogs.length.toLocaleString()} results)` : '';
|
||||
}
|
||||
|
||||
container.innerHTML = `
|
||||
<div class="space-y-3">
|
||||
${uniqueLogs.map(log => `
|
||||
<div class="border border-gray-200 dark:border-gray-700 rounded-lg p-4 bg-white dark:bg-gray-800 hover:bg-gray-50 dark:hover:bg-gray-700/50 transition">
|
||||
<div class="flex flex-col sm:flex-row sm:items-center justify-between gap-2 mb-2">
|
||||
<div class="flex flex-wrap items-center gap-2">
|
||||
<span class="font-mono text-sm font-semibold text-gray-900 dark:text-white">${log.ip || '-'}</span>
|
||||
${log.username && log.username !== '-' ? `<span class="text-sm text-blue-600 dark:text-blue-400">${escapeHtml(log.username)}</span>` : ''}
|
||||
<span class="inline-block px-2 py-0.5 text-xs font-medium rounded ${log.action === 'banned' ? 'bg-red-100 dark:bg-red-900/30 text-red-800 dark:text-red-300' : 'bg-yellow-100 dark:bg-yellow-900/30 text-yellow-800 dark:text-yellow-300'}">${log.action || 'warning'}</span>
|
||||
${log.attempts_left !== null && log.attempts_left !== undefined ? `<span class="text-xs text-gray-500 dark:text-gray-400">${log.attempts_left} attempts left</span>` : ''}
|
||||
</div>
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400">${formatTime(log.time)}</span>
|
||||
</div>
|
||||
<p class="text-sm text-gray-700 dark:text-gray-300 break-words">${escapeHtml(log.message || '-')}</p>
|
||||
</div>
|
||||
`).join('')}
|
||||
</div>
|
||||
${renderPagination('netfilter', data.page, data.pages)}
|
||||
`;
|
||||
}
|
||||
|
||||
// Smart refresh for Netfilter
|
||||
async function smartRefreshNetfilter() {
|
||||
const filters = currentFilters.netfilter || {};
|
||||
const params = new URLSearchParams({
|
||||
page: currentPage.netfilter,
|
||||
page: currentPage.netfilter || 1,
|
||||
limit: 50,
|
||||
...filters
|
||||
});
|
||||
@@ -460,52 +560,11 @@ async function smartRefreshNetfilter() {
|
||||
if (hasDataChanged(data, 'netfilter')) {
|
||||
console.log('[REFRESH] Netfilter data changed, updating UI');
|
||||
lastDataCache.netfilter = data;
|
||||
// Use renderNetfilterData to update content without loading spinner (like Messages page)
|
||||
renderNetfilterData(data);
|
||||
}
|
||||
}
|
||||
|
||||
// Render netfilter without loading spinner
|
||||
function renderNetfilterData(data) {
|
||||
const container = document.getElementById('netfilter-logs');
|
||||
if (!container) return;
|
||||
|
||||
if (!data.data || data.data.length === 0) {
|
||||
container.innerHTML = '<p class="text-gray-500 dark:text-gray-400 text-center py-8">No logs found</p>';
|
||||
return;
|
||||
}
|
||||
|
||||
container.innerHTML = `
|
||||
<div class="mobile-scroll overflow-x-auto">
|
||||
<table class="min-w-full divide-y divide-gray-200 dark:divide-gray-700">
|
||||
<thead class="bg-gray-50 dark:bg-gray-700">
|
||||
<tr>
|
||||
<th class="px-3 sm:px-4 py-3 text-left text-xs font-medium text-gray-500 dark:text-gray-300 uppercase tracking-wider">Time</th>
|
||||
<th class="px-3 sm:px-4 py-3 text-left text-xs font-medium text-gray-500 dark:text-gray-300 uppercase tracking-wider">IP</th>
|
||||
<th class="px-3 sm:px-4 py-3 text-left text-xs font-medium text-gray-500 dark:text-gray-300 uppercase tracking-wider">Username</th>
|
||||
<th class="px-3 sm:px-4 py-3 text-left text-xs font-medium text-gray-500 dark:text-gray-300 uppercase tracking-wider">Auth Method</th>
|
||||
<th class="px-3 sm:px-4 py-3 text-left text-xs font-medium text-gray-500 dark:text-gray-300 uppercase tracking-wider">Action</th>
|
||||
<th class="px-3 sm:px-4 py-3 text-left text-xs font-medium text-gray-500 dark:text-gray-300 uppercase tracking-wider hide-mobile">Attempts Left</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody class="bg-white dark:bg-gray-800 divide-y divide-gray-200 dark:divide-gray-700">
|
||||
${data.data.map(log => `
|
||||
<tr class="hover:bg-gray-50 dark:hover:bg-gray-700">
|
||||
<td class="px-3 sm:px-4 py-3 text-xs sm:text-sm text-gray-900 dark:text-gray-100 whitespace-nowrap">${formatTime(log.time)}</td>
|
||||
<td class="px-3 sm:px-4 py-3 text-xs sm:text-sm font-mono text-gray-900 dark:text-gray-100">${log.ip || '-'}</td>
|
||||
<td class="px-3 sm:px-4 py-3 text-xs sm:text-sm text-gray-900 dark:text-gray-100">${escapeHtml(log.username || '-')}</td>
|
||||
<td class="px-3 sm:px-4 py-3 text-xs sm:text-sm text-gray-600 dark:text-gray-300">${log.auth_method || '-'}</td>
|
||||
<td class="px-3 sm:px-4 py-3 text-xs sm:text-sm">
|
||||
<span class="inline-block px-2 py-1 text-xs font-medium rounded ${log.action === 'banned' ? 'bg-red-100 dark:bg-red-900/30 text-red-800 dark:text-red-300' : 'bg-yellow-100 dark:bg-yellow-900/30 text-yellow-800 dark:text-yellow-300'}">${log.action || 'warning'}</span>
|
||||
</td>
|
||||
<td class="px-3 sm:px-4 py-3 text-xs sm:text-sm text-gray-600 dark:text-gray-300 hide-mobile">${log.attempts_left !== null ? log.attempts_left : '-'}</td>
|
||||
</tr>
|
||||
`).join('')}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
${renderPagination('netfilter', data.page, data.pages)}
|
||||
`;
|
||||
}
|
||||
|
||||
// Smart refresh for Queue
|
||||
async function smartRefreshQueue() {
|
||||
@@ -610,6 +669,14 @@ async function smartRefreshSettings() {
|
||||
|
||||
const content = document.getElementById('settings-content');
|
||||
if (content && !content.classList.contains('hidden')) {
|
||||
// Preserve version info from cache (don't reload it on smart refresh)
|
||||
if (versionInfoCache.app_version) {
|
||||
data.app_version = versionInfoCache.app_version;
|
||||
}
|
||||
if (versionInfoCache.version_info) {
|
||||
data.version_info = versionInfoCache.version_info;
|
||||
}
|
||||
|
||||
renderSettings(content, data);
|
||||
}
|
||||
}
|
||||
@@ -1103,20 +1170,25 @@ async function loadNetfilterLogs(page = 1) {
|
||||
const data = await response.json();
|
||||
console.log('Netfilter data:', data);
|
||||
|
||||
// Update count display
|
||||
const countEl = document.getElementById('security-count');
|
||||
if (countEl) {
|
||||
countEl.textContent = data.total ? `(${data.total.toLocaleString()} results)` : '';
|
||||
}
|
||||
|
||||
if (!data.data || data.data.length === 0) {
|
||||
container.innerHTML = '<p class="text-gray-500 dark:text-gray-400 text-center py-8">No logs found</p>';
|
||||
const countEl = document.getElementById('security-count');
|
||||
if (countEl) countEl.textContent = '';
|
||||
return;
|
||||
}
|
||||
|
||||
// Deduplicate logs based on message + time + priority
|
||||
const uniqueLogs = deduplicateNetfilterLogs(data.data);
|
||||
|
||||
// Update count display with deduplicated count
|
||||
const countEl = document.getElementById('security-count');
|
||||
if (countEl) {
|
||||
countEl.textContent = uniqueLogs.length > 0 ? `(${uniqueLogs.length.toLocaleString()} results)` : '';
|
||||
}
|
||||
|
||||
container.innerHTML = `
|
||||
<div class="space-y-3">
|
||||
${data.data.map(log => `
|
||||
${uniqueLogs.map(log => `
|
||||
<div class="border border-gray-200 dark:border-gray-700 rounded-lg p-4 bg-white dark:bg-gray-800 hover:bg-gray-50 dark:hover:bg-gray-700/50 transition">
|
||||
<div class="flex flex-col sm:flex-row sm:items-center justify-between gap-2 mb-2">
|
||||
<div class="flex flex-wrap items-center gap-2">
|
||||
@@ -2574,14 +2646,33 @@ async function loadSettings() {
|
||||
content.classList.add('hidden');
|
||||
|
||||
try {
|
||||
const response = await authenticatedFetch('/api/settings/info');
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP ${response.status}`);
|
||||
// Load settings info and app version status in parallel
|
||||
const [settingsResponse, appInfoResponse, versionResponse] = await Promise.all([
|
||||
authenticatedFetch('/api/settings/info'),
|
||||
authenticatedFetch('/api/info'),
|
||||
authenticatedFetch('/api/status/app-version')
|
||||
]);
|
||||
|
||||
if (!settingsResponse.ok) {
|
||||
throw new Error(`HTTP ${settingsResponse.status}`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
const data = await settingsResponse.json();
|
||||
const appInfo = appInfoResponse.ok ? await appInfoResponse.json() : null;
|
||||
const versionInfo = versionResponse.ok ? await versionResponse.json() : null;
|
||||
|
||||
console.log('Settings loaded:', data);
|
||||
|
||||
// Add version info to data and cache it
|
||||
if (appInfo) {
|
||||
data.app_version = appInfo.version;
|
||||
versionInfoCache.app_version = appInfo.version;
|
||||
}
|
||||
if (versionInfo) {
|
||||
data.version_info = versionInfo;
|
||||
versionInfoCache.version_info = versionInfo;
|
||||
}
|
||||
|
||||
renderSettings(content, data);
|
||||
|
||||
loading.classList.add('hidden');
|
||||
@@ -2603,8 +2694,55 @@ async function loadSettings() {
|
||||
|
||||
function renderSettings(content, data) {
|
||||
const config = data.configuration || {};
|
||||
const appVersion = data.app_version || 'Unknown';
|
||||
const versionInfo = data.version_info || {};
|
||||
|
||||
content.innerHTML = `
|
||||
<!-- Version Information Section -->
|
||||
<div class="bg-white dark:bg-gray-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700 mb-6">
|
||||
<div class="p-4 border-b border-gray-200 dark:border-gray-700">
|
||||
<h3 class="text-lg font-semibold text-gray-900 dark:text-white flex items-center gap-2">
|
||||
<svg class="w-5 h-5 text-blue-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M7 16a4 4 0 01-.88-7.903A5 5 0 1115.9 6L16 6a5 5 0 011 9.9M15 13l-3-3m0 0l-3 3m3-3v12"></path>
|
||||
</svg>
|
||||
Version Information
|
||||
</h3>
|
||||
</div>
|
||||
<div class="p-4">
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div class="p-4 bg-gray-50 dark:bg-gray-700/30 rounded-lg">
|
||||
<p class="text-xs font-medium text-gray-500 dark:text-gray-400 uppercase mb-2">Current Version</p>
|
||||
<p class="text-lg font-semibold text-gray-900 dark:text-white">v${appVersion}</p>
|
||||
</div>
|
||||
<div class="p-4 bg-gray-50 dark:bg-gray-700/30 rounded-lg">
|
||||
<p class="text-xs font-medium text-gray-500 dark:text-gray-400 uppercase mb-2">Latest Version</p>
|
||||
<div class="flex items-center gap-2">
|
||||
<p class="text-lg font-semibold text-gray-900 dark:text-white">${versionInfo.latest_version ? `v${versionInfo.latest_version}` : 'Checking...'}</p>
|
||||
${versionInfo.update_available ? `
|
||||
<span class="px-2 py-1 bg-green-100 dark:bg-green-900/30 text-green-800 dark:text-green-300 rounded text-xs font-medium">
|
||||
Update Available
|
||||
</span>
|
||||
` : versionInfo.latest_version && !versionInfo.update_available ? `
|
||||
<span class="px-2 py-1 bg-blue-100 dark:bg-blue-900/30 text-blue-800 dark:text-blue-300 rounded text-xs font-medium">
|
||||
Up to Date
|
||||
</span>
|
||||
` : ''}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
${versionInfo.update_available ? `
|
||||
<div class="mt-4 p-3 bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800 rounded-lg">
|
||||
<p class="text-sm text-green-800 dark:text-green-300">
|
||||
<strong>Update available!</strong> A new version (v${versionInfo.latest_version}) is available on GitHub.
|
||||
</p>
|
||||
<a href="https://github.com/ShlomiPorush/mailcow-logs-viewer/releases/latest" target="_blank" rel="noopener noreferrer" class="text-sm text-green-600 dark:text-green-400 hover:underline mt-2 inline-block">
|
||||
View release notes →
|
||||
</a>
|
||||
</div>
|
||||
` : ''}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Configuration Section -->
|
||||
<div class="bg-white dark:bg-gray-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700">
|
||||
<div class="p-4 border-b border-gray-200 dark:border-gray-700">
|
||||
@@ -2622,9 +2760,26 @@ function renderSettings(content, data) {
|
||||
<p class="text-xs font-medium text-gray-500 dark:text-gray-400 uppercase">Mailcow URL</p>
|
||||
<p class="text-sm text-gray-900 dark:text-white mt-1 font-mono break-all">${escapeHtml(config.mailcow_url || 'N/A')}</p>
|
||||
</div>
|
||||
<div class="p-3 bg-gray-50 dark:bg-gray-700/30 rounded-lg">
|
||||
<p class="text-xs font-medium text-gray-500 dark:text-gray-400 uppercase">Local Domains</p>
|
||||
<p class="text-sm text-gray-900 dark:text-white mt-1">${config.local_domains ? config.local_domains.join(', ') : 'N/A'}</p>
|
||||
<div class="p-3 bg-gray-50 dark:bg-gray-700/30 rounded-lg ${config.local_domains && config.local_domains.length > 0 ? 'col-span-1 md:col-span-2 lg:col-span-3' : ''}">
|
||||
<p class="text-xs font-medium text-gray-500 dark:text-gray-400 uppercase mb-2">
|
||||
Local Domains
|
||||
${config.local_domains && config.local_domains.length > 0 ?
|
||||
`<span class="ml-1 text-gray-400 dark:text-gray-500 font-normal">(${config.local_domains.length})</span>` :
|
||||
''
|
||||
}
|
||||
</p>
|
||||
${config.local_domains && config.local_domains.length > 0 ?
|
||||
`<div class="mt-2 max-h-64 overflow-y-auto">
|
||||
<div class="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-2">
|
||||
${config.local_domains.map(domain => `
|
||||
<div class="text-sm text-gray-900 dark:text-white font-mono px-3 py-1.5 bg-white dark:bg-gray-800 rounded border border-gray-200 dark:border-gray-600 truncate" title="${escapeHtml(domain)}">
|
||||
${escapeHtml(domain)}
|
||||
</div>
|
||||
`).join('')}
|
||||
</div>
|
||||
</div>` :
|
||||
'<p class="text-sm text-gray-500 dark:text-gray-400 mt-1">N/A</p>'
|
||||
}
|
||||
</div>
|
||||
<div class="p-3 bg-gray-50 dark:bg-gray-700/30 rounded-lg">
|
||||
<p class="text-xs font-medium text-gray-500 dark:text-gray-400 uppercase">Fetch Interval</p>
|
||||
@@ -2714,6 +2869,10 @@ function renderImportCard(title, data, color) {
|
||||
<div class="p-4 border ${colorClasses[color]} rounded-lg">
|
||||
<p class="font-semibold text-gray-900 dark:text-white mb-3">${title}</p>
|
||||
<div class="space-y-2 text-sm">
|
||||
<div>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400">Last Fetch Run</p>
|
||||
<p class="text-gray-900 dark:text-white font-medium">${data.last_fetch_run ? formatTime(data.last_fetch_run) : 'Never'}</p>
|
||||
</div>
|
||||
<div>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400">Last Import</p>
|
||||
<p class="text-gray-900 dark:text-white">${data.last_import ? formatTime(data.last_import) : 'Never'}</p>
|
||||
|
||||
@@ -65,7 +65,7 @@
|
||||
</head>
|
||||
<body class="h-full bg-gray-50 dark:bg-gray-900">
|
||||
<!-- Main App Content -->
|
||||
<div id="app-content" class="h-full">
|
||||
<div id="app-content" class="h-full overflow-auto">
|
||||
<!-- Navigation -->
|
||||
<nav class="bg-white dark:bg-gray-800 shadow dark:shadow-gray-900/30">
|
||||
<div class="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8">
|
||||
@@ -566,6 +566,32 @@
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Footer -->
|
||||
<footer id="app-footer" class="bg-white dark:bg-gray-800 border-t border-gray-200 dark:border-gray-700 mt-6">
|
||||
<div class="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-4">
|
||||
<div class="flex flex-col sm:flex-row justify-between items-center gap-2 text-sm text-gray-600 dark:text-gray-400">
|
||||
<div class="flex flex-wrap items-center gap-2 justify-center sm:justify-start">
|
||||
<span id="app-name-footer">Mailcow Logs Viewer</span>
|
||||
<span class="text-gray-400 dark:text-gray-500">•</span>
|
||||
<span id="app-version-footer">v1.4.2</span>
|
||||
<span id="update-badge" class="hidden ml-2 px-2 py-0.5 bg-green-100 dark:bg-green-900/30 text-green-800 dark:text-green-300 rounded text-xs font-medium">
|
||||
Update Available
|
||||
</span>
|
||||
</div>
|
||||
<div class="flex flex-wrap items-center gap-2 justify-center sm:justify-end">
|
||||
<span>Created with ❤️</span>
|
||||
<span class="text-gray-400 dark:text-gray-500">•</span>
|
||||
<a href="https://github.com/ShlomiPorush/mailcow-logs-viewer" target="_blank" rel="noopener noreferrer" class="text-blue-600 dark:text-blue-400 hover:text-blue-800 dark:hover:text-blue-300 transition flex items-center gap-1">
|
||||
<svg class="w-4 h-4" fill="currentColor" viewBox="0 0 24 24">
|
||||
<path d="M12 0c-6.626 0-12 5.373-12 12 0 5.302 3.438 9.8 8.207 11.387.599.111.793-.261.793-.577v-2.234c-3.338.726-4.033-1.416-4.033-1.416-.546-1.387-1.333-1.756-1.333-1.756-1.089-.745.083-.729.083-.729 1.205.084 1.839 1.237 1.839 1.237 1.07 1.834 2.807 1.304 3.492.997.107-.775.418-1.305.762-1.604-2.665-.305-5.467-1.334-5.467-5.931 0-1.311.469-2.381 1.236-3.221-.124-.303-.535-1.524.117-3.176 0 0 1.008-.322 3.301 1.23.957-.266 1.983-.399 3.003-.404 1.02.005 2.047.138 3.006.404 2.291-1.552 3.297-1.23 3.297-1.23.653 1.653.242 2.874.118 3.176.77.84 1.235 1.911 1.235 3.221 0 4.609-2.807 5.624-5.479 5.921.43.372.823 1.102.823 2.222v3.293c0 .319.192.694.801.576 4.765-1.589 8.199-6.086 8.199-11.386 0-6.627-5.373-12-12-12z"/>
|
||||
</svg>
|
||||
GitHub
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</footer>
|
||||
</div> <!-- Close app-content div -->
|
||||
|
||||
<script src="/static/app.js"></script>
|
||||
|
||||
Reference in New Issue
Block a user