Release version 1.4.8

This commit is contained in:
shlomi
2026-01-08 12:55:03 +02:00
parent 712cc3e20b
commit ccc87b8f4f
19 changed files with 1341 additions and 222 deletions

View File

@@ -5,6 +5,81 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [1.4.8] - 2026-01-08
### Added
#### Automated Domains DNS Validation
- **Automated Background Checks**:
- DNS checks run automatically every 6 hours via scheduler
- Checks only active domains to optimize performance
- Results cached with timestamps for quick display
- **Manual DNS Verification**:
- **Global Check**: "Check Now" button in Domains Overview header
- Updates all active domains simultaneously
- Updates global "Last checked" timestamp
- **Single Domain Check**: Individual "Check" button per domain
- Updates only the specific domain without page refresh
- Partial UI update for better UX
- Toast notifications for user feedback on all check operations
- **DNS Check Results Display**:
- Last check timestamp displayed in page header (global checks only)
- Last check timestamp per domain in DNS Security Records section
#### Backend Infrastructure
- **New Database Table**: `domain_dns_checks`
- Stores SPF, DKIM, DMARC validation results as JSONB
- Includes `checked_at` timestamp and `is_full_check` flag
- Automatic migration with PostgreSQL artifact cleanup
- **New API Endpoints**:
- `GET /api/domains/all` - Fetch all domains with cached DNS results
- `POST /api/domains/check-all-dns` - Trigger global DNS check (manual)
- `POST /api/domains/{domain}/check-dns` - Check specific domain DNS
#### Frontend Enhancements
- **Responsive Design**: Mobile-optimized layout
- Header elements stack vertically on mobile, horizontal on desktop
- Centered content on mobile for better readability
- Check button and timestamp properly aligned on all screen sizes
- **Toast Notifications**: User feedback system
- Success, error, warning, and info message types
- Color-coded with icons (✓, ✗, ⚠, )
- Auto-dismiss after 4 seconds
- Manual dismiss option
#### Background Jobs Monitoring & Enhanced UI
- **Real-time Status Tracking**: All background jobs now report execution status (running/success/failed/idle/scheduled), last run timestamp, and error messages
- **Enhanced Visual Design**:
- Compact mobile-optimized layout
- Full-color status badges (solid green/blue/red/gray/purple backgrounds with white text)
- Icon indicators: ⏱ interval, 📅 schedule, 🗂 retention, ⏳ max age, 📋 pending items
- Always-visible last run timestamps
- **Complete Job Coverage**: All 7 background jobs now visible in UI (previously only 5 were displayed):
- Fetch Logs, Complete Correlations, Update Final Status, Expire Correlations, Cleanup Logs, Check App Version, DNS Check
### Changed
#### Queue and Quarantine Page
- **Display Order**: Quarantine page now displays newest messages first
- Messages sorted by creation timestamp in descending order (newest → oldest)
- Backend sorting ensures consistent ordering
#### Dashboard - Recent Activity
- **Layout Improvement**: Reorganized Status & Direction display for better readability
- Status and Direction badges now displayed on first line, right-aligned
- Timestamp moved to second line below badges
### Background Jobs and Status Page
- Background job status badges now use consistent full-color styling across all themes
- Check App Version and DNS Check jobs now properly displayed in Status page
- Simplified function signatures by removing redundant description parameters
---
## [1.4.7] - 2026-01-06
### Added

View File

@@ -6,10 +6,14 @@ A modern, self-hosted dashboard for viewing and analyzing Mailcow mail server lo
![Messages](images/Messages.png)
![Message Details](images/Message%20Details.png)
![Message Details](images/Message_Details_Overview.png)
![Message Logs](images/Message_Details_Logs.png)
![Security](images/Security.png)
![Domains](images/Domains.png)
![Status](images/Status.png)
---
@@ -116,8 +120,6 @@ All settings via environment variables. See **[env.example](env.example)** for f
| `MAILCOW_API_KEY` | Mailcow API key |
| `POSTGRES_PASSWORD` | Database password |
**Note:** Active domains are automatically fetched from Mailcow API - no configuration needed!
### Key Optional Settings
| Variable | Default | Description |

View File

@@ -1 +1 @@
1.4.7
1.4.8

View File

@@ -240,6 +240,136 @@ def add_is_complete_column(db: Session):
db.rollback()
def ensure_domain_dns_checks_table(db: Session):
"""Ensure domain_dns_checks table exists"""
logger.info("Checking if domain_dns_checks table exists...")
try:
result = db.execute(text("""
SELECT EXISTS (
SELECT FROM information_schema.tables
WHERE table_schema = 'public'
AND table_name = 'domain_dns_checks'
);
"""))
table_exists = result.fetchone()[0]
if table_exists:
logger.info("domain_dns_checks table already exists")
return
logger.info("Creating domain_dns_checks table...")
try:
db.execute(text("""
CREATE TABLE domain_dns_checks (
id SERIAL PRIMARY KEY,
domain_name VARCHAR(255) NOT NULL UNIQUE,
spf_check JSONB,
dkim_check JSONB,
dmarc_check JSONB,
checked_at TIMESTAMP NOT NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
"""))
db.execute(text("""
CREATE INDEX idx_domain_dns_checks_domain
ON domain_dns_checks(domain_name);
"""))
db.execute(text("""
CREATE INDEX idx_domain_dns_checks_checked_at
ON domain_dns_checks(checked_at);
"""))
db.commit()
logger.info("✓ domain_dns_checks table created successfully")
except Exception as create_error:
db.rollback()
if "duplicate key value violates unique constraint" in str(create_error).lower():
logger.warning("Detected PostgreSQL artifact, cleaning up...")
try:
# Clean up ALL artifacts
db.execute(text("DROP SEQUENCE IF EXISTS domain_dns_checks_id_seq CASCADE;"))
db.execute(text("DROP TABLE IF EXISTS domain_dns_checks CASCADE;"))
db.execute(text("DROP TYPE IF EXISTS domain_dns_checks CASCADE;"))
db.commit()
logger.info("Cleaned up PostgreSQL artifacts")
# Retry
db.execute(text("""
CREATE TABLE domain_dns_checks (
id SERIAL PRIMARY KEY,
domain_name VARCHAR(255) NOT NULL UNIQUE,
spf_check JSONB,
dkim_check JSONB,
dmarc_check JSONB,
checked_at TIMESTAMP NOT NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
"""))
db.execute(text("""
CREATE INDEX idx_domain_dns_checks_domain
ON domain_dns_checks(domain_name);
"""))
db.execute(text("""
CREATE INDEX idx_domain_dns_checks_checked_at
ON domain_dns_checks(checked_at);
"""))
db.commit()
logger.info("✓ domain_dns_checks table created after cleanup")
except Exception as retry_error:
logger.error(f"Failed after cleanup: {retry_error}")
db.rollback()
raise
else:
logger.error(f"Failed to create table: {create_error}")
raise
except Exception as e:
logger.error(f"Error ensuring domain_dns_checks table: {e}")
db.rollback()
def add_is_full_check_column(db: Session):
"""Add is_full_check column to domain_dns_checks"""
logger.info("Checking if is_full_check column exists...")
try:
result = db.execute(text("""
SELECT column_name
FROM information_schema.columns
WHERE table_name='domain_dns_checks'
AND column_name='is_full_check'
"""))
if result.fetchone() is None:
logger.info("Adding is_full_check column...")
db.execute(text("""
ALTER TABLE domain_dns_checks
ADD COLUMN is_full_check BOOLEAN DEFAULT FALSE
"""))
db.commit()
logger.info("is_full_check column added")
else:
logger.info("is_full_check column already exists")
except Exception as e:
logger.error(f"Error adding is_full_check column: {e}")
db.rollback()
def run_migrations():
"""
Run all database migrations and maintenance tasks
@@ -255,6 +385,10 @@ def run_migrations():
# Add is_complete column if missing (for tracking correlation completion)
add_is_complete_column(db)
# Domain DNS table
ensure_domain_dns_checks_table(db)
add_is_full_check_column(db)
# Clean up duplicate correlations
removed = cleanup_duplicate_correlations(db)

View File

@@ -190,3 +190,20 @@ class MessageCorrelation(Base):
def __repr__(self):
return f"<MessageCorrelation(message_id={self.message_id}, status={self.final_status})>"
class DomainDNSCheck(Base):
"""Cached DNS check results for domains"""
__tablename__ = "domain_dns_checks"
id = Column(Integer, primary_key=True, index=True)
domain_name = Column(String(255), unique=True, index=True, nullable=False)
spf_check = Column(JSONB)
dkim_check = Column(JSONB)
dmarc_check = Column(JSONB)
checked_at = Column(DateTime, nullable=False)
is_full_check = Column(Boolean, default=False)
created_at = Column(DateTime, default=datetime.utcnow)
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)

View File

@@ -11,6 +11,13 @@ from datetime import datetime, timezone
from app.mailcow_api import mailcow_api
from sqlalchemy.orm import Session
from sqlalchemy import text
from app.database import get_db
from app.models import DomainDNSCheck
from fastapi import Depends
from app.utils import format_datetime_for_api
logger = logging.getLogger(__name__)
router = APIRouter()
@@ -432,7 +439,7 @@ async def check_domain_dns(domain: str) -> Dict[str, Any]:
'spf': spf_result,
'dkim': dkim_result,
'dmarc': dmarc_result,
'checked_at': datetime.now(timezone.utc).isoformat()
'checked_at': format_datetime_for_api(datetime.now(timezone.utc))
}
except Exception as e:
@@ -440,59 +447,39 @@ async def check_domain_dns(domain: str) -> Dict[str, Any]:
return {
'domain': domain,
'error': str(e),
'checked_at': datetime.now(timezone.utc).isoformat()
'checked_at': format_datetime_for_api(datetime.now(timezone.utc))
}
@router.get("/domains/all")
async def get_all_domains_with_dns():
"""
Get all domains from Mailcow with DNS validation checks
Returns:
List of domains with detailed information and DNS checks
"""
async def get_all_domains_with_dns(db: Session = Depends(get_db)):
"""Get all domains with cached DNS checks"""
try:
# Fetch domains from Mailcow
domains = await mailcow_api.get_domains()
# Get last DNS check time FIRST
last_check = db.query(DomainDNSCheck).filter(
DomainDNSCheck.is_full_check == True
).order_by(
DomainDNSCheck.checked_at.desc()
).first()
if not domains:
return {
'domains': [],
'total': 0,
'active': 0
'active': 0,
'last_dns_check': format_datetime_for_api(last_check.checked_at) if (last_check and last_check.checked_at) else None
}
# Process each domain and add DNS checks
domain_tasks = []
for domain_data in domains:
domain_name = domain_data.get('domain_name')
if domain_name:
domain_tasks.append(check_domain_dns(domain_name))
# Run DNS checks in parallel
dns_results = await asyncio.gather(*domain_tasks, return_exceptions=True)
# Combine domain data with DNS results
result_domains = []
for i, domain_data in enumerate(domains):
for domain_data in domains:
domain_name = domain_data.get('domain_name')
if not domain_name:
continue
# Get DNS results (if available and not an exception)
dns_data = {}
if i < len(dns_results):
if isinstance(dns_results[i], Exception):
logger.error(f"DNS check failed for {domain_name}: {dns_results[i]}")
dns_data = {
'error': str(dns_results[i]),
'spf': {'status': 'error', 'message': 'Check failed'},
'dkim': {'status': 'error', 'message': 'Check failed'},
'dmarc': {'status': 'error', 'message': 'Check failed'}
}
else:
dns_data = dns_results[i]
# Get cached DNS check
dns_checks = get_cached_dns_check(db, domain_name)
result_domains.append({
'domain_name': domain_name,
@@ -510,16 +497,17 @@ async def get_all_domains_with_dns():
'max_quota_for_domain': domain_data.get('max_quota_for_domain', 0),
'backupmx': domain_data.get('backupmx', 0) == 1,
'relay_all_recipients': domain_data.get('relay_all_recipients', 0) == 1,
'dns_checks': dns_data
'relay_unknown_only': domain_data.get('relay_unknown_only', 0) == 1,
'dns_checks': dns_checks or {}
})
# Count active domains
active_count = sum(1 for d in result_domains if d.get('active'))
return {
'domains': result_domains,
'total': len(result_domains),
'active': active_count
'active': active_count,
'last_dns_check': format_datetime_for_api(last_check.checked_at) if (last_check and last_check.checked_at) else None
}
except Exception as e:
@@ -544,3 +532,123 @@ async def check_single_domain_dns(domain: str):
except Exception as e:
logger.error(f"Error checking DNS for {domain}: {e}")
raise HTTPException(status_code=500, detail=str(e))
async def save_dns_check_to_db(db: Session, domain_name: str, dns_data: Dict[str, Any], is_full_check: bool = False):
"""Save DNS check results to database (upsert)"""
try:
checked_at = datetime.now(timezone.utc)
existing = db.query(DomainDNSCheck).filter(
DomainDNSCheck.domain_name == domain_name
).first()
if existing:
existing.spf_check = dns_data.get('spf')
existing.dkim_check = dns_data.get('dkim')
existing.dmarc_check = dns_data.get('dmarc')
existing.checked_at = checked_at
existing.updated_at = checked_at
existing.is_full_check = is_full_check # ← הוסף
else:
new_check = DomainDNSCheck(
domain_name=domain_name,
spf_check=dns_data.get('spf'),
dkim_check=dns_data.get('dkim'),
dmarc_check=dns_data.get('dmarc'),
checked_at=checked_at,
is_full_check=is_full_check # ← הוסף
)
db.add(new_check)
db.commit()
logger.info(f"Saved DNS check for {domain_name}")
except Exception as e:
logger.error(f"Error saving DNS check for {domain_name}: {e}")
db.rollback()
raise
def get_cached_dns_check(db: Session, domain_name: str) -> Dict[str, Any]:
"""Get cached DNS check from database"""
try:
cached = db.query(DomainDNSCheck).filter(
DomainDNSCheck.domain_name == domain_name
).first()
if cached:
return {
'spf': cached.spf_check,
'dkim': cached.dkim_check,
'dmarc': cached.dmarc_check,
'checked_at': format_datetime_for_api(cached.checked_at) if cached.checked_at else None
}
return None
except Exception as e:
logger.error(f"Error getting cached DNS for {domain_name}: {e}")
return None
@router.post("/domains/check-all-dns")
async def check_all_domains_dns_manual(db: Session = Depends(get_db)):
"""Manually trigger DNS check for all active domains"""
try:
domains = await mailcow_api.get_domains()
if not domains:
return {
'status': 'success',
'message': 'No domains to check',
'domains_checked': 0,
'errors': []
}
active_domains = [d for d in domains if d.get('active', 0) == 1]
checked_count = 0
errors = []
for domain_data in active_domains:
domain_name = domain_data.get('domain_name')
if not domain_name:
continue
try:
dns_data = await check_domain_dns(domain_name)
await save_dns_check_to_db(db, domain_name, dns_data, is_full_check=True)
checked_count += 1
except Exception as e:
errors.append(f"{domain_name}: {str(e)}")
status = 'success' if checked_count == len(active_domains) else 'partial'
return {
'status': status,
'message': f'Checked {checked_count} domains',
'domains_checked': checked_count,
'errors': errors
}
except Exception as e:
logger.error(f"Error in manual DNS check: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.post("/domains/{domain}/check-dns")
async def check_single_domain_dns_manual(domain: str, db: Session = Depends(get_db)):
"""Manually trigger DNS check for a single domain"""
try:
dns_data = await check_domain_dns(domain)
await save_dns_check_to_db(db, domain, dns_data, is_full_check=False)
return {
'status': 'success',
'message': f'DNS checked for {domain}',
'data': dns_data
}
except Exception as e:
logger.error(f"Error checking DNS for {domain}: {e}")
raise HTTPException(status_code=500, detail=str(e))

View File

@@ -430,25 +430,33 @@ async def get_netfilter_logs(
async def get_queue():
"""
Get current mail queue from Mailcow (real-time)
Returns messages sorted by newest first (by arrival_time)
"""
try:
queue = await mailcow_api.get_queue()
# Sort by arrival_time - newest first (descending order)
# arrival_time is a Unix timestamp (integer)
queue_sorted = sorted(
queue,
key=lambda x: x.get('arrival_time', 0),
reverse=True # Newest first
)
return {
"total": len(queue),
"data": queue
"total": len(queue_sorted),
"data": queue_sorted
}
except Exception as e:
logger.error(f"Error fetching queue: {e}")
raise HTTPException(status_code=500, detail=str(e))
from datetime import datetime, timezone
@router.get("/quarantine")
async def get_quarantine():
"""
Get quarantined messages from Mailcow (real-time)
Returns messages sorted by newest first
"""
try:
quarantine = await mailcow_api.get_quarantine()
@@ -461,6 +469,8 @@ async def get_quarantine():
# Convert Unix timestamp to ISO format with 'Z' suffix
dt = datetime.fromtimestamp(item['created'], tz=timezone.utc)
item['created'] = dt.replace(microsecond=0).isoformat().replace('+00:00', 'Z')
# Store the numeric value for sorting
item['_created_timestamp'] = item['created']
elif isinstance(item['created'], str):
# Parse ISO string and ensure it has 'Z' suffix for UTC
try:
@@ -470,12 +480,26 @@ async def get_quarantine():
else:
dt = dt.astimezone(timezone.utc)
item['created'] = dt.replace(microsecond=0).isoformat().replace('+00:00', 'Z')
# Store the datetime object for sorting
item['_created_timestamp'] = dt.timestamp()
except (ValueError, AttributeError):
pass
# Sort by created timestamp - newest first (descending order)
# Items without valid timestamp will be at the end
quarantine_sorted = sorted(
quarantine,
key=lambda x: x.get('_created_timestamp', 0),
reverse=True # Newest first
)
# Remove the temporary sorting field before returning
for item in quarantine_sorted:
item.pop('_created_timestamp', None)
return {
"total": len(quarantine),
"data": quarantine
"total": len(quarantine_sorted),
"data": quarantine_sorted
}
except Exception as e:
logger.error(f"Error fetching quarantine: {e}")

View File

@@ -12,7 +12,7 @@ from typing import Dict, Any, Optional
from ..database import get_db
from ..models import PostfixLog, RspamdLog, NetfilterLog, MessageCorrelation
from ..config import settings
from ..scheduler import last_fetch_run_time
from ..scheduler import last_fetch_run_time, get_job_status
logger = logging.getLogger(__name__)
@@ -89,6 +89,8 @@ async def get_settings_info(db: Session = Depends(get_db)):
MessageCorrelation.is_complete == False
).order_by(desc(MessageCorrelation.created_at)).limit(5).all()
jobs_status = get_job_status()
return {
"configuration": {
"mailcow_url": settings.mailcow_url,
@@ -142,28 +144,57 @@ async def get_settings_info(db: Session = Depends(get_db)):
"background_jobs": {
"fetch_logs": {
"interval": f"{settings.fetch_interval} seconds",
"status": "running"
"description": "Imports logs from Mailcow API",
"status": jobs_status.get('fetch_logs', {}).get('status', 'unknown'),
"last_run": format_datetime_utc(jobs_status.get('fetch_logs', {}).get('last_run')),
"error": jobs_status.get('fetch_logs', {}).get('error')
},
"complete_correlations": {
"interval": f"{settings.correlation_check_interval} seconds ({settings.correlation_check_interval // 60} minutes)",
"status": "running",
"description": "Links Postfix logs to messages",
"status": jobs_status.get('complete_correlations', {}).get('status', 'unknown'),
"last_run": format_datetime_utc(jobs_status.get('complete_correlations', {}).get('last_run')),
"error": jobs_status.get('complete_correlations', {}).get('error'),
"pending_items": incomplete_correlations or 0
},
"update_final_status": {
"interval": f"{settings.correlation_check_interval} seconds ({settings.correlation_check_interval // 60} minutes)",
"description": "Updates final status for correlations with late-arriving Postfix logs",
"max_age": f"{settings.max_correlation_age_minutes} minutes",
"status": "running",
"status": jobs_status.get('update_final_status', {}).get('status', 'unknown'),
"last_run": format_datetime_utc(jobs_status.get('update_final_status', {}).get('last_run')),
"error": jobs_status.get('update_final_status', {}).get('error'),
"pending_items": correlations_needing_status or 0
},
"expire_correlations": {
"interval": "60 seconds (1 minute)",
"description": "Marks old incomplete correlations as expired",
"expire_after": f"{settings.max_correlation_age_minutes} minutes",
"status": "running"
"status": jobs_status.get('expire_correlations', {}).get('status', 'unknown'),
"last_run": format_datetime_utc(jobs_status.get('expire_correlations', {}).get('last_run')),
"error": jobs_status.get('expire_correlations', {}).get('error')
},
"cleanup_logs": {
"schedule": "Daily at 2 AM",
"description": "Removes old logs based on retention period",
"retention": f"{settings.retention_days} days",
"status": "scheduled"
"status": jobs_status.get('cleanup_logs', {}).get('status', 'unknown'),
"last_run": format_datetime_utc(jobs_status.get('cleanup_logs', {}).get('last_run')),
"error": jobs_status.get('cleanup_logs', {}).get('error')
},
"check_app_version": {
"interval": "6 hours",
"description": "Checks for application updates from GitHub",
"status": jobs_status.get('check_app_version', {}).get('status', 'unknown'),
"last_run": format_datetime_utc(jobs_status.get('check_app_version', {}).get('last_run')),
"error": jobs_status.get('check_app_version', {}).get('error')
},
"dns_check": {
"interval": "6 hours",
"description": "Validates DNS records (SPF, DKIM, DMARC) for all active domains",
"status": jobs_status.get('dns_check', {}).get('status', 'unknown'),
"last_run": format_datetime_utc(jobs_status.get('dns_check', {}).get('last_run')),
"error": jobs_status.get('dns_check', {}).get('error')
}
},
"recent_incomplete_correlations": [

View File

@@ -9,6 +9,7 @@ from typing import Dict, Any
from ..mailcow_api import mailcow_api
from ..version import __version__
from ..scheduler import check_app_version_update, get_app_version_cache
logger = logging.getLogger(__name__)
@@ -23,58 +24,6 @@ version_cache = {
"changelog": None
}
# Cache for app version check (check once per day)
app_version_cache = {
"checked_at": None,
"current_version": __version__, # Read from VERSION file
"latest_version": None,
"update_available": False,
"changelog": None
}
async def check_app_version_update():
"""
Check for app version updates from GitHub and update the cache.
This function can be called from both the API endpoint and the scheduler.
"""
global app_version_cache
logger.info("Checking app version and updates from GitHub...")
# Check GitHub for latest version
try:
async with httpx.AsyncClient(timeout=10) as client:
response = await client.get(
"https://api.github.com/repos/ShlomiPorush/mailcow-logs-viewer/releases/latest"
)
if response.status_code == 200:
release_data = response.json()
latest_version = release_data.get('tag_name', 'unknown')
# Remove 'v' prefix if present
if latest_version.startswith('v'):
latest_version = latest_version[1:]
changelog = release_data.get('body', '')
app_version_cache["latest_version"] = latest_version
app_version_cache["changelog"] = changelog
# Compare versions (simple string comparison)
app_version_cache["update_available"] = app_version_cache["current_version"] != latest_version
logger.info(f"App version check: Current={app_version_cache['current_version']}, Latest={latest_version}")
else:
logger.warning(f"GitHub API returned status {response.status_code}")
app_version_cache["latest_version"] = "unknown"
app_version_cache["update_available"] = False
except Exception as e:
logger.error(f"Failed to check GitHub for app updates: {e}")
app_version_cache["latest_version"] = "unknown"
app_version_cache["update_available"] = False
app_version_cache["checked_at"] = datetime.now(timezone.utc)
@router.get("/status/containers")
async def get_containers_status():
"""
@@ -237,7 +186,8 @@ async def get_app_version_status(force: bool = Query(False, description="Force a
force: If True, force a fresh check regardless of cache age
"""
try:
global app_version_cache
# Get cache from scheduler
app_version_cache = get_app_version_cache()
# Force check or check if cache is stale (more than 1 day old) and refresh if needed
# This is a fallback in case the scheduler hasn't run yet
@@ -246,6 +196,7 @@ async def get_app_version_status(force: bool = Query(False, description="Force a
app_version_cache["checked_at"] is None or
now - app_version_cache["checked_at"] > timedelta(days=1)):
await check_app_version_update()
app_version_cache = get_app_version_cache() # Get updated cache
# Format last_checked with UTC timezone indicator ('Z' suffix)
last_checked = None
@@ -269,13 +220,13 @@ async def get_app_version_status(force: bool = Query(False, description="Force a
except Exception as e:
logger.error(f"Error fetching app version status: {e}")
return {
"current_version": app_version_cache["current_version"],
"current_version": __version__,
"latest_version": "unknown",
"update_available": False,
"changelog": None,
"last_checked": None
"last_checked": None,
"error": str(e)
}
raise HTTPException(status_code=500, detail=str(e))
@router.get("/status/mailcow-info")

View File

@@ -5,6 +5,7 @@ import logging
import asyncio
import hashlib
import re
import httpx
from datetime import datetime, timedelta, timezone
from typing import Set, Optional, List, Dict, Any
from apscheduler.schedulers.asyncio import AsyncIOScheduler
@@ -18,10 +19,105 @@ from .database import get_db_context
from .mailcow_api import mailcow_api
from .models import PostfixLog, RspamdLog, NetfilterLog, MessageCorrelation
from .correlation import detect_direction, parse_postfix_message
from .routers.status import check_app_version_update
from .models import DomainDNSCheck
from .routers.domains import check_domain_dns, save_dns_check_to_db
logger = logging.getLogger(__name__)
# Job execution tracking
job_status = {
'fetch_logs': {'last_run': None, 'status': 'idle', 'error': None},
'complete_correlations': {'last_run': None, 'status': 'idle', 'error': None},
'update_final_status': {'last_run': None, 'status': 'idle', 'error': None},
'expire_correlations': {'last_run': None, 'status': 'idle', 'error': None},
'cleanup_logs': {'last_run': None, 'status': 'idle', 'error': None},
'check_app_version': {'last_run': None, 'status': 'idle', 'error': None},
'dns_check': {'last_run': None, 'status': 'idle', 'error': None}
}
def update_job_status(job_name: str, status: str, error: str = None):
"""Update job execution status"""
job_status[job_name] = {
'last_run': datetime.now(timezone.utc),
'status': status,
'error': error
}
def get_job_status():
"""Get all job statuses"""
return job_status
# App version cache (shared with status router)
app_version_cache = {
"checked_at": None,
"current_version": None, # Will be set on first check
"latest_version": None,
"update_available": False,
"changelog": None
}
async def check_app_version_update():
"""
Check for app version updates from GitHub and update the cache.
This function is called by the scheduler and can also be called from the API endpoint.
"""
update_job_status('check_app_version', 'running')
global app_version_cache
# Get current version from VERSION file
try:
from .version import __version__
current_version = __version__
app_version_cache["current_version"] = current_version
except Exception as e:
logger.error(f"Failed to read current version: {e}")
update_job_status('check_app_version', 'failed', str(e))
return
logger.info("Checking app version and updates from GitHub...")
# Check GitHub for latest version
try:
async with httpx.AsyncClient(timeout=10) as client:
response = await client.get(
"https://api.github.com/repos/ShlomiPorush/mailcow-logs-viewer/releases/latest"
)
if response.status_code == 200:
release_data = response.json()
latest_version = release_data.get('tag_name', 'unknown')
# Remove 'v' prefix if present
if latest_version.startswith('v'):
latest_version = latest_version[1:]
changelog = release_data.get('body', '')
app_version_cache["latest_version"] = latest_version
app_version_cache["changelog"] = changelog
# Compare versions (simple string comparison)
app_version_cache["update_available"] = current_version != latest_version
logger.info(f"App version check: Current={current_version}, Latest={latest_version}")
update_job_status('check_app_version', 'success')
else:
logger.warning(f"GitHub API returned status {response.status_code}")
app_version_cache["latest_version"] = "unknown"
app_version_cache["update_available"] = False
update_job_status('check_app_version', 'failed', f"GitHub API returned {response.status_code}")
except Exception as e:
logger.error(f"Failed to check GitHub for app updates: {e}")
app_version_cache["latest_version"] = "unknown"
app_version_cache["update_available"] = False
update_job_status('check_app_version', 'failed', str(e))
app_version_cache["checked_at"] = datetime.now(timezone.utc)
def get_app_version_cache():
"""Get app version cache (for API endpoint)"""
return app_version_cache
scheduler = AsyncIOScheduler()
seen_postfix: Set[str] = set()
@@ -34,7 +130,6 @@ last_fetch_run_time: Dict[str, Optional[datetime]] = {
'netfilter': None
}
def is_blacklisted(email: Optional[str]) -> bool:
"""
Check if email is in blacklist.
@@ -438,7 +533,9 @@ async def fetch_and_store_netfilter():
async def fetch_all_logs():
"""Fetch all log types concurrently"""
try:
update_job_status('fetch_logs', 'running')
logger.debug("[FETCH] Starting fetch_all_logs")
results = await asyncio.gather(
fetch_and_store_postfix(),
fetch_and_store_rspamd(),
@@ -452,7 +549,10 @@ async def fetch_all_logs():
logger.error(f"[ERROR] {log_type} fetch failed: {result}", exc_info=result)
logger.debug("[FETCH] Completed fetch_all_logs")
update_job_status('fetch_logs', 'success')
except Exception as e:
update_job_status('fetch_logs', 'failed', str(e))
logger.error(f"[ERROR] Fetch all logs error: {e}", exc_info=True)
@@ -748,6 +848,7 @@ async def complete_incomplete_correlations():
This handles the case where rspamd was processed before postfix logs arrived.
"""
update_job_status('complete_correlations', 'running')
try:
with get_db_context() as db:
# Find incomplete correlations (have message_id but missing queue_id or postfix logs)
@@ -823,9 +924,11 @@ async def complete_incomplete_correlations():
if completed_count > 0:
logger.info(f"[OK] Completed {completed_count} correlations")
update_job_status('complete_correlations', 'success')
except Exception as e:
logger.error(f"[ERROR] Complete correlations error: {e}")
update_job_status('complete_correlations', 'failed', str(e))
async def expire_old_correlations():
@@ -841,6 +944,7 @@ async def expire_old_correlations():
Uses datetime.utcnow() (naive) to match the naive datetime in created_at.
"""
update_job_status('expire_correlations', 'running')
try:
with get_db_context() as db:
# Use naive datetime for comparison (DB stores naive UTC)
@@ -867,9 +971,11 @@ async def expire_old_correlations():
if expired_count > 0:
logger.info(f"[EXPIRED] Marked {expired_count} correlations as expired (older than {settings.max_correlation_age_minutes}min)")
update_job_status('expire_correlations', 'success')
except Exception as e:
logger.error(f"[ERROR] Expire correlations error: {e}")
update_job_status('expire_correlations', 'failed', str(e))
async def update_final_status_for_correlations():
@@ -886,6 +992,7 @@ async def update_final_status_for_correlations():
This runs independently from correlation creation to ensure we catch
late-arriving Postfix logs.
"""
update_job_status('update_final_status', 'running')
try:
with get_db_context() as db:
# Only check correlations within Max Correlation Age
@@ -956,9 +1063,11 @@ async def update_final_status_for_correlations():
if updated_count > 0:
logger.info(f"[STATUS] Updated final_status for {updated_count} correlations")
update_job_status('update_final_status', 'success')
except Exception as e:
logger.error(f"[ERROR] Update final status error: {e}")
update_job_status('update_final_status', 'failed', str(e))
# =============================================================================
@@ -967,6 +1076,7 @@ async def update_final_status_for_correlations():
async def cleanup_old_logs():
"""Delete logs older than retention period"""
update_job_status('cleanup_logs', 'running')
try:
with get_db_context() as db:
cutoff_date = datetime.now(timezone.utc) - timedelta(
@@ -995,9 +1105,11 @@ async def cleanup_old_logs():
if total > 0:
logger.info(f"[CLEANUP] Cleaned up {total} old entries")
update_job_status('cleanup_logs', 'success')
except Exception as e:
logger.error(f"[ERROR] Cleanup error: {e}")
update_job_status('cleanup_logs', 'failed', str(e))
def cleanup_blacklisted_data():
@@ -1094,6 +1206,43 @@ def cleanup_blacklisted_data():
logger.error(f"[BLACKLIST] Cleanup error: {e}")
async def check_all_domains_dns_background():
"""Background job to check DNS for all domains"""
logger.info("Starting background DNS check...")
update_job_status('dns_check', 'running')
try:
domains = await mailcow_api.get_domains()
if not domains:
return
checked_count = 0
for domain_data in domains:
domain_name = domain_data.get('domain_name')
if not domain_name or domain_data.get('active', 0) != 1:
continue
try:
dns_data = await check_domain_dns(domain_name)
with get_db_context() as db:
await save_dns_check_to_db(db, domain_name, dns_data, is_full_check=True)
checked_count += 1
await asyncio.sleep(0.5)
except Exception as e:
logger.error(f"Failed DNS check for {domain_name}: {e}")
logger.info(f"DNS check completed: {checked_count} domains")
update_job_status('dns_check', 'success')
except Exception as e:
logger.error(f"Background DNS check failed: {e}")
update_job_status('dns_check', 'failed', str(e))
# =============================================================================
# SCHEDULER SETUP
# =============================================================================
@@ -1177,6 +1326,24 @@ def start_scheduler():
next_run_time=datetime.now(timezone.utc) # Run immediately on startup
)
# Job 8: DNS Check
scheduler.add_job(
check_all_domains_dns_background,
trigger=IntervalTrigger(hours=6),
id='dns_check_background',
name='DNS Check (All Domains)',
replace_existing=True,
max_instances=1
)
scheduler.add_job(
check_all_domains_dns_background,
'date',
run_date=datetime.now(timezone.utc) + timedelta(seconds=30),
id='dns_check_startup',
name='DNS Check (Startup)'
)
scheduler.start()
logger.info("[OK] Scheduler started")
@@ -1186,6 +1353,7 @@ def start_scheduler():
logger.info(f" [STATUS] Update final status: every {settings.correlation_check_interval}s (max age: {settings.max_correlation_age_minutes}min)")
logger.info(f" [EXPIRE] Old correlations: every 60s (expire after {settings.max_correlation_age_minutes}min)")
logger.info(f" [VERSION] Check app version updates: every 6 hours")
logger.info(f" [DNS] Check all domains DNS: every 6 hours")
# Log blacklist status
blacklist = settings.blacklist_emails_list

View File

@@ -12,17 +12,18 @@ This document describes all available API endpoints for the Mailcow Logs Viewer
1. [Authentication](#authentication)
2. [Health & Info](#health--info)
3. [Domains](#domains)
4. [Messages (Unified View)](#messages-unified-view)
5. [Logs](#logs)
3. [Job Status Tracking](#job-status-tracking)
4. [Domains](#domains)
5. [Messages (Unified View)](#messages-unified-view)
6. [Logs](#logs)
- [Postfix Logs](#postfix-logs)
- [Rspamd Logs](#rspamd-logs)
- [Netfilter Logs](#netfilter-logs)
6. [Queue & Quarantine](#queue--quarantine)
7. [Statistics](#statistics)
8. [Status](#status)
9. [Settings](#settings)
10. [Export](#export)
7. [Queue & Quarantine](#queue--quarantine)
8. [Statistics](#statistics)
9. [Status](#status)
10. [Settings](#settings)
11. [Export](#export)
---
@@ -81,7 +82,7 @@ Health check endpoint for monitoring and load balancers.
{
"status": "healthy",
"database": "connected",
"version": "1.4.3",
"version": "1.4.9",
"config": {
"fetch_interval": 60,
"retention_days": 7,
@@ -102,7 +103,7 @@ Application information and configuration.
```json
{
"name": "Mailcow Logs Viewer",
"version": "1.4.3",
"version": "1.4.9",
"mailcow_url": "https://mail.example.com",
"local_domains": ["example.com", "mail.example.com"],
"fetch_interval": 60,
@@ -117,39 +118,118 @@ Application information and configuration.
---
## Job Status Tracking
### Overview
The application includes a real-time job status tracking system that monitors all background jobs. Each job reports its execution status, timestamp, and any errors that occurred.
### Job Status Data Structure
```python
job_status = {
'fetch_logs': {'last_run': datetime, 'status': str, 'error': str|None},
'complete_correlations': {'last_run': datetime, 'status': str, 'error': str|None},
'update_final_status': {'last_run': datetime, 'status': str, 'error': str|None},
'expire_correlations': {'last_run': datetime, 'status': str, 'error': str|None},
'cleanup_logs': {'last_run': datetime, 'status': str, 'error': str|None},
'check_app_version': {'last_run': datetime, 'status': str, 'error': str|None},
'dns_check': {'last_run': datetime, 'status': str, 'error': str|None}
}
```
### Status Values
| Status | Description | Badge Color |
|--------|-------------|-------------|
| `running` | Job is currently executing | Blue (bg-blue-500) |
| `success` | Job completed successfully | Green (bg-green-600) |
| `failed` | Job encountered an error | Red (bg-red-600) |
| `idle` | Job hasn't run yet | Gray (bg-gray-500) |
| `scheduled` | Job is scheduled but runs infrequently | Purple (bg-purple-600) |
### Accessing Job Status
Job status is accessible through:
1. **Backend Function**: `get_job_status()` in `scheduler.py`
2. **API Endpoint**: `GET /api/settings/info` (includes `background_jobs` field)
3. **Frontend Display**: Settings page > Background Jobs section
### Background Jobs List
| Job Name | Interval | Description |
|----------|----------|-------------|
| **Fetch Logs** | 60 seconds | Imports Postfix, Rspamd, and Netfilter logs from Mailcow API |
| **Complete Correlations** | 120 seconds (2 min) | Links Postfix logs to message correlations |
| **Update Final Status** | 120 seconds (2 min) | Updates message delivery status for late-arriving logs |
| **Expire Correlations** | 60 seconds (1 min) | Marks old incomplete correlations as expired (after 10 minutes) |
| **Cleanup Logs** | Daily at 2 AM | Removes logs older than retention period |
| **Check App Version** | 6 hours | Checks GitHub for application updates |
| **DNS Check** | 6 hours | Validates DNS records (SPF, DKIM, DMARC) for all active domains |
### Implementation Details
**Update Function:**
```python
def update_job_status(job_name: str, status: str, error: str = None):
"""Update job execution status"""
job_status[job_name] = {
'last_run': datetime.now(timezone.utc),
'status': status,
'error': error
}
```
**Usage in Jobs:**
```python
async def some_background_job():
try:
update_job_status('job_name', 'running')
# ... job logic ...
update_job_status('job_name', 'success')
except Exception as e:
update_job_status('job_name', 'failed', str(e))
```
**UI Display:**
- Compact card layout with status badges
- Icon indicators (⏱ ⏳ 📅 🗂 📋)
- Last run timestamp always visible
- Error messages displayed in red alert boxes
- Pending items count for correlation jobs
---
## Domains
### GET /domains
### GET /api/domains/all
Get list of all domains with statistics and DNS validation.
Get list of all domains with statistics and cached DNS validation results.
**Response:**
```json
{
"total": 10,
"active": 8,
"last_dns_check": "2026-01-08T01:34:08Z",
"domains": [
{
"domain_name": "example.com",
"description": "Main domain",
"aliases": "example.com,mail.example.com",
"mailboxes": 25,
"mailbox_quota": 102400,
"max_num_aliases_for_domain": 400,
"max_num_mboxes_for_domain": 1000,
"max_quota_for_domain": 10240000,
"quota_used_in_domain": 1572864,
"bytes_total": 1572864,
"msgs_total": 1234,
"active": true,
"mboxes_in_domain": 5,
"mboxes_left": 995,
"max_num_mboxes_for_domain": 1000,
"aliases_in_domain": 3,
"aliases_left": 397,
"max_num_aliases_for_domain": 400,
"created": "2025-01-01T00:00:00Z",
"active": true,
"backupmx": 0,
"relay_all_recipients": 0,
"relay_unknown_only": 0,
"bytes_total": 1572864,
"msgs_total": 1234,
"quota_used_in_domain": "1572864",
"max_quota_for_domain": 10240000,
"backupmx": false,
"relay_all_recipients": false,
"relay_unknown_only": false,
"dns_checks": {
"spf": {
"status": "success",
@@ -165,8 +245,8 @@ Get list of all domains with statistics and DNS validation.
"message": "DKIM configured correctly",
"selector": "dkim",
"dkim_domain": "dkim._domainkey.example.com",
"expected_record": "v=DKIM1;k=rsa;...",
"actual_record": "v=DKIM1;k=rsa;...",
"expected_record": "v=DKIM1;k=rsa;p=MIIBIjANBg...",
"actual_record": "v=DKIM1;k=rsa;p=MIIBIjANBg...",
"match": true
},
"dmarc": {
@@ -175,36 +255,222 @@ Get list of all domains with statistics and DNS validation.
"record": "v=DMARC1; p=reject; rua=mailto:dmarc@example.com",
"policy": "reject",
"subdomain_policy": null,
"pct": "100"
}
"pct": "100",
"is_strong": true,
"warnings": []
},
"checked_at": "2026-01-08T01:34:08Z"
}
}
]
}
```
**DNS Check Status Values:**
- `success`: Check passed
- `warning`: Check passed but with recommendations
- `error`: Check failed or record not found
- `unknown`: Check not performed
**Response Fields:**
- `total`: Total number of domains
- `active`: Number of active domains
- `last_dns_check`: Timestamp of last global DNS check (only updated by scheduled or manual full checks)
- `domains`: Array of domain objects
**SPF Policy Types:**
- `-all`: Strict policy (success)
- `~all`: Soft fail (warning)
- `?all`: Neutral (warning)
- `+all`: Pass all (error - no protection)
- No `all`: Missing mechanism (error)
**Domain Object Fields:**
- `domain_name`: Domain name
- `active`: Boolean indicating if domain is active
- `mboxes_in_domain`: Number of mailboxes
- `mboxes_left`: Available mailbox slots
- `max_num_mboxes_for_domain`: Maximum mailboxes allowed
- `aliases_in_domain`: Number of aliases
- `aliases_left`: Available alias slots
- `max_num_aliases_for_domain`: Maximum aliases allowed
- `created`: Domain creation timestamp (UTC)
- `bytes_total`: Total storage used (bytes)
- `msgs_total`: Total messages
- `quota_used_in_domain`: Storage quota used (string format)
- `max_quota_for_domain`: Maximum storage quota
- `backupmx`: Boolean - true if domain is backup MX
- `relay_all_recipients`: Boolean - true if relaying all recipients
- `relay_unknown_only`: Boolean - true if relaying only unknown recipients
- `dns_checks`: DNS validation results (cached from database)
**DNS Check Status Values:**
- `success`: Check passed with no issues
- `warning`: Check passed but with recommendations for improvement
- `error`: Check failed or record not found
- `unknown`: Check not yet performed
**SPF Status Indicators:**
- `-all`: Strict policy (status: success)
- `~all`: Soft fail (status: warning) - Consider using -all for stricter policy
- `?all`: Neutral (status: warning) - Provides minimal protection
- `+all`: Pass all (status: error) - Provides no protection
- Missing `all`: No policy defined (status: error)
**DKIM Validation:**
- Fetches expected DKIM record from Mailcow API
- Queries DNS for actual DKIM record
- Compares expected vs actual records
- `match`: Boolean indicating if records match
**DMARC Policy Types:**
- `reject`: Strict policy (success)
- `quarantine`: Moderate policy (warning)
- `none`: Monitor only (warning)
- `reject`: Strict policy (status: success)
- `quarantine`: Moderate policy (status: warning) - Consider upgrading to reject
- `none`: Monitor only (status: warning) - Provides no protection
**Relay Configuration:**
- `backupmx`: 1 if domain is backup MX, 0 otherwise
- `relay_all_recipients`: 1 if relaying all recipients, 0 otherwise
- `relay_unknown_only`: 1 if relaying only unknown recipients, 0 otherwise
**Notes:**
- DNS checks are cached in database for performance
- `last_dns_check` only updates from global/scheduled checks, not individual domain checks
- `checked_at` (per domain) updates whenever that specific domain is checked
- All timestamps include UTC timezone indicator ('Z' suffix)
---
### POST /api/domains/check-all-dns
Manually trigger DNS validation for all active domains.
**Description:**
Performs DNS checks (SPF, DKIM, DMARC) for all active domains and updates the global `last_dns_check` timestamp. Results are cached in database.
**Authentication:** Required
**Response:**
```json
{
"status": "success",
"message": "Checked 8 domains",
"domains_checked": 8,
"errors": []
}
```
**Response Fields:**
- `status`: `success` (all domains checked) or `partial` (some domains failed)
- `message`: Summary message
- `domains_checked`: Number of domains successfully checked
- `errors`: Array of error messages for failed domains (empty if all successful)
**Error Response (partial success):**
```json
{
"status": "partial",
"message": "Checked 7 domains",
"domains_checked": 7,
"errors": [
"example.com: DNS timeout"
]
}
```
**Notes:**
- Only checks active domains
- Updates `is_full_check=true` flag in database
- Updates global `last_dns_check` timestamp
- Frontend shows progress with toast notifications
- Returns immediately with status (check runs asynchronously)
---
### POST /api/domains/{domain}/check-dns
Manually trigger DNS validation for a specific domain.
**Path Parameters:**
| Parameter | Type | Description |
|-----------|------|-------------|
| `domain` | string | Domain name to check |
**Authentication:** Required
**Example Request:**
```
POST /api/domains/example.com/check-dns
```
**Response:**
```json
{
"status": "success",
"message": "DNS checked for example.com",
"data": {
"domain": "example.com",
"spf": {
"status": "success",
"message": "SPF configured correctly with strict -all policy",
"record": "v=spf1 mx include:_spf.google.com -all",
"has_strict_all": true,
"includes_mx": true,
"includes": ["_spf.google.com"],
"warnings": []
},
"dkim": {
"status": "success",
"message": "DKIM configured correctly",
"selector": "dkim",
"dkim_domain": "dkim._domainkey.example.com",
"expected_record": "v=DKIM1;k=rsa;p=MIIBIjANBg...",
"actual_record": "v=DKIM1;k=rsa;p=MIIBIjANBg...",
"match": true
},
"dmarc": {
"status": "success",
"message": "DMARC configured with strict policy",
"record": "v=DMARC1; p=reject; rua=mailto:dmarc@example.com",
"policy": "reject",
"is_strong": true,
"warnings": []
},
"checked_at": "2026-01-08T01:45:23Z"
}
}
```
**Notes:**
- Only checks the specified domain
- Updates `is_full_check=false` flag in database
- Does NOT update global `last_dns_check` timestamp
- Frontend updates only that domain's section (no page refresh)
- Useful for verifying DNS changes immediately
---
### DNS Check Technical Details
**Async DNS Validation:**
- All DNS queries use async resolvers with 5-second timeout
- Queries run in parallel for performance
- Comprehensive error handling for timeouts, NXDOMAIN, NoAnswer
**SPF Validation:**
- Queries TXT records for SPF (`v=spf1`)
- Detects policy: `-all`, `~all`, `?all`, `+all`, or missing
- Checks for `mx` mechanism
- Extracts `include:` directives
- Provides policy-specific warnings
**DKIM Validation:**
- Fetches expected DKIM value from Mailcow API (`/api/v1/get/dkim/{domain}`)
- Queries DNS at `{selector}._domainkey.{domain}`
- Compares expected vs actual records (whitespace-normalized)
- Reports mismatch details
**DMARC Validation:**
- Queries TXT records at `_dmarc.{domain}`
- Parses policy (`p=` tag)
- Checks for subdomain policy (`sp=` tag)
- Validates percentage (`pct=` tag)
- Provides policy upgrade recommendations
**Background Checks:**
- Automated DNS checks run every 6 hours via scheduler
- Only checks active domains
- All automated checks marked as `is_full_check=true`
- Results cached in `domain_dns_checks` table
**Caching:**
- DNS results stored in PostgreSQL with JSONB columns
- Indexed on `domain_name` and `checked_at` for performance
- Upsert pattern (update if exists, insert if new)
- `is_full_check` flag distinguishes check types
---
@@ -816,18 +1082,28 @@ Get application version and check for updates from GitHub.
**Response:**
```json
{
"current_version": "1.4.6",
"latest_version": "1.4.6",
"current_version": "1.4.9",
"latest_version": "1.4.9",
"update_available": false,
"changelog": "Release notes in Markdown format...",
"last_checked": "2026-01-05T15:52:46Z"
"changelog": "### Added\n\n#### Background Jobs Enhanced UI\n- Compact layout...",
"last_checked": "2026-01-08T15:52:46Z"
}
```
**Note:**
- This endpoint checks GitHub once per day and caches the result
**Implementation Notes:**
- Version checks are performed by the scheduler every 6 hours
- Results are cached in `app_version_cache` (managed by `scheduler.py`)
- Status endpoint retrieves cached data via `get_app_version_cache()`
- Use `force=true` parameter to bypass cache and trigger immediate check
- All timestamps include UTC timezone indicator ('Z' suffix)
- Use `force=true` parameter to bypass cache and get fresh version check
- Changelog is retrieved from GitHub releases in Markdown format
**Version Check Process:**
1. Scheduler job `check_app_version_update` runs every 6 hours
2. Fetches latest release from `https://api.github.com/repos/ShlomiPorush/mailcow-logs-viewer/releases/latest`
3. Compares current version (from `/app/VERSION` file) with latest GitHub release
4. Updates cache with result and changelog
5. Job status tracked with `update_job_status()` (visible in Settings > Background Jobs)
---
@@ -979,28 +1255,57 @@ Get system configuration and status information.
"background_jobs": {
"fetch_logs": {
"interval": "60 seconds",
"status": "running"
"description": "Imports logs from Mailcow API",
"status": "success",
"last_run": "2026-01-08T12:14:56Z",
"error": null
},
"complete_correlations": {
"interval": "120 seconds (2 minutes)",
"description": "Links Postfix logs to messages",
"status": "running",
"pending_items": 500
"last_run": "2026-01-08T12:13:56Z",
"error": null,
"pending_items": 93
},
"update_final_status": {
"interval": "120 seconds (2 minutes)",
"description": "Updates final status for correlations with late-arriving Postfix logs",
"max_age": "10 minutes",
"status": "running",
"pending_items": 150
"status": "success",
"last_run": "2026-01-08T12:13:56Z",
"error": null,
"pending_items": 25
},
"expire_correlations": {
"interval": "60 seconds (1 minute)",
"description": "Marks old incomplete correlations as expired",
"expire_after": "10 minutes",
"status": "running"
"status": "success",
"last_run": "2026-01-08T12:14:45Z",
"error": null
},
"cleanup_logs": {
"schedule": "Daily at 2 AM",
"description": "Removes old logs based on retention period",
"retention": "7 days",
"status": "scheduled"
"status": "scheduled",
"last_run": "2026-01-08T02:00:00Z",
"error": null
},
"check_app_version": {
"interval": "6 hours",
"description": "Checks for application updates from GitHub",
"status": "success",
"last_run": "2026-01-08T10:00:00Z",
"error": null
},
"dns_check": {
"interval": "6 hours",
"description": "Validates DNS records (SPF, DKIM, DMARC) for all active domains",
"status": "success",
"last_run": "2026-01-08T08:00:00Z",
"error": null
}
},
"recent_incomplete_correlations": [
@@ -1016,6 +1321,37 @@ Get system configuration and status information.
}
```
**Background Jobs Status Tracking:**
Each background job reports real-time execution status:
| Field | Type | Description |
|-------|------|-------------|
| `interval` / `schedule` | string | How often the job runs |
| `description` | string | Human-readable job description |
| `status` | string | Current status: `running`, `success`, `failed`, `idle`, `scheduled` |
| `last_run` | datetime | UTC timestamp of last execution (with 'Z' suffix) |
| `error` | string / null | Error message if job failed, otherwise null |
| `pending_items` | int | Number of items waiting (for correlation jobs only) |
| `max_age` / `expire_after` / `retention` | string | Job-specific configuration |
**Status Values:**
- `running` - Job is currently executing
- `success` - Job completed successfully
- `failed` - Job encountered an error
- `idle` - Job hasn't run yet
- `scheduled` - Job is scheduled but runs infrequently (e.g., daily cleanup)
**Job Descriptions:**
1. **fetch_logs**: Fetches Postfix, Rspamd, and Netfilter logs from Mailcow API every 60 seconds
2. **complete_correlations**: Links Postfix logs to message correlations every 2 minutes
3. **update_final_status**: Updates message delivery status when late-arriving Postfix logs are found
4. **expire_correlations**: Marks old incomplete correlations as expired after 10 minutes
5. **cleanup_logs**: Removes logs older than retention period (runs daily at 2 AM)
6. **check_app_version**: Checks GitHub for application updates every 6 hours
7. **dns_check**: Validates DNS records (SPF, DKIM, DMARC) for all active domains every 6 hours
---
### GET /settings/health

View File

@@ -48,8 +48,6 @@ nano .env
| `MAILCOW_API_KEY` | Your Mailcow API key | `abc123-def456...` |
| `POSTGRES_PASSWORD` | Database password<br>⚠️ Avoid special chars (`@:/?#`) - breaks connection strings<br>💡 Use UUID: Linux/Mac: `uuidgen` <br> or online https://it-tools.tech/uuid-generator | Example: `a7f3c8e2-4b1d-4f9a-8c3e-7d2f1a9b5e4c` |
**Note:** Active domains are automatically fetched from Mailcow API (`/api/v1/get/domain/all`) - no need to configure `MAILCOW_LOCAL_DOMAINS` anymore!
**Review all other settings** and adjust as needed for your environment (timezone, fetch intervals, retention period, etc.)
**🔐 Optional: Enable Authentication**

View File

@@ -12,9 +12,6 @@ MAILCOW_URL=https://mail.example.com
# Required permissions: Read access to logs
MAILCOW_API_KEY=your-api-key-here
# Note: Active domains are automatically fetched from Mailcow API
# No need to configure MAILCOW_LOCAL_DOMAINS anymore
# =============================================================================
# DATABASE CONFIGURATION
# =============================================================================

View File

@@ -959,12 +959,14 @@ async function loadRecentActivity() {
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 5l7 7-7 7"></path>
</svg>
<span class="text-sm text-gray-600 dark:text-gray-300">${escapeHtml(msg.recipient || 'Unknown')}</span>
${msg.direction ? `<span class="inline-block px-2 py-0.5 text-xs font-medium rounded ${getDirectionClass(msg.direction)}">${msg.direction}</span>` : ''}
</div>
<p class="text-xs text-gray-500 dark:text-gray-400 truncate" title="${escapeHtml(msg.subject || 'No subject')}">${escapeHtml(msg.subject || 'No subject')}</p>
</div>
<div class="flex items-center gap-2 flex-shrink-0 sm:justify-end">
<div class="flex flex-col items-end gap-1 flex-shrink-0">
<div class="flex items-center gap-2">
<span class="inline-block px-2 py-1 text-xs font-medium rounded ${getStatusClass(msg.status)}">${msg.status || 'unknown'}</span>
${msg.direction ? `<span class="inline-block px-2 py-0.5 text-xs font-medium rounded ${getDirectionClass(msg.direction)}">${msg.direction}</span>` : ''}
</div>
<p class="text-xs text-gray-500 dark:text-gray-400 whitespace-nowrap">${formatTime(msg.time)}</p>
</div>
</div>
@@ -1894,11 +1896,13 @@ function renderStatusJobs(jobs) {
const container = document.getElementById('status-jobs');
container.innerHTML = `
<div class="space-y-3">
${renderJobCard('Fetch Logs', jobs.fetch_logs, 'Imports logs from Mailcow API')}
${renderJobCard('Complete Correlations', jobs.complete_correlations, 'Links Postfix logs to messages')}
${renderJobCard('Update Final Status', jobs.update_final_status, 'Updates final status for correlations with late-arriving Postfix logs')}
${renderJobCard('Expire Correlations', jobs.expire_correlations, 'Marks old incomplete correlations as expired')}
${renderJobCard('Cleanup Old Logs', jobs.cleanup_logs, 'Removes logs older than retention period')}
${renderJobCard('Fetch Logs', jobs.fetch_logs)}
${renderJobCard('Complete Correlations', jobs.complete_correlations)}
${renderJobCard('Update Final Status', jobs.update_final_status)}
${renderJobCard('Expire Correlations', jobs.expire_correlations)}
${renderJobCard('Cleanup Logs', jobs.cleanup_logs)}
${renderJobCard('Check App Version', jobs.check_app_version)}
${renderJobCard('DNS Check (All Domains)', jobs.dns_check)}
</div>
`;
}
@@ -2994,6 +2998,29 @@ async function loadDomains() {
function renderDomains(container, data) {
const domains = data.domains || [];
const dnsCheckInfo = document.getElementById('dns-check-info');
if (dnsCheckInfo) {
const lastCheck = data.last_dns_check
? formatTime(data.last_dns_check)
: '<span class="text-gray-400">Never</span>';
dnsCheckInfo.innerHTML = `
<div class="text-right">
<p class="text-xs text-gray-500 dark:text-gray-400">Last checked:</p>
<p class="text-sm font-medium text-gray-900 dark:text-white">${lastCheck}</p>
</div>
<button
id="check-all-dns-btn"
onclick="checkAllDomainsDNS()"
class="px-4 py-2 bg-blue-600 hover:bg-blue-700 text-white rounded-lg transition text-sm font-medium flex items-center gap-2">
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15"></path>
</svg>
Check Now
</button>
`;
}
if (domains.length === 0) {
container.innerHTML = `
<div class="text-center py-12">
@@ -3324,12 +3351,31 @@ function renderDomainAccordionRow(domain) {
<!-- DNS Checks -->
<div class="p-6">
<h4 class="text-sm font-semibold text-gray-900 dark:text-white mb-4 flex items-center gap-2">
<div class="flex items-center justify-between mb-4">
<h4 class="text-sm font-semibold text-gray-900 dark:text-white flex items-center gap-2">
<svg class="w-5 h-5 text-blue-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 12l2 2 4-4m5.618-4.016A11.955 11.955 0 0112 2.944a11.955 11.955 0 01-8.618 3.04A12.02 12.02 0 003 9c0 5.591 3.824 10.29 9 11.622 5.176-1.332 9-6.03 9-11.622 0-1.042-.133-2.052-.382-3.016z"></path>
</svg>
DNS Security Records
</h4>
<div class="flex items-center gap-3">
<div class="text-right">
<p class="text-xs text-gray-500 dark:text-gray-400">Last checked:</p>
<p class="text-xs font-medium text-gray-900 dark:text-white">
${dns.checked_at ? formatTime(dns.checked_at) : '<span class="text-gray-400">Not checked</span>'}
</p>
</div>
<button
onclick="event.stopPropagation(); checkSingleDomainDNS('${escapeHtml(domain.domain_name)}')"
class="px-3 py-1.5 text-xs bg-blue-600 hover:bg-blue-700 text-white rounded transition flex items-center gap-1.5"
title="Check DNS for this domain">
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15"></path>
</svg>
Check
</button>
</div>
</div>
<div class="grid grid-cols-1 lg:grid-cols-3 gap-4">
${renderDNSCheck('SPF', spf)}
${renderDNSCheck('DKIM', dkim)}
@@ -3428,6 +3474,165 @@ function renderDNSCheck(type, check) {
`;
}
let dnsCheckInProgress = false;
async function checkAllDomainsDNS() {
if (dnsCheckInProgress) {
showToast('DNS check already in progress', 'warning');
return;
}
const button = document.getElementById('check-all-dns-btn');
if (button) {
button.disabled = true;
button.innerHTML = '<svg class="animate-spin w-4 h-4" fill="none" viewBox="0 0 24 24"><circle class="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" stroke-width="4"></circle><path class="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"></path></svg> Checking...';
}
dnsCheckInProgress = true;
try {
const response = await authenticatedFetch('/api/domains/check-all-dns', {
method: 'POST'
});
const result = await response.json();
if (result.status === 'success') {
showToast(`✓ Checked ${result.domains_checked} domains`, 'success');
setTimeout(() => loadDomains(), 1000);
} else {
showToast('DNS check failed', 'error');
}
} catch (error) {
console.error('Failed:', error);
showToast('Failed to check DNS', 'error');
} finally {
dnsCheckInProgress = false;
if (button) {
button.disabled = false;
button.innerHTML = '<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15"></path></svg> Check Now';
}
}
}
async function checkSingleDomainDNS(domainName) {
if (dnsCheckInProgress) {
showToast('DNS check already in progress', 'warning');
return;
}
dnsCheckInProgress = true;
showToast(`Checking DNS for ${domainName}...`, 'info');
// Find and update the button
const domainId = `domain-${domainName.replace(/\./g, '-')}`;
const detailsDiv = document.getElementById(`${domainId}-details`);
try {
const response = await authenticatedFetch(`/api/domains/${encodeURIComponent(domainName)}/check-dns`, {
method: 'POST'
});
const result = await response.json();
if (result.status === 'success') {
showToast(`✓ DNS checked for ${domainName}`, 'success');
// Update only this domain's DNS section
if (detailsDiv) {
const dnsSection = detailsDiv.querySelector('.p-6:last-child');
if (dnsSection) {
// Get updated domain data
const domainsResponse = await authenticatedFetch('/api/domains/all');
const domainsData = await domainsResponse.json();
const updatedDomain = domainsData.domains.find(d => d.domain_name === domainName);
if (updatedDomain) {
// Re-render just the DNS section
const dns = updatedDomain.dns_checks || {};
const spf = dns.spf || { status: 'unknown', message: 'Not checked' };
const dkim = dns.dkim || { status: 'unknown', message: 'Not checked' };
const dmarc = dns.dmarc || { status: 'unknown', message: 'Not checked' };
dnsSection.innerHTML = `
<div class="flex items-center justify-between mb-4">
<h4 class="text-sm font-semibold text-gray-900 dark:text-white flex items-center gap-2">
<svg class="w-5 h-5 text-blue-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 12l2 2 4-4m5.618-4.016A11.955 11.955 0 0112 2.944a11.955 11.955 0 01-8.618 3.04A12.02 12.02 0 003 9c0 5.591 3.824 10.29 9 11.622 5.176-1.332 9-6.03 9-11.622 0-1.042-.133-2.052-.382-3.016z"></path>
</svg>
DNS Security Records
</h4>
<div class="flex items-center gap-3">
<div class="text-right">
<p class="text-xs text-gray-500 dark:text-gray-400">Last checked:</p>
<p class="text-xs font-medium text-gray-900 dark:text-white">
${dns.checked_at ? formatTime(dns.checked_at) : '<span class="text-gray-400">Not checked</span>'}
</p>
</div>
<button
data-domain="${escapeHtml(updatedDomain.domain_name)}"
onclick="event.stopPropagation(); checkSingleDomainDNS(this.dataset.domain)"
class="px-3 py-1.5 text-xs bg-blue-600 hover:bg-blue-700 text-white rounded transition flex items-center gap-1.5"
title="Check DNS for this domain">
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15"></path>
</svg>
Check
</button>
</div>
</div>
<div class="grid grid-cols-1 lg:grid-cols-3 gap-4">
${renderDNSCheck('SPF', spf)}
${renderDNSCheck('DKIM', dkim)}
${renderDNSCheck('DMARC', dmarc)}
</div>
`;
// Update inline badges in summary row
const summaryRow = document.querySelector(`[onclick*="toggleDomainDetails('${domainId}')"]`);
if (summaryRow) {
const getStatusIcon = (status) => {
if (status === 'success') return '<span class="text-green-500" title="OK">✓</span>';
if (status === 'warning') return '<span class="text-amber-500" title="Warning">⚠</span>';
if (status === 'error') return '<span class="text-red-500" title="Error">✗</span>';
return '<span class="text-gray-400" title="Unknown">?</span>';
};
const badgesContainer = summaryRow.querySelector('.flex.items-center.gap-2.text-base');
if (badgesContainer) {
badgesContainer.innerHTML = `
<span class="flex items-center gap-1">
<span class="text-xs text-gray-500 dark:text-gray-400">SPF:</span>
${getStatusIcon(spf.status)}
</span>
<span class="flex items-center gap-1">
<span class="text-xs text-gray-500 dark:text-gray-400">DKIM:</span>
${getStatusIcon(dkim.status)}
</span>
<span class="flex items-center gap-1">
<span class="text-xs text-gray-500 dark:text-gray-400">DMARC:</span>
${getStatusIcon(dmarc.status)}
</span>
`;
}
}
}
}
}
} else {
showToast(`Failed to check DNS for ${domainName}`, 'error');
}
} catch (error) {
console.error('Failed:', error);
showToast('Failed to check DNS', 'error');
} finally {
dnsCheckInProgress = false;
}
}
function formatBytes(bytes) {
if (bytes === 0 || bytes === '0') return '0 B';
const k = 1024;
@@ -4013,43 +4218,110 @@ function renderImportCard(title, data, color) {
`;
}
function renderJobCard(title, data, description) {
if (!data) {
return `<div class="p-3 bg-gray-50 dark:bg-gray-700/30 rounded">
<p class="font-semibold text-gray-900 dark:text-white">${title}</p>
<p class="text-sm text-gray-500 dark:text-gray-400 mt-1">No data</p>
</div>`;
function renderJobCard(name, job) {
if (!job) {
return '';
}
const statusColors = {
running: 'bg-green-100 text-green-800 dark:bg-green-900/30 dark:text-green-300',
scheduled: 'bg-blue-100 text-blue-800 dark:bg-blue-900/30 dark:text-blue-300',
stopped: 'bg-red-100 text-red-800 dark:bg-red-900/30 dark:text-red-300'
};
let statusBadge = '';
switch(job.status) {
case 'running':
statusBadge = '<span class="px-2 py-1 text-xs font-medium rounded bg-blue-500 text-white">running</span>';
break;
case 'success':
statusBadge = '<span class="px-2 py-1 text-xs font-medium rounded bg-green-600 dark:bg-green-500 text-white">success</span>';
break;
case 'failed':
statusBadge = '<span class="px-2 py-1 text-xs font-medium rounded bg-red-600 dark:bg-red-500 text-white">failed</span>';
break;
case 'scheduled':
statusBadge = '<span class="px-2 py-1 text-xs font-medium rounded bg-purple-600 dark:bg-purple-500 text-white">scheduled</span>';
break;
default:
statusBadge = '<span class="px-2 py-1 text-xs font-medium rounded bg-gray-500 text-white">idle</span>';
}
return `
<div class="p-3 bg-gray-50 dark:bg-gray-700/30 rounded-lg">
<div class="flex justify-between items-start mb-2">
<div>
<p class="font-semibold text-gray-900 dark:text-white">${title}</p>
<p class="text-xs text-gray-500 dark:text-gray-400 mt-1">${description}</p>
<div class="flex items-start justify-between gap-3 mb-2">
<div class="flex-1 min-w-0">
<h4 class="font-semibold text-gray-900 dark:text-white text-sm">${name}</h4>
<p class="text-xs text-gray-500 dark:text-gray-400 mt-0.5">${job.description || ''}</p>
</div>
<span class="px-2 py-1 text-xs font-medium rounded ${statusColors[data.status] || statusColors.running}">
${data.status || 'unknown'}
</span>
${statusBadge}
</div>
<div class="flex items-center gap-4 text-sm text-gray-600 dark:text-gray-400">
${data.interval ? `<span>Interval: ${data.interval}</span>` : ''}
${data.schedule ? `<span>Schedule: ${data.schedule}</span>` : ''}
${data.retention ? `<span>Retention: ${data.retention}</span>` : ''}
${data.expire_after ? `<span>Expire after: ${data.expire_after}</span>` : ''}
${data.max_age ? `<span>Max age: ${data.max_age}</span>` : ''}
${data.pending_items !== undefined ? `<span>Pending: ${data.pending_items}</span>` : ''}
<div class="flex flex-wrap gap-x-4 gap-y-1 text-xs text-gray-600 dark:text-gray-400">
${job.interval ? `<span> ${job.interval}</span>` : ''}
${job.schedule ? `<span>📅 ${job.schedule}</span>` : ''}
${job.retention ? `<span>🗂 ${job.retention}</span>` : ''}
${job.max_age ? `<span>Max: ${job.max_age}</span>` : ''}
${job.expire_after ? `<span>⏱ Expire: ${job.expire_after}</span>` : ''}
${job.pending_items !== undefined ? `<span class="font-medium text-yellow-600 dark:text-yellow-400">📋 Pending: ${job.pending_items}</span>` : ''}
</div>
${job.last_run ? `
<div class="mt-2 pt-2 border-t border-gray-200 dark:border-gray-600">
<p class="text-xs text-gray-500 dark:text-gray-400">
Last run: <span class="text-gray-900 dark:text-white font-medium">${formatTime(job.last_run)}</span>
</p>
</div>
` : ''}
${job.error ? `
<div class="mt-2 p-2 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded">
<p class="text-xs text-red-700 dark:text-red-300 font-mono break-all">${escapeHtml(job.error)}</p>
</div>
` : ''}
</div>
`;
}
function showToast(message, type = 'info') {
// Remove existing toast if any
const existingToast = document.getElementById('toast-notification');
if (existingToast) {
existingToast.remove();
}
const colors = {
'success': 'bg-green-100 dark:bg-green-900/30 text-green-800 dark:text-green-200 border-green-500',
'error': 'bg-red-100 dark:bg-red-900/30 text-red-800 dark:text-red-200 border-red-500',
'warning': 'bg-yellow-100 dark:bg-yellow-900/30 text-yellow-800 dark:text-yellow-200 border-yellow-500',
'info': 'bg-blue-100 dark:bg-blue-900/30 text-blue-800 dark:text-blue-200 border-blue-500'
};
const icons = {
'success': '✓',
'error': '✗',
'warning': '⚠',
'info': ''
};
const toast = document.createElement('div');
toast.id = 'toast-notification';
toast.className = `fixed bottom-4 right-4 z-50 ${colors[type]} border-l-4 p-4 rounded shadow-lg max-w-md animate-slide-in`;
toast.innerHTML = `
<div class="flex items-start gap-3">
<span class="text-xl font-bold flex-shrink-0">${icons[type]}</span>
<p class="text-sm flex-1">${message}</p>
<button onclick="this.parentElement.parentElement.remove()" class="text-lg font-bold hover:opacity-70 flex-shrink-0">×</button>
</div>
`;
document.body.appendChild(toast);
// Auto-remove after 4 seconds
setTimeout(() => {
if (toast.parentElement) {
toast.style.opacity = '0';
toast.style.transition = 'opacity 0.3s';
setTimeout(() => toast.remove(), 300);
}
}, 4000);
}
// =============================================================================
// CONSOLE LOG
// =============================================================================

View File

@@ -700,9 +700,15 @@
<!-- Domains Tab -->
<div id="content-domains" class="tab-content hidden">
<div class="mb-6">
<div class="flex flex-col lg:flex-row lg:items-center lg:justify-between gap-4">
<div class="text-center lg:text-left">
<h2 class="text-2xl font-bold text-gray-900 dark:text-white">Domains Overview</h2>
<p class="text-gray-600 dark:text-gray-400 mt-1">View Domains Settings with DNS records validation</p>
</div>
<div id="dns-check-info" class="flex items-center gap-3 justify-center lg:justify-end">
</div>
</div>
</div>
<div id="domains-loading" class="text-center py-12">
<div class="loading mx-auto mb-4"></div>
<p class="text-gray-500 dark:text-gray-400">Loading domains...</p>

BIN
images/Domains.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 86 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 213 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 107 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 127 KiB