Release version 2.0.0

This commit is contained in:
shlomi
2026-01-14 23:02:39 +02:00
parent ccc87b8f4f
commit 71c99550d9
796 changed files with 8203 additions and 269 deletions

5
.gitignore vendored
View File

@@ -73,4 +73,7 @@ temp/
# OS files
Thumbs.db
.DS_Store
.DS_Store
# Maxmind
data

View File

@@ -5,6 +5,227 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [2.0.0] - 2026-01-09
### Added
#### DMARC Backend
- Daily data aggregation for performance
- GeoIP enrichment with MaxMind database support (City + ASN)
- Automatic MaxMind database downloads and updates
- Weekly scheduler for MaxMind databases updates (Sunday 3 AM)
#### DMARC Frontend - Complete UI Implementation
- **Domains List View**:
- Stats dashboard showing total domains, messages, pass rate, and unique IPs
- Full domain overview with 30-day statistics
- Color-coded pass rates (green ≥95%, yellow ≥80%, red <80%)
- Policy badges (reject/quarantine/none) with appropriate styling
- Empty state with helpful messaging for first-time users
- **Domain Overview Page**:
- Breadcrumb navigation in DMARC page
- Domain-specific stats cards (total messages, compliance rate, unique sources)
- Daily Volume Graph showing 30-day email trends
- **Daily Reports Tab**:
- Aggregated daily report cards
- Shows report count, unique IPs, total messages per day
- SPF and DKIM pass percentages displayed
- Overall DMARC pass rate
- Chronological ordering (newest first)
- **Source IPs Tab with Complete GeoIP Info**:
- City names from MaxMind City database
- ISP/Organization names
- Country flag emoji display
- Message counts and pass rates per IP
- **Upload DMARC Functionality**:
- Upload button
- Supports XML, GZ, and ZIP file formats
- Toast notifications for success/duplicate/error states
- Auto-refresh of current view after successful upload
- Client-side file validation
#### DMARC IMAP Auto-Import System
- **Automatic Report Fetching**: Complete IMAP integration for automatic DMARC report imports
- Configurable sync interval (default: 1 hour) via `DMARC_IMAP_INTERVAL`
- Automatic connection to IMAP mailbox and report processing
- Supports SSL/TLS connections (`DMARC_IMAP_USE_SSL`)
- Configurable folder monitoring (default: INBOX via `DMARC_IMAP_FOLDER`)
- Optional email deletion after processing (`DMARC_IMAP_DELETE_AFTER`)
- Background job runs automatically at specified intervals
- Manual sync trigger available in DMARC page
- **DMARC IMAP Sync History**:
- Comprehensive sync statistics tracking (emails found, processed, created, duplicates, failed)
- Interactive modal showing all past sync operations
- Color-coded status indicators (success/error)
- Duration display for each sync
- Failed email count with highlighting
- "View History" button in DMARC tab
- Sync history persists across restarts
- **DMARC Error Notifications**: Automatic email alerts for IMAP sync failures
- Sends detailed error reports when IMAP sync encounters failures
- Email includes: failed email count, message IDs, subjects, and error descriptions
- Link to sync history in notification email
- Only sends when failures occur and SMTP is configured
- Configurable error recipient via `DMARC_ERROR_EMAIL` (defaults to `ADMIN_EMAIL`)
#### Global SMTP Configuration & Notifications
- **Centralized SMTP Service**: Generic email infrastructure for all notification types
- Configured via environment variables: `SMTP_HOST`, `SMTP_PORT`, `SMTP_USER`, `SMTP_PASSWORD`
- Support for TLS/SSL connections (`SMTP_USE_TLS`)
- Configurable sender address (`SMTP_FROM`) and admin email (`ADMIN_EMAIL`)
- Can be enabled/disabled globally (`SMTP_ENABLED`)
- Ready for future notification types beyond DMARC
- **Settings UI Enhancements**:
- New "Global SMTP Configuration" section showing current SMTP settings
- New "DMARC Management" section showing manual upload and IMAP status
- Display of SMTP server, port, and admin email when configured
- Display of IMAP server when auto-import enabled
- **Test Connection Buttons**:
- Added diagnostic test buttons in Settings page for both SMTP and IMAP
- Interactive popup showing connection attempt logs in real-time
- Tests authentication, server connectivity, mailbox access, and email sending
#### DMARC Tab Enhancements
- **IMAP Sync Controls**: Dynamic UI based on configuration
- "Sync from IMAP" button appears when IMAP auto-import enabled
- "Upload Report" button hidden when manual upload disabled (`DMARC_MANUAL_UPLOAD_ENABLED=false`)
- Last sync information displayed below sync button (time and status icon)
#### MaxMind GeoIP Integration
- **Configuration**:
- MaxMind account ID and license key via .env
- `MAXMIND_ACCOUNT_ID` - MaxMind account ID
- `MAXMIND_LICENSE_KEY` - MaxMind license key
- Free GeoLite2 databases available at maxmind.com
- Databases stored in `/app/data/` directory
- **Automatic Database Management**:
- Auto-downloads MaxMind GeoLite2 databases on first startup
- Dual database support: GeoLite2-City + GeoLite2-ASN
- Weekly automatic updates (Sunday 3 AM via scheduler)
- Database persistence via Docker volume mount (`./data:/app/data`)
- **GeoIP Enrichment Service**:
- Enriches all DMARC source IPs automatically during upload
- Dual readers for City and ASN lookups
- City names, Country code, Country name, Country emoji flags
- ASN Number, ASN organization
- **Graceful Degradation**:
- Works without MaxMind license key (returns null for geo fields)
- Continues operation if databases unavailable
- Default globe emoji (🌍) for unknown locations
- Non-blocking errors (logs warnings but doesn't crash)
- **Background Job**:
- Runs weekly on Sunday at 3 AM
- Checks database age (updates if >7 days old)
- Downloads both City and ASN Databases
- Automatic retry with exponential backoff
- Status tracking in Status page
- **MaxMind License Validation**: Added real-time validation of MaxMind license key in Settings page
- Validates license key using MaxMind's validation API
- Displays status badge: "Configured" (green with checkmark) or "Not configured" (gray)
- Shows error details if validation fails (red badge with X icon)
#### SPF Validation Enhancements
- **DNS Lookup Counter**: SPF validation now counts and validates DNS lookups according to RFC 7208
- Recursive counting through `include:` directives
- Counts `a`, `mx`, `exists:`, `redirect=`, and `include:` mechanisms
- Maximum limit of 10 DNS lookups enforced
- Returns error when limit exceeded: "SPF has too many DNS lookups (X). Maximum is 10"
- **Server IP Authorization Check**: SPF validation now verifies mail server IP is authorized
- Fetches server IP from Mailcow API on startup
- Caches IP in memory for performance (no repeated API calls)
- Checks if server IP is authorized via:
- Direct `ip4:` match (including CIDR ranges)
- `a` record lookup
- `mx` record lookup
- Recursive `include:`
- Returns error if server IP not found in SPF: "Server IP X.X.X.X is NOT authorized in SPF record"
- Shows authorization method in success message: "Server IP authorized via ip4:X.X.X.X"
- **Enhanced SPF Validation**: Complete SPF record validation
- Detects multiple SPF records (RFC violation - only one allowed)
- Validates basic syntax (`v=spf1` with space)
- Checks for valid mechanisms only (ip4, ip6, a, mx, include, exists, all)
- Validates presence of `all` mechanism
- Prevents infinite loops in circular includes
- Depth protection (maximum 10 recursion levels)
#### DKIM Parameter Validation
- **Testing Mode Detection** (`t=y`): Critical error detection
- Detects DKIM testing mode flag
- Returns error status with message: "DKIM is in TESTING mode (t=y)"
- Warning: "Emails will pass validation even with invalid signatures. Remove t=y for production!"
- Prevents false validation in production environments
- **Strict Subdomain Mode Detection** (`t=s`): Informational flag
- Detects strict subdomain restriction flag
- Displayed as informational text (not warning)
- Message: "DKIM uses strict subdomain mode (t=s)"
- Does NOT affect DKIM status (remains "success")
- **Revoked Key Detection** (`p=` empty): Error detection
- Detects intentionally disabled DKIM keys
- Returns error status with message: "DKIM key is revoked (p= is empty)"
- Indicates DKIM record has been decommissioned
- **Weak Hash Algorithm Detection** (`h=sha1`): Security warning
- Detects deprecated SHA1 hash algorithm
- Returns warning status with message: "DKIM uses SHA1 hash algorithm (h=sha1)"
- Recommendation: "SHA1 is deprecated and insecure. Upgrade to SHA256 (h=sha256)"
- **Key Type Validation** (`k=`): Configuration check
- Validates key type is `rsa` or `ed25519`
- Warning for unknown key types
- Helps identify configuration errors
### Fixed
#### Message Correlation System
- **Final Status Update Job Enhancement**: Fixed correlations not updating when Postfix logs arrive milliseconds after correlation creation
- Increased batch size from 100 to 500 correlations per run for faster processing
- Fixes race condition where `status=sent` logs arrived seconds after correlation was marked complete
- Improved logging to show how many logs were added to each correlation
#### Postfix Log Deduplication
- **UNIQUE Constraint Added**: Postfix logs now have database-level duplicate prevention
- Automatic cleanup of existing duplicate logs on startup (keeps oldest entry)
- Import process now silently skips duplicate logs (no error logging)
- Batched deletion (1000 records at a time) to prevent database locks
- Handles NULL `queue_id` values correctly using `COALESCE`
- Prevents duplicate log imports when fetch job runs faster than log generation rate
- Improved logging shows count of duplicates skipped during import
### Technical
#### New API Endpoints
```
GET /api/dmarc/domains?days=30
GET /api/dmarc/domains/{domain}/overview?days=30
GET /api/dmarc/domains/{domain}/reports?days=30
GET /api/dmarc/domains/{domain}/sources?days=30
POST /api/dmarc/upload
GET /api/dmarc/imap/status
POST /api/dmarc/imap/sync
GET /api/dmarc/imap/history
POST /api/settings/test/smtp
POST /api/settings/test/imap
```
---
## [1.4.8] - 2026-01-08
### Added

View File

@@ -36,4 +36,4 @@ EXPOSE 8080
HEALTHCHECK --interval=30s --timeout=10s --start-period=40s --retries=3 \
CMD curl -f http://localhost:8080/api/health || exit 1
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8080", "--workers", "2"]
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8080", "--workers", "1"]

View File

@@ -95,6 +95,8 @@ docker compose up -d
📖 **Full installation guide:** [Getting Started](documentation/GETTING_STARTED.md)
📘 **Technical Overview: Email Authentication & Monitoring:** How can **mailcow-logs-viewer** help you with this [Read more](documentation/Email_Authentication_Monitoring.md)
---
## Architecture
@@ -232,4 +234,11 @@ MIT License
- **Logs**: `docker compose logs app`
- **Health**: `http://localhost:8080/api/health`
- **Issues**: Open issue on GitHub
- **Issues**: Open issue on GitHub
---
## Credits
* **Flags**: Flag icons are sourced from [Flagpedia.net](https://flagpedia.net/).
* **Location Data**: This product includes GeoLite2 data created by MaxMind, available from [https://www.maxmind.com](https://www.maxmind.com).

View File

@@ -1 +1 @@
1.4.8
2.0.0

View File

@@ -2,7 +2,7 @@
Configuration management using Pydantic Settings
"""
from pydantic_settings import BaseSettings
from pydantic import Field, validator
from pydantic import Field, validator, field_validator
from typing import List, Optional
import logging
@@ -10,7 +10,6 @@ logger = logging.getLogger(__name__)
_cached_active_domains: Optional[List[str]] = None
class Settings(BaseSettings):
"""Application settings"""
@@ -89,7 +88,144 @@ class Settings(BaseSettings):
default="",
description="Basic auth password (required if auth_enabled=True)"
)
# DMARC configuration
dmarc_retention_days: int = Field(
default=60,
env="DMARC_RETENTION_DAYS"
)
dmarc_manual_upload_enabled: bool = Field(
default=True,
env='DMARC_MANUAL_UPLOAD_ENABLED',
description='Allow manual upload of DMARC reports via UI'
)
# DMARC IMAP Configuration
dmarc_imap_enabled: bool = Field(
default=False,
env='DMARC_IMAP_ENABLED',
description='Enable automatic DMARC report import from IMAP'
)
dmarc_imap_host: Optional[str] = Field(
default=None,
env='DMARC_IMAP_HOST',
description='IMAP server hostname (e.g., imap.gmail.com)'
)
dmarc_imap_port: Optional[int] = Field(
default=993,
env='DMARC_IMAP_PORT',
description='IMAP server port (993 for SSL, 143 for non-SSL)'
)
dmarc_imap_use_ssl: bool = Field(
default=True,
env='DMARC_IMAP_USE_SSL',
description='Use SSL/TLS for IMAP connection'
)
dmarc_imap_user: Optional[str] = Field(
default=None,
env='DMARC_IMAP_USER',
description='IMAP username (email address)'
)
dmarc_imap_password: Optional[str] = Field(
default=None,
env='DMARC_IMAP_PASSWORD',
description='IMAP password'
)
dmarc_imap_folder: str = Field(
default='INBOX',
env='DMARC_IMAP_FOLDER',
description='IMAP folder to scan for DMARC reports'
)
dmarc_imap_delete_after: bool = Field(
default=True,
env='DMARC_IMAP_DELETE_AFTER',
description='Delete emails after successful processing'
)
dmarc_imap_interval: Optional[int] = Field(
default=3600,
env='DMARC_IMAP_INTERVAL',
description='Interval between IMAP syncs in seconds (default: 3600 = 1 hour)'
)
dmarc_imap_run_on_startup: bool = Field(
default=True,
env='DMARC_IMAP_RUN_ON_STARTUP',
description='Run IMAP sync once on application startup'
)
dmarc_error_email: Optional[str] = Field(
default=None,
env='DMARC_ERROR_EMAIL',
description='Email address for DMARC error notifications (defaults to ADMIN_EMAIL if not set)'
)
# SMTP Configuration
smtp_enabled: bool = Field(
default=False,
env='SMTP_ENABLED',
description='Enable SMTP for sending notifications'
)
smtp_host: Optional[str] = Field(
default=None,
env='SMTP_HOST',
description='SMTP server hostname'
)
smtp_port: Optional[int] = Field(
default=587,
env='SMTP_PORT',
description='SMTP server port (587 for TLS, 465 for SSL, 25 for plain)'
)
smtp_use_tls: bool = Field(
default=True,
env='SMTP_USE_TLS',
description='Use STARTTLS for SMTP connection'
)
smtp_user: Optional[str] = Field(
default=None,
env='SMTP_USER',
description='SMTP username (usually email address)'
)
smtp_password: Optional[str] = Field(
default=None,
env='SMTP_PASSWORD',
description='SMTP password'
)
smtp_from: Optional[str] = Field(
default=None,
env='SMTP_FROM',
description='From address for emails (defaults to SMTP user if not set)'
)
# Global Admin Email
admin_email: Optional[str] = Field(
default=None,
env='ADMIN_EMAIL',
description='Administrator email for system notifications'
)
@field_validator('smtp_port', 'dmarc_imap_port', 'dmarc_imap_interval', mode='before')
@classmethod
def empty_str_to_none(cls, v):
"""Convert empty string to None so default value is used"""
if v == '':
return None
return v
@validator('mailcow_url')
def validate_mailcow_url(cls, v):
"""Ensure URL doesn't end with slash"""
@@ -121,6 +257,16 @@ class Settings(BaseSettings):
return []
return [e.strip().lower() for e in self.blacklist_emails.split(',') if e.strip()]
@property
def notification_smtp_configured(self) -> bool:
"""Check if SMTP is properly configured for notifications"""
return (
self.smtp_enabled and
self.smtp_host is not None and
self.smtp_user is not None and
self.smtp_password is not None
)
@property
def database_url(self) -> str:
"""Construct PostgreSQL connection URL"""
@@ -147,11 +293,18 @@ settings = Settings()
def setup_logging():
"""Configure application logging"""
root = logging.getLogger()
# Remove ALL existing handlers
for handler in root.handlers[:]:
root.removeHandler(handler)
log_format = '%(levelname)s - %(message)s'
logging.basicConfig(
level=getattr(logging, settings.log_level),
format=log_format
format=log_format,
force=True
)
logging.getLogger('httpx').setLevel(logging.ERROR)
@@ -177,4 +330,5 @@ def set_cached_active_domains(domains: List[str]) -> None:
def get_cached_active_domains() -> Optional[List[str]]:
"""Get the cached active domains list"""
return _cached_active_domains
global _cached_active_domains
return _cached_active_domains if _cached_active_domains else []

View File

@@ -3,6 +3,9 @@ Main FastAPI application
Entry point for the Mailcow Logs Viewer backend
"""
import logging
root = logging.getLogger()
root.handlers = []
from fastapi import FastAPI, Request
from fastapi.staticfiles import StaticFiles
from fastapi.responses import HTMLResponse, JSONResponse
@@ -16,10 +19,18 @@ from .mailcow_api import mailcow_api
from .routers import logs, stats
from .routers import export as export_router
from .routers import domains as domains_router
from .routers import dmarc as dmarc_router
from .routers import documentation
from .migrations import run_migrations
from .auth import BasicAuthMiddleware
from .version import __version__
from .services.geoip_downloader import (
update_geoip_database_if_needed,
is_license_configured,
get_geoip_status
)
logger = logging.getLogger(__name__)
try:
@@ -64,6 +75,32 @@ async def lifespan(app: FastAPI):
logger.error(f"Failed to initialize database: {e}")
raise
# Initialize GeoIP database (if configured)
try:
if is_license_configured():
logger.info("MaxMind license key configured, checking GeoIP database...")
# This will:
# 1. Check if database exists
# 2. Check if it's older than 7 days
# 3. Download if needed
# 4. Skip if database is fresh
db_available = update_geoip_database_if_needed()
if db_available:
status = get_geoip_status()
city_info = status['City']
asn_info = status['ASN']
logger.info(f"✓ GeoIP ready: City {city_info['size_mb']}MB ({city_info['age_days']}d), ASN {asn_info['size_mb']}MB ({asn_info['age_days']}d)")
else:
logger.warning("⚠ GeoIP database unavailable, features will be disabled")
else:
logger.info("MaxMind license key not configured, GeoIP features disabled")
logger.info("To enable: Set MAXMIND_ACCOUNT_ID and MAXMIND_LICENSE_KEY environment variables")
except Exception as e:
logger.error(f"Error initializing GeoIP database: {e}")
logger.info("Continuing without GeoIP features...")
# Test Mailcow API connection and fetch active domains
try:
api_ok = await mailcow_api.test_connection()
@@ -79,6 +116,12 @@ async def lifespan(app: FastAPI):
logger.warning("No active domains found in Mailcow - check your configuration")
except Exception as e:
logger.error(f"Failed to fetch active domains: {e}")
# Initialize server IP cache for SPF checks
try:
from app.routers.domains import init_server_ip
await init_server_ip()
except Exception as e:
logger.warning(f"Failed to initialize server IP cache: {e}")
except Exception as e:
logger.error(f"Mailcow API test failed: {e}")
@@ -128,6 +171,8 @@ app.include_router(status_router.router, prefix="/api", tags=["Status"])
app.include_router(messages_router.router, prefix="/api", tags=["Messages"])
app.include_router(settings_router.router, prefix="/api", tags=["Settings"])
app.include_router(domains_router.router, prefix="/api", tags=["Domains"])
app.include_router(dmarc_router.router, prefix="/api", tags=["DMARC"])
app.include_router(documentation.router, prefix="/api", tags=["Documentation"])
# Mount static files (frontend)
app.mount("/static", StaticFiles(directory="/app/frontend"), name="static")

View File

@@ -370,6 +370,456 @@ def add_is_full_check_column(db: Session):
db.rollback()
def add_postfix_unique_constraint(db: Session):
"""
Add UNIQUE constraint to postfix_logs to prevent duplicate logs
Uses a safer approach with batched deletes and proper error handling
"""
logger.info("Adding UNIQUE constraint to postfix_logs...")
try:
# Check if constraint already exists
result = db.execute(text("""
SELECT constraint_name
FROM information_schema.table_constraints
WHERE table_name='postfix_logs'
AND constraint_name='uq_postfix_log'
"""))
if result.fetchone():
logger.info("UNIQUE constraint already exists, skipping...")
return
# Step 1: Delete duplicates in small batches to avoid deadlock
logger.info("Cleaning up duplicate Postfix logs in batches...")
batch_size = 1000
total_deleted = 0
while True:
try:
result = db.execute(text(f"""
DELETE FROM postfix_logs
WHERE id IN (
SELECT id
FROM (
SELECT id,
ROW_NUMBER() OVER (
PARTITION BY time, program, COALESCE(queue_id, ''), message
ORDER BY created_at ASC
) as row_num
FROM postfix_logs
) t
WHERE t.row_num > 1
LIMIT {batch_size}
)
"""))
deleted = result.rowcount
total_deleted += deleted
db.commit()
if deleted == 0:
break # No more duplicates
logger.info(f"Deleted {deleted} duplicates (total: {total_deleted})...")
except Exception as e:
logger.warning(f"Error deleting batch: {e}")
db.rollback()
break # Skip if there's a lock issue
if total_deleted > 0:
logger.info(f"Deleted {total_deleted} duplicate Postfix logs")
else:
logger.info("No duplicate Postfix logs found")
# Step 2: Add UNIQUE constraint
logger.info("Creating UNIQUE constraint...")
db.execute(text("""
ALTER TABLE postfix_logs
ADD CONSTRAINT uq_postfix_log
UNIQUE (time, program, COALESCE(queue_id, ''), message);
"""))
db.commit()
logger.info("✓ UNIQUE constraint added successfully")
except Exception as e:
error_msg = str(e).lower()
if "already exists" in error_msg or "duplicate" in error_msg:
logger.info("UNIQUE constraint already exists, skipping...")
db.rollback()
elif "deadlock" in error_msg or "lock" in error_msg:
logger.warning(f"Could not add UNIQUE constraint due to lock (will retry on next startup): {e}")
db.rollback()
else:
logger.error(f"Error adding UNIQUE constraint: {e}")
db.rollback()
# Don't raise - allow app to start
def ensure_dmarc_tables(db: Session):
"""Ensure DMARC tables exist with proper structure"""
logger.info("Checking if DMARC tables exist...")
try:
# Check if dmarc_reports table exists
result = db.execute(text("""
SELECT EXISTS (
SELECT FROM information_schema.tables
WHERE table_schema = 'public'
AND table_name = 'dmarc_reports'
);
"""))
reports_exists = result.fetchone()[0]
# Check if dmarc_records table exists
result = db.execute(text("""
SELECT EXISTS (
SELECT FROM information_schema.tables
WHERE table_schema = 'public'
AND table_name = 'dmarc_records'
);
"""))
records_exists = result.fetchone()[0]
if reports_exists and records_exists:
logger.info("DMARC tables already exist")
return
# If tables exist partially, clean up
if reports_exists or records_exists:
logger.warning("DMARC tables exist partially, cleaning up...")
try:
db.execute(text("DROP TABLE IF EXISTS dmarc_records CASCADE;"))
db.execute(text("DROP TABLE IF EXISTS dmarc_reports CASCADE;"))
db.execute(text("DROP SEQUENCE IF EXISTS dmarc_reports_id_seq CASCADE;"))
db.execute(text("DROP SEQUENCE IF EXISTS dmarc_records_id_seq CASCADE;"))
db.commit()
logger.info("Cleaned up partial DMARC tables")
except Exception as cleanup_error:
logger.error(f"Error during cleanup: {cleanup_error}")
db.rollback()
raise
logger.info("Creating DMARC tables...")
try:
# Create dmarc_reports table
db.execute(text("""
CREATE TABLE dmarc_reports (
id SERIAL PRIMARY KEY,
report_id VARCHAR(255) NOT NULL UNIQUE,
domain VARCHAR(255) NOT NULL,
org_name VARCHAR(255) NOT NULL,
email VARCHAR(255),
extra_contact_info TEXT,
begin_date INTEGER NOT NULL,
end_date INTEGER NOT NULL,
policy_published JSONB,
domain_id VARCHAR(255),
raw_xml TEXT,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
"""))
# Create indexes for dmarc_reports
db.execute(text("""
CREATE INDEX idx_dmarc_report_domain_date
ON dmarc_reports(domain, begin_date);
"""))
db.execute(text("""
CREATE INDEX idx_dmarc_report_org
ON dmarc_reports(org_name);
"""))
db.execute(text("""
CREATE INDEX idx_dmarc_report_created
ON dmarc_reports(created_at);
"""))
# Create dmarc_records table
db.execute(text("""
CREATE TABLE dmarc_records (
id SERIAL PRIMARY KEY,
dmarc_report_id INTEGER NOT NULL,
source_ip VARCHAR(50) NOT NULL,
count INTEGER NOT NULL,
disposition VARCHAR(20),
dkim_result VARCHAR(20),
spf_result VARCHAR(20),
header_from VARCHAR(255),
envelope_from VARCHAR(255),
envelope_to VARCHAR(255),
auth_results JSONB,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
"""))
# Create indexes for dmarc_records
db.execute(text("""
CREATE INDEX idx_dmarc_record_report
ON dmarc_records(dmarc_report_id);
"""))
db.execute(text("""
CREATE INDEX idx_dmarc_record_ip
ON dmarc_records(source_ip);
"""))
db.execute(text("""
CREATE INDEX idx_dmarc_record_results
ON dmarc_records(dkim_result, spf_result);
"""))
db.commit()
logger.info("✓ DMARC tables created successfully")
except Exception as create_error:
db.rollback()
# Handle duplicate key errors (PostgreSQL artifacts)
if "duplicate key value violates unique constraint" in str(create_error).lower():
logger.warning("Detected PostgreSQL artifacts, cleaning up...")
try:
# Clean up ALL artifacts
db.execute(text("DROP TABLE IF EXISTS dmarc_records CASCADE;"))
db.execute(text("DROP TABLE IF EXISTS dmarc_reports CASCADE;"))
db.execute(text("DROP SEQUENCE IF EXISTS dmarc_reports_id_seq CASCADE;"))
db.execute(text("DROP SEQUENCE IF EXISTS dmarc_records_id_seq CASCADE;"))
db.commit()
logger.info("Cleaned up PostgreSQL artifacts")
# Retry - create tables again
db.execute(text("""
CREATE TABLE dmarc_reports (
id SERIAL PRIMARY KEY,
report_id VARCHAR(255) NOT NULL UNIQUE,
domain VARCHAR(255) NOT NULL,
org_name VARCHAR(255) NOT NULL,
email VARCHAR(255),
extra_contact_info TEXT,
begin_date INTEGER NOT NULL,
end_date INTEGER NOT NULL,
policy_published JSONB,
domain_id VARCHAR(255),
raw_xml TEXT,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
"""))
db.execute(text("""
CREATE INDEX idx_dmarc_report_domain_date
ON dmarc_reports(domain, begin_date);
"""))
db.execute(text("""
CREATE INDEX idx_dmarc_report_org
ON dmarc_reports(org_name);
"""))
db.execute(text("""
CREATE INDEX idx_dmarc_report_created
ON dmarc_reports(created_at);
"""))
db.execute(text("""
CREATE TABLE dmarc_records (
id SERIAL PRIMARY KEY,
dmarc_report_id INTEGER NOT NULL,
source_ip VARCHAR(50) NOT NULL,
count INTEGER NOT NULL,
disposition VARCHAR(20),
dkim_result VARCHAR(20),
spf_result VARCHAR(20),
header_from VARCHAR(255),
envelope_from VARCHAR(255),
envelope_to VARCHAR(255),
auth_results JSONB,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
"""))
db.execute(text("""
CREATE INDEX idx_dmarc_record_report
ON dmarc_records(dmarc_report_id);
"""))
db.execute(text("""
CREATE INDEX idx_dmarc_record_ip
ON dmarc_records(source_ip);
"""))
db.execute(text("""
CREATE INDEX idx_dmarc_record_results
ON dmarc_records(dkim_result, spf_result);
"""))
db.commit()
logger.info("✓ DMARC tables created after cleanup")
except Exception as retry_error:
logger.error(f"Failed after cleanup: {retry_error}")
db.rollback()
raise
else:
logger.error(f"Failed to create DMARC tables: {create_error}")
raise
except Exception as e:
logger.error(f"Error ensuring DMARC tables: {e}")
db.rollback()
raise
def add_geoip_fields_to_dmarc(db: Session):
"""Add GeoIP fields to dmarc_records table"""
logger.info("Checking if GeoIP fields exist in dmarc_records...")
try:
result = db.execute(text("""
SELECT column_name
FROM information_schema.columns
WHERE table_name='dmarc_records'
AND column_name='country_code'
"""))
if result.fetchone() is None:
logger.info("Adding GeoIP fields to dmarc_records...")
db.execute(text("""
ALTER TABLE dmarc_records
ADD COLUMN country_code VARCHAR(2),
ADD COLUMN country_name VARCHAR(100),
ADD COLUMN country_emoji VARCHAR(10),
ADD COLUMN city VARCHAR(100),
ADD COLUMN asn VARCHAR(20),
ADD COLUMN asn_org VARCHAR(255);
"""))
db.execute(text("""
CREATE INDEX IF NOT EXISTS idx_dmarc_record_country
ON dmarc_records(country_code);
"""))
db.commit()
logger.info("✓ GeoIP fields added to dmarc_records")
else:
logger.info("✓ GeoIP fields already exist in dmarc_records")
except Exception as e:
logger.error(f"Error adding GeoIP fields: {e}")
db.rollback()
def add_geoip_fields_to_rspamd(db: Session):
"""Add GeoIP fields to rspamd_logs table"""
logger.info("Checking if GeoIP fields exist in rspamd_logs...")
try:
result = db.execute(text("""
SELECT column_name
FROM information_schema.columns
WHERE table_name='rspamd_logs'
AND column_name='country_code'
"""))
if result.fetchone() is None:
logger.info("Adding GeoIP fields to rspamd_logs...")
db.execute(text("""
ALTER TABLE rspamd_logs
ADD COLUMN country_code VARCHAR(2),
ADD COLUMN country_name VARCHAR(100),
ADD COLUMN city VARCHAR(100),
ADD COLUMN asn VARCHAR(20),
ADD COLUMN asn_org VARCHAR(255);
"""))
db.execute(text("""
CREATE INDEX IF NOT EXISTS idx_rspamd_country
ON rspamd_logs(country_code);
"""))
db.commit()
logger.info("✓ GeoIP fields added to rspamd_logs")
else:
logger.info("✓ GeoIP fields already exist in rspamd_logs")
except Exception as e:
logger.error(f"Error adding GeoIP fields to rspamd_logs: {e}")
db.rollback()
def create_dmarc_sync_table(db: Session):
"""
Create dmarc_syncs table for tracking IMAP sync operations
This table tracks automatic and manual DMARC report imports from IMAP
"""
logger.info("Checking if dmarc_syncs table exists...")
try:
# Check if table already exists
result = db.execute(text("""
SELECT EXISTS (
SELECT FROM information_schema.tables
WHERE table_name = 'dmarc_syncs'
);
"""))
table_exists = result.scalar()
if table_exists:
logger.info("✓ dmarc_syncs table already exists")
return
logger.info("Creating dmarc_syncs table...")
# Create table
db.execute(text("""
CREATE TABLE dmarc_syncs (
id SERIAL PRIMARY KEY,
sync_type VARCHAR(20) NOT NULL,
started_at TIMESTAMP NOT NULL,
completed_at TIMESTAMP,
status VARCHAR(20) NOT NULL,
emails_found INTEGER DEFAULT 0,
emails_processed INTEGER DEFAULT 0,
reports_created INTEGER DEFAULT 0,
reports_duplicate INTEGER DEFAULT 0,
reports_failed INTEGER DEFAULT 0,
error_message TEXT,
failed_emails JSONB,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
"""))
# Create indexes
db.execute(text("""
CREATE INDEX idx_dmarc_sync_type_status
ON dmarc_syncs(sync_type, status);
"""))
db.execute(text("""
CREATE INDEX idx_dmarc_sync_started
ON dmarc_syncs(started_at);
"""))
db.commit()
logger.info("✓ dmarc_syncs table created successfully")
except Exception as e:
logger.error(f"Error creating dmarc_syncs table: {e}")
db.rollback()
raise
def run_migrations():
"""
Run all database migrations and maintenance tasks
@@ -389,8 +839,19 @@ def run_migrations():
ensure_domain_dns_checks_table(db)
add_is_full_check_column(db)
# UNIQUE postfix logs
add_postfix_unique_constraint(db)
# Clean up duplicate correlations
removed = cleanup_duplicate_correlations(db)
# DMARC table
ensure_dmarc_tables(db)
create_dmarc_sync_table(db)
# GeoIP fields
add_geoip_fields_to_dmarc(db)
add_geoip_fields_to_rspamd(db)
if removed > 0:
logger.info(f"Migration complete: Cleaned up {removed} duplicate correlations")

View File

@@ -6,7 +6,7 @@ SIMPLIFIED VERSION:
- Removed old generate_correlation_key function
- Correlation key is now SHA256 of Message-ID
"""
from sqlalchemy import Column, Integer, String, Float, DateTime, Boolean, Text, Index, JSON
from sqlalchemy import Column, Integer, String, Float, DateTime, Boolean, Text, Index, JSON, UniqueConstraint
from sqlalchemy.dialects.postgresql import JSONB
from datetime import datetime
@@ -23,30 +23,27 @@ class PostfixLog(Base):
priority = Column(String(20))
message = Column(Text)
# Parsed fields from message
queue_id = Column(String(50), index=True)
message_id = Column(String(255), index=True)
sender = Column(String(255), index=True)
recipient = Column(String(255), index=True)
status = Column(String(50), index=True) # sent, bounced, deferred, rejected
status = Column(String(50), index=True)
relay = Column(String(255))
delay = Column(Float)
dsn = Column(String(20))
# Correlation
correlation_key = Column(String(64), index=True) # SHA256 hash of Message-ID
correlation_key = Column(String(64), index=True)
# Raw data
raw_data = Column(JSONB)
# Metadata
created_at = Column(DateTime, default=datetime.utcnow)
__table_args__ = (
Index('idx_postfix_time_queue', 'time', 'queue_id'),
Index('idx_postfix_sender_recipient', 'sender', 'recipient'),
Index('idx_postfix_correlation', 'correlation_key'),
Index('idx_postfix_message_id', 'message_id'), # Critical for Message-ID lookup
Index('idx_postfix_message_id', 'message_id'),
UniqueConstraint('time', 'program', 'queue_id', 'message', name='uq_postfix_log'),
)
def __repr__(self):
@@ -60,41 +57,39 @@ class RspamdLog(Base):
id = Column(Integer, primary_key=True, index=True)
time = Column(DateTime, index=True, nullable=False)
# Message details
message_id = Column(String(255), index=True) # CRITICAL: Used for correlation
message_id = Column(String(255), index=True)
queue_id = Column(String(50), index=True)
subject = Column(Text)
size = Column(Integer)
# Email addresses
sender_smtp = Column(String(255), index=True)
sender_mime = Column(String(255))
recipients_smtp = Column(JSONB) # List of recipients
recipients_smtp = Column(JSONB)
recipients_mime = Column(JSONB)
# Spam analysis
score = Column(Float, index=True)
required_score = Column(Float)
action = Column(String(50), index=True) # no action, greylist, add header, reject
symbols = Column(JSONB) # Spam detection symbols
action = Column(String(50), index=True)
symbols = Column(JSONB)
# Authentication & Direction
user = Column(String(255), index=True) # Authenticated user (for outbound)
direction = Column(String(20), index=True) # inbound, outbound, unknown
user = Column(String(255), index=True)
direction = Column(String(20), index=True)
ip = Column(String(50), index=True)
# Flags
country_code = Column(String(2), index=True)
country_name = Column(String(100))
city = Column(String(100))
asn = Column(String(20))
asn_org = Column(String(255))
is_spam = Column(Boolean, index=True)
is_skipped = Column(Boolean)
has_auth = Column(Boolean, index=True) # Has MAILCOW_AUTH symbol
has_auth = Column(Boolean, index=True)
# Correlation
correlation_key = Column(String(64), index=True)
# Raw data
raw_data = Column(JSONB)
# Metadata
created_at = Column(DateTime, default=datetime.utcnow)
__table_args__ = (
@@ -103,7 +98,7 @@ class RspamdLog(Base):
Index('idx_rspamd_recipients', 'recipients_smtp', postgresql_using='gin'),
Index('idx_rspamd_score', 'score', 'action'),
Index('idx_rspamd_correlation', 'correlation_key'),
Index('idx_rspamd_message_id', 'message_id'), # Critical for Message-ID lookup
Index('idx_rspamd_message_id', 'message_id'),
)
def __repr__(self):
@@ -119,18 +114,15 @@ class NetfilterLog(Base):
priority = Column(String(20))
message = Column(Text)
# Parsed fields
ip = Column(String(50), index=True)
rule_id = Column(Integer)
attempts_left = Column(Integer)
username = Column(String(255), index=True)
auth_method = Column(String(50)) # SASL LOGIN, SASL PLAIN, etc.
action = Column(String(50), index=True) # warning, banned
auth_method = Column(String(50))
action = Column(String(50), index=True)
# Raw data
raw_data = Column(JSONB)
# Metadata
created_at = Column(DateTime, default=datetime.utcnow)
__table_args__ = (
@@ -156,34 +148,28 @@ class MessageCorrelation(Base):
id = Column(Integer, primary_key=True, index=True)
correlation_key = Column(String(64), unique=True, index=True, nullable=False)
# Message identifiers - BOTH are now critical
message_id = Column(String(255), index=True, unique=True) # Primary correlation identifier
queue_id = Column(String(50), index=True) # Secondary identifier
message_id = Column(String(255), index=True, unique=True)
queue_id = Column(String(50), index=True)
# Related log IDs
postfix_log_ids = Column(JSONB) # List of ALL postfix log IDs for this message
postfix_log_ids = Column(JSONB)
rspamd_log_id = Column(Integer, index=True)
# Message summary
sender = Column(String(255), index=True)
recipient = Column(String(255), index=True) # First/primary recipient
recipient = Column(String(255), index=True)
subject = Column(Text)
direction = Column(String(20)) # inbound, outbound, unknown
final_status = Column(String(50)) # delivered, bounced, deferred, rejected, spam
direction = Column(String(20))
final_status = Column(String(50))
# Completion tracking
is_complete = Column(Boolean, default=False, index=True) # Has Queue-ID and Postfix logs
is_complete = Column(Boolean, default=False, index=True)
# Timeline
first_seen = Column(DateTime, index=True)
last_seen = Column(DateTime)
# Metadata
created_at = Column(DateTime, default=datetime.utcnow)
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
__table_args__ = (
Index('idx_correlation_message_id', 'message_id'), # CRITICAL INDEX
Index('idx_correlation_message_id', 'message_id'),
Index('idx_correlation_queue_id', 'queue_id'),
Index('idx_correlation_sender_recipient', 'sender', 'recipient'),
)
@@ -206,4 +192,104 @@ class DomainDNSCheck(Base):
checked_at = Column(DateTime, nullable=False)
is_full_check = Column(Boolean, default=False)
created_at = Column(DateTime, default=datetime.utcnow)
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
class DMARCReport(Base):
"""DMARC aggregate reports received from email providers"""
__tablename__ = "dmarc_reports"
id = Column(Integer, primary_key=True, index=True)
report_id = Column(String(255), unique=True, index=True, nullable=False)
domain = Column(String(255), index=True, nullable=False)
org_name = Column(String(255), index=True, nullable=False)
email = Column(String(255))
extra_contact_info = Column(Text)
begin_date = Column(Integer, nullable=False)
end_date = Column(Integer, nullable=False)
policy_published = Column(JSONB)
domain_id = Column(String(255), index=True)
raw_xml = Column(Text)
created_at = Column(DateTime, default=datetime.utcnow, index=True)
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
__table_args__ = (
Index('idx_dmarc_report_domain_date', 'domain', 'begin_date'),
Index('idx_dmarc_report_org', 'org_name'),
)
def __repr__(self):
return f"<DMARCReport(report_id={self.report_id}, domain={self.domain}, org={self.org_name})>"
class DMARCRecord(Base):
"""Individual records within a DMARC report (one per source IP)"""
__tablename__ = "dmarc_records"
id = Column(Integer, primary_key=True, index=True)
dmarc_report_id = Column(Integer, index=True, nullable=False)
source_ip = Column(String(50), index=True, nullable=False)
count = Column(Integer, nullable=False)
disposition = Column(String(20), index=True)
dkim_result = Column(String(20), index=True)
spf_result = Column(String(20), index=True)
header_from = Column(String(255))
envelope_from = Column(String(255))
envelope_to = Column(String(255))
auth_results = Column(JSONB)
country_code = Column(String(2))
country_name = Column(String(100))
country_emoji = Column(String(10))
city = Column(String(100))
asn = Column(String(20))
asn_org = Column(String(255))
created_at = Column(DateTime, default=datetime.utcnow)
__table_args__ = (
Index('idx_dmarc_record_report', 'dmarc_report_id'),
Index('idx_dmarc_record_ip', 'source_ip'),
Index('idx_dmarc_record_results', 'dkim_result', 'spf_result'),
)
def __repr__(self):
return f"<DMARCRecord(ip={self.source_ip}, count={self.count}, dkim={self.dkim_result}, spf={self.spf_result})>"
class DMARCSync(Base):
"""History of DMARC IMAP sync operations"""
__tablename__ = "dmarc_syncs"
id = Column(Integer, primary_key=True, index=True)
sync_type = Column(String(20), nullable=False)
started_at = Column(DateTime, nullable=False, index=True)
completed_at = Column(DateTime)
status = Column(String(20), nullable=False, index=True)
emails_found = Column(Integer, default=0)
emails_processed = Column(Integer, default=0)
reports_created = Column(Integer, default=0)
reports_duplicate = Column(Integer, default=0)
reports_failed = Column(Integer, default=0)
error_message = Column(Text)
failed_emails = Column(JSONB)
created_at = Column(DateTime, default=datetime.utcnow)
__table_args__ = (
Index('idx_dmarc_sync_type_status', 'sync_type', 'status'),
Index('idx_dmarc_sync_started', 'started_at'),
)
def __repr__(self):
return f"<DMARCSync(type={self.sync_type}, status={self.status}, reports={self.reports_created})>"

View File

@@ -0,0 +1,861 @@
"""
DMARC Router - Domain-centric view (Cloudflare style)
"""
import logging
from typing import List, Optional
from datetime import datetime, timedelta, timezone
from fastapi import APIRouter, Depends, HTTPException, UploadFile, File, BackgroundTasks
from sqlalchemy.orm import Session
from sqlalchemy import func, and_, or_, case
from ..database import get_db
from ..models import DMARCReport, DMARCRecord, DMARCSync
from ..services.dmarc_parser import parse_dmarc_file
from ..services.geoip_service import enrich_dmarc_record
from ..services.dmarc_imap_service import sync_dmarc_reports_from_imap
from ..config import settings
from ..scheduler import update_job_status
logger = logging.getLogger(__name__)
router = APIRouter()
# =============================================================================
# DOMAINS LIST
# =============================================================================
@router.get("/dmarc/domains")
async def get_domains_list(
db: Session = Depends(get_db)
):
"""
Get list of all domains with DMARC reports and their statistics
Similar to Cloudflare's domain list
"""
try:
domains_query = db.query(
DMARCReport.domain,
func.count(DMARCReport.id).label('report_count'),
func.min(DMARCReport.begin_date).label('first_report'),
func.max(DMARCReport.end_date).label('last_report')
).group_by(
DMARCReport.domain
).all()
domains_list = []
for domain, report_count, first_report, last_report in domains_query:
thirty_days_ago = int((datetime.now() - timedelta(days=30)).timestamp())
stats = db.query(
func.sum(DMARCRecord.count).label('total_messages'),
func.count(func.distinct(DMARCRecord.source_ip)).label('unique_ips'),
func.sum(
case(
(and_(DMARCRecord.spf_result == 'pass', DMARCRecord.dkim_result == 'pass'), DMARCRecord.count),
else_=0
)
).label('dmarc_pass_count')
).join(
DMARCReport,
DMARCRecord.dmarc_report_id == DMARCReport.id
).filter(
and_(
DMARCReport.domain == domain,
DMARCReport.begin_date >= thirty_days_ago
)
).first()
total_msgs = stats.total_messages or 0
dmarc_pass = stats.dmarc_pass_count or 0
domains_list.append({
'domain': domain,
'report_count': report_count,
'first_report': first_report,
'last_report': last_report,
'stats_30d': {
'total_messages': total_msgs,
'unique_ips': stats.unique_ips or 0,
'dmarc_pass_pct': round((dmarc_pass / total_msgs * 100) if total_msgs > 0 else 0, 2)
}
})
return {
'domains': sorted(domains_list, key=lambda x: x['last_report'], reverse=True),
'total': len(domains_list)
}
except Exception as e:
logger.error(f"Error fetching domains list: {e}")
raise HTTPException(status_code=500, detail=str(e))
# =============================================================================
# DOMAIN OVERVIEW
# =============================================================================
@router.get("/dmarc/domains/{domain}/overview")
async def get_domain_overview(
domain: str,
days: int = 30,
db: Session = Depends(get_db)
):
"""
Get overview for specific domain with daily aggregated stats
Includes data for charts similar to Cloudflare
"""
try:
cutoff_timestamp = int((datetime.now() - timedelta(days=days)).timestamp())
reports = db.query(DMARCReport).filter(
and_(
DMARCReport.domain == domain,
DMARCReport.begin_date >= cutoff_timestamp
)
).all()
if not reports:
return {
'domain': domain,
'policy': None,
'daily_stats': [],
'totals': {
'total_messages': 0,
'dmarc_pass': 0,
'dmarc_fail': 0,
'unique_ips': 0,
'unique_reporters': 0
}
}
latest_report = max(reports, key=lambda r: r.end_date)
policy = latest_report.policy_published or {}
daily_data = {}
all_ips = set()
all_reporters = set()
for report in reports:
report_date = datetime.fromtimestamp(report.begin_date).date().isoformat()
if report_date not in daily_data:
daily_data[report_date] = {
'date': report_date,
'total': 0,
'dmarc_pass': 0,
'dmarc_fail': 0,
'spf_pass': 0,
'dkim_pass': 0
}
all_reporters.add(report.org_name)
records = db.query(DMARCRecord).filter(
DMARCRecord.dmarc_report_id == report.id
).all()
for record in records:
all_ips.add(record.source_ip)
daily_data[report_date]['total'] += record.count
if record.spf_result == 'pass' and record.dkim_result == 'pass':
daily_data[report_date]['dmarc_pass'] += record.count
else:
daily_data[report_date]['dmarc_fail'] += record.count
if record.spf_result == 'pass':
daily_data[report_date]['spf_pass'] += record.count
if record.dkim_result == 'pass':
daily_data[report_date]['dkim_pass'] += record.count
daily_stats = sorted(daily_data.values(), key=lambda x: x['date'])
total_messages = sum(d['total'] for d in daily_stats)
total_dmarc_pass = sum(d['dmarc_pass'] for d in daily_stats)
total_dmarc_fail = sum(d['dmarc_fail'] for d in daily_stats)
return {
'domain': domain,
'policy': {
'p': policy.get('p', 'none'),
'sp': policy.get('sp'),
'pct': policy.get('pct', 100),
'adkim': policy.get('adkim', 'r'),
'aspf': policy.get('aspf', 'r')
},
'daily_stats': daily_stats,
'totals': {
'total_messages': total_messages,
'dmarc_pass': total_dmarc_pass,
'dmarc_pass_pct': round((total_dmarc_pass / total_messages * 100) if total_messages > 0 else 0, 2),
'dmarc_fail': total_dmarc_fail,
'unique_ips': len(all_ips),
'unique_reporters': len(all_reporters)
}
}
except Exception as e:
logger.error(f"Error fetching domain overview: {e}")
raise HTTPException(status_code=500, detail=str(e))
# =============================================================================
# DOMAIN REPORTS (by day)
# =============================================================================
@router.get("/dmarc/domains/{domain}/reports")
async def get_domain_reports(
domain: str,
days: int = 30,
page: int = 1,
limit: int = 50,
db: Session = Depends(get_db)
):
"""
Get daily aggregated reports for a domain
Groups multiple reports from same day together
"""
try:
cutoff_timestamp = int((datetime.now() - timedelta(days=days)).timestamp())
reports = db.query(DMARCReport).filter(
and_(
DMARCReport.domain == domain,
DMARCReport.begin_date >= cutoff_timestamp
)
).all()
daily_reports = {}
for report in reports:
report_date = datetime.fromtimestamp(report.begin_date).date().isoformat()
if report_date not in daily_reports:
daily_reports[report_date] = {
'date': report_date,
'total_messages': 0,
'dmarc_pass': 0,
'spf_pass': 0,
'dkim_pass': 0,
'unique_ips': set(),
'reporters': set(),
'reports': []
}
records = db.query(DMARCRecord).filter(
DMARCRecord.dmarc_report_id == report.id
).all()
total_for_report = sum(r.count for r in records)
dmarc_pass_for_report = sum(r.count for r in records if r.spf_result == 'pass' and r.dkim_result == 'pass')
spf_pass_for_report = sum(r.count for r in records if r.spf_result == 'pass')
dkim_pass_for_report = sum(r.count for r in records if r.dkim_result == 'pass')
daily_reports[report_date]['reports'].append({
'report_id': report.report_id,
'org_name': report.org_name,
'begin_date': report.begin_date,
'end_date': report.end_date,
'volume': total_for_report,
'dmarc_pass_pct': round((dmarc_pass_for_report / total_for_report * 100) if total_for_report > 0 else 0, 2)
})
daily_reports[report_date]['total_messages'] += total_for_report
daily_reports[report_date]['dmarc_pass'] += dmarc_pass_for_report
daily_reports[report_date]['spf_pass'] += spf_pass_for_report
daily_reports[report_date]['dkim_pass'] += dkim_pass_for_report
daily_reports[report_date]['reporters'].add(report.org_name)
for record in records:
daily_reports[report_date]['unique_ips'].add(record.source_ip)
daily_list = []
for date, data in daily_reports.items():
total = data['total_messages']
daily_list.append({
'date': date,
'total_messages': total,
'dmarc_pass_pct': round((data['dmarc_pass'] / total * 100) if total > 0 else 0, 2),
'spf_pass_pct': round((data['spf_pass'] / total * 100) if total > 0 else 0, 2),
'dkim_pass_pct': round((data['dkim_pass'] / total * 100) if total > 0 else 0, 2),
'unique_ips': len(data['unique_ips']),
'reporters': list(data['reporters']),
'reports': data['reports']
})
daily_list.sort(key=lambda x: x['date'], reverse=True)
total = len(daily_list)
start = (page - 1) * limit
end = start + limit
return {
'domain': domain,
'total': total,
'page': page,
'limit': limit,
'pages': (total + limit - 1) // limit if total > 0 else 0,
'data': daily_list[start:end]
}
except Exception as e:
logger.error(f"Error fetching domain reports: {e}")
raise HTTPException(status_code=500, detail=str(e))
# =============================================================================
# REPORT DETAILS (specific date)
# =============================================================================
@router.get("/dmarc/domains/{domain}/reports/{report_date}/details")
async def get_report_details(
domain: str,
report_date: str,
db: Session = Depends(get_db)
):
"""
Get detailed information for a specific report date
Shows all sources (IPs) that sent emails on that day
"""
try:
date_obj = datetime.strptime(report_date, '%Y-%m-%d').date()
start_timestamp = int(datetime.combine(date_obj, datetime.min.time()).timestamp())
end_timestamp = int(datetime.combine(date_obj, datetime.max.time()).timestamp())
reports = db.query(DMARCReport).filter(
and_(
DMARCReport.domain == domain,
DMARCReport.begin_date >= start_timestamp,
DMARCReport.begin_date <= end_timestamp
)
).all()
if not reports:
raise HTTPException(status_code=404, detail="Report not found")
sources = {}
total_messages = 0
dmarc_pass_count = 0
spf_pass_count = 0
dkim_pass_count = 0
reporters = set()
for report in reports:
reporters.add(report.org_name)
records = db.query(DMARCRecord).filter(
DMARCRecord.dmarc_report_id == report.id
).all()
for record in records:
ip = record.source_ip
if ip not in sources:
source_data = enrich_dmarc_record({'source_ip': ip})
sources[ip] = {
'source_ip': ip,
'source_name': source_data.get('asn_org', 'Unknown'),
'country_code': source_data.get('country_code'),
'country_name': source_data.get('country_name'),
'city': source_data.get('city'),
'asn': source_data.get('asn'),
'asn_org': source_data.get('asn_org'),
'header_from': record.header_from,
'envelope_from': record.envelope_from,
'reporter': report.org_name,
'volume': 0,
'dmarc_pass': 0,
'dmarc_fail': 0,
'spf_pass': 0,
'dkim_pass': 0
}
sources[ip]['volume'] += record.count
total_messages += record.count
if record.spf_result == 'pass' and record.dkim_result == 'pass':
sources[ip]['dmarc_pass'] += record.count
dmarc_pass_count += record.count
else:
sources[ip]['dmarc_fail'] += record.count
if record.spf_result == 'pass':
sources[ip]['spf_pass'] += record.count
spf_pass_count += record.count
if record.dkim_result == 'pass':
sources[ip]['dkim_pass'] += record.count
dkim_pass_count += record.count
sources_list = []
for source_data in sources.values():
volume = source_data['volume']
sources_list.append({
**source_data,
'dmarc_pass_pct': round((source_data['dmarc_pass'] / volume * 100) if volume > 0 else 0, 2),
'spf_pass_pct': round((source_data['spf_pass'] / volume * 100) if volume > 0 else 0, 2),
'dkim_pass_pct': round((source_data['dkim_pass'] / volume * 100) if volume > 0 else 0, 2)
})
sources_list.sort(key=lambda x: x['volume'], reverse=True)
return {
'domain': domain,
'date': report_date,
'totals': {
'total_messages': total_messages,
'dmarc_pass': dmarc_pass_count,
'dmarc_pass_pct': round((dmarc_pass_count / total_messages * 100) if total_messages > 0 else 0, 2),
'spf_pass': spf_pass_count,
'spf_pass_pct': round((spf_pass_count / total_messages * 100) if total_messages > 0 else 0, 2),
'dkim_pass': dkim_pass_count,
'dkim_pass_pct': round((dkim_pass_count / total_messages * 100) if total_messages > 0 else 0, 2),
'unique_ips': len(sources_list),
'reporters': list(reporters)
},
'sources': sources_list
}
except ValueError:
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
except Exception as e:
logger.error(f"Error fetching report details: {e}")
raise HTTPException(status_code=500, detail=str(e))
# =============================================================================
# DOMAIN SOURCES
# =============================================================================
@router.get("/dmarc/domains/{domain}/sources")
async def get_domain_sources(
domain: str,
days: int = 30,
page: int = 1,
limit: int = 50,
db: Session = Depends(get_db)
):
"""
Get aggregated sources (IPs) for a domain
With GeoIP enrichment
"""
try:
cutoff_timestamp = int((datetime.now() - timedelta(days=days)).timestamp())
records_query = db.query(
DMARCRecord.source_ip,
func.sum(DMARCRecord.count).label('total_count'),
func.sum(
case(
(and_(DMARCRecord.spf_result == 'pass', DMARCRecord.dkim_result == 'pass'), DMARCRecord.count),
else_=0
)
).label('dmarc_pass_count'),
func.sum(
case(
(DMARCRecord.spf_result == 'pass', DMARCRecord.count),
else_=0
)
).label('spf_pass_count'),
func.sum(
case(
(DMARCRecord.dkim_result == 'pass', DMARCRecord.count),
else_=0
)
).label('dkim_pass_count')
).join(
DMARCReport,
DMARCRecord.dmarc_report_id == DMARCReport.id
).filter(
and_(
DMARCReport.domain == domain,
DMARCReport.begin_date >= cutoff_timestamp
)
).group_by(
DMARCRecord.source_ip
).order_by(
func.sum(DMARCRecord.count).desc()
).all()
sources_list = []
for ip, total, dmarc_pass, spf_pass, dkim_pass in records_query:
source_data = enrich_dmarc_record({'source_ip': ip})
sources_list.append({
'source_ip': ip,
'country_code': source_data.get('country_code'),
'country_name': source_data.get('country_name'),
'country_emoji': source_data.get('country_emoji', '🌍'),
'city': source_data.get('city'),
'asn': source_data.get('asn'),
'asn_org': source_data.get('asn_org'),
'total_count': total,
'dmarc_pass': dmarc_pass,
'dmarc_pass_pct': round((dmarc_pass / total * 100) if total > 0 else 0, 2),
'spf_pass': spf_pass,
'spf_pass_pct': round((spf_pass / total * 100) if total > 0 else 0, 2),
'dkim_pass': dkim_pass,
'dkim_pass_pct': round((dkim_pass / total * 100) if total > 0 else 0, 2)
})
total = len(sources_list)
start = (page - 1) * limit
end = start + limit
return {
'domain': domain,
'total': total,
'page': page,
'limit': limit,
'pages': (total + limit - 1) // limit if total > 0 else 0,
'data': sources_list[start:end]
}
except Exception as e:
logger.error(f"Error fetching domain sources: {e}")
raise HTTPException(status_code=500, detail=str(e))
# =============================================================================
# SOURCE DETAILS (specific IP aggregated across dates)
# =============================================================================
@router.get("/dmarc/domains/{domain}/sources/{source_ip}/details")
async def get_source_details(
domain: str,
source_ip: str,
days: int = 30,
db: Session = Depends(get_db)
):
"""
Get detailed information for a specific source IP
Shows all dates when this IP sent emails, grouped by envelope_from
"""
try:
cutoff_timestamp = int((datetime.now() - timedelta(days=days)).timestamp())
records = db.query(DMARCRecord, DMARCReport).join(
DMARCReport,
DMARCRecord.dmarc_report_id == DMARCReport.id
).filter(
and_(
DMARCReport.domain == domain,
DMARCRecord.source_ip == source_ip,
DMARCReport.begin_date >= cutoff_timestamp
)
).all()
if not records:
raise HTTPException(status_code=404, detail="Source not found")
source_data = enrich_dmarc_record({'source_ip': source_ip})
envelope_from_groups = {}
total_messages = 0
dmarc_pass_count = 0
spf_pass_count = 0
dkim_pass_count = 0
reporters = set()
for record, report in records:
envelope = record.envelope_from
reporters.add(report.org_name)
if envelope not in envelope_from_groups:
envelope_from_groups[envelope] = {
'envelope_from': envelope,
'header_from': record.header_from,
'reporter': report.org_name,
'volume': 0,
'dmarc_pass': 0,
'dmarc_fail': 0,
'spf_aligned': 0,
'dkim_aligned': 0,
'spf_result': record.spf_result,
'dkim_result': record.dkim_result
}
envelope_from_groups[envelope]['volume'] += record.count
total_messages += record.count
if record.spf_result == 'pass' and record.dkim_result == 'pass':
envelope_from_groups[envelope]['dmarc_pass'] += record.count
dmarc_pass_count += record.count
else:
envelope_from_groups[envelope]['dmarc_fail'] += record.count
if record.spf_result == 'pass':
envelope_from_groups[envelope]['spf_aligned'] += record.count
spf_pass_count += record.count
if record.dkim_result == 'pass':
envelope_from_groups[envelope]['dkim_aligned'] += record.count
dkim_pass_count += record.count
envelope_list = sorted(envelope_from_groups.values(), key=lambda x: x['volume'], reverse=True)
return {
'domain': domain,
'source_ip': source_ip,
'source_name': source_data.get('asn_org', 'Unknown'),
'country_code': source_data.get('country_code'),
'country_name': source_data.get('country_name'),
'city': source_data.get('city'),
'asn': source_data.get('asn'),
'asn_org': source_data.get('asn_org'),
'totals': {
'total_messages': total_messages,
'dmarc_pass': dmarc_pass_count,
'dmarc_pass_pct': round((dmarc_pass_count / total_messages * 100) if total_messages > 0 else 0, 2),
'spf_pass': spf_pass_count,
'spf_pass_pct': round((spf_pass_count / total_messages * 100) if total_messages > 0 else 0, 2),
'dkim_pass': dkim_pass_count,
'dkim_pass_pct': round((dkim_pass_count / total_messages * 100) if total_messages > 0 else 0, 2),
'unique_envelopes': len(envelope_list),
'reporters': list(reporters)
},
'envelope_from_groups': envelope_list
}
except Exception as e:
logger.error(f"Error fetching source details: {e}")
raise HTTPException(status_code=500, detail=str(e))
# =============================================================================
# IMAP SYNC STATUS
# =============================================================================
@router.get("/dmarc/imap/status")
async def get_imap_status(db: Session = Depends(get_db)):
"""
Get IMAP sync configuration and status
"""
try:
# Get latest sync
latest_sync = db.query(DMARCSync).order_by(
DMARCSync.started_at.desc()
).first()
# Get sync stats (last 24 hours)
from datetime import datetime, timedelta
twenty_four_hours_ago = datetime.now() - timedelta(hours=24)
recent_syncs = db.query(DMARCSync).filter(
DMARCSync.started_at >= twenty_four_hours_ago
).all()
total_reports_24h = sum(s.reports_created for s in recent_syncs)
total_failed_24h = sum(s.reports_failed for s in recent_syncs)
return {
'enabled': settings.dmarc_imap_enabled,
'configuration': {
'host': settings.dmarc_imap_host if settings.dmarc_imap_enabled else None,
'port': settings.dmarc_imap_port if settings.dmarc_imap_enabled else None,
'user': settings.dmarc_imap_user if settings.dmarc_imap_enabled else None,
'folder': settings.dmarc_imap_folder if settings.dmarc_imap_enabled else None,
'delete_after': settings.dmarc_imap_delete_after if settings.dmarc_imap_enabled else None,
'interval_seconds': settings.dmarc_imap_interval if settings.dmarc_imap_enabled else None,
'interval_minutes': round(settings.dmarc_imap_interval / 60, 1) if settings.dmarc_imap_enabled else None
},
'latest_sync': {
'id': latest_sync.id,
'sync_type': latest_sync.sync_type,
'started_at': latest_sync.started_at.strftime('%Y-%m-%dT%H:%M:%SZ') if latest_sync.started_at else None,
'completed_at': latest_sync.completed_at.strftime('%Y-%m-%dT%H:%M:%SZ') if latest_sync.completed_at else None,
'status': latest_sync.status,
'emails_found': latest_sync.emails_found,
'emails_processed': latest_sync.emails_processed,
'reports_created': latest_sync.reports_created,
'reports_duplicate': latest_sync.reports_duplicate,
'reports_failed': latest_sync.reports_failed,
'error_message': latest_sync.error_message
} if latest_sync else None,
'stats_24h': {
'total_syncs': len(recent_syncs),
'total_reports_created': total_reports_24h,
'total_reports_failed': total_failed_24h
}
}
except Exception as e:
logger.error(f"Error fetching IMAP status: {e}")
raise HTTPException(status_code=500, detail=str(e))
# =============================================================================
# MANUAL IMAP SYNC
# =============================================================================
@router.post("/dmarc/imap/sync")
async def trigger_manual_sync(background_tasks: BackgroundTasks, db: Session = Depends(get_db)):
"""
Manually trigger IMAP sync and update global job status for UI visibility
"""
if not settings.dmarc_imap_enabled:
raise HTTPException(
status_code=400,
detail="DMARC IMAP sync is not enabled."
)
try:
# Cleanup any stuck 'running' status in the specific sync table
db.query(DMARCSync).filter(DMARCSync.status == 'running').update({
"status": "failed",
"error_message": "Interrupted by manual restart"
})
db.commit()
# Update the global job status that the UI monitors
# This ensures the UI shows "Running" immediately
update_job_status('dmarc_imap_sync', 'running')
# We define a wrapper function to handle the background task status
def manual_sync_wrapper():
try:
# Perform the actual sync
result = sync_dmarc_reports_from_imap(sync_type='manual')
if result.get('status') == 'error':
update_job_status('dmarc_imap_sync', 'failed', result.get('error_message'))
else:
update_job_status('dmarc_imap_sync', 'success')
except Exception as e:
logger.error(f"Manual sync background error: {e}")
update_job_status('dmarc_imap_sync', 'failed', str(e))
# Trigger the wrapper in background
background_tasks.add_task(manual_sync_wrapper)
return {
'status': 'started',
'message': 'DMARC IMAP sync started'
}
except Exception as e:
db.rollback()
logger.error(f"Error triggering manual sync: {e}")
# If triggering fails, mark job as failed
update_job_status('dmarc_imap_sync', 'failed', str(e))
raise HTTPException(status_code=500, detail="Internal Server Error")
# =============================================================================
# IMAP SYNC HISTORY
# =============================================================================
@router.get("/dmarc/imap/history")
async def get_sync_history(
limit: int = 20,
page: int = 1,
db: Session = Depends(get_db)
):
"""
Get history of IMAP sync operations
"""
try:
# Get total count
total = db.query(DMARCSync).count()
# Get paginated results
offset = (page - 1) * limit
syncs = db.query(DMARCSync).order_by(
DMARCSync.started_at.desc()
).offset(offset).limit(limit).all()
return {
'total': total,
'page': page,
'limit': limit,
'pages': (total + limit - 1) // limit if total > 0 else 0,
'data': [
{
'id': sync.id,
'sync_type': sync.sync_type,
'status': sync.status,
'started_at': sync.started_at.strftime('%Y-%m-%dT%H:%M:%SZ') if sync.started_at else None,
'completed_at': sync.completed_at.strftime('%Y-%m-%dT%H:%M:%SZ') if sync.completed_at else None,
'emails_found': sync.emails_found,
'emails_processed': sync.emails_processed,
'reports_created': sync.reports_created,
'reports_duplicate': sync.reports_duplicate,
'reports_failed': sync.reports_failed,
'error_message': sync.error_message,
'failed_emails': sync.failed_emails
}
for sync in syncs
]
}
except Exception as e:
logger.error(f"Error fetching sync history: {e}")
raise HTTPException(status_code=500, detail=str(e))
# =============================================================================
# UPLOAD
# =============================================================================
@router.post("/dmarc/upload")
async def upload_dmarc_report(
file: UploadFile = File(...),
db: Session = Depends(get_db)
):
if not settings.dmarc_manual_upload_enabled:
raise HTTPException(
status_code=403,
detail="Manual DMARC report upload is disabled"
)
"""Upload and parse DMARC report file (GZ or ZIP)"""
try:
file_content = await file.read()
parsed_data = parse_dmarc_file(file_content, file.filename)
if not parsed_data:
raise HTTPException(status_code=400, detail="Failed to parse DMARC report")
records_data = parsed_data.pop('records', [])
report_data = parsed_data
existing = db.query(DMARCReport).filter(
DMARCReport.report_id == report_data['report_id']
).first()
if existing:
return {
'status': 'duplicate',
'message': f'Report {report_data["report_id"]} already exists'
}
report = DMARCReport(**report_data)
db.add(report)
db.flush()
for record_data in records_data:
record_data['dmarc_report_id'] = report.id
enriched = enrich_dmarc_record(record_data)
record = DMARCRecord(**enriched)
db.add(record)
db.commit()
return {
'status': 'success',
'message': f'Uploaded report for {report.domain} from {report.org_name}',
'report_id': report.id,
'records_count': len(records_data)
}
except HTTPException:
raise
except Exception as e:
db.rollback()
logger.error(f"Error uploading DMARC report: {e}", exc_info=True)
raise HTTPException(status_code=500, detail=str(e))

View File

@@ -0,0 +1,35 @@
from fastapi import APIRouter, HTTPException
from fastapi.responses import PlainTextResponse
import httpx
import logging
logger = logging.getLogger(__name__)
router = APIRouter()
GITHUB_DOCS_BASE_URL = "https://raw.githubusercontent.com/ShlomiPorush/mailcow-logs-viewer/main/documentation/HelpDocs"
ALLOWED_DOCS = {
"Domains": "Domains.md",
"DMARC": "DMARC.md",
}
@router.get("/docs/{doc_name}", response_class=PlainTextResponse)
async def get_documentation(doc_name: str):
if doc_name not in ALLOWED_DOCS:
raise HTTPException(status_code=404, detail="Documentation not found")
filename = ALLOWED_DOCS[doc_name]
url = f"{GITHUB_DOCS_BASE_URL}/{filename}"
try:
async with httpx.AsyncClient(timeout=10.0) as client:
response = await client.get(url)
response.raise_for_status()
return response.text
except httpx.HTTPStatusError as e:
logger.error(f"Failed to fetch documentation {doc_name}: HTTP {e.response.status_code}")
raise HTTPException(status_code=404, detail="Documentation not found")
except httpx.RequestError as e:
logger.error(f"Failed to fetch documentation {doc_name}: {e}")
raise HTTPException(status_code=503, detail="Failed to fetch documentation")

View File

@@ -3,53 +3,118 @@ API endpoints for domains management with DNS validation
"""
import logging
import asyncio
from fastapi import APIRouter, HTTPException
import ipaddress
import httpx
from fastapi import APIRouter, HTTPException, Depends
from typing import Dict, Any, List
import dns.resolver
import dns.asyncresolver
from datetime import datetime, timezone
from app.mailcow_api import mailcow_api
from sqlalchemy.orm import Session
from sqlalchemy import text
from app.database import get_db
from app.models import DomainDNSCheck
from fastapi import Depends
from app.utils import format_datetime_for_api
from app.config import settings
from app.mailcow_api import mailcow_api
logger = logging.getLogger(__name__)
router = APIRouter()
_server_ip_cache = None
async def init_server_ip():
"""
Initialize and cache server IP address from Mailcow API
Called once during application startup
"""
global _server_ip_cache
if _server_ip_cache is not None:
return _server_ip_cache
try:
import httpx
from app.config import settings
async with httpx.AsyncClient(timeout=10) as client:
response = await client.get(
f"{settings.mailcow_url}/api/v1/get/status/host/ip",
headers={"X-API-Key": settings.mailcow_api_key}
)
response.raise_for_status()
data = response.json()
logger.debug(f"Mailcow IP API response: {data}")
if isinstance(data, list) and len(data) > 0:
_server_ip_cache = data[0].get('ipv4')
if _server_ip_cache:
logger.info(f"Server IP cached successfully: {_server_ip_cache}")
return _server_ip_cache
else:
logger.warning(f"API response missing 'ipv4' field. Response: {data[0]}")
elif isinstance(data, dict):
_server_ip_cache = data.get('ipv4')
if _server_ip_cache:
logger.info(f"Server IP cached successfully: {_server_ip_cache}")
return _server_ip_cache
else:
logger.warning(f"API response missing 'ipv4' field. Response: {data}")
else:
logger.warning(f"Unexpected API response format. Type: {type(data)}, Data: {data}")
logger.warning("Could not fetch server IP from Mailcow - no valid IP in response")
return None
except httpx.HTTPStatusError as e:
logger.error(f"HTTP error fetching server IP: {e.response.status_code} - {e.response.text}")
return None
except Exception as e:
logger.error(f"Failed to fetch server IP: {type(e).__name__} - {str(e)}")
return None
def get_cached_server_ip() -> str:
"""
Get the cached server IP address
Returns None if not yet cached or failed to fetch
"""
global _server_ip_cache
return _server_ip_cache
async def check_spf_record(domain: str) -> Dict[str, Any]:
"""
Check SPF record for a domain
Args:
domain: Domain name to check
Returns:
Dictionary with SPF check results
Check SPF record for a domain with full validation
"""
try:
resolver = dns.asyncresolver.Resolver()
resolver.timeout = 5
resolver.lifetime = 5
# Query TXT records
answers = await resolver.resolve(domain, 'TXT')
# Find SPF record
spf_record = None
spf_records = []
for rdata in answers:
txt_data = b''.join(rdata.strings).decode('utf-8')
if txt_data.startswith('v=spf1'):
spf_record = txt_data
break
spf_records.append(txt_data)
if not spf_record:
if len(spf_records) > 1:
return {
'status': 'error',
'message': f'Multiple SPF records found ({len(spf_records)}). Only one is allowed',
'record': '; '.join(spf_records),
'has_strict_all': False,
'includes_mx': False,
'includes': [],
'warnings': ['Multiple SPF records invalidate ALL records']
}
if not spf_records:
return {
'status': 'error',
'message': 'SPF record not found',
@@ -59,44 +124,105 @@ async def check_spf_record(domain: str) -> Dict[str, Any]:
'includes': []
}
# Check for different 'all' policies
spf_record = spf_records[0]
if not spf_record.startswith('v=spf1 ') and spf_record != 'v=spf1':
return {
'status': 'error',
'message': 'Invalid SPF syntax - must start with "v=spf1 " (with space)',
'record': spf_record,
'has_strict_all': False,
'includes_mx': False,
'includes': []
}
parts = spf_record.split()
mechanisms = parts[1:] if len(parts) > 1 else []
valid_prefixes = ['ip4:', 'ip6:', 'a', 'mx', 'include:', 'exists:', 'all']
invalid_mechanisms = []
for mechanism in mechanisms:
clean_mech = mechanism.lstrip('+-~?')
is_valid = any(clean_mech == prefix or clean_mech.startswith(prefix) for prefix in valid_prefixes)
if not is_valid:
invalid_mechanisms.append(mechanism)
if invalid_mechanisms:
return {
'status': 'error',
'message': f'Invalid SPF mechanisms: {", ".join(invalid_mechanisms)}',
'record': spf_record,
'has_strict_all': False,
'includes_mx': False,
'includes': []
}
spf_lower = spf_record.lower()
has_strict_all = '-all' in spf_lower
has_soft_fail = '~all' in spf_lower
has_neutral = '?all' in spf_lower
has_pass_all = '+all' in spf_lower
has_pass_all = '+all' in spf_lower or ' all' in spf_lower
# Check for mx mechanism
includes_mx = ' mx' in spf_record or spf_record.startswith('v=spf1 mx')
if not (has_strict_all or has_soft_fail or has_neutral or has_pass_all):
return {
'status': 'error',
'message': 'SPF record missing "all" mechanism',
'record': spf_record,
'has_strict_all': False,
'includes_mx': False,
'includes': [],
'warnings': ['SPF should end with -all or ~all']
}
# Extract include directives
includes = []
parts = spf_record.split()
for part in parts:
if part.startswith('include:'):
includes.append(part.replace('include:', ''))
includes_mx = any(m.lstrip('+-~?') in ['mx'] or m.lstrip('+-~?').startswith('mx:') for m in mechanisms)
# Determine status and message
if has_strict_all:
status = 'success'
message = 'SPF configured correctly with strict -all policy'
warnings = []
elif has_soft_fail:
status = 'warning'
message = 'SPF uses ~all (soft fail). Consider using -all for stricter policy'
warnings = ['Using ~all allows some spoofing attempts to pass']
elif has_neutral:
status = 'warning'
message = 'SPF uses ?all (neutral). Consider using -all for stricter policy'
warnings = ['Using ?all provides minimal protection']
includes = [m.replace('include:', '') for m in mechanisms if m.startswith('include:')]
dns_lookup_count = await count_spf_dns_lookups(domain, spf_record, resolver)
global _server_ip_cache
server_ip = _server_ip_cache
if not server_ip:
server_ip = await init_server_ip()
server_authorized = False
authorization_method = None
if server_ip:
server_authorized, authorization_method = await check_ip_in_spf(domain, server_ip, spf_record, resolver)
warnings = []
if dns_lookup_count > 10:
status = 'error'
message = f'SPF has too many DNS lookups ({dns_lookup_count}). Maximum is 10'
warnings = [f'SPF record exceeds the 10 DNS lookup limit with {dns_lookup_count} lookups', 'This will cause SPF validation to fail']
elif has_pass_all:
status = 'error'
message = 'SPF uses +all (allows any server). This provides no protection!'
warnings = ['+all allows anyone to send email as your domain']
else:
elif not server_authorized and server_ip:
status = 'error'
message = 'SPF record missing "all" mechanism (no policy defined)'
warnings = ['SPF should end with -all or ~all']
message = f'Server IP {server_ip} is NOT authorized in SPF record'
warnings = ['Mail server IP not found in SPF record']
elif has_strict_all:
status = 'success'
message = f'SPF configured correctly with strict -all policy{f". Server IP authorized via {authorization_method}" if server_authorized else ""}'
warnings = []
elif has_soft_fail:
status = 'success'
message = f'SPF uses ~all (soft fail){f". Server IP authorized via {authorization_method}" if server_authorized else ""}. Consider using -all for stricter policy'
warnings = []
elif has_neutral:
status = 'warning'
message = 'SPF uses ?all (neutral). Consider using -all for stricter policy'
warnings = ['Using ?all provides minimal protection']
else:
status = 'success'
message = 'SPF record found'
warnings = []
return {
'status': status,
@@ -105,7 +231,8 @@ async def check_spf_record(domain: str) -> Dict[str, Any]:
'has_strict_all': has_strict_all,
'includes_mx': includes_mx,
'includes': includes,
'warnings': warnings
'warnings': warnings,
'dns_lookups': dns_lookup_count
}
except dns.resolver.NXDOMAIN:
@@ -138,6 +265,295 @@ async def check_spf_record(domain: str) -> Dict[str, Any]:
}
async def check_ip_in_spf(domain: str, ip_to_check: str, spf_record: str, resolver, visited_domains: set = None, depth: int = 0) -> tuple:
"""
Check if IP is authorized in SPF record recursively
Returns: (authorized: bool, method: str or None)
"""
if depth > 10:
return False, None
if visited_domains is None:
visited_domains = set()
if domain in visited_domains:
return False, None
visited_domains.add(domain)
parts = spf_record.split()
for part in parts:
clean_part = part.lstrip('+-~?')
if clean_part.startswith('ip4:'):
ip_spec = clean_part.replace('ip4:', '')
try:
if '/' in ip_spec:
network = ipaddress.ip_network(ip_spec, strict=False)
if ipaddress.ip_address(ip_to_check) in network:
return True, f'ip4:{ip_spec}'
else:
if ip_to_check == ip_spec:
return True, f'ip4:{ip_spec}'
except:
pass
elif clean_part in ['a'] or clean_part.startswith('a:'):
check_domain = domain if clean_part == 'a' else clean_part.split(':', 1)[1]
try:
a_records = await resolver.resolve(check_domain, 'A')
for rdata in a_records:
if str(rdata) == ip_to_check:
return True, f'a:{check_domain}' if clean_part.startswith('a:') else 'a'
except:
pass
elif clean_part in ['mx'] or clean_part.startswith('mx:'):
check_domain = domain if clean_part == 'mx' else clean_part.split(':', 1)[1]
try:
mx_records = await resolver.resolve(check_domain, 'MX')
for mx in mx_records:
try:
mx_a_records = await resolver.resolve(str(mx.exchange), 'A')
for rdata in mx_a_records:
if str(rdata) == ip_to_check:
return True, f'mx:{check_domain}' if clean_part.startswith('mx:') else 'mx'
except:
pass
except:
pass
elif clean_part.startswith('include:'):
include_domain = clean_part.replace('include:', '')
try:
include_answers = await resolver.resolve(include_domain, 'TXT')
for rdata in include_answers:
include_spf = b''.join(rdata.strings).decode('utf-8')
if include_spf.startswith('v=spf1'):
authorized, method = await check_ip_in_spf(
include_domain,
ip_to_check,
include_spf,
resolver,
visited_domains.copy(),
depth + 1
)
if authorized:
return True, f'include:{include_domain} ({method})'
except:
pass
return False, None
async def count_spf_dns_lookups(domain: str, spf_record: str, resolver, visited_domains: set = None, depth: int = 0) -> int:
"""
Count DNS lookups in SPF record recursively
SPF limit is 10 DNS lookups
"""
if depth > 10:
return 999
if visited_domains is None:
visited_domains = set()
if domain in visited_domains:
return 0
visited_domains.add(domain)
parts = spf_record.split()
lookup_count = 0
for part in parts:
clean_part = part.lstrip('+-~?')
if clean_part.startswith('include:'):
lookup_count += 1
include_domain = clean_part.replace('include:', '')
try:
include_answers = await resolver.resolve(include_domain, 'TXT')
for rdata in include_answers:
include_spf = b''.join(rdata.strings).decode('utf-8')
if include_spf.startswith('v=spf1'):
nested_count = await count_spf_dns_lookups(
include_domain,
include_spf,
resolver,
visited_domains.copy(),
depth + 1
)
lookup_count += nested_count
break
except:
pass
elif clean_part in ['a'] or clean_part.startswith('a:'):
lookup_count += 1
elif clean_part in ['mx'] or clean_part.startswith('mx:'):
lookup_count += 1
elif clean_part.startswith('exists:'):
lookup_count += 1
elif clean_part.startswith('redirect='):
lookup_count += 1
return lookup_count
async def check_ip_in_spf(domain: str, ip_to_check: str, spf_record: str, resolver, visited_domains: set = None, depth: int = 0) -> tuple:
"""
Check if IP is authorized in SPF record recursively
Returns: (authorized: bool, method: str or None)
"""
if depth > 10:
return False, None
if visited_domains is None:
visited_domains = set()
if domain in visited_domains:
return False, None
visited_domains.add(domain)
parts = spf_record.split()
for part in parts:
clean_part = part.lstrip('+-~?')
if clean_part.startswith('ip4:'):
ip_spec = clean_part.replace('ip4:', '')
try:
if '/' in ip_spec:
network = ipaddress.ip_network(ip_spec, strict=False)
if ipaddress.ip_address(ip_to_check) in network:
return True, f'ip4:{ip_spec}'
else:
if ip_to_check == ip_spec:
return True, f'ip4:{ip_spec}'
except:
pass
elif clean_part in ['a'] or clean_part.startswith('a:'):
check_domain = domain if clean_part == 'a' else clean_part.split(':', 1)[1]
try:
a_records = await resolver.resolve(check_domain, 'A')
for rdata in a_records:
if str(rdata) == ip_to_check:
return True, f'a:{check_domain}' if clean_part.startswith('a:') else 'a'
except:
pass
elif clean_part in ['mx'] or clean_part.startswith('mx:'):
check_domain = domain if clean_part == 'mx' else clean_part.split(':', 1)[1]
try:
mx_records = await resolver.resolve(check_domain, 'MX')
for mx in mx_records:
try:
mx_a_records = await resolver.resolve(str(mx.exchange), 'A')
for rdata in mx_a_records:
if str(rdata) == ip_to_check:
return True, f'mx:{check_domain}' if clean_part.startswith('mx:') else 'mx'
except:
pass
except:
pass
elif clean_part.startswith('include:'):
include_domain = clean_part.replace('include:', '')
try:
include_answers = await resolver.resolve(include_domain, 'TXT')
for rdata in include_answers:
include_spf = b''.join(rdata.strings).decode('utf-8')
if include_spf.startswith('v=spf1'):
authorized, method = await check_ip_in_spf(
include_domain,
ip_to_check,
include_spf,
resolver,
visited_domains.copy(),
depth + 1
)
if authorized:
return True, f'include:{include_domain} ({method})'
except:
pass
return False, None
def parse_dkim_parameters(dkim_record: str) -> Dict[str, Any]:
"""
Parse and validate DKIM record parameters
Args:
dkim_record: DKIM TXT record string
Returns:
Dictionary with parameter validation results
"""
issues = []
info = []
params = {}
for part in dkim_record.split(';'):
part = part.strip()
if '=' in part:
key, value = part.split('=', 1)
params[key.strip()] = value.strip()
if 'p' in params and params['p'] == '':
issues.append({
'level': 'error',
'message': 'DKIM key is revoked (p= is empty)',
'description': 'This DKIM record has been intentionally disabled'
})
if 't' in params:
flags = params['t']
if 'y' in flags:
issues.append({
'level': 'critical',
'message': 'DKIM is in TESTING mode (t=y)',
'description': 'Emails will pass validation even with invalid signatures. Remove t=y for production!'
})
if 's' in flags:
info.append({
'level': 'info',
'message': 'DKIM uses strict subdomain mode (t=s)',
'description': 'Only the main domain can send emails. Subdomains like mail.example.com will fail DKIM validation'
})
if 'h' in params:
hash_algo = params['h'].lower()
if hash_algo == 'sha1':
issues.append({
'level': 'warning',
'message': 'DKIM uses SHA1 hash algorithm (h=sha1)',
'description': 'SHA1 is deprecated and insecure. Upgrade to SHA256 (h=sha256)'
})
if 'k' in params:
key_type = params['k'].lower()
if key_type not in ['rsa', 'ed25519']:
issues.append({
'level': 'warning',
'message': f'Unknown key type: {key_type}',
'description': 'Expected rsa or ed25519'
})
return {
'has_issues': len(issues) > 0,
'issues': issues,
'info': info,
'parameters': params
}
async def check_dkim_record(domain: str) -> Dict[str, Any]:
"""
Check DKIM record for a domain
@@ -169,7 +585,10 @@ async def check_dkim_record(domain: str) -> Dict[str, Any]:
'selector': None,
'expected_record': None,
'actual_record': None,
'match': False
'match': False,
'warnings': [],
'info': [],
'parameters': {}
}
except httpx.RequestError as e:
logger.error(f"Request error fetching DKIM from Mailcow for {domain}: {e}")
@@ -179,7 +598,10 @@ async def check_dkim_record(domain: str) -> Dict[str, Any]:
'selector': None,
'expected_record': None,
'actual_record': None,
'match': False
'match': False,
'warnings': [],
'info': [],
'parameters': {}
}
# Validate response structure - API can return either dict or list
@@ -196,7 +618,10 @@ async def check_dkim_record(domain: str) -> Dict[str, Any]:
'selector': None,
'expected_record': None,
'actual_record': None,
'match': False
'match': False,
'warnings': [],
'info': [],
'parameters': {}
}
# Get first element from list
dkim_config = dkim_data[0]
@@ -209,6 +634,10 @@ async def check_dkim_record(domain: str) -> Dict[str, Any]:
'expected_record': None,
'actual_record': None,
'match': False
,
'warnings': [],
'info': [],
'parameters': {}
}
# Validate required fields
@@ -220,7 +649,10 @@ async def check_dkim_record(domain: str) -> Dict[str, Any]:
'selector': None,
'expected_record': None,
'actual_record': None,
'match': False
'match': False,
'warnings': [],
'info': [],
'parameters': {}
}
selector = dkim_config.get('dkim_selector', 'dkim')
@@ -234,7 +666,10 @@ async def check_dkim_record(domain: str) -> Dict[str, Any]:
'selector': selector,
'expected_record': None,
'actual_record': None,
'match': False
'match': False,
'warnings': [],
'info': [],
'parameters': {}
}
# Construct DKIM domain
@@ -260,19 +695,52 @@ async def check_dkim_record(domain: str) -> Dict[str, Any]:
match = expected_clean == actual_clean
dkim_params = parse_dkim_parameters(actual_record)
warnings = []
info_messages = []
critical_issues = []
for issue in dkim_params['issues']:
if issue['level'] == 'critical':
critical_issues.append(f"{issue['message']} - {issue['description']}")
elif issue['level'] == 'error':
warnings.append(f"{issue['message']}")
elif issue['level'] == 'warning':
warnings.append(f"⚠️ {issue['message']}")
for item in dkim_params['info']:
info_messages.append(item['message'])
if critical_issues:
status = 'error'
message = critical_issues[0]
elif not match:
status = 'error'
message = 'DKIM record mismatch'
elif warnings:
status = 'warning'
message = 'DKIM configured but has warnings'
else:
status = 'success'
message = 'DKIM configured correctly'
if match:
logger.info(f"DKIM check passed for {domain}")
else:
logger.warning(f"DKIM mismatch for {domain}")
return {
'status': 'success' if match else 'error',
'message': 'DKIM configured correctly' if match else 'DKIM record mismatch',
'status': status,
'message': message,
'selector': selector,
'dkim_domain': dkim_domain,
'expected_record': expected_value,
'actual_record': actual_record,
'match': match
'match': match,
'warnings': warnings,
'info': info_messages,
'parameters': dkim_params['parameters']
}
except dns.resolver.NXDOMAIN:
@@ -285,6 +753,10 @@ async def check_dkim_record(domain: str) -> Dict[str, Any]:
'expected_record': expected_value,
'actual_record': None,
'match': False
,
'warnings': [],
'info': [],
'parameters': {}
}
except dns.resolver.NoAnswer:
logger.warning(f"No TXT record at {dkim_domain} for {domain}")
@@ -296,6 +768,10 @@ async def check_dkim_record(domain: str) -> Dict[str, Any]:
'expected_record': expected_value,
'actual_record': None,
'match': False
,
'warnings': [],
'info': [],
'parameters': {}
}
except dns.exception.Timeout:
logger.error(f"DNS timeout checking DKIM for {domain}")
@@ -306,7 +782,10 @@ async def check_dkim_record(domain: str) -> Dict[str, Any]:
'dkim_domain': dkim_domain,
'expected_record': expected_value,
'actual_record': None,
'match': False
'match': False,
'warnings': [],
'info': [],
'parameters': {}
}
except Exception as e:
@@ -318,6 +797,10 @@ async def check_dkim_record(domain: str) -> Dict[str, Any]:
'expected_record': None,
'actual_record': None,
'match': False
,
'warnings': [],
'info': [],
'parameters': {}
}

View File

@@ -363,7 +363,12 @@ async def get_message_full_details(
"ip": rspamd_log.ip,
"user": rspamd_log.user,
"has_auth": rspamd_log.has_auth,
"size": rspamd_log.size
"size": rspamd_log.size,
"country_code": rspamd_log.country_code,
"country_name": rspamd_log.country_name,
"city": rspamd_log.city,
"asn": rspamd_log.asn,
"asn_org": rspamd_log.asn_org
} if rspamd_log else None,
"postfix_by_recipient": _group_postfix_by_recipient(postfix_logs),
"postfix": [

View File

@@ -3,6 +3,7 @@ API endpoints for settings and system information
Shows configuration, last import times, and background job status
"""
import logging
import httpx
from fastapi import APIRouter, Depends
from sqlalchemy.orm import Session
from sqlalchemy import func, desc, text, or_
@@ -13,6 +14,9 @@ from ..database import get_db
from ..models import PostfixLog, RspamdLog, NetfilterLog, MessageCorrelation
from ..config import settings
from ..scheduler import last_fetch_run_time, get_job_status
from ..services.connection_test import test_smtp_connection, test_imap_connection
from ..services.geoip_downloader import is_license_configured, get_geoip_status
from .domains import get_cached_server_ip
logger = logging.getLogger(__name__)
@@ -94,6 +98,7 @@ async def get_settings_info(db: Session = Depends(get_db)):
return {
"configuration": {
"mailcow_url": settings.mailcow_url,
"server_ip": get_cached_server_ip(),
"local_domains": settings.local_domains_list,
"fetch_interval": settings.fetch_interval,
"fetch_count_postfix": settings.fetch_count_postfix,
@@ -111,7 +116,8 @@ async def get_settings_info(db: Session = Depends(get_db)):
"csv_export_limit": settings.csv_export_limit,
"scheduler_workers": settings.scheduler_workers,
"auth_enabled": settings.auth_enabled,
"auth_username": settings.auth_username if settings.auth_enabled else None
"auth_username": settings.auth_username if settings.auth_enabled else None,
"maxmind_status": await validate_maxmind_license()
},
"import_status": {
"postfix": {
@@ -195,6 +201,56 @@ async def get_settings_info(db: Session = Depends(get_db)):
"status": jobs_status.get('dns_check', {}).get('status', 'unknown'),
"last_run": format_datetime_utc(jobs_status.get('dns_check', {}).get('last_run')),
"error": jobs_status.get('dns_check', {}).get('error')
},
"sync_local_domains": {
"interval": "6 hours",
"description": "Syncs active domains list from Mailcow API",
"status": jobs_status.get('sync_local_domains', {}).get('status', 'unknown'),
"last_run": format_datetime_utc(jobs_status.get('sync_local_domains', {}).get('last_run')),
"error": jobs_status.get('sync_local_domains', {}).get('error')
},
"dmarc_imap_sync": {
"interval": f"{settings.dmarc_imap_interval} seconds ({settings.dmarc_imap_interval // 60} minutes)" if settings.dmarc_imap_enabled else "Disabled",
"description": "Imports DMARC reports from IMAP mailbox",
"enabled": settings.dmarc_imap_enabled,
"status": jobs_status.get('dmarc_imap_sync', {}).get('status', 'idle') if settings.dmarc_imap_enabled else 'disabled',
"last_run": format_datetime_utc(jobs_status.get('dmarc_imap_sync', {}).get('last_run')) if settings.dmarc_imap_enabled else None,
"error": jobs_status.get('dmarc_imap_sync', {}).get('error') if settings.dmarc_imap_enabled else None
},
"update_geoip": {
"schedule": "Weekly (Sunday 3 AM)" if is_license_configured() else "Disabled",
"description": "Updates MaxMind GeoIP databases (City & ASN)",
"enabled": is_license_configured(),
"status": jobs_status.get('update_geoip', {}).get('status', 'idle') if is_license_configured() else 'disabled',
"last_run": format_datetime_utc(jobs_status.get('update_geoip', {}).get('last_run')) if is_license_configured() else None,
"error": jobs_status.get('update_geoip', {}).get('error') if is_license_configured() else None
}
},
"smtp_configuration": {
"enabled": settings.smtp_enabled,
"host": settings.smtp_host if settings.smtp_enabled else None,
"port": settings.smtp_port if settings.smtp_enabled else None,
"user": settings.smtp_user if settings.smtp_enabled else None,
"from_address": settings.smtp_from if settings.smtp_enabled else None,
"use_tls": settings.smtp_use_tls if settings.smtp_enabled else None,
"admin_email": settings.admin_email if settings.smtp_enabled else None,
"configured": settings.notification_smtp_configured
},
"dmarc_configuration": {
"manual_upload_enabled": settings.dmarc_manual_upload_enabled,
"imap_sync_enabled": settings.dmarc_imap_enabled,
"imap_host": settings.dmarc_imap_host if settings.dmarc_imap_enabled else None,
"imap_user": settings.dmarc_imap_user if settings.dmarc_imap_enabled else None,
"imap_folder": settings.dmarc_imap_folder if settings.dmarc_imap_enabled else None,
"imap_delete_after": settings.dmarc_imap_delete_after if settings.dmarc_imap_enabled else None,
"imap_interval_minutes": round(settings.dmarc_imap_interval / 60, 1) if settings.dmarc_imap_enabled else None,
"smtp_configured": settings.notification_smtp_configured
},
"geoip_configuration": {
"enabled": is_license_configured(),
"databases": get_geoip_status() if is_license_configured() else {
"City": {"installed": False, "version": None, "last_updated": None},
"ASN": {"installed": False, "version": None, "last_updated": None}
}
},
"recent_incomplete_correlations": [
@@ -220,6 +276,17 @@ async def get_settings_info(db: Session = Depends(get_db)):
"background_jobs": {}
}
@router.post("/settings/test/smtp")
async def test_smtp():
"""Test SMTP connection with detailed logging"""
result = test_smtp_connection()
return result
@router.post("/settings/test/imap")
async def test_imap():
"""Test IMAP connection with detailed logging"""
result = test_imap_connection()
return result
@router.get("/settings/health")
async def get_health_detailed(db: Session = Depends(get_db)):
@@ -270,4 +337,30 @@ async def get_health_detailed(db: Session = Depends(get_db)):
"status": "unhealthy",
"timestamp": format_datetime_utc(datetime.now(timezone.utc)),
"error": str(e)
}
}
async def validate_maxmind_license() -> Dict[str, Any]:
"""Validate MaxMind license key"""
import os
license_key = os.getenv('MAXMIND_LICENSE_KEY')
if not license_key:
return {"configured": False, "valid": False, "error": None}
try:
async with httpx.AsyncClient(timeout=5.0) as client:
response = await client.post(
"https://secret-scanning.maxmind.com/secrets/validate-license-key",
data={"license_key": license_key},
headers={"Content-Type": "application/x-www-form-urlencoded"}
)
if response.status_code == 204:
return {"configured": True, "valid": True, "error": None}
elif response.status_code == 401:
return {"configured": True, "valid": False, "error": "Invalid"}
else:
return {"configured": True, "valid": False, "error": f"Status {response.status_code}"}
except Exception:
return {"configured": True, "valid": False, "error": "Connection error"}

View File

@@ -13,14 +13,25 @@ from apscheduler.triggers.interval import IntervalTrigger
from apscheduler.triggers.cron import CronTrigger
from sqlalchemy.orm import Session
from sqlalchemy import desc, or_
from sqlalchemy.exc import IntegrityError
from .config import settings
from .config import settings, set_cached_active_domains
from .database import get_db_context
from .mailcow_api import mailcow_api
from .models import PostfixLog, RspamdLog, NetfilterLog, MessageCorrelation
from .correlation import detect_direction, parse_postfix_message
from .models import DomainDNSCheck
from .routers.domains import check_domain_dns, save_dns_check_to_db
from .services.dmarc_imap_service import sync_dmarc_reports_from_imap
from .services.dmarc_notifications import send_dmarc_error_notification
from .services import geoip_service
from .services.geoip_downloader import (
update_geoip_database_if_needed,
is_license_configured,
get_geoip_status
)
from .services.geoip_downloader import is_license_configured
logger = logging.getLogger(__name__)
@@ -32,7 +43,9 @@ job_status = {
'expire_correlations': {'last_run': None, 'status': 'idle', 'error': None},
'cleanup_logs': {'last_run': None, 'status': 'idle', 'error': None},
'check_app_version': {'last_run': None, 'status': 'idle', 'error': None},
'dns_check': {'last_run': None, 'status': 'idle', 'error': None}
'dns_check': {'last_run': None, 'status': 'idle', 'error': None},
'update_geoip': {'last_run': None, 'status': 'idle', 'error': None},
'dmarc_imap_sync': {'last_run': None, 'status': 'idle', 'error': None}
}
def update_job_status(job_name: str, status: str, error: str = None):
@@ -168,6 +181,7 @@ async def fetch_and_store_postfix():
with get_db_context() as db:
new_count = 0
skipped_count = 0
skipped_blacklist = 0
blacklisted_queue_ids: Set[str] = set()
@@ -209,6 +223,7 @@ async def fetch_and_store_postfix():
unique_id = f"{time_str}:{message[:100]}"
if unique_id in seen_postfix:
skipped_count += 1
continue
# Parse message for fields
@@ -247,17 +262,29 @@ async def fetch_and_store_postfix():
)
db.add(postfix_log)
db.flush()
seen_postfix.add(unique_id)
new_count += 1
except IntegrityError:
# Duplicate log - skip silently
db.rollback()
seen_postfix.add(unique_id)
skipped_count += 1
continue
except Exception as e:
logger.error(f"Error processing Postfix log: {e}")
db.rollback()
continue
db.commit()
if new_count > 0:
if new_count > 0 or skipped_count > 0:
msg = f"[OK] Imported {new_count} Postfix logs"
if skipped_count > 0:
msg += f" (skipped {skipped_count} duplicates)"
if skipped_blacklist > 0:
msg += f" (skipped {skipped_blacklist} blacklisted)"
logger.info(msg)
@@ -335,6 +362,14 @@ async def fetch_and_store_rspamd():
size=log_entry.get('size'),
raw_data=log_entry
)
if geoip_service.is_geoip_available() and rspamd_log.ip:
geo_info = geoip_service.lookup_ip(rspamd_log.ip)
rspamd_log.country_code = geo_info.get('country_code')
rspamd_log.country_name = geo_info.get('country_name')
rspamd_log.city = geo_info.get('city')
rspamd_log.asn = geo_info.get('asn')
rspamd_log.asn_org = geo_info.get('asn_org')
db.add(rspamd_log)
seen_rspamd.add(unique_id)
@@ -987,7 +1022,7 @@ async def update_final_status_for_correlations():
1. Finds correlations without a definitive final_status
2. Only checks correlations within Max Correlation Age
3. Looks for new Postfix logs that may have arrived
4. Updates final_status if a better status is found
4. Updates final_status, postfix_log_ids, and correlation_key
This runs independently from correlation creation to ensure we catch
late-arriving Postfix logs.
@@ -1013,7 +1048,7 @@ async def update_final_status_for_correlations():
MessageCorrelation.final_status.is_(None),
MessageCorrelation.final_status.notin_(['delivered', 'bounced', 'rejected', 'expired'])
)
).limit(100).all()
).limit(500).all() # Increased from 100 to 500
if not correlations_to_check:
return
@@ -1032,7 +1067,6 @@ async def update_final_status_for_correlations():
# Determine best final status from all Postfix logs
# Priority: bounced > rejected > sent (delivered) > deferred
# We check all logs to find the best status
new_final_status = correlation.final_status
for plog in all_postfix:
@@ -1047,13 +1081,33 @@ async def update_final_status_for_correlations():
elif plog.status == 'deferred' and new_final_status not in ['bounced', 'rejected', 'delivered']:
new_final_status = 'deferred'
# Update if we found a better status
if new_final_status and new_final_status != correlation.final_status:
# FIX #1: Update postfix_log_ids - add any missing logs
current_ids = list(correlation.postfix_log_ids or [])
ids_added = 0
for plog in all_postfix:
if plog.id and plog.id not in current_ids:
current_ids.append(plog.id)
ids_added += 1
if ids_added > 0:
correlation.postfix_log_ids = current_ids
# FIX #2: Update correlation_key in ALL Postfix logs
for plog in all_postfix:
if not plog.correlation_key or plog.correlation_key != correlation.correlation_key:
plog.correlation_key = correlation.correlation_key
# Update if we found a better status or added logs
if (new_final_status and new_final_status != correlation.final_status) or ids_added > 0:
old_status = correlation.final_status
correlation.final_status = new_final_status
correlation.last_seen = datetime.now(timezone.utc)
updated_count += 1
logger.debug(f"Updated final_status for correlation {correlation.id} ({correlation.message_id[:40] if correlation.message_id else 'no-id'}...): {old_status} -> {new_final_status}")
if ids_added > 0:
logger.debug(f"Updated correlation {correlation.id}: added {ids_added} logs, status {old_status} -> {new_final_status}")
else:
logger.debug(f"Updated final_status for correlation {correlation.id} ({correlation.message_id[:40] if correlation.message_id else 'no-id'}...): {old_status} -> {new_final_status}")
except Exception as e:
logger.warning(f"Failed to update final_status for correlation {correlation.id}: {e}")
@@ -1070,6 +1124,89 @@ async def update_final_status_for_correlations():
update_job_status('update_final_status', 'failed', str(e))
async def update_geoip_database():
"""Background job: Update GeoIP databases"""
from .services.geoip_downloader import (
update_geoip_database_if_needed,
is_license_configured
)
try:
update_job_status('update_geoip', 'running')
if not is_license_configured():
update_job_status('update_geoip', 'idle', 'License key not configured')
return
status = update_geoip_database_if_needed()
if status['City']['updated'] or status['ASN']['updated']:
update_job_status('update_geoip', 'success')
else:
update_job_status('update_geoip', 'success')
except Exception as e:
logger.error(f"GeoIP update failed: {e}")
update_job_status('update_geoip', 'failed', str(e))
async def dmarc_imap_sync_job():
"""
Scheduled job to sync DMARC reports from IMAP mailbox
Runs every hour (configurable via DMARC_IMAP_INTERVAL)
"""
if not settings.dmarc_imap_enabled:
logger.debug("DMARC IMAP sync is disabled, skipping")
return
# Global cleanup to ensure no other job is stuck in 'running' state
try:
# Assuming you have a way to get a DB session here
from your_app.database import SessionLocal
with SessionLocal() as db:
db.query(DMARCSync).filter(DMARCSync.status == 'running').update({
"status": "failed",
"error_message": "Stale job cleaned by scheduler"
})
db.commit()
except Exception as cleanup_err:
logger.warning(f"Background cleanup failed: {cleanup_err}")
# Start the current job
update_job_status('dmarc_imap_sync', 'running')
try:
logger.info("Starting DMARC IMAP sync...")
# Execute the actual IMAP sync logic
result = sync_dmarc_reports_from_imap(sync_type='auto')
if result.get('status') == 'error':
error_msg = result.get('error_message', 'Unknown error')
logger.error(f"DMARC IMAP sync failed: {error_msg}")
update_job_status('dmarc_imap_sync', 'failed', error_msg)
# Send notification if needed
failed_emails = result.get('failed_emails')
if failed_emails and settings.notification_smtp_configured:
try:
send_dmarc_error_notification(failed_emails, result.get('sync_id'))
except Exception as e:
logger.error(f"Failed to send error notification: {e}")
else:
# Sync finished successfully
logger.info(f"DMARC IMAP sync completed: {result.get('reports_created', 0)} created")
update_job_status('dmarc_imap_sync', 'success')
except Exception as e:
# Catch-all for unexpected crashes
logger.error(f"DMARC IMAP sync job error: {e}", exc_info=True)
update_job_status('dmarc_imap_sync', 'failed', str(e))
finally:
# Ensure the state is never left as 'running' if the code reaches here
logger.debug("DMARC IMAP sync job cycle finished")
# =============================================================================
# CLEANUP
# =============================================================================
@@ -1243,6 +1380,32 @@ async def check_all_domains_dns_background():
update_job_status('dns_check', 'failed', str(e))
async def sync_local_domains():
"""
Sync local domains from Mailcow API
Runs every 6 hours
"""
logger.info("Starting background local domains sync...")
update_job_status('sync_local_domains', 'running')
try:
active_domains = await mailcow_api.get_active_domains()
if active_domains:
set_cached_active_domains(active_domains)
logger.info(f"✓ Local domains synced: {len(active_domains)} domains")
update_job_status('sync_local_domains', 'success')
return True
else:
logger.warning("⚠ No active domains retrieved")
update_job_status('sync_local_domains', 'failed', str(e))
return False
except Exception as e:
logger.error(f"✗ Failed to sync local domains: {e}")
update_job_status('sync_local_domains', 'failed', str(e))
return False
# =============================================================================
# SCHEDULER SETUP
# =============================================================================
@@ -1323,7 +1486,7 @@ def start_scheduler():
name='Check App Version Updates',
replace_existing=True,
max_instances=1,
next_run_time=datetime.now(timezone.utc) # Run immediately on startup
next_run_time=datetime.now(timezone.utc)
)
# Job 8: DNS Check
@@ -1336,16 +1499,73 @@ def start_scheduler():
max_instances=1
)
# Job 8b: Initial DNS check on startup
scheduler.add_job(
check_all_domains_dns_background,
'date',
run_date=datetime.now(timezone.utc) + timedelta(seconds=30),
run_date=datetime.now(timezone.utc) + timedelta(seconds=60),
id='dns_check_startup',
name='DNS Check (Startup)'
)
scheduler.start()
# Job 9: Sync local domains (every 6 hours)
scheduler.add_job(
sync_local_domains,
IntervalTrigger(hours=6),
id='sync_local_domains',
name='Sync Local Domains',
replace_existing=True,
max_instances=1,
next_run_time=datetime.now(timezone.utc)
)
# Job 11: Update GeoIP database (weekly, Sunday at 3 AM)
# Only runs if MaxMind license key is configured
if is_license_configured():
scheduler.add_job(
update_geoip_database,
trigger=CronTrigger(day_of_week='sun', hour=3, minute=0),
id='update_geoip',
name='Update GeoIP',
replace_existing=True
)
# Run initial check on startup (after 60 seconds to let everything settle)
scheduler.add_job(
update_geoip_database,
'date',
run_date=datetime.now(timezone.utc) + timedelta(seconds=60),
id='geoip_startup',
name='GeoIP Check (Startup)'
)
logger.info(" [GEOIP] Initial GeoIP check scheduled (60 seconds after startup)")
else:
logger.info(" [GEOIP] MaxMind license key not configured, GeoIP features disabled")
# Job 12: DMARC IMAP Sync - runs at configured interval (default: hourly)
if settings.dmarc_imap_enabled:
scheduler.add_job(
dmarc_imap_sync_job,
IntervalTrigger(seconds=settings.dmarc_imap_interval),
id='dmarc_imap_sync',
name='DMARC IMAP Sync',
replace_existing=True
)
logger.info(f"Scheduled DMARC IMAP sync job (interval: {settings.dmarc_imap_interval}s)")
# Run once on startup if configured
if settings.dmarc_imap_run_on_startup:
scheduler.add_job(
dmarc_imap_sync_job,
'date',
run_date=datetime.now() + timedelta(seconds=30),
id='dmarc_imap_sync_startup',
name='DMARC IMAP Sync (Startup)'
)
logger.info("Scheduled initial DMARC IMAP sync on startup")
scheduler.start()
logger.info("[OK] Scheduler started")
logger.info(f" [INFO] Import: every {settings.fetch_interval}s")
logger.info(f" [LINK] Correlation: every 30s")
@@ -1354,6 +1574,13 @@ def start_scheduler():
logger.info(f" [EXPIRE] Old correlations: every 60s (expire after {settings.max_correlation_age_minutes}min)")
logger.info(f" [VERSION] Check app version updates: every 6 hours")
logger.info(f" [DNS] Check all domains DNS: every 6 hours")
logger.info(" [GEOIP] Update GeoIP database: weekly (Sunday 3 AM)")
if settings.dmarc_imap_enabled:
logger.info(f" [DMARC] IMAP sync: every {settings.dmarc_imap_interval // 60} minutes")
else:
logger.info(" [DMARC] IMAP sync: disabled")
# Log blacklist status
blacklist = settings.blacklist_emails_list

View File

@@ -0,0 +1,3 @@
"""
Routers package initialization
"""

View File

@@ -0,0 +1,145 @@
"""
Connection testing utilities for SMTP and IMAP
Provides detailed logging for debugging
"""
import imaplib
import smtplib
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
from typing import Dict, List
from ..config import settings
def test_smtp_connection() -> Dict:
"""Test SMTP connection and return detailed log"""
logs = []
success = False
try:
logs.append("Starting SMTP connection test...")
logs.append(f"Host: {settings.smtp_host}")
logs.append(f"Port: {settings.smtp_port}")
logs.append(f"Use TLS: {settings.smtp_use_tls}")
logs.append(f"User: {settings.smtp_user}")
if not settings.smtp_host or not settings.smtp_user or not settings.smtp_password:
logs.append("ERROR: SMTP not fully configured")
return {"success": False, "logs": logs}
logs.append("Connecting to SMTP server...")
if settings.smtp_port == 465:
server = smtplib.SMTP_SSL(settings.smtp_host, settings.smtp_port, timeout=10)
logs.append("Connected using SSL")
else:
server = smtplib.SMTP(settings.smtp_host, settings.smtp_port, timeout=10)
logs.append("Connected")
if settings.smtp_use_tls:
logs.append("Starting TLS...")
server.starttls()
logs.append("TLS established")
logs.append("Logging in...")
server.login(settings.smtp_user, settings.smtp_password)
logs.append("Login successful")
logs.append("Sending test email...")
msg = MIMEMultipart()
msg['From'] = settings.smtp_from or settings.smtp_user
msg['To'] = settings.admin_email or settings.smtp_user
msg['Subject'] = 'SMTP Test - Mailcow Logs Viewer'
body = "This is a test email from Mailcow Logs Viewer.\n\nIf you received this, SMTP is working correctly."
msg.attach(MIMEText(body, 'plain'))
server.send_message(msg)
logs.append("Test email sent successfully")
server.quit()
logs.append("Connection closed")
success = True
logs.append("✓ SMTP test completed successfully")
except smtplib.SMTPAuthenticationError as e:
logs.append(f"✗ Authentication failed: {e}")
except smtplib.SMTPException as e:
logs.append(f"✗ SMTP error: {e}")
except Exception as e:
logs.append(f"✗ Unexpected error: {type(e).__name__}: {e}")
return {
"success": success,
"logs": logs
}
def test_imap_connection() -> Dict:
"""Test IMAP connection and return detailed log"""
logs = []
success = False
try:
logs.append("Starting IMAP connection test...")
logs.append(f"Host: {settings.dmarc_imap_host}")
logs.append(f"Port: {settings.dmarc_imap_port}")
logs.append(f"Use SSL: {settings.dmarc_imap_use_ssl}")
logs.append(f"User: {settings.dmarc_imap_user}")
logs.append(f"Folder: {settings.dmarc_imap_folder}")
if not settings.dmarc_imap_host or not settings.dmarc_imap_user or not settings.dmarc_imap_password:
logs.append("ERROR: IMAP not fully configured")
return {"success": False, "logs": logs}
logs.append("Connecting to IMAP server...")
if settings.dmarc_imap_use_ssl:
connection = imaplib.IMAP4_SSL(settings.dmarc_imap_host, settings.dmarc_imap_port, timeout=30)
logs.append("Connected using SSL")
else:
connection = imaplib.IMAP4(settings.dmarc_imap_host, settings.dmarc_imap_port, timeout=30)
logs.append("Connected without SSL")
logs.append("Logging in...")
connection.login(settings.dmarc_imap_user, settings.dmarc_imap_password)
logs.append("Login successful")
logs.append(f"Listing mailboxes...")
status, mailboxes = connection.list()
if status == 'OK':
logs.append(f"Found {len(mailboxes)} mailboxes:")
for mb in mailboxes[:5]:
logs.append(f" - {mb.decode()}")
if len(mailboxes) > 5:
logs.append(f" ... and {len(mailboxes) - 5} more")
logs.append(f"Selecting folder: {settings.dmarc_imap_folder}")
status, data = connection.select(settings.dmarc_imap_folder, readonly=True)
if status == 'OK':
logs.append(f"Folder selected: {data[0].decode()} messages")
else:
logs.append(f"✗ Failed to select folder: {data}")
return {"success": False, "logs": logs}
logs.append("Searching for emails...")
status, messages = connection.search(None, 'ALL')
if status == 'OK':
email_ids = messages[0].split()
logs.append(f"Found {len(email_ids)} emails in folder")
connection.logout()
logs.append("Connection closed")
success = True
logs.append("✓ IMAP test completed successfully")
except imaplib.IMAP4.error as e:
logs.append(f"✗ IMAP error: {e}")
except Exception as e:
logs.append(f"✗ Unexpected error: {type(e).__name__}: {e}")
return {
"success": success,
"logs": logs
}

View File

@@ -0,0 +1,434 @@
"""
DMARC IMAP Service
Automatically fetches and processes DMARC reports from email inbox
"""
import logging
import imaplib
import email
import gzip
import zipfile
import io
from datetime import datetime, timezone
from typing import List, Dict, Optional, Tuple
from email.message import EmailMessage
from ..config import settings
from ..database import SessionLocal
from ..models import DMARCSync, DMARCReport, DMARCRecord
from ..services.dmarc_parser import parse_dmarc_file
from ..services.geoip_service import enrich_dmarc_record
from ..services.dmarc_notifications import send_dmarc_error_notification
logger = logging.getLogger(__name__)
class DMARCImapService:
"""Service to fetch DMARC reports from IMAP inbox"""
def __init__(self):
self.host = settings.dmarc_imap_host
self.port = settings.dmarc_imap_port
self.use_ssl = settings.dmarc_imap_use_ssl
self.user = settings.dmarc_imap_user
self.password = settings.dmarc_imap_password
self.folder = settings.dmarc_imap_folder
self.delete_after = settings.dmarc_imap_delete_after
self.connection = None
def connect(self) -> bool:
"""Connect to IMAP server"""
try:
if self.use_ssl:
self.connection = imaplib.IMAP4_SSL(self.host, self.port, timeout=30)
else:
self.connection = imaplib.IMAP4(self.host, self.port, timeout=30)
self.connection.login(self.user, self.password)
logger.info(f"Successfully connected to IMAP server {self.host}")
return True
except Exception as e:
logger.error(f"Failed to connect to IMAP server: {e}")
raise
def disconnect(self):
"""Disconnect from IMAP server"""
if self.connection:
try:
self.connection.logout()
logger.info("Disconnected from IMAP server")
except Exception as e:
logger.error(f"Error disconnecting from IMAP: {e}")
def select_folder(self) -> bool:
"""Select the mailbox folder"""
try:
status, messages = self.connection.select(self.folder)
if status != 'OK':
logger.error(f"Failed to select folder {self.folder}")
return False
logger.info(f"Selected folder: {self.folder}")
return True
except Exception as e:
logger.error(f"Error selecting folder: {e}")
return False
def search_dmarc_emails(self) -> List[bytes]:
"""
Search for DMARC report emails
Looking for emails with subject containing:
- "Report Domain:"
- "Submitter:"
- "Report-ID:"
Returns list of email IDs
"""
try:
# Search for emails with DMARC-related subject
# Using OR to be more flexible
search_criteria = '(OR (SUBJECT "Report Domain:") (OR (SUBJECT "DMARC") (SUBJECT "Report-ID:")))'
status, messages = self.connection.search(None, search_criteria)
if status != 'OK':
logger.error("Failed to search for DMARC emails")
return []
email_ids = messages[0].split()
logger.info(f"Found {len(email_ids)} potential DMARC emails")
return email_ids
except Exception as e:
logger.error(f"Error searching for emails: {e}")
return []
def is_valid_dmarc_email(self, msg: EmailMessage) -> bool:
"""
Validate that this is a genuine DMARC report email
Checks:
1. Subject contains "Report Domain:" AND ("Submitter:" OR "Report-ID:")
2. Has at least one compressed attachment (.xml.gz or .zip)
"""
try:
subject = msg.get('subject', '').lower()
# Check subject format
has_report_domain = 'report domain:' in subject
has_submitter = 'submitter:' in subject
has_report_id = 'report-id:' in subject
if not (has_report_domain and (has_submitter or has_report_id)):
logger.debug(f"Email does not match DMARC subject pattern: {subject}")
return False
# Check for compressed attachments
has_attachment = False
for part in msg.walk():
filename = part.get_filename()
if filename:
filename_lower = filename.lower()
if filename_lower.endswith('.xml.gz') or filename_lower.endswith('.zip'):
has_attachment = True
break
if not has_attachment:
logger.debug(f"Email has no compressed DMARC attachment: {subject}")
return False
return True
except Exception as e:
logger.error(f"Error validating DMARC email: {e}")
return False
def extract_attachments(self, msg: EmailMessage) -> List[Tuple[str, bytes]]:
"""
Extract compressed attachments from email
Returns list of (filename, content) tuples
"""
attachments = []
try:
for part in msg.walk():
filename = part.get_filename()
if not filename:
continue
filename_lower = filename.lower()
if not (filename_lower.endswith('.xml.gz') or filename_lower.endswith('.zip')):
continue
content = part.get_payload(decode=True)
if content:
attachments.append((filename, content))
logger.debug(f"Extracted attachment: {filename}")
except Exception as e:
logger.error(f"Error extracting attachments: {e}")
return attachments
def process_email(self, email_id: str, db: SessionLocal) -> Dict:
"""
Process a single DMARC email
Returns dict with:
- success: bool
- reports_created: int
- reports_duplicate: int
- error: str or None
"""
result = {
'success': False,
'reports_created': 0,
'reports_duplicate': 0,
'error': None,
'message_id': None,
'subject': None
}
try:
# Fetch email (email_id is already a string)
status, msg_data = self.connection.fetch(email_id, '(RFC822)')
if status != 'OK':
result['error'] = f"Failed to fetch email {email_id}"
return result
# Parse email
msg = email.message_from_bytes(msg_data[0][1])
result['message_id'] = msg.get('message-id', 'unknown')
result['subject'] = msg.get('subject', 'unknown')
# Validate it's a DMARC email
if not self.is_valid_dmarc_email(msg):
result['error'] = "Not a valid DMARC report email"
return result
# Extract attachments
attachments = self.extract_attachments(msg)
if not attachments:
result['error'] = "No DMARC attachments found"
return result
# Process each attachment
for filename, content in attachments:
try:
# Parse DMARC report
parsed_data = parse_dmarc_file(content, filename)
if not parsed_data:
logger.warning(f"Failed to parse attachment: {filename}")
continue
# Extract records
records_data = parsed_data.pop('records', [])
report_data = parsed_data
# Check for duplicate
existing = db.query(DMARCReport).filter(
DMARCReport.report_id == report_data['report_id']
).first()
if existing:
result['reports_duplicate'] += 1
logger.info(f"Duplicate report: {report_data['report_id']}")
continue
# Create report
report = DMARCReport(**report_data)
db.add(report)
db.flush()
# Create records with GeoIP enrichment
for record_data in records_data:
record_data['dmarc_report_id'] = report.id
enriched = enrich_dmarc_record(record_data)
record = DMARCRecord(**enriched)
db.add(record)
db.commit()
result['reports_created'] += 1
logger.info(f"Created DMARC report: {report_data['report_id']}")
except Exception as e:
db.rollback()
logger.error(f"Error processing attachment {filename}: {e}")
if not result['error']:
result['error'] = str(e)
# Mark as success if at least one report was created
if result['reports_created'] > 0:
result['success'] = True
return result
except Exception as e:
logger.error(f"Error processing email {email_id}: {e}")
result['error'] = str(e)
return result
def mark_as_processed(self, email_id: str):
"""Mark email as processed (flag or move)"""
try:
# Add a flag to mark as processed
self.connection.store(email_id, '+FLAGS', '\\Seen')
logger.debug(f"Marked email {email_id} as seen")
except Exception as e:
logger.error(f"Error marking email as processed: {e}")
def delete_email(self, email_id: str):
"""Delete email from server"""
try:
self.connection.store(email_id, '+FLAGS', '\\Deleted')
self.connection.expunge()
logger.debug(f"Deleted email {email_id}")
except Exception as e:
logger.error(f"Error deleting email: {e}")
def sync_reports(self, sync_type: str = 'auto') -> Dict:
"""
Main sync function
Returns statistics about the sync operation
"""
sync_record = DMARCSync(
sync_type=sync_type,
started_at=datetime.now(timezone.utc),
status='running'
)
db = SessionLocal()
try:
db.add(sync_record)
db.commit()
db.refresh(sync_record)
# Connect to IMAP
self.connect()
# Select folder
if not self.select_folder():
raise Exception(f"Failed to select folder {self.folder}")
# Search for DMARC emails
email_ids = self.search_dmarc_emails()
sync_record.emails_found = len(email_ids)
db.commit()
if not email_ids:
logger.info("No DMARC emails found")
sync_record.status = 'success'
sync_record.completed_at = datetime.now(timezone.utc)
db.commit()
return self._build_result(sync_record)
# Process each email
failed_emails = []
for email_id in email_ids:
email_id = email_id.decode() if isinstance(email_id, bytes) else email_id
result = self.process_email(email_id, db)
sync_record.emails_processed += 1
if result['success']:
sync_record.reports_created += result['reports_created']
sync_record.reports_duplicate += result['reports_duplicate']
# Delete or mark as processed
if self.delete_after:
self.delete_email(email_id)
else:
self.mark_as_processed(email_id)
else:
sync_record.reports_failed += 1
failed_emails.append({
'email_id': email_id,
'message_id': result['message_id'],
'subject': result['subject'],
'error': result['error']
})
db.commit()
# Update sync record
sync_record.status = 'success'
sync_record.completed_at = datetime.now(timezone.utc)
sync_record.failed_emails = failed_emails if failed_emails else None
if failed_emails:
sync_record.error_message = f"{len(failed_emails)} emails failed to process"
db.commit()
logger.info(f"DMARC sync completed: {sync_record.reports_created} created, "
f"{sync_record.reports_duplicate} duplicates, "
f"{sync_record.reports_failed} failed")
# Send email notification if there were failures
if failed_emails and settings.notification_smtp_configured:
logger.info(f"Sending error notification for {len(failed_emails)} failed emails")
try:
send_dmarc_error_notification(failed_emails, sync_record.id)
logger.info("Error notification sent successfully")
except Exception as email_error:
logger.error(f"Failed to send error notification: {email_error}")
return self._build_result(sync_record)
except Exception as e:
logger.error(f"DMARC sync failed: {e}")
sync_record.status = 'error'
sync_record.completed_at = datetime.now(timezone.utc)
sync_record.error_message = str(e)
db.commit()
raise
finally:
self.disconnect()
db.close()
def _build_result(self, sync_record: DMARCSync) -> Dict:
"""Build result dictionary from sync record"""
return {
'sync_id': sync_record.id,
'sync_type': sync_record.sync_type,
'status': sync_record.status,
'started_at': sync_record.started_at.strftime('%Y-%m-%dT%H:%M:%SZ') if sync_record.started_at else None,
'completed_at': sync_record.completed_at.strftime('%Y-%m-%dT%H:%M:%SZ') if sync_record.completed_at else None,
'emails_found': sync_record.emails_found,
'emails_processed': sync_record.emails_processed,
'reports_created': sync_record.reports_created,
'reports_duplicate': sync_record.reports_duplicate,
'reports_failed': sync_record.reports_failed,
'error_message': sync_record.error_message,
'failed_emails': sync_record.failed_emails
}
def sync_dmarc_reports_from_imap(sync_type: str = 'auto') -> Dict:
"""
Convenience function to sync DMARC reports
Can be called from scheduler or API endpoint
"""
if not settings.dmarc_imap_enabled:
logger.info("DMARC IMAP sync is disabled")
return {
'status': 'disabled',
'message': 'DMARC IMAP sync is not enabled'
}
service = DMARCImapService()
return service.sync_reports(sync_type=sync_type)

View File

@@ -0,0 +1,161 @@
"""
DMARC Notification Module
Uses the global SMTP service to send DMARC-specific notifications
"""
import logging
from typing import List, Dict
from datetime import datetime
from ..config import settings
from .smtp_service import send_notification_email, get_notification_email
logger = logging.getLogger(__name__)
def send_dmarc_error_notification(failed_emails: List[Dict], sync_id: int) -> bool:
"""
Send notification about failed DMARC report processing
Uses global SMTP service
Args:
failed_emails: List of failed email dicts with message_id, subject, error
sync_id: ID of the sync operation
Returns:
True if email was sent successfully, False otherwise
"""
if not failed_emails:
return True
# Get recipient: DMARC_ERROR_EMAIL or fallback to ADMIN_EMAIL
recipient = get_notification_email(settings.dmarc_error_email)
if not recipient:
logger.warning("No recipient configured (DMARC_ERROR_EMAIL or ADMIN_EMAIL)")
return False
# Build email content
subject = f"DMARC Processing Errors - Sync #{sync_id}"
text_content = _create_text_content(failed_emails, sync_id)
html_content = _create_html_content(failed_emails, sync_id)
# Send via global SMTP service
return send_notification_email(recipient, subject, text_content, html_content)
def _create_text_content(failed_emails: List[Dict], sync_id: int) -> str:
"""Create plain text email content"""
lines = [
f"DMARC Report Processing Errors - Sync #{sync_id}",
f"Date: {datetime.utcnow().strftime('%Y-%m-%d %H:%M:%S UTC')}",
"",
f"Failed to process {len(failed_emails)} DMARC report email(s):",
""
]
for i, email_data in enumerate(failed_emails, 1):
lines.append(f"{i}. Email ID: {email_data.get('email_id', 'unknown')}")
lines.append(f" Message-ID: {email_data.get('message_id', 'unknown')}")
lines.append(f" Subject: {email_data.get('subject', 'unknown')}")
lines.append(f" Error: {email_data.get('error', 'unknown')}")
lines.append("")
lines.append("---")
lines.append("This is an automated notification from Mailcow Logs Viewer")
lines.append(f"DMARC IMAP Sync Service")
return "\n".join(lines)
def _create_html_content(failed_emails: List[Dict], sync_id: int) -> str:
"""Create HTML email content"""
html = f"""
<!DOCTYPE html>
<html>
<head>
<style>
body {{
font-family: Arial, sans-serif;
line-height: 1.6;
color: #333;
}}
.header {{
background-color: #dc3545;
color: white;
padding: 20px;
border-radius: 5px;
}}
.content {{
padding: 20px;
}}
.error-list {{
background-color: #f8f9fa;
border-left: 4px solid #dc3545;
padding: 15px;
margin: 20px 0;
}}
.error-item {{
margin-bottom: 20px;
padding-bottom: 20px;
border-bottom: 1px solid #dee2e6;
}}
.error-item:last-child {{
border-bottom: none;
}}
.label {{
font-weight: bold;
color: #495057;
}}
.value {{
margin-left: 10px;
color: #212529;
}}
.error {{
color: #dc3545;
margin-top: 5px;
}}
.footer {{
margin-top: 30px;
padding-top: 20px;
border-top: 1px solid #dee2e6;
font-size: 12px;
color: #6c757d;
}}
</style>
</head>
<body>
<div class="header">
<h2>⚠️ DMARC Processing Errors</h2>
<p>Sync #{sync_id} - {datetime.utcnow().strftime('%Y-%m-%d %H:%M:%S UTC')}</p>
</div>
<div class="content">
<p>Failed to process <strong>{len(failed_emails)}</strong> DMARC report email(s):</p>
<div class="error-list">
"""
for i, email_data in enumerate(failed_emails, 1):
html += f"""
<div class="error-item">
<div><span class="label">#{i}</span></div>
<div><span class="label">Email ID:</span><span class="value">{email_data.get('email_id', 'unknown')}</span></div>
<div><span class="label">Message-ID:</span><span class="value">{email_data.get('message_id', 'unknown')}</span></div>
<div><span class="label">Subject:</span><span class="value">{email_data.get('subject', 'unknown')}</span></div>
<div class="error"><span class="label">Error:</span> {email_data.get('error', 'unknown')}</div>
</div>
"""
html += """
</div>
<div class="footer">
<p>This is an automated notification from <strong>Mailcow Logs Viewer</strong></p>
<p>DMARC IMAP Sync Service</p>
</div>
</div>
</body>
</html>
"""
return html

View File

@@ -0,0 +1,235 @@
"""
DMARC Report Parser
Handles parsing of DMARC aggregate reports in XML format (GZ or ZIP compressed)
"""
import gzip
import zipfile
import xml.etree.ElementTree as ET
import logging
from typing import Dict, List, Any, Optional
from io import BytesIO
logger = logging.getLogger(__name__)
def parse_dmarc_file(file_content: bytes, filename: str) -> Optional[Dict[str, Any]]:
"""
Parse DMARC report from compressed file (GZ or ZIP)
Args:
file_content: Raw bytes of the compressed file
filename: Original filename (to determine compression type)
Returns:
Parsed DMARC data or None if parsing failed
"""
try:
# Determine file type and extract XML
xml_content = None
if filename.endswith('.gz'):
# Gzip compressed
with gzip.open(BytesIO(file_content), 'rb') as f:
xml_content = f.read()
elif filename.endswith('.zip'):
# ZIP compressed (Google uses this)
with zipfile.ZipFile(BytesIO(file_content)) as z:
# Get first XML file in zip
xml_files = [name for name in z.namelist() if name.endswith('.xml')]
if xml_files:
xml_content = z.read(xml_files[0])
else:
logger.error(f"No XML file found in ZIP: {filename}")
return None
else:
logger.error(f"Unsupported file format: {filename}")
return None
if not xml_content:
logger.error(f"Failed to extract XML content from {filename}")
return None
# Parse XML
return parse_dmarc_xml(xml_content.decode('utf-8'), xml_content.decode('utf-8'))
except Exception as e:
logger.error(f"Error parsing DMARC file {filename}: {e}")
return None
def parse_dmarc_xml(xml_string: str, raw_xml: str) -> Dict[str, Any]:
"""
Parse DMARC XML content
Args:
xml_string: XML content as string
raw_xml: Original raw XML for storage
Returns:
Dictionary with parsed DMARC data
"""
try:
root = ET.fromstring(xml_string)
# Parse report metadata
metadata = root.find('report_metadata')
if metadata is None:
raise ValueError("Missing report_metadata element")
org_name = get_element_text(metadata, 'org_name')
email = get_element_text(metadata, 'email')
extra_contact_info = get_element_text(metadata, 'extra_contact_info')
report_id = get_element_text(metadata, 'report_id')
date_range = metadata.find('date_range')
if date_range is None:
raise ValueError("Missing date_range element")
begin_date = int(get_element_text(date_range, 'begin'))
end_date = int(get_element_text(date_range, 'end'))
# Parse published policy
policy = root.find('policy_published')
if policy is None:
raise ValueError("Missing policy_published element")
domain = get_element_text(policy, 'domain')
policy_published = {
'adkim': get_element_text(policy, 'adkim'),
'aspf': get_element_text(policy, 'aspf'),
'p': get_element_text(policy, 'p'),
'sp': get_element_text(policy, 'sp'),
'pct': get_element_text(policy, 'pct'),
'fo': get_element_text(policy, 'fo'),
'np': get_element_text(policy, 'np'),
}
# Remove None values
policy_published = {k: v for k, v in policy_published.items() if v is not None}
# Parse records
records = []
for record_elem in root.findall('record'):
record_data = parse_dmarc_record(record_elem)
if record_data:
records.append(record_data)
return {
'report_id': report_id,
'org_name': org_name,
'email': email,
'extra_contact_info': extra_contact_info,
'domain': domain,
'begin_date': begin_date,
'end_date': end_date,
'policy_published': policy_published,
'records': records,
'raw_xml': raw_xml
}
except Exception as e:
logger.error(f"Error parsing DMARC XML: {e}")
raise
def parse_dmarc_record(record_elem: ET.Element) -> Optional[Dict[str, Any]]:
"""
Parse a single DMARC record element
Args:
record_elem: XML element for a record
Returns:
Dictionary with parsed record data
"""
try:
row = record_elem.find('row')
if row is None:
return None
# Source and count
source_ip = get_element_text(row, 'source_ip')
count = int(get_element_text(row, 'count', '0'))
# Policy evaluation
policy_eval = row.find('policy_evaluated')
disposition = get_element_text(policy_eval, 'disposition') if policy_eval else None
dkim_result = get_element_text(policy_eval, 'dkim') if policy_eval else None
spf_result = get_element_text(policy_eval, 'spf') if policy_eval else None
# Identifiers
identifiers = record_elem.find('identifiers')
header_from = get_element_text(identifiers, 'header_from') if identifiers else None
envelope_from = get_element_text(identifiers, 'envelope_from') if identifiers else None
envelope_to = get_element_text(identifiers, 'envelope_to') if identifiers else None
# Auth results
auth_results = {}
auth_results_elem = record_elem.find('auth_results')
if auth_results_elem:
# Parse DKIM results
dkim_results = []
for dkim_elem in auth_results_elem.findall('dkim'):
dkim_data = {
'domain': get_element_text(dkim_elem, 'domain'),
'selector': get_element_text(dkim_elem, 'selector'),
'result': get_element_text(dkim_elem, 'r') or get_element_text(dkim_elem, 'result')
}
dkim_results.append({k: v for k, v in dkim_data.items() if v})
if dkim_results:
auth_results['dkim'] = dkim_results
# Parse SPF results
spf_results = []
for spf_elem in auth_results_elem.findall('spf'):
spf_data = {
'domain': get_element_text(spf_elem, 'domain'),
'scope': get_element_text(spf_elem, 'scope'),
'result': get_element_text(spf_elem, 'r') or get_element_text(spf_elem, 'result')
}
spf_results.append({k: v for k, v in spf_data.items() if v})
if spf_results:
auth_results['spf'] = spf_results
return {
'source_ip': source_ip,
'count': count,
'disposition': disposition,
'dkim_result': dkim_result,
'spf_result': spf_result,
'header_from': header_from,
'envelope_from': envelope_from,
'envelope_to': envelope_to,
'auth_results': auth_results if auth_results else None
}
except Exception as e:
logger.error(f"Error parsing DMARC record: {e}")
return None
def get_element_text(parent: Optional[ET.Element], tag: str, default: Optional[str] = None) -> Optional[str]:
"""
Safely get text from XML element
Args:
parent: Parent XML element
tag: Tag name to find
default: Default value if not found
Returns:
Element text or default value
"""
if parent is None:
return default
elem = parent.find(tag)
if elem is not None and elem.text:
return elem.text.strip()
return default

View File

@@ -0,0 +1,315 @@
"""
MaxMind GeoIP Auto-Downloader
Downloads and updates GeoLite2 databases automatically
"""
import logging
import os
import tarfile
import tempfile
from pathlib import Path
from datetime import datetime, timedelta
import requests
logger = logging.getLogger(__name__)
# Configuration from environment
MAXMIND_LICENSE_KEY = os.getenv('MAXMIND_LICENSE_KEY', '')
MAXMIND_ACCOUNT_ID = os.getenv('MAXMIND_ACCOUNT_ID', '')
GEOIP_DB_DIR = os.getenv('GEOIP_DB_DIR', '/app/data')
# Database paths
GEOIP_CITY_DB_PATH = os.path.join(GEOIP_DB_DIR, 'GeoLite2-City.mmdb')
GEOIP_ASN_DB_PATH = os.path.join(GEOIP_DB_DIR, 'GeoLite2-ASN.mmdb')
# MaxMind download URL
MAXMIND_DOWNLOAD_URL = "https://download.maxmind.com/app/geoip_download"
# Update frequency (days)
UPDATE_CHECK_DAYS = 7
# Databases to download
DATABASES = {
'City': {
'edition_id': 'GeoLite2-City',
'path': GEOIP_CITY_DB_PATH,
'description': 'Country + City + Coordinates'
},
'ASN': {
'edition_id': 'GeoLite2-ASN',
'path': GEOIP_ASN_DB_PATH,
'description': 'ASN + ISP information'
}
}
def is_license_configured() -> bool:
"""Check if MaxMind license key is configured"""
return bool(MAXMIND_LICENSE_KEY and MAXMIND_ACCOUNT_ID)
def get_db_age_days(db_path: str) -> int:
"""
Get age of database in days
Returns -1 if database doesn't exist
"""
path = Path(db_path)
if not path.exists():
return -1
# Get file modification time
mtime = path.stat().st_mtime
modified_date = datetime.fromtimestamp(mtime)
age_days = (datetime.now() - modified_date).days
return age_days
def should_update_database(db_name: str) -> bool:
"""
Check if database should be updated
Returns True if:
- Database doesn't exist
- Database is older than UPDATE_CHECK_DAYS days
"""
db_path = DATABASES[db_name]['path']
age_days = get_db_age_days(db_path)
if age_days == -1:
logger.info(f"{db_name} database not found, download required")
return True
if age_days >= UPDATE_CHECK_DAYS:
logger.info(f"{db_name} database is {age_days} days old, update required")
return True
logger.info(f"{db_name} database is {age_days} days old, no update needed")
return False
def download_single_database(db_name: str) -> bool:
"""
Download a single GeoIP database from MaxMind
Args:
db_name: 'City' or 'ASN'
Returns:
True if successful, False otherwise
"""
db_info = DATABASES[db_name]
try:
logger.info(f"Downloading GeoLite2-{db_name} database from MaxMind...")
logger.info(f" ({db_info['description']})")
# Construct download URL
params = {
'edition_id': db_info['edition_id'],
'license_key': MAXMIND_LICENSE_KEY,
'suffix': 'tar.gz'
}
# Download
response = requests.get(MAXMIND_DOWNLOAD_URL, params=params, stream=True, timeout=300)
if response.status_code == 401:
logger.error("MaxMind license key is invalid or expired")
return False
if response.status_code != 200:
logger.error(f"Failed to download {db_name} database: HTTP {response.status_code}")
return False
# Create temp file
with tempfile.NamedTemporaryFile(delete=False, suffix='.tar.gz') as tmp_file:
tmp_path = tmp_file.name
# Download with progress
total_size = int(response.headers.get('content-length', 0))
downloaded = 0
for chunk in response.iter_content(chunk_size=8192):
tmp_file.write(chunk)
downloaded += len(chunk)
if total_size > 0 and downloaded % (5 * 1024 * 1024) == 0: # Log every 5MB
progress = (downloaded / total_size) * 100
logger.info(f" Download progress: {progress:.1f}%")
size_mb = downloaded / (1024 * 1024)
logger.info(f" Downloaded {size_mb:.1f}MB")
# Extract tar.gz
logger.info(f" Extracting GeoLite2-{db_name} database...")
with tempfile.TemporaryDirectory() as tmp_dir:
with tarfile.open(tmp_path, 'r:gz') as tar:
tar.extractall(tmp_dir)
# Find the .mmdb file (it's in a subdirectory)
mmdb_files = list(Path(tmp_dir).rglob('*.mmdb'))
if not mmdb_files:
logger.error(f"No .mmdb file found in downloaded {db_name} archive")
os.unlink(tmp_path)
return False
mmdb_file = mmdb_files[0]
# Ensure destination directory exists
os.makedirs(GEOIP_DB_DIR, exist_ok=True)
# Move to destination
import shutil
shutil.copy2(mmdb_file, db_info['path'])
logger.info(f"✓ GeoLite2-{db_name} database installed at {db_info['path']}")
# Cleanup
os.unlink(tmp_path)
# Log database info
db_size = Path(db_info['path']).stat().st_size / (1024 * 1024)
logger.info(f" Database size: {db_size:.1f}MB")
return True
except requests.exceptions.RequestException as e:
logger.error(f"Network error downloading {db_name} database: {e}")
return False
except Exception as e:
logger.error(f"Error downloading {db_name} database: {e}")
return False
def download_geoip_databases() -> dict:
"""
Download both City and ASN databases from MaxMind
Returns:
{'City': bool, 'ASN': bool} - success status for each database
"""
if not is_license_configured():
logger.warning("MaxMind license key not configured, skipping download")
return {'City': False, 'ASN': False}
results = {}
for db_name in ['City', 'ASN']:
if should_update_database(db_name):
results[db_name] = download_single_database(db_name)
else:
logger.info(f"{db_name} database is up to date, skipping download")
results[db_name] = True # Already exists and up-to-date
return results
def update_geoip_database_if_needed() -> dict:
"""
Update GeoIP databases if needed
Called on startup and periodically
Returns:
{
'City': {'available': bool, 'updated': bool},
'ASN': {'available': bool, 'updated': bool}
}
"""
if not is_license_configured():
logger.info("MaxMind license key not configured, GeoIP features will be disabled")
return {
'City': {'available': False, 'updated': False},
'ASN': {'available': False, 'updated': False}
}
status = {}
for db_name in ['City', 'ASN']:
db_path = DATABASES[db_name]['path']
needs_update = should_update_database(db_name)
if not needs_update:
# Already up-to-date
status[db_name] = {
'available': True,
'updated': False # Didn't need update
}
continue
# Download
success = download_single_database(db_name)
if success:
status[db_name] = {
'available': True,
'updated': True
}
else:
# Check if old database exists
if Path(db_path).exists():
logger.info(f"Using existing (outdated) {db_name} database")
status[db_name] = {
'available': True,
'updated': False
}
else:
logger.error(f"No {db_name} database available")
status[db_name] = {
'available': False,
'updated': False
}
return status
def get_geoip_status() -> dict:
"""
Get current GeoIP databases status
Returns:
{
'configured': bool,
'City': {
'available': bool,
'age_days': int,
'size_mb': float,
'last_modified': str
},
'ASN': {
'available': bool,
'age_days': int,
'size_mb': float,
'last_modified': str
}
}
"""
status = {
'configured': is_license_configured(),
'City': {
'available': False,
'age_days': -1,
'size_mb': 0,
'last_modified': None
},
'ASN': {
'available': False,
'age_days': -1,
'size_mb': 0,
'last_modified': None
}
}
for db_name in ['City', 'ASN']:
db_path = Path(DATABASES[db_name]['path'])
if db_path.exists():
status[db_name]['available'] = True
status[db_name]['age_days'] = get_db_age_days(str(db_path))
status[db_name]['size_mb'] = round(db_path.stat().st_size / (1024 * 1024), 1)
mtime = db_path.stat().st_mtime
status[db_name]['last_modified'] = datetime.fromtimestamp(mtime).isoformat()
return status

View File

@@ -0,0 +1,226 @@
"""
GeoIP Service for DMARC
Uses MaxMind GeoLite2-City and GeoLite2-ASN databases
"""
import logging
from typing import Optional, Dict
from pathlib import Path
logger = logging.getLogger(__name__)
GEOIP_CITY_DB_PATH = "/app/data/GeoLite2-City.mmdb"
GEOIP_ASN_DB_PATH = "/app/data/GeoLite2-ASN.mmdb"
_city_reader = None
_asn_reader = None
_geoip_available = None
def is_geoip_available() -> bool:
"""Check if GeoIP databases are available"""
global _geoip_available
if _geoip_available is None:
city_exists = Path(GEOIP_CITY_DB_PATH).exists()
asn_exists = Path(GEOIP_ASN_DB_PATH).exists()
_geoip_available = city_exists
if not city_exists:
logger.warning(f"GeoIP City database not found at {GEOIP_CITY_DB_PATH}")
logger.info("GeoIP features will be disabled. To enable, configure MAXMIND_LICENSE_KEY")
if not asn_exists:
logger.warning(f"GeoIP ASN database not found at {GEOIP_ASN_DB_PATH}")
logger.info("ASN information will not be available")
return _geoip_available
def get_city_reader():
"""Get or create GeoIP City database reader"""
global _city_reader
if not Path(GEOIP_CITY_DB_PATH).exists():
return None
if _city_reader is None:
try:
import geoip2.database
_city_reader = geoip2.database.Reader(GEOIP_CITY_DB_PATH)
logger.info(f"✓ GeoIP City database loaded from {GEOIP_CITY_DB_PATH}")
except ImportError:
logger.error("geoip2 module not installed. Install with: pip install geoip2")
_city_reader = None
except Exception as e:
logger.error(f"Failed to load GeoIP City database: {e}")
_city_reader = None
return _city_reader
def get_asn_reader():
"""Get or create GeoIP ASN database reader"""
global _asn_reader
if not Path(GEOIP_ASN_DB_PATH).exists():
return None
if _asn_reader is None:
try:
import geoip2.database
_asn_reader = geoip2.database.Reader(GEOIP_ASN_DB_PATH)
logger.info(f"✓ GeoIP ASN database loaded from {GEOIP_ASN_DB_PATH}")
except ImportError:
logger.error("geoip2 module not installed. Install with: pip install geoip2")
_asn_reader = None
except Exception as e:
logger.error(f"Failed to load GeoIP ASN database: {e}")
_asn_reader = None
return _asn_reader
def get_country_emoji(country_code: str) -> str:
"""
Convert ISO country code to flag emoji
Example: 'US' -> '🇺🇸'
"""
if not country_code or len(country_code) != 2:
return '🌍'
try:
code_points = [127462 + ord(c) - ord('A') for c in country_code.upper()]
return ''.join(chr(c) for c in code_points)
except:
return '🌍'
def lookup_ip(ip_address: str) -> Dict[str, Optional[str]]:
"""
Lookup IP address and return geo information
Uses both City and ASN databases
Returns:
{
'country_code': 'US',
'country_name': 'United States',
'city': 'Mountain View',
'asn': 'AS15169',
'asn_org': 'Google LLC'
}
If GeoIP is not available, returns all None values (graceful degradation)
"""
result = {
'country_code': None,
'country_name': None,
'city': None,
'asn': None,
'asn_org': None
}
city_reader = get_city_reader()
if city_reader:
try:
import geoip2.errors
response = city_reader.city(ip_address)
if response.country.iso_code:
result['country_code'] = response.country.iso_code
result['country_name'] = response.country.name
if response.city.name:
result['city'] = response.city.name
except geoip2.errors.AddressNotFoundError:
pass
except Exception as e:
logger.debug(f"Error looking up IP {ip_address} in City database: {e}")
asn_reader = get_asn_reader()
if asn_reader:
try:
import geoip2.errors
response = asn_reader.asn(ip_address)
if response.autonomous_system_number:
result['asn'] = f"AS{response.autonomous_system_number}"
if response.autonomous_system_organization:
result['asn_org'] = response.autonomous_system_organization
except geoip2.errors.AddressNotFoundError:
pass
except Exception as e:
logger.debug(f"Error looking up IP {ip_address} in ASN database: {e}")
return result
def enrich_dmarc_record(record_data: Dict) -> Dict:
"""
Enrich DMARC record with GeoIP data
Args:
record_data: Dictionary with 'source_ip' key
Returns:
Enhanced dictionary with geo data (or None values if GeoIP unavailable)
"""
if not is_geoip_available():
record_data.update({
'country_code': None,
'country_name': None,
'country_emoji': '🌍',
'city': None,
'asn': None,
'asn_org': None
})
return record_data
if 'source_ip' in record_data:
geo_info = lookup_ip(record_data['source_ip'])
record_data.update(geo_info)
record_data['country_emoji'] = get_country_emoji(geo_info.get('country_code'))
return record_data
def reload_geoip_readers():
"""
Reload GeoIP readers (after database update)
Call this after downloading new databases
"""
global _city_reader, _asn_reader, _geoip_available
if _city_reader:
try:
_city_reader.close()
except:
pass
_city_reader = None
if _asn_reader:
try:
_asn_reader.close()
except:
pass
_asn_reader = None
_geoip_available = None
city_ok = get_city_reader() is not None
asn_ok = get_asn_reader() is not None
if city_ok and asn_ok:
logger.info("✓ GeoIP databases reloaded successfully (City + ASN)")
return True
elif city_ok:
logger.info("✓ GeoIP City database reloaded (ASN unavailable)")
return True
else:
logger.warning("Failed to reload GeoIP databases")
return False

View File

@@ -0,0 +1,130 @@
"""
Global SMTP Service
Generic email notification service for all system modules
"""
import logging
import smtplib
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
from datetime import datetime
from typing import Optional
from ..config import settings
logger = logging.getLogger(__name__)
class SmtpService:
"""Generic service for sending email notifications"""
def __init__(self):
self.host = settings.smtp_host
self.port = settings.smtp_port
self.use_tls = settings.smtp_use_tls
self.user = settings.smtp_user
self.password = settings.smtp_password
self.from_address = settings.smtp_from or settings.smtp_user
def is_configured(self) -> bool:
"""Check if SMTP is properly configured"""
return settings.notification_smtp_configured
def send_email(
self,
recipient: str,
subject: str,
text_content: str,
html_content: Optional[str] = None
) -> bool:
"""
Send email via SMTP
Args:
recipient: Email address to send to
subject: Email subject
text_content: Plain text content
html_content: Optional HTML content
Returns:
True if email was sent successfully, False otherwise
"""
if not self.is_configured():
logger.warning("SMTP not configured, skipping email")
return False
if not recipient:
logger.warning("No recipient email provided, skipping email")
return False
try:
msg = MIMEMultipart('alternative')
msg['Subject'] = subject
msg['From'] = self.from_address
msg['To'] = recipient
msg['Date'] = datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S +0000')
part1 = MIMEText(text_content, 'plain')
msg.attach(part1)
if html_content:
part2 = MIMEText(html_content, 'html')
msg.attach(part2)
if self.use_tls:
server = smtplib.SMTP(self.host, self.port)
server.starttls()
else:
server = smtplib.SMTP_SSL(self.host, self.port)
server.login(self.user, self.password)
server.sendmail(self.from_address, [recipient], msg.as_string())
server.quit()
logger.info(f"Email sent successfully to {recipient}: {subject}")
return True
except Exception as e:
logger.error(f"Failed to send email: {e}")
return False
def get_notification_email(module_specific: Optional[str] = None) -> str:
"""
Get email address for notifications with fallback logic
Args:
module_specific: Optional module-specific email override
Returns:
Email address to use (module email or admin email)
"""
if module_specific:
return module_specific
return settings.admin_email
def send_notification_email(
recipient: str,
subject: str,
text_content: str,
html_content: Optional[str] = None
) -> bool:
"""
Convenience function to send notification email
Args:
recipient: Email address
subject: Email subject
text_content: Plain text content
html_content: Optional HTML content
Returns:
True if sent successfully
"""
service = SmtpService()
if not service.is_configured():
logger.info("SMTP not configured, skipping notification")
return False
return service.send_email(recipient, subject, text_content, html_content)

View File

@@ -34,6 +34,11 @@ tenacity==8.2.3
# DNS Queries - Required for domains management
dnspython==2.6.1
# MaxMind
geoip2>=4.7.0
maxminddb>=2.4.0
requests>=2.31.0
# Testing (optional, for development)
pytest==7.4.4
pytest-asyncio==0.23.3

View File

@@ -28,6 +28,8 @@ services:
condition: service_healthy
ports:
- "${APP_PORT:-8080}:8080"
volumes:
- ./data:/app/data
networks:
- mailcow-logs-network
healthcheck:

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,49 @@
# Technical Overview: Email Authentication & Monitoring
To maintain high deliverability and robust domain security, **mailcow-logs-viewer** provides deep inspection and automated monitoring of the three core email authentication protocols: **SPF**, **DKIM**, and **DMARC**.
### The Authentication Stack
| Protocol | Technical Purpose | System Validation Logic |
| --- | --- | --- |
| **SPF** | **Identity Authorization:** Defines which IP addresses/hosts are authorized to send mail for a domain. | Validates against **RFC 7208**, checking the **10-DNS lookup limit**, recursive `include:` mechanisms, and verifying if the Mailcow server IP is explicitly authorized. |
| **DKIM** | **Message Integrity:** Provides a cryptographic signature to ensure the email content hasn't been altered in transit. | Inspects public keys for **SHA1 (weak hash)**, detects **revoked keys**, and warns if the record is stuck in **Testing Mode (`t=y`)**. |
| **DMARC** | **Policy Enforcement:** Provides instructions to receivers on how to handle failed SPF/DKIM checks. | Aggregates XML reports via IMAP, performing **Identifier Alignment** analysis and visualizing global mail flow. |
---
### Advanced Monitoring & Intelligence
**mailcow-logs-viewer** goes beyond basic record checking by providing a comprehensive analysis of your mail flow:
* **GeoIP & ASN Enrichment:** Integrated with **MaxMind GeoLite2**, the system enriches source IPs from DMARC reports with city-level location and Autonomous System (ASN) data. This allows you to identify legitimate third-party senders (like SendGrid or M365) versus malicious spoofing attempts.
* **Automated Data Ingestion:** An automated **IMAP worker** polls your designated reporting mailbox, processes `zip/gz` attachments.
* **SPF Recursion Analysis:** The validator simulates the receiver's evaluation process, detecting deep-nested includes that might cause the SPF check to fail due to the 10-lookup limit, a common issue in complex enterprise environments.
* **Compliance Dashboard:** Visualize a 30-day trend of your authentication pass rates. The UI provides color-coded compliance metrics (Green 95%) and immediate visibility into `quarantine` or `reject` policy effectiveness.
---
### 🚀 Implementation: Enabling DMARC Reporting
To leverage the monitoring capabilities, you must publish a DMARC record in your DNS. This triggers global receivers (Google, Microsoft, etc.) to generate and send aggregate reports (`rua`) to your system.
#### 1. DNS Configuration
Create a **TXT** record at the `_dmarc` subdomain (e.g., `_dmarc.example.com`):
```text
v=DMARC1; p=none; rua=mailto:dmarc-reports@yourdomain.com;
```
#### 2. Parameter Details
* **`p=none` (Monitoring Mode):** The recommended starting point. It ensures no mail is blocked while you collect data to verify that all legitimate sources are correctly authenticated.
* **`rua=mailto:...`:** This is the feedback loop trigger. Ensure this address is the one configured in the **IMAP Settings** of Mailcow Logs Viewer.
* **`v=DMARC1`:** Required version prefix.
#### 3. Transitioning to Enforcement
Once the dashboard confirms that your legitimate traffic (including third-party SaaS) is passing SPF/DKIM alignment, you should update your policy to `p=quarantine` or `p=reject` to fully secure your domain against spoofing.
---

View File

@@ -2,6 +2,87 @@
Get up and running in 5 minutes! 🚀
---
# Quick Start (TL;DR)
## Minimum Required Configuration
```bash
mkdir mailcow-logs-viewer && cd mailcow-logs-viewer
# Download docker-compose.yml and env.example, then:
mv env.example .env
nano .env
```
**Update these required settings in `.env`:**
```env
MAILCOW_URL=https://mail.example.com
MAILCOW_API_KEY=your_api_key_here
POSTGRES_PASSWORD=a7f3c8e2-4b1d-4f9a-8c3e-7d2f1a9b5e4c
ADMIN_EMAIL=admin@yourdomain.com
```
**Start:**
```bash
docker compose up -d
```
**Access:** `http://localhost:8080`
---
## Optional Features (all disabled by default)
Add to your `.env` file to enable:
**MaxMind GeoIP** (geographic location data):
```env
MAXMIND_ACCOUNT_ID=your_id
MAXMIND_LICENSE_KEY=your_key
```
And add data volume in `docker-compose.yml`:
```yaml
services:
app:
volumes:
- ./data:/app/data
```
**SMTP Notifications:**
```env
SMTP_ENABLED=true
SMTP_HOST=smtp.yourdomain.com
SMTP_PORT=587
SMTP_USE_TLS=true
SMTP_USER=user
SMTP_PASSWORD=pass
SMTP_FROM=noreply@yourdomain.com
```
**DMARC IMAP Auto-Import:**
```env
DMARC_IMAP_ENABLED=true
DMARC_IMAP_HOST=imap.yourdomain.com
DMARC_IMAP_PORT=993
DMARC_IMAP_USE_SSL=true
DMARC_IMAP_USER=dmarc@yourdomain.com
DMARC_IMAP_PASSWORD=your_password
```
**Authentication:**
```env
AUTH_ENABLED=true
AUTH_USERNAME=your_username
AUTH_PASSWORD=your_secure_password
```
---
# Detailed Installation Guide
## Prerequisites
- Docker & Docker Compose installed
@@ -10,7 +91,7 @@ Get up and running in 5 minutes! 🚀
---
## Installation
## Installation Steps
### Step 1: Create Project Directory
@@ -40,17 +121,20 @@ Edit the `.env` file and configure the settings for your environment:
nano .env
```
**⚠️ You must update these required settings:**
#### Required Settings
**⚠️ You must update these settings:**
| Variable | Description | Example |
|----------|-------------|---------|
| `MAILCOW_URL` | Your Mailcow instance URL | `https://mail.example.com` |
| `MAILCOW_API_KEY` | Your Mailcow API key | `abc123-def456...` |
| `POSTGRES_PASSWORD` | Database password<br>⚠️ Avoid special chars (`@:/?#`) - breaks connection strings<br>💡 Use UUID: Linux/Mac: `uuidgen` <br> or online https://it-tools.tech/uuid-generator | Example: `a7f3c8e2-4b1d-4f9a-8c3e-7d2f1a9b5e4c` |
| `POSTGRES_PASSWORD` | Database password<br>⚠️ Avoid special chars (`@:/?#`) - breaks connection strings<br>💡 Use UUID: `uuidgen` or https://it-tools.tech/uuid-generator | `a7f3c8e2-4b1d-4f9a-8c3e-7d2f1a9b5e4c` |
| `ADMIN_EMAIL` | Admin email for notifications | `admin@yourdomain.com` |
**Review all other settings** and adjust as needed for your environment (timezone, fetch intervals, retention period, etc.)
**🔐 Optional: Enable Authentication**
#### Optional: Enable Authentication
For production deployments, enable HTTP Basic Authentication:
@@ -72,13 +156,13 @@ When enabled:
2. Navigate to **System****Configuration****Access**
3. Extend **API** section
4. Copy & Enable **Read-Only Access**
6. Paste the generated API key to your `.env` file
5. Paste the generated API key to your `.env` file
### Step 5: Configure Postfix (Important!)
For optimal message correlation, add this line to your Postfix configuration:
### Add to `data/conf/postfix/extra.cf`:
#### Add to `data/conf/postfix/extra.cf`:
```conf
always_add_missing_headers = yes
```
@@ -125,7 +209,7 @@ Expected response:
{
"status": "healthy",
"database": "connected",
"version": "1.3.0"
"version": "2.0.0"
}
```
@@ -146,6 +230,122 @@ INFO - ✅ Imported 45 Rspamd logs
---
# Optional Features Configuration
## MaxMind GeoIP Integration
Add geographic location data to your DMARC reports and log analysis.
### Setup Steps:
1. Sign up for a free MaxMind account at [https://www.maxmind.com/](https://www.maxmind.com/)
2. Create a **License Key**
3. Copy your **Account ID** and **License Key**
4. Add the credentials to your `.env` file:
```env
MAXMIND_ACCOUNT_ID=your_account_id
MAXMIND_LICENSE_KEY=your_license_key
```
5. **Map the data volume** in your `docker-compose.yml`:
```yaml
services:
app:
# ... other configurations
volumes:
- ./data:/app/data
```
> [!NOTE]
> The application will automatically download and update the GeoIP database into this folder using the credentials provided.
**If not configured:** The application works normally without GeoIP data.
---
## SMTP Email Notifications
Configure email notifications for system alerts and DMARC processing errors.
Add to your `.env` file:
```env
SMTP_ENABLED=true
SMTP_HOST=smtp.yourdomain.com
SMTP_PORT=587
SMTP_USE_TLS=true
SMTP_USER=your_smtp_user
SMTP_PASSWORD=your_smtp_password
SMTP_FROM=noreply@yourdomain.com
```
**If not configured:** No email notifications will be sent (default: `SMTP_ENABLED=false`).
---
## DMARC Configuration
### Retention Period
Control how long DMARC reports are stored:
```env
DMARC_RETENTION_DAYS=60
```
**Default:** 60 days if not specified.
### Manual Upload
Enable/disable manual DMARC report upload via the web interface:
```env
DMARC_MANUAL_UPLOAD_ENABLED=true
```
**Default:** `true` (enabled).
### IMAP Auto-Import
Automatically fetch DMARC reports from an email inbox:
```env
DMARC_IMAP_ENABLED=true
DMARC_IMAP_HOST=imap.yourdomain.com
DMARC_IMAP_PORT=993
DMARC_IMAP_USE_SSL=true
DMARC_IMAP_USER=dmarc@yourdomain.com
DMARC_IMAP_PASSWORD=your_password
DMARC_IMAP_FOLDER=INBOX
DMARC_IMAP_DELETE_AFTER=true
DMARC_IMAP_INTERVAL=3600
DMARC_IMAP_RUN_ON_STARTUP=true
```
**Configuration options:**
- `DMARC_IMAP_DELETE_AFTER`: Delete emails after processing (default: `true`)
- `DMARC_IMAP_INTERVAL`: Check interval in seconds (default: 3600 = 1 hour)
- `DMARC_IMAP_RUN_ON_STARTUP`: Process existing emails on startup (default: `true`)
**If not configured:** IMAP auto-import remains disabled (default: `DMARC_IMAP_ENABLED=false`).
### DMARC Error Notifications
Override the admin email specifically for DMARC processing errors:
```env
DMARC_ERROR_EMAIL=dmarc-alerts@yourdomain.com
```
**If not configured:** Uses `ADMIN_EMAIL` by default.
---
# Troubleshooting
## Common Issues
### No logs appearing?
@@ -166,6 +366,11 @@ INFO - ✅ Imported 45 Rspamd logs
- Check database password in `.env`
- Restart: `docker compose restart`
### Container won't start?
- Verify `ADMIN_EMAIL` is set
- Check Docker logs: `docker compose logs -f`
### Port 8080 already in use?
Change the port mapping in `docker-compose.yml` and restart:
@@ -174,14 +379,34 @@ docker compose down
docker compose up -d
```
### IMAP not working?
- Verify credentials and connection settings
- Check firewall allows outbound connections to IMAP server
- For Gmail: use App Passwords, not your regular password
### No email notifications?
- Ensure `SMTP_ENABLED=true`
- Verify SMTP credentials and server settings
- Check Docker logs for SMTP errors
---
### Update Application
# Updating the Application
To update to the latest version:
```bash
docker compose pull
docker compose up -d
```
**That's it!** The application will automatically:
- Run database migrations
- Initialize new features
- Apply your configuration
---
## Documentation
@@ -195,4 +420,13 @@ docker compose up -d
**Logs**: `docker compose logs app`
**Health**: `http://localhost:8080/api/health`
**Issues**: Open issue on GitHub
**Issues**: Open issue on GitHub
---
## Need Help?
If you encounter any issues, please open an issue on GitHub with:
- Your Docker logs
- Your `.env` configuration (remove sensitive data)
- Description of the problem

View File

@@ -0,0 +1,207 @@
# DMARC Reports - User Guide
## Overview
The DMARC Reports page provides detailed analysis of DMARC aggregate reports received from email service providers. These reports show how your domain's emails are being handled across the internet and help identify authentication issues and potential email spoofing attempts.
## What is DMARC?
**DMARC (Domain-based Message Authentication, Reporting & Conformance)** is an email authentication protocol that:
- Validates that emails claiming to be from your domain are legitimate
- Tells receiving servers what to do with emails that fail validation
- Provides reports about email authentication results
## Report Types
### Aggregate Reports (XML)
Most common type of DMARC report, containing:
- **Statistics**: How many emails passed/failed authentication
- **Sources**: IP addresses sending email claiming to be from your domain
- **Results**: SPF and DKIM authentication outcomes
- **Disposition**: How receiving servers handled the emails
### Report Organization
The DMARC interface has multiple navigation levels:
#### 1. Domains View (Main Page)
- Lists all domains with DMARC reporting enabled
- Shows summary statistics:
- Total reports received
- Date range of reports
- Overall DMARC compliance rate
#### 2. Domain Overview
Click a domain to see:
- **Report Timeline**: Graph showing reports over time
- **Top Sending Sources**: Most active IP addresses
- **Compliance Summary**: Pass/fail statistics
- **Policy Effectiveness**: How well your DMARC policy is working
#### 3. Individual Report Details
Click a specific report to view:
- **Report Metadata**:
- Reporting organization (e.g., Gmail, Outlook)
- Date range covered
- Report ID
- **Authentication Results**:
- SPF alignment status
- DKIM alignment status
- Overall DMARC result
- **Message Statistics**:
- Total messages evaluated
- Disposition applied (none/quarantine/reject)
#### 4. Source IP Details
Click an IP address to see:
- **Geographic Information**:
- Country
- Region/City
- ISP/Organization
- **Authentication Details**:
- SPF check result
- DKIM check result
- DMARC alignment status
- **Volume**: Number of messages from this source
- **Reverse DNS**: Hostname associated with the IP
## Understanding Report Data
### DMARC Alignment
For an email to pass DMARC, it must pass either:
- **SPF alignment**: The sending domain passes SPF AND matches the From: header domain
- **DKIM alignment**: The email has a valid DKIM signature AND the domain matches the From: header
### Disposition
What the receiving server did with the email:
- **none**: Delivered normally (monitoring mode)
- **quarantine**: Moved to spam/junk folder
- **reject**: Bounced/blocked entirely
### Policy vs. Disposition
- **Policy**: What your DMARC record tells servers to do
- **Disposition**: What servers actually did (they may override your policy)
## Key Features
### Geographic Visualization
- Country flags show where emails are being sent from
- Hover over flags to see country names
- Click to filter by geographic region
### Trend Analysis
- Charts show authentication patterns over time
- Identify sudden changes in email volume or sources
- Spot potential spoofing attempts
### Source Identification
- IP addresses with reverse DNS lookup
- ISP/organization information
- Historical data per source
### Compliance Tracking
- Pass rate percentage for SPF and DKIM
- DMARC policy effectiveness
- Recommendations for policy adjustments
## Common Scenarios
### Legitimate Sources Failing
**Symptom**: Known good sources showing failures
**Causes**:
- Third-party email services not properly configured
- Marketing platforms lacking DKIM signatures
- Forwarded emails breaking SPF
**Solutions**:
- Add third-party IPs to SPF record
- Configure DKIM with third-party services
- Use SPF/DKIM alignment carefully
### Unknown Sources Appearing
**Symptom**: Unexpected IP addresses in reports
**Investigation**:
1. Check reverse DNS and ISP
2. Look for geographic anomalies
3. Compare message volume
4. Review authentication failures
**Action**: If suspicious, strengthen DMARC policy
### High Failure Rate
**Symptom**: Low DMARC pass percentage
**Diagnosis**:
- Review which sources are failing
- Check SPF record completeness
- Verify DKIM is configured on all sending systems
- Look for email forwarding issues
## Best Practices
### Policy Progression
1. **Start**: `p=none` (monitoring only)
2. **Observe**: Collect reports for 2-4 weeks
3. **Identify**: Find all legitimate sending sources
4. **Fix**: Configure SPF/DKIM for all sources
5. **Upgrade**: Move to `p=quarantine`
6. **Monitor**: Watch for issues
7. **Final**: Move to `p=reject` for maximum protection
### Regular Review
- Check reports at least weekly
- Look for new sources or suspicious patterns
- Monitor DMARC compliance rate
- Update SPF/DKIM as infrastructure changes
### Third-Party Services
When using email services (marketing, support desk, etc.):
- Request DKIM signing
- Add their IPs to SPF record
- Test before going live
- Monitor their authentication success
## Troubleshooting
### No Reports Appearing
- **Check DMARC Record**: Verify `rua=` tag has correct email
- **Wait**: Reports can take 24-48 hours to arrive
- **Email Access**: Ensure reporting email is accessible
### Reports Not Parsing
- **Format Issues**: Some providers send non-standard XML
- **Upload Manually**: Use upload button for problematic reports
- **Contact Support**: Report parsing issues
### Confusing Results
- **Multiple Sources**: Different email systems may show different results
- **Forwarding**: Email forwarding can break SPF
- **Subdomains**: Check if subdomain policy is needed
## Report Retention
- Reports are stored according to your configured retention period
- Default: 90 days
- Older reports are automatically deleted to save space
- Export reports before they're deleted if long-term analysis is needed
## Security Considerations
### Identifying Spoofing
Watch for:
- Unusual geographic sources
- High volume from unknown IPs
- 100% authentication failures from specific sources
- Mismatched reverse DNS
### Response to Threats
1. Document the suspicious activity
2. Strengthen DMARC policy if not already at `reject`
3. Review and tighten SPF records
4. Consider adding forensic reporting (`ruf=`)
5. Contact abuse departments at sending ISPs
## Additional Resources
- [DMARC Official Site](https://dmarc.org/)
- [DMARC Alignment Guide](https://dmarc.org/overview/)
- [RFC 7489 - DMARC Specification](https://tools.ietf.org/html/rfc7489)

View File

@@ -0,0 +1,90 @@
# Domains Page - User Guide
## Overview
The Domains page displays all email domains configured in your Mailcow server, along with comprehensive DNS validation and domain statistics.
## Key Features
### Domain Information
- **Domain Name**: Your email domain
- **Active Status**: Whether the domain is currently active
- **Mailboxes**: Current/Maximum mailbox count and available slots
- **Aliases**: Current/Maximum alias count and available slots
- **Storage**: Total storage used and quota (if applicable)
### DNS Security Validation
The system automatically validates three critical DNS records:
#### SPF (Sender Policy Framework)
- **Purpose**: Specifies which mail servers can send email on behalf of your domain
- **Status Indicators**:
-**Success**: SPF record exists and is properly configured
-**Warning**: SPF record exists but may need optimization
-**Error**: SPF record is missing or incorrect
- ? **Unknown**: Not yet checked
#### DKIM (DomainKeys Identified Mail)
- **Purpose**: Adds a digital signature to outgoing emails
- **Validation**: Compares your DNS record with Mailcow's configured DKIM key
- **Status**: Same indicators as SPF
#### DMARC (Domain-based Message Authentication)
- **Purpose**: Defines how recipients should handle emails that fail authentication
- **Policy Levels**:
- `reject`: Strongest protection (recommended)
- `quarantine`: Moderate protection
- `none`: Monitoring only (weakest)
- **Status**: Same indicators as SPF
## How to Use
### Viewing Domains
1. All domains are displayed in an expandable list
2. Quick overview shows domain name, status, and DNS validation summary
3. Click any domain row to expand and view detailed information
### DNS Validation
- **Automatic Checks**: DNS records are validated every 6 hours in the background
- **Manual Check**: Click the "Check DNS" button within any domain's details to force an immediate validation
- **Last Checked**: Timestamp shows when DNS was last validated
### Search & Filter
- **Search Box**: Filter domains by name
- **Issues Filter**: Check "Show DNS Issues Only" to display only domains with DNS problems
### Understanding DNS Status
When you expand a domain, the DNS Security section shows:
- Detailed status message for each record type
- The actual DNS record value (for DKIM and DMARC)
- Specific warnings or recommendations
- Time of last validation
## Best Practices
1. **Regular Monitoring**: Review DNS status regularly, especially after DNS changes
2. **Fix Issues Promptly**: Address DNS warnings and errors as soon as possible
3. **Strong DMARC Policy**: Aim for `quarantine` or `reject` policy
4. **SPF Optimization**: Keep SPF records concise (under 10 DNS lookups)
5. **DKIM Key Rotation**: Periodically rotate DKIM keys for security
## Troubleshooting
### DNS Changes Not Reflected
- DNS changes can take 24-72 hours to propagate globally
- Use the manual "Check DNS" button to verify after waiting
- Check your DNS provider's interface to confirm records are published
### "DNS Query Timeout" Errors
- Indicates temporary DNS server issues
- Wait a few minutes and try again
- If persistent, check your DNS provider's status
### "Record Mismatch" Warnings
- Compare the "Expected" vs "Actual" record values
- Update your DNS to match the expected value
- Wait for DNS propagation, then check again
## Related Resources
- [SPF Record Syntax](https://en.wikipedia.org/wiki/Sender_Policy_Framework)
- [DKIM Overview](https://en.wikipedia.org/wiki/DomainKeys_Identified_Mail)
- [DMARC Policy Guide](https://dmarc.org/)

298
documentation/UpdateV2.md Normal file
View File

@@ -0,0 +1,298 @@
# Upgrade Guide v2.0 - New Environment Variables
## Overview
This update introduces several new optional features and configuration options for v2 mailcow-logs-viewer application. All changes are **backward compatible** - existing installations will continue to work without any modifications.
## What's New
### 1. **GeoIP Integration (MaxMind)**
Add geographic location data to your DMARC reports and log analysis.
### 2. **SMTP Email Notifications**
Configure email notifications for system alerts and DMARC processing errors.
### 3. **Admin Email**
Centralized admin contact for system notifications.
### 4. **Enhanced DMARC Features**
- Configurable retention period
- Manual report upload capability
- Automatic IMAP import for DMARC reports
---
# TL;DR
## Optional Features (all disabled by default)
Add to your `.env` file:
**MaxMind GeoIP:**
```env
MAXMIND_ACCOUNT_ID=your_id
MAXMIND_LICENSE_KEY=your_key
```
Add a mounted folder for MaxMind databases
```yaml
services:
app:
volumes:
- ./data:/app/data
```
**SMTP Notifications:**
```env
SMTP_ENABLED=true
SMTP_HOST=smtp.yourdomain.com
SMTP_PORT=587
SMTP_USER=user
SMTP_PASSWORD=pass
SMTP_FROM=noreply@yourdomain.com
```
**DMARC IMAP Auto-Import:**
```env
DMARC_IMAP_ENABLED=true
DMARC_IMAP_HOST=imap.yourdomain.com
DMARC_IMAP_PORT=993
DMARC_IMAP_USE_SSL=true
DMARC_IMAP_USER=dmarc@yourdomain.com
DMARC_IMAP_PASSWORD=your_password
```
**Other optional settings:**
- `ADMIN_EMAIL=admin@yourdomain.com`
- `DMARC_RETENTION_DAYS=60` (default: 60)
- `DMARC_MANUAL_UPLOAD_ENABLED=true` (default: true)
- `DMARC_ERROR_EMAIL=` (optional, uses ADMIN_EMAIL if not set)
**Upgrade**
```bash
docker compose pull
docker compose up -d
```
**That's it!**.
---
## Changes
### Admin Email
Add this variable to your `.env` file:
```env
ADMIN_EMAIL=admin@yourdomain.com
```
**Replace `admin@yourdomain.com` with your actual email address.** This email will receive system notifications and error alerts.
---
## Optional Features
### MaxMind GeoIP (Optional)
To enable geographic location enrichment in Email Source IP & DMARC reports:
* [ ] Sign up for a free MaxMind account at [https://www.maxmind.com/](https://www.maxmind.com/)
* [ ] Create a **License Key**
* [ ] Copy your **Account ID** and **License Key**
* [ ] Add the credentials to your `.env` file:
```env
MAXMIND_ACCOUNT_ID=your_account_id
MAXMIND_LICENSE_KEY=your_license_key
```
* [ ] **Map the data volume** in your `docker-compose.yml` to persist the database after a container restart:
```yaml
services:
app:
# ... other configurations
volumes:
- ./data:/app/data
```
> [!NOTE]
> The application will automatically download and update the GeoIP database into this folder using the credentials provided.
**If not configured:** The application works normally without GeoIP data.
---
### SMTP Email Notifications (Optional)
To enable email notifications:
```env
SMTP_ENABLED=true
SMTP_HOST=smtp.yourdomain.com
SMTP_PORT=587
SMTP_USE_TLS=true
SMTP_USER=your_smtp_user
SMTP_PASSWORD=your_smtp_password
SMTP_FROM=noreply@yourdomain.com
```
**If not configured:** No email notifications will be sent (default: `SMTP_ENABLED=false`).
---
### DMARC Configuration (Optional)
#### Retention Period
Control how long DMARC reports are stored:
```env
DMARC_RETENTION_DAYS=60
```
**Default:** 60 days if not specified.
---
#### Manual Upload
Enable/disable manual DMARC report upload via the web interface:
```env
DMARC_MANUAL_UPLOAD_ENABLED=true
```
**Default:** `true` (enabled).
---
#### IMAP Auto-Import
Automatically fetch DMARC reports from an email inbox:
```env
DMARC_IMAP_ENABLED=true
DMARC_IMAP_HOST=imap.yourdomain.com
DMARC_IMAP_PORT=993
DMARC_IMAP_USE_SSL=true
DMARC_IMAP_USER=dmarc@yourdomain.com
DMARC_IMAP_PASSWORD=your_password
DMARC_IMAP_FOLDER=INBOX
DMARC_IMAP_DELETE_AFTER=true
DMARC_IMAP_INTERVAL=3600
DMARC_IMAP_RUN_ON_STARTUP=true
```
**Configuration options:**
- `DMARC_IMAP_DELETE_AFTER`: Delete emails after processing (default: `true`)
- `DMARC_IMAP_INTERVAL`: Check interval in seconds (default: 3600 = 1 hour)
- `DMARC_IMAP_RUN_ON_STARTUP`: Process existing emails on startup (default: `true`)
**If not configured:** IMAP auto-import remains disabled (default: `DMARC_IMAP_ENABLED=false`).
---
#### DMARC Error Notifications
Override the admin email specifically for DMARC processing errors:
```env
DMARC_ERROR_EMAIL=dmarc-alerts@yourdomain.com
```
**If not configured:** Uses `ADMIN_EMAIL` by default.
---
## Upgrade Steps
1. **Update your `.env` file:**
- Add `ADMIN_EMAIL=your@email.com`
- Add any optional features you want to enable
2. **Pull the latest image:**
```bash
docker compose pull
```
3. **Start the container:**
```bash
docker compose up -d
```
**That's it!** The application will automatically:
- Run database migrations
- Initialize new features
- Apply your configuration
---
## Full v2 Configuration Example
Complete example with all features enabled:
```env
# Required
ADMIN_EMAIL=admin@yourdomain.com
# MaxMind GeoIP
MAXMIND_ACCOUNT_ID=123456
MAXMIND_LICENSE_KEY=your_license_key_here
# SMTP Notifications
SMTP_ENABLED=true
SMTP_HOST=smtp.gmail.com
SMTP_PORT=587
SMTP_USE_TLS=true
SMTP_USER=notifications@yourdomain.com
SMTP_PASSWORD=your_app_password
SMTP_FROM=noreply@yourdomain.com
# DMARC Settings
DMARC_RETENTION_DAYS=90
DMARC_MANUAL_UPLOAD_ENABLED=true
# DMARC IMAP Auto-Import
DMARC_IMAP_ENABLED=true
DMARC_IMAP_HOST=imap.gmail.com
DMARC_IMAP_PORT=993
DMARC_IMAP_USE_SSL=true
DMARC_IMAP_USER=dmarc-reports@yourdomain.com
DMARC_IMAP_PASSWORD=your_app_password
DMARC_IMAP_FOLDER=INBOX
DMARC_IMAP_DELETE_AFTER=true
DMARC_IMAP_INTERVAL=3600
DMARC_IMAP_RUN_ON_STARTUP=true
# Optional: Separate email for DMARC errors
DMARC_ERROR_EMAIL=dmarc-admin@yourdomain.com
```
---
## Troubleshooting
**Container won't start after update:**
- Verify `ADMIN_EMAIL` is set
- Check Docker logs: `docker compose logs -f`
**IMAP not working:**
- Verify credentials and connection settings
- Check firewall allows outbound connections to IMAP server
- For Gmail: use App Passwords, not your regular password
**No email notifications:**
- Ensure `SMTP_ENABLED=true`
- Verify SMTP credentials and server settings
- Check Docker logs for SMTP errors
---
## Need Help?
If you encounter any issues during the upgrade, please open an issue on GitHub with:
- Your Docker logs
- Your `.env` configuration (remove sensitive data)
- Description of the problem

View File

@@ -25,6 +25,13 @@ POSTGRES_DB=mailcowlogs
POSTGRES_HOST=db
POSTGRES_PORT=5432
# =============================================================================
# MAXMIND (Optional)
# =============================================================================
MAXMIND_ACCOUNT_ID=
MAXMIND_LICENSE_KEY=
# =============================================================================
# FETCH CONFIGURATION
# =============================================================================
@@ -55,6 +62,50 @@ MAX_CORRELATION_AGE_MINUTES=10
# Correlation check interval (seconds)
CORRELATION_CHECK_INTERVAL=120
# =============================================================================
# Global SMTP CONFIGURATION (Optional)
# =============================================================================
SMTP_ENABLED=false
SMTP_HOST=
SMTP_PORT=
SMTP_USE_TLS=true
SMTP_USER=
SMTP_PASSWORD=
SMTP_FROM=noreply@yourdomain.com
# =============================================================================
# Admin Email
# =============================================================================
ADMIN_EMAIL=admin@yourdomain.com
# =============================================================================
# DMARC CONFIGURATION (Optional)
# =============================================================================
# DMARC reports retention in days
# Default: 60 days
DMARC_RETENTION_DAYS=60
# DMARC Manual Upload
DMARC_MANUAL_UPLOAD_ENABLED=true
# DMARC IMAP Auto-Import Configuration
DMARC_IMAP_ENABLED=false
DMARC_IMAP_HOST=
DMARC_IMAP_PORT=993
DMARC_IMAP_USE_SSL=true
DMARC_IMAP_USER=
DMARC_IMAP_PASSWORD=
DMARC_IMAP_FOLDER=INBOX
DMARC_IMAP_DELETE_AFTER=true
DMARC_IMAP_INTERVAL=3600
DMARC_IMAP_RUN_ON_STARTUP=true
# DMARC Error Email Override (optional - uses ADMIN_EMAIL if not set)
DMARC_ERROR_EMAIL=
# =============================================================================
# BLACKLIST CONFIGURATION (Optional)
# =============================================================================

File diff suppressed because it is too large Load Diff

Binary file not shown.

After

Width:  |  Height:  |  Size: 503 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 442 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 485 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 540 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 512 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 367 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 489 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 445 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 509 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 523 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 517 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 441 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 488 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 503 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 591 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 542 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 499 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 489 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 462 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 363 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 468 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 505 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 486 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 608 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 433 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 545 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 609 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 616 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 512 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 483 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 590 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 478 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 571 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 551 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 506 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 482 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 523 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 415 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 485 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 645 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 597 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 492 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 346 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 394 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 493 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 475 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 446 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 401 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 469 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 523 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 640 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 573 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 482 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 587 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 455 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 498 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 367 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 502 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 447 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 610 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 559 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 477 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 537 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 454 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 453 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 505 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 569 B

Some files were not shown because too many files have changed in this diff Show More