VC Processing Cron System
Overview
The VC Processing Cron System automatically processes Verifiable Credential (VC) events from Dhiway Analytics API in time-based windows. It runs on a configurable schedule (default: every 2 hours) and processes events from the last processed timestamp up to the current time minus a lookback window.
Architecture
Components
DhiwayVcProcessingCron (
src/modules/users/crons/dhiway-vc-processing.cron.ts)Main cron worker that runs on schedule
Manages time windows and state
Processes records from Analytics API
VcProcessingService (
src/modules/users/services/vc-processing.service.ts)Extracted business logic from
UserService.processVcEvent()Processes VC events without HTTP responses
Handles document updates, verification, and profile updates
DhiwayAnalyticsService (
src/services/dhiway-analytics/dhiway-analytics.service.ts)Fetches VC summary data from Dhiway Analytics API
Handles API errors and timeouts
Entities
CronState- Tracks last processed timestampVcEventProcessingLog- Logs processing results for audit and idempotency
Configuration
Environment Variables
Add these to your .env file:
Cron Schedule Format
The cron schedule follows standard cron format: minute hour day month day-of-week
Examples:
0 */2 * * *- Every 2 hours at minute 00 */4 * * *- Every 4 hours at minute 00 9,17 * * *- At 9 AM and 5 PM daily
Type Mapping
Dhiway Analytics types are mapped to internal VC statuses in src/config/dhiway-analytics.config.ts:
record_anchored→issued
Add more mappings as needed for future types.
Database Setup
Run Migration
Execute the migration script to create required tables:
Or manually run the SQL from migrations/create_vc_processing_cron_tables.sql.
Tables Created
cron_state
Stores cron job state
Tracks
last_processed_totimestampInitialized with default values on first run
vc_event_processing_log
Logs all processing attempts
Used for idempotency checks
Provides audit trail
How It Works
Time Window Processing
First Run: Initializes
cron_statewithlast_processed_to = NOW()Subsequent Runs:
Reads
last_processed_tofromcron_stateCalculates time window:
from = last_processed_to,to = NOW()Processes all records in that window
Updates
last_processed_tototoafter successful batch
Processing Flow
Idempotency
Checks
vc_event_processing_logfor records already processed in the same batchPrevents duplicate processing if cron runs multiple times
Safe to process same
vc_public_idmultiple times (document updates are idempotent)
Error Handling
Analytics API Errors: Logged and batch processing stops (state not updated)
Individual Record Failures: Logged, processing continues for remaining records
Missing Documents: Logged as warning, skipped
Processing Errors: Logged to
vc_event_processing_logwith error message
Failed records will be picked up in the next cron run if they fall within the time window.
Monitoring
Logs
All logs include service context: dhiway-vc-processing
Key log messages:
Starting VC processing cron job- Cron startProcessing time window: {from} to {to}- Window being processedFound {count} records to process- Records foundSuccessfully processed record {id}- SuccessFailed to process record {id}- FailureCompleted processing. Success: X, Failed: Y- Summary
Database Queries
Check last processed time:
View processing logs:
Success rate:
Troubleshooting
Cron Not Running
Check if
ScheduleModule.forRoot()is imported inusers.module.tsVerify cron schedule format is correct
Check application logs for errors
No Records Processed
Check
cron_state.last_processed_to- may be ahead of current timeVerify Analytics API is returning data for the time window
Check Analytics API base URL configuration
Failed Records
Check
vc_event_processing_logfor error messagesVerify documents exist with matching
vc_public_idCheck adapter processing logs
Reset Cron State
To reset and start from a specific time:
API Endpoint
The existing /users/vc/process-event API endpoint remains unchanged and continues to work independently. The cron system uses the same business logic internally without triggering the API.
Future Enhancements
Add retry mechanism for failed records (if needed)
Add metrics/alerting integration
Support for multiple issuer types
Batch size limits for large datasets
Last updated
