Skip to main content

Monitoring & Analytics

Comprehensive guide for monitoring your EvoBin integration, tracking performance, and gathering insights.

Monitoring Architecture​

graph TB
subgraph "Your Application"
A[EvoBin Client] --> B[Monitoring Agent]
B --> C[Local Storage]
B --> D[Remote Monitoring]
end
subgraph "Monitoring Infrastructure"
D --> E[Metrics Collector]
D --> F[Log Aggregator]
D --> G[Alert Manager]
end
subgraph "Dash Platform"
A --> H[Blockchain Network]
H --> I[Transaction Metrics]
H --> J[Network Health]
end
subgraph "Observability Tools"
E --> K[Prometheus/Grafana]
F --> L[ELK Stack]
G --> M[Alerting System]
end
K --> N[Dashboards]
L --> O[Log Search]
M --> P[Notifications]

Key Metrics to Track​

Storage Metrics​

// docs/advanced/monitoring.md
interface StorageMetrics {
// Capacity metrics
totalStorageUsed: number // bytes
storageByType: Record<string, number> // bytes by file type
storageGrowthRate: number // bytes/day
// File metrics
totalFiles: number
filesByType: Record<string, number>
averageFileSize: number // bytes
// Performance metrics
uploadSuccessRate: number // 0-1
downloadSuccessRate: number // 0-1
averageUploadTime: number // ms
averageDownloadTime: number // ms
// Cost metrics
totalCreditsUsed: number
creditsPerMb: number
estimatedMonthlyCost: number
// Redundancy metrics
replicationFactor: number // average
chunksLost: number
chunksRecovered: number
}
class StorageMonitor {
private metrics: StorageMetrics = {
totalStorageUsed: 0,
storageByType: {},
storageGrowthRate: 0,
totalFiles: 0,
filesByType: {},
averageFileSize: 0,
uploadSuccessRate: 0,
downloadSuccessRate: 0,
averageUploadTime: 0,
averageDownloadTime: 0,
totalCreditsUsed: 0,
creditsPerMb: 0,
estimatedMonthlyCost: 0,
replicationFactor: 3,
chunksLost: 0,
chunksRecovered: 0
}
private uploadTimes: number[] = []
private downloadTimes: number[] = []
private uploadSuccesses = 0
private uploadFailures = 0
private downloadSuccesses = 0
private downloadFailures = 0
recordFileUpload(file: File, uploadTime: number, creditsUsed: number, success: boolean): void {
const fileType = this.getFileType(file.name)
const fileSize = file.size
// Update storage metrics
this.metrics.totalStorageUsed += fileSize
this.metrics.storageByType[fileType] = (this.metrics.storageByType[fileType] || 0) + fileSize
this.metrics.totalFiles++
this.metrics.filesByType[fileType] = (this.metrics.filesByType[fileType] || 0) + 1
// Update performance metrics
if (success) {
this.uploadSuccesses++
this.uploadTimes.push(uploadTime)
this.metrics.averageUploadTime = this.calculateMovingAverage(this.uploadTimes)
this.metrics.totalCreditsUsed += creditsUsed
this.metrics.creditsPerMb = this.metrics.totalCreditsUsed / (this.metrics.totalStorageUsed / (1024 * 1024))
} else {
this.uploadFailures++
}
this.metrics.uploadSuccessRate = this.uploadSuccesses / (this.uploadSuccesses + this.uploadFailures)
this.updateAverageFileSize()
this.calculateStorageGrowth()
}
recordFileDownload(fileId: string, downloadTime: number, success: boolean): void {
if (success) {
this.downloadSuccesses++
this.downloadTimes.push(downloadTime)
this.metrics.averageDownloadTime = this.calculateMovingAverage(this.downloadTimes)
} else {
this.downloadFailures++
}
this.metrics.downloadSuccessRate = this.downloadSuccesses / (this.downloadSuccesses + this.downloadFailures)
}
recordChunkHealth(chunks: { id: string; healthy: boolean; replicas: number }[]): void {
let totalReplicas = 0
let lostChunks = 0
let recoveredChunks = 0
chunks.forEach(chunk => {
totalReplicas += chunk.replicas
if (!chunk.healthy) {
lostChunks++
// Check if chunk was recovered
if (chunk.replicas > 0) {
recoveredChunks++
}
}
})
this.metrics.replicationFactor = totalReplicas / chunks.length
this.metrics.chunksLost += lostChunks
this.metrics.chunksRecovered += recoveredChunks
}
private getFileType(filename: string): string {
const extension = filename.split('.').pop()?.toLowerCase() || 'unknown'
const types: Record<string, string> = {
'jpg': 'image', 'jpeg': 'image', 'png': 'image', 'gif': 'image', 'webp': 'image',
'mp4': 'video', 'avi': 'video', 'mov': 'video', 'mkv': 'video',
'mp3': 'audio', 'wav': 'audio', 'flac': 'audio',
'pdf': 'document', 'doc': 'document', 'docx': 'document', 'txt': 'document',
'zip': 'archive', 'rar': 'archive', '7z': 'archive'
}
return types[extension] || 'other'
}
private calculateMovingAverage(times: number[]): number {
if (times.length === 0) return 0
const recent = times.slice(-100) // Last 100 samples
return recent.reduce((sum, time) => sum + time, 0) / recent.length
}
private updateAverageFileSize(): void {
if (this.metrics.totalFiles === 0) return
this.metrics.averageFileSize = this.metrics.totalStorageUsed / this.metrics.totalFiles
}
private calculateStorageGrowth(): void {
// Store daily usage and calculate growth rate
const today = new Date().toISOString().split('T')[0]
const dailyUsage = JSON.parse(localStorage.getItem('daily_usage') || '{}')
if (!dailyUsage[today]) {
dailyUsage[today] = this.metrics.totalStorageUsed
localStorage.setItem('daily_usage', JSON.stringify(dailyUsage))
// Calculate growth rate based on last 7 days
const last7Days = Object.values(dailyUsage).slice(-7) as number[]
if (last7Days.length >= 2) {
const oldest = last7Days[0]
const newest = last7Days[last7Days.length - 1]
this.metrics.storageGrowthRate = (newest - oldest) / 7
}
}
}
getMetrics(): StorageMetrics {
return { ...this.metrics }
}
generateReport(): StorageReport {
const metrics = this.getMetrics()
return {
timestamp: Date.now(),
metrics,
recommendations: this.generateRecommendations(metrics),
healthScore: this.calculateHealthScore(metrics)
}
}
private generateRecommendations(metrics: StorageMetrics): string[] {
const recommendations: string[] = []
if (metrics.uploadSuccessRate < 0.95) {
recommendations.push('Upload success rate is low. Check network connectivity and chunk size settings.')
}
if (metrics.averageUploadTime > 10000) { // 10 seconds
recommendations.push('Upload performance is slow. Consider reducing chunk size or increasing parallel uploads.')
}
if (metrics.replicationFactor < 3) {
recommendations.push('Replication factor below optimal level (3). Some chunks may be at risk.')
}
if (metrics.creditsPerMb > 100) { // Arbitrary threshold
recommendations.push('Storage costs are high. Consider compressing files before upload.')
}
return recommendations
}
private calculateHealthScore(metrics: StorageMetrics): number {
let score = 100
// Deduct points for poor performance
if (metrics.uploadSuccessRate < 0.95) score -= 20
if (metrics.downloadSuccessRate < 0.95) score -= 20
if (metrics.averageUploadTime > 10000) score -= 10
if (metrics.replicationFactor < 3) score -= 15
if (metrics.chunksLost > 0) score -= 5 * metrics.chunksLost
// Cap at 0
return Math.max(0, Math.min(100, score))
}
}
interface StorageReport {
timestamp: number
metrics: StorageMetrics
recommendations: string[]
healthScore: number // 0-100
}

Blockchain Metrics​

// docs/advanced/monitoring.md
interface BlockchainMetrics {
// Network metrics
blockHeight: number
blockTime: number // seconds
networkHashrate: number
activeMasternodes: number
// Transaction metrics
transactionCount: number
averageFee: number // credits
confirmationTime: number // seconds
// Identity metrics
totalIdentities: number
activeIdentities: number
identityGrowthRate: number // per day
// Contract metrics
totalContracts: number
contractsByType: Record<string, number>
documentCount: number
documentsPerContract: number // average
// Error metrics
failedTransactions: number
validationErrors: number
networkErrors: number
}
class BlockchainMonitor {
private sdk: any
private metrics: BlockchainMetrics
private lastBlockHeight = 0
private blockTimes: number[] = []
constructor(sdk: any) {
this.sdk = sdk
this.metrics = this.initializeMetrics()
this.startMonitoring()
}
private initializeMetrics(): BlockchainMetrics {
return {
blockHeight: 0,
blockTime: 0,
networkHashrate: 0,
activeMasternodes: 0,
transactionCount: 0,
averageFee: 0,
confirmationTime: 0,
totalIdentities: 0,
activeIdentities: 0,
identityGrowthRate: 0,
totalContracts: 0,
contractsByType: {},
documentCount: 0,
documentsPerContract: 0,
failedTransactions: 0,
validationErrors: 0,
networkErrors: 0
}
}
private async startMonitoring(): Promise<void> {
// Update metrics every minute
setInterval(async () => {
try {
await this.updateMetrics()
} catch (error) {
console.error('Failed to update blockchain metrics:', error)
}
}, 60000)
// Initial update
await this.updateMetrics()
}
private async updateMetrics(): Promise<void> {
try {
const [blockchainInfo, networkInfo] = await Promise.all([
this.sdk.getBlockchainInfo(),
this.sdk.getNetworkInfo()
])
// Update block metrics
this.metrics.blockHeight = blockchainInfo.blocks
if (this.lastBlockHeight > 0) {
const newBlocks = blockchainInfo.blocks - this.lastBlockHeight
if (newBlocks > 0) {
// Estimate block time (simplified)
this.metrics.blockTime = 60000 / newBlocks // 1 minute / new blocks
this.blockTimes.push(this.metrics.blockTime)
// Keep only last 100 measurements
if (this.blockTimes.length > 100) {
this.blockTimes.shift()
}
}
}
this.lastBlockHeight = blockchainInfo.blocks
// Update network metrics
this.metrics.networkHashrate = networkInfo.hashrate || 0
this.metrics.activeMasternodes = networkInfo.masternodes || 0
// Update transaction metrics (this would be more complex in reality)
const mempoolInfo = await this.sdk.getMempoolInfo().catch(() => ({ size: 0 }))
this.metrics.transactionCount = mempoolInfo.size || 0
// Update other metrics periodically (less frequently)
await this.updateExtendedMetrics()
} catch (error) {
console.error('Error updating blockchain metrics:', error)
this.metrics.networkErrors++
}
}
private async updateExtendedMetrics(): Promise<void> {
try {
// These would be more expensive calls, so do them less frequently
if (Date.now() % 300000 < 60000) { // Every 5 minutes, for 1 minute
// Update identity metrics
// Note: This is simplified - actual implementation would need proper queries
this.metrics.totalIdentities = await this.estimateTotalIdentities()
this.metrics.activeIdentities = await this.estimateActiveIdentities()
// Update contract metrics
this.metrics.totalContracts = await this.estimateTotalContracts()
this.metrics.documentCount = await this.estimateTotalDocuments()
this.metrics.documentsPerContract = this.metrics.totalContracts > 0
? this.metrics.documentCount / this.metrics.totalContracts
: 0
}
} catch (error) {
console.error('Error updating extended metrics:', error)
this.metrics.networkErrors++
}
}
private async estimateTotalIdentities(): Promise<number> {
// Simplified estimation - in reality you'd query the blockchain
return 1000 // Placeholder
}
private async estimateActiveIdentities(): Promise<number> {
// Simplified estimation
return 500 // Placeholder
}
private async estimateTotalContracts(): Promise<number> {
return 100 // Placeholder
}
private async estimateTotalDocuments(): Promise<number> {
return 5000 // Placeholder
}
recordTransaction(fee: number, success: boolean, confirmationTime?: number): void {
if (success) {
this.metrics.transactionCount++
if (confirmationTime) {
this.metrics.confirmationTime = confirmationTime
}
// Update average fee
const oldTotal = this.metrics.averageFee * (this.metrics.transactionCount - 1)
this.metrics.averageFee = (oldTotal + fee) / this.metrics.transactionCount
} else {
this.metrics.failedTransactions++
}
}
recordError(errorType: 'validation' | 'network'): void {
if (errorType === 'validation') {
this.metrics.validationErrors++
} else {
this.metrics.networkErrors++
}
}
getMetrics(): BlockchainMetrics {
return { ...this.metrics }
}
getHealthStatus(): 'healthy' | 'degraded' | 'unhealthy' {
const errors = this.metrics.networkErrors + this.metrics.validationErrors
const errorRate = errors / Math.max(this.metrics.transactionCount, 1)
if (errorRate > 0.1) return 'unhealthy'
if (errorRate > 0.05) return 'degraded'
return 'healthy'
}
}

User Activity Metrics​

// docs/advanced/monitoring.md
interface UserActivity {
userId: string
timestamp: number
action: 'upload' | 'download' | 'share' | 'delete' | 'view'
fileId?: string
fileSize?: number // bytes
success: boolean
error?: string
duration?: number // ms
creditsUsed?: number
}
interface UserMetrics {
// Session metrics
totalSessions: number
averageSessionDuration: number // minutes
activeUsers: number // last 24h
returningUsers: number // percentage
// Activity metrics
uploadsPerUser: number // average
downloadsPerUser: number // average
storagePerUser: number // bytes, average
// Engagement metrics
averageFilesPerSession: number
mostActiveHour: number // 0-23
retentionRate: number // percentage
// Error metrics
errorsPerUser: number // average
mostCommonError: string
}
class UserActivityMonitor {
private activities: UserActivity[] = []
private sessions: Map<string, { start: number; end?: number }> = new Map()
private readonly MAX_ACTIVITIES = 10000
recordActivity(activity: UserActivity): void {
this.activities.push(activity)
// Keep only recent activities
if (this.activities.length > this.MAX_ACTIVITIES) {
this.activities = this.activities.slice(-this.MAX_ACTIVITIES)
}
// Update session
this.updateSession(activity.userId)
}
startSession(userId: string): void {
this.sessions.set(userId, {
start: Date.now(),
end: undefined
})
}
endSession(userId: string): void {
const session = this.sessions.get(userId)
if (session) {
session.end = Date.now()
}
}
private updateSession(userId: string): void {
if (!this.sessions.has(userId)) {
this.startSession(userId)
}
}
getMetrics(timeRange: number = 24 * 60 * 60 * 1000): UserMetrics {
const now = Date.now()
const recentActivities = this.activities.filter(a => now - a.timestamp <= timeRange)
const recentSessions = Array.from(this.sessions.entries())
.filter(([_, session]) => session.start >= now - timeRange)
const uniqueUsers = new Set(recentActivities.map(a => a.userId))
const activeHours = new Map<number, number>()
let totalUploads = 0
let totalDownloads = 0
let totalStorage = 0
let totalErrors = 0
const errorCounts = new Map<string, number>()
recentActivities.forEach(activity => {
// Count activities by hour
const hour = new Date(activity.timestamp).getHours()
activeHours.set(hour, (activeHours.get(hour) || 0) + 1)
// Aggregate metrics
if (activity.action === 'upload') {
totalUploads++
totalStorage += activity.fileSize || 0
} else if (activity.action === 'download') {
totalDownloads++
}
if (!activity.success && activity.error) {
totalErrors++
errorCounts.set(activity.error, (errorCounts.get(activity.error) || 0) + 1)
}
})
// Calculate session durations
const sessionDurations = recentSessions
.map(([_, session]) => session.end ? session.end - session.start : 0)
.filter(duration => duration > 0)
const averageSessionDuration = sessionDurations.length > 0
? sessionDurations.reduce((sum, duration) => sum + duration, 0) / sessionDurations.length / 60000
: 0
// Find most active hour
let mostActiveHour = 0
let maxActivities = 0
activeHours.forEach((count, hour) => {
if (count > maxActivities) {
maxActivities = count
mostActiveHour = hour
}
})
return {
totalSessions: recentSessions.length,
averageSessionDuration,
activeUsers: uniqueUsers.size,
returningUsers: this.calculateReturningUsers(recentActivities, timeRange),
uploadsPerUser: uniqueUsers.size > 0 ? totalUploads / uniqueUsers.size : 0,
downloadsPerUser: uniqueUsers.size > 0 ? totalDownloads / uniqueUsers.size : 0,
storagePerUser: uniqueUsers.size > 0 ? totalStorage / uniqueUsers.size : 0,
averageFilesPerSession: recentSessions.length > 0 ? totalUploads / recentSessions.length : 0,
mostActiveHour,
retentionRate: this.calculateRetentionRate(timeRange),
errorsPerUser: uniqueUsers.size > 0 ? totalErrors / uniqueUsers.size : 0,
mostCommonError: this.getMostCommonError(errorCounts)
}
}
private calculateReturningUsers(activities: UserActivity[], timeRange: number): number {
const userFirstSeen = new Map<string, number>()
const userLastSeen = new Map<string, number>()
activities.forEach(activity => {
if (!userFirstSeen.has(activity.userId)) {
userFirstSeen.set(activity.userId, activity.timestamp)
}
userLastSeen.set(activity.userId, activity.timestamp)
})
let returningUsers = 0
const sevenDays = 7 * 24 * 60 * 60 * 1000
userFirstSeen.forEach((firstSeen, userId) => {
const lastSeen = userLastSeen.get(userId) || firstSeen
if (lastSeen - firstSeen > sevenDays) {
returningUsers++
}
})
return userFirstSeen.size > 0 ? (returningUsers / userFirstSeen.size) * 100 : 0
}
private calculateRetentionRate(timeRange: number): number {
// Simplified retention calculation
const now = Date.now()
const userActivity = new Map<string, number[]>()
this.activities.forEach(activity => {
if (!userActivity.has(activity.userId)) {
userActivity.set(activity.userId, [])
}
userActivity.get(activity.userId)!.push(activity.timestamp)
})
let retainedUsers = 0
const userCount = userActivity.size
userActivity.forEach(timestamps => {
// Check if user was active in first and last third of time range
timestamps.sort((a, b) => a - b)
const firstActivity = timestamps[0]
const lastActivity = timestamps[timestamps.length - 1]
const timeFromStart = firstActivity - (now - timeRange)
const timeFromEnd = now - lastActivity
if (timeFromStart < timeRange / 3 && timeFromEnd < timeRange / 3) {
retainedUsers++
}
})
return userCount > 0 ? (retainedUsers / userCount) * 100 : 0
}
private getMostCommonError(errorCounts: Map<string, number>): string {
let mostCommon = 'none'
let maxCount = 0
errorCounts.forEach((count, error) => {
if (count > maxCount) {
maxCount = count
mostCommon = error
}
})
return mostCommon
}
generateHeatmapData(): { hour: number; day: number; count: number }[] {
const heatmap: Map<string, number> = new Map()
this.activities.forEach(activity => {
const date = new Date(activity.timestamp)
const hour = date.getHours()
const day = date.getDay() // 0 = Sunday, 1 = Monday, etc.
const key =`${day}-${hour}`
heatmap.set(key, (heatmap.get(key) || 0) + 1)
})
return Array.from(heatmap.entries()).map(([key, count]) => {
const [day, hour] = key.split('-').map(Number)
return { hour, day, count }
})
}
}

Logging Strategy​

// docs/advanced/monitoring.md
enum LogLevel {
DEBUG = 'DEBUG',
INFO = 'INFO',
WARN = 'WARN',
ERROR = 'ERROR',
FATAL = 'FATAL'
}
interface LogEntry {
timestamp: number
level: LogLevel
component: string
message: string
data?: any
userId?: string
sessionId?: string
traceId?: string
error?: Error
}
class Logger {
private static instance: Logger
private logLevel: LogLevel = LogLevel.INFO
private transports: LogTransport[] = []
private constructor() {}
static getInstance(): Logger {
if (!Logger.instance) {
Logger.instance = new Logger()
// Add default transports
Logger.instance.addTransport(new ConsoleTransport())
Logger.instance.addTransport(new LocalStorageTransport())
}
return Logger.instance
}
setLogLevel(level: LogLevel): void {
this.logLevel = level
}
addTransport(transport: LogTransport): void {
this.transports.push(transport)
}
private shouldLog(level: LogLevel): boolean {
const levels = [LogLevel.DEBUG, LogLevel.INFO, LogLevel.WARN, LogLevel.ERROR, LogLevel.FATAL]
const currentLevelIndex = levels.indexOf(this.logLevel)
const messageLevelIndex = levels.indexOf(level)
return messageLevelIndex >= currentLevelIndex
}
log(entry: Omit<LogEntry, 'timestamp'>): void {
if (!this.shouldLog(entry.level)) return
const fullEntry: LogEntry = {
timestamp: Date.now(),
...entry
}
// Send to all transports
this.transports.forEach(transport => {
try {
transport.log(fullEntry)
} catch (error) {
// Don't fail if a transport fails
console.error('Log transport failed:', error)
}
})
}
debug(component: string, message: string, data?: any): void {
this.log({
level: LogLevel.DEBUG,
component,
message,
data
})
}
info(component: string, message: string, data?: any): void {
this.log({
level: LogLevel.INFO,
component,
message,
data
})
}
warn(component: string, message: string, data?: any, error?: Error): void {
this.log({
level: LogLevel.WARN,
component,
message,
data,
error
})
}
error(component: string, message: string, data?: any, error?: Error): void {
this.log({
level: LogLevel.ERROR,
component,
message,
data,
error
})
}
fatal(component: string, message: string, data?: any, error?: Error): void {
this.log({
level: LogLevel.FATAL,
component,
message,
data,
error
})
// Fatal errors might need special handling
this.handleFatalError(component, message, error)
}
private handleFatalError(component: string, message: string, error?: Error): void {
// Send to error tracking service
this.sendToErrorTracking({
component,
message,
error: error?.toString(),
stack: error?.stack,
timestamp: Date.now()
})
// Alert administrators
this.sendAlert(`Fatal error in ${component}: ${message}`)
}
private sendToErrorTracking(errorData: any): void {
// Implement integration with error tracking service (Sentry, LogRocket, etc.)
if (window.Sentry) {
window.Sentry.captureException(new Error(errorData.message), {
extra: errorData
})
}
}
private sendAlert(message: string): void {
// Implement alerting (email, Slack, etc.)
console.error('ALERT:', message)
}
}
interface LogTransport {
log(entry: LogEntry): void
}
class ConsoleTransport implements LogTransport {
log(entry: LogEntry): void {
const formatted = this.formatEntry(entry)
switch (entry.level) {
case LogLevel.DEBUG:
console.debug(...formatted)
break
case LogLevel.INFO:
console.info(...formatted)
break
case LogLevel.WARN:
console.warn(...formatted)
break
case LogLevel.ERROR:
case LogLevel.FATAL:
console.error(...formatted)
break
}
}
private formatEntry(entry: LogEntry): any[] {
const timestamp = new Date(entry.timestamp).toISOString()
const prefix =`[${timestamp}] [${entry.level}] [${entry.component}]`
const args: any[] = [prefix, entry.message]
if (entry.data) {
args.push(entry.data)
}
if (entry.error) {
args.push('\nError:', entry.error)
if (entry.error.stack) {
args.push('\nStack:', entry.error.stack)
}
}
if (entry.userId) {
args.push(`(user: ${entry.userId})`)
}
if (entry.traceId) {
args.push(`(trace: ${entry.traceId})`)
}
return args
}
}
class LocalStorageTransport implements LogTransport {
private readonly MAX_LOGS = 1000
private readonly LOG_KEY = 'evobin_logs'
log(entry: LogEntry): void {
const logs = this.getLogs()
logs.push(entry)
// Keep only recent logs
if (logs.length > this.MAX_LOGS) {
logs.splice(0, logs.length - this.MAX_LOGS)
}
localStorage.setItem(this.LOG_KEY, JSON.stringify(logs))
}
getLogs(): LogEntry[] {
try {
const logs = localStorage.getItem(this.LOG_KEY)
return logs ? JSON.parse(logs) : []
} catch {
return []
}
}
clearLogs(): void {
localStorage.removeItem(this.LOG_KEY)
}
getErrors(): LogEntry[] {
return this.getLogs().filter(log =>
log.level === LogLevel.ERROR || log.level === LogLevel.FATAL
)
}
exportLogs(): string {
return JSON.stringify(this.getLogs(), null, 2)
}
}
// Usage example
const logger = Logger.getInstance()
// Configure logging level based on environment
if (process.env.NODE_ENV === 'development') {
logger.setLogLevel(LogLevel.DEBUG)
} else {
logger.setLogLevel(LogLevel.INFO)
}
// Add remote logging for production
if (process.env.NODE_ENV === 'production') {
logger.addTransport(new RemoteLogTransport('https://logs.evobin.com/api/logs'))
}
// Log examples
logger.info('FileUploader', 'Starting upload process', { fileSize: 1024 * 1024 })
logger.debug('EncryptionService', 'Generated encryption key', { algorithm: 'AES-256-GCM' })
logger.warn('NetworkManager', 'Slow network detected', { latency: 5000 })
logger.error('BlockchainService', 'Transaction failed', { txHash: 'abc123' }, new Error('Insufficient balance'))
// Structured logging helper
class EvoBinLogger {
private logger = Logger.getInstance()
private sessionId: string
private userId?: string
constructor(sessionId: string, userId?: string) {
this.sessionId = sessionId
this.userId = userId
}
logUploadStart(file: File): void {
this.logger.info('FileUpload', 'Upload started', {
sessionId: this.sessionId,
userId: this.userId,
fileName: file.name,
fileSize: file.size,
fileType: file.type
})
}
logUploadProgress(fileId: string, progress: number): void {
this.logger.debug('FileUpload', 'Upload progress', {
sessionId: this.sessionId,
userId: this.userId,
fileId,
progress
})
}
logUploadComplete(fileId: string, duration: number, chunks: number): void {
this.logger.info('FileUpload', 'Upload complete', {
sessionId: this.sessionId,
userId: this.userId,
fileId,
duration,
chunks,
timestamp: Date.now()
})
}
logUploadError(fileId: string, error: Error, retryCount: number): void {
this.logger.error('FileUpload', 'Upload failed', {
sessionId: this.sessionId,
userId: this.userId,
fileId,
error: error.message,
retryCount,
timestamp: Date.now()
}, error)
}
logTransaction(txHash: string, operation: string, credits: number): void {
this.logger.info('Blockchain', 'Transaction submitted', {
sessionId: this.sessionId,
userId: this.userId,
txHash,
operation,
credits,
timestamp: Date.now()
})
}
}

Alerting System​

// docs/advanced/monitoring.md
enum AlertSeverity {
INFO = 'info',
WARNING = 'warning',
ERROR = 'error',
CRITICAL = 'critical'
}
interface AlertRule {
name: string
description: string
severity: AlertSeverity
condition: (metrics: any) => boolean
cooldown: number // milliseconds
actions: AlertAction[]
lastTriggered?: number
}
interface Alert {
ruleId: string
severity: AlertSeverity
message: string
timestamp: numberany
}
interface AlertAction {
type: 'email' | 'slack' | 'webhook' | 'console'
config: any
}
class AlertManager {
private rules: AlertRule[] = []
private alerts: Alert[] = []
private readonly MAX_ALERTS = 1000
constructor() {
this.setupDefaultRules()
this.startMonitoring()
}
private setupDefaultRules(): void {
this.rules = [
{
name: 'High Upload Failure Rate',
description: 'More than 10% of uploads are failing',
severity: AlertSeverity.ERROR,
condition: (metrics) => metrics.uploadSuccessRate < 0.9,
cooldown: 5 * 60 * 1000, // 5 minutes
actions: [
{ type: 'slack', config: { channel: '#alerts' } },
{ type: 'email', config: { to: 'admin@evobin.com' } }
]
},
{
name: 'Low Replication Factor',
description: 'Average replication factor below 2',
severity: AlertSeverity.WARNING,
condition: (metrics) => metrics.replicationFactor < 2,
cooldown: 15 * 60 * 1000, // 15 minutes
actions: [
{ type: 'slack', config: { channel: '#alerts' } }
]
},
{
name: 'High Network Latency',
description: 'Average upload time exceeds 30 seconds',
severity: AlertSeverity.WARNING,
condition: (metrics) => metrics.averageUploadTime > 30000,
cooldown: 10 * 60 * 1000, // 10 minutes
actions: [
{ type: 'console', config: {} }
]
},
{
name: 'Low Credit Balance',
description: 'Average user credits below threshold',
severity: AlertSeverity.INFO,
condition: (metrics) => metrics.averageCredits < 100,
cooldown: 60 * 60 * 1000, // 1 hour
actions: [
{ type: 'slack', config: { channel: '#notifications' } }
]
},
{
name: 'Chunk Loss Detected',
description: 'More than 5 chunks lost in 24 hours',
severity: AlertSeverity.CRITICAL,
condition: (metrics) => metrics.chunksLost > 5,
cooldown: 0, // Immediate
actions: [
{ type: 'slack', config: { channel: '#critical-alerts' } },
{ type: 'email', config: { to: 'ops@evobin.com' } },
{ type: 'webhook', config: { url: 'https://hooks.pagerduty.com/...' } }
]
}
]
}
evaluate(metrics: any): void {
const now = Date.now()
this.rules.forEach(rule => {
// Check cooldown
if (rule.lastTriggered && now - rule.lastTriggered < rule.cooldown) {
return
}
// Evaluate condition
if (rule.condition(metrics)) {
this.triggerAlert(rule, metrics)
rule.lastTriggered = now
}
})
}
private triggerAlert(rule: AlertRule, metrics: any): void {
const alert: Alert = {
ruleId: rule.id,
severity: rule.severity,
message:`${rule.name}: ${rule.description}`,
timestamp: Date.now(),{
rule,
metrics,
context: this.getAlertContext()
}
}
this.alerts.push(alert)
// Keep only recent alerts
if (this.alerts.length > this.MAX_ALERTS) {
this.alerts = this.alerts.slice(-this.MAX_ALERTS)
}
// Execute actions
this.executeActions(rule, alert)
// Log the alert
const logger = Logger.getInstance()
logger.warn('AlertManager',`Alert triggered: ${rule.name}`, {
ruleId: rule.id,
severity: rule.severity,
metrics
})
}
private getAlertContext(): any {
return {
userAgent: navigator.userAgent,
timestamp: Date.now(),
url: window.location.href,
platform: 'web', // or 'mobile', 'desktop', etc.
version: process.env.VERSION || 'unknown'
}
}
private generateAlertId(): string {
return`alert_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`
}
private async executeActions(rule: AlertRule, alert: Alert): Promise<void> {
for (const action of rule.actions) {
try {
switch (action.type) {
case 'console':
console[alert.severity === AlertSeverity.CRITICAL ? 'error' : 'warn'](
`[${alert.severity.toUpperCase()}] ${alert.message}`,
alert.data
)
break
case 'slack':
await this.sendSlackAlert(action.config, alert)
break
case 'email':
await this.sendEmailAlert(action.config, alert)
break
case 'webhook':
await this.sendWebhookAlert(action.config, alert)
break
}
} catch (error) {
console.error(`Failed to execute alert action ${action.type}:`, error)
}
}
}
private async sendSlackAlert(config: any, alert: Alert): Promise<void> {
const message = {
channel: config.channel,
text:`:warning: *${alert.severity.toUpperCase()} Alert*`,
blocks: [
{
type: 'section',
text: {
type: 'mrkdwn',
text:`*${alert.message}*\n\nSeverity: ${alert.severity}\nTime: ${new Date(alert.timestamp).toISOString()}`
}
},
{
type: 'section',
fields: [
{
type: 'mrkdwn',
text:`*Rule:*\n${alert.data.rule.name}`
},
{
type: 'mrkdwn',
text:`*ID:*\n${alert.id}`
}
]
}
]
}
// In production, you would make an actual API call
// await fetch(config.webhookUrl, {
// method: 'POST',
// body: JSON.stringify(message)
// })
console.log('Sending Slack alert:', message)
}
private async sendEmailAlert(config: any, alert: Alert): Promise<void> {
const subject =`[${alert.severity.toUpperCase()}] EvoBin Alert: ${alert.data.rule.name}`
const body = `
Alert Details:
-------------
Message: ${alert.message}
Severity: ${alert.severity}
Time: ${new Date(alert.timestamp).toISOString()}
Rule: ${alert.data.rule.name}
Alert ID: ${alert.id}
Metrics:
${JSON.stringify(alert.data.metrics, null, 2)}
Context:
${JSON.stringify(alert.data.context, null, 2)}
`
console.log('Sending email alert:', { to: config.to, subject, body })
// In production, integrate with your email service
}
private async sendWebhookAlert(config: any, alert: Alert): Promise<void> {
const payload = {
alert,
timestamp: new Date().toISOString(),
source: 'evobin-monitoring'
}
// In production:
// await fetch(config.url, {
// method: 'POST',
// headers: { 'Content-Type': 'application/json' },
// body: JSON.stringify(payload)
// })
console.log('Sending webhook alert:', config.url, payload)
}
getAlerts(severity?: AlertSeverity, startTime?: number, endTime?: number): Alert[] {
let filtered = this.alerts
if (severity) {
filtered = filtered.filter(alert => alert.severity === severity)
}
if (startTime) {
filtered = filtered.filter(alert => alert.timestamp >= startTime)
}
if (endTime) {
filtered = filtered.filter(alert => alert.timestamp <= endTime)
}
return filtered.sort((a, b) => b.timestamp - a.timestamp) // Most recent first
}
getAlertSummary(): AlertSummary {
const now = Date.now()
const last24h = now - (24 * 60 * 60 * 1000)
const lastWeek = now - (7 * 24 * 60 * 60 * 1000)
const alerts24h = this.getAlerts(undefined, last24h)
const alertsWeek = this.getAlerts(undefined, lastWeek)
const bySeverity = {
critical: alerts24h.filter(a => a.severity === AlertSeverity.CRITICAL).length,
error: alerts24h.filter(a => a.severity === AlertSeverity.ERROR).length,
warning: alerts24h.filter(a => a.severity === AlertSeverity.WARNING).length,
info: alerts24h.filter(a => a.severity === AlertSeverity.INFO).length
}
const byRule = new Map<string, number>()
alerts24h.forEach(alert => {
byRule.set(alert.ruleId, (byRule.get(alert.ruleId) || 0) + 1)
})
return {
total24h: alerts24h.length,
totalWeek: alertsWeek.length,
bySeverity,
byRule: Array.from(byRule.entries()).map(([ruleId, count]) => ({
ruleId,
count
})),
mostCommonRule: byRule.size > 0
? Array.from(byRule.entries()).sort((a, b) => b[1] - a[1])[0]
: null
}
}
}
interface AlertSummary {
total24h: number
totalWeek: number
bySeverity: Record<string, number>
byRule: Array<{ ruleId: string; count: number }>
mostCommonRule: [string, number] | null
}
// Usage example
const alertManager = new AlertManager()
// Periodically evaluate metrics
setInterval(() => {
const storageMetrics = storageMonitor.getMetrics()
const blockchainMetrics = blockchainMonitor.getMetrics()
const userMetrics = userActivityMonitor.getMetrics()
const allMetrics = {
...storageMetrics,
...blockchainMetrics,
...userMetrics
}
alertManager.evaluate(allMetrics)
}, 60000) // Evaluate every minute
// Get current alerts
const recentAlerts = alertManager.getAlerts(AlertSeverity.CRITICAL)
const alertSummary = alertManager.getAlertSummary()
console.log('Critical alerts:', recentAlerts.length)
console.log('Alert summary:', alertSummary)

Dashboard Implementation​

// docs/advanced/monitoring.md
interface DashboardConfig {
refreshInterval: number // milliseconds
metrics: DashboardMetric[]
layout: DashboardLayout
theme: 'light' | 'dark'
}
interface DashboardMetric {
title: string
type: 'gauge' | 'line' | 'bar' | 'pie' | 'stat'
dataSource: () => Promise<any>
refreshRate: number // milliseconds
thresholds?: {
warning: number
critical: number
}
}
interface DashboardLayout {
columns: number
widgets: DashboardWidget[]
}
interface DashboardWidget {
metricId: string
position: { x: number; y: number; width: number; height: number }
options: any
}
class MonitoringDashboard {
private config: DashboardConfig
private metrics = new Map<string, any>()
private widgets = new Map<string, HTMLElement>()
private isInitialized = false
constructor(config: DashboardConfig) {
this.config = config
this.initializeDashboard()
}
private async initializeDashboard(): Promise<void> {
if (this.isInitialized) return
// Create dashboard container
const container = document.getElementById('dashboard') || this.createDashboardContainer()
// Initialize widgets
await this.initializeWidgets(container)
// Start auto-refresh
this.startAutoRefresh()
this.isInitialized = true
}
private createDashboardContainer(): HTMLElement {
const container = document.createElement('div')
container.id = 'dashboard'
container.style.cssText = `
display: grid;
grid-template-columns: repeat(${this.config.layout.columns}, 1fr);
gap: 1rem;
padding: 1rem;
background: ${this.config.theme === 'dark' ? '#1a1a1a' : '#f5f5f5'};
color: ${this.config.theme === 'dark' ? '#fff' : '#000'};
min-height: 100vh;
`
document.body.appendChild(container)
return container
}
private async initializeWidgets(container: HTMLElement): Promise<void> {
// Sort widgets by position for proper grid placement
const sortedWidgets = [...this.config.layout.widgets].sort((a, b) => {
if (a.position.y === b.position.y) {
return a.position.x - b.position.x
}
return a.position.y - b.position.y
})
for (const widgetConfig of sortedWidgets) {
const widget = await this.createWidget(widgetConfig)
if (widget) {
container.appendChild(widget)
this.widgets.set(widgetConfig.id, widget)
}
}
}
private async createWidget(widgetConfig: DashboardWidget): Promise<HTMLElement | null> {
const metricConfig = this.config.metrics.find(m => m.id === widgetConfig.metricId)
if (!metricConfig) return null
const widget = document.createElement('div')
widget.id =`widget-${widgetConfig.id}`
widget.style.cssText = `
grid-column: span ${widgetConfig.position.width};
grid-row: span ${widgetConfig.position.height};
background: ${this.config.theme === 'dark' ? '#2d2d2d' : '#fff'};
border-radius: 8px;
padding: 1rem;
box-shadow: 0 2px 4px rgba(0,0,0,0.1);
`
// Widget header
const header = document.createElement('div')
header.style.cssText = `
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 1rem;
padding-bottom: 0.5rem;
border-bottom: 1px solid ${this.config.theme === 'dark' ? '#444' : '#eee'};
`
const title = document.createElement('h3')
title.textContent = metricConfig.title
title.style.cssText = 'margin: 0; font-size: 1rem;'
const refreshBtn = document.createElement('button')
refreshBtn.textContent = '↻'
refreshBtn.style.cssText = `
background: none;
border: none;
cursor: pointer;
font-size: 1.2rem;
color: ${this.config.theme === 'dark' ? '#888' : '#666'};
padding: 0.25rem 0.5rem;
border-radius: 4px;
`
refreshBtn.onclick = () => this.refreshWidget(widgetConfig.id)
header.appendChild(title)
header.appendChild(refreshBtn)
widget.appendChild(header)
// Widget content
const content = document.createElement('div')
content.id =`content-${widgetConfig.id}`
widget.appendChild(content)
// Initial data load
await this.refreshWidget(widgetConfig.id)
// Auto-refresh for this widget
if (metricConfig.refreshRate > 0) {
setInterval(() => {
this.refreshWidget(widgetConfig.id)
}, metricConfig.refreshRate)
}
return widget
}
private async refreshWidget(widgetId: string): Promise<void> {
const widgetConfig = this.config.layout.widgets.find(w => w.id === widgetId)
if (!widgetConfig) return
const metricConfig = this.config.metrics.find(m => m.id === widgetConfig.metricId)
if (!metricConfig) return
try {
const data = await metricConfig.dataSource()
this.metrics.set(widgetConfig.metricId, data)
await this.updateWidgetContent(widgetId, data, metricConfig, widgetConfig.options)
} catch (error) {
console.error(`Failed to refresh widget ${widgetId}:`, error)
this.showError(widgetId, error)
}
}
private async updateWidgetContent(
widgetId: string,any,
metricConfig: DashboardMetric,
options: any
): Promise<void> {
const content = document.getElementById(`content-${widgetId}`)
if (!content) return
content.innerHTML = ''
switch (metricConfig.type) {
case 'gauge':
this.renderGauge(content, data, metricConfig, options)
break
case 'line':
this.renderLineChart(content, data, metricConfig, options)
break
case 'bar':
this.renderBarChart(content, data, metricConfig, options)
break
case 'pie':
this.renderPieChart(content, data, metricConfig, options)
break
case 'stat':
this.renderStat(content, data, metricConfig, options)
break
}
// Apply threshold styling if applicable
if (metricConfig.thresholds && typeof data === 'number') {
this.applyThresholdStyling(content, data, metricConfig.thresholds)
}
}
private renderGauge(container: HTMLElement, value: number, config: DashboardMetric, options: any): void {
const size = 150
const strokeWidth = 15
const radius = (size - strokeWidth) / 2
const circumference = 2 * Math.PI * radius
const progress = Math.min(Math.max(value / 100, 0), 1)
const offset = circumference - progress * circumference
const svg = document.createElementNS('http://www.w3.org/2000/svg', 'svg')
svg.setAttribute('width', size.toString())
svg.setAttribute('height', size.toString())
svg.setAttribute('viewBox',`0 0 ${size} ${size}`)
// Background circle
const bgCircle = document.createElementNS('http://www.w3.org/2000/svg', 'circle')
bgCircle.setAttribute('cx', (size / 2).toString())
bgCircle.setAttribute('cy', (size / 2).toString())
bgCircle.setAttribute('r', radius.toString())
bgCircle.setAttribute('fill', 'none')
bgCircle.setAttribute('stroke', this.config.theme === 'dark' ? '#444' : '#eee')
bgCircle.setAttribute('stroke-width', strokeWidth.toString())
// Progress circle
const progressCircle = document.createElementNS('http://www.w3.org/2000/svg', 'circle')
progressCircle.setAttribute('cx', (size / 2).toString())
progressCircle.setAttribute('cy', (size / 2).toString())
progressCircle.setAttribute('r', radius.toString())
progressCircle.setAttribute('fill', 'none')
progressCircle.setAttribute('stroke', this.getProgressColor(value, config.thresholds))
progressCircle.setAttribute('stroke-width', strokeWidth.toString())
progressCircle.setAttribute('stroke-dasharray', circumference.toString())
progressCircle.setAttribute('stroke-dashoffset', offset.toString())
progressCircle.setAttribute('transform',`rotate(-90 ${size / 2} ${size / 2})`)
progressCircle.setAttribute('stroke-linecap', 'round')
// Value text
const text = document.createElementNS('http://www.w3.org/2000/svg', 'text')
text.setAttribute('x', (size / 2).toString())
text.setAttribute('y', (size / 2).toString())
text.setAttribute('text-anchor', 'middle')
text.setAttribute('dominant-baseline', 'middle')
text.setAttribute('fill', this.config.theme === 'dark' ? '#fff' : '#000')
text.setAttribute('font-size', '24')
text.setAttribute('font-weight', 'bold')
text.textContent = value.toFixed(1) + '%'
svg.appendChild(bgCircle)
svg.appendChild(progressCircle)
svg.appendChild(text)
container.appendChild(svg)
}
private getProgressColor(value: number, thresholds?: { warning: number; critical: number }): string {
if (!thresholds) return '#8b5cf6' // Default purple
if (value >= thresholds.critical) return '#ef4444' // Red
if (value >= thresholds.warning) return '#f59e0b' // Yellow
return '#10b981' // Green
}
private renderLineChart(container: HTMLElement, data: number[], config: DashboardMetric, options: any): void {
const canvas = document.createElement('canvas')
canvas.width = container.clientWidth || 300
canvas.height = 200
const ctx = canvas.getContext('2d')
if (!ctx) return
// Clear canvas
ctx.clearRect(0, 0, canvas.width, canvas.height)
if (data.length === 0) {
ctx.fillStyle = this.config.theme === 'dark' ? '#888' : '#666'
ctx.font = '14px sans-serif'
ctx.textAlign = 'center'
ctx.fillText('No data available', canvas.width / 2, canvas.height / 2)
container.appendChild(canvas)
return
}
// Chart configuration
const padding = 40
const chartWidth = canvas.width - 2 * padding
const chartHeight = canvas.height - 2 * padding
// Find min and max values
const maxValue = Math.max(...data)
const minValue = Math.min(...data)
const valueRange = maxValue - minValue || 1
// Draw grid
ctx.strokeStyle = this.config.theme === 'dark' ? '#444' : '#eee'
ctx.lineWidth = 1
// Horizontal grid lines
const gridLines = 5
for (let i = 0; i <= gridLines; i++) {
const y = padding + (chartHeight * i) / gridLines
ctx.beginPath()
ctx.moveTo(padding, y)
ctx.lineTo(canvas.width - padding, y)
ctx.stroke()
// Y-axis labels
const value = maxValue - (valueRange * i) / gridLines
ctx.fillStyle = this.config.theme === 'dark' ? '#888' : '#666'
ctx.font = '12px sans-serif'
ctx.textAlign = 'right'
ctx.fillText(value.toFixed(1), padding - 10, y + 4)
}
// Vertical grid lines
for (let i = 0; i < data.length; i++) {
const x = padding + (chartWidth * i) / (data.length - 1 || 1)
ctx.beginPath()
ctx.moveTo(x, padding)
ctx.lineTo(x, canvas.height - padding)
ctx.stroke()
}
// Draw line
ctx.strokeStyle = '#8b5cf6'
ctx.lineWidth = 2
ctx.beginPath()
data.forEach((value, index) => {
const x = padding + (chartWidth * index) / (data.length - 1 || 1)
const y = padding + chartHeight - ((value - minValue) / valueRange) * chartHeight
if (index === 0) {
ctx.moveTo(x, y)
} else {
ctx.lineTo(x, y)
}
})
ctx.stroke()
// Draw data points
data.forEach((value, index) => {
const x = padding + (chartWidth * index) / (data.length - 1 || 1)
const y = padding + chartHeight - ((value - minValue) / valueRange) * chartHeight
ctx.fillStyle = '#8b5cf6'
ctx.beginPath()
ctx.arc(x, y, 4, 0, Math.PI * 2)
ctx.fill()
})
// X-axis labels
ctx.fillStyle = this.config.theme === 'dark' ? '#888' : '#666'
ctx.font = '12px sans-serif'
ctx.textAlign = 'center'
data.forEach((_, index) => {
const x = padding + (chartWidth * index) / (data.length - 1 || 1)
const label =`-${data.length - 1 - index}m` // Minutes ago
ctx.fillText(label, x, canvas.height - padding + 20)
})
container.appendChild(canvas)
}
private renderBarChart(container: HTMLElement, data: Record<string, number>, config: DashboardMetric, options: any): void {
const canvas = document.createElement('canvas')
canvas.width = container.clientWidth || 300
canvas.height = 200
const ctx = canvas.getContext('2d')
if (!ctx) return
ctx.clearRect(0, 0, canvas.width, canvas.height)
if (Object.keys(data).length === 0) {
ctx.fillStyle = this.config.theme === 'dark' ? '#888' : '#666'
ctx.font = '14px sans-serif'
ctx.textAlign = 'center'
ctx.fillText('No data available', canvas.width / 2, canvas.height / 2)
container.appendChild(canvas)
return
}
const padding = 40
const chartWidth = canvas.width - 2 * padding
const chartHeight = canvas.height - 2 * padding
const barCount = Object.keys(data).length
const barWidth = chartWidth / (barCount * 1.5)
const gap = (chartWidth - barCount * barWidth) / (barCount + 1)
const values = Object.values(data)
const maxValue = Math.max(...values)
const colors = ['#8b5cf6', '#10b981', '#f59e0b', '#ef4444', '#3b82f6']
// Draw bars
Object.entries(data).forEach(([label, value], index) => {
const x = padding + gap + index * (barWidth + gap)
const barHeight = (value / maxValue) * chartHeight
const y = canvas.height - padding - barHeight
ctx.fillStyle = colors[index % colors.length]
ctx.fillRect(x, y, barWidth, barHeight)
// Value label
ctx.fillStyle = this.config.theme === 'dark' ? '#fff' : '#000'
ctx.font = '12px sans-serif'
ctx.textAlign = 'center'
ctx.fillText(value.toString(), x + barWidth / 2, y - 5)
// X-axis label
ctx.fillStyle = this.config.theme === 'dark' ? '#888' : '#666'
ctx.fillText(label, x + barWidth / 2, canvas.height - padding + 20)
})
// Y-axis
ctx.strokeStyle = this.config.theme === 'dark' ? '#444' : '#eee'
ctx.lineWidth = 1
ctx.beginPath()
ctx.moveTo(padding, padding)
ctx.lineTo(padding, canvas.height - padding)
ctx.lineTo(canvas.width - padding, canvas.height - padding)
ctx.stroke()
container.appendChild(canvas)
}
private renderPieChart(container: HTMLElement, data: Record<string, number>, config: DashboardMetric, options: any): void {
const canvas = document.createElement('canvas')
canvas.width = 200
canvas.height = 200
const ctx = canvas.getContext('2d')
if (!ctx) return
ctx.clearRect(0, 0, canvas.width, canvas.height)
if (Object.keys(data).length === 0) {
ctx.fillStyle = this.config.theme === 'dark' ? '#888' : '#666'
ctx.font = '14px sans-serif'
ctx.textAlign = 'center'
ctx.fillText('No data available', canvas.width / 2, canvas.height / 2)
container.appendChild(canvas)
return
}
const colors = ['#8b5cf6', '#10b981', '#f59e0b', '#ef4444', '#3b82f6']
const total = Object.values(data).reduce((sum, value) => sum + value, 0)
const centerX = canvas.width / 2
const centerY = canvas.height / 2
const radius = Math.min(centerX, centerY) - 20
let startAngle = 0
Object.entries(data).forEach(([label, value], index) => {
const sliceAngle = (2 * Math.PI * value) / total
// Draw slice
ctx.fillStyle = colors[index % colors.length]
ctx.beginPath()
ctx.moveTo(centerX, centerY)
ctx.arc(centerX, centerY, radius, startAngle, startAngle + sliceAngle)
ctx.closePath()
ctx.fill()
// Draw label
const angle = startAngle + sliceAngle / 2
const labelRadius = radius + 20
const labelX = centerX + Math.cos(angle) * labelRadius
const labelY = centerY + Math.sin(angle) * labelRadius
const percentage = ((value / total) * 100).toFixed(1)
ctx.fillStyle = this.config.theme === 'dark' ? '#fff' : '#000'
ctx.font = '12px sans-serif'
ctx.textAlign = 'center'
ctx.fillText(`${percentage}%`, labelX, labelY)
// Legend
const legendX = 10
const legendY = 10 + index * 20
ctx.fillStyle = colors[index % colors.length]
ctx.fillRect(legendX, legendY - 10, 15, 15)
ctx.fillStyle = this.config.theme === 'dark' ? '#fff' : '#000'
ctx.textAlign = 'left'
ctx.fillText(`${label}: ${value}`, legendX + 20, legendY)
startAngle += sliceAngle
})
container.appendChild(canvas)
}
private renderStat(container: HTMLElement, data: any, config: DashboardMetric, options: any): void {
const valueElement = document.createElement('div')
valueElement.style.cssText = `
font-size: 2.5rem;
font-weight: bold;
text-align: center;
margin: 1rem 0;
color: ${this.config.theme === 'dark' ? '#fff' : '#000'};
`
let displayValue: string
if (typeof data === 'number') {
// Format numbers with appropriate units
if (data >= 1000000) {
displayValue = (data / 1000000).toFixed(2) + 'M'
} else if (data >= 1000) {
displayValue = (data / 1000).toFixed(1) + 'K'
} else {
displayValue = data.toFixed(2)
}
} else if (typeof data === 'string') {
displayValue = data
} else {
displayValue = JSON.stringify(data)
}
valueElement.textContent = displayValue
if (options.unit) {
const unitElement = document.createElement('span')
unitElement.textContent =` ${options.unit}`
unitElement.style.cssText = 'font-size: 1rem; opacity: 0.7;'
valueElement.appendChild(unitElement)
}
container.appendChild(valueElement)
// Optional description
if (options.description) {
const descElement = document.createElement('div')
descElement.textContent = options.description
descElement.style.cssText = `
text-align: center;
font-size: 0.875rem;
opacity: 0.7;
margin-top: 0.5rem;
`
container.appendChild(descElement)
}
// Optional trend indicator
if (typeof data === 'number' && options.previousValue !== undefined) {
const trend = data - options.previousValue
const trendElement = document.createElement('div')
trendElement.style.cssText = `
text-align: center;
font-size: 0.75rem;
margin-top: 0.5rem;
color: ${trend >= 0 ? '#10b981' : '#ef4444'};
`
const trendIcon = trend >= 0 ? '↗' : '↘'
trendElement.textContent =`${trendIcon} ${Math.abs(trend).toFixed(2)}`
container.appendChild(trendElement)
}
}
private applyThresholdStyling(container: HTMLElement, value: number, thresholds: { warning: number; critical: number }): void {
let color: string
if (value >= thresholds.critical) {
color = '#ef4444' // Red
} else if (value >= thresholds.warning) {
color = '#f59e0b' // Yellow
} else {
color = '#10b981' // Green
}
const valueElement = container.querySelector('div:first-child')
if (valueElement) {
valueElement.style.color = color
}
}
private showError(widgetId: string, error: Error): void {
const content = document.getElementById(`content-${widgetId}`)
if (!content) return
content.innerHTML = ''
const errorElement = document.createElement('div')
errorElement.style.cssText = `
color: #ef4444;
text-align: center;
padding: 1rem;
font-size: 0.875rem;
`
errorElement.textContent =`Error: ${error.message}`
content.appendChild(errorElement)
}
private startAutoRefresh(): void {
if (this.config.refreshInterval > 0) {
setInterval(() => {
this.refreshAllWidgets()
}, this.config.refreshInterval)
}
}
private async refreshAllWidgets(): Promise<void> {
const promises = Array.from(this.widgets.keys()).map(widgetId =>
this.refreshWidget(widgetId).catch(error => {
console.error(`Failed to refresh widget ${widgetId}:`, error)
})
)
await Promise.all(promises)
}
getWidgetData(widgetId: string): any {
const widgetConfig = this.config.layout.widgets.find(w => w.id === widgetId)
if (!widgetConfig) return null
return this.metrics.get(widgetConfig.metricId)
}
exportDashboard(): DashboardConfig {
return { ...this.config }
}
updateLayout(newLayout: DashboardLayout): void {
this.config.layout = newLayout
this.rebuildDashboard()
}
private rebuildDashboard(): void {
const container = document.getElementById('dashboard')
if (container) {
container.innerHTML = ''
this.widgets.clear()
this.initializeWidgets(container)
}
}
}
// Example dashboard configuration
const dashboardConfig: DashboardConfig = {
refreshInterval: 30000, // 30 seconds
theme: 'dark',
metrics: [
{
title: 'Upload Success Rate',
type: 'gauge',
dataSource: async () => {
const metrics = storageMonitor.getMetrics()
return metrics.uploadSuccessRate * 100 // Convert to percentage
},
refreshRate: 10000, // 10 seconds
thresholds: {
warning: 90,
critical: 80
}
},
{
title: 'Storage Growth (Last 7 Days)',
type: 'line',
dataSource: async () => {
// Return array of daily storage usage
const dailyUsage = JSON.parse(localStorage.getItem('daily_usage') || '{}')
return Object.values(dailyUsage).slice(-7) as number[]
},
refreshRate: 60000 // 1 minute
},
{
title: 'Files by Type',
type: 'pie',
dataSource: async () => {
const metrics = storageMonitor.getMetrics()
return metrics.storageByType
},
refreshRate: 30000 // 30 seconds
},
{
title: 'Active Users (24h)',
type: 'stat',
dataSource: async () => {
const metrics = userActivityMonitor.getMetrics()
return metrics.activeUsers
},
refreshRate: 60000, // 1 minute
unit: 'users'
},
{
title: 'Network Health',
type: 'gauge',
dataSource: async () => {
const status = blockchainMonitor.getHealthStatus()
switch (status) {
case 'healthy': return 100
case 'degraded': return 60
case 'unhealthy': return 20
default: return 0
}
},
refreshRate: 15000, // 15 seconds
thresholds: {
warning: 60,
critical: 20
}
},
{
title: 'Error Rate (Last Hour)',
type: 'bar',
dataSource: async () => {
// Simulated error data by type
return {
'Network': 5,
'Validation': 2,
'Authentication': 1,
'Storage': 3,
'Other': 4
}
},
refreshRate: 60000 // 1 minute
}
],
layout: {
columns: 3,
widgets: [
{
metricId: 'upload-success-rate',
position: { x: 0, y: 0, width: 1, height: 1 },
options: {}
},
{
metricId: 'storage-growth',
position: { x: 1, y: 0, width: 2, height: 1 },
options: {}
},
{
metricId: 'file-types',
position: { x: 0, y: 1, width: 1, height: 1 },
options: {}
},
{
metricId: 'active-users',
position: { x: 1, y: 1, width: 1, height: 1 },
options: {}
},
{
metricId: 'network-health',
position: { x: 2, y: 1, width: 1, height: 1 },
options: {}
},
{
metricId: 'error-rate',
position: { x: 0, y: 2, width: 3, height: 1 },
options: {}
}
]
}
}
// Initialize dashboard
const dashboard = new MonitoringDashboard(dashboardConfig)
// Export dashboard data for external analysis
function exportDashboardData(): DashboardExport {
const metrics: Record<string, any> = {}
dashboardConfig.metrics.forEach(metric => {
const data = dashboard.getWidgetData(
dashboardConfig.layout.widgets.find(w => w.metricId === metric.id)?.id || ''
)
metrics[metric.id] = data
})
const logs = localStorageTransport.getLogs()
const alerts = alertManager.getAlerts()
const userMetrics = userActivityMonitor.getMetrics()
const storageMetrics = storageMonitor.getMetrics()
const blockchainMetrics = blockchainMonitor.getMetrics()
return {
timestamp: Date.now(),
metrics,
logs: logs.slice(-1000), // Last 1000 logs
alerts: alerts.filter(a => a.timestamp > Date.now() - 24 * 60 * 60 * 1000), // Last 24h
userMetrics,
storageMetrics,
blockchainMetrics,
healthScore: storageMonitor.generateReport().healthScore
}
}
interface DashboardExport {
timestamp: number
metrics: Record<string, any>
logs: LogEntry[]
alerts: Alert[]
userMetrics: UserMetrics
storageMetrics: StorageMetrics
blockchainMetrics: BlockchainMetrics
healthScore: number
}
// Schedule automatic export (e.g., daily)
setInterval(() => {
const exportData = exportDashboardData()
// Send to analytics endpoint
fetch('/api/analytics/dashboard', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(exportData)
}).catch(error => {
console.error('Failed to export dashboard data:', error)
})
}, 24 * 60 * 60 * 1000) // Daily

Self-Hosted Options​

# docker-compose.yml for monitoring stack
version: '3.8'
services:
# Metrics collection
prometheus:
image: prom/prometheus:latest
volumes:
- ./prometheus.yml:/etc/prometheus/prometheus.yml
- prometheus_data:/prometheus
command:
- '--config.file=/etc/prometheus/prometheus.yml'
- '--storage.tsdb.path=/prometheus'
ports:
- "9090:9090"
networks:
- monitoring
# Visualization
grafana:
image: grafana/grafana:latest
volumes:
- grafana_data:/var/lib/grafana
environment:
- GF_SECURITY_ADMIN_PASSWORD=admin
ports:
- "3000:3000"
networks:
- monitoring
# Log aggregation
elasticsearch:
image: elasticsearch:8.10.0
environment:
- discovery.type=single-node
- xpack.security.enabled=false
volumes:
- elasticsearch_data:/usr/share/elasticsearch/data
ports:
- "9200:9200"
networks:
- monitoring
logstash:
image: logstash:8.10.0
volumes:
- ./logstash.conf:/usr/share/logstash/pipeline/logstash.conf
ports:
- "5000:5000"
networks:
- monitoring
kibana:
image: kibana:8.10.0
ports:
- "5601:5601"
networks:
- monitoring
# Alerting
alertmanager:
image: prom/alertmanager:latest
volumes:
- ./alertmanager.yml:/etc/alertmanager/alertmanager.yml
ports:
- "9093:9093"
networks:
- monitoring
volumes:
prometheus_data:
grafana_data:
elasticsearch_data:
networks:
monitoring:
driver: bridge

Prometheus Configuration​

# prometheus.yml
global:
scrape_interval: 15s
evaluation_interval: 15s
rule_files:
- "alerts.yml"
alerting:
alertmanagers:
- static_configs:
- targets: ['alertmanager:9093']
scrape_configs:
- job_name: 'evobin-web'
static_configs:
- targets: ['localhost:3000']
metrics_path: '/api/metrics'
- job_name: 'evobin-api'
static_configs:
- targets: ['api.evobin.com:8080']
metrics_path: '/metrics'
- job_name: 'dash-platform'
static_configs:
- targets: ['api.dash.org:8080']
metrics_path: '/metrics'

Alert Rules​

# alerts.yml
groups:
- name: evobin-alerts
rules:
- alert: HighUploadFailureRate
expr: rate(evobin_upload_failures_total[5m]) / rate(evobin_upload_attempts_total[5m]) > 0.1
for: 5m
labels:
severity: warning
annotations:
summary: "High upload failure rate detected"
description: "Upload failure rate is {{ $value | humanizePercentage }} (threshold: 10%)"
- alert: LowReplicationFactor
expr: evobin_chunk_replication_factor < 2
for: 10m
labels:
severity: critical
annotations:
summary: "Low chunk replication factor"
description: "Average replication factor is {{ $value }} (threshold: 2)"
- alert: SlowUploadPerformance
expr: histogram_quantile(0.95, rate(evobin_upload_duration_seconds_bucket[5m])) > 30
for: 5m
labels:
severity: warning
annotations:
summary: "Slow upload performance"
description: "95th percentile upload duration is {{ $value | humanizeDuration }} (threshold: 30s)"
- alert: HighErrorRate
expr: rate(evobin_errors_total[5m]) > 10
for: 2m
labels:
severity: warning
annotations:
summary: "High error rate detected"
description: "{{ $value }} errors per minute (threshold: 10)"

Cloud-Based Options​

For smaller deployments or easier setup, consider these cloud services:

  1. Datadog - All-in-one monitoring
  2. New Relic - Application performance monitoring
  3. Sentry - Error tracking and performance monitoring
  4. LogRocket - Session replay and error tracking
  5. CloudWatch - AWS monitoring (if using AWS)
  6. Azure Monitor - Azure monitoring (if using Azure)

Integration with External Tools​

// docs/advanced/monitoring.md
class ExternalMonitoringIntegration {
// Sentry integration
static initSentry(dsn: string): void {
if (window.Sentry) {
window.Sentry.init({
dsn,
environment: process.env.NODE_ENV,
release: process.env.VERSION,
tracesSampleRate: 1.0,
integrations: [
new window.Sentry.BrowserTracing({
tracingOrigins: ['localhost', 'evobin.xyz'],
}),
new window.Sentry.Replay()
]
})
}
}
// Datadog integration
static initDatadog(clientToken: string, applicationId: string): void {
if (window.DD_RUM) {
window.DD_RUM.init({
applicationId,
clientToken,
site: 'datadoghq.com',
service: 'evobin-web',
env: process.env.NODE_ENV,
version: process.env.VERSION,
sampleRate: 100,
premiumSampleRate: 100,
trackInteractions: true,
defaultPrivacyLevel: 'mask-user-input'
})
window.DD_RUM.startSessionReplayRecording()
}
}
// Cloudflare Analytics
static trackToCloudflare(beaconToken: string, data: any): void {
fetch('https://cloudflare-analytics.com/api/v4/send', {
method: 'POST',
headers: {
'Authorization':`Bearer ${beaconToken}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
...data,
timestamp: Date.now(),
userAgent: navigator.userAgent,
url: window.location.href
})
}).catch(() => {
// Silently fail analytics
})
}
// Custom metrics endpoint
static async sendMetrics(endpoint: string, metrics: any): Promise<void> {
try {
await fetch(endpoint, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
timestamp: Date.now(),
metrics,
sessionId: localStorage.getItem('session_id'),
userId: localStorage.getItem('user_id')
})
})
} catch (error) {
console.error('Failed to send metrics:', error)
}
}
}
// Initialize external monitoring based on environment
if (process.env.NODE_ENV === 'production') {
// Sentry for error tracking
if (process.env.SENTRY_DSN) {
ExternalMonitoringIntegration.initSentry(process.env.SENTRY_DSN)
}
// Datadog for performance monitoring
if (process.env.DATADOG_CLIENT_TOKEN && process.env.DATADOG_APPLICATION_ID) {
ExternalMonitoringIntegration.initDatadog(
process.env.DATADOG_CLIENT_TOKEN,
process.env.DATADOG_APPLICATION_ID
)
}
// Send periodic metrics
setInterval(() => {
const metrics = {
storage: storageMonitor.getMetrics(),
blockchain: blockchainMonitor.getMetrics(),
user: userActivityMonitor.getMetrics(),
alerts: alertManager.getAlertSummary()
}
if (process.env.METRICS_ENDPOINT) {
ExternalMonitoringIntegration.sendMetrics(
process.env.METRICS_ENDPOINT,
metrics
)
}
}, 60000) // Every minute
}

Best Practices for Monitoring​

1. Log Everything Important​

// Good logging practices
logger.info('FileUpload', 'Upload started', {
fileId: 'abc123',
fileSize: 1024 * 1024,
chunkSize: 256 * 1024,
userId: 'user_123'
})
logger.error('BlockchainService', 'Transaction failed', {
txHash: '0x123...',
errorCode: 'INSUFFICIENT_BALANCE',
retryCount: 3
}, error)

2. Monitor Key Performance Indicators (KPIs)​

  • Upload success rate - Should be > 95%
  • Average upload time - Should be < 30 seconds for 100MB files
  • Chunk replication factor - Should be >= 3
  • Error rate - Should be < 1%
  • User retention - Should be increasing over time
  • Storage growth - Should match expectations

3. Set Meaningful Alerts​

  • Alert on error rate > 5% for 5 minutes
  • Alert on upload success rate < 90% for 10 minutes
  • Alert on replication factor < 2 for 30 minutes
  • Alert on unusual storage growth (> 50% daily increase)
  • Alert on credit balance depletion for top users

4. Regular Review and Optimization​

  1. Daily: Check critical alerts and error rates
  2. Weekly: Review performance trends and user metrics
  3. Monthly: Analyze costs, storage growth, and retention
  4. Quarterly: Review and update alert thresholds

5. Privacy Considerations​

  • Never log personally identifiable information (PII)
  • Anonymize user data before analytics
  • Comply with GDPR, CCPA, and other regulations
  • Allow users to opt-out of analytics
  • Use end-to-end encryption for sensitive metrics

Troubleshooting Common Issues​

Issue: High Error Rates​

Symptoms:

  • Alerts for high error rates
  • Users reporting upload failures
  • Increased support tickets Investigation Steps:
  1. Check error logs for patterns
  2. Review network connectivity metrics
  3. Verify blockchain node status
  4. Check credit balances for affected users
  5. Review recent code changes Solutions:
  • Increase chunk size for better network conditions
  • Implement better retry logic with exponential backoff
  • Add more DAPI endpoints for redundancy
  • Optimize gas price calculations

Issue: Slow Performance​

Symptoms:

  • Long upload times
  • User complaints about speed
  • Dashboard showing high latency Investigation Steps:
  1. Check network latency to DAPI nodes
  2. Review chunk size settings
  3. Monitor parallel upload counts
  4. Check for network congestion Solutions:
  • Implement dynamic chunk sizing
  • Increase parallel upload count (cautiously)
  • Add geographical DAPI endpoints
  • Implement client-side compression

Issue: Storage Issues​

Symptoms:

  • Files failing to retrieve
  • High chunk loss rate
  • Replication factor warnings Investigation Steps:
  1. Check blockchain node health
  2. Verify data contract state
  3. Monitor chunk distribution
  4. Review storage credit balances Solutions:
  • Implement chunk repair mechanisms
  • Increase redundancy factor
  • Add backup storage providers
  • Optimize storage contract parameters

Next Steps​

  1. Implement monitoring in your EvoBin integration
  2. Set up dashboards for key metrics
  3. Configure alerts for critical issues
  4. Regularly review metrics and adjust thresholds
  5. Continuously optimize based on monitoring data Remember: Good monitoring is not about collecting all data, but about collecting the right data and acting on it effectively.