iot-auto-fe-be
This commit is contained in:
commit
ae4842eb75
150
.gitignore
vendored
Normal file
150
.gitignore
vendored
Normal file
@ -0,0 +1,150 @@
|
||||
# Dependencies
|
||||
node_modules/
|
||||
**/node_modules/
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
|
||||
# Environment variables
|
||||
.env
|
||||
.env.local
|
||||
.env.development.local
|
||||
.env.test.local
|
||||
.env.production.local
|
||||
**/.env
|
||||
**/.env.local
|
||||
**/.env.development.local
|
||||
**/.env.test.local
|
||||
**/.env.production.local
|
||||
|
||||
# Logs
|
||||
logs/
|
||||
**/logs/
|
||||
*.log
|
||||
**/*.log
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
lerna-debug.log*
|
||||
|
||||
# Runtime data
|
||||
pids/
|
||||
*.pid
|
||||
*.seed
|
||||
*.pid.lock
|
||||
|
||||
# Coverage directory used by tools like istanbul
|
||||
coverage/
|
||||
**/coverage/
|
||||
*.lcov
|
||||
|
||||
# nyc test coverage
|
||||
.nyc_output
|
||||
|
||||
# Grunt intermediate storage
|
||||
.grunt
|
||||
|
||||
# Bower dependency directory
|
||||
bower_components
|
||||
|
||||
# node-waf configuration
|
||||
.lock-wscript
|
||||
|
||||
# Compiled binary addons
|
||||
build/Release
|
||||
|
||||
# Dependency directories
|
||||
jspm_packages/
|
||||
|
||||
# TypeScript cache
|
||||
*.tsbuildinfo
|
||||
|
||||
# Optional npm cache directory
|
||||
.npm
|
||||
|
||||
# Optional eslint cache
|
||||
.eslintcache
|
||||
|
||||
# Microbundle cache
|
||||
.rpt2_cache/
|
||||
.rts2_cache_cjs/
|
||||
.rts2_cache_es/
|
||||
.rts2_cache_umd/
|
||||
|
||||
# Optional REPL history
|
||||
.node_repl_history
|
||||
|
||||
# Output of 'npm pack'
|
||||
*.tgz
|
||||
|
||||
# Yarn Integrity file
|
||||
.yarn-integrity
|
||||
|
||||
# dotenv environment variables file
|
||||
.env.test
|
||||
|
||||
# parcel-bundler cache
|
||||
.cache
|
||||
.parcel-cache
|
||||
|
||||
# Next.js build output
|
||||
.next
|
||||
|
||||
# Nuxt.js build / generate output
|
||||
.nuxt
|
||||
dist
|
||||
|
||||
# Gatsby files
|
||||
.cache/
|
||||
public
|
||||
|
||||
# Storybook build outputs
|
||||
.out
|
||||
.storybook-out
|
||||
|
||||
# Temporary folders
|
||||
tmp/
|
||||
temp/
|
||||
**/tmp/
|
||||
**/temp/
|
||||
|
||||
# Editor directories and files
|
||||
.vscode/
|
||||
.idea/
|
||||
*.swp
|
||||
*.swo
|
||||
*~
|
||||
|
||||
# OS generated files
|
||||
.DS_Store
|
||||
.DS_Store?
|
||||
._*
|
||||
.Spotlight-V100
|
||||
.Trashes
|
||||
ehthumbs.db
|
||||
Thumbs.db
|
||||
|
||||
# Database files
|
||||
*.sqlite
|
||||
*.db
|
||||
**/*.sqlite
|
||||
**/*.db
|
||||
|
||||
# SSL certificates
|
||||
*.pem
|
||||
*.key
|
||||
*.crt
|
||||
|
||||
# PM2 files
|
||||
.pm2/
|
||||
|
||||
# Docker
|
||||
.dockerignore
|
||||
Dockerfile
|
||||
docker-compose.yml
|
||||
|
||||
# Backup files
|
||||
*.bak
|
||||
*.backup
|
||||
**/*.bak
|
||||
**/*.backup
|
||||
337
README.md
Normal file
337
README.md
Normal file
@ -0,0 +1,337 @@
|
||||
# AI Agent Backend - IoT Dashboard
|
||||
|
||||
A comprehensive Node.js Express backend for IoT device management with AI-powered analytics, real-time streaming, self-healing capabilities, and intelligent suggestions.
|
||||
|
||||
## Features
|
||||
|
||||
### 🔐 Authentication & Authorization
|
||||
- JWT-based authentication with Redis session management
|
||||
- Role-based access control (Admin, Operator, Viewer)
|
||||
- Password hashing with bcrypt
|
||||
- Rate limiting for security
|
||||
|
||||
### 📡 Real-time Data Streaming
|
||||
- Apache StreamPipes integration for IoT data ingestion
|
||||
- WebSocket support for real-time communication
|
||||
- Kafka/MQTT support for message queuing
|
||||
- Redis caching for high-performance data access
|
||||
|
||||
### 🤖 AI Agent Capabilities
|
||||
- Intelligent anomaly detection
|
||||
- Predictive maintenance suggestions
|
||||
- Performance optimization recommendations
|
||||
- Self-healing automation
|
||||
- Machine learning model management
|
||||
|
||||
### 🚨 Alert Management
|
||||
- Real-time alert generation
|
||||
- Configurable alert rules and thresholds
|
||||
- Multi-channel notifications (Email, SMS, WebSocket)
|
||||
- Alert acknowledgment and resolution tracking
|
||||
|
||||
### 🔧 Self-Healing System
|
||||
- Automated problem detection and resolution
|
||||
- Device restart and configuration management
|
||||
- Load balancing and circuit breaker patterns
|
||||
- Maintenance scheduling
|
||||
|
||||
### 📊 Analytics & Insights
|
||||
- Real-time dashboard metrics
|
||||
- Performance analytics
|
||||
- Device health monitoring
|
||||
- Historical data analysis
|
||||
|
||||
### 📱 Notification System
|
||||
- Multi-channel notifications (Email, SMS, In-app)
|
||||
- Configurable notification preferences
|
||||
- Notification history and status tracking
|
||||
|
||||
## Tech Stack
|
||||
|
||||
- **Runtime**: Node.js 16+
|
||||
- **Framework**: Express.js
|
||||
- **Database**: MySQL 8.0+
|
||||
- **Cache**: Redis 6.0+
|
||||
- **Authentication**: JWT + bcrypt
|
||||
- **Real-time**: Socket.IO
|
||||
- **Streaming**: Apache StreamPipes, Kafka, MQTT
|
||||
- **Notifications**: Nodemailer, Twilio
|
||||
- **Logging**: Winston
|
||||
- **Validation**: Express-validator
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Node.js 16+
|
||||
- MySQL 8.0+
|
||||
- Redis 6.0+
|
||||
- Apache StreamPipes (optional)
|
||||
- Kafka (optional)
|
||||
- MQTT Broker (optional)
|
||||
|
||||
## Installation
|
||||
|
||||
1. **Clone the repository**
|
||||
```bash
|
||||
git clone <repository-url>
|
||||
cd ai-agent-backend
|
||||
```
|
||||
|
||||
2. **Install dependencies**
|
||||
```bash
|
||||
npm install
|
||||
```
|
||||
|
||||
3. **Environment Configuration**
|
||||
```bash
|
||||
cp env.example .env
|
||||
```
|
||||
|
||||
Edit `.env` with your configuration:
|
||||
```env
|
||||
# Database
|
||||
DB_HOST=localhost
|
||||
DB_PORT=3306
|
||||
DB_USER=root
|
||||
DB_PASSWORD=your_password
|
||||
DB_NAME=ai_agent_iot
|
||||
|
||||
# JWT
|
||||
JWT_SECRET=your_super_secret_jwt_key_here
|
||||
|
||||
# Redis
|
||||
REDIS_HOST=localhost
|
||||
REDIS_PORT=6379
|
||||
|
||||
# StreamPipes
|
||||
STREAMPIPES_HOST=localhost
|
||||
STREAMPIPES_PORT=8080
|
||||
STREAMPIPES_USERNAME=admin
|
||||
STREAMPIPES_PASSWORD=admin
|
||||
|
||||
# Notifications
|
||||
SMTP_HOST=smtp.gmail.com
|
||||
SMTP_USER=your_email@gmail.com
|
||||
SMTP_PASS=your_app_password
|
||||
|
||||
TWILIO_ACCOUNT_SID=your_twilio_account_sid
|
||||
TWILIO_AUTH_TOKEN=your_twilio_auth_token
|
||||
```
|
||||
|
||||
4. **Database Setup**
|
||||
```bash
|
||||
# Run migrations
|
||||
npm run migrate
|
||||
|
||||
# Seed initial data (optional)
|
||||
npm run seed
|
||||
```
|
||||
|
||||
5. **Start the server**
|
||||
```bash
|
||||
# Development
|
||||
npm run dev
|
||||
|
||||
# Production
|
||||
npm start
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### Authentication
|
||||
- `POST /api/auth/register` - Register new user
|
||||
- `POST /api/auth/login` - User login
|
||||
- `POST /api/auth/logout` - User logout
|
||||
- `GET /api/auth/profile` - Get user profile
|
||||
- `PUT /api/auth/profile` - Update user profile
|
||||
- `POST /api/auth/refresh` - Refresh JWT token
|
||||
|
||||
### Devices
|
||||
- `GET /api/devices` - Get all devices
|
||||
- `GET /api/devices/:deviceId` - Get device details
|
||||
- `POST /api/devices` - Create new device
|
||||
- `PUT /api/devices/:deviceId` - Update device
|
||||
- `DELETE /api/devices/:deviceId` - Delete device
|
||||
- `GET /api/devices/:deviceId/data` - Get device data
|
||||
- `POST /api/devices/:deviceId/command` - Send command to device
|
||||
- `GET /api/devices/:deviceId/stats` - Get device statistics
|
||||
|
||||
### Alerts
|
||||
- `GET /api/alerts` - Get all alerts
|
||||
- `GET /api/alerts/:alertId` - Get alert details
|
||||
- `POST /api/alerts/:alertId/acknowledge` - Acknowledge alert
|
||||
- `POST /api/alerts/:alertId/resolve` - Resolve alert
|
||||
- `GET /api/alerts/stats/overview` - Get alert statistics
|
||||
|
||||
### Analytics
|
||||
- `GET /api/analytics/dashboard` - Get dashboard overview
|
||||
- `GET /api/analytics/performance` - Get performance metrics
|
||||
|
||||
### Healing
|
||||
- `GET /api/healing` - Get healing actions
|
||||
|
||||
### Suggestions
|
||||
- `GET /api/suggestions` - Get AI suggestions
|
||||
|
||||
### Notifications
|
||||
- `GET /api/notifications` - Get user notifications
|
||||
|
||||
### StreamPipes
|
||||
- `GET /api/streampipes/health` - Check StreamPipes health
|
||||
- `GET /api/streampipes/streams` - Get data streams
|
||||
|
||||
## WebSocket Events
|
||||
|
||||
### Client to Server
|
||||
- `subscribe_device` - Subscribe to device updates
|
||||
- `unsubscribe_device` - Unsubscribe from device updates
|
||||
- `subscribe_alerts` - Subscribe to alerts
|
||||
- `unsubscribe_alerts` - Unsubscribe from alerts
|
||||
- `device_control` - Send device control command
|
||||
- `acknowledge_alert` - Acknowledge alert
|
||||
|
||||
### Server to Client
|
||||
- `device_data_update` - Device data update
|
||||
- `new_alert` - New alert notification
|
||||
- `alert_acknowledged` - Alert acknowledged
|
||||
- `healing_approved` - Healing action approved
|
||||
- `suggestion_feedback_received` - Suggestion feedback
|
||||
- `new_notification` - New notification
|
||||
- `dashboard_update` - Dashboard update
|
||||
|
||||
## Database Schema
|
||||
|
||||
### Core Tables
|
||||
- `users` - User accounts and authentication
|
||||
- `devices` - IoT device information
|
||||
- `device_data` - Raw device data storage
|
||||
- `alerts` - Alert records
|
||||
- `notifications` - User notifications
|
||||
|
||||
### AI & Analytics Tables
|
||||
- `ai_suggestions` - AI-generated suggestions
|
||||
- `ai_analysis_results` - AI analysis results
|
||||
- `healing_actions` - Self-healing actions
|
||||
- `device_controls` - Device control commands
|
||||
|
||||
## Services Architecture
|
||||
|
||||
### Core Services
|
||||
- **StreamPipesService** - Handles Apache StreamPipes integration
|
||||
- **AIAgentService** - Manages AI analysis and suggestions
|
||||
- **AlertService** - Handles alert generation and management
|
||||
- **HealingService** - Manages self-healing actions
|
||||
- **NotificationService** - Handles multi-channel notifications
|
||||
|
||||
### Configuration
|
||||
- **Database** - MySQL connection and query management
|
||||
- **Redis** - Caching and session management
|
||||
- **Logger** - Structured logging with Winston
|
||||
|
||||
## Development
|
||||
|
||||
### Project Structure
|
||||
```
|
||||
├── config/ # Configuration files
|
||||
├── middleware/ # Express middleware
|
||||
├── routes/ # API route handlers
|
||||
├── services/ # Business logic services
|
||||
├── socket/ # WebSocket handlers
|
||||
├── utils/ # Utility functions
|
||||
├── scripts/ # Database scripts
|
||||
├── logs/ # Application logs
|
||||
├── server.js # Main application file
|
||||
└── package.json # Dependencies and scripts
|
||||
```
|
||||
|
||||
### Available Scripts
|
||||
- `npm start` - Start production server
|
||||
- `npm run dev` - Start development server with nodemon
|
||||
- `npm run migrate` - Run database migrations
|
||||
- `npm run seed` - Seed database with initial data
|
||||
- `npm test` - Run tests
|
||||
|
||||
### Environment Variables
|
||||
See `env.example` for all available configuration options.
|
||||
|
||||
## Deployment
|
||||
|
||||
### Production Setup
|
||||
1. Set `NODE_ENV=production`
|
||||
2. Configure production database and Redis
|
||||
3. Set up SSL certificates
|
||||
4. Configure reverse proxy (nginx)
|
||||
5. Set up process manager (PM2)
|
||||
|
||||
### Docker Deployment
|
||||
```bash
|
||||
# Build image
|
||||
docker build -t ai-agent-backend .
|
||||
|
||||
# Run container
|
||||
docker run -p 5000:5000 --env-file .env ai-agent-backend
|
||||
```
|
||||
|
||||
## Monitoring & Logging
|
||||
|
||||
### Health Checks
|
||||
- `GET /health` - Application health status
|
||||
- Service-specific health checks available
|
||||
|
||||
### Logging
|
||||
- Structured JSON logging
|
||||
- Log rotation and archiving
|
||||
- Different log levels (error, warn, info, debug)
|
||||
|
||||
### Metrics
|
||||
- Request/response metrics
|
||||
- Database query performance
|
||||
- Service health metrics
|
||||
- Custom business metrics
|
||||
|
||||
## Security
|
||||
|
||||
### Authentication
|
||||
- JWT tokens with configurable expiration
|
||||
- Redis-based session management
|
||||
- Password hashing with bcrypt
|
||||
|
||||
### Authorization
|
||||
- Role-based access control
|
||||
- Route-level permissions
|
||||
- Device-level access control
|
||||
|
||||
### Security Headers
|
||||
- Helmet.js for security headers
|
||||
- CORS configuration
|
||||
- Rate limiting
|
||||
- Input validation
|
||||
|
||||
## Contributing
|
||||
|
||||
1. Fork the repository
|
||||
2. Create a feature branch
|
||||
3. Make your changes
|
||||
4. Add tests
|
||||
5. Submit a pull request
|
||||
|
||||
## License
|
||||
|
||||
MIT License - see LICENSE file for details
|
||||
|
||||
## Support
|
||||
|
||||
For support and questions:
|
||||
- Create an issue in the repository
|
||||
- Check the documentation
|
||||
- Review the API examples
|
||||
|
||||
## Roadmap
|
||||
|
||||
- [ ] Advanced ML model integration
|
||||
- [ ] GraphQL API support
|
||||
- [ ] Microservices architecture
|
||||
- [ ] Kubernetes deployment
|
||||
- [ ] Advanced analytics dashboard
|
||||
- [ ] Mobile app support
|
||||
- [ ] Multi-tenant support
|
||||
- [ ] Advanced security features
|
||||
79
config/database.js
Normal file
79
config/database.js
Normal file
@ -0,0 +1,79 @@
|
||||
const mysql = require('mysql2/promise');
|
||||
const logger = require('../utils/logger');
|
||||
|
||||
class Database {
|
||||
constructor() {
|
||||
this.pool = null;
|
||||
}
|
||||
|
||||
async connect() {
|
||||
try {
|
||||
this.pool = mysql.createPool({
|
||||
host: process.env.DB_HOST || 'localhost',
|
||||
port: process.env.DB_PORT || 3306,
|
||||
user: process.env.DB_USER || 'root',
|
||||
password: process.env.DB_PASSWORD || '',
|
||||
database: process.env.DB_NAME || 'ai_agent_iot',
|
||||
waitForConnections: true,
|
||||
connectionLimit: 10,
|
||||
queueLimit: 0,
|
||||
acquireTimeout: 60000,
|
||||
charset: 'utf8mb4'
|
||||
});
|
||||
|
||||
// Test the connection
|
||||
const connection = await this.pool.getConnection();
|
||||
await connection.ping();
|
||||
connection.release();
|
||||
|
||||
logger.info('Database connection pool created successfully');
|
||||
} catch (error) {
|
||||
logger.error('Database connection failed:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async disconnect() {
|
||||
if (this.pool) {
|
||||
await this.pool.end();
|
||||
logger.info('Database connection pool closed');
|
||||
}
|
||||
}
|
||||
|
||||
async query(sql, params = []) {
|
||||
try {
|
||||
const [rows] = await this.pool.execute(sql, params);
|
||||
return rows;
|
||||
} catch (error) {
|
||||
logger.error('Database query error:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async transaction(callback) {
|
||||
const connection = await this.pool.getConnection();
|
||||
try {
|
||||
await connection.beginTransaction();
|
||||
const result = await callback(connection);
|
||||
await connection.commit();
|
||||
return result;
|
||||
} catch (error) {
|
||||
await connection.rollback();
|
||||
throw error;
|
||||
} finally {
|
||||
connection.release();
|
||||
}
|
||||
}
|
||||
|
||||
async healthCheck() {
|
||||
try {
|
||||
await this.query('SELECT 1 as health');
|
||||
return true;
|
||||
} catch (error) {
|
||||
logger.error('Database health check failed:', error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = new Database();
|
||||
189
config/redis.js
Normal file
189
config/redis.js
Normal file
@ -0,0 +1,189 @@
|
||||
const Redis = require('ioredis');
|
||||
const logger = require('../utils/logger');
|
||||
|
||||
class RedisClient {
|
||||
constructor() {
|
||||
this.client = null;
|
||||
this.subscriber = null;
|
||||
}
|
||||
|
||||
async connect() {
|
||||
try {
|
||||
this.client = new Redis({
|
||||
host: process.env.REDIS_HOST || 'localhost',
|
||||
port: process.env.REDIS_PORT || 6379,
|
||||
password: process.env.REDIS_PASSWORD || undefined,
|
||||
retryDelayOnFailover: 100,
|
||||
maxRetriesPerRequest: 3,
|
||||
lazyConnect: true,
|
||||
keepAlive: 30000,
|
||||
family: 4,
|
||||
db: 0
|
||||
});
|
||||
|
||||
this.subscriber = new Redis({
|
||||
host: process.env.REDIS_HOST || 'localhost',
|
||||
port: process.env.REDIS_PORT || 6379,
|
||||
password: process.env.REDIS_PASSWORD || undefined,
|
||||
retryDelayOnFailover: 100,
|
||||
maxRetriesPerRequest: 3,
|
||||
lazyConnect: true,
|
||||
keepAlive: 30000,
|
||||
family: 4,
|
||||
db: 0
|
||||
});
|
||||
|
||||
// Test connection
|
||||
await this.client.ping();
|
||||
await this.subscriber.ping();
|
||||
|
||||
logger.info('Redis connections established successfully');
|
||||
} catch (error) {
|
||||
logger.error('Redis connection failed:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async disconnect() {
|
||||
if (this.client) {
|
||||
await this.client.quit();
|
||||
logger.info('Redis client disconnected');
|
||||
}
|
||||
if (this.subscriber) {
|
||||
await this.subscriber.quit();
|
||||
logger.info('Redis subscriber disconnected');
|
||||
}
|
||||
}
|
||||
|
||||
async set(key, value, ttl = null) {
|
||||
try {
|
||||
if (ttl) {
|
||||
await this.client.setex(key, ttl, JSON.stringify(value));
|
||||
} else {
|
||||
await this.client.set(key, JSON.stringify(value));
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Redis set error:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async get(key) {
|
||||
try {
|
||||
const value = await this.client.get(key);
|
||||
return value ? JSON.parse(value) : null;
|
||||
} catch (error) {
|
||||
logger.error('Redis get error:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async del(key) {
|
||||
try {
|
||||
await this.client.del(key);
|
||||
} catch (error) {
|
||||
logger.error('Redis del error:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async exists(key) {
|
||||
try {
|
||||
return await this.client.exists(key);
|
||||
} catch (error) {
|
||||
logger.error('Redis exists error:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async expire(key, seconds) {
|
||||
try {
|
||||
await this.client.expire(key, seconds);
|
||||
} catch (error) {
|
||||
logger.error('Redis expire error:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async publish(channel, message) {
|
||||
try {
|
||||
await this.client.publish(channel, JSON.stringify(message));
|
||||
} catch (error) {
|
||||
logger.error('Redis publish error:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async subscribe(channel, callback) {
|
||||
try {
|
||||
await this.subscriber.subscribe(channel);
|
||||
this.subscriber.on('message', (chan, message) => {
|
||||
if (chan === channel) {
|
||||
callback(JSON.parse(message));
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Redis subscribe error:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async unsubscribe(channel) {
|
||||
try {
|
||||
await this.subscriber.unsubscribe(channel);
|
||||
} catch (error) {
|
||||
logger.error('Redis unsubscribe error:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async healthCheck() {
|
||||
try {
|
||||
await this.client.ping();
|
||||
return true;
|
||||
} catch (error) {
|
||||
logger.error('Redis health check failed:', error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Device data caching methods
|
||||
async cacheDeviceData(deviceId, data, ttl = 300) {
|
||||
const key = `device:${deviceId}:data`;
|
||||
await this.set(key, data, ttl);
|
||||
}
|
||||
|
||||
async getDeviceData(deviceId) {
|
||||
const key = `device:${deviceId}:data`;
|
||||
return await this.get(key);
|
||||
}
|
||||
|
||||
// Alert caching methods
|
||||
async cacheAlert(alertId, alert, ttl = 3600) {
|
||||
const key = `alert:${alertId}`;
|
||||
await this.set(key, alert, ttl);
|
||||
}
|
||||
|
||||
async getAlert(alertId) {
|
||||
const key = `alert:${alertId}`;
|
||||
return await this.get(key);
|
||||
}
|
||||
|
||||
// User session caching
|
||||
async cacheUserSession(userId, sessionData, ttl = 86400) {
|
||||
const key = `session:${userId}`;
|
||||
await this.set(key, sessionData, ttl);
|
||||
}
|
||||
|
||||
async getUserSession(userId) {
|
||||
const key = `session:${userId}`;
|
||||
return await this.get(key);
|
||||
}
|
||||
|
||||
async invalidateUserSession(userId) {
|
||||
const key = `session:${userId}`;
|
||||
await this.del(key);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = new RedisClient();
|
||||
63
env.example
Normal file
63
env.example
Normal file
@ -0,0 +1,63 @@
|
||||
# Server Configuration
|
||||
PORT=5000
|
||||
NODE_ENV=development
|
||||
|
||||
# Database Configuration
|
||||
DB_HOST=localhost
|
||||
DB_PORT=3306
|
||||
DB_USER=root
|
||||
DB_PASSWORD=Admin@123
|
||||
DB_NAME=ai_agent_iot
|
||||
|
||||
# JWT Configuration
|
||||
JWT_SECRET=Zr8#vP!eK2$9nL@3^aW7xYb*TqM1UzGcR4sDfHjKlQ
|
||||
JWT_EXPIRES_IN=24h
|
||||
|
||||
# Redis Configuration
|
||||
REDIS_HOST=localhost
|
||||
REDIS_PORT=6379
|
||||
REDIS_PASSWORD=
|
||||
|
||||
# Apache StreamPipes Configuration
|
||||
STREAMPIPES_BASE_URL=https://datastream.cloudtopiaa.com
|
||||
STREAMPIPES_USERNAME=admin
|
||||
STREAMPIPES_PASSWORD=admin
|
||||
|
||||
# Kafka Configuration
|
||||
KAFKA_BROKERS=localhost:9092
|
||||
KAFKA_TOPIC_DEVICE_DATA=iot-device-data
|
||||
KAFKA_TOPIC_ALERTS=iot-alerts
|
||||
|
||||
# MQTT Configuration
|
||||
MQTT_BROKER=localhost
|
||||
MQTT_PORT=1883
|
||||
MQTT_USERNAME=
|
||||
MQTT_PASSWORD=
|
||||
|
||||
# Email Configuration (for notifications)
|
||||
SMTP_HOST=smtp.gmail.com
|
||||
SMTP_PORT=587
|
||||
SMTP_USER=your_email@gmail.com
|
||||
SMTP_PASS=your_app_password
|
||||
|
||||
# Twilio Configuration (for SMS notifications)
|
||||
TWILIO_ACCOUNT_SID=your_twilio_account_sid
|
||||
TWILIO_AUTH_TOKEN=your_twilio_auth_token
|
||||
TWILIO_PHONE_NUMBER=+1234567890
|
||||
|
||||
# AI Agent Configuration
|
||||
AI_AGENT_ENABLED=true
|
||||
AI_LEARNING_RATE=0.1
|
||||
AI_THRESHOLD_ANOMALY=0.8
|
||||
AI_HEALING_ENABLED=true
|
||||
|
||||
# Logging
|
||||
LOG_LEVEL=info
|
||||
LOG_FILE=logs/app.log
|
||||
|
||||
# Rate Limiting
|
||||
RATE_LIMIT_WINDOW_MS=900000
|
||||
RATE_LIMIT_MAX_REQUESTS=100
|
||||
|
||||
# CORS
|
||||
CORS_ORIGIN=http://localhost:3000
|
||||
1
iot-dashboard
Submodule
1
iot-dashboard
Submodule
@ -0,0 +1 @@
|
||||
Subproject commit fdb7a83ad9fbffbccd2ef31c9864191a0aa6ea40
|
||||
199
middleware/auth.js
Normal file
199
middleware/auth.js
Normal file
@ -0,0 +1,199 @@
|
||||
const jwt = require('jsonwebtoken');
|
||||
const bcrypt = require('bcryptjs');
|
||||
const logger = require('../utils/logger');
|
||||
const database = require('../config/database');
|
||||
const redis = require('../config/redis');
|
||||
|
||||
// Protect routes - require authentication
|
||||
const protect = async (req, res, next) => {
|
||||
let token;
|
||||
|
||||
if (req.headers.authorization && req.headers.authorization.startsWith('Bearer')) {
|
||||
token = req.headers.authorization.split(' ')[1];
|
||||
}
|
||||
|
||||
if (!token) {
|
||||
return res.status(401).json({
|
||||
success: false,
|
||||
message: 'Not authorized to access this route'
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
// Verify token
|
||||
const decoded = jwt.verify(token, process.env.JWT_SECRET);
|
||||
|
||||
// Check if user session exists in Redis
|
||||
const session = await redis.getUserSession(decoded.id);
|
||||
if (!session) {
|
||||
return res.status(401).json({
|
||||
success: false,
|
||||
message: 'Session expired, please login again'
|
||||
});
|
||||
}
|
||||
|
||||
// Get user from database
|
||||
const [users] = await database.query(
|
||||
'SELECT id, username, email, role, status, created_at FROM users WHERE id = ? AND status = "active"',
|
||||
[decoded.id]
|
||||
);
|
||||
|
||||
if (users.length === 0) {
|
||||
return res.status(401).json({
|
||||
success: false,
|
||||
message: 'User not found or inactive'
|
||||
});
|
||||
}
|
||||
|
||||
req.user = users[0];
|
||||
next();
|
||||
} catch (error) {
|
||||
logger.error('Token verification failed:', error);
|
||||
return res.status(401).json({
|
||||
success: false,
|
||||
message: 'Not authorized to access this route'
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
// Grant access to specific roles
|
||||
const authorize = (...roles) => {
|
||||
return (req, res, next) => {
|
||||
if (!req.user) {
|
||||
return res.status(401).json({
|
||||
success: false,
|
||||
message: 'User not authenticated'
|
||||
});
|
||||
}
|
||||
|
||||
if (!roles.includes(req.user.role)) {
|
||||
return res.status(403).json({
|
||||
success: false,
|
||||
message: `User role ${req.user.role} is not authorized to access this route`
|
||||
});
|
||||
}
|
||||
|
||||
next();
|
||||
};
|
||||
};
|
||||
|
||||
// Optional authentication - doesn't fail if no token
|
||||
const optionalAuth = async (req, res, next) => {
|
||||
let token;
|
||||
|
||||
if (req.headers.authorization && req.headers.authorization.startsWith('Bearer')) {
|
||||
token = req.headers.authorization.split(' ')[1];
|
||||
}
|
||||
|
||||
if (!token) {
|
||||
req.user = null;
|
||||
return next();
|
||||
}
|
||||
|
||||
try {
|
||||
const decoded = jwt.verify(token, process.env.JWT_SECRET);
|
||||
const session = await redis.getUserSession(decoded.id);
|
||||
|
||||
if (session) {
|
||||
const [users] = await database.query(
|
||||
'SELECT id, username, email, role, status FROM users WHERE id = ? AND status = "active"',
|
||||
[decoded.id]
|
||||
);
|
||||
|
||||
if (users.length > 0) {
|
||||
req.user = users[0];
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
logger.debug('Optional auth failed:', error.message);
|
||||
}
|
||||
|
||||
next();
|
||||
};
|
||||
|
||||
// Rate limiting for authentication attempts
|
||||
const authRateLimit = (req, res, next) => {
|
||||
const key = `auth_attempts:${req.ip}`;
|
||||
|
||||
redis.get(key).then(attempts => {
|
||||
const attemptCount = attempts ? parseInt(attempts) : 0;
|
||||
|
||||
if (attemptCount >= 5) {
|
||||
return res.status(429).json({
|
||||
success: false,
|
||||
message: 'Too many authentication attempts. Please try again later.'
|
||||
});
|
||||
}
|
||||
|
||||
next();
|
||||
}).catch(error => {
|
||||
logger.error('Rate limit check failed:', error);
|
||||
next(); // Continue if Redis is unavailable
|
||||
});
|
||||
};
|
||||
|
||||
// Log authentication attempts
|
||||
const logAuthAttempt = (req, res, next) => {
|
||||
const originalSend = res.send;
|
||||
|
||||
res.send = function(data) {
|
||||
const response = JSON.parse(data);
|
||||
|
||||
if (response.success === false && req.path.includes('/login')) {
|
||||
const key = `auth_attempts:${req.ip}`;
|
||||
redis.get(key).then(attempts => {
|
||||
const attemptCount = attempts ? parseInt(attempts) : 0;
|
||||
redis.set(key, attemptCount + 1, 900); // 15 minutes
|
||||
}).catch(error => {
|
||||
logger.error('Failed to log auth attempt:', error);
|
||||
});
|
||||
|
||||
logger.logSecurity('failed_login', {
|
||||
ip: req.ip,
|
||||
userAgent: req.get('User-Agent'),
|
||||
username: req.body.username
|
||||
}, 'warn');
|
||||
} else if (response.success === true && req.path.includes('/login')) {
|
||||
logger.logSecurity('successful_login', {
|
||||
ip: req.ip,
|
||||
userAgent: req.get('User-Agent'),
|
||||
username: req.body.username
|
||||
}, 'info');
|
||||
}
|
||||
|
||||
originalSend.call(this, data);
|
||||
};
|
||||
|
||||
next();
|
||||
};
|
||||
|
||||
// Generate JWT token
|
||||
const generateToken = (userId) => {
|
||||
return jwt.sign(
|
||||
{ id: userId },
|
||||
process.env.JWT_SECRET,
|
||||
{ expiresIn: process.env.JWT_EXPIRES_IN || '24h' }
|
||||
);
|
||||
};
|
||||
|
||||
// Hash password
|
||||
const hashPassword = async (password) => {
|
||||
const salt = await bcrypt.genSalt(10);
|
||||
return bcrypt.hash(password, salt);
|
||||
};
|
||||
|
||||
// Compare password
|
||||
const comparePassword = async (password, hashedPassword) => {
|
||||
return bcrypt.compare(password, hashedPassword);
|
||||
};
|
||||
|
||||
module.exports = {
|
||||
protect,
|
||||
authorize,
|
||||
optionalAuth,
|
||||
authRateLimit,
|
||||
logAuthAttempt,
|
||||
generateToken,
|
||||
hashPassword,
|
||||
comparePassword
|
||||
};
|
||||
94
middleware/errorHandler.js
Normal file
94
middleware/errorHandler.js
Normal file
@ -0,0 +1,94 @@
|
||||
const logger = require('../utils/logger');
|
||||
|
||||
// Error handling middleware
|
||||
const errorHandler = (err, req, res, next) => {
|
||||
let error = { ...err };
|
||||
error.message = err.message;
|
||||
|
||||
// Log error
|
||||
logger.error('Error occurred:', {
|
||||
message: err.message,
|
||||
stack: err.stack,
|
||||
url: req.url,
|
||||
method: req.method,
|
||||
ip: req.ip,
|
||||
userAgent: req.get('User-Agent')
|
||||
});
|
||||
|
||||
// Mongoose bad ObjectId
|
||||
if (err.name === 'CastError') {
|
||||
const message = 'Resource not found';
|
||||
error = { message, statusCode: 404 };
|
||||
}
|
||||
|
||||
// Mongoose duplicate key
|
||||
if (err.code === 11000) {
|
||||
const message = 'Duplicate field value entered';
|
||||
error = { message, statusCode: 400 };
|
||||
}
|
||||
|
||||
// Mongoose validation error
|
||||
if (err.name === 'ValidationError') {
|
||||
const message = Object.values(err.errors).map(val => val.message).join(', ');
|
||||
error = { message, statusCode: 400 };
|
||||
}
|
||||
|
||||
// JWT errors
|
||||
if (err.name === 'JsonWebTokenError') {
|
||||
const message = 'Invalid token';
|
||||
error = { message, statusCode: 401 };
|
||||
}
|
||||
|
||||
if (err.name === 'TokenExpiredError') {
|
||||
const message = 'Token expired';
|
||||
error = { message, statusCode: 401 };
|
||||
}
|
||||
|
||||
// MySQL errors
|
||||
if (err.code === 'ER_DUP_ENTRY') {
|
||||
const message = 'Duplicate entry found';
|
||||
error = { message, statusCode: 400 };
|
||||
}
|
||||
|
||||
if (err.code === 'ER_NO_REFERENCED_ROW_2') {
|
||||
const message = 'Referenced record not found';
|
||||
error = { message, statusCode: 400 };
|
||||
}
|
||||
|
||||
if (err.code === 'ER_ROW_IS_REFERENCED_2') {
|
||||
const message = 'Cannot delete record - it is referenced by other records';
|
||||
error = { message, statusCode: 400 };
|
||||
}
|
||||
|
||||
// Redis errors
|
||||
if (err.code === 'ECONNREFUSED' && err.syscall === 'connect') {
|
||||
const message = 'Cache service unavailable';
|
||||
error = { message, statusCode: 503 };
|
||||
}
|
||||
|
||||
// Network errors
|
||||
if (err.code === 'ENOTFOUND') {
|
||||
const message = 'Service not found';
|
||||
error = { message, statusCode: 503 };
|
||||
}
|
||||
|
||||
if (err.code === 'ETIMEDOUT') {
|
||||
const message = 'Request timeout';
|
||||
error = { message, statusCode: 408 };
|
||||
}
|
||||
|
||||
// Default error
|
||||
const statusCode = error.statusCode || 500;
|
||||
const message = error.message || 'Server Error';
|
||||
|
||||
// Don't leak error details in production
|
||||
const response = {
|
||||
success: false,
|
||||
error: message,
|
||||
...(process.env.NODE_ENV === 'development' && { stack: err.stack })
|
||||
};
|
||||
|
||||
res.status(statusCode).json(response);
|
||||
};
|
||||
|
||||
module.exports = errorHandler;
|
||||
8010
package-lock.json
generated
Normal file
8010
package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
55
package.json
Normal file
55
package.json
Normal file
@ -0,0 +1,55 @@
|
||||
{
|
||||
"name": "ai-agent-backend",
|
||||
"version": "1.0.0",
|
||||
"description": "AI Agent Backend for IoT Dashboard with real-time streaming, self-healing, and intelligent suggestions",
|
||||
"main": "server.js",
|
||||
"scripts": {
|
||||
"start": "node server.js",
|
||||
"dev": "nodemon server.js",
|
||||
"test": "jest",
|
||||
"migrate": "node scripts/migrate.js",
|
||||
"seed": "node scripts/seed.js"
|
||||
},
|
||||
"keywords": ["iot", "ai", "streaming", "realtime", "dashboard"],
|
||||
"author": "AI Agent Team",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"express": "^4.18.2",
|
||||
"mysql2": "^3.6.5",
|
||||
"socket.io": "^4.7.4",
|
||||
"bcryptjs": "^2.4.3",
|
||||
"jsonwebtoken": "^9.0.2",
|
||||
"cors": "^2.8.5",
|
||||
"helmet": "^7.1.0",
|
||||
"express-rate-limit": "^7.1.5",
|
||||
"dotenv": "^16.3.1",
|
||||
"winston": "^3.11.0",
|
||||
"node-cron": "^3.0.3",
|
||||
"axios": "^1.6.2",
|
||||
"multer": "^1.4.5-lts.1",
|
||||
"express-validator": "^7.0.1",
|
||||
"compression": "^1.7.4",
|
||||
"morgan": "^1.10.0",
|
||||
"uuid": "^9.0.1",
|
||||
"moment": "^2.29.4",
|
||||
"lodash": "^4.17.21",
|
||||
"node-schedule": "^2.1.1",
|
||||
"nodemailer": "^6.9.7",
|
||||
"twilio": "^4.19.0",
|
||||
"redis": "^4.6.10",
|
||||
"ioredis": "^5.3.2",
|
||||
"kafkajs": "^2.2.4",
|
||||
"mqtt": "^5.3.4",
|
||||
"ws": "^8.14.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"nodemon": "^3.0.2",
|
||||
"jest": "^29.7.0",
|
||||
"supertest": "^6.3.3",
|
||||
"eslint": "^8.55.0",
|
||||
"prettier": "^3.1.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=16.0.0"
|
||||
}
|
||||
}
|
||||
225
routes/alerts.js
Normal file
225
routes/alerts.js
Normal file
@ -0,0 +1,225 @@
|
||||
const express = require('express');
|
||||
const { body, validationResult } = require('express-validator');
|
||||
const { protect, authorize } = require('../middleware/auth');
|
||||
const database = require('../config/database');
|
||||
const alertService = require('../services/alertService');
|
||||
const logger = require('../utils/logger');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
// Get all alerts
|
||||
router.get('/', protect, async (req, res) => {
|
||||
try {
|
||||
const { page = 1, limit = 50, status, severity, device_id } = req.query;
|
||||
const offset = (page - 1) * limit;
|
||||
|
||||
let query = 'SELECT * FROM alerts WHERE 1=1';
|
||||
const params = [];
|
||||
|
||||
if (status) {
|
||||
query += ' AND status = ?';
|
||||
params.push(status);
|
||||
}
|
||||
|
||||
if (severity) {
|
||||
query += ' AND severity = ?';
|
||||
params.push(severity);
|
||||
}
|
||||
|
||||
if (device_id) {
|
||||
query += ' AND device_id = ?';
|
||||
params.push(device_id);
|
||||
}
|
||||
|
||||
query += ' ORDER BY created_at DESC LIMIT ? OFFSET ?';
|
||||
params.push(parseInt(limit), offset);
|
||||
|
||||
const alerts = await database.query(query, params);
|
||||
|
||||
// Get total count
|
||||
let countQuery = 'SELECT COUNT(*) as total FROM alerts WHERE 1=1';
|
||||
const countParams = [];
|
||||
|
||||
if (status) {
|
||||
countQuery += ' AND status = ?';
|
||||
countParams.push(status);
|
||||
}
|
||||
|
||||
if (severity) {
|
||||
countQuery += ' AND severity = ?';
|
||||
countParams.push(severity);
|
||||
}
|
||||
|
||||
if (device_id) {
|
||||
countQuery += ' AND device_id = ?';
|
||||
countParams.push(device_id);
|
||||
}
|
||||
|
||||
const [countResult] = await database.query(countQuery, countParams);
|
||||
const total = countResult.total;
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: alerts,
|
||||
pagination: {
|
||||
page: parseInt(page),
|
||||
limit: parseInt(limit),
|
||||
total,
|
||||
pages: Math.ceil(total / limit)
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get alerts error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get alerts'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get alert by ID
|
||||
router.get('/:alertId', protect, async (req, res) => {
|
||||
try {
|
||||
const { alertId } = req.params;
|
||||
|
||||
const alerts = await database.query(
|
||||
'SELECT * FROM alerts WHERE id = ?',
|
||||
[alertId]
|
||||
);
|
||||
|
||||
if (alerts.length === 0) {
|
||||
return res.status(404).json({
|
||||
success: false,
|
||||
message: 'Alert not found'
|
||||
});
|
||||
}
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: alerts[0]
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get alert error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get alert'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Acknowledge alert
|
||||
router.post('/:alertId/acknowledge', protect, async (req, res) => {
|
||||
try {
|
||||
const { alertId } = req.params;
|
||||
|
||||
await alertService.acknowledgeAlert(alertId, req.user.id);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: 'Alert acknowledged successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Acknowledge alert error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to acknowledge alert'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Resolve alert
|
||||
router.post('/:alertId/resolve', protect, [
|
||||
body('resolution').optional().isString().withMessage('Resolution must be a string')
|
||||
], async (req, res) => {
|
||||
try {
|
||||
const errors = validationResult(req);
|
||||
if (!errors.isEmpty()) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
errors: errors.array()
|
||||
});
|
||||
}
|
||||
|
||||
const { alertId } = req.params;
|
||||
const { resolution } = req.body;
|
||||
|
||||
await alertService.resolveAlert(alertId, req.user.id, resolution);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: 'Alert resolved successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Resolve alert error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to resolve alert'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get alert statistics
|
||||
router.get('/stats/overview', protect, async (req, res) => {
|
||||
try {
|
||||
const { period = '24h' } = req.query;
|
||||
|
||||
let timeFilter;
|
||||
switch (period) {
|
||||
case '1h':
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 1 HOUR)';
|
||||
break;
|
||||
case '24h':
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 24 HOUR)';
|
||||
break;
|
||||
case '7d':
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 7 DAY)';
|
||||
break;
|
||||
case '30d':
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 30 DAY)';
|
||||
break;
|
||||
default:
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 24 HOUR)';
|
||||
}
|
||||
|
||||
// Get total alerts
|
||||
const [totalAlerts] = await database.query(
|
||||
`SELECT COUNT(*) as count FROM alerts WHERE created_at >= ${timeFilter}`
|
||||
);
|
||||
|
||||
// Get alerts by severity
|
||||
const severityStats = await database.query(
|
||||
`SELECT severity, COUNT(*) as count FROM alerts WHERE created_at >= ${timeFilter} GROUP BY severity`
|
||||
);
|
||||
|
||||
// Get alerts by status
|
||||
const statusStats = await database.query(
|
||||
`SELECT status, COUNT(*) as count FROM alerts WHERE created_at >= ${timeFilter} GROUP BY status`
|
||||
);
|
||||
|
||||
// Get alerts by type
|
||||
const typeStats = await database.query(
|
||||
`SELECT type, COUNT(*) as count FROM alerts WHERE created_at >= ${timeFilter} GROUP BY type ORDER BY count DESC LIMIT 10`
|
||||
);
|
||||
|
||||
const stats = {
|
||||
period,
|
||||
total_alerts: totalAlerts.count,
|
||||
by_severity: severityStats,
|
||||
by_status: statusStats,
|
||||
by_type: typeStats
|
||||
};
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: stats
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get alert stats error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get alert statistics'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
99
routes/analytics.js
Normal file
99
routes/analytics.js
Normal file
@ -0,0 +1,99 @@
|
||||
const express = require('express');
|
||||
const { protect } = require('../middleware/auth');
|
||||
const database = require('../config/database');
|
||||
const logger = require('../utils/logger');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
// Get dashboard overview
|
||||
router.get('/dashboard', protect, async (req, res) => {
|
||||
try {
|
||||
// Get total devices
|
||||
const [deviceCount] = await database.query('SELECT COUNT(*) as count FROM devices');
|
||||
|
||||
// Get online devices
|
||||
const [onlineDevices] = await database.query("SELECT COUNT(*) as count FROM devices WHERE status = 'online'");
|
||||
|
||||
// Get total alerts in last 24h
|
||||
const [alerts24h] = await database.query(
|
||||
'SELECT COUNT(*) as count FROM alerts WHERE created_at >= DATE_SUB(NOW(), INTERVAL 24 HOUR)'
|
||||
);
|
||||
|
||||
// Get critical alerts
|
||||
const [criticalAlerts] = await database.query(
|
||||
"SELECT COUNT(*) as count FROM alerts WHERE severity = 'critical' AND status = 'active'"
|
||||
);
|
||||
|
||||
const overview = {
|
||||
total_devices: deviceCount.count,
|
||||
online_devices: onlineDevices.count,
|
||||
offline_devices: deviceCount.count - onlineDevices.count,
|
||||
alerts_24h: alerts24h.count,
|
||||
critical_alerts: criticalAlerts.count,
|
||||
uptime_percentage: deviceCount.count > 0 ? ((onlineDevices.count / deviceCount.count) * 100).toFixed(2) : 0
|
||||
};
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: overview
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get dashboard overview error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get dashboard overview'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get device performance metrics
|
||||
router.get('/performance', protect, async (req, res) => {
|
||||
try {
|
||||
const { period = '24h' } = req.query;
|
||||
|
||||
let timeFilter;
|
||||
switch (period) {
|
||||
case '1h':
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 1 HOUR)';
|
||||
break;
|
||||
case '24h':
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 24 HOUR)';
|
||||
break;
|
||||
case '7d':
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 7 DAY)';
|
||||
break;
|
||||
default:
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 24 HOUR)';
|
||||
}
|
||||
|
||||
// Get data points count
|
||||
const [dataPoints] = await database.query(
|
||||
`SELECT COUNT(*) as count FROM device_data WHERE created_at >= ${timeFilter}`
|
||||
);
|
||||
|
||||
// Get average data size
|
||||
const [avgDataSize] = await database.query(
|
||||
`SELECT AVG(JSON_LENGTH(raw_data)) as avg_size FROM device_data WHERE created_at >= ${timeFilter}`
|
||||
);
|
||||
|
||||
const performance = {
|
||||
period,
|
||||
data_points: dataPoints.count,
|
||||
avg_data_size: avgDataSize.avg_size || 0,
|
||||
data_rate_per_hour: period === '1h' ? dataPoints.count : Math.round(dataPoints.count / 24)
|
||||
};
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: performance
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get performance metrics error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get performance metrics'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
410
routes/auth.js
Normal file
410
routes/auth.js
Normal file
@ -0,0 +1,410 @@
|
||||
const express = require('express');
|
||||
const { body, validationResult } = require('express-validator');
|
||||
const { protect, generateToken, hashPassword, comparePassword, authRateLimit, logAuthAttempt } = require('../middleware/auth');
|
||||
const database = require('../config/database');
|
||||
const redis = require('../config/redis');
|
||||
const logger = require('../utils/logger');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
// Register new user
|
||||
router.post('/register', [
|
||||
body('username').isLength({ min: 3 }).withMessage('Username must be at least 3 characters'),
|
||||
body('email').isEmail().withMessage('Must be a valid email'),
|
||||
body('password').isLength({ min: 6 }).withMessage('Password must be at least 6 characters'),
|
||||
body('role').optional().isIn(['admin', 'operator', 'viewer']).withMessage('Invalid role')
|
||||
], async (req, res) => {
|
||||
try {
|
||||
const errors = validationResult(req);
|
||||
if (!errors.isEmpty()) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
errors: errors.array()
|
||||
});
|
||||
}
|
||||
|
||||
const { username, email, password, role = 'viewer' } = req.body;
|
||||
|
||||
// Check if user already exists
|
||||
const existingUsers = await database.query(
|
||||
'SELECT id FROM users WHERE username = ? OR email = ?',
|
||||
[username, email]
|
||||
);
|
||||
|
||||
if (existingUsers.length > 0) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
message: 'Username or email already exists'
|
||||
});
|
||||
}
|
||||
|
||||
// Hash password
|
||||
const passwordHash = await hashPassword(password);
|
||||
|
||||
// Create user
|
||||
const result = await database.query(
|
||||
'INSERT INTO users (username, email, password_hash, role) VALUES (?, ?, ?, ?)',
|
||||
[username, email, passwordHash, role]
|
||||
);
|
||||
|
||||
const userId = result.insertId;
|
||||
|
||||
// Generate token
|
||||
const token = generateToken(userId);
|
||||
|
||||
// Store session in Redis
|
||||
await redis.cacheUserSession(userId, {
|
||||
userId,
|
||||
username,
|
||||
email,
|
||||
role,
|
||||
loginTime: new Date().toISOString()
|
||||
});
|
||||
|
||||
logger.logSecurity('user_registered', { username, email, role }, 'info');
|
||||
|
||||
res.status(201).json({
|
||||
success: true,
|
||||
message: 'User registered successfully',
|
||||
data: {
|
||||
user: {
|
||||
id: userId,
|
||||
username,
|
||||
email,
|
||||
role
|
||||
},
|
||||
token
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Registration error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Registration failed'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Login user
|
||||
router.post('/login', [
|
||||
body('username').notEmpty().withMessage('Username is required'),
|
||||
body('password').notEmpty().withMessage('Password is required')
|
||||
], authRateLimit, logAuthAttempt, async (req, res) => {
|
||||
try {
|
||||
const errors = validationResult(req);
|
||||
if (!errors.isEmpty()) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
errors: errors.array()
|
||||
});
|
||||
}
|
||||
|
||||
const { username, password } = req.body;
|
||||
|
||||
// Find user
|
||||
const users = await database.query(
|
||||
'SELECT id, username, email, password_hash, role, status FROM users WHERE username = ? OR email = ?',
|
||||
[username, username]
|
||||
);
|
||||
|
||||
if (users.length === 0) {
|
||||
return res.status(401).json({
|
||||
success: false,
|
||||
message: 'Invalid credentials'
|
||||
});
|
||||
}
|
||||
|
||||
const user = users[0];
|
||||
|
||||
// Check if user is active
|
||||
if (user.status !== 'active') {
|
||||
return res.status(401).json({
|
||||
success: false,
|
||||
message: 'Account is not active'
|
||||
});
|
||||
}
|
||||
|
||||
// Verify password
|
||||
const isValidPassword = await comparePassword(password, user.password_hash);
|
||||
if (!isValidPassword) {
|
||||
return res.status(401).json({
|
||||
success: false,
|
||||
message: 'Invalid credentials'
|
||||
});
|
||||
}
|
||||
|
||||
// Generate token
|
||||
const token = generateToken(user.id);
|
||||
|
||||
// Store session in Redis
|
||||
await redis.cacheUserSession(user.id, {
|
||||
userId: user.id,
|
||||
username: user.username,
|
||||
email: user.email,
|
||||
role: user.role,
|
||||
loginTime: new Date().toISOString()
|
||||
});
|
||||
|
||||
// Update last login
|
||||
await database.query(
|
||||
'UPDATE users SET last_login = NOW() WHERE id = ?',
|
||||
[user.id]
|
||||
);
|
||||
|
||||
logger.logSecurity('user_login', { username: user.username, userId: user.id }, 'info');
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: 'Login successful',
|
||||
data: {
|
||||
user: {
|
||||
id: user.id,
|
||||
username: user.username,
|
||||
email: user.email,
|
||||
role: user.role
|
||||
},
|
||||
token
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Login error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Login failed'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get current user profile
|
||||
router.get('/profile', protect, async (req, res) => {
|
||||
try {
|
||||
const user = await database.query(
|
||||
'SELECT id, username, email, role, status, created_at, last_login FROM users WHERE id = ?',
|
||||
[req.user.id]
|
||||
);
|
||||
|
||||
if (user.length === 0) {
|
||||
return res.status(404).json({
|
||||
success: false,
|
||||
message: 'User not found'
|
||||
});
|
||||
}
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: user[0]
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get profile error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get profile'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Update user profile
|
||||
router.put('/profile', protect, [
|
||||
body('email').optional().isEmail().withMessage('Must be a valid email'),
|
||||
body('currentPassword').optional().notEmpty().withMessage('Current password is required for changes')
|
||||
], async (req, res) => {
|
||||
try {
|
||||
const errors = validationResult(req);
|
||||
if (!errors.isEmpty()) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
errors: errors.array()
|
||||
});
|
||||
}
|
||||
|
||||
const { email, currentPassword, newPassword } = req.body;
|
||||
const updates = {};
|
||||
const params = [];
|
||||
|
||||
// Check if user wants to change password
|
||||
if (newPassword) {
|
||||
if (!currentPassword) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
message: 'Current password is required to change password'
|
||||
});
|
||||
}
|
||||
|
||||
// Verify current password
|
||||
const user = await database.query(
|
||||
'SELECT password_hash FROM users WHERE id = ?',
|
||||
[req.user.id]
|
||||
);
|
||||
|
||||
const isValidPassword = await comparePassword(currentPassword, user[0].password_hash);
|
||||
if (!isValidPassword) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
message: 'Current password is incorrect'
|
||||
});
|
||||
}
|
||||
|
||||
// Hash new password
|
||||
const newPasswordHash = await hashPassword(newPassword);
|
||||
updates.password_hash = newPasswordHash;
|
||||
params.push(newPasswordHash);
|
||||
}
|
||||
|
||||
// Update email if provided
|
||||
if (email) {
|
||||
// Check if email is already taken
|
||||
const existingUser = await database.query(
|
||||
'SELECT id FROM users WHERE email = ? AND id != ?',
|
||||
[email, req.user.id]
|
||||
);
|
||||
|
||||
if (existingUser.length > 0) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
message: 'Email is already taken'
|
||||
});
|
||||
}
|
||||
|
||||
updates.email = email;
|
||||
params.push(email);
|
||||
}
|
||||
|
||||
if (Object.keys(updates).length === 0) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
message: 'No updates provided'
|
||||
});
|
||||
}
|
||||
|
||||
// Build update query
|
||||
const updateFields = Object.keys(updates).map(key => `${key} = ?`).join(', ');
|
||||
params.push(req.user.id);
|
||||
|
||||
await database.query(
|
||||
`UPDATE users SET ${updateFields} WHERE id = ?`,
|
||||
params
|
||||
);
|
||||
|
||||
logger.logSecurity('profile_updated', { userId: req.user.id }, 'info');
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: 'Profile updated successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Update profile error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to update profile'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Logout user
|
||||
router.post('/logout', protect, async (req, res) => {
|
||||
try {
|
||||
// Invalidate session in Redis
|
||||
await redis.invalidateUserSession(req.user.id);
|
||||
|
||||
logger.logSecurity('user_logout', { userId: req.user.id }, 'info');
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: 'Logout successful'
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Logout error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Logout failed'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Refresh token
|
||||
router.post('/refresh', protect, async (req, res) => {
|
||||
try {
|
||||
// Generate new token
|
||||
const newToken = generateToken(req.user.id);
|
||||
|
||||
// Update session in Redis
|
||||
await redis.cacheUserSession(req.user.id, {
|
||||
userId: req.user.id,
|
||||
username: req.user.username,
|
||||
email: req.user.email,
|
||||
role: req.user.role,
|
||||
loginTime: new Date().toISOString()
|
||||
});
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: 'Token refreshed successfully',
|
||||
data: {
|
||||
token: newToken
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Token refresh error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Token refresh failed'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Change password
|
||||
router.post('/change-password', protect, [
|
||||
body('currentPassword').notEmpty().withMessage('Current password is required'),
|
||||
body('newPassword').isLength({ min: 6 }).withMessage('New password must be at least 6 characters')
|
||||
], async (req, res) => {
|
||||
try {
|
||||
const errors = validationResult(req);
|
||||
if (!errors.isEmpty()) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
errors: errors.array()
|
||||
});
|
||||
}
|
||||
|
||||
const { currentPassword, newPassword } = req.body;
|
||||
|
||||
// Get current user
|
||||
const user = await database.query(
|
||||
'SELECT password_hash FROM users WHERE id = ?',
|
||||
[req.user.id]
|
||||
);
|
||||
|
||||
// Verify current password
|
||||
const isValidPassword = await comparePassword(currentPassword, user[0].password_hash);
|
||||
if (!isValidPassword) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
message: 'Current password is incorrect'
|
||||
});
|
||||
}
|
||||
|
||||
// Hash new password
|
||||
const newPasswordHash = await hashPassword(newPassword);
|
||||
|
||||
// Update password
|
||||
await database.query(
|
||||
'UPDATE users SET password_hash = ? WHERE id = ?',
|
||||
[newPasswordHash, req.user.id]
|
||||
);
|
||||
|
||||
logger.logSecurity('password_changed', { userId: req.user.id }, 'info');
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: 'Password changed successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Change password error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to change password'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
544
routes/devices.js
Normal file
544
routes/devices.js
Normal file
@ -0,0 +1,544 @@
|
||||
const express = require('express');
|
||||
const { body, validationResult } = require('express-validator');
|
||||
const { protect, authorize } = require('../middleware/auth');
|
||||
const database = require('../config/database');
|
||||
const redis = require('../config/redis');
|
||||
const streamPipesService = require('../services/streamPipesService');
|
||||
const logger = require('../utils/logger');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
// Get all devices
|
||||
router.get('/', protect, async (req, res) => {
|
||||
try {
|
||||
const { page = 1, limit = 10, status, device_type } = req.query;
|
||||
const offset = (page - 1) * limit;
|
||||
|
||||
let query = 'SELECT * FROM devices WHERE 1=1';
|
||||
const params = [];
|
||||
|
||||
if (status) {
|
||||
query += ' AND status = ?';
|
||||
params.push(status);
|
||||
}
|
||||
|
||||
if (device_type) {
|
||||
query += ' AND device_type = ?';
|
||||
params.push(device_type);
|
||||
}
|
||||
|
||||
query += ' ORDER BY created_at DESC LIMIT ? OFFSET ?';
|
||||
params.push(parseInt(limit), offset);
|
||||
|
||||
const devices = await database.query(query, params);
|
||||
|
||||
// Get total count
|
||||
let countQuery = 'SELECT COUNT(*) as total FROM devices WHERE 1=1';
|
||||
const countParams = [];
|
||||
|
||||
if (status) {
|
||||
countQuery += ' AND status = ?';
|
||||
countParams.push(status);
|
||||
}
|
||||
|
||||
if (device_type) {
|
||||
countQuery += ' AND device_type = ?';
|
||||
countParams.push(device_type);
|
||||
}
|
||||
|
||||
const [countResult] = await database.query(countQuery, countParams);
|
||||
const total = countResult.total;
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: devices,
|
||||
pagination: {
|
||||
page: parseInt(page),
|
||||
limit: parseInt(limit),
|
||||
total,
|
||||
pages: Math.ceil(total / limit)
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get devices error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get devices'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get device by ID
|
||||
router.get('/:deviceId', protect, async (req, res) => {
|
||||
try {
|
||||
const { deviceId } = req.params;
|
||||
|
||||
const devices = await database.query(
|
||||
'SELECT * FROM devices WHERE device_id = ?',
|
||||
[deviceId]
|
||||
);
|
||||
|
||||
if (devices.length === 0) {
|
||||
return res.status(404).json({
|
||||
success: false,
|
||||
message: 'Device not found'
|
||||
});
|
||||
}
|
||||
|
||||
// Get latest device data from Redis
|
||||
const latestData = await redis.getDeviceData(deviceId);
|
||||
|
||||
const device = {
|
||||
...devices[0],
|
||||
latest_data: latestData
|
||||
};
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: device
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get device error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get device'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Create new device
|
||||
router.post('/', protect, authorize('admin', 'operator'), [
|
||||
body('device_id').notEmpty().withMessage('Device ID is required'),
|
||||
body('name').notEmpty().withMessage('Device name is required'),
|
||||
body('device_type').notEmpty().withMessage('Device type is required')
|
||||
], async (req, res) => {
|
||||
try {
|
||||
const errors = validationResult(req);
|
||||
if (!errors.isEmpty()) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
errors: errors.array()
|
||||
});
|
||||
}
|
||||
|
||||
const { device_id, name, device_type, location, configuration } = req.body;
|
||||
|
||||
// Check if device already exists
|
||||
const existingDevices = await database.query(
|
||||
'SELECT id FROM devices WHERE device_id = ?',
|
||||
[device_id]
|
||||
);
|
||||
|
||||
if (existingDevices.length > 0) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
message: 'Device ID already exists'
|
||||
});
|
||||
}
|
||||
|
||||
// Create device
|
||||
const result = await database.query(
|
||||
'INSERT INTO devices (device_id, name, device_type, location, configuration) VALUES (?, ?, ?, ?, ?)',
|
||||
[device_id, name, device_type, location, JSON.stringify(configuration || {})]
|
||||
);
|
||||
|
||||
logger.logSecurity('device_created', { device_id, name, created_by: req.user.id }, 'info');
|
||||
|
||||
res.status(201).json({
|
||||
success: true,
|
||||
message: 'Device created successfully',
|
||||
data: {
|
||||
id: result.insertId,
|
||||
device_id,
|
||||
name,
|
||||
device_type,
|
||||
location,
|
||||
configuration
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Create device error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to create device'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Update device
|
||||
router.put('/:deviceId', protect, authorize('admin', 'operator'), [
|
||||
body('name').optional().notEmpty().withMessage('Device name cannot be empty'),
|
||||
body('device_type').optional().notEmpty().withMessage('Device type cannot be empty')
|
||||
], async (req, res) => {
|
||||
try {
|
||||
const errors = validationResult(req);
|
||||
if (!errors.isEmpty()) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
errors: errors.array()
|
||||
});
|
||||
}
|
||||
|
||||
const { deviceId } = req.params;
|
||||
const { name, device_type, location, configuration, status } = req.body;
|
||||
|
||||
// Check if device exists
|
||||
const existingDevices = await database.query(
|
||||
'SELECT id FROM devices WHERE device_id = ?',
|
||||
[deviceId]
|
||||
);
|
||||
|
||||
if (existingDevices.length === 0) {
|
||||
return res.status(404).json({
|
||||
success: false,
|
||||
message: 'Device not found'
|
||||
});
|
||||
}
|
||||
|
||||
// Build update query
|
||||
const updates = {};
|
||||
const params = [];
|
||||
|
||||
if (name) {
|
||||
updates.name = name;
|
||||
params.push(name);
|
||||
}
|
||||
|
||||
if (device_type) {
|
||||
updates.device_type = device_type;
|
||||
params.push(device_type);
|
||||
}
|
||||
|
||||
if (location !== undefined) {
|
||||
updates.location = location;
|
||||
params.push(location);
|
||||
}
|
||||
|
||||
if (configuration !== undefined) {
|
||||
updates.configuration = JSON.stringify(configuration);
|
||||
params.push(JSON.stringify(configuration));
|
||||
}
|
||||
|
||||
if (status) {
|
||||
updates.status = status;
|
||||
params.push(status);
|
||||
}
|
||||
|
||||
if (Object.keys(updates).length === 0) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
message: 'No updates provided'
|
||||
});
|
||||
}
|
||||
|
||||
const updateFields = Object.keys(updates).map(key => `${key} = ?`).join(', ');
|
||||
params.push(deviceId);
|
||||
|
||||
await database.query(
|
||||
`UPDATE devices SET ${updateFields} WHERE device_id = ?`,
|
||||
params
|
||||
);
|
||||
|
||||
logger.logSecurity('device_updated', { device_id: deviceId, updated_by: req.user.id }, 'info');
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: 'Device updated successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Update device error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to update device'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Delete device
|
||||
router.delete('/:deviceId', protect, authorize('admin'), async (req, res) => {
|
||||
try {
|
||||
const { deviceId } = req.params;
|
||||
|
||||
// Check if device exists
|
||||
const existingDevices = await database.query(
|
||||
'SELECT id FROM devices WHERE device_id = ?',
|
||||
[deviceId]
|
||||
);
|
||||
|
||||
if (existingDevices.length === 0) {
|
||||
return res.status(404).json({
|
||||
success: false,
|
||||
message: 'Device not found'
|
||||
});
|
||||
}
|
||||
|
||||
// Delete device
|
||||
await database.query(
|
||||
'DELETE FROM devices WHERE device_id = ?',
|
||||
[deviceId]
|
||||
);
|
||||
|
||||
// Clear device data from Redis
|
||||
await redis.del(`device:${deviceId}:data`);
|
||||
|
||||
logger.logSecurity('device_deleted', { device_id: deviceId, deleted_by: req.user.id }, 'info');
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: 'Device deleted successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Delete device error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to delete device'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get device data
|
||||
router.get('/:deviceId/data', protect, async (req, res) => {
|
||||
try {
|
||||
const { deviceId } = req.params;
|
||||
const { limit = 100, start_date, end_date } = req.query;
|
||||
|
||||
// Check if device exists
|
||||
const existingDevices = await database.query(
|
||||
'SELECT id FROM devices WHERE device_id = ?',
|
||||
[deviceId]
|
||||
);
|
||||
|
||||
if (existingDevices.length === 0) {
|
||||
return res.status(404).json({
|
||||
success: false,
|
||||
message: 'Device not found'
|
||||
});
|
||||
}
|
||||
|
||||
let query = 'SELECT * FROM device_data WHERE device_id = ?';
|
||||
const params = [deviceId];
|
||||
|
||||
if (start_date && end_date) {
|
||||
query += ' AND timestamp BETWEEN ? AND ?';
|
||||
params.push(start_date, end_date);
|
||||
}
|
||||
|
||||
query += ' ORDER BY timestamp DESC LIMIT ?';
|
||||
params.push(parseInt(limit));
|
||||
|
||||
const data = await database.query(query, params);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: data
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get device data error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get device data'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Send command to device
|
||||
router.post('/:deviceId/command', protect, authorize('admin', 'operator'), [
|
||||
body('command').notEmpty().withMessage('Command is required'),
|
||||
body('parameters').optional().isObject().withMessage('Parameters must be an object')
|
||||
], async (req, res) => {
|
||||
try {
|
||||
const errors = validationResult(req);
|
||||
if (!errors.isEmpty()) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
errors: errors.array()
|
||||
});
|
||||
}
|
||||
|
||||
const { deviceId } = req.params;
|
||||
const { command, parameters = {} } = req.body;
|
||||
|
||||
// Check if device exists
|
||||
const existingDevices = await database.query(
|
||||
'SELECT id FROM devices WHERE device_id = ?',
|
||||
[deviceId]
|
||||
);
|
||||
|
||||
if (existingDevices.length === 0) {
|
||||
return res.status(404).json({
|
||||
success: false,
|
||||
message: 'Device not found'
|
||||
});
|
||||
}
|
||||
|
||||
// Store command in database
|
||||
const result = await database.query(
|
||||
'INSERT INTO device_controls (device_id, action, parameters, initiated_by, status) VALUES (?, ?, ?, ?, ?)',
|
||||
[deviceId, command, JSON.stringify(parameters), req.user.id, 'pending']
|
||||
);
|
||||
|
||||
logger.logSecurity('device_command_sent', {
|
||||
device_id: deviceId,
|
||||
command,
|
||||
parameters,
|
||||
initiated_by: req.user.id
|
||||
}, 'info');
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: 'Command sent successfully',
|
||||
data: {
|
||||
control_id: result.insertId,
|
||||
device_id: deviceId,
|
||||
command,
|
||||
parameters,
|
||||
status: 'pending'
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Send device command error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to send command'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get device statistics
|
||||
router.get('/:deviceId/stats', protect, async (req, res) => {
|
||||
try {
|
||||
const { deviceId } = req.params;
|
||||
const { period = '24h' } = req.query;
|
||||
|
||||
// Check if device exists
|
||||
const existingDevices = await database.query(
|
||||
'SELECT id FROM devices WHERE device_id = ?',
|
||||
[deviceId]
|
||||
);
|
||||
|
||||
if (existingDevices.length === 0) {
|
||||
return res.status(404).json({
|
||||
success: false,
|
||||
message: 'Device not found'
|
||||
});
|
||||
}
|
||||
|
||||
let timeFilter;
|
||||
switch (period) {
|
||||
case '1h':
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 1 HOUR)';
|
||||
break;
|
||||
case '24h':
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 24 HOUR)';
|
||||
break;
|
||||
case '7d':
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 7 DAY)';
|
||||
break;
|
||||
case '30d':
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 30 DAY)';
|
||||
break;
|
||||
default:
|
||||
timeFilter = 'DATE_SUB(NOW(), INTERVAL 24 HOUR)';
|
||||
}
|
||||
|
||||
// Get data count
|
||||
const [dataCount] = await database.query(
|
||||
`SELECT COUNT(*) as count FROM device_data WHERE device_id = ? AND created_at >= ${timeFilter}`,
|
||||
[deviceId]
|
||||
);
|
||||
|
||||
// Get latest data
|
||||
const [latestData] = await database.query(
|
||||
'SELECT * FROM device_data WHERE device_id = ? ORDER BY timestamp DESC LIMIT 1',
|
||||
[deviceId]
|
||||
);
|
||||
|
||||
// Get alerts count
|
||||
const [alertsCount] = await database.query(
|
||||
`SELECT COUNT(*) as count FROM alerts WHERE device_id = ? AND created_at >= ${timeFilter}`,
|
||||
[deviceId]
|
||||
);
|
||||
|
||||
const stats = {
|
||||
device_id: deviceId,
|
||||
period,
|
||||
data_count: dataCount.count,
|
||||
alerts_count: alertsCount.count,
|
||||
latest_data: latestData || null,
|
||||
last_seen: latestData ? latestData.timestamp : null
|
||||
};
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: stats
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get device stats error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get device statistics'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get all device types
|
||||
router.get('/types/list', protect, async (req, res) => {
|
||||
try {
|
||||
const types = await database.query(
|
||||
'SELECT DISTINCT device_type FROM devices ORDER BY device_type'
|
||||
);
|
||||
|
||||
const deviceTypes = types.map(type => type.device_type);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: deviceTypes
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get device types error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get device types'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get devices by type
|
||||
router.get('/types/:deviceType', protect, async (req, res) => {
|
||||
try {
|
||||
const { deviceType } = req.params;
|
||||
const { page = 1, limit = 10 } = req.query;
|
||||
const offset = (page - 1) * limit;
|
||||
|
||||
const devices = await database.query(
|
||||
'SELECT * FROM devices WHERE device_type = ? ORDER BY created_at DESC LIMIT ? OFFSET ?',
|
||||
[deviceType, parseInt(limit), offset]
|
||||
);
|
||||
|
||||
const [countResult] = await database.query(
|
||||
'SELECT COUNT(*) as total FROM devices WHERE device_type = ?',
|
||||
[deviceType]
|
||||
);
|
||||
|
||||
const total = countResult.total;
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: devices,
|
||||
pagination: {
|
||||
page: parseInt(page),
|
||||
limit: parseInt(limit),
|
||||
total,
|
||||
pages: Math.ceil(total / limit)
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get devices by type error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get devices by type'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
45
routes/healing.js
Normal file
45
routes/healing.js
Normal file
@ -0,0 +1,45 @@
|
||||
const express = require('express');
|
||||
const { protect } = require('../middleware/auth');
|
||||
const database = require('../config/database');
|
||||
const logger = require('../utils/logger');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
// Get healing actions
|
||||
router.get('/', protect, async (req, res) => {
|
||||
try {
|
||||
const { page = 1, limit = 50, status, device_id } = req.query;
|
||||
const offset = (page - 1) * limit;
|
||||
|
||||
let query = 'SELECT * FROM healing_actions WHERE 1=1';
|
||||
const params = [];
|
||||
|
||||
if (status) {
|
||||
query += ' AND status = ?';
|
||||
params.push(status);
|
||||
}
|
||||
|
||||
if (device_id) {
|
||||
query += ' AND device_id = ?';
|
||||
params.push(device_id);
|
||||
}
|
||||
|
||||
query += ' ORDER BY created_at DESC LIMIT ? OFFSET ?';
|
||||
params.push(parseInt(limit), offset);
|
||||
|
||||
const actions = await database.query(query, params);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: actions
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get healing actions error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get healing actions'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
40
routes/notifications.js
Normal file
40
routes/notifications.js
Normal file
@ -0,0 +1,40 @@
|
||||
const express = require('express');
|
||||
const { protect } = require('../middleware/auth');
|
||||
const database = require('../config/database');
|
||||
const logger = require('../utils/logger');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
// Get user notifications
|
||||
router.get('/', protect, async (req, res) => {
|
||||
try {
|
||||
const { page = 1, limit = 50, status } = req.query;
|
||||
const offset = (page - 1) * limit;
|
||||
|
||||
let query = 'SELECT * FROM notifications WHERE user_id = ?';
|
||||
const params = [req.user.id];
|
||||
|
||||
if (status) {
|
||||
query += ' AND status = ?';
|
||||
params.push(status);
|
||||
}
|
||||
|
||||
query += ' ORDER BY created_at DESC LIMIT ? OFFSET ?';
|
||||
params.push(parseInt(limit), offset);
|
||||
|
||||
const notifications = await database.query(query, params);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: notifications
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get notifications error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get notifications'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
44
routes/streamPipes.js
Normal file
44
routes/streamPipes.js
Normal file
@ -0,0 +1,44 @@
|
||||
const express = require('express');
|
||||
const { protect } = require('../middleware/auth');
|
||||
const streamPipesService = require('../services/streamPipesService');
|
||||
const logger = require('../utils/logger');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
// Get StreamPipes health status
|
||||
router.get('/health', protect, async (req, res) => {
|
||||
try {
|
||||
const health = await streamPipesService.healthCheck();
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: health
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('StreamPipes health check error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to check StreamPipes health'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get data streams
|
||||
router.get('/streams', protect, async (req, res) => {
|
||||
try {
|
||||
const streams = await streamPipesService.getDataStreams();
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: streams
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get StreamPipes streams error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get data streams'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
45
routes/suggestions.js
Normal file
45
routes/suggestions.js
Normal file
@ -0,0 +1,45 @@
|
||||
const express = require('express');
|
||||
const { protect } = require('../middleware/auth');
|
||||
const database = require('../config/database');
|
||||
const logger = require('../utils/logger');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
// Get AI suggestions
|
||||
router.get('/', protect, async (req, res) => {
|
||||
try {
|
||||
const { page = 1, limit = 50, status, device_id } = req.query;
|
||||
const offset = (page - 1) * limit;
|
||||
|
||||
let query = 'SELECT * FROM ai_suggestions WHERE 1=1';
|
||||
const params = [];
|
||||
|
||||
if (status) {
|
||||
query += ' AND status = ?';
|
||||
params.push(status);
|
||||
}
|
||||
|
||||
if (device_id) {
|
||||
query += ' AND device_id = ?';
|
||||
params.push(device_id);
|
||||
}
|
||||
|
||||
query += ' ORDER BY created_at DESC LIMIT ? OFFSET ?';
|
||||
params.push(parseInt(limit), offset);
|
||||
|
||||
const suggestions = await database.query(query, params);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: suggestions
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Get suggestions error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
message: 'Failed to get suggestions'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
200
scripts/migrate.js
Normal file
200
scripts/migrate.js
Normal file
@ -0,0 +1,200 @@
|
||||
require('dotenv').config();
|
||||
const database = require('../config/database');
|
||||
const logger = require('../utils/logger');
|
||||
|
||||
async function runMigrations() {
|
||||
try {
|
||||
await database.connect();
|
||||
logger.info('Starting database migrations...');
|
||||
|
||||
// Create users table
|
||||
await database.query(`
|
||||
CREATE TABLE IF NOT EXISTS users (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
username VARCHAR(50) UNIQUE NOT NULL,
|
||||
email VARCHAR(100) UNIQUE NOT NULL,
|
||||
password_hash VARCHAR(255) NOT NULL,
|
||||
role ENUM('admin', 'operator', 'viewer') DEFAULT 'viewer',
|
||||
status ENUM('active', 'inactive') DEFAULT 'active',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
`);
|
||||
|
||||
// Create devices table
|
||||
await database.query(`
|
||||
CREATE TABLE IF NOT EXISTS devices (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
device_id VARCHAR(100) UNIQUE NOT NULL,
|
||||
name VARCHAR(100) NOT NULL,
|
||||
device_type VARCHAR(50) NOT NULL,
|
||||
status ENUM('online', 'offline') DEFAULT 'offline',
|
||||
ai_model_config JSON,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
`);
|
||||
|
||||
// Create device_data table
|
||||
await database.query(`
|
||||
CREATE TABLE IF NOT EXISTS device_data (
|
||||
id BIGINT AUTO_INCREMENT PRIMARY KEY,
|
||||
device_id VARCHAR(100) NOT NULL,
|
||||
raw_data JSON NOT NULL,
|
||||
timestamp DATETIME NOT NULL,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
`);
|
||||
|
||||
// Create alerts table
|
||||
await database.query(`
|
||||
CREATE TABLE IF NOT EXISTS alerts (
|
||||
id BIGINT AUTO_INCREMENT PRIMARY KEY,
|
||||
device_id VARCHAR(100),
|
||||
type VARCHAR(50) NOT NULL,
|
||||
severity ENUM('info', 'warning', 'critical') NOT NULL,
|
||||
message TEXT NOT NULL,
|
||||
status ENUM('active', 'resolved') DEFAULT 'active',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
`);
|
||||
|
||||
// Create alert_rules table
|
||||
await database.query(`
|
||||
CREATE TABLE IF NOT EXISTS alert_rules (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
name VARCHAR(100) NOT NULL,
|
||||
description TEXT,
|
||||
device_type VARCHAR(50),
|
||||
condition_type ENUM('threshold', 'anomaly', 'pattern') NOT NULL,
|
||||
condition_config JSON NOT NULL,
|
||||
severity ENUM('info', 'warning', 'critical') NOT NULL,
|
||||
actions JSON,
|
||||
status ENUM('active', 'inactive') DEFAULT 'active',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
`);
|
||||
|
||||
// Create healing_rules table
|
||||
await database.query(`
|
||||
CREATE TABLE IF NOT EXISTS healing_rules (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
name VARCHAR(100) NOT NULL,
|
||||
description TEXT,
|
||||
trigger_condition JSON NOT NULL,
|
||||
healing_actions JSON NOT NULL,
|
||||
priority INT DEFAULT 1,
|
||||
status ENUM('active', 'inactive') DEFAULT 'active',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
`);
|
||||
|
||||
// Create healing_actions table
|
||||
await database.query(`
|
||||
CREATE TABLE IF NOT EXISTS healing_actions (
|
||||
id BIGINT AUTO_INCREMENT PRIMARY KEY,
|
||||
rule_id INT,
|
||||
device_id VARCHAR(100),
|
||||
action_type VARCHAR(50) NOT NULL,
|
||||
action_config JSON NOT NULL,
|
||||
status ENUM('pending', 'in_progress', 'completed', 'failed') DEFAULT 'pending',
|
||||
result JSON,
|
||||
started_at TIMESTAMP NULL,
|
||||
completed_at TIMESTAMP NULL,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
`);
|
||||
|
||||
// Create ai_suggestions table
|
||||
await database.query(`
|
||||
CREATE TABLE IF NOT EXISTS ai_suggestions (
|
||||
id BIGINT AUTO_INCREMENT PRIMARY KEY,
|
||||
device_id VARCHAR(100),
|
||||
suggestion_type ENUM('maintenance', 'optimization', 'alert', 'healing') NOT NULL,
|
||||
title VARCHAR(200) NOT NULL,
|
||||
description TEXT NOT NULL,
|
||||
priority ENUM('low', 'medium', 'high', 'critical') DEFAULT 'medium',
|
||||
confidence_score DECIMAL(5,2),
|
||||
status ENUM('pending', 'approved', 'rejected', 'implemented') DEFAULT 'pending',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
`);
|
||||
|
||||
// Create notifications table
|
||||
await database.query(`
|
||||
CREATE TABLE IF NOT EXISTS notifications (
|
||||
id BIGINT AUTO_INCREMENT PRIMARY KEY,
|
||||
user_id INT,
|
||||
type VARCHAR(50) NOT NULL,
|
||||
title VARCHAR(200) NOT NULL,
|
||||
message TEXT NOT NULL,
|
||||
channel ENUM('email', 'sms', 'in_app') NOT NULL,
|
||||
status ENUM('pending', 'sent', 'failed') DEFAULT 'pending',
|
||||
sent_at TIMESTAMP NULL,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
`);
|
||||
|
||||
// Create device_thresholds table
|
||||
await database.query(`
|
||||
CREATE TABLE IF NOT EXISTS device_thresholds (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
device_id VARCHAR(100) NOT NULL,
|
||||
parameter_name VARCHAR(100) NOT NULL,
|
||||
min_value DECIMAL(10,2),
|
||||
max_value DECIMAL(10,2),
|
||||
warning_min DECIMAL(10,2),
|
||||
warning_max DECIMAL(10,2),
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
`);
|
||||
|
||||
// Create user_sessions table
|
||||
await database.query(`
|
||||
CREATE TABLE IF NOT EXISTS user_sessions (
|
||||
id BIGINT AUTO_INCREMENT PRIMARY KEY,
|
||||
user_id INT NOT NULL,
|
||||
session_token VARCHAR(255) UNIQUE NOT NULL,
|
||||
expires_at TIMESTAMP NOT NULL,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
`);
|
||||
|
||||
// Create audit_logs table
|
||||
await database.query(`
|
||||
CREATE TABLE IF NOT EXISTS audit_logs (
|
||||
id BIGINT AUTO_INCREMENT PRIMARY KEY,
|
||||
user_id INT,
|
||||
action VARCHAR(100) NOT NULL,
|
||||
resource_type VARCHAR(50),
|
||||
resource_id VARCHAR(100),
|
||||
details JSON,
|
||||
ip_address VARCHAR(45),
|
||||
user_agent TEXT,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
`);
|
||||
|
||||
// Create system_config table
|
||||
await database.query(`
|
||||
CREATE TABLE IF NOT EXISTS system_config (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
config_key VARCHAR(100) UNIQUE NOT NULL,
|
||||
config_value TEXT,
|
||||
description TEXT,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
|
||||
)
|
||||
`);
|
||||
|
||||
logger.info('Database migrations completed successfully');
|
||||
} catch (error) {
|
||||
logger.error('Migration failed:', error);
|
||||
process.exit(1);
|
||||
} finally {
|
||||
await database.disconnect();
|
||||
}
|
||||
}
|
||||
|
||||
if (require.main === module) {
|
||||
runMigrations();
|
||||
}
|
||||
|
||||
module.exports = { runMigrations };
|
||||
139
scripts/seed.js
Normal file
139
scripts/seed.js
Normal file
@ -0,0 +1,139 @@
|
||||
require('dotenv').config();
|
||||
const database = require('../config/database');
|
||||
const { hashPassword } = require('../middleware/auth');
|
||||
const logger = require('../utils/logger');
|
||||
|
||||
async function seedDatabase() {
|
||||
try {
|
||||
await database.connect();
|
||||
logger.info('Starting database seeding...');
|
||||
|
||||
// Create admin user
|
||||
const adminPassword = await hashPassword('admin123');
|
||||
await database.query(
|
||||
'INSERT INTO users (username, email, password_hash, role) VALUES (?, ?, ?, ?) ON DUPLICATE KEY UPDATE id=id',
|
||||
['admin', 'admin@aiagent.com', adminPassword, 'admin']
|
||||
);
|
||||
|
||||
// Create operator user
|
||||
const operatorPassword = await hashPassword('operator123');
|
||||
await database.query(
|
||||
'INSERT INTO users (username, email, password_hash, role) VALUES (?, ?, ?, ?) ON DUPLICATE KEY UPDATE id=id',
|
||||
['operator', 'operator@aiagent.com', operatorPassword, 'operator']
|
||||
);
|
||||
|
||||
// Create viewer user
|
||||
const viewerPassword = await hashPassword('viewer123');
|
||||
await database.query(
|
||||
'INSERT INTO users (username, email, password_hash, role) VALUES (?, ?, ?, ?) ON DUPLICATE KEY UPDATE id=id',
|
||||
['viewer', 'viewer@aiagent.com', viewerPassword, 'viewer']
|
||||
);
|
||||
|
||||
// Create sample devices
|
||||
const sampleDevices = [
|
||||
{
|
||||
device_id: 'sensor-001',
|
||||
name: 'Temperature Sensor 1',
|
||||
device_type: 'temperature_sensor',
|
||||
status: 'online'
|
||||
},
|
||||
{
|
||||
device_id: 'sensor-002',
|
||||
name: 'Humidity Sensor 1',
|
||||
device_type: 'humidity_sensor',
|
||||
status: 'online'
|
||||
},
|
||||
{
|
||||
device_id: 'actuator-001',
|
||||
name: 'Smart Valve 1',
|
||||
device_type: 'actuator',
|
||||
status: 'online'
|
||||
},
|
||||
{
|
||||
device_id: 'gateway-001',
|
||||
name: 'IoT Gateway 1',
|
||||
device_type: 'gateway',
|
||||
status: 'online'
|
||||
}
|
||||
];
|
||||
|
||||
for (const device of sampleDevices) {
|
||||
await database.query(
|
||||
'INSERT INTO devices (device_id, name, device_type, status) VALUES (?, ?, ?, ?) ON DUPLICATE KEY UPDATE id=id',
|
||||
[device.device_id, device.name, device.device_type, device.status]
|
||||
);
|
||||
}
|
||||
|
||||
// Create sample device data
|
||||
const sampleData = [
|
||||
{
|
||||
device_id: 'sensor-001',
|
||||
raw_data: JSON.stringify({
|
||||
temperature: 24.5,
|
||||
humidity: 45.2,
|
||||
timestamp: new Date().toISOString()
|
||||
}),
|
||||
timestamp: new Date()
|
||||
},
|
||||
{
|
||||
device_id: 'sensor-002',
|
||||
raw_data: JSON.stringify({
|
||||
humidity: 52.8,
|
||||
pressure: 1013.25,
|
||||
timestamp: new Date().toISOString()
|
||||
}),
|
||||
timestamp: new Date()
|
||||
}
|
||||
];
|
||||
|
||||
for (const data of sampleData) {
|
||||
await database.query(
|
||||
'INSERT INTO device_data (device_id, raw_data, timestamp) VALUES (?, ?, ?)',
|
||||
[data.device_id, data.raw_data, data.timestamp]
|
||||
);
|
||||
}
|
||||
|
||||
// Create sample alerts
|
||||
const sampleAlerts = [
|
||||
{
|
||||
device_id: 'sensor-001',
|
||||
type: 'temperature_high',
|
||||
severity: 'warning',
|
||||
message: 'Temperature is above normal range',
|
||||
status: 'active'
|
||||
},
|
||||
{
|
||||
device_id: 'sensor-002',
|
||||
type: 'humidity_low',
|
||||
severity: 'info',
|
||||
message: 'Humidity is below normal range',
|
||||
status: 'active'
|
||||
}
|
||||
];
|
||||
|
||||
for (const alert of sampleAlerts) {
|
||||
await database.query(
|
||||
'INSERT INTO alerts (device_id, type, severity, message, status) VALUES (?, ?, ?, ?, ?)',
|
||||
[alert.device_id, alert.type, alert.severity, alert.message, alert.status]
|
||||
);
|
||||
}
|
||||
|
||||
logger.info('Database seeding completed successfully');
|
||||
logger.info('Default users created:');
|
||||
logger.info('- admin/admin123 (admin@aiagent.com)');
|
||||
logger.info('- operator/operator123 (operator@aiagent.com)');
|
||||
logger.info('- viewer/viewer123 (viewer@aiagent.com)');
|
||||
|
||||
} catch (error) {
|
||||
logger.error('Database seeding failed:', error);
|
||||
process.exit(1);
|
||||
} finally {
|
||||
await database.disconnect();
|
||||
}
|
||||
}
|
||||
|
||||
if (require.main === module) {
|
||||
seedDatabase();
|
||||
}
|
||||
|
||||
module.exports = { seedDatabase };
|
||||
185
server.js
Normal file
185
server.js
Normal file
@ -0,0 +1,185 @@
|
||||
const express = require('express');
|
||||
const cors = require('cors');
|
||||
const helmet = require('helmet');
|
||||
const compression = require('compression');
|
||||
const morgan = require('morgan');
|
||||
const rateLimit = require('express-rate-limit');
|
||||
const { createServer } = require('http');
|
||||
const { Server } = require('socket.io');
|
||||
require('dotenv').config();
|
||||
|
||||
const logger = require('./utils/logger');
|
||||
const database = require('./config/database');
|
||||
const redis = require('./config/redis');
|
||||
const socketHandler = require('./socket/socketHandler');
|
||||
const errorHandler = require('./middleware/errorHandler');
|
||||
|
||||
// Import routes
|
||||
const authRoutes = require('./routes/auth');
|
||||
const deviceRoutes = require('./routes/devices');
|
||||
const alertRoutes = require('./routes/alerts');
|
||||
const analyticsRoutes = require('./routes/analytics');
|
||||
const healingRoutes = require('./routes/healing');
|
||||
const suggestionsRoutes = require('./routes/suggestions');
|
||||
const notificationsRoutes = require('./routes/notifications');
|
||||
const streamPipesRoutes = require('./routes/streamPipes');
|
||||
|
||||
// Import services
|
||||
const streamPipesService = require('./services/streamPipesService');
|
||||
const aiAgentService = require('./services/aiAgentService');
|
||||
const alertService = require('./services/alertService');
|
||||
const healingService = require('./services/healingService');
|
||||
|
||||
const app = express();
|
||||
const server = createServer(app);
|
||||
const io = new Server(server, {
|
||||
cors: {
|
||||
origin: process.env.CORS_ORIGIN || "http://localhost:3000",
|
||||
methods: ["GET", "POST"]
|
||||
}
|
||||
});
|
||||
|
||||
// Rate limiting
|
||||
const limiter = rateLimit({
|
||||
windowMs: parseInt(process.env.RATE_LIMIT_WINDOW_MS) || 15 * 60 * 1000, // 15 minutes
|
||||
max: parseInt(process.env.RATE_LIMIT_MAX_REQUESTS) || 100, // limit each IP to 100 requests per windowMs
|
||||
message: 'Too many requests from this IP, please try again later.'
|
||||
});
|
||||
|
||||
// Middleware
|
||||
app.use(helmet());
|
||||
app.use(compression());
|
||||
app.use(cors({
|
||||
origin: process.env.CORS_ORIGIN || "http://localhost:3000",
|
||||
credentials: true
|
||||
}));
|
||||
app.use(limiter);
|
||||
app.use(morgan('combined', { stream: { write: message => logger.info(message.trim()) } }));
|
||||
app.use(express.json({ limit: '10mb' }));
|
||||
app.use(express.urlencoded({ extended: true, limit: '10mb' }));
|
||||
|
||||
// Health check endpoint
|
||||
app.get('/health', (req, res) => {
|
||||
res.status(200).json({
|
||||
status: 'OK',
|
||||
timestamp: new Date().toISOString(),
|
||||
uptime: process.uptime(),
|
||||
environment: process.env.NODE_ENV
|
||||
});
|
||||
});
|
||||
|
||||
// API Routes
|
||||
app.use('/api/auth', authRoutes);
|
||||
app.use('/api/devices', deviceRoutes);
|
||||
app.use('/api/alerts', alertRoutes);
|
||||
app.use('/api/analytics', analyticsRoutes);
|
||||
app.use('/api/healing', healingRoutes);
|
||||
app.use('/api/suggestions', suggestionsRoutes);
|
||||
app.use('/api/notifications', notificationsRoutes);
|
||||
app.use('/api/streampipes', streamPipesRoutes);
|
||||
|
||||
// Socket.io connection handling
|
||||
socketHandler(io);
|
||||
|
||||
// Error handling middleware
|
||||
app.use(errorHandler);
|
||||
|
||||
// 404 handler
|
||||
app.use('*', (req, res) => {
|
||||
res.status(404).json({
|
||||
success: false,
|
||||
message: 'Route not found'
|
||||
});
|
||||
});
|
||||
|
||||
const PORT = process.env.PORT || 5000;
|
||||
|
||||
// Initialize services
|
||||
async function initializeServices() {
|
||||
try {
|
||||
// Initialize database connection
|
||||
await database.connect();
|
||||
logger.info('Database connected successfully');
|
||||
|
||||
// Initialize Redis connection
|
||||
await redis.connect();
|
||||
logger.info('Redis connected successfully');
|
||||
|
||||
// Initialize StreamPipes service
|
||||
try {
|
||||
await streamPipesService.initialize();
|
||||
logger.info('StreamPipes service initialized');
|
||||
} catch (error) {
|
||||
logger.warn('StreamPipes service initialization failed - continuing without it');
|
||||
}
|
||||
|
||||
// Initialize AI Agent service
|
||||
try {
|
||||
await aiAgentService.initialize();
|
||||
logger.info('AI Agent service initialized');
|
||||
} catch (error) {
|
||||
logger.warn('AI Agent service initialization failed - continuing without it');
|
||||
}
|
||||
|
||||
// Initialize Alert service
|
||||
try {
|
||||
await alertService.initialize();
|
||||
logger.info('Alert service initialized');
|
||||
} catch (error) {
|
||||
logger.warn('Alert service initialization failed - continuing without it');
|
||||
}
|
||||
|
||||
// Initialize Healing service
|
||||
try {
|
||||
await healingService.initialize();
|
||||
logger.info('Healing service initialized');
|
||||
} catch (error) {
|
||||
logger.warn('Healing service initialization failed - continuing without it');
|
||||
}
|
||||
|
||||
// Start the server
|
||||
server.listen(PORT, () => {
|
||||
logger.info(`AI Agent Backend server running on port ${PORT}`);
|
||||
logger.info(`Environment: ${process.env.NODE_ENV}`);
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
logger.error('Failed to initialize services:', error);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
// Graceful shutdown
|
||||
process.on('SIGTERM', async () => {
|
||||
logger.info('SIGTERM received, shutting down gracefully');
|
||||
await database.disconnect();
|
||||
await redis.disconnect();
|
||||
server.close(() => {
|
||||
logger.info('Server closed');
|
||||
process.exit(0);
|
||||
});
|
||||
});
|
||||
|
||||
process.on('SIGINT', async () => {
|
||||
logger.info('SIGINT received, shutting down gracefully');
|
||||
await database.disconnect();
|
||||
await redis.disconnect();
|
||||
server.close(() => {
|
||||
logger.info('Server closed');
|
||||
process.exit(0);
|
||||
});
|
||||
});
|
||||
|
||||
// Handle unhandled promise rejections
|
||||
process.on('unhandledRejection', (reason, promise) => {
|
||||
logger.error('Unhandled Rejection at:', promise, 'reason:', reason);
|
||||
});
|
||||
|
||||
// Handle uncaught exceptions
|
||||
process.on('uncaughtException', (error) => {
|
||||
logger.error('Uncaught Exception:', error);
|
||||
process.exit(1);
|
||||
});
|
||||
|
||||
// Start the application
|
||||
initializeServices();
|
||||
551
services/aiAgentService.js
Normal file
551
services/aiAgentService.js
Normal file
@ -0,0 +1,551 @@
|
||||
const logger = require('../utils/logger');
|
||||
const database = require('../config/database');
|
||||
const redis = require('../config/redis');
|
||||
const healingService = require('./healingService');
|
||||
const alertService = require('./alertService');
|
||||
|
||||
class AIAgentService {
|
||||
constructor() {
|
||||
this.learningRate = parseFloat(process.env.AI_LEARNING_RATE) || 0.1;
|
||||
this.anomalyThreshold = parseFloat(process.env.AI_THRESHOLD_ANOMALY) || 0.8;
|
||||
this.healingEnabled = process.env.AI_HEALING_ENABLED === 'true';
|
||||
this.deviceModels = new Map();
|
||||
this.isInitialized = false;
|
||||
}
|
||||
|
||||
async initialize() {
|
||||
try {
|
||||
await this.loadDeviceModels();
|
||||
await this.setupPeriodicAnalysis();
|
||||
this.isInitialized = true;
|
||||
logger.info('AI Agent service initialized successfully');
|
||||
} catch (error) {
|
||||
logger.error('Failed to initialize AI Agent service:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async loadDeviceModels() {
|
||||
try {
|
||||
// Load device-specific AI models from database
|
||||
const devices = await database.query('SELECT id, device_type, ai_model_config FROM devices WHERE ai_model_config IS NOT NULL');
|
||||
|
||||
for (const device of devices) {
|
||||
if (device.ai_model_config) {
|
||||
const config = JSON.parse(device.ai_model_config);
|
||||
this.deviceModels.set(device.id, {
|
||||
deviceType: device.device_type,
|
||||
config: config,
|
||||
lastUpdated: new Date()
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
logger.info(`Loaded AI models for ${devices.length} devices`);
|
||||
} catch (error) {
|
||||
logger.error('Failed to load device models:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async setupPeriodicAnalysis() {
|
||||
// Run periodic analysis every 5 minutes
|
||||
setInterval(async () => {
|
||||
await this.runPeriodicAnalysis();
|
||||
}, 5 * 60 * 1000);
|
||||
}
|
||||
|
||||
async processDeviceData(deviceId, data, timestamp) {
|
||||
try {
|
||||
const startTime = Date.now();
|
||||
|
||||
// Analyze data for anomalies
|
||||
const anomalyScore = await this.detectAnomalies(deviceId, data);
|
||||
|
||||
// Generate insights
|
||||
const insights = await this.generateInsights(deviceId, data, timestamp);
|
||||
|
||||
// Check for optimization opportunities
|
||||
const optimizations = await this.findOptimizations(deviceId, data);
|
||||
|
||||
// Update device model
|
||||
await this.updateDeviceModel(deviceId, data, anomalyScore);
|
||||
|
||||
// Generate suggestions
|
||||
const suggestions = await this.generateSuggestions(deviceId, data, insights, optimizations);
|
||||
|
||||
// Store analysis results
|
||||
await this.storeAnalysisResults(deviceId, {
|
||||
anomalyScore,
|
||||
insights,
|
||||
optimizations,
|
||||
suggestions,
|
||||
timestamp
|
||||
});
|
||||
|
||||
const processingTime = Date.now() - startTime;
|
||||
logger.logAIAction('device_data_analysis', {
|
||||
deviceId,
|
||||
anomalyScore,
|
||||
insightsCount: insights.length,
|
||||
optimizationsCount: optimizations.length,
|
||||
suggestionsCount: suggestions.length
|
||||
}, anomalyScore);
|
||||
|
||||
logger.logPerformance('ai_data_processing', processingTime, {
|
||||
deviceId,
|
||||
dataSize: JSON.stringify(data).length
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
logger.error('Error processing device data with AI:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async detectAnomalies(deviceId, data) {
|
||||
try {
|
||||
// Get historical data for comparison
|
||||
const historicalData = await this.getHistoricalData(deviceId, 100);
|
||||
|
||||
if (historicalData.length < 10) {
|
||||
return 0.5; // Neutral score if insufficient data
|
||||
}
|
||||
|
||||
// Calculate statistical measures
|
||||
const values = historicalData.map(d => this.extractNumericValues(d.raw_data)).flat();
|
||||
const mean = values.reduce((a, b) => a + b, 0) / values.length;
|
||||
const variance = values.reduce((a, b) => a + Math.pow(b - mean, 2), 0) / values.length;
|
||||
const stdDev = Math.sqrt(variance);
|
||||
|
||||
// Calculate anomaly score for current data
|
||||
const currentValues = this.extractNumericValues(data);
|
||||
const anomalyScores = currentValues.map(value => {
|
||||
const zScore = Math.abs((value - mean) / stdDev);
|
||||
return Math.min(zScore / 3, 1); // Normalize to 0-1
|
||||
});
|
||||
|
||||
const avgAnomalyScore = anomalyScores.reduce((a, b) => a + b, 0) / anomalyScores.length;
|
||||
|
||||
// Trigger alert if anomaly score is high
|
||||
if (avgAnomalyScore > this.anomalyThreshold) {
|
||||
await alertService.createAlert({
|
||||
deviceId,
|
||||
type: 'anomaly_detected',
|
||||
severity: avgAnomalyScore > 0.9 ? 'critical' : 'warning',
|
||||
message: `Anomaly detected in device ${deviceId} with score ${avgAnomalyScore.toFixed(3)}`,
|
||||
data: { anomalyScore: avgAnomalyScore, values: currentValues }
|
||||
});
|
||||
}
|
||||
|
||||
return avgAnomalyScore;
|
||||
} catch (error) {
|
||||
logger.error('Error detecting anomalies:', error);
|
||||
return 0.5;
|
||||
}
|
||||
}
|
||||
|
||||
async generateInsights(deviceId, data, timestamp) {
|
||||
try {
|
||||
const insights = [];
|
||||
|
||||
// Analyze data patterns
|
||||
const patterns = await this.analyzePatterns(deviceId, data);
|
||||
if (patterns.length > 0) {
|
||||
insights.push({
|
||||
type: 'pattern_detected',
|
||||
description: `Detected ${patterns.length} new patterns in device behavior`,
|
||||
patterns: patterns
|
||||
});
|
||||
}
|
||||
|
||||
// Analyze performance trends
|
||||
const trends = await this.analyzeTrends(deviceId, data);
|
||||
if (trends.length > 0) {
|
||||
insights.push({
|
||||
type: 'trend_detected',
|
||||
description: `Identified ${trends.length} performance trends`,
|
||||
trends: trends
|
||||
});
|
||||
}
|
||||
|
||||
// Analyze efficiency
|
||||
const efficiency = await this.analyzeEfficiency(deviceId, data);
|
||||
if (efficiency.score < 0.7) {
|
||||
insights.push({
|
||||
type: 'efficiency_issue',
|
||||
description: `Device efficiency is ${(efficiency.score * 100).toFixed(1)}%`,
|
||||
recommendations: efficiency.recommendations
|
||||
});
|
||||
}
|
||||
|
||||
return insights;
|
||||
} catch (error) {
|
||||
logger.error('Error generating insights:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async findOptimizations(deviceId, data) {
|
||||
try {
|
||||
const optimizations = [];
|
||||
|
||||
// Check for energy optimization opportunities
|
||||
const energyOpt = await this.checkEnergyOptimization(deviceId, data);
|
||||
if (energyOpt.opportunity) {
|
||||
optimizations.push({
|
||||
type: 'energy_optimization',
|
||||
description: energyOpt.description,
|
||||
potentialSavings: energyOpt.savings,
|
||||
action: energyOpt.action
|
||||
});
|
||||
}
|
||||
|
||||
// Check for performance optimization
|
||||
const perfOpt = await this.checkPerformanceOptimization(deviceId, data);
|
||||
if (perfOpt.opportunity) {
|
||||
optimizations.push({
|
||||
type: 'performance_optimization',
|
||||
description: perfOpt.description,
|
||||
improvement: perfOpt.improvement,
|
||||
action: perfOpt.action
|
||||
});
|
||||
}
|
||||
|
||||
// Check for maintenance optimization
|
||||
const maintOpt = await this.checkMaintenanceOptimization(deviceId, data);
|
||||
if (maintOpt.opportunity) {
|
||||
optimizations.push({
|
||||
type: 'maintenance_optimization',
|
||||
description: maintOpt.description,
|
||||
benefit: maintOpt.benefit,
|
||||
action: maintOpt.action
|
||||
});
|
||||
}
|
||||
|
||||
return optimizations;
|
||||
} catch (error) {
|
||||
logger.error('Error finding optimizations:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async generateSuggestions(deviceId, data, insights, optimizations) {
|
||||
try {
|
||||
const suggestions = [];
|
||||
|
||||
// Generate suggestions based on insights
|
||||
for (const insight of insights) {
|
||||
const suggestion = await this.createSuggestionFromInsight(deviceId, insight);
|
||||
if (suggestion) {
|
||||
suggestions.push(suggestion);
|
||||
}
|
||||
}
|
||||
|
||||
// Generate suggestions based on optimizations
|
||||
for (const optimization of optimizations) {
|
||||
const suggestion = await this.createSuggestionFromOptimization(deviceId, optimization);
|
||||
if (suggestion) {
|
||||
suggestions.push(suggestion);
|
||||
}
|
||||
}
|
||||
|
||||
// Generate proactive suggestions
|
||||
const proactiveSuggestions = await this.generateProactiveSuggestions(deviceId, data);
|
||||
suggestions.push(...proactiveSuggestions);
|
||||
|
||||
// Store suggestions in database
|
||||
for (const suggestion of suggestions) {
|
||||
await this.storeSuggestion(deviceId, suggestion);
|
||||
}
|
||||
|
||||
return suggestions;
|
||||
} catch (error) {
|
||||
logger.error('Error generating suggestions:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async createSuggestionFromInsight(deviceId, insight) {
|
||||
try {
|
||||
const suggestion = {
|
||||
deviceId,
|
||||
type: 'ai_insight',
|
||||
title: `Action based on ${insight.type}`,
|
||||
description: insight.description,
|
||||
priority: insight.type === 'efficiency_issue' ? 'high' : 'medium',
|
||||
category: 'optimization',
|
||||
confidence: 0.8,
|
||||
actions: [],
|
||||
created_at: new Date()
|
||||
};
|
||||
|
||||
// Add specific actions based on insight type
|
||||
switch (insight.type) {
|
||||
case 'efficiency_issue':
|
||||
suggestion.actions = insight.recommendations;
|
||||
break;
|
||||
case 'pattern_detected':
|
||||
suggestion.actions = ['Monitor pattern evolution', 'Adjust thresholds if needed'];
|
||||
break;
|
||||
case 'trend_detected':
|
||||
suggestion.actions = ['Review trend direction', 'Plan preventive measures'];
|
||||
break;
|
||||
}
|
||||
|
||||
return suggestion;
|
||||
} catch (error) {
|
||||
logger.error('Error creating suggestion from insight:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async createSuggestionFromOptimization(deviceId, optimization) {
|
||||
try {
|
||||
const suggestion = {
|
||||
deviceId,
|
||||
type: 'ai_optimization',
|
||||
title: optimization.description,
|
||||
description: `AI detected optimization opportunity: ${optimization.description}`,
|
||||
priority: 'medium',
|
||||
category: 'optimization',
|
||||
confidence: 0.7,
|
||||
actions: [optimization.action],
|
||||
metadata: {
|
||||
optimizationType: optimization.type,
|
||||
potentialBenefit: optimization.potentialSavings || optimization.improvement || optimization.benefit
|
||||
},
|
||||
created_at: new Date()
|
||||
};
|
||||
|
||||
return suggestion;
|
||||
} catch (error) {
|
||||
logger.error('Error creating suggestion from optimization:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async generateProactiveSuggestions(deviceId, data) {
|
||||
try {
|
||||
const suggestions = [];
|
||||
|
||||
// Check device health
|
||||
const health = await this.assessDeviceHealth(deviceId, data);
|
||||
if (health.score < 0.6) {
|
||||
suggestions.push({
|
||||
deviceId,
|
||||
type: 'health_warning',
|
||||
title: 'Device Health Alert',
|
||||
description: `Device health is at ${(health.score * 100).toFixed(1)}%. Consider maintenance.`,
|
||||
priority: 'high',
|
||||
category: 'maintenance',
|
||||
confidence: 0.9,
|
||||
actions: ['Schedule maintenance', 'Check device logs', 'Review recent alerts'],
|
||||
created_at: new Date()
|
||||
});
|
||||
}
|
||||
|
||||
// Check for predictive maintenance
|
||||
const maintenance = await this.predictMaintenance(deviceId, data);
|
||||
if (maintenance.needed) {
|
||||
suggestions.push({
|
||||
deviceId,
|
||||
type: 'predictive_maintenance',
|
||||
title: 'Predictive Maintenance Recommended',
|
||||
description: `Maintenance recommended within ${maintenance.timeframe} days`,
|
||||
priority: 'medium',
|
||||
category: 'maintenance',
|
||||
confidence: maintenance.confidence,
|
||||
actions: ['Schedule maintenance', 'Order parts if needed'],
|
||||
metadata: { timeframe: maintenance.timeframe },
|
||||
created_at: new Date()
|
||||
});
|
||||
}
|
||||
|
||||
return suggestions;
|
||||
} catch (error) {
|
||||
logger.error('Error generating proactive suggestions:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async storeSuggestion(deviceId, suggestion) {
|
||||
try {
|
||||
await database.query(
|
||||
`INSERT INTO ai_suggestions
|
||||
(device_id, type, title, description, priority, category, confidence, actions, metadata, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
|
||||
[
|
||||
deviceId,
|
||||
suggestion.type,
|
||||
suggestion.title,
|
||||
suggestion.description,
|
||||
suggestion.priority,
|
||||
suggestion.category,
|
||||
suggestion.confidence,
|
||||
JSON.stringify(suggestion.actions),
|
||||
JSON.stringify(suggestion.metadata || {}),
|
||||
suggestion.created_at
|
||||
]
|
||||
);
|
||||
} catch (error) {
|
||||
logger.error('Failed to store suggestion:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async runPeriodicAnalysis() {
|
||||
try {
|
||||
logger.info('Running periodic AI analysis...');
|
||||
|
||||
// Get all active devices
|
||||
const devices = await database.query('SELECT id FROM devices WHERE status = "active"');
|
||||
|
||||
for (const device of devices) {
|
||||
// Get recent data for analysis
|
||||
const recentData = await database.query(
|
||||
'SELECT * FROM device_data WHERE device_id = ? ORDER BY timestamp DESC LIMIT 10',
|
||||
[device.id]
|
||||
);
|
||||
|
||||
if (recentData.length > 0) {
|
||||
// Run comprehensive analysis
|
||||
await this.runComprehensiveAnalysis(device.id, recentData);
|
||||
}
|
||||
}
|
||||
|
||||
logger.info('Periodic AI analysis completed');
|
||||
} catch (error) {
|
||||
logger.error('Error in periodic analysis:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async runComprehensiveAnalysis(deviceId, data) {
|
||||
try {
|
||||
// Analyze device behavior patterns
|
||||
const behaviorPatterns = await this.analyzeBehaviorPatterns(deviceId, data);
|
||||
|
||||
// Check for system-wide optimizations
|
||||
const systemOptimizations = await this.findSystemOptimizations(deviceId, data);
|
||||
|
||||
// Generate long-term recommendations
|
||||
const longTermRecommendations = await this.generateLongTermRecommendations(deviceId, data);
|
||||
|
||||
// Store comprehensive analysis
|
||||
await this.storeComprehensiveAnalysis(deviceId, {
|
||||
behaviorPatterns,
|
||||
systemOptimizations,
|
||||
longTermRecommendations,
|
||||
timestamp: new Date()
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
logger.error('Error in comprehensive analysis:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Helper methods
|
||||
extractNumericValues(data) {
|
||||
const values = [];
|
||||
const extract = (obj) => {
|
||||
for (const key in obj) {
|
||||
if (typeof obj[key] === 'number') {
|
||||
values.push(obj[key]);
|
||||
} else if (typeof obj[key] === 'object' && obj[key] !== null) {
|
||||
extract(obj[key]);
|
||||
}
|
||||
}
|
||||
};
|
||||
extract(data);
|
||||
return values;
|
||||
}
|
||||
|
||||
async getHistoricalData(deviceId, limit) {
|
||||
try {
|
||||
return await database.query(
|
||||
'SELECT raw_data FROM device_data WHERE device_id = ? ORDER BY timestamp DESC LIMIT ?',
|
||||
[deviceId, limit]
|
||||
);
|
||||
} catch (error) {
|
||||
logger.error('Error getting historical data:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async updateDeviceModel(deviceId, data, anomalyScore) {
|
||||
try {
|
||||
// Update device model with new data
|
||||
const model = this.deviceModels.get(deviceId) || {
|
||||
deviceType: 'unknown',
|
||||
config: {},
|
||||
lastUpdated: new Date()
|
||||
};
|
||||
|
||||
// Update model parameters based on new data
|
||||
model.lastUpdated = new Date();
|
||||
model.lastAnomalyScore = anomalyScore;
|
||||
|
||||
this.deviceModels.set(deviceId, model);
|
||||
|
||||
// Store updated model in database
|
||||
await database.query(
|
||||
'UPDATE devices SET ai_model_config = ?, updated_at = NOW() WHERE id = ?',
|
||||
[JSON.stringify(model), deviceId]
|
||||
);
|
||||
} catch (error) {
|
||||
logger.error('Error updating device model:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async storeAnalysisResults(deviceId, results) {
|
||||
try {
|
||||
await database.query(
|
||||
`INSERT INTO ai_analysis_results
|
||||
(device_id, anomaly_score, insights, optimizations, suggestions, timestamp, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, ?, NOW())`,
|
||||
[
|
||||
deviceId,
|
||||
results.anomalyScore,
|
||||
JSON.stringify(results.insights),
|
||||
JSON.stringify(results.optimizations),
|
||||
JSON.stringify(results.suggestions),
|
||||
results.timestamp
|
||||
]
|
||||
);
|
||||
} catch (error) {
|
||||
logger.error('Failed to store analysis results:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Placeholder methods for complex AI operations
|
||||
async analyzePatterns(deviceId, data) { return []; }
|
||||
async analyzeTrends(deviceId, data) { return []; }
|
||||
async analyzeEfficiency(deviceId, data) { return { score: 0.8, recommendations: [] }; }
|
||||
async checkEnergyOptimization(deviceId, data) { return { opportunity: false }; }
|
||||
async checkPerformanceOptimization(deviceId, data) { return { opportunity: false }; }
|
||||
async checkMaintenanceOptimization(deviceId, data) { return { opportunity: false }; }
|
||||
async assessDeviceHealth(deviceId, data) { return { score: 0.8 }; }
|
||||
async predictMaintenance(deviceId, data) { return { needed: false }; }
|
||||
async analyzeBehaviorPatterns(deviceId, data) { return []; }
|
||||
async findSystemOptimizations(deviceId, data) { return []; }
|
||||
async generateLongTermRecommendations(deviceId, data) { return []; }
|
||||
async storeComprehensiveAnalysis(deviceId, analysis) { }
|
||||
|
||||
async healthCheck() {
|
||||
try {
|
||||
return {
|
||||
status: this.isInitialized ? 'healthy' : 'not_initialized',
|
||||
message: this.isInitialized ? 'AI Agent service is healthy' : 'Service not initialized',
|
||||
deviceModels: this.deviceModels.size,
|
||||
learningRate: this.learningRate,
|
||||
anomalyThreshold: this.anomalyThreshold,
|
||||
healingEnabled: this.healingEnabled
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
status: 'unhealthy',
|
||||
message: 'AI Agent service health check failed',
|
||||
error: error.message
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = new AIAgentService();
|
||||
568
services/alertService.js
Normal file
568
services/alertService.js
Normal file
@ -0,0 +1,568 @@
|
||||
const logger = require('../utils/logger');
|
||||
const database = require('../config/database');
|
||||
const redis = require('../config/redis');
|
||||
const notificationService = require('./notificationService');
|
||||
|
||||
class AlertService {
|
||||
constructor() {
|
||||
this.alertRules = new Map();
|
||||
this.isInitialized = false;
|
||||
}
|
||||
|
||||
async initialize() {
|
||||
try {
|
||||
await this.loadAlertRules();
|
||||
await this.setupAlertMonitoring();
|
||||
this.isInitialized = true;
|
||||
logger.info('Alert service initialized successfully');
|
||||
} catch (error) {
|
||||
logger.error('Failed to initialize Alert service:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async loadAlertRules() {
|
||||
try {
|
||||
const rules = await database.query('SELECT * FROM alert_rules WHERE status = "active"');
|
||||
|
||||
for (const rule of rules) {
|
||||
this.alertRules.set(rule.id, {
|
||||
...rule,
|
||||
conditions: JSON.parse(rule.conditions),
|
||||
actions: JSON.parse(rule.actions)
|
||||
});
|
||||
}
|
||||
|
||||
logger.info(`Loaded ${rules.length} alert rules`);
|
||||
} catch (error) {
|
||||
logger.error('Failed to load alert rules:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async setupAlertMonitoring() {
|
||||
// Monitor for alert rule updates every 30 seconds
|
||||
setInterval(async () => {
|
||||
await this.refreshAlertRules();
|
||||
}, 30 * 1000);
|
||||
}
|
||||
|
||||
async checkDeviceAlerts(deviceId, data, timestamp) {
|
||||
try {
|
||||
const alerts = [];
|
||||
|
||||
// Check predefined alert rules
|
||||
for (const [ruleId, rule] of this.alertRules) {
|
||||
if (rule.device_id === deviceId || rule.device_id === null) {
|
||||
const alert = await this.evaluateAlertRule(rule, data, timestamp);
|
||||
if (alert) {
|
||||
alerts.push(alert);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check threshold-based alerts
|
||||
const thresholdAlerts = await this.checkThresholdAlerts(deviceId, data, timestamp);
|
||||
alerts.push(...thresholdAlerts);
|
||||
|
||||
// Check anomaly-based alerts
|
||||
const anomalyAlerts = await this.checkAnomalyAlerts(deviceId, data, timestamp);
|
||||
alerts.push(...anomalyAlerts);
|
||||
|
||||
// Process and store alerts
|
||||
for (const alert of alerts) {
|
||||
await this.processAlert(alert);
|
||||
}
|
||||
|
||||
return alerts;
|
||||
} catch (error) {
|
||||
logger.error('Error checking device alerts:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async evaluateAlertRule(rule, data, timestamp) {
|
||||
try {
|
||||
const conditions = rule.conditions;
|
||||
let allConditionsMet = true;
|
||||
|
||||
for (const condition of conditions) {
|
||||
const value = this.extractValueFromData(data, condition.field);
|
||||
|
||||
if (!this.evaluateCondition(value, condition.operator, condition.value)) {
|
||||
allConditionsMet = false;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (allConditionsMet) {
|
||||
return {
|
||||
deviceId: rule.device_id,
|
||||
type: rule.alert_type,
|
||||
severity: rule.severity,
|
||||
message: rule.message,
|
||||
category: rule.category,
|
||||
data: data,
|
||||
timestamp: timestamp,
|
||||
ruleId: rule.id,
|
||||
actions: rule.actions
|
||||
};
|
||||
}
|
||||
|
||||
return null;
|
||||
} catch (error) {
|
||||
logger.error('Error evaluating alert rule:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async checkThresholdAlerts(deviceId, data, timestamp) {
|
||||
try {
|
||||
const alerts = [];
|
||||
|
||||
// Get device thresholds
|
||||
const thresholds = await database.query(
|
||||
'SELECT * FROM device_thresholds WHERE device_id = ? AND status = "active"',
|
||||
[deviceId]
|
||||
);
|
||||
|
||||
for (const threshold of thresholds) {
|
||||
const value = this.extractValueFromData(data, threshold.metric);
|
||||
|
||||
if (value !== null && value !== undefined) {
|
||||
let alert = null;
|
||||
|
||||
if (threshold.operator === 'gt' && value > threshold.value) {
|
||||
alert = {
|
||||
deviceId,
|
||||
type: 'threshold_exceeded',
|
||||
severity: threshold.severity,
|
||||
message: `${threshold.metric} exceeded threshold: ${value} > ${threshold.value}`,
|
||||
category: 'threshold',
|
||||
data: { metric: threshold.metric, value, threshold: threshold.value },
|
||||
timestamp,
|
||||
thresholdId: threshold.id
|
||||
};
|
||||
} else if (threshold.operator === 'lt' && value < threshold.value) {
|
||||
alert = {
|
||||
deviceId,
|
||||
type: 'threshold_below',
|
||||
severity: threshold.severity,
|
||||
message: `${threshold.metric} below threshold: ${value} < ${threshold.value}`,
|
||||
category: 'threshold',
|
||||
data: { metric: threshold.metric, value, threshold: threshold.value },
|
||||
timestamp,
|
||||
thresholdId: threshold.id
|
||||
};
|
||||
}
|
||||
|
||||
if (alert) {
|
||||
alerts.push(alert);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return alerts;
|
||||
} catch (error) {
|
||||
logger.error('Error checking threshold alerts:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async checkAnomalyAlerts(deviceId, data, timestamp) {
|
||||
try {
|
||||
const alerts = [];
|
||||
|
||||
// Get historical data for anomaly detection
|
||||
const historicalData = await database.query(
|
||||
'SELECT raw_data FROM device_data WHERE device_id = ? ORDER BY timestamp DESC LIMIT 50',
|
||||
[deviceId]
|
||||
);
|
||||
|
||||
if (historicalData.length < 10) {
|
||||
return alerts; // Not enough data for anomaly detection
|
||||
}
|
||||
|
||||
// Calculate statistical measures
|
||||
const values = historicalData.map(d => this.extractNumericValues(d.raw_data)).flat();
|
||||
const mean = values.reduce((a, b) => a + b, 0) / values.length;
|
||||
const variance = values.reduce((a, b) => a + Math.pow(b - mean, 2), 0) / values.length;
|
||||
const stdDev = Math.sqrt(variance);
|
||||
|
||||
// Check current data for anomalies
|
||||
const currentValues = this.extractNumericValues(data);
|
||||
const anomalyScores = currentValues.map(value => {
|
||||
const zScore = Math.abs((value - mean) / stdDev);
|
||||
return zScore;
|
||||
});
|
||||
|
||||
const maxAnomalyScore = Math.max(...anomalyScores);
|
||||
|
||||
if (maxAnomalyScore > 3) { // 3 standard deviations
|
||||
alerts.push({
|
||||
deviceId,
|
||||
type: 'anomaly_detected',
|
||||
severity: maxAnomalyScore > 5 ? 'critical' : 'warning',
|
||||
message: `Anomaly detected with z-score ${maxAnomalyScore.toFixed(2)}`,
|
||||
category: 'anomaly',
|
||||
data: { anomalyScore: maxAnomalyScore, values: currentValues },
|
||||
timestamp
|
||||
});
|
||||
}
|
||||
|
||||
return alerts;
|
||||
} catch (error) {
|
||||
logger.error('Error checking anomaly alerts:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async processAlert(alert) {
|
||||
try {
|
||||
// Store alert in database
|
||||
const alertId = await this.storeAlert(alert);
|
||||
|
||||
// Cache alert in Redis
|
||||
await redis.cacheAlert(alertId, alert);
|
||||
|
||||
// Log alert
|
||||
logger.logAlert(alert.type, alert.severity, alert.message, alert.deviceId);
|
||||
|
||||
// Execute alert actions
|
||||
await this.executeAlertActions(alert);
|
||||
|
||||
// Send notifications
|
||||
await this.sendAlertNotifications(alert);
|
||||
|
||||
// Trigger healing if enabled
|
||||
if (alert.severity === 'critical' && process.env.AI_HEALING_ENABLED === 'true') {
|
||||
await this.triggerHealing(alert);
|
||||
}
|
||||
|
||||
return alertId;
|
||||
} catch (error) {
|
||||
logger.error('Error processing alert:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async storeAlert(alert) {
|
||||
try {
|
||||
const result = await database.query(
|
||||
`INSERT INTO alerts
|
||||
(device_id, type, severity, message, category, data, timestamp, status, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, NOW())`,
|
||||
[
|
||||
alert.deviceId,
|
||||
alert.type,
|
||||
alert.severity,
|
||||
alert.message,
|
||||
alert.category,
|
||||
JSON.stringify(alert.data),
|
||||
alert.timestamp,
|
||||
'active'
|
||||
]
|
||||
);
|
||||
|
||||
return result.insertId;
|
||||
} catch (error) {
|
||||
logger.error('Failed to store alert:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async executeAlertActions(alert) {
|
||||
try {
|
||||
if (alert.actions && Array.isArray(alert.actions)) {
|
||||
for (const action of alert.actions) {
|
||||
await this.executeAction(action, alert);
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Error executing alert actions:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async executeAction(action, alert) {
|
||||
try {
|
||||
switch (action.type) {
|
||||
case 'email':
|
||||
await notificationService.sendEmail(action.recipients, {
|
||||
subject: `Alert: ${alert.type}`,
|
||||
body: alert.message,
|
||||
severity: alert.severity
|
||||
});
|
||||
break;
|
||||
|
||||
case 'sms':
|
||||
await notificationService.sendSMS(action.recipients, {
|
||||
message: `${alert.severity.toUpperCase()}: ${alert.message}`,
|
||||
deviceId: alert.deviceId
|
||||
});
|
||||
break;
|
||||
|
||||
case 'webhook':
|
||||
await notificationService.sendWebhook(action.url, {
|
||||
alert: alert,
|
||||
timestamp: new Date().toISOString()
|
||||
});
|
||||
break;
|
||||
|
||||
case 'device_control':
|
||||
await this.executeDeviceControl(action, alert);
|
||||
break;
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Error executing action:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async executeDeviceControl(action, alert) {
|
||||
try {
|
||||
// Store device control action
|
||||
await database.query(
|
||||
`INSERT INTO device_controls
|
||||
(device_id, action, parameters, triggered_by_alert, status)
|
||||
VALUES (?, ?, ?, ?, ?)`,
|
||||
[
|
||||
alert.deviceId,
|
||||
action.control,
|
||||
JSON.stringify(action.parameters),
|
||||
alert.id,
|
||||
'pending'
|
||||
]
|
||||
);
|
||||
|
||||
logger.info(`Device control action triggered by alert: ${action.control}`);
|
||||
} catch (error) {
|
||||
logger.error('Error executing device control:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async sendAlertNotifications(alert) {
|
||||
try {
|
||||
// Get users who should be notified
|
||||
const users = await this.getUsersToNotify(alert);
|
||||
|
||||
for (const user of users) {
|
||||
await notificationService.sendNotification(user.id, {
|
||||
type: 'alert',
|
||||
title: `Alert: ${alert.type}`,
|
||||
message: alert.message,
|
||||
severity: alert.severity,
|
||||
deviceId: alert.deviceId,
|
||||
data: alert.data
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Error sending alert notifications:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async getUsersToNotify(alert) {
|
||||
try {
|
||||
// Get users based on alert severity and device access
|
||||
let query = `
|
||||
SELECT DISTINCT u.id, u.username, u.email, u.notification_preferences
|
||||
FROM users u
|
||||
LEFT JOIN user_device_access uda ON u.id = uda.user_id
|
||||
WHERE u.status = 'active'
|
||||
`;
|
||||
|
||||
const params = [];
|
||||
|
||||
if (alert.deviceId) {
|
||||
query += ' AND (uda.device_id = ? OR u.role = "admin")';
|
||||
params.push(alert.deviceId);
|
||||
} else {
|
||||
query += ' AND u.role = "admin"';
|
||||
}
|
||||
|
||||
const users = await database.query(query, params);
|
||||
|
||||
// Filter users based on notification preferences
|
||||
return users.filter(user => {
|
||||
const preferences = JSON.parse(user.notification_preferences || '{}');
|
||||
return preferences[alert.severity] !== false;
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Error getting users to notify:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async triggerHealing(alert) {
|
||||
try {
|
||||
// Import healing service dynamically to avoid circular dependency
|
||||
const healingService = require('./healingService');
|
||||
await healingService.triggerHealing(alert);
|
||||
} catch (error) {
|
||||
logger.error('Error triggering healing:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async acknowledgeAlert(alertId, userId) {
|
||||
try {
|
||||
await database.query(
|
||||
'UPDATE alerts SET acknowledged_by = ?, acknowledged_at = NOW(), status = "acknowledged" WHERE id = ?',
|
||||
[userId, alertId]
|
||||
);
|
||||
|
||||
logger.info(`Alert ${alertId} acknowledged by user ${userId}`);
|
||||
} catch (error) {
|
||||
logger.error('Error acknowledging alert:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async resolveAlert(alertId, userId, resolution = '') {
|
||||
try {
|
||||
await database.query(
|
||||
'UPDATE alerts SET resolved_by = ?, resolved_at = NOW(), status = "resolved", resolution = ? WHERE id = ?',
|
||||
[userId, resolution, alertId]
|
||||
);
|
||||
|
||||
logger.info(`Alert ${alertId} resolved by user ${userId}`);
|
||||
} catch (error) {
|
||||
logger.error('Error resolving alert:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getActiveAlerts(deviceId = null, limit = 100) {
|
||||
try {
|
||||
let query = 'SELECT * FROM alerts WHERE status = "active"';
|
||||
const params = [];
|
||||
|
||||
if (deviceId) {
|
||||
query += ' AND device_id = ?';
|
||||
params.push(deviceId);
|
||||
}
|
||||
|
||||
query += ' ORDER BY timestamp DESC LIMIT ?';
|
||||
params.push(limit);
|
||||
|
||||
return await database.query(query, params);
|
||||
} catch (error) {
|
||||
logger.error('Error getting active alerts:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async getAlertHistory(deviceId = null, startDate = null, endDate = null, limit = 100) {
|
||||
try {
|
||||
let query = 'SELECT * FROM alerts WHERE 1=1';
|
||||
const params = [];
|
||||
|
||||
if (deviceId) {
|
||||
query += ' AND device_id = ?';
|
||||
params.push(deviceId);
|
||||
}
|
||||
|
||||
if (startDate) {
|
||||
query += ' AND timestamp >= ?';
|
||||
params.push(startDate);
|
||||
}
|
||||
|
||||
if (endDate) {
|
||||
query += ' AND timestamp <= ?';
|
||||
params.push(endDate);
|
||||
}
|
||||
|
||||
query += ' ORDER BY timestamp DESC LIMIT ?';
|
||||
params.push(limit);
|
||||
|
||||
return await database.query(query, params);
|
||||
} catch (error) {
|
||||
logger.error('Error getting alert history:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async refreshAlertRules() {
|
||||
try {
|
||||
await this.loadAlertRules();
|
||||
} catch (error) {
|
||||
logger.error('Error refreshing alert rules:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Helper methods
|
||||
extractValueFromData(data, field) {
|
||||
try {
|
||||
const keys = field.split('.');
|
||||
let value = data;
|
||||
|
||||
for (const key of keys) {
|
||||
if (value && typeof value === 'object' && key in value) {
|
||||
value = value[key];
|
||||
} else {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
return value;
|
||||
} catch (error) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
evaluateCondition(value, operator, expectedValue) {
|
||||
try {
|
||||
switch (operator) {
|
||||
case 'eq':
|
||||
return value === expectedValue;
|
||||
case 'ne':
|
||||
return value !== expectedValue;
|
||||
case 'gt':
|
||||
return value > expectedValue;
|
||||
case 'gte':
|
||||
return value >= expectedValue;
|
||||
case 'lt':
|
||||
return value < expectedValue;
|
||||
case 'lte':
|
||||
return value <= expectedValue;
|
||||
case 'contains':
|
||||
return String(value).includes(String(expectedValue));
|
||||
case 'regex':
|
||||
return new RegExp(expectedValue).test(String(value));
|
||||
default:
|
||||
return false;
|
||||
}
|
||||
} catch (error) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
extractNumericValues(data) {
|
||||
const values = [];
|
||||
const extract = (obj) => {
|
||||
for (const key in obj) {
|
||||
if (typeof obj[key] === 'number') {
|
||||
values.push(obj[key]);
|
||||
} else if (typeof obj[key] === 'object' && obj[key] !== null) {
|
||||
extract(obj[key]);
|
||||
}
|
||||
}
|
||||
};
|
||||
extract(data);
|
||||
return values;
|
||||
}
|
||||
|
||||
async healthCheck() {
|
||||
try {
|
||||
return {
|
||||
status: this.isInitialized ? 'healthy' : 'not_initialized',
|
||||
message: this.isInitialized ? 'Alert service is healthy' : 'Service not initialized',
|
||||
activeRules: this.alertRules.size
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
status: 'unhealthy',
|
||||
message: 'Alert service health check failed',
|
||||
error: error.message
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = new AlertService();
|
||||
622
services/healingService.js
Normal file
622
services/healingService.js
Normal file
@ -0,0 +1,622 @@
|
||||
const logger = require('../utils/logger');
|
||||
const database = require('../config/database');
|
||||
const redis = require('../config/redis');
|
||||
const notificationService = require('./notificationService');
|
||||
|
||||
class HealingService {
|
||||
constructor() {
|
||||
this.healingRules = new Map();
|
||||
this.activeHealingActions = new Map();
|
||||
this.isInitialized = false;
|
||||
}
|
||||
|
||||
async initialize() {
|
||||
try {
|
||||
await this.loadHealingRules();
|
||||
await this.setupHealingMonitoring();
|
||||
this.isInitialized = true;
|
||||
logger.info('Healing service initialized successfully');
|
||||
} catch (error) {
|
||||
logger.error('Failed to initialize Healing service:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async loadHealingRules() {
|
||||
try {
|
||||
const rules = await database.query('SELECT * FROM healing_rules WHERE status = "active"');
|
||||
|
||||
for (const rule of rules) {
|
||||
this.healingRules.set(rule.id, {
|
||||
...rule,
|
||||
conditions: JSON.parse(rule.conditions),
|
||||
actions: JSON.parse(rule.actions)
|
||||
});
|
||||
}
|
||||
|
||||
logger.info(`Loaded ${rules.length} healing rules`);
|
||||
} catch (error) {
|
||||
logger.error('Failed to load healing rules:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async setupHealingMonitoring() {
|
||||
// Monitor active healing actions every 30 seconds
|
||||
setInterval(async () => {
|
||||
await this.monitorActiveHealingActions();
|
||||
}, 30 * 1000);
|
||||
}
|
||||
|
||||
async triggerHealing(alert) {
|
||||
try {
|
||||
const startTime = Date.now();
|
||||
|
||||
// Find applicable healing rules
|
||||
const applicableRules = await this.findApplicableHealingRules(alert);
|
||||
|
||||
if (applicableRules.length === 0) {
|
||||
logger.info(`No healing rules found for alert: ${alert.type}`);
|
||||
return null;
|
||||
}
|
||||
|
||||
// Execute healing actions
|
||||
const healingActions = [];
|
||||
for (const rule of applicableRules) {
|
||||
const action = await this.executeHealingRule(rule, alert);
|
||||
if (action) {
|
||||
healingActions.push(action);
|
||||
}
|
||||
}
|
||||
|
||||
const processingTime = Date.now() - startTime;
|
||||
logger.logHealingAction('trigger_healing', alert.deviceId, {
|
||||
alertType: alert.type,
|
||||
rulesApplied: applicableRules.length,
|
||||
actionsExecuted: healingActions.length
|
||||
}, processingTime);
|
||||
|
||||
return healingActions;
|
||||
} catch (error) {
|
||||
logger.error('Error triggering healing:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async findApplicableHealingRules(alert) {
|
||||
try {
|
||||
const applicableRules = [];
|
||||
|
||||
for (const [ruleId, rule] of this.healingRules) {
|
||||
if (this.isRuleApplicable(rule, alert)) {
|
||||
applicableRules.push(rule);
|
||||
}
|
||||
}
|
||||
|
||||
return applicableRules;
|
||||
} catch (error) {
|
||||
logger.error('Error finding applicable healing rules:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
isRuleApplicable(rule, alert) {
|
||||
try {
|
||||
// Check if rule applies to this device
|
||||
if (rule.device_id && rule.device_id !== alert.deviceId) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check if rule applies to this alert type
|
||||
if (rule.alert_types && !rule.alert_types.includes(alert.type)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check if rule applies to this severity
|
||||
if (rule.severity_levels && !rule.severity_levels.includes(alert.severity)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
} catch (error) {
|
||||
logger.error('Error checking rule applicability:', error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
async executeHealingRule(rule, alert) {
|
||||
try {
|
||||
const healingAction = {
|
||||
ruleId: rule.id,
|
||||
deviceId: alert.deviceId,
|
||||
alertId: alert.id,
|
||||
type: rule.healing_type,
|
||||
description: rule.description,
|
||||
actions: rule.actions,
|
||||
priority: rule.priority,
|
||||
status: 'pending',
|
||||
created_at: new Date()
|
||||
};
|
||||
|
||||
// Store healing action
|
||||
const actionId = await this.storeHealingAction(healingAction);
|
||||
healingAction.id = actionId;
|
||||
|
||||
// Execute healing actions
|
||||
const results = await this.executeHealingActions(healingAction);
|
||||
|
||||
// Update action status
|
||||
const success = results.every(result => result.success);
|
||||
await this.updateHealingActionStatus(actionId, success ? 'completed' : 'failed', results);
|
||||
|
||||
// Log healing action
|
||||
logger.logHealingAction(healingAction.type, alert.deviceId, {
|
||||
ruleId: rule.id,
|
||||
actionsCount: rule.actions.length,
|
||||
success: success
|
||||
}, 0);
|
||||
|
||||
return healingAction;
|
||||
} catch (error) {
|
||||
logger.error('Error executing healing rule:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async executeHealingActions(healingAction) {
|
||||
try {
|
||||
const results = [];
|
||||
|
||||
for (const action of healingAction.actions) {
|
||||
const result = await this.executeAction(action, healingAction);
|
||||
results.push(result);
|
||||
}
|
||||
|
||||
return results;
|
||||
} catch (error) {
|
||||
logger.error('Error executing healing actions:', error);
|
||||
return [{ success: false, error: error.message }];
|
||||
}
|
||||
}
|
||||
|
||||
async executeAction(action, healingAction) {
|
||||
try {
|
||||
const startTime = Date.now();
|
||||
|
||||
switch (action.type) {
|
||||
case 'device_restart':
|
||||
return await this.executeDeviceRestart(action, healingAction);
|
||||
|
||||
case 'parameter_adjustment':
|
||||
return await this.executeParameterAdjustment(action, healingAction);
|
||||
|
||||
case 'configuration_update':
|
||||
return await this.executeConfigurationUpdate(action, healingAction);
|
||||
|
||||
case 'maintenance_schedule':
|
||||
return await this.executeMaintenanceSchedule(action, healingAction);
|
||||
|
||||
case 'backup_restore':
|
||||
return await this.executeBackupRestore(action, healingAction);
|
||||
|
||||
case 'load_balancing':
|
||||
return await this.executeLoadBalancing(action, healingAction);
|
||||
|
||||
case 'circuit_breaker':
|
||||
return await this.executeCircuitBreaker(action, healingAction);
|
||||
|
||||
default:
|
||||
return { success: false, error: `Unknown action type: ${action.type}` };
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Error executing action:', error);
|
||||
return { success: false, error: error.message };
|
||||
}
|
||||
}
|
||||
|
||||
async executeDeviceRestart(action, healingAction) {
|
||||
try {
|
||||
// Store restart command
|
||||
await database.query(
|
||||
`INSERT INTO device_controls
|
||||
(device_id, action, parameters, triggered_by_healing, status)
|
||||
VALUES (?, ?, ?, ?, ?)`,
|
||||
[
|
||||
healingAction.deviceId,
|
||||
'restart',
|
||||
JSON.stringify(action.parameters || {}),
|
||||
healingAction.id,
|
||||
'pending'
|
||||
]
|
||||
);
|
||||
|
||||
// Send notification
|
||||
await notificationService.sendNotification('admin', {
|
||||
type: 'healing_action',
|
||||
title: 'Device Restart Initiated',
|
||||
message: `Device ${healingAction.deviceId} restart initiated by healing system`,
|
||||
severity: 'info',
|
||||
deviceId: healingAction.deviceId
|
||||
});
|
||||
|
||||
return { success: true, action: 'device_restart' };
|
||||
} catch (error) {
|
||||
logger.error('Error executing device restart:', error);
|
||||
return { success: false, error: error.message };
|
||||
}
|
||||
}
|
||||
|
||||
async executeParameterAdjustment(action, healingAction) {
|
||||
try {
|
||||
const { parameter, value, adjustment_type } = action;
|
||||
|
||||
// Store parameter adjustment
|
||||
await database.query(
|
||||
`INSERT INTO device_parameter_adjustments
|
||||
(device_id, parameter, old_value, new_value, adjustment_type, healing_action_id, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, ?, NOW())`,
|
||||
[
|
||||
healingAction.deviceId,
|
||||
parameter,
|
||||
action.current_value || 'unknown',
|
||||
value,
|
||||
adjustment_type || 'manual',
|
||||
healingAction.id
|
||||
]
|
||||
);
|
||||
|
||||
// Store control command
|
||||
await database.query(
|
||||
`INSERT INTO device_controls
|
||||
(device_id, action, parameters, triggered_by_healing, status)
|
||||
VALUES (?, ?, ?, ?, ?)`,
|
||||
[
|
||||
healingAction.deviceId,
|
||||
'parameter_adjustment',
|
||||
JSON.stringify({ parameter, value, adjustment_type }),
|
||||
healingAction.id,
|
||||
'pending'
|
||||
]
|
||||
);
|
||||
|
||||
return { success: true, action: 'parameter_adjustment', parameter, value };
|
||||
} catch (error) {
|
||||
logger.error('Error executing parameter adjustment:', error);
|
||||
return { success: false, error: error.message };
|
||||
}
|
||||
}
|
||||
|
||||
async executeConfigurationUpdate(action, healingAction) {
|
||||
try {
|
||||
const { configuration, backup_existing } = action;
|
||||
|
||||
if (backup_existing) {
|
||||
// Create backup of current configuration
|
||||
await this.createConfigurationBackup(healingAction.deviceId);
|
||||
}
|
||||
|
||||
// Store configuration update
|
||||
await database.query(
|
||||
`INSERT INTO device_configuration_updates
|
||||
(device_id, configuration, healing_action_id, created_at)
|
||||
VALUES (?, ?, ?, NOW())`,
|
||||
[
|
||||
healingAction.deviceId,
|
||||
JSON.stringify(configuration),
|
||||
healingAction.id
|
||||
]
|
||||
);
|
||||
|
||||
// Store control command
|
||||
await database.query(
|
||||
`INSERT INTO device_controls
|
||||
(device_id, action, parameters, triggered_by_healing, status)
|
||||
VALUES (?, ?, ?, ?, ?)`,
|
||||
[
|
||||
healingAction.deviceId,
|
||||
'configuration_update',
|
||||
JSON.stringify({ configuration, backup_existing }),
|
||||
healingAction.id,
|
||||
'pending'
|
||||
]
|
||||
);
|
||||
|
||||
return { success: true, action: 'configuration_update' };
|
||||
} catch (error) {
|
||||
logger.error('Error executing configuration update:', error);
|
||||
return { success: false, error: error.message };
|
||||
}
|
||||
}
|
||||
|
||||
async executeMaintenanceSchedule(action, healingAction) {
|
||||
try {
|
||||
const { maintenance_type, priority, estimated_duration } = action;
|
||||
|
||||
// Schedule maintenance
|
||||
await database.query(
|
||||
`INSERT INTO maintenance_schedules
|
||||
(device_id, maintenance_type, priority, estimated_duration, healing_action_id, status, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, ?, NOW())`,
|
||||
[
|
||||
healingAction.deviceId,
|
||||
maintenance_type,
|
||||
priority || 'medium',
|
||||
estimated_duration || 60,
|
||||
healingAction.id,
|
||||
'scheduled'
|
||||
]
|
||||
);
|
||||
|
||||
// Send notification
|
||||
await notificationService.sendNotification('admin', {
|
||||
type: 'maintenance_scheduled',
|
||||
title: 'Maintenance Scheduled',
|
||||
message: `${maintenance_type} maintenance scheduled for device ${healingAction.deviceId}`,
|
||||
severity: 'info',
|
||||
deviceId: healingAction.deviceId
|
||||
});
|
||||
|
||||
return { success: true, action: 'maintenance_schedule', maintenance_type };
|
||||
} catch (error) {
|
||||
logger.error('Error executing maintenance schedule:', error);
|
||||
return { success: false, error: error.message };
|
||||
}
|
||||
}
|
||||
|
||||
async executeBackupRestore(action, healingAction) {
|
||||
try {
|
||||
const { backup_id, restore_point } = action;
|
||||
|
||||
// Store backup restore action
|
||||
await database.query(
|
||||
`INSERT INTO backup_restore_actions
|
||||
(device_id, backup_id, restore_point, healing_action_id, status, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, NOW())`,
|
||||
[
|
||||
healingAction.deviceId,
|
||||
backup_id,
|
||||
restore_point || 'latest',
|
||||
healingAction.id,
|
||||
'pending'
|
||||
]
|
||||
);
|
||||
|
||||
// Store control command
|
||||
await database.query(
|
||||
`INSERT INTO device_controls
|
||||
(device_id, action, parameters, triggered_by_healing, status)
|
||||
VALUES (?, ?, ?, ?, ?)`,
|
||||
[
|
||||
healingAction.deviceId,
|
||||
'backup_restore',
|
||||
JSON.stringify({ backup_id, restore_point }),
|
||||
healingAction.id,
|
||||
'pending'
|
||||
]
|
||||
);
|
||||
|
||||
return { success: true, action: 'backup_restore', backup_id };
|
||||
} catch (error) {
|
||||
logger.error('Error executing backup restore:', error);
|
||||
return { success: false, error: error.message };
|
||||
}
|
||||
}
|
||||
|
||||
async executeLoadBalancing(action, healingAction) {
|
||||
try {
|
||||
const { target_devices, load_distribution } = action;
|
||||
|
||||
// Store load balancing action
|
||||
await database.query(
|
||||
`INSERT INTO load_balancing_actions
|
||||
(source_device_id, target_devices, load_distribution, healing_action_id, status, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, NOW())`,
|
||||
[
|
||||
healingAction.deviceId,
|
||||
JSON.stringify(target_devices),
|
||||
JSON.stringify(load_distribution),
|
||||
healingAction.id,
|
||||
'pending'
|
||||
]
|
||||
);
|
||||
|
||||
return { success: true, action: 'load_balancing', target_devices };
|
||||
} catch (error) {
|
||||
logger.error('Error executing load balancing:', error);
|
||||
return { success: false, error: error.message };
|
||||
}
|
||||
}
|
||||
|
||||
async executeCircuitBreaker(action, healingAction) {
|
||||
try {
|
||||
const { circuit_state, timeout } = action;
|
||||
|
||||
// Store circuit breaker action
|
||||
await database.query(
|
||||
`INSERT INTO circuit_breaker_actions
|
||||
(device_id, circuit_state, timeout, healing_action_id, status, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, NOW())`,
|
||||
[
|
||||
healingAction.deviceId,
|
||||
circuit_state || 'open',
|
||||
timeout || 300,
|
||||
healingAction.id,
|
||||
'pending'
|
||||
]
|
||||
);
|
||||
|
||||
// Store control command
|
||||
await database.query(
|
||||
`INSERT INTO device_controls
|
||||
(device_id, action, parameters, triggered_by_healing, status)
|
||||
VALUES (?, ?, ?, ?, ?)`,
|
||||
[
|
||||
healingAction.deviceId,
|
||||
'circuit_breaker',
|
||||
JSON.stringify({ circuit_state, timeout }),
|
||||
healingAction.id,
|
||||
'pending'
|
||||
]
|
||||
);
|
||||
|
||||
return { success: true, action: 'circuit_breaker', circuit_state };
|
||||
} catch (error) {
|
||||
logger.error('Error executing circuit breaker:', error);
|
||||
return { success: false, error: error.message };
|
||||
}
|
||||
}
|
||||
|
||||
async storeHealingAction(healingAction) {
|
||||
try {
|
||||
const result = await database.query(
|
||||
`INSERT INTO healing_actions
|
||||
(rule_id, device_id, alert_id, type, description, actions, priority, status, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)`,
|
||||
[
|
||||
healingAction.ruleId,
|
||||
healingAction.deviceId,
|
||||
healingAction.alertId,
|
||||
healingAction.type,
|
||||
healingAction.description,
|
||||
JSON.stringify(healingAction.actions),
|
||||
healingAction.priority,
|
||||
healingAction.status,
|
||||
healingAction.created_at
|
||||
]
|
||||
);
|
||||
|
||||
return result.insertId;
|
||||
} catch (error) {
|
||||
logger.error('Failed to store healing action:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async updateHealingActionStatus(actionId, status, results = []) {
|
||||
try {
|
||||
await database.query(
|
||||
`UPDATE healing_actions
|
||||
SET status = ?, results = ?, updated_at = NOW()
|
||||
WHERE id = ?`,
|
||||
[status, JSON.stringify(results), actionId]
|
||||
);
|
||||
} catch (error) {
|
||||
logger.error('Failed to update healing action status:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async monitorActiveHealingActions() {
|
||||
try {
|
||||
const activeActions = await database.query(
|
||||
'SELECT * FROM healing_actions WHERE status IN ("pending", "in_progress")'
|
||||
);
|
||||
|
||||
for (const action of activeActions) {
|
||||
await this.checkHealingActionProgress(action);
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Error monitoring active healing actions:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async checkHealingActionProgress(action) {
|
||||
try {
|
||||
// Check if healing action has been completed
|
||||
const controls = await database.query(
|
||||
'SELECT status FROM device_controls WHERE triggered_by_healing = ?',
|
||||
[action.id]
|
||||
);
|
||||
|
||||
if (controls.length > 0) {
|
||||
const allCompleted = controls.every(control => control.status === 'completed');
|
||||
const anyFailed = controls.some(control => control.status === 'failed');
|
||||
|
||||
if (allCompleted) {
|
||||
await this.updateHealingActionStatus(action.id, 'completed');
|
||||
} else if (anyFailed) {
|
||||
await this.updateHealingActionStatus(action.id, 'failed');
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Error checking healing action progress:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async createConfigurationBackup(deviceId) {
|
||||
try {
|
||||
// Get current device configuration
|
||||
const [device] = await database.query(
|
||||
'SELECT configuration FROM devices WHERE id = ?',
|
||||
[deviceId]
|
||||
);
|
||||
|
||||
if (device && device.configuration) {
|
||||
await database.query(
|
||||
`INSERT INTO device_configuration_backups
|
||||
(device_id, configuration, backup_type, created_at)
|
||||
VALUES (?, ?, ?, NOW())`,
|
||||
[deviceId, device.configuration, 'healing_backup']
|
||||
);
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Error creating configuration backup:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async getHealingHistory(deviceId = null, limit = 100) {
|
||||
try {
|
||||
let query = 'SELECT * FROM healing_actions WHERE 1=1';
|
||||
const params = [];
|
||||
|
||||
if (deviceId) {
|
||||
query += ' AND device_id = ?';
|
||||
params.push(deviceId);
|
||||
}
|
||||
|
||||
query += ' ORDER BY created_at DESC LIMIT ?';
|
||||
params.push(limit);
|
||||
|
||||
return await database.query(query, params);
|
||||
} catch (error) {
|
||||
logger.error('Error getting healing history:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async getHealingStatistics() {
|
||||
try {
|
||||
const stats = await database.query(`
|
||||
SELECT
|
||||
status,
|
||||
COUNT(*) as count,
|
||||
AVG(TIMESTAMPDIFF(SECOND, created_at, updated_at)) as avg_duration
|
||||
FROM healing_actions
|
||||
WHERE created_at >= DATE_SUB(NOW(), INTERVAL 24 HOUR)
|
||||
GROUP BY status
|
||||
`);
|
||||
|
||||
return stats;
|
||||
} catch (error) {
|
||||
logger.error('Error getting healing statistics:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async healthCheck() {
|
||||
try {
|
||||
return {
|
||||
status: this.isInitialized ? 'healthy' : 'not_initialized',
|
||||
message: this.isInitialized ? 'Healing service is healthy' : 'Service not initialized',
|
||||
activeRules: this.healingRules.size,
|
||||
activeActions: this.activeHealingActions.size
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
status: 'unhealthy',
|
||||
message: 'Healing service health check failed',
|
||||
error: error.message
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = new HealingService();
|
||||
465
services/notificationService.js
Normal file
465
services/notificationService.js
Normal file
@ -0,0 +1,465 @@
|
||||
const nodemailer = require('nodemailer');
|
||||
const twilio = require('twilio');
|
||||
const logger = require('../utils/logger');
|
||||
const database = require('../config/database');
|
||||
const redis = require('../config/redis');
|
||||
|
||||
class NotificationService {
|
||||
constructor() {
|
||||
this.emailTransporter = null;
|
||||
this.twilioClient = null;
|
||||
this.isInitialized = false;
|
||||
}
|
||||
|
||||
async initialize() {
|
||||
try {
|
||||
await this.setupEmailTransporter();
|
||||
await this.setupTwilioClient();
|
||||
this.isInitialized = true;
|
||||
logger.info('Notification service initialized successfully');
|
||||
} catch (error) {
|
||||
logger.error('Failed to initialize Notification service:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async setupEmailTransporter() {
|
||||
try {
|
||||
this.emailTransporter = nodemailer.createTransporter({
|
||||
host: process.env.SMTP_HOST,
|
||||
port: process.env.SMTP_PORT,
|
||||
secure: process.env.SMTP_PORT === '465',
|
||||
auth: {
|
||||
user: process.env.SMTP_USER,
|
||||
pass: process.env.SMTP_PASS
|
||||
}
|
||||
});
|
||||
|
||||
// Verify connection
|
||||
await this.emailTransporter.verify();
|
||||
logger.info('Email transporter configured successfully');
|
||||
} catch (error) {
|
||||
logger.error('Failed to setup email transporter:', error);
|
||||
this.emailTransporter = null;
|
||||
}
|
||||
}
|
||||
|
||||
async setupTwilioClient() {
|
||||
try {
|
||||
if (process.env.TWILIO_ACCOUNT_SID && process.env.TWILIO_AUTH_TOKEN) {
|
||||
this.twilioClient = twilio(process.env.TWILIO_ACCOUNT_SID, process.env.TWILIO_AUTH_TOKEN);
|
||||
logger.info('Twilio client configured successfully');
|
||||
} else {
|
||||
logger.warn('Twilio credentials not provided, SMS notifications disabled');
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Failed to setup Twilio client:', error);
|
||||
this.twilioClient = null;
|
||||
}
|
||||
}
|
||||
|
||||
async sendNotification(userId, notification) {
|
||||
try {
|
||||
const startTime = Date.now();
|
||||
|
||||
// Get user notification preferences
|
||||
const user = await this.getUserNotificationPreferences(userId);
|
||||
if (!user) {
|
||||
logger.warn(`User ${userId} not found for notification`);
|
||||
return false;
|
||||
}
|
||||
|
||||
// Store notification in database
|
||||
const notificationId = await this.storeNotification(userId, notification);
|
||||
|
||||
// Send notifications based on user preferences
|
||||
const results = {
|
||||
email: false,
|
||||
sms: false,
|
||||
inApp: true // Always store in-app
|
||||
};
|
||||
|
||||
// Send email if enabled
|
||||
if (user.email_enabled && user.email) {
|
||||
results.email = await this.sendEmail(user.email, notification);
|
||||
}
|
||||
|
||||
// Send SMS if enabled
|
||||
if (user.sms_enabled && user.phone && this.twilioClient) {
|
||||
results.sms = await this.sendSMS(user.phone, notification);
|
||||
}
|
||||
|
||||
// Update notification status
|
||||
await this.updateNotificationStatus(notificationId, results);
|
||||
|
||||
const processingTime = Date.now() - startTime;
|
||||
logger.logNotification('multi_channel', userId, notification.title, results);
|
||||
logger.logPerformance('notification_sending', processingTime, {
|
||||
userId,
|
||||
channels: Object.keys(results).filter(k => results[k]).length
|
||||
});
|
||||
|
||||
return results;
|
||||
} catch (error) {
|
||||
logger.error('Error sending notification:', error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
async sendEmail(recipients, notification) {
|
||||
try {
|
||||
if (!this.emailTransporter) {
|
||||
logger.warn('Email transporter not configured');
|
||||
return false;
|
||||
}
|
||||
|
||||
const emailContent = this.formatEmailContent(notification);
|
||||
|
||||
const mailOptions = {
|
||||
from: process.env.SMTP_USER,
|
||||
to: Array.isArray(recipients) ? recipients.join(',') : recipients,
|
||||
subject: emailContent.subject,
|
||||
html: emailContent.html,
|
||||
text: emailContent.text
|
||||
};
|
||||
|
||||
const result = await this.emailTransporter.sendMail(mailOptions);
|
||||
|
||||
logger.logNotification('email', recipients, notification.title, 'sent');
|
||||
return true;
|
||||
} catch (error) {
|
||||
logger.error('Error sending email:', error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
async sendSMS(recipients, notification) {
|
||||
try {
|
||||
if (!this.twilioClient) {
|
||||
logger.warn('Twilio client not configured');
|
||||
return false;
|
||||
}
|
||||
|
||||
const message = this.formatSMSContent(notification);
|
||||
const phoneNumbers = Array.isArray(recipients) ? recipients : [recipients];
|
||||
|
||||
const results = await Promise.allSettled(
|
||||
phoneNumbers.map(phone =>
|
||||
this.twilioClient.messages.create({
|
||||
body: message,
|
||||
from: process.env.TWILIO_PHONE_NUMBER,
|
||||
to: phone
|
||||
})
|
||||
)
|
||||
);
|
||||
|
||||
const successCount = results.filter(r => r.status === 'fulfilled').length;
|
||||
const success = successCount === phoneNumbers.length;
|
||||
|
||||
if (success) {
|
||||
logger.logNotification('sms', recipients, notification.title, 'sent');
|
||||
} else {
|
||||
logger.error('Some SMS messages failed to send');
|
||||
}
|
||||
|
||||
return success;
|
||||
} catch (error) {
|
||||
logger.error('Error sending SMS:', error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
async sendWebhook(url, data) {
|
||||
try {
|
||||
const response = await fetch(url, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify(data)
|
||||
});
|
||||
|
||||
const success = response.ok;
|
||||
|
||||
if (success) {
|
||||
logger.logNotification('webhook', url, 'Webhook notification', 'sent');
|
||||
} else {
|
||||
logger.error(`Webhook failed with status: ${response.status}`);
|
||||
}
|
||||
|
||||
return success;
|
||||
} catch (error) {
|
||||
logger.error('Error sending webhook:', error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
formatEmailContent(notification) {
|
||||
const severityColors = {
|
||||
critical: '#dc3545',
|
||||
warning: '#ffc107',
|
||||
info: '#17a2b8',
|
||||
success: '#28a745'
|
||||
};
|
||||
|
||||
const color = severityColors[notification.severity] || '#6c757d';
|
||||
|
||||
const html = `
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<title>${notification.title}</title>
|
||||
<style>
|
||||
body { font-family: Arial, sans-serif; line-height: 1.6; color: #333; }
|
||||
.container { max-width: 600px; margin: 0 auto; padding: 20px; }
|
||||
.header { background-color: ${color}; color: white; padding: 20px; text-align: center; }
|
||||
.content { padding: 20px; background-color: #f8f9fa; }
|
||||
.footer { text-align: center; padding: 20px; color: #6c757d; font-size: 12px; }
|
||||
.severity { display: inline-block; padding: 4px 8px; border-radius: 4px; color: white; font-size: 12px; }
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<div class="header">
|
||||
<h1>${notification.title}</h1>
|
||||
<span class="severity" style="background-color: ${color};">${notification.severity.toUpperCase()}</span>
|
||||
</div>
|
||||
<div class="content">
|
||||
<p>${notification.message}</p>
|
||||
${notification.deviceId ? `<p><strong>Device ID:</strong> ${notification.deviceId}</p>` : ''}
|
||||
${notification.data ? `<p><strong>Details:</strong> ${JSON.stringify(notification.data, null, 2)}</p>` : ''}
|
||||
<p><strong>Timestamp:</strong> ${new Date().toLocaleString()}</p>
|
||||
</div>
|
||||
<div class="footer">
|
||||
<p>This is an automated notification from the AI Agent IoT Dashboard.</p>
|
||||
</div>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
||||
`;
|
||||
|
||||
const text = `
|
||||
${notification.title}
|
||||
Severity: ${notification.severity.toUpperCase()}
|
||||
|
||||
${notification.message}
|
||||
|
||||
${notification.deviceId ? `Device ID: ${notification.deviceId}` : ''}
|
||||
${notification.data ? `Details: ${JSON.stringify(notification.data)}` : ''}
|
||||
Timestamp: ${new Date().toLocaleString()}
|
||||
|
||||
This is an automated notification from the AI Agent IoT Dashboard.
|
||||
`;
|
||||
|
||||
return {
|
||||
subject: `[${notification.severity.toUpperCase()}] ${notification.title}`,
|
||||
html: html,
|
||||
text: text
|
||||
};
|
||||
}
|
||||
|
||||
formatSMSContent(notification) {
|
||||
let message = `${notification.severity.toUpperCase()}: ${notification.title}`;
|
||||
|
||||
if (notification.message) {
|
||||
message += `\n${notification.message}`;
|
||||
}
|
||||
|
||||
if (notification.deviceId) {
|
||||
message += `\nDevice: ${notification.deviceId}`;
|
||||
}
|
||||
|
||||
return message.substring(0, 160); // SMS character limit
|
||||
}
|
||||
|
||||
async getUserNotificationPreferences(userId) {
|
||||
try {
|
||||
const [users] = await database.query(
|
||||
`SELECT id, username, email, phone, email_enabled, sms_enabled, notification_preferences
|
||||
FROM users WHERE id = ? AND status = "active"`,
|
||||
[userId]
|
||||
);
|
||||
|
||||
if (users.length === 0) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const user = users[0];
|
||||
const preferences = JSON.parse(user.notification_preferences || '{}');
|
||||
|
||||
return {
|
||||
...user,
|
||||
notification_preferences: preferences
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('Error getting user notification preferences:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async storeNotification(userId, notification) {
|
||||
try {
|
||||
const result = await database.query(
|
||||
`INSERT INTO notifications
|
||||
(user_id, type, title, message, severity, device_id, data, status, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, NOW())`,
|
||||
[
|
||||
userId,
|
||||
notification.type,
|
||||
notification.title,
|
||||
notification.message,
|
||||
notification.severity,
|
||||
notification.deviceId,
|
||||
JSON.stringify(notification.data || {}),
|
||||
'sent'
|
||||
]
|
||||
);
|
||||
|
||||
return result.insertId;
|
||||
} catch (error) {
|
||||
logger.error('Failed to store notification:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async updateNotificationStatus(notificationId, results) {
|
||||
try {
|
||||
const status = results.email || results.sms ? 'delivered' : 'failed';
|
||||
const deliveryInfo = JSON.stringify(results);
|
||||
|
||||
await database.query(
|
||||
'UPDATE notifications SET status = ?, delivery_info = ?, updated_at = NOW() WHERE id = ?',
|
||||
[status, deliveryInfo, notificationId]
|
||||
);
|
||||
} catch (error) {
|
||||
logger.error('Failed to update notification status:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async getUserNotifications(userId, limit = 50, offset = 0) {
|
||||
try {
|
||||
const notifications = await database.query(
|
||||
`SELECT * FROM notifications
|
||||
WHERE user_id = ?
|
||||
ORDER BY created_at DESC
|
||||
LIMIT ? OFFSET ?`,
|
||||
[userId, limit, offset]
|
||||
);
|
||||
|
||||
return notifications;
|
||||
} catch (error) {
|
||||
logger.error('Error getting user notifications:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async markNotificationAsRead(notificationId, userId) {
|
||||
try {
|
||||
await database.query(
|
||||
'UPDATE notifications SET read_at = NOW() WHERE id = ? AND user_id = ?',
|
||||
[notificationId, userId]
|
||||
);
|
||||
|
||||
logger.info(`Notification ${notificationId} marked as read by user ${userId}`);
|
||||
} catch (error) {
|
||||
logger.error('Error marking notification as read:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async markAllNotificationsAsRead(userId) {
|
||||
try {
|
||||
await database.query(
|
||||
'UPDATE notifications SET read_at = NOW() WHERE user_id = ? AND read_at IS NULL',
|
||||
[userId]
|
||||
);
|
||||
|
||||
logger.info(`All notifications marked as read for user ${userId}`);
|
||||
} catch (error) {
|
||||
logger.error('Error marking all notifications as read:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async deleteNotification(notificationId, userId) {
|
||||
try {
|
||||
await database.query(
|
||||
'DELETE FROM notifications WHERE id = ? AND user_id = ?',
|
||||
[notificationId, userId]
|
||||
);
|
||||
|
||||
logger.info(`Notification ${notificationId} deleted by user ${userId}`);
|
||||
} catch (error) {
|
||||
logger.error('Error deleting notification:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getNotificationStatistics(userId = null) {
|
||||
try {
|
||||
let query = `
|
||||
SELECT
|
||||
type,
|
||||
severity,
|
||||
status,
|
||||
COUNT(*) as count,
|
||||
DATE(created_at) as date
|
||||
FROM notifications
|
||||
WHERE created_at >= DATE_SUB(NOW(), INTERVAL 7 DAY)
|
||||
`;
|
||||
|
||||
const params = [];
|
||||
|
||||
if (userId) {
|
||||
query += ' AND user_id = ?';
|
||||
params.push(userId);
|
||||
}
|
||||
|
||||
query += ' GROUP BY type, severity, status, DATE(created_at) ORDER BY date DESC';
|
||||
|
||||
const stats = await database.query(query, params);
|
||||
return stats;
|
||||
} catch (error) {
|
||||
logger.error('Error getting notification statistics:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async cleanupOldNotifications(daysToKeep = 30) {
|
||||
try {
|
||||
const result = await database.query(
|
||||
'DELETE FROM notifications WHERE created_at < DATE_SUB(NOW(), INTERVAL ? DAY)',
|
||||
[daysToKeep]
|
||||
);
|
||||
|
||||
logger.info(`Cleaned up ${result.affectedRows} old notifications`);
|
||||
return result.affectedRows;
|
||||
} catch (error) {
|
||||
logger.error('Error cleaning up old notifications:', error);
|
||||
return 0;
|
||||
}
|
||||
}
|
||||
|
||||
async healthCheck() {
|
||||
try {
|
||||
const emailStatus = this.emailTransporter ? 'configured' : 'not_configured';
|
||||
const smsStatus = this.twilioClient ? 'configured' : 'not_configured';
|
||||
|
||||
return {
|
||||
status: this.isInitialized ? 'healthy' : 'not_initialized',
|
||||
message: this.isInitialized ? 'Notification service is healthy' : 'Service not initialized',
|
||||
email: emailStatus,
|
||||
sms: smsStatus
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
status: 'unhealthy',
|
||||
message: 'Notification service health check failed',
|
||||
error: error.message
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = new NotificationService();
|
||||
309
services/streamPipesService.js
Normal file
309
services/streamPipesService.js
Normal file
@ -0,0 +1,309 @@
|
||||
const axios = require('axios');
|
||||
const WebSocket = require('ws');
|
||||
const logger = require('../utils/logger');
|
||||
const database = require('../config/database');
|
||||
const redis = require('../config/redis');
|
||||
const aiAgentService = require('./aiAgentService');
|
||||
const alertService = require('./alertService');
|
||||
|
||||
class StreamPipesService {
|
||||
constructor() {
|
||||
// Support both old format (host:port) and new format (full URL)
|
||||
if (process.env.STREAMPIPES_BASE_URL) {
|
||||
this.baseUrl = process.env.STREAMPIPES_BASE_URL;
|
||||
} else {
|
||||
this.baseUrl = `http://${process.env.STREAMPIPES_HOST || 'localhost'}:${process.env.STREAMPIPES_PORT || 8080}`;
|
||||
}
|
||||
this.username = process.env.STREAMPIPES_USERNAME || 'admin';
|
||||
this.password = process.env.STREAMPIPES_PASSWORD || 'admin';
|
||||
this.token = null;
|
||||
this.wsConnections = new Map();
|
||||
this.isInitialized = false;
|
||||
}
|
||||
|
||||
async initialize() {
|
||||
try {
|
||||
await this.authenticate();
|
||||
if (this.token) {
|
||||
await this.setupDataStreams();
|
||||
this.isInitialized = true;
|
||||
logger.info('StreamPipes service initialized successfully');
|
||||
} else {
|
||||
logger.warn('StreamPipes service not available - continuing without StreamPipes integration');
|
||||
this.isInitialized = false;
|
||||
}
|
||||
} catch (error) {
|
||||
logger.warn('StreamPipes service not available - continuing without StreamPipes integration');
|
||||
this.isInitialized = false;
|
||||
}
|
||||
}
|
||||
|
||||
async authenticate() {
|
||||
try {
|
||||
const response = await axios.post(`${this.baseUrl}/streampipes-backend/api/v2/auth/login`, {
|
||||
username: this.username,
|
||||
password: this.password
|
||||
});
|
||||
|
||||
this.token = response.data.token;
|
||||
logger.info('StreamPipes authentication successful');
|
||||
} catch (error) {
|
||||
logger.error('StreamPipes authentication failed:', {
|
||||
message: error.message,
|
||||
status: error.response?.status,
|
||||
statusText: error.response?.statusText
|
||||
});
|
||||
// Don't throw error, just log it and continue
|
||||
this.token = null;
|
||||
}
|
||||
}
|
||||
|
||||
async setupDataStreams() {
|
||||
try {
|
||||
// Get all data streams
|
||||
const streams = await this.getDataStreams();
|
||||
|
||||
for (const stream of streams) {
|
||||
await this.subscribeToStream(stream);
|
||||
}
|
||||
|
||||
logger.info(`Subscribed to ${streams.length} data streams`);
|
||||
} catch (error) {
|
||||
logger.error('Failed to setup data streams:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getDataStreams() {
|
||||
try {
|
||||
const response = await axios.get(`${this.baseUrl}/streampipes-backend/api/v2/streams`, {
|
||||
headers: {
|
||||
'Authorization': `Bearer ${this.token}`
|
||||
}
|
||||
});
|
||||
|
||||
return response.data || [];
|
||||
} catch (error) {
|
||||
logger.error('Failed to get data streams:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async subscribeToStream(stream) {
|
||||
try {
|
||||
// Convert HTTP URL to WebSocket URL
|
||||
let wsUrl;
|
||||
if (process.env.STREAMPIPES_BASE_URL) {
|
||||
wsUrl = process.env.STREAMPIPES_BASE_URL.replace('https://', 'wss://').replace('http://', 'ws://');
|
||||
} else {
|
||||
wsUrl = `ws://${process.env.STREAMPIPES_HOST || 'localhost'}:${process.env.STREAMPIPES_PORT || 8080}`;
|
||||
}
|
||||
wsUrl += `/streampipes-backend/api/v2/streams/${stream.elementId}/data`;
|
||||
|
||||
const ws = new WebSocket(wsUrl, {
|
||||
headers: {
|
||||
'Authorization': `Bearer ${this.token}`
|
||||
}
|
||||
});
|
||||
|
||||
ws.on('open', () => {
|
||||
logger.info(`Connected to StreamPipes stream: ${stream.name}`);
|
||||
this.wsConnections.set(stream.elementId, ws);
|
||||
});
|
||||
|
||||
ws.on('message', async (data) => {
|
||||
try {
|
||||
const message = JSON.parse(data);
|
||||
await this.processStreamData(stream, message);
|
||||
} catch (error) {
|
||||
logger.error('Error processing stream data:', error);
|
||||
}
|
||||
});
|
||||
|
||||
ws.on('error', (error) => {
|
||||
logger.error(`WebSocket error for stream ${stream.name}:`, error);
|
||||
});
|
||||
|
||||
ws.on('close', () => {
|
||||
logger.info(`Disconnected from StreamPipes stream: ${stream.name}`);
|
||||
this.wsConnections.delete(stream.elementId);
|
||||
|
||||
// Attempt to reconnect after 5 seconds
|
||||
setTimeout(() => {
|
||||
this.subscribeToStream(stream);
|
||||
}, 5000);
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
logger.error(`Failed to subscribe to stream ${stream.name}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
async processStreamData(stream, data) {
|
||||
try {
|
||||
const startTime = Date.now();
|
||||
|
||||
// Extract device information
|
||||
const deviceId = data.deviceId || data.sensorId || stream.elementId;
|
||||
const timestamp = data.timestamp || new Date().toISOString();
|
||||
|
||||
// Log the incoming data
|
||||
logger.logDeviceData(deviceId, 'stream', data);
|
||||
|
||||
// Store raw data in database
|
||||
await this.storeDeviceData(deviceId, stream, data, timestamp);
|
||||
|
||||
// Cache latest data in Redis
|
||||
await redis.cacheDeviceData(deviceId, {
|
||||
stream: stream.name,
|
||||
data: data,
|
||||
timestamp: timestamp
|
||||
});
|
||||
|
||||
// Process with AI Agent
|
||||
if (process.env.AI_AGENT_ENABLED === 'true') {
|
||||
await aiAgentService.processDeviceData(deviceId, data, timestamp);
|
||||
}
|
||||
|
||||
// Check for alerts
|
||||
await alertService.checkDeviceAlerts(deviceId, data, timestamp);
|
||||
|
||||
const processingTime = Date.now() - startTime;
|
||||
logger.logPerformance('stream_data_processing', processingTime, {
|
||||
deviceId,
|
||||
streamName: stream.name,
|
||||
dataSize: JSON.stringify(data).length
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
logger.error('Error processing stream data:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async storeDeviceData(deviceId, stream, data, timestamp) {
|
||||
try {
|
||||
await database.query(
|
||||
`INSERT INTO device_data
|
||||
(device_id, stream_id, stream_name, raw_data, timestamp, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, NOW())`,
|
||||
[deviceId, stream.elementId, stream.name, JSON.stringify(data), timestamp]
|
||||
);
|
||||
} catch (error) {
|
||||
logger.error('Failed to store device data:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async getDeviceData(deviceId, limit = 100) {
|
||||
try {
|
||||
const data = await database.query(
|
||||
`SELECT * FROM device_data
|
||||
WHERE device_id = ?
|
||||
ORDER BY timestamp DESC
|
||||
LIMIT ?`,
|
||||
[deviceId, limit]
|
||||
);
|
||||
return data;
|
||||
} catch (error) {
|
||||
logger.error('Failed to get device data:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async getDeviceDataByTimeRange(deviceId, startTime, endTime) {
|
||||
try {
|
||||
const data = await database.query(
|
||||
`SELECT * FROM device_data
|
||||
WHERE device_id = ?
|
||||
AND timestamp BETWEEN ? AND ?
|
||||
ORDER BY timestamp ASC`,
|
||||
[deviceId, startTime, endTime]
|
||||
);
|
||||
return data;
|
||||
} catch (error) {
|
||||
logger.error('Failed to get device data by time range:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async getStreamStatistics() {
|
||||
try {
|
||||
const stats = await database.query(`
|
||||
SELECT
|
||||
stream_name,
|
||||
COUNT(*) as message_count,
|
||||
MIN(timestamp) as first_message,
|
||||
MAX(timestamp) as last_message,
|
||||
AVG(JSON_LENGTH(raw_data)) as avg_data_size
|
||||
FROM device_data
|
||||
WHERE created_at >= DATE_SUB(NOW(), INTERVAL 24 HOUR)
|
||||
GROUP BY stream_name
|
||||
ORDER BY message_count DESC
|
||||
`);
|
||||
|
||||
return stats;
|
||||
} catch (error) {
|
||||
logger.error('Failed to get stream statistics:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async cleanupOldData(daysToKeep = 30) {
|
||||
try {
|
||||
const result = await database.query(
|
||||
'DELETE FROM device_data WHERE created_at < DATE_SUB(NOW(), INTERVAL ? DAY)',
|
||||
[daysToKeep]
|
||||
);
|
||||
|
||||
logger.info(`Cleaned up ${result.affectedRows} old data records`);
|
||||
return result.affectedRows;
|
||||
} catch (error) {
|
||||
logger.error('Failed to cleanup old data:', error);
|
||||
return 0;
|
||||
}
|
||||
}
|
||||
|
||||
async healthCheck() {
|
||||
try {
|
||||
if (!this.isInitialized) {
|
||||
return { status: 'not_initialized', message: 'Service not initialized' };
|
||||
}
|
||||
|
||||
if (!this.token) {
|
||||
return { status: 'not_authenticated', message: 'Not authenticated' };
|
||||
}
|
||||
|
||||
// Test API connection
|
||||
await axios.get(`${this.baseUrl}/streampipes-backend/api/v2/streams`, {
|
||||
headers: { 'Authorization': `Bearer ${this.token}` }
|
||||
});
|
||||
|
||||
return {
|
||||
status: 'healthy',
|
||||
message: 'Service is healthy',
|
||||
activeConnections: this.wsConnections.size
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
status: 'unhealthy',
|
||||
message: 'Service health check failed',
|
||||
error: error.message
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
async disconnect() {
|
||||
try {
|
||||
// Close all WebSocket connections
|
||||
for (const [streamId, ws] of this.wsConnections) {
|
||||
ws.close();
|
||||
}
|
||||
this.wsConnections.clear();
|
||||
|
||||
logger.info('StreamPipes service disconnected');
|
||||
} catch (error) {
|
||||
logger.error('Error disconnecting StreamPipes service:', error);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = new StreamPipesService();
|
||||
111
socket/socketHandler.js
Normal file
111
socket/socketHandler.js
Normal file
@ -0,0 +1,111 @@
|
||||
const jwt = require('jsonwebtoken');
|
||||
const logger = require('../utils/logger');
|
||||
const redis = require('../config/redis');
|
||||
const database = require('../config/database');
|
||||
|
||||
const socketHandler = (io) => {
|
||||
// Authentication middleware for Socket.IO
|
||||
io.use(async (socket, next) => {
|
||||
try {
|
||||
const token = socket.handshake.auth.token || socket.handshake.headers.authorization;
|
||||
|
||||
if (!token) {
|
||||
return next(new Error('Authentication error: No token provided'));
|
||||
}
|
||||
|
||||
const cleanToken = token.replace('Bearer ', '');
|
||||
const decoded = jwt.verify(cleanToken, process.env.JWT_SECRET);
|
||||
|
||||
// Verify user session
|
||||
const session = await redis.getUserSession(decoded.id);
|
||||
if (!session) {
|
||||
return next(new Error('Authentication error: Session expired'));
|
||||
}
|
||||
|
||||
// Get user info
|
||||
const [users] = await database.query(
|
||||
'SELECT id, username, email, role FROM users WHERE id = ? AND status = "active"',
|
||||
[decoded.id]
|
||||
);
|
||||
|
||||
if (users.length === 0) {
|
||||
return next(new Error('Authentication error: User not found'));
|
||||
}
|
||||
|
||||
socket.user = users[0];
|
||||
next();
|
||||
} catch (error) {
|
||||
logger.error('Socket authentication failed:', error);
|
||||
next(new Error('Authentication error: Invalid token'));
|
||||
}
|
||||
});
|
||||
|
||||
io.on('connection', (socket) => {
|
||||
logger.info(`User connected: ${socket.user.username} (${socket.id})`);
|
||||
|
||||
// Join user to their personal room
|
||||
socket.join(`user:${socket.user.id}`);
|
||||
|
||||
// Join admin users to admin room
|
||||
if (socket.user.role === 'admin') {
|
||||
socket.join('admin');
|
||||
}
|
||||
|
||||
// Handle device data updates
|
||||
socket.on('subscribe_device', (deviceId) => {
|
||||
socket.join(`device:${deviceId}`);
|
||||
logger.info(`User ${socket.user.username} subscribed to device ${deviceId}`);
|
||||
});
|
||||
|
||||
socket.on('unsubscribe_device', (deviceId) => {
|
||||
socket.leave(`device:${deviceId}`);
|
||||
logger.info(`User ${socket.user.username} unsubscribed from device ${deviceId}`);
|
||||
});
|
||||
|
||||
// Handle alert subscriptions
|
||||
socket.on('subscribe_alerts', () => {
|
||||
socket.join('alerts');
|
||||
logger.info(`User ${socket.user.username} subscribed to alerts`);
|
||||
});
|
||||
|
||||
socket.on('unsubscribe_alerts', () => {
|
||||
socket.leave('alerts');
|
||||
logger.info(`User ${socket.user.username} unsubscribed from alerts`);
|
||||
});
|
||||
|
||||
// Handle disconnect
|
||||
socket.on('disconnect', () => {
|
||||
logger.info(`User disconnected: ${socket.user.username} (${socket.id})`);
|
||||
});
|
||||
});
|
||||
|
||||
// Export socket functions for use in other modules
|
||||
return {
|
||||
// Emit device data updates
|
||||
emitDeviceData: (deviceId, data) => {
|
||||
io.to(`device:${deviceId}`).emit('device_data_update', {
|
||||
deviceId,
|
||||
data,
|
||||
timestamp: new Date().toISOString()
|
||||
});
|
||||
},
|
||||
|
||||
// Emit alerts
|
||||
emitAlert: (alert) => {
|
||||
io.to('alerts').emit('new_alert', {
|
||||
...alert,
|
||||
timestamp: new Date().toISOString()
|
||||
});
|
||||
},
|
||||
|
||||
// Emit to admin users
|
||||
emitToAdmin: (event, data) => {
|
||||
io.to('admin').emit(event, {
|
||||
...data,
|
||||
timestamp: new Date().toISOString()
|
||||
});
|
||||
}
|
||||
};
|
||||
};
|
||||
|
||||
module.exports = socketHandler;
|
||||
187
utils/logger.js
Normal file
187
utils/logger.js
Normal file
@ -0,0 +1,187 @@
|
||||
const winston = require('winston');
|
||||
const path = require('path');
|
||||
const fs = require('fs');
|
||||
|
||||
// Create logs directory if it doesn't exist
|
||||
const logsDir = path.join(__dirname, '..', 'logs');
|
||||
if (!fs.existsSync(logsDir)) {
|
||||
fs.mkdirSync(logsDir, { recursive: true });
|
||||
}
|
||||
|
||||
// Define log format
|
||||
const logFormat = winston.format.combine(
|
||||
winston.format.timestamp({
|
||||
format: 'YYYY-MM-DD HH:mm:ss'
|
||||
}),
|
||||
winston.format.errors({ stack: true }),
|
||||
winston.format.json(),
|
||||
winston.format.printf(({ timestamp, level, message, stack, ...meta }) => {
|
||||
let log = `${timestamp} [${level.toUpperCase()}]: ${message}`;
|
||||
|
||||
if (stack) {
|
||||
log += `\n${stack}`;
|
||||
}
|
||||
|
||||
if (Object.keys(meta).length > 0) {
|
||||
log += `\n${JSON.stringify(meta, null, 2)}`;
|
||||
}
|
||||
|
||||
return log;
|
||||
})
|
||||
);
|
||||
|
||||
// Create logger instance
|
||||
const logger = winston.createLogger({
|
||||
level: process.env.LOG_LEVEL || 'info',
|
||||
format: logFormat,
|
||||
defaultMeta: { service: 'ai-agent-backend' },
|
||||
transports: [
|
||||
// Console transport
|
||||
new winston.transports.Console({
|
||||
format: winston.format.combine(
|
||||
winston.format.colorize(),
|
||||
winston.format.simple()
|
||||
)
|
||||
}),
|
||||
|
||||
// File transport for all logs
|
||||
new winston.transports.File({
|
||||
filename: path.join(logsDir, 'app.log'),
|
||||
maxsize: 5242880, // 5MB
|
||||
maxFiles: 5,
|
||||
tailable: true
|
||||
}),
|
||||
|
||||
// File transport for error logs
|
||||
new winston.transports.File({
|
||||
filename: path.join(logsDir, 'error.log'),
|
||||
level: 'error',
|
||||
maxsize: 5242880, // 5MB
|
||||
maxFiles: 5,
|
||||
tailable: true
|
||||
})
|
||||
],
|
||||
|
||||
// Handle exceptions
|
||||
exceptionHandlers: [
|
||||
new winston.transports.File({
|
||||
filename: path.join(logsDir, 'exceptions.log')
|
||||
})
|
||||
],
|
||||
|
||||
// Handle rejections
|
||||
rejectionHandlers: [
|
||||
new winston.transports.File({
|
||||
filename: path.join(logsDir, 'rejections.log')
|
||||
})
|
||||
]
|
||||
});
|
||||
|
||||
// Add request logging middleware
|
||||
logger.logRequest = (req, res, next) => {
|
||||
const start = Date.now();
|
||||
|
||||
res.on('finish', () => {
|
||||
const duration = Date.now() - start;
|
||||
const logData = {
|
||||
method: req.method,
|
||||
url: req.url,
|
||||
status: res.statusCode,
|
||||
duration: `${duration}ms`,
|
||||
userAgent: req.get('User-Agent'),
|
||||
ip: req.ip || req.connection.remoteAddress
|
||||
};
|
||||
|
||||
if (res.statusCode >= 400) {
|
||||
logger.warn('HTTP Request', logData);
|
||||
} else {
|
||||
logger.info('HTTP Request', logData);
|
||||
}
|
||||
});
|
||||
|
||||
next();
|
||||
};
|
||||
|
||||
// Add database query logging
|
||||
logger.logQuery = (sql, params, duration) => {
|
||||
logger.debug('Database Query', {
|
||||
sql,
|
||||
params,
|
||||
duration: `${duration}ms`
|
||||
});
|
||||
};
|
||||
|
||||
// Add device data logging
|
||||
logger.logDeviceData = (deviceId, dataType, data) => {
|
||||
logger.info('Device Data Received', {
|
||||
deviceId,
|
||||
dataType,
|
||||
timestamp: new Date().toISOString(),
|
||||
dataSize: JSON.stringify(data).length
|
||||
});
|
||||
};
|
||||
|
||||
// Add alert logging
|
||||
logger.logAlert = (alertType, severity, message, deviceId = null) => {
|
||||
logger.warn('Alert Generated', {
|
||||
alertType,
|
||||
severity,
|
||||
message,
|
||||
deviceId,
|
||||
timestamp: new Date().toISOString()
|
||||
});
|
||||
};
|
||||
|
||||
// Add AI agent logging
|
||||
logger.logAIAction = (action, details, confidence = null) => {
|
||||
logger.info('AI Agent Action', {
|
||||
action,
|
||||
details,
|
||||
confidence,
|
||||
timestamp: new Date().toISOString()
|
||||
});
|
||||
};
|
||||
|
||||
// Add healing action logging
|
||||
logger.logHealingAction = (action, deviceId, result, duration) => {
|
||||
logger.info('Healing Action', {
|
||||
action,
|
||||
deviceId,
|
||||
result,
|
||||
duration: `${duration}ms`,
|
||||
timestamp: new Date().toISOString()
|
||||
});
|
||||
};
|
||||
|
||||
// Add performance logging
|
||||
logger.logPerformance = (operation, duration, metadata = {}) => {
|
||||
logger.info('Performance Metric', {
|
||||
operation,
|
||||
duration: `${duration}ms`,
|
||||
...metadata,
|
||||
timestamp: new Date().toISOString()
|
||||
});
|
||||
};
|
||||
|
||||
// Add security logging
|
||||
logger.logSecurity = (event, details, severity = 'info') => {
|
||||
const logMethod = severity === 'error' ? 'error' : 'warn';
|
||||
logger[logMethod]('Security Event', {
|
||||
event,
|
||||
details,
|
||||
timestamp: new Date().toISOString()
|
||||
});
|
||||
};
|
||||
|
||||
// Add notification logging
|
||||
logger.logNotification = (type, recipient, content, status) => {
|
||||
logger.info('Notification Sent', {
|
||||
type,
|
||||
recipient,
|
||||
content: content.substring(0, 100) + (content.length > 100 ? '...' : ''),
|
||||
status,
|
||||
timestamp: new Date().toISOString()
|
||||
});
|
||||
};
|
||||
|
||||
module.exports = logger;
|
||||
Loading…
Reference in New Issue
Block a user