Skip to content

1983shiv/Kafka-vs-Traditional-Database

Repository files navigation

Kafka vs Traditional Database Demo

This project demonstrates the challenges of using traditional databases for real-time event streaming and how Apache Kafka solves those issues through practical, runnable examples.

🎯 Learning Goals

  • Understand why traditional databases struggle with high-frequency event streaming
  • Learn how Apache Kafka excels at real-time event processing
  • See practical performance differences through metrics and logs
  • Explore event-driven architecture patterns
  • Gain hands-on experience with Kafka producers and consumers

🚀 Quick Start

1. Prerequisites

  • Node.js (v16 or higher)
  • Docker Desktop (for Kafka)
  • Command Prompt (Windows)

2. Setup

# Clone or download this project
cd kafka-vs-database-demo

# Install dependencies and build
npm run setup

# Start Kafka infrastructure
npm run start-kafka

3. Run the Interactive Demo

npm run demo

This launches an interactive menu where you can:

  1. Start Kafka infrastructure
  2. Run the database demo (shows limitations)
  3. Run the Kafka producer demo
  4. Run the Kafka consumer demo
  5. View project structure

Running the Demos

Part 1: Traditional Database Demo

This demo shows the challenges of using traditional databases for high-frequency event ingestion:

# Run with TypeScript directly (development)
npm run dev:db

# Or run compiled JavaScript
npm run start:db-demo

What you'll observe:

  • Slow write performance as event rate increases
  • Database connection overhead for each write
  • No built-in support for real-time processing
  • Difficulty in handling backpressure

Part 2: Kafka Demo

This demo shows how Kafka handles high-throughput event streaming efficiently:

# Start the Kafka producer (in one terminal)
npm run dev:kafka-producer

# Start the Kafka consumer (in another terminal)
npm run dev:kafka-consumer

What you'll observe:

  • High-throughput event production
  • Asynchronous, decoupled processing
  • Built-in durability and replay capabilities
  • Easy horizontal scaling

Key Learning Points

Traditional Database Challenges:

  1. Performance Bottlenecks: Each event requires a separate database write operation
  2. Blocking Operations: Synchronous writes slow down the producer
  3. No Native Streaming: Databases aren't designed for continuous data streams
  4. Tight Coupling: Producer and consumer are tightly coupled through the database
  5. Limited Scalability: Difficult to scale horizontally for high event rates

Kafka Advantages:

  1. High Throughput: Designed for millions of events per second
  2. Asynchronous Processing: Producers and consumers operate independently
  3. Durability: Events are persisted and can be replayed
  4. Scalability: Easy horizontal scaling with partitions
  5. Real-time Processing: Built for streaming workloads
  6. Fault Tolerance: Replication and fault recovery built-in

Event Types Simulated

The demo simulates realistic events:

  • User Activity: clicks, page views, scrolls
  • Financial Transactions: payments, transfers, deposits
  • System Events: logins, logouts, errors

Each event includes:

  • Unique event ID
  • Timestamp
  • User/session ID
  • Event type and metadata
  • Realistic data values

Performance Comparison

Run both demos and observe:

  • Database Demo: Watch latency increase with event rate
  • Kafka Demo: Consistent performance even at high rates

The console logs will show timing metrics and throughput statistics to highlight the performance differences.

Docker Compose Services

The included docker-compose.yml sets up:

  • Zookeeper: Kafka coordination service
  • Kafka Broker: Single Kafka instance for development
  • Kafka UI (optional): Web interface for monitoring

Next Steps

After running this demo, students should understand:

  1. When to use traditional databases vs. streaming platforms
  2. How event-driven architectures scale
  3. The benefits of decoupled producer-consumer patterns
  4. Why Kafka is the industry standard for event streaming

Troubleshooting

  • Kafka connection errors: Ensure Docker containers are running
  • Port conflicts: Check that ports 9092, 2181 are available
  • Memory issues: Kafka may need more memory in Docker settings

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published