Skip to main content

Development Setup Guide

Overview

This guide will help you set up a complete development environment for JobHive. The platform uses Django for the backend, React for the frontend, and requires several external services for full functionality.

Prerequisites

System Requirements

  • Operating System: macOS, Linux, or Windows with WSL2
  • Python: 3.11 or higher
  • Node.js: 18.x or higher
  • Docker: 20.10 or higher (for services)
  • Git: Latest version
  • Memory: 8GB RAM minimum, 16GB recommended
  • Storage: 10GB free space

Required Tools

# Package managers
pip (Python package manager)
npm or yarn (Node.js package manager)
docker-compose (Container orchestration)

# Development tools  
git (Version control)
curl (HTTP client for testing)
postgresql-client (Database client)
redis-cli (Redis client)

Initial Setup

1. Clone the Repository

# Clone the main repository
git clone https://github.com/your-org/jobhive.git
cd jobhive

# Initialize submodules if any
git submodule update --init --recursive

2. Environment Configuration

# Create environment file from template
cp .env.example .env.local

# Edit environment variables
nano .env.local

Environment Variables (.env.local)

# Django Configuration
DJANGO_SETTINGS_MODULE=config.settings.local
DJANGO_SECRET_KEY=your-super-secret-development-key-here
DEBUG=True
ALLOWED_HOSTS=localhost,127.0.0.1,0.0.0.0

# Database Configuration
DATABASE_URL=postgresql://jobhive_user:jobhive_pass@localhost:5432/jobhive_dev
POSTGRES_DB=jobhive_dev
POSTGRES_USER=jobhive_user
POSTGRES_PASSWORD=jobhive_pass

# Redis Configuration
REDIS_URL=redis://localhost:6379/0
CELERY_BROKER_URL=redis://localhost:6379/0
CELERY_RESULT_BACKEND=redis://localhost:6379/0

# Email Configuration (Development)
EMAIL_BACKEND=django.core.mail.backends.console.EmailBackend
EMAIL_HOST=localhost
EMAIL_PORT=1025
EMAIL_USE_TLS=False

# Storage Configuration (Development)
USE_S3=False
MEDIA_ROOT=./media
STATIC_ROOT=./staticfiles

# API Keys (Development/Test Keys)
OPENAI_API_KEY=your-openai-test-key
STRIPE_PUBLISHABLE_KEY=pk_test_...
STRIPE_SECRET_KEY=sk_test_...
STRIPE_WEBHOOK_SECRET=whsec_test_...

# LiveKit Configuration (Optional for full features)
LIVEKIT_API_KEY=your-livekit-api-key
LIVEKIT_API_SECRET=your-livekit-api-secret
LIVEKIT_WS_URL=ws://localhost:7880

# DataDog (Optional for monitoring)
DD_TRACE_ENABLED=False
DD_LOGS_ENABLED=False

Backend Setup

1. Python Environment Setup

# Create virtual environment
python -m venv venv

# Activate virtual environment
# On macOS/Linux:
source venv/bin/activate
# On Windows:
venv\Scripts\activate

# Upgrade pip
pip install --upgrade pip setuptools wheel

2. Install Dependencies

# Install Python dependencies
pip install -r requirements/local.txt

# Install additional development tools
pip install pre-commit black flake8 mypy pytest-django

3. Database Setup

# Start PostgreSQL using Docker
docker-compose -f docker-compose.local.yml up -d postgres

# Wait for database to be ready (check logs)
docker-compose -f docker-compose.local.yml logs postgres

# Create database and run migrations
python manage.py migrate

# Create superuser account
python manage.py createsuperuser

# Load initial data (optional)
python manage.py loaddata fixtures/initial_data.json

4. Redis Setup

# Start Redis using Docker
docker-compose -f docker-compose.local.yml up -d redis

# Verify Redis is running
redis-cli ping
# Should return: PONG

5. Initial Data Setup

# Create default subscription plans
python manage.py setup_billing

# Create default skill categories and skills
python manage.py seed_initial_data

# Create score weights for different job focuses
python manage.py create_default_score_weights

# Create sample learning resources
python manage.py create_learning_resources

# Load assessment categories and types
python manage.py load_assessment_categories_types

6. Start Development Server

# Start Django development server
python manage.py runserver 0.0.0.0:8000

# The API will be available at:
# http://localhost:8000/api/v1/
# Admin interface: http://localhost:8000/admin/

7. Start Background Workers

# In a new terminal, activate virtual environment
source venv/bin/activate

# Start Celery worker
celery -A config.celery_app worker -l info

# In another terminal, start Celery beat (for scheduled tasks)
celery -A config.celery_app beat -l info

# Optional: Start Flower for monitoring Celery tasks
celery -A config.celery_app flower
# Flower UI: http://localhost:5555/

Frontend Setup (Optional)

If you’re working on the full-stack application:

1. Install Node.js Dependencies

# Navigate to frontend directory
cd ../JobHive_Frontend  # Adjust path as needed

# Install dependencies
npm install
# or
yarn install

2. Frontend Environment Configuration

# Create environment file
cp .env.example .env.local

# Edit environment variables
nano .env.local

Frontend Environment Variables

# API Configuration
REACT_APP_API_BASE_URL=http://localhost:8000/api/v1
REACT_APP_WS_BASE_URL=ws://localhost:8000/ws

# Environment
REACT_APP_ENV=development
REACT_APP_DEBUG=true

# Authentication
REACT_APP_JWT_REFRESH_INTERVAL=3600

# Features (Enable/disable features in development)
REACT_APP_ENABLE_ANALYTICS=true
REACT_APP_ENABLE_BILLING=true
REACT_APP_ENABLE_LIVEKIT=false

# External Services (Development keys)
REACT_APP_STRIPE_PUBLISHABLE_KEY=pk_test_...
REACT_APP_LIVEKIT_WS_URL=ws://localhost:7880

3. Start Frontend Development Server

# Start React development server
npm start
# or
yarn start

# Frontend will be available at:
# http://localhost:3000/

Docker Development Environment

For a complete containerized development environment:

1. Docker Compose Setup

# Start all services using Docker Compose
docker-compose -f docker-compose.local.yml up -d

# Services started:
# - postgres (PostgreSQL database)
# - redis (Redis cache and task queue)
# - mailpit (Email testing server)
# - django (Django web application)
# - celery-worker (Background task processor)
# - celery-beat (Scheduled task scheduler)
# - celery-flower (Task monitoring)

2. Verify Services

# Check all services are running
docker-compose -f docker-compose.local.yml ps

# Check service logs
docker-compose -f docker-compose.local.yml logs django
docker-compose -f docker-compose.local.yml logs celery-worker

# Access services:
# Django: http://localhost:8000/
# Flower: http://localhost:5555/
# Mailpit: http://localhost:8025/

3. Django Commands in Docker

# Execute Django commands in the container
docker-compose -f docker-compose.local.yml exec django python manage.py migrate
docker-compose -f docker-compose.local.yml exec django python manage.py createsuperuser
docker-compose -f docker-compose.local.yml exec django python manage.py collectstatic --noinput

# Access Django shell
docker-compose -f docker-compose.local.yml exec django python manage.py shell

Database Management

Development Database Commands

# Reset database (WARNING: Destroys all data)
python manage.py flush --noinput

# Create fresh migrations
python manage.py makemigrations

# Apply migrations
python manage.py migrate

# Show migration status
python manage.py showmigrations

# Create database backup
pg_dump -h localhost -U jobhive_user jobhive_dev > backup.sql

# Restore database backup
psql -h localhost -U jobhive_user jobhive_dev < backup.sql

Useful Database Queries

-- Check recent interview sessions
SELECT id, user_id, status, start_time, completion_percentage 
FROM interview_interviewsession 
ORDER BY start_time DESC LIMIT 10;

-- Check user subscriptions
SELECT u.email, s.name as plan_name, cs.status, cs.start_date
FROM users_user u
JOIN billing_customersubscription cs ON u.id = cs.user_id
JOIN billing_subscriptionplan s ON cs.plan_id = s.id;

-- Check sentiment analysis results
SELECT session_id, overall_sentiment, positive_score, negative_score, created_at
FROM interview_sentimentsession
ORDER BY created_at DESC LIMIT 20;

Testing Setup

1. Running Tests

# Run all tests
python manage.py test

# Run specific app tests
python manage.py test jobhive.interview

# Run with coverage
coverage run --source='.' manage.py test
coverage report
coverage html  # Generates HTML coverage report

# Run pytest (if preferred)
pytest

# Run with verbose output
pytest -v

# Run specific test file
pytest jobhive/interview/tests/test_models.py

2. Test Database

# Create test-specific settings
export DJANGO_SETTINGS_MODULE=config.settings.test

# Tests will automatically use test database
# Test database is created and destroyed for each test run

3. API Testing

# Install API testing tools
pip install httpie

# Test API endpoints
http GET localhost:8000/api/v1/users/me/ Authorization:"Bearer your-jwt-token"

# Create interview session
http POST localhost:8000/api/v1/interview/interviews/ \
  Authorization:"Bearer your-jwt-token" \
  job_id:=123 \
  scheduled_time="2024-02-15T14:00:00Z"

Development Tools and Pre-commit Hooks

1. Code Quality Tools Setup

# Install pre-commit hooks
pre-commit install

# Run pre-commit on all files
pre-commit run --all-files

# Update pre-commit hooks
pre-commit autoupdate

2. Code Formatting

# Format Python code with Black
black .

# Sort imports with isort
isort .

# Check code style with flake8
flake8 .

# Type checking with mypy
mypy jobhive/

3. Pre-commit Configuration (.pre-commit-config.yaml)

repos:
  - repo: https://github.com/pre-commit/pre-commit-hooks
    rev: v4.4.0
    hooks:
      - id: trailing-whitespace
      - id: end-of-file-fixer
      - id: check-yaml
      - id: check-added-large-files

  - repo: https://github.com/psf/black
    rev: 23.3.0
    hooks:
      - id: black

  - repo: https://github.com/pycqa/isort
    rev: 5.12.0
    hooks:
      - id: isort

  - repo: https://github.com/pycqa/flake8
    rev: 6.0.0
    hooks:
      - id: flake8

IDE Configuration

Visual Studio Code Setup

{
  "recommendations": [
    "ms-python.python",
    "ms-python.black-formatter",
    "ms-python.isort",
    "ms-python.flake8",
    "ms-python.mypy-type-checker",
    "ms-vscode.vscode-json",
    "redhat.vscode-yaml",
    "ms-vscode.vscode-django",
    "formulahendry.auto-rename-tag",
    "bradlc.vscode-tailwindcss"
  ]
}

2. VS Code Settings (.vscode/settings.json)

{
  "python.defaultInterpreterPath": "./venv/bin/python",
  "python.linting.enabled": true,
  "python.linting.flake8Enabled": true,
  "python.formatting.provider": "black",
  "python.sortImports.args": ["--profile", "black"],
  "editor.formatOnSave": true,
  "editor.codeActionsOnSave": {
    "source.organizeImports": true
  },
  "files.exclude": {
    "**/__pycache__": true,
    "**/*.pyc": true,
    ".coverage": true,
    "htmlcov/": true
  }
}

3. Launch Configuration (.vscode/launch.json)

{
  "version": "0.2.0",
  "configurations": [
    {
      "name": "Django",
      "type": "python",
      "request": "launch",
      "program": "${workspaceFolder}/manage.py",
      "args": ["runserver", "0.0.0.0:8000"],
      "django": true,
      "envFile": "${workspaceFolder}/.env.local"
    },
    {
      "name": "Django Tests",
      "type": "python",
      "request": "launch",
      "program": "${workspaceFolder}/manage.py",
      "args": ["test"],
      "django": true,
      "envFile": "${workspaceFolder}/.env.local"
    }
  ]
}

PyCharm Setup

1. Project Configuration

  1. Open project in PyCharm
  2. Configure Python interpreter: Settings → Project → Python Interpreter
  3. Select existing virtual environment: ./venv/bin/python
  4. Enable Django support: Settings → Languages & Frameworks → Django
  5. Set Django project root and settings module

2. Run Configurations

<!-- Django Server -->
<configuration name="Django Server" type="Python">
  <option name="INTERPRETER_OPTIONS" value="" />
  <option name="PARENT_ENVS" value="true" />
  <envs>
    <env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
  </envs>
  <option name="SDK_HOME" value="$PROJECT_DIR$/venv/bin/python" />
  <option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
  <option name="IS_MODULE_SDK" value="false" />
  <option name="ADD_CONTENT_ROOTS" value="true" />
  <option name="ADD_SOURCE_ROOTS" value="true" />
  <option name="SCRIPT_NAME" value="manage.py" />
  <option name="PARAMETERS" value="runserver 0.0.0.0:8000" />
  <option name="SHOW_COMMAND_LINE" value="false" />
  <option name="EMULATE_TERMINAL" value="false" />
</configuration>

Troubleshooting Common Issues

Database Connection Issues

# Check if PostgreSQL is running
docker-compose -f docker-compose.local.yml ps postgres

# Check PostgreSQL logs
docker-compose -f docker-compose.local.yml logs postgres

# Reset PostgreSQL container
docker-compose -f docker-compose.local.yml down postgres
docker-compose -f docker-compose.local.yml up -d postgres

# Test database connection
python manage.py dbshell

Redis Connection Issues

# Check if Redis is running
docker-compose -f docker-compose.local.yml ps redis

# Test Redis connection
redis-cli ping

# Reset Redis container
docker-compose -f docker-compose.local.yml restart redis

Migration Issues

# Check migration status
python manage.py showmigrations

# Create migrations for changes
python manage.py makemigrations

# Apply specific migration
python manage.py migrate app_name migration_name

# Fake apply migration (if needed)
python manage.py migrate app_name migration_name --fake

# Reset migrations (nuclear option)
find . -path "*/migrations/*.py" -not -name "__init__.py" -delete
find . -path "*/migrations/*.pyc" -delete
python manage.py makemigrations
python manage.py migrate

Celery Issues

# Check Celery worker status
celery -A config.celery_app inspect active

# Check Celery queue status
celery -A config.celery_app inspect reserved

# Purge all tasks from queue
celery -A config.celery_app purge

# Restart Celery worker
# Stop with Ctrl+C and restart
celery -A config.celery_app worker -l info

Permission Issues

# Fix file permissions (Unix/Linux)
chmod +x manage.py
chmod -R 755 staticfiles/
chmod -R 755 media/

# Fix Docker permission issues
sudo chown -R $USER:$USER .

Port Conflicts

# Check what's using port 8000
lsof -i :8000
netstat -tulpn | grep :8000

# Kill process using port
sudo kill -9 $(lsof -t -i:8000)

# Use different port
python manage.py runserver 0.0.0.0:8001

Development Workflow

1. Daily Development Routine

# Start development environment
docker-compose -f docker-compose.local.yml up -d postgres redis

# Activate virtual environment
source venv/bin/activate

# Update dependencies (if needed)
pip install -r requirements/local.txt

# Apply any new migrations
python manage.py migrate

# Start Django server
python manage.py runserver

# Start Celery worker (in another terminal)
celery -A config.celery_app worker -l info

2. Making Changes

# Create new branch for feature
git checkout -b feature/new-interview-analysis

# Make changes to code
# ...

# Create migrations for model changes
python manage.py makemigrations

# Test changes
python manage.py test

# Run code quality checks
pre-commit run --all-files

# Commit changes
git add .
git commit -m "Add new interview analysis feature"

# Push changes
git push origin feature/new-interview-analysis

3. Integration Testing

# Run full test suite
python manage.py test

# Run specific test categories
python manage.py test jobhive.interview.tests.test_ai_analysis
python manage.py test jobhive.billing.tests

# Test API endpoints
python manage.py test jobhive.interview.api.tests

# Performance testing (if implemented)
python manage.py test --tag=performance

Additional Resources

Development Tools

Useful Commands Reference

# Django commands
python manage.py --help
python manage.py shell
python manage.py dbshell
python manage.py collectstatic
python manage.py check

# Docker commands
docker-compose -f docker-compose.local.yml up -d
docker-compose -f docker-compose.local.yml down
docker-compose -f docker-compose.local.yml logs service_name
docker-compose -f docker-compose.local.yml exec service_name bash

# Git commands
git status
git log --oneline
git branch -a
git checkout -b new-branch
git merge main
git rebase main
This development setup guide should get you up and running with a fully functional JobHive development environment. For additional help, refer to the troubleshooting section or reach out to the development team.