Compare commits

...

4 Commits

29 changed files with 1166 additions and 541 deletions

View File

@ -0,0 +1,31 @@
---
description: Start API dev server with automatic error monitoring and cleanup
---
You need to launch the API development server. Follow these steps carefully:
1. **Kill any existing server on port 3000**:
- First check if there's a background shell running the dev server in Claude Code and kill it using the KillShell tool
- Then check for any process using port 3000 with `lsof -ti:3000` and kill it with `kill -9 $(lsof -ti:3000)` if found
- This ensures a clean restart regardless of where the server was started
2. **Start the dev server in background**:
- Navigate to the API service directory and start the server: `cd /projects/my-projects/banatie-service/apps/api-service && pnpm dev`
- Run this command in the background using the Bash tool with `run_in_background: true`
- Save the shell ID for monitoring
3. **Monitor the logs**:
- Wait 3-5 seconds for the server to start
- Use BashOutput tool to check the background process output for errors
- Read the api-dev.log file to verify the server started successfully
- Look for:
- Server startup message (should show "Server running on port 3000")
- Any error messages or stack traces
- Database/MinIO connection status
4. **Report status**:
- Inform the user if the server started successfully
- Show any errors found in the logs
- Provide the shell ID so the user can monitor it later if needed
CRITICAL: Always kill existing servers before starting a new one to avoid port conflicts.

View File

@ -1,49 +0,0 @@
# Application Configuration
NODE_ENV=development
PORT=3000
LOG_LEVEL=info
API_BASE_URL=http://localhost:3000
# CORS Configuration
CORS_ORIGIN=*
# Database Configuration
DB_HOST=postgres
DB_PORT=5432
DB_NAME=banatie_db
DB_USER=banatie_user
DB_PASSWORD=banatie_secure_password
DATABASE_URL=postgresql://banatie_user:banatie_secure_password@postgres:5432/banatie_db
# MinIO Storage Configuration (SNMD)
MINIO_ROOT_USER=banatie_admin
MINIO_ROOT_PASSWORD=banatie_storage_secure_key_2024
STORAGE_TYPE=minio
MINIO_ENDPOINT=minio:9000
MINIO_ACCESS_KEY=banatie_service
MINIO_SECRET_KEY=banatie_service_key_2024
MINIO_USE_SSL=false
MINIO_BUCKET_NAME=banatie
MINIO_PUBLIC_URL=http://localhost:9000
# AI Service Configuration
GEMINI_API_KEY=AIzaSyBaOt9JMPGKA3811FL-ssf1n5Hh9Jauly8
# File Upload Configuration
MAX_FILE_SIZE=5242880
MAX_FILES=3
# Multi-tenancy Configuration (Production-Ready Names)
DEFAULT_ORG_ID=default
DEFAULT_PROJECT_ID=main
DEFAULT_USER_ID=system
# Presigned URL Configuration
PRESIGNED_URL_EXPIRY=86400 # 24 hours
# Directory Configuration
RESULTS_DIR=/app/results
UPLOADS_DIR=/app/uploads/temp
# Logging Configuration
LOG_LEVEL=info

View File

@ -122,25 +122,36 @@ Key table: `api_keys`
## Environment Configuration ## Environment Configuration
### Root Environment (`.env.docker`) **Important**: We use TWO `.env` files with different purposes:
- `DATABASE_URL` - PostgreSQL connection string (for Docker: `postgresql://banatie_user:banatie_secure_password@postgres:5432/banatie_db`) ### Root `.env` (Docker Compose Infrastructure)
- `MINIO_ROOT_USER` - MinIO admin username
- `MINIO_ROOT_PASSWORD` - MinIO admin password
### API Service Environment (`apps/api-service/.env`) Used by Docker Compose services (MinIO, Postgres, API container). Key differences from local:
- `DATABASE_URL=postgresql://banatie_user:banatie_secure_password@postgres:5432/banatie_db` (Docker network hostname)
- `MINIO_ENDPOINT=minio:9000` (Docker network hostname)
- `MINIO_ROOT_USER` and `MINIO_ROOT_PASSWORD` - MinIO admin credentials
- All variables are passed to the app container via docker-compose.yml environment section
Required environment variables: ### API Service `.env` (Local Development Only)
- `DATABASE_URL` - PostgreSQL connection string (for local dev: `postgresql://banatie_user:banatie_secure_password@localhost:5434/banatie_db`) Located at `apps/api-service/.env` - used ONLY when running `pnpm dev:api` locally:
- `DATABASE_URL=postgresql://banatie_user:banatie_secure_password@localhost:5434/banatie_db` (port-forwarded)
- `MINIO_ENDPOINT=localhost:9000` (port-forwarded)
- **NOTE**: This file is excluded from Docker builds (see Dockerfile.mono)
### Required Environment Variables
- `DATABASE_URL` - PostgreSQL connection string
- `GEMINI_API_KEY` - Google Gemini API key (required) - `GEMINI_API_KEY` - Google Gemini API key (required)
- `MINIO_ENDPOINT` - MinIO endpoint (`localhost:9000` for local dev, `minio:9000` for Docker) - `MINIO_ENDPOINT` - MinIO endpoint
- `MINIO_ACCESS_KEY` - MinIO service account key - `MINIO_ACCESS_KEY` - MinIO service account key (`banatie_service`)
- `MINIO_SECRET_KEY` - MinIO service account secret - `MINIO_SECRET_KEY` - MinIO service account secret (`banatie_service_key_2024`)
- `MINIO_BUCKET_NAME` - Storage bucket name (default: `banatie`) - `MINIO_BUCKET_NAME` - Storage bucket name (default: `banatie`)
- `MINIO_ROOT_USER` - MinIO admin user (`banatie_admin`)
- `MINIO_ROOT_PASSWORD` - MinIO admin password
- `PORT` - Server port (default: 3000) - `PORT` - Server port (default: 3000)
- `NODE_ENV` - Environment mode - `NODE_ENV` - Environment mode
- `CORS_ORIGIN` - CORS origin setting (default: multiple localhost URLs for frontend apps) - `CORS_ORIGIN` - CORS origin setting
## Key Dependencies ## Key Dependencies

View File

@ -11,7 +11,11 @@ COPY pnpm-workspace.yaml package.json pnpm-lock.yaml ./
# Copy all workspace packages # Copy all workspace packages
COPY packages/ ./packages/ COPY packages/ ./packages/
COPY apps/api-service/ ./apps/api-service/
# Copy API service (exclude .env file - it's for local dev only)
COPY apps/api-service/package.json ./apps/api-service/
COPY apps/api-service/tsconfig.json ./apps/api-service/
COPY apps/api-service/src/ ./apps/api-service/src/
# Install all dependencies (workspace-aware) # Install all dependencies (workspace-aware)
RUN pnpm install --frozen-lockfile RUN pnpm install --frozen-lockfile

View File

@ -4,7 +4,8 @@
"description": "Nano Banana Image Generation Service - REST API for AI-powered image generation using Gemini Flash Image model", "description": "Nano Banana Image Generation Service - REST API for AI-powered image generation using Gemini Flash Image model",
"main": "dist/server.js", "main": "dist/server.js",
"scripts": { "scripts": {
"dev": "tsx --watch src/server.ts", "infra:up": "cd ../.. && docker compose up -d postgres minio storage-init",
"dev": "npm run infra:up && echo 'Logs will be saved to api-dev.log' && tsx --watch src/server.ts 2>&1 | tee api-dev.log",
"start": "node dist/server.js", "start": "node dist/server.js",
"build": "tsc", "build": "tsc",
"typecheck": "tsc --noEmit", "typecheck": "tsc --noEmit",

View File

@ -89,35 +89,14 @@ export const createApp = (): Application => {
const apiKey = await apiKeyService.validateKey(providedKey); const apiKey = await apiKeyService.validateKey(providedKey);
if (apiKey) { if (apiKey) {
// Query org and project names // Use slugs from validated API key (already fetched via LEFT JOIN)
let organizationName = apiKey.organizationId;
let projectName = apiKey.projectId;
try {
const { db } = await import('./db');
const { organizations, projects } = await import('@banatie/database');
const { eq } = await import('drizzle-orm');
if (apiKey.organizationId) {
const org = await db.select().from(organizations).where(eq(organizations.id, apiKey.organizationId)).limit(1);
if (org.length > 0) organizationName = org[0].name;
}
if (apiKey.projectId) {
const proj = await db.select().from(projects).where(eq(projects.id, apiKey.projectId)).limit(1);
if (proj.length > 0) projectName = proj[0].name;
}
} catch (dbError) {
// Fallback to IDs if DB query fails
}
info.authenticated = true; info.authenticated = true;
info.keyInfo = { info.keyInfo = {
type: apiKey.keyType, type: apiKey.keyType,
organizationId: apiKey.organizationId, organizationId: apiKey.organizationId,
organizationName, organizationSlug: apiKey.organizationSlug,
projectId: apiKey.projectId, projectId: apiKey.projectId,
projectName, projectSlug: apiKey.projectSlug,
expiresAt: apiKey.expiresAt expiresAt: apiKey.expiresAt
}; };
} }

View File

@ -0,0 +1,42 @@
import { Request, Response, NextFunction } from 'express';
/**
* Middleware to ensure only project keys can access generation endpoints
* Master keys are for admin purposes only
*/
export function requireProjectKey(
req: Request,
res: Response,
next: NextFunction
): void {
// This middleware assumes validateApiKey has already run and attached req.apiKey
if (!req.apiKey) {
res.status(401).json({
error: 'Authentication required',
message: 'API key validation must be performed first',
});
return;
}
// Block master keys from generation endpoints
if (req.apiKey.keyType === 'master') {
res.status(403).json({
error: 'Forbidden',
message: 'Master keys cannot be used for image generation. Please use a project-specific API key.',
});
return;
}
// Ensure project key has required IDs
if (!req.apiKey.projectId) {
res.status(400).json({
error: 'Invalid API key',
message: 'Project key must be associated with a project',
});
return;
}
console.log(`[${new Date().toISOString()}] Project key validated for generation: ${req.apiKey.id}`);
next();
}

View File

@ -1,12 +1,11 @@
import { Request, Response, NextFunction } from 'express'; import { Request, Response, NextFunction } from 'express';
import { ApiKeyService } from '../../services/ApiKeyService'; import { ApiKeyService, type ApiKeyWithSlugs } from '../../services/ApiKeyService';
import type { ApiKey } from '@banatie/database';
// Extend Express Request type to include apiKey // Extend Express Request type to include apiKey with slugs
declare global { declare global {
namespace Express { namespace Express {
interface Request { interface Request {
apiKey?: ApiKey; apiKey?: ApiKeyWithSlugs;
} }
} }
} }

View File

@ -16,7 +16,17 @@ router.use(requireMasterKey);
*/ */
router.post('/', async (req, res) => { router.post('/', async (req, res) => {
try { try {
const { type, projectId, organizationId, name, expiresInDays } = req.body; const {
type,
projectId,
organizationId,
organizationSlug,
projectSlug,
organizationName,
projectName,
name,
expiresInDays
} = req.body;
// Validation // Validation
if (!type || !['master', 'project'].includes(type)) { if (!type || !['master', 'project'].includes(type)) {
@ -26,23 +36,46 @@ router.post('/', async (req, res) => {
}); });
} }
if (type === 'project' && !projectId) { if (type === 'project' && !projectSlug) {
return res.status(400).json({ return res.status(400).json({
error: 'Missing projectId', error: 'Missing projectSlug',
message: 'Project keys require a projectId', message: 'Project keys require a projectSlug',
});
}
if (type === 'project' && !organizationSlug) {
return res.status(400).json({
error: 'Missing organizationSlug',
message: 'Project keys require an organizationSlug',
}); });
} }
// Create key // Create key
const result = type === 'master' let result;
? await apiKeyService.createMasterKey(name, req.apiKey!.id)
: await apiKeyService.createProjectKey( if (type === 'master') {
projectId, result = await apiKeyService.createMasterKey(name, req.apiKey!.id);
organizationId, } else {
// Get or create organization and project
const finalOrgId = await apiKeyService.getOrCreateOrganization(
organizationSlug,
organizationName,
);
const finalProjectId = await apiKeyService.getOrCreateProject(
finalOrgId,
projectSlug,
projectName,
);
result = await apiKeyService.createProjectKey(
finalProjectId,
finalOrgId,
name, name,
req.apiKey!.id, req.apiKey!.id,
expiresInDays || 90 expiresInDays || 90
); );
}
console.log(`[${new Date().toISOString()}] New API key created by admin: ${result.metadata.id} (${result.metadata.keyType}) - by: ${req.apiKey!.id}`); console.log(`[${new Date().toISOString()}] New API key created by admin: ${result.metadata.id} (${result.metadata.keyType}) - by: ${req.apiKey!.id}`);
@ -52,6 +85,7 @@ router.post('/', async (req, res) => {
id: result.metadata.id, id: result.metadata.id,
type: result.metadata.keyType, type: result.metadata.keyType,
projectId: result.metadata.projectId, projectId: result.metadata.projectId,
organizationId: result.metadata.organizationId,
name: result.metadata.name, name: result.metadata.name,
expiresAt: result.metadata.expiresAt, expiresAt: result.metadata.expiresAt,
scopes: result.metadata.scopes, scopes: result.metadata.scopes,

View File

@ -1,5 +1,5 @@
import express from 'express'; import express from "express";
import { ApiKeyService } from '../services/ApiKeyService'; import { ApiKeyService } from "../services/ApiKeyService";
const router = express.Router(); const router = express.Router();
const apiKeyService = new ApiKeyService(); const apiKeyService = new ApiKeyService();
@ -10,38 +10,44 @@ const apiKeyService = new ApiKeyService();
* *
* POST /api/bootstrap/initial-key * POST /api/bootstrap/initial-key
*/ */
router.post('/initial-key', async (req, res) => { router.post("/initial-key", async (req, res) => {
try { try {
// Check if any keys already exist // Check if any keys already exist
const hasKeys = await apiKeyService.hasAnyKeys(); const hasKeys = await apiKeyService.hasAnyKeys();
if (hasKeys) { if (hasKeys) {
console.warn(`[${new Date().toISOString()}] Bootstrap attempt when keys already exist`); console.warn(
`[${new Date().toISOString()}] Bootstrap attempt when keys already exist`,
);
return res.status(403).json({ return res.status(403).json({
error: 'Bootstrap not allowed', error: "Bootstrap not allowed",
message: 'API keys already exist. Use /api/admin/keys to create new keys.', message:
"API keys already exist. Use /api/admin/keys to create new keys.",
}); });
} }
// Create first master key // Create first master key
const { key, metadata } = await apiKeyService.createMasterKey('Initial Master Key'); const { key, metadata } =
await apiKeyService.createMasterKey("Initial Master Key");
console.log(`[${new Date().toISOString()}] Initial master key created via bootstrap: ${metadata.id}`); console.log(
`[${new Date().toISOString()}] Initial master key created via bootstrap: ${metadata.id}`,
);
res.status(201).json({ res.status(201).json({
apiKey: key, apiKey: key,
type: metadata.keyType, type: metadata.keyType,
name: metadata.name, name: metadata.name,
expiresAt: metadata.expiresAt, expiresAt: metadata.expiresAt,
message: 'IMPORTANT: Save this key securely. You will not see it again!', message: "IMPORTANT: Save this key securely. You will not see it again!",
}); });
} catch (error) { } catch (error) {
console.error(`[${new Date().toISOString()}] Bootstrap error:`, error); console.error(`[${new Date().toISOString()}] Bootstrap error:`, error);
res.status(500).json({ res.status(500).json({
error: 'Bootstrap failed', error: "Bootstrap failed",
message: 'Failed to create initial API key', message: "Failed to create initial API key",
}); });
} }
}); });
export default router; export default router;

View File

@ -15,6 +15,7 @@ import {
} from "../middleware/promptEnhancement"; } from "../middleware/promptEnhancement";
import { asyncHandler } from "../middleware/errorHandler"; import { asyncHandler } from "../middleware/errorHandler";
import { validateApiKey } from "../middleware/auth/validateApiKey"; import { validateApiKey } from "../middleware/auth/validateApiKey";
import { requireProjectKey } from "../middleware/auth/requireProjectKey";
import { rateLimitByApiKey } from "../middleware/auth/rateLimiter"; import { rateLimitByApiKey } from "../middleware/auth/rateLimiter";
import { GenerateImageResponse } from "../types/api"; import { GenerateImageResponse } from "../types/api";
// Create router // Create router
@ -30,6 +31,7 @@ generateRouter.post(
"/generate", "/generate",
// Authentication middleware // Authentication middleware
validateApiKey, validateApiKey,
requireProjectKey,
rateLimitByApiKey, rateLimitByApiKey,
// File upload middleware // File upload middleware
@ -64,8 +66,12 @@ generateRouter.post(
const { prompt, filename } = req.body; const { prompt, filename } = req.body;
const files = (req.files as Express.Multer.File[]) || []; const files = (req.files as Express.Multer.File[]) || [];
// Extract org/project slugs from validated API key
const orgId = req.apiKey?.organizationSlug || undefined;
const projectId = req.apiKey?.projectSlug!; // Guaranteed by requireProjectKey middleware
console.log( console.log(
`[${timestamp}] [${requestId}] Starting image generation process`, `[${timestamp}] [${requestId}] Starting image generation process for org:${orgId}, project:${projectId}`,
); );
try { try {
@ -102,6 +108,8 @@ generateRouter.post(
const result = await imageGenService.generateImage({ const result = await imageGenService.generateImage({
prompt, prompt,
filename, filename,
orgId,
projectId,
...(referenceImages && { referenceImages }), ...(referenceImages && { referenceImages }),
}); });

View File

@ -1,6 +1,6 @@
import { Router, Request, Response } from 'express'; import { Router, Request, Response } from "express";
import { StorageFactory } from '../services/StorageFactory'; import { StorageFactory } from "../services/StorageFactory";
import { asyncHandler } from '../middleware/errorHandler'; import { asyncHandler } from "../middleware/errorHandler";
export const imagesRouter = Router(); export const imagesRouter = Router();
@ -9,54 +9,55 @@ export const imagesRouter = Router();
* Serves images via presigned URLs (redirect approach) * Serves images via presigned URLs (redirect approach)
*/ */
imagesRouter.get( imagesRouter.get(
'/images/:orgId/:projectId/:category/:filename', "/images/:orgId/:projectId/:category/:filename",
asyncHandler(async (req: Request, res: Response) => { asyncHandler(async (req: Request, res: Response) => {
const { orgId, projectId, category, filename } = req.params; const { orgId, projectId, category, filename } = req.params;
// Validate category // Validate category
if (!['uploads', 'generated', 'references'].includes(category)) { if (!["uploads", "generated", "references"].includes(category)) {
return res.status(400).json({ return res.status(400).json({
success: false, success: false,
message: 'Invalid category' message: "Invalid category",
}); });
} }
const storageService = StorageFactory.getInstance(); const storageService = await StorageFactory.getInstance();
try { try {
// Check if file exists first (fast check) // Check if file exists first (fast check)
const exists = await storageService.fileExists( const exists = await storageService.fileExists(
orgId, orgId,
projectId, projectId,
category as 'uploads' | 'generated' | 'references', category as "uploads" | "generated" | "references",
filename filename,
); );
if (!exists) { if (!exists) {
return res.status(404).json({ return res.status(404).json({
success: false, success: false,
message: 'File not found' message: "File not found",
}); });
} }
// Determine content type from filename // Determine content type from filename
const ext = filename.toLowerCase().split('.').pop(); const ext = filename.toLowerCase().split(".").pop();
const contentType = { const contentType =
'png': 'image/png', {
'jpg': 'image/jpeg', png: "image/png",
'jpeg': 'image/jpeg', jpg: "image/jpeg",
'gif': 'image/gif', jpeg: "image/jpeg",
'webp': 'image/webp', gif: "image/gif",
'svg': 'image/svg+xml' webp: "image/webp",
}[ext || ''] || 'application/octet-stream'; svg: "image/svg+xml",
}[ext || ""] || "application/octet-stream";
// Set headers for optimal caching and performance // Set headers for optimal caching and performance
res.setHeader('Content-Type', contentType); res.setHeader("Content-Type", contentType);
res.setHeader('Cache-Control', 'public, max-age=86400, immutable'); // 24 hours + immutable res.setHeader("Cache-Control", "public, max-age=86400, immutable"); // 24 hours + immutable
res.setHeader('ETag', `"${orgId}-${projectId}-${filename}"`); // Simple ETag res.setHeader("ETag", `"${orgId}-${projectId}-${filename}"`); // Simple ETag
// Handle conditional requests (304 Not Modified) // Handle conditional requests (304 Not Modified)
const ifNoneMatch = req.headers['if-none-match']; const ifNoneMatch = req.headers["if-none-match"];
if (ifNoneMatch === `"${orgId}-${projectId}-${filename}"`) { if (ifNoneMatch === `"${orgId}-${projectId}-${filename}"`) {
return res.status(304).end(); // Not Modified return res.status(304).end(); // Not Modified
} }
@ -65,32 +66,31 @@ imagesRouter.get(
const fileStream = await storageService.streamFile( const fileStream = await storageService.streamFile(
orgId, orgId,
projectId, projectId,
category as 'uploads' | 'generated' | 'references', category as "uploads" | "generated" | "references",
filename filename,
); );
// Handle stream errors // Handle stream errors
fileStream.on('error', (streamError) => { fileStream.on("error", (streamError) => {
console.error('Stream error:', streamError); console.error("Stream error:", streamError);
if (!res.headersSent) { if (!res.headersSent) {
res.status(500).json({ res.status(500).json({
success: false, success: false,
message: 'Error streaming file' message: "Error streaming file",
}); });
} }
}); });
// Stream the file without loading into memory // Stream the file without loading into memory
fileStream.pipe(res); fileStream.pipe(res);
} catch (error) { } catch (error) {
console.error('Failed to stream file:', error); console.error("Failed to stream file:", error);
return res.status(404).json({ return res.status(404).json({
success: false, success: false,
message: 'File not found' message: "File not found",
}); });
} }
}) }),
); );
/** /**
@ -98,41 +98,40 @@ imagesRouter.get(
* Returns a presigned URL instead of redirecting * Returns a presigned URL instead of redirecting
*/ */
imagesRouter.get( imagesRouter.get(
'/images/url/:orgId/:projectId/:category/:filename', "/images/url/:orgId/:projectId/:category/:filename",
asyncHandler(async (req: Request, res: Response) => { asyncHandler(async (req: Request, res: Response) => {
const { orgId, projectId, category, filename } = req.params; const { orgId, projectId, category, filename } = req.params;
const { expiry = '3600' } = req.query; // Default 1 hour const { expiry = "3600" } = req.query; // Default 1 hour
if (!['uploads', 'generated', 'references'].includes(category)) { if (!["uploads", "generated", "references"].includes(category)) {
return res.status(400).json({ return res.status(400).json({
success: false, success: false,
message: 'Invalid category' message: "Invalid category",
}); });
} }
const storageService = StorageFactory.getInstance(); const storageService = await StorageFactory.getInstance();
try { try {
const presignedUrl = await storageService.getPresignedDownloadUrl( const presignedUrl = await storageService.getPresignedDownloadUrl(
orgId, orgId,
projectId, projectId,
category as 'uploads' | 'generated' | 'references', category as "uploads" | "generated" | "references",
filename, filename,
parseInt(expiry as string, 10) parseInt(expiry as string, 10),
); );
return res.json({ return res.json({
success: true, success: true,
url: presignedUrl, url: presignedUrl,
expiresIn: parseInt(expiry as string, 10) expiresIn: parseInt(expiry as string, 10),
}); });
} catch (error) { } catch (error) {
console.error('Failed to generate presigned URL:', error); console.error("Failed to generate presigned URL:", error);
return res.status(404).json({ return res.status(404).json({
success: false, success: false,
message: 'File not found or access denied' message: "File not found or access denied",
}); });
} }
}) }),
); );

View File

@ -11,6 +11,7 @@ import {
} from "../middleware/promptEnhancement"; } from "../middleware/promptEnhancement";
import { asyncHandler } from "../middleware/errorHandler"; import { asyncHandler } from "../middleware/errorHandler";
import { validateApiKey } from "../middleware/auth/validateApiKey"; import { validateApiKey } from "../middleware/auth/validateApiKey";
import { requireProjectKey } from "../middleware/auth/requireProjectKey";
import { rateLimitByApiKey } from "../middleware/auth/rateLimiter"; import { rateLimitByApiKey } from "../middleware/auth/rateLimiter";
import { GenerateImageResponse } from "../types/api"; import { GenerateImageResponse } from "../types/api";
@ -25,6 +26,7 @@ textToImageRouter.post(
"/text-to-image", "/text-to-image",
// Authentication middleware // Authentication middleware
validateApiKey, validateApiKey,
requireProjectKey,
rateLimitByApiKey, rateLimitByApiKey,
// JSON validation middleware // JSON validation middleware
@ -54,8 +56,12 @@ textToImageRouter.post(
const requestId = req.requestId; const requestId = req.requestId;
const { prompt, filename } = req.body; const { prompt, filename } = req.body;
// Extract org/project slugs from validated API key
const orgId = req.apiKey?.organizationSlug || undefined;
const projectId = req.apiKey?.projectSlug!; // Guaranteed by requireProjectKey middleware
console.log( console.log(
`[${timestamp}] [${requestId}] Starting text-to-image generation process`, `[${timestamp}] [${requestId}] Starting text-to-image generation process for org:${orgId}, project:${projectId}`,
); );
try { try {
@ -67,6 +73,8 @@ textToImageRouter.post(
const result = await imageGenService.generateImage({ const result = await imageGenService.generateImage({
prompt, prompt,
filename, filename,
orgId,
projectId,
}); });
// Log the result // Log the result

View File

@ -1,23 +1,30 @@
import crypto from 'crypto'; import crypto from "crypto";
import { db } from '../db'; import { db } from "../db";
import { apiKeys, type ApiKey, type NewApiKey } from '@banatie/database'; import { apiKeys, organizations, projects, type ApiKey, type NewApiKey } from "@banatie/database";
import { eq, and, desc } from 'drizzle-orm'; import { eq, and, desc } from "drizzle-orm";
// Extended API key type with slugs for storage paths
export interface ApiKeyWithSlugs extends ApiKey {
organizationSlug?: string;
projectSlug?: string;
}
export class ApiKeyService { export class ApiKeyService {
/** /**
* Generate a new API key * Generate a new API key
* Format: bnt_{64_hex_chars} * Format: bnt_{64_hex_chars}
*/ */
private generateKey(): { fullKey: string; keyHash: string; keyPrefix: string } { private generateKey(): {
const secret = crypto.randomBytes(32).toString('hex'); // 64 chars fullKey: string;
const keyPrefix = 'bnt_'; keyHash: string;
keyPrefix: string;
} {
const secret = crypto.randomBytes(32).toString("hex"); // 64 chars
const keyPrefix = "bnt_";
const fullKey = keyPrefix + secret; const fullKey = keyPrefix + secret;
// Hash for storage (SHA-256) // Hash for storage (SHA-256)
const keyHash = crypto const keyHash = crypto.createHash("sha256").update(fullKey).digest("hex");
.createHash('sha256')
.update(fullKey)
.digest('hex');
return { fullKey, keyHash, keyPrefix }; return { fullKey, keyHash, keyPrefix };
} }
@ -25,21 +32,29 @@ export class ApiKeyService {
/** /**
* Create a master key (admin access, never expires) * Create a master key (admin access, never expires)
*/ */
async createMasterKey(name?: string, createdBy?: string): Promise<{ key: string; metadata: ApiKey }> { async createMasterKey(
name?: string,
createdBy?: string,
): Promise<{ key: string; metadata: ApiKey }> {
const { fullKey, keyHash, keyPrefix } = this.generateKey(); const { fullKey, keyHash, keyPrefix } = this.generateKey();
const [newKey] = await db.insert(apiKeys).values({ const [newKey] = await db
keyHash, .insert(apiKeys)
keyPrefix, .values({
keyType: 'master', keyHash,
projectId: null, keyPrefix,
scopes: ['*'], // Full access keyType: "master",
name: name || 'Master Key', projectId: null,
expiresAt: null, // Never expires scopes: ["*"], // Full access
createdBy: createdBy || null, name: name || "Master Key",
}).returning(); expiresAt: null, // Never expires
createdBy: createdBy || null,
})
.returning();
console.log(`[${new Date().toISOString()}] Master key created: ${newKey?.id} - ${newKey?.name}`); console.log(
`[${new Date().toISOString()}] Master key created: ${newKey?.id} - ${newKey?.name}`,
);
return { key: fullKey, metadata: newKey! }; return { key: fullKey, metadata: newKey! };
} }
@ -52,26 +67,31 @@ export class ApiKeyService {
organizationId?: string, organizationId?: string,
name?: string, name?: string,
createdBy?: string, createdBy?: string,
expiresInDays: number = 90 expiresInDays: number = 90,
): Promise<{ key: string; metadata: ApiKey }> { ): Promise<{ key: string; metadata: ApiKey }> {
const { fullKey, keyHash, keyPrefix } = this.generateKey(); const { fullKey, keyHash, keyPrefix } = this.generateKey();
const expiresAt = new Date(); const expiresAt = new Date();
expiresAt.setDate(expiresAt.getDate() + expiresInDays); expiresAt.setDate(expiresAt.getDate() + expiresInDays);
const [newKey] = await db.insert(apiKeys).values({ const [newKey] = await db
keyHash, .insert(apiKeys)
keyPrefix, .values({
keyType: 'project', keyHash,
projectId, keyPrefix,
organizationId: organizationId || null, keyType: "project",
scopes: ['generate', 'read'], projectId,
name: name || `Project Key - ${projectId}`, organizationId: organizationId || null,
expiresAt, scopes: ["generate", "read"],
createdBy: createdBy || null, name: name || `Project Key - ${projectId}`,
}).returning(); expiresAt,
createdBy: createdBy || null,
})
.returning();
console.log(`[${new Date().toISOString()}] Project key created: ${newKey?.id} - ${projectId} - expires: ${expiresAt.toISOString()}`); console.log(
`[${new Date().toISOString()}] Project key created: ${newKey?.id} - ${projectId} - expires: ${expiresAt.toISOString()}`,
);
return { key: fullKey, metadata: newKey! }; return { key: fullKey, metadata: newKey! };
} }
@ -79,49 +99,74 @@ export class ApiKeyService {
/** /**
* Validate an API key * Validate an API key
* Returns null if invalid/expired/revoked * Returns null if invalid/expired/revoked
* Returns API key with organization and project slugs for storage paths
*/ */
async validateKey(providedKey: string): Promise<ApiKey | null> { async validateKey(providedKey: string): Promise<ApiKeyWithSlugs | null> {
if (!providedKey || !providedKey.startsWith('bnt_')) { if (!providedKey || !providedKey.startsWith("bnt_")) {
return null; return null;
} }
// Hash the provided key // Hash the provided key
const keyHash = crypto const keyHash = crypto
.createHash('sha256') .createHash("sha256")
.update(providedKey) .update(providedKey)
.digest('hex'); .digest("hex");
// Find in database // Find in database with left joins to get slugs
const [key] = await db const [result] = await db
.select() .select({
// API key fields
id: apiKeys.id,
keyHash: apiKeys.keyHash,
keyPrefix: apiKeys.keyPrefix,
keyType: apiKeys.keyType,
organizationId: apiKeys.organizationId,
projectId: apiKeys.projectId,
scopes: apiKeys.scopes,
createdAt: apiKeys.createdAt,
expiresAt: apiKeys.expiresAt,
lastUsedAt: apiKeys.lastUsedAt,
isActive: apiKeys.isActive,
name: apiKeys.name,
createdBy: apiKeys.createdBy,
// Slug fields
organizationSlug: organizations.slug,
projectSlug: projects.slug,
})
.from(apiKeys) .from(apiKeys)
.where( .leftJoin(organizations, eq(apiKeys.organizationId, organizations.id))
and( .leftJoin(projects, eq(apiKeys.projectId, projects.id))
eq(apiKeys.keyHash, keyHash), .where(and(eq(apiKeys.keyHash, keyHash), eq(apiKeys.isActive, true)))
eq(apiKeys.isActive, true)
)
)
.limit(1); .limit(1);
if (!key) { if (!result) {
console.warn(`[${new Date().toISOString()}] Invalid API key attempt: ${providedKey.substring(0, 10)}...`); console.warn(
`[${new Date().toISOString()}] Invalid API key attempt: ${providedKey.substring(0, 10)}...`,
);
return null; return null;
} }
// Check expiration // Check expiration
if (key.expiresAt && key.expiresAt < new Date()) { if (result.expiresAt && result.expiresAt < new Date()) {
console.warn(`[${new Date().toISOString()}] Expired API key used: ${key.id} - expired: ${key.expiresAt.toISOString()}`); console.warn(
`[${new Date().toISOString()}] Expired API key used: ${result.id} - expired: ${result.expiresAt.toISOString()}`,
);
return null; return null;
} }
// Update last used timestamp (async, don't wait) // Update last used timestamp (async, don't wait)
db.update(apiKeys) db.update(apiKeys)
.set({ lastUsedAt: new Date() }) .set({ lastUsedAt: new Date() })
.where(eq(apiKeys.id, key.id)) .where(eq(apiKeys.id, result.id))
.execute() .execute()
.catch(err => console.error(`[${new Date().toISOString()}] Failed to update lastUsedAt:`, err)); .catch((err) =>
console.error(
`[${new Date().toISOString()}] Failed to update lastUsedAt:`,
err,
),
);
return key; return result as ApiKeyWithSlugs;
} }
/** /**
@ -146,21 +191,94 @@ export class ApiKeyService {
* List all keys (for admin) * List all keys (for admin)
*/ */
async listKeys(): Promise<ApiKey[]> { async listKeys(): Promise<ApiKey[]> {
return db return db.select().from(apiKeys).orderBy(desc(apiKeys.createdAt));
.select()
.from(apiKeys)
.orderBy(desc(apiKeys.createdAt));
} }
/** /**
* Check if any keys exist (for bootstrap) * Check if any keys exist (for bootstrap)
*/ */
async hasAnyKeys(): Promise<boolean> { async hasAnyKeys(): Promise<boolean> {
const keys = await db const keys = await db.select({ id: apiKeys.id }).from(apiKeys).limit(1);
.select({ id: apiKeys.id })
.from(apiKeys)
.limit(1);
return keys.length > 0; return keys.length > 0;
} }
}
/**
* Get or create organization by slug
* If organization doesn't exist, create it with provided name (or use slug as name)
*/
async getOrCreateOrganization(
slug: string,
name?: string,
email?: string,
): Promise<string> {
// Try to find existing organization
const [existing] = await db
.select({ id: organizations.id })
.from(organizations)
.where(eq(organizations.slug, slug))
.limit(1);
if (existing) {
return existing.id;
}
// Create new organization
const [newOrg] = await db
.insert(organizations)
.values({
slug,
name: name || slug,
email: email || `${slug}@placeholder.local`,
})
.returning({ id: organizations.id });
console.log(
`[${new Date().toISOString()}] Organization created: ${newOrg?.id} - ${slug}`,
);
return newOrg!.id;
}
/**
* Get or create project by slug within an organization
* If project doesn't exist, create it with provided name (or use slug as name)
*/
async getOrCreateProject(
organizationId: string,
slug: string,
name?: string,
): Promise<string> {
// Try to find existing project
const [existing] = await db
.select({ id: projects.id })
.from(projects)
.where(
and(
eq(projects.organizationId, organizationId),
eq(projects.slug, slug),
),
)
.limit(1);
if (existing) {
return existing.id;
}
// Create new project
const [newProject] = await db
.insert(projects)
.values({
organizationId,
slug,
name: name || slug,
})
.returning({ id: projects.id });
console.log(
`[${new Date().toISOString()}] Project created: ${newProject?.id} - ${slug} (org: ${organizationId})`,
);
return newProject!.id;
}
}

View File

@ -1,19 +1,17 @@
import { GoogleGenAI } from "@google/genai"; import { GoogleGenAI } from "@google/genai";
// eslint-disable-next-line @typescript-eslint/no-var-requires // eslint-disable-next-line @typescript-eslint/no-var-requires
const mime = require("mime") as any; const mime = require("mime") as any;
import path from "path";
import { import {
ImageGenerationOptions, ImageGenerationOptions,
ImageGenerationResult, ImageGenerationResult,
ReferenceImage, ReferenceImage,
GeneratedImageData,
} from "../types/api"; } from "../types/api";
import { StorageFactory } from "./StorageFactory"; import { StorageFactory } from "./StorageFactory";
import { UploadResult } from "./StorageService";
export class ImageGenService { export class ImageGenService {
private ai: GoogleGenAI; private ai: GoogleGenAI;
private primaryModel = "gemini-2.5-flash-image-preview"; private primaryModel = "gemini-2.5-flash-image-preview";
private fallbackModel = "imagen-4.0-generate-001";
constructor(apiKey: string) { constructor(apiKey: string) {
if (!apiKey) { if (!apiKey) {
@ -24,182 +22,168 @@ export class ImageGenService {
/** /**
* Generate an image from text prompt with optional reference images * Generate an image from text prompt with optional reference images
* This method separates image generation from storage for clear error handling
*/ */
async generateImage( async generateImage(
options: ImageGenerationOptions, options: ImageGenerationOptions,
): Promise<ImageGenerationResult> { ): Promise<ImageGenerationResult> {
const { prompt, filename, referenceImages, orgId, projectId, userId } = const { prompt, filename, referenceImages, orgId, projectId } = options;
options;
const timestamp = new Date().toISOString();
// Use default values if not provided // Use default values if not provided
const finalOrgId = orgId || process.env["DEFAULT_ORG_ID"] || "default"; const finalOrgId = orgId || process.env["DEFAULT_ORG_ID"] || "default";
const finalProjectId = const finalProjectId =
projectId || process.env["DEFAULT_PROJECT_ID"] || "main"; projectId || process.env["DEFAULT_PROJECT_ID"] || "main";
const finalUserId = userId || process.env["DEFAULT_USER_ID"] || "system";
// Step 1: Generate image from Gemini AI
let generatedData: GeneratedImageData;
try { try {
const result = await this.tryGeneration({ generatedData = await this.generateImageWithAI(prompt, referenceImages);
model: this.primaryModel,
config: { responseModalities: ["IMAGE", "TEXT"] },
prompt,
filename,
orgId: finalOrgId,
projectId: finalProjectId,
userId: finalUserId,
...(referenceImages && { referenceImages }),
modelName: "Primary Model",
});
if (result.success) {
return result;
}
return await this.tryGeneration({
model: this.fallbackModel,
config: { responseModalities: ["IMAGE"] },
prompt,
filename: `${filename}_fallback`,
orgId: finalOrgId,
projectId: finalProjectId,
userId: finalUserId,
...(referenceImages && { referenceImages }),
modelName: "Fallback Model",
});
} catch (error) { } catch (error) {
// Generation failed - return explicit error
return { return {
success: false, success: false,
model: "none", model: this.primaryModel,
error: error:
error instanceof Error ? error.message : "Unknown error occurred", error instanceof Error ? error.message : "Image generation failed",
errorType: "generation",
};
}
// Step 2: Save generated image to storage
try {
const finalFilename = `${filename}.${generatedData.fileExtension}`;
const storageService = await StorageFactory.getInstance();
const uploadResult = await storageService.uploadFile(
finalOrgId,
finalProjectId,
"generated",
finalFilename,
generatedData.buffer,
generatedData.mimeType,
);
if (uploadResult.success) {
return {
success: true,
filename: uploadResult.filename,
filepath: uploadResult.path,
url: uploadResult.url,
model: this.primaryModel,
...(generatedData.description && {
description: generatedData.description,
}),
};
} else {
// Storage failed but image was generated
return {
success: false,
model: this.primaryModel,
error: `Image generated successfully but storage failed: ${uploadResult.error || "Unknown storage error"}`,
errorType: "storage",
generatedImageData: generatedData,
...(generatedData.description && {
description: generatedData.description,
}),
};
}
} catch (error) {
// Storage exception - image was generated but couldn't be saved
return {
success: false,
model: this.primaryModel,
error: `Image generated successfully but storage failed: ${error instanceof Error ? error.message : "Unknown storage error"}`,
errorType: "storage",
generatedImageData: generatedData,
...(generatedData.description && {
description: generatedData.description,
}),
}; };
} }
} }
private async tryGeneration(params: { /**
model: string; * Generate image using Gemini AI - isolated from storage logic
config: { responseModalities: string[] }; * @throws Error if generation fails
prompt: string; */
filename: string; private async generateImageWithAI(
orgId: string; prompt: string,
projectId: string; referenceImages?: ReferenceImage[],
userId: string; ): Promise<GeneratedImageData> {
referenceImages?: ReferenceImage[]; const contentParts: any[] = [];
modelName: string;
}): Promise<ImageGenerationResult> { // Add reference images if provided
const { if (referenceImages && referenceImages.length > 0) {
model, for (const refImage of referenceImages) {
config, contentParts.push({
prompt, inlineData: {
filename, mimeType: refImage.mimetype,
orgId, data: refImage.buffer.toString("base64"),
projectId, },
userId, });
referenceImages, }
modelName, }
} = params;
// Add text prompt
contentParts.push({
text: prompt,
});
const contents = [
{
role: "user" as const,
parts: contentParts,
},
];
try { try {
const contentParts: any[] = [];
if (referenceImages && referenceImages.length > 0) {
for (const refImage of referenceImages) {
contentParts.push({
inlineData: {
mimeType: refImage.mimetype,
data: refImage.buffer.toString("base64"),
},
});
}
}
contentParts.push({
text: prompt,
});
const contents = [
{
role: "user" as const,
parts: contentParts,
},
];
const response = await this.ai.models.generateContent({ const response = await this.ai.models.generateContent({
model, model: this.primaryModel,
config, config: { responseModalities: ["IMAGE", "TEXT"] },
contents, contents,
}); });
// Parse response
if ( if (
response.candidates && !response.candidates ||
response.candidates[0] && !response.candidates[0] ||
response.candidates[0].content !response.candidates[0].content
) { ) {
const content = response.candidates[0].content; throw new Error("No response received from Gemini AI");
let generatedDescription = ""; }
let uploadResult: UploadResult | null = null;
for (let index = 0; index < (content.parts?.length || 0); index++) { const content = response.candidates[0].content;
const part = content.parts?.[index]; let generatedDescription: string | undefined;
if (!part) continue; let imageData: { buffer: Buffer; mimeType: string } | null = null;
if (part.inlineData) { // Extract image data and description from response
const fileExtension = mime.getExtension( for (const part of content.parts || []) {
part.inlineData.mimeType || "", if (part.inlineData) {
); const buffer = Buffer.from(part.inlineData.data || "", "base64");
const finalFilename = `${filename}.${fileExtension}`; const mimeType = part.inlineData.mimeType || "image/png";
const contentType = imageData = { buffer, mimeType };
part.inlineData.mimeType || `image/${fileExtension}`; } else if (part.text) {
generatedDescription = part.text;
const buffer = Buffer.from(part.inlineData.data || "", "base64");
const storageService = StorageFactory.getInstance();
const result = (await storageService).uploadFile(
orgId,
projectId,
"generated",
finalFilename,
buffer,
contentType,
);
uploadResult = await result;
} else if (part.text) {
generatedDescription = part.text;
}
}
if (uploadResult && uploadResult.success) {
return {
success: true,
filename: uploadResult.filename,
filepath: uploadResult.path,
url: uploadResult.url,
description: generatedDescription,
model: modelName,
};
} }
} }
if (!imageData) {
throw new Error("No image data received from Gemini AI");
}
const fileExtension = mime.getExtension(imageData.mimeType) || "png";
return { return {
success: false, buffer: imageData.buffer,
model: modelName, mimeType: imageData.mimeType,
error: "No image data received from API", fileExtension,
...(generatedDescription && { description: generatedDescription }),
}; };
} catch (error) { } catch (error) {
return { // Re-throw with clear error message
success: false, if (error instanceof Error) {
model: modelName, throw new Error(`Gemini AI generation failed: ${error.message}`);
error: error instanceof Error ? error.message : "Generation failed", }
}; throw new Error("Gemini AI generation failed: Unknown error");
} }
} }

View File

@ -1,5 +1,5 @@
import { Client as MinioClient } from 'minio'; import { Client as MinioClient } from "minio";
import { StorageService, FileMetadata, UploadResult } from './StorageService'; import { StorageService, FileMetadata, UploadResult } from "./StorageService";
export class MinioStorageService implements StorageService { export class MinioStorageService implements StorageService {
private client: MinioClient; private client: MinioClient;
@ -11,13 +11,13 @@ export class MinioStorageService implements StorageService {
accessKey: string, accessKey: string,
secretKey: string, secretKey: string,
useSSL: boolean = false, useSSL: boolean = false,
bucketName: string = 'banatie', bucketName: string = "banatie",
publicUrl?: string publicUrl?: string,
) { ) {
// Parse endpoint to separate hostname and port // Parse endpoint to separate hostname and port
const cleanEndpoint = endpoint.replace(/^https?:\/\//, ''); const cleanEndpoint = endpoint.replace(/^https?:\/\//, "");
const [hostname, portStr] = cleanEndpoint.split(':'); const [hostname, portStr] = cleanEndpoint.split(":");
const port = portStr ? parseInt(portStr, 10) : (useSSL ? 443 : 9000); const port = portStr ? parseInt(portStr, 10) : useSSL ? 443 : 9000;
if (!hostname) { if (!hostname) {
throw new Error(`Invalid MinIO endpoint: ${endpoint}`); throw new Error(`Invalid MinIO endpoint: ${endpoint}`);
@ -25,20 +25,20 @@ export class MinioStorageService implements StorageService {
this.client = new MinioClient({ this.client = new MinioClient({
endPoint: hostname, endPoint: hostname,
port: port, port,
useSSL, useSSL,
accessKey, accessKey,
secretKey secretKey,
}); });
this.bucketName = bucketName; this.bucketName = bucketName;
this.publicUrl = publicUrl || `${useSSL ? 'https' : 'http'}://${endpoint}`; this.publicUrl = publicUrl || `${useSSL ? "https" : "http"}://${endpoint}`;
} }
private getFilePath( private getFilePath(
orgId: string, orgId: string,
projectId: string, projectId: string,
category: 'uploads' | 'generated' | 'references', category: "uploads" | "generated" | "references",
filename: string filename: string,
): string { ): string {
// Simplified path without date folder for now // Simplified path without date folder for now
return `${orgId}/${projectId}/${category}/${filename}`; return `${orgId}/${projectId}/${category}/${filename}`;
@ -50,11 +50,11 @@ export class MinioStorageService implements StorageService {
const timestamp = Date.now(); const timestamp = Date.now();
const random = Math.random().toString(36).substring(2, 8); const random = Math.random().toString(36).substring(2, 8);
const ext = sanitized.includes('.') const ext = sanitized.includes(".")
? sanitized.substring(sanitized.lastIndexOf('.')) ? sanitized.substring(sanitized.lastIndexOf("."))
: ''; : "";
const name = sanitized.includes('.') const name = sanitized.includes(".")
? sanitized.substring(0, sanitized.lastIndexOf('.')) ? sanitized.substring(0, sanitized.lastIndexOf("."))
: sanitized; : sanitized;
return `${name}-${timestamp}-${random}${ext}`; return `${name}-${timestamp}-${random}${ext}`;
@ -63,49 +63,70 @@ export class MinioStorageService implements StorageService {
private sanitizeFilename(filename: string): string { private sanitizeFilename(filename: string): string {
// Remove dangerous characters and path traversal attempts // Remove dangerous characters and path traversal attempts
return filename return filename
.replace(/[<>:"/\\|?*\x00-\x1f]/g, '') // Remove dangerous chars .replace(/[<>:"/\\|?*\x00-\x1f]/g, "") // Remove dangerous chars
.replace(/\.\./g, '') // Remove path traversal .replace(/\.\./g, "") // Remove path traversal
.replace(/^\.+/, '') // Remove leading dots .replace(/^\.+/, "") // Remove leading dots
.trim() .trim()
.substring(0, 255); // Limit length .substring(0, 255); // Limit length
} }
private validateFilePath(orgId: string, projectId: string, category: string, filename: string): void { private validateFilePath(
orgId: string,
projectId: string,
category: string,
filename: string,
): void {
// Validate orgId // Validate orgId
if (!orgId || !/^[a-zA-Z0-9_-]+$/.test(orgId) || orgId.length > 50) { if (!orgId || !/^[a-zA-Z0-9_-]+$/.test(orgId) || orgId.length > 50) {
throw new Error('Invalid organization ID: must be alphanumeric with dashes/underscores, max 50 chars'); throw new Error(
"Invalid organization ID: must be alphanumeric with dashes/underscores, max 50 chars",
);
} }
// Validate projectId // Validate projectId
if (!projectId || !/^[a-zA-Z0-9_-]+$/.test(projectId) || projectId.length > 50) { if (
throw new Error('Invalid project ID: must be alphanumeric with dashes/underscores, max 50 chars'); !projectId ||
!/^[a-zA-Z0-9_-]+$/.test(projectId) ||
projectId.length > 50
) {
throw new Error(
"Invalid project ID: must be alphanumeric with dashes/underscores, max 50 chars",
);
} }
// Validate category // Validate category
if (!['uploads', 'generated', 'references'].includes(category)) { if (!["uploads", "generated", "references"].includes(category)) {
throw new Error('Invalid category: must be uploads, generated, or references'); throw new Error(
"Invalid category: must be uploads, generated, or references",
);
} }
// Validate filename // Validate filename
if (!filename || filename.length === 0 || filename.length > 255) { if (!filename || filename.length === 0 || filename.length > 255) {
throw new Error('Invalid filename: must be 1-255 characters'); throw new Error("Invalid filename: must be 1-255 characters");
} }
// Check for path traversal and dangerous patterns // Check for path traversal and dangerous patterns
if (filename.includes('..') || filename.includes('/') || filename.includes('\\')) { if (
throw new Error('Invalid characters in filename: path traversal not allowed'); filename.includes("..") ||
filename.includes("/") ||
filename.includes("\\")
) {
throw new Error(
"Invalid characters in filename: path traversal not allowed",
);
} }
// Prevent null bytes and control characters // Prevent null bytes and control characters
if (/[\x00-\x1f]/.test(filename)) { if (/[\x00-\x1f]/.test(filename)) {
throw new Error('Invalid filename: control characters not allowed'); throw new Error("Invalid filename: control characters not allowed");
} }
} }
async createBucket(): Promise<void> { async createBucket(): Promise<void> {
const exists = await this.client.bucketExists(this.bucketName); const exists = await this.client.bucketExists(this.bucketName);
if (!exists) { if (!exists) {
await this.client.makeBucket(this.bucketName, 'us-east-1'); await this.client.makeBucket(this.bucketName, "us-east-1");
console.log(`Created bucket: ${this.bucketName}`); console.log(`Created bucket: ${this.bucketName}`);
} }
@ -120,20 +141,20 @@ export class MinioStorageService implements StorageService {
async uploadFile( async uploadFile(
orgId: string, orgId: string,
projectId: string, projectId: string,
category: 'uploads' | 'generated' | 'references', category: "uploads" | "generated" | "references",
filename: string, filename: string,
buffer: Buffer, buffer: Buffer,
contentType: string contentType: string,
): Promise<UploadResult> { ): Promise<UploadResult> {
// Validate inputs first // Validate inputs first
this.validateFilePath(orgId, projectId, category, filename); this.validateFilePath(orgId, projectId, category, filename);
if (!buffer || buffer.length === 0) { if (!buffer || buffer.length === 0) {
throw new Error('Buffer cannot be empty'); throw new Error("Buffer cannot be empty");
} }
if (!contentType || contentType.trim().length === 0) { if (!contentType || contentType.trim().length === 0) {
throw new Error('Content type is required'); throw new Error("Content type is required");
} }
// Ensure bucket exists // Ensure bucket exists
@ -141,15 +162,20 @@ export class MinioStorageService implements StorageService {
// Generate unique filename to avoid conflicts // Generate unique filename to avoid conflicts
const uniqueFilename = this.generateUniqueFilename(filename); const uniqueFilename = this.generateUniqueFilename(filename);
const filePath = this.getFilePath(orgId, projectId, category, uniqueFilename); const filePath = this.getFilePath(
orgId,
projectId,
category,
uniqueFilename,
);
const metadata = { const metadata = {
'Content-Type': contentType, "Content-Type": contentType,
'X-Amz-Meta-Original-Name': filename, "X-Amz-Meta-Original-Name": filename,
'X-Amz-Meta-Category': category, "X-Amz-Meta-Category": category,
'X-Amz-Meta-Project': projectId, "X-Amz-Meta-Project": projectId,
'X-Amz-Meta-Organization': orgId, "X-Amz-Meta-Organization": orgId,
'X-Amz-Meta-Upload-Time': new Date().toISOString() "X-Amz-Meta-Upload-Time": new Date().toISOString(),
}; };
console.log(`Uploading file to: ${this.bucketName}/${filePath}`); console.log(`Uploading file to: ${this.bucketName}/${filePath}`);
@ -159,7 +185,7 @@ export class MinioStorageService implements StorageService {
filePath, filePath,
buffer, buffer,
buffer.length, buffer.length,
metadata metadata,
); );
const url = this.getPublicUrl(orgId, projectId, category, uniqueFilename); const url = this.getPublicUrl(orgId, projectId, category, uniqueFilename);
@ -172,15 +198,15 @@ export class MinioStorageService implements StorageService {
path: filePath, path: filePath,
url, url,
size: buffer.length, size: buffer.length,
contentType contentType,
}; };
} }
async downloadFile( async downloadFile(
orgId: string, orgId: string,
projectId: string, projectId: string,
category: 'uploads' | 'generated' | 'references', category: "uploads" | "generated" | "references",
filename: string filename: string,
): Promise<Buffer> { ): Promise<Buffer> {
this.validateFilePath(orgId, projectId, category, filename); this.validateFilePath(orgId, projectId, category, filename);
const filePath = this.getFilePath(orgId, projectId, category, filename); const filePath = this.getFilePath(orgId, projectId, category, filename);
@ -189,18 +215,18 @@ export class MinioStorageService implements StorageService {
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
const chunks: Buffer[] = []; const chunks: Buffer[] = [];
stream.on('data', (chunk) => chunks.push(chunk)); stream.on("data", (chunk) => chunks.push(chunk));
stream.on('end', () => resolve(Buffer.concat(chunks))); stream.on("end", () => resolve(Buffer.concat(chunks)));
stream.on('error', reject); stream.on("error", reject);
}); });
} }
async streamFile( async streamFile(
orgId: string, orgId: string,
projectId: string, projectId: string,
category: 'uploads' | 'generated' | 'references', category: "uploads" | "generated" | "references",
filename: string filename: string,
): Promise<import('stream').Readable> { ): Promise<import("stream").Readable> {
this.validateFilePath(orgId, projectId, category, filename); this.validateFilePath(orgId, projectId, category, filename);
const filePath = this.getFilePath(orgId, projectId, category, filename); const filePath = this.getFilePath(orgId, projectId, category, filename);
@ -211,8 +237,8 @@ export class MinioStorageService implements StorageService {
async deleteFile( async deleteFile(
orgId: string, orgId: string,
projectId: string, projectId: string,
category: 'uploads' | 'generated' | 'references', category: "uploads" | "generated" | "references",
filename: string filename: string,
): Promise<void> { ): Promise<void> {
this.validateFilePath(orgId, projectId, category, filename); this.validateFilePath(orgId, projectId, category, filename);
const filePath = this.getFilePath(orgId, projectId, category, filename); const filePath = this.getFilePath(orgId, projectId, category, filename);
@ -222,52 +248,61 @@ export class MinioStorageService implements StorageService {
getPublicUrl( getPublicUrl(
orgId: string, orgId: string,
projectId: string, projectId: string,
category: 'uploads' | 'generated' | 'references', category: "uploads" | "generated" | "references",
filename: string filename: string,
): string { ): string {
this.validateFilePath(orgId, projectId, category, filename); this.validateFilePath(orgId, projectId, category, filename);
// Production-ready: Return API URL for presigned URL access // Production-ready: Return API URL for presigned URL access
const apiBaseUrl = process.env['API_BASE_URL'] || 'http://localhost:3000'; const apiBaseUrl = process.env["API_BASE_URL"] || "http://localhost:3000";
return `${apiBaseUrl}/api/images/${orgId}/${projectId}/${category}/${filename}`; return `${apiBaseUrl}/api/images/${orgId}/${projectId}/${category}/${filename}`;
} }
async getPresignedUploadUrl( async getPresignedUploadUrl(
orgId: string, orgId: string,
projectId: string, projectId: string,
category: 'uploads' | 'generated' | 'references', category: "uploads" | "generated" | "references",
filename: string, filename: string,
expirySeconds: number, expirySeconds: number,
contentType: string contentType: string,
): Promise<string> { ): Promise<string> {
this.validateFilePath(orgId, projectId, category, filename); this.validateFilePath(orgId, projectId, category, filename);
if (!contentType || contentType.trim().length === 0) { if (!contentType || contentType.trim().length === 0) {
throw new Error('Content type is required for presigned upload URL'); throw new Error("Content type is required for presigned upload URL");
} }
const filePath = this.getFilePath(orgId, projectId, category, filename); const filePath = this.getFilePath(orgId, projectId, category, filename);
return await this.client.presignedPutObject(this.bucketName, filePath, expirySeconds); return await this.client.presignedPutObject(
this.bucketName,
filePath,
expirySeconds,
);
} }
async getPresignedDownloadUrl( async getPresignedDownloadUrl(
orgId: string, orgId: string,
projectId: string, projectId: string,
category: 'uploads' | 'generated' | 'references', category: "uploads" | "generated" | "references",
filename: string, filename: string,
expirySeconds: number = 86400 // 24 hours default expirySeconds: number = 86400, // 24 hours default
): Promise<string> { ): Promise<string> {
this.validateFilePath(orgId, projectId, category, filename); this.validateFilePath(orgId, projectId, category, filename);
const filePath = this.getFilePath(orgId, projectId, category, filename); const filePath = this.getFilePath(orgId, projectId, category, filename);
const presignedUrl = await this.client.presignedGetObject(this.bucketName, filePath, expirySeconds); const presignedUrl = await this.client.presignedGetObject(
this.bucketName,
filePath,
expirySeconds,
);
// Replace internal Docker hostname with public URL if configured // Replace internal Docker hostname with public URL if configured
if (this.publicUrl) { if (this.publicUrl) {
const clientEndpoint = this.client.host + (this.client.port ? `:${this.client.port}` : ''); const clientEndpoint =
const publicEndpoint = this.publicUrl.replace(/^https?:\/\//, ''); this.client.host + (this.client.port ? `:${this.client.port}` : "");
const publicEndpoint = this.publicUrl.replace(/^https?:\/\//, "");
return presignedUrl.replace( return presignedUrl.replace(
`${this.client.protocol}//${clientEndpoint}`, `${this.client.protocol}//${clientEndpoint}`,
this.publicUrl this.publicUrl,
); );
} }
@ -277,24 +312,32 @@ export class MinioStorageService implements StorageService {
async listProjectFiles( async listProjectFiles(
orgId: string, orgId: string,
projectId: string, projectId: string,
category?: 'uploads' | 'generated' | 'references' category?: "uploads" | "generated" | "references",
): Promise<FileMetadata[]> { ): Promise<FileMetadata[]> {
const prefix = category ? `${orgId}/${projectId}/${category}/` : `${orgId}/${projectId}/`; const prefix = category
? `${orgId}/${projectId}/${category}/`
: `${orgId}/${projectId}/`;
const files: FileMetadata[] = []; const files: FileMetadata[] = [];
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
const stream = this.client.listObjects(this.bucketName, prefix, true); const stream = this.client.listObjects(this.bucketName, prefix, true);
stream.on('data', async (obj) => { stream.on("data", async (obj) => {
try { try {
if (!obj.name) return; if (!obj.name) return;
const metadata = await this.client.statObject(this.bucketName, obj.name); const metadata = await this.client.statObject(
this.bucketName,
obj.name,
);
const pathParts = obj.name.split('/'); const pathParts = obj.name.split("/");
const filename = pathParts[pathParts.length - 1]; const filename = pathParts[pathParts.length - 1];
const categoryFromPath = pathParts[2] as 'uploads' | 'generated' | 'references'; const categoryFromPath = pathParts[2] as
| "uploads"
| "generated"
| "references";
if (!filename || !categoryFromPath) { if (!filename || !categoryFromPath) {
return; return;
@ -303,28 +346,35 @@ export class MinioStorageService implements StorageService {
files.push({ files.push({
key: `${this.bucketName}/${obj.name}`, key: `${this.bucketName}/${obj.name}`,
filename, filename,
contentType: metadata.metaData?.['content-type'] || 'application/octet-stream', contentType:
metadata.metaData?.["content-type"] || "application/octet-stream",
size: obj.size || 0, size: obj.size || 0,
url: this.getPublicUrl(orgId, projectId, categoryFromPath, filename), url: this.getPublicUrl(
createdAt: obj.lastModified || new Date() orgId,
projectId,
categoryFromPath,
filename,
),
createdAt: obj.lastModified || new Date(),
}); });
} catch (error) { } catch (error) {}
}
}); });
stream.on('end', () => resolve(files)); stream.on("end", () => resolve(files));
stream.on('error', reject); stream.on("error", reject);
}); });
} }
parseKey(key: string): { parseKey(key: string): {
orgId: string; orgId: string;
projectId: string; projectId: string;
category: 'uploads' | 'generated' | 'references'; category: "uploads" | "generated" | "references";
filename: string; filename: string;
} | null { } | null {
try { try {
const match = key.match(/^banatie\/([^/]+)\/([^/]+)\/(uploads|generated|references)\/[^/]+\/(.+)$/); const match = key.match(
/^banatie\/([^/]+)\/([^/]+)\/(uploads|generated|references)\/[^/]+\/(.+)$/,
);
if (!match) { if (!match) {
return null; return null;
@ -339,20 +389,19 @@ export class MinioStorageService implements StorageService {
return { return {
orgId, orgId,
projectId, projectId,
category: category as 'uploads' | 'generated' | 'references', category: category as "uploads" | "generated" | "references",
filename filename,
}; };
} catch { } catch {
return null; return null;
} }
} }
async fileExists( async fileExists(
orgId: string, orgId: string,
projectId: string, projectId: string,
category: 'uploads' | 'generated' | 'references', category: "uploads" | "generated" | "references",
filename: string filename: string,
): Promise<boolean> { ): Promise<boolean> {
try { try {
this.validateFilePath(orgId, projectId, category, filename); this.validateFilePath(orgId, projectId, category, filename);
@ -367,10 +416,10 @@ export class MinioStorageService implements StorageService {
async listFiles( async listFiles(
orgId: string, orgId: string,
projectId: string, projectId: string,
category: 'uploads' | 'generated' | 'references', category: "uploads" | "generated" | "references",
prefix?: string prefix?: string,
): Promise<FileMetadata[]> { ): Promise<FileMetadata[]> {
this.validateFilePath(orgId, projectId, category, 'dummy.txt'); this.validateFilePath(orgId, projectId, category, "dummy.txt");
const basePath = `${orgId}/${projectId}/${category}/`; const basePath = `${orgId}/${projectId}/${category}/`;
const searchPrefix = prefix ? `${basePath}${prefix}` : basePath; const searchPrefix = prefix ? `${basePath}${prefix}` : basePath;
@ -378,33 +427,40 @@ export class MinioStorageService implements StorageService {
const files: FileMetadata[] = []; const files: FileMetadata[] = [];
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
const stream = this.client.listObjects(this.bucketName, searchPrefix, true); const stream = this.client.listObjects(
this.bucketName,
searchPrefix,
true,
);
stream.on('data', async (obj) => { stream.on("data", async (obj) => {
if (!obj.name || !obj.size) return; if (!obj.name || !obj.size) return;
try { try {
const pathParts = obj.name.split('/'); const pathParts = obj.name.split("/");
const filename = pathParts[pathParts.length - 1]; const filename = pathParts[pathParts.length - 1];
if (!filename) return; if (!filename) return;
const metadata = await this.client.statObject(this.bucketName, obj.name); const metadata = await this.client.statObject(
this.bucketName,
obj.name,
);
files.push({ files.push({
filename, filename,
size: obj.size, size: obj.size,
contentType: metadata.metaData?.['content-type'] || 'application/octet-stream', contentType:
metadata.metaData?.["content-type"] || "application/octet-stream",
lastModified: obj.lastModified || new Date(), lastModified: obj.lastModified || new Date(),
etag: metadata.etag, etag: metadata.etag,
path: obj.name path: obj.name,
}); });
} catch (error) { } catch (error) {}
}
}); });
stream.on('end', () => resolve(files)); stream.on("end", () => resolve(files));
stream.on('error', reject); stream.on("error", reject);
}); });
} }
} }

View File

@ -1,5 +1,5 @@
import { StorageService } from './StorageService'; import { StorageService } from "./StorageService";
import { MinioStorageService } from './MinioStorageService'; import { MinioStorageService } from "./MinioStorageService";
export class StorageFactory { export class StorageFactory {
private static instance: StorageService | null = null; private static instance: StorageService | null = null;
@ -30,32 +30,30 @@ export class StorageFactory {
try { try {
this.instance = this.createStorageService(); this.instance = this.createStorageService();
} catch (error) { } catch (error) {
throw new Error('Storage service unavailable. Please check MinIO configuration.'); throw new Error(
"Storage service unavailable. Please check MinIO configuration.",
);
} }
} }
return this.instance; return this.instance;
} }
private static async createStorageServiceWithRetry(): Promise<StorageService> { private static async createStorageServiceWithRetry(): Promise<StorageService> {
const maxRetries = 3; const maxRetries = 3;
const baseDelay = 1000; // 1 second const baseDelay = 1000; // 1 second
for (let attempt = 1; attempt <= maxRetries; attempt++) { for (let attempt = 1; attempt <= maxRetries; attempt++) {
try { try {
const service = this.createStorageService(); const service = this.createStorageService();
await service.bucketExists(); await service.bucketExists();
return service; return service;
} catch (error) { } catch (error) {
if (attempt === maxRetries) { if (attempt === maxRetries) {
throw new Error( throw new Error(
`Failed to initialize storage service after ${maxRetries} attempts. ` + `Failed to initialize storage service after ${maxRetries} attempts. ` +
`Last error: ${error instanceof Error ? error.message : 'Unknown error'}` `Last error: ${error instanceof Error ? error.message : "Unknown error"}`,
); );
} }
@ -64,40 +62,39 @@ export class StorageFactory {
} }
} }
throw new Error('Unexpected error in storage service creation'); throw new Error("Unexpected error in storage service creation");
} }
private static sleep(ms: number): Promise<void> { private static sleep(ms: number): Promise<void> {
return new Promise(resolve => setTimeout(resolve, ms)); return new Promise((resolve) => setTimeout(resolve, ms));
} }
private static createStorageService(): StorageService { private static createStorageService(): StorageService {
const storageType = process.env['STORAGE_TYPE'] || 'minio'; const storageType = process.env["STORAGE_TYPE"] || "minio";
try { try {
switch (storageType.toLowerCase()) { switch (storageType.toLowerCase()) {
case 'minio': { case "minio": {
const endpoint = process.env['MINIO_ENDPOINT']; const endpoint = process.env["MINIO_ENDPOINT"];
const accessKey = process.env['MINIO_ACCESS_KEY']; const accessKey = process.env["MINIO_ACCESS_KEY"];
const secretKey = process.env['MINIO_SECRET_KEY']; const secretKey = process.env["MINIO_SECRET_KEY"];
const useSSL = process.env['MINIO_USE_SSL'] === 'true'; const useSSL = process.env["MINIO_USE_SSL"] === "true";
const bucketName = process.env['MINIO_BUCKET_NAME'] || 'banatie'; const bucketName = process.env["MINIO_BUCKET_NAME"] || "banatie";
const publicUrl = process.env['MINIO_PUBLIC_URL']; const publicUrl = process.env["MINIO_PUBLIC_URL"];
if (!endpoint || !accessKey || !secretKey) { if (!endpoint || !accessKey || !secretKey) {
throw new Error( throw new Error(
'MinIO configuration missing. Required: MINIO_ENDPOINT, MINIO_ACCESS_KEY, MINIO_SECRET_KEY' "MinIO configuration missing. Required: MINIO_ENDPOINT, MINIO_ACCESS_KEY, MINIO_SECRET_KEY",
); );
} }
return new MinioStorageService( return new MinioStorageService(
endpoint, endpoint,
accessKey, accessKey,
secretKey, secretKey,
useSSL, useSSL,
bucketName, bucketName,
publicUrl publicUrl,
); );
} }
@ -113,4 +110,4 @@ export class StorageFactory {
this.instance = null; this.instance = null;
this.initializationPromise = null; this.initializationPromise = null;
} }
} }

View File

@ -77,6 +77,16 @@ export interface ImageGenerationResult {
description?: string; description?: string;
model: string; model: string;
error?: string; error?: string;
errorType?: "generation" | "storage"; // Distinguish between generation and storage errors
generatedImageData?: GeneratedImageData; // Available when generation succeeds but storage fails
}
// Intermediate result after image generation, before storage
export interface GeneratedImageData {
buffer: Buffer;
mimeType: string;
fileExtension: string;
description?: string;
} }
// Logging types // Logging types

View File

@ -13,9 +13,8 @@ const STORAGE_KEY = 'banatie_master_key';
export default function ApiKeysPage() { export default function ApiKeysPage() {
const router = useRouter(); const router = useRouter();
const [masterKey, setMasterKey] = useState(''); const [masterKey, setMasterKey] = useState('');
const [email, setEmail] = useState(''); const [orgSlug, setOrgSlug] = useState('');
const [orgName, setOrgName] = useState(''); const [projectSlug, setProjectSlug] = useState('');
const [projectName, setProjectName] = useState('');
const [generatedKey, setGeneratedKey] = useState(''); const [generatedKey, setGeneratedKey] = useState('');
const [apiKeys, setApiKeys] = useState<any[]>([]); const [apiKeys, setApiKeys] = useState<any[]>([]);
const [loading, setLoading] = useState(false); const [loading, setLoading] = useState(false);
@ -45,15 +44,14 @@ export default function ApiKeysPage() {
setSuccess(''); setSuccess('');
setGeneratedKey(''); setGeneratedKey('');
const result = await createProjectApiKey(masterKey, email, orgName, projectName); const result = await createProjectApiKey(masterKey, orgSlug, projectSlug);
if (result.success && result.apiKey) { if (result.success && result.apiKey) {
setGeneratedKey(result.apiKey); setGeneratedKey(result.apiKey);
setSuccess('API key created successfully!'); setSuccess('API key created successfully!');
// Clear form // Clear form
setEmail(''); setOrgSlug('');
setOrgName(''); setProjectSlug('');
setProjectName('');
// Reload keys list // Reload keys list
await loadApiKeys(); await loadApiKeys();
} else { } else {
@ -121,25 +119,17 @@ export default function ApiKeysPage() {
<h2 className="text-2xl font-semibold text-white mb-6">Create New API Key</h2> <h2 className="text-2xl font-semibold text-white mb-6">Create New API Key</h2>
<form onSubmit={handleSubmit} className="space-y-4"> <form onSubmit={handleSubmit} className="space-y-4">
<AdminFormInput <AdminFormInput
label="Email" label="Organization Slug"
type="email" value={orgSlug}
value={email} onChange={setOrgSlug}
onChange={setEmail} placeholder="my-org"
placeholder="admin@example.com"
required required
/> />
<AdminFormInput <AdminFormInput
label="Organization Name" label="Project Slug"
value={orgName} value={projectSlug}
onChange={setOrgName} onChange={setProjectSlug}
placeholder="My Organization" placeholder="my-project"
required
/>
<AdminFormInput
label="Project Name"
value={projectName}
onChange={setProjectName}
placeholder="My Project"
required required
/> />
<AdminButton type="submit" disabled={loading}> <AdminButton type="submit" disabled={loading}>

View File

@ -24,8 +24,8 @@ interface GenerationResult {
} }
interface ApiKeyInfo { interface ApiKeyInfo {
organizationName?: string; organizationSlug?: string;
projectName?: string; projectSlug?: string;
} }
export default function DemoTTIPage() { export default function DemoTTIPage() {
@ -74,13 +74,13 @@ export default function DemoTTIPage() {
// Extract org/project info from API response // Extract org/project info from API response
if (data.keyInfo) { if (data.keyInfo) {
setApiKeyInfo({ setApiKeyInfo({
organizationName: data.keyInfo.organizationName || data.keyInfo.organizationId, organizationSlug: data.keyInfo.organizationSlug || data.keyInfo.organizationId,
projectName: data.keyInfo.projectName || data.keyInfo.projectId, projectSlug: data.keyInfo.projectSlug || data.keyInfo.projectId,
}); });
} else { } else {
setApiKeyInfo({ setApiKeyInfo({
organizationName: 'Unknown', organizationSlug: 'Unknown',
projectName: 'Unknown', projectSlug: 'Unknown',
}); });
} }
} else{ } else{
@ -264,7 +264,7 @@ export default function DemoTTIPage() {
{apiKeyValidated && apiKeyInfo && ( {apiKeyValidated && apiKeyInfo && (
<div className="mt-3 text-sm text-green-400"> <div className="mt-3 text-sm text-green-400">
Validated {apiKeyInfo.organizationName} / {apiKeyInfo.projectName} Validated {apiKeyInfo.organizationSlug} / {apiKeyInfo.projectSlug}
</div> </div>
)} )}
</div> </div>

View File

@ -1,6 +1,5 @@
'use server'; 'use server';
import { getOrCreateOrgAndProject } from './orgProjectActions';
import { listApiKeys as listApiKeysQuery } from '../db/queries/apiKeys'; import { listApiKeys as listApiKeysQuery } from '../db/queries/apiKeys';
const API_BASE_URL = process.env.NEXT_PUBLIC_API_URL || 'http://localhost:3000'; const API_BASE_URL = process.env.NEXT_PUBLIC_API_URL || 'http://localhost:3000';
@ -51,15 +50,11 @@ export async function bootstrapMasterKey(): Promise<{ success: boolean; apiKey?:
export async function createProjectApiKey( export async function createProjectApiKey(
masterKey: string, masterKey: string,
email: string, orgSlug: string,
orgName: string, projectSlug: string
projectName: string
): Promise<{ success: boolean; apiKey?: string; error?: string }> { ): Promise<{ success: boolean; apiKey?: string; error?: string }> {
try { try {
// First, ensure organization and project exist in DB // Call API service to create the project key (API auto-creates org/project)
const { organization, project } = await getOrCreateOrgAndProject(email, orgName, projectName);
// Then call API service to create the project key
const response = await fetch(`${API_BASE_URL}/api/admin/keys`, { const response = await fetch(`${API_BASE_URL}/api/admin/keys`, {
method: 'POST', method: 'POST',
headers: { headers: {
@ -68,9 +63,9 @@ export async function createProjectApiKey(
}, },
body: JSON.stringify({ body: JSON.stringify({
type: 'project', type: 'project',
projectId: project.id, organizationSlug: orgSlug,
organizationId: organization.id, projectSlug: projectSlug,
name: `${orgName} - ${projectName}`, name: `${orgSlug} - ${projectSlug}`,
}), }),
}); });

View File

@ -21,8 +21,27 @@ services:
condition: service_healthy condition: service_healthy
environment: environment:
- NODE_ENV=development - NODE_ENV=development
env_file: - DATABASE_URL=${DATABASE_URL}
- .env.docker - GEMINI_API_KEY=${GEMINI_API_KEY}
- STORAGE_TYPE=${STORAGE_TYPE}
- MINIO_ENDPOINT=${MINIO_ENDPOINT}
- MINIO_ACCESS_KEY=${MINIO_ACCESS_KEY}
- MINIO_SECRET_KEY=${MINIO_SECRET_KEY}
- MINIO_USE_SSL=${MINIO_USE_SSL}
- MINIO_BUCKET_NAME=${MINIO_BUCKET_NAME}
- MINIO_PUBLIC_URL=${MINIO_PUBLIC_URL}
- API_BASE_URL=${API_BASE_URL}
- DEFAULT_ORG_ID=${DEFAULT_ORG_ID}
- DEFAULT_PROJECT_ID=${DEFAULT_PROJECT_ID}
- DEFAULT_USER_ID=${DEFAULT_USER_ID}
- PRESIGNED_URL_EXPIRY=${PRESIGNED_URL_EXPIRY}
- MAX_FILE_SIZE=${MAX_FILE_SIZE}
- MAX_FILES=${MAX_FILES}
- RESULTS_DIR=${RESULTS_DIR}
- UPLOADS_DIR=${UPLOADS_DIR}
- LOG_LEVEL=${LOG_LEVEL}
- PORT=${PORT}
- CORS_ORIGIN=${CORS_ORIGIN}
restart: unless-stopped restart: unless-stopped
postgres: postgres:

View File

@ -50,6 +50,7 @@
"author": "", "author": "",
"license": "MIT", "license": "MIT",
"devDependencies": { "devDependencies": {
"kill-port": "^2.0.1",
"typescript": "^5.9.2" "typescript": "^5.9.2"
} }
} }

View File

@ -0,0 +1,55 @@
CREATE TABLE IF NOT EXISTS "organizations" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"name" text NOT NULL,
"slug" text NOT NULL,
"email" text NOT NULL,
"created_at" timestamp DEFAULT now() NOT NULL,
"updated_at" timestamp DEFAULT now() NOT NULL,
CONSTRAINT "organizations_slug_unique" UNIQUE("slug"),
CONSTRAINT "organizations_email_unique" UNIQUE("email")
);
--> statement-breakpoint
CREATE TABLE IF NOT EXISTS "projects" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"name" text NOT NULL,
"slug" text NOT NULL,
"organization_id" uuid NOT NULL,
"created_at" timestamp DEFAULT now() NOT NULL,
"updated_at" timestamp DEFAULT now() NOT NULL,
CONSTRAINT "projects_organization_id_slug_unique" UNIQUE("organization_id","slug")
);
--> statement-breakpoint
CREATE TABLE IF NOT EXISTS "api_keys" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"key_hash" text NOT NULL,
"key_prefix" text DEFAULT 'bnt_' NOT NULL,
"key_type" text NOT NULL,
"organization_id" uuid,
"project_id" uuid,
"scopes" jsonb DEFAULT '["generate"]'::jsonb NOT NULL,
"created_at" timestamp DEFAULT now() NOT NULL,
"expires_at" timestamp,
"last_used_at" timestamp,
"is_active" boolean DEFAULT true NOT NULL,
"name" text,
"created_by" uuid,
CONSTRAINT "api_keys_key_hash_unique" UNIQUE("key_hash")
);
--> statement-breakpoint
DO $$ BEGIN
ALTER TABLE "projects" ADD CONSTRAINT "projects_organization_id_organizations_id_fk" FOREIGN KEY ("organization_id") REFERENCES "public"."organizations"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
--> statement-breakpoint
DO $$ BEGIN
ALTER TABLE "api_keys" ADD CONSTRAINT "api_keys_organization_id_organizations_id_fk" FOREIGN KEY ("organization_id") REFERENCES "public"."organizations"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
--> statement-breakpoint
DO $$ BEGIN
ALTER TABLE "api_keys" ADD CONSTRAINT "api_keys_project_id_projects_id_fk" FOREIGN KEY ("project_id") REFERENCES "public"."projects"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;

View File

@ -0,0 +1,292 @@
{
"id": "8ec6e31f-1daa-4930-8bf8-2b4996e17270",
"prevId": "00000000-0000-0000-0000-000000000000",
"version": "7",
"dialect": "postgresql",
"tables": {
"public.organizations": {
"name": "organizations",
"schema": "",
"columns": {
"id": {
"name": "id",
"type": "uuid",
"primaryKey": true,
"notNull": true,
"default": "gen_random_uuid()"
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true
},
"slug": {
"name": "slug",
"type": "text",
"primaryKey": false,
"notNull": true
},
"email": {
"name": "email",
"type": "text",
"primaryKey": false,
"notNull": true
},
"created_at": {
"name": "created_at",
"type": "timestamp",
"primaryKey": false,
"notNull": true,
"default": "now()"
},
"updated_at": {
"name": "updated_at",
"type": "timestamp",
"primaryKey": false,
"notNull": true,
"default": "now()"
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {
"organizations_slug_unique": {
"name": "organizations_slug_unique",
"nullsNotDistinct": false,
"columns": [
"slug"
]
},
"organizations_email_unique": {
"name": "organizations_email_unique",
"nullsNotDistinct": false,
"columns": [
"email"
]
}
},
"policies": {},
"checkConstraints": {},
"isRLSEnabled": false
},
"public.projects": {
"name": "projects",
"schema": "",
"columns": {
"id": {
"name": "id",
"type": "uuid",
"primaryKey": true,
"notNull": true,
"default": "gen_random_uuid()"
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true
},
"slug": {
"name": "slug",
"type": "text",
"primaryKey": false,
"notNull": true
},
"organization_id": {
"name": "organization_id",
"type": "uuid",
"primaryKey": false,
"notNull": true
},
"created_at": {
"name": "created_at",
"type": "timestamp",
"primaryKey": false,
"notNull": true,
"default": "now()"
},
"updated_at": {
"name": "updated_at",
"type": "timestamp",
"primaryKey": false,
"notNull": true,
"default": "now()"
}
},
"indexes": {},
"foreignKeys": {
"projects_organization_id_organizations_id_fk": {
"name": "projects_organization_id_organizations_id_fk",
"tableFrom": "projects",
"tableTo": "organizations",
"columnsFrom": [
"organization_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {
"projects_organization_id_slug_unique": {
"name": "projects_organization_id_slug_unique",
"nullsNotDistinct": false,
"columns": [
"organization_id",
"slug"
]
}
},
"policies": {},
"checkConstraints": {},
"isRLSEnabled": false
},
"public.api_keys": {
"name": "api_keys",
"schema": "",
"columns": {
"id": {
"name": "id",
"type": "uuid",
"primaryKey": true,
"notNull": true,
"default": "gen_random_uuid()"
},
"key_hash": {
"name": "key_hash",
"type": "text",
"primaryKey": false,
"notNull": true
},
"key_prefix": {
"name": "key_prefix",
"type": "text",
"primaryKey": false,
"notNull": true,
"default": "'bnt_'"
},
"key_type": {
"name": "key_type",
"type": "text",
"primaryKey": false,
"notNull": true
},
"organization_id": {
"name": "organization_id",
"type": "uuid",
"primaryKey": false,
"notNull": false
},
"project_id": {
"name": "project_id",
"type": "uuid",
"primaryKey": false,
"notNull": false
},
"scopes": {
"name": "scopes",
"type": "jsonb",
"primaryKey": false,
"notNull": true,
"default": "'[\"generate\"]'::jsonb"
},
"created_at": {
"name": "created_at",
"type": "timestamp",
"primaryKey": false,
"notNull": true,
"default": "now()"
},
"expires_at": {
"name": "expires_at",
"type": "timestamp",
"primaryKey": false,
"notNull": false
},
"last_used_at": {
"name": "last_used_at",
"type": "timestamp",
"primaryKey": false,
"notNull": false
},
"is_active": {
"name": "is_active",
"type": "boolean",
"primaryKey": false,
"notNull": true,
"default": true
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": false
},
"created_by": {
"name": "created_by",
"type": "uuid",
"primaryKey": false,
"notNull": false
}
},
"indexes": {},
"foreignKeys": {
"api_keys_organization_id_organizations_id_fk": {
"name": "api_keys_organization_id_organizations_id_fk",
"tableFrom": "api_keys",
"tableTo": "organizations",
"columnsFrom": [
"organization_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
},
"api_keys_project_id_projects_id_fk": {
"name": "api_keys_project_id_projects_id_fk",
"tableFrom": "api_keys",
"tableTo": "projects",
"columnsFrom": [
"project_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {
"api_keys_key_hash_unique": {
"name": "api_keys_key_hash_unique",
"nullsNotDistinct": false,
"columns": [
"key_hash"
]
}
},
"policies": {},
"checkConstraints": {},
"isRLSEnabled": false
}
},
"enums": {},
"schemas": {},
"sequences": {},
"roles": {},
"policies": {},
"views": {},
"_meta": {
"columns": {},
"schemas": {},
"tables": {}
}
}

View File

@ -1,5 +1,13 @@
{ {
"version": "7", "version": "7",
"dialect": "postgresql", "dialect": "postgresql",
"entries": [] "entries": [
{
"idx": 0,
"version": "7",
"when": 1759661399219,
"tag": "0000_curious_wolfsbane",
"breakpoints": true
}
]
} }

View File

@ -5,6 +5,7 @@ export const organizations = pgTable('organizations', {
// Organization details // Organization details
name: text('name').notNull(), name: text('name').notNull(),
slug: text('slug').notNull().unique(), // URL-friendly identifier for storage paths
email: text('email').notNull().unique(), email: text('email').notNull().unique(),
// Timestamps // Timestamps

View File

@ -6,14 +6,15 @@ export const projects = pgTable('projects', {
// Project details // Project details
name: text('name').notNull(), name: text('name').notNull(),
slug: text('slug').notNull(), // URL-friendly identifier for storage paths
organizationId: uuid('organization_id').notNull().references(() => organizations.id, { onDelete: 'cascade' }), organizationId: uuid('organization_id').notNull().references(() => organizations.id, { onDelete: 'cascade' }),
// Timestamps // Timestamps
createdAt: timestamp('created_at').notNull().defaultNow(), createdAt: timestamp('created_at').notNull().defaultNow(),
updatedAt: timestamp('updated_at').notNull().defaultNow().$onUpdate(() => new Date()), updatedAt: timestamp('updated_at').notNull().defaultNow().$onUpdate(() => new Date()),
}, (table) => ({ }, (table) => ({
// Unique constraint: one project name per organization // Unique constraint: one project slug per organization
uniqueOrgProject: unique().on(table.organizationId, table.name), uniqueOrgProjectSlug: unique().on(table.organizationId, table.slug),
})); }));
export type Project = typeof projects.$inferSelect; export type Project = typeof projects.$inferSelect;

View File

@ -8,6 +8,9 @@ importers:
.: .:
devDependencies: devDependencies:
kill-port:
specifier: ^2.0.1
version: 2.0.1
typescript: typescript:
specifier: ^5.9.2 specifier: ^5.9.2
version: 5.9.2 version: 5.9.2
@ -163,6 +166,9 @@ importers:
apps/landing: apps/landing:
dependencies: dependencies:
'@banatie/database':
specifier: workspace:*
version: link:../../packages/database
next: next:
specifier: 15.5.4 specifier: 15.5.4
version: 15.5.4(react-dom@19.1.0(react@19.1.0))(react@19.1.0) version: 15.5.4(react-dom@19.1.0(react@19.1.0))(react@19.1.0)
@ -2997,6 +3003,9 @@ packages:
resolution: {integrity: sha512-w9UMqWwJxHNOvoNzSJ2oPF5wvYcvP7jUvYzhp67yEhTi17ZDBBC1z9pTdGuzjD+EFIqLSYRweZjqfiPzQ06Ebg==} resolution: {integrity: sha512-w9UMqWwJxHNOvoNzSJ2oPF5wvYcvP7jUvYzhp67yEhTi17ZDBBC1z9pTdGuzjD+EFIqLSYRweZjqfiPzQ06Ebg==}
engines: {node: '>= 0.4'} engines: {node: '>= 0.4'}
get-them-args@1.3.2:
resolution: {integrity: sha512-LRn8Jlk+DwZE4GTlDbT3Hikd1wSHgLMme/+7ddlqKd7ldwR6LjJgTVWzBnR01wnYGe4KgrXjg287RaI22UHmAw==}
get-tsconfig@4.10.1: get-tsconfig@4.10.1:
resolution: {integrity: sha512-auHyJ4AgMz7vgS8Hp3N6HXSmlMdUyhSUrfBF16w153rxtLIEOE+HGqaBppczZvnHLqQJfiHotCYpNhl0lUROFQ==} resolution: {integrity: sha512-auHyJ4AgMz7vgS8Hp3N6HXSmlMdUyhSUrfBF16w153rxtLIEOE+HGqaBppczZvnHLqQJfiHotCYpNhl0lUROFQ==}
@ -3535,6 +3544,10 @@ packages:
keyv@4.5.4: keyv@4.5.4:
resolution: {integrity: sha512-oxVHkHR/EJf2CNXnWxRLW6mg7JyCCUcG0DtEGmL2ctUo1PNTin1PUil+r/+4r5MpVgC/fn1kjsx7mjSujKqIpw==} resolution: {integrity: sha512-oxVHkHR/EJf2CNXnWxRLW6mg7JyCCUcG0DtEGmL2ctUo1PNTin1PUil+r/+4r5MpVgC/fn1kjsx7mjSujKqIpw==}
kill-port@2.0.1:
resolution: {integrity: sha512-e0SVOV5jFo0mx8r7bS29maVWp17qGqLBZ5ricNSajON6//kmb7qqqNnml4twNE8Dtj97UQD+gNFOaipS/q1zzQ==}
hasBin: true
kleur@3.0.3: kleur@3.0.3:
resolution: {integrity: sha512-eTIzlVOSUR+JxdDFepEYcBMtZ9Qqdef+rnzWdRZuMbOywu5tO2w2N7rqjoANZ5k9vywhL6Br1VRjUIgTQx4E8w==} resolution: {integrity: sha512-eTIzlVOSUR+JxdDFepEYcBMtZ9Qqdef+rnzWdRZuMbOywu5tO2w2N7rqjoANZ5k9vywhL6Br1VRjUIgTQx4E8w==}
engines: {node: '>=6'} engines: {node: '>=6'}
@ -4312,6 +4325,9 @@ packages:
resolution: {integrity: sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A==} resolution: {integrity: sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A==}
engines: {node: '>=8'} engines: {node: '>=8'}
shell-exec@1.0.2:
resolution: {integrity: sha512-jyVd+kU2X+mWKMmGhx4fpWbPsjvD53k9ivqetutVW/BQ+WIZoDoP4d8vUMGezV6saZsiNoW2f9GIhg9Dondohg==}
side-channel-list@1.0.0: side-channel-list@1.0.0:
resolution: {integrity: sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA==} resolution: {integrity: sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA==}
engines: {node: '>= 0.4'} engines: {node: '>= 0.4'}
@ -7263,7 +7279,7 @@ snapshots:
eslint: 8.57.1 eslint: 8.57.1
eslint-import-resolver-node: 0.3.9 eslint-import-resolver-node: 0.3.9
eslint-import-resolver-typescript: 3.10.1(eslint-plugin-import@2.32.0(@typescript-eslint/parser@8.44.0(eslint@8.57.1)(typescript@5.9.2))(eslint@8.57.1))(eslint@8.57.1) eslint-import-resolver-typescript: 3.10.1(eslint-plugin-import@2.32.0(@typescript-eslint/parser@8.44.0(eslint@8.57.1)(typescript@5.9.2))(eslint@8.57.1))(eslint@8.57.1)
eslint-plugin-import: 2.32.0(@typescript-eslint/parser@8.44.0(eslint@8.57.1)(typescript@5.9.2))(eslint-import-resolver-typescript@3.10.1(eslint-plugin-import@2.32.0(@typescript-eslint/parser@8.44.0(eslint@8.57.1)(typescript@5.9.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1) eslint-plugin-import: 2.32.0(@typescript-eslint/parser@8.44.0(eslint@8.57.1)(typescript@5.9.2))(eslint-import-resolver-typescript@3.10.1)(eslint@8.57.1)
eslint-plugin-jsx-a11y: 6.10.2(eslint@8.57.1) eslint-plugin-jsx-a11y: 6.10.2(eslint@8.57.1)
eslint-plugin-react: 7.37.5(eslint@8.57.1) eslint-plugin-react: 7.37.5(eslint@8.57.1)
eslint-plugin-react-hooks: 5.0.0-canary-7118f5dd7-20230705(eslint@8.57.1) eslint-plugin-react-hooks: 5.0.0-canary-7118f5dd7-20230705(eslint@8.57.1)
@ -7297,7 +7313,7 @@ snapshots:
tinyglobby: 0.2.15 tinyglobby: 0.2.15
unrs-resolver: 1.11.1 unrs-resolver: 1.11.1
optionalDependencies: optionalDependencies:
eslint-plugin-import: 2.32.0(@typescript-eslint/parser@8.44.0(eslint@8.57.1)(typescript@5.9.2))(eslint-import-resolver-typescript@3.10.1(eslint-plugin-import@2.32.0(@typescript-eslint/parser@8.44.0(eslint@8.57.1)(typescript@5.9.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1) eslint-plugin-import: 2.32.0(@typescript-eslint/parser@8.44.0(eslint@8.57.1)(typescript@5.9.2))(eslint-import-resolver-typescript@3.10.1)(eslint@8.57.1)
transitivePeerDependencies: transitivePeerDependencies:
- supports-color - supports-color
@ -7312,7 +7328,7 @@ snapshots:
transitivePeerDependencies: transitivePeerDependencies:
- supports-color - supports-color
eslint-plugin-import@2.32.0(@typescript-eslint/parser@8.44.0(eslint@8.57.1)(typescript@5.9.2))(eslint-import-resolver-typescript@3.10.1(eslint-plugin-import@2.32.0(@typescript-eslint/parser@8.44.0(eslint@8.57.1)(typescript@5.9.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1): eslint-plugin-import@2.32.0(@typescript-eslint/parser@8.44.0(eslint@8.57.1)(typescript@5.9.2))(eslint-import-resolver-typescript@3.10.1)(eslint@8.57.1):
dependencies: dependencies:
'@rtsao/scc': 1.1.0 '@rtsao/scc': 1.1.0
array-includes: 3.1.9 array-includes: 3.1.9
@ -7778,6 +7794,8 @@ snapshots:
es-errors: 1.3.0 es-errors: 1.3.0
get-intrinsic: 1.3.0 get-intrinsic: 1.3.0
get-them-args@1.3.2: {}
get-tsconfig@4.10.1: get-tsconfig@4.10.1:
dependencies: dependencies:
resolve-pkg-maps: 1.0.0 resolve-pkg-maps: 1.0.0
@ -8551,6 +8569,11 @@ snapshots:
dependencies: dependencies:
json-buffer: 3.0.1 json-buffer: 3.0.1
kill-port@2.0.1:
dependencies:
get-them-args: 1.3.2
shell-exec: 1.0.2
kleur@3.0.3: {} kleur@3.0.3: {}
kuler@2.0.0: {} kuler@2.0.0: {}
@ -9355,6 +9378,8 @@ snapshots:
shebang-regex@3.0.0: {} shebang-regex@3.0.0: {}
shell-exec@1.0.2: {}
side-channel-list@1.0.0: side-channel-list@1.0.0:
dependencies: dependencies:
es-errors: 1.3.0 es-errors: 1.3.0