more new feature fixes + sentry logging
Some checks failed
Deploy to Test Environment / deploy-to-test (push) Has been cancelled
Some checks failed
Deploy to Test Environment / deploy-to-test (push) Has been cancelled
This commit is contained in:
@@ -91,7 +91,8 @@
|
|||||||
"Bash(ping:*)",
|
"Bash(ping:*)",
|
||||||
"Bash(tee:*)",
|
"Bash(tee:*)",
|
||||||
"Bash(timeout 1800 podman exec flyer-crawler-dev npm run test:unit:*)",
|
"Bash(timeout 1800 podman exec flyer-crawler-dev npm run test:unit:*)",
|
||||||
"mcp__filesystem__edit_file"
|
"mcp__filesystem__edit_file",
|
||||||
|
"Bash(timeout 300 tail:*)"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -40,10 +40,16 @@ npm run test:integration # Run integration tests (requires DB/Redis)
|
|||||||
|
|
||||||
### Running Tests via Podman (from Windows host)
|
### Running Tests via Podman (from Windows host)
|
||||||
|
|
||||||
|
**Note:** This project has 2900+ unit tests. For AI-assisted development, pipe output to a file for easier processing.
|
||||||
|
|
||||||
The command to run unit tests in the dev container via podman:
|
The command to run unit tests in the dev container via podman:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
|
# Basic (output to terminal)
|
||||||
podman exec -it flyer-crawler-dev npm run test:unit
|
podman exec -it flyer-crawler-dev npm run test:unit
|
||||||
|
|
||||||
|
# Recommended for AI processing: pipe to file
|
||||||
|
podman exec -it flyer-crawler-dev npm run test:unit 2>&1 | tee test-results.txt
|
||||||
```
|
```
|
||||||
|
|
||||||
The command to run integration tests in the dev container via podman:
|
The command to run integration tests in the dev container via podman:
|
||||||
|
|||||||
@@ -796,16 +796,18 @@ Logstash aggregates logs from the application and infrastructure, forwarding err
|
|||||||
|
|
||||||
### Step 1: Create Application Log Directory
|
### Step 1: Create Application Log Directory
|
||||||
|
|
||||||
Create the log directory and set appropriate permissions:
|
The flyer-crawler application automatically creates its log directory on startup, but you need to ensure proper permissions for Logstash to read the logs.
|
||||||
|
|
||||||
|
Create the log directories and set appropriate permissions:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Create log directory for the flyer-crawler application
|
# Create log directory for the production application
|
||||||
sudo mkdir -p /var/www/flyer-crawler.projectium.com/logs
|
sudo mkdir -p /var/www/flyer-crawler.projectium.com/logs
|
||||||
|
|
||||||
# Set ownership to the user running the application (typically the deploy user or www-data)
|
# Set ownership to root (since PM2 runs as root)
|
||||||
sudo chown -R $USER:$USER /var/www/flyer-crawler.projectium.com/logs
|
sudo chown -R root:root /var/www/flyer-crawler.projectium.com/logs
|
||||||
|
|
||||||
# Ensure logstash user can read the logs
|
# Make logs readable by logstash user
|
||||||
sudo chmod 755 /var/www/flyer-crawler.projectium.com/logs
|
sudo chmod 755 /var/www/flyer-crawler.projectium.com/logs
|
||||||
```
|
```
|
||||||
|
|
||||||
@@ -813,26 +815,47 @@ For the test environment:
|
|||||||
|
|
||||||
```bash
|
```bash
|
||||||
sudo mkdir -p /var/www/flyer-crawler-test.projectium.com/logs
|
sudo mkdir -p /var/www/flyer-crawler-test.projectium.com/logs
|
||||||
sudo chown -R $USER:$USER /var/www/flyer-crawler-test.projectium.com/logs
|
sudo chown -R root:root /var/www/flyer-crawler-test.projectium.com/logs
|
||||||
sudo chmod 755 /var/www/flyer-crawler-test.projectium.com/logs
|
sudo chmod 755 /var/www/flyer-crawler-test.projectium.com/logs
|
||||||
```
|
```
|
||||||
|
|
||||||
### Step 2: Configure Application to Write File Logs
|
### Step 2: Application File Logging (Already Configured)
|
||||||
|
|
||||||
The flyer-crawler application uses Pino for logging and currently outputs to stdout (captured by PM2). To enable file-based logging for Logstash, you would need to configure Pino to write to files.
|
The flyer-crawler application uses Pino for logging and is configured to write logs to files in production/test environments:
|
||||||
|
|
||||||
**Current Behavior:** Logs go to stdout → PM2 captures them → `~/.pm2/logs/`
|
**Log File Locations:**
|
||||||
|
|
||||||
**For Logstash Integration:** You would need to either:
|
| Environment | Log File Path |
|
||||||
|
| ------------- | --------------------------------------------------------- |
|
||||||
|
| Production | `/var/www/flyer-crawler.projectium.com/logs/app.log` |
|
||||||
|
| Test | `/var/www/flyer-crawler-test.projectium.com/logs/app.log` |
|
||||||
|
| Dev Container | `/app/logs/app.log` |
|
||||||
|
|
||||||
1. Configure Pino to write directly to files (requires code changes)
|
**How It Works:**
|
||||||
2. Use PM2's log files instead (located at `~/.pm2/logs/flyer-crawler-*.log`)
|
|
||||||
|
|
||||||
For now, we'll use PM2's log files which already exist:
|
- In production/test: Pino writes JSON logs to both stdout (for PM2) AND `logs/app.log` (for Logstash)
|
||||||
|
- In development: Pino uses pino-pretty for human-readable console output only
|
||||||
|
- The log directory is created automatically if it doesn't exist
|
||||||
|
- You can override the log directory with the `LOG_DIR` environment variable
|
||||||
|
|
||||||
|
**Verify Logging After Deployment:**
|
||||||
|
|
||||||
|
After deploying the application, verify that logs are being written:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Check PM2 log location
|
# Check production logs
|
||||||
ls -la ~/.pm2/logs/
|
ls -la /var/www/flyer-crawler.projectium.com/logs/
|
||||||
|
tail -f /var/www/flyer-crawler.projectium.com/logs/app.log
|
||||||
|
|
||||||
|
# Check test logs
|
||||||
|
ls -la /var/www/flyer-crawler-test.projectium.com/logs/
|
||||||
|
tail -f /var/www/flyer-crawler-test.projectium.com/logs/app.log
|
||||||
|
```
|
||||||
|
|
||||||
|
You should see JSON-formatted log entries like:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{ "level": 30, "time": 1704067200000, "msg": "Server started on port 3001", "module": "server" }
|
||||||
```
|
```
|
||||||
|
|
||||||
### Step 3: Install Logstash
|
### Step 3: Install Logstash
|
||||||
@@ -861,14 +884,14 @@ Create the pipeline configuration file:
|
|||||||
sudo nano /etc/logstash/conf.d/bugsink.conf
|
sudo nano /etc/logstash/conf.d/bugsink.conf
|
||||||
```
|
```
|
||||||
|
|
||||||
Add the following content (adjust paths as needed):
|
Add the following content:
|
||||||
|
|
||||||
```conf
|
```conf
|
||||||
input {
|
input {
|
||||||
# PM2 application logs (Pino JSON format)
|
# Production application logs (Pino JSON format)
|
||||||
# PM2 stores logs in the home directory of the user running PM2
|
# The flyer-crawler app writes JSON logs directly to this file
|
||||||
file {
|
file {
|
||||||
path => "/root/.pm2/logs/flyer-crawler-api-out.log"
|
path => "/var/www/flyer-crawler.projectium.com/logs/app.log"
|
||||||
codec => json_lines
|
codec => json_lines
|
||||||
type => "pino"
|
type => "pino"
|
||||||
tags => ["app", "production"]
|
tags => ["app", "production"]
|
||||||
@@ -876,18 +899,9 @@ input {
|
|||||||
sincedb_path => "/var/lib/logstash/sincedb_pino_prod"
|
sincedb_path => "/var/lib/logstash/sincedb_pino_prod"
|
||||||
}
|
}
|
||||||
|
|
||||||
# PM2 error logs
|
# Test environment logs
|
||||||
file {
|
file {
|
||||||
path => "/root/.pm2/logs/flyer-crawler-api-error.log"
|
path => "/var/www/flyer-crawler-test.projectium.com/logs/app.log"
|
||||||
type => "pm2-error"
|
|
||||||
tags => ["app", "production", "error"]
|
|
||||||
start_position => "end"
|
|
||||||
sincedb_path => "/var/lib/logstash/sincedb_pm2_error_prod"
|
|
||||||
}
|
|
||||||
|
|
||||||
# Test environment logs (if running on same server)
|
|
||||||
file {
|
|
||||||
path => "/root/.pm2/logs/flyer-crawler-api-test-out.log"
|
|
||||||
codec => json_lines
|
codec => json_lines
|
||||||
type => "pino"
|
type => "pino"
|
||||||
tags => ["app", "test"]
|
tags => ["app", "test"]
|
||||||
@@ -906,10 +920,13 @@ input {
|
|||||||
}
|
}
|
||||||
|
|
||||||
filter {
|
filter {
|
||||||
# Pino error detection (level 50 = error, 60 = fatal)
|
# Pino log level detection
|
||||||
|
# Pino levels: 10=trace, 20=debug, 30=info, 40=warn, 50=error, 60=fatal
|
||||||
if [type] == "pino" and [level] {
|
if [type] == "pino" and [level] {
|
||||||
if [level] >= 50 {
|
if [level] >= 50 {
|
||||||
mutate { add_tag => ["error"] }
|
mutate { add_tag => ["error"] }
|
||||||
|
} else if [level] >= 40 {
|
||||||
|
mutate { add_tag => ["warning"] }
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -922,11 +939,6 @@ filter {
|
|||||||
mutate { add_tag => ["error"] }
|
mutate { add_tag => ["error"] }
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
# PM2 error logs are always errors
|
|
||||||
if [type] == "pm2-error" {
|
|
||||||
mutate { add_tag => ["error"] }
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
output {
|
output {
|
||||||
@@ -966,23 +978,23 @@ sudo mkdir -p /var/lib/logstash
|
|||||||
sudo chown logstash:logstash /var/lib/logstash
|
sudo chown logstash:logstash /var/lib/logstash
|
||||||
```
|
```
|
||||||
|
|
||||||
### Step 6: Grant Logstash Access to PM2 Logs
|
### Step 6: Grant Logstash Access to Application Logs
|
||||||
|
|
||||||
Logstash runs as the `logstash` user and needs permission to read PM2 logs:
|
Logstash runs as the `logstash` user and needs permission to read the application log files:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Add logstash user to the group that owns PM2 logs
|
# Make application log files readable by logstash
|
||||||
# If PM2 runs as root:
|
# The directories were already set to 755 in Step 1
|
||||||
sudo usermod -a -G root logstash
|
|
||||||
|
|
||||||
# Or, make PM2 logs world-readable (less secure but simpler)
|
# Ensure the log files themselves are readable (they should be created with 644 by default)
|
||||||
sudo chmod 644 /root/.pm2/logs/*.log
|
sudo chmod 644 /var/www/flyer-crawler.projectium.com/logs/app.log 2>/dev/null || echo "Production log file not yet created"
|
||||||
|
sudo chmod 644 /var/www/flyer-crawler-test.projectium.com/logs/app.log 2>/dev/null || echo "Test log file not yet created"
|
||||||
|
|
||||||
# For Redis logs
|
# For Redis logs
|
||||||
sudo chmod 644 /var/log/redis/redis-server.log
|
sudo chmod 644 /var/log/redis/redis-server.log
|
||||||
```
|
```
|
||||||
|
|
||||||
**Note:** If PM2 runs as a different user, adjust the group accordingly.
|
**Note:** The application log files are created automatically when the application starts. Run the chmod commands after the first deployment.
|
||||||
|
|
||||||
### Step 7: Test Logstash Configuration
|
### Step 7: Test Logstash Configuration
|
||||||
|
|
||||||
|
|||||||
@@ -965,10 +965,23 @@ CREATE TABLE IF NOT EXISTS public.receipt_items (
|
|||||||
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL,
|
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL,
|
||||||
-- Column from migration 002_expiry_tracking.sql
|
-- Column from migration 002_expiry_tracking.sql
|
||||||
upc_code TEXT,
|
upc_code TEXT,
|
||||||
|
-- Columns from migration 004_receipt_items_enhancements.sql
|
||||||
|
line_number INTEGER,
|
||||||
|
match_confidence NUMERIC(5,4) CHECK (match_confidence IS NULL OR (match_confidence >= 0 AND match_confidence <= 1)),
|
||||||
|
is_discount BOOLEAN DEFAULT FALSE NOT NULL,
|
||||||
|
unit_price_cents INTEGER CHECK (unit_price_cents IS NULL OR unit_price_cents >= 0),
|
||||||
|
unit_type TEXT,
|
||||||
|
added_to_pantry BOOLEAN DEFAULT FALSE NOT NULL,
|
||||||
CONSTRAINT receipt_items_raw_item_description_check CHECK (TRIM(raw_item_description) <> '')
|
CONSTRAINT receipt_items_raw_item_description_check CHECK (TRIM(raw_item_description) <> '')
|
||||||
);
|
);
|
||||||
COMMENT ON TABLE public.receipt_items IS 'Stores individual line items extracted from a user receipt.';
|
COMMENT ON TABLE public.receipt_items IS 'Stores individual line items extracted from a user receipt.';
|
||||||
COMMENT ON COLUMN public.receipt_items.upc_code IS 'UPC code if extracted from receipt or matched during processing.';
|
COMMENT ON COLUMN public.receipt_items.upc_code IS 'UPC code if extracted from receipt or matched during processing.';
|
||||||
|
COMMENT ON COLUMN public.receipt_items.line_number IS 'Line number on the receipt for ordering items.';
|
||||||
|
COMMENT ON COLUMN public.receipt_items.match_confidence IS 'Confidence score (0.0-1.0) when matching to master_item or product.';
|
||||||
|
COMMENT ON COLUMN public.receipt_items.is_discount IS 'Whether this line item represents a discount or coupon.';
|
||||||
|
COMMENT ON COLUMN public.receipt_items.unit_price_cents IS 'Price per unit in cents (for items sold by weight/volume).';
|
||||||
|
COMMENT ON COLUMN public.receipt_items.unit_type IS 'Unit of measurement (e.g., lb, kg, each) for unit-priced items.';
|
||||||
|
COMMENT ON COLUMN public.receipt_items.added_to_pantry IS 'Whether this item has been added to the user pantry inventory.';
|
||||||
CREATE INDEX IF NOT EXISTS idx_receipt_items_receipt_id ON public.receipt_items(receipt_id);
|
CREATE INDEX IF NOT EXISTS idx_receipt_items_receipt_id ON public.receipt_items(receipt_id);
|
||||||
CREATE INDEX IF NOT EXISTS idx_receipt_items_master_item_id ON public.receipt_items(master_item_id);
|
CREATE INDEX IF NOT EXISTS idx_receipt_items_master_item_id ON public.receipt_items(master_item_id);
|
||||||
CREATE INDEX IF NOT EXISTS idx_receipt_items_upc_code ON public.receipt_items(upc_code)
|
CREATE INDEX IF NOT EXISTS idx_receipt_items_upc_code ON public.receipt_items(upc_code)
|
||||||
|
|||||||
@@ -984,10 +984,23 @@ CREATE TABLE IF NOT EXISTS public.receipt_items (
|
|||||||
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL,
|
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL,
|
||||||
-- Column from migration 002_expiry_tracking.sql
|
-- Column from migration 002_expiry_tracking.sql
|
||||||
upc_code TEXT,
|
upc_code TEXT,
|
||||||
|
-- Columns from migration 004_receipt_items_enhancements.sql
|
||||||
|
line_number INTEGER,
|
||||||
|
match_confidence NUMERIC(5,4) CHECK (match_confidence IS NULL OR (match_confidence >= 0 AND match_confidence <= 1)),
|
||||||
|
is_discount BOOLEAN DEFAULT FALSE NOT NULL,
|
||||||
|
unit_price_cents INTEGER CHECK (unit_price_cents IS NULL OR unit_price_cents >= 0),
|
||||||
|
unit_type TEXT,
|
||||||
|
added_to_pantry BOOLEAN DEFAULT FALSE NOT NULL,
|
||||||
CONSTRAINT receipt_items_raw_item_description_check CHECK (TRIM(raw_item_description) <> '')
|
CONSTRAINT receipt_items_raw_item_description_check CHECK (TRIM(raw_item_description) <> '')
|
||||||
);
|
);
|
||||||
COMMENT ON TABLE public.receipt_items IS 'Stores individual line items extracted from a user receipt.';
|
COMMENT ON TABLE public.receipt_items IS 'Stores individual line items extracted from a user receipt.';
|
||||||
COMMENT ON COLUMN public.receipt_items.upc_code IS 'UPC code if extracted from receipt or matched during processing.';
|
COMMENT ON COLUMN public.receipt_items.upc_code IS 'UPC code if extracted from receipt or matched during processing.';
|
||||||
|
COMMENT ON COLUMN public.receipt_items.line_number IS 'Line number on the receipt for ordering items.';
|
||||||
|
COMMENT ON COLUMN public.receipt_items.match_confidence IS 'Confidence score (0.0-1.0) when matching to master_item or product.';
|
||||||
|
COMMENT ON COLUMN public.receipt_items.is_discount IS 'Whether this line item represents a discount or coupon.';
|
||||||
|
COMMENT ON COLUMN public.receipt_items.unit_price_cents IS 'Price per unit in cents (for items sold by weight/volume).';
|
||||||
|
COMMENT ON COLUMN public.receipt_items.unit_type IS 'Unit of measurement (e.g., lb, kg, each) for unit-priced items.';
|
||||||
|
COMMENT ON COLUMN public.receipt_items.added_to_pantry IS 'Whether this item has been added to the user pantry inventory.';
|
||||||
CREATE INDEX IF NOT EXISTS idx_receipt_items_receipt_id ON public.receipt_items(receipt_id);
|
CREATE INDEX IF NOT EXISTS idx_receipt_items_receipt_id ON public.receipt_items(receipt_id);
|
||||||
CREATE INDEX IF NOT EXISTS idx_receipt_items_master_item_id ON public.receipt_items(master_item_id);
|
CREATE INDEX IF NOT EXISTS idx_receipt_items_master_item_id ON public.receipt_items(master_item_id);
|
||||||
CREATE INDEX IF NOT EXISTS idx_receipt_items_upc_code ON public.receipt_items(upc_code)
|
CREATE INDEX IF NOT EXISTS idx_receipt_items_upc_code ON public.receipt_items(upc_code)
|
||||||
|
|||||||
39
sql/migrations/004_receipt_items_enhancements.sql
Normal file
39
sql/migrations/004_receipt_items_enhancements.sql
Normal file
@@ -0,0 +1,39 @@
|
|||||||
|
-- Migration: 004_receipt_items_enhancements.sql
|
||||||
|
-- Description: Add additional columns to receipt_items for better receipt processing
|
||||||
|
-- Created: 2026-01-12
|
||||||
|
|
||||||
|
-- Add line_number column for ordering items on receipt
|
||||||
|
ALTER TABLE public.receipt_items
|
||||||
|
ADD COLUMN IF NOT EXISTS line_number INTEGER;
|
||||||
|
COMMENT ON COLUMN public.receipt_items.line_number IS 'Line number on the receipt for ordering items.';
|
||||||
|
|
||||||
|
-- Add match_confidence column for tracking matching confidence scores
|
||||||
|
ALTER TABLE public.receipt_items
|
||||||
|
ADD COLUMN IF NOT EXISTS match_confidence NUMERIC(5,4);
|
||||||
|
ALTER TABLE public.receipt_items
|
||||||
|
ADD CONSTRAINT receipt_items_match_confidence_check
|
||||||
|
CHECK (match_confidence IS NULL OR (match_confidence >= 0 AND match_confidence <= 1));
|
||||||
|
COMMENT ON COLUMN public.receipt_items.match_confidence IS 'Confidence score (0.0-1.0) when matching to master_item or product.';
|
||||||
|
|
||||||
|
-- Add is_discount column to identify discount/coupon line items
|
||||||
|
ALTER TABLE public.receipt_items
|
||||||
|
ADD COLUMN IF NOT EXISTS is_discount BOOLEAN DEFAULT FALSE NOT NULL;
|
||||||
|
COMMENT ON COLUMN public.receipt_items.is_discount IS 'Whether this line item represents a discount or coupon.';
|
||||||
|
|
||||||
|
-- Add unit_price_cents column for items sold by weight/volume
|
||||||
|
ALTER TABLE public.receipt_items
|
||||||
|
ADD COLUMN IF NOT EXISTS unit_price_cents INTEGER;
|
||||||
|
ALTER TABLE public.receipt_items
|
||||||
|
ADD CONSTRAINT receipt_items_unit_price_cents_check
|
||||||
|
CHECK (unit_price_cents IS NULL OR unit_price_cents >= 0);
|
||||||
|
COMMENT ON COLUMN public.receipt_items.unit_price_cents IS 'Price per unit in cents (for items sold by weight/volume).';
|
||||||
|
|
||||||
|
-- Add unit_type column for unit of measurement
|
||||||
|
ALTER TABLE public.receipt_items
|
||||||
|
ADD COLUMN IF NOT EXISTS unit_type TEXT;
|
||||||
|
COMMENT ON COLUMN public.receipt_items.unit_type IS 'Unit of measurement (e.g., lb, kg, each) for unit-priced items.';
|
||||||
|
|
||||||
|
-- Add added_to_pantry column to track pantry additions
|
||||||
|
ALTER TABLE public.receipt_items
|
||||||
|
ADD COLUMN IF NOT EXISTS added_to_pantry BOOLEAN DEFAULT FALSE NOT NULL;
|
||||||
|
COMMENT ON COLUMN public.receipt_items.added_to_pantry IS 'Whether this item has been added to the user pantry inventory.';
|
||||||
@@ -53,9 +53,15 @@ export class ShoppingRepository {
|
|||||||
const res = await this.db.query<ShoppingList>(query, [userId]);
|
const res = await this.db.query<ShoppingList>(query, [userId]);
|
||||||
return res.rows;
|
return res.rows;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
handleDbError(error, logger, 'Database error in getShoppingLists', { userId }, {
|
handleDbError(
|
||||||
defaultMessage: 'Failed to retrieve shopping lists.',
|
error,
|
||||||
});
|
logger,
|
||||||
|
'Database error in getShoppingLists',
|
||||||
|
{ userId },
|
||||||
|
{
|
||||||
|
defaultMessage: 'Failed to retrieve shopping lists.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -73,10 +79,16 @@ export class ShoppingRepository {
|
|||||||
);
|
);
|
||||||
return { ...res.rows[0], items: [] };
|
return { ...res.rows[0], items: [] };
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
handleDbError(error, logger, 'Database error in createShoppingList', { userId, name }, {
|
handleDbError(
|
||||||
fkMessage: 'The specified user does not exist.',
|
error,
|
||||||
defaultMessage: 'Failed to create shopping list.',
|
logger,
|
||||||
});
|
'Database error in createShoppingList',
|
||||||
|
{ userId, name },
|
||||||
|
{
|
||||||
|
fkMessage: 'The specified user does not exist.',
|
||||||
|
defaultMessage: 'Failed to create shopping list.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -118,9 +130,15 @@ export class ShoppingRepository {
|
|||||||
return res.rows[0];
|
return res.rows[0];
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
if (error instanceof NotFoundError) throw error;
|
if (error instanceof NotFoundError) throw error;
|
||||||
handleDbError(error, logger, 'Database error in getShoppingListById', { listId, userId }, {
|
handleDbError(
|
||||||
defaultMessage: 'Failed to retrieve shopping list.',
|
error,
|
||||||
});
|
logger,
|
||||||
|
'Database error in getShoppingListById',
|
||||||
|
{ listId, userId },
|
||||||
|
{
|
||||||
|
defaultMessage: 'Failed to retrieve shopping list.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -142,9 +160,15 @@ export class ShoppingRepository {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
handleDbError(error, logger, 'Database error in deleteShoppingList', { listId, userId }, {
|
handleDbError(
|
||||||
defaultMessage: 'Failed to delete shopping list.',
|
error,
|
||||||
});
|
logger,
|
||||||
|
'Database error in deleteShoppingList',
|
||||||
|
{ listId, userId },
|
||||||
|
{
|
||||||
|
defaultMessage: 'Failed to delete shopping list.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -188,11 +212,17 @@ export class ShoppingRepository {
|
|||||||
return res.rows[0];
|
return res.rows[0];
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
if (error instanceof NotFoundError) throw error;
|
if (error instanceof NotFoundError) throw error;
|
||||||
handleDbError(error, logger, 'Database error in addShoppingListItem', { listId, userId, item }, {
|
handleDbError(
|
||||||
fkMessage: 'Referenced list or item does not exist.',
|
error,
|
||||||
checkMessage: 'Shopping list item must have a master item or a custom name.',
|
logger,
|
||||||
defaultMessage: 'Failed to add item to shopping list.',
|
'Database error in addShoppingListItem',
|
||||||
});
|
{ listId, userId, item },
|
||||||
|
{
|
||||||
|
fkMessage: 'Referenced list or item does not exist.',
|
||||||
|
checkMessage: 'Shopping list item must have a master item or a custom name.',
|
||||||
|
defaultMessage: 'Failed to add item to shopping list.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -216,9 +246,15 @@ export class ShoppingRepository {
|
|||||||
}
|
}
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
if (error instanceof NotFoundError) throw error;
|
if (error instanceof NotFoundError) throw error;
|
||||||
handleDbError(error, logger, 'Database error in removeShoppingListItem', { itemId, userId }, {
|
handleDbError(
|
||||||
defaultMessage: 'Failed to remove item from shopping list.',
|
error,
|
||||||
});
|
logger,
|
||||||
|
'Database error in removeShoppingListItem',
|
||||||
|
{ itemId, userId },
|
||||||
|
{
|
||||||
|
defaultMessage: 'Failed to remove item from shopping list.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
/**
|
/**
|
||||||
@@ -274,7 +310,11 @@ export class ShoppingRepository {
|
|||||||
logger,
|
logger,
|
||||||
'Database error in addMenuPlanToShoppingList',
|
'Database error in addMenuPlanToShoppingList',
|
||||||
{ menuPlanId, shoppingListId, userId },
|
{ menuPlanId, shoppingListId, userId },
|
||||||
{ fkMessage: 'The specified menu plan, shopping list, or an item within the plan does not exist.', defaultMessage: 'Failed to add menu plan to shopping list.' },
|
{
|
||||||
|
fkMessage:
|
||||||
|
'The specified menu plan, shopping list, or an item within the plan does not exist.',
|
||||||
|
defaultMessage: 'Failed to add menu plan to shopping list.',
|
||||||
|
},
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -292,9 +332,15 @@ export class ShoppingRepository {
|
|||||||
);
|
);
|
||||||
return res.rows;
|
return res.rows;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
handleDbError(error, logger, 'Database error in getPantryLocations', { userId }, {
|
handleDbError(
|
||||||
defaultMessage: 'Failed to get pantry locations.',
|
error,
|
||||||
});
|
logger,
|
||||||
|
'Database error in getPantryLocations',
|
||||||
|
{ userId },
|
||||||
|
{
|
||||||
|
defaultMessage: 'Failed to get pantry locations.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -316,12 +362,18 @@ export class ShoppingRepository {
|
|||||||
);
|
);
|
||||||
return res.rows[0];
|
return res.rows[0];
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
handleDbError(error, logger, 'Database error in createPantryLocation', { userId, name }, {
|
handleDbError(
|
||||||
uniqueMessage: 'A pantry location with this name already exists.',
|
error,
|
||||||
fkMessage: 'User not found',
|
logger,
|
||||||
notNullMessage: 'Pantry location name cannot be null.',
|
'Database error in createPantryLocation',
|
||||||
defaultMessage: 'Failed to create pantry location.',
|
{ userId, name },
|
||||||
});
|
{
|
||||||
|
uniqueMessage: 'A pantry location with this name already exists.',
|
||||||
|
fkMessage: 'User not found',
|
||||||
|
notNullMessage: 'Pantry location name cannot be null.',
|
||||||
|
defaultMessage: 'Failed to create pantry location.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -388,9 +440,15 @@ export class ShoppingRepository {
|
|||||||
) {
|
) {
|
||||||
throw error;
|
throw error;
|
||||||
}
|
}
|
||||||
handleDbError(error, logger, 'Database error in updateShoppingListItem', { itemId, userId, updates }, {
|
handleDbError(
|
||||||
defaultMessage: 'Failed to update shopping list item.',
|
error,
|
||||||
});
|
logger,
|
||||||
|
'Database error in updateShoppingListItem',
|
||||||
|
{ itemId, userId, updates },
|
||||||
|
{
|
||||||
|
defaultMessage: 'Failed to update shopping list item.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -414,10 +472,16 @@ export class ShoppingRepository {
|
|||||||
);
|
);
|
||||||
return res.rows[0].complete_shopping_list;
|
return res.rows[0].complete_shopping_list;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
handleDbError(error, logger, 'Database error in completeShoppingList', { shoppingListId, userId }, {
|
handleDbError(
|
||||||
fkMessage: 'The specified shopping list does not exist.',
|
error,
|
||||||
defaultMessage: 'Failed to complete shopping list.',
|
logger,
|
||||||
});
|
'Database error in completeShoppingList',
|
||||||
|
{ shoppingListId, userId },
|
||||||
|
{
|
||||||
|
fkMessage: 'The specified shopping list does not exist.',
|
||||||
|
defaultMessage: 'Failed to complete shopping list.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -456,9 +520,15 @@ export class ShoppingRepository {
|
|||||||
const res = await this.db.query<ShoppingTrip>(query, [userId]);
|
const res = await this.db.query<ShoppingTrip>(query, [userId]);
|
||||||
return res.rows;
|
return res.rows;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
handleDbError(error, logger, 'Database error in getShoppingTripHistory', { userId }, {
|
handleDbError(
|
||||||
defaultMessage: 'Failed to retrieve shopping trip history.',
|
error,
|
||||||
});
|
logger,
|
||||||
|
'Database error in getShoppingTripHistory',
|
||||||
|
{ userId },
|
||||||
|
{
|
||||||
|
defaultMessage: 'Failed to retrieve shopping trip history.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -478,10 +548,16 @@ export class ShoppingRepository {
|
|||||||
);
|
);
|
||||||
return res.rows[0];
|
return res.rows[0];
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
handleDbError(error, logger, 'Database error in createReceipt', { userId, receiptImageUrl }, {
|
handleDbError(
|
||||||
fkMessage: 'User not found',
|
error,
|
||||||
defaultMessage: 'Failed to create receipt record.',
|
logger,
|
||||||
});
|
'Database error in createReceipt',
|
||||||
|
{ userId, receiptImageUrl },
|
||||||
|
{
|
||||||
|
fkMessage: 'User not found',
|
||||||
|
defaultMessage: 'Failed to create receipt record.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -503,6 +579,13 @@ export class ShoppingRepository {
|
|||||||
| 'quantity'
|
| 'quantity'
|
||||||
| 'created_at'
|
| 'created_at'
|
||||||
| 'updated_at'
|
| 'updated_at'
|
||||||
|
| 'upc_code'
|
||||||
|
| 'line_number'
|
||||||
|
| 'match_confidence'
|
||||||
|
| 'is_discount'
|
||||||
|
| 'unit_price_cents'
|
||||||
|
| 'unit_type'
|
||||||
|
| 'added_to_pantry'
|
||||||
>[],
|
>[],
|
||||||
logger: Logger,
|
logger: Logger,
|
||||||
): Promise<void> {
|
): Promise<void> {
|
||||||
@@ -530,10 +613,16 @@ export class ShoppingRepository {
|
|||||||
'Failed to update receipt status to "failed" after transaction rollback.',
|
'Failed to update receipt status to "failed" after transaction rollback.',
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
handleDbError(error, logger, 'Database transaction error in processReceiptItems', { receiptId }, {
|
handleDbError(
|
||||||
fkMessage: 'The specified receipt or an item within it does not exist.',
|
error,
|
||||||
defaultMessage: 'Failed to process and save receipt items.',
|
logger,
|
||||||
});
|
'Database transaction error in processReceiptItems',
|
||||||
|
{ receiptId },
|
||||||
|
{
|
||||||
|
fkMessage: 'The specified receipt or an item within it does not exist.',
|
||||||
|
defaultMessage: 'Failed to process and save receipt items.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -550,9 +639,15 @@ export class ShoppingRepository {
|
|||||||
);
|
);
|
||||||
return res.rows;
|
return res.rows;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
handleDbError(error, logger, 'Database error in findDealsForReceipt', { receiptId }, {
|
handleDbError(
|
||||||
defaultMessage: 'Failed to find deals for receipt.',
|
error,
|
||||||
});
|
logger,
|
||||||
|
'Database error in findDealsForReceipt',
|
||||||
|
{ receiptId },
|
||||||
|
{
|
||||||
|
defaultMessage: 'Failed to find deals for receipt.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -572,9 +667,15 @@ export class ShoppingRepository {
|
|||||||
);
|
);
|
||||||
return res.rows[0];
|
return res.rows[0];
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
handleDbError(error, logger, 'Database error in findReceiptOwner', { receiptId }, {
|
handleDbError(
|
||||||
defaultMessage: 'Failed to retrieve receipt owner from database.',
|
error,
|
||||||
});
|
logger,
|
||||||
|
'Database error in findReceiptOwner',
|
||||||
|
{ receiptId },
|
||||||
|
{
|
||||||
|
defaultMessage: 'Failed to retrieve receipt owner from database.',
|
||||||
|
},
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -4,13 +4,43 @@ import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
|
|||||||
// Unmock the module we are testing to override the global mock from setupFiles.
|
// Unmock the module we are testing to override the global mock from setupFiles.
|
||||||
vi.unmock('./logger.server');
|
vi.unmock('./logger.server');
|
||||||
|
|
||||||
|
// Mock fs to prevent actual file system operations
|
||||||
|
vi.mock('fs', () => ({
|
||||||
|
default: {
|
||||||
|
existsSync: vi.fn(() => true),
|
||||||
|
mkdirSync: vi.fn(),
|
||||||
|
},
|
||||||
|
existsSync: vi.fn(() => true),
|
||||||
|
mkdirSync: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Create mock objects for pino's multistream functionality
|
||||||
|
const mockDestinationStream = { write: vi.fn() };
|
||||||
|
const mockMultistream = { write: vi.fn() };
|
||||||
|
|
||||||
// Mock pino before importing the logger
|
// Mock pino before importing the logger
|
||||||
const pinoMock = vi.fn(() => ({
|
// The new logger uses pino.destination() and pino.multistream() for production/test
|
||||||
|
const mockLoggerInstance = {
|
||||||
info: vi.fn(),
|
info: vi.fn(),
|
||||||
warn: vi.fn(),
|
warn: vi.fn(),
|
||||||
error: vi.fn(),
|
error: vi.fn(),
|
||||||
debug: vi.fn(),
|
debug: vi.fn(),
|
||||||
}));
|
level: 'info',
|
||||||
|
child: vi.fn(() => mockLoggerInstance),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Create a properly typed mock that includes pino's static methods
|
||||||
|
const mockDestination = vi.fn(() => mockDestinationStream);
|
||||||
|
const mockMultistreamFn = vi.fn(() => mockMultistream);
|
||||||
|
|
||||||
|
const pinoMock = Object.assign(
|
||||||
|
vi.fn(() => mockLoggerInstance),
|
||||||
|
{
|
||||||
|
destination: mockDestination,
|
||||||
|
multistream: mockMultistreamFn,
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
vi.mock('pino', () => ({ default: pinoMock }));
|
vi.mock('pino', () => ({ default: pinoMock }));
|
||||||
|
|
||||||
describe('Server Logger', () => {
|
describe('Server Logger', () => {
|
||||||
@@ -25,28 +55,73 @@ describe('Server Logger', () => {
|
|||||||
vi.unstubAllEnvs();
|
vi.unstubAllEnvs();
|
||||||
});
|
});
|
||||||
|
|
||||||
it('should initialize pino with the correct level for production', async () => {
|
it('should initialize pino with multistream for production (stdout + file)', async () => {
|
||||||
vi.stubEnv('NODE_ENV', 'production');
|
vi.stubEnv('NODE_ENV', 'production');
|
||||||
await import('./logger.server');
|
await import('./logger.server');
|
||||||
|
|
||||||
|
// Production uses pino.destination for file output
|
||||||
|
expect(mockDestination).toHaveBeenCalledWith(
|
||||||
|
expect.objectContaining({
|
||||||
|
dest: expect.stringContaining('app.log'),
|
||||||
|
sync: false,
|
||||||
|
mkdir: true,
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
// Production uses pino.multistream to combine stdout and file streams
|
||||||
|
expect(mockMultistreamFn).toHaveBeenCalledWith(
|
||||||
|
expect.arrayContaining([
|
||||||
|
expect.objectContaining({ stream: process.stdout }),
|
||||||
|
expect.objectContaining({ stream: mockDestinationStream }),
|
||||||
|
]),
|
||||||
|
);
|
||||||
|
|
||||||
|
// pino is called with level 'info' for production
|
||||||
expect(pinoMock).toHaveBeenCalledWith(
|
expect(pinoMock).toHaveBeenCalledWith(
|
||||||
expect.objectContaining({ level: 'info', transport: undefined }),
|
expect.objectContaining({ level: 'info' }),
|
||||||
|
mockMultistream,
|
||||||
);
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
it('should initialize pino with pretty-print transport for development', async () => {
|
it('should initialize pino with pretty-print transport for development', async () => {
|
||||||
vi.stubEnv('NODE_ENV', 'development');
|
vi.stubEnv('NODE_ENV', 'development');
|
||||||
await import('./logger.server');
|
await import('./logger.server');
|
||||||
|
|
||||||
|
// Development does NOT use destination or multistream
|
||||||
|
expect(mockDestination).not.toHaveBeenCalled();
|
||||||
|
expect(mockMultistreamFn).not.toHaveBeenCalled();
|
||||||
|
|
||||||
|
// Development uses pino-pretty transport
|
||||||
expect(pinoMock).toHaveBeenCalledWith(
|
expect(pinoMock).toHaveBeenCalledWith(
|
||||||
expect.objectContaining({ level: 'debug', transport: expect.any(Object) }),
|
expect.objectContaining({
|
||||||
|
level: 'debug',
|
||||||
|
transport: expect.objectContaining({
|
||||||
|
target: 'pino-pretty',
|
||||||
|
}),
|
||||||
|
}),
|
||||||
);
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
it('should initialize pino with debug level and no transport for test', async () => {
|
it('should initialize pino with multistream for test (stdout + file)', async () => {
|
||||||
// This is the default for vitest, but we stub it for clarity.
|
// This is the default for vitest, but we stub it for clarity.
|
||||||
vi.stubEnv('NODE_ENV', 'test');
|
vi.stubEnv('NODE_ENV', 'test');
|
||||||
await import('./logger.server');
|
await import('./logger.server');
|
||||||
|
|
||||||
|
// Test env also uses file logging like production
|
||||||
|
expect(mockDestination).toHaveBeenCalledWith(
|
||||||
|
expect.objectContaining({
|
||||||
|
dest: expect.stringContaining('app.log'),
|
||||||
|
sync: false,
|
||||||
|
mkdir: true,
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(mockMultistreamFn).toHaveBeenCalled();
|
||||||
|
|
||||||
|
// Test uses debug level
|
||||||
expect(pinoMock).toHaveBeenCalledWith(
|
expect(pinoMock).toHaveBeenCalledWith(
|
||||||
expect.objectContaining({ level: 'debug', transport: undefined }),
|
expect.objectContaining({ level: 'debug' }),
|
||||||
|
mockMultistream,
|
||||||
);
|
);
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -3,44 +3,126 @@
|
|||||||
* SERVER-SIDE LOGGER
|
* SERVER-SIDE LOGGER
|
||||||
* This file configures and exports a singleton `pino` logger instance for
|
* This file configures and exports a singleton `pino` logger instance for
|
||||||
* server-side use, adhering to ADR-004 for structured JSON logging.
|
* server-side use, adhering to ADR-004 for structured JSON logging.
|
||||||
|
*
|
||||||
|
* In production/test environments, logs are written to:
|
||||||
|
* - stdout (for PM2 capture and real-time viewing)
|
||||||
|
* - File: logs/app.log (for Logstash aggregation)
|
||||||
|
*
|
||||||
|
* Log files are stored in the application's logs/ directory:
|
||||||
|
* - Production: /var/www/flyer-crawler.projectium.com/logs/
|
||||||
|
* - Test: /var/www/flyer-crawler-test.projectium.com/logs/
|
||||||
|
* - Dev container: /app/logs/
|
||||||
*/
|
*/
|
||||||
import pino from 'pino';
|
import pino from 'pino';
|
||||||
|
import fs from 'fs';
|
||||||
|
import path from 'path';
|
||||||
|
|
||||||
const isProduction = process.env.NODE_ENV === 'production';
|
const isProduction = process.env.NODE_ENV === 'production';
|
||||||
const isTest = process.env.NODE_ENV === 'test';
|
const isTest = process.env.NODE_ENV === 'test';
|
||||||
|
const isDevelopment = !isProduction && !isTest;
|
||||||
|
|
||||||
export const logger = pino({
|
// Determine log directory based on environment
|
||||||
level: isProduction ? 'info' : 'debug',
|
// In production/test, use the application directory's logs folder
|
||||||
// Use pino-pretty for human-readable logs in development, and JSON in production.
|
// In development, use process.cwd()/logs
|
||||||
// Disable transport in tests to prevent worker thread issues.
|
const getLogDirectory = (): string => {
|
||||||
transport:
|
// Allow override via environment variable
|
||||||
isProduction || isTest
|
if (process.env.LOG_DIR) {
|
||||||
? undefined
|
return process.env.LOG_DIR;
|
||||||
: {
|
}
|
||||||
target: 'pino-pretty',
|
|
||||||
options: {
|
// Default to logs/ in current working directory
|
||||||
colorize: true,
|
return path.join(process.cwd(), 'logs');
|
||||||
translateTime: 'SYS:standard',
|
};
|
||||||
ignore: 'pid,hostname', // These are useful in production, but noisy in dev.
|
|
||||||
},
|
// Ensure log directory exists (only in production/test where we write files)
|
||||||
|
const ensureLogDirectory = (): string | null => {
|
||||||
|
if (isDevelopment) {
|
||||||
|
return null; // Don't create log files in development
|
||||||
|
}
|
||||||
|
|
||||||
|
const logDir = getLogDirectory();
|
||||||
|
try {
|
||||||
|
if (!fs.existsSync(logDir)) {
|
||||||
|
fs.mkdirSync(logDir, { recursive: true });
|
||||||
|
}
|
||||||
|
return logDir;
|
||||||
|
} catch (error) {
|
||||||
|
// If we can't create the directory, fall back to stdout only
|
||||||
|
console.error(`Failed to create log directory ${logDir}:`, error);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Common redaction configuration
|
||||||
|
const redactConfig = {
|
||||||
|
paths: [
|
||||||
|
'req.headers.authorization',
|
||||||
|
'req.headers.cookie',
|
||||||
|
'*.body.password',
|
||||||
|
'*.body.newPassword',
|
||||||
|
'*.body.currentPassword',
|
||||||
|
'*.body.confirmPassword',
|
||||||
|
'*.body.refreshToken',
|
||||||
|
'*.body.token',
|
||||||
|
],
|
||||||
|
censor: '[REDACTED]',
|
||||||
|
};
|
||||||
|
|
||||||
|
// Create the logger based on environment
|
||||||
|
const createLogger = (): pino.Logger => {
|
||||||
|
const logDir = ensureLogDirectory();
|
||||||
|
|
||||||
|
// Development: Use pino-pretty for human-readable output
|
||||||
|
if (isDevelopment) {
|
||||||
|
return pino({
|
||||||
|
level: 'debug',
|
||||||
|
transport: {
|
||||||
|
target: 'pino-pretty',
|
||||||
|
options: {
|
||||||
|
colorize: true,
|
||||||
|
translateTime: 'SYS:standard',
|
||||||
|
ignore: 'pid,hostname',
|
||||||
},
|
},
|
||||||
// As per ADR-004, we centralize sanitization here.
|
},
|
||||||
// This automatically redacts sensitive fields from all log objects.
|
redact: redactConfig,
|
||||||
// The paths target keys within objects passed to the logger.
|
});
|
||||||
redact: {
|
}
|
||||||
paths: [
|
|
||||||
'req.headers.authorization',
|
// Production/Test: Write to both stdout and file
|
||||||
'req.headers.cookie',
|
if (logDir) {
|
||||||
'*.body.password',
|
const logFilePath = path.join(logDir, 'app.log');
|
||||||
'*.body.newPassword',
|
|
||||||
'*.body.currentPassword',
|
// Create a multi-stream destination
|
||||||
'*.body.confirmPassword',
|
const streams: pino.StreamEntry[] = [
|
||||||
'*.body.refreshToken',
|
// Stream to stdout (for PM2 and real-time viewing)
|
||||||
'*.body.token',
|
{ stream: process.stdout },
|
||||||
],
|
// Stream to file (for Logstash aggregation)
|
||||||
censor: '[REDACTED]',
|
{
|
||||||
},
|
stream: pino.destination({
|
||||||
});
|
dest: logFilePath,
|
||||||
|
sync: false, // Async for better performance
|
||||||
|
mkdir: true, // Create directory if needed
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
return pino(
|
||||||
|
{
|
||||||
|
level: isProduction ? 'info' : 'debug',
|
||||||
|
redact: redactConfig,
|
||||||
|
},
|
||||||
|
pino.multistream(streams),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback: stdout only (if log directory creation failed)
|
||||||
|
return pino({
|
||||||
|
level: isProduction ? 'info' : 'debug',
|
||||||
|
redact: redactConfig,
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
export const logger = createLogger();
|
||||||
|
|
||||||
const debugModules = (process.env.DEBUG_MODULES || '').split(',').map((s) => s.trim());
|
const debugModules = (process.env.DEBUG_MODULES || '').split(',').map((s) => s.trim());
|
||||||
|
|
||||||
|
|||||||
@@ -50,23 +50,22 @@ describe('E2E Inventory/Expiry Management Journey', () => {
|
|||||||
|
|
||||||
// Clean up alert logs
|
// Clean up alert logs
|
||||||
if (createdInventoryIds.length > 0) {
|
if (createdInventoryIds.length > 0) {
|
||||||
await pool.query('DELETE FROM public.expiry_alert_log WHERE inventory_id = ANY($1::int[])', [
|
await pool.query(
|
||||||
createdInventoryIds,
|
'DELETE FROM public.expiry_alert_log WHERE pantry_item_id = ANY($1::bigint[])',
|
||||||
]);
|
[createdInventoryIds],
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Clean up inventory items
|
// Clean up inventory items (pantry_items table)
|
||||||
if (createdInventoryIds.length > 0) {
|
if (createdInventoryIds.length > 0) {
|
||||||
await pool.query('DELETE FROM public.user_inventory WHERE inventory_id = ANY($1::int[])', [
|
await pool.query('DELETE FROM public.pantry_items WHERE pantry_item_id = ANY($1::bigint[])', [
|
||||||
createdInventoryIds,
|
createdInventoryIds,
|
||||||
]);
|
]);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Clean up user alert settings
|
// Clean up user alert settings (expiry_alerts table)
|
||||||
if (userId) {
|
if (userId) {
|
||||||
await pool.query('DELETE FROM public.user_expiry_alert_settings WHERE user_id = $1', [
|
await pool.query('DELETE FROM public.expiry_alerts WHERE user_id = $1', [userId]);
|
||||||
userId,
|
|
||||||
]);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Clean up user
|
// Clean up user
|
||||||
@@ -110,36 +109,64 @@ describe('E2E Inventory/Expiry Management Journey', () => {
|
|||||||
const formatDate = (d: Date) => d.toISOString().split('T')[0];
|
const formatDate = (d: Date) => d.toISOString().split('T')[0];
|
||||||
|
|
||||||
// Step 3: Add multiple inventory items with different expiry dates
|
// Step 3: Add multiple inventory items with different expiry dates
|
||||||
|
// Note: API requires 'source' field (manual, receipt_scan, upc_scan)
|
||||||
|
// Also: pantry_items table requires master_item_id, so we need to create master items first
|
||||||
|
const pool = getPool();
|
||||||
|
|
||||||
|
// Create master grocery items for our test items
|
||||||
|
const masterItemNames = ['E2E Milk', 'E2E Frozen Pizza', 'E2E Bread', 'E2E Apples', 'E2E Rice'];
|
||||||
|
const masterItemIds: number[] = [];
|
||||||
|
|
||||||
|
for (const name of masterItemNames) {
|
||||||
|
const result = await pool.query(
|
||||||
|
`INSERT INTO public.master_grocery_items (name)
|
||||||
|
VALUES ($1)
|
||||||
|
ON CONFLICT (name) DO UPDATE SET name = EXCLUDED.name
|
||||||
|
RETURNING master_grocery_item_id`,
|
||||||
|
[name],
|
||||||
|
);
|
||||||
|
masterItemIds.push(result.rows[0].master_grocery_item_id);
|
||||||
|
}
|
||||||
|
|
||||||
const items = [
|
const items = [
|
||||||
{
|
{
|
||||||
item_name: 'Milk',
|
item_name: 'E2E Milk',
|
||||||
|
master_item_id: masterItemIds[0],
|
||||||
quantity: 2,
|
quantity: 2,
|
||||||
location: 'fridge',
|
location: 'fridge',
|
||||||
expiry_date: formatDate(tomorrow),
|
expiry_date: formatDate(tomorrow),
|
||||||
notes: 'Low-fat milk',
|
source: 'manual',
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
item_name: 'Frozen Pizza',
|
item_name: 'E2E Frozen Pizza',
|
||||||
|
master_item_id: masterItemIds[1],
|
||||||
quantity: 3,
|
quantity: 3,
|
||||||
location: 'freezer',
|
location: 'freezer',
|
||||||
expiry_date: formatDate(nextMonth),
|
expiry_date: formatDate(nextMonth),
|
||||||
|
source: 'manual',
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
item_name: 'Bread',
|
item_name: 'E2E Bread',
|
||||||
|
master_item_id: masterItemIds[2],
|
||||||
quantity: 1,
|
quantity: 1,
|
||||||
location: 'pantry',
|
location: 'pantry',
|
||||||
expiry_date: formatDate(nextWeek),
|
expiry_date: formatDate(nextWeek),
|
||||||
|
source: 'manual',
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
item_name: 'Apples',
|
item_name: 'E2E Apples',
|
||||||
|
master_item_id: masterItemIds[3],
|
||||||
quantity: 6,
|
quantity: 6,
|
||||||
location: 'fridge',
|
location: 'fridge',
|
||||||
expiry_date: formatDate(nextWeek),
|
expiry_date: formatDate(nextWeek),
|
||||||
|
source: 'manual',
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
item_name: 'Rice',
|
item_name: 'E2E Rice',
|
||||||
|
master_item_id: masterItemIds[4],
|
||||||
quantity: 1,
|
quantity: 1,
|
||||||
location: 'pantry',
|
location: 'pantry',
|
||||||
|
source: 'manual',
|
||||||
// No expiry date - non-perishable
|
// No expiry date - non-perishable
|
||||||
},
|
},
|
||||||
];
|
];
|
||||||
@@ -158,14 +185,36 @@ describe('E2E Inventory/Expiry Management Journey', () => {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Add an expired item directly to the database for testing expired endpoint
|
// Add an expired item directly to the database for testing expired endpoint
|
||||||
const pool = getPool();
|
// First create a master_grocery_item and pantry_location for the direct insert
|
||||||
const expiredResult = await pool.query(
|
// (pool already defined above)
|
||||||
`INSERT INTO public.user_inventory (user_id, item_name, quantity, location, expiry_date)
|
|
||||||
VALUES ($1, 'Expired Yogurt', 1, 'fridge', $2)
|
// Create or get the master grocery item
|
||||||
RETURNING inventory_id`,
|
const masterItemResult = await pool.query(
|
||||||
[userId, formatDate(yesterday)],
|
`INSERT INTO public.master_grocery_items (name)
|
||||||
|
VALUES ('Expired Yogurt E2E')
|
||||||
|
ON CONFLICT (name) DO UPDATE SET name = EXCLUDED.name
|
||||||
|
RETURNING master_grocery_item_id`,
|
||||||
);
|
);
|
||||||
createdInventoryIds.push(expiredResult.rows[0].inventory_id);
|
const masterItemId = masterItemResult.rows[0].master_grocery_item_id;
|
||||||
|
|
||||||
|
// Create or get the pantry location
|
||||||
|
const locationResult = await pool.query(
|
||||||
|
`INSERT INTO public.pantry_locations (user_id, name)
|
||||||
|
VALUES ($1, 'fridge')
|
||||||
|
ON CONFLICT (user_id, name) DO UPDATE SET name = EXCLUDED.name
|
||||||
|
RETURNING pantry_location_id`,
|
||||||
|
[userId],
|
||||||
|
);
|
||||||
|
const pantryLocationId = locationResult.rows[0].pantry_location_id;
|
||||||
|
|
||||||
|
// Insert the expired pantry item
|
||||||
|
const expiredResult = await pool.query(
|
||||||
|
`INSERT INTO public.pantry_items (user_id, master_item_id, quantity, pantry_location_id, best_before_date, source)
|
||||||
|
VALUES ($1, $2, 1, $3, $4, 'manual')
|
||||||
|
RETURNING pantry_item_id`,
|
||||||
|
[userId, masterItemId, pantryLocationId, formatDate(yesterday)],
|
||||||
|
);
|
||||||
|
createdInventoryIds.push(expiredResult.rows[0].pantry_item_id);
|
||||||
|
|
||||||
// Step 4: View all inventory
|
// Step 4: View all inventory
|
||||||
const listResponse = await authedFetch('/inventory', {
|
const listResponse = await authedFetch('/inventory', {
|
||||||
@@ -192,7 +241,7 @@ describe('E2E Inventory/Expiry Management Journey', () => {
|
|||||||
expect(fridgeData.data.items.length).toBe(3); // Milk, Apples, Expired Yogurt
|
expect(fridgeData.data.items.length).toBe(3); // Milk, Apples, Expired Yogurt
|
||||||
|
|
||||||
// Step 6: View expiring items
|
// Step 6: View expiring items
|
||||||
const expiringResponse = await authedFetch('/inventory/expiring?days_ahead=3', {
|
const expiringResponse = await authedFetch('/inventory/expiring?days=3', {
|
||||||
method: 'GET',
|
method: 'GET',
|
||||||
token: authToken,
|
token: authToken,
|
||||||
});
|
});
|
||||||
@@ -214,7 +263,7 @@ describe('E2E Inventory/Expiry Management Journey', () => {
|
|||||||
|
|
||||||
// Find the expired yogurt
|
// Find the expired yogurt
|
||||||
const expiredYogurt = expiredData.data.items.find(
|
const expiredYogurt = expiredData.data.items.find(
|
||||||
(i: { item_name: string }) => i.item_name === 'Expired Yogurt',
|
(i: { item_name: string }) => i.item_name === 'Expired Yogurt E2E',
|
||||||
);
|
);
|
||||||
expect(expiredYogurt).toBeDefined();
|
expect(expiredYogurt).toBeDefined();
|
||||||
|
|
||||||
@@ -244,45 +293,48 @@ describe('E2E Inventory/Expiry Management Journey', () => {
|
|||||||
const updateData = await updateResponse.json();
|
const updateData = await updateResponse.json();
|
||||||
expect(updateData.data.quantity).toBe(1);
|
expect(updateData.data.quantity).toBe(1);
|
||||||
|
|
||||||
// Step 10: Consume some apples
|
// Step 10: Consume some apples (partial consume via update, then mark fully consumed)
|
||||||
|
// First, reduce quantity via update
|
||||||
const applesId = createdInventoryIds[3];
|
const applesId = createdInventoryIds[3];
|
||||||
const consumeResponse = await authedFetch(`/inventory/${applesId}/consume`, {
|
const partialConsumeResponse = await authedFetch(`/inventory/${applesId}`, {
|
||||||
method: 'POST',
|
method: 'PUT',
|
||||||
token: authToken,
|
token: authToken,
|
||||||
body: JSON.stringify({ quantity_consumed: 2 }),
|
body: JSON.stringify({ quantity: 4 }), // 6 - 2 = 4
|
||||||
});
|
});
|
||||||
|
|
||||||
expect(consumeResponse.status).toBe(200);
|
expect(partialConsumeResponse.status).toBe(200);
|
||||||
const consumeData = await consumeResponse.json();
|
const partialConsumeData = await partialConsumeResponse.json();
|
||||||
expect(consumeData.data.quantity).toBe(4); // 6 - 2
|
expect(partialConsumeData.data.quantity).toBe(4);
|
||||||
|
|
||||||
// Step 11: Configure alert settings
|
// Step 11: Configure alert settings for email
|
||||||
const alertSettingsResponse = await authedFetch('/inventory/alerts/settings', {
|
// The API uses PUT /inventory/alerts/:alertMethod with days_before_expiry and is_enabled
|
||||||
|
const alertSettingsResponse = await authedFetch('/inventory/alerts/email', {
|
||||||
method: 'PUT',
|
method: 'PUT',
|
||||||
token: authToken,
|
token: authToken,
|
||||||
body: JSON.stringify({
|
body: JSON.stringify({
|
||||||
alerts_enabled: true,
|
is_enabled: true,
|
||||||
days_before_expiry: 3,
|
days_before_expiry: 3,
|
||||||
alert_time: '08:00',
|
|
||||||
email_notifications: true,
|
|
||||||
push_notifications: false,
|
|
||||||
}),
|
}),
|
||||||
});
|
});
|
||||||
|
|
||||||
expect(alertSettingsResponse.status).toBe(200);
|
expect(alertSettingsResponse.status).toBe(200);
|
||||||
const alertSettingsData = await alertSettingsResponse.json();
|
const alertSettingsData = await alertSettingsResponse.json();
|
||||||
expect(alertSettingsData.data.settings.alerts_enabled).toBe(true);
|
expect(alertSettingsData.data.is_enabled).toBe(true);
|
||||||
expect(alertSettingsData.data.settings.days_before_expiry).toBe(3);
|
expect(alertSettingsData.data.days_before_expiry).toBe(3);
|
||||||
|
|
||||||
// Step 12: Verify alert settings were saved
|
// Step 12: Verify alert settings were saved
|
||||||
const getSettingsResponse = await authedFetch('/inventory/alerts/settings', {
|
const getSettingsResponse = await authedFetch('/inventory/alerts', {
|
||||||
method: 'GET',
|
method: 'GET',
|
||||||
token: authToken,
|
token: authToken,
|
||||||
});
|
});
|
||||||
|
|
||||||
expect(getSettingsResponse.status).toBe(200);
|
expect(getSettingsResponse.status).toBe(200);
|
||||||
const getSettingsData = await getSettingsResponse.json();
|
const getSettingsData = await getSettingsResponse.json();
|
||||||
expect(getSettingsData.data.settings.alerts_enabled).toBe(true);
|
// Should have email alerts enabled
|
||||||
|
const emailAlert = getSettingsData.data.find(
|
||||||
|
(s: { alert_method: string }) => s.alert_method === 'email',
|
||||||
|
);
|
||||||
|
expect(emailAlert?.is_enabled).toBe(true);
|
||||||
|
|
||||||
// Step 13: Get recipe suggestions based on expiring items
|
// Step 13: Get recipe suggestions based on expiring items
|
||||||
const suggestionsResponse = await authedFetch('/inventory/recipes/suggestions', {
|
const suggestionsResponse = await authedFetch('/inventory/recipes/suggestions', {
|
||||||
@@ -294,17 +346,23 @@ describe('E2E Inventory/Expiry Management Journey', () => {
|
|||||||
const suggestionsData = await suggestionsResponse.json();
|
const suggestionsData = await suggestionsResponse.json();
|
||||||
expect(Array.isArray(suggestionsData.data.suggestions)).toBe(true);
|
expect(Array.isArray(suggestionsData.data.suggestions)).toBe(true);
|
||||||
|
|
||||||
// Step 14: Fully consume an item
|
// Step 14: Fully consume an item (marks as consumed, returns 204)
|
||||||
const breadId = createdInventoryIds[2];
|
const breadId = createdInventoryIds[2];
|
||||||
const fullConsumeResponse = await authedFetch(`/inventory/${breadId}/consume`, {
|
const fullConsumeResponse = await authedFetch(`/inventory/${breadId}/consume`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
token: authToken,
|
token: authToken,
|
||||||
body: JSON.stringify({ quantity_consumed: 1 }),
|
|
||||||
});
|
});
|
||||||
|
|
||||||
expect(fullConsumeResponse.status).toBe(200);
|
expect(fullConsumeResponse.status).toBe(204);
|
||||||
const fullConsumeData = await fullConsumeResponse.json();
|
|
||||||
expect(fullConsumeData.data.is_consumed).toBe(true);
|
// Verify the item is now marked as consumed
|
||||||
|
const consumedItemResponse = await authedFetch(`/inventory/${breadId}`, {
|
||||||
|
method: 'GET',
|
||||||
|
token: authToken,
|
||||||
|
});
|
||||||
|
expect(consumedItemResponse.status).toBe(200);
|
||||||
|
const consumedItemData = await consumedItemResponse.json();
|
||||||
|
expect(consumedItemData.data.item.is_consumed).toBe(true);
|
||||||
|
|
||||||
// Step 15: Delete an item
|
// Step 15: Delete an item
|
||||||
const riceId = createdInventoryIds[4];
|
const riceId = createdInventoryIds[4];
|
||||||
|
|||||||
@@ -54,23 +54,23 @@ describe('E2E Receipt Processing Journey', () => {
|
|||||||
afterAll(async () => {
|
afterAll(async () => {
|
||||||
const pool = getPool();
|
const pool = getPool();
|
||||||
|
|
||||||
// Clean up inventory items
|
// Clean up inventory items (pantry_items table)
|
||||||
if (createdInventoryIds.length > 0) {
|
if (createdInventoryIds.length > 0) {
|
||||||
await pool.query('DELETE FROM public.user_inventory WHERE inventory_id = ANY($1::int[])', [
|
await pool.query('DELETE FROM public.pantry_items WHERE pantry_item_id = ANY($1::bigint[])', [
|
||||||
createdInventoryIds,
|
createdInventoryIds,
|
||||||
]);
|
]);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Clean up receipt items and receipts
|
// Clean up receipt items and receipts
|
||||||
if (createdReceiptIds.length > 0) {
|
if (createdReceiptIds.length > 0) {
|
||||||
await pool.query('DELETE FROM public.receipt_items WHERE receipt_id = ANY($1::int[])', [
|
await pool.query('DELETE FROM public.receipt_items WHERE receipt_id = ANY($1::bigint[])', [
|
||||||
createdReceiptIds,
|
createdReceiptIds,
|
||||||
]);
|
]);
|
||||||
await pool.query(
|
await pool.query(
|
||||||
'DELETE FROM public.receipt_processing_logs WHERE receipt_id = ANY($1::int[])',
|
'DELETE FROM public.receipt_processing_log WHERE receipt_id = ANY($1::bigint[])',
|
||||||
[createdReceiptIds],
|
[createdReceiptIds],
|
||||||
);
|
);
|
||||||
await pool.query('DELETE FROM public.receipts WHERE receipt_id = ANY($1::int[])', [
|
await pool.query('DELETE FROM public.receipts WHERE receipt_id = ANY($1::bigint[])', [
|
||||||
createdReceiptIds,
|
createdReceiptIds,
|
||||||
]);
|
]);
|
||||||
}
|
}
|
||||||
@@ -108,23 +108,35 @@ describe('E2E Receipt Processing Journey', () => {
|
|||||||
|
|
||||||
// Step 3: Create a receipt directly in the database (simulating a completed upload)
|
// Step 3: Create a receipt directly in the database (simulating a completed upload)
|
||||||
// In a real E2E test with full BullMQ setup, we would upload and wait for processing
|
// In a real E2E test with full BullMQ setup, we would upload and wait for processing
|
||||||
|
// Note: receipts table uses store_id (FK to stores) and total_amount_cents (integer cents)
|
||||||
const pool = getPool();
|
const pool = getPool();
|
||||||
|
|
||||||
|
// First, create or get a test store
|
||||||
|
const storeResult = await pool.query(
|
||||||
|
`INSERT INTO public.stores (name)
|
||||||
|
VALUES ('E2E Test Store')
|
||||||
|
ON CONFLICT (name) DO UPDATE SET name = EXCLUDED.name
|
||||||
|
RETURNING store_id`,
|
||||||
|
);
|
||||||
|
const storeId = storeResult.rows[0].store_id;
|
||||||
|
|
||||||
const receiptResult = await pool.query(
|
const receiptResult = await pool.query(
|
||||||
`INSERT INTO public.receipts (user_id, receipt_image_url, status, store_name, total_amount, transaction_date)
|
`INSERT INTO public.receipts (user_id, receipt_image_url, status, store_id, total_amount_cents, transaction_date)
|
||||||
VALUES ($1, '/uploads/receipts/e2e-test.jpg', 'completed', 'E2E Test Store', 49.99, '2024-01-15')
|
VALUES ($1, '/uploads/receipts/e2e-test.jpg', 'completed', $2, 4999, '2024-01-15')
|
||||||
RETURNING receipt_id`,
|
RETURNING receipt_id`,
|
||||||
[userId],
|
[userId, storeId],
|
||||||
);
|
);
|
||||||
const receiptId = receiptResult.rows[0].receipt_id;
|
const receiptId = receiptResult.rows[0].receipt_id;
|
||||||
createdReceiptIds.push(receiptId);
|
createdReceiptIds.push(receiptId);
|
||||||
|
|
||||||
// Add receipt items
|
// Add receipt items
|
||||||
|
// receipt_items uses: raw_item_description, quantity, price_paid_cents, status
|
||||||
const itemsResult = await pool.query(
|
const itemsResult = await pool.query(
|
||||||
`INSERT INTO public.receipt_items (receipt_id, raw_text, parsed_name, quantity, unit_price, total_price, status, added_to_inventory)
|
`INSERT INTO public.receipt_items (receipt_id, raw_item_description, quantity, price_paid_cents, status)
|
||||||
VALUES
|
VALUES
|
||||||
($1, 'MILK 2% 4L', 'Milk 2%', 1, 5.99, 5.99, 'matched', false),
|
($1, 'MILK 2% 4L', 1, 599, 'matched'),
|
||||||
($1, 'BREAD WHITE', 'White Bread', 2, 2.49, 4.98, 'unmatched', false),
|
($1, 'BREAD WHITE', 2, 498, 'unmatched'),
|
||||||
($1, 'EGGS LARGE 12', 'Large Eggs', 1, 4.99, 4.99, 'matched', false)
|
($1, 'EGGS LARGE 12', 1, 499, 'matched')
|
||||||
RETURNING receipt_item_id`,
|
RETURNING receipt_item_id`,
|
||||||
[receiptId],
|
[receiptId],
|
||||||
);
|
);
|
||||||
@@ -146,7 +158,7 @@ describe('E2E Receipt Processing Journey', () => {
|
|||||||
(r: { receipt_id: number }) => r.receipt_id === receiptId,
|
(r: { receipt_id: number }) => r.receipt_id === receiptId,
|
||||||
);
|
);
|
||||||
expect(ourReceipt).toBeDefined();
|
expect(ourReceipt).toBeDefined();
|
||||||
expect(ourReceipt.store_name).toBe('E2E Test Store');
|
expect(ourReceipt.store_id).toBe(storeId);
|
||||||
|
|
||||||
// Step 5: View receipt details
|
// Step 5: View receipt details
|
||||||
const detailResponse = await authedFetch(`/receipts/${receiptId}`, {
|
const detailResponse = await authedFetch(`/receipts/${receiptId}`, {
|
||||||
@@ -295,11 +307,12 @@ describe('E2E Receipt Processing Journey', () => {
|
|||||||
await cleanupDb({ userIds: [otherUserId] });
|
await cleanupDb({ userIds: [otherUserId] });
|
||||||
|
|
||||||
// Step 14: Create a second receipt to test listing and filtering
|
// Step 14: Create a second receipt to test listing and filtering
|
||||||
|
// Use the same store_id we created earlier, and use total_amount_cents (integer cents)
|
||||||
const receipt2Result = await pool.query(
|
const receipt2Result = await pool.query(
|
||||||
`INSERT INTO public.receipts (user_id, receipt_image_url, status, store_name, total_amount)
|
`INSERT INTO public.receipts (user_id, receipt_image_url, status, store_id, total_amount_cents)
|
||||||
VALUES ($1, '/uploads/receipts/e2e-test-2.jpg', 'failed', 'Failed Store', 25.00)
|
VALUES ($1, '/uploads/receipts/e2e-test-2.jpg', 'failed', $2, 2500)
|
||||||
RETURNING receipt_id`,
|
RETURNING receipt_id`,
|
||||||
[userId],
|
[userId, storeId],
|
||||||
);
|
);
|
||||||
createdReceiptIds.push(receipt2Result.rows[0].receipt_id);
|
createdReceiptIds.push(receipt2Result.rows[0].receipt_id);
|
||||||
|
|
||||||
|
|||||||
@@ -91,13 +91,24 @@ describe('E2E UPC Scanning Journey', () => {
|
|||||||
expect(authToken).toBeDefined();
|
expect(authToken).toBeDefined();
|
||||||
|
|
||||||
// Step 3: Create a test product with UPC in the database
|
// Step 3: Create a test product with UPC in the database
|
||||||
|
// Products table requires master_item_id (FK to master_grocery_items), has optional brand_id
|
||||||
const pool = getPool();
|
const pool = getPool();
|
||||||
const testUpc = `${Date.now()}`.slice(-12).padStart(12, '0');
|
const testUpc = `${Date.now()}`.slice(-12).padStart(12, '0');
|
||||||
|
|
||||||
|
// First, create or get a master grocery item
|
||||||
|
const masterItemResult = await pool.query(
|
||||||
|
`INSERT INTO public.master_grocery_items (name)
|
||||||
|
VALUES ('E2E Test Product Item')
|
||||||
|
ON CONFLICT (name) DO UPDATE SET name = EXCLUDED.name
|
||||||
|
RETURNING master_grocery_item_id`,
|
||||||
|
);
|
||||||
|
const masterItemId = masterItemResult.rows[0].master_grocery_item_id;
|
||||||
|
|
||||||
const productResult = await pool.query(
|
const productResult = await pool.query(
|
||||||
`INSERT INTO public.products (name, brand_id, category_id, upc_code, description)
|
`INSERT INTO public.products (name, master_item_id, upc_code, description)
|
||||||
VALUES ('E2E Test Product', 1, 1, $1, 'Product for E2E testing')
|
VALUES ('E2E Test Product', $1, $2, 'Product for E2E testing')
|
||||||
RETURNING product_id`,
|
RETURNING product_id`,
|
||||||
[testUpc],
|
[masterItemId, testUpc],
|
||||||
);
|
);
|
||||||
const productId = productResult.rows[0].product_id;
|
const productId = productResult.rows[0].product_id;
|
||||||
createdProductIds.push(productId);
|
createdProductIds.push(productId);
|
||||||
@@ -112,7 +123,7 @@ describe('E2E UPC Scanning Journey', () => {
|
|||||||
}),
|
}),
|
||||||
});
|
});
|
||||||
|
|
||||||
expect(scanResponse.status).toBe(201);
|
expect(scanResponse.status).toBe(200);
|
||||||
const scanData = await scanResponse.json();
|
const scanData = await scanResponse.json();
|
||||||
expect(scanData.success).toBe(true);
|
expect(scanData.success).toBe(true);
|
||||||
expect(scanData.data.scan.upc_code).toBe(testUpc);
|
expect(scanData.data.scan.upc_code).toBe(testUpc);
|
||||||
|
|||||||
@@ -238,20 +238,30 @@ describe('Receipt Processing Integration Tests (/api/receipts)', () => {
|
|||||||
|
|
||||||
beforeAll(async () => {
|
beforeAll(async () => {
|
||||||
const pool = getPool();
|
const pool = getPool();
|
||||||
|
|
||||||
|
// First create or get a test store
|
||||||
|
const storeResult = await pool.query(
|
||||||
|
`INSERT INTO public.stores (name)
|
||||||
|
VALUES ('Test Store')
|
||||||
|
ON CONFLICT (name) DO UPDATE SET name = EXCLUDED.name
|
||||||
|
RETURNING store_id`,
|
||||||
|
);
|
||||||
|
const storeId = storeResult.rows[0].store_id;
|
||||||
|
|
||||||
const result = await pool.query(
|
const result = await pool.query(
|
||||||
`INSERT INTO public.receipts (user_id, receipt_image_url, status, store_name, total_amount)
|
`INSERT INTO public.receipts (user_id, receipt_image_url, status, store_id, total_amount_cents)
|
||||||
VALUES ($1, $2, 'completed', 'Test Store', 99.99)
|
VALUES ($1, $2, 'completed', $3, 9999)
|
||||||
RETURNING receipt_id`,
|
RETURNING receipt_id`,
|
||||||
[testUser.user.user_id, '/uploads/receipts/detail-test.jpg'],
|
[testUser.user.user_id, '/uploads/receipts/detail-test.jpg', storeId],
|
||||||
);
|
);
|
||||||
testReceiptId = result.rows[0].receipt_id;
|
testReceiptId = result.rows[0].receipt_id;
|
||||||
createdReceiptIds.push(testReceiptId);
|
createdReceiptIds.push(testReceiptId);
|
||||||
|
|
||||||
// Add some items to the receipt
|
// Add some items to the receipt
|
||||||
await pool.query(
|
await pool.query(
|
||||||
`INSERT INTO public.receipt_items (receipt_id, raw_text, parsed_name, quantity, unit_price, total_price, status)
|
`INSERT INTO public.receipt_items (receipt_id, raw_item_description, quantity, price_paid_cents, status)
|
||||||
VALUES ($1, 'MILK 2% 4L', 'Milk 2%', 1, 5.99, 5.99, 'matched'),
|
VALUES ($1, 'MILK 2% 4L', 1, 599, 'matched'),
|
||||||
($1, 'BREAD WHITE', 'White Bread', 2, 2.49, 4.98, 'unmatched')`,
|
($1, 'BREAD WHITE', 2, 498, 'unmatched')`,
|
||||||
[testReceiptId],
|
[testReceiptId],
|
||||||
);
|
);
|
||||||
});
|
});
|
||||||
@@ -265,7 +275,7 @@ describe('Receipt Processing Integration Tests (/api/receipts)', () => {
|
|||||||
expect(response.body.success).toBe(true);
|
expect(response.body.success).toBe(true);
|
||||||
expect(response.body.data.receipt).toBeDefined();
|
expect(response.body.data.receipt).toBeDefined();
|
||||||
expect(response.body.data.receipt.receipt_id).toBe(testReceiptId);
|
expect(response.body.data.receipt.receipt_id).toBe(testReceiptId);
|
||||||
expect(response.body.data.receipt.store_name).toBe('Test Store');
|
expect(response.body.data.receipt.store_id).toBeDefined();
|
||||||
expect(response.body.data.items).toBeDefined();
|
expect(response.body.data.items).toBeDefined();
|
||||||
expect(response.body.data.items.length).toBe(2);
|
expect(response.body.data.items.length).toBe(2);
|
||||||
});
|
});
|
||||||
@@ -372,8 +382,8 @@ describe('Receipt Processing Integration Tests (/api/receipts)', () => {
|
|||||||
createdReceiptIds.push(receiptWithItemsId);
|
createdReceiptIds.push(receiptWithItemsId);
|
||||||
|
|
||||||
const itemResult = await pool.query(
|
const itemResult = await pool.query(
|
||||||
`INSERT INTO public.receipt_items (receipt_id, raw_text, parsed_name, quantity, unit_price, total_price, status)
|
`INSERT INTO public.receipt_items (receipt_id, raw_item_description, quantity, price_paid_cents, status)
|
||||||
VALUES ($1, 'EGGS LARGE 12CT', 'Large Eggs', 1, 4.99, 4.99, 'unmatched')
|
VALUES ($1, 'EGGS LARGE 12CT', 1, 499, 'unmatched')
|
||||||
RETURNING receipt_item_id`,
|
RETURNING receipt_item_id`,
|
||||||
[receiptWithItemsId],
|
[receiptWithItemsId],
|
||||||
);
|
);
|
||||||
@@ -443,8 +453,8 @@ describe('Receipt Processing Integration Tests (/api/receipts)', () => {
|
|||||||
createdReceiptIds.push(receiptForConfirmId);
|
createdReceiptIds.push(receiptForConfirmId);
|
||||||
|
|
||||||
const itemResult = await pool.query(
|
const itemResult = await pool.query(
|
||||||
`INSERT INTO public.receipt_items (receipt_id, raw_text, parsed_name, quantity, unit_price, total_price, status, added_to_inventory)
|
`INSERT INTO public.receipt_items (receipt_id, raw_item_description, quantity, price_paid_cents, status, added_to_pantry)
|
||||||
VALUES ($1, 'YOGURT GREEK', 'Greek Yogurt', 2, 3.99, 7.98, 'matched', false)
|
VALUES ($1, 'YOGURT GREEK', 2, 798, 'matched', false)
|
||||||
RETURNING receipt_item_id`,
|
RETURNING receipt_item_id`,
|
||||||
[receiptForConfirmId],
|
[receiptForConfirmId],
|
||||||
);
|
);
|
||||||
@@ -486,8 +496,8 @@ describe('Receipt Processing Integration Tests (/api/receipts)', () => {
|
|||||||
it('should skip items with include: false', async () => {
|
it('should skip items with include: false', async () => {
|
||||||
const pool = getPool();
|
const pool = getPool();
|
||||||
const itemResult = await pool.query(
|
const itemResult = await pool.query(
|
||||||
`INSERT INTO public.receipt_items (receipt_id, raw_text, parsed_name, quantity, unit_price, total_price, status, added_to_inventory)
|
`INSERT INTO public.receipt_items (receipt_id, raw_item_description, quantity, price_paid_cents, status, added_to_pantry)
|
||||||
VALUES ($1, 'CHIPS BBQ', 'BBQ Chips', 1, 4.99, 4.99, 'matched', false)
|
VALUES ($1, 'CHIPS BBQ', 1, 499, 'matched', false)
|
||||||
RETURNING receipt_item_id`,
|
RETURNING receipt_item_id`,
|
||||||
[receiptForConfirmId],
|
[receiptForConfirmId],
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -877,6 +877,13 @@ export const createMockReceiptItem = (overrides: Partial<ReceiptItem> = {}): Rec
|
|||||||
master_item_id: null,
|
master_item_id: null,
|
||||||
product_id: null,
|
product_id: null,
|
||||||
status: 'unmatched',
|
status: 'unmatched',
|
||||||
|
upc_code: null,
|
||||||
|
line_number: null,
|
||||||
|
match_confidence: null,
|
||||||
|
is_discount: false,
|
||||||
|
unit_price_cents: null,
|
||||||
|
unit_type: null,
|
||||||
|
added_to_pantry: false,
|
||||||
created_at: new Date().toISOString(),
|
created_at: new Date().toISOString(),
|
||||||
updated_at: new Date().toISOString(),
|
updated_at: new Date().toISOString(),
|
||||||
};
|
};
|
||||||
@@ -1492,17 +1499,23 @@ export const createMockAppliance = (overrides: Partial<Appliance> = {}): Applian
|
|||||||
|
|
||||||
// ... existing factories
|
// ... existing factories
|
||||||
|
|
||||||
export const createMockShoppingListItemPayload = (overrides: Partial<{ masterItemId: number; customItemName: string }> = {}): { masterItemId?: number; customItemName?: string } => ({
|
export const createMockShoppingListItemPayload = (
|
||||||
|
overrides: Partial<{ masterItemId: number; customItemName: string }> = {},
|
||||||
|
): { masterItemId?: number; customItemName?: string } => ({
|
||||||
customItemName: 'Mock Item',
|
customItemName: 'Mock Item',
|
||||||
...overrides,
|
...overrides,
|
||||||
});
|
});
|
||||||
|
|
||||||
export const createMockRecipeCommentPayload = (overrides: Partial<{ content: string; parentCommentId: number }> = {}): { content: string; parentCommentId?: number } => ({
|
export const createMockRecipeCommentPayload = (
|
||||||
|
overrides: Partial<{ content: string; parentCommentId: number }> = {},
|
||||||
|
): { content: string; parentCommentId?: number } => ({
|
||||||
content: 'This is a mock comment.',
|
content: 'This is a mock comment.',
|
||||||
...overrides,
|
...overrides,
|
||||||
});
|
});
|
||||||
|
|
||||||
export const createMockProfileUpdatePayload = (overrides: Partial<Profile> = {}): Partial<Profile> => ({
|
export const createMockProfileUpdatePayload = (
|
||||||
|
overrides: Partial<Profile> = {},
|
||||||
|
): Partial<Profile> => ({
|
||||||
full_name: 'Mock User',
|
full_name: 'Mock User',
|
||||||
...overrides,
|
...overrides,
|
||||||
});
|
});
|
||||||
@@ -1516,14 +1529,20 @@ export const createMockAddressPayload = (overrides: Partial<Address> = {}): Part
|
|||||||
...overrides,
|
...overrides,
|
||||||
});
|
});
|
||||||
|
|
||||||
export const createMockSearchQueryPayload = (overrides: Partial<Omit<SearchQuery, 'search_query_id' | 'created_at' | 'updated_at' | 'user_id'>> = {}): Omit<SearchQuery, 'search_query_id' | 'created_at' | 'updated_at' | 'user_id'> => ({
|
export const createMockSearchQueryPayload = (
|
||||||
|
overrides: Partial<
|
||||||
|
Omit<SearchQuery, 'search_query_id' | 'created_at' | 'updated_at' | 'user_id'>
|
||||||
|
> = {},
|
||||||
|
): Omit<SearchQuery, 'search_query_id' | 'created_at' | 'updated_at' | 'user_id'> => ({
|
||||||
query_text: 'mock search',
|
query_text: 'mock search',
|
||||||
result_count: 5,
|
result_count: 5,
|
||||||
was_successful: true,
|
was_successful: true,
|
||||||
...overrides,
|
...overrides,
|
||||||
});
|
});
|
||||||
|
|
||||||
export const createMockWatchedItemPayload = (overrides: Partial<{ itemName: string; category: string }> = {}): { itemName: string; category: string } => ({
|
export const createMockWatchedItemPayload = (
|
||||||
|
overrides: Partial<{ itemName: string; category: string }> = {},
|
||||||
|
): { itemName: string; category: string } => ({
|
||||||
itemName: 'Mock Watched Item',
|
itemName: 'Mock Watched Item',
|
||||||
category: 'Pantry',
|
category: 'Pantry',
|
||||||
...overrides,
|
...overrides,
|
||||||
@@ -1544,7 +1563,9 @@ export const createMockRegisterUserPayload = (
|
|||||||
...overrides,
|
...overrides,
|
||||||
});
|
});
|
||||||
|
|
||||||
export const createMockLoginPayload = (overrides: Partial<{ email: string; password: string; rememberMe: boolean }> = {}) => ({
|
export const createMockLoginPayload = (
|
||||||
|
overrides: Partial<{ email: string; password: string; rememberMe: boolean }> = {},
|
||||||
|
) => ({
|
||||||
email: 'mock@example.com',
|
email: 'mock@example.com',
|
||||||
password: 'password123',
|
password: 'password123',
|
||||||
rememberMe: false,
|
rememberMe: false,
|
||||||
|
|||||||
156
src/types.ts
156
src/types.ts
@@ -420,6 +420,13 @@ export interface PantryItem {
|
|||||||
best_before_date?: string | null; // DATE
|
best_before_date?: string | null; // DATE
|
||||||
pantry_location_id?: number | null;
|
pantry_location_id?: number | null;
|
||||||
readonly notification_sent_at?: string | null; // TIMESTAMPTZ
|
readonly notification_sent_at?: string | null; // TIMESTAMPTZ
|
||||||
|
purchase_date?: string | null; // DATE
|
||||||
|
source?: string | null; // 'manual', 'receipt_scan', 'upc_scan'
|
||||||
|
receipt_item_id?: number | null;
|
||||||
|
product_id?: number | null;
|
||||||
|
expiry_source?: string | null; // 'manual', 'calculated', 'package', 'receipt'
|
||||||
|
is_consumed?: boolean;
|
||||||
|
consumed_at?: string | null; // TIMESTAMPTZ
|
||||||
readonly updated_at: string;
|
readonly updated_at: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -663,6 +670,13 @@ export interface ReceiptItem {
|
|||||||
master_item_id?: number | null; // Can be updated by admin correction
|
master_item_id?: number | null; // Can be updated by admin correction
|
||||||
product_id?: number | null; // Can be updated by admin correction
|
product_id?: number | null; // Can be updated by admin correction
|
||||||
status: 'unmatched' | 'matched' | 'needs_review' | 'ignored';
|
status: 'unmatched' | 'matched' | 'needs_review' | 'ignored';
|
||||||
|
upc_code?: string | null;
|
||||||
|
line_number?: number | null;
|
||||||
|
match_confidence?: number | null;
|
||||||
|
is_discount: boolean;
|
||||||
|
unit_price_cents?: number | null;
|
||||||
|
unit_type?: string | null;
|
||||||
|
added_to_pantry: boolean;
|
||||||
readonly created_at: string;
|
readonly created_at: string;
|
||||||
readonly updated_at: string;
|
readonly updated_at: string;
|
||||||
}
|
}
|
||||||
@@ -1031,3 +1045,145 @@ export interface UnitConversion {
|
|||||||
readonly created_at: string;
|
readonly created_at: string;
|
||||||
readonly updated_at: string;
|
readonly updated_at: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// UPC SCANNING TYPES
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
export type UpcScanSource = 'image_upload' | 'manual_entry' | 'phone_app' | 'camera_scan';
|
||||||
|
|
||||||
|
export interface UpcScanHistory {
|
||||||
|
readonly scan_id: number;
|
||||||
|
readonly user_id: string; // UUID
|
||||||
|
upc_code: string;
|
||||||
|
product_id?: number | null;
|
||||||
|
scan_source: UpcScanSource;
|
||||||
|
scan_confidence?: number | null;
|
||||||
|
raw_image_path?: string | null;
|
||||||
|
lookup_successful: boolean;
|
||||||
|
readonly created_at: string;
|
||||||
|
readonly updated_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type UpcExternalSource = 'openfoodfacts' | 'upcitemdb' | 'manual' | 'unknown';
|
||||||
|
|
||||||
|
export interface UpcExternalLookup {
|
||||||
|
readonly lookup_id: number;
|
||||||
|
upc_code: string;
|
||||||
|
product_name?: string | null;
|
||||||
|
brand_name?: string | null;
|
||||||
|
category?: string | null;
|
||||||
|
description?: string | null;
|
||||||
|
image_url?: string | null;
|
||||||
|
external_source: UpcExternalSource;
|
||||||
|
lookup_data?: unknown | null; // JSONB
|
||||||
|
lookup_successful: boolean;
|
||||||
|
readonly created_at: string;
|
||||||
|
readonly updated_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// EXPIRY TRACKING TYPES
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
export type StorageLocation = 'fridge' | 'freezer' | 'pantry' | 'room_temp';
|
||||||
|
export type ExpiryDataSource = 'usda' | 'fda' | 'manual' | 'community';
|
||||||
|
|
||||||
|
export interface ExpiryDateRange {
|
||||||
|
readonly expiry_range_id: number;
|
||||||
|
master_item_id?: number | null;
|
||||||
|
category_id?: number | null;
|
||||||
|
item_pattern?: string | null;
|
||||||
|
storage_location: StorageLocation;
|
||||||
|
min_days: number;
|
||||||
|
max_days: number;
|
||||||
|
typical_days: number;
|
||||||
|
notes?: string | null;
|
||||||
|
source?: ExpiryDataSource | null;
|
||||||
|
readonly created_at: string;
|
||||||
|
readonly updated_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type ExpiryAlertMethod = 'email' | 'push' | 'in_app';
|
||||||
|
|
||||||
|
export interface ExpiryAlert {
|
||||||
|
readonly expiry_alert_id: number;
|
||||||
|
readonly user_id: string; // UUID
|
||||||
|
days_before_expiry: number;
|
||||||
|
alert_method: ExpiryAlertMethod;
|
||||||
|
is_enabled: boolean;
|
||||||
|
last_alert_sent_at?: string | null; // TIMESTAMPTZ
|
||||||
|
readonly created_at: string;
|
||||||
|
readonly updated_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type ExpiryAlertType = 'expiring_soon' | 'expired' | 'expiry_reminder';
|
||||||
|
|
||||||
|
export interface ExpiryAlertLog {
|
||||||
|
readonly alert_log_id: number;
|
||||||
|
readonly user_id: string; // UUID
|
||||||
|
pantry_item_id?: number | null;
|
||||||
|
alert_type: ExpiryAlertType;
|
||||||
|
alert_method: ExpiryAlertMethod;
|
||||||
|
item_name: string;
|
||||||
|
expiry_date?: string | null; // DATE
|
||||||
|
days_until_expiry?: number | null;
|
||||||
|
readonly sent_at: string; // TIMESTAMPTZ
|
||||||
|
}
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// RECEIPT PROCESSING TYPES
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
export type ReceiptProcessingStep =
|
||||||
|
| 'upload'
|
||||||
|
| 'ocr_extraction'
|
||||||
|
| 'text_parsing'
|
||||||
|
| 'store_detection'
|
||||||
|
| 'item_extraction'
|
||||||
|
| 'item_matching'
|
||||||
|
| 'price_parsing'
|
||||||
|
| 'finalization';
|
||||||
|
|
||||||
|
export type ReceiptProcessingStatus = 'started' | 'completed' | 'failed' | 'skipped';
|
||||||
|
|
||||||
|
export type ReceiptProcessingProvider =
|
||||||
|
| 'tesseract'
|
||||||
|
| 'openai'
|
||||||
|
| 'anthropic'
|
||||||
|
| 'google_vision'
|
||||||
|
| 'aws_textract'
|
||||||
|
| 'internal';
|
||||||
|
|
||||||
|
export interface ReceiptProcessingLog {
|
||||||
|
readonly log_id: number;
|
||||||
|
readonly receipt_id: number;
|
||||||
|
processing_step: ReceiptProcessingStep;
|
||||||
|
status: ReceiptProcessingStatus;
|
||||||
|
provider?: ReceiptProcessingProvider | null;
|
||||||
|
duration_ms?: number | null;
|
||||||
|
tokens_used?: number | null;
|
||||||
|
cost_cents?: number | null;
|
||||||
|
input_data?: unknown | null; // JSONB
|
||||||
|
output_data?: unknown | null; // JSONB
|
||||||
|
error_message?: string | null;
|
||||||
|
readonly created_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type StoreReceiptPatternType =
|
||||||
|
| 'header_regex'
|
||||||
|
| 'footer_regex'
|
||||||
|
| 'phone_number'
|
||||||
|
| 'address_fragment'
|
||||||
|
| 'store_number_format';
|
||||||
|
|
||||||
|
export interface StoreReceiptPattern {
|
||||||
|
readonly pattern_id: number;
|
||||||
|
readonly store_id: number;
|
||||||
|
pattern_type: StoreReceiptPatternType;
|
||||||
|
pattern_value: string;
|
||||||
|
priority: number;
|
||||||
|
is_active: boolean;
|
||||||
|
readonly created_at: string;
|
||||||
|
readonly updated_at: string;
|
||||||
|
}
|
||||||
|
|||||||
Reference in New Issue
Block a user