Compare commits

...

79 Commits

Author SHA1 Message Date
Gitea Actions
68aadcaa4e ci: Bump version to 0.1.2 [skip ci] 2025-12-25 01:41:06 +05:00
971d2c3fa7 add ai agent fallbacks
Some checks failed
Deploy to Test Environment / deploy-to-test (push) Failing after 1m18s
2025-12-24 12:39:15 -08:00
Gitea Actions
daaacfde5e ci: Bump version to 0.1.1 [skip ci] 2025-12-24 23:53:27 +05:00
7ac8fe1d29 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
Some checks are pending
Deploy to Test Environment / deploy-to-test (push) Has started running
2025-12-24 10:52:13 -08:00
a2462dfb6b testing push to prod etc 2025-12-24 10:51:43 -08:00
Gitea Actions
a911224fb4 ci: Bump version to 0.1.0 for production release [skip ci] 2025-12-24 23:24:53 +05:00
Gitea Actions
bf4bcef890 ci: Bump version to 0.0.30 [skip ci] 2025-12-24 22:59:36 +05:00
ac6cd2e0a1 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 16m10s
2025-12-24 09:58:45 -08:00
eea03880c1 exclude some dirs from coverage 2025-12-24 09:58:37 -08:00
Gitea Actions
7fc263691f ci: Bump version to 0.0.29 [skip ci] 2025-12-24 22:41:17 +05:00
c0912d36d5 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 15m59s
2025-12-24 09:39:56 -08:00
612c2b5943 deploy to test fix 2025-12-24 09:39:49 -08:00
Gitea Actions
8e787ddcf0 ci: Bump version to 0.0.28 [skip ci] 2025-12-24 22:18:18 +05:00
11c52d284c fixing unit tests
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 15m13s
2025-12-24 09:17:09 -08:00
Gitea Actions
b528bd3651 ci: Bump version to 0.0.27 [skip ci] 2025-12-24 22:06:03 +05:00
4c5ceb1bd6 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
Some checks are pending
Deploy to Test Environment / deploy-to-test (push) Has started running
2025-12-24 09:05:12 -08:00
bcc4ad64dc fixing unit tests 2025-12-24 09:04:10 -08:00
Gitea Actions
d520980322 ci: Bump version to 0.0.26 [skip ci] 2025-12-24 21:23:30 +05:00
d79955aaa0 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 13m50s
2025-12-24 08:22:30 -08:00
e66027dc8e fix e2e and deploy to prod 2025-12-24 08:21:35 -08:00
Gitea Actions
027df989a4 ci: Bump version to 0.0.25 [skip ci] 2025-12-24 12:50:52 +05:00
d4d69caaf7 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 9m47s
2025-12-23 23:50:10 -08:00
03b5af39e1 consolidate some testing functions 2025-12-23 23:50:03 -08:00
Gitea Actions
8a86333f86 ci: Bump version to 0.0.24 [skip ci] 2025-12-24 10:50:48 +05:00
f173f805ea Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 19m55s
2025-12-23 21:49:16 -08:00
d3b0996ad5 create a few initial E2E tests 2025-12-23 21:49:05 -08:00
Gitea Actions
b939262f0c ci: Bump version to 0.0.23 [skip ci] 2025-12-24 10:18:28 +05:00
9437f3d6c6 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 18m50s
2025-12-23 21:16:47 -08:00
f1e028d498 only one error left - huzzah ! 2025-12-23 21:16:40 -08:00
Gitea Actions
5274650aea ci: Bump version to 0.0.22 [skip ci] 2025-12-24 08:10:33 +05:00
de5a9a565b Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 17m19s
2025-12-23 19:08:59 -08:00
10a379c5e3 fix for integration tests 404 ? not sure this is right 2025-12-23 19:08:53 -08:00
a6a484d432 fix for integration tests 404 ? not sure this is right 2025-12-23 18:13:35 -08:00
Gitea Actions
4b0a172c35 ci: Bump version to 0.0.21 [skip ci] 2025-12-24 07:06:33 +05:00
e8c894d5cf Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 15m46s
2025-12-23 18:05:28 -08:00
6c8fd4b126 prettier ! 2025-12-23 17:57:32 -08:00
Gitea Actions
a1f52544d0 ci: Bump version to 0.0.20 [skip ci] 2025-12-24 06:49:05 +05:00
2334359756 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 14m31s
2025-12-23 17:48:07 -08:00
406954ca06 database fixes due to adding foreign keys and seed script improvement 2025-12-23 17:48:00 -08:00
Gitea Actions
95d441be98 ci: Bump version to 0.0.19 [skip ci] 2025-12-24 06:22:03 +05:00
186ed484b7 last test fixes for upcoming V0.1 + pretty
Some checks failed
Deploy to Test Environment / deploy-to-test (push) Failing after 2m40s
2025-12-23 17:20:51 -08:00
Gitea Actions
3669958e9d ci: Bump version to 0.0.18 [skip ci] 2025-12-24 05:32:56 +05:00
5f3daf0539 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 11m23s
2025-12-23 16:32:11 -08:00
ae7afaaf97 integration test fixes 2025-12-23 16:32:05 -08:00
Gitea Actions
3ae7b9e0d4 ci: Bump version to 0.0.17 [skip ci] 2025-12-24 04:51:21 +05:00
921c48fc57 more unit test fixes now the UseProfileAddress OOM has been identified
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 10m14s
2025-12-23 15:50:01 -08:00
Gitea Actions
2571864b91 ci: Bump version to 0.0.16 [skip ci] 2025-12-24 01:52:55 +05:00
065d0c746a Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 12m17s
2025-12-23 12:52:02 -08:00
395f6c21a2 some "push to get dev working" stuff, and possibly found the problem test 2025-12-23 12:51:56 -08:00
Gitea Actions
aec56dfc23 ci: Bump version to 0.0.15 [skip ci] 2025-12-24 01:18:44 +05:00
a12a0e5207 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
Some checks failed
Deploy to Test Environment / deploy-to-test (push) Failing after 1m28s
2025-12-23 12:17:31 -08:00
e337bd67b1 some "push to get dev working" stuff, and possibly found the problem test 2025-12-23 12:17:21 -08:00
Gitea Actions
a8f5b4e51a ci: Bump version to 0.0.14 [skip ci] 2025-12-23 08:45:00 +05:00
d0ce8021d6 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 15m5s
2025-12-22 19:44:11 -08:00
efbb162880 keep disabling tests until the culprit is found this fucking sucks 2025-12-22 19:44:02 -08:00
Gitea Actions
e353ce8a81 ci: Bump version to 0.0.13 [skip ci] 2025-12-23 08:30:20 +05:00
b5cbf271b8 debugging the fucking OOM
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 13m51s
2025-12-22 19:29:10 -08:00
Gitea Actions
2041b4ac3c ci: Bump version to 0.0.12 [skip ci] 2025-12-23 08:19:32 +05:00
e547363a65 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
Some checks failed
Deploy to Test Environment / deploy-to-test (push) Has been cancelled
2025-12-22 19:19:05 -08:00
bddaf765fc disable to stupid system test that fails - fuck it 2025-12-22 19:18:29 -08:00
Gitea Actions
3c0bebb65c ci: Bump version to 0.0.11 [skip ci] 2025-12-23 07:59:09 +05:00
265cc3ffd4 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 14m37s
2025-12-22 18:58:11 -08:00
3d5767b60b roll back changes to src/routes/system.routes.ts hopefully before OOM issues 2025-12-22 18:58:01 -08:00
Gitea Actions
e9cb45efe0 ci: Bump version to 0.0.10 [skip ci] 2025-12-23 07:41:54 +05:00
99a57f3a30 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 15m47s
2025-12-22 18:40:59 -08:00
e46f5eb7f6 roll back changes to src/routes/system.routes.test.ts hopefully before OOM issues 2025-12-22 18:40:37 -08:00
Gitea Actions
034887069c ci: Bump version to 0.0.9 [skip ci] 2025-12-23 07:23:30 +05:00
84b5e0e15e Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
Some checks failed
Deploy to Test Environment / deploy-to-test (push) Has been cancelled
2025-12-22 18:22:23 -08:00
dc0f774699 try to stop system.route test crashes fuck sakes 2025-12-22 18:21:39 -08:00
Gitea Actions
1195b7e87f ci: Bump version to 0.0.8 [skip ci] 2025-12-23 04:45:54 +05:00
e9889f1f1e Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 15m41s
2025-12-22 15:45:11 -08:00
3c7f6429aa try to stop system.route test crashes 2025-12-22 15:43:59 -08:00
Gitea Actions
0db90dfaa6 ci: Bump version to 0.0.7 [skip ci] 2025-12-23 04:33:19 +05:00
b7a1294ae6 fix to versioning
Some checks are pending
Deploy to Test Environment / deploy-to-test (push) Has started running
2025-12-22 15:32:43 -08:00
Gitea Actions
be652f9790 ci: Bump version to 0.0.6 [skip ci] 2025-12-23 04:17:17 +05:00
1a3e6a9ab5 unit test fixes
Some checks failed
Deploy to Test Environment / deploy-to-test (push) Failing after 38s
2025-12-22 15:11:18 -08:00
Gitea Actions
262396ddd0 ci: Bump version to 0.0.5 [skip ci] 2025-12-23 02:29:27 +05:00
c542796048 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
Some checks failed
Deploy to Test Environment / deploy-to-test (push) Has been cancelled
2025-12-22 13:22:28 -08:00
5b8f309ad8 oom issue 2025-12-22 13:22:21 -08:00
122 changed files with 5603 additions and 3922 deletions

View File

@@ -0,0 +1,18 @@
{
"name": "Flyer Crawler Dev (Ubuntu 22.04)",
"dockerComposeFile": ["../compose.dev.yml"],
"service": "app",
"workspaceFolder": "/app",
"customizations": {
"vscode": {
"extensions": ["dbaeumer.vscode-eslint", "esbenp.prettier-vscode"]
}
},
"remoteUser": "root",
// Automatically install dependencies when the container is created.
// This runs inside the container, populating the isolated node_modules volume.
"postCreateCommand": "npm install",
"postAttachCommand": "npm run dev:container",
// Try to start podman machine, but exit with success (0) even if it's already running
"initializeCommand": "powershell -Command \"podman machine start; exit 0\""
}

View File

@@ -47,6 +47,19 @@ jobs:
- name: Install Dependencies
run: npm ci
- name: Bump Minor Version and Push
run: |
# Configure git for the commit.
git config --global user.name 'Gitea Actions'
git config --global user.email 'actions@gitea.projectium.com'
# Bump the minor version number. This creates a new commit and a new tag.
# The commit message includes [skip ci] to prevent this push from triggering another workflow run.
npm version minor -m "ci: Bump version to %s for production release [skip ci]"
# Push the new commit and the new tag back to the main branch.
git push --follow-tags
- name: Check for Production Database Schema Changes
env:
DB_HOST: ${{ secrets.DB_HOST }}
@@ -61,9 +74,10 @@ jobs:
echo "--- Checking for production schema changes ---"
CURRENT_HASH=$(cat sql/master_schema_rollup.sql | dos2unix | sha256sum | awk '{ print $1 }')
echo "Current Git Schema Hash: $CURRENT_HASH"
DEPLOYED_HASH=$(PGPASSWORD="$DB_PASSWORD" psql -v ON_ERROR_STOP=1 -h "$DB_HOST" -p 5432 -U "$DB_USER" -d "$DB_NAME" -c "SELECT schema_hash FROM public.schema_info WHERE environment = 'production';" -t -A || echo "none")
# The psql command will now fail the step if the query errors (e.g., column missing), preventing deployment on a bad schema.
DEPLOYED_HASH=$(PGPASSWORD="$DB_PASSWORD" psql -v ON_ERROR_STOP=1 -h "$DB_HOST" -p 5432 -U "$DB_USER" -d "$DB_NAME" -c "SELECT schema_hash FROM public.schema_info WHERE environment = 'production';" -t -A)
echo "Deployed DB Schema Hash: $DEPLOYED_HASH"
if [ "$DEPLOYED_HASH" = "none" ] || [ -z "$DEPLOYED_HASH" ]; then
if [ -z "$DEPLOYED_HASH" ]; then
echo "WARNING: No schema hash found in the production database. This is expected for a first-time deployment."
elif [ "$CURRENT_HASH" != "$DEPLOYED_HASH" ]; then
echo "ERROR: Database schema mismatch detected! A manual database migration is required."
@@ -79,8 +93,9 @@ jobs:
exit 1
fi
GITEA_SERVER_URL="https://gitea.projectium.com"
COMMIT_MESSAGE=$(git log -1 --pretty=%s)
VITE_APP_VERSION="$(date +'%Y%m%d-%H%M'):$(git rev-parse --short HEAD)" \
COMMIT_MESSAGE=$(git log -1 --grep="\[skip ci\]" --invert-grep --pretty=%s)
PACKAGE_VERSION=$(node -p "require('./package.json').version")
VITE_APP_VERSION="$(date +'%Y%m%d-%H%M'):$(git rev-parse --short HEAD):$PACKAGE_VERSION" \
VITE_APP_COMMIT_URL="$GITEA_SERVER_URL/${{ gitea.repository }}/commit/${{ gitea.sha }}" \
VITE_APP_COMMIT_MESSAGE="$COMMIT_MESSAGE" \
VITE_API_BASE_URL=/api VITE_API_KEY=${{ secrets.VITE_GOOGLE_GENAI_API_KEY }} npm run build
@@ -148,7 +163,12 @@ jobs:
echo "Updating schema hash in production database..."
CURRENT_HASH=$(cat sql/master_schema_rollup.sql | dos2unix | sha256sum | awk '{ print $1 }')
PGPASSWORD="$DB_PASSWORD" psql -v ON_ERROR_STOP=1 -h "$DB_HOST" -p 5432 -U "$DB_USER" -d "$DB_NAME" -c \
"INSERT INTO public.schema_info (environment, schema_hash, deployed_at) VALUES ('production', '$CURRENT_HASH', NOW())
"CREATE TABLE IF NOT EXISTS public.schema_info (
environment VARCHAR(50) PRIMARY KEY,
schema_hash VARCHAR(64) NOT NULL,
deployed_at TIMESTAMP DEFAULT NOW()
);
INSERT INTO public.schema_info (environment, schema_hash, deployed_at) VALUES ('production', '$CURRENT_HASH', NOW())
ON CONFLICT (environment) DO UPDATE SET schema_hash = EXCLUDED.schema_hash, deployed_at = NOW();"
UPDATED_HASH=$(PGPASSWORD="$DB_PASSWORD" psql -v ON_ERROR_STOP=1 -h "$DB_HOST" -p 5432 -U "$DB_USER" -d "$DB_NAME" -c "SELECT schema_hash FROM public.schema_info WHERE environment = 'production';" -t -A)

View File

@@ -51,7 +51,14 @@ jobs:
# Bump the patch version number. This creates a new commit and a new tag.
# The commit message includes [skip ci] to prevent this push from triggering another workflow run.
npm version patch -m "ci: Bump version to %s [skip ci]"
# If the tag already exists (e.g. re-running a failed job), we skip the conflicting version.
if ! npm version patch -m "ci: Bump version to %s [skip ci]"; then
echo "⚠️ Version bump failed (likely tag exists). Attempting to skip to next version..."
# Bump package.json to the conflicting version without git tagging
npm version patch --no-git-tag-version > /dev/null
# Bump again to the next version, forcing it because the directory is now dirty
npm version patch -m "ci: Bump version to %s [skip ci]" --force
fi
# Push the new commit and the new tag back to the main branch.
git push --follow-tags
@@ -112,6 +119,11 @@ jobs:
# --- JWT Secret for Passport authentication in tests ---
JWT_SECRET: ${{ secrets.JWT_SECRET }}
# --- V8 Coverage for Server Process ---
# This variable tells the Node.js process (our server, started by globalSetup)
# where to output its raw V8 coverage data.
NODE_V8_COVERAGE: '.coverage/tmp/integration-server'
# --- Increase Node.js memory limit to prevent heap out of memory errors ---
# This is crucial for memory-intensive tasks like running tests and coverage.
NODE_OPTIONS: '--max-old-space-size=8192'
@@ -129,10 +141,16 @@ jobs:
# Run unit and integration tests as separate steps.
# The `|| true` ensures the workflow continues even if tests fail, allowing coverage to run.
echo "--- Running Unit Tests ---"
npm run test:unit -- --coverage --reporter=verbose --includeTaskLocation --testTimeout=10000 --silent=passed-only || true
# npm run test:unit -- --coverage --reporter=verbose --includeTaskLocation --testTimeout=10000 --silent=passed-only || true
npm run test:unit -- --coverage --coverage.exclude='**/*.test.ts' --coverage.exclude='**/tests/**' --coverage.exclude='**/mocks/**' --reporter=verbose --includeTaskLocation --testTimeout=10000 --silent=passed-only --no-file-parallelism || true
echo "--- Running Integration Tests ---"
npm run test:integration -- --coverage --reporter=verbose --includeTaskLocation --testTimeout=10000 --silent=passed-only || true
npm run test:integration -- --coverage --coverage.exclude='**/*.test.ts' --coverage.exclude='**/tests/**' --coverage.exclude='**/mocks/**' --reporter=verbose --includeTaskLocation --testTimeout=10000 --silent=passed-only || true
echo "--- Running E2E Tests ---"
# Run E2E tests using the dedicated E2E config which inherits from integration config.
# We still pass --coverage to enable it, but directory and timeout are now in the config.
npx vitest run --config vitest.config.e2e.ts --coverage --coverage.exclude='**/*.test.ts' --coverage.exclude='**/tests/**' --coverage.exclude='**/mocks/**' --reporter=verbose --no-file-parallelism || true
# Re-enable secret masking for subsequent steps.
echo "::secret-masking::"
@@ -148,6 +166,7 @@ jobs:
echo "Checking for source coverage files..."
ls -l .coverage/unit/coverage-final.json
ls -l .coverage/integration/coverage-final.json
ls -l .coverage/e2e/coverage-final.json || echo "E2E coverage file not found"
# --- V8 Coverage Processing for Backend Server ---
# The integration tests start the server, which generates raw V8 coverage data.
@@ -160,7 +179,7 @@ jobs:
# Run c8: read raw files from the temp dir, and output an Istanbul JSON report.
# We only generate the 'json' report here because it's all nyc needs for merging.
echo "Server coverage report about to be generated..."
npx c8 report --reporter=json --temp-directory .coverage/tmp/integration-server --reports-dir .coverage/integration-server
npx c8 report --exclude='**/*.test.ts' --exclude='**/tests/**' --exclude='**/mocks/**' --reporter=json --temp-directory .coverage/tmp/integration-server --reports-dir .coverage/integration-server
echo "Server coverage report generated. Verifying existence:"
ls -l .coverage/integration-server/coverage-final.json
@@ -179,6 +198,7 @@ jobs:
# We give them unique names to be safe, though it's not strictly necessary.
cp .coverage/unit/coverage-final.json "$NYC_SOURCE_DIR/unit-coverage.json"
cp .coverage/integration/coverage-final.json "$NYC_SOURCE_DIR/integration-coverage.json"
cp .coverage/e2e/coverage-final.json "$NYC_SOURCE_DIR/e2e-coverage.json" || echo "E2E coverage file not found, skipping."
# This file might not exist if integration tests fail early, so we add `|| true`
cp .coverage/integration-server/coverage-final.json "$NYC_SOURCE_DIR/integration-server-coverage.json" || echo "Server coverage file not found, skipping."
echo "Copied coverage files to source directory. Contents:"
@@ -198,7 +218,10 @@ jobs:
--reporter=text \
--reporter=html \
--report-dir .coverage/ \
--temp-dir "$NYC_SOURCE_DIR"
--temp-dir "$NYC_SOURCE_DIR" \
--exclude "**/*.test.ts" \
--exclude "**/tests/**" \
--exclude "**/mocks/**"
# Re-enable secret masking for subsequent steps.
echo "::secret-masking::"
@@ -249,18 +272,19 @@ jobs:
# We normalize line endings to ensure the hash is consistent across different OS environments.
CURRENT_HASH=$(cat sql/master_schema_rollup.sql | dos2unix | sha256sum | awk '{ print $1 }')
echo "Current Git Schema Hash: $CURRENT_HASH"
# Query the production database to get the hash of the deployed schema.
# The `psql` command requires PGPASSWORD to be set.
# `\t` sets tuples-only mode and `\A` unaligns output to get just the raw value.
# The `|| echo "none"` ensures the command doesn't fail if the table or row doesn't exist yet.
DEPLOYED_HASH=$(PGPASSWORD="$DB_PASSWORD" psql -v ON_ERROR_STOP=1 -h "$DB_HOST" -p 5432 -U "$DB_USER" -d "$DB_NAME" -c "SELECT schema_hash FROM public.schema_info WHERE environment = 'test';" -t -A || echo "none")
# The psql command will now fail the step if the query errors (e.g., column missing), preventing deployment on a bad schema.
DEPLOYED_HASH=$(PGPASSWORD="$DB_PASSWORD" psql -v ON_ERROR_STOP=1 -h "$DB_HOST" -p 5432 -U "$DB_USER" -d "$DB_NAME" -c "SELECT schema_hash FROM public.schema_info WHERE environment = 'test';" -t -A)
echo "Deployed DB Schema Hash: $DEPLOYED_HASH"
# Check if the hash is "none" (command failed) OR if it's an empty string (table exists but is empty).
if [ "$DEPLOYED_HASH" = "none" ] || [ -z "$DEPLOYED_HASH" ]; then
if [ -z "$DEPLOYED_HASH" ]; then
echo "WARNING: No schema hash found in the test database."
echo "This is expected for a first-time deployment. The hash will be set after a successful deployment."
echo "--- Debug: Dumping schema_info table ---"
PGPASSWORD="$DB_PASSWORD" psql -v ON_ERROR_STOP=0 -h "$DB_HOST" -p 5432 -U "$DB_USER" -d "$DB_NAME" -c "SELECT * FROM public.schema_info;" || true
echo "----------------------------------------"
# We allow the deployment to continue, but a manual schema update is required.
# You could choose to fail here by adding `exit 1`.
elif [ "$CURRENT_HASH" != "$DEPLOYED_HASH" ]; then
@@ -284,8 +308,9 @@ jobs:
fi
GITEA_SERVER_URL="https://gitea.projectium.com" # Your Gitea instance URL
COMMIT_MESSAGE=$(git log -1 --pretty=%s)
VITE_APP_VERSION="$(date +'%Y%m%d-%H%M'):$(git rev-parse --short HEAD)" \
COMMIT_MESSAGE=$(git log -1 --grep="\[skip ci\]" --invert-grep --pretty=%s)
PACKAGE_VERSION=$(node -p "require('./package.json').version")
VITE_APP_VERSION="$(date +'%Y%m%d-%H%M'):$(git rev-parse --short HEAD):$PACKAGE_VERSION" \
VITE_APP_COMMIT_URL="$GITEA_SERVER_URL/${{ gitea.repository }}/commit/${{ gitea.sha }}" \
VITE_APP_COMMIT_MESSAGE="$COMMIT_MESSAGE" \
VITE_API_BASE_URL="https://flyer-crawler-test.projectium.com/api" VITE_API_KEY=${{ secrets.VITE_GOOGLE_GENAI_API_KEY_TEST }} npm run build
@@ -347,7 +372,7 @@ jobs:
echo "Installing production dependencies and restarting test server..."
cd /var/www/flyer-crawler-test.projectium.com
npm install --omit=dev # Install only production dependencies
npm install --omit=dev
# Use `startOrReload` with the ecosystem file. This is the standard, idempotent way to deploy.
# It will START the process if it's not running, or RELOAD it if it is.
# We also add `&& pm2 save` to persist the process list across server reboots.
@@ -359,7 +384,12 @@ jobs:
echo "Updating schema hash in test database..."
CURRENT_HASH=$(cat sql/master_schema_rollup.sql | dos2unix | sha256sum | awk '{ print $1 }')
PGPASSWORD="$DB_PASSWORD" psql -v ON_ERROR_STOP=1 -h "$DB_HOST" -p 5432 -U "$DB_USER" -d "$DB_NAME" -c \
"INSERT INTO public.schema_info (environment, schema_hash, deployed_at) VALUES ('test', '$CURRENT_HASH', NOW())
"CREATE TABLE IF NOT EXISTS public.schema_info (
environment VARCHAR(50) PRIMARY KEY,
schema_hash VARCHAR(64) NOT NULL,
deployed_at TIMESTAMP DEFAULT NOW()
);
INSERT INTO public.schema_info (environment, schema_hash, deployed_at) VALUES ('test', '$CURRENT_HASH', NOW())
ON CONFLICT (environment) DO UPDATE SET schema_hash = EXCLUDED.schema_hash, deployed_at = NOW();"
# Verify the hash was updated

View File

@@ -60,4 +60,4 @@ jobs:
uses: actions/upload-artifact@v3
with:
name: database-backup
path: ${{ env.backup_filename }}
path: ${{ env.backup_filename }}

View File

@@ -144,4 +144,4 @@ jobs:
find "$APP_PATH/flyer-images" -type f -name '*-test-flyer-image.*' -delete
find "$APP_PATH/flyer-images/icons" -type f -name '*-test-flyer-image.*' -delete
find "$APP_PATH/flyer-images/archive" -mindepth 1 -maxdepth 1 -type f -delete || echo "Archive directory not found, skipping."
echo "✅ Flyer asset directories cleared."
echo "✅ Flyer asset directories cleared."

View File

@@ -130,4 +130,4 @@ jobs:
find "$APP_PATH/flyer-images" -mindepth 1 -type f -delete
find "$APP_PATH/flyer-images/icons" -mindepth 1 -type f -delete
find "$APP_PATH/flyer-images/archive" -mindepth 1 -type f -delete || echo "Archive directory not found, skipping."
echo "✅ Test flyer asset directories cleared."
echo "✅ Test flyer asset directories cleared."

View File

@@ -25,7 +25,7 @@ jobs:
DB_USER: ${{ secrets.DB_USER }}
DB_PASSWORD: ${{ secrets.DB_PASSWORD }}
DB_NAME: ${{ secrets.DB_DATABASE_PROD }}
BACKUP_DIR: "/var/www/backups" # Define a dedicated directory for backups
BACKUP_DIR: '/var/www/backups' # Define a dedicated directory for backups
steps:
- name: Validate Secrets and Inputs
@@ -92,4 +92,4 @@ jobs:
echo "Restarting application server..."
cd /var/www/flyer-crawler.projectium.com
pm2 startOrReload ecosystem.config.cjs --env production && pm2 save
echo "✅ Application server restarted."
echo "✅ Application server restarted."

View File

@@ -0,0 +1,181 @@
# .gitea/workflows/manual-deploy-major.yml
#
# This workflow provides a MANUAL trigger to perform a MAJOR version bump
# and deploy the application to the PRODUCTION environment.
name: Manual - Deploy Major Version to Production
on:
workflow_dispatch:
inputs:
confirmation:
description: 'Type "deploy-major-to-prod" to confirm you want to deploy a new major version.'
required: true
default: 'do-not-run'
force_reload:
description: 'Force PM2 reload even if version matches (true/false).'
required: false
type: boolean
default: false
jobs:
deploy-production-major:
runs-on: projectium.com
steps:
- name: Verify Confirmation Phrase
run: |
if [ "${{ gitea.event.inputs.confirmation }}" != "deploy-major-to-prod" ]; then
echo "ERROR: Confirmation phrase did not match. Aborting deployment."
exit 1
fi
echo "✅ Confirmation accepted. Proceeding with major version production deployment."
- name: Checkout Code from 'main' branch
uses: actions/checkout@v3
with:
ref: 'main' # Explicitly check out the main branch for production deployment
fetch-depth: 0
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: '20'
cache: 'npm'
cache-dependency-path: '**/package-lock.json'
- name: Install Dependencies
run: npm ci
- name: Bump Major Version and Push
run: |
# Configure git for the commit.
git config --global user.name 'Gitea Actions'
git config --global user.email 'actions@gitea.projectium.com'
# Bump the major version number. This creates a new commit and a new tag.
# The commit message includes [skip ci] to prevent this push from triggering another workflow run.
npm version major -m "ci: Bump version to %s for major release [skip ci]"
# Push the new commit and the new tag back to the main branch.
git push --follow-tags
- name: Check for Production Database Schema Changes
env:
DB_HOST: ${{ secrets.DB_HOST }}
DB_USER: ${{ secrets.DB_USER }}
DB_PASSWORD: ${{ secrets.DB_PASSWORD }}
DB_NAME: ${{ secrets.DB_DATABASE_PROD }}
run: |
if [ -z "$DB_HOST" ] || [ -z "$DB_USER" ] || [ -z "$DB_PASSWORD" ] || [ -z "$DB_NAME" ]; then
echo "ERROR: One or more production database secrets (DB_HOST, DB_USER, DB_PASSWORD, DB_DATABASE_PROD) are not set."
exit 1
fi
echo "--- Checking for production schema changes ---"
CURRENT_HASH=$(cat sql/master_schema_rollup.sql | dos2unix | sha256sum | awk '{ print $1 }')
echo "Current Git Schema Hash: $CURRENT_HASH"
# The psql command will now fail the step if the query errors (e.g., column missing), preventing deployment on a bad schema.
DEPLOYED_HASH=$(PGPASSWORD="$DB_PASSWORD" psql -v ON_ERROR_STOP=1 -h "$DB_HOST" -p 5432 -U "$DB_USER" -d "$DB_NAME" -c "SELECT schema_hash FROM public.schema_info WHERE environment = 'production';" -t -A)
echo "Deployed DB Schema Hash: $DEPLOYED_HASH"
if [ -z "$DEPLOYED_HASH" ]; then
echo "WARNING: No schema hash found in the production database. This is expected for a first-time deployment."
elif [ "$CURRENT_HASH" != "$DEPLOYED_HASH" ]; then
echo "ERROR: Database schema mismatch detected! A manual database migration is required."
exit 1
else
echo "✅ Schema is up to date. No changes detected."
fi
- name: Build React Application for Production
run: |
if [ -z "${{ secrets.VITE_GOOGLE_GENAI_API_KEY }}" ]; then
echo "ERROR: The VITE_GOOGLE_GENAI_API_KEY secret is not set."
exit 1
fi
GITEA_SERVER_URL="https://gitea.projectium.com"
COMMIT_MESSAGE=$(git log -1 --grep="\[skip ci\]" --invert-grep --pretty=%s)
PACKAGE_VERSION=$(node -p "require('./package.json').version")
VITE_APP_VERSION="$(date +'%Y%m%d-%H%M'):$(git rev-parse --short HEAD):$PACKAGE_VERSION" \
VITE_APP_COMMIT_URL="$GITEA_SERVER_URL/${{ gitea.repository }}/commit/${{ gitea.sha }}" \
VITE_APP_COMMIT_MESSAGE="$COMMIT_MESSAGE" \
VITE_API_BASE_URL=/api VITE_API_KEY=${{ secrets.VITE_GOOGLE_GENAI_API_KEY }} npm run build
- name: Deploy Application to Production Server
run: |
echo "Deploying application files to /var/www/flyer-crawler.projectium.com..."
APP_PATH="/var/www/flyer-crawler.projectium.com"
mkdir -p "$APP_PATH"
mkdir -p "$APP_PATH/flyer-images/icons" "$APP_PATH/flyer-images/archive"
rsync -avz --delete --exclude 'node_modules' --exclude '.git' --exclude 'dist' --exclude 'flyer-images' ./ "$APP_PATH/"
rsync -avz dist/ "$APP_PATH"
echo "Application deployment complete."
- name: Install Backend Dependencies and Restart Production Server
env:
# --- Production Secrets Injection ---
DB_HOST: ${{ secrets.DB_HOST }}
DB_USER: ${{ secrets.DB_USER }}
DB_PASSWORD: ${{ secrets.DB_PASSWORD }}
DB_NAME: ${{ secrets.DB_DATABASE_PROD }}
REDIS_URL: 'redis://localhost:6379'
REDIS_PASSWORD: ${{ secrets.REDIS_PASSWORD_PROD }}
FRONTEND_URL: 'https://flyer-crawler.projectium.com'
JWT_SECRET: ${{ secrets.JWT_SECRET }}
GEMINI_API_KEY: ${{ secrets.VITE_GOOGLE_GENAI_API_KEY }}
GOOGLE_MAPS_API_KEY: ${{ secrets.GOOGLE_MAPS_API_KEY }}
SMTP_HOST: 'localhost'
SMTP_PORT: '1025'
SMTP_SECURE: 'false'
SMTP_USER: ''
SMTP_PASS: ''
SMTP_FROM_EMAIL: 'noreply@flyer-crawler.projectium.com'
run: |
if [ -z "$DB_HOST" ] || [ -z "$DB_USER" ] || [ -z "$DB_PASSWORD" ] || [ -z "$DB_NAME" ]; then
echo "ERROR: One or more production database secrets (DB_HOST, DB_USER, DB_PASSWORD, DB_DATABASE_PROD) are not set."
exit 1
fi
echo "Installing production dependencies and restarting server..."
cd /var/www/flyer-crawler.projectium.com
npm install --omit=dev
# --- Version Check Logic ---
# Get the version from the newly deployed package.json
NEW_VERSION=$(node -p "require('./package.json').version")
echo "Deployed Package Version: $NEW_VERSION"
# Get the running version from PM2 for the main API process
# We use a small node script to parse the JSON output from pm2 jlist
RUNNING_VERSION=$(pm2 jlist | node -e "try { const list = JSON.parse(require('fs').readFileSync(0, 'utf-8')); const app = list.find(p => p.name === 'flyer-crawler-api'); console.log(app ? app.pm2_env.version : ''); } catch(e) { console.log(''); }")
echo "Running PM2 Version: $RUNNING_VERSION"
if [ "${{ gitea.event.inputs.force_reload }}" == "true" ] || [ "$NEW_VERSION" != "$RUNNING_VERSION" ] || [ -z "$RUNNING_VERSION" ]; then
if [ "${{ gitea.event.inputs.force_reload }}" == "true" ]; then
echo "Force reload triggered by manual input. Reloading PM2..."
else
echo "Version mismatch (Running: $RUNNING_VERSION -> Deployed: $NEW_VERSION) or app not running. Reloading PM2..."
fi
pm2 startOrReload ecosystem.config.cjs --env production && pm2 save
echo "Production backend server reloaded successfully."
else
echo "Version $NEW_VERSION is already running. Skipping PM2 reload."
fi
echo "Updating schema hash in production database..."
CURRENT_HASH=$(cat sql/master_schema_rollup.sql | dos2unix | sha256sum | awk '{ print $1 }')
PGPASSWORD="$DB_PASSWORD" psql -v ON_ERROR_STOP=1 -h "$DB_HOST" -p 5432 -U "$DB_USER" -d "$DB_NAME" -c \
"INSERT INTO public.schema_info (environment, schema_hash, deployed_at) VALUES ('production', '$CURRENT_HASH', NOW())
ON CONFLICT (environment) DO UPDATE SET schema_hash = EXCLUDED.schema_hash, deployed_at = NOW();"
UPDATED_HASH=$(PGPASSWORD="$DB_PASSWORD" psql -v ON_ERROR_STOP=1 -h "$DB_HOST" -p 5432 -U "$DB_USER" -d "$DB_NAME" -c "SELECT schema_hash FROM public.schema_info WHERE environment = 'production';" -t -A)
if [ "$CURRENT_HASH" = "$UPDATED_HASH" ]; then
echo "✅ Schema hash successfully updated in the database to: $UPDATED_HASH"
else
echo "ERROR: Failed to update schema hash in the database."
fi
- name: Show PM2 Environment for Production
run: |
echo "--- Displaying recent PM2 logs for flyer-crawler-api ---"
sleep 5
pm2 describe flyer-crawler-api || echo "Could not find production pm2 process."
pm2 logs flyer-crawler-api --lines 20 --nostream || echo "Could not find production pm2 process."
pm2 env flyer-crawler-api || echo "Could not find production pm2 process."

View File

@@ -6,4 +6,4 @@
"tabWidth": 2,
"useTabs": false,
"endOfLine": "auto"
}
}

31
Dockerfile.dev Normal file
View File

@@ -0,0 +1,31 @@
# Use Ubuntu 22.04 (LTS) as the base image to match production
FROM ubuntu:22.04
# Set environment variables to non-interactive to avoid prompts during installation
ENV DEBIAN_FRONTEND=noninteractive
# Update package lists and install essential tools
# - curl: for downloading Node.js setup script
# - git: for version control operations
# - build-essential: for compiling native Node.js modules (node-gyp)
# - python3: required by some Node.js build tools
RUN apt-get update && apt-get install -y \
curl \
git \
build-essential \
python3 \
&& rm -rf /var/lib/apt/lists/*
# Install Node.js 20.x (LTS) from NodeSource
RUN curl -fsSL https://deb.nodesource.com/setup_20.x | bash - \
&& apt-get install -y nodejs
# Set the working directory inside the container
WORKDIR /app
# Set default environment variables for development
ENV NODE_ENV=development
ENV NODE_OPTIONS='--max-old-space-size=8192'
# Default command keeps the container running so you can attach to it
CMD ["bash"]

130
README.md
View File

@@ -2,7 +2,7 @@
Flyer Crawler is a web application that uses the Google Gemini AI to extract, analyze, and manage data from grocery store flyers. Users can upload flyer images or PDFs, and the application will automatically identify items, prices, and sale dates, storing the structured data in a PostgreSQL database for historical analysis, price tracking, and personalized deal alerts.
We are working on an app to help people save money, by finding good deals that are only advertized in store flyers/ads. So, the primary purpose of the site is to make uploading flyers as easy as possible and as accurate as possible, and to store peoples needs, so sales can be matched to needs.
We are working on an app to help people save money, by finding good deals that are only advertized in store flyers/ads. So, the primary purpose of the site is to make uploading flyers as easy as possible and as accurate as possible, and to store peoples needs, so sales can be matched to needs.
## Features
@@ -45,9 +45,9 @@ This project is configured to run in a CI/CD environment and does not use `.env`
1. **Set up a PostgreSQL database instance.**
2. **Run the Database Schema**:
- Connect to your database using a tool like `psql` or DBeaver.
- Open `sql/schema.sql.txt`, copy its entire contents, and execute it against your database.
- This will create all necessary tables, functions, and relationships.
- Connect to your database using a tool like `psql` or DBeaver.
- Open `sql/schema.sql.txt`, copy its entire contents, and execute it against your database.
- This will create all necessary tables, functions, and relationships.
### Step 2: Install Dependencies and Run the Application
@@ -79,11 +79,11 @@ sudo nano /etc/nginx/mime.types
change
application/javascript js;
application/javascript js;
TO
application/javascript js mjs;
application/javascript js mjs;
RESTART NGINX
@@ -95,7 +95,7 @@ actually the proper change was to do this in the /etc/nginx/sites-available/flye
## for OAuth
1. Get Google OAuth Credentials
This is a crucial step that you must do outside the codebase:
This is a crucial step that you must do outside the codebase:
Go to the Google Cloud Console.
@@ -112,7 +112,7 @@ Under Authorized redirect URIs, click ADD URI and enter the URL where Google wil
Click Create. You will be given a Client ID and a Client Secret.
2. Get GitHub OAuth Credentials
You'll need to obtain a Client ID and Client Secret from GitHub:
You'll need to obtain a Client ID and Client Secret from GitHub:
Go to your GitHub profile settings.
@@ -133,21 +133,23 @@ You will be given a Client ID and a Client Secret.
psql -h localhost -U flyer_crawler_user -d "flyer-crawler-prod" -W
## postgis
flyer-crawler-prod=> SELECT version();
version
------------------------------------------------------------------------------------------------------------------------------------------
PostgreSQL 14.19 (Ubuntu 14.19-0ubuntu0.22.04.1) on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 11.4.0-1ubuntu1~22.04.2) 11.4.0, 64-bit
version
---
PostgreSQL 14.19 (Ubuntu 14.19-0ubuntu0.22.04.1) on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 11.4.0-1ubuntu1~22.04.2) 11.4.0, 64-bit
(1 row)
flyer-crawler-prod=> SELECT PostGIS_Full_Version();
postgis_full_version
--------------------------------------------------------------------------------------------------------------------------------------------------------------------
POSTGIS="3.2.0 c3e3cc0" [EXTENSION] PGSQL="140" GEOS="3.10.2-CAPI-1.16.0" PROJ="8.2.1" LIBXML="2.9.12" LIBJSON="0.15" LIBPROTOBUF="1.3.3" WAGYU="0.5.0 (Internal)"
(1 row)
postgis_full_version
---
POSTGIS="3.2.0 c3e3cc0" [EXTENSION] PGSQL="140" GEOS="3.10.2-CAPI-1.16.0" PROJ="8.2.1" LIBXML="2.9.12" LIBJSON="0.15" LIBPROTOBUF="1.3.3" WAGYU="0.5.0 (Internal)"
(1 row)
## production postgres setup
@@ -201,9 +203,13 @@ Step 4: Seed the Admin Account (If Needed)
Your application has a separate script to create the initial admin user. To run it, you must first set the required environment variables in your shell session.
bash
# Set variables for the current session
export DB_USER=flyer_crawler_user DB_PASSWORD=your_password DB_NAME="flyer-crawler-prod" ...
# Run the seeding script
npx tsx src/db/seed_admin_account.ts
Your production database is now ready!
@@ -284,8 +290,6 @@ Test Execution: Your tests run against this clean, isolated schema.
This approach is faster, more reliable, and removes the need for sudo access within the CI pipeline.
gitea-runner@projectium:~$ pm2 install pm2-logrotate
[PM2][Module] Installing NPM pm2-logrotate module
[PM2][Module] Calling [NPM] to install pm2-logrotate ...
@@ -293,7 +297,7 @@ gitea-runner@projectium:~$ pm2 install pm2-logrotate
added 161 packages in 5s
21 packages are looking for funding
run `npm fund` for details
run `npm fund` for details
npm notice
npm notice New patch version of npm available! 11.6.3 -> 11.6.4
npm notice Changelog: https://github.com/npm/cli/releases/tag/v11.6.4
@@ -308,23 +312,23 @@ $ pm2 set pm2-logrotate:retain 30
$ pm2 set pm2-logrotate:compress false
$ pm2 set pm2-logrotate:dateFormat YYYY-MM-DD_HH-mm-ss
$ pm2 set pm2-logrotate:workerInterval 30
$ pm2 set pm2-logrotate:rotateInterval 0 0 * * *
$ pm2 set pm2-logrotate:rotateInterval 0 0 \* \* _
$ pm2 set pm2-logrotate:rotateModule true
Modules configuration. Copy/Paste line to edit values.
[PM2][Module] Module successfully installed and launched
[PM2][Module] Checkout module options: `$ pm2 conf`
┌────┬───────────────────────────────────┬─────────────┬─────────┬─────────┬──────────┬────────┬──────┬───────────┬──────────┬──────────┬──────────┬──────────┐
│ id │ name │ namespace │ version │ mode │ pid │ uptime │ ↺ │ status │ cpu │ mem │ user │ watching │
│ id │ name │ namespace │ version │ mode │ pid │ uptime │ ↺ │ status │ cpu │ mem │ user │ watching │
├────┼───────────────────────────────────┼─────────────┼─────────┼─────────┼──────────┼────────┼──────┼───────────┼──────────┼──────────┼──────────┼──────────┤
│ 2 │ flyer-crawler-analytics-worker │ default │ 0.0.0 │ fork │ 3846981 │ 7m │ 5 │ online │ 0% │ 55.8mb │ git… │ disabled │
│ 11 │ flyer-crawler-api │ default │ 0.0.0 │ fork │ 3846987 │ 7m │ 0 │ online │ 0% │ 59.0mb │ git… │ disabled │
│ 12 │ flyer-crawler-worker │ default │ 0.0.0 │ fork │ 3846988 │ 7m │ 0 │ online │ 0% │ 54.2mb │ git… │ disabled │
│ 2 │ flyer-crawler-analytics-worker │ default │ 0.0.0 │ fork │ 3846981 │ 7m │ 5 │ online │ 0% │ 55.8mb │ git… │ disabled │
│ 11 │ flyer-crawler-api │ default │ 0.0.0 │ fork │ 3846987 │ 7m │ 0 │ online │ 0% │ 59.0mb │ git… │ disabled │
│ 12 │ flyer-crawler-worker │ default │ 0.0.0 │ fork │ 3846988 │ 7m │ 0 │ online │ 0% │ 54.2mb │ git… │ disabled │
└────┴───────────────────────────────────┴─────────────┴─────────┴─────────┴──────────┴────────┴──────┴───────────┴──────────┴──────────┴──────────┴──────────┘
Module
┌────┬──────────────────────────────┬───────────────┬──────────┬──────────┬──────┬──────────┬──────────┬──────────┐
│ id │ module │ version │ pid │ status │ ↺ │ cpu │ mem │ user
│ id │ module │ version │ pid │ status │ ↺ │ cpu │ mem │ user │
├────┼──────────────────────────────┼───────────────┼──────────┼──────────┼──────┼──────────┼──────────┼──────────┤
│ 13 │ pm2-logrotate │ 3.0.0 │ 3848878 │ online │ 0 │ 0% │ 20.1mb │ git… │
│ 13 │ pm2-logrotate │ 3.0.0 │ 3848878 │ online │ 0 │ 0% │ 20.1mb │ git… │
└────┴──────────────────────────────┴───────────────┴──────────┴──────────┴──────┴──────────┴──────────┴──────────┘
gitea-runner@projectium:~$ pm2 set pm2-logrotate:max_size 10M
[PM2] Module pm2-logrotate restarted
@@ -335,7 +339,7 @@ $ pm2 set pm2-logrotate:retain 30
$ pm2 set pm2-logrotate:compress false
$ pm2 set pm2-logrotate:dateFormat YYYY-MM-DD_HH-mm-ss
$ pm2 set pm2-logrotate:workerInterval 30
$ pm2 set pm2-logrotate:rotateInterval 0 0 * * *
$ pm2 set pm2-logrotate:rotateInterval 0 0 _ \* _
$ pm2 set pm2-logrotate:rotateModule true
gitea-runner@projectium:~$ pm2 set pm2-logrotate:retain 14
[PM2] Module pm2-logrotate restarted
@@ -346,33 +350,31 @@ $ pm2 set pm2-logrotate:retain 14
$ pm2 set pm2-logrotate:compress false
$ pm2 set pm2-logrotate:dateFormat YYYY-MM-DD_HH-mm-ss
$ pm2 set pm2-logrotate:workerInterval 30
$ pm2 set pm2-logrotate:rotateInterval 0 0 * * *
$ pm2 set pm2-logrotate:rotateInterval 0 0 _ \* \*
$ pm2 set pm2-logrotate:rotateModule true
gitea-runner@projectium:~$
## dev server setup:
Here are the steps to set up the development environment on Windows using Podman with an Ubuntu container:
1. Install Prerequisites on Windows
Install WSL 2: Podman on Windows relies on the Windows Subsystem for Linux. Install it by running wsl --install in an administrator PowerShell.
Install Podman Desktop: Download and install Podman Desktop for Windows.
Install WSL 2: Podman on Windows relies on the Windows Subsystem for Linux. Install it by running wsl --install in an administrator PowerShell.
Install Podman Desktop: Download and install Podman Desktop for Windows.
2. Set Up Podman
Initialize Podman: Launch Podman Desktop. It will automatically set up its WSL 2 machine.
Start Podman: Ensure the Podman machine is running from the Podman Desktop interface.
Initialize Podman: Launch Podman Desktop. It will automatically set up its WSL 2 machine.
Start Podman: Ensure the Podman machine is running from the Podman Desktop interface.
3. Set Up the Ubuntu Container
- Pull Ubuntu Image: Open a PowerShell or command prompt and pull the latest Ubuntu image:
podman pull ubuntu:latest
- Create a Podman Volume: Create a volume to persist node_modules and avoid installing them every time the container starts.
podman volume create node_modules_cache
- Run the Ubuntu Container: Start a new container with the project directory mounted and the necessary ports forwarded.
- Open a terminal in your project's root directory on Windows.
- Run the following command, replacing D:\gitea\flyer-crawler.projectium.com\flyer-crawler.projectium.com with the full path to your project:
- Pull Ubuntu Image: Open a PowerShell or command prompt and pull the latest Ubuntu image:
podman pull ubuntu:latest
- Create a Podman Volume: Create a volume to persist node_modules and avoid installing them every time the container starts.
podman volume create node_modules_cache
- Run the Ubuntu Container: Start a new container with the project directory mounted and the necessary ports forwarded.
- Open a terminal in your project's root directory on Windows.
- Run the following command, replacing D:\gitea\flyer-crawler.projectium.com\flyer-crawler.projectium.com with the full path to your project:
podman run -it -p 3001:3001 -p 5173:5173 --name flyer-dev -v "D:\gitea\flyer-crawler.projectium.com\flyer-crawler.projectium.com:/app" -v "node_modules_cache:/app/node_modules" ubuntu:latest
@@ -383,46 +385,40 @@ podman run -it -p 3001:3001 -p 5173:5173 --name flyer-dev -v "D:\gitea\flyer-cra
-v "node_modules_cache:/app/node_modules": Mounts the named volume for node_modules.
4. Configure the Ubuntu Environment
You are now inside the Ubuntu container's shell.
You are now inside the Ubuntu container's shell.
- Update Package Lists:
apt-get update
- Install Dependencies: Install curl, git, and nodejs (which includes npm).
apt-get install -y curl git
curl -sL https://deb.nodesource.com/setup_20.x | bash -
apt-get install -y nodejs
- Navigate to Project Directory:
cd /app
- Update Package Lists:
apt-get update
- Install Dependencies: Install curl, git, and nodejs (which includes npm).
apt-get install -y curl git
curl -sL https://deb.nodesource.com/setup_20.x | bash -
apt-get install -y nodejs
- Navigate to Project Directory:
cd /app
- Install Project Dependencies:
npm install
- Install Project Dependencies:
npm install
5. Run the Development Server
- Start the Application:
npm run dev
npm run dev
6. Accessing the Application
- Frontend: Open your browser and go to http://localhost:5173.
- Backend: The frontend will make API calls to http://localhost:3001.
- Frontend: Open your browser and go to http://localhost:5173.
- Backend: The frontend will make API calls to http://localhost:3001.
Managing the Environment
- Stopping the Container: Press Ctrl+C in the container terminal, then type exit.
- Restarting the Container:
podman start -a -i flyer-dev
- Stopping the Container: Press Ctrl+C in the container terminal, then type exit.
- Restarting the Container:
podman start -a -i flyer-dev
## for me:
cd /mnt/d/gitea/flyer-crawler.projectium.com/flyer-crawler.projectium.com
podman run -it -p 3001:3001 -p 5173:5173 --name flyer-dev -v "$(pwd):/app" -v "node_modules_cache:/app/node_modules" ubuntu:latest
rate limiting
respect the AI service's rate limits, making it more stable and robust. You can adjust the GEMINI_RPM environment variable in your production environment as needed without changing the code.
respect the AI service's rate limits, making it more stable and robust. You can adjust the GEMINI_RPM environment variable in your production environment as needed without changing the code.

52
compose.dev.yml Normal file
View File

@@ -0,0 +1,52 @@
version: '3.8'
services:
app:
container_name: flyer-crawler-dev
build:
context: .
dockerfile: Dockerfile.dev
volumes:
# Mount the current directory to /app in the container
- .:/app
# Create a volume for node_modules to avoid conflicts with Windows host
# and improve performance.
- node_modules_data:/app/node_modules
ports:
- '3000:3000' # Frontend (Vite default)
- '3001:3001' # Backend API
environment:
- NODE_ENV=development
- DB_HOST=postgres
- DB_USER=postgres
- DB_PASSWORD=postgres
- DB_NAME=flyer_crawler_dev
- REDIS_URL=redis://redis:6379
# Add other secrets here or use a .env file
depends_on:
- postgres
- redis
# Keep container running so VS Code can attach
command: tail -f /dev/null
postgres:
image: docker.io/library/postgis/postgis:15-3.4
container_name: flyer-crawler-postgres
ports:
- '5432:5432'
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: flyer_crawler_dev
volumes:
- postgres_data:/var/lib/postgresql/data
redis:
image: docker.io/library/redis:alpine
container_name: flyer-crawler-redis
ports:
- '6379:6379'
volumes:
postgres_data:
node_modules_data:

View File

@@ -34,7 +34,7 @@ We will adopt a strict, consistent error-handling contract for the service and r
**Robustness**: Eliminates an entire class of bugs where `undefined` is passed to `res.json()`, preventing incorrect `500` errors.
**Consistency & Predictability**: All data-fetching methods now have a predictable contract. They either return the expected data or throw a specific, typed error.
**Developer Experience**: Route handlers become simpler, cleaner, and easier to write correctly. The cognitive load on developers is reduced as they no longer need to remember to check for `undefined`.
**Improved Testability**: Tests become more reliable and realistic. Mocks can now throw the *exact* error type (`new NotFoundError()`) that the real implementation would, ensuring tests accurately reflect the application's behavior.
**Improved Testability**: Tests become more reliable and realistic. Mocks can now throw the _exact_ error type (`new NotFoundError()`) that the real implementation would, ensuring tests accurately reflect the application's behavior.
**Centralized Control**: Error-to-HTTP-status logic is centralized in the `errorHandler` middleware, making it easy to manage and modify error responses globally.
### Negative

View File

@@ -10,21 +10,19 @@ Following the standardization of error handling in ADR-001, the next most common
This manual approach has several drawbacks:
**Repetitive Boilerplate**: The `try/catch/finally` block for transaction management is duplicated across multiple files.
**Error-Prone**: It is easy to forget to `client.release()` in all code paths, which can lead to connection pool exhaustion and bring down the application.
3. **Poor Composability**: It is difficult to compose multiple repository methods into a single, atomic "Unit of Work". For example, a service function that needs to update a user's points and create a budget in a single transaction cannot easily do so if both underlying repository methods create their own transactions.
**Error-Prone**: It is easy to forget to `client.release()` in all code paths, which can lead to connection pool exhaustion and bring down the application. 3. **Poor Composability**: It is difficult to compose multiple repository methods into a single, atomic "Unit of Work". For example, a service function that needs to update a user's points and create a budget in a single transaction cannot easily do so if both underlying repository methods create their own transactions.
## Decision
We will implement a standardized "Unit of Work" pattern through a high-level `withTransaction` helper function. This function will abstract away the complexity of transaction management.
1. **`withTransaction` Helper**: A new helper function, `withTransaction<T>(callback: (client: PoolClient) => Promise<T>): Promise<T>`, will be created. This function will be responsible for:
* Acquiring a client from the database pool.
* Starting a transaction (`BEGIN`).
* Executing the `callback` function, passing the transactional client to it.
* If the callback succeeds, it will `COMMIT` the transaction.
* If the callback throws an error, it will `ROLLBACK` the transaction and re-throw the error.
* In all cases, it will `RELEASE` the client back to the pool.
- Acquiring a client from the database pool.
- Starting a transaction (`BEGIN`).
- Executing the `callback` function, passing the transactional client to it.
- If the callback succeeds, it will `COMMIT` the transaction.
- If the callback throws an error, it will `ROLLBACK` the transaction and re-throw the error.
- In all cases, it will `RELEASE` the client back to the pool.
2. **Repository Method Signature**: Repository methods that need to be part of a transaction will be updated to optionally accept a `PoolClient` in their constructor or as a method parameter. By default, they will use the global pool. When called from within a `withTransaction` block, they will be passed the transactional client.
3. **Service Layer Orchestration**: Service-layer functions that orchestrate multi-step operations will use `withTransaction` to ensure atomicity. They will instantiate or call repository methods, providing them with the transactional client from the callback.
@@ -40,7 +38,7 @@ async function registerUserAndCreateDefaultList(userData) {
const shoppingRepo = new ShoppingRepository(client);
const newUser = await userRepo.createUser(userData);
await shoppingRepo.createShoppingList(newUser.user_id, "My First List");
await shoppingRepo.createShoppingList(newUser.user_id, 'My First List');
return newUser;
});

View File

@@ -20,8 +20,8 @@ We will adopt a schema-based approach for input validation using the `zod` libra
1. **Adopt `zod` for Schema Definition**: We will use `zod` to define clear, type-safe schemas for the `params`, `query`, and `body` of each API request. `zod` provides powerful and declarative validation rules and automatically infers TypeScript types.
2. **Create a Reusable Validation Middleware**: A generic `validateRequest(schema)` middleware will be created. This middleware will take a `zod` schema, parse the incoming request against it, and handle success and error cases.
* On successful validation, the parsed and typed data will be attached to the `req` object (e.g., `req.body` will be replaced with the parsed body), and `next()` will be called.
* On validation failure, the middleware will call `next()` with a custom `ValidationError` containing a structured list of issues, which `ADR-001`'s `errorHandler` can then format into a user-friendly `400 Bad Request` response.
- On successful validation, the parsed and typed data will be attached to the `req` object (e.g., `req.body` will be replaced with the parsed body), and `next()` will be called.
- On validation failure, the middleware will call `next()` with a custom `ValidationError` containing a structured list of issues, which `ADR-001`'s `errorHandler` can then format into a user-friendly `400 Bad Request` response.
3. **Refactor Routes**: All route handlers will be refactored to use this new middleware, removing all manual validation logic.
@@ -46,18 +46,18 @@ const getFlyerSchema = z.object({
type GetFlyerRequest = z.infer<typeof getFlyerSchema>;
// 3. Apply the middleware and use an inline cast for the request
router.get('/:id', validateRequest(getFlyerSchema), (async (req, res, next) => {
// Cast 'req' to the inferred type.
// This provides full type safety for params, query, and body.
const { params } = req as unknown as GetFlyerRequest;
router.get('/:id', validateRequest(getFlyerSchema), async (req, res, next) => {
// Cast 'req' to the inferred type.
// This provides full type safety for params, query, and body.
const { params } = req as unknown as GetFlyerRequest;
try {
const flyer = await db.flyerRepo.getFlyerById(params.id); // params.id is 'number'
res.json(flyer);
} catch (error) {
next(error);
}
}));
try {
const flyer = await db.flyerRepo.getFlyerById(params.id); // params.id is 'number'
res.json(flyer);
} catch (error) {
next(error);
}
});
```
## Consequences

View File

@@ -20,9 +20,9 @@ We will adopt a standardized, application-wide structured logging policy. All lo
**Request-Scoped Logger with Context**: We will create a middleware that runs at the beginning of the request lifecycle. This middleware will:
* Generate a unique `request_id` for each incoming request.
* Create a request-scoped logger instance (a "child logger") that automatically includes the `request_id`, `user_id` (if authenticated), and `ip_address` in every log message it generates.
* Attach this child logger to the `req` object (e.g., `req.log`).
- Generate a unique `request_id` for each incoming request.
- Create a request-scoped logger instance (a "child logger") that automatically includes the `request_id`, `user_id` (if authenticated), and `ip_address` in every log message it generates.
- Attach this child logger to the `req` object (e.g., `req.log`).
**Mandatory Use of Request-Scoped Logger**: All route handlers and any service functions called by them **MUST** use the request-scoped logger (`req.log`) instead of the global logger instance. This ensures all logs for a given request are automatically correlated.
@@ -32,9 +32,9 @@ We will adopt a standardized, application-wide structured logging policy. All lo
**Standardized Logging Practices**:
**INFO**: Log key business events, such as `User logged in` or `Flyer processed`.
**WARN**: Log recoverable errors or unusual situations that do not break the request, such as `Client Error: 404 on GET /api/non-existent-route` or `Retrying failed database connection`.
**ERROR**: Log only unhandled or server-side errors that cause a request to fail (typically handled by the `errorHandler`). Avoid logging expected client errors (like 4xx) at this level.
**DEBUG**: Log detailed diagnostic information useful during development, such as function entry/exit points or variable states.
**WARN**: Log recoverable errors or unusual situations that do not break the request, such as `Client Error: 404 on GET /api/non-existent-route` or `Retrying failed database connection`.
**ERROR**: Log only unhandled or server-side errors that cause a request to fail (typically handled by the `errorHandler`). Avoid logging expected client errors (like 4xx) at this level.
**DEBUG**: Log detailed diagnostic information useful during development, such as function entry/exit points or variable states.
### Example Usage
@@ -59,15 +59,15 @@ export const requestLogger = (req, res, next) => {
// In a route handler:
router.get('/:id', async (req, res, next) => {
// Use the request-scoped logger
req.log.info({ flyerId: req.params.id }, 'Fetching flyer by ID');
try {
// ... business logic ...
res.json(flyer);
} catch (error) {
// The error itself will be logged with full context by the errorHandler
next(error);
}
// Use the request-scoped logger
req.log.info({ flyerId: req.params.id }, 'Fetching flyer by ID');
try {
// ... business logic ...
res.json(flyer);
} catch (error) {
// The error itself will be logged with full context by the errorHandler
next(error);
}
});
```

View File

@@ -14,5 +14,5 @@ We will formalize a centralized Role-Based Access Control (RBAC) or Attribute-Ba
## Consequences
* **Positive**: Ensures authorization logic is consistent, easy to audit, and decoupled from business logic. Improves security by centralizing access control.
* **Negative**: Requires a significant refactoring effort to integrate the new authorization system across all protected routes and features. Introduces a new dependency if an external library is chosen.
- **Positive**: Ensures authorization logic is consistent, easy to audit, and decoupled from business logic. Improves security by centralizing access control.
- **Negative**: Requires a significant refactoring effort to integrate the new authorization system across all protected routes and features. Introduces a new dependency if an external library is chosen.

View File

@@ -14,5 +14,5 @@ We will establish a formal Design System and Component Library. This will involv
## Consequences
* **Positive**: Ensures a consistent and high-quality user interface. Accelerates frontend development by providing reusable, well-documented components. Improves maintainability and reduces technical debt.
* **Negative**: Requires an initial investment in setting up Storybook and migrating existing components. Adds a new dependency and a new workflow for frontend development.
- **Positive**: Ensures a consistent and high-quality user interface. Accelerates frontend development by providing reusable, well-documented components. Improves maintainability and reduces technical debt.
- **Negative**: Requires an initial investment in setting up Storybook and migrating existing components. Adds a new dependency and a new workflow for frontend development.

View File

@@ -14,5 +14,5 @@ We will adopt a dedicated database migration tool, such as **`node-pg-migrate`**
## Consequences
* **Positive**: Provides a safe, repeatable, and reversible way to evolve the database schema. Improves team collaboration on database changes. Reduces the risk of data loss or downtime during deployments.
* **Negative**: Requires an initial setup and learning curve for the chosen migration tool. All future schema changes must adhere to the migration workflow.
- **Positive**: Provides a safe, repeatable, and reversible way to evolve the database schema. Improves team collaboration on database changes. Reduces the risk of data loss or downtime during deployments.
- **Negative**: Requires an initial setup and learning curve for the chosen migration tool. All future schema changes must adhere to the migration workflow.

View File

@@ -14,5 +14,5 @@ We will standardize the deployment process by containerizing the application usi
## Consequences
* **Positive**: Ensures consistency between development and production environments. Simplifies the setup for new developers. Improves portability and scalability of the application.
* **Negative**: Requires learning Docker and containerization concepts. Adds `Dockerfile` and `docker-compose.yml` to the project's configuration.
- **Positive**: Ensures consistency between development and production environments. Simplifies the setup for new developers. Improves portability and scalability of the application.
- **Negative**: Requires learning Docker and containerization concepts. Adds `Dockerfile` and `docker-compose.yml` to the project's configuration.

View File

@@ -18,5 +18,5 @@ We will implement a multi-layered security approach for the API:
## Consequences
* **Positive**: Significantly improves the application's security posture against common web vulnerabilities like XSS, clickjacking, and brute-force attacks.
* **Negative**: Requires careful configuration of CORS and rate limits to avoid blocking legitimate traffic. Content-Security-Policy can be complex to configure correctly.
- **Positive**: Significantly improves the application's security posture against common web vulnerabilities like XSS, clickjacking, and brute-force attacks.
- **Negative**: Requires careful configuration of CORS and rate limits to avoid blocking legitimate traffic. Content-Security-Policy can be complex to configure correctly.

View File

@@ -14,5 +14,5 @@ We will formalize the end-to-end CI/CD process. This ADR will define the project
## Consequences
* **Positive**: Automates quality control and creates a safe, repeatable path to production. Increases development velocity and reduces deployment-related errors.
* **Negative**: Initial setup effort for the CI/CD pipeline. May slightly increase the time to merge code due to mandatory checks.
- **Positive**: Automates quality control and creates a safe, repeatable path to production. Increases development velocity and reduces deployment-related errors.
- **Negative**: Initial setup effort for the CI/CD pipeline. May slightly increase the time to merge code due to mandatory checks.

View File

@@ -14,5 +14,5 @@ We will adopt **OpenAPI (Swagger)** for API documentation. We will use tools (e.
## Consequences
* **Positive**: Creates a single source of truth for API documentation that stays in sync with the code. Enables auto-generation of client SDKs and simplifies testing.
* **Negative**: Requires developers to maintain JSDoc annotations on all routes. Adds a build step and new dependencies to the project.
- **Positive**: Creates a single source of truth for API documentation that stays in sync with the code. Enables auto-generation of client SDKs and simplifies testing.
- **Negative**: Requires developers to maintain JSDoc annotations on all routes. Adds a build step and new dependencies to the project.

View File

@@ -14,5 +14,5 @@ We will implement a formal data backup and recovery strategy. This will involve
## Consequences
* **Positive**: Protects against catastrophic data loss, ensuring business continuity. Provides a clear, tested plan for disaster recovery.
* **Negative**: Requires setup and maintenance of backup scripts and secure storage. Incurs storage costs for backup files.
- **Positive**: Protects against catastrophic data loss, ensuring business continuity. Provides a clear, tested plan for disaster recovery.
- **Negative**: Requires setup and maintenance of backup scripts and secure storage. Incurs storage costs for backup files.

View File

@@ -12,11 +12,11 @@ When the application is containerized (`ADR-014`), the container orchestrator (e
We will implement dedicated health check endpoints in the Express application.
* A **Liveness Probe** (`/api/health/live`) will return a `200 OK` to indicate the server is running. If it fails, the orchestrator should restart the container.
- A **Liveness Probe** (`/api/health/live`) will return a `200 OK` to indicate the server is running. If it fails, the orchestrator should restart the container.
* A **Readiness Probe** (`/api/health/ready`) will return a `200 OK` only if the application is ready to accept traffic (e.g., database connection is established). If it fails, the orchestrator will temporarily remove the container from the load balancer.
- A **Readiness Probe** (`/api/health/ready`) will return a `200 OK` only if the application is ready to accept traffic (e.g., database connection is established). If it fails, the orchestrator will temporarily remove the container from the load balancer.
## Consequences
* **Positive**: Enables robust, automated application lifecycle management in a containerized environment. Prevents traffic from being sent to unhealthy or uninitialized application instances.
* **Negative**: Adds a small amount of code for the health check endpoints. Requires configuration in the container orchestration layer.
- **Positive**: Enables robust, automated application lifecycle management in a containerized environment. Prevents traffic from being sent to unhealthy or uninitialized application instances.
- **Negative**: Adds a small amount of code for the health check endpoints. Requires configuration in the container orchestration layer.

View File

@@ -24,8 +24,8 @@ We will adopt a standardized, application-wide structured logging policy for all
**2. Pino-like API for Structured Logging**: The client logger mimics the `pino` API, which is the standard on the backend. It supports two primary call signatures:
* `logger.info('A simple message');`
* `logger.info({ key: 'value' }, 'A message with a structured data payload');`
- `logger.info('A simple message');`
- `logger.info({ key: 'value' }, 'A message with a structured data payload');`
The second signature, which includes a data object as the first argument, is **strongly preferred**, especially for logging errors or complex state.
@@ -79,7 +79,7 @@ describe('MyComponent', () => {
// Assert that the logger was called with the expected structure
expect(logger.error).toHaveBeenCalledWith(
expect.objectContaining({ err: expect.any(Error) }), // Check for the error object
'Failed to fetch component data' // Check for the message
'Failed to fetch component data', // Check for the message
);
});
});

View File

@@ -15,15 +15,73 @@ module.exports = {
args: 'server.ts', // tsx will execute this file
// Production Environment Settings
env_production: {
NODE_ENV: 'production', // Set the Node.js environment to production
NODE_ENV: 'production', // Set the Node.js environment to production
name: 'flyer-crawler-api',
cwd: '/var/www/flyer-crawler.projectium.com',
// Inherit secrets from the deployment environment
DB_HOST: process.env.DB_HOST,
DB_USER: process.env.DB_USER,
DB_PASSWORD: process.env.DB_PASSWORD,
DB_NAME: process.env.DB_NAME,
REDIS_URL: process.env.REDIS_URL,
REDIS_PASSWORD: process.env.REDIS_PASSWORD,
FRONTEND_URL: process.env.FRONTEND_URL,
JWT_SECRET: process.env.JWT_SECRET,
GEMINI_API_KEY: process.env.GEMINI_API_KEY,
GOOGLE_MAPS_API_KEY: process.env.GOOGLE_MAPS_API_KEY,
SMTP_HOST: process.env.SMTP_HOST,
SMTP_PORT: process.env.SMTP_PORT,
SMTP_SECURE: process.env.SMTP_SECURE,
SMTP_USER: process.env.SMTP_USER,
SMTP_PASS: process.env.SMTP_PASS,
SMTP_FROM_EMAIL: process.env.SMTP_FROM_EMAIL,
},
// Test Environment Settings
env_test: {
NODE_ENV: 'development', // Use 'development' for test to enable more verbose logging if needed
NODE_ENV: 'test', // Set to 'test' to match the environment purpose and disable pino-pretty
name: 'flyer-crawler-api-test',
cwd: '/var/www/flyer-crawler-test.projectium.com',
// Inherit secrets from the deployment environment
DB_HOST: process.env.DB_HOST,
DB_USER: process.env.DB_USER,
DB_PASSWORD: process.env.DB_PASSWORD,
DB_NAME: process.env.DB_NAME,
REDIS_URL: process.env.REDIS_URL,
REDIS_PASSWORD: process.env.REDIS_PASSWORD,
FRONTEND_URL: process.env.FRONTEND_URL,
JWT_SECRET: process.env.JWT_SECRET,
GEMINI_API_KEY: process.env.GEMINI_API_KEY,
GOOGLE_MAPS_API_KEY: process.env.GOOGLE_MAPS_API_KEY,
SMTP_HOST: process.env.SMTP_HOST,
SMTP_PORT: process.env.SMTP_PORT,
SMTP_SECURE: process.env.SMTP_SECURE,
SMTP_USER: process.env.SMTP_USER,
SMTP_PASS: process.env.SMTP_PASS,
SMTP_FROM_EMAIL: process.env.SMTP_FROM_EMAIL,
},
// Development Environment Settings
env_development: {
NODE_ENV: 'development',
name: 'flyer-crawler-api-dev',
watch: true,
ignore_watch: ['node_modules', 'logs', '*.log', 'flyer-images', '.git'],
// Inherit secrets from the deployment environment
DB_HOST: process.env.DB_HOST,
DB_USER: process.env.DB_USER,
DB_PASSWORD: process.env.DB_PASSWORD,
DB_NAME: process.env.DB_NAME,
REDIS_URL: process.env.REDIS_URL,
REDIS_PASSWORD: process.env.REDIS_PASSWORD,
FRONTEND_URL: process.env.FRONTEND_URL,
JWT_SECRET: process.env.JWT_SECRET,
GEMINI_API_KEY: process.env.GEMINI_API_KEY,
GOOGLE_MAPS_API_KEY: process.env.GOOGLE_MAPS_API_KEY,
SMTP_HOST: process.env.SMTP_HOST,
SMTP_PORT: process.env.SMTP_PORT,
SMTP_SECURE: process.env.SMTP_SECURE,
SMTP_USER: process.env.SMTP_USER,
SMTP_PASS: process.env.SMTP_PASS,
SMTP_FROM_EMAIL: process.env.SMTP_FROM_EMAIL,
},
},
{
@@ -36,12 +94,70 @@ module.exports = {
NODE_ENV: 'production',
name: 'flyer-crawler-worker',
cwd: '/var/www/flyer-crawler.projectium.com',
// Inherit secrets from the deployment environment
DB_HOST: process.env.DB_HOST,
DB_USER: process.env.DB_USER,
DB_PASSWORD: process.env.DB_PASSWORD,
DB_NAME: process.env.DB_NAME,
REDIS_URL: process.env.REDIS_URL,
REDIS_PASSWORD: process.env.REDIS_PASSWORD,
FRONTEND_URL: process.env.FRONTEND_URL,
JWT_SECRET: process.env.JWT_SECRET,
GEMINI_API_KEY: process.env.GEMINI_API_KEY,
GOOGLE_MAPS_API_KEY: process.env.GOOGLE_MAPS_API_KEY,
SMTP_HOST: process.env.SMTP_HOST,
SMTP_PORT: process.env.SMTP_PORT,
SMTP_SECURE: process.env.SMTP_SECURE,
SMTP_USER: process.env.SMTP_USER,
SMTP_PASS: process.env.SMTP_PASS,
SMTP_FROM_EMAIL: process.env.SMTP_FROM_EMAIL,
},
// Test Environment Settings
env_test: {
NODE_ENV: 'development',
NODE_ENV: 'test',
name: 'flyer-crawler-worker-test',
cwd: '/var/www/flyer-crawler-test.projectium.com',
// Inherit secrets from the deployment environment
DB_HOST: process.env.DB_HOST,
DB_USER: process.env.DB_USER,
DB_PASSWORD: process.env.DB_PASSWORD,
DB_NAME: process.env.DB_NAME,
REDIS_URL: process.env.REDIS_URL,
REDIS_PASSWORD: process.env.REDIS_PASSWORD,
FRONTEND_URL: process.env.FRONTEND_URL,
JWT_SECRET: process.env.JWT_SECRET,
GEMINI_API_KEY: process.env.GEMINI_API_KEY,
GOOGLE_MAPS_API_KEY: process.env.GOOGLE_MAPS_API_KEY,
SMTP_HOST: process.env.SMTP_HOST,
SMTP_PORT: process.env.SMTP_PORT,
SMTP_SECURE: process.env.SMTP_SECURE,
SMTP_USER: process.env.SMTP_USER,
SMTP_PASS: process.env.SMTP_PASS,
SMTP_FROM_EMAIL: process.env.SMTP_FROM_EMAIL,
},
// Development Environment Settings
env_development: {
NODE_ENV: 'development',
name: 'flyer-crawler-worker-dev',
watch: true,
ignore_watch: ['node_modules', 'logs', '*.log', 'flyer-images', '.git'],
// Inherit secrets from the deployment environment
DB_HOST: process.env.DB_HOST,
DB_USER: process.env.DB_USER,
DB_PASSWORD: process.env.DB_PASSWORD,
DB_NAME: process.env.DB_NAME,
REDIS_URL: process.env.REDIS_URL,
REDIS_PASSWORD: process.env.REDIS_PASSWORD,
FRONTEND_URL: process.env.FRONTEND_URL,
JWT_SECRET: process.env.JWT_SECRET,
GEMINI_API_KEY: process.env.GEMINI_API_KEY,
GOOGLE_MAPS_API_KEY: process.env.GOOGLE_MAPS_API_KEY,
SMTP_HOST: process.env.SMTP_HOST,
SMTP_PORT: process.env.SMTP_PORT,
SMTP_SECURE: process.env.SMTP_SECURE,
SMTP_USER: process.env.SMTP_USER,
SMTP_PASS: process.env.SMTP_PASS,
SMTP_FROM_EMAIL: process.env.SMTP_FROM_EMAIL,
},
},
{
@@ -54,13 +170,71 @@ module.exports = {
NODE_ENV: 'production',
name: 'flyer-crawler-analytics-worker',
cwd: '/var/www/flyer-crawler.projectium.com',
// Inherit secrets from the deployment environment
DB_HOST: process.env.DB_HOST,
DB_USER: process.env.DB_USER,
DB_PASSWORD: process.env.DB_PASSWORD,
DB_NAME: process.env.DB_NAME,
REDIS_URL: process.env.REDIS_URL,
REDIS_PASSWORD: process.env.REDIS_PASSWORD,
FRONTEND_URL: process.env.FRONTEND_URL,
JWT_SECRET: process.env.JWT_SECRET,
GEMINI_API_KEY: process.env.GEMINI_API_KEY,
GOOGLE_MAPS_API_KEY: process.env.GOOGLE_MAPS_API_KEY,
SMTP_HOST: process.env.SMTP_HOST,
SMTP_PORT: process.env.SMTP_PORT,
SMTP_SECURE: process.env.SMTP_SECURE,
SMTP_USER: process.env.SMTP_USER,
SMTP_PASS: process.env.SMTP_PASS,
SMTP_FROM_EMAIL: process.env.SMTP_FROM_EMAIL,
},
// Test Environment Settings
env_test: {
NODE_ENV: 'development',
NODE_ENV: 'test',
name: 'flyer-crawler-analytics-worker-test',
cwd: '/var/www/flyer-crawler-test.projectium.com',
// Inherit secrets from the deployment environment
DB_HOST: process.env.DB_HOST,
DB_USER: process.env.DB_USER,
DB_PASSWORD: process.env.DB_PASSWORD,
DB_NAME: process.env.DB_NAME,
REDIS_URL: process.env.REDIS_URL,
REDIS_PASSWORD: process.env.REDIS_PASSWORD,
FRONTEND_URL: process.env.FRONTEND_URL,
JWT_SECRET: process.env.JWT_SECRET,
GEMINI_API_KEY: process.env.GEMINI_API_KEY,
GOOGLE_MAPS_API_KEY: process.env.GOOGLE_MAPS_API_KEY,
SMTP_HOST: process.env.SMTP_HOST,
SMTP_PORT: process.env.SMTP_PORT,
SMTP_SECURE: process.env.SMTP_SECURE,
SMTP_USER: process.env.SMTP_USER,
SMTP_PASS: process.env.SMTP_PASS,
SMTP_FROM_EMAIL: process.env.SMTP_FROM_EMAIL,
},
// Development Environment Settings
env_development: {
NODE_ENV: 'development',
name: 'flyer-crawler-analytics-worker-dev',
watch: true,
ignore_watch: ['node_modules', 'logs', '*.log', 'flyer-images', '.git'],
// Inherit secrets from the deployment environment
DB_HOST: process.env.DB_HOST,
DB_USER: process.env.DB_USER,
DB_PASSWORD: process.env.DB_PASSWORD,
DB_NAME: process.env.DB_NAME,
REDIS_URL: process.env.REDIS_URL,
REDIS_PASSWORD: process.env.REDIS_PASSWORD,
FRONTEND_URL: process.env.FRONTEND_URL,
JWT_SECRET: process.env.JWT_SECRET,
GEMINI_API_KEY: process.env.GEMINI_API_KEY,
GOOGLE_MAPS_API_KEY: process.env.GOOGLE_MAPS_API_KEY,
SMTP_HOST: process.env.SMTP_HOST,
SMTP_PORT: process.env.SMTP_PORT,
SMTP_SECURE: process.env.SMTP_SECURE,
SMTP_USER: process.env.SMTP_USER,
SMTP_PASS: process.env.SMTP_PASS,
SMTP_FROM_EMAIL: process.env.SMTP_FROM_EMAIL,
},
},
],
};
};

View File

@@ -1,21 +1,21 @@
import globals from "globals";
import tseslint from "typescript-eslint";
import pluginReact from "eslint-plugin-react";
import pluginReactHooks from "eslint-plugin-react-hooks";
import pluginReactRefresh from "eslint-plugin-react-refresh";
import globals from 'globals';
import tseslint from 'typescript-eslint';
import pluginReact from 'eslint-plugin-react';
import pluginReactHooks from 'eslint-plugin-react-hooks';
import pluginReactRefresh from 'eslint-plugin-react-refresh';
export default tseslint.config(
{
// Global ignores
ignores: ["dist", ".gitea", "node_modules", "*.cjs"],
ignores: ['dist', '.gitea', 'node_modules', '*.cjs'],
},
{
// All files
files: ["**/*.{js,mjs,cjs,ts,jsx,tsx}"],
files: ['**/*.{js,mjs,cjs,ts,jsx,tsx}'],
plugins: {
react: pluginReact,
"react-hooks": pluginReactHooks,
"react-refresh": pluginReactRefresh,
'react-hooks': pluginReactHooks,
'react-refresh': pluginReactRefresh,
},
languageOptions: {
globals: {
@@ -24,12 +24,9 @@ export default tseslint.config(
},
},
rules: {
"react-refresh/only-export-components": [
"warn",
{ allowConstantExport: true },
],
'react-refresh/only-export-components': ['warn', { allowConstantExport: true }],
},
},
// TypeScript files
...tseslint.configs.recommended,
);
);

4
express.d.ts vendored
View File

@@ -1,4 +1,4 @@
// src/types/express.d.ts
// express.d.ts
import { Logger } from 'pino';
/**
@@ -12,4 +12,4 @@ declare global {
log: Logger;
}
}
}
}

View File

@@ -1,20 +1,20 @@
<!DOCTYPE html>
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Grocery Flyer AI Analyzer</title>
<style>
@import url('https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap');
body {
font-family: 'Inter', sans-serif;
}
</style>
<!-- The stylesheet will be injected here by Vite during the build process -->
</head>
<body>
<div id="root"></div>
<!-- Vite will inject the correct <script> tag here during the build process -->
<script type="module" src="/src/index.tsx"></script>
</body>
</html>
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Grocery Flyer AI Analyzer</title>
<style>
@import url('https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap');
body {
font-family: 'Inter', sans-serif;
}
</style>
<!-- The stylesheet will be injected here by Vite during the build process -->
</head>
<body>
<div id="root"></div>
<!-- Vite will inject the correct <script> tag here during the build process -->
<script type="module" src="/src/index.tsx"></script>
</body>
</html>

View File

@@ -1,8 +1,5 @@
{
"name": "Flyer Crawler",
"description": "Upload a grocery store flyer image to extract item details, prices, and quantities using AI. Get insights, meal plans, and compare prices to save money on your shopping.",
"requestFramePermissions": [
"geolocation",
"microphone"
]
}
"requestFramePermissions": ["geolocation", "microphone"]
}

4283
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,17 +1,19 @@
{
"name": "flyer-crawler",
"private": true,
"version": "0.0.4",
"version": "0.1.2",
"type": "module",
"scripts": {
"dev": "concurrently \"npm:start:dev\" \"vite\"",
"dev:container": "concurrently \"npm:start:dev\" \"vite --host\"",
"start": "npm run start:prod",
"build": "vite build",
"preview": "vite preview",
"test": "NODE_ENV=test tsx ./node_modules/vitest/vitest.mjs run",
"test": "cross-env NODE_ENV=test tsx ./node_modules/vitest/vitest.mjs run",
"test-wsl": "cross-env NODE_ENV=test vitest run",
"test:coverage": "npm run clean && npm run test:unit -- --coverage && npm run test:integration -- --coverage",
"test:unit": "NODE_ENV=test tsx ./node_modules/vitest/vitest.mjs run --project unit -c vite.config.ts",
"test:integration": "NODE_ENV=test tsx ./node_modules/vitest/vitest.mjs run --project integration -c vitest.config.integration.ts",
"test:unit": "NODE_ENV=test tsx --max-old-space-size=8192 ./node_modules/vitest/vitest.mjs run --project unit -c vite.config.ts",
"test:integration": "NODE_ENV=test tsx --max-old-space-size=8192 ./node_modules/vitest/vitest.mjs run --project integration -c vitest.config.integration.ts",
"format": "prettier --write .",
"lint": "eslint . --ext ts,tsx --report-unused-disable-directives --max-warnings 0",
"type-check": "tsc --noEmit",
@@ -20,6 +22,7 @@
"start:dev": "NODE_ENV=development tsx watch server.ts",
"start:prod": "NODE_ENV=production tsx server.ts",
"start:test": "NODE_ENV=test NODE_V8_COVERAGE=.coverage/tmp/integration-server tsx server.ts",
"db:reset:dev": "NODE_ENV=development tsx src/db/seed.ts",
"db:reset:test": "NODE_ENV=test tsx src/db/seed.ts",
"worker:prod": "NODE_ENV=production tsx src/services/queueService.server.ts"
},
@@ -58,7 +61,7 @@
"recharts": "^3.4.1",
"sharp": "^0.34.5",
"tsx": "^4.20.6",
"zod": "^4.1.13",
"zod": "^4.2.1",
"zxcvbn": "^4.4.2"
},
"devDependencies": {
@@ -95,6 +98,7 @@
"autoprefixer": "^10.4.22",
"c8": "^10.1.3",
"concurrently": "^9.2.1",
"cross-env": "^10.1.0",
"eslint": "9.39.1",
"eslint-config-prettier": "^9.1.0",
"eslint-plugin-react": "7.37.5",

View File

@@ -10,10 +10,13 @@ const tailwindConfigPath = path.resolve(process.cwd(), 'tailwind.config.js');
console.log(`[POSTCSS] Attempting to use Tailwind config at: ${tailwindConfigPath}`);
// Log to prove the imported config object is what we expect
console.log('[POSTCSS] Imported tailwind.config.js object:', JSON.stringify(tailwindConfig, null, 2));
console.log(
'[POSTCSS] Imported tailwind.config.js object:',
JSON.stringify(tailwindConfig, null, 2),
);
export default {
plugins: {
'@tailwindcss/postcss': {}, // The empty object is correct.
},
};
};

View File

@@ -1030,11 +1030,61 @@ DROP FUNCTION IF EXISTS public.fork_recipe(UUID, BIGINT);
CREATE OR REPLACE FUNCTION public.fork_recipe(p_user_id UUID, p_original_recipe_id BIGINT)
RETURNS SETOF public.recipes
LANGUAGE sql
LANGUAGE plpgsql
SECURITY INVOKER
AS $$
-- The entire forking logic is now encapsulated in a single, atomic database function.
SELECT * FROM public.fork_recipe(p_user_id, p_original_recipe_id);
DECLARE
new_recipe_id BIGINT;
BEGIN
-- 1. Create a copy of the recipe, linking it to the new user and the original recipe.
INSERT INTO public.recipes (
user_id,
original_recipe_id,
name,
description,
instructions,
prep_time_minutes,
cook_time_minutes,
servings,
photo_url,
calories_per_serving,
protein_grams,
fat_grams,
carb_grams,
status -- Forked recipes should be private by default
)
SELECT
p_user_id,
p_original_recipe_id,
original.name || ' (Fork)', -- Append '(Fork)' to distinguish it
original.description,
original.instructions,
original.prep_time_minutes,
original.cook_time_minutes,
original.servings,
original.photo_url,
original.calories_per_serving,
original.protein_grams,
original.fat_grams,
original.carb_grams,
'private'
FROM public.recipes AS original
WHERE original.recipe_id = p_original_recipe_id
RETURNING recipe_id INTO new_recipe_id;
-- If the original recipe didn't exist, new_recipe_id will be null.
IF new_recipe_id IS NULL THEN
RETURN;
END IF;
-- 2. Copy all ingredients, tags, and appliances from the original recipe to the new one.
INSERT INTO public.recipe_ingredients (recipe_id, master_item_id, quantity, unit) SELECT new_recipe_id, master_item_id, quantity, unit FROM public.recipe_ingredients WHERE recipe_id = p_original_recipe_id;
INSERT INTO public.recipe_tags (recipe_id, tag_id) SELECT new_recipe_id, tag_id FROM public.recipe_tags WHERE recipe_id = p_original_recipe_id;
INSERT INTO public.recipe_appliances (recipe_id, appliance_id) SELECT new_recipe_id, appliance_id FROM public.recipe_appliances WHERE recipe_id = p_original_recipe_id;
-- 3. Return the newly created recipe record.
RETURN QUERY SELECT * FROM public.recipes WHERE recipe_id = new_recipe_id;
END;
$$;
@@ -1566,4 +1616,3 @@ BEGIN
bp.price_rank = 1;
END;
$$ LANGUAGE plpgsql;

View File

@@ -8,7 +8,23 @@
-- It is idempotent, meaning it can be run multiple times without causing errors.
-- 1. Pre-populate the master grocery items dictionary.
-- This block links generic items to their respective categories.
-- This MUST run after populating categories.
-- Renumbered to 2.
-- 2. Pre-populate the categories table from a predefined list.
-- Renumbered to 1. This MUST run before populating master_grocery_items.
DO $$
BEGIN
INSERT INTO public.categories (name) VALUES
('Fruits & Vegetables'), ('Meat & Seafood'), ('Dairy & Eggs'), ('Bakery & Bread'),
('Pantry & Dry Goods'), ('Beverages'), ('Frozen Foods'), ('Snacks'), ('Household & Cleaning'),
('Personal Care & Health'), ('Baby & Child'), ('Pet Supplies'), ('Deli & Prepared Foods'),
('Canned Goods'), ('Condiments & Spices'), ('Breakfast & Cereal'), ('Organic'),
('International Foods'), ('Other/Miscellaneous')
ON CONFLICT (name) DO NOTHING;
END $$;
-- 2. Pre-populate the master grocery items dictionary.
DO $$
DECLARE
fv_cat_id BIGINT; ms_cat_id BIGINT; de_cat_id BIGINT; bb_cat_id BIGINT; pdg_cat_id BIGINT;
@@ -53,18 +69,6 @@ BEGIN
ON CONFLICT (name) DO NOTHING;
END $$;
-- 2. Pre-populate the categories table from a predefined list.
DO $$
BEGIN
INSERT INTO public.categories (name) VALUES
('Fruits & Vegetables'), ('Meat & Seafood'), ('Dairy & Eggs'), ('Bakery & Bread'),
('Pantry & Dry Goods'), ('Beverages'), ('Frozen Foods'), ('Snacks'), ('Household & Cleaning'),
('Personal Care & Health'), ('Baby & Child'), ('Pet Supplies'), ('Deli & Prepared Foods'),
('Canned Goods'), ('Condiments & Spices'), ('Breakfast & Cereal'), ('Organic'),
('International Foods'), ('Other/Miscellaneous')
ON CONFLICT (name) DO NOTHING;
END $$;
-- 3. Pre-populate the brands and products tables.
-- This block adds common brands and links them to specific products.
DO $$

View File

@@ -92,6 +92,7 @@ CREATE TABLE IF NOT EXISTS public.stores (
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL,
created_by UUID REFERENCES public.users(user_id) ON DELETE SET NULL
);
COMMENT ON TABLE public.stores IS 'Stores metadata for grocery store chains (e.g., Safeway, Kroger).';
-- 5. The 'categories' table for normalized category data.
@@ -109,8 +110,8 @@ CREATE TABLE IF NOT EXISTS public.flyers (
file_name TEXT NOT NULL,
image_url TEXT NOT NULL,
icon_url TEXT,
checksum TEXT UNIQUE,
store_id BIGINT REFERENCES public.stores(store_id),
checksum TEXT UNIQUE,
store_id BIGINT REFERENCES public.stores(store_id) ON DELETE CASCADE,
valid_from DATE,
valid_to DATE,
store_address TEXT,
@@ -138,7 +139,7 @@ CREATE INDEX IF NOT EXISTS idx_flyers_valid_to_file_name ON public.flyers (valid
CREATE TABLE IF NOT EXISTS public.master_grocery_items (
master_grocery_item_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
name TEXT NOT NULL UNIQUE,
category_id BIGINT REFERENCES public.categories(category_id),
category_id BIGINT REFERENCES public.categories(category_id) ON DELETE SET NULL,
is_allergen BOOLEAN DEFAULT false,
allergy_info JSONB,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
@@ -161,6 +162,38 @@ CREATE TABLE IF NOT EXISTS public.user_watched_items (
COMMENT ON TABLE public.user_watched_items IS 'A linking table that represents a user''s personal watchlist of grocery items.';
CREATE INDEX IF NOT EXISTS idx_user_watched_items_master_item_id ON public.user_watched_items(master_item_id);
-- 23. Store brand information. (Moved up due to dependency in flyer_items)
CREATE TABLE IF NOT EXISTS public.brands (
brand_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
name TEXT NOT NULL UNIQUE,
logo_url TEXT,
store_id BIGINT REFERENCES public.stores(store_id) ON DELETE SET NULL,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.brands IS 'Stores brand names like "Coca-Cola", "Maple Leaf", or "Kraft".';
COMMENT ON COLUMN public.brands.store_id IS 'If this is a store-specific brand (e.g., President''s Choice), this links to the parent store.';
-- 24. For specific products, linking a master item with a brand and size. (Moved up due to dependency in flyer_items)
CREATE TABLE IF NOT EXISTS public.products (
product_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE CASCADE,
brand_id BIGINT REFERENCES public.brands(brand_id) ON DELETE SET NULL,
name TEXT NOT NULL,
description TEXT,
size TEXT,
upc_code TEXT UNIQUE,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.products IS 'Represents a specific, sellable product, combining a generic item with a brand and size.';
COMMENT ON COLUMN public.products.upc_code IS 'Universal Product Code, if available, for exact product matching.';
COMMENT ON COLUMN public.products.brand_id IS 'Can be null for generic/store-brand items.';
COMMENT ON COLUMN public.products.name IS 'Prime Raised without Antibiotics Chicken Breast.';
COMMENT ON COLUMN public.products.size IS 'e.g., "4L", "500g".';
CREATE INDEX IF NOT EXISTS idx_products_master_item_id ON public.products(master_item_id);
CREATE INDEX IF NOT EXISTS idx_products_brand_id ON public.products(brand_id);
-- 9. The 'flyer_items' table. This stores individual items from flyers.
CREATE TABLE IF NOT EXISTS public.flyer_items (
flyer_item_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
@@ -170,13 +203,13 @@ CREATE TABLE IF NOT EXISTS public.flyer_items (
price_in_cents INTEGER,
quantity_num NUMERIC,
quantity TEXT NOT NULL,
category_id BIGINT REFERENCES public.categories(category_id),
category_id BIGINT REFERENCES public.categories(category_id) ON DELETE SET NULL,
category_name TEXT,
unit_price JSONB,
view_count INTEGER DEFAULT 0 NOT NULL,
click_count INTEGER DEFAULT 0 NOT NULL,
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id),
product_id BIGINT,
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE SET NULL,
product_id BIGINT REFERENCES public.products(product_id) ON DELETE SET NULL,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
@@ -293,7 +326,7 @@ CREATE INDEX IF NOT EXISTS idx_shopping_lists_user_id ON public.shopping_lists(u
CREATE TABLE IF NOT EXISTS public.shopping_list_items (
shopping_list_item_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
shopping_list_id BIGINT NOT NULL REFERENCES public.shopping_lists(shopping_list_id) ON DELETE CASCADE,
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id),
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE CASCADE,
custom_item_name TEXT,
quantity NUMERIC DEFAULT 1 NOT NULL,
is_purchased BOOLEAN DEFAULT false NOT NULL,
@@ -358,7 +391,7 @@ CREATE INDEX IF NOT EXISTS idx_shared_menu_plans_shared_with_user_id ON public.s
CREATE TABLE IF NOT EXISTS public.suggested_corrections (
suggested_correction_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
flyer_item_id BIGINT NOT NULL REFERENCES public.flyer_items(flyer_item_id) ON DELETE CASCADE,
user_id UUID NOT NULL REFERENCES public.users(user_id),
user_id UUID NOT NULL REFERENCES public.users(user_id) ON DELETE CASCADE,
correction_type TEXT NOT NULL,
suggested_value TEXT NOT NULL,
status TEXT DEFAULT 'pending' NOT NULL,
@@ -378,9 +411,9 @@ CREATE INDEX IF NOT EXISTS idx_suggested_corrections_pending ON public.suggested
-- 21. For prices submitted directly by users from in-store.
CREATE TABLE IF NOT EXISTS public.user_submitted_prices (
user_submitted_price_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
user_id UUID NOT NULL REFERENCES public.users(user_id),
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id),
store_id BIGINT NOT NULL REFERENCES public.stores(store_id),
user_id UUID NOT NULL REFERENCES public.users(user_id) ON DELETE CASCADE,
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE CASCADE,
store_id BIGINT NOT NULL REFERENCES public.stores(store_id) ON DELETE CASCADE,
price_in_cents INTEGER NOT NULL,
photo_url TEXT,
upvotes INTEGER DEFAULT 0 NOT NULL,
@@ -408,38 +441,6 @@ COMMENT ON TABLE public.unmatched_flyer_items IS 'A queue for reviewing flyer it
CREATE INDEX IF NOT EXISTS idx_unmatched_flyer_items_flyer_item_id ON public.unmatched_flyer_items(flyer_item_id);
CREATE INDEX IF NOT EXISTS idx_unmatched_flyer_items_pending ON public.unmatched_flyer_items (created_at) WHERE status = 'pending';
-- 23. Store brand information.
CREATE TABLE IF NOT EXISTS public.brands (
brand_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
name TEXT NOT NULL UNIQUE,
logo_url TEXT,
store_id BIGINT REFERENCES public.stores(store_id) ON DELETE SET NULL,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.brands IS 'Stores brand names like "Coca-Cola", "Maple Leaf", or "Kraft".';
COMMENT ON COLUMN public.brands.store_id IS 'If this is a store-specific brand (e.g., President''s Choice), this links to the parent store.';
-- 24. For specific products, linking a master item with a brand and size.
CREATE TABLE IF NOT EXISTS public.products (
product_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id),
brand_id BIGINT REFERENCES public.brands(brand_id),
name TEXT NOT NULL,
description TEXT,
size TEXT,
upc_code TEXT UNIQUE,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.products IS 'Represents a specific, sellable product, combining a generic item with a brand and size.';
COMMENT ON COLUMN public.products.upc_code IS 'Universal Product Code, if available, for exact product matching.';
COMMENT ON COLUMN public.products.brand_id IS 'Can be null for generic/store-brand items.';
COMMENT ON COLUMN public.products.name IS 'Prime Raised without Antibiotics Chicken Breast.';
COMMENT ON COLUMN public.products.size IS 'e.g., "4L", "500g".';
CREATE INDEX IF NOT EXISTS idx_products_master_item_id ON public.products(master_item_id);
CREATE INDEX IF NOT EXISTS idx_products_brand_id ON public.products(brand_id);
-- 25. Linking table for when one flyer is valid for multiple locations.
CREATE TABLE IF NOT EXISTS public.flyer_locations (
flyer_id BIGINT NOT NULL REFERENCES public.flyers(flyer_id) ON DELETE CASCADE,
@@ -495,7 +496,7 @@ CREATE UNIQUE INDEX IF NOT EXISTS idx_recipes_unique_system_recipe_name ON publi
CREATE TABLE IF NOT EXISTS public.recipe_ingredients (
recipe_ingredient_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
recipe_id BIGINT NOT NULL REFERENCES public.recipes(recipe_id) ON DELETE CASCADE,
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id),
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE CASCADE,
quantity NUMERIC NOT NULL,
unit TEXT NOT NULL,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
@@ -779,7 +780,7 @@ CREATE INDEX IF NOT EXISTS idx_shopping_trips_shopping_list_id ON public.shoppin
CREATE TABLE IF NOT EXISTS public.shopping_trip_items (
shopping_trip_item_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
shopping_trip_id BIGINT NOT NULL REFERENCES public.shopping_trips(shopping_trip_id) ON DELETE CASCADE,
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id),
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE SET NULL,
custom_item_name TEXT,
quantity NUMERIC NOT NULL,
price_paid_cents INTEGER,
@@ -843,7 +844,7 @@ CREATE INDEX IF NOT EXISTS idx_user_follows_following_id ON public.user_follows(
CREATE TABLE IF NOT EXISTS public.receipts (
receipt_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
user_id UUID NOT NULL REFERENCES public.users(user_id) ON DELETE CASCADE,
store_id BIGINT REFERENCES public.stores(store_id),
store_id BIGINT REFERENCES public.stores(store_id) ON DELETE CASCADE,
receipt_image_url TEXT NOT NULL,
transaction_date TIMESTAMPTZ,
total_amount_cents INTEGER,
@@ -864,8 +865,8 @@ CREATE TABLE IF NOT EXISTS public.receipt_items (
raw_item_description TEXT NOT NULL,
quantity NUMERIC DEFAULT 1 NOT NULL,
price_paid_cents INTEGER NOT NULL,
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id),
product_id BIGINT REFERENCES public.products(product_id),
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE SET NULL,
product_id BIGINT REFERENCES public.products(product_id) ON DELETE SET NULL,
status TEXT DEFAULT 'unmatched' NOT NULL CHECK (status IN ('unmatched', 'matched', 'needs_review', 'ignored')),
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL

View File

@@ -126,8 +126,8 @@ CREATE TABLE IF NOT EXISTS public.flyers (
file_name TEXT NOT NULL,
image_url TEXT NOT NULL,
icon_url TEXT,
checksum TEXT UNIQUE,
store_id BIGINT REFERENCES public.stores(store_id),
checksum TEXT UNIQUE,
store_id BIGINT REFERENCES public.stores(store_id) ON DELETE CASCADE,
valid_from DATE,
valid_to DATE,
store_address TEXT,
@@ -155,7 +155,7 @@ CREATE INDEX IF NOT EXISTS idx_flyers_valid_to_file_name ON public.flyers (valid
CREATE TABLE IF NOT EXISTS public.master_grocery_items (
master_grocery_item_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
name TEXT NOT NULL UNIQUE,
category_id BIGINT REFERENCES public.categories(category_id),
category_id BIGINT REFERENCES public.categories(category_id) ON DELETE SET NULL,
is_allergen BOOLEAN DEFAULT false,
allergy_info JSONB,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
@@ -178,6 +178,38 @@ CREATE TABLE IF NOT EXISTS public.user_watched_items (
COMMENT ON TABLE public.user_watched_items IS 'A linking table that represents a user''s personal watchlist of grocery items.';
CREATE INDEX IF NOT EXISTS idx_user_watched_items_master_item_id ON public.user_watched_items(master_item_id);
-- 23. Store brand information. (Moved up due to dependency in flyer_items)
CREATE TABLE IF NOT EXISTS public.brands (
brand_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
name TEXT NOT NULL UNIQUE,
logo_url TEXT,
store_id BIGINT REFERENCES public.stores(store_id) ON DELETE SET NULL,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.brands IS 'Stores brand names like "Coca-Cola", "Maple Leaf", or "Kraft".';
COMMENT ON COLUMN public.brands.store_id IS 'If this is a store-specific brand (e.g., President''s Choice), this links to the parent store.';
-- 24. For specific products, linking a master item with a brand and size. (Moved up due to dependency in flyer_items)
CREATE TABLE IF NOT EXISTS public.products (
product_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE CASCADE,
brand_id BIGINT REFERENCES public.brands(brand_id) ON DELETE SET NULL,
name TEXT NOT NULL,
description TEXT,
size TEXT,
upc_code TEXT UNIQUE,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.products IS 'Represents a specific, sellable product, combining a generic item with a brand and size.';
COMMENT ON COLUMN public.products.upc_code IS 'Universal Product Code, if available, for exact product matching.';
COMMENT ON COLUMN public.products.brand_id IS 'Can be null for generic/store-brand items.';
COMMENT ON COLUMN public.products.name IS 'Prime Raised without Antibiotics Chicken Breast.';
COMMENT ON COLUMN public.products.size IS 'e.g., "4L", "500g".';
CREATE INDEX IF NOT EXISTS idx_products_master_item_id ON public.products(master_item_id);
CREATE INDEX IF NOT EXISTS idx_products_brand_id ON public.products(brand_id);
-- 9. The 'flyer_items' table. This stores individual items from flyers.
CREATE TABLE IF NOT EXISTS public.flyer_items (
flyer_item_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
@@ -187,13 +219,13 @@ CREATE TABLE IF NOT EXISTS public.flyer_items (
price_in_cents INTEGER,
quantity_num NUMERIC,
quantity TEXT NOT NULL,
category_id BIGINT REFERENCES public.categories(category_id),
category_id BIGINT REFERENCES public.categories(category_id) ON DELETE SET NULL,
category_name TEXT,
unit_price JSONB,
view_count INTEGER DEFAULT 0 NOT NULL,
click_count INTEGER DEFAULT 0 NOT NULL,
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id),
product_id BIGINT,
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE SET NULL,
product_id BIGINT REFERENCES public.products(product_id) ON DELETE SET NULL,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
@@ -310,7 +342,7 @@ CREATE INDEX IF NOT EXISTS idx_shopping_lists_user_id ON public.shopping_lists(u
CREATE TABLE IF NOT EXISTS public.shopping_list_items (
shopping_list_item_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
shopping_list_id BIGINT NOT NULL REFERENCES public.shopping_lists(shopping_list_id) ON DELETE CASCADE,
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id),
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE CASCADE,
custom_item_name TEXT,
quantity NUMERIC DEFAULT 1 NOT NULL,
is_purchased BOOLEAN DEFAULT false NOT NULL,
@@ -375,7 +407,7 @@ CREATE INDEX IF NOT EXISTS idx_shared_menu_plans_shared_with_user_id ON public.s
CREATE TABLE IF NOT EXISTS public.suggested_corrections (
suggested_correction_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
flyer_item_id BIGINT NOT NULL REFERENCES public.flyer_items(flyer_item_id) ON DELETE CASCADE,
user_id UUID NOT NULL REFERENCES public.users(user_id),
user_id UUID NOT NULL REFERENCES public.users(user_id) ON DELETE CASCADE,
correction_type TEXT NOT NULL,
suggested_value TEXT NOT NULL,
status TEXT DEFAULT 'pending' NOT NULL,
@@ -395,9 +427,9 @@ CREATE INDEX IF NOT EXISTS idx_suggested_corrections_pending ON public.suggested
-- 21. For prices submitted directly by users from in-store.
CREATE TABLE IF NOT EXISTS public.user_submitted_prices (
user_submitted_price_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
user_id UUID NOT NULL REFERENCES public.users(user_id),
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id),
store_id BIGINT NOT NULL REFERENCES public.stores(store_id),
user_id UUID NOT NULL REFERENCES public.users(user_id) ON DELETE CASCADE,
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE CASCADE,
store_id BIGINT NOT NULL REFERENCES public.stores(store_id) ON DELETE CASCADE,
price_in_cents INTEGER NOT NULL,
photo_url TEXT,
upvotes INTEGER DEFAULT 0 NOT NULL,
@@ -424,38 +456,6 @@ COMMENT ON TABLE public.unmatched_flyer_items IS 'A queue for reviewing flyer it
CREATE INDEX IF NOT EXISTS idx_unmatched_flyer_items_flyer_item_id ON public.unmatched_flyer_items(flyer_item_id);
CREATE INDEX IF NOT EXISTS idx_unmatched_flyer_items_pending ON public.unmatched_flyer_items (created_at) WHERE status = 'pending';
-- 23. Store brand information.
CREATE TABLE IF NOT EXISTS public.brands (
brand_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
name TEXT NOT NULL UNIQUE,
logo_url TEXT,
store_id BIGINT REFERENCES public.stores(store_id) ON DELETE SET NULL,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.brands IS 'Stores brand names like "Coca-Cola", "Maple Leaf", or "Kraft".';
COMMENT ON COLUMN public.brands.store_id IS 'If this is a store-specific brand (e.g., President''s Choice), this links to the parent store.';
-- 24. For specific products, linking a master item with a brand and size.
CREATE TABLE IF NOT EXISTS public.products (
product_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id),
brand_id BIGINT REFERENCES public.brands(brand_id),
name TEXT NOT NULL,
description TEXT,
size TEXT,
upc_code TEXT UNIQUE,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.products IS 'Represents a specific, sellable product, combining a generic item with a brand and size.';
COMMENT ON COLUMN public.products.upc_code IS 'Universal Product Code, if available, for exact product matching.';
COMMENT ON COLUMN public.products.brand_id IS 'Can be null for generic/store-brand items.';
COMMENT ON COLUMN public.products.name IS 'Prime Raised without Antibiotics Chicken Breast.';
COMMENT ON COLUMN public.products.size IS 'e.g., "4L", "500g".';
CREATE INDEX IF NOT EXISTS idx_products_master_item_id ON public.products(master_item_id);
CREATE INDEX IF NOT EXISTS idx_products_brand_id ON public.products(brand_id);
-- 25. Linking table for when one flyer is valid for multiple locations.
CREATE TABLE IF NOT EXISTS public.flyer_locations (
flyer_id BIGINT NOT NULL REFERENCES public.flyers(flyer_id) ON DELETE CASCADE,
@@ -510,7 +510,7 @@ CREATE UNIQUE INDEX IF NOT EXISTS idx_recipes_unique_system_recipe_name ON publi
CREATE TABLE IF NOT EXISTS public.recipe_ingredients (
recipe_ingredient_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
recipe_id BIGINT NOT NULL REFERENCES public.recipes(recipe_id) ON DELETE CASCADE,
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id),
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE CASCADE,
quantity NUMERIC NOT NULL,
unit TEXT NOT NULL,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
@@ -796,7 +796,7 @@ CREATE INDEX IF NOT EXISTS idx_shopping_trips_shopping_list_id ON public.shoppin
CREATE TABLE IF NOT EXISTS public.shopping_trip_items (
shopping_trip_item_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
shopping_trip_id BIGINT NOT NULL REFERENCES public.shopping_trips(shopping_trip_id) ON DELETE CASCADE,
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id),
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE SET NULL,
custom_item_name TEXT,
quantity NUMERIC NOT NULL,
price_paid_cents INTEGER,
@@ -862,7 +862,7 @@ CREATE INDEX IF NOT EXISTS idx_user_follows_following_id ON public.user_follows(
CREATE TABLE IF NOT EXISTS public.receipts (
receipt_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
user_id UUID NOT NULL REFERENCES public.users(user_id) ON DELETE CASCADE,
store_id BIGINT REFERENCES public.stores(store_id),
store_id BIGINT REFERENCES public.stores(store_id) ON DELETE CASCADE,
receipt_image_url TEXT NOT NULL,
transaction_date TIMESTAMPTZ,
total_amount_cents INTEGER,
@@ -883,8 +883,8 @@ CREATE TABLE IF NOT EXISTS public.receipt_items (
raw_item_description TEXT NOT NULL,
quantity NUMERIC DEFAULT 1 NOT NULL,
price_paid_cents INTEGER NOT NULL,
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id),
product_id BIGINT REFERENCES public.products(product_id),
master_item_id BIGINT REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE SET NULL,
product_id BIGINT REFERENCES public.products(product_id) ON DELETE SET NULL,
status TEXT DEFAULT 'unmatched' NOT NULL CHECK (status IN ('unmatched', 'matched', 'needs_review', 'ignored')),
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
@@ -2128,11 +2128,61 @@ DROP FUNCTION IF EXISTS public.fork_recipe(UUID, BIGINT);
CREATE OR REPLACE FUNCTION public.fork_recipe(p_user_id UUID, p_original_recipe_id BIGINT)
RETURNS SETOF public.recipes
LANGUAGE sql
LANGUAGE plpgsql
SECURITY INVOKER
AS $$
-- The entire forking logic is now encapsulated in a single, atomic database function.
SELECT * FROM public.fork_recipe(p_user_id, p_original_recipe_id);
DECLARE
new_recipe_id BIGINT;
BEGIN
-- 1. Create a copy of the recipe, linking it to the new user and the original recipe.
INSERT INTO public.recipes (
user_id,
original_recipe_id,
name,
description,
instructions,
prep_time_minutes,
cook_time_minutes,
servings,
photo_url,
calories_per_serving,
protein_grams,
fat_grams,
carb_grams,
status -- Forked recipes should be private by default
)
SELECT
p_user_id,
p_original_recipe_id,
original.name || ' (Fork)', -- Append '(Fork)' to distinguish it
original.description,
original.instructions,
original.prep_time_minutes,
original.cook_time_minutes,
original.servings,
original.photo_url,
original.calories_per_serving,
original.protein_grams,
original.fat_grams,
original.carb_grams,
'private'
FROM public.recipes AS original
WHERE original.recipe_id = p_original_recipe_id
RETURNING recipe_id INTO new_recipe_id;
-- If the original recipe didn't exist, new_recipe_id will be null.
IF new_recipe_id IS NULL THEN
RETURN;
END IF;
-- 2. Copy all ingredients, tags, and appliances from the original recipe to the new one.
INSERT INTO public.recipe_ingredients (recipe_id, master_item_id, quantity, unit) SELECT new_recipe_id, master_item_id, quantity, unit FROM public.recipe_ingredients WHERE recipe_id = p_original_recipe_id;
INSERT INTO public.recipe_tags (recipe_id, tag_id) SELECT new_recipe_id, tag_id FROM public.recipe_tags WHERE recipe_id = p_original_recipe_id;
INSERT INTO public.recipe_appliances (recipe_id, appliance_id) SELECT new_recipe_id, appliance_id FROM public.recipe_appliances WHERE recipe_id = p_original_recipe_id;
-- 3. Return the newly created recipe record.
RETURN QUERY SELECT * FROM public.recipes WHERE recipe_id = new_recipe_id;
END;
$$;

View File

@@ -36,7 +36,7 @@ vi.mock('pdfjs-dist', () => ({
// Mock the new config module
vi.mock('./config', () => ({
default: {
app: { version: '1.0.0', commitMessage: 'Initial commit', commitUrl: '#' },
app: { version: '20250101-1200:abc1234:1.0.0', commitMessage: 'Initial commit', commitUrl: '#' },
google: { mapsEmbedApiKey: 'mock-key' },
},
}));
@@ -243,9 +243,7 @@ describe('App Component', () => {
mockedApiClient.fetchShoppingLists.mockImplementation(() =>
Promise.resolve(new Response(JSON.stringify([]))),
);
mockedAiApiClient.rescanImageArea.mockResolvedValue(
new Response(JSON.stringify({ text: 'mocked text' })),
); // Mock for FlyerCorrectionTool
mockedAiApiClient.rescanImageArea.mockResolvedValue({ text: 'mocked text' }); // Mock for FlyerCorrectionTool
console.log('[TEST DEBUG] beforeEach: Setup complete');
});
@@ -588,11 +586,11 @@ describe('App Component', () => {
// Mock the config module for this specific test
vi.mock('./config', () => ({
default: {
app: { version: '1.0.1', commitMessage: 'New feature!', commitUrl: '#' },
app: { version: '20250101-1200:abc1234:1.0.1', commitMessage: 'New feature!', commitUrl: '#' },
google: { mapsEmbedApiKey: 'mock-key' },
},
}));
localStorageMock.setItem('lastSeenVersion', '1.0.0');
localStorageMock.setItem('lastSeenVersion', '20250101-1200:abc1234:1.0.0');
renderApp();
await expect(screen.findByTestId('whats-new-modal-mock')).resolves.toBeInTheDocument();
});
@@ -741,7 +739,7 @@ describe('App Component', () => {
vi.mock('./config', () => ({
default: {
app: {
version: '2.0.0',
version: '20250101-1200:abc1234:2.0.0',
commitMessage: 'A new version!',
commitUrl: 'http://example.com/commit/2.0.0',
},
@@ -752,14 +750,14 @@ describe('App Component', () => {
it('should display the version number and commit link', () => {
renderApp();
const versionLink = screen.getByText(`Version: 2.0.0`);
const versionLink = screen.getByText(`Version: 20250101-1200:abc1234:2.0.0`);
expect(versionLink).toBeInTheDocument();
expect(versionLink).toHaveAttribute('href', 'http://example.com/commit/2.0.0');
});
it('should open the "What\'s New" modal when the question mark icon is clicked', async () => {
// Pre-set the localStorage to prevent the modal from opening automatically
localStorageMock.setItem('lastSeenVersion', '2.0.0');
localStorageMock.setItem('lastSeenVersion', '20250101-1200:abc1234:2.0.0');
renderApp();
expect(screen.queryByTestId('whats-new-modal-mock')).not.toBeInTheDocument();

View File

@@ -16,4 +16,4 @@ const config = {
},
};
export default config;
export default config;

View File

@@ -6,10 +6,11 @@
* DO NOT run this on a production database.
*/
import { Pool } from 'pg';
import { Pool, PoolClient } from 'pg';
import fs from 'node:fs/promises';
import path from 'node:path';
import bcrypt from 'bcrypt';
import { logger } from '../services/logger.server';
import { CATEGORIES } from '../types';
const pool = new Pool({
user: process.env.DB_USER,
@@ -20,81 +21,55 @@ const pool = new Pool({
});
async function main() {
// Declare client outside the try block so it's accessible in the finally block.
let client;
let client: PoolClient | undefined;
try {
client = await pool.connect();
logger.info('Connected to the database for seeding.');
await client.query('BEGIN');
// 1. Clean the database
logger.info('--- Wiping existing data... ---');
// Using TRUNCATE ... RESTART IDENTITY CASCADE is a powerful way to clean all tables
// and reset auto-incrementing keys, while respecting foreign key relationships.
const tablesRes = await client.query(`
SELECT tablename
FROM pg_tables
WHERE schemaname = 'public'
-- Exclude PostGIS system tables from truncation to avoid permission errors.
AND tablename NOT IN ('spatial_ref_sys', 'geometry_columns')
`);
const tables = tablesRes.rows.map((row) => `"${row.tablename}"`).join(', ');
if (tables) {
await client.query(`TRUNCATE ${tables} RESTART IDENTITY CASCADE`);
logger.info('All tables in public schema have been truncated.');
}
// 1. Clean the database by dropping and recreating the schema
logger.info('--- Wiping and rebuilding schema... ---');
const dropScriptPath = path.resolve(process.cwd(), 'sql/drop_tables.sql');
const dropSql = await fs.readFile(dropScriptPath, 'utf-8');
await client.query(dropSql);
logger.info('All tables dropped successfully.');
// 2. Seed Categories
logger.info('--- Seeding Categories... ---');
const categoryQuery = `INSERT INTO public.categories (name) VALUES ${CATEGORIES.map((_, i) => `($${i + 1})`).join(', ')} RETURNING category_id, name`;
const seededCategories = (
await client.query<{ category_id: number; name: string }>(categoryQuery, CATEGORIES)
).rows;
const categoryMap = new Map(seededCategories.map((c) => [c.name, c.category_id]));
logger.info(`Seeded ${seededCategories.length} categories.`);
const schemaScriptPath = path.resolve(process.cwd(), 'sql/master_schema_rollup.sql');
const schemaSql = await fs.readFile(schemaScriptPath, 'utf-8');
await client.query(schemaSql);
logger.info(
'Schema rebuilt and static data seeded successfully from master_schema_rollup.sql.',
);
// 3. Seed Stores
// 2. Seed Additional Stores (if any beyond what's in the rollup)
logger.info('--- Seeding Stores... ---');
const stores = ['Safeway', 'No Frills', 'Costco', 'Superstore'];
const storeQuery = `INSERT INTO public.stores (name) VALUES ${stores.map((_, i) => `($${i + 1})`).join(', ')} RETURNING store_id, name`;
const seededStores = (
await client.query<{ store_id: number; name: string }>(storeQuery, stores)
const storeQuery = `INSERT INTO public.stores (name) VALUES ${stores.map((_, i) => `($${i + 1})`).join(', ')} ON CONFLICT (name) DO NOTHING RETURNING store_id, name`;
await client.query<{ store_id: number; name: string }>(storeQuery, stores);
const allStores = (
await client.query<{ store_id: number; name: string }>(
'SELECT store_id, name FROM public.stores',
)
).rows;
const storeMap = new Map(seededStores.map((s) => [s.name, s.store_id]));
logger.info(`Seeded ${seededStores.length} stores.`);
// 4. Seed Master Grocery Items
logger.info('--- Seeding Master Grocery Items... ---');
const masterItems = [
{ name: 'Chicken Breast, Boneless Skinless', category: 'Meat & Seafood' },
{ name: 'Ground Beef, Lean', category: 'Meat & Seafood' },
{ name: 'Avocado', category: 'Fruits & Vegetables' },
{ name: 'Bananas', category: 'Fruits & Vegetables' },
{ name: 'Broccoli', category: 'Fruits & Vegetables' },
{ name: 'Cheddar Cheese, Block', category: 'Dairy & Eggs' },
{ name: 'Milk, 2%', category: 'Dairy & Eggs' },
{ name: 'Eggs, Large', category: 'Dairy & Eggs' },
{ name: 'Whole Wheat Bread', category: 'Bakery & Bread' },
{ name: 'Pasta, Spaghetti', category: 'Pantry & Dry Goods' },
{ name: 'Canned Tomatoes, Diced', category: 'Canned Goods' },
{ name: 'Coca-Cola, 12-pack', category: 'Beverages' },
{ name: 'Frozen Pizza', category: 'Frozen Foods' },
{ name: 'Paper Towels', category: 'Household & Cleaning' },
];
const masterItemValues = masterItems
.map((item) => `('${item.name.replace(/'/g, "''")}', ${categoryMap.get(item.category)})`)
.join(', ');
const masterItemQuery = `INSERT INTO public.master_grocery_items (name, category_id) VALUES ${masterItemValues} RETURNING master_grocery_item_id, name`;
const seededMasterItems = (
await client.query<{ master_grocery_item_id: number; name: string }>(masterItemQuery)
).rows;
const masterItemMap = new Map(
seededMasterItems.map((item) => [item.name, item.master_grocery_item_id]),
const storeMap = new Map(
allStores.map((s: { name: string; store_id: number }) => [s.name, s.store_id]),
);
logger.info(`Seeded ${seededMasterItems.length} master grocery items.`);
logger.info(`Seeded/verified ${allStores.length} total stores.`);
// 5. Seed Users & Profiles
// Fetch maps for items seeded by the master rollup script
const masterItemMap = new Map(
(
await client.query<{ master_grocery_item_id: number; name: string }>(
'SELECT master_grocery_item_id, name FROM public.master_grocery_items',
)
).rows.map((item: { name: string; master_grocery_item_id: number }) => [
item.name,
item.master_grocery_item_id,
]),
);
// 3. Seed Users & Profiles
logger.info('--- Seeding Users & Profiles... ---');
const saltRounds = 10;
const adminPassHash = await bcrypt.hash('adminpass', saltRounds);
@@ -126,7 +101,7 @@ async function main() {
const userId = userRes.rows[0].user_id;
logger.info('Seeded regular user (user@example.com / userpass)');
// 6. Seed a Flyer
// 4. Seed a Flyer
logger.info('--- Seeding a Sample Flyer... ---');
const today = new Date();
const validFrom = new Date(today);
@@ -146,29 +121,29 @@ async function main() {
const flyerId = flyerRes.rows[0].flyer_id;
logger.info(`Seeded flyer for Safeway (ID: ${flyerId}).`);
// 7. Seed Flyer Items
// 5. Seed Flyer Items
logger.info('--- Seeding Flyer Items... ---');
const flyerItems = [
{
name: 'Chicken Breast, Boneless Skinless',
name: 'chicken breast',
price_display: '$3.99 /lb',
price_in_cents: 399,
quantity: 'per lb',
master_item_id: masterItemMap.get('Chicken Breast, Boneless Skinless'),
master_item_id: masterItemMap.get('chicken breast'),
},
{
name: 'Avocado',
name: 'avocados',
price_display: '2 for $5.00',
price_in_cents: 250,
quantity: 'each',
master_item_id: masterItemMap.get('Avocado'),
master_item_id: masterItemMap.get('avocados'),
},
{
name: 'Coca-Cola 12-pack',
name: 'soda',
price_display: '$6.99',
price_in_cents: 699,
quantity: '12x355ml',
master_item_id: masterItemMap.get('Coca-Cola, 12-pack'),
master_item_id: masterItemMap.get('soda'),
},
{
name: 'Unmatched Sample Item',
@@ -194,12 +169,12 @@ async function main() {
}
logger.info(`Seeded ${flyerItems.length} items for the Safeway flyer.`);
// 8. Seed Watched Items for the user
// 6. Seed Watched Items for the user
logger.info('--- Seeding Watched Items... ---');
const watchedItemIds = [
masterItemMap.get('Chicken Breast, Boneless Skinless'),
masterItemMap.get('Avocado'),
masterItemMap.get('Ground Beef, Lean'),
masterItemMap.get('chicken breast'),
masterItemMap.get('avocados'),
masterItemMap.get('ground beef'),
];
for (const itemId of watchedItemIds) {
if (itemId) {
@@ -211,7 +186,7 @@ async function main() {
}
logger.info(`Seeded ${watchedItemIds.length} watched items for Test User.`);
// 9. Seed a Shopping List
// 7. Seed a Shopping List
logger.info('--- Seeding a Shopping List... ---');
const listRes = await client.query<{ shopping_list_id: number }>(
'INSERT INTO public.shopping_lists (user_id, name) VALUES ($1, $2) RETURNING shopping_list_id',
@@ -220,8 +195,8 @@ async function main() {
const listId = listRes.rows[0].shopping_list_id;
const shoppingListItems = [
{ master_item_id: masterItemMap.get('Milk, 2%'), quantity: 1 },
{ master_item_id: masterItemMap.get('Eggs, Large'), quantity: 1 },
{ master_item_id: masterItemMap.get('milk'), quantity: 1 },
{ master_item_id: masterItemMap.get('eggs'), quantity: 1 },
{ custom_item_name: 'Specialty Hot Sauce', quantity: 1 },
];
@@ -235,75 +210,6 @@ async function main() {
`Seeded shopping list "Weekly Groceries" with ${shoppingListItems.length} items for Test User.`,
);
// 10. Seed Brands
logger.info('--- Seeding Brands... ---');
const brands = [
'Coca-Cola',
'Kraft',
'Maple Leaf',
"Dempster's",
'No Name',
"President's Choice",
];
const brandQuery = `INSERT INTO public.brands (name) VALUES ${brands.map((_, i) => `($${i + 1})`).join(', ')} ON CONFLICT (name) DO NOTHING`;
await client.query(brandQuery, brands);
logger.info(`Seeded ${brands.length} brands.`);
// Link store-specific brands
const loblawsId = storeMap.get('Loblaws');
if (loblawsId) {
await client.query('UPDATE public.brands SET store_id = $1 WHERE name = $2 OR name = $3', [
loblawsId,
'No Name',
"President's Choice",
]);
logger.info('Linked store brands to Loblaws.');
}
// 11. Seed Recipes
logger.info('--- Seeding Recipes... ---');
const recipes = [
{
name: 'Simple Chicken and Rice',
description: 'A quick and healthy weeknight meal.',
instructions: '1. Cook rice. 2. Cook chicken. 3. Combine.',
prep: 10,
cook: 20,
servings: 4,
},
{
name: 'Classic Spaghetti Bolognese',
description: 'A rich and hearty meat sauce.',
instructions: '1. Brown beef. 2. Add sauce. 3. Simmer.',
prep: 15,
cook: 45,
servings: 6,
},
{
name: 'Vegetable Stir-fry',
description: 'A fast and flavorful vegetarian meal.',
instructions: '1. Chop veggies. 2. Stir-fry. 3. Add sauce.',
prep: 10,
cook: 10,
servings: 3,
},
];
for (const recipe of recipes) {
await client.query(
`INSERT INTO public.recipes (name, description, instructions, prep_time_minutes, cook_time_minutes, servings, status)
VALUES ($1, $2, $3, $4, $5, $6, 'public') ON CONFLICT (name) WHERE user_id IS NULL DO NOTHING`,
[
recipe.name,
recipe.description,
recipe.instructions,
recipe.prep,
recipe.cook,
recipe.servings,
],
);
}
logger.info(`Seeded ${recipes.length} recipes.`);
// --- SEED SCRIPT DEBUG LOGGING ---
// Corrected the query to be unambiguous by specifying the table alias for each column.
// `id` and `email` come from the `users` table (u), and `role` comes from the `profiles` table (p).

View File

@@ -1,4 +1,4 @@
// src/components/PriceHistoryChart.tsx
// src/features/charts/PriceHistoryChart.tsx
import React, { useState, useEffect, useMemo } from 'react';
import {
LineChart,
@@ -142,7 +142,7 @@ export const PriceHistoryChart: React.FC = () => {
const renderContent = () => {
if (isLoading || isLoadingUserData) {
return (
<div role="status" className="flex justify-center items-center h-full min-h-[200px]">
<div role="status" className="flex justify-center items-center h-full min-h-50]">
<LoadingSpinner /> <span className="ml-2">Loading Price History...</span>
</div>
);
@@ -198,7 +198,12 @@ export const PriceHistoryChart: React.FC = () => {
borderRadius: '0.5rem',
}}
labelStyle={{ color: '#F9FAFB' }}
formatter={(value: number) => `$${(value / 100).toFixed(2)}`}
formatter={(value: number | undefined) => {
if (typeof value === 'number') {
return [`$${(value / 100).toFixed(2)}`];
}
return [null];
}}
/>
<Legend wrapperStyle={{ fontSize: '12px' }} />
{availableItems.map((item, index) => (

View File

@@ -73,12 +73,11 @@ describe('FlyerUploader', () => {
it('should handle file upload and start polling', async () => {
console.log('--- [TEST LOG] ---: 1. Setting up mocks for upload and polling.');
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue(
new Response(JSON.stringify({ jobId: 'job-123' }), { status: 200 }),
);
mockedAiApiClient.getJobStatus.mockResolvedValue(
new Response(JSON.stringify({ state: 'active', progress: { message: 'Checking...' } })),
);
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue({ jobId: 'job-123' });
mockedAiApiClient.getJobStatus.mockResolvedValue({
state: 'active',
progress: { message: 'Checking...' },
});
console.log('--- [TEST LOG] ---: 2. Rendering component and preparing file.');
renderComponent();
@@ -131,12 +130,11 @@ describe('FlyerUploader', () => {
it('should handle file upload via drag and drop', async () => {
console.log('--- [TEST LOG] ---: 1. Setting up mocks for drag and drop.');
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue(
new Response(JSON.stringify({ jobId: 'job-dnd' }), { status: 200 }),
);
mockedAiApiClient.getJobStatus.mockResolvedValue(
new Response(JSON.stringify({ state: 'active', progress: { message: 'Dropped...' } })),
);
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue({ jobId: 'job-dnd' });
mockedAiApiClient.getJobStatus.mockResolvedValue({
state: 'active',
progress: { message: 'Dropped...' },
});
console.log('--- [TEST LOG] ---: 2. Rendering component and preparing file for drop.');
renderComponent();
@@ -159,16 +157,10 @@ describe('FlyerUploader', () => {
it('should poll for status, complete successfully, and redirect', async () => {
const onProcessingComplete = vi.fn();
console.log('--- [TEST LOG] ---: 1. Setting up mock sequence for polling.');
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue(
new Response(JSON.stringify({ jobId: 'job-123' }), { status: 200 }),
);
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue({ jobId: 'job-123' });
mockedAiApiClient.getJobStatus
.mockResolvedValueOnce(
new Response(JSON.stringify({ state: 'active', progress: { message: 'Analyzing...' } })),
)
.mockResolvedValueOnce(
new Response(JSON.stringify({ state: 'completed', returnValue: { flyerId: 42 } })),
);
.mockResolvedValueOnce({ state: 'active', progress: { message: 'Analyzing...' } })
.mockResolvedValueOnce({ state: 'completed', returnValue: { flyerId: 42 } });
console.log('--- [TEST LOG] ---: 2. Rendering component and uploading file.');
renderComponent(onProcessingComplete);
@@ -229,12 +221,11 @@ describe('FlyerUploader', () => {
it('should handle a failed job', async () => {
console.log('--- [TEST LOG] ---: 1. Setting up mocks for a failed job.');
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue(
new Response(JSON.stringify({ jobId: 'job-fail' }), { status: 200 }),
);
mockedAiApiClient.getJobStatus.mockResolvedValue(
new Response(JSON.stringify({ state: 'failed', failedReason: 'AI model exploded' })),
);
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue({ jobId: 'job-fail' });
mockedAiApiClient.getJobStatus.mockResolvedValue({
state: 'failed',
failedReason: 'AI model exploded',
});
console.log('--- [TEST LOG] ---: 2. Rendering and uploading.');
renderComponent();
@@ -260,11 +251,82 @@ describe('FlyerUploader', () => {
console.log('--- [TEST LOG] ---: 6. "Upload Another" button confirmed.');
});
it('should clear the polling timeout when a job fails', async () => {
const clearTimeoutSpy = vi.spyOn(global, 'clearTimeout');
console.log('--- [TEST LOG] ---: 1. Setting up mocks for failed job timeout clearance.');
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue({ jobId: 'job-fail-timeout' });
// We need at least one 'active' response to establish a timeout loop so we have something to clear
mockedAiApiClient.getJobStatus
.mockResolvedValueOnce({ state: 'active', progress: { message: 'Working...' } })
.mockResolvedValueOnce({ state: 'failed', failedReason: 'Fatal Error' });
renderComponent();
const file = new File(['content'], 'flyer.pdf', { type: 'application/pdf' });
const input = screen.getByLabelText(/click to select a file/i);
fireEvent.change(input, { target: { files: [file] } });
// Wait for the first poll to complete and UI to update to "Working..."
await screen.findByText('Working...');
// Advance time to trigger the second poll
await act(async () => {
vi.advanceTimersByTime(3000);
});
// Wait for the failure UI
await screen.findByText(/Processing failed: Fatal Error/i);
// Verify clearTimeout was called
expect(clearTimeoutSpy).toHaveBeenCalled();
// Verify no further polling occurs
const callsBefore = mockedAiApiClient.getJobStatus.mock.calls.length;
await act(async () => {
vi.advanceTimersByTime(10000);
});
expect(mockedAiApiClient.getJobStatus).toHaveBeenCalledTimes(callsBefore);
clearTimeoutSpy.mockRestore();
});
it('should clear the polling timeout when the component unmounts', async () => {
const clearTimeoutSpy = vi.spyOn(global, 'clearTimeout');
console.log('--- [TEST LOG] ---: 1. Setting up mocks for unmount timeout clearance.');
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue({ jobId: 'job-unmount' });
mockedAiApiClient.getJobStatus.mockResolvedValue({
state: 'active',
progress: { message: 'Polling...' },
});
const { unmount } = renderComponent();
const file = new File(['content'], 'flyer.pdf', { type: 'application/pdf' });
const input = screen.getByLabelText(/click to select a file/i);
fireEvent.change(input, { target: { files: [file] } });
// Wait for the first poll to complete and the UI to show the polling state
await screen.findByText('Polling...');
// Now that we are in a polling state (and a timeout is set), unmount the component
console.log('--- [TEST LOG] ---: 2. Unmounting component to trigger cleanup effect.');
unmount();
// Verify that the cleanup function in the useEffect hook was called
expect(clearTimeoutSpy).toHaveBeenCalled();
console.log('--- [TEST LOG] ---: 3. clearTimeout confirmed.');
clearTimeoutSpy.mockRestore();
});
it('should handle a duplicate flyer error (409)', async () => {
console.log('--- [TEST LOG] ---: 1. Setting up mock for 409 duplicate error.');
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue(
new Response(JSON.stringify({ flyerId: 99, message: 'Duplicate' }), { status: 409 }),
);
// The API client now throws a structured error for non-2xx responses.
mockedAiApiClient.uploadAndProcessFlyer.mockRejectedValue({
status: 409,
body: { flyerId: 99, message: 'Duplicate' },
});
console.log('--- [TEST LOG] ---: 2. Rendering and uploading.');
renderComponent();
@@ -295,12 +357,11 @@ describe('FlyerUploader', () => {
it('should allow the user to stop watching progress', async () => {
console.log('--- [TEST LOG] ---: 1. Setting up mocks for infinite polling.');
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue(
new Response(JSON.stringify({ jobId: 'job-stop' }), { status: 200 }),
);
mockedAiApiClient.getJobStatus.mockResolvedValue(
new Response(JSON.stringify({ state: 'active', progress: { message: 'Analyzing...' } })),
);
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue({ jobId: 'job-stop' });
mockedAiApiClient.getJobStatus.mockResolvedValue({
state: 'active',
progress: { message: 'Analyzing...' },
} as any);
console.log('--- [TEST LOG] ---: 2. Rendering and uploading.');
renderComponent();
@@ -362,9 +423,11 @@ describe('FlyerUploader', () => {
it('should handle a generic network error during upload', async () => {
console.log('--- [TEST LOG] ---: 1. Setting up mock for generic upload error.');
mockedAiApiClient.uploadAndProcessFlyer.mockRejectedValue(
new Error('Network Error During Upload'),
);
// Simulate a structured error from the API client
mockedAiApiClient.uploadAndProcessFlyer.mockRejectedValue({
status: 500,
body: { message: 'Network Error During Upload' },
});
renderComponent();
const file = new File(['content'], 'flyer.pdf', { type: 'application/pdf' });
const input = screen.getByLabelText(/click to select a file/i);
@@ -379,9 +442,7 @@ describe('FlyerUploader', () => {
it('should handle a generic network error during polling', async () => {
console.log('--- [TEST LOG] ---: 1. Setting up mock for polling error.');
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue(
new Response(JSON.stringify({ jobId: 'job-poll-fail' }), { status: 200 }),
);
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue({ jobId: 'job-poll-fail' });
mockedAiApiClient.getJobStatus.mockRejectedValue(new Error('Polling Network Error'));
renderComponent();
@@ -398,11 +459,9 @@ describe('FlyerUploader', () => {
it('should handle a completed job with a missing flyerId', async () => {
console.log('--- [TEST LOG] ---: 1. Setting up mock for malformed completion payload.');
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue(
new Response(JSON.stringify({ jobId: 'job-no-flyerid' }), { status: 200 }),
);
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue({ jobId: 'job-no-flyerid' });
mockedAiApiClient.getJobStatus.mockResolvedValue(
new Response(JSON.stringify({ state: 'completed', returnValue: {} })), // No flyerId
{ state: 'completed', returnValue: {} }, // No flyerId
);
renderComponent();
@@ -419,6 +478,27 @@ describe('FlyerUploader', () => {
console.log('--- [TEST LOG] ---: 4. Assertions passed.');
});
it('should handle a non-JSON response during polling', async () => {
console.log('--- [TEST LOG] ---: 1. Setting up mock for non-JSON response.');
// The actual function would throw, so we mock the rejection.
// The new getJobStatus would throw an error like "Failed to parse JSON..."
mockedAiApiClient.uploadAndProcessFlyer.mockResolvedValue({ jobId: 'job-bad-json' });
mockedAiApiClient.getJobStatus.mockRejectedValue(
new Error('Failed to parse JSON response from server. Body: <html>502 Bad Gateway</html>'),
);
renderComponent();
const file = new File(['content'], 'flyer.pdf', { type: 'application/pdf' });
const input = screen.getByLabelText(/click to select a file/i);
console.log('--- [TEST LOG] ---: 2. Firing file change event.');
fireEvent.change(input, { target: { files: [file] } });
console.log('--- [TEST LOG] ---: 3. Awaiting error message.');
expect(await screen.findByText(/Failed to parse JSON response from server/i)).toBeInTheDocument();
console.log('--- [TEST LOG] ---: 4. Assertions passed.');
});
it('should do nothing if the file input is cancelled', () => {
renderComponent();
const input = screen.getByLabelText(/click to select a file/i);

View File

@@ -60,14 +60,8 @@ export const FlyerUploader: React.FC<FlyerUploaderProps> = ({ onProcessingComple
const pollStatus = async () => {
console.debug(`[DEBUG] pollStatus(): Polling for jobId: ${jobId}`);
try {
const statusResponse = await getJobStatus(jobId);
console.debug(`[DEBUG] pollStatus(): API response status: ${statusResponse.status}`);
if (!statusResponse.ok) {
throw new Error(`Failed to get job status (HTTP ${statusResponse.status})`);
}
const job = await statusResponse.json();
console.debug('[DEBUG] pollStatus(): Job status received:', job);
const job = await getJobStatus(jobId); // Now returns parsed JSON directly
console.debug('[DEBUG] pollStatus(): Job status received:', job); // The rest of the logic remains the same
if (job.progress) {
setProcessingStages(job.progress.stages || []);
@@ -97,7 +91,13 @@ export const FlyerUploader: React.FC<FlyerUploaderProps> = ({ onProcessingComple
console.debug(
`[DEBUG] pollStatus(): Job state is "failed". Reason: ${job.failedReason}`,
);
// Explicitly clear any pending timeout to stop the polling loop immediately.
if (pollingTimeoutRef.current) {
clearTimeout(pollingTimeoutRef.current);
}
setErrorMessage(`Processing failed: ${job.failedReason || 'Unknown error'}`);
// Clear any stale "in-progress" messages to avoid user confusion.
setStatusMessage(null);
setProcessingState('error');
break;
@@ -150,29 +150,24 @@ export const FlyerUploader: React.FC<FlyerUploaderProps> = ({ onProcessingComple
`[DEBUG] processFile(): Checksum generated: ${checksum}. Calling uploadAndProcessFlyer.`,
);
const startResponse = await uploadAndProcessFlyer(file, checksum);
console.debug(`[DEBUG] processFile(): Upload response status: ${startResponse.status}`);
if (!startResponse.ok) {
const errorData = await startResponse.json();
console.debug('[DEBUG] processFile(): Upload failed. Error data:', errorData);
if (startResponse.status === 409 && errorData.flyerId) {
setErrorMessage(`This flyer has already been processed. You can view it here:`);
setDuplicateFlyerId(errorData.flyerId);
} else {
setErrorMessage(errorData.message || `Upload failed with status ${startResponse.status}`);
}
setProcessingState('error');
return;
}
const { jobId: newJobId } = await startResponse.json();
// The API client now returns parsed JSON on success or throws a structured error on failure.
const { jobId: newJobId } = await uploadAndProcessFlyer(file, checksum);
console.debug(`[DEBUG] processFile(): Upload successful. Received jobId: ${newJobId}`);
setJobId(newJobId);
setProcessingState('polling');
} catch (error) {
logger.error('An unexpected error occurred during file upload:', { error });
setErrorMessage(error instanceof Error ? error.message : 'An unexpected error occurred.');
} catch (error: any) {
// Handle the structured error thrown by the API client.
logger.error('An error occurred during file upload:', { error });
// Handle 409 Conflict for duplicate flyers
if (error?.status === 409 && error.body?.flyerId) {
setErrorMessage(`This flyer has already been processed. You can view it here:`);
setDuplicateFlyerId(error.body.flyerId);
} else {
// Handle other errors (e.g., validation, server errors)
const message =
error?.body?.message || error?.message || 'An unexpected error occurred during upload.';
setErrorMessage(message);
}
setProcessingState('error');
}
}, []);

View File

@@ -2,8 +2,8 @@
import React, { ReactNode } from 'react';
import { renderHook, waitFor } from '@testing-library/react';
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { useUserData } from '../hooks/useUserData';
import { useAuth } from '../hooks/useAuth';
import { useUserData } from './useUserData';
import { useAuth } from './useAuth';
import { UserDataProvider } from '../providers/UserDataProvider';
import { useApiOnMount } from './useApiOnMount';
import type { UserProfile } from '../types';

View File

@@ -4,7 +4,7 @@
This single directive replaces @tailwind base, components, and utilities.
It is the new entry point for all of Tailwind's generated CSS.
*/
@import "tailwindcss";
@import 'tailwindcss';
/*
This is the new v4 directive that tells the @tailwindcss/postcss plugin
@@ -12,4 +12,3 @@
Since tailwind.config.js is in the root and this is in src/, the path is '../tailwind.config.js'.
*/
@config '../tailwind.config.js';

View File

@@ -8,17 +8,16 @@ import './index.css';
const rootElement = document.getElementById('root');
if (!rootElement) {
throw new Error("Could not find root element to mount to");
throw new Error('Could not find root element to mount to');
}
const root = ReactDOM.createRoot(rootElement);
root.render(
<React.StrictMode>
<BrowserRouter>
<BrowserRouter>
<AppProviders>
<App />
</AppProviders>
</BrowserRouter>
</React.StrictMode>
</BrowserRouter>
</React.StrictMode>,
);

View File

@@ -5,4 +5,4 @@ import toast from 'react-hot-toast';
// This intermediate file allows us to mock 'src/lib/toast' reliably in tests
// without wrestling with the internal structure of the 'react-hot-toast' package.
export * from 'react-hot-toast';
export default toast;
export default toast;

View File

@@ -15,6 +15,11 @@ import { NotFoundError } from '../services/db/errors.db'; // This can stay, it's
import { createTestApp } from '../tests/utils/createTestApp';
import { mockLogger } from '../tests/utils/mockLogger';
// Mock the file upload middleware to allow testing the controller's internal check
vi.mock('../middleware/fileUpload.middleware', () => ({
requireFileUpload: () => (req: Request, res: Response, next: NextFunction) => next(),
}));
vi.mock('../lib/queue', () => ({
serverAdapter: {
getRouter: () => (req: Request, res: Response, next: NextFunction) => next(), // Return a dummy express handler
@@ -125,12 +130,6 @@ describe('Admin Content Management Routes (/api/admin)', () => {
authenticatedUser: adminUser,
});
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
beforeEach(() => {
vi.clearAllMocks();
});
@@ -262,7 +261,7 @@ describe('Admin Content Management Routes (/api/admin)', () => {
const response = await supertest(app).post('/api/admin/brands/55/logo');
expect(response.status).toBe(400);
expect(response.body.message).toMatch(
/Logo image file is required|The request data is invalid/,
/Logo image file is required|The request data is invalid|Logo image file is missing./,
);
});

View File

@@ -97,12 +97,6 @@ describe('Admin Job Trigger Routes (/api/admin/trigger)', () => {
authenticatedUser: adminUser,
});
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
beforeEach(() => {
vi.clearAllMocks();
});
@@ -248,6 +242,17 @@ describe('Admin Job Trigger Routes (/api/admin/trigger)', () => {
expect(response.status).toBe(400);
});
it('should return 404 if the queue name is valid but not in the retry map', async () => {
const queueName = 'weekly-analytics-reporting'; // This is in the Zod enum but not the queueMap
const jobId = 'some-job-id';
const response = await supertest(app).post(`/api/admin/jobs/${queueName}/${jobId}/retry`);
// The route throws a NotFoundError, which the error handler should convert to a 404.
expect(response.status).toBe(404);
expect(response.body.message).toBe(`Queue 'weekly-analytics-reporting' not found.`);
});
it('should return 404 if the job ID is not found in the queue', async () => {
vi.mocked(flyerQueue.getJob).mockResolvedValue(undefined);
const response = await supertest(app).post(

View File

@@ -102,12 +102,6 @@ describe('Admin Monitoring Routes (/api/admin)', () => {
authenticatedUser: adminUser,
});
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
beforeEach(() => {
vi.clearAllMocks();
});

View File

@@ -2,12 +2,11 @@
import { Router, NextFunction, Request, Response } from 'express';
import passport from './passport.routes';
import { isAdmin } from './passport.routes'; // Correctly imported
import multer from 'multer'; // --- Zod Schemas for Admin Routes (as per ADR-003) ---
import multer from 'multer';
import { z } from 'zod';
import * as db from '../services/db/index.db';
import { logger } from '../services/logger.server';
import { UserProfile } from '../types';
import type { UserProfile } from '../types';
import { geocodingService } from '../services/geocodingService.server';
import { requireFileUpload } from '../middleware/fileUpload.middleware'; // This was a duplicate, fixed.
import { NotFoundError, ValidationError } from '../services/db/errors.db';
@@ -33,45 +32,27 @@ import {
weeklyAnalyticsWorker,
} from '../services/queueService.server'; // Import your queues
import { getSimpleWeekAndYear } from '../utils/dateUtils';
import {
requiredString,
numericIdParam,
uuidParamSchema,
optionalNumeric,
} from '../utils/zodUtils';
import { logger } from '../services/logger.server';
// Helper for consistent required string validation (handles missing/null/empty)
const requiredString = (message: string) =>
z.preprocess((val) => val ?? '', z.string().min(1, message));
/**
* A factory for creating a Zod schema that validates a UUID in the request parameters.
* @param key The name of the parameter key (e.g., 'userId').
* @param message A custom error message for invalid UUIDs.
*/
const uuidParamSchema = (key: string, message = `Invalid UUID for parameter '${key}'.`) =>
z.object({
params: z.object({ [key]: z.string().uuid({ message }) }),
});
/**
* A factory for creating a Zod schema that validates a numeric ID in the request parameters.
*/
const numericIdParamSchema = (
key: string,
message = `Invalid ID for parameter '${key}'. Must be a positive integer.`,
) =>
z.object({
params: z.object({ [key]: z.coerce.number().int({ message }).positive({ message }) }),
});
const updateCorrectionSchema = numericIdParamSchema('id').extend({
const updateCorrectionSchema = numericIdParam('id').extend({
body: z.object({
suggested_value: requiredString('A new suggested_value is required.'),
}),
});
const updateRecipeStatusSchema = numericIdParamSchema('id').extend({
const updateRecipeStatusSchema = numericIdParam('id').extend({
body: z.object({
status: z.enum(['private', 'pending_review', 'public', 'rejected']),
}),
});
const updateCommentStatusSchema = numericIdParamSchema('id').extend({
const updateCommentStatusSchema = numericIdParam('id').extend({
body: z.object({
status: z.enum(['visible', 'hidden', 'reported']),
}),
@@ -85,8 +66,8 @@ const updateUserRoleSchema = uuidParamSchema('id', 'A valid user ID is required.
const activityLogSchema = z.object({
query: z.object({
limit: z.coerce.number().int().positive().optional().default(50),
offset: z.coerce.number().int().nonnegative().optional().default(0),
limit: optionalNumeric({ default: 50, integer: true, positive: true }),
offset: optionalNumeric({ default: 0, integer: true, nonnegative: true }),
}),
});
@@ -187,10 +168,10 @@ router.get('/stats/daily', async (req, res, next: NextFunction) => {
router.post(
'/corrections/:id/approve',
validateRequest(numericIdParamSchema('id')),
validateRequest(numericIdParam('id')),
async (req: Request, res: Response, next: NextFunction) => {
// Apply ADR-003 pattern for type safety
const { params } = req as unknown as z.infer<ReturnType<typeof numericIdParamSchema>>;
const { params } = req as unknown as z.infer<ReturnType<typeof numericIdParam>>;
try {
await db.adminRepo.approveCorrection(params.id, req.log); // params.id is now safely typed as number
res.status(200).json({ message: 'Correction approved successfully.' });
@@ -202,10 +183,10 @@ router.post(
router.post(
'/corrections/:id/reject',
validateRequest(numericIdParamSchema('id')),
validateRequest(numericIdParam('id')),
async (req: Request, res: Response, next: NextFunction) => {
// Apply ADR-003 pattern for type safety
const { params } = req as unknown as z.infer<ReturnType<typeof numericIdParamSchema>>;
const { params } = req as unknown as z.infer<ReturnType<typeof numericIdParam>>;
try {
await db.adminRepo.rejectCorrection(params.id, req.log); // params.id is now safely typed as number
res.status(200).json({ message: 'Correction rejected successfully.' });
@@ -251,12 +232,12 @@ router.put(
router.post(
'/brands/:id/logo',
validateRequest(numericIdParamSchema('id')),
validateRequest(numericIdParam('id')),
upload.single('logoImage'),
requireFileUpload('logoImage'),
async (req: Request, res: Response, next: NextFunction) => {
// Apply ADR-003 pattern for type safety
const { params } = req as unknown as z.infer<ReturnType<typeof numericIdParamSchema>>;
const { params } = req as unknown as z.infer<ReturnType<typeof numericIdParam>>;
try {
// Although requireFileUpload middleware should ensure the file exists,
// this check satisfies TypeScript and adds robustness.
@@ -288,11 +269,11 @@ router.get('/unmatched-items', async (req, res, next: NextFunction) => {
*/
router.delete(
'/recipes/:recipeId',
validateRequest(numericIdParamSchema('recipeId')),
validateRequest(numericIdParam('recipeId')),
async (req: Request, res: Response, next: NextFunction) => {
const userProfile = req.user as UserProfile;
// Infer the type directly from the schema generator function. // This was a duplicate, fixed.
const { params } = req as unknown as z.infer<ReturnType<typeof numericIdParamSchema>>;
const { params } = req as unknown as z.infer<ReturnType<typeof numericIdParam>>;
try {
// The isAdmin flag bypasses the ownership check in the repository method.
await db.recipeRepo.deleteRecipe(params.recipeId, userProfile.user.user_id, true, req.log);
@@ -308,10 +289,10 @@ router.delete(
*/
router.delete(
'/flyers/:flyerId',
validateRequest(numericIdParamSchema('flyerId')),
validateRequest(numericIdParam('flyerId')),
async (req: Request, res: Response, next: NextFunction) => {
// Infer the type directly from the schema generator function.
const { params } = req as unknown as z.infer<ReturnType<typeof numericIdParamSchema>>;
const { params } = req as unknown as z.infer<ReturnType<typeof numericIdParam>>;
try {
await db.flyerRepo.deleteFlyer(params.flyerId, req.log);
res.status(204).send();
@@ -435,12 +416,10 @@ router.post(
// We call the function but don't wait for it to finish (no `await`).
// This is a "fire-and-forget" operation from the client's perspective.
backgroundJobService.runDailyDealCheck();
res
.status(202)
.json({
message:
'Daily deal check job has been triggered successfully. It will run in the background.',
});
res.status(202).json({
message:
'Daily deal check job has been triggered successfully. It will run in the background.',
});
} catch (error) {
logger.error({ error }, '[Admin] Failed to trigger daily deal check job.');
next(error);
@@ -467,11 +446,9 @@ router.post(
const job = await analyticsQueue.add('generate-daily-report', { reportDate }, { jobId });
res
.status(202)
.json({
message: `Analytics report generation job has been enqueued successfully. Job ID: ${job.id}`,
});
res.status(202).json({
message: `Analytics report generation job has been enqueued successfully. Job ID: ${job.id}`,
});
} catch (error) {
logger.error({ error }, '[Admin] Failed to enqueue analytics report job.');
next(error);
@@ -485,11 +462,11 @@ router.post(
*/
router.post(
'/flyers/:flyerId/cleanup',
validateRequest(numericIdParamSchema('flyerId')),
validateRequest(numericIdParam('flyerId')),
async (req: Request, res: Response, next: NextFunction) => {
const userProfile = req.user as UserProfile;
// Infer type from the schema generator for type safety, as per ADR-003.
const { params } = req as unknown as z.infer<ReturnType<typeof numericIdParamSchema>>; // This was a duplicate, fixed.
const { params } = req as unknown as z.infer<ReturnType<typeof numericIdParam>>; // This was a duplicate, fixed.
logger.info(
`[Admin] Manual trigger for flyer file cleanup received from user: ${userProfile.user.user_id} for flyer ID: ${params.flyerId}`,
);
@@ -541,11 +518,9 @@ router.post(
try {
const keysDeleted = await geocodingService.clearGeocodeCache(req.log);
res
.status(200)
.json({
message: `Successfully cleared the geocode cache. ${keysDeleted} keys were removed.`,
});
res.status(200).json({
message: `Successfully cleared the geocode cache. ${keysDeleted} keys were removed.`,
});
} catch (error) {
logger.error({ error }, '[Admin] Failed to clear geocode cache.');
next(error);

View File

@@ -73,12 +73,6 @@ describe('Admin Stats Routes (/api/admin/stats)', () => {
authenticatedUser: adminUser,
});
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
beforeEach(() => {
vi.clearAllMocks();
});

View File

@@ -79,12 +79,6 @@ describe('Admin System Routes (/api/admin/system)', () => {
authenticatedUser: adminUser,
});
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
beforeEach(() => {
vi.clearAllMocks();
});

View File

@@ -83,12 +83,6 @@ describe('Admin User Management Routes (/api/admin/users)', () => {
authenticatedUser: adminUser,
});
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
beforeEach(() => {
vi.clearAllMocks();
});

View File

@@ -78,6 +78,7 @@ describe('AI Routes (/api/ai)', () => {
vi.mocked(mockLogger.info).mockImplementation(() => {});
vi.mocked(mockLogger.error).mockImplementation(() => {});
vi.mocked(mockLogger.warn).mockImplementation(() => {});
vi.mocked(mockLogger.debug).mockImplementation(() => {}); // Ensure debug is also mocked
});
const app = createTestApp({ router: aiRouter, basePath: '/api/ai' });
@@ -86,12 +87,15 @@ describe('AI Routes (/api/ai)', () => {
// Arrange
const mkdirError = new Error('EACCES: permission denied');
vi.resetModules(); // Reset modules to re-run top-level code
vi.doMock('node:fs', () => ({
...fs,
mkdirSync: vi.fn().mockImplementation(() => {
throw mkdirError;
}),
}));
vi.doMock('node:fs', () => {
const mockFs = {
...fs,
mkdirSync: vi.fn().mockImplementation(() => {
throw mkdirError;
}),
};
return { ...mockFs, default: mockFs };
});
const { logger } = await import('../services/logger.server');
// Act: Dynamically import the router to trigger the mkdirSync call
@@ -108,10 +112,55 @@ describe('AI Routes (/api/ai)', () => {
});
});
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
// New test to cover the router.use diagnostic middleware's catch block and errMsg branches
describe('Diagnostic Middleware Error Handling', () => {
it('should log an error if logger.debug throws an object with a message property', async () => {
const mockErrorObject = { message: 'Mock debug error' };
vi.mocked(mockLogger.debug).mockImplementationOnce(() => {
throw mockErrorObject;
});
// Make any request to trigger the middleware
const response = await supertest(app).get('/api/ai/jobs/job-123/status');
expect(mockLogger.error).toHaveBeenCalledWith(
{ error: mockErrorObject.message }, // errMsg should extract the message
'Failed to log incoming AI request headers',
);
// The request should still proceed, but might fail later if the original flow was interrupted.
// Here, it will likely hit the 404 for job not found.
expect(response.status).toBe(404);
});
it('should log an error if logger.debug throws a primitive string', async () => {
const mockErrorString = 'Mock debug error string';
vi.mocked(mockLogger.debug).mockImplementationOnce(() => {
throw mockErrorString;
});
// Make any request to trigger the middleware
const response = await supertest(app).get('/api/ai/jobs/job-123/status');
expect(mockLogger.error).toHaveBeenCalledWith(
{ error: mockErrorString }, // errMsg should convert to string
'Failed to log incoming AI request headers',
);
expect(response.status).toBe(404);
});
it('should log an error if logger.debug throws null/undefined', async () => {
vi.mocked(mockLogger.debug).mockImplementationOnce(() => {
throw null; // Simulate throwing null
});
const response = await supertest(app).get('/api/ai/jobs/job-123/status');
expect(mockLogger.error).toHaveBeenCalledWith(
{ error: 'An unknown error occurred.' }, // errMsg should handle null/undefined
'Failed to log incoming AI request headers',
);
expect(response.status).toBe(404);
});
});
describe('POST /upload-and-process', () => {
@@ -304,10 +353,11 @@ describe('AI Routes (/api/ai)', () => {
expect(response.status).toBe(400);
});
it('should return 409 Conflict if flyer checksum already exists', async () => {
it('should return 409 Conflict and delete the uploaded file if flyer checksum already exists', async () => {
// Arrange
const mockExistingFlyer = createMockFlyer({ flyer_id: 99 });
vi.mocked(mockedDb.flyerRepo.findFlyerByChecksum).mockResolvedValue(mockExistingFlyer); // Duplicate found
const unlinkSpy = vi.spyOn(fs.promises, 'unlink').mockResolvedValue(undefined);
// Act
const response = await supertest(app)
@@ -319,6 +369,10 @@ describe('AI Routes (/api/ai)', () => {
expect(response.status).toBe(409);
expect(response.body.message).toBe('This flyer has already been processed.');
expect(mockedDb.createFlyerAndItems).not.toHaveBeenCalled();
// Assert that the file was deleted
expect(unlinkSpy).toHaveBeenCalledTimes(1);
// The filename is predictable in the test environment because of the multer config in ai.routes.ts
expect(unlinkSpy).toHaveBeenCalledWith(expect.stringContaining('flyerImage-test-flyer-image.jpg'));
});
it('should accept payload when extractedData.items is missing and save with empty items', async () => {
@@ -420,6 +474,52 @@ describe('AI Routes (/api/ai)', () => {
expect(mockedDb.createFlyerAndItems).toHaveBeenCalledTimes(1);
});
it('should handle payload where extractedData is null', async () => {
const payloadWithNullExtractedData = {
checksum: 'null-extracted-data-checksum',
originalFileName: 'flyer-null.jpg',
extractedData: null,
};
const response = await supertest(app)
.post('/api/ai/flyers/process')
.field('data', JSON.stringify(payloadWithNullExtractedData))
.attach('flyerImage', imagePath);
expect(response.status).toBe(201);
expect(mockedDb.createFlyerAndItems).toHaveBeenCalledTimes(1);
// Verify that extractedData was correctly defaulted to an empty object
const flyerDataArg = vi.mocked(mockedDb.createFlyerAndItems).mock.calls[0][0];
expect(flyerDataArg.store_name).toContain('Unknown Store'); // Fallback should be used
expect(mockLogger.warn).toHaveBeenCalledWith(
{ bodyData: expect.any(Object) },
'Missing extractedData in /api/ai/flyers/process payload.',
);
});
it('should handle payload where extractedData is a string', async () => {
const payloadWithStringExtractedData = {
checksum: 'string-extracted-data-checksum',
originalFileName: 'flyer-string.jpg',
extractedData: 'not-an-object',
};
const response = await supertest(app)
.post('/api/ai/flyers/process')
.field('data', JSON.stringify(payloadWithStringExtractedData))
.attach('flyerImage', imagePath);
expect(response.status).toBe(201);
expect(mockedDb.createFlyerAndItems).toHaveBeenCalledTimes(1);
// Verify that extractedData was correctly defaulted to an empty object
const flyerDataArg = vi.mocked(mockedDb.createFlyerAndItems).mock.calls[0][0];
expect(flyerDataArg.store_name).toContain('Unknown Store'); // Fallback should be used
expect(mockLogger.warn).toHaveBeenCalledWith(
{ bodyData: expect.any(Object) },
'Missing extractedData in /api/ai/flyers/process payload.',
);
});
it('should handle payload where extractedData is at the root of the body', async () => {
// This simulates a client sending multipart fields for each property of extractedData
const response = await supertest(app)
@@ -435,6 +535,27 @@ describe('AI Routes (/api/ai)', () => {
const flyerDataArg = vi.mocked(mockedDb.createFlyerAndItems).mock.calls[0][0];
expect(flyerDataArg.store_name).toBe('Root Store');
});
it('should default item quantity to 1 if missing', async () => {
const payloadMissingQuantity = {
checksum: 'qty-checksum',
originalFileName: 'flyer-qty.jpg',
extractedData: {
store_name: 'Qty Store',
items: [{ name: 'Item without qty', price: 100 }],
},
};
const response = await supertest(app)
.post('/api/ai/flyers/process')
.field('data', JSON.stringify(payloadMissingQuantity))
.attach('flyerImage', imagePath);
expect(response.status).toBe(201);
expect(mockedDb.createFlyerAndItems).toHaveBeenCalledTimes(1);
const itemsArg = vi.mocked(mockedDb.createFlyerAndItems).mock.calls[0][1];
expect(itemsArg[0].quantity).toBe(1);
});
});
describe('POST /check-flyer', () => {
@@ -554,10 +675,11 @@ describe('AI Routes (/api/ai)', () => {
const mockUser = createMockUserProfile({
user: { user_id: 'user-123', email: 'user-123@test.com' },
});
const authenticatedApp = createTestApp({ router: aiRouter, basePath: '/api/ai', authenticatedUser: mockUser });
beforeEach(() => {
// Inject an authenticated user for this test block
app.use((req, res, next) => {
authenticatedApp.use((req, res, next) => {
req.user = mockUser;
next();
});
@@ -572,7 +694,7 @@ describe('AI Routes (/api/ai)', () => {
.field('cropArea', JSON.stringify({ x: 10, y: 10, width: 50, height: 50 }))
.field('extractionType', 'item_details')
.attach('image', imagePath);
// Use the authenticatedApp instance for requests in this block
expect(response.status).toBe(200);
expect(response.body).toEqual(mockResult);
expect(aiService.aiService.extractTextFromImageArea).toHaveBeenCalled();
@@ -583,7 +705,7 @@ describe('AI Routes (/api/ai)', () => {
new Error('AI API is down'),
);
const response = await supertest(app)
const response = await supertest(authenticatedApp)
.post('/api/ai/rescan-area')
.field('cropArea', JSON.stringify({ x: 10, y: 10, width: 50, height: 50 }))
.field('extractionType', 'item_details')
@@ -599,15 +721,12 @@ describe('AI Routes (/api/ai)', () => {
const mockUserProfile = createMockUserProfile({
user: { user_id: 'user-123', email: 'user-123@test.com' },
});
const authenticatedApp = createTestApp({ router: aiRouter, basePath: '/api/ai', authenticatedUser: mockUserProfile });
beforeEach(() => {
// For this block, simulate an authenticated request by attaching the user.
app.use((req, res, next) => {
req.user = mockUserProfile;
next();
});
// The authenticatedApp instance is already set up with mockUserProfile
});
it('POST /quick-insights should return the stubbed response', async () => {
const response = await supertest(app)
.post('/api/ai/quick-insights')
@@ -617,6 +736,14 @@ describe('AI Routes (/api/ai)', () => {
expect(response.body.text).toContain('server-generated quick insight');
});
it('POST /quick-insights should accept items with "item" property instead of "name"', async () => {
const response = await supertest(app)
.post('/api/ai/quick-insights')
.send({ items: [{ item: 'test item' }] });
expect(response.status).toBe(200);
});
it('POST /quick-insights should return 500 on a generic error', async () => {
// To hit the catch block, we can simulate an error by making the logger throw.
vi.mocked(mockLogger.info).mockImplementationOnce(() => {

View File

@@ -2,7 +2,7 @@
import { Router, Request, Response, NextFunction } from 'express';
import multer from 'multer';
import path from 'path';
import fs from 'fs';
import fs from 'node:fs';
import { z } from 'zod';
import passport from './passport.routes';
import { optionalAuth } from './passport.routes';
@@ -15,6 +15,7 @@ import { logger } from '../services/logger.server';
import { UserProfile, ExtractedCoreData, ExtractedFlyerItem } from '../types';
import { flyerQueue } from '../services/queueService.server';
import { validateRequest } from '../middleware/validation.middleware';
import { requiredString } from '../utils/zodUtils';
const router = Router();
@@ -26,9 +27,6 @@ interface FlyerProcessPayload extends Partial<ExtractedCoreData> {
}
// --- Zod Schemas for AI Routes (as per ADR-003) ---
// Helper for consistent required string validation (handles missing/null/empty)
const requiredString = (message: string) =>
z.preprocess((val) => val ?? '', z.string().min(1, message));
const uploadAndProcessSchema = z.object({
body: z.object({
@@ -52,6 +50,15 @@ const errMsg = (e: unknown) => {
return String(e || 'An unknown error occurred.');
};
const cleanupUploadedFile = async (file?: Express.Multer.File) => {
if (!file) return;
try {
await fs.promises.unlink(file.path);
} catch (err) {
// Ignore cleanup errors (e.g. file already deleted)
}
};
const cropAreaObjectSchema = z.object({
x: z.number(),
y: z.number(),
@@ -88,10 +95,17 @@ const rescanAreaSchema = z.object({
const flyerItemForAnalysisSchema = z
.object({
name: requiredString('Item name is required.'),
// Allow other properties to pass through without validation
item: z.string().nullish(),
name: z.string().nullish(),
})
.passthrough();
.passthrough()
.refine(
(data) =>
(data.item && data.item.trim().length > 0) || (data.name && data.name.trim().length > 0),
{
message: "Item identifier is required (either 'item' or 'name').",
},
);
const insightsSchema = z.object({
body: z.object({
@@ -180,7 +194,7 @@ router.use((req: Request, res: Response, next: NextFunction) => {
'[API /ai] Incoming request',
);
} catch (e: unknown) {
logger.error({ error: e }, 'Failed to log incoming AI request headers');
logger.error({ error: errMsg(e) }, 'Failed to log incoming AI request headers');
}
next();
});
@@ -311,7 +325,7 @@ router.post(
// Try several ways to obtain the payload so we are tolerant to client variations.
let parsed: FlyerProcessPayload = {};
let extractedData: Partial<ExtractedCoreData> = {};
let extractedData: Partial<ExtractedCoreData> | null | undefined = {};
try {
// If the client sent a top-level `data` field (stringified JSON), parse it.
if (req.body && (req.body.data || req.body.extractedData)) {
@@ -332,7 +346,7 @@ router.post(
) as FlyerProcessPayload;
}
// If parsed itself contains an `extractedData` field, use that, otherwise assume parsed is the extractedData
extractedData = parsed.extractedData ?? (parsed as Partial<ExtractedCoreData>);
extractedData = 'extractedData' in parsed ? parsed.extractedData : (parsed as Partial<ExtractedCoreData>);
} else {
// No explicit `data` field found. Attempt to interpret req.body as an object (Express may have parsed multipart fields differently).
try {
@@ -378,6 +392,12 @@ router.post(
// Pull common metadata fields (checksum, originalFileName) from whichever shape we parsed.
const checksum = parsed.checksum ?? parsed?.data?.checksum ?? '';
if (!checksum) {
await cleanupUploadedFile(req.file);
return res.status(400).json({ message: 'Checksum is required.' });
}
const originalFileName =
parsed.originalFileName ?? parsed?.data?.originalFileName ?? req.file.originalname;
const userProfile = req.user as UserProfile | undefined;
@@ -404,6 +424,7 @@ router.post(
const itemsForDb = itemsArray.map((item: Partial<ExtractedFlyerItem>) => ({
...item,
master_item_id: item.master_item_id === null ? undefined : item.master_item_id,
quantity: item.quantity ?? 1, // Default to 1 to satisfy DB constraint
view_count: 0,
click_count: 0,
updated_at: new Date().toISOString(),
@@ -424,6 +445,7 @@ router.post(
const existingFlyer = await db.flyerRepo.findFlyerByChecksum(checksum, req.log);
if (existingFlyer) {
logger.warn(`Duplicate flyer upload attempt blocked for checksum: ${checksum}`);
await cleanupUploadedFile(req.file);
return res.status(409).json({ message: 'This flyer has already been processed.' });
}
@@ -471,6 +493,7 @@ router.post(
res.status(201).json({ message: 'Flyer processed and saved successfully.', flyer: newFlyer });
} catch (error) {
await cleanupUploadedFile(req.file);
next(error);
}
},

View File

@@ -192,7 +192,7 @@ describe('Auth Routes (/api/auth)', () => {
// Assert
expect(response.status).toBe(201);
expect(response.body.message).toBe('User registered successfully!');
expect(response.body.user.email).toBe(newUserEmail);
expect(response.body.userprofile.user.email).toBe(newUserEmail);
expect(response.body.token).toBeTypeOf('string'); // This was a duplicate, fixed.
expect(db.userRepo.createUser).toHaveBeenCalled();
});
@@ -295,9 +295,8 @@ describe('Auth Routes (/api/auth)', () => {
// Assert
expect(response.status).toBe(200);
// The API now returns a nested UserProfile object
expect(response.body.user).toEqual(
expect(response.body.userprofile).toEqual(
expect.objectContaining({
user_id: 'user-123',
user: expect.objectContaining({
user_id: 'user-123',
email: loginCredentials.email,
@@ -618,7 +617,9 @@ describe('Auth Routes (/api/auth)', () => {
const setCookieHeader = response.headers['set-cookie'];
expect(setCookieHeader).toBeDefined();
expect(setCookieHeader[0]).toContain('refreshToken=;');
expect(setCookieHeader[0]).toContain('Expires=Thu, 01 Jan 1970');
// Check for Max-Age=0, which is the modern way to expire a cookie.
// The 'Expires' attribute is a fallback and its exact value can be inconsistent.
expect(setCookieHeader[0]).toContain('Max-Age=0');
});
it('should still return 200 OK even if deleting the refresh token from DB fails', async () => {

View File

@@ -1,13 +1,12 @@
// src/routes/auth.routes.ts
import { Router, Request, Response, NextFunction } from 'express';
import * as bcrypt from 'bcrypt';
import zxcvbn from 'zxcvbn';
import { z } from 'zod';
import jwt from 'jsonwebtoken';
import crypto from 'crypto';
import rateLimit from 'express-rate-limit';
import passport from './passport.routes'; // Corrected import path
import passport from './passport.routes';
import { userRepo, adminRepo } from '../services/db/index.db';
import { UniqueConstraintError } from '../services/db/errors.db';
import { getPool } from '../services/db/connection.db';
@@ -15,38 +14,13 @@ import { logger } from '../services/logger.server';
import { sendPasswordResetEmail } from '../services/emailService.server';
import { validateRequest } from '../middleware/validation.middleware';
import type { UserProfile } from '../types';
import { validatePasswordStrength } from '../utils/authUtils';
import { requiredString } from '../utils/zodUtils';
const router = Router();
const JWT_SECRET = process.env.JWT_SECRET!;
/**
* Validates the strength of a password using zxcvbn.
* @param password The password to check.
* @returns An object with `isValid` and an optional `feedback` message.
*/
const validatePasswordStrength = (password: string): { isValid: boolean; feedback?: string } => {
const MIN_PASSWORD_SCORE = 3; // Require a 'Good' or 'Strong' password (score 3 or 4)
const strength = zxcvbn(password);
if (strength.score < MIN_PASSWORD_SCORE) {
const feedbackMessage =
strength.feedback.warning ||
(strength.feedback.suggestions && strength.feedback.suggestions[0]);
return {
isValid: false,
feedback:
`Password is too weak. ${feedbackMessage || 'Please choose a stronger password.'}`.trim(),
};
}
return { isValid: true };
};
// Helper for consistent required string validation (handles missing/null/empty)
const requiredString = (message: string) =>
z.preprocess((val) => val ?? '', z.string().min(1, message));
// Conditionally disable rate limiting for the test environment
const isTestEnv = process.env.NODE_ENV === 'test';
@@ -69,8 +43,6 @@ const resetPasswordLimiter = rateLimit({
skip: () => isTestEnv, // Skip this middleware if in test environment
});
// --- Zod Schemas for Auth Routes (as per ADR-003) ---
const registerSchema = z.object({
body: z.object({
email: z.string().email('A valid email is required.'),
@@ -213,7 +185,7 @@ router.post('/login', (req: Request, res: Response, next: NextFunction) => {
const accessToken = jwt.sign(payload, JWT_SECRET, { expiresIn: '15m' });
try {
const refreshToken = crypto.randomBytes(64).toString('hex'); // This was a duplicate, fixed.
const refreshToken = crypto.randomBytes(64).toString('hex');
await userRepo.saveRefreshToken(userProfile.user.user_id, refreshToken, req.log);
req.log.info(`JWT and refresh token issued for user: ${userProfile.user.email}`);
@@ -381,7 +353,7 @@ router.post('/logout', async (req: Request, res: Response) => {
// Instruct the browser to clear the cookie by setting its expiration to the past.
res.cookie('refreshToken', '', {
httpOnly: true,
expires: new Date(0),
maxAge: 0, // Use maxAge for modern compatibility; Express sets 'Expires' as a fallback.
secure: process.env.NODE_ENV === 'production',
});
res.status(200).json({ message: 'Logged out successfully.' });

View File

@@ -69,17 +69,7 @@ describe('Budget Routes (/api/budgets)', () => {
vi.mocked(db.budgetRepo.getSpendingByCategory).mockResolvedValue([]);
});
const app = createTestApp({
router: budgetRouter,
basePath: '/api/budgets',
authenticatedUser: mockUser,
});
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
const app = createTestApp({ router: budgetRouter, basePath: '/api/budgets', authenticatedUser: mockUserProfile });
describe('GET /', () => {
it('should return a list of budgets for the user', async () => {

View File

@@ -5,20 +5,12 @@ import passport from './passport.routes';
import { budgetRepo } from '../services/db/index.db';
import type { UserProfile } from '../types';
import { validateRequest } from '../middleware/validation.middleware';
import { requiredString, numericIdParam } from '../utils/zodUtils';
const router = express.Router();
// Helper for consistent required string validation (handles missing/null/empty)
const requiredString = (message: string) =>
z.preprocess((val) => val ?? '', z.string().min(1, message));
// --- Zod Schemas for Budget Routes (as per ADR-003) ---
const budgetIdParamSchema = z.object({
params: z.object({
id: z.coerce.number().int().positive("Invalid ID for parameter 'id'. Must be a number."),
}),
});
const budgetIdParamSchema = numericIdParam('id', "Invalid ID for parameter 'id'. Must be a number.");
const createBudgetSchema = z.object({
body: z.object({

View File

@@ -54,13 +54,6 @@ describe('Deals Routes (/api/users/deals)', () => {
authenticatedUser: mockUser,
});
const unauthenticatedApp = createTestApp({ router: dealsRouter, basePath });
const errorHandler = (err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
};
// Apply the handler to both app instances
authenticatedApp.use(errorHandler);
unauthenticatedApp.use(errorHandler);
beforeEach(() => {
vi.clearAllMocks();

View File

@@ -40,12 +40,6 @@ describe('Flyer Routes (/api/flyers)', () => {
const app = createTestApp({ router: flyerRouter, basePath: '/api/flyers' });
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
describe('GET /', () => {
it('should return a list of flyers on success', async () => {
const mockFlyers = [createMockFlyer({ flyer_id: 1 }), createMockFlyer({ flyer_id: 2 })];

View File

@@ -3,6 +3,7 @@ import { Router } from 'express';
import * as db from '../services/db/index.db';
import { z } from 'zod';
import { validateRequest } from '../middleware/validation.middleware';
import { optionalNumeric } from '../utils/zodUtils';
const router = Router();
@@ -10,8 +11,8 @@ const router = Router();
const getFlyersSchema = z.object({
query: z.object({
limit: z.coerce.number().int().positive().optional().default(20),
offset: z.coerce.number().int().nonnegative().optional().default(0),
limit: optionalNumeric({ default: 20, integer: true, positive: true }),
offset: optionalNumeric({ default: 0, integer: true, nonnegative: true }),
}),
});

View File

@@ -86,12 +86,6 @@ describe('Gamification Routes (/api/achievements)', () => {
basePath,
authenticatedUser: mockAdminProfile,
});
const errorHandler = (err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
};
unauthenticatedApp.use(errorHandler);
authenticatedApp.use(errorHandler);
adminApp.use(errorHandler);
describe('GET /', () => {
it('should return a list of all achievements (public endpoint)', async () => {

View File

@@ -7,19 +7,16 @@ import { logger } from '../services/logger.server';
import { UserProfile } from '../types';
import { ForeignKeyConstraintError } from '../services/db/errors.db';
import { validateRequest } from '../middleware/validation.middleware';
import { requiredString, optionalNumeric } from '../utils/zodUtils';
const router = express.Router();
const adminGamificationRouter = express.Router(); // Create a new router for admin-only routes.
// Helper for consistent required string validation (handles missing/null/empty)
const requiredString = (message: string) =>
z.preprocess((val) => val ?? '', z.string().min(1, message));
// --- Zod Schemas for Gamification Routes (as per ADR-003) ---
const leaderboardSchema = z.object({
query: z.object({
limit: z.coerce.number().int().positive().max(50).optional().default(10),
limit: optionalNumeric({ default: 10, integer: true, positive: true, max: 50 }),
}),
});

View File

@@ -46,12 +46,6 @@ const { logger } = await import('../services/logger.server');
// 2. Create a minimal Express app to host the router for testing.
const app = createTestApp({ router: healthRouter, basePath: '/api/health' });
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
describe('Health Routes (/api/health)', () => {
beforeEach(() => {
// Clear mock history before each test to ensure isolation.

View File

@@ -30,12 +30,6 @@ vi.mock('../services/logger.server', () => ({
describe('Personalization Routes (/api/personalization)', () => {
const app = createTestApp({ router: personalizationRouter, basePath: '/api/personalization' });
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
beforeEach(() => {
vi.clearAllMocks();
});

View File

@@ -4,8 +4,21 @@ import supertest from 'supertest';
import { createTestApp } from '../tests/utils/createTestApp';
import { mockLogger } from '../tests/utils/mockLogger';
// Mock the price repository
vi.mock('../services/db/price.db', () => ({
priceRepo: {
getPriceHistory: vi.fn(),
},
}));
// Mock the logger to keep test output clean
vi.mock('../services/logger.server', () => ({
logger: mockLogger,
}));
// Import the router AFTER other setup.
import priceRouter from './price.routes';
import { priceRepo } from '../services/db/price.db';
describe('Price Routes (/api/price-history)', () => {
const app = createTestApp({ router: priceRouter, basePath: '/api/price-history' });
@@ -14,32 +27,106 @@ describe('Price Routes (/api/price-history)', () => {
});
describe('POST /', () => {
it('should return 200 OK with an empty array for a valid request', async () => {
const masterItemIds = [1, 2, 3];
const response = await supertest(app).post('/api/price-history').send({ masterItemIds });
it('should return 200 OK with price history data for a valid request', async () => {
const mockHistory = [
{ master_item_id: 1, price_in_cents: 199, date: '2024-01-01T00:00:00.000Z' },
{ master_item_id: 2, price_in_cents: 299, date: '2024-01-08T00:00:00.000Z' },
];
vi.mocked(priceRepo.getPriceHistory).mockResolvedValue(mockHistory);
const response = await supertest(app)
.post('/api/price-history')
.send({ masterItemIds: [1, 2] });
expect(response.status).toBe(200);
expect(response.body).toEqual([]);
expect(response.body).toEqual(mockHistory);
expect(priceRepo.getPriceHistory).toHaveBeenCalledWith([1, 2], expect.any(Object), 1000, 0);
});
it('should pass limit and offset from the body to the repository', async () => {
vi.mocked(priceRepo.getPriceHistory).mockResolvedValue([]);
await supertest(app)
.post('/api/price-history')
.send({ masterItemIds: [1, 2, 3], limit: 50, offset: 10 });
expect(priceRepo.getPriceHistory).toHaveBeenCalledWith(
[1, 2, 3],
expect.any(Object),
50,
10,
);
});
it('should log the request info', async () => {
vi.mocked(priceRepo.getPriceHistory).mockResolvedValue([]);
await supertest(app)
.post('/api/price-history')
.send({ masterItemIds: [1, 2, 3], limit: 25, offset: 5 });
expect(mockLogger.info).toHaveBeenCalledWith(
{ itemCount: masterItemIds.length },
{ itemCount: 3, limit: 25, offset: 5 },
'[API /price-history] Received request for historical price data.',
);
});
it('should return 500 if the database call fails', async () => {
const dbError = new Error('Database connection failed');
vi.mocked(priceRepo.getPriceHistory).mockRejectedValue(dbError);
const response = await supertest(app)
.post('/api/price-history')
.send({ masterItemIds: [1, 2, 3] });
expect(response.status).toBe(500);
expect(response.body.message).toBe('Database connection failed');
});
it('should return 400 if masterItemIds is an empty array', async () => {
const response = await supertest(app).post('/api/price-history').send({ masterItemIds: [] });
expect(response.status).toBe(400);
expect(response.body.errors[0].message).toBe(
'masterItemIds must be a non-empty array of positive integers.',
);
});
it('should return 400 if masterItemIds is not an array', async () => {
const response = await supertest(app)
.post('/api/price-history')
.send({ masterItemIds: 'not-an-array' });
expect(response.status).toBe(400);
expect(response.body.errors[0].message).toMatch(/Expected array, received string/i);
// The actual message is "Invalid input: expected array, received string"
expect(response.body.errors[0].message).toBe('Invalid input: expected array, received string');
});
it('should return 400 if masterItemIds is an empty array', async () => {
const response = await supertest(app).post('/api/price-history').send({ masterItemIds: [] });
it('should return 400 if masterItemIds contains non-positive integers', async () => {
const response = await supertest(app)
.post('/api/price-history')
.send({ masterItemIds: [1, -2, 3] });
expect(response.status).toBe(400);
expect(response.body.errors[0].message).toBe(
'masterItemIds must be a non-empty array of positive integers.',
);
expect(response.body.errors[0].message).toBe('Number must be greater than 0');
});
it('should return 400 if masterItemIds is missing', async () => {
const response = await supertest(app).post('/api/price-history').send({});
expect(response.status).toBe(400);
// The actual message is "Invalid input: expected array, received undefined"
expect(response.body.errors[0].message).toBe('Invalid input: expected array, received undefined');
});
it('should return 400 for invalid limit and offset', async () => {
const response = await supertest(app)
.post('/api/price-history')
.send({ masterItemIds: [1], limit: -1, offset: 'abc' });
expect(response.status).toBe(400);
expect(response.body.errors).toHaveLength(2);
// The actual message is "Too small: expected number to be >0"
expect(response.body.errors[0].message).toBe('Too small: expected number to be >0');
expect(response.body.errors[1].message).toBe('Invalid input: expected number, received NaN');
});
});
});

View File

@@ -1,15 +1,21 @@
// src/routes/price.routes.ts
import { Router, Request, Response } from 'express';
import { Router, Request, Response, NextFunction } from 'express';
import { z } from 'zod';
import { validateRequest } from '../middleware/validation.middleware';
import { priceRepo } from '../services/db/price.db';
import { optionalNumeric } from '../utils/zodUtils';
const router = Router();
const priceHistorySchema = z.object({
body: z.object({
masterItemIds: z.array(z.number().int().positive()).nonempty({
message: 'masterItemIds must be a non-empty array of positive integers.',
}),
masterItemIds: z
.array(z.number().int().positive('Number must be greater than 0'))
.nonempty({
message: 'masterItemIds must be a non-empty array of positive integers.',
}),
limit: optionalNumeric({ default: 1000, integer: true, positive: true }),
offset: optionalNumeric({ default: 0, integer: true, nonnegative: true }),
}),
});
@@ -18,18 +24,23 @@ type PriceHistoryRequest = z.infer<typeof priceHistorySchema>;
/**
* POST /api/price-history - Fetches historical price data for a given list of master item IDs.
* This is a placeholder implementation.
* This endpoint retrieves price points over time for specified master grocery items.
*/
router.post('/', validateRequest(priceHistorySchema), async (req: Request, res: Response) => {
router.post('/', validateRequest(priceHistorySchema), async (req: Request, res: Response, next: NextFunction) => {
// Cast 'req' to the inferred type for full type safety.
const {
body: { masterItemIds },
body: { masterItemIds, limit, offset },
} = req as unknown as PriceHistoryRequest;
req.log.info(
{ itemCount: masterItemIds.length },
{ itemCount: masterItemIds.length, limit, offset },
'[API /price-history] Received request for historical price data.',
);
res.status(200).json([]);
try {
const priceHistory = await priceRepo.getPriceHistory(masterItemIds, req.log, limit, offset);
res.status(200).json(priceHistory);
} catch (error) {
next(error);
}
});
export default router;

View File

@@ -35,12 +35,6 @@ const expectLogger = expect.objectContaining({
describe('Recipe Routes (/api/recipes)', () => {
const app = createTestApp({ router: recipeRouter, basePath: '/api/recipes' });
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
beforeEach(() => {
vi.clearAllMocks();
});

View File

@@ -3,24 +3,19 @@ import { Router } from 'express';
import { z } from 'zod';
import * as db from '../services/db/index.db';
import { validateRequest } from '../middleware/validation.middleware';
import { requiredString, numericIdParam, optionalNumeric } from '../utils/zodUtils';
const router = Router();
// Helper for consistent required string validation (handles missing/null/empty)
const requiredString = (message: string) =>
z.preprocess((val) => val ?? '', z.string().min(1, message));
// --- Zod Schemas for Recipe Routes (as per ADR-003) ---
const bySalePercentageSchema = z.object({
query: z.object({
minPercentage: z.coerce.number().min(0).max(100).optional().default(50),
minPercentage: optionalNumeric({ default: 50, min: 0, max: 100 }),
}),
});
const bySaleIngredientsSchema = z.object({
query: z.object({
minIngredients: z.coerce.number().int().positive().optional().default(3),
minIngredients: optionalNumeric({ default: 3, integer: true, positive: true }),
}),
});
@@ -31,11 +26,7 @@ const byIngredientAndTagSchema = z.object({
}),
});
const recipeIdParamsSchema = z.object({
params: z.object({
recipeId: z.coerce.number().int().positive(),
}),
});
const recipeIdParamsSchema = numericIdParam('recipeId');
/**
* GET /api/recipes/by-sale-percentage - Get recipes based on the percentage of their ingredients on sale.
@@ -47,7 +38,7 @@ router.get(
try {
// Explicitly parse req.query to apply coercion (string -> number) and default values
const { query } = bySalePercentageSchema.parse({ query: req.query });
const recipes = await db.recipeRepo.getRecipesBySalePercentage(query.minPercentage, req.log);
const recipes = await db.recipeRepo.getRecipesBySalePercentage(query.minPercentage!, req.log);
res.json(recipes);
} catch (error) {
req.log.error({ error }, 'Error fetching recipes in /api/recipes/by-sale-percentage:');
@@ -67,7 +58,7 @@ router.get(
// Explicitly parse req.query to apply coercion (string -> number) and default values
const { query } = bySaleIngredientsSchema.parse({ query: req.query });
const recipes = await db.recipeRepo.getRecipesByMinSaleIngredients(
query.minIngredients,
query.minIngredients!,
req.log,
);
res.json(recipes);

View File

@@ -28,12 +28,6 @@ const expectLogger = expect.objectContaining({
describe('Stats Routes (/api/stats)', () => {
const app = createTestApp({ router: statsRouter, basePath: '/api/stats' });
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
beforeEach(() => {
vi.clearAllMocks();
});

View File

@@ -3,6 +3,7 @@ import { Router, Request, Response, NextFunction } from 'express';
import { z } from 'zod';
import * as db from '../services/db/index.db';
import { validateRequest } from '../middleware/validation.middleware';
import { optionalNumeric } from '../utils/zodUtils';
const router = Router();
@@ -10,8 +11,8 @@ const router = Router();
// Define the query schema separately so we can use it to parse req.query in the handler
const statsQuerySchema = z.object({
days: z.coerce.number().int().min(1).max(365).optional().default(30),
limit: z.coerce.number().int().min(1).max(50).optional().default(10),
days: optionalNumeric({ default: 30, min: 1, max: 365, integer: true }),
limit: optionalNumeric({ default: 10, min: 1, max: 50, integer: true }),
});
const mostFrequentSalesSchema = z.object({
@@ -31,7 +32,7 @@ router.get(
// Even though validateRequest checks validity, it may not mutate req.query with the parsed result.
const { days, limit } = statsQuerySchema.parse(req.query);
const items = await db.adminRepo.getMostFrequentSaleItems(days, limit, req.log);
const items = await db.adminRepo.getMostFrequentSaleItems(days!, limit!, req.log);
res.json(items);
} catch (error) {
req.log.error(

View File

@@ -1,10 +1,10 @@
// src/routes/system.routes.test.ts
import { describe, it, expect, vi, beforeEach } from 'vitest';
import supertest from 'supertest';
import { exec, type ExecException, type ExecOptions } from 'child_process'; // Keep this for mocking
import systemRouter from './system.routes'; // This was a duplicate, fixed.
import { exec, type ExecException, type ExecOptions } from 'child_process';
import { geocodingService } from '../services/geocodingService.server';
import { createTestApp } from '../tests/utils/createTestApp';
import { mockLogger } from '../tests/utils/mockLogger';
// FIX: Use the simple factory pattern for child_process to avoid default export issues
vi.mock('child_process', () => {
@@ -30,20 +30,18 @@ vi.mock('../services/geocodingService.server', () => ({
// 3. Mock Logger
vi.mock('../services/logger.server', () => ({
logger: mockLogger,
logger: {
info: vi.fn(),
debug: vi.fn(),
error: vi.fn(),
warn: vi.fn(),
child: vi.fn().mockReturnThis(),
},
}));
// Import the router AFTER all mocks are defined.
import systemRouter from './system.routes';
describe('System Routes (/api/system)', () => {
const app = createTestApp({ router: systemRouter, basePath: '/api/system' });
// Add a basic error handler to capture errors passed to next(err) and return JSON.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
beforeEach(() => {
// We cast here to get type-safe access to mock functions like .mockImplementation
vi.clearAllMocks();

View File

@@ -1,17 +1,14 @@
// src/routes/system.ts
// src/routes/system.routes.ts
import { Router, Request, Response, NextFunction } from 'express';
import { exec } from 'child_process';
import { z } from 'zod';
import { logger } from '../services/logger.server';
import { geocodingService } from '../services/geocodingService.server';
import { validateRequest } from '../middleware/validation.middleware';
import { requiredString } from '../utils/zodUtils';
const router = Router();
// Helper for consistent required string validation (handles missing/null/empty)
const requiredString = (message: string) =>
z.preprocess((val) => val ?? '', z.string().min(1, message));
const geocodeSchema = z.object({
body: z.object({
address: requiredString('An address string is required.'),

View File

@@ -173,12 +173,6 @@ describe('User Routes (/api/users)', () => {
});
const app = createTestApp({ router: userRouter, basePath, authenticatedUser: mockUserProfile });
// Add a basic error handler to capture errors passed to next(err) and return JSON.
// This prevents unhandled error crashes in tests and ensures we get the 500 response we expect.
app.use((err: any, req: any, res: any, next: any) => {
res.status(err.status || 500).json({ message: err.message, errors: err.errors });
});
beforeEach(() => {
// All tests in this block will use the authenticated app
});
@@ -883,20 +877,41 @@ describe('User Routes (/api/users)', () => {
});
describe('Notification Routes', () => {
it('GET /notifications should return notifications for the user', async () => {
it('GET /notifications should return only unread notifications by default', async () => {
const mockNotifications: Notification[] = [
createMockNotification({ user_id: 'user-123', content: 'Test' }),
];
vi.mocked(db.notificationRepo.getNotificationsForUser).mockResolvedValue(mockNotifications);
const response = await supertest(app).get('/api/users/notifications?limit=10&offset=0');
const response = await supertest(app).get('/api/users/notifications?limit=10');
expect(response.status).toBe(200);
expect(response.body).toEqual(mockNotifications);
expect(db.notificationRepo.getNotificationsForUser).toHaveBeenCalledWith(
'user-123',
10,
0,
0, // default offset
false, // default includeRead
expectLogger,
);
});
it('GET /notifications?includeRead=true should return all notifications', async () => {
const mockNotifications: Notification[] = [
createMockNotification({ user_id: 'user-123', content: 'Read', is_read: true }),
createMockNotification({ user_id: 'user-123', content: 'Unread', is_read: false }),
];
vi.mocked(db.notificationRepo.getNotificationsForUser).mockResolvedValue(mockNotifications);
const response = await supertest(app).get('/api/users/notifications?includeRead=true');
expect(response.status).toBe(200);
expect(response.body).toEqual(mockNotifications);
expect(db.notificationRepo.getNotificationsForUser).toHaveBeenCalledWith(
'user-123',
20, // default limit
0, // default offset
true, // includeRead from query param
expectLogger,
);
});

View File

@@ -4,57 +4,24 @@ import passport from './passport.routes';
import multer from 'multer';
import path from 'path';
import fs from 'node:fs/promises';
import * as bcrypt from 'bcrypt';
import zxcvbn from 'zxcvbn';
import * as bcrypt from 'bcrypt'; // This was a duplicate, fixed.
import { z } from 'zod';
import * as db from '../services/db/index.db';
import { logger } from '../services/logger.server';
import { UserProfile } from '../types';
import { userService } from '../services/userService';
import { ForeignKeyConstraintError } from '../services/db/errors.db';
import { validateRequest } from '../middleware/validation.middleware';
import { validatePasswordStrength } from '../utils/authUtils';
import {
requiredString,
numericIdParam,
optionalNumeric,
optionalBoolean,
} from '../utils/zodUtils';
import * as db from '../services/db/index.db';
const router = express.Router();
/**
* Validates the strength of a password using zxcvbn.
* @param password The password to check.
* @returns An object with `isValid` and an optional `feedback` message.
*/
const validatePasswordStrength = (password: string): { isValid: boolean; feedback?: string } => {
const MIN_PASSWORD_SCORE = 3; // Require a 'Good' or 'Strong' password (score 3 or 4)
const strength = zxcvbn(password);
if (strength.score < MIN_PASSWORD_SCORE) {
const feedbackMessage =
strength.feedback.warning ||
(strength.feedback.suggestions && strength.feedback.suggestions[0]);
return {
isValid: false,
feedback:
`Password is too weak. ${feedbackMessage || 'Please choose a stronger password.'}`.trim(),
};
}
return { isValid: true };
};
// Helper for consistent required string validation (handles missing/null/empty)
const requiredString = (message: string) =>
z.preprocess((val) => val ?? '', z.string().min(1, message));
// --- Zod Schemas for User Routes (as per ADR-003) ---
const numericIdParam = (key: string) =>
z.object({
params: z.object({
[key]: z.coerce
.number()
.int()
.positive(`Invalid ID for parameter '${key}'. Must be a number.`),
}),
});
const updateProfileSchema = z.object({
body: z
.object({ full_name: z.string().optional(), avatar_url: z.string().url().optional() })
@@ -93,8 +60,9 @@ const createShoppingListSchema = z.object({
// Apply the JWT authentication middleware to all routes in this file.
const notificationQuerySchema = z.object({
query: z.object({
limit: z.coerce.number().int().positive().optional().default(20),
offset: z.coerce.number().int().nonnegative().optional().default(0),
limit: optionalNumeric({ default: 20, integer: true, positive: true }),
offset: optionalNumeric({ default: 0, integer: true, nonnegative: true }),
includeRead: optionalBoolean({ default: false }),
}),
});
@@ -173,13 +141,12 @@ router.get(
// Apply ADR-003 pattern for type safety
try {
const { query } = req as unknown as GetNotificationsRequest;
// Explicitly convert to numbers to ensure the repo receives correct types
const limit = query.limit ? Number(query.limit) : 20;
const offset = query.offset ? Number(query.offset) : 0;
const parsedQuery = notificationQuerySchema.parse({ query: req.query }).query;
const notifications = await db.notificationRepo.getNotificationsForUser(
userProfile.user.user_id,
limit,
offset,
parsedQuery.limit!,
parsedQuery.offset!,
parsedQuery.includeRead!,
req.log,
);
res.json(notifications);

View File

@@ -2,6 +2,7 @@
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
import * as aiApiClient from './aiApiClient';
import { AiAnalysisService } from './aiAnalysisService';
import { createMockFlyerItem } from '../tests/utils/mockFactories';
// Mock the dependencies
vi.mock('./aiApiClient');
@@ -56,7 +57,7 @@ describe('AiAnalysisService', () => {
json: () => Promise.resolve(mockResponse),
} as Response);
const result = await service.searchWeb([]);
const result = await service.searchWeb([createMockFlyerItem({ item: 'test' })]);
expect(result.text).toBe('Search results');
expect(result.sources).toEqual([{ uri: 'https://example.com', title: 'Example' }]);
@@ -68,7 +69,7 @@ describe('AiAnalysisService', () => {
json: () => Promise.resolve(mockResponse),
} as Response);
const result = await service.searchWeb([]);
const result = await service.searchWeb([createMockFlyerItem({ item: 'test' })]);
expect(result.text).toBe('Search results');
expect(result.sources).toEqual([]);
@@ -83,7 +84,7 @@ describe('AiAnalysisService', () => {
json: () => Promise.resolve(mockResponse),
} as Response);
const result = await service.searchWeb([]);
const result = await service.searchWeb([createMockFlyerItem({ item: 'test' })]);
expect(result.sources).toEqual([{ uri: '', title: 'Untitled' }]);
});
@@ -92,7 +93,9 @@ describe('AiAnalysisService', () => {
const apiError = new Error('API is down');
vi.mocked(aiApiClient.searchWeb).mockRejectedValue(apiError);
await expect(service.searchWeb([])).rejects.toThrow(apiError);
await expect(service.searchWeb([createMockFlyerItem({ item: 'test' })])).rejects.toThrow(
apiError,
);
});
});

View File

@@ -1,15 +1,8 @@
// src/services/aiAnalysisService.ts
import { Flyer, FlyerItem, MasterGroceryItem, GroundedResponse, Source } from '../types';
import { Flyer, FlyerItem, MasterGroceryItem, GroundedResponse } from '../types';
import * as aiApiClient from './aiApiClient';
import { logger } from './logger.client';
interface RawSource {
web?: {
uri?: string;
title?: string;
};
}
/**
* A service class to encapsulate all AI analysis API calls and related business logic.
* This decouples the React components and hooks from the data fetching implementation.
@@ -22,7 +15,8 @@ export class AiAnalysisService {
*/
async getQuickInsights(items: FlyerItem[]): Promise<string> {
logger.info('[AiAnalysisService] getQuickInsights called.');
return aiApiClient.getQuickInsights(items).then((res) => res.json());
const result = await aiApiClient.getQuickInsights(items);
return result.text;
}
/**
@@ -32,7 +26,8 @@ export class AiAnalysisService {
*/
async getDeepDiveAnalysis(items: FlyerItem[]): Promise<string> {
logger.info('[AiAnalysisService] getDeepDiveAnalysis called.');
return aiApiClient.getDeepDiveAnalysis(items).then((res) => res.json());
const result = await aiApiClient.getDeepDiveAnalysis(items);
return result.text;
}
/**
@@ -42,18 +37,9 @@ export class AiAnalysisService {
*/
async searchWeb(items: FlyerItem[]): Promise<GroundedResponse> {
logger.info('[AiAnalysisService] searchWeb called.');
// The API client returns a specific shape that we need to await the JSON from
const response: { text: string; sources: RawSource[] } = await aiApiClient
.searchWeb(items)
.then((res) => res.json());
// Normalize sources to a consistent format.
const mappedSources = (response.sources || []).map(
(s: RawSource) =>
(s.web
? { uri: s.web.uri || '', title: s.web.title || 'Untitled' }
: { uri: '', title: 'Untitled' }) as Source,
);
return { ...response, sources: mappedSources };
// Construct a query string from the item names.
const query = items.map((item) => item.item).join(', ');
return aiApiClient.searchWeb(query);
}
/**
@@ -66,7 +52,7 @@ export class AiAnalysisService {
logger.info('[AiAnalysisService] planTripWithMaps called.');
// Encapsulate geolocation logic within the service.
const userLocation = await this.getCurrentLocation();
return aiApiClient.planTripWithMaps(items, store, userLocation).then((res) => res.json());
return aiApiClient.planTripWithMaps(items, store, userLocation);
}
/**
@@ -76,17 +62,7 @@ export class AiAnalysisService {
*/
async compareWatchedItemPrices(watchedItems: MasterGroceryItem[]): Promise<GroundedResponse> {
logger.info('[AiAnalysisService] compareWatchedItemPrices called.');
const response: { text: string; sources: RawSource[] } = await aiApiClient
.compareWatchedItemPrices(watchedItems)
.then((res) => res.json());
// Normalize sources to a consistent format.
const mappedSources = (response.sources || []).map(
(s: RawSource) =>
(s.web
? { uri: s.web.uri || '', title: s.web.title || 'Untitled' }
: { uri: '', title: 'Untitled' }) as Source,
);
return { ...response, sources: mappedSources };
return aiApiClient.compareWatchedItemPrices(watchedItems);
}
/**
@@ -96,7 +72,8 @@ export class AiAnalysisService {
*/
async generateImageFromText(prompt: string): Promise<string> {
logger.info('[AiAnalysisService] generateImageFromText called.');
return aiApiClient.generateImageFromText(prompt).then((res) => res.json());
const result = await aiApiClient.generateImageFromText(prompt);
return result.imageUrl;
}
/**

View File

@@ -282,15 +282,15 @@ describe('AI API Client (Network Mocking with MSW)', () => {
});
describe('searchWeb', () => {
it('should send items as JSON in the body', async () => {
const items = [createMockFlyerItem({ item: 'search me' })];
await aiApiClient.searchWeb(items, undefined, 'test-token');
it('should send query as JSON in the body', async () => {
const query = 'search me';
await aiApiClient.searchWeb(query, undefined, 'test-token');
expect(requestSpy).toHaveBeenCalledTimes(1);
const req = requestSpy.mock.calls[0][0];
expect(req.endpoint).toBe('search-web');
expect(req.body).toEqual({ items });
expect(req.body).toEqual({ query });
expect(req.headers.get('Authorization')).toBe('Bearer test-token');
});
});

View File

@@ -4,7 +4,13 @@
* It communicates with the application's own backend endpoints, which then securely
* call the Google AI services. This ensures no API keys are exposed on the client.
*/
import type { FlyerItem, Store, MasterGroceryItem } from '../types';
import type {
FlyerItem,
Store,
MasterGroceryItem,
ProcessingStage,
GroundedResponse,
} from '../types';
import { logger } from './logger.client';
import { apiFetch } from './apiClient';
@@ -20,14 +26,14 @@ export const uploadAndProcessFlyer = async (
file: File,
checksum: string,
tokenOverride?: string,
): Promise<Response> => {
): Promise<{ jobId: string }> => {
const formData = new FormData();
formData.append('flyerFile', file);
formData.append('checksum', checksum);
logger.info(`[aiApiClient] Starting background processing for file: ${file.name}`);
return apiFetch(
const response = await apiFetch(
'/ai/upload-and-process',
{
method: 'POST',
@@ -35,26 +41,79 @@ export const uploadAndProcessFlyer = async (
},
{ tokenOverride },
);
if (!response.ok) {
let errorBody;
try {
errorBody = await response.json();
} catch (e) {
errorBody = { message: await response.text() };
}
// Throw a structured error so the component can inspect the status and body
throw { status: response.status, body: errorBody };
}
return response.json();
};
// Define the expected shape of the job status response
export interface JobStatus {
id: string;
state: 'completed' | 'failed' | 'active' | 'waiting' | 'delayed' | 'paused';
progress: {
stages?: ProcessingStage[];
estimatedTimeRemaining?: number;
message?: string;
} | null;
returnValue: {
flyerId?: number;
} | null;
failedReason: string | null;
}
/**
* Fetches the status of a background processing job.
* This is the second step in the new background processing flow.
* @param jobId The ID of the job to check.
* @param tokenOverride Optional token for testing.
* @returns A promise that resolves to the API response with the job's status.
* @returns A promise that resolves to the parsed job status object.
* @throws An error if the network request fails or if the response is not valid JSON.
*/
export const getJobStatus = async (jobId: string, tokenOverride?: string): Promise<Response> => {
return apiFetch(`/ai/jobs/${jobId}/status`, {}, { tokenOverride });
export const getJobStatus = async (
jobId: string,
tokenOverride?: string,
): Promise<JobStatus> => {
const response = await apiFetch(`/ai/jobs/${jobId}/status`, {}, { tokenOverride });
if (!response.ok) {
let errorText = `API Error: ${response.status} ${response.statusText}`;
try {
const errorBody = await response.text();
if (errorBody) errorText = `API Error ${response.status}: ${errorBody}`;
} catch (e) {
// ignore if reading body fails
}
throw new Error(errorText);
}
try {
return await response.json();
} catch (error) {
const rawText = await response.text();
throw new Error(`Failed to parse JSON response from server. Body: ${rawText}`);
}
};
export const isImageAFlyer = async (imageFile: File, tokenOverride?: string): Promise<Response> => {
export const isImageAFlyer = async (
imageFile: File,
tokenOverride?: string,
): Promise<{ is_flyer: boolean }> => {
const formData = new FormData();
formData.append('image', imageFile);
// Use apiFetchWithAuth for FormData to let the browser set the correct Content-Type.
// The URL must be relative, as the helper constructs the full path.
return apiFetch(
const response = await apiFetch(
'/ai/check-flyer',
{
method: 'POST',
@@ -62,16 +121,28 @@ export const isImageAFlyer = async (imageFile: File, tokenOverride?: string): Pr
},
{ tokenOverride },
);
if (!response.ok) {
let errorBody;
try {
errorBody = await response.json();
} catch (e) {
errorBody = { message: await response.text() };
}
throw { status: response.status, body: errorBody };
}
return response.json();
};
export const extractAddressFromImage = async (
imageFile: File,
tokenOverride?: string,
): Promise<Response> => {
): Promise<{ address: string }> => {
const formData = new FormData();
formData.append('image', imageFile);
return apiFetch(
const response = await apiFetch(
'/ai/extract-address',
{
method: 'POST',
@@ -79,18 +150,30 @@ export const extractAddressFromImage = async (
},
{ tokenOverride },
);
if (!response.ok) {
let errorBody;
try {
errorBody = await response.json();
} catch (e) {
errorBody = { message: await response.text() };
}
throw { status: response.status, body: errorBody };
}
return response.json();
};
export const extractLogoFromImage = async (
imageFiles: File[],
tokenOverride?: string,
): Promise<Response> => {
): Promise<{ store_logo_base_64: string | null }> => {
const formData = new FormData();
imageFiles.forEach((file) => {
formData.append('images', file);
});
return apiFetch(
const response = await apiFetch(
'/ai/extract-logo',
{
method: 'POST',
@@ -98,14 +181,26 @@ export const extractLogoFromImage = async (
},
{ tokenOverride },
);
if (!response.ok) {
let errorBody;
try {
errorBody = await response.json();
} catch (e) {
errorBody = { message: await response.text() };
}
throw { status: response.status, body: errorBody };
}
return response.json();
};
export const getQuickInsights = async (
items: Partial<FlyerItem>[],
signal?: AbortSignal,
tokenOverride?: string,
): Promise<Response> => {
return apiFetch(
): Promise<{ text: string }> => {
const response = await apiFetch(
'/ai/quick-insights',
{
method: 'POST',
@@ -115,14 +210,26 @@ export const getQuickInsights = async (
},
{ tokenOverride, signal },
);
if (!response.ok) {
let errorBody;
try {
errorBody = await response.json();
} catch (e) {
errorBody = { message: await response.text() };
}
throw { status: response.status, body: errorBody };
}
return response.json();
};
export const getDeepDiveAnalysis = async (
items: Partial<FlyerItem>[],
signal?: AbortSignal,
tokenOverride?: string,
): Promise<Response> => {
return apiFetch(
): Promise<{ text: string }> => {
const response = await apiFetch(
'/ai/deep-dive',
{
method: 'POST',
@@ -132,23 +239,47 @@ export const getDeepDiveAnalysis = async (
},
{ tokenOverride, signal },
);
if (!response.ok) {
let errorBody;
try {
errorBody = await response.json();
} catch (e) {
errorBody = { message: await response.text() };
}
throw { status: response.status, body: errorBody };
}
return response.json();
};
export const searchWeb = async (
items: Partial<FlyerItem>[],
query: string,
signal?: AbortSignal,
tokenOverride?: string,
): Promise<Response> => {
return apiFetch(
): Promise<GroundedResponse> => {
const response = await apiFetch(
'/ai/search-web',
{
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ items }),
body: JSON.stringify({ query }),
signal,
},
{ tokenOverride, signal },
);
if (!response.ok) {
let errorBody;
try {
errorBody = await response.json();
} catch (e) {
errorBody = { message: await response.text() };
}
throw { status: response.status, body: errorBody };
}
return response.json();
};
// ============================================================================
@@ -161,9 +292,9 @@ export const planTripWithMaps = async (
userLocation: GeolocationCoordinates,
signal?: AbortSignal,
tokenOverride?: string,
): Promise<Response> => {
): Promise<GroundedResponse> => {
logger.debug('Stub: planTripWithMaps called with location:', { userLocation });
return apiFetch(
const response = await apiFetch(
'/ai/plan-trip',
{
method: 'POST',
@@ -172,6 +303,18 @@ export const planTripWithMaps = async (
},
{ signal, tokenOverride },
);
if (!response.ok) {
let errorBody;
try {
errorBody = await response.json();
} catch (e) {
errorBody = { message: await response.text() };
}
throw { status: response.status, body: errorBody };
}
return response.json();
};
/**
@@ -183,9 +326,9 @@ export const generateImageFromText = async (
prompt: string,
signal?: AbortSignal,
tokenOverride?: string,
): Promise<Response> => {
): Promise<{ imageUrl: string }> => {
logger.debug('Stub: generateImageFromText called with prompt:', { prompt });
return apiFetch(
const response = await apiFetch(
'/ai/generate-image',
{
method: 'POST',
@@ -195,6 +338,18 @@ export const generateImageFromText = async (
},
{ tokenOverride, signal },
);
if (!response.ok) {
let errorBody;
try {
errorBody = await response.json();
} catch (e) {
errorBody = { message: await response.text() };
}
throw { status: response.status, body: errorBody };
}
return response.json();
};
/**
@@ -206,9 +361,9 @@ export const generateSpeechFromText = async (
text: string,
signal?: AbortSignal,
tokenOverride?: string,
): Promise<Response> => {
): Promise<{ audioUrl: string }> => {
logger.debug('Stub: generateSpeechFromText called with text:', { text });
return apiFetch(
const response = await apiFetch(
'/ai/generate-speech',
{
method: 'POST',
@@ -218,6 +373,18 @@ export const generateSpeechFromText = async (
},
{ tokenOverride, signal },
);
if (!response.ok) {
let errorBody;
try {
errorBody = await response.json();
} catch (e) {
errorBody = { message: await response.text() };
}
throw { status: response.status, body: errorBody };
}
return response.json();
};
/**
@@ -264,13 +431,29 @@ export const rescanImageArea = async (
cropArea: { x: number; y: number; width: number; height: number },
extractionType: 'store_name' | 'dates' | 'item_details',
tokenOverride?: string,
): Promise<Response> => {
): Promise<{ text: string | undefined }> => {
const formData = new FormData();
formData.append('image', imageFile);
formData.append('cropArea', JSON.stringify(cropArea));
formData.append('extractionType', extractionType);
return apiFetch('/ai/rescan-area', { method: 'POST', body: formData }, { tokenOverride });
const response = await apiFetch(
'/ai/rescan-area',
{ method: 'POST', body: formData },
{ tokenOverride },
);
if (!response.ok) {
let errorBody;
try {
errorBody = await response.json();
} catch (e) {
errorBody = { message: await response.text() };
}
throw { status: response.status, body: errorBody };
}
return response.json();
};
/**
@@ -281,10 +464,10 @@ export const rescanImageArea = async (
export const compareWatchedItemPrices = async (
watchedItems: MasterGroceryItem[],
signal?: AbortSignal,
): Promise<Response> => {
): Promise<GroundedResponse> => {
// Use the apiFetch wrapper for consistency with other API calls in this file.
// This centralizes token handling and base URL logic.
return apiFetch(
const response = await apiFetch(
'/ai/compare-prices',
{
method: 'POST',
@@ -293,4 +476,16 @@ export const compareWatchedItemPrices = async (
},
{ signal },
);
if (!response.ok) {
let errorBody;
try {
errorBody = await response.json();
} catch (e) {
errorBody = { message: await response.text() };
}
throw { status: response.status, body: errorBody };
}
return response.json();
};

View File

@@ -166,6 +166,127 @@ describe('AI Service (Server)', () => {
});
});
describe('Model Fallback Logic', () => {
const originalEnv = process.env;
beforeEach(() => {
vi.unstubAllEnvs();
process.env = { ...originalEnv, GEMINI_API_KEY: 'test-key' };
vi.resetModules(); // Re-import to use the new env var and re-instantiate the service
});
afterEach(() => {
process.env = originalEnv;
vi.unstubAllEnvs();
});
it('should try the next model if the first one fails with a quota error', async () => {
// Arrange
const { AIService } = await import('./aiService.server');
const { logger } = await import('./logger.server');
const serviceWithFallback = new AIService(logger);
const quotaError = new Error('User rate limit exceeded due to quota');
const successResponse = { text: 'Success from fallback model', candidates: [] };
// Mock the generateContent function to fail on the first call and succeed on the second
mockGenerateContent.mockRejectedValueOnce(quotaError).mockResolvedValueOnce(successResponse);
const request = { contents: [{ parts: [{ text: 'test prompt' }] }] };
// Act
const result = await (serviceWithFallback as any).aiClient.generateContent(request);
// Assert
expect(result).toEqual(successResponse);
expect(mockGenerateContent).toHaveBeenCalledTimes(2);
// Check first call
expect(mockGenerateContent).toHaveBeenNthCalledWith(1, {
model: 'gemini-2.5-flash',
...request,
});
// Check second call
expect(mockGenerateContent).toHaveBeenNthCalledWith(2, {
model: 'gemini-3-flash',
...request,
});
// Check that a warning was logged
expect(logger.warn).toHaveBeenCalledWith(
expect.stringContaining(
"Model 'gemini-2.5-flash' failed due to quota/rate limit. Trying next model.",
),
);
});
it('should throw immediately for non-retriable errors', async () => {
// Arrange
const { AIService } = await import('./aiService.server');
const { logger } = await import('./logger.server');
const serviceWithFallback = new AIService(logger);
const nonRetriableError = new Error('Invalid API Key');
mockGenerateContent.mockRejectedValueOnce(nonRetriableError);
const request = { contents: [{ parts: [{ text: 'test prompt' }] }] };
// Act & Assert
await expect((serviceWithFallback as any).aiClient.generateContent(request)).rejects.toThrow(
'Invalid API Key',
);
expect(mockGenerateContent).toHaveBeenCalledTimes(1);
expect(logger.error).toHaveBeenCalledWith(
`[AIService Adapter] Model 'gemini-2.5-flash' failed with a non-retriable error.`,
{ error: nonRetriableError },
);
});
it('should throw the last error if all models fail', async () => {
// Arrange
const { AIService } = await import('./aiService.server');
const { logger } = await import('./logger.server');
const serviceWithFallback = new AIService(logger);
const quotaError1 = new Error('Quota exhausted for model 1');
const quotaError2 = new Error('429 Too Many Requests for model 2');
const quotaError3 = new Error('RESOURCE_EXHAUSTED for model 3');
mockGenerateContent
.mockRejectedValueOnce(quotaError1)
.mockRejectedValueOnce(quotaError2)
.mockRejectedValueOnce(quotaError3);
const request = { contents: [{ parts: [{ text: 'test prompt' }] }] };
// Act & Assert
await expect((serviceWithFallback as any).aiClient.generateContent(request)).rejects.toThrow(
quotaError3,
);
expect(mockGenerateContent).toHaveBeenCalledTimes(3);
expect(mockGenerateContent).toHaveBeenNthCalledWith(1, {
model: 'gemini-2.5-flash',
...request,
});
expect(mockGenerateContent).toHaveBeenNthCalledWith(2, {
model: 'gemini-3-flash',
...request,
});
expect(mockGenerateContent).toHaveBeenNthCalledWith(3, {
model: 'gemini-2.5-flash-lite',
...request,
});
expect(logger.error).toHaveBeenCalledWith(
'[AIService Adapter] All AI models failed. Throwing last known error.',
{ lastError: quotaError3 },
);
});
});
describe('extractItemsFromReceiptImage', () => {
it('should extract items from a valid AI response', async () => {
const mockAiResponseText = `[

View File

@@ -72,6 +72,7 @@ export class AIService {
private fs: IFileSystem;
private rateLimiter: <T>(fn: () => Promise<T>) => Promise<T>;
private logger: Logger;
private readonly models = ['gemini-2.5-flash', 'gemini-3-flash', 'gemini-2.5-flash-lite'];
constructor(logger: Logger, aiClient?: IAiClient, fs?: IFileSystem) {
this.logger = logger;
@@ -121,17 +122,11 @@ export class AIService {
);
}
// do not change "gemini-2.5-flash" - this is correct
const modelName = 'gemini-2.5-flash';
// We create a shim/adapter that matches the old structure but uses the new SDK call pattern.
// This preserves the dependency injection pattern used throughout the class.
this.aiClient = genAI
? {
generateContent: (request) => {
// The model name is now injected here, into every call, as the new SDK requires.
// Architectural guard clause: All requests from this service must have content.
// This prevents sending invalid requests to the API and satisfies TypeScript's strictness.
generateContent: async (request) => {
if (!request.contents || request.contents.length === 0) {
this.logger.error(
{ request },
@@ -140,14 +135,7 @@ export class AIService {
throw new Error('AIService.generateContent requires at least one content element.');
}
// Architectural Fix: After the guard clause, assign the guaranteed-to-exist element
// to a new constant. This provides a definitive type-safe variable for the compiler.
const firstContent = request.contents[0];
this.logger.debug(
{ modelName, requestParts: firstContent.parts?.length ?? 0 },
'[AIService] Calling actual generateContent via adapter.',
);
return genAI.models.generateContent({ model: modelName, ...request });
return this._generateWithFallback(genAI, request);
},
}
: {
@@ -182,6 +170,54 @@ export class AIService {
this.logger.info('---------------- [AIService] Constructor End ----------------');
}
private async _generateWithFallback(
genAI: GoogleGenAI,
request: { contents: Content[]; tools?: Tool[] },
): Promise<GenerateContentResponse> {
let lastError: Error | null = null;
for (const modelName of this.models) {
try {
this.logger.info(
`[AIService Adapter] Attempting to generate content with model: ${modelName}`,
);
const result = await genAI.models.generateContent({ model: modelName, ...request });
// If the call succeeds, return the result immediately.
return result;
} catch (error: unknown) {
lastError = error instanceof Error ? error : new Error(String(error));
const errorMessage = lastError.message || '';
// Check for specific error messages indicating quota issues or model unavailability.
if (
errorMessage.includes('quota') ||
errorMessage.includes('429') || // HTTP 429 Too Many Requests
errorMessage.includes('RESOURCE_EXHAUSTED') ||
errorMessage.includes('model is overloaded')
) {
this.logger.warn(
`[AIService Adapter] Model '${modelName}' failed due to quota/rate limit. Trying next model. Error: ${errorMessage}`,
);
continue; // Try the next model in the list.
} else {
// For other errors (e.g., invalid input, safety settings), fail immediately.
this.logger.error(
{ error: lastError },
`[AIService Adapter] Model '${modelName}' failed with a non-retriable error.`,
);
throw lastError;
}
}
}
// If all models in the list have failed, throw the last error encountered.
this.logger.error(
{ lastError },
'[AIService Adapter] All AI models failed. Throwing last known error.',
);
throw lastError || new Error('All AI models failed to generate content.');
}
private async serverFileToGenerativePart(path: string, mimeType: string) {
const fileData = await this.fs.readFile(path);
return {

View File

@@ -32,7 +32,7 @@ describe('Notification DB Service', () => {
});
describe('getNotificationsForUser', () => {
it('should execute the correct query with limit and offset and return notifications', async () => {
it('should only return unread notifications by default', async () => {
const mockNotifications: Notification[] = [
createMockNotification({
notification_id: 1,
@@ -43,30 +43,59 @@ describe('Notification DB Service', () => {
];
mockPoolInstance.query.mockResolvedValue({ rows: mockNotifications });
const result = await notificationRepo.getNotificationsForUser('user-123', 10, 5, mockLogger);
const result = await notificationRepo.getNotificationsForUser(
'user-123',
10,
5,
false,
mockLogger,
);
expect(mockPoolInstance.query).toHaveBeenCalledWith(
expect.stringContaining('SELECT * FROM public.notifications'),
expect.stringContaining('is_read = false'),
['user-123', 10, 5],
);
expect(result).toEqual(mockNotifications);
});
it('should return all notifications when includeRead is true', async () => {
const mockNotifications: Notification[] = [
createMockNotification({ is_read: true }),
createMockNotification({ is_read: false }),
];
mockPoolInstance.query.mockResolvedValue({ rows: mockNotifications });
await notificationRepo.getNotificationsForUser('user-123', 10, 0, true, mockLogger);
// The query should NOT contain the is_read filter
expect(mockPoolInstance.query.mock.calls[0][0]).not.toContain('is_read = false');
expect(mockPoolInstance.query).toHaveBeenCalledWith(expect.any(String), ['user-123', 10, 0]);
});
it('should return an empty array if the user has no notifications', async () => {
mockPoolInstance.query.mockResolvedValue({ rows: [] });
const result = await notificationRepo.getNotificationsForUser('user-456', 10, 0, mockLogger);
const result = await notificationRepo.getNotificationsForUser(
'user-456',
10,
0,
false,
mockLogger,
);
expect(result).toEqual([]);
expect(mockPoolInstance.query).toHaveBeenCalledWith(expect.any(String), ['user-456', 10, 0]);
expect(mockPoolInstance.query).toHaveBeenCalledWith(
expect.stringContaining('is_read = false'),
['user-456', 10, 0],
);
});
it('should throw an error if the database query fails', async () => {
const dbError = new Error('DB Error');
mockPoolInstance.query.mockRejectedValue(dbError);
await expect(
notificationRepo.getNotificationsForUser('user-123', 10, 5, mockLogger),
notificationRepo.getNotificationsForUser('user-123', 10, 5, false, mockLogger),
).rejects.toThrow('Failed to retrieve notifications.');
expect(mockLogger.error).toHaveBeenCalledWith(
{ err: dbError, userId: 'user-123', limit: 10, offset: 5 },
{ err: dbError, userId: 'user-123', limit: 10, offset: 5, includeRead: false },
'Database error in getNotificationsForUser',
);
});

View File

@@ -95,20 +95,24 @@ export class NotificationRepository {
userId: string,
limit: number,
offset: number,
includeRead: boolean,
logger: Logger,
): Promise<Notification[]> {
try {
const res = await this.db.query<Notification>(
`SELECT * FROM public.notifications
WHERE user_id = $1
ORDER BY created_at DESC
LIMIT $2 OFFSET $3`,
[userId, limit, offset],
);
const params: (string | number)[] = [userId, limit, offset];
let query = `SELECT * FROM public.notifications WHERE user_id = $1`;
if (!includeRead) {
query += ` AND is_read = false`;
}
query += ` ORDER BY created_at DESC LIMIT $2 OFFSET $3`;
const res = await this.db.query<Notification>(query, params);
return res.rows;
} catch (error) {
logger.error(
{ err: error, userId, limit, offset },
{ err: error, userId, limit, offset, includeRead },
'Database error in getNotificationsForUser',
);
throw new Error('Failed to retrieve notifications.');

View File

@@ -0,0 +1,53 @@
// src/services/db/price.db.ts
import type { Logger } from 'pino';
import type { PriceHistoryData } from '../../types';
import { getPool } from './connection.db';
/**
* Repository for fetching price-related data.
*/
export const priceRepo = {
/**
* Fetches the historical price data for a given list of master item IDs.
* It retrieves the price in cents and the start date of the flyer for each item.
*
* @param masterItemIds An array of master grocery item IDs.
* @param logger The pino logger instance.
* @param limit The maximum number of records to return.
* @param offset The number of records to skip.
* @returns A promise that resolves to an array of price history data points.
*/
async getPriceHistory(
masterItemIds: number[],
logger: Logger,
limit: number = 1000,
offset: number = 0,
): Promise<PriceHistoryData[]> {
if (masterItemIds.length === 0) {
return [];
}
const query = `
SELECT
fi.master_item_id,
fi.price_in_cents,
f.valid_from AS date
FROM public.flyer_items fi
JOIN public.flyers f ON fi.flyer_id = f.flyer_id
WHERE
fi.master_item_id = ANY($1::int[])
AND f.valid_from IS NOT NULL
AND fi.price_in_cents IS NOT NULL
ORDER BY
fi.master_item_id, f.valid_from ASC
LIMIT $2 OFFSET $3;
`;
const result = await getPool().query(query, [masterItemIds, limit, offset]);
logger.debug(
{ count: result.rows.length, itemIds: masterItemIds.length, limit, offset },
'Fetched price history from database.',
);
return result.rows;
},
};

View File

@@ -624,14 +624,10 @@ describe('User DB Service', () => {
);
});
it('should throw NotFoundError if token is not found', async () => {
it('should return undefined if token is not found', async () => {
mockPoolInstance.query.mockResolvedValue({ rows: [], rowCount: 0 });
await expect(userRepo.findUserByRefreshToken('a-token', mockLogger)).rejects.toThrow(
NotFoundError,
);
await expect(userRepo.findUserByRefreshToken('a-token', mockLogger)).rejects.toThrow(
'User not found for the given refresh token.',
);
const result = await userRepo.findUserByRefreshToken('a-token', mockLogger);
expect(result).toBeUndefined();
});
it('should throw a generic error if the database query fails', async () => {

View File

@@ -205,7 +205,10 @@ export class UserRepository {
return res.rows[0];
} catch (error) {
if (error instanceof NotFoundError) throw error;
logger.error({ err: error, userId }, 'Database error in findUserById');
logger.error(
{ err: error, userId },
'Database error in findUserById',
);
throw new Error('Failed to retrieve user by ID from database.');
}
}
@@ -229,7 +232,10 @@ export class UserRepository {
return res.rows[0];
} catch (error) {
if (error instanceof NotFoundError) throw error;
logger.error({ err: error, userId }, 'Database error in findUserWithPasswordHashById');
logger.error(
{ err: error, userId },
'Database error in findUserWithPasswordHashById',
);
throw new Error('Failed to retrieve user with sensitive data by ID from database.');
}
}
@@ -275,7 +281,10 @@ export class UserRepository {
if (error instanceof NotFoundError) {
throw error;
}
logger.error({ err: error, userId }, 'Database error in findUserProfileById');
logger.error(
{ err: error, userId },
'Database error in findUserProfileById',
);
throw new Error('Failed to retrieve user profile from database.');
}
}
@@ -321,7 +330,10 @@ export class UserRepository {
if (error instanceof NotFoundError) {
throw error;
}
logger.error({ err: error, userId, profileData }, 'Database error in updateUserProfile');
logger.error(
{ err: error, userId, profileData },
'Database error in updateUserProfile',
);
throw new Error('Failed to update user profile in database.');
}
}
@@ -350,7 +362,10 @@ export class UserRepository {
if (error instanceof NotFoundError) {
throw error;
}
logger.error({ err: error, userId, preferences }, 'Database error in updateUserPreferences');
logger.error(
{ err: error, userId, preferences },
'Database error in updateUserPreferences',
);
throw new Error('Failed to update user preferences in database.');
}
}
@@ -368,7 +383,10 @@ export class UserRepository {
[passwordHash, userId]
);
} catch (error) {
logger.error({ err: error, userId }, 'Database error in updateUserPassword');
logger.error(
{ err: error, userId },
'Database error in updateUserPassword',
);
throw new Error('Failed to update user password in database.');
}
}
@@ -381,8 +399,11 @@ export class UserRepository {
async deleteUserById(userId: string, logger: Logger): Promise<void> {
try {
await this.db.query('DELETE FROM public.users WHERE user_id = $1', [userId]);
} catch (error) {
logger.error({ err: error, userId }, 'Database error in deleteUserById');
} catch (error) { // This was a duplicate, fixed.
logger.error(
{ err: error, userId },
'Database error in deleteUserById',
);
throw new Error('Failed to delete user from database.');
}
}
@@ -400,7 +421,10 @@ export class UserRepository {
[refreshToken, userId]
);
} catch (error) {
logger.error({ err: error, userId }, 'Database error in saveRefreshToken');
logger.error(
{ err: error, userId },
'Database error in saveRefreshToken',
);
throw new Error('Failed to save refresh token.');
}
}
@@ -410,19 +434,20 @@ export class UserRepository {
* @param refreshToken The refresh token to look up.
* @returns A promise that resolves to the user object (id, email) or undefined if not found.
*/
// prettier-ignore
async findUserByRefreshToken(refreshToken: string, logger: Logger): Promise<{ user_id: string; email: string; }> {
async findUserByRefreshToken(
refreshToken: string,
logger: Logger,
): Promise<{ user_id: string; email: string } | undefined> {
try {
const res = await this.db.query<{ user_id: string; email: string }>(
'SELECT user_id, email FROM public.users WHERE refresh_token = $1',
[refreshToken]
[refreshToken],
);
if ((res.rowCount ?? 0) === 0) {
throw new NotFoundError('User not found for the given refresh token.');
return undefined;
}
return res.rows[0];
} catch (error) {
if (error instanceof NotFoundError) throw error;
logger.error({ err: error }, 'Database error in findUserByRefreshToken');
throw new Error('Failed to find user by refresh token.'); // Generic error for other failures
}
@@ -461,7 +486,10 @@ export class UserRepository {
if (error instanceof Error && 'code' in error && error.code === '23503') {
throw new ForeignKeyConstraintError('The specified user does not exist.');
}
logger.error({ err: error, userId }, 'Database error in createPasswordResetToken');
logger.error(
{ err: error, userId },
'Database error in createPasswordResetToken',
);
throw new Error('Failed to create password reset token.');
}
}
@@ -478,7 +506,10 @@ export class UserRepository {
);
return res.rows;
} catch (error) {
logger.error({ err: error }, 'Database error in getValidResetTokens');
logger.error(
{ err: error },
'Database error in getValidResetTokens',
);
throw new Error('Failed to retrieve valid reset tokens.');
}
}
@@ -492,7 +523,10 @@ export class UserRepository {
try {
await this.db.query('DELETE FROM public.password_reset_tokens WHERE token_hash = $1', [tokenHash]);
} catch (error) {
logger.error({ err: error, tokenHash }, 'Database error in deleteResetToken');
logger.error(
{ err: error, tokenHash },
'Database error in deleteResetToken',
);
}
}
@@ -634,7 +668,10 @@ export async function exportUserData(userId: string, logger: Logger): Promise<{
return { profile, watchedItems, shoppingLists };
});
} catch (error) {
logger.error({ err: error, userId }, 'Database error in exportUserData');
logger.error(
{ err: error, userId },
'Database error in exportUserData',
);
throw new Error('Failed to export user data.');
}
}

View File

@@ -87,7 +87,7 @@ describe('Geocoding Service', () => {
// Assert
expect(result).toEqual(coordinates);
expect(logger.error).toHaveBeenCalledWith(
{ err: expect.any(Error), cacheKey: expect.any(String) },
{ err: 'Redis down', cacheKey: expect.any(String) },
'Redis GET or JSON.parse command failed. Proceeding without cache.',
);
expect(mockGoogleService.geocode).toHaveBeenCalled(); // Should still proceed to fetch
@@ -107,7 +107,7 @@ describe('Geocoding Service', () => {
expect(mocks.mockRedis.get).toHaveBeenCalledWith(cacheKey);
// The service should log the JSON parsing error and continue
expect(logger.error).toHaveBeenCalledWith(
{ err: expect.any(SyntaxError), cacheKey: expect.any(String) },
{ err: expect.any(String), cacheKey: expect.any(String) },
'Redis GET or JSON.parse command failed. Proceeding without cache.',
);
expect(mockGoogleService.geocode).toHaveBeenCalledTimes(1);
@@ -185,7 +185,7 @@ describe('Geocoding Service', () => {
// Assert
expect(result).toEqual(coordinates);
expect(logger.error).toHaveBeenCalledWith(
{ err: expect.any(Error) },
{ err: 'Network Error' },
expect.stringContaining('An error occurred while calling the Google Maps Geocoding API'),
);
expect(mockNominatimService.geocode).toHaveBeenCalledWith(address, logger);
@@ -223,7 +223,7 @@ describe('Geocoding Service', () => {
expect(mockGoogleService.geocode).toHaveBeenCalledTimes(1);
expect(mocks.mockRedis.set).toHaveBeenCalledTimes(1);
expect(logger.error).toHaveBeenCalledWith(
{ err: expect.any(Error), cacheKey: expect.any(String) },
{ err: 'Redis SET failed', cacheKey: expect.any(String) },
'Redis SET command failed. Result will not be cached.',
);
});
@@ -271,7 +271,7 @@ describe('Geocoding Service', () => {
// Act & Assert
await expect(geocodingService.clearGeocodeCache(logger)).rejects.toThrow(redisError);
expect(logger.error).toHaveBeenCalledWith(
{ err: redisError },
{ err: redisError.message },
'Failed to clear geocode cache from Redis.',
);
expect(mocks.mockRedis.del).not.toHaveBeenCalled();

View File

@@ -26,7 +26,7 @@ export class GeocodingService {
}
} catch (error) {
logger.error(
{ err: error, cacheKey },
{ err: error instanceof Error ? error.message : error, cacheKey },
'Redis GET or JSON.parse command failed. Proceeding without cache.',
);
}
@@ -44,7 +44,7 @@ export class GeocodingService {
);
} catch (error) {
logger.error(
{ err: error },
{ err: error instanceof Error ? error.message : error },
'An error occurred while calling the Google Maps Geocoding API. Falling back to Nominatim.',
);
}
@@ -73,7 +73,7 @@ export class GeocodingService {
await redis.set(cacheKey, JSON.stringify(result), 'EX', 60 * 60 * 24 * 30); // Cache for 30 days
} catch (error) {
logger.error(
{ err: error, cacheKey },
{ err: error instanceof Error ? error.message : error, cacheKey },
'Redis SET command failed. Result will not be cached.',
);
}
@@ -98,7 +98,10 @@ export class GeocodingService {
logger.info(`Successfully deleted ${totalDeleted} geocode cache entries.`);
return totalDeleted;
} catch (error) {
logger.error({ err: error }, 'Failed to clear geocode cache from Redis.');
logger.error(
{ err: error instanceof Error ? error.message : error },
'Failed to clear geocode cache from Redis.',
);
throw error;
}
}

View File

@@ -34,6 +34,9 @@ export const logger = pino({
'*.body.password',
'*.body.newPassword',
'*.body.currentPassword',
'*.body.confirmPassword',
'*.body.refreshToken',
'*.body.token',
],
censor: '[REDACTED]',
},

View File

@@ -1,5 +1,5 @@
// src/services/queueService.server.ts
import { Queue, Worker, Job } from 'bullmq';
import { Queue, Worker, Job, UnrecoverableError } from 'bullmq';
import IORedis from 'ioredis'; // Correctly imported
import fsPromises from 'node:fs/promises';
import { exec } from 'child_process';
@@ -185,9 +185,26 @@ const attachWorkerEventListeners = (worker: Worker) => {
export const flyerWorker = new Worker<FlyerJobData>(
'flyer-processing', // Must match the queue name
(job) => {
// The processJob method creates its own job-specific logger internally.
return flyerProcessingService.processJob(job);
async (job) => {
try {
// The processJob method creates its own job-specific logger internally.
return await flyerProcessingService.processJob(job);
} catch (error: any) {
// Check for quota errors or other unrecoverable errors from the AI service
const errorMessage = error?.message || '';
if (
errorMessage.includes('quota') ||
errorMessage.includes('429') ||
errorMessage.includes('RESOURCE_EXHAUSTED')
) {
logger.error(
{ err: error, jobId: job.id },
'[FlyerWorker] Unrecoverable quota error detected. Failing job immediately.',
);
throw new UnrecoverableError(errorMessage);
}
throw error;
}
},
{
connection,

View File

@@ -0,0 +1,96 @@
// src/tests/e2e/admin-dashboard.e2e.test.ts
import { describe, it, expect, afterAll } from 'vitest';
import supertest from 'supertest';
import app from '../../../server';
import { getPool } from '../../services/db/connection.db';
/**
* @vitest-environment node
*/
const request = supertest(app);
describe('E2E Admin Dashboard Flow', () => {
// Use a unique email for every run to avoid collisions
const uniqueId = Date.now();
const adminEmail = `e2e-admin-${uniqueId}@example.com`;
const adminPassword = 'StrongPassword123!';
let authToken: string;
let adminUserId: string | null = null;
afterAll(async () => {
// Safety cleanup: Ensure the user is deleted from the DB if the test fails mid-way.
if (adminUserId) {
try {
await getPool().query('DELETE FROM public.users WHERE user_id = $1', [adminUserId]);
} catch (err) {
console.error('Error cleaning up E2E admin user:', err);
}
}
});
it('should allow an admin to log in and access dashboard features', async () => {
// 1. Register a new user (initially a regular user)
const registerResponse = await request.post('/api/auth/register').send({
email: adminEmail,
password: adminPassword,
full_name: 'E2E Admin User',
});
expect(registerResponse.status).toBe(201);
const registeredUser = registerResponse.body.userprofile.user;
adminUserId = registeredUser.user_id;
expect(adminUserId).toBeDefined();
// 2. Promote the user to 'admin' via direct DB access
// (This simulates an existing admin or a manual promotion, as there is no public "register as admin" endpoint)
await getPool().query(`UPDATE public.profiles SET role = 'admin' WHERE user_id = $1`, [
adminUserId,
]);
// 3. Login to get the access token (now with admin privileges)
const loginResponse = await request.post('/api/auth/login').send({
email: adminEmail,
password: adminPassword,
});
expect(loginResponse.status).toBe(200);
authToken = loginResponse.body.token;
expect(authToken).toBeDefined();
// Verify the role returned in the login response is now 'admin'
expect(loginResponse.body.userprofile.role).toBe('admin');
// 4. Fetch System Stats (Protected Admin Route)
const statsResponse = await request
.get('/api/admin/stats')
.set('Authorization', `Bearer ${authToken}`);
expect(statsResponse.status).toBe(200);
expect(statsResponse.body).toHaveProperty('userCount');
expect(statsResponse.body).toHaveProperty('flyerCount');
// 5. Fetch User List (Protected Admin Route)
const usersResponse = await request
.get('/api/admin/users')
.set('Authorization', `Bearer ${authToken}`);
expect(usersResponse.status).toBe(200);
expect(Array.isArray(usersResponse.body)).toBe(true);
// The list should contain the admin user we just created
const self = usersResponse.body.find((u: any) => u.user_id === adminUserId);
expect(self).toBeDefined();
// 6. Check Queue Status (Protected Admin Route)
const queueResponse = await request
.get('/api/admin/queues/status')
.set('Authorization', `Bearer ${authToken}`);
expect(queueResponse.status).toBe(200);
expect(Array.isArray(queueResponse.body)).toBe(true);
// Verify that the 'flyer-processing' queue is present in the status report
const flyerQueue = queueResponse.body.find((q: any) => q.name === 'flyer-processing');
expect(flyerQueue).toBeDefined();
expect(flyerQueue.counts).toBeDefined();
});
});

View File

@@ -0,0 +1,110 @@
// src/tests/e2e/flyer-upload.e2e.test.ts
import { describe, it, expect, afterAll } from 'vitest';
import supertest from 'supertest';
import app from '../../../server';
import { getPool } from '../../services/db/connection.db';
import crypto from 'crypto';
import path from 'path';
import fs from 'fs';
/**
* @vitest-environment node
*/
const request = supertest(app);
describe('E2E Flyer Upload and Processing Workflow', () => {
const uniqueId = Date.now();
const userEmail = `e2e-uploader-${uniqueId}@example.com`;
const userPassword = 'StrongPassword123!';
let authToken: string;
let userId: string | null = null;
let flyerId: number | null = null;
afterAll(async () => {
// Cleanup: Delete the flyer and user created during the test
const pool = getPool();
if (flyerId) {
await pool.query('DELETE FROM public.flyers WHERE flyer_id = $1', [flyerId]);
}
if (userId) {
await pool.query('DELETE FROM public.users WHERE user_id = $1', [userId]);
}
});
it('should allow a user to upload a flyer and wait for processing to complete', async () => {
// 1. Register a new user
const registerResponse = await request.post('/api/auth/register').send({
email: userEmail,
password: userPassword,
full_name: 'E2E Flyer Uploader',
});
expect(registerResponse.status).toBe(201);
// 2. Login to get the access token
const loginResponse = await request.post('/api/auth/login').send({
email: userEmail,
password: userPassword,
});
expect(loginResponse.status).toBe(200);
authToken = loginResponse.body.token;
userId = loginResponse.body.userprofile.user.user_id;
expect(authToken).toBeDefined();
// 3. Prepare the flyer file
// We try to use the existing test asset if available, otherwise create a dummy buffer.
// Note: In a real E2E scenario against a live AI service, a valid image is required.
// If the AI service is mocked or stubbed in this environment, a dummy buffer might suffice.
let fileBuffer: Buffer;
let fileName = `e2e-test-flyer-${uniqueId}.jpg`;
const assetPath = path.resolve(__dirname, '../assets/test-flyer-image.jpg');
if (fs.existsSync(assetPath)) {
const rawBuffer = fs.readFileSync(assetPath);
// Append unique ID to ensure unique checksum for every test run
fileBuffer = Buffer.concat([rawBuffer, Buffer.from(uniqueId.toString())]);
} else {
// Fallback to a minimal valid JPEG header + random data if asset is missing
// (This might fail if the backend does strict image validation/processing)
fileBuffer = Buffer.concat([
Buffer.from([0xff, 0xd8, 0xff, 0xe0]), // JPEG Start of Image
Buffer.from(uniqueId.toString())
]);
}
// Calculate checksum (required by the API)
const checksum = crypto.createHash('sha256').update(fileBuffer).digest('hex');
// 4. Upload the flyer
const uploadResponse = await request
.post('/api/ai/upload-and-process')
.set('Authorization', `Bearer ${authToken}`)
.field('checksum', checksum)
.attach('flyerFile', fileBuffer, fileName);
expect(uploadResponse.status).toBe(202);
const jobId = uploadResponse.body.jobId;
expect(jobId).toBeDefined();
// 5. Poll for job completion
let jobStatus;
const maxRetries = 30; // Poll for up to 90 seconds
for (let i = 0; i < maxRetries; i++) {
await new Promise((resolve) => setTimeout(resolve, 3000)); // Wait 3s
const statusResponse = await request
.get(`/api/ai/jobs/${jobId}/status`)
.set('Authorization', `Bearer ${authToken}`);
jobStatus = statusResponse.body;
if (jobStatus.state === 'completed' || jobStatus.state === 'failed') {
break;
}
}
expect(jobStatus.state).toBe('completed');
flyerId = jobStatus.returnValue?.flyerId;
expect(flyerId).toBeTypeOf('number');
}, 120000); // Extended timeout for AI processing
});

View File

@@ -0,0 +1,111 @@
// src/tests/e2e/user-journey.e2e.test.ts
import { describe, it, expect, afterAll } from 'vitest';
import supertest from 'supertest';
import app from '../../../server';
import { getPool } from '../../services/db/connection.db';
/**
* @vitest-environment node
*/
const request = supertest(app);
describe('E2E User Journey', () => {
// Use a unique email for every run to avoid collisions
const uniqueId = Date.now();
const userEmail = `e2e-test-${uniqueId}@example.com`;
const userPassword = 'StrongPassword123!';
let authToken: string;
let userId: string | null = null;
let shoppingListId: number;
afterAll(async () => {
// Safety cleanup: Ensure the user is deleted from the DB if the test fails mid-way.
// If the test succeeds, the user deletes their own account, so this acts as a fallback.
if (userId) {
try {
await getPool().query('DELETE FROM public.users WHERE user_id = $1', [userId]);
} catch (err) {
console.error('Error cleaning up E2E test user:', err);
}
}
});
it('should complete a full user lifecycle: Register -> Login -> Manage List -> Delete Account', async () => {
// 1. Register a new user
const registerResponse = await request.post('/api/auth/register').send({
email: userEmail,
password: userPassword,
full_name: 'E2E Traveler',
});
expect(registerResponse.status).toBe(201);
expect(registerResponse.body.message).toBe('User registered successfully!');
// 2. Login to get the access token
const loginResponse = await request.post('/api/auth/login').send({
email: userEmail,
password: userPassword,
});
expect(loginResponse.status).toBe(200);
authToken = loginResponse.body.token;
userId = loginResponse.body.userprofile.user.user_id;
expect(authToken).toBeDefined();
expect(userId).toBeDefined();
// 3. Create a Shopping List
const createListResponse = await request
.post('/api/users/shopping-lists')
.set('Authorization', `Bearer ${authToken}`)
.send({ name: 'E2E Party List' });
expect(createListResponse.status).toBe(201);
shoppingListId = createListResponse.body.shopping_list_id;
expect(shoppingListId).toBeDefined();
// 4. Add an item to the list
const addItemResponse = await request
.post(`/api/users/shopping-lists/${shoppingListId}/items`)
.set('Authorization', `Bearer ${authToken}`)
.send({ customItemName: 'Chips' });
expect(addItemResponse.status).toBe(201);
expect(addItemResponse.body.custom_item_name).toBe('Chips');
// 5. Verify the list and item exist via GET
const getListsResponse = await request
.get('/api/users/shopping-lists')
.set('Authorization', `Bearer ${authToken}`);
expect(getListsResponse.status).toBe(200);
const myLists = getListsResponse.body;
const targetList = myLists.find((l: any) => l.shopping_list_id === shoppingListId);
expect(targetList).toBeDefined();
expect(targetList.items).toHaveLength(1);
expect(targetList.items[0].custom_item_name).toBe('Chips');
// 6. Delete the User Account (Self-Service)
const deleteAccountResponse = await request
.delete('/api/users/account')
.set('Authorization', `Bearer ${authToken}`)
.send({ password: userPassword });
expect(deleteAccountResponse.status).toBe(200);
expect(deleteAccountResponse.body.message).toBe('Account deleted successfully.');
// 7. Verify Login is no longer possible
const failLoginResponse = await request.post('/api/auth/login').send({
email: userEmail,
password: userPassword,
});
expect(failLoginResponse.status).toBe(401);
// Mark userId as null so afterAll doesn't attempt to delete it again
userId = null;
});
});

View File

@@ -1,10 +1,16 @@
// src/tests/integration/admin.integration.test.ts
import { describe, it, expect, beforeAll, beforeEach } from 'vitest';
import * as apiClient from '../../services/apiClient';
import { describe, it, expect, beforeAll, beforeEach, afterAll } from 'vitest';
import supertest from 'supertest';
import app from '../../../server';
import { getPool } from '../../services/db/connection.db';
import type { UserProfile } from '../../types';
import { createAndLoginUser } from '../utils/testHelpers';
/**
* @vitest-environment node
*/
const request = supertest(app);
describe('Admin API Routes Integration Tests', () => {
let adminToken: string;
let adminUser: UserProfile;
@@ -42,8 +48,10 @@ describe('Admin API Routes Integration Tests', () => {
describe('GET /api/admin/stats', () => {
it('should allow an admin to fetch application stats', async () => {
const response = await apiClient.getApplicationStats(adminToken);
const stats = await response.json();
const response = await request
.get('/api/admin/stats')
.set('Authorization', `Bearer ${adminToken}`);
const stats = response.body;
expect(stats).toBeDefined();
expect(stats).toHaveProperty('flyerCount');
expect(stats).toHaveProperty('userCount');
@@ -51,18 +59,21 @@ describe('Admin API Routes Integration Tests', () => {
});
it('should forbid a regular user from fetching application stats', async () => {
const response = await apiClient.getApplicationStats(regularUserToken);
expect(response.ok).toBe(false);
const response = await request
.get('/api/admin/stats')
.set('Authorization', `Bearer ${regularUserToken}`);
expect(response.status).toBe(403);
const errorData = await response.json();
const errorData = response.body;
expect(errorData.message).toBe('Forbidden: Administrator access required.');
});
});
describe('GET /api/admin/stats/daily', () => {
it('should allow an admin to fetch daily stats', async () => {
const response = await apiClient.getDailyStats(adminToken);
const dailyStats = await response.json();
const response = await request
.get('/api/admin/stats/daily')
.set('Authorization', `Bearer ${adminToken}`);
const dailyStats = response.body;
expect(dailyStats).toBeDefined();
expect(Array.isArray(dailyStats)).toBe(true);
// We just created users in beforeAll, so we should have data
@@ -73,10 +84,11 @@ describe('Admin API Routes Integration Tests', () => {
});
it('should forbid a regular user from fetching daily stats', async () => {
const response = await apiClient.getDailyStats(regularUserToken);
expect(response.ok).toBe(false);
const response = await request
.get('/api/admin/stats/daily')
.set('Authorization', `Bearer ${regularUserToken}`);
expect(response.status).toBe(403);
const errorData = await response.json();
const errorData = response.body;
expect(errorData.message).toBe('Forbidden: Administrator access required.');
});
});
@@ -85,25 +97,30 @@ describe('Admin API Routes Integration Tests', () => {
it('should allow an admin to fetch suggested corrections', async () => {
// This test just verifies access and correct response shape.
// More detailed tests would require seeding corrections.
const response = await apiClient.getSuggestedCorrections(adminToken);
const corrections = await response.json();
const response = await request
.get('/api/admin/corrections')
.set('Authorization', `Bearer ${adminToken}`);
const corrections = response.body;
expect(corrections).toBeDefined();
expect(Array.isArray(corrections)).toBe(true);
});
it('should forbid a regular user from fetching suggested corrections', async () => {
const response = await apiClient.getSuggestedCorrections(regularUserToken);
expect(response.ok).toBe(false);
const response = await request
.get('/api/admin/corrections')
.set('Authorization', `Bearer ${regularUserToken}`);
expect(response.status).toBe(403);
const errorData = await response.json();
const errorData = response.body;
expect(errorData.message).toBe('Forbidden: Administrator access required.');
});
});
describe('GET /api/admin/brands', () => {
it('should allow an admin to fetch all brands', async () => {
const response = await apiClient.fetchAllBrands(adminToken);
const brands = await response.json();
const response = await request
.get('/api/admin/brands')
.set('Authorization', `Bearer ${adminToken}`);
const brands = response.body;
expect(brands).toBeDefined();
expect(Array.isArray(brands)).toBe(true);
// Even if no brands exist, it should return an array.
@@ -112,10 +129,11 @@ describe('Admin API Routes Integration Tests', () => {
});
it('should forbid a regular user from fetching all brands', async () => {
const response = await apiClient.fetchAllBrands(regularUserToken);
expect(response.ok).toBe(false);
const response = await request
.get('/api/admin/brands')
.set('Authorization', `Bearer ${regularUserToken}`);
expect(response.status).toBe(403);
const errorData = await response.json();
const errorData = response.body;
expect(errorData.message).toBe('Forbidden: Administrator access required.');
});
});
@@ -170,8 +188,10 @@ describe('Admin API Routes Integration Tests', () => {
it('should allow an admin to approve a correction', async () => {
// Act: Approve the correction.
const response = await apiClient.approveCorrection(testCorrectionId, adminToken);
expect(response.ok).toBe(true);
const response = await request
.post(`/api/admin/corrections/${testCorrectionId}/approve`)
.set('Authorization', `Bearer ${adminToken}`);
expect(response.status).toBe(200);
// Assert: Verify the flyer item's price was updated and the correction status changed.
const { rows: itemRows } = await getPool().query(
@@ -189,8 +209,10 @@ describe('Admin API Routes Integration Tests', () => {
it('should allow an admin to reject a correction', async () => {
// Act: Reject the correction.
const response = await apiClient.rejectCorrection(testCorrectionId, adminToken);
expect(response.ok).toBe(true);
const response = await request
.post(`/api/admin/corrections/${testCorrectionId}/reject`)
.set('Authorization', `Bearer ${adminToken}`);
expect(response.status).toBe(200);
// Assert: Verify the correction status changed.
const { rows: correctionRows } = await getPool().query(
@@ -202,12 +224,11 @@ describe('Admin API Routes Integration Tests', () => {
it('should allow an admin to update a correction', async () => {
// Act: Update the suggested value of the correction.
const response = await apiClient.updateSuggestedCorrection(
testCorrectionId,
'300',
adminToken,
);
const updatedCorrection = await response.json();
const response = await request
.put(`/api/admin/corrections/${testCorrectionId}`)
.set('Authorization', `Bearer ${adminToken}`)
.send({ suggested_value: '300' });
const updatedCorrection = response.body;
// Assert: Verify the API response and the database state.
expect(updatedCorrection.suggested_value).toBe('300');
@@ -227,8 +248,11 @@ describe('Admin API Routes Integration Tests', () => {
const recipeId = recipeRes.rows[0].recipe_id;
// Act: Update the status to 'public'.
const response = await apiClient.updateRecipeStatus(recipeId, 'public', adminToken);
expect(response.ok).toBe(true);
const response = await request
.put(`/api/admin/recipes/${recipeId}/status`)
.set('Authorization', `Bearer ${adminToken}`)
.send({ status: 'public' });
expect(response.status).toBe(200);
// Assert: Verify the status was updated in the database.
const { rows: updatedRecipeRows } = await getPool().query(

View File

@@ -1,6 +1,7 @@
// src/tests/integration/ai.integration.test.ts
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
import * as aiApiClient from '../../services/aiApiClient';
import supertest from 'supertest';
import app from '../../../server';
import fs from 'node:fs/promises';
import path from 'path';
import { createAndLoginUser } from '../utils/testHelpers';
@@ -9,6 +10,8 @@ import { createAndLoginUser } from '../utils/testHelpers';
* @vitest-environment node
*/
const request = supertest(app);
interface TestGeolocationCoordinates {
latitude: number;
longitude: number;
@@ -44,46 +47,67 @@ describe('AI API Routes Integration Tests', () => {
});
it('POST /api/ai/check-flyer should return a boolean', async () => {
const mockImageFile = new File(['content'], 'test.jpg', { type: 'image/jpeg' });
const response = await aiApiClient.isImageAFlyer(mockImageFile, authToken);
const result = await response.json();
const response = await request
.post('/api/ai/check-flyer')
.set('Authorization', `Bearer ${authToken}`)
.attach('image', Buffer.from('content'), 'test.jpg');
const result = response.body;
expect(response.status).toBe(200);
// The backend is stubbed to always return true for this check
expect(result.is_flyer).toBe(true);
});
it('POST /api/ai/extract-address should return a stubbed address', async () => {
const mockImageFile = new File(['content'], 'test.jpg', { type: 'image/jpeg' });
const response = await aiApiClient.extractAddressFromImage(mockImageFile, authToken);
const result = await response.json();
expect(result.address).toBe('123 AI Street, Server City');
const response = await request
.post('/api/ai/extract-address')
.set('Authorization', `Bearer ${authToken}`)
.attach('image', Buffer.from('content'), 'test.jpg');
const result = response.body;
expect(response.status).toBe(200);
expect(result.address).toBe('not identified');
});
it('POST /api/ai/extract-logo should return a stubbed response', async () => {
const mockImageFile = new File(['content'], 'test.jpg', { type: 'image/jpeg' });
const response = await aiApiClient.extractLogoFromImage([mockImageFile], authToken);
const result = await response.json();
const response = await request
.post('/api/ai/extract-logo')
.set('Authorization', `Bearer ${authToken}`)
.attach('images', Buffer.from('content'), 'test.jpg');
const result = response.body;
expect(response.status).toBe(200);
expect(result).toEqual({ store_logo_base_64: null });
});
it('POST /api/ai/quick-insights should return a stubbed insight', async () => {
const response = await aiApiClient.getQuickInsights([], undefined, authToken);
const result = await response.json();
const response = await request
.post('/api/ai/quick-insights')
.set('Authorization', `Bearer ${authToken}`)
.send({ items: [{ item: 'test' }] });
const result = response.body;
expect(response.status).toBe(200);
expect(result.text).toBe('This is a server-generated quick insight: buy the cheap stuff!');
});
it('POST /api/ai/deep-dive should return a stubbed analysis', async () => {
const response = await aiApiClient.getDeepDiveAnalysis([], undefined, authToken);
const result = await response.json();
const response = await request
.post('/api/ai/deep-dive')
.set('Authorization', `Bearer ${authToken}`)
.send({ items: [{ item: 'test' }] });
const result = response.body;
expect(response.status).toBe(200);
expect(result.text).toBe('This is a server-generated deep dive analysis. It is very detailed.');
});
it('POST /api/ai/search-web should return a stubbed search result', async () => {
const response = await aiApiClient.searchWeb([], undefined, authToken);
const result = await response.json();
const response = await request
.post('/api/ai/search-web')
.set('Authorization', `Bearer ${authToken}`)
.send({ query: 'test query' });
const result = response.body;
expect(response.status).toBe(200);
expect(result).toEqual({ text: 'The web says this is good.', sources: [] });
});
it('POST /api/ai/plan-trip should return a stubbed trip plan', async () => {
it('POST /api/ai/plan-trip should return an error as the feature is disabled', async () => {
// The GeolocationCoordinates type requires more than just lat/lng.
// We create a complete mock object to satisfy the type.
const mockLocation: TestGeolocationCoordinates = {
@@ -94,38 +118,50 @@ describe('AI API Routes Integration Tests', () => {
altitudeAccuracy: null,
heading: null,
speed: null,
toJSON: () => ({}),
toJSON: function () {
return {
latitude: this.latitude,
longitude: this.longitude,
accuracy: this.accuracy,
altitude: this.altitude,
altitudeAccuracy: this.altitudeAccuracy,
heading: this.heading,
speed: this.speed,
};
},
};
const response = await aiApiClient.planTripWithMaps(
[],
undefined,
mockLocation,
undefined,
authToken,
);
const result = await response.json();
expect(result).toBeDefined();
// The AI service is mocked in unit tests, but in integration it might be live.
// For now, we just check that we get a text response.
expect(result.text).toBeTypeOf('string');
const mockStore = {
name: 'Test Store for Trip',
store_id: 1,
created_at: new Date().toISOString(),
updated_at: new Date().toISOString(),
};
const response = await request
.post('/api/ai/plan-trip')
.set('Authorization', `Bearer ${authToken}`)
.send({ items: [], store: mockStore, userLocation: mockLocation });
// The service for this endpoint is disabled and throws an error, which results in a 500.
expect(response.status).toBe(500);
const errorResult = response.body;
expect(errorResult.message).toContain('planTripWithMaps');
});
it('POST /api/ai/generate-image should reject because it is not implemented', async () => {
// The backend for this is not stubbed and will throw an error.
// This test confirms that the endpoint is protected and responds as expected to a failure.
const response = await aiApiClient.generateImageFromText('a test prompt', undefined, authToken);
expect(response.ok).toBe(false);
const response = await request
.post('/api/ai/generate-image')
.set('Authorization', `Bearer ${authToken}`)
.send({ prompt: 'a test prompt' });
expect(response.status).toBe(501);
});
it('POST /api/ai/generate-speech should reject because it is not implemented', async () => {
// The backend for this is not stubbed and will throw an error.
const response = await aiApiClient.generateSpeechFromText(
'a test prompt',
undefined,
authToken,
);
expect(response.ok).toBe(false);
const response = await request
.post('/api/ai/generate-speech')
.set('Authorization', `Bearer ${authToken}`)
.send({ text: 'a test prompt' });
expect(response.status).toBe(501);
});
});

View File

@@ -1,6 +1,7 @@
// src/tests/integration/auth.integration.test.ts
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
import { loginUser } from '../../services/apiClient';
import supertest from 'supertest';
import app from '../../../server';
import { getPool } from '../../services/db/connection.db';
import { createAndLoginUser, TEST_PASSWORD } from '../utils/testHelpers';
import type { UserProfile } from '../../types';
@@ -9,6 +10,8 @@ import type { UserProfile } from '../../types';
* @vitest-environment node
*/
const request = supertest(app);
/**
* These are integration tests that verify the authentication flow against a running backend server.
* Make sure your Express server is running before executing these tests.
@@ -16,30 +19,6 @@ import type { UserProfile } from '../../types';
* To run only these tests: `vitest run src/tests/auth.integration.test.ts`
*/
describe('Authentication API Integration', () => {
// --- START DEBUG LOGGING ---
// Query the DB from within the test file to see its state.
getPool()
.query(
'SELECT u.user_id, u.email, p.role FROM public.users u JOIN public.profiles p ON u.user_id = p.user_id',
)
.then((res) => {
console.log('\n--- [auth.integration.test.ts] Users found in DB from TEST perspective: ---');
console.table(res.rows);
console.log('--------------------------------------------------------------------------\n');
})
.catch((err) => console.error('--- [auth.integration.test.ts] DB QUERY FAILED ---', err));
// --- END DEBUG LOGGING ---
// --- START DEBUG LOGGING ---
// Log the database connection details as seen by an individual TEST FILE.
console.log('\n\n--- [AUTH.INTEGRATION.TEST LOG] DATABASE CONNECTION ---');
console.log(` Host: ${process.env.DB_HOST}`);
console.log(` Port: ${process.env.DB_PORT}`);
console.log(` User: ${process.env.DB_USER}`);
console.log(` Database: ${process.env.DB_NAME}`);
console.log('-----------------------------------------------------\n');
// --- END DEBUG LOGGING ---
let testUserEmail: string;
let testUser: UserProfile;
@@ -57,11 +36,14 @@ describe('Authentication API Integration', () => {
// This test migrates the logic from the old DevTestRunner.tsx component.
it('should successfully log in a registered user', async () => {
// The `rememberMe` parameter is required. For a test, `false` is a safe default.
const response = await loginUser(testUserEmail, TEST_PASSWORD, false);
const data = await response.json();
const response = await request
.post('/api/auth/login')
.send({ email: testUserEmail, password: TEST_PASSWORD, rememberMe: false });
const data = response.body;
// Assert that the API returns the expected structure
expect(data).toBeDefined();
expect(response.status).toBe(200);
expect(data.userprofile).toBeDefined();
expect(data.userprofile.user.email).toBe(testUserEmail);
expect(data.userprofile.user.user_id).toBeTypeOf('string');
@@ -74,9 +56,11 @@ describe('Authentication API Integration', () => {
const wrongPassword = 'wrongpassword';
// The loginUser function returns a Response object. We check its status.
const response = await loginUser(adminEmail, wrongPassword, false);
expect(response.ok).toBe(false);
const errorData = await response.json();
const response = await request
.post('/api/auth/login')
.send({ email: adminEmail, password: wrongPassword, rememberMe: false });
expect(response.status).toBe(401);
const errorData = response.body;
expect(errorData.message).toBe('Incorrect email or password.');
});
@@ -85,9 +69,11 @@ describe('Authentication API Integration', () => {
const anyPassword = 'any-password';
// The loginUser function returns a Response object. We check its status.
const response = await loginUser(nonExistentEmail, anyPassword, false);
expect(response.ok).toBe(false);
const errorData = await response.json();
const response = await request
.post('/api/auth/login')
.send({ email: nonExistentEmail, password: anyPassword, rememberMe: false });
expect(response.status).toBe(401);
const errorData = response.body;
// Security best practice: the error message should be identical for wrong password and wrong email
// to prevent user enumeration attacks.
expect(errorData.message).toBe('Incorrect email or password.');
@@ -96,24 +82,21 @@ describe('Authentication API Integration', () => {
it('should successfully refresh an access token using a refresh token cookie', async () => {
// Arrange: Log in to get a fresh, valid refresh token cookie for this specific test.
// This ensures the test is self-contained and not affected by other tests.
const loginResponse = await loginUser(testUserEmail, TEST_PASSWORD, true);
const setCookieHeader = loginResponse.headers.get('set-cookie');
const refreshTokenCookie = setCookieHeader?.split(';')[0];
const loginResponse = await request
.post('/api/auth/login')
.send({ email: testUserEmail, password: TEST_PASSWORD, rememberMe: true });
const refreshTokenCookie = loginResponse.headers['set-cookie'][0].split(';')[0];
expect(refreshTokenCookie).toBeDefined();
// Act: Make a request to the refresh-token endpoint, including the cookie.
const apiUrl = process.env.VITE_API_BASE_URL || 'http://localhost:3001/api';
const response = await fetch(`${apiUrl}/auth/refresh-token`, {
method: 'POST',
headers: {
Cookie: refreshTokenCookie!,
},
});
const response = await request
.post('/api/auth/refresh-token')
.set('Cookie', refreshTokenCookie!);
// Assert: Check for a successful response and a new access token.
expect(response.ok).toBe(true);
const data = await response.json();
expect(response.status).toBe(200);
const data = response.body;
expect(data.token).toBeTypeOf('string');
});
@@ -122,40 +105,30 @@ describe('Authentication API Integration', () => {
const invalidRefreshTokenCookie = 'refreshToken=this-is-not-a-valid-token';
// Act: Make a request to the refresh-token endpoint with the invalid cookie.
const apiUrl = process.env.VITE_API_BASE_URL || 'http://localhost:3001/api';
const response = await fetch(`${apiUrl}/auth/refresh-token`, {
method: 'POST',
headers: {
Cookie: invalidRefreshTokenCookie,
},
});
const response = await request
.post('/api/auth/refresh-token')
.set('Cookie', invalidRefreshTokenCookie);
// Assert: Check for a 403 Forbidden response.
expect(response.ok).toBe(false);
expect(response.status).toBe(403);
const data = await response.json();
const data = response.body;
expect(data.message).toBe('Invalid or expired refresh token.');
});
it('should successfully log out and clear the refresh token cookie', async () => {
// Arrange: Log in to get a valid refresh token cookie.
const loginResponse = await loginUser(testUserEmail, TEST_PASSWORD, true);
const setCookieHeader = loginResponse.headers.get('set-cookie');
const refreshTokenCookie = setCookieHeader?.split(';')[0];
const loginResponse = await request
.post('/api/auth/login')
.send({ email: testUserEmail, password: TEST_PASSWORD, rememberMe: true });
const refreshTokenCookie = loginResponse.headers['set-cookie'][0].split(';')[0];
expect(refreshTokenCookie).toBeDefined();
// Act: Make a request to the new logout endpoint, including the cookie.
const apiUrl = process.env.VITE_API_BASE_URL || 'http://localhost:3001/api';
const response = await fetch(`${apiUrl}/auth/logout`, {
method: 'POST',
headers: {
Cookie: refreshTokenCookie!,
},
});
const response = await request.post('/api/auth/logout').set('Cookie', refreshTokenCookie!);
// Assert: Check for a successful response and a cookie-clearing header.
expect(response.ok).toBe(true);
const logoutSetCookieHeader = response.headers.get('set-cookie');
expect(response.status).toBe(200);
const logoutSetCookieHeader = response.headers['set-cookie'][0];
expect(logoutSetCookieHeader).toContain('refreshToken=;');
expect(logoutSetCookieHeader).toContain('Max-Age=0');
});

Some files were not shown because too many files have changed in this diff Show More