Compare commits

...

10 Commits

Author SHA1 Message Date
Gitea Actions
5274650aea ci: Bump version to 0.0.22 [skip ci] 2025-12-24 08:10:33 +05:00
de5a9a565b Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 17m19s
2025-12-23 19:08:59 -08:00
10a379c5e3 fix for integration tests 404 ? not sure this is right 2025-12-23 19:08:53 -08:00
a6a484d432 fix for integration tests 404 ? not sure this is right 2025-12-23 18:13:35 -08:00
Gitea Actions
4b0a172c35 ci: Bump version to 0.0.21 [skip ci] 2025-12-24 07:06:33 +05:00
e8c894d5cf Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 15m46s
2025-12-23 18:05:28 -08:00
6c8fd4b126 prettier ! 2025-12-23 17:57:32 -08:00
Gitea Actions
a1f52544d0 ci: Bump version to 0.0.20 [skip ci] 2025-12-24 06:49:05 +05:00
2334359756 Merge branch 'main' of https://gitea.projectium.com/torbo/flyer-crawler.projectium.com
All checks were successful
Deploy to Test Environment / deploy-to-test (push) Successful in 14m31s
2025-12-23 17:48:07 -08:00
406954ca06 database fixes due to adding foreign keys and seed script improvement 2025-12-23 17:48:00 -08:00
38 changed files with 790 additions and 839 deletions

View File

@@ -6,4 +6,4 @@
"tabWidth": 2,
"useTabs": false,
"endOfLine": "auto"
}
}

130
README.md
View File

@@ -2,7 +2,7 @@
Flyer Crawler is a web application that uses the Google Gemini AI to extract, analyze, and manage data from grocery store flyers. Users can upload flyer images or PDFs, and the application will automatically identify items, prices, and sale dates, storing the structured data in a PostgreSQL database for historical analysis, price tracking, and personalized deal alerts.
We are working on an app to help people save money, by finding good deals that are only advertized in store flyers/ads. So, the primary purpose of the site is to make uploading flyers as easy as possible and as accurate as possible, and to store peoples needs, so sales can be matched to needs.
We are working on an app to help people save money, by finding good deals that are only advertized in store flyers/ads. So, the primary purpose of the site is to make uploading flyers as easy as possible and as accurate as possible, and to store peoples needs, so sales can be matched to needs.
## Features
@@ -45,9 +45,9 @@ This project is configured to run in a CI/CD environment and does not use `.env`
1. **Set up a PostgreSQL database instance.**
2. **Run the Database Schema**:
- Connect to your database using a tool like `psql` or DBeaver.
- Open `sql/schema.sql.txt`, copy its entire contents, and execute it against your database.
- This will create all necessary tables, functions, and relationships.
- Connect to your database using a tool like `psql` or DBeaver.
- Open `sql/schema.sql.txt`, copy its entire contents, and execute it against your database.
- This will create all necessary tables, functions, and relationships.
### Step 2: Install Dependencies and Run the Application
@@ -79,11 +79,11 @@ sudo nano /etc/nginx/mime.types
change
application/javascript js;
application/javascript js;
TO
application/javascript js mjs;
application/javascript js mjs;
RESTART NGINX
@@ -95,7 +95,7 @@ actually the proper change was to do this in the /etc/nginx/sites-available/flye
## for OAuth
1. Get Google OAuth Credentials
This is a crucial step that you must do outside the codebase:
This is a crucial step that you must do outside the codebase:
Go to the Google Cloud Console.
@@ -112,7 +112,7 @@ Under Authorized redirect URIs, click ADD URI and enter the URL where Google wil
Click Create. You will be given a Client ID and a Client Secret.
2. Get GitHub OAuth Credentials
You'll need to obtain a Client ID and Client Secret from GitHub:
You'll need to obtain a Client ID and Client Secret from GitHub:
Go to your GitHub profile settings.
@@ -133,21 +133,23 @@ You will be given a Client ID and a Client Secret.
psql -h localhost -U flyer_crawler_user -d "flyer-crawler-prod" -W
## postgis
flyer-crawler-prod=> SELECT version();
version
------------------------------------------------------------------------------------------------------------------------------------------
PostgreSQL 14.19 (Ubuntu 14.19-0ubuntu0.22.04.1) on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 11.4.0-1ubuntu1~22.04.2) 11.4.0, 64-bit
version
---
PostgreSQL 14.19 (Ubuntu 14.19-0ubuntu0.22.04.1) on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 11.4.0-1ubuntu1~22.04.2) 11.4.0, 64-bit
(1 row)
flyer-crawler-prod=> SELECT PostGIS_Full_Version();
postgis_full_version
--------------------------------------------------------------------------------------------------------------------------------------------------------------------
POSTGIS="3.2.0 c3e3cc0" [EXTENSION] PGSQL="140" GEOS="3.10.2-CAPI-1.16.0" PROJ="8.2.1" LIBXML="2.9.12" LIBJSON="0.15" LIBPROTOBUF="1.3.3" WAGYU="0.5.0 (Internal)"
(1 row)
postgis_full_version
---
POSTGIS="3.2.0 c3e3cc0" [EXTENSION] PGSQL="140" GEOS="3.10.2-CAPI-1.16.0" PROJ="8.2.1" LIBXML="2.9.12" LIBJSON="0.15" LIBPROTOBUF="1.3.3" WAGYU="0.5.0 (Internal)"
(1 row)
## production postgres setup
@@ -201,9 +203,13 @@ Step 4: Seed the Admin Account (If Needed)
Your application has a separate script to create the initial admin user. To run it, you must first set the required environment variables in your shell session.
bash
# Set variables for the current session
export DB_USER=flyer_crawler_user DB_PASSWORD=your_password DB_NAME="flyer-crawler-prod" ...
# Run the seeding script
npx tsx src/db/seed_admin_account.ts
Your production database is now ready!
@@ -284,8 +290,6 @@ Test Execution: Your tests run against this clean, isolated schema.
This approach is faster, more reliable, and removes the need for sudo access within the CI pipeline.
gitea-runner@projectium:~$ pm2 install pm2-logrotate
[PM2][Module] Installing NPM pm2-logrotate module
[PM2][Module] Calling [NPM] to install pm2-logrotate ...
@@ -293,7 +297,7 @@ gitea-runner@projectium:~$ pm2 install pm2-logrotate
added 161 packages in 5s
21 packages are looking for funding
run `npm fund` for details
run `npm fund` for details
npm notice
npm notice New patch version of npm available! 11.6.3 -> 11.6.4
npm notice Changelog: https://github.com/npm/cli/releases/tag/v11.6.4
@@ -308,23 +312,23 @@ $ pm2 set pm2-logrotate:retain 30
$ pm2 set pm2-logrotate:compress false
$ pm2 set pm2-logrotate:dateFormat YYYY-MM-DD_HH-mm-ss
$ pm2 set pm2-logrotate:workerInterval 30
$ pm2 set pm2-logrotate:rotateInterval 0 0 * * *
$ pm2 set pm2-logrotate:rotateInterval 0 0 \* \* _
$ pm2 set pm2-logrotate:rotateModule true
Modules configuration. Copy/Paste line to edit values.
[PM2][Module] Module successfully installed and launched
[PM2][Module] Checkout module options: `$ pm2 conf`
┌────┬───────────────────────────────────┬─────────────┬─────────┬─────────┬──────────┬────────┬──────┬───────────┬──────────┬──────────┬──────────┬──────────┐
│ id │ name │ namespace │ version │ mode │ pid │ uptime │ ↺ │ status │ cpu │ mem │ user │ watching │
│ id │ name │ namespace │ version │ mode │ pid │ uptime │ ↺ │ status │ cpu │ mem │ user │ watching │
├────┼───────────────────────────────────┼─────────────┼─────────┼─────────┼──────────┼────────┼──────┼───────────┼──────────┼──────────┼──────────┼──────────┤
│ 2 │ flyer-crawler-analytics-worker │ default │ 0.0.0 │ fork │ 3846981 │ 7m │ 5 │ online │ 0% │ 55.8mb │ git… │ disabled │
│ 11 │ flyer-crawler-api │ default │ 0.0.0 │ fork │ 3846987 │ 7m │ 0 │ online │ 0% │ 59.0mb │ git… │ disabled │
│ 12 │ flyer-crawler-worker │ default │ 0.0.0 │ fork │ 3846988 │ 7m │ 0 │ online │ 0% │ 54.2mb │ git… │ disabled │
│ 2 │ flyer-crawler-analytics-worker │ default │ 0.0.0 │ fork │ 3846981 │ 7m │ 5 │ online │ 0% │ 55.8mb │ git… │ disabled │
│ 11 │ flyer-crawler-api │ default │ 0.0.0 │ fork │ 3846987 │ 7m │ 0 │ online │ 0% │ 59.0mb │ git… │ disabled │
│ 12 │ flyer-crawler-worker │ default │ 0.0.0 │ fork │ 3846988 │ 7m │ 0 │ online │ 0% │ 54.2mb │ git… │ disabled │
└────┴───────────────────────────────────┴─────────────┴─────────┴─────────┴──────────┴────────┴──────┴───────────┴──────────┴──────────┴──────────┴──────────┘
Module
┌────┬──────────────────────────────┬───────────────┬──────────┬──────────┬──────┬──────────┬──────────┬──────────┐
│ id │ module │ version │ pid │ status │ ↺ │ cpu │ mem │ user
│ id │ module │ version │ pid │ status │ ↺ │ cpu │ mem │ user │
├────┼──────────────────────────────┼───────────────┼──────────┼──────────┼──────┼──────────┼──────────┼──────────┤
│ 13 │ pm2-logrotate │ 3.0.0 │ 3848878 │ online │ 0 │ 0% │ 20.1mb │ git… │
│ 13 │ pm2-logrotate │ 3.0.0 │ 3848878 │ online │ 0 │ 0% │ 20.1mb │ git… │
└────┴──────────────────────────────┴───────────────┴──────────┴──────────┴──────┴──────────┴──────────┴──────────┘
gitea-runner@projectium:~$ pm2 set pm2-logrotate:max_size 10M
[PM2] Module pm2-logrotate restarted
@@ -335,7 +339,7 @@ $ pm2 set pm2-logrotate:retain 30
$ pm2 set pm2-logrotate:compress false
$ pm2 set pm2-logrotate:dateFormat YYYY-MM-DD_HH-mm-ss
$ pm2 set pm2-logrotate:workerInterval 30
$ pm2 set pm2-logrotate:rotateInterval 0 0 * * *
$ pm2 set pm2-logrotate:rotateInterval 0 0 _ \* _
$ pm2 set pm2-logrotate:rotateModule true
gitea-runner@projectium:~$ pm2 set pm2-logrotate:retain 14
[PM2] Module pm2-logrotate restarted
@@ -346,33 +350,31 @@ $ pm2 set pm2-logrotate:retain 14
$ pm2 set pm2-logrotate:compress false
$ pm2 set pm2-logrotate:dateFormat YYYY-MM-DD_HH-mm-ss
$ pm2 set pm2-logrotate:workerInterval 30
$ pm2 set pm2-logrotate:rotateInterval 0 0 * * *
$ pm2 set pm2-logrotate:rotateInterval 0 0 _ \* \*
$ pm2 set pm2-logrotate:rotateModule true
gitea-runner@projectium:~$
## dev server setup:
Here are the steps to set up the development environment on Windows using Podman with an Ubuntu container:
1. Install Prerequisites on Windows
Install WSL 2: Podman on Windows relies on the Windows Subsystem for Linux. Install it by running wsl --install in an administrator PowerShell.
Install Podman Desktop: Download and install Podman Desktop for Windows.
Install WSL 2: Podman on Windows relies on the Windows Subsystem for Linux. Install it by running wsl --install in an administrator PowerShell.
Install Podman Desktop: Download and install Podman Desktop for Windows.
2. Set Up Podman
Initialize Podman: Launch Podman Desktop. It will automatically set up its WSL 2 machine.
Start Podman: Ensure the Podman machine is running from the Podman Desktop interface.
Initialize Podman: Launch Podman Desktop. It will automatically set up its WSL 2 machine.
Start Podman: Ensure the Podman machine is running from the Podman Desktop interface.
3. Set Up the Ubuntu Container
- Pull Ubuntu Image: Open a PowerShell or command prompt and pull the latest Ubuntu image:
podman pull ubuntu:latest
- Create a Podman Volume: Create a volume to persist node_modules and avoid installing them every time the container starts.
podman volume create node_modules_cache
- Run the Ubuntu Container: Start a new container with the project directory mounted and the necessary ports forwarded.
- Open a terminal in your project's root directory on Windows.
- Run the following command, replacing D:\gitea\flyer-crawler.projectium.com\flyer-crawler.projectium.com with the full path to your project:
- Pull Ubuntu Image: Open a PowerShell or command prompt and pull the latest Ubuntu image:
podman pull ubuntu:latest
- Create a Podman Volume: Create a volume to persist node_modules and avoid installing them every time the container starts.
podman volume create node_modules_cache
- Run the Ubuntu Container: Start a new container with the project directory mounted and the necessary ports forwarded.
- Open a terminal in your project's root directory on Windows.
- Run the following command, replacing D:\gitea\flyer-crawler.projectium.com\flyer-crawler.projectium.com with the full path to your project:
podman run -it -p 3001:3001 -p 5173:5173 --name flyer-dev -v "D:\gitea\flyer-crawler.projectium.com\flyer-crawler.projectium.com:/app" -v "node_modules_cache:/app/node_modules" ubuntu:latest
@@ -383,46 +385,40 @@ podman run -it -p 3001:3001 -p 5173:5173 --name flyer-dev -v "D:\gitea\flyer-cra
-v "node_modules_cache:/app/node_modules": Mounts the named volume for node_modules.
4. Configure the Ubuntu Environment
You are now inside the Ubuntu container's shell.
You are now inside the Ubuntu container's shell.
- Update Package Lists:
apt-get update
- Install Dependencies: Install curl, git, and nodejs (which includes npm).
apt-get install -y curl git
curl -sL https://deb.nodesource.com/setup_20.x | bash -
apt-get install -y nodejs
- Navigate to Project Directory:
cd /app
- Update Package Lists:
apt-get update
- Install Dependencies: Install curl, git, and nodejs (which includes npm).
apt-get install -y curl git
curl -sL https://deb.nodesource.com/setup_20.x | bash -
apt-get install -y nodejs
- Navigate to Project Directory:
cd /app
- Install Project Dependencies:
npm install
- Install Project Dependencies:
npm install
5. Run the Development Server
- Start the Application:
npm run dev
npm run dev
6. Accessing the Application
- Frontend: Open your browser and go to http://localhost:5173.
- Backend: The frontend will make API calls to http://localhost:3001.
- Frontend: Open your browser and go to http://localhost:5173.
- Backend: The frontend will make API calls to http://localhost:3001.
Managing the Environment
- Stopping the Container: Press Ctrl+C in the container terminal, then type exit.
- Restarting the Container:
podman start -a -i flyer-dev
- Stopping the Container: Press Ctrl+C in the container terminal, then type exit.
- Restarting the Container:
podman start -a -i flyer-dev
## for me:
cd /mnt/d/gitea/flyer-crawler.projectium.com/flyer-crawler.projectium.com
podman run -it -p 3001:3001 -p 5173:5173 --name flyer-dev -v "$(pwd):/app" -v "node_modules_cache:/app/node_modules" ubuntu:latest
rate limiting
respect the AI service's rate limits, making it more stable and robust. You can adjust the GEMINI_RPM environment variable in your production environment as needed without changing the code.
respect the AI service's rate limits, making it more stable and robust. You can adjust the GEMINI_RPM environment variable in your production environment as needed without changing the code.

View File

@@ -15,7 +15,7 @@ module.exports = {
args: 'server.ts', // tsx will execute this file
// Production Environment Settings
env_production: {
NODE_ENV: 'production', // Set the Node.js environment to production
NODE_ENV: 'production', // Set the Node.js environment to production
name: 'flyer-crawler-api',
cwd: '/var/www/flyer-crawler.projectium.com',
},
@@ -63,4 +63,4 @@ module.exports = {
},
},
],
};
};

View File

@@ -1,21 +1,21 @@
import globals from "globals";
import tseslint from "typescript-eslint";
import pluginReact from "eslint-plugin-react";
import pluginReactHooks from "eslint-plugin-react-hooks";
import pluginReactRefresh from "eslint-plugin-react-refresh";
import globals from 'globals';
import tseslint from 'typescript-eslint';
import pluginReact from 'eslint-plugin-react';
import pluginReactHooks from 'eslint-plugin-react-hooks';
import pluginReactRefresh from 'eslint-plugin-react-refresh';
export default tseslint.config(
{
// Global ignores
ignores: ["dist", ".gitea", "node_modules", "*.cjs"],
ignores: ['dist', '.gitea', 'node_modules', '*.cjs'],
},
{
// All files
files: ["**/*.{js,mjs,cjs,ts,jsx,tsx}"],
files: ['**/*.{js,mjs,cjs,ts,jsx,tsx}'],
plugins: {
react: pluginReact,
"react-hooks": pluginReactHooks,
"react-refresh": pluginReactRefresh,
'react-hooks': pluginReactHooks,
'react-refresh': pluginReactRefresh,
},
languageOptions: {
globals: {
@@ -24,12 +24,9 @@ export default tseslint.config(
},
},
rules: {
"react-refresh/only-export-components": [
"warn",
{ allowConstantExport: true },
],
'react-refresh/only-export-components': ['warn', { allowConstantExport: true }],
},
},
// TypeScript files
...tseslint.configs.recommended,
);
);

4
express.d.ts vendored
View File

@@ -1,4 +1,4 @@
// src/types/express.d.ts
// express.d.ts
import { Logger } from 'pino';
/**
@@ -12,4 +12,4 @@ declare global {
log: Logger;
}
}
}
}

View File

@@ -1,20 +1,20 @@
<!DOCTYPE html>
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Grocery Flyer AI Analyzer</title>
<style>
@import url('https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap');
body {
font-family: 'Inter', sans-serif;
}
</style>
<!-- The stylesheet will be injected here by Vite during the build process -->
</head>
<body>
<div id="root"></div>
<!-- Vite will inject the correct <script> tag here during the build process -->
<script type="module" src="/src/index.tsx"></script>
</body>
</html>
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Grocery Flyer AI Analyzer</title>
<style>
@import url('https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap');
body {
font-family: 'Inter', sans-serif;
}
</style>
<!-- The stylesheet will be injected here by Vite during the build process -->
</head>
<body>
<div id="root"></div>
<!-- Vite will inject the correct <script> tag here during the build process -->
<script type="module" src="/src/index.tsx"></script>
</body>
</html>

View File

@@ -1,8 +1,5 @@
{
"name": "Flyer Crawler",
"description": "Upload a grocery store flyer image to extract item details, prices, and quantities using AI. Get insights, meal plans, and compare prices to save money on your shopping.",
"requestFramePermissions": [
"geolocation",
"microphone"
]
}
"requestFramePermissions": ["geolocation", "microphone"]
}

4
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "flyer-crawler",
"version": "0.0.19",
"version": "0.0.22",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "flyer-crawler",
"version": "0.0.19",
"version": "0.0.22",
"dependencies": {
"@bull-board/api": "^6.14.2",
"@bull-board/express": "^6.14.2",

View File

@@ -1,7 +1,7 @@
{
"name": "flyer-crawler",
"private": true,
"version": "0.0.19",
"version": "0.0.22",
"type": "module",
"scripts": {
"dev": "concurrently \"npm:start:dev\" \"vite\"",

View File

@@ -10,10 +10,13 @@ const tailwindConfigPath = path.resolve(process.cwd(), 'tailwind.config.js');
console.log(`[POSTCSS] Attempting to use Tailwind config at: ${tailwindConfigPath}`);
// Log to prove the imported config object is what we expect
console.log('[POSTCSS] Imported tailwind.config.js object:', JSON.stringify(tailwindConfig, null, 2));
console.log(
'[POSTCSS] Imported tailwind.config.js object:',
JSON.stringify(tailwindConfig, null, 2),
);
export default {
plugins: {
'@tailwindcss/postcss': {}, // The empty object is correct.
},
};
};

View File

@@ -1030,11 +1030,61 @@ DROP FUNCTION IF EXISTS public.fork_recipe(UUID, BIGINT);
CREATE OR REPLACE FUNCTION public.fork_recipe(p_user_id UUID, p_original_recipe_id BIGINT)
RETURNS SETOF public.recipes
LANGUAGE sql
LANGUAGE plpgsql
SECURITY INVOKER
AS $$
-- The entire forking logic is now encapsulated in a single, atomic database function.
SELECT * FROM public.fork_recipe(p_user_id, p_original_recipe_id);
DECLARE
new_recipe_id BIGINT;
BEGIN
-- 1. Create a copy of the recipe, linking it to the new user and the original recipe.
INSERT INTO public.recipes (
user_id,
original_recipe_id,
name,
description,
instructions,
prep_time_minutes,
cook_time_minutes,
servings,
photo_url,
calories_per_serving,
protein_grams,
fat_grams,
carb_grams,
status -- Forked recipes should be private by default
)
SELECT
p_user_id,
p_original_recipe_id,
original.name || ' (Fork)', -- Append '(Fork)' to distinguish it
original.description,
original.instructions,
original.prep_time_minutes,
original.cook_time_minutes,
original.servings,
original.photo_url,
original.calories_per_serving,
original.protein_grams,
original.fat_grams,
original.carb_grams,
'private'
FROM public.recipes AS original
WHERE original.recipe_id = p_original_recipe_id
RETURNING recipe_id INTO new_recipe_id;
-- If the original recipe didn't exist, new_recipe_id will be null.
IF new_recipe_id IS NULL THEN
RETURN;
END IF;
-- 2. Copy all ingredients, tags, and appliances from the original recipe to the new one.
INSERT INTO public.recipe_ingredients (recipe_id, master_item_id, quantity, unit) SELECT new_recipe_id, master_item_id, quantity, unit FROM public.recipe_ingredients WHERE recipe_id = p_original_recipe_id;
INSERT INTO public.recipe_tags (recipe_id, tag_id) SELECT new_recipe_id, tag_id FROM public.recipe_tags WHERE recipe_id = p_original_recipe_id;
INSERT INTO public.recipe_appliances (recipe_id, appliance_id) SELECT new_recipe_id, appliance_id FROM public.recipe_appliances WHERE recipe_id = p_original_recipe_id;
-- 3. Return the newly created recipe record.
RETURN QUERY SELECT * FROM public.recipes WHERE recipe_id = new_recipe_id;
END;
$$;
@@ -1566,4 +1616,3 @@ BEGIN
bp.price_rank = 1;
END;
$$ LANGUAGE plpgsql;

View File

@@ -8,7 +8,23 @@
-- It is idempotent, meaning it can be run multiple times without causing errors.
-- 1. Pre-populate the master grocery items dictionary.
-- This block links generic items to their respective categories.
-- This MUST run after populating categories.
-- Renumbered to 2.
-- 2. Pre-populate the categories table from a predefined list.
-- Renumbered to 1. This MUST run before populating master_grocery_items.
DO $$
BEGIN
INSERT INTO public.categories (name) VALUES
('Fruits & Vegetables'), ('Meat & Seafood'), ('Dairy & Eggs'), ('Bakery & Bread'),
('Pantry & Dry Goods'), ('Beverages'), ('Frozen Foods'), ('Snacks'), ('Household & Cleaning'),
('Personal Care & Health'), ('Baby & Child'), ('Pet Supplies'), ('Deli & Prepared Foods'),
('Canned Goods'), ('Condiments & Spices'), ('Breakfast & Cereal'), ('Organic'),
('International Foods'), ('Other/Miscellaneous')
ON CONFLICT (name) DO NOTHING;
END $$;
-- 2. Pre-populate the master grocery items dictionary.
DO $$
DECLARE
fv_cat_id BIGINT; ms_cat_id BIGINT; de_cat_id BIGINT; bb_cat_id BIGINT; pdg_cat_id BIGINT;
@@ -53,18 +69,6 @@ BEGIN
ON CONFLICT (name) DO NOTHING;
END $$;
-- 2. Pre-populate the categories table from a predefined list.
DO $$
BEGIN
INSERT INTO public.categories (name) VALUES
('Fruits & Vegetables'), ('Meat & Seafood'), ('Dairy & Eggs'), ('Bakery & Bread'),
('Pantry & Dry Goods'), ('Beverages'), ('Frozen Foods'), ('Snacks'), ('Household & Cleaning'),
('Personal Care & Health'), ('Baby & Child'), ('Pet Supplies'), ('Deli & Prepared Foods'),
('Canned Goods'), ('Condiments & Spices'), ('Breakfast & Cereal'), ('Organic'),
('International Foods'), ('Other/Miscellaneous')
ON CONFLICT (name) DO NOTHING;
END $$;
-- 3. Pre-populate the brands and products tables.
-- This block adds common brands and links them to specific products.
DO $$

View File

@@ -162,6 +162,38 @@ CREATE TABLE IF NOT EXISTS public.user_watched_items (
COMMENT ON TABLE public.user_watched_items IS 'A linking table that represents a user''s personal watchlist of grocery items.';
CREATE INDEX IF NOT EXISTS idx_user_watched_items_master_item_id ON public.user_watched_items(master_item_id);
-- 23. Store brand information. (Moved up due to dependency in flyer_items)
CREATE TABLE IF NOT EXISTS public.brands (
brand_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
name TEXT NOT NULL UNIQUE,
logo_url TEXT,
store_id BIGINT REFERENCES public.stores(store_id) ON DELETE SET NULL,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.brands IS 'Stores brand names like "Coca-Cola", "Maple Leaf", or "Kraft".';
COMMENT ON COLUMN public.brands.store_id IS 'If this is a store-specific brand (e.g., President''s Choice), this links to the parent store.';
-- 24. For specific products, linking a master item with a brand and size. (Moved up due to dependency in flyer_items)
CREATE TABLE IF NOT EXISTS public.products (
product_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE CASCADE,
brand_id BIGINT REFERENCES public.brands(brand_id) ON DELETE SET NULL,
name TEXT NOT NULL,
description TEXT,
size TEXT,
upc_code TEXT UNIQUE,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.products IS 'Represents a specific, sellable product, combining a generic item with a brand and size.';
COMMENT ON COLUMN public.products.upc_code IS 'Universal Product Code, if available, for exact product matching.';
COMMENT ON COLUMN public.products.brand_id IS 'Can be null for generic/store-brand items.';
COMMENT ON COLUMN public.products.name IS 'Prime Raised without Antibiotics Chicken Breast.';
COMMENT ON COLUMN public.products.size IS 'e.g., "4L", "500g".';
CREATE INDEX IF NOT EXISTS idx_products_master_item_id ON public.products(master_item_id);
CREATE INDEX IF NOT EXISTS idx_products_brand_id ON public.products(brand_id);
-- 9. The 'flyer_items' table. This stores individual items from flyers.
CREATE TABLE IF NOT EXISTS public.flyer_items (
flyer_item_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
@@ -409,38 +441,6 @@ COMMENT ON TABLE public.unmatched_flyer_items IS 'A queue for reviewing flyer it
CREATE INDEX IF NOT EXISTS idx_unmatched_flyer_items_flyer_item_id ON public.unmatched_flyer_items(flyer_item_id);
CREATE INDEX IF NOT EXISTS idx_unmatched_flyer_items_pending ON public.unmatched_flyer_items (created_at) WHERE status = 'pending';
-- 23. Store brand information.
CREATE TABLE IF NOT EXISTS public.brands (
brand_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
name TEXT NOT NULL UNIQUE,
logo_url TEXT,
store_id BIGINT REFERENCES public.stores(store_id) ON DELETE SET NULL,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.brands IS 'Stores brand names like "Coca-Cola", "Maple Leaf", or "Kraft".';
COMMENT ON COLUMN public.brands.store_id IS 'If this is a store-specific brand (e.g., President''s Choice), this links to the parent store.';
-- 24. For specific products, linking a master item with a brand and size.
CREATE TABLE IF NOT EXISTS public.products (
product_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE CASCADE,
brand_id BIGINT REFERENCES public.brands(brand_id) ON DELETE SET NULL,
name TEXT NOT NULL,
description TEXT,
size TEXT,
upc_code TEXT UNIQUE,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.products IS 'Represents a specific, sellable product, combining a generic item with a brand and size.';
COMMENT ON COLUMN public.products.upc_code IS 'Universal Product Code, if available, for exact product matching.';
COMMENT ON COLUMN public.products.brand_id IS 'Can be null for generic/store-brand items.';
COMMENT ON COLUMN public.products.name IS 'Prime Raised without Antibiotics Chicken Breast.';
COMMENT ON COLUMN public.products.size IS 'e.g., "4L", "500g".';
CREATE INDEX IF NOT EXISTS idx_products_master_item_id ON public.products(master_item_id);
CREATE INDEX IF NOT EXISTS idx_products_brand_id ON public.products(brand_id);
-- 25. Linking table for when one flyer is valid for multiple locations.
CREATE TABLE IF NOT EXISTS public.flyer_locations (
flyer_id BIGINT NOT NULL REFERENCES public.flyers(flyer_id) ON DELETE CASCADE,

View File

@@ -178,6 +178,38 @@ CREATE TABLE IF NOT EXISTS public.user_watched_items (
COMMENT ON TABLE public.user_watched_items IS 'A linking table that represents a user''s personal watchlist of grocery items.';
CREATE INDEX IF NOT EXISTS idx_user_watched_items_master_item_id ON public.user_watched_items(master_item_id);
-- 23. Store brand information. (Moved up due to dependency in flyer_items)
CREATE TABLE IF NOT EXISTS public.brands (
brand_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
name TEXT NOT NULL UNIQUE,
logo_url TEXT,
store_id BIGINT REFERENCES public.stores(store_id) ON DELETE SET NULL,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.brands IS 'Stores brand names like "Coca-Cola", "Maple Leaf", or "Kraft".';
COMMENT ON COLUMN public.brands.store_id IS 'If this is a store-specific brand (e.g., President''s Choice), this links to the parent store.';
-- 24. For specific products, linking a master item with a brand and size. (Moved up due to dependency in flyer_items)
CREATE TABLE IF NOT EXISTS public.products (
product_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE CASCADE,
brand_id BIGINT REFERENCES public.brands(brand_id) ON DELETE SET NULL,
name TEXT NOT NULL,
description TEXT,
size TEXT,
upc_code TEXT UNIQUE,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.products IS 'Represents a specific, sellable product, combining a generic item with a brand and size.';
COMMENT ON COLUMN public.products.upc_code IS 'Universal Product Code, if available, for exact product matching.';
COMMENT ON COLUMN public.products.brand_id IS 'Can be null for generic/store-brand items.';
COMMENT ON COLUMN public.products.name IS 'Prime Raised without Antibiotics Chicken Breast.';
COMMENT ON COLUMN public.products.size IS 'e.g., "4L", "500g".';
CREATE INDEX IF NOT EXISTS idx_products_master_item_id ON public.products(master_item_id);
CREATE INDEX IF NOT EXISTS idx_products_brand_id ON public.products(brand_id);
-- 9. The 'flyer_items' table. This stores individual items from flyers.
CREATE TABLE IF NOT EXISTS public.flyer_items (
flyer_item_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
@@ -424,38 +456,6 @@ COMMENT ON TABLE public.unmatched_flyer_items IS 'A queue for reviewing flyer it
CREATE INDEX IF NOT EXISTS idx_unmatched_flyer_items_flyer_item_id ON public.unmatched_flyer_items(flyer_item_id);
CREATE INDEX IF NOT EXISTS idx_unmatched_flyer_items_pending ON public.unmatched_flyer_items (created_at) WHERE status = 'pending';
-- 23. Store brand information.
CREATE TABLE IF NOT EXISTS public.brands (
brand_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
name TEXT NOT NULL UNIQUE,
logo_url TEXT,
store_id BIGINT REFERENCES public.stores(store_id) ON DELETE SET NULL,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.brands IS 'Stores brand names like "Coca-Cola", "Maple Leaf", or "Kraft".';
COMMENT ON COLUMN public.brands.store_id IS 'If this is a store-specific brand (e.g., President''s Choice), this links to the parent store.';
-- 24. For specific products, linking a master item with a brand and size.
CREATE TABLE IF NOT EXISTS public.products (
product_id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
master_item_id BIGINT NOT NULL REFERENCES public.master_grocery_items(master_grocery_item_id) ON DELETE CASCADE,
brand_id BIGINT REFERENCES public.brands(brand_id) ON DELETE SET NULL,
name TEXT NOT NULL,
description TEXT,
size TEXT,
upc_code TEXT UNIQUE,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
updated_at TIMESTAMPTZ DEFAULT now() NOT NULL
);
COMMENT ON TABLE public.products IS 'Represents a specific, sellable product, combining a generic item with a brand and size.';
COMMENT ON COLUMN public.products.upc_code IS 'Universal Product Code, if available, for exact product matching.';
COMMENT ON COLUMN public.products.brand_id IS 'Can be null for generic/store-brand items.';
COMMENT ON COLUMN public.products.name IS 'Prime Raised without Antibiotics Chicken Breast.';
COMMENT ON COLUMN public.products.size IS 'e.g., "4L", "500g".';
CREATE INDEX IF NOT EXISTS idx_products_master_item_id ON public.products(master_item_id);
CREATE INDEX IF NOT EXISTS idx_products_brand_id ON public.products(brand_id);
-- 25. Linking table for when one flyer is valid for multiple locations.
CREATE TABLE IF NOT EXISTS public.flyer_locations (
flyer_id BIGINT NOT NULL REFERENCES public.flyers(flyer_id) ON DELETE CASCADE,
@@ -2128,11 +2128,61 @@ DROP FUNCTION IF EXISTS public.fork_recipe(UUID, BIGINT);
CREATE OR REPLACE FUNCTION public.fork_recipe(p_user_id UUID, p_original_recipe_id BIGINT)
RETURNS SETOF public.recipes
LANGUAGE sql
LANGUAGE plpgsql
SECURITY INVOKER
AS $$
-- The entire forking logic is now encapsulated in a single, atomic database function.
SELECT * FROM public.fork_recipe(p_user_id, p_original_recipe_id);
DECLARE
new_recipe_id BIGINT;
BEGIN
-- 1. Create a copy of the recipe, linking it to the new user and the original recipe.
INSERT INTO public.recipes (
user_id,
original_recipe_id,
name,
description,
instructions,
prep_time_minutes,
cook_time_minutes,
servings,
photo_url,
calories_per_serving,
protein_grams,
fat_grams,
carb_grams,
status -- Forked recipes should be private by default
)
SELECT
p_user_id,
p_original_recipe_id,
original.name || ' (Fork)', -- Append '(Fork)' to distinguish it
original.description,
original.instructions,
original.prep_time_minutes,
original.cook_time_minutes,
original.servings,
original.photo_url,
original.calories_per_serving,
original.protein_grams,
original.fat_grams,
original.carb_grams,
'private'
FROM public.recipes AS original
WHERE original.recipe_id = p_original_recipe_id
RETURNING recipe_id INTO new_recipe_id;
-- If the original recipe didn't exist, new_recipe_id will be null.
IF new_recipe_id IS NULL THEN
RETURN;
END IF;
-- 2. Copy all ingredients, tags, and appliances from the original recipe to the new one.
INSERT INTO public.recipe_ingredients (recipe_id, master_item_id, quantity, unit) SELECT new_recipe_id, master_item_id, quantity, unit FROM public.recipe_ingredients WHERE recipe_id = p_original_recipe_id;
INSERT INTO public.recipe_tags (recipe_id, tag_id) SELECT new_recipe_id, tag_id FROM public.recipe_tags WHERE recipe_id = p_original_recipe_id;
INSERT INTO public.recipe_appliances (recipe_id, appliance_id) SELECT new_recipe_id, appliance_id FROM public.recipe_appliances WHERE recipe_id = p_original_recipe_id;
-- 3. Return the newly created recipe record.
RETURN QUERY SELECT * FROM public.recipes WHERE recipe_id = new_recipe_id;
END;
$$;

View File

@@ -16,4 +16,4 @@ const config = {
},
};
export default config;
export default config;

View File

@@ -6,10 +6,11 @@
* DO NOT run this on a production database.
*/
import { Pool } from 'pg';
import { Pool, PoolClient } from 'pg';
import fs from 'node:fs/promises';
import path from 'node:path';
import bcrypt from 'bcrypt';
import { logger } from '../services/logger.server';
import { CATEGORIES } from '../types';
const pool = new Pool({
user: process.env.DB_USER,
@@ -20,81 +21,55 @@ const pool = new Pool({
});
async function main() {
// Declare client outside the try block so it's accessible in the finally block.
let client;
let client: PoolClient | undefined;
try {
client = await pool.connect();
logger.info('Connected to the database for seeding.');
await client.query('BEGIN');
// 1. Clean the database
logger.info('--- Wiping existing data... ---');
// Using TRUNCATE ... RESTART IDENTITY CASCADE is a powerful way to clean all tables
// and reset auto-incrementing keys, while respecting foreign key relationships.
const tablesRes = await client.query(`
SELECT tablename
FROM pg_tables
WHERE schemaname = 'public'
-- Exclude PostGIS system tables from truncation to avoid permission errors.
AND tablename NOT IN ('spatial_ref_sys', 'geometry_columns')
`);
const tables = tablesRes.rows.map((row) => `"${row.tablename}"`).join(', ');
if (tables) {
await client.query(`TRUNCATE ${tables} RESTART IDENTITY CASCADE`);
logger.info('All tables in public schema have been truncated.');
}
// 1. Clean the database by dropping and recreating the schema
logger.info('--- Wiping and rebuilding schema... ---');
const dropScriptPath = path.resolve(process.cwd(), 'sql/drop_tables.sql');
const dropSql = await fs.readFile(dropScriptPath, 'utf-8');
await client.query(dropSql);
logger.info('All tables dropped successfully.');
// 2. Seed Categories
logger.info('--- Seeding Categories... ---');
const categoryQuery = `INSERT INTO public.categories (name) VALUES ${CATEGORIES.map((_, i) => `($${i + 1})`).join(', ')} RETURNING category_id, name`;
const seededCategories = (
await client.query<{ category_id: number; name: string }>(categoryQuery, CATEGORIES)
).rows;
const categoryMap = new Map(seededCategories.map((c) => [c.name, c.category_id]));
logger.info(`Seeded ${seededCategories.length} categories.`);
const schemaScriptPath = path.resolve(process.cwd(), 'sql/master_schema_rollup.sql');
const schemaSql = await fs.readFile(schemaScriptPath, 'utf-8');
await client.query(schemaSql);
logger.info(
'Schema rebuilt and static data seeded successfully from master_schema_rollup.sql.',
);
// 3. Seed Stores
// 2. Seed Additional Stores (if any beyond what's in the rollup)
logger.info('--- Seeding Stores... ---');
const stores = ['Safeway', 'No Frills', 'Costco', 'Superstore'];
const storeQuery = `INSERT INTO public.stores (name) VALUES ${stores.map((_, i) => `($${i + 1})`).join(', ')} RETURNING store_id, name`;
const seededStores = (
await client.query<{ store_id: number; name: string }>(storeQuery, stores)
const storeQuery = `INSERT INTO public.stores (name) VALUES ${stores.map((_, i) => `($${i + 1})`).join(', ')} ON CONFLICT (name) DO NOTHING RETURNING store_id, name`;
await client.query<{ store_id: number; name: string }>(storeQuery, stores);
const allStores = (
await client.query<{ store_id: number; name: string }>(
'SELECT store_id, name FROM public.stores',
)
).rows;
const storeMap = new Map(seededStores.map((s) => [s.name, s.store_id]));
logger.info(`Seeded ${seededStores.length} stores.`);
// 4. Seed Master Grocery Items
logger.info('--- Seeding Master Grocery Items... ---');
const masterItems = [
{ name: 'Chicken Breast, Boneless Skinless', category: 'Meat & Seafood' },
{ name: 'Ground Beef, Lean', category: 'Meat & Seafood' },
{ name: 'Avocado', category: 'Fruits & Vegetables' },
{ name: 'Bananas', category: 'Fruits & Vegetables' },
{ name: 'Broccoli', category: 'Fruits & Vegetables' },
{ name: 'Cheddar Cheese, Block', category: 'Dairy & Eggs' },
{ name: 'Milk, 2%', category: 'Dairy & Eggs' },
{ name: 'Eggs, Large', category: 'Dairy & Eggs' },
{ name: 'Whole Wheat Bread', category: 'Bakery & Bread' },
{ name: 'Pasta, Spaghetti', category: 'Pantry & Dry Goods' },
{ name: 'Canned Tomatoes, Diced', category: 'Canned Goods' },
{ name: 'Coca-Cola, 12-pack', category: 'Beverages' },
{ name: 'Frozen Pizza', category: 'Frozen Foods' },
{ name: 'Paper Towels', category: 'Household & Cleaning' },
];
const masterItemValues = masterItems
.map((item) => `('${item.name.replace(/'/g, "''")}', ${categoryMap.get(item.category)})`)
.join(', ');
const masterItemQuery = `INSERT INTO public.master_grocery_items (name, category_id) VALUES ${masterItemValues} RETURNING master_grocery_item_id, name`;
const seededMasterItems = (
await client.query<{ master_grocery_item_id: number; name: string }>(masterItemQuery)
).rows;
const masterItemMap = new Map(
seededMasterItems.map((item) => [item.name, item.master_grocery_item_id]),
const storeMap = new Map(
allStores.map((s: { name: string; store_id: number }) => [s.name, s.store_id]),
);
logger.info(`Seeded ${seededMasterItems.length} master grocery items.`);
logger.info(`Seeded/verified ${allStores.length} total stores.`);
// 5. Seed Users & Profiles
// Fetch maps for items seeded by the master rollup script
const masterItemMap = new Map(
(
await client.query<{ master_grocery_item_id: number; name: string }>(
'SELECT master_grocery_item_id, name FROM public.master_grocery_items',
)
).rows.map((item: { name: string; master_grocery_item_id: number }) => [
item.name,
item.master_grocery_item_id,
]),
);
// 3. Seed Users & Profiles
logger.info('--- Seeding Users & Profiles... ---');
const saltRounds = 10;
const adminPassHash = await bcrypt.hash('adminpass', saltRounds);
@@ -126,7 +101,7 @@ async function main() {
const userId = userRes.rows[0].user_id;
logger.info('Seeded regular user (user@example.com / userpass)');
// 6. Seed a Flyer
// 4. Seed a Flyer
logger.info('--- Seeding a Sample Flyer... ---');
const today = new Date();
const validFrom = new Date(today);
@@ -146,29 +121,29 @@ async function main() {
const flyerId = flyerRes.rows[0].flyer_id;
logger.info(`Seeded flyer for Safeway (ID: ${flyerId}).`);
// 7. Seed Flyer Items
// 5. Seed Flyer Items
logger.info('--- Seeding Flyer Items... ---');
const flyerItems = [
{
name: 'Chicken Breast, Boneless Skinless',
name: 'chicken breast',
price_display: '$3.99 /lb',
price_in_cents: 399,
quantity: 'per lb',
master_item_id: masterItemMap.get('Chicken Breast, Boneless Skinless'),
master_item_id: masterItemMap.get('chicken breast'),
},
{
name: 'Avocado',
name: 'avocados',
price_display: '2 for $5.00',
price_in_cents: 250,
quantity: 'each',
master_item_id: masterItemMap.get('Avocado'),
master_item_id: masterItemMap.get('avocados'),
},
{
name: 'Coca-Cola 12-pack',
name: 'soda',
price_display: '$6.99',
price_in_cents: 699,
quantity: '12x355ml',
master_item_id: masterItemMap.get('Coca-Cola, 12-pack'),
master_item_id: masterItemMap.get('soda'),
},
{
name: 'Unmatched Sample Item',
@@ -194,12 +169,12 @@ async function main() {
}
logger.info(`Seeded ${flyerItems.length} items for the Safeway flyer.`);
// 8. Seed Watched Items for the user
// 6. Seed Watched Items for the user
logger.info('--- Seeding Watched Items... ---');
const watchedItemIds = [
masterItemMap.get('Chicken Breast, Boneless Skinless'),
masterItemMap.get('Avocado'),
masterItemMap.get('Ground Beef, Lean'),
masterItemMap.get('chicken breast'),
masterItemMap.get('avocados'),
masterItemMap.get('ground beef'),
];
for (const itemId of watchedItemIds) {
if (itemId) {
@@ -211,7 +186,7 @@ async function main() {
}
logger.info(`Seeded ${watchedItemIds.length} watched items for Test User.`);
// 9. Seed a Shopping List
// 7. Seed a Shopping List
logger.info('--- Seeding a Shopping List... ---');
const listRes = await client.query<{ shopping_list_id: number }>(
'INSERT INTO public.shopping_lists (user_id, name) VALUES ($1, $2) RETURNING shopping_list_id',
@@ -220,8 +195,8 @@ async function main() {
const listId = listRes.rows[0].shopping_list_id;
const shoppingListItems = [
{ master_item_id: masterItemMap.get('Milk, 2%'), quantity: 1 },
{ master_item_id: masterItemMap.get('Eggs, Large'), quantity: 1 },
{ master_item_id: masterItemMap.get('milk'), quantity: 1 },
{ master_item_id: masterItemMap.get('eggs'), quantity: 1 },
{ custom_item_name: 'Specialty Hot Sauce', quantity: 1 },
];
@@ -235,75 +210,6 @@ async function main() {
`Seeded shopping list "Weekly Groceries" with ${shoppingListItems.length} items for Test User.`,
);
// 10. Seed Brands
logger.info('--- Seeding Brands... ---');
const brands = [
'Coca-Cola',
'Kraft',
'Maple Leaf',
"Dempster's",
'No Name',
"President's Choice",
];
const brandQuery = `INSERT INTO public.brands (name) VALUES ${brands.map((_, i) => `($${i + 1})`).join(', ')} ON CONFLICT (name) DO NOTHING`;
await client.query(brandQuery, brands);
logger.info(`Seeded ${brands.length} brands.`);
// Link store-specific brands
const loblawsId = storeMap.get('Loblaws');
if (loblawsId) {
await client.query('UPDATE public.brands SET store_id = $1 WHERE name = $2 OR name = $3', [
loblawsId,
'No Name',
"President's Choice",
]);
logger.info('Linked store brands to Loblaws.');
}
// 11. Seed Recipes
logger.info('--- Seeding Recipes... ---');
const recipes = [
{
name: 'Simple Chicken and Rice',
description: 'A quick and healthy weeknight meal.',
instructions: '1. Cook rice. 2. Cook chicken. 3. Combine.',
prep: 10,
cook: 20,
servings: 4,
},
{
name: 'Classic Spaghetti Bolognese',
description: 'A rich and hearty meat sauce.',
instructions: '1. Brown beef. 2. Add sauce. 3. Simmer.',
prep: 15,
cook: 45,
servings: 6,
},
{
name: 'Vegetable Stir-fry',
description: 'A fast and flavorful vegetarian meal.',
instructions: '1. Chop veggies. 2. Stir-fry. 3. Add sauce.',
prep: 10,
cook: 10,
servings: 3,
},
];
for (const recipe of recipes) {
await client.query(
`INSERT INTO public.recipes (name, description, instructions, prep_time_minutes, cook_time_minutes, servings, status)
VALUES ($1, $2, $3, $4, $5, $6, 'public') ON CONFLICT (name) WHERE user_id IS NULL DO NOTHING`,
[
recipe.name,
recipe.description,
recipe.instructions,
recipe.prep,
recipe.cook,
recipe.servings,
],
);
}
logger.info(`Seeded ${recipes.length} recipes.`);
// --- SEED SCRIPT DEBUG LOGGING ---
// Corrected the query to be unambiguous by specifying the table alias for each column.
// `id` and `email` come from the `users` table (u), and `role` comes from the `profiles` table (p).

View File

@@ -4,7 +4,7 @@
This single directive replaces @tailwind base, components, and utilities.
It is the new entry point for all of Tailwind's generated CSS.
*/
@import "tailwindcss";
@import 'tailwindcss';
/*
This is the new v4 directive that tells the @tailwindcss/postcss plugin
@@ -12,4 +12,3 @@
Since tailwind.config.js is in the root and this is in src/, the path is '../tailwind.config.js'.
*/
@config '../tailwind.config.js';

View File

@@ -8,17 +8,16 @@ import './index.css';
const rootElement = document.getElementById('root');
if (!rootElement) {
throw new Error("Could not find root element to mount to");
throw new Error('Could not find root element to mount to');
}
const root = ReactDOM.createRoot(rootElement);
root.render(
<React.StrictMode>
<BrowserRouter>
<BrowserRouter>
<AppProviders>
<App />
</AppProviders>
</BrowserRouter>
</React.StrictMode>
</BrowserRouter>
</React.StrictMode>,
);

View File

@@ -5,4 +5,4 @@ import toast from 'react-hot-toast';
// This intermediate file allows us to mock 'src/lib/toast' reliably in tests
// without wrestling with the internal structure of the 'react-hot-toast' package.
export * from 'react-hot-toast';
export default toast;
export default toast;

View File

@@ -1,10 +1,16 @@
// src/tests/integration/admin.integration.test.ts
import { describe, it, expect, beforeAll, beforeEach, afterAll } from 'vitest';
import * as apiClient from '../../services/apiClient';
import supertest from 'supertest';
import app from '../../../server';
import { getPool } from '../../services/db/connection.db';
import type { UserProfile } from '../../types';
import { createAndLoginUser } from '../utils/testHelpers';
/**
* @vitest-environment node
*/
const request = supertest(app);
describe('Admin API Routes Integration Tests', () => {
let adminToken: string;
let adminUser: UserProfile;
@@ -42,8 +48,10 @@ describe('Admin API Routes Integration Tests', () => {
describe('GET /api/admin/stats', () => {
it('should allow an admin to fetch application stats', async () => {
const response = await apiClient.getApplicationStats(adminToken);
const stats = await response.json();
const response = await request
.get('/api/admin/stats')
.set('Authorization', `Bearer ${adminToken}`);
const stats = response.body;
expect(stats).toBeDefined();
expect(stats).toHaveProperty('flyerCount');
expect(stats).toHaveProperty('userCount');
@@ -51,18 +59,21 @@ describe('Admin API Routes Integration Tests', () => {
});
it('should forbid a regular user from fetching application stats', async () => {
const response = await apiClient.getApplicationStats(regularUserToken);
expect(response.ok).toBe(false);
const response = await request
.get('/api/admin/stats')
.set('Authorization', `Bearer ${regularUserToken}`);
expect(response.status).toBe(403);
const errorData = await response.json();
const errorData = response.body;
expect(errorData.message).toBe('Forbidden: Administrator access required.');
});
});
describe('GET /api/admin/stats/daily', () => {
it('should allow an admin to fetch daily stats', async () => {
const response = await apiClient.getDailyStats(adminToken);
const dailyStats = await response.json();
const response = await request
.get('/api/admin/stats/daily')
.set('Authorization', `Bearer ${adminToken}`);
const dailyStats = response.body;
expect(dailyStats).toBeDefined();
expect(Array.isArray(dailyStats)).toBe(true);
// We just created users in beforeAll, so we should have data
@@ -73,10 +84,11 @@ describe('Admin API Routes Integration Tests', () => {
});
it('should forbid a regular user from fetching daily stats', async () => {
const response = await apiClient.getDailyStats(regularUserToken);
expect(response.ok).toBe(false);
const response = await request
.get('/api/admin/stats/daily')
.set('Authorization', `Bearer ${regularUserToken}`);
expect(response.status).toBe(403);
const errorData = await response.json();
const errorData = response.body;
expect(errorData.message).toBe('Forbidden: Administrator access required.');
});
});
@@ -85,25 +97,30 @@ describe('Admin API Routes Integration Tests', () => {
it('should allow an admin to fetch suggested corrections', async () => {
// This test just verifies access and correct response shape.
// More detailed tests would require seeding corrections.
const response = await apiClient.getSuggestedCorrections(adminToken);
const corrections = await response.json();
const response = await request
.get('/api/admin/corrections')
.set('Authorization', `Bearer ${adminToken}`);
const corrections = response.body;
expect(corrections).toBeDefined();
expect(Array.isArray(corrections)).toBe(true);
});
it('should forbid a regular user from fetching suggested corrections', async () => {
const response = await apiClient.getSuggestedCorrections(regularUserToken);
expect(response.ok).toBe(false);
const response = await request
.get('/api/admin/corrections')
.set('Authorization', `Bearer ${regularUserToken}`);
expect(response.status).toBe(403);
const errorData = await response.json();
const errorData = response.body;
expect(errorData.message).toBe('Forbidden: Administrator access required.');
});
});
describe('GET /api/admin/brands', () => {
it('should allow an admin to fetch all brands', async () => {
const response = await apiClient.fetchAllBrands(adminToken);
const brands = await response.json();
const response = await request
.get('/api/admin/brands')
.set('Authorization', `Bearer ${adminToken}`);
const brands = response.body;
expect(brands).toBeDefined();
expect(Array.isArray(brands)).toBe(true);
// Even if no brands exist, it should return an array.
@@ -112,10 +129,11 @@ describe('Admin API Routes Integration Tests', () => {
});
it('should forbid a regular user from fetching all brands', async () => {
const response = await apiClient.fetchAllBrands(regularUserToken);
expect(response.ok).toBe(false);
const response = await request
.get('/api/admin/brands')
.set('Authorization', `Bearer ${regularUserToken}`);
expect(response.status).toBe(403);
const errorData = await response.json();
const errorData = response.body;
expect(errorData.message).toBe('Forbidden: Administrator access required.');
});
});
@@ -170,8 +188,10 @@ describe('Admin API Routes Integration Tests', () => {
it('should allow an admin to approve a correction', async () => {
// Act: Approve the correction.
const response = await apiClient.approveCorrection(testCorrectionId, adminToken);
expect(response.ok).toBe(true);
const response = await request
.post(`/api/admin/corrections/${testCorrectionId}/approve`)
.set('Authorization', `Bearer ${adminToken}`);
expect(response.status).toBe(200);
// Assert: Verify the flyer item's price was updated and the correction status changed.
const { rows: itemRows } = await getPool().query(
@@ -189,8 +209,10 @@ describe('Admin API Routes Integration Tests', () => {
it('should allow an admin to reject a correction', async () => {
// Act: Reject the correction.
const response = await apiClient.rejectCorrection(testCorrectionId, adminToken);
expect(response.ok).toBe(true);
const response = await request
.post(`/api/admin/corrections/${testCorrectionId}/reject`)
.set('Authorization', `Bearer ${adminToken}`);
expect(response.status).toBe(200);
// Assert: Verify the correction status changed.
const { rows: correctionRows } = await getPool().query(
@@ -202,12 +224,11 @@ describe('Admin API Routes Integration Tests', () => {
it('should allow an admin to update a correction', async () => {
// Act: Update the suggested value of the correction.
const response = await apiClient.updateSuggestedCorrection(
testCorrectionId,
'300',
adminToken,
);
const updatedCorrection = await response.json();
const response = await request
.put(`/api/admin/corrections/${testCorrectionId}`)
.set('Authorization', `Bearer ${adminToken}`)
.send({ suggested_value: '300' });
const updatedCorrection = response.body;
// Assert: Verify the API response and the database state.
expect(updatedCorrection.suggested_value).toBe('300');
@@ -227,8 +248,11 @@ describe('Admin API Routes Integration Tests', () => {
const recipeId = recipeRes.rows[0].recipe_id;
// Act: Update the status to 'public'.
const response = await apiClient.updateRecipeStatus(recipeId, 'public', adminToken);
expect(response.ok).toBe(true);
const response = await request
.put(`/api/admin/recipes/${recipeId}/status`)
.set('Authorization', `Bearer ${adminToken}`)
.send({ status: 'public' });
expect(response.status).toBe(200);
// Assert: Verify the status was updated in the database.
const { rows: updatedRecipeRows } = await getPool().query(

View File

@@ -1,6 +1,7 @@
// src/tests/integration/ai.integration.test.ts
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
import * as aiApiClient from '../../services/aiApiClient';
import supertest from 'supertest';
import app from '../../../server';
import fs from 'node:fs/promises';
import path from 'path';
import { createAndLoginUser } from '../utils/testHelpers';
@@ -9,6 +10,8 @@ import { createAndLoginUser } from '../utils/testHelpers';
* @vitest-environment node
*/
const request = supertest(app);
interface TestGeolocationCoordinates {
latitude: number;
longitude: number;
@@ -44,46 +47,63 @@ describe('AI API Routes Integration Tests', () => {
});
it('POST /api/ai/check-flyer should return a boolean', async () => {
const mockImageFile = new File(['content'], 'test.jpg', { type: 'image/jpeg' });
const response = await aiApiClient.isImageAFlyer(mockImageFile, authToken);
const result = await response.json();
const response = await request
.post('/api/ai/check-flyer')
.set('Authorization', `Bearer ${authToken}`)
.attach('image', Buffer.from('content'), 'test.jpg');
const result = response.body;
expect(response.status).toBe(200);
// The backend is stubbed to always return true for this check
expect(result.is_flyer).toBe(true);
});
it('POST /api/ai/extract-address should return a stubbed address', async () => {
const mockImageFile = new File(['content'], 'test.jpg', { type: 'image/jpeg' });
const response = await aiApiClient.extractAddressFromImage(mockImageFile, authToken);
const result = await response.json();
const response = await request
.post('/api/ai/extract-address')
.set('Authorization', `Bearer ${authToken}`)
.attach('image', Buffer.from('content'), 'test.jpg');
const result = response.body;
expect(response.status).toBe(200);
expect(result.address).toBe('not identified');
});
it('POST /api/ai/extract-logo should return a stubbed response', async () => {
const mockImageFile = new File(['content'], 'test.jpg', { type: 'image/jpeg' });
const response = await aiApiClient.extractLogoFromImage([mockImageFile], authToken);
const result = await response.json();
const response = await request
.post('/api/ai/extract-logo')
.set('Authorization', `Bearer ${authToken}`)
.attach('images', Buffer.from('content'), 'test.jpg');
const result = response.body;
expect(response.status).toBe(200);
expect(result).toEqual({ store_logo_base_64: null });
});
it('POST /api/ai/quick-insights should return a stubbed insight', async () => {
const response = await aiApiClient.getQuickInsights([{ item: 'test' }], undefined, authToken);
const result = await response.json();
const response = await request
.post('/api/ai/quick-insights')
.set('Authorization', `Bearer ${authToken}`)
.send({ items: [{ item: 'test' }] });
const result = response.body;
expect(response.status).toBe(200);
expect(result.text).toBe('This is a server-generated quick insight: buy the cheap stuff!');
});
it('POST /api/ai/deep-dive should return a stubbed analysis', async () => {
const response = await aiApiClient.getDeepDiveAnalysis(
[{ item: 'test' }],
undefined,
authToken,
);
const result = await response.json();
const response = await request
.post('/api/ai/deep-dive')
.set('Authorization', `Bearer ${authToken}`)
.send({ items: [{ item: 'test' }] });
const result = response.body;
expect(response.status).toBe(200);
expect(result.text).toBe('This is a server-generated deep dive analysis. It is very detailed.');
});
it('POST /api/ai/search-web should return a stubbed search result', async () => {
const response = await aiApiClient.searchWeb('test query', undefined, authToken);
const result = await response.json();
const response = await request
.post('/api/ai/search-web')
.set('Authorization', `Bearer ${authToken}`)
.send({ query: 'test query' });
const result = response.body;
expect(response.status).toBe(200);
expect(result).toEqual({ text: 'The web says this is good.', sources: [] });
});
@@ -116,36 +136,32 @@ describe('AI API Routes Integration Tests', () => {
created_at: new Date().toISOString(),
updated_at: new Date().toISOString(),
};
const response = await aiApiClient.planTripWithMaps(
[],
mockStore,
mockLocation,
undefined,
authToken,
);
const response = await request
.post('/api/ai/plan-trip')
.set('Authorization', `Bearer ${authToken}`)
.send({ items: [], store: mockStore, userLocation: mockLocation });
// The service for this endpoint is disabled and throws an error, which results in a 500.
expect(response.ok).toBe(false);
expect(response.status).toBe(500);
const errorResult = await response.json();
const errorResult = response.body;
expect(errorResult.message).toContain('planTripWithMaps');
});
it('POST /api/ai/generate-image should reject because it is not implemented', async () => {
// The backend for this is not stubbed and will throw an error.
// This test confirms that the endpoint is protected and responds as expected to a failure.
const response = await aiApiClient.generateImageFromText('a test prompt', undefined, authToken);
expect(response.ok).toBe(false);
const response = await request
.post('/api/ai/generate-image')
.set('Authorization', `Bearer ${authToken}`)
.send({ prompt: 'a test prompt' });
expect(response.status).toBe(501);
});
it('POST /api/ai/generate-speech should reject because it is not implemented', async () => {
// The backend for this is not stubbed and will throw an error.
const response = await aiApiClient.generateSpeechFromText(
'a test prompt',
undefined,
authToken,
);
expect(response.ok).toBe(false);
const response = await request
.post('/api/ai/generate-speech')
.set('Authorization', `Bearer ${authToken}`)
.send({ text: 'a test prompt' });
expect(response.status).toBe(501);
});
});

View File

@@ -1,6 +1,7 @@
// src/tests/integration/auth.integration.test.ts
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
import { loginUser } from '../../services/apiClient';
import supertest from 'supertest';
import app from '../../../server';
import { getPool } from '../../services/db/connection.db';
import { createAndLoginUser, TEST_PASSWORD } from '../utils/testHelpers';
import type { UserProfile } from '../../types';
@@ -9,6 +10,8 @@ import type { UserProfile } from '../../types';
* @vitest-environment node
*/
const request = supertest(app);
/**
* These are integration tests that verify the authentication flow against a running backend server.
* Make sure your Express server is running before executing these tests.
@@ -16,30 +19,6 @@ import type { UserProfile } from '../../types';
* To run only these tests: `vitest run src/tests/auth.integration.test.ts`
*/
describe('Authentication API Integration', () => {
// --- START DEBUG LOGGING ---
// Query the DB from within the test file to see its state.
getPool()
.query(
'SELECT u.user_id, u.email, p.role FROM public.users u JOIN public.profiles p ON u.user_id = p.user_id',
)
.then((res) => {
console.log('\n--- [auth.integration.test.ts] Users found in DB from TEST perspective: ---');
console.table(res.rows);
console.log('--------------------------------------------------------------------------\n');
})
.catch((err) => console.error('--- [auth.integration.test.ts] DB QUERY FAILED ---', err));
// --- END DEBUG LOGGING ---
// --- START DEBUG LOGGING ---
// Log the database connection details as seen by an individual TEST FILE.
console.log('\n\n--- [AUTH.INTEGRATION.TEST LOG] DATABASE CONNECTION ---');
console.log(` Host: ${process.env.DB_HOST}`);
console.log(` Port: ${process.env.DB_PORT}`);
console.log(` User: ${process.env.DB_USER}`);
console.log(` Database: ${process.env.DB_NAME}`);
console.log('-----------------------------------------------------\n');
// --- END DEBUG LOGGING ---
let testUserEmail: string;
let testUser: UserProfile;
@@ -57,11 +36,14 @@ describe('Authentication API Integration', () => {
// This test migrates the logic from the old DevTestRunner.tsx component.
it('should successfully log in a registered user', async () => {
// The `rememberMe` parameter is required. For a test, `false` is a safe default.
const response = await loginUser(testUserEmail, TEST_PASSWORD, false);
const data = await response.json();
const response = await request
.post('/api/auth/login')
.send({ email: testUserEmail, password: TEST_PASSWORD, rememberMe: false });
const data = response.body;
// Assert that the API returns the expected structure
expect(data).toBeDefined();
expect(response.status).toBe(200);
expect(data.userprofile).toBeDefined();
expect(data.userprofile.user.email).toBe(testUserEmail);
expect(data.userprofile.user.user_id).toBeTypeOf('string');
@@ -74,9 +56,11 @@ describe('Authentication API Integration', () => {
const wrongPassword = 'wrongpassword';
// The loginUser function returns a Response object. We check its status.
const response = await loginUser(adminEmail, wrongPassword, false);
expect(response.ok).toBe(false);
const errorData = await response.json();
const response = await request
.post('/api/auth/login')
.send({ email: adminEmail, password: wrongPassword, rememberMe: false });
expect(response.status).toBe(401);
const errorData = response.body;
expect(errorData.message).toBe('Incorrect email or password.');
});
@@ -85,9 +69,11 @@ describe('Authentication API Integration', () => {
const anyPassword = 'any-password';
// The loginUser function returns a Response object. We check its status.
const response = await loginUser(nonExistentEmail, anyPassword, false);
expect(response.ok).toBe(false);
const errorData = await response.json();
const response = await request
.post('/api/auth/login')
.send({ email: nonExistentEmail, password: anyPassword, rememberMe: false });
expect(response.status).toBe(401);
const errorData = response.body;
// Security best practice: the error message should be identical for wrong password and wrong email
// to prevent user enumeration attacks.
expect(errorData.message).toBe('Incorrect email or password.');
@@ -96,24 +82,21 @@ describe('Authentication API Integration', () => {
it('should successfully refresh an access token using a refresh token cookie', async () => {
// Arrange: Log in to get a fresh, valid refresh token cookie for this specific test.
// This ensures the test is self-contained and not affected by other tests.
const loginResponse = await loginUser(testUserEmail, TEST_PASSWORD, true);
const setCookieHeader = loginResponse.headers.get('set-cookie');
const refreshTokenCookie = setCookieHeader?.split(';')[0];
const loginResponse = await request
.post('/api/auth/login')
.send({ email: testUserEmail, password: TEST_PASSWORD, rememberMe: true });
const refreshTokenCookie = loginResponse.headers['set-cookie'][0].split(';')[0];
expect(refreshTokenCookie).toBeDefined();
// Act: Make a request to the refresh-token endpoint, including the cookie.
const apiUrl = process.env.VITE_API_BASE_URL || 'http://localhost:3001/api';
const response = await fetch(`${apiUrl}/auth/refresh-token`, {
method: 'POST',
headers: {
Cookie: refreshTokenCookie!,
},
});
const response = await request
.post('/api/auth/refresh-token')
.set('Cookie', refreshTokenCookie!);
// Assert: Check for a successful response and a new access token.
expect(response.ok).toBe(true);
const data = await response.json();
expect(response.status).toBe(200);
const data = response.body;
expect(data.token).toBeTypeOf('string');
});
@@ -122,40 +105,30 @@ describe('Authentication API Integration', () => {
const invalidRefreshTokenCookie = 'refreshToken=this-is-not-a-valid-token';
// Act: Make a request to the refresh-token endpoint with the invalid cookie.
const apiUrl = process.env.VITE_API_BASE_URL || 'http://localhost:3001/api';
const response = await fetch(`${apiUrl}/auth/refresh-token`, {
method: 'POST',
headers: {
Cookie: invalidRefreshTokenCookie,
},
});
const response = await request
.post('/api/auth/refresh-token')
.set('Cookie', invalidRefreshTokenCookie);
// Assert: Check for a 403 Forbidden response.
expect(response.ok).toBe(false);
expect(response.status).toBe(403);
const data = await response.json();
const data = response.body;
expect(data.message).toBe('Invalid or expired refresh token.');
});
it('should successfully log out and clear the refresh token cookie', async () => {
// Arrange: Log in to get a valid refresh token cookie.
const loginResponse = await loginUser(testUserEmail, TEST_PASSWORD, true);
const setCookieHeader = loginResponse.headers.get('set-cookie');
const refreshTokenCookie = setCookieHeader?.split(';')[0];
const loginResponse = await request
.post('/api/auth/login')
.send({ email: testUserEmail, password: TEST_PASSWORD, rememberMe: true });
const refreshTokenCookie = loginResponse.headers['set-cookie'][0].split(';')[0];
expect(refreshTokenCookie).toBeDefined();
// Act: Make a request to the new logout endpoint, including the cookie.
const apiUrl = process.env.VITE_API_BASE_URL || 'http://localhost:3001/api';
const response = await fetch(`${apiUrl}/auth/logout`, {
method: 'POST',
headers: {
Cookie: refreshTokenCookie!,
},
});
const response = await request.post('/api/auth/logout').set('Cookie', refreshTokenCookie!);
// Assert: Check for a successful response and a cookie-clearing header.
expect(response.ok).toBe(true);
const logoutSetCookieHeader = response.headers.get('set-cookie');
expect(response.status).toBe(200);
const logoutSetCookieHeader = response.headers['set-cookie'][0];
expect(logoutSetCookieHeader).toContain('refreshToken=;');
expect(logoutSetCookieHeader).toContain('Max-Age=0');
});

View File

@@ -1,8 +1,9 @@
// src/tests/integration/flyer-processing.integration.test.ts
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
import supertest from 'supertest';
import app from '../../../server';
import fs from 'node:fs/promises';
import path from 'path';
import * as aiApiClient from '../../services/aiApiClient';
import * as db from '../../services/db/index.db';
import { getPool } from '../../services/db/connection.db';
import { generateFileChecksum } from '../../utils/checksum';
@@ -14,6 +15,8 @@ import { createAndLoginUser } from '../utils/testHelpers';
* @vitest-environment node
*/
const request = supertest(app);
describe('Flyer Processing Background Job Integration Test', () => {
const createdUserIds: string[] = [];
const createdFlyerIds: number[] = [];
@@ -68,8 +71,15 @@ describe('Flyer Processing Background Job Integration Test', () => {
const checksum = await generateFileChecksum(mockImageFile);
// Act 1: Upload the file to start the background job.
const uploadResponse = await aiApiClient.uploadAndProcessFlyer(mockImageFile, checksum, token);
const { jobId } = await uploadResponse.json();
const uploadReq = request
.post('/api/ai/upload-and-process')
.field('checksum', checksum)
.attach('flyerFile', uniqueContent, uniqueFileName);
if (token) {
uploadReq.set('Authorization', `Bearer ${token}`);
}
const uploadResponse = await uploadReq;
const { jobId } = uploadResponse.body;
// Assert 1: Check that a job ID was returned.
expect(jobId).toBeTypeOf('string');
@@ -79,8 +89,12 @@ describe('Flyer Processing Background Job Integration Test', () => {
const maxRetries = 20; // Poll for up to 60 seconds (20 * 3s)
for (let i = 0; i < maxRetries; i++) {
await new Promise((resolve) => setTimeout(resolve, 3000)); // Wait 3 seconds between polls
const statusResponse = await aiApiClient.getJobStatus(jobId, token);
jobStatus = await statusResponse.json();
const statusReq = request.get(`/api/ai/jobs/${jobId}/status`);
if (token) {
statusReq.set('Authorization', `Bearer ${token}`);
}
const statusResponse = await statusReq;
jobStatus = statusResponse.body;
if (jobStatus.state === 'completed' || jobStatus.state === 'failed') {
break;
}

View File

@@ -1,7 +1,8 @@
// src/tests/integration/flyer.integration.test.ts
import { describe, it, expect, beforeAll } from 'vitest';
import * as apiClient from '../../services/apiClient';
import supertest from 'supertest';
import { getPool } from '../../services/db/connection.db';
import app from '../../../server';
import type { Flyer, FlyerItem } from '../../types';
/**
@@ -10,6 +11,8 @@ import type { Flyer, FlyerItem } from '../../types';
describe('Public Flyer API Routes Integration Tests', () => {
let flyers: Flyer[] = [];
// Use a supertest instance for all requests in this file
const request = supertest(app);
let createdFlyerId: number;
// Fetch flyers once before all tests in this suite to use in subsequent tests.
@@ -34,18 +37,16 @@ describe('Public Flyer API Routes Integration Tests', () => {
[createdFlyerId],
);
const response = await apiClient.fetchFlyers();
flyers = await response.json();
const response = await request.get('/api/flyers');
flyers = response.body;
});
describe('GET /api/flyers', () => {
it('should return a list of flyers', async () => {
// Act: Call the API endpoint using the client function.
const response = await apiClient.fetchFlyers();
const flyers: Flyer[] = await response.json();
// Assert: Verify the response is successful and contains the expected data structure.
expect(response.ok).toBe(true);
const response = await request.get('/api/flyers');
const flyers: Flyer[] = response.body;
expect(response.status).toBe(200);
expect(flyers).toBeInstanceOf(Array);
// We created a flyer in beforeAll, so we expect the array not to be empty.
@@ -69,11 +70,10 @@ describe('Public Flyer API Routes Integration Tests', () => {
const testFlyer = flyers[0];
// Act: Fetch items for the first flyer.
const response = await apiClient.fetchFlyerItems(testFlyer.flyer_id);
const items: FlyerItem[] = await response.json();
const response = await request.get(`/api/flyers/${testFlyer.flyer_id}/items`);
const items: FlyerItem[] = response.body;
// Assert: Verify the response and data structure.
expect(response.ok).toBe(true);
expect(response.status).toBe(200);
expect(items).toBeInstanceOf(Array);
// If there are items, check the shape of the first one.
@@ -87,18 +87,16 @@ describe('Public Flyer API Routes Integration Tests', () => {
});
});
describe('POST /api/flyer-items/batch-fetch', () => {
describe('POST /api/flyers/items/batch-fetch', () => {
it('should return items for multiple flyer IDs', async () => {
// Arrange: Get IDs from the flyers fetched in beforeAll.
const flyerIds = flyers.map((f) => f.flyer_id);
expect(flyerIds.length).toBeGreaterThan(0);
// Act: Fetch items for all available flyers.
const response = await apiClient.fetchFlyerItemsForFlyers(flyerIds);
const items: FlyerItem[] = await response.json();
// Assert
expect(response.ok).toBe(true);
const response = await request.post('/api/flyers/items/batch-fetch').send({ flyerIds });
const items: FlyerItem[] = response.body;
expect(response.status).toBe(200);
expect(items).toBeInstanceOf(Array);
// The total number of items should be greater than or equal to the number of flyers (assuming at least one item per flyer).
if (items.length > 0) {
@@ -107,15 +105,15 @@ describe('Public Flyer API Routes Integration Tests', () => {
});
});
describe('POST /api/flyer-items/batch-count', () => {
describe('POST /api/flyers/items/batch-count', () => {
it('should return the total count of items for multiple flyer IDs', async () => {
// Arrange
const flyerIds = flyers.map((f) => f.flyer_id);
expect(flyerIds.length).toBeGreaterThan(0);
// Act
const response = await apiClient.countFlyerItemsForFlyers(flyerIds);
const result = await response.json();
const response = await request.post('/api/flyers/items/batch-count').send({ flyerIds });
const result = response.body;
// Assert
expect(result.count).toBeTypeOf('number');

View File

@@ -1,108 +0,0 @@
// src/tests/integration/public.integration.test.ts
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
import * as apiClient from '../../services/apiClient';
import { getPool } from '../../services/db/connection.db';
/**
* @vitest-environment node
*/
describe('Public API Routes Integration Tests', () => {
let createdFlyerId: number;
let createdMasterItemId: number;
beforeAll(async () => {
const pool = getPool();
// Create a store for the flyer
const storeRes = await pool.query(
`INSERT INTO public.stores (name) VALUES ('Public Test Store') RETURNING store_id`,
);
const storeId = storeRes.rows[0].store_id;
// Create a flyer
const flyerRes = await pool.query(
`INSERT INTO public.flyers (store_id, file_name, image_url, item_count, checksum)
VALUES ($1, 'public-test.jpg', 'http://test.com/public.jpg', 0, $2) RETURNING flyer_id`,
[storeId, `checksum-public-${Date.now()}`],
);
createdFlyerId = flyerRes.rows[0].flyer_id;
// Create a master item. Assumes a category with ID 1 exists from static seeds.
const masterItemRes = await pool.query(
`INSERT INTO public.master_grocery_items (name, category_id) VALUES ('Public Test Item', 1) RETURNING master_grocery_item_id`,
);
createdMasterItemId = masterItemRes.rows[0].master_grocery_item_id;
});
afterAll(async () => {
const pool = getPool();
// Cleanup in reverse order of creation
if (createdMasterItemId) {
await pool.query(
'DELETE FROM public.master_grocery_items WHERE master_grocery_item_id = $1',
[createdMasterItemId],
);
}
if (createdFlyerId) {
await pool.query('DELETE FROM public.flyers WHERE flyer_id = $1', [createdFlyerId]);
}
});
describe('Health Check Endpoints', () => {
it('GET /api/health/ping should return "pong"', async () => {
const response = await apiClient.pingBackend();
expect(response.ok).toBe(true);
expect(await response.text()).toBe('pong');
});
it('GET /api/health/db-schema should return success', async () => {
const response = await apiClient.checkDbSchema();
const result = await response.json();
expect(result.success).toBe(true);
expect(result.message).toBe('All required database tables exist.');
});
it('GET /api/health/storage should return success', async () => {
// This assumes the STORAGE_PATH is correctly set up for the test environment
const response = await apiClient.checkStorage();
const result = await response.json();
expect(result.success).toBe(true);
expect(result.message).toContain('is accessible and writable');
});
it('GET /api/health/db-pool should return success', async () => {
const response = await apiClient.checkDbPoolHealth();
// The pingBackend function returns a boolean directly, so no .json() call is needed.
// However, checkDbPoolHealth returns a Response, so we need to parse it.
const result = await response.json();
expect(result.success).toBe(true);
expect(result.message).toContain('Pool Status:');
});
});
describe('Public Data Endpoints', () => {
it('GET /api/flyers should return a list of flyers', async () => {
const response = await apiClient.fetchFlyers();
const flyers = await response.json();
expect(flyers).toBeInstanceOf(Array);
// We created a flyer, so we expect it to be in the list.
expect(flyers.length).toBeGreaterThan(0);
const foundFlyer = flyers.find((f: { flyer_id: number }) => f.flyer_id === createdFlyerId);
expect(foundFlyer).toBeDefined();
expect(foundFlyer).toHaveProperty('store');
});
it('GET /api/master-items should return a list of master items', async () => {
const response = await apiClient.fetchMasterItems();
const masterItems = await response.json();
expect(masterItems).toBeInstanceOf(Array);
// We created a master item, so we expect it to be in the list.
expect(masterItems.length).toBeGreaterThan(0);
const foundItem = masterItems.find(
(i: { master_grocery_item_id: number }) => i.master_grocery_item_id === createdMasterItemId,
);
expect(foundItem).toBeDefined();
expect(foundItem).toHaveProperty('category_name');
});
});
});

View File

@@ -1,6 +1,7 @@
// src/tests/integration/public.routes.integration.test.ts
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
import supertest from 'supertest';
import app from '../../../server';
import type {
Flyer,
FlyerItem,
@@ -13,8 +14,11 @@ import type {
import { getPool } from '../../services/db/connection.db';
import { createAndLoginUser } from '../utils/testHelpers';
const API_URL = process.env.VITE_API_BASE_URL || 'http://localhost:3001/api';
const request = supertest(API_URL.replace('/api', '')); // supertest needs the server's base URL
/**
* @vitest-environment node
*/
const request = supertest(app);
describe('Public API Routes Integration Tests', () => {
// Shared state for tests
@@ -97,17 +101,17 @@ describe('Public API Routes Integration Tests', () => {
expect(response.status).toBe(200);
expect(response.body.success).toBe(true);
});
});
describe('Public Data Endpoints', () => {
it('GET /api/time should return the server time', async () => {
const response = await request.get('/api/time');
it('GET /api/health/time should return the server time', async () => {
const response = await request.get('/api/health/time');
expect(response.status).toBe(200);
expect(response.body).toHaveProperty('currentTime');
expect(response.body).toHaveProperty('year');
expect(response.body).toHaveProperty('week');
});
});
describe('Public Data Endpoints', () => {
it('GET /api/flyers should return a list of flyers', async () => {
const response = await request.get('/api/flyers');
const flyers: Flyer[] = response.body;
@@ -126,25 +130,25 @@ describe('Public API Routes Integration Tests', () => {
expect(items[0].flyer_id).toBe(testFlyer.flyer_id);
});
it('POST /api/flyer-items/batch-fetch should return items for multiple flyers', async () => {
it('POST /api/flyers/items/batch-fetch should return items for multiple flyers', async () => {
const flyerIds = [testFlyer.flyer_id];
const response = await request.post('/api/flyer-items/batch-fetch').send({ flyerIds });
const response = await request.post('/api/flyers/items/batch-fetch').send({ flyerIds });
const items: FlyerItem[] = response.body;
expect(response.status).toBe(200);
expect(items).toBeInstanceOf(Array);
expect(items.length).toBeGreaterThan(0);
});
it('POST /api/flyer-items/batch-count should return a count for multiple flyers', async () => {
it('POST /api/flyers/items/batch-count should return a count for multiple flyers', async () => {
const flyerIds = [testFlyer.flyer_id];
const response = await request.post('/api/flyer-items/batch-count').send({ flyerIds });
const response = await request.post('/api/flyers/items/batch-count').send({ flyerIds });
expect(response.status).toBe(200);
expect(response.body.count).toBeTypeOf('number');
expect(response.body.count).toBeGreaterThan(0);
});
it('GET /api/master-items should return a list of master grocery items', async () => {
const response = await request.get('/api/master-items');
it('GET /api/personalization/master-items should return a list of master grocery items', async () => {
const response = await request.get('/api/personalization/master-items');
const masterItems = response.body;
expect(response.status).toBe(200);
expect(masterItems).toBeInstanceOf(Array);
@@ -190,9 +194,9 @@ describe('Public API Routes Integration Tests', () => {
expect(items).toBeInstanceOf(Array);
});
it('GET /api/dietary-restrictions should return a list of restrictions', async () => {
it('GET /api/personalization/dietary-restrictions should return a list of restrictions', async () => {
// This test relies on static seed data for a lookup table, which is acceptable.
const response = await request.get('/api/dietary-restrictions');
const response = await request.get('/api/personalization/dietary-restrictions');
const restrictions: DietaryRestriction[] = response.body;
expect(response.status).toBe(200);
expect(restrictions).toBeInstanceOf(Array);
@@ -200,8 +204,8 @@ describe('Public API Routes Integration Tests', () => {
expect(restrictions[0]).toHaveProperty('dietary_restriction_id');
});
it('GET /api/appliances should return a list of appliances', async () => {
const response = await request.get('/api/appliances');
it('GET /api/personalization/appliances should return a list of appliances', async () => {
const response = await request.get('/api/personalization/appliances');
const appliances: Appliance[] = response.body;
expect(response.status).toBe(200);
expect(appliances).toBeInstanceOf(Array);

View File

@@ -1,6 +1,7 @@
// src/tests/integration/system.integration.test.ts
import { describe, it, expect } from 'vitest';
import * as apiClient from '../../services/apiClient';
import supertest from 'supertest';
import app from '../../../server';
/**
* @vitest-environment node
@@ -9,15 +10,16 @@ import * as apiClient from '../../services/apiClient';
describe('System API Routes Integration Tests', () => {
describe('GET /api/system/pm2-status', () => {
it('should return a status for PM2', async () => {
const request = supertest(app);
// In a typical CI environment without PM2, this will fail gracefully.
// The test verifies that the endpoint responds correctly, even if PM2 isn't running.
const response = await apiClient.checkPm2Status();
const result = await response.json();
const response = await request.get('/api/system/pm2-status');
const result = response.body;
expect(result).toBeDefined();
expect(result).toHaveProperty('message');
// If the response is successful (200 OK), it must have a 'success' property.
// If it's an error (e.g., 500 because pm2 command not found), it will only have 'message'.
if (response.ok) {
if (response.status === 200) {
expect(result).toHaveProperty('success');
}
});

View File

@@ -1,6 +1,7 @@
// src/tests/integration/user.integration.test.ts
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
import * as apiClient from '../../services/apiClient';
import supertest from 'supertest';
import app from '../../../server';
import { logger } from '../../services/logger.server';
import { getPool } from '../../services/db/connection.db';
import type { UserProfile, MasterGroceryItem, ShoppingList } from '../../types';
@@ -10,25 +11,12 @@ import { createAndLoginUser, TEST_PASSWORD } from '../utils/testHelpers';
* @vitest-environment node
*/
const request = supertest(app);
describe('User API Routes Integration Tests', () => {
let testUser: UserProfile;
let authToken: string;
// --- START DEBUG LOGGING ---
// Query the DB from within the test file to see its state.
beforeAll(async () => {
const res = await getPool().query(
'SELECT u.user_id, u.email, p.role FROM public.users u JOIN public.profiles p ON u.user_id = p.user_id',
);
console.log(
'\n--- [user.integration.test.ts] Users found in DB from TEST perspective (beforeAll): ---',
);
console.table(res.rows);
console.log(
'-------------------------------------------------------------------------------------\n',
);
});
// --- END DEBUG LOGGING ---
// Before any tests run, create a new user and log them in.
// The token will be used for all subsequent API calls in this test suite.
beforeAll(async () => {
@@ -62,11 +50,13 @@ describe('User API Routes Integration Tests', () => {
it('should fetch the authenticated user profile via GET /api/users/profile', async () => {
// Act: Call the API endpoint using the authenticated token.
const response = await apiClient.getAuthenticatedUserProfile({ tokenOverride: authToken });
const profile = await response.json();
const response = await request
.get('/api/users/profile')
.set('Authorization', `Bearer ${authToken}`);
const profile = response.body;
// Assert: Verify the profile data matches the created user.
expect(profile).toBeDefined();
expect(response.status).toBe(200);
expect(profile.user.user_id).toBe(testUser.user.user_id);
expect(profile.user.email).toBe(testUser.user.email); // This was already correct
expect(profile.full_name).toBe('Test User');
@@ -80,20 +70,21 @@ describe('User API Routes Integration Tests', () => {
};
// Act: Call the update endpoint with the new data and the auth token.
const response = await apiClient.updateUserProfile(profileUpdates, {
tokenOverride: authToken,
});
const updatedProfile = await response.json();
const response = await request
.put('/api/users/profile')
.set('Authorization', `Bearer ${authToken}`)
.send(profileUpdates);
const updatedProfile = response.body;
// Assert: Check that the returned profile reflects the changes.
expect(updatedProfile).toBeDefined();
expect(response.status).toBe(200);
expect(updatedProfile.full_name).toBe('Updated Test User');
// Also, fetch the profile again to ensure the change was persisted.
const refetchResponse = await apiClient.getAuthenticatedUserProfile({
tokenOverride: authToken,
});
const refetchedProfile = await refetchResponse.json();
const refetchResponse = await request
.get('/api/users/profile')
.set('Authorization', `Bearer ${authToken}`);
const refetchedProfile = refetchResponse.body;
expect(refetchedProfile.full_name).toBe('Updated Test User');
});
@@ -104,14 +95,14 @@ describe('User API Routes Integration Tests', () => {
};
// Act: Call the update endpoint.
const response = await apiClient.updateUserPreferences(preferenceUpdates, {
tokenOverride: authToken,
});
const updatedProfile = await response.json();
const response = await request
.put('/api/users/profile/preferences')
.set('Authorization', `Bearer ${authToken}`)
.send(preferenceUpdates);
const updatedProfile = response.body;
// Assert: Check that the preferences object in the returned profile is updated.
expect(updatedProfile).toBeDefined();
expect(updatedProfile.preferences).toBeDefined();
expect(response.status).toBe(200);
expect(updatedProfile.preferences?.darkMode).toBe(true);
});
@@ -122,9 +113,14 @@ describe('User API Routes Integration Tests', () => {
// Act & Assert: Attempt to register and expect the promise to reject
// with an error message indicating the password is too weak.
const response = await apiClient.registerUser(email, weakPassword, 'Weak Password User');
expect(response.ok).toBe(false);
const errorData = (await response.json()) as { message: string; errors: { message: string }[] };
const response = await request.post('/api/auth/register').send({
email,
password: weakPassword,
full_name: 'Weak Password User',
});
expect(response.status).toBe(400);
const errorData = response.body as { message: string; errors: { message: string }[] };
// For validation errors, the detailed messages are in the `errors` array.
// We join them to check for the specific feedback from the password strength checker.
const detailedErrorMessage = errorData.errors?.map((e) => e.message).join(' ');
@@ -137,18 +133,22 @@ describe('User API Routes Integration Tests', () => {
const { token: deletionToken } = await createAndLoginUser({ email: deletionEmail });
// Act: Call the delete endpoint with the correct password and token.
const response = await apiClient.deleteUserAccount(TEST_PASSWORD, {
tokenOverride: deletionToken,
});
const deleteResponse = await response.json();
const response = await request
.delete('/api/users/account')
.set('Authorization', `Bearer ${deletionToken}`)
.send({ password: TEST_PASSWORD });
const deleteResponse = response.body;
// Assert: Check for a successful deletion message.
expect(response.status).toBe(200);
expect(deleteResponse.message).toBe('Account deleted successfully.');
// Assert (Verification): Attempting to log in again with the same credentials should now fail.
const loginResponse = await apiClient.loginUser(deletionEmail, TEST_PASSWORD, false);
expect(loginResponse.ok).toBe(false);
const errorData = await loginResponse.json();
const loginResponse = await request
.post('/api/auth/login')
.send({ email: deletionEmail, password: TEST_PASSWORD });
expect(loginResponse.status).toBe(401);
const errorData = loginResponse.body;
expect(errorData.message).toBe('Incorrect email or password.');
});
@@ -158,12 +158,14 @@ describe('User API Routes Integration Tests', () => {
const { user: resetUser } = await createAndLoginUser({ email: resetEmail });
// Act 1: Request a password reset. In our test environment, the token is returned in the response.
const resetRequestRawResponse = await apiClient.requestPasswordReset(resetEmail);
if (!resetRequestRawResponse.ok) {
const errorData = await resetRequestRawResponse.json();
const resetRequestRawResponse = await request
.post('/api/auth/forgot-password')
.send({ email: resetEmail });
if (resetRequestRawResponse.status !== 200) {
const errorData = resetRequestRawResponse.body;
throw new Error(errorData.message || 'Password reset request failed');
}
const resetRequestResponse = await resetRequestRawResponse.json();
const resetRequestResponse = resetRequestRawResponse.body;
const resetToken = resetRequestResponse.token;
// Assert 1: Check that we received a token.
@@ -172,19 +174,23 @@ describe('User API Routes Integration Tests', () => {
// Act 2: Use the token to set a new password.
const newPassword = 'my-new-secure-password-!@#$';
const resetRawResponse = await apiClient.resetPassword(resetToken!, newPassword);
if (!resetRawResponse.ok) {
const errorData = await resetRawResponse.json();
const resetRawResponse = await request
.post('/api/auth/reset-password')
.send({ token: resetToken!, newPassword });
if (resetRawResponse.status !== 200) {
const errorData = resetRawResponse.body;
throw new Error(errorData.message || 'Password reset failed');
}
const resetResponse = await resetRawResponse.json();
const resetResponse = resetRawResponse.body;
// Assert 2: Check for a successful password reset message.
expect(resetResponse.message).toBe('Password has been reset successfully.');
// Act 3 & Assert 3 (Verification): Log in with the NEW password to confirm the change.
const loginResponse = await apiClient.loginUser(resetEmail, newPassword, false);
const loginData = await loginResponse.json();
const loginResponse = await request
.post('/api/auth/login')
.send({ email: resetEmail, password: newPassword });
const loginData = loginResponse.body;
expect(loginData.userprofile).toBeDefined();
expect(loginData.userprofile.user.user_id).toBe(resetUser.user.user_id);
});
@@ -192,20 +198,21 @@ describe('User API Routes Integration Tests', () => {
describe('User Data Routes (Watched Items & Shopping Lists)', () => {
it('should allow a user to add and remove a watched item', async () => {
// Act 1: Add a new watched item. The API returns the created master item.
const addResponse = await apiClient.addWatchedItem(
'Integration Test Item',
'Other/Miscellaneous',
authToken,
);
const newItem = await addResponse.json();
const addResponse = await request
.post('/api/users/watched-items')
.set('Authorization', `Bearer ${authToken}`)
.send({ itemName: 'Integration Test Item', category: 'Other/Miscellaneous' });
const newItem = addResponse.body;
// Assert 1: Check that the item was created correctly.
expect(newItem).toBeDefined();
expect(addResponse.status).toBe(201);
expect(newItem.name).toBe('Integration Test Item');
// Act 2: Fetch all watched items for the user.
const watchedItemsResponse = await apiClient.fetchWatchedItems(authToken);
const watchedItems = await watchedItemsResponse.json();
const watchedItemsResponse = await request
.get('/api/users/watched-items')
.set('Authorization', `Bearer ${authToken}`);
const watchedItems = watchedItemsResponse.body;
// Assert 2: Verify the new item is in the user's watched list.
expect(
@@ -216,11 +223,16 @@ describe('User API Routes Integration Tests', () => {
).toBe(true);
// Act 3: Remove the watched item.
await apiClient.removeWatchedItem(newItem.master_grocery_item_id, authToken);
const removeResponse = await request
.delete(`/api/users/watched-items/${newItem.master_grocery_item_id}`)
.set('Authorization', `Bearer ${authToken}`);
expect(removeResponse.status).toBe(204);
// Assert 3: Fetch again and verify the item is gone.
const finalWatchedItemsResponse = await apiClient.fetchWatchedItems(authToken);
const finalWatchedItems = await finalWatchedItemsResponse.json();
const finalWatchedItemsResponse = await request
.get('/api/users/watched-items')
.set('Authorization', `Bearer ${authToken}`);
const finalWatchedItems = finalWatchedItemsResponse.body;
expect(
finalWatchedItems.some(
(item: MasterGroceryItem) =>
@@ -231,31 +243,33 @@ describe('User API Routes Integration Tests', () => {
it('should allow a user to manage a shopping list', async () => {
// Act 1: Create a new shopping list.
const createListResponse = await apiClient.createShoppingList(
'My Integration Test List',
authToken,
);
const newList = await createListResponse.json();
const createListResponse = await request
.post('/api/users/shopping-lists')
.set('Authorization', `Bearer ${authToken}`)
.send({ name: 'My Integration Test List' });
const newList = createListResponse.body;
// Assert 1: Check that the list was created.
expect(newList).toBeDefined();
expect(createListResponse.status).toBe(201);
expect(newList.name).toBe('My Integration Test List');
// Act 2: Add an item to the new list.
const addItemResponse = await apiClient.addShoppingListItem(
newList.shopping_list_id,
{ customItemName: 'Custom Test Item' },
authToken,
);
const addedItem = await addItemResponse.json();
const addItemResponse = await request
.post(`/api/users/shopping-lists/${newList.shopping_list_id}/items`)
.set('Authorization', `Bearer ${authToken}`)
.send({ customItemName: 'Custom Test Item' });
const addedItem = addItemResponse.body;
// Assert 2: Check that the item was added.
expect(addedItem).toBeDefined();
expect(addItemResponse.status).toBe(201);
expect(addedItem.custom_item_name).toBe('Custom Test Item');
// Assert 3: Fetch all lists and verify the new item is present in the correct list.
const fetchResponse = await apiClient.fetchShoppingLists(authToken);
const lists = await fetchResponse.json();
const fetchResponse = await request
.get('/api/users/shopping-lists')
.set('Authorization', `Bearer ${authToken}`);
const lists = fetchResponse.body;
expect(fetchResponse.status).toBe(200);
const updatedList = lists.find(
(l: ShoppingList) => l.shopping_list_id === newList.shopping_list_id,
);

View File

@@ -1,42 +1,30 @@
// src/tests/integration/user.routes.integration.test.ts
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
import supertest from 'supertest';
import app from '../../../server';
import { getPool } from '../../services/db/connection.db';
import type { UserProfile } from '../../types';
import { createAndLoginUser } from '../utils/testHelpers';
const API_URL = process.env.VITE_API_BASE_URL || 'http://localhost:3001/api';
const request = supertest(API_URL.replace('/api', '')); // supertest needs the server's base URL
/**
* @vitest-environment node
*/
const request = supertest(app);
let authToken = '';
let createdListId: number;
let testUser: UserProfile;
const testPassword = 'password-for-user-routes-test';
describe('User Routes Integration Tests (/api/users)', () => {
// Authenticate once before all tests in this suite to get a JWT.
beforeAll(async () => {
// Create a new user for this test suite to avoid dependency on seeded data
const testEmail = `user-routes-test-${Date.now()}@example.com`;
// 1. Register the user
const registerResponse = await request
.post('/api/auth/register')
.send({ email: testEmail, password: testPassword, full_name: 'User Routes Test User' });
expect(registerResponse.status).toBe(201);
// 2. Log in as the new user
const loginResponse = await request
.post('/api/auth/login')
.send({ email: testEmail, password: testPassword });
if (loginResponse.status !== 200) {
console.error('Login failed in beforeAll hook:', loginResponse.body);
}
expect(loginResponse.status).toBe(200);
expect(loginResponse.body.token).toBeDefined();
authToken = loginResponse.body.token;
testUser = loginResponse.body.userprofile;
// Use the helper to create and log in a user in one step.
const { user, token } = await createAndLoginUser({
fullName: 'User Routes Test User',
});
testUser = user;
authToken = token;
});
afterAll(async () => {

View File

@@ -52,7 +52,10 @@ export type FlyerDbInsert = Omit<FlyerInsert, 'store_name'> & { store_id: number
* Represents the data required to insert a new flyer item into the database.
* It's a subset of the full FlyerItem type.
*/
export type FlyerItemInsert = Omit<FlyerItem, 'flyer_item_id' | 'flyer_id' | 'created_at' | 'updated_at'>;
export type FlyerItemInsert = Omit<
FlyerItem,
'flyer_item_id' | 'flyer_id' | 'created_at' | 'updated_at'
>;
export interface UnitPrice {
value: number;
@@ -163,13 +166,12 @@ export interface Profile {
updated_by?: string | null;
}
/**
* Represents the combined user and profile data object returned by the backend's /users/profile endpoint.
* It embeds the User object within the Profile object.
* It also includes the full Address object if one is associated with the profile.
*/
export type UserProfile = Profile & {
export type UserProfile = Profile & {
user: User;
address?: Address | null;
};
@@ -325,7 +327,6 @@ export interface RecipeIngredientSubstitution {
notes?: string | null;
}
export interface Tag {
tag_id: number;
name: string;
@@ -718,7 +719,10 @@ export type AiAnalysisAction =
// Dispatched when an analysis that returns a simple string succeeds.
| { type: 'FETCH_SUCCESS_TEXT'; payload: { analysisType: AnalysisType; data: string } }
// Dispatched when an analysis that returns text and sources succeeds.
| { type: 'FETCH_SUCCESS_GROUNDED'; payload: { analysisType: AnalysisType; data: GroundedResponse } }
| {
type: 'FETCH_SUCCESS_GROUNDED';
payload: { analysisType: AnalysisType; data: GroundedResponse };
}
// Dispatched when the image generation succeeds.
| { type: 'FETCH_SUCCESS_IMAGE'; payload: { data: string } }
// Dispatched when any analysis fails.
@@ -738,11 +742,25 @@ export interface ProcessingStage {
}
export const CATEGORIES = [
'Fruits & Vegetables', 'Meat & Seafood', 'Dairy & Eggs', 'Bakery & Bread',
'Pantry & Dry Goods', 'Beverages', 'Frozen Foods', 'Snacks', 'Household & Cleaning',
'Personal Care & Health', 'Baby & Child', 'Pet Supplies', 'Deli & Prepared Foods',
'Canned Goods', 'Condiments & Spices', 'Breakfast & Cereal', 'Organic',
'International Foods', 'Other/Miscellaneous'
'Fruits & Vegetables',
'Meat & Seafood',
'Dairy & Eggs',
'Bakery & Bread',
'Pantry & Dry Goods',
'Beverages',
'Frozen Foods',
'Snacks',
'Household & Cleaning',
'Personal Care & Health',
'Baby & Child',
'Pet Supplies',
'Deli & Prepared Foods',
'Canned Goods',
'Condiments & Spices',
'Breakfast & Cereal',
'Organic',
'International Foods',
'Other/Miscellaneous',
];
/**
@@ -750,11 +768,11 @@ export const CATEGORIES = [
* This is the structure returned from the backend to the frontend.
*/
export interface ExtractedCoreData {
store_name: string;
valid_from: string | null;
valid_to: string | null;
store_address: string | null;
items: ExtractedFlyerItem[];
store_name: string;
valid_from: string | null;
valid_to: string | null;
store_address: string | null;
items: ExtractedFlyerItem[];
}
/**
@@ -776,19 +794,19 @@ export interface ExtractedFlyerItem {
* Represents the logo data extracted from a flyer by the AI service.
*/
export interface ExtractedLogoData {
store_logo_base_64: string | null;
store_logo_base_64: string | null;
}
/**
* Represents the data extracted from a receipt image by the AI service.
*/
export interface ExtractedReceiptData {
raw_text: string;
items: {
raw_item_description: string;
quantity: number;
price_paid_cents: number;
}[];
raw_text: string;
items: {
raw_item_description: string;
quantity: number;
price_paid_cents: number;
}[];
}
/**
@@ -930,10 +948,10 @@ export interface LeaderboardUser {
* This is a public-facing type and does not include sensitive fields.
*/
export interface AdminUserView {
user_id: string;
email: string;
created_at: string;
role: 'admin' | 'user';
full_name: string | null;
avatar_url: string | null;
}
user_id: string;
email: string;
created_at: string;
role: 'admin' | 'user';
full_name: string | null;
avatar_url: string | null;
}

2
src/vite-env.d.ts vendored
View File

@@ -10,4 +10,4 @@ interface ImportMetaEnv {
interface ImportMeta {
readonly env: ImportMetaEnv;
}
}

View File

@@ -2,4 +2,4 @@
// This file can be used for global setup logic that applies to ALL test projects
// defined in the workspace. Since our unit and integration tests have distinct
// setup requirements, this file is currently empty.
// setup requirements, this file is currently empty.

View File

@@ -4,8 +4,5 @@ console.log('--- [EXECUTION PROOF] tailwind.config.js is being loaded. ---');
/** @type {import('tailwindcss').Config} */
export default {
content: [
"./index.html",
"./src/**/*.{js,ts,jsx,tsx}",
],
};
content: ['./index.html', './src/**/*.{js,ts,jsx,tsx}'],
};

View File

@@ -17,12 +17,10 @@
"jsx": "react-jsx",
// This line makes Vitest's global APIs (describe, it, expect) available everywhere
// without needing to import them.
"types": [
"vitest/globals"
]
"types": ["vitest/globals"]
},
// This is the most important part: It tells TypeScript to include ALL files
// within the 'src' directory, including your new 'vite-env.d.ts' file.
"include": ["src"],
"references": [{ "path": "./tsconfig.node.json" }]
}
}

View File

@@ -9,5 +9,10 @@
"strict": true, // It's good practice to keep tooling config strict
"types": ["node"]
},
"include": ["vite.config.ts", "vitest.config.ts", "vitest.config.integration.ts", "vitest.workspace.ts"]
}
"include": [
"vite.config.ts",
"vitest.config.ts",
"vitest.config.integration.ts",
"vitest.workspace.ts"
]
}

View File

@@ -22,7 +22,7 @@ export default defineConfig({
resolve: {
alias: {
// Use __dirname for a more robust path resolution
'@': path.resolve(__dirname, './src'),
'@': path.resolve(__dirname, './src'),
// This alias ensures that any import of 'services/logger' is resolved
// to the browser-safe client version during the Vite build process.
// Server-side code should explicitly import 'services/logger.server'.
@@ -51,7 +51,7 @@ export default defineConfig({
'**/node_modules/**',
'**/dist/**',
'src/tests/integration/**', // Exclude the entire integration test directory
'**/*.e2e.test.ts'
'**/*.e2e.test.ts',
],
// Disable file parallelism to run tests sequentially (replaces --no-threads)
fileParallelism: false,
@@ -60,10 +60,12 @@ export default defineConfig({
reporter: [
// Add maxCols to suggest a wider output for the text summary.
['text', { maxCols: 200 }],
'html', 'json'],
'html',
'json',
],
// hanging-process reporter helps identify tests that do not exit properly - comes at a high cost tho
//reporter: ['verbose', 'html', 'json', 'hanging-process'],
reportsDirectory: './.coverage/unit',
reportsDirectory: './.coverage/unit',
clean: true,
reportOnFailure: true, // This ensures the report generates even if tests fail
include: ['src/**/*.{ts,tsx}'],
@@ -71,19 +73,19 @@ export default defineConfig({
// By excluding scripts, setup files, and type definitions, we get a more accurate
// picture of the test coverage for the actual application logic.
exclude: [
'src/index.tsx', // Application entry point
'src/main.tsx', // A common alternative entry point name
'src/index.tsx', // Application entry point
'src/main.tsx', // A common alternative entry point name
'src/types.ts',
'src/tests/**', // Exclude all test setup and helper files
'src/vitest.setup.ts', // Global test setup config
'src/vitest.setup.ts', // Global test setup config
'src/**/*.test.{ts,tsx}', // Exclude test files themselves
'src/**/*.stories.{ts,tsx}', // Exclude Storybook stories
'src/**/*.d.ts', // Exclude type definition files
'src/components/icons/**', // Exclude icon components if they are simple wrappers
'src/db/seed.ts', // Database seeding script
'src/db/seed_admin_account.ts', // Database seeding script
'src/db/backup_user.ts', // Database backup script
'src/db/seed.ts', // Database seeding script
'src/db/seed_admin_account.ts', // Database seeding script
'src/db/backup_user.ts', // Database backup script
],
},
},
});
});

View File

@@ -6,7 +6,7 @@ import viteConfig from './vite.config';
// Ensure NODE_ENV is set to 'test' for all Vitest runs.
process.env.NODE_ENV = 'test';
// 1. Separate the 'test' config (which has Unit Test settings)
// 1. Separate the 'test' config (which has Unit Test settings)
// from the rest of the general Vite config (plugins, aliases, etc.)
// DEBUG: Use console.error to ensure logs appear in CI/CD output
console.error('[DEBUG] Loading vitest.config.integration.ts...');
@@ -30,38 +30,41 @@ console.error('[DEBUG] Base vite config keys:', Object.keys(baseViteConfig));
* It MERGES with the main vite.config.ts to inherit plugins and aliases,
* then overrides the test-specific settings for a Node.js environment.
*/
const finalConfig = mergeConfig(baseViteConfig, defineConfig({
test: {
// Override settings from the main config for this specific test project.
name: 'integration',
environment: 'node',
// Point specifically to the new integration tests directory.
// This pattern will match any test file inside `src/tests/integration/`.
include: ['src/tests/integration/**/*.test.{ts,tsx}'],
// CRITICAL: We must override the `exclude` property from the base vite.config.ts.
// Otherwise, the inherited `exclude` rule will prevent any integration tests from running.
// Setting it to an empty array removes all exclusion rules for this project.
exclude: [],
// This setup script starts the backend server before tests run.
globalSetup: './src/tests/setup/integration-global-setup.ts',
// The default timeout is 5000ms (5 seconds)
testTimeout: 60000, // Increased timeout for server startup and API calls, especially AI services.
// "singleThread: true" is removed in modern Vitest.
// Use fileParallelism: false to ensure test files run one by one to prevent port conflicts.
fileParallelism: false,
coverage: {
provider: 'v8',
// We remove 'text' here. The final text report will be generated by `nyc` after merging.
reporter: ['html', 'json-summary', 'json'],
reportsDirectory: '.coverage/integration',
reportOnFailure: true, // This ensures the report generates even if tests fail
clean: true,
const finalConfig = mergeConfig(
baseViteConfig,
defineConfig({
test: {
// Override settings from the main config for this specific test project.
name: 'integration',
environment: 'node',
// Point specifically to the new integration tests directory.
// This pattern will match any test file inside `src/tests/integration/`.
include: ['src/tests/integration/**/*.test.{ts,tsx}'],
// CRITICAL: We must override the `exclude` property from the base vite.config.ts.
// Otherwise, the inherited `exclude` rule will prevent any integration tests from running.
// Setting it to an empty array removes all exclusion rules for this project.
exclude: [],
// This setup script starts the backend server before tests run.
globalSetup: './src/tests/setup/integration-global-setup.ts',
// The default timeout is 5000ms (5 seconds)
testTimeout: 60000, // Increased timeout for server startup and API calls, especially AI services.
// "singleThread: true" is removed in modern Vitest.
// Use fileParallelism: false to ensure test files run one by one to prevent port conflicts.
fileParallelism: false,
coverage: {
provider: 'v8',
// We remove 'text' here. The final text report will be generated by `nyc` after merging.
reporter: ['html', 'json-summary', 'json'],
reportsDirectory: '.coverage/integration',
reportOnFailure: true, // This ensures the report generates even if tests fail
clean: true,
},
},
}
}));
}),
);
console.error('[DEBUG] Integration Final Config - INCLUDE:', finalConfig.test?.include);
console.error('[DEBUG] Integration Final Config - EXCLUDE:', finalConfig.test?.exclude);
console.error('[DEBUG] ----------------------------------\n');
export default finalConfig;
export default finalConfig;

View File

@@ -7,8 +7,11 @@
* - Integration tests are defined in `vitest.config.integration.ts` and run in a 'node' environment.
*/
export default [
// DEBUGGING LOG
((): string => { console.error('\n[DEBUG] Loading vitest.workspace.ts'); return ''; })(),
'vite.config.ts', // Defines the 'unit' test project
// DEBUGGING LOG
((): string => {
console.error('\n[DEBUG] Loading vitest.workspace.ts');
return '';
})(),
'vite.config.ts', // Defines the 'unit' test project
'vitest.config.integration.ts', // Defines the 'integration' test project
];
];