imported files from google gemini ai BUILD env
This commit is contained in:
433
README.md
433
README.md
@@ -1,3 +1,432 @@
|
||||
# flyer-crawler.projectium.com
|
||||
# Flyer Crawler - Grocery AI Analyzer
|
||||
|
||||
The Google AI based Flyer Crawler App
|
||||
Flyer Crawler is a web application that uses the Google Gemini AI to extract, analyze, and manage data from grocery store flyers. Users can upload flyer images or PDFs, and the application will automatically identify items, prices, and sale dates, storing the structured data in a Supabase database for historical analysis, price tracking, and personalized deal alerts.
|
||||
|
||||
## Features
|
||||
|
||||
- **AI-Powered Data Extraction**: Upload PNG, JPG, or PDF flyers to automatically extract store names, sale dates, and a detailed list of items with prices and quantities.
|
||||
- **Bulk Import**: Process multiple flyers at once with a summary report of successes, skips (duplicates), and errors.
|
||||
- **Database Integration**: Securely connect to your own Supabase instance to save all extracted data, enabling long-term persistence and analysis.
|
||||
- **Personalized Watchlist**: Authenticated users can create a "watchlist" of specific grocery items they want to track.
|
||||
- **Active Deal Alerts**: The app highlights current sales on your watched items from all valid flyers in the database.
|
||||
- **Price History Charts**: Visualize the price trends of your watched items over time.
|
||||
- **Shopping List Management**: Users can create multiple shopping lists, add items from flyers or their watchlist, and track purchased items.
|
||||
- **User Authentication & Management**: Secure user sign-up, login, and profile management, including a secure account deletion process.
|
||||
- **Dynamic UI**: A responsive interface with dark mode and a choice between metric/imperial unit systems.
|
||||
|
||||
## Tech Stack
|
||||
|
||||
- **Frontend**: React, TypeScript, Tailwind CSS
|
||||
- **AI**: Google Gemini API (`@google/genai`)
|
||||
- **Backend**: Supabase (PostgreSQL, Auth, Storage, Edge Functions)
|
||||
- **UI Components**: Recharts for charts
|
||||
|
||||
---
|
||||
|
||||
## Required Environment Variables & Setup
|
||||
|
||||
This project requires several secret keys to function. See the `env.example` file for a complete template.
|
||||
|
||||
- **For the Frontend (Interactive Prompt)**: `REACT_APP_SUPABASE_URL` & `REACT_APP_SUPABASE_ANON_KEY`. These are your project's **public-facing** credentials used in the browser.
|
||||
- **For the AI Service**: `API_KEY`. In this development environment, this is provided for you automatically.
|
||||
|
||||
You can find your Supabase keys in your project's **Dashboard > Project Settings > API**.
|
||||
|
||||
## Setup and Installation
|
||||
|
||||
### Step 1: Set Up Supabase Project
|
||||
|
||||
1. **Create a New Project**: Go to your [Supabase Dashboard](https://app.supabase.com/) and create a new project.
|
||||
2. **Run the Database Schema**:
|
||||
- Navigate to the **SQL Editor** in your Supabase dashboard.
|
||||
- Open `sql/schema.sql.txt`, copy its entire contents, and paste them into the SQL Editor.
|
||||
- Click **"Run"**. This creates all necessary tables, relationships, and security policies.
|
||||
|
||||
### Step 2: Configure and Deploy Backend Edge Functions
|
||||
|
||||
The application uses secure Supabase Edge Functions for sensitive operations. You must deploy them using the Supabase CLI.
|
||||
|
||||
1. **Install the Supabase CLI**:
|
||||
```bash
|
||||
npm install supabase --save-dev
|
||||
```
|
||||
|
||||
2. **Log in and Link Your Project**:
|
||||
```bash
|
||||
# Login to the Supabase CLI
|
||||
npx supabase login
|
||||
|
||||
# Link your project (get <project-id> from your Supabase URL)
|
||||
npx supabase link --project-ref <project-id>
|
||||
```
|
||||
|
||||
3. **Create a Shared CORS File**: All functions need a shared file for CORS headers.
|
||||
- Create a folder path: `supabase/functions/_shared`.
|
||||
- Inside that folder, create a file named `cors.ts` and paste the code from **PART 7.1** below.
|
||||
|
||||
4. **Create and Deploy the Functions**:
|
||||
- For each of the three functions (`system-check`, `delete-user`, `seed-database`):
|
||||
1. Run `npx supabase functions new <function-name>`.
|
||||
2. Copy the corresponding code from **PART 7.2**, **7.3**, or **7.4** into the new `supabase/functions/<function-name>/index.ts` file.
|
||||
- Deploy all three functions to your project:
|
||||
```bash
|
||||
npx supabase functions deploy system-check
|
||||
npx supabase functions deploy delete-user
|
||||
npx supabase functions deploy seed-database
|
||||
```
|
||||
|
||||
### Step 3: Connect and Run the Frontend
|
||||
|
||||
1. **Run the Application**: This project is configured to run in an environment where dependencies are managed automatically.
|
||||
2. **Connect to Database**: On first launch, the app will prompt you for your Supabase credentials. Enter your **`Supabase URL`** and your **`Supabase Anon Key`** (the public `anon` key) here. The app will save these in your browser's local storage.
|
||||
3. **Follow the System Check**: The **System Check** panel will now guide you through verifying your setup. If any steps fail, it will provide instructions.
|
||||
4. **Seed Dev Users**: If the "Seeded Users" check fails, a **"Seed Dev Users"** button will appear. Click it to create the initial `admin@example.com` and `user@example.com` accounts. The check will re-run automatically and should now pass.
|
||||
5. **Start Using**: You can now upload flyers or log in with `admin@example.com` (password: `password123`).
|
||||
|
||||
---
|
||||
-- ============================================================================
|
||||
-- PART 7: EDGE FUNCTIONS DOCUMENTATION
|
||||
-- The following sections contain the TypeScript code for the required
|
||||
-- Supabase Edge Functions.
|
||||
-- ============================================================================
|
||||
-- PART 7.1: SHARED CORS HELPER
|
||||
-- File: supabase/functions/_shared/cors.ts
|
||||
-- ============================================================================
|
||||
export const corsHeaders = {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
};
|
||||
|
||||
-- ============================================================================
|
||||
-- PART 7.2: SYSTEM CHECK FUNCTION
|
||||
-- File: supabase/functions/system-check/index.ts
|
||||
-- ============================================================================
|
||||
import { createClient } from 'https://esm.sh/@supabase/supabase-js@2';
|
||||
import { corsHeaders } from '../_shared/cors.ts';
|
||||
|
||||
// Helper function to create a Supabase admin client
|
||||
const createAdminClient = () => createClient(
|
||||
Deno.env.get('SUPABASE_URL')!,
|
||||
Deno.env.get('SUPABASE_SERVICE_ROLE_KEY')!,
|
||||
{ auth: { autoRefreshToken: false, persistSession: false } }
|
||||
);
|
||||
|
||||
const checkDatabaseSchema = async (client) => {
|
||||
const { data, error } = await client.rpc('check_schema');
|
||||
if (error) throw new Error(`Schema check failed: ${error.message}`);
|
||||
if (!data) throw new Error('Schema check returned no data.');
|
||||
|
||||
const requiredTables = ['flyers', 'flyer_items', 'profiles', 'shopping_lists'];
|
||||
const missingTables = requiredTables.filter(t => !data.tables.includes(t));
|
||||
if (missingTables.length > 0) {
|
||||
return { pass: false, message: `Missing tables: ${missingTables.join(', ')}. Please run the full schema.sql.txt script.` };
|
||||
}
|
||||
return { pass: true, message: 'All required tables exist.' };
|
||||
};
|
||||
|
||||
const checkRlsPolicies = async (client) => {
|
||||
const { data, error } = await client.rpc('check_rls');
|
||||
if (error) throw new Error(`RLS check failed: ${error.message}`);
|
||||
|
||||
const requiredPolicies = {
|
||||
'profiles': 'Users can update their own profile.',
|
||||
'shopping_lists': 'Users can manage their own shopping lists.',
|
||||
'flyer_items': `Allow anon users to manage test items`
|
||||
};
|
||||
|
||||
for (const table in requiredPolicies) {
|
||||
const policyName = requiredPolicies[table];
|
||||
if (!data.some(p => p.table_name === table && p.policy_name === policyName)) {
|
||||
return { pass: false, message: `Missing RLS policy "${policyName}" on table "${table}". Please run the schema.sql.txt script.` };
|
||||
}
|
||||
}
|
||||
return { pass: true, message: 'Key RLS policies are in place.' };
|
||||
};
|
||||
|
||||
const checkUserCreationTrigger = async (client) => {
|
||||
const { data, error } = await client.rpc('check_trigger_security');
|
||||
if (error) throw new Error(`Trigger check failed: ${error.message}`);
|
||||
|
||||
if (!data || data.length === 0) {
|
||||
return { pass: false, message: `Function 'handle_new_user' not found. Please run schema.sql.txt.` };
|
||||
}
|
||||
|
||||
const handleNewUser = data[0];
|
||||
if (!handleNewUser.is_security_definer) {
|
||||
return { pass: false, message: `Trigger function 'handle_new_user' must be 'SECURITY DEFINER'. Please run the schema.sql.txt script to fix it.` };
|
||||
}
|
||||
if (handleNewUser.owner_role !== 'postgres') {
|
||||
return { pass: false, message: `Trigger function 'handle_new_user' must be owned by 'postgres'. Yours is owned by '${handleNewUser.owner_role}'. Please run schema.sql.txt to fix it.` };
|
||||
}
|
||||
return { pass: true, message: 'User creation trigger is correctly configured.' };
|
||||
};
|
||||
|
||||
const checkStorageBucket = async (client) => {
|
||||
const { data, error } = await client.storage.getBucket('flyers');
|
||||
if (error) {
|
||||
return { pass: false, message: `Failed to access 'flyers' bucket: ${error.message}. Ensure it exists and permissions are set.` };
|
||||
}
|
||||
if (!data.public) {
|
||||
return { pass: false, message: `Storage bucket 'flyers' is not public. Please enable public access in your Supabase dashboard.` };
|
||||
}
|
||||
return { pass: true, message: "'flyers' bucket exists and is public." };
|
||||
};
|
||||
|
||||
Deno.serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response('ok', { headers: corsHeaders });
|
||||
}
|
||||
|
||||
try {
|
||||
const adminClient = createAdminClient();
|
||||
const results = {};
|
||||
|
||||
results['schema'] = await checkDatabaseSchema(adminClient);
|
||||
results['rls'] = await checkRlsPolicies(adminClient);
|
||||
results['trigger'] = await checkUserCreationTrigger(adminClient);
|
||||
results['storage'] = await checkStorageBucket(adminClient);
|
||||
|
||||
return new Response(JSON.stringify({ results }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
status: 200,
|
||||
});
|
||||
} catch (error) {
|
||||
return new Response(JSON.stringify({ error: error.message, stack: error.stack }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
status: 500,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
-- ============================================================================
|
||||
-- PART 7.3: DELETE USER FUNCTION
|
||||
-- File: supabase/functions/delete-user/index.ts
|
||||
-- ============================================================================
|
||||
import { createClient } from 'https://esm.sh/@supabase/supabase-js@2';
|
||||
import { corsHeaders } from '../_shared/cors.ts';
|
||||
|
||||
Deno.serve(async (req) => {
|
||||
// Handle preflight OPTIONS request for CORS
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response('ok', { headers: corsHeaders });
|
||||
}
|
||||
|
||||
try {
|
||||
const { password } = await req.json();
|
||||
if (!password) {
|
||||
throw new Error('Password is required.');
|
||||
}
|
||||
|
||||
// Create a Supabase client with the user's authentication token
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
throw new Error('Missing authorization header.');
|
||||
}
|
||||
|
||||
const userSupabaseClient = createClient(
|
||||
Deno.env.get('SUPABASE_URL')!,
|
||||
Deno.env.get('SUPABASE_ANON_KEY')!,
|
||||
{ global: { headers: { Authorization: authHeader } } }
|
||||
);
|
||||
|
||||
// Get the user from the token
|
||||
const { data: { user }, error: userError } = await userSupabaseClient.auth.getUser();
|
||||
if (userError || !user) {
|
||||
return new Response(JSON.stringify({ error: userError?.message || 'User not authenticated.' }), {
|
||||
status: 401,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// Verify the user's password by attempting to sign in
|
||||
const { error: signInError } = await userSupabaseClient.auth.signInWithPassword({
|
||||
email: user.email!,
|
||||
password: password,
|
||||
});
|
||||
|
||||
if (signInError) {
|
||||
return new Response(JSON.stringify({ error: 'Invalid password.' }), {
|
||||
status: 403,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// If password is correct, create an admin client with the service_role key
|
||||
const adminSupabaseClient = createClient(
|
||||
Deno.env.get('SUPABASE_URL')!,
|
||||
Deno.env.get('SUPABASE_SERVICE_ROLE_KEY')!,
|
||||
{ auth: { autoRefreshToken: false, persistSession: false } }
|
||||
);
|
||||
|
||||
// Delete the user
|
||||
const { error: deleteError } = await adminSupabaseClient.auth.admin.deleteUser(user.id);
|
||||
if (deleteError) {
|
||||
throw deleteError;
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ message: 'User deleted successfully.' }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
status: 200,
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
// Return a detailed error with a stack trace for better debugging.
|
||||
return new Response(JSON.stringify({ error: error.message, stack: error.stack }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
status: 500,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
-- ============================================================================
|
||||
-- PART 7.4: SEED DATABASE FUNCTION
|
||||
-- File: supabase/functions/seed-database/index.ts
|
||||
-- ============================================================================
|
||||
import { createClient } from 'https://esm.sh/@supabase/supabase-js@2';
|
||||
import { corsHeaders } from '../_shared/cors.ts';
|
||||
|
||||
Deno.serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response('ok', { headers: corsHeaders });
|
||||
}
|
||||
|
||||
try {
|
||||
// We create an admin client using the service_role key to perform elevated actions.
|
||||
// This key is automatically provided by Supabase in the production environment.
|
||||
const adminSupabaseClient = createClient(
|
||||
Deno.env.get('SUPABASE_URL')!,
|
||||
Deno.env.get('SUPABASE_SERVICE_ROLE_KEY')!,
|
||||
{ auth: { autoRefreshToken: false, persistSession: false } }
|
||||
);
|
||||
|
||||
const usersToSeed = [
|
||||
{
|
||||
email: 'admin@example.com',
|
||||
password: 'password123',
|
||||
user_metadata: { full_name: 'Admin User' }
|
||||
},
|
||||
{
|
||||
email: 'user@example.com',
|
||||
password: 'password123',
|
||||
user_metadata: { full_name: 'Normal User' }
|
||||
}
|
||||
];
|
||||
|
||||
const createdUsers = [];
|
||||
const existingUsers = [];
|
||||
|
||||
const { data: { users: existingUserList }, error: listError } = await adminSupabaseClient.auth.admin.listUsers();
|
||||
if (listError) throw listError;
|
||||
|
||||
const existingEmails = new Set(existingUserList.map(u => u.email));
|
||||
|
||||
for (const user of usersToSeed) {
|
||||
if (!existingEmails.has(user.email)) {
|
||||
const { error } = await adminSupabaseClient.auth.admin.createUser({
|
||||
email: user.email,
|
||||
password: user.password,
|
||||
user_metadata: user.user_metadata,
|
||||
email_confirm: true, // Auto-confirm for dev environment
|
||||
});
|
||||
|
||||
if (error) {
|
||||
throw new Error(`Failed to create user ${user.email}: ${error.message}`);
|
||||
}
|
||||
createdUsers.push(user.email);
|
||||
} else {
|
||||
existingUsers.push(user.email);
|
||||
}
|
||||
}
|
||||
|
||||
let message = '';
|
||||
if (createdUsers.length > 0) {
|
||||
message += `Successfully created users: ${createdUsers.join(', ')}. `;
|
||||
}
|
||||
if (existingUsers.length > 0) {
|
||||
message += `Users already existed: ${existingUsers.join(', ')}.`;
|
||||
}
|
||||
if (message === '') {
|
||||
message = 'All development users already exist.'
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ message: message.trim() }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
status: 200,
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
// Return a detailed error with a stack trace for better debugging.
|
||||
return new Response(JSON.stringify({ error: error.message, stack: error.stack }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
status: 500,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
2025-11-08 - from Googel AI BUILD env Gemini
|
||||
|
||||
ok, time to move this project from the Google AI BUILD environment, and to a proper build and deploy process using git, etc - can you make a list of what the overall projects progress has been so far, where we are at with the overall goals, and what steps were still incomplete, and what areas were started but not finished
|
||||
|
||||
answer:
|
||||
|
||||
Of course. Moving from a rapid-prototyping environment to a structured build process is a critical and exciting step for any project. It's the perfect time to take stock of our progress.
|
||||
Here is a comprehensive overview of the Flyer Crawler project's current status, its alignment with our goals, and a roadmap for what lies ahead.
|
||||
Project Status & Next Steps Report
|
||||
1. Project Goal Recap
|
||||
Our primary objective is to create a powerful tool that transforms static, often confusing grocery store flyers into structured, actionable data. By leveraging AI, we aim to help users save money, plan meals, and streamline their shopping through features like price tracking, deal alerts, and smart shopping lists.
|
||||
2. Current Progress: Implemented Features (What's Done & Working)
|
||||
We have successfully built a robust Minimum Viable Product (MVP) that validates the core concept. The application is functional end-to-end.
|
||||
Core AI Pipeline:
|
||||
Multi-Format Upload: Users can upload single images (PNG, JPG) and multi-page PDFs. The system automatically converts PDFs into images for the AI.
|
||||
Intelligent Data Extraction: The Gemini-powered backend can reliably extract:
|
||||
Store Name & Address
|
||||
Flyer Validity Dates (From/To)
|
||||
A detailed list of all sale items.
|
||||
Granular Item Analysis: For each item, the AI extracts the display price, quantity/deal info (e.g., "per lb", "2 for $5"), and calculates a standardized unit_price and a normalized price_in_cents for comparison.
|
||||
Canonical Item Matching: A critical feature, the AI matches flyer items (e.g., "Dole Bananas") to a canonical entry in our master_grocery_items table (e.g., "bananas"), which powers all personalization features.
|
||||
Backend & Data Persistence (Supabase):
|
||||
Full Integration: The application is fully backed by Supabase for database (PostgreSQL), file storage, and authentication.
|
||||
Durable Storage: Processed flyers are saved to the database, and their images are uploaded to Supabase Storage, creating a permanent, queryable history.
|
||||
Duplicate Prevention: A checksum is generated for each uploaded file to prevent processing the same flyer twice.
|
||||
User Features & Personalization:
|
||||
Full Authentication: Complete user auth system with email/password, OAuth (Google/GitHub), password reset, and secure account deletion via Edge Functions.
|
||||
User Profiles & Preferences: Users can manage their name and preferences (Dark Mode, Metric/Imperial units), which are saved to their profile.
|
||||
The Watchlist: Authenticated users can create a personalized list of items they want to track. The UI clearly highlights these items in any flyer.
|
||||
Active Deals Panel: A key feature that scans all current, valid flyers in the database and shows a user the best deals on their specific watched items.
|
||||
Shopping Lists: Users can create and manage multiple shopping lists, add items from flyers or their watchlist, and check them off as "purchased."
|
||||
AI Analysis Panel: Provides three levels of AI-driven analysis on a selected flyer's contents: Quick Insights, a Deep-Dive meal plan, and a Web Search for recipes/info, complete with source links.
|
||||
Development & Onboarding:
|
||||
Comprehensive Setup Guide: The README.md and schema.sql.txt provide a complete guide for a developer to set up the entire Supabase backend from scratch.
|
||||
Automated System Check: A SystemCheck component automatically verifies a developer's backend configuration (schema, RLS policies, Edge Functions) and provides actionable feedback, streamlining setup.
|
||||
Database Seeder: A one-click tool to seed the database with initial development users.
|
||||
3. Current State vs. Overall Goals
|
||||
The project is in an excellent state. The core loop of Upload -> Process -> View -> Personalize is complete and stable. We have successfully de-risked the most technically challenging part of the project: using a large language model to accurately parse unstructured visual data and link it to a relational database.
|
||||
The current application serves as a powerful proof-of-concept and a solid foundation for the more advanced features outlined in the database schema.
|
||||
4. Incomplete Features (What Hasn't Been Started)
|
||||
The database schema was designed with a larger vision in mind. The following features are fully designed on the backend but have no corresponding UI or logic implemented yet:
|
||||
User Alerts & Notifications: The user_alerts and notifications tables exist, but the mechanism to trigger them (e.g., a nightly cron job that checks for new deals on watched items) has not been built. This is the next major feature needed to make the app proactive.
|
||||
Full Recipe & Meal Planning System: The schema supports recipes, ingredients, menu plans, and a user's pantry. However, there is no UI for users to:
|
||||
Create or import recipes.
|
||||
Build a weekly menu_plan.
|
||||
Manage their pantry_items.
|
||||
Generate a shopping list from a meal plan while subtracting pantry items (the database function for this, generate_shopping_list_for_menu_plan, exists but is not used).
|
||||
Crowdsourced Data Quality: The schema includes tables for suggested_corrections and user_submitted_prices. The UI to allow users to submit corrections or report in-store prices has not been created. An admin dashboard to review and approve these submissions would also be needed.
|
||||
Location-Based Services: The database can store detailed store_locations with geographic coordinates, but the app does not currently use this. It doesn't request the user's location to find nearby flyers or stores.
|
||||
5. Partially Completed Areas (Started but Unfinished)
|
||||
These are areas where a foundation exists but require more work to be considered production-ready.
|
||||
Price History Chart: The PriceHistoryChart component is built and the backend has a complex trigger (update_price_history_on_flyer_item_insert) to populate the summary table. However, this is a complex data aggregation process that needs extensive testing with real-world scenarios (including flyer deletions and corrections) to ensure accuracy.
|
||||
Scalability of AI Processing: Currently, all processing happens in the user's browser and blocks the UI. For a production application, this should be moved to a background job. A better architecture would be: Upload to Storage -> Trigger Supabase Edge Function -> Function calls Gemini -> Function writes results to DB. This makes the frontend non-blocking and more scalable.
|
||||
AI Matching Robustness: The AI's ability to match items to the master_grocery_items list is good but can always be improved. This will require ongoing prompt engineering and testing with a wider variety of flyers from different stores and regions. The master_item_aliases table, designed for this, is currently unused.
|
||||
Formal Testing: The project lacks a formal testing suite (e.g., Vitest, Jest, React Testing Library). While the SystemCheck is great for setup, it's not a substitute for unit and integration tests to ensure code quality and prevent regressions.
|
||||
Accessibility & Advanced Responsiveness: The app is functional on mobile, but it would benefit from a dedicated pass to ensure all components are fully responsive, accessible, and navigable via keyboard, adhering to WCAG standards.
|
||||
By addressing these areas, we can transition Flyer Crawler from a powerful MVP into a polished, scalable, and feature-rich production application.
|
||||
Reference in New Issue
Block a user