This page looks best with JavaScript enabled

AI-Based Job Position Watching from Company Career Pages(PoC) - Part 1

 ·  ☕ 5 min read

As an unemployed mid‑career software engineer looking for a new role in 2026, I often find myself repeatedly checking the career pages of companies I am interested in. This process is time‑consuming and tedious, especially when tracking openings across multiple companies. While many career sites provide email‑based job alerts, these alerts are usually driven by opaque AI matching against a previously submitted CV, or by simple keyword matching. In both cases, I have very limited control over the actual matching criteria.

To address this problem, I decided to leverage an AI agent to automatically watch the career pages of my target companies and notify me when new job postings that match my own criteria are published. In this article, I present a proof of concept (PoC) that demonstrates this idea.

In this PoC, I show how to use an AI agent to watch a company’s career page for new software development roles. The agent navigates the job portal, searches for relevant positions, extracts structured information, and stores the results in a SQLite database for later querying and tracking.

PoC

Job Position Data Source

For this experiment, I selected a company whose career page is built on the Eightfold AI platform. If your target company also uses Eightfold AI, you can adapt this PoC with minimal changes.

Eightfold AI is a talent intelligence platform that uses artificial intelligence for hiring, retention, and workforce development. It matches candidates to open positions based on skills and experience and is used by over 100 companies, including Vodafone, Morgan Stanley, and Chevron. The platform is available in more than 155 countries.

Although the Eightfold AI platform itself provides AI‑based job alert subscriptions, I wanted finer‑grained control over the matching logic and the collected data, which motivated this custom solution.

Agent Design

I implemented this PoC inside the VS Code Copilot Chat environment, using the following tools and prompts.

MCP Tools

  • Browser toolbrowsermcp: used to navigate and interact with the job portal web pages.
  • SQLite database toolgenai-toolbox: used to persist extracted job posting data.

./vscode/mcp.json:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
{
  "servers": {
    "browsermcp": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "@browsermcp/mcp@latest"
      ]
    },
    "sqlite": {
      "command": "~/genai-toolbox/toolbox",
      "args": [
        "--prebuilt",
        "sqlite",
        "--stdio"
      ],
      "env": {
        "SQLITE_DATABASE": "~/jobs/jobs.db"
      }
    }
  }
}

Database Schema

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
CREATE TABLE IF NOT EXISTS xyz_company_jobs (
    job_id TEXT PRIMARY KEY, -- Unique identifier for the job position
    req_id TEXT,             -- Requisition ID
    job_title TEXT,          -- Job title
    location TEXT,
    date_posted TEXT,        -- Format: 'YYYY-MM-DD'
    business_department TEXT,
    job_description_url TEXT,
    job_description TEXT     -- Main job description content
);

Agent Prompt

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
You are an AI agent with access to a browser tool and a SQLite database tool. Your task is to collect software‑development‑related job positions from the XYZ_Company job portal and store the extracted information in a SQLite database. The page currently open in the browser is the XYZ_Company job search portal.

Database:
```sql
CREATE TABLE IF NOT EXISTS xyz_company_jobs (
    job_id TEXT PRIMARY KEY, -- Unique identifier for job position
    req_id TEXT,             -- Req ID
    job_title TEXT,          -- Job title
    location TEXT,
    date_posted TEXT,        -- Format: 'YYYY-MM-DD'
    business_department TEXT,
    job_description_url TEXT,
    job_description TEXT     -- Job description main content
)
```

Rules:
- After calling a `click()` action, you must wait 10 seconds for the page to load completely, then call `snapshot` to capture the current page state.
- Before inserting a new record into the `xyz_company_jobs` table, check whether the `job_id` already exists to avoid duplicates.
- When generating INSERT SQL statements, ensure that single quotes in values are properly escaped.
- Do not collect or click job positions under `document > main > group Similar Position` sections.

User Prepared Environment:
- The web page opened on the browser is a job search portal of XYZ_Company.

Setup:
1. Create the `xyz_company_jobs` table if it does not already exist. When executing SQL, preserve comment lines in the SQL code block.


Procedure:
1. Each button with text matching the pattern "$Job_Title$ posted since $time_since_publication$" represents a job position. Calculate `Date Posted` as `today - time_since_publication`.
2. Click a job position button to open the job description page. The resulting page URL is the `Job Description URL`.
3. From each job description page, extract: Job Title, Location, Job ID, Business Department (optional), Req ID, and Job Description Main Content.
4. Only collect job positions related to software development.
5. Insert each collected job into the `xyz_company_jobs` table, ensuring no duplicates based on `job_id`.
6. Click the `More job` button to load additional positions if needed.
7. Collect and store at least 10 job positions.

Run

  1. Open a Chrome Tab and open the XYZ_Company job search portal. Active the Browser MCP Chrome Extension on that tab.
  2. Start a new Copilot Chat session in VS Code with the above MCP configuration and prompt.

Summary

With the above setup, I successfully implemented an AI agent that automatically watchs the career page of a target company for new software development job postings. The agent navigates the job portal, identifies relevant positions, extracts structured data, and stores it in a SQLite database for convenient access and long‑term tracking.

Future Work

This PoC can be extended in several directions:

  • Introduce more sophisticated matching logic, such as CV parsing and semantic skill matching.
  • Export collected job postings as an RSS feed or email digest, enabling a fully self‑hosted job alert system.
  • Add a notification mechanism to alert me immediately when new matching positions are discovered.
Share on

Mark Zhu
WRITTEN BY
Mark Zhu
我在找工作 | I'm open to work