Private Projects
Personal work built outside of office hours
Norwegian AI Jobs Tracker
aivarsel.netlify.app ↗A full-stack web application that automatically tracks and analyzes AI-related job postings in the Norwegian market.
What it does
- Every morning at 06:00 UTC, a scheduled serverless function fetches all new job postings published that day from NAV (Norway's official labor registry) via their authenticated feed API.
- Each job is sent to an LLM for classification — the model reads the job title, full description, and occupational category, then determines whether the role is AI-related and assigns it a type (engineer, analyst, researcher, manager, consultant).
- Results are stored in a Postgres database and aggregated into daily snapshots tracking counts, top employers, and locations.
- The public dashboard displays trend charts of new AI jobs per day, a breakdown by role type, top hiring companies, top municipalities, and a live feed of the most recent active listings.
Key implementation details
- Feed pagination uses a persistent cursor stored in the database, allowing the daily sync to resume exactly where it left off across invocations.
- Job details are fetched in parallel from the NAV detail endpoint to retrieve full descriptions before classification.
- HTML is stripped from job descriptions before sending to the LLM.
- Inactive listings are flagged promptly in the database per NAV's terms of use.
- A separate pipeline monitoring page provides real-time visibility into sync status, classification counts, and a live table of processed jobs with AI verdicts and reasoning.
Tech stack
React
Vite
Recharts
Tailwind CSS
Netlify
Serverless Functions
TypeScript / ESM
Supabase (PostgreSQL)
Ollama Cloud API
NAV Stilling Feed API
DIY Robot Arm — YOLOv8 + Hailo-8
github ↗A real-time computer vision pipeline that lets a desktop robot arm autonomously detect and track cubes using a custom-trained neural network accelerated by a dedicated NPU.
What it does
- A camera feed is processed by a YOLOv8s model running on a Hailo-8 M.2 NPU attached to a Raspberry Pi 5, achieving 15–30 FPS versus 1–2 FPS on the Pi CPU alone.
- Detected cube positions are mapped from pixel coordinates to robot workspace XYZ via homography and RBF interpolation, then sent as G-code movement commands to an Arduino-controlled 4-DOF arm.
- A three-thread architecture separates frame capture, NPU inference, and command dispatch to sustain real-time throughput.
Key implementation details
- Model trained on 119 annotated images (80/20 split), achieving 91.7% mAP50 after 100 epochs; converted from PyTorch → ONNX → HAR → HEF for the Hailo runtime.
- INT8 quantization initially deadened class-score heads due to saturated zero-points — fixed by calibrating with random uniform noise instead of real images.
- Post-processing (DFL decode, sigmoid, NMS) runs on the Pi CPU while the Hailo-8 handles the neural network backbone and detection heads.
- Commands are forwarded from the Pi to a Windows host over TCP, then relayed to the Arduino via serial at 115200 baud.
Tech stack
Raspberry Pi 5
Hailo-8 NPU
YOLOv8
OpenCV
picamera2
SciPy RBFInterpolator
Arduino / G-code
Python