April 29, 2026
Introducing PerifEye — AI-powered home inventory from walkthrough videos
I spent 45 minutes looking for a MacBook charger in my own house. So I built an AI that catalogs everything you own from a walkthrough video.
The problem
Three months ago, I spent 45 minutes looking for a spare MacBook charger. I have a 3-bedroom house — this shouldn't be hard. I knew I had one. I just had no idea where it was.
That frustration sent me down a rabbit hole. Insurance companies estimate the average household contains 300,000 items. Most people can't name 100 of them. We have better inventory systems for warehouses than we do for the places we actually live.
Home inventory apps exist — dozens of them. But they all share the same fatal flaw: manual data entry. Photograph each item. Type a description. Assign a room. Nobody does this. It's a non-starter for 99% of households.
The insight
You already walk through your home every day. Your phone camera is in your pocket. What if building a home inventory required nothing more than that — a walkthrough video?
Video changes the equation completely. Instead of photographing 2,000 individual items, you record a 5-minute walkthrough per room. AI handles the rest: object detection, room mapping, frame deduplication. What would take a human 20+ hours becomes a 30 minute video task.
How PerifEye works
- Record. Walk through each room with your phone camera. 5 minutes per room. Pan slowly — the AI needs clear frames.
- Upload. Send the video. Our computer vision pipeline processes every frame, identifies objects, maps them to rooms, and flags potential repair issues.
- Verify. A human operator reviews the AI output before you see it. This catches edge cases that pure ML misses: mirrors reflecting objects, pets mistaken for furniture, partially occluded items.
- Search. You get a searchable dashboard. Find anything, see where it was last spotted, and get a running repair punch list that updates on every walkthrough.
Why human-in-the-loop matters
Pure AI object detection makes mistakes. It confuses a black sweater with a cat. It misses small items behind furniture. It can't read labels reliably on the first pass.
We added a human operator review step — think of it like a radiology workflow: AI does the first pass (fast, comprehensive), a human verifies (accurate, trustworthy). This takes about 2 minutes per room and catches the edge cases that would erode trust in the system.
The long-term goal is to reduce the operator's role as the AI improves. But starting with human verification means every early user gets an accurate inventory from day one.
The repair tracking surprise
When I first built PerifEye, I was focused on inventory — knowing what you have and where. But the repair tracking feature has turned out to be the sleeper hit.
The AI spots things you've tuned out: loose outlet covers, water stains on the ceiling, cracked caulk around windows, worn weatherstripping. Every new walkthrough compares against the previous one. New issues get flagged. Fixed issues get marked resolved.
Your home's maintenance log builds itself. For homeowners, this alone is worth the price of admission.
The tech stack
For the technically curious, PerifEye is built on:
- Frontend: Next.js 16 (App Router) + Tailwind CSS, deployed on Vercel
- Computer vision pipeline: YOLO-based object detection, room classification from visual context cues, optical flow for frame deduplication
- Operator dashboard: Custom review UI for verifying and editing AI predictions
- Infrastructure: GitHub API-based persistence, serverless API routes, waitlist → cohort onboarding flow
Where we are now
We're in early access. The landing page is live, the waitlist is open, and we're processing the first batch of beta walkthroughs. We're onboarding in small cohorts (5–10 per week) so every user gets a great first experience.
If this sounds useful — if you've ever lost something in your own house and thought "I should know where this is" — join the waitlist. Early signups get priority access and founder pricing when we launch paid plans.
Share this post
Questions? Feedback? Email me at arnd@dexmind.ai or find me on Twitter.