AI Products

Collecting what use-cases are people building AI for.

🚀 tech
Table of Contents

What problems are people trying to solve with LLM-related products?

These products/startups are a window into what startup founders and investors think is a valuable problem to solve with LLMs. This list does not filter for product-market fit. New markets will be created and products will evolve.

Segments

Conversations

ChatGPT - The original conversational AI.

Cohere - Conversational AI.

Meta AI - Chat on WhatsApp (and other Meta properties); generate stickers, chat with characters.

Workflows

DryMerge - Automate work with plain English.

Speck - AI Browser automation. (AI-replacement for Puppeteer?)

Data

Dalmatian - Text2SQL for enterprises.

Scale - Data engine for AI.

Matrices - Spreadsheet AI.

Infrastructure

Modal - Serverless AI infrastructure.

OpenFoundry - Developer infrastructure for open source AI.

Abacus.AI - AI platform for enterprises.

Creative

Eggnog - YouTube for AI-generated content.

Pika - Video on command with AI.

Focal ML - AI movie studio.

Runway - AI creative studio.

Operations

OpenCall - AI call center.

Duckie - Technical support with AI.

Sigma AI - AI automation E-Commerce customer support.

Healthcare

Andy AI - Clinical documentation for home health nurses.

Vetrec - Automated clinical notes for veterinarians.

Co-Pilots / Knowledge Bases

Khoj.AI - AI copilot for your knowledge base.

Github Copilot - AI pair programmer.

Glossarie - Language explanations powered by LLMs.

Testing

Baserun - Reliability testing for AI.

Sizeless - Reproducible and safe machine learning.

Perplexity - AI-powered search.

Phind - AI-powered code search.

Hardware

Matic - Smart home robot (iRobot for 21st century?)

Thoughts

The big bet with Large Language Models (LLMs) is that they will abstract away the current paradigm of instructions on microprocessors with more widely accessible natural language. These models remain prone to high error rates, unable to reason, costly to deploy, not very well-understood by researchers, and overall skepticism of (non-)experts. None of these challenges, however, are unique to this technology.

When transistors first came out as a replacement for vacuum tubes, they were indeed prone to higher error rates too, and had very high manufacturing/distribution costs. The markets also lacked supply of engineers and technicians who had expertise in working with transistors. We have now overcome all that, such that the error rates and costs are economically viable. I do not see why LLMs cannot have the same trajectory.

A common theme I am seeing here is many products aim to compete with incumbents with strong PMFs with an LLM tacked on. Perhaps, the hope is that LLMs reduce friction at some contact points with the user. New segment-defining products are still limited and yet to have a breakthrough economically.

AI Grant