The beginner's guide to AI data annotation jobs
What data annotation is, how much it pays, and how to land your first remote gig — with no prior experience.
AI systems only work as well as the data they're trained on. Someone has to label that data — decide whether a photo shows a cat or a dog, mark the bounding box around a pedestrian, rate whether a chatbot response is helpful. That "someone" is increasingly a distributed workforce of remote annotators, and the field has quietly become one of the more accessible on-ramps into paid remote work.
This guide covers what the work actually looks like day-to-day, what it pays, and the concrete steps to get your first task.
What is data annotation, exactly?
At its core, annotation is adding structured labels to unstructured data so that a machine-learning model can learn from it. The task varies enormously by project:
- Image labelling — drawing bounding boxes, polygons, or pixel masks around objects (cars, faces, tumors, defects).
- Text classification — tagging a message as spam vs. not spam, flagging toxic content, categorizing support tickets by topic.
- Audio transcription — converting speech to text, often with speaker labels and timestamps.
- RLHF (reinforcement learning from human feedback) — ranking chatbot responses, writing better alternatives, flagging unsafe outputs. This is a fast-growing segment driven by the boom in large language models.
- Sentiment & tone tagging — rating reviews, tweets, or comments on scales like positive/negative or on axes like helpful/sarcastic.
- Search-result evaluation — judging whether the results returned by a search engine are relevant to a query.
The common thread: careful, focused human judgment applied at scale. You're the quality signal the model learns from.
How much does it actually pay?
Rates vary enormously, but here's a realistic map of the market in 2025:
| Task type | Typical hourly rate | Notes |
|---|---|---|
| Microtasks, simple classification | $3–8 | Often paid per-task; add up to hourly after batching |
| Image bounding boxes, transcription | $8–15 | Needs attention to detail; quality gates common |
| RLHF / LLM evaluation | $15–35 | Premium rates for English fluency + writing skill |
| Domain-expert annotation (medical, legal, code) | $30–80 | Requires credentials or proven expertise |
Numbers on WorkQuay typically fall in the middle two bands, because we only list verified companies and filter out the $3/hr sweatshop listings. Specialist roles show up too — see /jobs for current openings.
Do I need experience?
For entry-level work — no. You need:
- A reliable computer and internet connection.
- Good English comprehension (most projects are in English).
- The discipline to follow long, specific instructions carefully.
- Patience. A single task might take 30 seconds. A session might be 4 hours of those.
For RLHF and writing-heavy work, you'll benefit from strong English writing, subject-matter knowledge (coding, medicine, finance), and the ability to explain why one answer is better than another.
What a typical day looks like
- Pick a task batch from the jobs dashboard. Each batch has a time estimate and payout.
- Read the guidelines. Some are 2 pages. Some are 40. Read them. Re-read them. The single biggest predictor of getting rehired is following guidelines.
- Do a calibration set — a handful of pre-labelled examples so you and the system agree on edge cases before you start producing data.
- Annotate. Maintain consistency. When an edge case surprises you, flag it in the notes field rather than guessing silently.
- Submit. Good platforms pay within 1–2 weeks of the batch closing.
Getting started on WorkQuay
- Sign up as a worker.
- Pass the assessment — 20 questions across English and general knowledge. You need 70% on each. You get 3 attempts per 7-day window.
- Complete your profile — languages, skills, preferred work type. This is how projects find you.
- Upload your CV and ID. Required once, and it's what lets legitimate companies trust you with real data.
- Browse verified roles and apply.
Watch-outs
- Anything asking for your bank login, PayPal password, or crypto keys — not legitimate. Ever.
- Free "training" that leads to a paid course — a scam. Legitimate annotation work is learned on the job.
- Guideline shortcuts on forums — don't copy answers from someone else's notes; you'll fail quality checks and get removed.
Start small, read the guidelines, ask questions when you're stuck. The workers who stick around past their first week are the ones who treat it like a craft instead of a grind.