Skip to main content

What is Tara?

Updated over a week ago

Tara is an AI product analyst that works alongside your team. She watches session recordings, analyzes real user behavior, and turns what she finds into clear insights and recommended next steps.

Instead of manually reviewing sessions, digging through dashboards, or guessing what to fix, you can simply ask Tara questions about your product—and she does the analysis for you.

Tara helps teams move from data → understanding → decision in minutes.

What can Tara do?

Tara takes over the manual work of product analysis.

She can:

  • Watch session recordings for you

  • Detect friction, confusion, and broken flows

  • Identify patterns across many sessions

  • Explain why users struggle

  • Surface hidden issues that aren’t tracked

  • Provide evidence-backed insights

  • Recommend what to do next

Tara doesn’t just show what happened. She explains what it means and how to act on it.

How Tara works

Tara works like an additional product analyst on your team.

You ask her a question in plain language, such as:

  • “What are the biggest frictions in our checkout?”

  • “Did the last release make onboarding worse?”

  • “Where are users getting stuck?”

  • “What should we fix first?”

Tara then:

  1. Analyzes relevant session recordings

  2. Identifies patterns, anomalies, and friction

  3. Explains what’s happening and why

  4. Returns supporting evidence

  5. Recommends a next step

If Tara doesn’t have enough data to support a conclusion, she will tell you.

Privacy, Security & Compliance

Tara is built on an enterprise-grade, privacy-first architecture.

  • Data Redaction: Before any AI processing begins, UXCam’s on-device redaction ensures that sensitive data (PII) is blurred or removed.

  • No Training on Customer Data: UXCam does not fine-tune, retrain, or use your data to benchmark models. Your data is processed strictly for inference within the scope of your account.

  • Stateless Processing: All interactions with AI models (e.g., Google Gemini, OpenAI) are stateless. Data is not retained by these providers for model improvements.

  • Retention: Data exchanged with external AI providers is retained for a maximum of 30 days exclusively for security auditing, then deleted.

Frequently Asked Questions (FAQ)

Q: Do I need to configure anything?

A: No. Tara works out-of-the-box with your existing UXCam setup. There is no need for extra tagging or event configuration.

Q: How is Tara different from standard LLMs (like ChatGPT)?

A: Standard LLMs require you to provide massive context to get good answers. Tara already "knows" your app structure and has watched the user sessions. She uses Visual Reasoning to understand the user experience (what the user actually sees) rather than just relying on code metadata.

Q: How do Tara credits work?

A: Credits are shared across your organization and reset every month. They can be used for both batch processing sessions and asking chat questions. The pool is shared, meaning there is no limit per specific app or user.

Q: Does Tara work with hybrid apps (Flutter/React Native)?

A: Yes. Because Tara relies on visual reasoning (analyzing the frames of the video) rather than just underlying code metadata, she is uniquely suited for frameworks where traditional text-based tracking often fails.

Q: Why do Tara's insights sometimes differ from my dashboard metrics?

A: Dashboards show "Logged Events" (what the code captured). Tara shows "User Reality." If a user clicks a button that fails to trigger an event due to a bug, a dashboard might show no activity, whereas Tara will report that the user experienced friction.

Did this answer your question?