We Ran Juno on a Real Client Return This Tax Season. Here’s What Happened.

Juno

Share This Post

When one of our US clients rolled out Juno to support their tax workflow, we didn’t test it on a demo file. We put it straight onto a live 1040. Full document set. Real client. Real deadlines. That was deliberate. If a tool is going to be part of our workflow, it needs to work under real conditions, not in a controlled demo with the sales team.

The Setup

The return wasn’t simple.

  1. Multiple W-2s
  2. Several 1099s
  3. Brokerage statements (the kind that run for pages)
  4. Plus the usual supporting documents you’d expect in a mid-complexity individual return

The idea behind Juno is straightforward: instead of manually keying data into your tax software – UltraTax in this example – you upload the documents, let the system extract the data, validate it, and then push it into your tax software.

In theory, that removes a big chunk of manual effort. So, we tested it exactly that way.

The Workflow We Used

We kept the process tight and repeatable:

  • Upload all documents into Juno
  • Run extraction
  • Validate against source documents
  • Import into UltraTax
  • Annotate and bookmark inside Juno
  • Store the final package in TaxDome

The key step here is validation. That doesn’t go away. If anything, it becomes more important.

Think of it as a pre-flight check, you’re catching issues before they hit the tax software.

Where Juno Actually Delivered

The biggest win showed up where you’d expect: volume. Once the document set gets beyond ~10 pages, the difference is noticeable. Instead of a preparer keying line by line, Juno pulls everything at once and queues it for review. That changes the rhythm of the work. Less typing. More reviewing.

And that matters, not just for speed, but for accuracy. Every time someone manually enters data from a 1099 or brokerage statement, the chance of an error increases. Small mistakes, but they add up.

With AI extracting the data, that risk drops. Not to zero, but materially lower.

The Unexpected Win: Documentation

One thing the team really liked was the annotation layer. Every decision, every check, every flagged item, gets tied back to the source document. So instead of trying to reconstruct what happened during review, the audit trail is already there.

For firms using TaxDome (like this client), that becomes a clean, permanent record for next year. That’s not just nice to have, t’s operationally useful.

Where It Didn’t Work (Yet)

This is the part most write-ups gloss over. We won’t.

Juno is not faster for simple returns.

If the document set is small (say under 10 pages), the overhead of uploading, extracting, and validating can actually slow you down compared to just entering it directly in UltraTax.

That’s not a flaw, it’s just knowing when to use the tool.

There were also a few practical limitations:

  • Formatting issues – payer names often came through in ALL CAPS, which meant cleanup after import
  • Document quality dependency – poor scans or non-standard formats can break extraction
  • Gaps in certain schedules – things like vehicle mileage or home office still required manual entry

None of these are deal breakers, but they are real.

So Where Does It Fit?

After running this live, the answer is pretty clear.

Juno works best when:

  • The document set is large
  • There are multiple income sources
  • Accuracy matters more than raw speed
  • You want a clean review trail

Manual entry still makes sense when:

  • The return is simple
  • The documents are messy
  • You’re dealing with schedules the tool doesn’t support well

The Bottom Line

We didn’t come away thinking “this replaces tax prep.”

We came away thinking:

This changes how prep work gets done, if you use it in the right situations.

That’s the difference.

We’ll keep running it across more returns this season. But now we know where it fits, and where it doesn’t.

And that’s what matters.

More To Explore