Webinar: How to Optimize your Online Forms and Checkouts

Watch our discussion session with a top Optimizely experimenter - learn tips to improve your forms

Looking to improve your form conversion?

Get an automated form health check for free showing you:

  • Friction causing abandonment
  • What's working well
  • Other areas for UX improvement

A video session on using experimentation to improve the conversion of your forms

We discuss forms and experimentation with Griffin Cox, Senior Product Manager at Optimizely. Watch the video to learn things like:

  • Memorable forms we've worked on
  • UX issues we consistently see across online forms
  • How to diagnose / quantify those issues
  • Why so many forms still have bad UX
  • How to use testing to improve forms

For more information on how you can use Zuko with Optimizely read our Optimizely integration guide. You can find more information on A/B testing forms here.

Webinar Transcript: How to Optimise Your Forms and Checkouts

Introductions and welcome

00:00
Alun (Zuko): Welcome to our session today on how to optimise your forms and checkouts. I’ve got Griffin from Optimizely with me, who’ll introduce himself shortly. I’m Alun from Zuko — Managing Director at Zuko.

For anyone who doesn’t know Zuko: we’re a specialist form analytics product. We provide deep insights into why people drop off forms at a detailed, field-by-field level. We’re the only player focused purely on form analytics.

One of my favourite reports in Zuko is the Field Flow report on a failed submit — it shows what people do after they fail to submit, where they jump back to, and helps you quickly pinpoint the issues on your form.

00:54
Alun (Zuko): Quick housekeeping: we’ll be going for about 30 minutes plus Q&A (so we may overrun). Drop questions or comments in the chat — we may answer as we go or save them to the end depending on timing.

I’ll hand over to Griffin to introduce himself.

01:17
Griffin (Optimizely): Hey everybody — I’m Griffin Cox. I’m the Senior Technical Product Manager on Optimizely’s flagship full-stack experimentation product, Feature Experimentation. I’ve been with Optimizely since April as a PM.

Before that, I was an engineer at a Chicago-based fintech — and my first passion (I’m totally serious) was optimising forms. The brand I worked for was a Brazilian loan product — essentially a glorified form. We had around 30 fields because we needed a lot of data to underwrite and approve loans. There was a lot of friction, especially around Brazil’s equivalent of a social security number.

That got me deep into improving errors, breaking the form into multiple steps so it didn’t feel as long, and saving progress as users went. I really got into the weeds with it.

Optimizely’s Web and Feature Experimentation products both support experiments on forms. There are two main categories:

  • Between forms: routing users to different URLs and comparing different form versions.
  • Within a form: changing the experience inside the same form (e.g., showing/hiding fields) to see if fewer fields or a different UX improves conversion.

I’m excited for this conversation.

03:06
Alun (Zuko): One quick plug: Zuko is a partner of Optimizely — we have a direct integration so you can run A/B tests and see Zuko data on different test variants.

Now, let’s get into forms.

Common questions teams ask about forms

03:25
Alun (Zuko): What are some of the most common questions you get from clients, users, or in previous roles about forms?

03:38
Griffin (Optimizely): I’d almost flip that question — I don’t think I got enough questions about forms from my product team.

At a high level, a form is a collection of fields. It may or may not have validation and help text. Users submit it, the backend receives the data, and then you can change the user experience — whether it’s checkout or account creation. It’s fundamentally a method for users to input data securely and (hopefully) in a user-friendly way.

That last part was always the issue. Early on, before I could show massive conversion gains just by changing those elements, it was hard to convince product teams it was worth trying.

Honestly, I wish we’d had a product like Zuko. We were using Adobe at the time — and it was worse five years ago than it is now. If you’ve used Adobe, you probably feel my pain. All I need to say is: “docs.”

04:54
Alun (Zuko): We mostly speak to clients who already recognise forms need fixing, but we also speak to people who need to justify it internally. We get lots of questions, but the key is always: identify the UX issues and focus your effort.

A classic question is:
If a form has 10–20 fields, should it be one page or multi-step?

One page vs multi-step forms

05:33
Griffin (Optimizely): The form I worked on was multi-step already. The reason was simple: showing someone 10 fields at once can be intimidating.

We’d start with low-friction fields — non-PII questions (loan purpose, job situation, etc.). Then, in steps 2 and 3, we’d ask for sensitive info (like bank routing details). That’s where drop-off happened.

Multi-step forms also give you a way to show progress — without being disingenuous — so users feel they’re making meaningful headway.

06:55
Griffin (Optimizely): Another “joy” pattern: when you ask for billing address, and then offer a simple checkbox like “Delivery address is the same as billing address.” Small moments like that reduce effort.

The opposite is when validation is broken — especially when the form instructs you to enter something, and then throws an error anyway. That’s when people rage quit.

07:39
Alun (Zuko): On “how many fields per step,” we don’t give a universal number. The usual advice is:

  • keep similar questions together
  • and, if you have enough traffic, test the structure.

The order of questions matters a lot — and it varies by sector.

For many forms, asking easy questions like name/email early can help people feel momentum.

But in sectors like insurance, people often just want a quote first. If you ask for personal details too early, they may assume spam and bounce. So a better pattern can be:

  1. ask for the quote-relevant info first
  2. show the quote
  3. then ask for email if they want to proceed

Historically, we’ve seen conversion improve in insurance when personal info is requested later — but for something like education forms, the opposite can be true.

A useful experimentation mindset: remove, don’t just add

09:22
Griffin (Optimizely): A common misconception in experimentation is that it’s all about adding something new and seeing if it’s better.

But often, the easiest (and most revealing) experiments are about removing something:

  • the code already exists
  • you can hide a field and test quickly
  • you don’t need to build new complex UI

When building longer forms:

  1. question whether you truly need the info
  2. if you do need it, consider whether you need it now

Sometimes you have to ask early (e.g., in Brazil, phone number can be used to look things up for underwriting). But if you can delay it until later, it often helps.

10:37
Alun (Zuko): Having a framework upfront is crucial: what do you need and why do you need it?

Form design involves multiple stakeholders — marketing, compliance, legal — and everyone wants “just one more field.” That creates tension.

From our dataset, the biggest friction points in common fields are typically:

  • passwords (often #1 drop-off)
  • telephone number (often next)

Phone number fields: do you need it, and how are you asking?

11:22
Alun (Zuko): The first question is always: Do you actually need the phone number?

It’s often a legacy requirement — a holdover from older forms where phone was the primary contact method. Users frequently think: “You have my email — why do you need my phone number? I don’t want calls.”

And even when you do need it, how you ask for it creates friction. Phone formats vary wildly by country and user habit — spaces, dashes, leading zeros, country codes. We regularly see validation errors triggered by something as trivial as entering a space.

Our general recommendation is: allow free-form entry and clean/standardise in the backend. If you do apply formatting, do it in a way that helps the user rather than rejects them.

12:56
Griffin (Optimizely): A quick win I’ve personally seen: tag the input as a telephone field (HTML input type). On mobile, that triggers the numeric keypad — and that alone can make a big difference.

Also: accessibility matters. If screen readers can’t navigate your form, that’s a conversion problem too — and it often gets neglected because it takes real skills and time to fix properly, even if tools like Lighthouse surface the issues.

14:10
Alun (Zuko): Accessibility is a whole topic in itself, but it’s important because you might not realise there’s a problem — it can affect only a segment of your audience. Segmentation by device is also critical: teams often design for desktop and forget how painful the experience can be on mobile.

Passwords: reduce friction with alternatives and better rules

15:07
Griffin (Optimizely): Some companies are finding creative ways to avoid passwords entirely — depending on the product and risk level.

For example, OpenTable-style “magic links” via email/text can work well for lower-stakes accounts. If it’s not high-security, removing password creation can dramatically reduce friction.

If you can’t remove passwords, consider alternative logins (social sign-in), but be aware it depends on industry and audience.

16:27
Alun (Zuko): Social sign-in can help, but it’s more sensitive in financial services due to privacy concerns. Also, if your audience skews older, you still need the “classic” email + password route.

On password rules: don’t be overly restrictive. A minimum length (often 8+) is usually enough. Let people use special characters, but don’t force complexity rules that encourage password reuse or writing passwords down.

The analytics → hypothesis → experiment cycle

17:30
Griffin (Optimizely): When it comes to experimentation, it really starts with analytics. Optimizely partners with analytics providers because we don’t aim to compete there — we specialise in experimentation.

A common loop looks like this:

  1. use analytics (e.g., Zuko) to identify where friction is
  2. form a testable hypothesis (“If we remove this field, more people will complete”)
  3. create variations (with vs without the field)
  4. run the experiment and evaluate results using statistical significance
  5. roll out the winner, often using feature flags/targeted delivery

The most successful companies run this continuously — hundreds of experiments per year — iterating conversion up over time.

19:35
Alun (Zuko): The challenge is ensuring the experimentation programme actually extends to forms. Integrations help make that easier.

Conversion rate alone is useful for A/B tests, but field-level behavioural data helps you refine the next hypothesis:

  • where are people jumping back to?
  • which fields trigger corrections?
  • what changes in behaviour happened even if overall conversion didn’t move?

20:59
Alun (Zuko): We recently created an ebook on how to use data to optimise forms — focusing on metrics and what they can imply, not just generic UX advice. We’ll share it afterwards.

Also: yes, this is being recorded. If you signed up via LinkedIn, we’ll message the link afterwards.

Technical section: SPAs vs full page refresh

22:11
Griffin (Optimizely): Let’s get a bit technical: single page applications (SPAs) vs traditional multi-page flows.

In traditional server-rendered apps (e.g., Rails), each step submits, the backend responds with new HTML, and you get a full page refresh (header/footer reload, etc.). Validation patterns tend to be more backend-driven.

In SPAs (React/Vue/Angular), you often load the full multi-step experience at once and show one step at a time. Navigation between steps is handled in code.

I saw a huge lift by converting our highest-volume flow from full page refresh to an SPA — keeping the UI the same but removing the refresh. It was especially impactful on mobile and slow connections. We could even lazy-load future steps in the background.

24:00
Alun (Zuko): We’ve seen improvements when companies move to SPA-style step transitions too — often a few percentage points, which can still be significant. But it depends heavily on context and sample sizes.

Advanced tips: saving progress and nudges

25:08
Griffin (Optimizely): A couple tips for advanced CRO folks:

1) Save progress locally
If a user refreshes, fields should remain populated (e.g., stored in the browser). The idea is to prevent mistakes that force users backwards.

2) Save progress in the backend (even before account creation)
Store partial data tied to an anonymous ID so you can analyse and recover journeys. It can be useful for analytics — and potentially ML later.

3) Nudges to recover drop-off
If someone abandons (e.g., no field changes for 15 minutes), send a reminder via SMS/email with a link that resumes where they left off. This can be extremely effective.

26:57
Alun (Zuko): In Europe you have to be careful — GDPR and consent can limit what’s viable (e.g., partial form capture). But the principle is sound; you just need to execute it appropriately for your region and compliance requirements.

Q&A: Eye tracking for sign-up pain points

27:42
Alun (Zuko): Question from Arthur: have you used eye tracking, and is it useful for detecting pain points in sign-up?

We’ve done experiments on this (content on our website). One example: date of birth fields. Eye tracking can be great for developing hypotheses, but it’s expensive and sample sizes tend to be small.

We tested different DOB patterns and found the smoothest pathway was often three simple text boxes (DD / MM / YY or similar) — fast and predictable. Dropdowns often created lots of scanning and friction (especially when year dropdowns start at the current year and you have to scroll forever).

Eye tracking gives you a hypothesis; then you test it at scale via A/B testing.

29:36
Griffin (Optimizely): There are multiple “types” of eye tracking:

  • approximations based on viewport/scroll behaviour
  • predictive models based on mouse/viewport signals
  • and predictive “pre-test” models applied to mockups before production

That last one is especially interesting: models predict where attention will go based on training data, helping you refine layouts before you build.

Q&A: Best way to analyse field engagement and related experiments

31:37
Alun (Zuko): How do you analyse form field engagement data and experiments?

You want field-level metrics like:

  • abandonment rate by field
  • time spent (dwell time)
  • returns/backtracks
  • corrections
  • error triggers and patterns

You can hack it together with Tag Manager, but it’s a lot of work and depth. Tools designed for form analytics make it much easier.

For experiments, compare how changes impact both:

  • overall conversion
  • and the behavioural metrics (abandonment, dwell, errors, corrections) that explain why.

32:53
Griffin (Optimizely): A common integration pattern is:

  • experimentation tool assigns a variation
  • sends that info to the analytics provider (event/callback)
  • then analytics reporting can be segmented by variation

Another approach is ETL-style:

  • get both datasets into a warehouse
  • join on a shared ID

Larger companies often lean toward warehouse joins; smaller teams often use live events.

Q&A: High abandonment at submit button

36:23
Alun (Zuko): If each step has low abandonment (1–5%) but abandonment is ~70% at the submit button — is that unusual?

No — we see it all the time. Often it happens because validation errors only appear on submit. Users click submit, see a sea of errors, and leave.

Recommendation: inline validation (surface issues earlier).

To find the true cause, use behaviour paths (e.g., field flow after failed submit) to see where people jump back to — often revealing the real problematic field (e.g., date of birth).

37:29
Griffin (Optimizely): Another technique is session recording. It can show issues you’d never spot otherwise — like elements flying off-screen on weird devices. Combine quantitative analytics with qualitative playback for a fuller picture.

Q&A: Date pickers on mobile

38:22
Alun (Zuko): Thoughts on mobile date pickers — especially custom ones that aren’t accessible?

There’s a lot of bad UX out there. Wherever possible, allow manual entry (text input) as an option. What works best also depends on context: date of birth vs travel booking dates are different interactions.

39:14
Griffin (Optimizely): There’s no single silver bullet. Often date pickers are built on open-source libraries. Prioritise accessibility when choosing a library, then style it lightly (e.g., colours, rounding) to match branding rather than building from scratch.

Closing

40:12
Griffin (Optimizely): To really achieve CRO, it helps to use both kinds of tools: analytics to find friction and build hypotheses, and experimentation to test solutions.

40:23
Alun (Zuko): Agreed. Thanks everyone — the recording will be available. Keep an eye on the page; since this is LinkedIn Live, you can rewatch, and we’ll also share the recording link afterwards.

40:45
Griffin (Optimizely): Thanks everybody — it’s been fun.

We wrote the book on form optimization!

"The best book on form design ever written - 80 pages of PURE GOLD"

Craig Sullivan, CEO, Optimise or Die
Two copies of the book 'The Big Guide to Form Optimization and Analytics' by Zuko with a laptop screen showing graphs on the cover.Guide dogs in training wearing harnesses inside a vehicle, with a man seated beside them.

Want to get started with Zuko?

Start a free trial that includes all features, or request a demo