Fits.me: Virtual Fitting Room

One of the main problems with buying clothes online is simple. You cannot feel the fit. So you guess, the parcel arrives, and the return loop starts again.

Fits.me, an Estonian company, builds a Virtual Fitting Room around a shape-shifting robotic mannequin. Instead of trying to “predict” fit with a size chart, the mannequin physically changes form to match your body dimensions, letting you preview how different sizes sit on a body shaped like yours.

A mannequin that changes shape so the garment can do the explaining

The mechanism is a robotic mannequin, often referred to as a FitBot, a shape-adjustable mannequin that can be tuned across a wide range of body measurements. Clothing is photographed on the mannequin in multiple sizes, and the shopper can compare how the same item behaves as size changes, on a body that resembles their own. Because the garment is shown on the same body shape across sizes, the comparison makes fit differences visible and reduces guesswork.

A robotic mannequin providing a Visual Size Guide.

In online apparel retail, fit uncertainty drives returns and suppresses conversion, so anything that reduces sizing doubt tends to outperform its surface-level novelty.

Why this approach feels more “real” than a size chart

What makes it persuasive is that it turns sizing into a visual comparison instead of a number. The real question is whether you can help a shopper see the trade-offs between sizes before checkout, without asking them to trust a black-box recommendation. If you have that problem, this is the right pattern to use. You are not being told “you are a Medium.” You are shown what Small, Medium, and Large look like on a similar shape, which is closer to the in-store decision process.

Extractable takeaway: When a purchase decision depends on a physical sensation you cannot deliver online, replace the missing sensation with a repeatable visual proof that helps shoppers compare options, not just read recommendations.

What the rollout says about where the pain is

At the time, the system is positioned around a male mannequin first, with Fits.me saying it is planning to unveil a female version in November. That sequencing is a reminder that “who we can fit well” is often a product constraint, not a marketing choice, especially when the technology depends on physical ranges and repeatable photography.

For more information visit www.fits.me.

What to steal from Fits.me’s FitBot

  • Make fit a comparison, not a verdict. Let shoppers see multiple sizes side by side on a body-like reference instead of outputting a single “recommended” size.
  • Design for confidence, then measure it. Track size changes after viewing, conversion on fitted items, and return-rate shifts by category.
  • Respect constraint sequencing. If the system only fits certain body ranges well at first, be explicit about where it is reliable and expand the range as the asset library grows.

A few fast answers before you act

What is a “Virtual Fitting Room” in this Fits.me context?

It is a system that uses a shape-adjustable robotic mannequin to model how garments look across sizes on a body shaped to match the shopper’s measurements, so shoppers can compare fit visually before buying.

Why does this reduce returns in theory?

Because it reduces guesswork. When shoppers can see how different sizes drape and sit, they are less likely to buy multiple sizes “just in case,” and less likely to be surprised when the item arrives.

What is the key difference versus typical size charts or recommendation widgets?

This approach is comparison-first. It shows a garment on a body-like reference across multiple sizes, rather than outputting a single recommended size and asking the shopper to trust it.

When does a visual fit tool like this not help much?

It helps most with size uncertainty, but it cannot fully replace tactile judgments like fabric feel or personal comfort preferences, so some returns will still be driven by “feel” rather than fit.

What should retailers measure if they deploy something like this?

Engagement with the fitting experience, size-selection changes after viewing, conversion lift on fitted products, and return-rate reduction by category and by first-time versus repeat shoppers.

NOOKA: Augmented Reality Accessorizer

NOOKA watches created a video-led way to let you try out their watches virtually. All you need is a simple strip of NOOKA watch-representing paper to make it work, and once you see it in action, the idea becomes obvious.

A paper strip that turns your webcam into a fitting room

The mechanism is a coded wrist strip and a webcam. You place the strip on your wrist, hold your arm up to the camera, and the watch appears aligned to your wrist as you move. It is a fast, low-friction way to demonstrate “how it looks on me” without needing a physical product in hand.

Because the strip gives the webcam a stable reference, the overlay can track your wrist as it moves, which is what makes the preview feel believable.

In online retail, the fastest way to reduce hesitation is to replace abstract product specs with a visual proof the shopper can control.

The real question is whether you can turn “how will this look on me?” into a live proof the shopper can control before they decide.

Why this feels more convincing than a static product shot

Most product pages show the same images to everyone. This flips the experience from passive viewing to live preview. For look-and-fit products, a live preview like this is a stronger trust-builder than piling on more static shots. Even if the rendering is simple, the feeling of personalization comes from movement and alignment, not photorealism.

Extractable takeaway: If your product is bought on look and fit, design a try-on moment that uses a behavior people already understand (webcam + holding up your wrist), then make the payoff immediate so the demo does the selling.

Stealable moves for NOOKA’s print-to-digital bridge

By a “print to digital” bridge, I mean a physical cue that unlocks or anchors a digital preview in a way the viewer can control.

  • Use a physical key. A simple strip, card, or marker makes the digital experience feel tangible and intentional.
  • Keep the interaction one-step. The user should be able to try it within seconds, not after setup friction.
  • Build for sharing. The best proof is something people can show a friend in the moment.
  • Let the demo carry the story. If it needs heavy explanation, simplify the mechanic.

A few fast answers before you act

What is the NOOKA Augmented Reality Accessorizer?

It is an augmented reality try-on concept where a coded paper wrist strip and a webcam let a shopper preview a NOOKA watch aligned to their wrist in real time.

Why does a paper strip matter in an AR try-on?

It provides a consistent reference point for positioning and scale, and it makes the experience feel like a “real” object-assisted try-on rather than a random filter.

What makes this useful for e-commerce?

It reduces uncertainty about appearance and proportion. The shopper can see the watch on a wrist-sized reference and judge the look before buying.

What is one practical lesson to apply without AR?

Use a simple physical reference or on-screen guide that anchors scale and positioning, then let the shopper control the view quickly so the proof feels personal.

What is the main limitation of this type of approach?

It can show appearance and rough scale, but it cannot fully replicate comfort, weight, or how a strap feels. It works best as a confidence booster, not a perfect substitute for trying it on.