Ford C-Max Augmented Reality

A shopper walks past a JCDecaux Innovate mall “six-sheet” screen and stops. Instead of watching a looped video, they raise their hands and the Ford Grand C-MAX responds. They spin the car 360 degrees, open the doors, fold the seats flat, and flip through feature demos like Active Park Assist. No printed marker. No “scan this” prompt. Just gesture and immediate feedback.

What makes this outdoor AR execution different

This is where augmented reality in advertising moves from a cool, branded desktop experience to a marker-less, educational interaction in public space. The campaign, created by Ogilvy & Mather with London production partner Grand Visual, runs on JCDecaux Innovate’s mall digital screens in UK shopping centres and invites passers-by to explore the product, not just admire it.

The interaction model, in plain terms

Instead of asking people to download an app or scan a code, the screen behaves like a “walk-up showroom.”

  • Hands up. The interface recognises the user and their gestures.
  • Virtual buttons. On-screen controls let people change colour, open doors, fold seats, rotate the car, and trigger feature demos.
  • Learning by doing. The experience is less about spectacle and more about understanding what the 7-seat Grand C-MAX offers in a few seconds.

How the marker-less AR works here

The technical leap is the move away from printed markers or symbols as the anchor for interaction. The interface is based on natural movement and hand gestures, so any passer-by can start immediately without instructions.

Under the hood, a Panasonic D-Imager camera measures real-time spatial depth, and Inition’s augmented reality software merges the live footage with a 3D, photo-real model of the Grand C-MAX on screen.

In retail and out-of-home environments, interactive screens win when they eliminate setup friction and teach the product in seconds.

Why this matters for outdoor digital

If you care about outdoor and retail-media screens as more than “digital posters,” this is a strong pattern:

  • Lower friction beats novelty. The magic is not AR itself. The magic is that the user does not need to learn anything first.
  • Gesture makes the screen feel “alive.” The moment the passer-by sees the car respond, the display stops being media and becomes a product interface.
  • Education scales in public space. Showing how seats fold, how doors open, or what a feature demo looks like is hard to compress into a static ad. Interaction solves that.

Practical takeaways if you want to build something like this

  • Design for instant comprehension. Assume 3 seconds of attention before you earn more. Lead with one obvious gesture and one obvious payoff.
  • Keep the control set small. Colour, rotate, open, fold. A few high-value actions beat a deep menu.
  • Treat it like product UX, not campaign UX. The success metric is “did I understand the car better,” not “did I watch longer.”
  • Instrument it. Track starts, completions, feature selections, and drop-offs. Outdoor can behave like a funnel if you design it that way.

A few fast answers before you act

What is the core innovation here?

Marker-less, gesture-driven AR on mall digital screens that lets passers-by explore product features without scanning a code or using a printed marker.

What does the user actually do?

They raise their hands to start, then use on-screen controls to change colour, open doors, fold seats, rotate the car, and trigger feature demos like Active Park Assist.

What technology enables it?

A depth-imaging camera measures real-time spatial depth, and AR software merges live footage with a 3D model of the vehicle.

Why does “marker-less” matter in public spaces?

Because it removes setup friction. Anyone walking by can immediately interact through natural movement and gestures.

Black Eyed Peas: BEP360 AR music video

In 2011, the smartest artists are starting to behave like brands. Not only by releasing content, but by building experiences around it that fans can actually play with.

BEP360 is a strong example of that thinking. It packages a 360-degree, motion-controlled music video experience around The Black Eyed Peas, designed for iPhone, iPad, and iPod touch.

The core mechanic is simple. You move your device, and the camera view moves with you, giving you viewer control inside the scene. On top of that, BEP360 includes an augmented reality layer triggered by pointing the iPhone camera at the album cover for The Beginning, plus a virtual photo session feature that lets fans stage shots with the band and share them.

In global entertainment marketing, app-based experiences are becoming a practical way to deepen fandom between releases and justify paid content with participation.

It is also an early signal of where “music video” can go when it is treated as a product experience rather than a clip you watch once. The app is billed as a first-of-its-kind 360-degree mobile music video, built under will.i.am’s will.i.apps banner, with augmented reality support via Metaio and 3D360 video technology referenced in early coverage.

Why this is more than a promo gimmick

The best part is the shift from passive viewing to participation. A 360-degree experience creates a reason to replay, because you cannot see everything at once. That replay value is what standard video launches rarely earn.

What the AR layer adds, and what it does not

The AR trigger is not the main event. It is a novelty layer that extends the universe into the physical world, using the album cover as the marker. The real value is the combination of interactive video plus social output. Fans can create something and share it, which keeps the campaign alive without requiring more media spend.

What to steal for your own fan-first experience

  • Give people viewer control. Control creates replay value.
  • Bundle features around one hero action. Here the hero action is “step inside the video”. Everything else supports that.
  • Use AR as an on-ramp, not the whole product. A quick wow moment is fine, but the experience must hold attention afterwards.
  • Design for sharing outputs. Photo sessions and remixable moments extend reach organically.

A few fast answers before you act

What is BEP360?

BEP360 is a Black Eyed Peas iOS app that turns a music video into an interactive 360-degree experience controlled by moving your device, with an added augmented reality layer triggered by the album cover.

What makes the music video “360-degree” in this case?

The camera perspective changes as you rotate or swing the phone, giving you control over where you look inside the scene while the track continues.

How does the augmented reality part work?

You point your iPhone camera at the The Beginning album cover, and the app overlays animated BEP characters and related content on screen.

Why does an app make sense for music marketing?

Because it can bundle interaction, social sharing, and ongoing fan content into one place. It gives people a reason to pay for the experience, not only consume a free clip.

What is the main risk with app-based fan experiences?

Friction. If downloads, device compatibility, or onboarding are annoying, the idea collapses. The experience has to deliver value within seconds.

Yellow Pages: Location Based Banner

Here is the next generation of interactive web banners. Tel Aviv agency Shalmor Avnon Amichay/Y&R promoted the Yellow Pages augmented reality location-based app by creating a banner that does the same thing.

The banner opens your webcam and lets you see the businesses around you. Wave your hand to switch between businesses. Click a business to jump straight to its Yellow Pages listing.

A banner that behaves like the product

The clever part is that this is not “interactive” for decoration. It is a working demo of the core value proposition. If the app helps you find what is near you, the banner proves that promise immediately, inside the placement, without asking you to imagine anything.

The mechanic: webcam as context, hand wave as UI

The flow is intentionally simple. Turn on the camera. Overlay nearby business options. Use a wave to move through results. Use a click to convert curiosity into action via the listing page.

In local discovery experiences, the strongest persuasion is a live, context-matched preview of usefulness rather than a feature claim.

Why it lands: it removes the “so what” gap

Most directory and local-search advertising dies in the space between promise and proof. This banner collapses that gap. You see your own context first, then you see results, then you can act. The interaction is the explanation.

Standalone takeaway: The fastest way to make a utility app feel essential is to let people experience the “aha” moment before they ever leave the page they are on.

What Yellow Pages is really trying to achieve

The business intent is to reposition Yellow Pages as modern, digital, and situationally useful, not just a legacy directory brand. The banner also creates a clear performance path. Engagement inside the unit, then click-out to a listing that can drive calls, visits, or follow-on app consideration.

What to steal from this execution

  • Mirror the product in the ad. If the product is a tool, make the ad behave like the tool.
  • Use one gesture people understand. A wave as “next” is instantly legible. No tutorial needed.
  • Keep the ladder of commitment short. Preview. Browse. Click through. No extra steps.
  • Make the experience readable for bystanders. In shared environments, obvious motion plus clear on-screen change sells the mechanic.
  • Watch privacy optics. If you turn on a camera, be explicit that it is for interaction and context, not identification.

A few fast answers before you act

What is a “location based banner”?

It is a banner ad that adapts its content to the user’s situation, typically location or environment cues, so the ad can show relevant nearby options instead of generic messaging.

How does this Yellow Pages banner work?

It opens a webcam view, overlays nearby business options, lets you wave to cycle through businesses, and lets you click a result to open the corresponding Yellow Pages listing.

Why use a webcam at all?

Because it makes the experience feel immediate and personal. The ad becomes a live “finder” interface rather than a static claim about finding things.

What makes gesture-controlled banners risky?

Friction and variability. If the gesture detection fails or is unclear, users assume the ad is broken. The interaction must be forgiving and the feedback must be instant.

What is the safest way to replicate the idea today?

Keep the mechanic to one simple input, provide clear on-screen feedback, and ensure the user can still get value even if they do not enable the camera.