NIVEA Creme: Second Skin Project

NIVEA Creme: Second Skin Project

A mother puts on a headset and a skin-like suit. Her son does the same, thousands of kilometres away. The promise is simple. If they cannot be together for Christmas, technology will let them feel a hug anyway.

That is the set-up in NIVEA Creme’s “Second Skin Project” with Leo Burnett Madrid. The film introduces Laura in Madrid and her son Pablo, who is away volunteering in Paraguay. They are invited to test a “Second Skin” garment that is presented as a high-tech fabric designed to simulate human skin and transmit the sensation of touch at distance, paired with virtual reality headsets.

The story then pivots. What looks like a tech demo is used to make a point about touch, not technology. The most persuasive moment is not the suit. It is the human reunion that follows, designed to underline NIVEA Creme’s belief that nothing beats skin-to-skin contact.

The “Second Skin” mechanism that pulls you in

The film borrows credibility from advanced-sounding materials and VR. That framing creates anticipation, because the viewer wants to know whether the experiment can actually work. The suit and headset are the narrative engine that earns attention for long enough to land the real message.

In global consumer brands where heritage products compete with endless alternatives, emotional proof often carries more weight than functional claims.

The real question is whether the tech is the story, or whether it is just a credible pretext for the brand to own the value of touch.

The twist that protects the brand meaning

There is a risk with tech-led emotion. The technology can become the hero and the brand becomes a sponsor. This script avoids that by using the tech as a decoy. The reveal shifts the spotlight back to the product truth. A hug is still the best “gift” and NIVEA Creme wants to be associated with that intimacy.

Extractable takeaway: When you borrow a shiny mechanism to earn attention, make the emotional payoff explicitly restate what the brand believes, or the gadget takes the credit.

How to use “purpose + tech” without losing the human truth

  • Use technology as the hook, not the conclusion. Let it earn attention, then pay it off with a human truth.
  • Make the brand stance explicit. Here the stance is clear. Technology can be amazing, but touch matters more.
  • Cast real stakes. Distance, holidays, and family history make the outcome feel earned.
  • Keep the product role emotional, not technical. NIVEA Creme is not “the innovation”. It is the comfort cue that frames the story.

A few fast answers before you act

What is the NIVEA Creme Second Skin Project?

It is a Christmas-season film and experiment setup where a mother and son test a VR-led “Second Skin” suit that is presented as transmitting the feeling of touch at distance, then the story reveals the value of real human contact.

Why does the campaign use VR and a “second skin” suit?

Because it creates a believable question the audience wants answered. Can technology replicate a hug? That curiosity holds attention long enough for the campaign’s real point to land.

What is the core message NIVEA Creme is trying to own?

That skin-to-skin contact matters. The work uses technology to highlight that, even in a world of advanced tools, nothing replaces human touch.

What makes this more than a generic emotional video?

The narrative structure. It starts as a tech experiment, then pivots into a human reunion. That contrast makes the conclusion feel stronger than a straight sentimental story.

What is the biggest risk with “tech-as-story” campaigns?

Audience misattribution. People remember the gadget and forget the brand meaning. The fix is to ensure the emotional payoff clearly belongs to the brand stance, not the device.

Volkswagen Trailer Assist

Volkswagen Trailer Assist

The Trailer Assist feature allows Volkswagen cars to park semi-autonomously using the rear backup camera. To promote this feature in Norway, Volkswagen created a stunt where a driver appeared to back up his car and trailer at high speed through parking lots, roundabouts and intersections.

The film looks impossible on purpose. The “trailer” was built as a disguised driving rig, with a stunt driver inside. One-way transparent plexi glass (and film) kept visibility possible for the driver in the rig, while still selling the illusion from the outside.

What Trailer Assist is actually solving

Reversing with a trailer is where confidence collapses for many drivers. The steering feels counter-intuitive, small corrections compound fast, and stress makes it worse. Trailer Assist flips that experience by turning the job into a simpler “direction setting” task, while the system handles the tricky part of guiding the trailer’s path using the rear camera. By “direction setting,” the driver chooses where the trailer should go rather than constantly counter-steering every correction.

Why the stunt works as marketing

In automotive marketing, driver-assist features are easier to remember when the audience feels the pain before it hears the specification. Because the stunt externalizes the panic of trailer reversing at an exaggerated scale, viewers immediately understand why assistance matters before the feature is explained. This is smart feature marketing because it dramatizes the user problem first and the technology second.

Extractable takeaway: When a feature reduces a known stress point, dramatize the stress first so the assistance feels necessary rather than technical.

What Volkswagen is really demonstrating here

The real question is whether Volkswagen can turn a hidden driver-assist feature into a capability buyers instantly understand and remember. Volkswagen is not selling autonomous driving here. It is selling confidence at the exact moment many drivers feel least competent.

What to steal for tech-feature storytelling

  • Start with a strong visual proof. If the benefit is hard to explain, make it easy to see.
  • Use exaggeration to earn attention, then anchor in reality. The stunt pulls people in. The feature explanation keeps it credible.
  • Pick a scenario your audience already fears. Trailer reversing is a universal stress test.

A few fast answers before you act

What is Volkswagen Trailer Assist?

A driver-assist feature that helps manoeuvre a trailer while reversing using the rear camera, reducing the counter-intuitive steering challenge.

What did Volkswagen do in Norway to promote it?

They staged a stunt that made it look like a Volkswagen reversed a trailer at very high speed through real-world driving situations.

How did they create the illusion?

A disguised trailer rig with a hidden stunt driver inside made the movement possible while keeping the “reverse drive” effect believable from the outside.

Why was plexi glass part of the setup?

One-way transparent plexi glass (and film) allowed the driver in the rig to see out while keeping the illusion intact for onlookers and camera angles.

What is the key takeaway for marketers?

When a feature is hard to appreciate in a static demo, create a single dramatic scenario that forces attention, then connect it back to the everyday value.

Hyundai: Virtual Guide AR App for Owners

Hyundai: Virtual Guide AR App for Owners

An owner’s manual you point at the car

To make life easier for car owners, Hyundai has built an augmented reality app called the Virtual Guide. It allows Hyundai owners to use their smart phones to get more familiar with their car and learn how to perform basic maintenance without delving into a hundred page owner’s manual.

Here, augmented reality means on-screen overlays that label real-world parts and show step by step guidance while you view the car through the phone camera.

Here is a short demo video of the app from The Verge at CES 2016.

The clever part: help appears exactly where you need it

Instead of searching through pages, you point your phone at the car and learn in-context. That one shift. From reading about a feature to seeing guidance on the actual part. Makes learning faster and less frustrating.

In consumer product and mobility brands, the highest-value help shows up at the moment of use, not in a document you have to hunt for.

The real question is whether your product help meets people where the problem happens, or sends them off to search.

In-context, camera-based guidance should be the default for “how do I” tasks. Manuals should be the fallback.

Why this is a big deal for everyday ownership

Most drivers do not ignore manuals because they do not care. They ignore them because the effort is too high at the moment they need help. AR lowers that effort by turning “How do I…?” into a quick visual answer while you are standing next to the car.

Extractable takeaway: If you can put guidance on the real object in front of someone, you remove the search step. That makes follow-through more likely.

What Hyundai is really building here

Fewer support moments, fewer avoidable service misunderstandings, and a smoother owner experience that strengthens trust in the brand long after purchase.

The Virtual Guide app will be available in the next month or two for the 2015 and the 2016 Hyundai Sonata and will come to the rest of the Hyundai range later on this year.

Patterns to borrow for product help

  • Move instruction from documentation into the environment. In-context guidance beats search.
  • Design for the real moment of need. Standing next to the product, phone in hand.
  • Make “basic maintenance” feel doable. Confidence is a retention lever.

A few fast answers before you act

What is Hyundai Virtual Guide?

An augmented reality app that helps Hyundai owners learn car features and perform basic maintenance using a smartphone instead of relying on the printed owner’s manual.

How does it work in practice?

You use your phone to view parts of the car and get guidance designed to help you understand features and maintenance steps in context.

Which models does the post say it supports first?

The post says it will be available first for the 2015 and 2016 Hyundai Sonata, then expand across the Hyundai range later in the year.

Where was the demo shown?

The post references a demo video from The Verge at CES 2016.