Pizza Hut: Pie Tops II

Pizza Hut is the official pizza of the NCAA, a men’s basketball tournament known informally as March Madness and played each spring in the United States.

For last year’s tournament, Pizza Hut created what was billed as the world’s first shoe that ordered a pizza. Now, to celebrate their second year as the official pizza of the NCAA, Pizza Hut, Droga5 and the Shoe Surgeon launched Pie Tops II. It is a limited-edition high top shoe that not only uses your geolocation to order the current Pizza Hut deal at the press of a button, but also allows users to pause the game while they receive their delivery.

A TV ad has also been released to highlight the new pause feature of these newly relaunched Pie Top shoes.

A sneaker button that behaves like a remote

The mechanism is deliberately simple. Put a single button on the shoe. Tie it to an app. Map the press to two jobs: order, then pause. The shoe becomes a physical shortcut for a very specific March Madness moment, when people want food but do not want to miss play. That works because it removes friction at the exact moment attention is highest.

In second-screen sports viewing, the strongest interactions reduce interruption while keeping attention on the live game.

Why it lands on game day

Pie Tops II works because it converts a familiar tension into a prop. Hunger versus attention. Convenience versus FOMO. The “pause” feature turns a delivery problem into a punchline, and the shoe format makes the whole thing instantly tellable.

Extractable takeaway: If you can turn a high-frequency habit into a one-action ritual, you make the brand feel like part of the event, not just an ad around it.

The real intent behind the novelty

This is not really about footwear. The real question is how Pizza Hut earns a place inside the live ritual instead of advertising around it. It is about owning a behavior loop during March Madness. By behavior loop here, I mean a repeatable sequence of trigger, action, and reward that keeps the brand attached to the moment. Pizza ordering, deal recall, and a reason to talk about Pizza Hut in the same breath as the game. The smart move here is not the gadget but the way it turns brand utility into event behavior. Limited-edition scarcity does the rest, because it makes the product itself a piece of shareable culture.

What brands can steal from Pie Tops II

  • Pick one moment to own: design for a specific tension that happens repeatedly during an event, not for “sports fans” in general.
  • One control, two outcomes: a single action that triggers both utility and delight is more memorable than a complex feature list.
  • Make the object do the storytelling: the product should explain the campaign in one sentence, even without a logo.
  • Build viewer control into the idea: letting people keep the game in their hands makes the brand feel helpful, not interruptive.
  • Scarcity as distribution: limited runs can function like media spend when the object is inherently talkable.

A few fast answers before you act

What are Pie Tops II?

They are limited-edition Pizza Hut sneakers designed for March Madness that let you order pizza via a button press and, as described, pause the game while you wait for delivery.

What problem is this campaign solving?

It dramatizes a familiar game-day problem. People want food without missing play. The stunt turns that tension into a memorable product feature and a shareable story.

Why does the “pause” feature matter more than the pizza-ordering feature?

Ordering is convenient. Pausing is emotionally resonant because it speaks directly to FOMO during live sports. It is the twist that makes the idea travel.

Is this wearable tech or brand entertainment?

It is primarily brand entertainment packaged as a functional shortcut. The utility makes it credible. The novelty makes it worth talking about.

What is the reusable pattern for other brands?

Create a physical or tactile shortcut for a high-frequency moment. Keep the interaction to one obvious action. Then tie it to an event where people already have strong emotions and repeat behaviors.

NIVEA Creme: Second Skin Project

A mother puts on a headset and a skin-like suit. Her son does the same, thousands of kilometres away. The promise is simple. If they cannot be together for Christmas, technology will let them feel a hug anyway.

That is the set-up in NIVEA Creme’s “Second Skin Project” with Leo Burnett Madrid. The film introduces Laura in Madrid and her son Pablo, who is away volunteering in Paraguay. They are invited to test a “Second Skin” garment that is presented as a high-tech fabric designed to simulate human skin and transmit the sensation of touch at distance, paired with virtual reality headsets.

The story then pivots. What looks like a tech demo is used to make a point about touch, not technology. The most persuasive moment is not the suit. It is the human reunion that follows, designed to underline NIVEA Creme’s belief that nothing beats skin-to-skin contact.

The “Second Skin” mechanism that pulls you in

The film borrows credibility from advanced-sounding materials and VR. That framing creates anticipation, because the viewer wants to know whether the experiment can actually work. The suit and headset are the narrative engine that earns attention for long enough to land the real message.

In global consumer brands where heritage products compete with endless alternatives, emotional proof often carries more weight than functional claims.

The real question is whether the tech is the story, or whether it is just a credible pretext for the brand to own the value of touch.

The twist that protects the brand meaning

There is a risk with tech-led emotion. The technology can become the hero and the brand becomes a sponsor. This script avoids that by using the tech as a decoy. The reveal shifts the spotlight back to the product truth. A hug is still the best “gift” and NIVEA Creme wants to be associated with that intimacy.

Extractable takeaway: When you borrow a shiny mechanism to earn attention, make the emotional payoff explicitly restate what the brand believes, or the gadget takes the credit.

How to use “purpose + tech” without losing the human truth

  • Use technology as the hook, not the conclusion. Let it earn attention, then pay it off with a human truth.
  • Make the brand stance explicit. Here the stance is clear. Technology can be amazing, but touch matters more.
  • Cast real stakes. Distance, holidays, and family history make the outcome feel earned.
  • Keep the product role emotional, not technical. NIVEA Creme is not “the innovation”. It is the comfort cue that frames the story.

A few fast answers before you act

What is the NIVEA Creme Second Skin Project?

It is a Christmas-season film and experiment setup where a mother and son test a VR-led “Second Skin” suit that is presented as transmitting the feeling of touch at distance, then the story reveals the value of real human contact.

Why does the campaign use VR and a “second skin” suit?

Because it creates a believable question the audience wants answered. Can technology replicate a hug? That curiosity holds attention long enough for the campaign’s real point to land.

What is the core message NIVEA Creme is trying to own?

That skin-to-skin contact matters. The work uses technology to highlight that, even in a world of advanced tools, nothing replaces human touch.

What makes this more than a generic emotional video?

The narrative structure. It starts as a tech experiment, then pivots into a human reunion. That contrast makes the conclusion feel stronger than a straight sentimental story.

What is the biggest risk with “tech-as-story” campaigns?

Audience misattribution. People remember the gadget and forget the brand meaning. The fix is to ensure the emotional payoff clearly belongs to the brand stance, not the device.

Project Soli: Hands Become the Interface

Google ATAP builds what people actually use

Google ATAP is tasked with creating cool new things that we’ll all actually use. At the recently concluded Google I/O event, they showcase Project Soli. A new kind of wearable technology that wants to make your hands and fingers the only user interface you’ll ever need.

This is not touchless interaction as a gimmick. It is a rethink of interface itself. Your gestures become input. Your hands become the control surface.

The breakthrough is radar, not cameras

To make this possible, Project Soli uses a radar that is small enough to fit into a wearable like a smartwatch.

The small radar picks up movements in real time and interprets how gestures alter its signal. This enables precise motion sensing without relying on cameras or fixed environmental conditions.

In wearable computing and ambient interfaces, the real unlock is interaction that works in motion, without relying on tiny screens.

The real question is whether wearables can move beyond miniaturized apps and make interaction work in motion, without a screen-first mindset.

The implication is straightforward. Interaction moves from screens to motion. User interfaces become something you do, not something you tap.

Why this matters for wearable tech

Wearables struggle when they copy the smartphone model onto tiny screens. Wearable UX should treat the screen as optional, not primary.

Extractable takeaway: When the screen becomes the bottleneck, shift the interface to sensing and interpretation, then keep the gesture vocabulary small enough to learn fast.

Instead of shrinking interfaces, it removes them. The wearable becomes a sensor-driven layer that listens to intent through movement.

If this approach scales, it changes what wearable interaction can be. Less screen dependency. More natural control. Faster micro-interactions.


What Soli teaches about hands-first UX

  • Start with intent, not UI. Define the handful of moments where a gesture is faster than hunting for a screen.
  • Design for motion. Favor interactions that work while walking, commuting, or doing something else with your attention.
  • Keep the gesture set teachable. A small, consistent vocabulary beats a large library that nobody remembers.

A few fast answers before you act

Is Project Soli just gesture control?

It is gesture control powered by a radar sensor small enough for wearables, designed to make hands and fingers the primary interface.

Why use radar instead of cameras?

Radar can sense fine motion without relying on lighting, framing, or line-of-sight in the same way camera-based systems do.

What is the real promise here?

Interfaces that disappear. Interaction becomes physical, immediate, and wearable-friendly.

What should a product team prototype first?

Pick one high-frequency moment where a quick gesture could replace a screen tap, and test whether the sensing feels reliable in motion.

What is the biggest adoption risk?

If gestures feel inconsistent or hard to learn, people will default back to the screen. The bar is effortless, not novel.