A Can Size for Every Aussie

Kraft launches four new sizes of Heinz baked bean cans with a three-minute “life narrative” film. It follows Geoff, a man addicted to beans, and his future wife, whom he meets in the spaghetti department. The story builds to the punchline. Geoff “invents” a range of can sizes that feels perfect for different Australian occasions.

The creative choice is doing a lot of work. It turns something that is normally functional and forgettable. Pack size. Into a character-driven narrative that is easy to watch and easy to remember.

The insight behind the pack strategy

In 2016, Kraft commissions consumer and shopper research to understand how Australians use Heinz beans and spaghetti. The key finding is straightforward. People want ideal can sizes that suit different occasions.

Four sizes is not “more choice” for its own sake. It is a response to a usage reality. One household does not always need the same portion format.

Why a film is the right container for a packaging story

Packaging benefits can sound like rational product copy. This film makes the point emotionally, then lands it practically.

Extractable takeaway: When the product change is useful but easy to ignore, story can turn the format logic into something people can retell.

In FMCG portfolios, format expansion only scales when shoppers can instantly see why each variant exists.

This is the right strategic move because the job is not to announce four SKUs. It is to make each size feel like an intuitive answer to a real usage moment, so the portfolio looks helpful instead of bloated.

The real question is whether the audience immediately understands why more pack formats improve everyday use.

The narrative format also solves a distribution problem. It gives the campaign a reason to be watched and shared even by people who do not currently care about can sizes.

What to steal if you are launching format variants

  • Start with a concrete usage insight, not a portfolio decision.
  • Give the variant story a memorable mental model. Here, “a can size for every occasion.”
  • Use entertainment to earn attention. Then let the product logic feel obvious, not forced.

A few fast answers before you act

What is being launched here?

Four new sizes of Heinz baked bean cans.

What insight drives the launch?

Kraft’s research shows Australians are looking for ideal can sizes to suit different occasions.

How is the launch communicated?

Through a three-minute life narrative film featuring Geoff and his future wife in the spaghetti department.

What is the core marketing technique?

Use story to make a functional packaging benefit feel human, memorable, and worth sharing.

Why not just announce the new sizes directly?

Because the film helps the audience feel the usefulness of the size range, rather than processing it as a dry packaging update.

Oakley: Pro Vision with Google Cardboard

When you picture a virtual reality (VR) headset, you probably imagine something high-tech and far too expensive to feel practical. Google Cardboard takes that assumption and flips it by turning a simple cardboard cutout into a phone-powered VR viewer.

Oakley borrows that logic and puts it exactly where people already accept cardboard. The packaging. Instead of being thrown away, the box becomes the device that unlocks the experience.

Packaging that turns into a VR product

Google launched Google Cardboard as a cardboard cutout that turns Android phones into a VR headset. Oakley integrates that fold-and-slot concept into its sunglass packaging, so customers can transform the pack into a viewer and use their phone to access 360-degree content.

The payoff is described as a “you are there” look at extreme sports like surfing, skiing, mountain biking, skateboarding, and skydiving. It is less about specs and more about perspective.

In consumer product marketing, converting packaging from waste into a usable experience can create perceived value without adding new components.

Why this lands for an action-sports brand

This works because the medium matches the promise. Oakley is not only showing extreme sports. It is letting you look from inside the moment, using viewer control to make the content feel personal. The “VR made from packaging” twist also creates a good kind of surprise. The customer discovers the brand added value where they expected disposal.

Extractable takeaway: If your story is about immersion or perspective, build the experience trigger into something the customer already touches, then let the first interaction deliver the benefit before they read any explanation.

The commercial intent underneath

This is a purchase-adjacent experience. It turns the post-purchase moment into brand time, and it extends the product narrative beyond the sunglasses themselves. The packaging becomes a bridge between retail and content, with the customer doing the assembly that makes the story memorable.

The real question is whether the packaging can turn post-purchase curiosity into a usable brand experience, not whether it can imitate premium VR hardware.

What to steal from packaging-led immersion

  • Reuse an accepted “throwaway” material. If it is already in hand, it is frictionless distribution.
  • Make the first use obvious. Assembly and activation should be legible without instructions.
  • Match the experience to brand territory. Immersive POV content fits performance and extreme sports.
  • Design for sharing. If it looks clever on camera, people will demonstrate it for you.

A few fast answers before you act

What is Oakley Pro Vision in this context?

It is a packaging-led idea where an Oakley box folds into a Google Cardboard style VR viewer, using a phone to deliver 360-degree extreme sports content.

Why use Google Cardboard instead of a dedicated headset?

Because it lowers cost and setup. A phone plus folded cardboard is enough to deliver an immersive experience without asking people to buy new hardware.

What does 360-degree content add versus normal video?

It gives viewer control over where to look, which increases the sense of presence and makes the experience feel closer to a real point of view.

Where does the marketing value come from?

From turning packaging into a reusable object and extending brand time after purchase, while linking the product to high-adrenaline moments people want to feel.

What is the main failure mode with this pattern?

If the fold, fit, or onboarding is unclear, people will not assemble it. The physical usability has to be as strong as the content.

iBeacons: Context as the Interface

From proximity to context

iBeacons introduce a simple but powerful idea. The physical world can trigger digital behavior.

A smartphone does not need to be opened. A user does not need to search. The environment itself becomes the signal.

At their core, iBeacons enable proximity-based awareness. When a device enters a defined physical range, a predefined digital action can occur. That action may be a notification, a content change, or a service trigger.

The evolution is not about distance. It is about context.

What iBeacons enable

iBeacons are small Bluetooth Low Energy transmitters. They broadcast an identifier. Nearby devices interpret that signal and respond based on predefined rules.

This creates a new interaction model. Digital systems respond to where someone is, not just what they click. Because that location signal arrives before a click, the system can reduce friction by pre-loading the most relevant content or service for that moment.

Retail stores, public spaces, machines, and even wearable objects become programmable environments. The physical location is no longer passive. It actively participates in the experience.

Why proximity alone is not the breakthrough

Early use cases focus heavily on messaging. Push notifications triggered by presence. Alerts sent when someone enters a zone.

That framing misses the point.

The real value emerges when proximity is combined with intent, permission, and relevance. Without those elements, proximity quickly becomes noise.

iBeacons are not a messaging channel. They are an input layer. Here, “input layer” means a reliable real-world signal that can change digital content or services without requiring a click.

The real question is whether proximity removes a step for the user, or just adds another interruption.

In global retail and consumer-brand environments, iBeacons work best when they connect physical moments to consented digital help at the point of need.

From messaging to contextual experience design

As iBeacon use matures, the focus shifts away from alerts and toward experience orchestration.

Instead of asking “What message do we send here?”, the better question becomes “What should adapt automatically in this moment?”

This is where real-world examples start to matter.

Example 1. When a vending machine becomes a brand touchpoint

The SnackBall Machine demonstrates how iBeacons can turn a physical object into an interactive experience.

Developed for the pet food brand GranataPet in collaboration with agency MRM / McCann Germany, the machine uses iBeacon technology to connect the physical snack dispenser with a digital layer.

The interaction is not about pushing ads. It is about extending the brand experience beyond packaging and into a moment of engagement. The machine becomes a contextual interface, meaning the object itself selects the right digital behavior when someone is present. Presence triggers relevance.

This is iBeacon thinking applied correctly. Not interruption, but augmentation.

Example 2. When wearables make context portable

Tzukuri iBeacon Glasses enable hands-free, glance-based, context-aware information.

The Tzukuri iBeacon Glasses, created by Australian company Tzukuri, take the concept one step further.

Instead of fixing context to a location, the context moves with the person.

The glasses interact with nearby beacons and surfaces, enabling hands-free, glance-based, context-aware information. The interface does not demand attention. It integrates into the wearer’s field of view.

This example highlights a critical shift. iBeacons are not limited to phones. They are part of a broader ambient computing layer. Here, “ambient computing layer” means computing embedded in objects and surroundings that responds without demanding a screen-first interaction.

Modern product and experience design is slowly replacing “screen” with “context” as the interface.

Why these examples matter

Both examples share a common pattern.

Extractable takeaway: Treat proximity as a signal to adapt the service in the moment. If it does not reduce friction or increase clarity, it is not context. It is noise.

The user is not asked to do more. The system adapts instead.

The technology fades into the background. The experience becomes situational, timely, and relevant.

That is the real evolution of iBeacons. Not scale, but subtlety.

The real evolution. Invisible interaction

The most important step in the evolution of iBeacons is not adoption. It is disappearance.

The more successful the system becomes, the less visible it feels. No explicit action. No conscious trigger. Just relevance at the right moment.

This aligns with a broader shift in digital design. Interfaces recede. Context takes over. Technology becomes ambient rather than demanding.

Why iBeacons are an early signal, not the end state

iBeacons are not the final form of contextual computing. They are an early, pragmatic implementation.

They prove that location can be a reliable input. They expose the limits of interruption-based design. They push organizations to think in terms of environments rather than channels.

What evolves next builds on the same principle. Context first. Interface second.

Practical rules for context-first experiences

  • Start with the moment, not the message. Define what should adapt automatically when someone is present, before deciding what to notify.
  • Proximity is an input, not a channel. Use beacon signals to change content, offers, or service steps. Do not treat them as another push pipeline.
  • Permission and intent are part of the design. Make opt-in explicit and only trigger actions that match why the user is there.
  • Optimize for invisibility. The best beacon experience feels like the environment helping, not marketing interrupting.
  • Measure behavior change. Track whether friction drops and tasks complete faster, not whether notifications were opened.

A few fast answers before you act

What are iBeacons in simple terms?

iBeacons are small Bluetooth Low Energy transmitters that let phones detect proximity to a location or object and trigger a specific experience based on that context.

Do iBeacons automatically track people?

No. The experience usually depends on app presence and permissions. Good implementations make opt-in clear and use proximity as a trigger, not as silent surveillance.

What is the core mechanism marketers should understand?

Proximity becomes an input. When someone is near a shelf, a door, or a counter, the system can change what content or actions are offered, because the context is known.

What makes a beacon experience actually work?

Relevance and timing. The action has to match the moment and reduce friction. If it feels like random messaging, it fails.

What is the main takeaway?

Design the experience around the place, not the screen. Use context to simplify choices and help people complete a task, then measure behavior change, not opens.