Flair: Fashiontag

Women are always looking for inspiration for their wardrobe and most of the time they find this inspiration by looking at other women.

This inspired agency Duval Guillaume to create a Flair Fashiontag Facebook app for Belgian women’s magazine Flair. In the app, instead of tagging people, you can tag people’s clothes or accessories and ask them where they got them.

All fashiontags are displayed in a Facebook gallery, and the best are published in the weekly edition of Flair. This way there is constant interaction between the Facebook application and the magazine itself.

Turning social curiosity into a repeatable format

The mechanism is a simple swap. Replace social tagging of people with social tagging of products. A photo becomes a shoppable question. The owner of the outfit becomes the source. The magazine becomes the curator that elevates the best finds from feed to print.

In fashion and lifestyle publishing, converting casual “where did you get that” moments into a structured loop is a practical way to keep community activity and editorial output feeding each other.

Why it lands

This works because it formalizes a behavior that already exists. People already look at outfits, notice details, and ask friends for sources. Fashiontag simply gives that behavior a native interface and a public gallery, then adds a prestige layer by featuring the best tags in the weekly magazine.

Extractable takeaway: If your audience already asks each other for product sources, build a lightweight format that captures those questions in the moment and rewards the best contributions with visible amplification.

What to steal

  • Swap the object of attention: tag the item, not the person, when product discovery is the real intent.
  • Close the loop with curation: a gallery is useful. Editorial selection makes it aspirational.
  • Make participation low-friction: one tag, one question, one shareable output.
  • Bridge channels on purpose: use print, site, and social as a single system, not separate campaigns.
  • Protect the social contract: ensure the person in the photo is comfortable with tagging and featuring, especially when content moves into a magazine.

A few fast answers before you act

What is Flair Fashiontag?

It is a Facebook app for Flair magazine that lets users tag clothes or accessories in photos and ask where those items were purchased.

What makes it different from normal photo tagging?

Normal tagging identifies people. Fashiontag identifies items. It turns fashion curiosity into a structured question-and-answer interaction.

How does the magazine benefit from the Facebook app?

The app creates a steady flow of wardrobe inspiration and real questions from readers. The magazine then curates and publishes the best tags, which reinforces participation.

Why is this a strong community mechanic?

Because it rewards helpfulness. People contribute sources and recommendations, and the gallery plus print selection turns that help into recognition.

What is the biggest risk in this format?

Consent and comfort. Tagging items in someone’s photo can feel intrusive if the person did not opt in, especially if content can be featured publicly in print.

Corning: A Day Made of Glass

Here is a future vision video by Corning, on where they see multi-touch digital displays over the next few years. Multi-touch means the surface can track several fingers or hands at once, so gestures like pinch, rotate, and shared interaction become natural.

What the film is really demonstrating

The core mechanic is simple. Turn glass from “protective cover” into “primary interface”. Every surface becomes a screen. Every screen becomes responsive to direct manipulation. Information follows you across contexts, from home to school to office, with the same touch-first language.

In consumer electronics and workplace IT, concept films like this are used to align designers, suppliers, and product teams around a shared interface direction.

Why it lands

It removes the usual friction between people and devices. No boot-up rituals, no “find the remote,” no hunting through menus. You touch the thing you want to change, and the system answers in place. That immediacy is the real promise, not the glass itself.

Extractable takeaway: When you are pitching a new interface paradigm, show behavior before hardware. Make the gestures, feedback loops, and handoffs between screens unmistakable, so the idea remains valuable even if the materials and form factors change.

What to steal for your own work

  • Design the interaction language first. Define the small set of gestures and responses that can travel across surfaces, sizes, and contexts.
  • Keep information anchored to the object or task. The winning moments happen when data appears exactly where the decision is being made.
  • Plan for multi-user moments. Big surfaces invite collaboration. Design for two people at the same time, not just one user plus spectators.
  • Prototype the “seams.” The handoff between phone, table, wall, and car is where most visions break. That is the first place to test.

A few fast answers before you act

What is “A Day Made of Glass” trying to communicate?

It is a vision of glass becoming an interactive medium, where touch-first displays move from dedicated devices into everyday surfaces.

What’s the practical value of watching concept videos like this?

They are useful for spotting interface patterns early, then translating the patterns into near-term prototypes and roadmap language for teams and partners.

What’s the biggest product risk in “glass everywhere” thinking?

Over-indexing on the surface and under-investing in the interaction model. If the gestures, feedback, and context switching are weak, the material does not matter.

What is one immediate takeaway a UX or product team can apply?

Write a short “interaction grammar” for your experience, then test it across at least two form factors. If the grammar does not travel, the concept will not scale.

Volkswagen Norway: Test drive in a print ad

You open a magazine and see a long, empty road. Then you hover an iPhone over the printed page and a Volkswagen appears to “drive” along that road on your screen. It is a test drive that happens inside a print ad, with summer and winter road versions depending on the magazine insert.

Volkswagen Norway builds this as a hybrid print and mobile experience. Readers are prompted to download an app, developed by Mobiento, that turns the printed road into a track. The phone becomes the controller and the page becomes the environment. The payoff is simple viewer control. You move the phone. The car moves with you.

An augmented reality print ad is a piece of print that a camera can recognize as a trigger. Once recognized, an app overlays a digital layer onto the page, anchored to the printed design so the interaction feels connected to the physical medium.

The experience is designed to demo three features in a way print usually cannot. Lane assist, adaptive lights, and cruise control. It is not a real test drive, but it is a clear and surprisingly tactile explanation of systems that are otherwise hard to “feel” from a magazine spread.

Why this works as an explanation engine

Driver-assist features are abstract until you see them respond to a road situation. Here, the road is literally in your hands. The app turns a passive reading moment into a short simulation where the benefit is demonstrated rather than claimed.

What the campaign is really doing for the brand

This is a positioning move as much as a product demo. It says Volkswagen brings technology into everyday life and it does it with familiar media, not only with future-facing formats. Print becomes the doorway into a mobile experience, and that contrast makes both feel more interesting.

What to steal for your own print-to-mobile idea

  • Make the printed asset the interface. The road is not decoration. It is the input surface.
  • Choose features that benefit from simulation. Assist systems and “smart” behaviours are ideal for quick demos.
  • Keep the interaction one-step. Download, point, move. Anything more kills curiosity.
  • Provide two contexts. Summer and winter versions make the concept feel robust and replayable.

A few fast answers before you act

What is “test drive in a print ad” in simple terms?

It is a magazine ad that works with an iPhone app. When you hover the phone over the printed road, the app overlays a car on screen and lets you simulate driving along the page.

What features does the VW print-ad test drive demonstrate?

The experience is built around lane assist, adaptive lights, and cruise control, using the printed road as the scenario that triggers the system behaviours.

Why is this better than a normal print ad for tech features?

Because it shows behaviour, not descriptions. The viewer sees the system respond in a road context, which is more memorable than reading about it.

Is it accurate to call it the world’s first?

Volkswagen Norway bills it that way, and the work is widely described as an early example of augmented reality applied to print as a functional “test road”.

What is the main risk with print-to-app activations?

Friction. If install or recognition is slow, people stop. The first payoff has to arrive quickly so the novelty turns into understanding.