Mercedes-Benz: Yes, A.I. Do

For the world premiere of their new Mercedes-Benz EQC at CES 2019 in Las Vegas, Mercedes transformed their new model into a wedding carriage. Four lucky couples were invited to test drive the new Mercedes-Benz EQC on the roads of Las Vegas and experience its special A.I. features first hand.

Why this launch twist works

  • It turns a product reveal into a story. A “wedding carriage” reframes a tech premiere into an experience people immediately understand.
  • It makes A.I. tangible. Instead of describing features on a stage, it puts them into a real drive where reactions matter.
  • It earns attention without shouting. The setup is unusual enough to travel, while still keeping the car at the center.

The reusable pattern

Wrap a launch moment in a simple, human ritual. Then invite a small group to experience the product in-context so the story carries the technology, not the other way around.


A few fast answers before you act

What happened in the Mercedes-Benz “Yes, A.I. Do” activation?

For CES 2019 in Las Vegas, Mercedes used the EQC premiere as a wedding-carriage themed experience and invited four couples to test drive the car and experience its A.I. features first hand.

Why use couples and a wedding theme for a car launch?

It creates an instantly recognizable narrative frame, which makes the activation easier to remember and easier to share than a standard demo.

What is the main takeaway for product launches?

Give the viewer a clear story hook, then let the product prove itself through a real experience rather than through claims.

How do you keep a stunt from overshadowing the product?

Make the product the “stage”. The theme should guide attention toward the experience of the product, not away from it.

Ford Noise Cancelling Kennel

An estimated 45% of dogs in the UK show signs of fear when they hear fireworks – causing distress to owners and their families too. So, Ford developed a noise-cancelling kennel concept that applied automotive know-how to help solve this everyday problem.

The idea was inspired by the noise-canceling technology Ford developed and introduced in its Edge SUV to give passengers a quieter ride. It worked so well that it got Ford thinking about how it could be applied to other facets of everyday life. In this case, it applied the tech to dogs and their fear of fireworks.

Feel the View

Ford in Italy, together with agency GTB Rome, teams up with Aedo, a local start-up that creates devices for people with visual impairments. Together they design a prototype device that attaches to a car window and decodes the landscape outside, allowing visually impaired passengers to experience it with the tip of their fingers.

The device transforms the flat surface of a car window into a tactile display. The prototype captures photos via an integrated camera and converts them into haptic sensory stimuli. The result is not primarily visual. It is perceptible through touch and hearing.

Why this matters as accessible experience design

This is an assistive interface built around a real, emotional moment. Looking out of a window during a drive. It treats “the view” as an experience that can be translated into other senses, rather than a privilege reserved for sighted passengers.

The product idea in one line

Capture what is outside the car, then render it on the window surface as a tactile and audio layer that can be explored in real time.

What to take from this if you build inclusive innovation

  • Start with a human moment. Here, it is shared travel and the desire to participate in what others are seeing.
  • Use the environment as the interface. The window is already where attention goes. It becomes the display.
  • Translate, do not replace. The concept does not mimic sight. It converts the same input into touch and sound.

A few fast answers before you act

What is “Feel the View”?

A Ford Italy concept with GTB Rome and Aedo that prototypes a car-window device converting outside landscapes into a tactile and audio experience for visually impaired passengers.

How does the prototype work at a high level?

An integrated camera captures what is outside, then the system transforms the input into haptic stimuli on the window surface, supported by audio cues.

What is the core design principle?

Make the experience accessible by translating the same real-world scene into senses the user can rely on, in the moment.