Feel the View

Ford in Italy, together with agency GTB Rome, teams up with Aedo, a local start-up that creates devices for people with visual impairments. Together they design a prototype device that attaches to a car window and decodes the landscape outside, allowing visually impaired passengers to experience it with the tip of their fingers.

The device transforms the flat surface of a car window into a tactile display. The prototype captures photos via an integrated camera and converts them into haptic sensory stimuli. Here, “haptic” means tactile patterns you can feel with your fingertips. The result is not primarily visual. It is perceptible through touch and hearing.

In automotive and mobility experience design, the real bar is whether the same journey can be translated across senses without creating a separate experience.

Why this matters as accessible experience design

This is an assistive interface built around a real, emotional moment. Looking out of a window during a drive. It treats “the view” as an experience that can be translated into other senses, rather than a privilege reserved for sighted passengers. Because the window is where attention naturally goes, using it as the tactile surface makes participation feel shared rather than segregated.

Extractable takeaway: If you want inclusive innovation to land, translate the same moment into multiple senses instead of designing a parallel version of the experience.

Inclusive innovation should be judged by whether it expands participation in the same moment, not by how novel the technology sounds.

The product idea in one line

Capture what is outside the car, then render it on the window surface as a tactile and audio layer that can be explored in real time.

The real question is whether your design lets people participate in the same moment as everyone else, without extra friction or stigma.

What to take from this if you build inclusive innovation

  • Start with a human moment. Here, it is shared travel and the desire to participate in what others are seeing.
  • Use the environment as the interface. The window is already where attention goes. It becomes the display.
  • Translate, do not replace. The concept does not mimic sight. It converts the same input into touch and sound.

A few fast answers before you act

What is “Feel the View”?

A Ford Italy concept with GTB Rome and Aedo that prototypes a car-window device converting outside landscapes into a tactile and audio experience for visually impaired passengers.

How does the prototype work at a high level?

An integrated camera captures what is outside, then the system transforms the input into haptic stimuli on the window surface, supported by audio cues.

What is the core design principle?

Make the experience accessible by translating the same real-world scene into senses the user can rely on, in the moment.

Is this a production product or a prototype concept?

It is described as a prototype concept rather than a production feature, so treat it as a design pattern more than a released product.

What can you apply even if you do not build haptics?

Start from a shared human moment, pick the surface where attention already goes, then translate the same scene into other senses instead of creating a parallel experience.

Wearable Tech: From Abandonment to Empowerment

Wearable tech has a retention problem

Wearable technology adoption looks impressive at first glance. But usage tells a more complex story.

Research from Endeavour Partners shows that one in ten American adults owns an activity tracker, and half of them no longer use it. Similarly, one-third of American consumers who own smartwatches and other wearables stop using them within six months.

Those numbers raise an uncomfortable question: The real question is whether a wearable increases capability enough to become essential.

Is wearable tech doomed before it has even gone mainstream in the rest of the world?

The problem is not the technology

The issue is not sensors, screens, or connectivity.

The issue is meaning.

Many wearables launch with novelty and metrics, but fail to integrate into daily life. Counting steps or tracking sleep is interesting. It is rarely essential.

When a device does not change what people can do, it gets abandoned.

When wearables truly matter

The story changes completely when wearables move from tracking to empowering.

By empowering, I mean they expand what a person can do in the moment, not just what a dashboard can show later.

In its latest Mobile Minute series, Mashable looks at how wearable technology enables people in incredible ways.

These are not incremental conveniences. They are life-changing capabilities.

Wearables that increase quality of life

Wearable technology begins to earn its place when it solves real human problems:

  • Haptic clothing helps visually impaired people navigate the world through touch-based signals.
  • Wearable interfaces allow people with limited mobility to control wheelchairs using subtle movements.
  • Body-mounted cameras enable candid photography without drawing attention or interrupting moments.

In these scenarios, wearables are not gadgets. They are extensions of human ability.

Why abandonment and empowerment coexist

Wearables fail when they demand attention without giving value. They succeed when they quietly enable action, independence, and dignity. They stick because the device reduces attention and maintenance load while delivering capability at the moment of need.

Extractable takeaway: If a wearable cannot clearly increase what someone can do, it will be abandoned, no matter how impressive the metrics look.

In global consumer health and workplace wellbeing programs, wearable tech sticks when it removes daily friction and turns passive tracking into timely, actionable support.

Design rules for wearables that stick

Wearable tech is not going away. It is maturing.

The future of wearable tech is not about more data. It is about more capability.

The devices that survive will be those that:

  • Fade into the background. Minimize interruptions and attention demand.
  • Respect the body and the moment. Prioritize comfort, context, and dignity.
  • Increase quality of life in tangible ways. Deliver capability a person can feel in daily life.

This is how wearable technology moves from early adoption to lasting relevance.


A few fast answers before you act

Does high abandonment mean wearables are failing?

No. It usually means the use case is novelty or measurement-only, so the device never becomes essential in daily life.

What drives people to abandon wearables?

Friction and weak value. Charging hassle, comfort issues, unclear accuracy, notification fatigue, and metrics that do not change behavior.

What separates successful wearables from forgotten ones?

They enable action, independence, safety, or confidence in a specific moment. They do not just report data after the fact.

Where is the biggest long-term opportunity for wearables?

Assistive and supportive scenarios such as accessibility, chronic condition support, mobility, and safety. The value is empowerment, not tracking.

How do you evaluate whether a wearable belongs in daily life?

Ask what it lets a person do that they could not do before, and whether it works with near-zero attention and low maintenance.

What is one practical design rule for sticky wearables?

Reduce upkeep and interruptions. The best wearable fades into the background and proves its value at the moment of need.

Yahoo! JAPAN: Hands On Search

Yahoo! JAPAN introduces what it calls “Hands On Search”. A hands-on search experience that lets visually impaired children explore online concepts through touch, not screens.

A voice-activated kiosk is set up so children can speak what they want to “search” for. The system recognises the verbal request, pulls a corresponding 3D model, and prints a small physical object. For the first time, children can hold what they usually only hear described. From animals to landmarks and buildings.

Search becomes a physical output

The mechanism is voice input plus 3D printing output. Instead of returning text, images, or audio, the search result is manufactured into a tactile model the child can feel in their hands. Because the output is tactile, the child can verify shape and scale directly, which is why the interaction shifts from description to discovery.

In accessible technology design, the strongest innovation is often a translation layer that converts a dominant medium into the sense that an excluded audience can reliably use. That is the pattern worth copying. Change the output medium, not just the narration layer.

In accessible-learning contexts, the constraint is rarely intent but whether the output can be inspected without sight.

Why it lands

It reframes “search” as something more than browsing. It becomes discovery you can share in a classroom. The real question is whether your product can render its core value into the senses your excluded users actually rely on. The moment the object prints is also the moment learning becomes concrete. It is not an abstract promise about inclusion. It is a visible, touchable outcome.

Extractable takeaway: If your experience is inherently visual, do not just add narration. Add an equivalent output that preserves shape and scale in a form people can physically inspect, so learning moves from description to direct exploration.

Tactile-search patterns for product teams

  • Design for the missing sense, not the average user. Start with the constraint, then build the interface around it.
  • Make the interaction one-step. Voice request in. Physical result out. No menus, no setup rituals.
  • Curate the object library. Accessibility fails when content quality is inconsistent. The “catalogue” is part of the product.
  • Prototype in real learning environments. Schools and educators reveal whether the tool supports teaching, not just demos.

A few fast answers before you act

What is Hands On Search in one sentence?

It is a concept machine that turns spoken searches into small 3D-printed models, so visually impaired children can “touch” search results.

Why does 3D printing matter here?

Because it converts information into form. For someone who cannot see images, a physical model can communicate shape, proportion, and structure directly.

Is this a campaign or a product direction?

It plays like a campaign film, but the underlying idea is a product direction. Search as an output system that can render to different senses depending on user needs.

What is the biggest risk in copying this idea?

Building a beautiful prototype without a sustainable content pipeline. If the object library is thin, slow to expand, or low fidelity, usefulness drops quickly.

Where should you prototype first?

Prototype where learning happens. Schools and educators will quickly show whether the tool supports teaching, not just demos.