The Moby Mart

Every parking space becomes a 24-hour store. The Moby Mart is designed to turn ordinary parking spots into always-on retail. Roughly the size of a small bus, it carries everyday products such as snacks, meals, basic groceries, and even shoes. To use it, you download an app, register as a customer, and use your smartphone to unlock the doors.

The idea is in trial mode. The store is undergoing trials in Shanghai through a collaboration between Swedish startup Wheelys Inc and China’s Hefei University. For now, the trial prototype is stationary, based permanently in a car park. But the company says it is working with technology partners to develop the self-driving capability, as shown in the video.

What this concept makes tangible

Retail flips from “go to store” to “store comes to you”

The provocation is simple. If the unit can be deployed anywhere, then proximity becomes a variable you can design, not a constraint you accept.

Friction reduction becomes the product

The app unlock and self-service flow compresses the journey. Entry, selection, payment, exit. Less waiting, less staffing, less handoff.

Mobility creates new placement logic

A store on wheels changes what “location strategy” means. Instead of long-term leases, the unit can be positioned where demand spikes, or where fixed retail is uneconomical.

The reusable pattern

  1. Start with a familiar format. People immediately understand a convenience store. That lowers cognitive load.
  2. Make access the first experience. App unlock is the “moment of truth.” If that step is seamless, everything downstream feels modern.
  3. Design for unattended trust. Clear rules, clear prompts, and a clear “this worked” confirmation prevent anxiety in a staffless space.
  4. Prototype the operating model early. Mobility, restocking, and support are not secondary. They are the offering.

A few fast answers before you act

What is the Moby Mart?

A bus-sized, staffless, mobile convenience store concept that aims to turn parking spaces into 24-hour retail, accessed via a smartphone app.

How do customers use it?

They download an app, register, and unlock the doors with their phone to shop inside.

Where is it being tested?

It is undergoing trials in Shanghai through a collaboration between Wheelys Inc and China’s Hefei University.

Is it already self-driving?

The trial prototype is stationary in a car park. The company says it is working with partners on self-driving capability.

What is the core lesson for marketers and innovators?

Move the experience to the moment and place of demand. Then design the access, trust, and operations as the real product.

The intelligent car from Mercedes-Benz

Mercedes-Benz announces that its 2016 and 2017 vehicles in the US can connect with Amazon Echo and Google Home. With that integration in place, owners can remotely start or lock their vehicle, and they can send an address from home straight into the car’s in-car navigation.

What makes this interesting is not the novelty of voice commands. It is the direction. The car starts behaving like a node in a wider home automation ecosystem, not a standalone product you only interact with once you sit behind the wheel. You speak to your assistant at home. The car responds. The boundary between “home experience” and “driving experience” gets thinner.

The ecosystem move, not a feature add-on

A single capability like “remote start” is useful. But the strategic move is building an intelligent ecosystem around the car, using third-party voice assistants people already trust and use daily. That lowers adoption friction and accelerates habit formation. If a driver already uses Alexa or Google Home for routines, adding the car becomes a natural extension.

This also shifts expectations. Once the car is connected into the household’s digital layer, people start wanting context-aware flows. For example, planning and sending destinations before leaving. Or basic vehicle actions triggered as part of an existing routine.

Mercedes is not alone in spotting the pattern

Mercedes-Benz is not the first automaker to recognise the potential of third-party voice assistants. At CES earlier this year, Ford unveiled plans to roll out Alexa-equipped vehicles. Around the same time, Hyundai announced a partnership with Google to add voice control through Google Home.

The competitive question becomes simple. Who turns the car into a meaningful part of the customer’s everyday digital routines first, and who reduces the connected car to a checklist feature.


A few fast answers before you act

What does Mercedes-Benz enable through Alexa and Google Home?

Remote start. Remote lock. Sending addresses from home to the in-car navigation system.

Why is this bigger than “voice control in the car”?

Because it connects the car to an existing smart home ecosystem. That makes the car addressable from outside the vehicle, and it pushes the experience upstream into planning and daily routines.

What should product, CX, and marketing teams watch closely?

The ecosystem choices. The core use cases that become habitual. The trust layer, including permissioning and security for remote actions. The operational reliability, because routines only stick when they work every time.

What is the strategic takeaway in one line?

The “intelligent car” story is increasingly an ecosystem story. It is about where the car lives in the customer’s broader digital life.

Gatebox: The Virtual Home Robot

You come home after work and someone is waiting for you. Not a speaker. Not a disembodied voice. A character in a glass tube that looks up, recognizes you, and says “welcome back.” She can wake you up in the morning, remind you what you need to do today, and act as a simple control layer for your smart home.

That is the proposition behind Gatebox. It positions itself as a virtual home robot, built around a fully interactive holographic character called Azuma Hikari. The pitch is not only automation. It is companionship plus utility. Face recognition. Voice recognition. Daily routines. Home control. A “presence” that turns a smart home from commands into a relationship.

What makes Gatebox different from Alexa, Siri, and Cortana

Gatebox competes on a different axis than mainstream voice assistants.

Voice assistants typically behave like tools. You ask. They answer. You command. They execute.

Gatebox leans into a different model:

  • Character-first interface. A persistent persona you interact with, not just a voice endpoint.
  • Ambient companionship. It is designed to greet you, nudge you, and keep you company, not only respond on demand.
  • Smart home control as a baseline. Home automation is part of the offer, not the story.

The result is a product that feels less like a speaker and more like a “someone” in the room.

Why the “holographic companion” framing matters

A lot of smart home innovation focuses on features. Gatebox focuses on behavior.

It is designed around everyday moments:

  • waking you up
  • reminding you what to remember
  • welcoming you home
  • keeping a simple loop of interaction alive across the day

That is not just novelty. It is a design bet that people want technology to feel relational, not transactional.

What the product is, in practical terms

At its most basic, Gatebox:

  • controls smart home equipment
  • recognizes your face and your voice
  • runs lightweight daily-life interactions through the Azuma Hikari character

It is currently available for pre-order for Japanese-speaking customers in Japan and the USA, at around $2,600 per unit. For more details, visit gatebox.ai.

The bigger signal for interface design

Gatebox is also a clean case study in where interfaces can go next.

Instead of:

  • screens everywhere
  • apps for everything
  • menus and settings

It bets on:

  • a single persistent companion interface
  • a character that anchors interaction
  • a device that makes “home AI” feel present, not hidden in the cloud

That is an important shift for anyone building consumer interaction models. The interface is not the UI. The interface is the relationship.


A few fast answers before you act

Q: What is Gatebox in one sentence?
A virtual home robot that combines smart home control with a holographic companion character, designed for everyday interaction.

Q: Who is Azuma Hikari?
Gatebox’s first character. A fully interactive holographic girl that acts as the interface for utility and companionship.

Q: What can it do at a basic level?
Control smart home equipment, recognize face and voice, run daily routines like wake-up, reminders, and greetings.

Q: Why compare it to Alexa, Siri, and Cortana?
Because it is positioned as more than a voice assistant. It is a character-first, companion-style interface.

Q: What is the commercial status?
Available for pre-order for Japanese-speaking customers in Japan and the USA, at around $2,600 per unit.