How Is the Apple HomePod Different From Other Voice First Devices?

This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.
<p>Hemera / Getty Images Plus </p>

Hemera / Getty Images Plus

What is Apple HomePod and why is it important? originally appeared on Quora: the place to gain and share knowledge, empowering people to learn from others and better understand the world.

Answer by Brian Roemmele, Founder + Editor at Read Multiple, on Quora:

Apple Creates A New Platform For Voice First, Yet They Want You To Think Just Music—For Now...

On June 5th, 2017, Apple announced the $349 HomePod [1] (on sale in December). HomePod (I had to call it HomeBase for the last year) is Apple’s inaugural Voice First device, it combines Apple-engineered audio technology and advanced software to deliver the highest-fidelity sound throughout the room, no matter where it’s placed. Not content to deliver a minimal sound experience, they use a high-excursion woofer with a custom amplifier to play a wide range of deep, rich bass. A powerful motor drives the diaphragm a full 20 mm that is simply unheard of for a speaker this size and not seen in any other Voice First system.

HomePod's AI control algorithm continuously analyzes the music and dynamically tunes the low frequencies for smooth, distortion‑free. The seven beam forming tweeters precisely focuses sound, from very narrow beams all the way to true, consistent 360º audio. HomePod aims those beams throughout the room, the tweeters create an immersive sense of space — no matter where HomePod is or where you’re sitting.

HomePod is a bit of a magical experience to use, The top shows you when Siri is listening with an LED waveform that animates with your every word. This animation seems to be not a big deal until you experience it. There is a bit of a feedback of recognition that the system is actually hearing each word. The effect is subconscious but profound. You tend to lower your tone and talk more naturally with this feedback system.

The Living Room Not The Kitchen

HomePod is aimed squarely at the living room and perhaps the bedroom, and not the kitchen. Amazon clearly aimed the first Echo to be in the kitchen and later the Echo Show cemented this. Although HomePod will not be out of place in the kitchen, and it will be used there, I predict Apple’s kitchen device to be radically different and we may see it in twelve months.

A powerful Apple A8 chip is the runs most complex audio innovations in HomePod including, real-time modeling of the woofer mechanics.The buffering is perhaps the best in the industry—faster than real time. This means there will be no delays in music streaming. Finally, the six far-field beam forming microphones can hear you over the music, echo and room noise using advanced echo cancellation.

Here is a feature overview:

  • Apple‑designed A8 chip.
  • Seamless mesh fabric in Space Grey or White.
  • High‑excursion woofer.
  • Internal low-frequency calibration microphone for automatic bass correction.
  • Six‑microphone far-field beam forming array.
  • Seven‑tweeter beam forming array, horn-loaded tweeters, each with its own custom amplifier.
  • Touch sensitive LED Waveform indicator and control.


  • 6.8 inches high (172 mm)
  • 5.6 inches wide (142 mm)


  • 5.5 pounds (2.5 kg)

HomePod is smart enough to sense where in the room it’s playing if you added another HomePod. HomePod is controlled by a new version of AirPlay called AirPlay 2. AirPlay 2 is designed to allow you to add many speaker systems to this new network. By adding two HomePods in the same room, AirPlay 2 will separate the sound into a wide spread stereo spectrum. This is a powerful technology.

Far-Field And Waveform LEDs

HomePod is the first far-field Voice First system Apple has created. AirPods are the first near-field Voice First systems Apple created. One of the largest complaints of the remote “Hey Siri” on iOS devices is the fact that the noise cancelling microphones were designed for cell calls and not speech recognition. This fact hindered Siri in less than ideal situations. HomePod has changed this. When you invoke “Hey Siri”, the top portion of HomePod’s LED Waveform display lights up to notify you the system is awake and speech recognition has started. Apple then encrypts and privatizes your recorded speech for cloud recognition and intent extraction. A resulting sound file is produced from your request. HomePod is using a fundamentally new technology that allows for some of the best wake word recognition in my tests.

The aspect of privacy on HomePod and Siri in general will play a larger part of the future. Apple is making a point to make clear that HomePod only listens when you say “Hey Siri” and the LED Waveform is lit. The system is the only one that clearly states that all speech snippets are encrypted and made anonymous. Apple also stated to me privately that the audio and resulting data is erased after the AI extracts intent.

Low Key Domains And Contextual Siri

Apple is taking a very low key approach to building this new Voice First platform. It started with the head fake that said that “the best assistant is already on your phone” on to today with a focus almost entirely on music playback. This focus is wise. It allows time for Apple and developers to build out the domains for Siri. The new Siri is full of extended context across all Apple platforms and will allow developers to deeply integrate into this new platform. As domains expand, utility of this new more contextually aware Siri will expand. We will see an integration to Apple’s new Workflow to extend the utility even further, perhaps as soon as this year.

The current HomePod Siri domains are:

  • Music
  • News
  • Unit Conversions
  • Messages
  • Reminders
  • Podcasts
  • Alarms & Timers
  • Translation
  • Stocks
  • General knowledge
  • Weather
  • Trafic & Nearby
  • Sports
  • Home (app)

These domains represent the majority cohort of use for most Voice First systems. Apple has focused on making these experiences better and will grow the longer tail domains over time, far faster than most skeptics may believe today. Using the amazing context and dialog technology acquired from VocalIQ, we will see a far more knowledgeable Siri and far more than just Q & A. Some of the new Siri is here already with the learning context engine VocalIQ built. However the more advanced Siri will slowly show herself over the next twenty-four months.

Today Apple is to focused HomePod during the announcement in a domain where the primary initial use case, music has an expert, Siri that knows music inside and out. Apple calls this music focused Siri a Musicologist. This of course includes the extension of Apple Music but also other music platforms ultimately. The idea is to make HomePod your go-to music playback system. It will also do wonders for growing Apple Music to an even wider audience and hold back many competing systems.

Echo, Google Home And Home Pod

The Voice First market has grown from one device in 2014 to a multitude of devices in 2017. The use cases have gone from music and timers to complex questions. All things being equal, all three major devices have about equal abilities. However, there are pronounced differences in how they arrive at answers. This element will become crucial in the future. It will not impact overall sales but it will impact the developer economy.

The new Siri, with better hearing and context across all of your devices may tilt some to see Siri as getting more intelligent faster. Combine this with the inevitable addition of the scripting Workflow system that will connect just about any app, just about all ten million of them to a Siri domain. In fact, it may become the way domains are extended.

The other platforms will expand and there will be a rich future ecosystem developed around all of them. I think the average home will have dozens of devices one the arc of ten years, including in their appliances.

The Apple Patents Show The Future

Earlier this year I surfaced a few Apple Patents that show the future arc of Siri and Apple’s Voice First platform. These patents show a new way to interact with Siri but also a far more powerful Siri, using many of the VocalIQ systems. One feature is being able to interact with Siri using iOS and MacOS Messages platform. This will allow for the new context aware and cross platform Siri to interact with HomePod. From simple things like basic controls to using the Apple Home system allowing for control of devotes like lights and door locks [2].

In this illustration Apple is presenting an amalgamation of Siri + Messages App + Image recognition into a new mixed modality of Voice and text interactions that can include other individuals or groups in the chat/audio/video stream. This is the first time we have seen Apple present this exact embodiment. The illustration presents a new way of interaction with Siri and a new way to extend your memory as a real world bookmarking system using the camera and deep learning AI to identify things from buildings to products. We are also seeing the begging of Apple’s Voice Commerce engine telegraphed in this patent as products are identified and soon entered into a transaction for purchases.

On April 12th, 2016 I wrote what became a lightning rod article defining the Voice First revolution, it was contrarian and controversial at the time and still even today. I said [3]:

"The first wave of Voice First devices will likely come from these companies with consumer grade and enterprise grade systems and devices:
  • Apple
  • Microsoft
  • Google
  • IBM
  • Oracle
  • Salesforce
  • Samsung
  • Sony
  • Facebook"

As of today we are 5 out of 9 with more coming very soon. When I wrote it, there was just one device, the Amazon Echo system. I asserted that we would own a multitude of Voice First devices. It is not really an either/or choice. Clearly Apple’s initial plist rice of $325, it is not set to be at the $49.95 price of an Echo Dot, but it is not so high priced to be out of reach to have both. I said [4]:

"Thus to recap: You will use dozens of Voice platforms for dozens of reasons. You will have Intelligent Assistants that will mediate and manage everything.We are at the precise of this new Voice First world. At some point much of what we used to do to interact with computers will look unfamiliar. Along the way we will endure the Siri vs Alexa vs Google vs Viv stories, have a chuckle and know they are wrong. They all win, we all win."

Voice Commerce

HomePod will not let Amazon and Google dominate Voice Commerce. Although this element has not been demonstrated publicly, it will play a materially important part of this device and Siri as a whole ecosystem. Today Apple also announced Apple Pay Cash. This is a fundamentally new payment system that bypasses Visa And MasterCard but also PayPal, Venmo, Square and Stripe. Many see it as a person to person payment system, but it will become a merchant payment platform allowing customers to pay merchants directly. Apple Pay Cash is deeply integrated into Messenger and thus it is also deeply integrated into Siri. We will begin to see how this will work in the coming months.

Apple HomePod Will Be A Success

I am certain that HomePod will become one of Apple’s best selling accessories. I believe that HomePod will be sold out during the holiday season of 2017 and will become a must have product during 2018. In 2018 it will become clear that Apple is building a new computing platform with SiriOS. Some observers in tech refuse to accept Voice First as a new platform for they create an invalid premise that establishes that one mush have “General AI” for this to be useful. This is of course a silly notion. This same framing was used to Marc Andreessen in 1992 when he created Mosaic and the commercial World Wide Web. This new platform will prove to be the one that replaces all prior platforms. Of course we will still use gesturing on glass screens and ever a keyboard and displays, however we will use them less and less. It is not Voice Only, it is Voice First.

Apple historically is later to market. Yet their arrival marks a paradigm shift. Apple's new contextual Siri will redefine Voice First but also the entire computer industry. I have said that over the next ten years, 50% of human to computer interactions will be via Voice. This will shift every paradigm in a fundamental way. In history we call this a revolution, a Voice First revolution [3].


This question originally appeared on Quora - the place to gain and share knowledge, empowering people to learn from others and better understand the world. You can follow Quora on Twitter, Facebook, and Google+. More questions: