The HomePod is a voice-activated speaker with Siri built in that can stream songs from Apple Music via spoken commands. Songs from other streaming services can be played only through AirPlay, a wireless audio technology available exclusively on devices made by Apple.

In other words, the HomePod, available on Feb. 9 in the US, the UK, and Australia, might be the Apple-iest Apple device in recent history.

I’ve been living with, listening to, and shouting at the $349 plump music machine for a week — I’ve found that the HomePod is a good-sounding speaker with killer features for iPhone users, but it has flawed “smart” functionality that falls short of its closest competitors, and for a much steeper price.

Android users beware: If you don’t have an Apple Music subscription, an extensive iTunes library, or an iPhone, you shouldn’t get the HomePod.

You need an iPhone or iPad to set the HomePod up. Furthermore, the only streaming service you can control with your voice on the HomePod is Apple Music. (A subscription costs $10 per month for individuals and $15 per month for a six-person family.) Voice control also works with purchased iTunes music, free podcasts, and Beats 1 radio, which doesn’t require an Apple Music subscription.

You can play Spotify and other audio from your iPhone or Mac via AirPlay, but you won’t be able to use your voice — and voice control is the main appeal of a smart speaker like the HomePod.

The HomePod comes, of course, with Apple’s voice assistant Siri built in. I found the speaker to be exceptionally good at hearing the wake phrase (“Hey, Siri,” just like on an iPhone) and at understanding what I was saying.

Siri could hear me while I was wearing my retainers (“Hayy Sheeree, remind me teh bring mah headphonez toomerow”), brushing my teeth, or cooking with the overhead vent turned on. Best of all, Siri knew what I meant when I said, “Play SZA” (!!!), pronounced “sizza.” Alexa plays John Philip Sousa or Sizzla instead. You’d think the most nominated female artist of the year would have earned some more RESPECT from the most popular chatty speaker bot.

I thought saying “Hey Siri” around both my iPhone and HomePod would be a nightmare (“OK Google” almost always activated my boyfriend’s Google Pixel and our Google Home at the same time). But it wasn’t: The iPhone is attention aware and can tell when you’re looking at it to determine which Siri you’re talking to.

Other unique HomePod features include allowing podfasters to play podcasts at 1.5 times or double the normal speed (Alexa can’t). You can use your voice to send iMessages, SMS texts, and even third-party WhatsApp messages (which, BTW, is one of my favorite features), as long as the connected iPhone is on the same network. Siri on HomePod can also set reminders using the iPhone’s Reminders app and transcribe notes in the Notes app.

Having lived in a Google Home household, with plenty of features for Android phones and relatively few for iOS devices, the HomePod’s native iPhone integration felt like sweet victory.

Nicole Nguyen / BuzzFeed News

But HomePod lacks some of the basic “smart” features that Amazon Echo and Google Home have long had.

HomePod is tied to one iCloud account, and you have to have an iPhone or iPad to set it up. (Apple wouldn’t comment on whether multiple account support is coming soon.)

If you live in a shared household, you can’t switch profiles to take advantage of Apple Music’s personalized playlist features, or access different accounts’ reminders, notes, or messages, which are what Apple calls “Personal Requests.” Both Amazon and Google’s devices can recognize different people’s voices and automatically switch to their profiles. HomePod can’t do this.

FYI: Personal Requests are optional. They only work when the connected iPhone is on the same Wi-Fi network as the HomePod. So, no — no one can listen to your unread text messages or send them on your behalf while you’re away.

Also, HomePod has no calendar support. Hearing all of my day’s meetings and appointments is a feature I’ve loved on both Google Home and Echo.

You can access the HomePod’s settings from the Home app (which you probably didn’t know existed on your phone). But there isn’t much there. You can’t set your news source (which has to be done with your voice), or change the alarm tone. You can’t see a history of everything you’ve asked HomePod, like you can with other smart speakers, which, depending on your stance on privacy, may be a good thing.

If you live in a shared household, you can’t switch profiles to take advantage of Apple Music’s personalized playlist features, or access different accounts’ reminders, notes, or messages.

Apple doesn’t care about activity history in the same way that Google and Amazon do. Unlike its competitors, Apple doesn’t store queries. If the government asks for your HomePod data, Apple wouldn’t be able to give it to them, because it doesn’t have it.

You can see your listening history in Apple Music, but only if you’ve asked for specific albums or songs. If you listened to a radio station based on an artist, the app won’t show you the specific tracks you listened to, which kind of ruins the music discovery that comes with those kinds of features. (You can also turn listening history off if you have a roommate who loves country, and you’d rather not get country music served in your personalized playlist.)

Siri’s ability to understand commands is also limited. For denoting musical preference, “I don’t like this” doesn’t work, while “I don’t like this song” does. Siri also didn’t know when pomegranates were in season (October through January), and neither did Alexa, but Google Assistant prevailed.

HomePod is incredibly easy to set up.

I didn’t need to type a single thing. I just held my iPhone close to the HomePod to transfer the Wi-Fi network name and password, as well as my Apple ID credentials. It’s nice seeing the AirPods’ stupid-easy automatic pairing interface working on other Apple products.

It’s much easier to move the HomePod around and set it up on new networks, too. When you plug in your HomePod in a new place, the speaker grabs the Wi-Fi password from your phone, and it can be used immediately without additional set up.

Sonos, a competing speaker aimed at audiophiles, is a set-up nightmare. When I brought the two speakers to my office, I had to go through all of these hoops, including factory resetting the device, only to find out that it couldn’t connect to the building’s non-2.5ghz network.

Its design is attractive and understated. It looks and feels Expensive. (It is.)

The HomePod comes in two colors: marshmallow (white) and charred marshmallow (space gray). It’s one of the heftiest pineapple-sized things I’ve ever picked up (5.5 pounds).

Bergeron was speaking to a small group of tech bloggers, including myself, last Monday in Apple’s Cupertino, CA-based audio lab, just minutes from the new Apple Park spaceship campus. About six years ago, according to Bergeron, the company began working on HomePod by attempting to answer this question: “What if we decided to design a loudspeaker that we could put in any room, and it wouldn’t affect the sound?”

This question is very different from the question the Amazon Echo and Google Home are trying to address. Those speakers’ primary aim is to offer hands-free help, by way of turning on the lights in the living room, telling you what traffic to work is like, setting timers, and playing podcasts while you’re busy cooking breakfast.

HomePod can do all of those things (with variable degrees of success) — but what it *really* hopes to be, is a speaker that sounds really good.

That’s why Apple showed us (the bloggers) their lab full of large anechoic chambers, rooms built on isolating springs to keep vibrations from the outside world out. The company wanted to prove it’s serious about audio quality.

“You have to really want to do this,” said senior director of audio engineering Gary Geaves, as he waved us onto a wire trampoline suspended above giant, asymmetrically-directed spikes of different sizes.

Apple’s anechoic chamber where the woofer and speaker array were developed.

Brooks Kraft / Apple

Brooks Kraft / Brooks Kraft/Apple

The chamber, which was seemingly covered, floor-to-ceiling, in Chuckie’s hair from Rugrats, would be terrifying if not for the fact that the spikes are made of a soft, styrofoam-esque material.

Apple wouldn’t show us how testing in the room worked. Geaves asked us to imagine that here, on this wire trampoline, is where Apple developed the “beam-forming speaker array” (human translation: high-frequency tweeter speakers) and the woofer that powers the HomePod’s powerful bass.

Throughout my week of testing, I listened carefully to HomePod to see what the result of all that testing was, and I had a few colleagues do the same.

What I — and a few others — found was that HomePod was more spatially aware than other speakers we listened to. In fact, it has a microphone on board to listen to its surrounding and adjust its sound settings accordingly, as well as an accelerometer to detect when it’s been moved, which will prompt the speaker to refresh its tuning.

The HomePod’s audio sounded warm and immersive, and distinctly bass-y — at times too much so, in the same way many Beats headphones tend to be.

The Sonos One, which has Amazon Alexa built in, and Apple’s HomePod are natural competitors. Both tout themselves as premium audio offerings in the smart speaker category, in contrast to the cheap, tinny speakers that have kept the Amazon Echo and Google Home affordable and accessible.

Homepod (left) and Sonos One (right)

Nicole Nguyen / BuzzFeed News

I compared the HomePod to the Sonos One in the living room of my small one-bedroom apartment, and then in BuzzFeed San Francisco’s 650 square foot lab with ~15 foot high ceilings. In a blind listening test in my apartment, my bf Will overwhelmingly preferred the Sonos One, saying “the vocals are really more clear on the Sonos” for The Grateful Dead’s “China Cat Sunflower;” that the Sonos’ “mid-range sounds more prevalent” and “accurate” for Lorde’s “Green Light;” but that there was “more detail” on the HomePod for “God’s Plan” by Drake.

I mostly agreed — except I thought the HomePod spread audio throughout the room more evenly than the Sonos One, which fired audio in one direction, rather than filling the space. I listened to Rhye’s new Blood album, and the bass felt too thump-y on the HomePod, but the bass guitar sounded great. Yaeji’s “Passionfruit” also sounded better on the HomePod, which really highlighted the record’s ethereal/atmospheric vibes. Sonos made Laura Marling’s voice on “Ghosts” and “Rambling Man” sound especially clear, like she was in the room with me, but the HomePod gave the tracks a warmer tone overall.

In the BuzzFeed lab, a challenging room for any speaker, the large echo-y space swallowed the Sonos’ sound. My editor John Paczkowski, a metalhead, described the Sonos’ audio as “thin,” compared to the HomePod, which it definitely was, especially at louder volumes. Neither speaker did a good job of playing Paczkowski’s picks: Yob’s “Prepare the Ground,” which sounded muddy, or Black Sabbath’s “Paranoid.”

In a smaller room with lower ceilings, “Comfortably Numb” first sounded great on Sonos Play:1 speakers (the non-Alexa version of the device), but Paczkowski found the song’s audio quality was notably deeper and richer on the HomePod. The strings in Clint Mansell’s Requiem For A Dream soundtrack sounded better on Sonos, compared to the HomePod. But the HomePod nailed the bass in Beastie Boys’ “Shake Your Rump.”

The takeaway from our tests is that Sonos speakers sound calibrated to pay deep attention to everything — every guitar strum, every bass line, every hi-hat tap — because, perhaps, it’s aimed at a very particular type of music listener.

HomePod, meanwhile, has a very thump-y bass, with an algorithm on-board that looks ahead at the track’s next 30 milliseconds and tunes frequencies in real-time. But the algorithm isn’t always effective: In certain songs, the bass sounded too prominent, burying vocals and other mid-range sounds.

HomePod has a very thump-y bass, with an algorithm on-board that looks ahead at the track’s next 30 milliseconds and tunes frequencies in real-time.