-->

Wednesday, July 1, 2020

author photo

Technology - Google News


Spotify Premium Duo account launches for couples; Apple Music to follow? - 9to5Mac

Posted: 01 Jul 2020 05:22 AM PDT

The Spotify Premium Duo account first tested more than a year ago has now launched in the US, UK, and more than 50 other countries. It enables couples or two home sharers to each have a premium account for a total payment of $12.99 per month.

This represents a savings of $2/month over the Premium for Family plan which many couples use at present …

In addition to each getting individual premium accounts, Spotify says Duo also gives you a joint playlist selected by Spotify on the basis of your joint tastes.

You love Today's Top Hits. Your partner is obsessed with All Out '80s. So deciding who gets to play DJ at home or in the car is a constant battle.

Luckily, today Spotify is making it even easier for two people to enjoy music together (or separately) with the launch of Spotify Premium Duo – our new, first-of-its-kind subscription plan. Beginning today, Duo will be rolling out in 55 markets globally, including the U.S.

Premium Duo is for two people living at the same home address and costs $12.99 per month. Each individual gets their own Premium account under one plan in addition to the unique benefits like exclusive access to Duo Mix, a regularly updated playlist made just for the two subscribers to discover audio they both love and enjoy together.

To sign-up, visit spotify.com/duo and follow the instructions – the two users must reside at the same address to be eligible. Users who haven't tried Premium before may be eligible to get the first month of Premium Duo for free. Existing Premium subscribers can switch to Premium Duo by visiting their account page and changing their subscription. Upgrading to Premium Duo allows subscribers to keep their existing Premium accounts along with saved music, podcasts, playlists and recommendations.

The Duo Mix allows you to choose between Chill and Upbeat playlists, and you can also opt to filter out explicit songs.

You can check whether your country is included by visiting spotify.com/duo.

Historically, Apple Music and Spotify have mirrored each other's membership types, so it is at least possible that Apple will choose to offer a similar account for couples.

FTC: We use income earning auto affiliate links. More.

Apple July 4 sale Adorama

Check out 9to5Mac on YouTube for more Apple news:

Let's block ads! (Why?)

Windows 10 May 2020 Update is rolling out much faster than Microsoft’s last big upgrade - TechRadar

Posted: 01 Jul 2020 04:51 AM PDT

Windows 10 May 2020 Update is off to a robust start, accelerating to a solid level of adoption in its first month of release – and moving much faster than Microsoft's previous major update in 2019.

This is going by figures from AdDuplex, which compiles monthly reports breaking down the adoption of different versions of Windows 10, and found that in its first month (plus a few days) of release, the May 2020 Update is now on 7% of PCs.

That's based on a sample of almost 150,000 Windows 10 machines which run AdDuplex's adverts (in apps and games from the Microsoft Store).

What's really interesting is that if we compare that to AdDuplex's stats for the last major upgrade to Windows 10 – the May 2019 Update (remember that the November 2019 Update was only a minor service pack-style affair) – this only achieved 4.9% adoption in its first full month after release (and around 1.4% in its first week out previous to that).

Cautious approach

You may recall that Microsoft adopted a slightly more cautious approach to the May 2019 Update rollout following the disastrous previous upgrade, the October 2018 Update.

Speaking of the latter, which was the most-bugged-ever Windows 10 update, that in itself had only reached 6.6% adoption at the end of 2018 after it had been out for three months (although the rollout was paused for a month, of course, after that awful file deletion bug reared its ugly head).

So as you can see, the May 2020 Update hitting 7% straight off the bat in its first month represents a much speedier rollout.

That said, the upgrade has not been without problems, including most recently messing with the storage of some PCs that have installed the May 2020 Update. If you've made the move and are encountering problems, we've got a full guide explaining how to fix the most common May 2020 Update issues.

Let's block ads! (Why?)

Uncovered: 1,000 phrases that incorrectly trigger Alexa, Siri, and Google Assistant - Ars Technica

Posted: 01 Jul 2020 06:47 AM PDT

Uncovered: 1,000 phrases that incorrectly trigger Alexa, Siri, and Google Assistant

As Alexa, Google Home, Siri, and other voice assistants have become fixtures in millions of homes, privacy advocates have grown concerned that their near-constant listening to nearby conversations could pose more risk than benefit to users. New research suggests the privacy threat may be greater than previously thought.

The findings demonstrate how common it is for dialog in TV shows and other sources to produce false triggers that cause the devices to turn on, sometimes sending nearby sounds to Amazon, Apple, Google, or other manufacturers. In all, researchers uncovered more than 1,000 word sequences—including those from Game of Thrones, Modern Family, House of Cards, and news broadcasts—that incorrectly trigger the devices.

"The devices are intentionally programmed in a somewhat forgiving manner, because they are supposed to be able to understand their humans," one of the researchers, Dorothea Kolossa, said. "Therefore, they are more likely to start up once too often rather than not at all."

That which must not be said

Examples of words or word sequences that provide false triggers include

  • Alexa: "unacceptable," "election," and "a letter"
  • Google Home: "OK, cool," and "Okay, who is reading"
  • Siri: "a city" and "hey jerry"
  • Microsoft Cortana: "Montana"

The two videos below show a GoT character saying "a letter" and Modern Family character uttering "hey Jerry" and activating Alexa and Siri, respectively.

Accidental Trigger #1 - Alexa - Cloud
Accidental Trigger #3 - Hey Siri - Cloud

In both cases, the phrases activate the device both locally, where algorithms analyze the phrases; after mistakenly concluding that these are likely a wake word, the devices then send the audio to remote servers where more robust checking mechanisms also mistake the words for wake terms. In other cases, the words or phrases trick only the local wake word detection but not algorithms in the cloud.

Unacceptable privacy intrusion

When devices wake, the researchers said, they record a portion of what's said and transmit it to the manufacturer. The audio may then be transcribed and checked by employees, in an attempt to improve word recognition. The result: fragments of potentially private conversations can end up in the company logs.

The risk to privacy isn't solely theoretical. In 2016, law enforcement authorities investigating a murder subpoenaed Amazon for Alexa data transmitted in the moments leading up to the crime. Last year, The Guardian reported that Apple employees sometimes transcribe sensitive conversations overheard by Siri. They include private discussions between doctors and patients, business deals, seemingly criminal dealings, and sexual encounters.

The research paper, titled "Unacceptable, where is my privacy?," is the product of Lea Schönherr, Maximilian Golla, Jan Wiele, Thorsten Eisenhofer, Dorothea Kolossa, and Thorsten Holz of Ruhr University Bochum and Max Planck Institute for Security and Privacy. In a brief write-up of the findings, they wrote:

Our setup was able to identify more than 1,000 sequences that incorrectly trigger smart speakers. For example, we found that depending on the pronunciation, «Alexa» reacts to the words "unacceptable" and "election," while «Google» often triggers to "OK, cool." «Siri» can be fooled by "a city," «Cortana» by "Montana," «Computer» by "Peter," «Amazon» by "and the zone," and «Echo» by "tobacco." See videos with examples of such accidental triggers here.

In our paper, we analyze a diverse set of audio sources, explore gender and language biases, and measure the reproducibility of the identified triggers. To better understand accidental triggers, we describe a method to craft them artificially. By reverse-engineering the communication channel of an Amazon Echo, we are able to provide novel insights on how commercial companies deal with such problematic triggers in practice. Finally, we analyze the privacy implications of accidental triggers and discuss potential mechanisms to improve the privacy of smart speakers.

The researchers analyzed voice assistants from Amazon, Apple, Google, Microsoft, and Deutsche Telekom, as well as three Chinese models by Xiaomi, Baidu, and Tencent. Results published on Tuesday focused on the first four. Representatives from Amazon, Apple, Google, and Microsoft didn't immediately respond to a request for comment.

The full paper hasn't yet been published, and the researchers declined to provide a copy ahead of schedule. The general findings, however, already provide further evidence that voice assistants can intrude on users' privacy even when people don't think their devices are listening. For those concerned about the issue, it may make sense to keep voice assistants unplugged, turned off, or or blocked from listening except when needed—or to forgo using them at all.

Let's block ads! (Why?)

This post have 0 komentar


EmoticonEmoticon

Next article Next Post
Previous article Previous Post