-->

Tuesday, June 21, 2022

author photo

Technology - Google News


New Philips Hue options include flexible track lighting, battery-powered table lamp, and more - 9to5Mac

Posted: 21 Jun 2022 04:29 AM PDT

Signify has announced a wide array of new Philips Hue lighting products to its ever-growing range, as well as a new switch designed to make it easy to dim lights and select scenes.

Additionally, the Hue app has been updated to offer a new Sunrise option to smart home owners, allowing existing color lights to mimick the colors of a sunrise to gently wake you in the morning …

New Philips Hue track lighting system

Perhaps the most interesting option is a flexible track lighting system. In a way, this is mood lighting come full circle! Way before smart lighting, you could buy colored spotlights to fit to a ceiling track, to create color casts on walls. Now Hue owner Signify has brought track lighting up to date.

Key to the new product is its flexibility, letting you mix-and-match different types of lights.

Become your home's personal lighting designer with Philips Hue Perifo track lighting.

Perifo is a new line from Philips Hue that is made up of individual rails that fit together to make a fully customizable track.  You choose the layout and length of the track and what lights to include to get full control over the way you light your home.

The track can be attached to the wall or ceiling and connected to a standard outlet or existing wires using the included power supply unit. Then the real fun begins as you click your choice of smart lights into the track and position them exactly how you want.

You can combine your choice of color-capable spotlights, pendants, light bars, and light tubes in a single Perifo track to set the mood for any occasion while creating a unique design feature.

Battery-powered table lamp

The first Hue Go battery-powered lamp was launched back in 2017, and remains popular today for subtle mood lighting. But the latest addition to the portable Go range is a more practical table lamp.

Now you can take smart light with you from the living room to the patio table, or anywhere inside and outside your home with the new Philips Hue Go portable table lamp. The portable table lamp is designed for indoor and outdoor use and features a silicone grip, making it easy to carry wherever you need light. The Hue Go portable table lamp offers up to 48 hours of battery life and is easily recharged using the included charging base. Whether you're reading in bed or having dinner on the patio, the lamp's button lets you cycle through preset light scenes to create just the right mood.

Other new Philips Hue lights

There are a couple of new gradient lamps, which allow a mix of colors to be displayed at the same time.

The new Philips Hue Signe gradient lamp in oak [has a] slender profile and a natural wood-toned base, and is designed to be both a statement piece and subtle accent for the bedroom. It's available both as a table lamp and floor lamp.

There are also a couple of bathroom ceiling lights designed to cope with a high humidity environment.

The Philips Hue Xamento recessed spot in black brings subtle, contemporary design to the bathroom while offering up to 350 lumens in millions of colors of dimmable smart light just where you need it. The Xamento recessed spot in black is available as a single or 3-pack. The Philips Hue Xamento M ceiling light in black makes a decorative centerpiece in the bathroom while filling the space with subtle, diffused light thanks to its unique design.

Philips Hue switch for easy dimming and scene selection

A new Tap dial switch offers four programmable buttons intended for scene selection, while the rotary dial around the outside allows easy dimming.

The Tap dial switch has four buttons, and each button can be set to control smart lights in up to three separate rooms or zones around the home. Tap a button to choose or adjust any light scene instantly. The Tap dial also comes with intuitive dimming control — the faster or slower you turn the dial, the faster or slower your smart lights dim or brighten. The switch also has a sleek, matte design in black or white, making it fit in with decor anywhere around the home. It can also be used as a handy remote control or even mounted magnetically onto any metal surfaces.

Sunrise wake-up scene

The Hue app already offers a gentle wake-up scene, designed to gradually fade in light, and this has now been supplemented by a new Sunrise option.

The new Sunrise effect with its rich, colorful transition through blue to soft orange light, mimics the sun appearing over the horizon — giving you a relaxing wake-up call in the morning or at any time of day. The Sunrise wake-up style can be found in the Philips Hue app under the Wake up Automations Tab for Hue Bridge users and the Routines Tab for Bluetooth users. The wake-up style can be customized for duration and time of day. Users of the existing wake-up style — now called Fade to Bright — can upgrade their wake-up experience by simply selecting Sunrise from the edit screen of the Automation.

Pricing and availability of the new Philips Hue products

Some are available from today in Europe, while the US will need to wait until later in the summer.

  • Philips Hue Perifo rail black or white (End of Summer 2022 in EU)
    EU: EUR 49.99 – 89.99
  • Philips Hue Perifo connectors black or white (End of Summer 2022 in EU)
    EU: EUR 19.99 – 29.99
  • Philips Hue Perifo PSU wall or ceiling (End of Summer 2022 in EU)
    EU: EUR 99.99
  • Philips Hue Perifo track lights (End of Summer 2022 in EU)
    EU: EUR 119.99 – 299.99
  • Philips Hue Go portable table lamp (End of Summer 2022 in EU and NAM)
    EU: EUR 149.99
    NAM: USD 159.99
  • Philips Hue Signe gradient table oak (available June 21 in EU)
    EU: EUR 239.99
  • Philips Hue Signe gradient floor oak (available June 21 in EU and mid-July in NAM)
    EU: EUR 349.99
    NAM: USD 349.99
  • Philips Hue White and color ambiance downlight — generation two (Available June 21)
    US and Canada: 4-inch 59.99 USD, 5-/6-inch 59.99 USD, 5-/6-inch 4-pack 219.99
  • Philips Hue White ambiance downlight — generation three (Available June 21)
    US and Canada: 4-inch 49.99 USD, 5-/6-inch 49.99 USD, 5-/6-inch 4-pack 189.99
  • Philips Hue Xamento recessed spot black (available June 21 in EU)
    EU: EUR 79.99 single pack / 3-pack EUR 219.99
  • Philips Hue Xamento M ceiling light black (available June 21 in EU)
    EU: EUR 219.99
  • Philips Hue Tap dial switch in black or white (available June 21 in EU and NAM)
    EU: EUR 49.99
    NAM: USD 49.99

More details can be found on the Hue website.

Check out 9to5Mac on YouTube for more Apple news:

Adblock test (Why?)

Microsoft Plans to Eliminate Face Analysis Tools in Push for ‘Responsible A.I.’ - The New York Times

Posted: 21 Jun 2022 09:02 AM PDT

The technology giant will stop offering automated tools that predict a person's gender, age and emotional state and will restrict the use of its facial recognition tool.

For years, activists and academics have been raising concerns that facial analysis software that claims to be able to identify a person's age, gender and emotional state can be biased, unreliable or invasive — and shouldn't be sold.

Acknowledging some of those criticisms, Microsoft said on Tuesday that it planned to remove those features from its artificial intelligence service for detecting, analyzing and recognizing faces. They will stop being available to new users this week, and will be phased out for existing users within the year.

The changes are part of a push by Microsoft for tighter controls of its artificial intelligence products. After a two-year review, a team at Microsoft has developed a "Responsible AI Standard," a 27-page document that sets out requirements for A.I. systems to ensure they are not going to have a harmful impact on society.

The requirements include ensuring that systems provide "valid solutions for the problems they are designed to solve" and "a similar quality of service for identified demographic groups, including marginalized groups."

Before they are released, technologies that would be used to make important decisions about a person's access to employment, education, health care, financial services or a life opportunity are subject to a review by a team led by Natasha Crampton, Microsoft's chief responsible A.I. officer.

There were heightened concerns at Microsoft around the emotion recognition tool, which labeled someone's expression as anger, contempt, disgust, fear, happiness, neutral, sadness or surprise.

"There's a huge amount of cultural and geographic and individual variation in the way in which we express ourselves," Ms. Crampton said. That led to reliability concerns, along with the bigger questions of whether "facial expression is a reliable indicator of your internal emotional state," she said.

The age and gender analysis tools being eliminated — along with other tools to detect facial attributes such as hair and smile — could be useful to interpret visual images for blind or low-vision people, for example, but the company decided it was problematic to make the profiling tools generally available to the public, Ms. Crampton said.

In particular, she added, the system's so-called gender classifier was binary, "and that's not consistent with our values."

Microsoft will also put new controls on its face recognition feature, which can be used to perform identity checks or search for a particular person. Uber, for example, uses the software in its app to verify that a driver's face matches the ID on file for that driver's account. Software developers who want to use Microsoft's facial recognition tool will need to apply for access and explain how they plan to deploy it.

Users will also be required to apply and explain how they will use other potentially abusive A.I. systems, such as Custom Neural Voice. The service can generate a human voice print, based on a sample of someone's speech, so that authors, for example, can create synthetic versions of their voice to read their audiobooks in languages they don't speak.

Because of the possible misuse of the tool — to create the impression that people have said things they haven't — speakers must go through a series of steps to confirm that the use of their voice is authorized, and the recordings include watermarks detectable by Microsoft.

"We're taking concrete steps to live up to our A.I. principles," said Ms. Crampton, who has worked as a lawyer at Microsoft for 11 years and joined the ethical A.I. group in 2018. "It's going to be a huge journey."

Grant Hindsley for The New York Times

Microsoft, like other technology companies, has had stumbles with its artificially intelligent products. In 2016, it released a chatbot on Twitter, called Tay, that was designed to learn "conversational understanding" from the users it interacted with. The bot quickly began spouting racist and offensive tweets, and Microsoft had to take it down.

In 2020, researchers discovered that speech-to-text tools developed by Microsoft, Apple, Google, IBM and Amazon worked less well for Black people. Microsoft's system was the best of the bunch but misidentified 15 percent of words for white people, compared with 27 percent for Black people.

The company had collected diverse speech data to train its A.I. system but hadn't understood just how diverse language could be. So it hired a sociolinguistics expert from the University of Washington to explain the language varieties that Microsoft needed to know about. It went beyond demographics and regional variety into how people speak in formal and informal settings.

"Thinking about race as a determining factor of how someone speaks is actually a bit misleading," Ms. Crampton said. "What we've learned in consultation with the expert is that actually a huge range of factors affect linguistic variety."

Ms. Crampton said the journey to fix that speech-to-text disparity had helped inform the guidance set out in the company's new standards.

"This is a critical norm-setting period for A.I.," she said, pointing to Europe's proposed regulations setting rules and limits on the use of artificial intelligence. "We hope to be able to use our standard to try and contribute to the bright, necessary discussion that needs to be had about the standards that technology companies should be held to."

A vibrant debate about the potential harms of A.I. has been underway for years in the technology community, fueled by mistakes and errors that have real consequences on people's lives, such as algorithms that determine whether or not people get welfare benefits. Dutch tax authorities mistakenly took child care benefits away from needy families when a flawed algorithm penalized people with dual nationality.

Automated software for recognizing and analyzing faces has been particularly controversial. Last year, Facebook shut down its decade-old system for identifying people in photos. The company's vice president of artificial intelligence cited the "many concerns about the place of facial recognition technology in society."

Several Black men have been wrongfully arrested after flawed facial recognition matches. And in 2020, at the same time as the Black Lives Matter protests after the police killing of George Floyd in Minneapolis, Amazon and Microsoft issued moratoriums on the use of their facial recognition products by the police in the United States, saying clearer laws on its use were needed.

Since then, Washington and Massachusetts have passed regulation requiring, among other things, judicial oversight over police use of facial recognition tools.

Ms. Crampton said Microsoft had considered whether to start making its software available to the police in states with laws on the books but had decided, for now, not to do so. She said that could change as the legal landscape changed.

Arvind Narayanan, a Princeton computer science professor and prominent A.I. expert, said companies might be stepping back from technologies that analyze the face because they were "more visceral, as opposed to various other kinds of A.I. that might be dubious but that we don't necessarily feel in our bones."

Companies also may realize that, at least for the moment, some of these systems are not that commercially valuable, he said. Microsoft could not say how many users it had for the facial analysis features it is getting rid of. Mr. Narayanan predicted that companies would be less likely to abandon other invasive technologies, such as targeted advertising, which profiles people to choose the best ads to show them, because they were a "cash cow."

Adblock test (Why?)

This posting includes an audio/video/photo media file: Download Now

This post have 0 komentar


EmoticonEmoticon

:)
:(
hihi
:-)
:D
=D
:-d
;(
;-(
@-)
:P
:o
:>)
(o)
:p
(p)
:-s
(m)
8-)
:-t
:-b
b-(
$-)
(y)
x-)
(k)
Next article Next Post
Previous article Previous Post