-->

Friday, February 23, 2018

author photo

Technology - Google News


Google is bringing ARCore out of beta and launching Lens search on more phones

Posted: 23 Feb 2018 08:00 AM PST

Google is bringing its augmented reality system ARCore out of beta with new features, and it's making its Lens visual search tool part of Google Photos on all phones. Today, ARCore 1.0 launches on all Google Pixel phones, all recent Samsung flagship phones, the Android O version of LG's V30 and V30 Plus, the Asus ZenFone AR, and the OnePlus 5. In addition to the preview version's capabilities, ARCore 1.0 includes support for anchoring virtual objects to any textured surface, not just flat, horizontal ones. Google boasts that 100 million Android phones currently support the platform, and it's working with several companies (Samsung, Huawei, LGE, Motorola, Asus, Xiaomi, HMD / Nokia, ZTE, Sony Mobile, and Vivo) to certify new ARCore phones in the future.

Lens was previously Pixel-only, but now it's available through Google Photos on Android and iOS 9 or later or through Google Assistant on several Android flagship phones. It's also supposed to have improved support for recognizing common animal breeds and plant types. Lens can examine photos you've already taken through Photos, and if you're using a Samsung, Huawei, LG, Motorola, Sony, and HMD / Nokia flagship with Assistant, you can simply pull out your phone and point it at something.

ARCore launched as a limited preview in August 2017, and Lens arrived shortly after in October. Google Lens senior director of product Aparna Chennapragada describes them as two sides of the same coin: Lens is a "camera-in" system that helps you make sense of the visual world, and ARCore is a "camera-out" that makes the world appear differently through your phone. Lens is an extension of Google's text and voice search capabilities, and ARCore lets developers easily build augmented reality apps, similar to Apple's ARKit for iOS. ARKit recently added support for vertical planes and image recognition, and Google's own update could help raise the bar for both mobile AR platforms.

We've seen things like ARCore stickers from Google already, but as of today, developers can upload their own ARCore-based apps to the Play Store. Apps that already feature augmented reality capabilities can also integrate ARCore tech for better performance: Snap, for instance, is supplementing its "world lens" feature with ARCore, introducing a new experience that simulates entering Barcelona's Camp Nou stadium through a high-tech portal. Snap has put a lot of work into augmented reality, but "they don't go out and certify and calibrate millions and millions of cameras," says Amit Singh, VP of business and operations for Google VR. Google is also updating Android Studio to let developers preview AR apps on the desktop.

In addition to Snap's experience, Google is partnering with a few other developers to celebrate ARCore 1.0's launch. Sotheby's International Realty, furniture company Otto, and e-commerce company JD.com will let you preview rooms, furniture, appliances, and other goods. Porsche will let Android users check out a version of its Mission E concept vehicle, and an upcoming mobile game called Ghostbusters World will (naturally) let players trap ghosts that appear in the real world. Google is also taking its platform outside the Play Store ecosystem in China, partnering with Xiaomi and Huawei to distribute ARCore-powered apps through independent app stores.

The long-term vision for ARCore and Lens is pretty exciting. As Chennapragada points out, Google could easily connect Android's "camera-in" and "camera-out" functions. She offered the example of seeing a nice piece of furniture at a friend's house, taking a picture for Google to identify, and automatically calling up a 3D model of it to preview back at home. "There's a reason why we're talking about these two things together," she says. Lens-style visual search could also expand beyond phones to something like Google's VR180 point-and-shoot video camera line, where it could seamlessly identify or annotate objects. "We haven't figured out what the user experience [is like], and you don't want to add more cognitive load to the experience," she says. "But certainly behind the scenes, I think that's one of the things we're looking at."

For now, though, both Lens and ARCore are relatively simple and limited. Even so, the next few weeks should see them become more sophisticated and more widely available.

Let's block ads! (Why?)

Sony's new flagship Xperia XZ2 and XZ2 Compact phones leak ahead of MWC

Posted: 23 Feb 2018 06:28 AM PST

Sony's Mobile World Congress plans seem to have leaked out ahead of time: details for its new flagship Xperia XZ2 and XZ2 Compact smartphones have hit the internet a few days early, according to Evan Blass at VentureBeat.

According to Blass, Sony will be announcing the 5.7-inch XZ2 (a follow-up to last year's XZ model) alongside a smaller 5.0-inch XZ2 Compact model (which had a prototype show up online earlier this week).

Both phones are said to offer Qualcomm's top-of-the-line Snapdragon 845 processor and ship with Android Oreo. Sony will reportedly be offering the same 19-megapixel, ƒ/1.8 camera from the XZ Premium. (That might not be a good thing, given that my colleague Vlad Savov found it to be fairly unreliable in his tests last year.) Also included is a rear fingerprint sensor, as opposed to the button-based sensors Sony has used in the past. Hopefully the company will continue the trend from the Xperia XA2 and offer fingerprint functionality in the US.

There are a few differences between the larger XZ2 and smaller XZ2 Compact beyond size, though. The XZ2 will apparently offer glass on both sides of the phone and support wireless charging in addition to USB-C, while the XZ2 Compact will feature a polycarbonate case back instead and lack the wireless charging found on the larger model.

Additionally, the full-size XZ2 will apparently have a new unique feature that adds haptic feedback to audio, allowing the device to sync vibrations to sound. (That could be what Sony's mysterious teaser from the beginning of the week was alluding to, if the report is correct.)

More details on the two devices should come when Sony officially announces them at MWC during its press conference on Monday.

Let's block ads! (Why?)

Google Assistant is adding Routines and location-based reminders

Posted: 23 Feb 2018 07:52 AM PST

Alongside news of Google Assistant's forthcoming multilingual support and the addition of more languages this year, Google also announced this morning its smart assistant would soon be gaining two new features: Routines and location-based reminders.

Google has been promising Routines were in the works for some time.

The feature, which lets Google Assistant users string together multiple commands, was first announced back in October 2017. With Routines, you can create personalized commands and responses – for example, saying "OK Google, I'm home," could turn on the lights, adjust the thermostat, and play some music.

It's a feature that rival Alexa announced in September 2017, and launched the following month. Google is playing catch up here, but it's doing it quickly.

This isn't the only way that Google Assistant can multitask, however. Before, you could say two commands in one sentence – like "turn on the TV and what's the weather?", and the smart assistant could perform both actions.

However, Routines will support tying together more than two commands, and will associate them with a trigger phrase.

According to Google, Routines will first launch in the U.S. in the weeks ahead, and will allow users to personalize six routines that help with morning commutes to and from work, and their evening at home.

In addition to Routines, location-based reminders is another new feature set to roll out in the near future.

This option is already available in Google Assistant on smartphones, but it will now be integrated into Google Home devices, as well.

Of course, the Google Home smart speaker stays in one location itself – but it can be used to create the location-based reminders you want to be alerted about later on your phone.

For example, you could tell your Google Home device to remind you to buy milk when you're at the grocery store.

Google didn't give an exact launch date for either feature as it's a staged rollout, but says they'll start arriving next week.

Let's block ads! (Why?)

This post have 0 komentar


EmoticonEmoticon

Next article Next Post
Previous article Previous Post