-->

Monday, December 9, 2019

author photo

Technology - Google News


Snapchat's Cameo test slips your face into GIFs - Engadget

Posted: 08 Dec 2019 03:23 PM PST

Sponsored Links

GIFs are great for conveying your reactions, but they're not usually very personal -- and recording yourself probably won't be that exciting. Snap might have found a way to bridge those two worlds. The company has confirmed to TechCrunch that it's testing a Cameo feature which inserts your face into a selection of (currently pre-made) GIFs. You just take a selfie, pick a generic body type and your face will animate as if it were you, including mouth movements. Think of it like a deepfake, but this time friendly instead of creepy.

The company said it was testing the feature in some international markets, including France. It's not certain when Cameo will reach the masses, but Snap said to expect a worldwide release "soon."

This could help Snapchat in multiple ways. It's an alternative to Bitmoji for those who think avatars are too cutesy. It also gives Snapchat a way to stand out among social networks and messaging services at the same time, and might give it a source of revenue if it charges for third-party Cameo clips. At least, up until Facebook copies the feature.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Comment
Comments
Share
74 Shares
Share
Tweet
Share
Save

Let's block ads! (Why?)

Bose’s Noise Cancelling 700 wireless headphones are $50 off at several retailers - The Verge

Posted: 09 Dec 2019 06:44 AM PST

Many popular headphone models, including Apple's AirPods, Sony's WH-1000XM3, and Beats Studio 3, were cheaper than usual during Black Friday and Cyber Monday. The latest model of Bose's flagship wireless noise-canceling headphones, the Noise Cancelling Headphones 700, were missing from the festivities, but you can now get them for $50 off through Amazon, Best Buy, Walmart, and Target.

The $50 off brings the total cost of the headphones down to $349. That may not sound like much, but it's the biggest price drop that we've seen on this new model yet. According to The Verge's Chris Welch, these offer some great features that you won't find on any other set of cans. Highlights include USB-C charging, but more importantly, voice calls are fantastic. The Noise Cancelling Headphones 700 can also connect to two devices simultaneously over Bluetooth, and the noise cancellation effect is (unsurprisingly) great. This model currently sits atop the rest in our ranking of the best wireless headphones.

Usually, Beats, Bose, and Sony headphones stubbornly hang on to their retail price for quite a while, so if you've been waiting for a price reduction for 2019's finest over-ear wireless headphones, now is your moment.

Let's block ads! (Why?)

Cloudy with a chance of neurons: The tools that make neural networks work - Ars Technica

Posted: 09 Dec 2019 05:00 AM PST

Machine learning is really good at turning pictures of normal things into pictures of eldritch horrors.
Enlarge / Machine learning is really good at turning pictures of normal things into pictures of eldritch horrors.
Jim Salter

Artificial Intelligence—or, if you prefer, Machine Learning—is today's hot buzzword. Unlike many buzzwords have come before it, though, this stuff isn't vaporware dreams—it's real, it's here already, and it's changing your life whether you realize it or not.

A quick overview of AI/ML

Before we go too much further, let's talk quickly about that term "Artificial Intelligence." Yes, it's warranted; no, it doesn't mean KITT from Knight Rider, or Samantha, the all-too-human unseen digital assistant voiced by Scarlett Johansson in 2013's Her. Aside from being fictional, KITT and Samantha are examples of strong artificial intelligence, also known as Artificial General Intelligence (AGI). On the other hand, artificial intelligence—without the "strong" or "general" qualifiers—is an established academic term dating back to the 1955 proposal for the Dartmouth Summer Project on Artificial Intelligence (DSRPAI), written by Professors John McCarthy and Marvin Minsky.

All "artificial intelligence" really means is a system that emulates problem-solving skills normally seen in humans or animals. Traditionally, there are two branches of AI—symbolic and connectionist. Symbolic means an approach involving traditional rules-based programming—a programmer tells the computer what to expect and how to deal with it, very explicitly. The "expert systems" of the 1980s and 1990s were examples of symbolic (attempts at) AI; while occasionally useful, it's generally considered impossible to scale this approach up to anything like real-world complexity.

Sadly, we're not here yet.
Enlarge / Sadly, we're not here yet.
NBCUniversal

Artificial Intelligence in the commonly used modern sense almost always refers to connectionist AI. Connectionist AI, unlike symbolic AI, isn't directly programmed by a human. Artificial neural networks are the most common type of connectionist AI, also sometimes referred to as machine learning. My colleague Tim Lee just got done writing about neural networks last week—you can get caught up right here.

If you wanted to build a system that could drive a car, instead of programming it directly you might attach a sufficiently advanced neural network to its sensors and controls, and then let it "watch" a human driving for tens of thousands of hours. The neural network begins to attach weights to events and patterns in the data flow from its sensors that allow it to predict acceptable actions in response to various conditions. Eventually, you might give the network conditional control of the car's controls and allow it to accelerate, brake, and steer on its own—but still with a human available. The partially trained neural network can continue learning in response to when the human assistant takes the controls away from it. "Whoops, shouldn't have done that," and the neural network adjusts weighted values again.

Sounds very simple, doesn't it? In practice, not so much—there are many different types of neural networks (simple, convolutional, generative adversarial, and more), and none of them is very bright on its own—the brightest is roughly similar in scale to a worm's brain. Most complex, really interesting tasks will require networks of neural networks that preprocess data to find areas of interest, pass those areas of interest onto other neural networks trained to more accurately classify them, and so forth.

One last piece of the puzzle is that, when dealing with neural networks, there are two major modes of operation: inference and training. Training is just what it sounds like—you give the neural network a large batch of data that represents a problem space, and let it chew through it, identifying things of interest and possibly learning to match them to labels you've provided along with the data. Inference, on the other hand, is using an already-trained neural network to give you answers in a problem space that it understands.

Both inference and training workloads can operate several orders of magnitude more rapidly on GPUs than on general-purpose CPUs—but that doesn't necessarily mean you want to do absolutely everything on a GPU. It's generally easier and faster to run small jobs directly on CPUs rather than invoking the initial overhead of loading models and data into a GPU and its onboard VRAM, so you'll very frequently see inference workloads run on standard CPUs.

AI you interact with daily

Some readers may remember the rallying cry of Linux nerds 20 years ago: "You use Linux every day, you just don't know it," followed by an exhaustive list of everything from your digital watch to your microwave oven that typically ran on embedded Linux kernels. This is where artificial intelligence is today—unnoticed by many, but nearly impossible to avoid. If you're a Facebook user, you see bounding boxes around the faces in all your photographs—and frequently, your friends' faces automatically tagged in freshly uploaded photos, with no interaction from you at all. This is the result of neural networks on the job, first identifying likely face areas in photographs, then passing those bounded areas to more neural networks which have already been trained to identify your friends' faces on sight.

YouTube users get recommendations for new videos to play after the current video ends. These, too, are generated by neural networks, which in this case have been trained to identify patterns leading to deeper user engagement with the service—meaning longer time spent on YouTube, and longer times spent watching each suggested video (not just clicking rapidly from one to the next without watching through). All these neural networks care about is engagement—they don't actually understand or care about the content of the videos in a human sense, which has led to serious problems when "increasing engagement" also happened to mean "radicalizing users into far-right terrorists."

Similarly, your phone uses neural networks to enhance your pictures, to understand your natural language commands—"Where's the nearest gas station?"—and so forth. AI isn't near-future tech. It's here right now.

Discovering AI with Google Colab

It's surprisingly easy to get started playing with AI in the cloud, if you know where to look. Although I am a senior sysadmin, and I've understood the basic concepts of AI for a year or two, I had never directly developed or supported any AI technology prior to this week. But after a couple hours' initial searching, I came across a Medium article that led me to Google's excellent Colab research project—and from there, it was a few short hours to testing, building, and deploying my own AI version of Hello World.

Colab, like many AI exploratory suites, largely consists of a feature-enhanced version of the Jupyter notebook platform. Jupyter can be thought of as a sort of interactive Wikipedia with executable code in it. A Jupyter notebook will associate itself to a running kernel and environment which can automatically download and install applications, frameworks, and data as the user steps through it.

Breaking the code down into well-organized, individually executable steps—and interspersing Markdown language that wraps it into an attractive, readable framework—offers an excellent introduction into new types of code. It also makes for a good way to prototype new approaches. While you wouldn't likely use a Jupyter network for a production task, you absolutely can use one to see the results of your code, and changes to that code, in a rapid and discoverable way—and when you've gotten your approach right, you can then literally download the code from the notebook as executable Python and modify it from there to make something useful in production.

The "Getting started" notebook that I began with downloads the Inception V3 image classification model, along with the necessary frameworks and APIs to interact with it, and it steps you through the process of classifying your first image with a pre-trained neural network. My copy of Inception V3 was trained on the ImageNet dataset, which means it can recognize common objects and animals but not human images—it has no idea what a human being is.

Playing with Inception and feeding it various photos can be an enlightening experience—you rapidly learn to recognize things it has or has not been trained to look for, along with some of the common pitfalls of image recognition. In particular, it gets confused easily by images with lots of potential focal points. You shouldn't really feed an image classifier a large, complex image—you should instead feed the large, complex image into an object detection network, which then isolates areas of potential interest in the photo, tightly crops them, and feeds those to your classifier.

Setting up your own, simpler environment with Digital Ocean

If you want to set up Jupyter on your own Digital Ocean droplet—or your own Linux computer!—Digital Ocean's tutorial makes it easy.
Enlarge / If you want to set up Jupyter on your own Digital Ocean droplet—or your own Linux computer!—Digital Ocean's tutorial makes it easy.
Jim Salter

I spent half an hour or so playing with Inception in the Jupyter notebook at Colab, and then decided I wanted to do mass classification. So I downloaded the Python code from the notebook, and headed to Digital Ocean to learn how to install the underlying frameworks necessary for that code to run. Digital Ocean has truly excellent beginner tutorials for all sorts of Linux-based projects, including one for setting up Jupyter Notebook on Ubuntu 18.04.

You won't get big GPU instances on Digital Ocean, but it's an excellent platform to set up inexpensive, reliable virtual machines for all kinds of projects, including inference workloads like this one. Digital Ocean makes it easy to rapidly deploy virtual machines—which they call "droplets"—from a variety of families. If you're going to do any significant AI work, you will need to choose one of their CPU-optimized droplets (beginning at $40/mo), not the cheaper "standard" droplets—otherwise you'll end up getting your droplet disabled for runaway CPU usage patterns.

If you're a Linux user at home, you can also choose to just follow Digital Ocean's tutorial to set up the Jupyter environment on your own hardware, which is exactly what I did. Once Jupyter is installed and its environment created, you can run jupyter notebook from its environment, and download and run the entire .pynb file you used at Colab on your own hardware. But you can also just download the Python code itself, and run it directly from the command line.

It took between 15 to 30 seconds to step through the code blocks in the Jupyter notebook I was using, but most of that was setup time—it should be much faster and easier to process lots of images by taking the raw Python code, replacing Google Colab's single-image-upload code with a loop to find all the images in a given directory, and processing all of them in a batch. With a little work, I did just that—and sure enough, my Ryzen 7 3700X workstation could chew through about 10 high-res photos a second, when done one after another in a batch.

The last step was modifying the output code I'd lifted from the Jupyter notebook to paste the classifier-generated result chart onto the bottom of the source image I'd fed it, then dump the resulting composite image into an output folder. I wasn't much more experienced with Python code than with AI, but that wasn't too difficult either. Only a few hours into my initial foray into machine learning, I was using my command-line tool to process hundreds of personal photos, memes, and what-have-you at 10 images per second.

Deploying at scale with Google Cloud, Amazon Web Services, or Azure

So far, so good. It was a fairly quick and easy process to go from Google's hosted Jupyter notebooks at Colab, to a self-managed Jupyter instance at Digital Ocean (or own your own Linux PC). From there to a Python code that can be run in production was equally painless. But this is pretty simple, lightweight stuff—what do you do if you need to run massive training workloads or want to build a scalable, publicly-deployed service using this technology?

We couldn't go into as much detail here as we did with the initial steps of getting your feet wet, because we neither have a real project to deploy nor the budget to do so. One thing you realize quickly when looking through scalable offerings from Amazon, Google, and Microsoft is that the costs can be difficult to forecast, and they can rise fast. All three platforms offer free, introductory service periods—but Google's is the only one that I felt completely certain wouldn't suddenly charge me for something I wasn't expecting.

Setting up services at Azure, Google Cloud, or AWS is a much, much less discoverable process than with Colab, which is entirely free and automatic. I spent several hours on Azure just trying to get my first Jupyter notebook instance running; it's a frustrating loop of "yes, you've done that, but have you done this?" that will probably feel very familiar to any long-term Microsoft sysadmins. The interface felt heavy and ponderous, and most of the account actions necessary took seconds or minutes to complete. This is a shame, because the platform has a wide selection of well-laid-out Jupyter notebooks focusing on real-world problems—seeing them is easy enough, but getting all the this-and-that necessary to be able to actually execute their code is not.

Although I eventually did get a Jupyter notebook running on Azure, we didn't have a chance to do much with it—and I accidentally racked up $3 in service charges along the way, despite the "free" introductory period. When setting up a VM to power a Jupyter notebook, you can choose anything from a simple single-CPU instance all the way up to massive VMs with tons of CPU cores, multiple GPUs, and staggering amounts of RAM—with no prices listed next to any of them. It's not a very reassuring environment for a single developer trying to learn how to use things.

Amazon Web Services offers an even larger assortment of instances from tiny to massive, but the pricing is much more discoverable—listed right where you need it, in operational costs per hour—and it's also easy to see what's on the "Free" tier and what is not. AWS seems far, far more focused on "you know what you're doing, now do it" than Azure; although there are some Jupyter-based environments available, like EMR Notebooks, they're difficult to find among the staggering sprawl of available AWS services.

Google Cloud seems to be a balanced offering, somewhere between Microsoft's research-focused platform and AWS's commerce-focused platform. Although the commercial Google Cloud (as opposed to the free-to-use Google Colab, which we began our journey with) doesn't offer anything like Azure's up-front selection of Jupyter notebooks, the services it offers are less directly bolted to Web commerce than Amazon's and may be more readily accessible for academic work.

Let's block ads! (Why?)

This post have 0 komentar


EmoticonEmoticon

Next article Next Post
Previous article Previous Post