Google included a secret imaging co-processor, dubbed Visual Core, on the 2nd generation of Pixel devices. The company introduced the new co-processor for its Pixel handsets almost half a month after the initial launch. Just a few months ago, Google started enabling the co-processor on its devices, allowing for improved HDR+ on the default camera app.
Today, Google is opening up Visual Core to all apps. This means third-party apps like Instagram, Snapchat, and others will be able to utilize the power of the co-processor in order to take better quality pictures while saving power. Google says Instagram, WhatsApp, and Snapchat will start using Visual Core on the Pixel 2 and Pixel 2 XL starting today. It’s also being opened up to all third-party developers — so if you build an app that uses the camera, you can start using the co-processor on your app, too.
This is the first time Google has packed a co-processor on its smartphones, which is mainly the reason behind the slow, phased rollout mentioned earlier. “We wanted to make sure it rolls out cleanly and nicely. We wanted to work with our partners and not surprise them. We wanted to make sure we improved on all aspects — not only image quality, but performance and power. That’s why we only launched Pixel Visual Core in developer options back in November, and now we’re continually improving. We want to make sure that our phone keeps on improving with time,” a Google engineer explained to TechCrunch.
Visual Core is likely the beginning of something new on Google’s flagship smartphones. As an increasing number of companies look to build co-processors for on-device machine learning, we’ll start to see a lot of such processors that offload some of the intensive tasks from the primary processor for better performance, and privacy. In other words, Visual Core seems a lot like an experiment for the future.