Android 12 preview: here’s Google’s radical new design

(rubs hands) – Android 12, it is here
or it's being announced. The new beta where Google
actually tells us what the biggest new user
facing features will be, has been announced. And, I have seen a demo
and I've played around with the beta here on my phone, and I have some thoughts. Do you wanna hear my thoughts or would you rather just see
what's new in Android 12? Oh, why not both… This is Android 12. (upbeat music) Android 12 looks different
from what you're used to on Android, actually very different. Google says that this is
the biggest visual overhaul since 2014, or maybe ever, depending
on who you're asking. And yeah, a lot of the pieces of this operating system here do look very different, but it all basically still works the same. You've got a home screen, you swipe up for apps, you swipe on for quick settings and for your notifications,
etcetera, etcetera.

What you're really looking at here with these big buttons and
the really big bubbly sliders and so on is how the Android team has decided to implement a new design system that Google is calling Material U. Now, Material UX or material UI just Material U like Y O U, whatever. Now, when you're looking at the B roll and the screen recordings and the screenshots on this phone, you should know that it is how Google is implementing Material U on the Pixel. Whether and how Samsung or Xiaomi or OnePlus decide to implement
it is going to be different. And also, you know, much later because their updates always
come later than the Pixel. Anyway, I don't have the full details on Material U and how it works and so on. But, I do know that it's
supposed to apply to everything from the web to Android,
to apps, to even hardware.

What that means is I'm just,
I'm not going to get any of the HETI UI versus UX versus you. You stuff here. I'm just going to talk about what I am seeing here on this phone. And what I am seeing is good. For the Android team the U part of material you hear is an
automatic theming system. So, when you set a new wallpaper you're gonna to be given
the option to have Android pull up some colors from your photo and then, apply that theme with
those colors to the system. So you can see here that the
buttons have turned green, and there's also an algorithm for pulling out complementary
colors from the photo. It's kind of neat, but I don't know that I would have picked
this particular green if I were beaming at myself. And the good news is is you can pick whatever
colors you really want to. So that's neat, but really I
can tell you the whole story of this visual redesign just by looking at a couple
of screen recording.

So, here's Android 11
and here is Andrew 12. So first there's a bunch of new like lighting effects when you unlock the phone,
you can kinda see colors and shadows and light kinda sweep around. And, in general there's just more animations all over the operating system. And we're gonna come
back to why that is, but look, they're even taking advantage of these animations on
the lock screen buttons, and you can see the little color from the material U theming as well. Now, when we pull down the quick settings
and notification shade you see that they are just
very big, easy to recognize easy to understand buttons. Google's just not afraid of taking up more space with all of their UI and they're not trying to cram everything into the most information
dense thing possible.

I actually think it's
like a nice direction. There is another subtle difference in the notification shade, you can see that it's just covering
the entire screen instead of sort of being a translucent layer over. It makes it into an entirely new space. And if you look at the
notifications themselves you'll see that they're
groups together and signified by a bunch of bubbles for
each individual group. So there's conversations and silent notifications and whatever. But if you slide an
individual notification away there's this really
subtle effect where the hard corner turns into a bubble for just that notification to indicate that is its own separate thing. Now on the home screen, let's just pause a moment
to look at these widgets. They are brand new and they're based on an entirely new
system for making widgets that is based on these principles from the material U design system.

So, Google is gonna
update a bunch of their own widgets, but they're also hoping that they can get a bunch of developers on board to update their old
widgets to the new system. And, I really hope it works
because the widget ecosystem on Android has gotten really
crufty and messy over time and it is due for a refresh. Now, next stop are quick settings and Google changes quick
settings every single year. And this year is no different. The new thing this year is that the buttons are huge! I mean, just look at
them, but I don't know. I kinda like it. Google also puts smart home controls and Google wallet into
quick settings finally, which means that now holding
down the power button brings up the assistant just like it does on the iPhone
and on Galaxy phones. And all of that means "adiós weird power button menu from Android 11!". You tried… Finally in quick settings there are toggles for
camera and mic access and we're going to get
to those in a minute.

Oh you know what, one more
thing I just have to talk about that's not in the screen
recordings, the new lock screen when you don't have any notifications you have this giant clock on it and it's dope and it
matches your color theme. We do have notifications. It's still pretty big. It just gets a little bit smaller. It's a good lock screen! Now the version of the Android beta that Google is releasing
this month, doesn't have all of the gewgaws and bells and
whistles that you just saw but, there's enough here that you can see where
Google is going with it.

Like, even if you just
look at the settings I have all of the icons
and the text is bigger and they've got this new
over scroll animation that kinda squeezes things together. It's a big redesign but it's not a complete overhaul
of how everything works. Every design gets crufty over time. And Android was definitely
starting to show a lot of inconsistencies as new features piled on and old ones were kind of half forgotten. I see this design as a general cleanup. All the buttons are big and bubbly and I see a sense that things are going to be a
little bit more coherent now, and, I dig that. So that is the new design
system, but I wanna come back to a thing I mentioned at the
top to the smoothness thing.

Android has a, a reputation that the
only way to make it smooth and good-looking is to
throw more powerful hardware at it with faster refresh
screens or more RAM or whatever. With Android 12, Google's promising that they're going to make
the animation smoother for everybody through
software improvements. So, we sat down with Sameer Samat, the VP of product management for
Android and Google play. And here's how he explains it. – So we've done a few
things to make things to make the system feel smooth. We've reduced lock
contention and key services, [Sameer] activity window
and package manager. What that really means is, there are multiple different parts of
the system trying to talk to the operating system at the same time. And that's when you see things jitter or genic, by smoothing a lot of that out and by reducing,
for example, the amount of time that Android system
server uses by 22%, actually. We've been able to make all the motion and animation feel super smooth. – All right, there are
a few other interesting features that are being announced today. So, there is a proper remote
control app for Android TV.

They're going to have car unlock
that works with NFC or UWB if your phone has it and that'll work with a
few different partners. And later this year, if
you have a Chromebook it's going to be able to
directly access the photo library on your Android phone. So next up is privacy updates. Google is putting privacy updates in every version of Android. That is great. And this year there really are a bunch.

The main thing that Google is trying to do this year is tamp
down on unfettered access to your location, your
camera and your microphone. So there are new indicators in the upper right-hand corner
when they're being accessed. And there are those new
buttons and quick settings that just fully turns off your
camera or your microphone. So, when you toggle them off, an app that looks your camera just gets a black, nothing. It thinks the camera's there, but really it's just getting nothing. There is also a new privacy dashboard that will show you how often those sensors have been accessed and by which apps. So you can view your data
from the past 24 hours in a pie chart or in a
timeline, and then turn off all the different
access stuff from there. Now for location, there is a
new kind of permission that you can grant to an app that's
approximate location instead of just precise location. So, say you've got
something like a weather app and you don't want it to
know your precise GPS pin but you want to know what
neighborhood you're in, you can give it an approximate location.

So let's all the privacy stuff
for sensors, but there's also this new part of the
operating system called the Android private compute core. Now you might think
it's a chip because core but it's not, instead it's,
it's like a sandbox part of the operating system for
machine learning things. It doesn't store data. It runs processes. – A good way to think
about it is, when you have these advanced technologies,
like for example speech recognition or
natural language processing, and they need access
to certain information. Another favorite example
of mine is smart reply. [Sameer] Awesome feature,
looks at your notifications your chat notifications,
and suggests replies based on a speech and language model.

All of that runs on device
in private compute core. – From my perspective, basically
what all that means is that if Google wants Android to be
able to do something with AI that you might think is creepy, now they can put all of
those processes in a box and limit all communication
into and out of that box and everything in the box
can't access the network and it's only accessible via limited API. So, that all seems great
but is it more secure? We'll see. So that's all the privacy
stuff that Google wants to talk about but, there is another
kind of privacy that Google really isn't keen
on discussing that much. And that is app tracking for ads. Now, there have been rumors
that Google would follow Apple and limit some kind of app
tracking for things like ads but, Google also makes
all of its money on ads.

So – Taking a step back on this one, there's obviously a lot changing
in the, in the ecosystem. One thing about Google is
that, it is a platform company. It's also a company that is
deep in the advertising space. So we're thinking, very deeply about how we should evolve
the advertising ecosystem. You've seen what we're doing on Chrome. [Sameer] From our standpoint on Android, we don't have anything
to announce at the moment but we are taking a position that privacy and advertising don't need
to be directly opposed with each other, [Sameer] that we don't believe is healthy for the overall ecosystem as a company.

So we're thinking about that working with our developer partners and we'll be sharing more later this year. – All right, well, stay tuned for news from Google on that later. And speaking of later, when are you gonna be
able to get Android 12 on your Android phone? Well, do you have a Pixel? Because then the answer is easy. You're going to get it this fall. Do you not have a Pixel?
Well, then the answer is later. Google says that the speed by which companies are
updating their phones to the latest version of
Android has improved by 30% but still, other manufacturers besides Google just take awhile to get the latest version of Android on their phones. That's just how Android works. Alright. That's Android 12, a huge redesign that adds some consistency and coherency with big buttons,
big sliders, big everything! There's more theming options. There's a bunch of privacy indicators.

There's a bunch of stuff that they put in the developer betas that
I haven't even covered here and a TV remote. This isn't the most massive release ever but you know what, it's enough. (transition sound) Hey everybody, thanks
so much for watching, right now it is the middle
of Google IO, which means that there is a lot going
on and we're going to have a lot more coverage of
everything Google has announced, and, you know, in general
it's just a big tech week. So I think there's gonna
be a couple more videos on the verge you're
gonna wanna check out….

As found on YouTube

How To Install And Download Google Play store App For Android – it’s easy! #HelpingMind

.

As found on YouTube

Unfolding the first trifold phone

(instrumental music) – Folding phones are boring. Look we've had the first
wave of foldable devices and they're neat, but you've
seen all that already. This is about what comes next after the current generation of foldables. Sling like the tri-fold from TCL, a wild new concept, folding
tablet, phone thing. (techno-music) Unlike most foldables the
tri-fold has two hinges, which lets it fold up into thirds . You can use it as a phone, you can unfold it once
to use as a bigger screen or fold the unfolded
into a full size tablet. It's not a half way compromise
like some of the other foldables that we've seen which basically turn into just slightly
wider phone displays.

This is a full blown tablet. It's nearly as big as an iPad, but you can still fit it in your pocket. The screen folds from a 10 inch tablet down to 6.65 inch phone. You could also open it two-thirds the way and ya know prop that up it'll auto rotate so you can use in whatever
orientation you want. There's a lot of weird use cases that you could probably use this for. The screen on this one is a 3k panel, but again that's just this prototype.

We have no idea if the finished
version will have that, but the tri-fold shows just
how hard it's gonna be to turn these ideas into reality. I've gotten to play around
with the prototype for a bit and it's really rough to use right now. It is incredibly heavy for a phone. It's got those big metal hinges
and there are three separate batteries to power all those displays, it's basically three phones.

And even though it's really
thin as a tablet the phone mode is super thick and that's
before your worrying about things like the software, which is basically non-existent. Durability which is a huge question or price which who even knows. (techno music) But looking even further into the future. TCL also had a very early
mock up of a rollable phone. This is really cool idea
that's a phone that has a fully flexible display that
slides around the side and behind the phone. And it could roll back out
and become a larger display when you need more space. The way it will, in theory, work is that there's gonna be motors
on the inside of the phone and you'll press a button
and it'll expand out from a 6.75 inch display
to a 7.8 inch screen.

Now that doesn't sound like a
lot, but your actually getting almost double the screen space
it's almost twice as wide. The whole system is actually
pretty similar to the rollable Oled TV that LG's been
showing off for years. Now again this is a really early concept, it's not even a functional
device, just plastic and a screen that's literally
just a sheet of paper.

Is this a good idea? Who knows. It does avoid some of the
issues of current foldables, like those easily breakable hinges and the creased display which is cool. But it's almost guaranteed
to have issues of its own. Moving parts they're tricky. Now the tri-fold is
just a proof of concept and the plastic sliding one even more so. You won't actually be able to
buy either of these devices and it's not clear if TCL's
actually gonna make products based on these concepts in the future.

So why should you care. Well, first of all, because it's cool. I mean look at this thing
it's a phone that unfolds into a giant full size tablet. And it actually turns on and runs android, it's like a science fiction prop. But it's also important,
because TCL is planning on eventually making
foldable and rollable phones that might actually look like these. Possibly as early as next year. The company says that it's
experimenting with dozens of different form factors right now.

So it's possible that phones like these, could be real one day. Look phones have basically been the same for the last decade. Black boxes with touch screens. Devices like the tri-fold
or that sliding concept, even if they're not here yet, show off what the future phones might one day look like. And that's really exciting idea. Thanks so much for watching. If you want to see more videos
about cool phones check out our Galaxy S20 ultra review. You can actually buy that one.

Check out the review. See if you want to. And like and subscribe for
more great videos like this..

As found on YouTube

Google I/O 2021 keynote in 16 minutes

good morning everyone it's great to be back at io today i'm excited to share our latest breakthrough in natural language understanding lambda it's a language model for dialogue applications and it's open domain which means it's designed to converse on any topic and while it's still in research and development we've been using it internally to explore novel interactions for example say you wanted to learn about one of my favorite planets pluto lambda already understands quite a lot about pluto and millions of other topics let's listen to a conversation the team had with pluto a few days ago i'm so curious about you i sense your excitement ask me anything tell me what i would see if i visited you would get to see a massive canyon some frozen icebergs geysers and some craters it sounds beautiful i assure you it is worth the trip however you need to bring your coat because it gets really cold i'll keep that in mind hey i was wondering have you ever had any visitors yes i have had some the most notable was new horizons the spacecraft that visited me let's break down what made it feel so natural first learn concepts as you saw the model talked about the new horizon spacecraft and the coldness of space lambda synthesized these concepts from its training data because none of the responses were predefined lambda answered with sensible responses keeping the dialogue open-ended natural conversations are generative and they never take the same path twice and lambda is able to carry a conversation no matter what we talk about yet it's still early research so it doesn't get everything right sometimes it can give nonsensical responses imagining pluto doing flips or playing fetch with its favorite ball the moon other times it just doesn't keep the conversation go going we believe lambda's natural conversation capabilities have the potential to make information and computing radically more accessible and easier to use we look forward to incorporating better conversational features into products like google assistant search and workspace lambda is a huge step forward in natural conversation but it is still trained only on text when people communicate with each other they do it across images text audio and video so we need to build models that allow people to naturally ask questions across different types of information these are called multimodal models for example when you say show me the part where the lion roars at sunset we will get you to that exact moment in a video advances in ai are helping us reimagine what a map can be but now you can also use it to explore the world around you you'll be able to access live view right from the map and instantly see details about the shops and the restaurants around you including how busy they are recent reviews and photos of those popular dishes in addition there are a host of new features coming to live view later this year we're adding prominent virtual street signs to help you navigate those complex intersections second we'll point you towards key alarm landmarks and places that are important for you like the direction of your hotel third we're bringing it indoors to help you get around some of the hardest to navigate buildings like airports transit stations and malls indoor live you will start rolling out in top train stations and airports in zurich this week and will come to tokyo next month we're bringing you the most detailed street maps we've ever made take this image of columbus circle one of the most complicated intersections in manhattan you can now see where the sidewalks the crosswalks the pedestrian islands are something that might be incredibly helpful if you're taking young children out on a walk or absolutely essential if you're using a wheelchair thanks to our application of advanced ai technology on robust street view and aerial imagery we're on track to launch detailed street maps in 50 new cities by the end of the year so we're making the map more dynamic and more tailored highlighting the most relevant information exactly when you need it if it's 8 a.m on a weekday we'll display the coffee shops and bakeries more prominently in the map while at 5 pm we'll highlight the dinner restaurants that match your tastes you'll start seeing this more tailored map in the coming weeks people have found it really useful especially during this pandemic to see how busy a place is before heading out now we're expanding this capability from specific places like restaurants and shops to neighborhoods with the feature called area business say you're in rome and want to head over to the spanish steps and its nearby shops with area business you'll be able to understand at a glance if it's the right time for you to go based on how busy that part of the city is in real time area busyness will roll out globally in the coming months let's talk about all the ways we're innovating in shopping soon on chrome when you open a new tab you'll be able to see your open carts from the past couple of weeks we'll also find you promotions and discounts for your open carts if you choose to opt in your personal information and what's in your carts are never shared with anyone externally without your permission we capture photos and videos so we can look back and remember there are more than four trillion photos and videos stored in google photos but having so many photos of loved ones screenshots selfies all stored together makes it hard to rediscover the important moments soon we're launching a new way to look back that we're calling little patterns little patterns show the magic in everyday moments by identifying not so obvious moments and resurfacing them to you this feature uses machine learning to translate photos into a series of numbers and then compares how visually or conceptually similar these images are when we find a set of three or more photos with similarities such as shape or color we'll surface them as a pattern when we started testing little patterns we saw some great stories come to life like how one of our engineers traveled the world with their favorite orange backpack or how our product manager christy had a habit of capturing objects of similar shape and color we also want to bring these moments to life with cutting edge effects last year we launched cinematic photos to help you relive your memories in a more vivid way cinematic moments will take these near duplicate images and use neural networks to synthesize the movement between image a and image b we interpolate the photos and fill in the gaps by creating new frames the end result is a vivid moving picture and the cool thing about this effect is it can work on any pair of images whether they were captured on android ios or scanned from a photo album in addition to providing personalized content to look back on we also want to give you more control we heard from you that controls can be helpful for anyone who has been through a tough life event breakup or loss these insights inspired us to give you the control to hide photos of certain people or time periods from our memories feature and soon you'll be able to remove a single photo from a memory rename the memory or remove it entirely instead of form following function what if form followed feeling instead of google blue we imagined material you a new design that includes you as a co-creator letting you transform the look and feel of all your apps by generating personal material palettes that mix color science with a designer's eye a new design that can flex to every screen and fit every device your apps adapt comfortably every place you go beyond light and dark a mode for every mood these selections can travel with your account across every app and every device material u comes first to google pixel this fall including all of your favorite google apps and over the following year we will continue our vision bringing it to the web chrome os wearables smart displays and all of google's products we've overhauled everything from the lock screen to system settings revamping the way we use color shapes light and motion watch what happens when the wallpaper changes like if i use this picture of my kids actually getting along for once i set it as my background and voila the system creates a custom palette based on the colors in my photo the result is a one of a kind design just for you and you'll see it first on google pixel in the fall starting from the lock screen the design is more playful with dynamic lighting pick up your phone and it lights up from the bottom of your screen press the power button to wake up the phone instead and the light ripples out from your touch even the clock is in tune with you when you don't have any notifications it appears larger on the lock screen so you know you're all caught up the notification shade is more intuitive with a crisp at a glance view of your app notifications whatever you're currently listening to or watching and quick settings that give you control over the os with just a swipe and a tap and now you can invoke the google assistant by long pressing the power button and the team also reduced the cpu time of android system server by a whopping 22 percent and with android 12 we're going even further to keep your information safe to give people more transparency and control we've created a new privacy dashboard that shows you what type of data was accessed and when this dashboard reports on all the apps on your phone including all of your google apps and we've made it really easy to revoke an app's permission directly from the dashboard we've also added an indicator to make it clear when an app is using your camera or microphone but let's take that a step further if you don't want any apps to access the microphone or camera even if you've granted them permission in the past we've added two new toggles in quick settings so you can completely disable those sensors for every app android's private compute core enables things like now playing which tells you what song is playing in the background and smart reply which suggests responses to your chats based on your personal reply patterns and there's more to come later this year all of the sensitive audio and language processing happens exclusively on your device and like the rest of android private compute core is open source it's fully inspectable and verifiable by the security community with a single tap you can unlock and sign into your chromebook when your phone is nearby incoming chat notifications from apps on your phone are right there in chrome os and soon if you want to share a picture one click and you can access your phone's most recent photos to keep movie night on track we're building tv remote features directly into your phone you can use voice search or even type with your phone's keyboard we're also really excited to introduce support for digital car key car key will allow you to lock unlock and start your car all from your phone it works with nfc and ultra wideband technology making it super secure and easy to use and if your friend needs to borrow your car you can remotely and securely share your digital key with them car key is launching this fall with select google pixel and samsung galaxy smartphones and we're working with bmw and others across the industry to bring it to their upcoming cars that was a quick look at android 12 which will launch this fall but you can check out many of these features in the android 12 beta today let's go beyond the phone to what we believe is the next evolution of mobile computing the smartwatch first building a unified platform jointly with samsung focused on battery life performance and making it easier for developers to build great apps for the watch second a whole new consumer experience including updates to your favorite google apps and third a world-class health and fitness service created by the newest addition to the google family fitbit as the world's largest os we have a responsibility to build for everyone but for people of color photography has not always seen us as we want to be seen even in some of our own google products to make smartphone photography truly for everyone we've been working with a group of industry experts to build a more accurate and inclusive camera so far we've partnered with a range of different expert image makers who've taken thousands of images to diversify our image data sets helped improve the accuracy of our auto white balance and auto exposure algorithms and given aesthetic feedback to make our images of people of color more beautiful and more accurate although there's still much to do we're working hard to bring all of what you've seen here and more to google pixel this fall we were all grateful to have video conferencing over the last year it helped us stay in touch with family and friends and kept businesses and schools going but there is no substitute for being together in the room with someone so several years ago we kicked off a project to use technology to explore what's possible we call it project star line first using high resolution cameras and custom built depth sensors we capture your shape and appearance from multiple perspectives and then fuse them together to create an extremely detailed real-time 3d model the resulting data is huge many gigabits per second to send this 3d imagery over existing networks we developed novel compression and streaming algorithms that reduce the data by a factor of more than 100 and we have developed a breakthrough light field display that shows you the realistic representation of someone sitting right in front of you in three dimensions as you move your head and body our system adjusts the images to match your perspective you can talk naturally gesture and make eye contact it's as close as we can get to the feeling of sitting across from someone we have spent thousands of hours testing it in our own offices and the results are promising there's also excitement from our lead enterprise partners we plan to expand access to partners in healthcare and media thank you for joining us today please enjoy the rest of google i o and stay tuned for the developer keynote coming up next i hope to see you in person next year until then stay safe and be well

As found on YouTube

Google I/O Keynote (Google I/O ’17)

[MUSIC PLAYING] [VIDEO PLAYBACK] [PAPER CRUMPLING] [MUSIC PLAYING] [SQUEAK] [SQUEAK] [SEAGULL CRYING] [CARS HONKING] [ZAP] [CHALK ON CHALKBOARD] [CAR HONKING] [SCRAPING] [TEARING] [CHEERING AND APPLAUSE] [CHIMES] [FOOTSTEPS] [BIRDS CHIRPING] [TAPPING] – Hm. [BIRDS CHIRPING] [POP] – [GASP] [CHUCKLES] [MUSIC PLAYING] – Hm? [BIRDS CHIRPING] [HEAVY FOOTSTEPS] – Mm. [TING] [THUNDERING IN DISTANCE] [RAINFALL] [THUNDER] [THUNDER] – [STRAINING] [THUNDER] – [GASP] – [MANY STRAINING] – [SIGH] – Hmm. [GLEAMING] – Huh? – Oh? – [GASP] Hmm. [CLACK] – Woohoo! Whoa. [ROCKETING] [TAP] [THUMP] [END PLAYBACK] [APPLAUSE] SUNDAR PICHAI: Good morning. Welcome to Google I/O. [CHEERING] AUDIENCE: I love you, Sundar! [LAUGHTER] SUNDAR PICHAI: I
love you guys, too. [LAUGHTER] Can't believe it's
one year already. It's a beautiful day.

We're being joined
by over 7,000 people, and we are live streaming
this, as always, to over 400 events
in 85 countries. Last year was the 10th year
since Google I/O started, and so we moved it closer
to home at Shoreline, back where it all began. It seems to have gone well. I checked the Wikipedia
entry from last year. There were some
mentions of sunburn, so we have plenty of
sunscreen all around. It's on us. Use it liberally. It's been a very busy year
since last year, no different from my 13 years at Google. That's because
we've been focused ever more on our core mission
of organizing the world's information. And we're doing it for everyone. And we approach it by
applying [? deep ?] computer science and technical
insights to solve problems at scale.

That approach has served
us very, very well. This is what allowed
us to scale up seven of our most
important products and platforms to over a billion
monthly active users each. And it's not just
the scale at which these products
are working, users engage with them very heavily. YouTube, not just has
over a billion users, but every single day, users
watch over 1 billion hours of videos on YouTube. Google Maps. Every single day, users navigate
over 1 billion kilometers with Google Maps. So the scale is
inspiring to see, and there are other products
approaching this scale. We launched Google
Drive five years ago, and today, it is over 800
million monthly active users. And every single week, there
are over 3 billion objects uploaded to Google Drive.

Two years ago at Google I/O,
we launched Photos as a way to organize user's photos
using machine learning. And today, we are over
500 million active users, and every single day, users
upload 1.2 billion photos to Google. So the scale of these
products are amazing, but they are all still
working up their way to what's Android, which
I'm excited as of this week, we crossed over 2 billion
active devices of Android.

[APPLAUSE] As you can see, the robot is
pretty happy, too, behind me, so it's a privilege to
serve users of this scale. And this is all
because of the growth of mobile and smartphones, but
computing is evolving again. We spoke last year about this
important shift in computing from a mobile first to
a AI first approach. Mobile made us reimagine every
product we were working on. We had to take into account that
the user interaction model had fundamentally changed,
with multi-touch, location, identity, payments, and so on. Similarly, in a
AI first world, we are rethinking all our products
and applying machine learning and AI to solve user problems. And we are doing this across
every one of our products. So today, if you
use Google Search, we rank differently
using machine learning.

Or if you're using Google
Maps, Street View automatically recognizes restaurant
signs, street signs, using machine learning. Duo with video calling
uses machine learning for low bandwidth situations. And Smart Reply and Allo last
year had great reception. And so today, we
are excited that we are rolling out Smart Reply to
over 1 billion users of Gmail. It works really well. Here's a sample email. If you get an email like this,
the machine learning systems learn to be
conversational, and it can reply, I'm fine with
Saturday, or whatever. So it's really nice to see.

Just like with every platform
shift, how users interact with computing changes. Mobile brought multi-touch. We evolved beyond
keyboard and mouse. Similarly, we now
voice and vision as two new important
modalities for computing. Humans are interacting
with computing in more natural and immersive ways. Let's start with voice. We've been using
voice as an input across many of our products. That's because computers
are getting much better at understanding speech. We have had significant
breakthroughs, but the pace, even
since last year, has been pretty amazing to see. Our word error rate
continues to improve, even in very noisy environments.

This is why if you speak to
Google on your phone or Google Home, we can pick up
your voice accurately, even in noisy environments. When we were
shipping Google Home, we had originally planned to
include eight microphones so that we could accurately
locate the source of where the user was speaking from. But thanks to deep
learning, we use a technique called neural beamforming. We were able to ship it
with just two microphones and achieve the same quality. Deep learning is what allowed
us about two weeks ago to announce support for
multiple users in Google Home, so that we can recognize up
to six people in your house and personalize the experience
for each and every one. So voice is becoming
an important modality in our products. The same thing is
happening with vision. Similar to speech, we are
seeing great improvements in computer vision. So when we look at
a picture like this, we are able to understand the
attributes behind the picture.

We realize it's your
boy in a birthday party. There was cake and
family involved, and your boy was happy. So we can understand
all that better now. And our computer
vision systems now, for the task of the
image recognition, are even better than humans. So it's astounding
progress and we're using it across our products. So if you used the
Google Pixel, it has the best-in-class camera,
and we do a lot of work with computer vision. You can take a low light picture
like this, which is noisy, and we automatically make
it much clearer for you.

Or coming very soon, if you
take a picture of your daughter at a baseball game, and there
is something obstructing it, we can do the hard work
remove the obstruction– [APPLAUSE] –and– [APPLAUSE] –have the picture of what
matters to you in front of you. We are clearly at an
inflection point with vision, and so today, we are
announcing a new initiative called Google Lens. [APPLAUSE] Google Lens is a set of
vision-based computing capabilities that can understand
what you're looking at and help you take action
based on that information.

We'll ship it first in
Google Assistant and Photos, and it'll come to
other products. So how does it work? So for example, if
you run into something and you want to know
what it is, say, a flower, you can invoke Google
Lens from your Assistant, point your phone at it, and we
can tell you what flower it is. It's great for someone
like me with allergies. [LAUGHTER] Or if you've ever been
at a friend's place and you have
crawled under a desk just to get the username and
password from a Wi-Fi router, you can point your phone at it. [APPLAUSE] And we can automatically
do the hard work for you. Or if you're walking
in a street downtown and you see a set of
restaurants across you, you can point your phone.

Because we know where you are
and we have our Knowledge Graph and we know what
you're looking at, we can give you the
right information in a meaningful way. As you can see, we're
beginning to understand images and videos. All of Google was built because
we started understanding text and web pages. So the fact that computers can
understand images and videos has profound implications
for our core mission. When we started
working on Search, we wanted to do it at scale. This is why we rethought our
computational architecture. We designed our data
centers from the ground up. And we put a lot
of effort in them. Now that we are evolving for
this machine learning and AI world, we are rethinking our
computational architecture again. We are building what we think
of as AI first data centers. This is why last year,
we launched the tensor processing units. They are custom hardware
for machine learning. They were about 15 to 30 times
faster and 30 to 80 times more power efficient than CPUs
and GPUs at that time. We use TPUs across
all our products, every time you do a search,
every time you speak to Google.

In fact, TPUs are what powered
AlphaGo in its historic match against Lee Sedol. I see now machine learning
as two components. Training, that is, how
we build the neural net. Training is very
computationally intensive, and inference is what
we do at real time, so that when you
show it a picture, we'd recognize whether it's
a dog or a cat, and so on.

Last year's TPU were
optimized for inference. Training is computationally
very intensive. To give you a sense, each one of
our machine translation models takes a training of
over three billion words for a week on about 100 GPUs. So we've been working
hard and I'm really excited to announce our next
generation of TPUs, Cloud TPUs, which are optimized for
both training and inference. What you see behind me
is one Cloud TPU board. It has four chips in
it, and each board is capable of 180
trillion floating point operations per second. [WHOOPING] And we've designed it
for our data centers, so you can easily stack them. You can put 64 of these
into one big supercomputer. We call these TPU
pods, and each pod is capable of 11.5 petaflops. It is an important advance
in technical infrastructure for the AI era. The reason we named
it cloud TPU is because we're bringing it
through the Google Cloud Platform. So cloud TPUs are
coming to Google Compute Engine as of today. [APPLAUSE] We want Google Cloud to be
the best cloud for machine learning, and so we want
to provide our customers with a wide range
of hardware, be it CPUs, GPUs, including the
great GPUs Nvidia announced last week, and now Cloud TPUs.

So this lays the foundation
for significant progress. So we are focused
on driving the shift and applying AI to
solving problems. At Google, we are bringing
our AI efforts together under Google.ai. It's a collection
of efforts and teams across the company focused on
bringing the benefits of AI to everyone. Google.ai will focus
on three areas, state-of-the-art research,
tools, and infrastructure– like TensorFlow and Cloud TPUs– and applied AI.

So let me talk a little
bit about these areas. Talking about research, we're
excited about designing better machine learning
models, but today it is really time consuming. It's a painstaking effort of a
few engineers and scientists, mainly machine learning PhDs. We want it to be possible
for hundreds of thousands of developers to use
machine learning. So what better way to do
this than getting neural nets to design better neural nets? We call this approach AutoML. It's learning to learn. So the way it works is we take
a set of candidate neural nets. Think of these as
little baby neural nets. And we actually use a neural net
to iterate through them till we arrive at the best neural net.

We use a reinforcement
learning approach. And it's– the
results are promising. To do this is
computationally hard, but Cloud TPUs put it in
the realm of possibility. We are already approaching state
of the art in standard tasks like, say, for our
image recognition. So whenever I spend
time with the team and think about neural nets
building their own neural nets, it reminds me of one of my
favorite movies, "Inception." And I tell them
we must go deeper.

[LAUGHTER] So we are taking all
these AI advances and applying them to
newer, harder problems across a wide range
of disciplines. One such area is health care. Last year, I spoke about our
work on diabetic retinopathy. It's a preventable
cause of blindness. This year, we
published our paper in the "Journal of the
American Medical Association," and [? verily ?] is working
on bringing products to the medical community. Another such area is pathology. Pathology is a
very complex area. If you take an area like
breast cancer diagnosis, even amongst highly
trained pathologists, agreement on some
forms of breast cancer can be as low as 48%.

That's because
each pathologist is reviewing the equivalent of
1,000 10-megapixel images for every case. This is a large data problem,
but one which machine learning is uniquely equipped to solve. So we built neural nets
to detect cancer spreading to adjacent lymph nodes. It's early days,
but our neural nets show a much higher
degree of accuracy, 89% compared to previous
methods of 73%. There are important caveats we
do have higher false positives, but already giving this in
the hands of pathologists, they can improve diagnosis. In general, I think this is
a great approach for machine learning, providing
tools for people to do what they do better. And we're applying it
across even basic sciences. Take biology. We are training
neural nets to improve the accuracy of DNA sequencing.

[? Deep ?] [? Piriant ?] is a
new tool from Google.ai that identifies genetic variants
more accurately than state-of-the-art methods. Reducing errors is in
important in applications. We can more accurately
identify whether or not a patient has genetic disease
and can help with better diagnosis and treatment. We're applying it to chemistry. We're using machine
learning to predict the properties of molecules. Today, it takes an incredible
amount of computing resources to hunt for new
molecules, and we think we can
[? accelerate ?] timelines by orders of magnitude.

This opens up possibilities
in drug discovery or material sciences. I'm entirely
confident one day, AI will invent new molecules that
behave in predefined ways. Not everything we are
doing is so profound. We are doing even
simple and fun things, like a simple tool which
can help people draw. We call this AutoDraw. Just like today when
you type in Google, we give you suggestions,
we can do the same when you're trying to draw,
even I can draw with this thing.

So it may look
like fun and games, but pushing computers
to do things like this is what helps them
be creative and actually gain knowledge. So we are very excited about
progress even in these areas as well. So we are making
impressive progress in applying machine learning,
and we are applying it across all our products, but
the most important product we are using this is for Google
Search and Google Assistant. We are evolving
Google Search to be more assistive for our users. This is why last
year at Google I/O, we spoke about the Assistant,
and since then, we've launched it on Google
Pixel and Google Home. Scott and team are going
to talk more about it, but before that, let's take a
look at the many amazing ways people have been using
the Google Assistant. [VIDEO PLAYBACK] – OK, Google. [MUSIC PLAYING] – Hey, Google? – Hey, Google. – OK, Google. – Hey, Google. [BLING] – Play some dance music. – Sure. [BLING] – This is "Fresh Air." My guest will be– – Kimmy Schmidt on Netflix. [BLING] – OK, Google. Count to 100.

– Sure. 1, 2, 3– – Play vacuum
harmonica on my TV. [VACUUMING] [HARMONICA PLAYS] – –71, 72– – No! – –73– – Play the "Wonder
Woman" trailer. – Hey, Google. Talk to Domino's. – Talk to Lonely Planet. – Talk to Quora. – Show me my photos
from last weekend. [BLING] [SCREAMING] – Your car is parked at 22B. [BEEP BEEP] – Today in the news– [BLING] – Turn the living
room lights on. – OK, turning on the lights. – I'm back, baby. – Hey, Google. Drop a beat. – Flip a coin. – Call Jill. – Set a timer. – Talk to Headspace. [TING] – And then just
for a moment, I'd like you to let go
of any focus at all. Just let your mind do
whatever it wants to do. – Done. – Hey, Google. Good night. – Turning off all the things. See you tomorrow. [END PLAYBACK] [MUSIC PLAYING] [APPLAUSE] SCOTT HUFFMAN: Hey, everyone. Last year at I/O, we introduced
the Google Assistant, a way for you to have a
conversation with Google to get things done
in your world.

Today, as Sundar
mentioned, we're well on our way,
with the Assistant available on over
100 million devices. But just as Google
Search simplified the web and made it more
useful for everyone, your Google Assistant
simplifies all the technology in your life. You should be able
to just express what you want
throughout your day and the right things
should happen. That's what the Google
Assistant is all about. It's your own individual Google. So that video we
saw really captures the momentum of this project. We've made such big strides
and there's so much more to talk about today. The Assistant is becoming
even more conversational, always available wherever you
need it, and ready to help get even more things done.

First, we fundamentally believe
that the Google Assistant should be, hands
down, the easiest way to accomplish tasks, and
that's through conversation. It comes so naturally to
humans, and now Google is getting really good
at conversation, too. Almost 70% of requests
to the Assistant are expressed in
natural language, not the typical keywords that
people type in a search box. And many requests or follow-ups
that continue the conversation. We're really starting to crack
the hard computer science challenge of conversationality
by combining our strengths in speech recognition, natural
language understanding, and contextual meaning. Now recently, we made
the Assistant even more conversational, so each
member of the family gets relevant
responses just for them by asking with their own voice. And we're continuing to make
interacting with your Assistant more natural. For example, it doesn't always
feel comfortable to speak out loud to your Assistant,
so today, we're adding the ability to type to
your Assistant on the phone.

Now, this is great when
you're in a public place and you don't want
to be overheard. The Assistant's also learning
conversation beyond just words. With another person,
it's really natural to talk about what
you're looking at. Sundar spoke earlier about
how AI and deep learning have led to tremendous
strides in computer vision. Soon, with the smarts
of Google Lens, your Assistant will be able to
have a conversation about what you see. And this is really cool,
and Ibrahim is here to help me show you a couple
of examples of what we'll launch in the coming months. So, last time I
traveled to Osaka, I came across a line of
people waiting to try something that smelled amazing.

Now, I don't speak Japanese,
so I couldn't read the sign out front, but Google Translate
knows over 100 languages, and my Assistant will help
with visual translation. I just tap the Google Lens
icon, point the camera, and my Assistant can instantly
translate the menu to English. And now, I can continue
the conversation. IBRAHIM ULUKAYA: What
does it look like? GOOGLE ASSISTANT: These
pictures should match. SCOTT HUFFMAN: All right. It looks pretty yummy. Now notice, I never had to
type the name of the dish.

My Assistant used visual
context and answered my question conversationally. Let's look at another example. Some of the most tedious
things I do on my phone stem from what I see– a business card I
want to save, details from a receipt I need
to track, and so on. With Google Lens,
my Assistant will be able to help with
those kinds of tasks, too. I love live music,
and sometimes I see info for shows around
town that look like fun.

Now, I can just tap
the Google Lens icon and point the camera
at the venue's marquee. My Assistant instantly
recognizes what I'm looking at. Now, if I wanted to, I could
tap to hear some of this band's songs, and my Assistant offers
other helpful suggestions right in the viewfinder. There's one to buy
tickets from Ticketmaster, and another to add the
show to my calendar. With just a tap, my Assistant
adds the concert details to my schedule. GOOGLE ASSISTANT: Saving event.

Saved Stone Foxes for
May 17th at 9:00 PM. SCOTT HUFFMAN: Awesome. [APPLAUSE] My Assistant will help me
keep track of the event, so I won't miss the
show, and I didn't have to open a bunch of
apps or type anything. Thanks Ibrahim. So that's how the
Assistant is getting better at conversation– by understanding language and
voices, with new input choices, and with the power
of Google Lens. Second, the
Assistant is becoming a more connected experience
that's available everywhere you need help, from your living
room to your morning jog, from your commute to
errands around town, your Assistant should
know how to use all of your connected
devices for your benefit. Now, we're making good progress
in bringing the Assistant to those 2 billion
phones, and other devices powered by Android, like TVs,
wearables, and car systems. And today, I'm
excited to announce that the Google Assistant is
now available on the iPhone. [APPLAUSE] Woo. So no matter what
smartphone you use, you can now get help from
the same smart assistant throughout the day at
home, and on the go. The Assistant brings together
all your favorite Google features on the iPhone.

Just ask to get package
delivery details from Gmail, watch videos from your
favorite YouTube creators, get answers from Google
Search, and much more. You can even turn on the
lights and heat up the house before you get home. Now, Android devices and iPhones
are just part of the story. We think the Assistant should
be available on all kinds of devices where people
might want to ask for help. The new Google Assistant SDK
allows any device manufacturer to easily build the Google
Assistant into whatever they're building. Speakers, toys,
drink-mixing robots, whatever crazy device
all of you think up, now can incorporate
the Google Assistant. And we're working with many
of the world's best consumer brands and their
suppliers, so keep an eye out for the badge that says,
"Google Assistant built-in" when you do your holiday
shopping this year. Now obviously, another aspect
of being useful to people everywhere is support
for many languages. I'm excited to announce
that starting this summer, the Google Assistant
will begin rolling out in French, German,
Brazilian Portuguese, and Japanese on both
Android phones and iPhones.

By the end of the
year, we'll also support Italian,
Spanish and Korean. So that's how the Assistant is
becoming more conversational, and how it will be available
in even more contexts. Finally, the
Assistant needs to be able to get all kinds of
useful things done for people. People sometimes ask if
the Assistant is just a new way to search. Now of course, you
can ask your Assistant to get all sorts of
answers from Google Search, but beyond finding
information, users are also asking
the Assistant to do all sorts of things for them.

Now as you've already
seen, the Assistant can tap into capabilities across
many Google Apps and services, but Google's features are
just part of the story. We also open the Assistant
to third-party developers who are building some
really useful integrations. I'll turn it over to Valerie
to share more about how the developer platform
is getting stronger. [MUSIC PLAYING] [APPLAUSE] VALERIE NYGAARD: Hi. OK, so with the actions
on Google Platform, it's been awesome to
see how developers like you have been engaging
with the Google Assistant. Like honestly, you've built
some really cool integrations. Like, I can ask Food Network
about the recipe that's on TV right now. I can work out with
Fitstar, ask CNBC about the news, or
my husband and I can play name that tune
with SongPop, which he is surprisingly good at.

Until now, these
experiences have been available through the
Assistant on Google Home. But today, we're
also bringing them to Android phones and iPhones. It's over 100 million
devices on Android alone. So now people can get
to Google features and third-party
services from anywhere, and they can even pick up where
they left off across devices. So, not only are
third-party integrations available in more places. They'll be able to do more. Starting today,
actions on Google will be supporting transactions. It's a complete end-to-end
solution for developers, including payments, identity,
notifications, receipts, even account creation. The platform handles
all the complexity. Let me show you
how one will work. GOOGLE ASSISTANT:
Hi, how can I help? VALERIE NYGAARD: I'd like
delivery from Panera. PANERA: Hi, this is Panera. I'll need your delivery address.

Which one can I get from Google? GOOGLE ASSISTANT: We'll
go with 1600 Amphitheater. PANERA: What can I
get you started with? VALERIE NYGAARD: I'll have the
strawberry poppy seed salad with steak instead of chicken. PANERA: Got it. How about one of
these cool drinks? VALERIE NYGAARD: And here, I can
just swipe through my options. See what looks good. Agave lemonade. PANERA: Great. Are you ready to check out? VALERIE NYGAARD: Yep. PANERA: OK, the total is $18.40. Are you ready to
place the order? VALERIE NYGAARD: Yes. I'll just scan my fingerprint to
pay with Google, and that's it. [APPLAUSE] PANERA: Thanks. You're all set. VALERIE NYGAARD:
Yeah, super easy, like I was talking to
someone at the store. So here I was a new
Panera customer. I didn't have to install
anything or create an account. You've also probably
noticed I didn't have to enter my address
or my credit card.

I just saved those
earlier with Google, and Panera used
built-in platform calls to request the information. Now, I was in control over what
I shared every step of the way. So– AUDIENCE: Woo! VALERIE NYGAARD: [CHUCKLES]
The developer platform's also getting much stronger for
home automation integrations. Actions on Google can now
support any smart home developer that wants to
add conversational control. Today, over 70 smart
home companies work with the Google Assistant,
so now in my Google Home or from my phone, I can lock my
front door with August locks, control a range
of LG appliances, or check in on my son's room
by putting the Nest cam on TV. All right, now
that we're talking about making your home smarter,
we also have a lot of news to share today about Google
Home, our own smart speaker with the Google
Assistant built in. Here to tell you more
is Rishi Chandra.

[MUSIC PLAYING] [APPLAUSE] RISHI CHANDRA: Thanks, Valerie. You know, it's really
hard to believe we launched Google Home a
little over six months ago, and we've been really
busy ever since. Since launch, we've added
50 new features, including some my favorites like
support for Google Shopping, where I can use my voice
to order items from Costco right to my front door. Or I can get step-by-step
cooking instructions from over 5 million recipes. Or I can even play my favorite
song just by using the lyrics. Now in April, we launched in
the UK to some great reviews. And starting this
summer, we're going to be launching in
Canada, Australia, France, Germany, and Japan. [APPLAUSE] And with support
for multiple users, we can unlock the full
potential of Google Home to offer a truly
personal experience. So now, you can schedule
a meeting, set a reminder, or get your own daily
briefing with My Day by using your own voice.

And get your commute, your
calendar appointments, and your news sources. Now today, I'd like you
share four new features we'll be rolling out
over the coming months. So first, we're
announcing support for proactive assistance
coming to Google Home. Home is great at providing
personally relevant information for you when you
ask for it, but we think it'd be even more
helpful if it can automatically notify you of those timely
and important messages. And we do this by understanding
the context of your daily life, and proactively looking for
that really helpful information, and providing for you
and a hands-free way. So for example, let's say I'm
relaxing and [? playing game ?] with the kids. Well, I can see that the Google
Home lights just turned on. Hey, Google, what's up? GOOGLE ASSISTANT: Hi, Rishi. Traffic's heavy
right now, so you'll need to leave in 14 minutes
to get to Shoreline Athletic Fields by 3:30 PM. RISHI CHANDRA:
That's pretty nice. The Assistant saw the game
coming up on my calendar, and got my attention
because I had to leave earlier than normal.

So now, my daughter can
make it to that soccer game right on time. Now, we're going
to start simple, with really important messages
like reminders, traffic delays, and flight status changes. And with multiple-user
support, you have the ability to control the
type of proactive notifications you want over time. All right, and second,
another really common activity we do in the home today is
communicate with others. And a phone call is still the
easiest way to reach someone. So today, I'm excited to
announce hands-free calling coming to Google Home. [CHEERING AND APPLAUSE] It's really simple to use. Just ask the Google
Assistant to make a call, and we'll connect you. You can call any landline
or mobile number in the US or Canada completely free. And it's all done
in a hands-free way. For example, let's say I forgot
to call my mom on Mother's Day. Well now, I can
call her while I'm scrambling to get the kids
ready for school in the morning. I just see and say, hey Google. Call mom. GOOGLE ASSISTANT:
Sure, calling mom.

[RINGING] [RINGING] SPEAKER 1: So, you're
finally calling. Mother's Day was three days ago. RISHI CHANDRA: Yeah,
sorry about that. They made me rehearse
for I/O on Mother's Day. Speaking of which, you're
on stage right now. Say hi to everyone. SPEAKER 1: Oh, hi, everyone. AUDIENCE: Hi. RISHI CHANDRA: So, hopefully,
this makes up for not calling, right? SPEAKER 1: No, it doesn't. You still need to visit
and bring flowers.

RISHI CHANDRA: OK, I'm on it. Bye. SPEAKER 1: Bye. RISHI CHANDRA: It's that simple. We're just making a standard
phone call through Google Home. So mom didn't need to learn anything new. She just needs to answer her phone. There's no additional setup,
apps, or even phone required. And since the Assistant
recognized my voice, we called my mom.

If my wife had asked,
we would have called her mom. We can personalize calling
just like everything else. And now, anyone home can
call friends, family, even businesses. Maybe even a local florist to
get some flowers for your mom. Now, by default, we're going to
call out with a private number, but you also have the option
to link your mobile number to the Google Assistant. And we'll use that
number whenever we recognize your voice. So whoever you call [? must ?]
know it's coming from you. Now, we're rolling out
hands-free calling in the US to all existing
Google Home devices over the next few months. It's the ultimate
hands-free speakerphone. No setup required, call anyone,
including personal contacts or businesses, and even dial out
with your personal number when we detect your voice. We can't wait for
you to try it out.

OK, third, let's talk a
little about entertainment. We designed Google Home
to be a great speaker, one that can put in any
room in the house or wirelessly connect to other
Chromecast built-in speaker systems. Well today, we're
announcing that Spotify, in addition to their
subscription service, will be adding their free
music service to Google Home, so it's even easier to play
your Spotify playlists. [APPLAUSE] We'll also be adding support
for SoundCloud and Deezer to the largest global
music services today. [APPLAUSE] And these music
services will join many of the others
already available through the Assistant. And finally, we'll be
adding Bluetooth support to all existing
Google Home devices. So you can play any audio from
your iOS or Android device. AUDIENCE: Yes! [APPLAUSE] But Google Home can do
much more than just audio. Last year, we
launched the ability to use your voice to play
YouTube, Netflix, and Google Photos right on your TV.

And today, we're announcing
additional partners, including HBO NOW. [APPLAUSE] So just say you want to watch,
and we'll play it for you all in a hands-free way. With Google Home, we want to
make it really easy to play your favorite entertainment. OK, finally, I want
to talk a little bit how we see the Assistant
evolving to help you in a more visual way. Voice responses are great,
but sometimes a picture is worth a thousand words. So today, we're announcing
support for visual responses with Google Home. Now to do that,
we need a screen. Well, fortunately,
many of us already have a ton of screens in
our home today, our phones, our tablets, even our TVs.

The Google Assistant
should smartly take advantage of all
these different devices to provide you the best
response on the right device. For example, with Google
Home, I can easily get location information. OK, Google. Where is my next event? GOOGLE ASSISTANT:
Your Pokemon GO hike is at Rancho San
Antonio Reserve. RISHI CHANDRA: It's for my kids. GOOGLE ASSISTANT: It's
at 11:00 AM today. RISHI CHANDRA: It's for my kids. Relax. [LAUGHTER] But if I want to
view the directions, the best place to do
it is on my phone. Well soon, you could
just say, OK, Google. Let's go. GOOGLE ASSISTANT: All right,
I'm sending the best route to your phone. RISHI CHANDRA: And will
automatically your phone– and notify your phone,
whether it's Android or iOS, and take you straight
to Google Maps.

So you can glance at directions,
interact with the map, or just start navigation. It's really simple. Now TVs are another
natural place to get help from the
Google Assistant, and we've a great place to start
with over 50 million Chromecast and Chromecast built-in devices. So today, we're
announcing that we'll be updating Chromecast to show
visual responses on your TV when you ask for help
from Google Home. For example, I can
now say, OK, Google. Show my calendar for Saturday. GOOGLE ASSISTANT:
Showing it on your TV. RISHI CHANDRA: It'll show
up right on TV screen. I'll immediately get
results from the Assistant. [APPLAUSE] And since the Assistant
detected my voice, we're showing my calendar. Others would see their
calendar by using their voice. We can personalize the
experience, even on the TV. They can continue to
follow-up the conversation. Looks like I have a
biking trip to Santa Cruz.

What's the weather in
Santa Cruz this weekend? GOOGLE ASSISTANT: This
weekend in Santa Cruz, it will be clear and
sunny most of the time. RISHI CHANDRA: So
it's really easy. It's all hands-free. Your Assistant can provide
a visual response to a TV to a lot of different
types of questions. We talked about how
easy it is to play what you want to watch
on the TV screen, but what about those times
you don't know what to watch? Well, soon, you could
just ask, hey, Google. What's on YouTube? GOOGLE ASSISTANT: Here you go.

RISHI CHANDRA: And it'll show
me my personalized results right on the TV screen. If I don't like
any of the options, I can continue the
conversation with my voice. Show my Watch Later list. GOOGLE ASSISTANT: All right. RISHI CHANDRA: Play
"Send My Love." GOOGLE ASSISTANT: Playing
"Send My Love" from YouTube. [MUSIC – "SEND MY LOVE"] RISHI CHANDRA:
It's really simple. Again, no remotes
or phone required. In a short conversation, I found
something really interesting to watch using Google Home. I can even do it
with other things. OK, Google. What's on my DVR? GOOGLE ASSISTANT: Here you go. RISHI CHANDRA:
Here we're showing how it works with YouTube
TV, a new live TV streaming service that gives you
live sports and shows from popular TV networks. And YouTube TV
includes a cloud DVR, so I can easily play
my saved episodes. Everything can be done
in a hands-free way all from the
comfort of my couch. And over time, we're going
to bring all those developer actions that Valerie had already
talked about right to the TV screen.

So we'll do even more over
time with Google Home. And that's our update
for Google Home. Proactive assistance will bring
important information to you at the right time, simple
and easy hands-free calling, more entertainment
options, and evolving the Assistant to provide
visual responses in the home. Next up is Anil, who's going
to talk about Google Photos. [APPLAUSE] [MUSIC PLAYING] ANIL SABHARWAL:
Two years ago, we launched Google Photos
with an audacious goal– to be the home for
all of your photos, automatically organized
and brought to life so that you could easily
share and save what matters.

In doing so, we took a
fundamentally different approach. We built a product from the
ground up with AI at its core. And that's enabled
us to do things in ways that only Google can. Like when you're looking for
that one photo you can't find, Google Photos
organizes your library by people, places, and things. Simply type, "Anil
pineapple Hawaii," and instantly find this gem. [LAUGHTER] Or when you come home
from vacation, overwhelmed by the hundreds of
photos you took, Google Photos will
give you an album curated with only the
best shots, removing duplicates and blurry images. This is the secret ingredient
behind Google Photos, and the momentum we've seen
in these two short years is remarkable. As Sundar mentioned, we now
have more than half a billion monthly active users, uploading
more than 1.2 billion photos and videos per day. And today, I'm
excited to show you three new features
we're launching to make it even easier
to send and receive the meaningful
moments in your life.

Now, at first glance, it
might seem like photo sharing is a solved problem. After all, there's no shortage
of apps out there that are great at keeping you
and your friends and family connected, but we
think there's still a big and different problem
that needs to be addressed. Let me show you what I mean. [VIDEO PLAYBACK] – If there's one
thing you know, it's that you're a
great photographer. If there's a second
thing you know, it's that you're kind
of a terrible person. – What? – Yeah, you heard me. The only photo of the
birthday girl in focus? Never sent it. The best picture of
the entire wedding? Kept it to yourself. This masterpiece of
your best friend? We were going to
send it, but then you were like, oh,
remember that sandwich? I love that sandwich.

If only something could say,
hey, Eric looks great in these. You want to send them to him? And you can be like, great idea. Well, it can. Wait, it can? Yup. With Google Photos. [END PLAYBACK] [APPLAUSE] ANIL SABHARWAL:
So today, to make us all a little less
terrible people, we're announcing Suggested
Sharing, because we've all been there, right? Like when you're
taking that group photo and you insist that it be
taken with your camera, because you know if
it's not on your camera, you are never seeing
that photo ever again. [LAUGHTER] Now thanks to the machine
learning in Google Photos, we'll not only remind you so
you don't forget to share, we'll even suggest
the photos and people you should share with. In one tap, you're done. Let's have a look at
Suggested Sharing in action. I'm once again joined onstage
by my friend, and Google Photos product lead, David Leib. [APPLAUSE] All right, so here
are a bunch of photos Dave took while bowling
with the team last weekend.

He was too busy
enjoying the moment, so he never got around
to sharing them. But this time, Google
Photos sent him a reminder via
notification, and also by badging the new Sharing tab. The Sharing tab is
where you're going be able to find all of
your Google Photos sharing activity, and at the top,
your personal suggestions based on your sharing habits and
what's most important to you. Here is the Sharing
Suggestion that Dave got from his day bowling. Google Photos recognized
this was a meaningful moment, it selected the right
shots, and it figured out who he should send it to based
on who was in the photos. In this case, it's Janvi,
Jason, and a few others who were also at the event. Dave can now review
the photos selected, as well as update
the recipients.

Or if he's happy with
it, he can just tap Send. And that's it. Google Photos will even
send an SMS or an email to anyone who
doesn't have the app. And that way, everyone can view
and save the full resolution photos, even if they don't
have Google Photos accounts. And because Google
photo sharing works on any device,
including iOS, let's have a look at what
Janvi sees on her iPhone. She receives a notification,
and tapping on it lets her quickly jump
right into the album. And look at all the photos
that Dave has shared with her. But notice here at
the bottom, she's asked to contribute the photos
she took from the event, with Google Photos automatically
identifying and suggesting the right ones.

Janvi can review the suggestions
and then simply tap Add. Now all of these photos
are finally pulled together in one place, and Dave gets
some photos he's actually in. [LAUGHTER] Which is great, because a
home for all your photos really should include
photos of you. Now, though Suggested Sharing
takes the work out of sharing, sometimes there's a
special person in your life whom you share just
about everything with. Your partner, your best
friend, your sibling. Wouldn't it be great if
Google Photos automatically shared photos with that person? For example, I would love it
if every photo I ever took of my kids was automatically
shared with my wife. And that's why today, we're also
announcing Shared Libraries. [APPLAUSE] Let me show you how it works. So here, we're now looking
at my Google Photos account. >From the menu, I
now have the option to go ahead and
share my library, which I'm going to go ahead
and do with my wife, Jess.

Importantly, I have complete
control over which photos I automatically share. I can share them all,
or I can share a subset, like only photos of
the kids, or only photos from a
certain date forward, like when we first met. In this case, I'm going
to go ahead and share all. [LAUGHTER] [LAUGHS] We did not meet today. [LAUGHTER] And that's all there is to it. I've now gone ahead and shared
my library with my wife, Jess. So, let's switch to her phone
to see what the experience looks like from her end. She receives a notification,
and after accepting, she can now go to see
all the photos that I've shared with her, which she
can access really easily from the menu. If she see something
she likes, she can go ahead and
select those photos and simply save
them to her library.

We'll even notify
her periodically as I take new photos. Now, this is great,
but what if Jess doesn't want to have to keep
coming back to this view and checking if I shared
new photos with her? She just wants every photo
I take of her or the kids to automatically be
saved to her library, just as if she took
the photos herself. With Shared Libraries,
she can do just that, choosing to autosave
photos of specific people. Now, any time I
take photos of her or the kids, without either
of us having to do anything, they'll automatically appear
in the main view of her app. Let me show you. Now, I couldn't justify
pulling the kids out of school today just to have
their photo taken, but I do have the
next best thing.

[APPLAUSE] Let me introduce you to
[? Eva ?] and [? Lilly. ?] All righty here. So I'm going to go ahead,
take a photo with the girls. Smile, kids! [LAUGHTER] Wow, fantastic. And since this is too
good of an opportunity, I'm going to have to
take one with all of you here, too, all right? [CHEERING] Here we go. Woo! Brilliant. All right. OK, so thank you, girls. Much appreciated. Back to school we go. [LAUGHTER] All right. So, using nothing more
than the standard camera app on my phone, I've
gone ahead and taken one photo with my kids and
one photo with all of you here in the audience. Google Photos is going to
back these two photos up. It's going to share
them with Jess, and then it's going to
recognize the photo that has my kids in them
and automatically save just that one to her library,
like you can see right here. [APPLAUSE] Now finally, Jess and I can
stop worrying about whose phone we're using to take the photos. All the photos of our family
are in my Google Photos app, and they automatically
appear in hers too.

And best of all,
these family photos are part of both of
our search results, and they're included in
the great collages, movies, and other fun creations that
Google Photos makes for us. But notice how only the
photos with the kids showed up in Jess's main view. But because I shared my
entire library with her, I can simply go to the
menu, and Jess can now see all of the photos, including
the one with all of you. [APPLAUSE] And that's how easy sharing
can be in Google Photos. Spend less time worrying
about sharing your memories, and more time actually
enjoying them. Suggested Sharing
and Shared Libraries will be rolling out on
Android, iOS, and web in the coming weeks.

Finally, we know
sharing doesn't always happen through apps and screens. There's still something
pretty special about looking at and even gathering around
an actual printed photo. But printing photos and
albums today is hard. You have to hunt across
devices and accounts to find the right
photos, select the best among the duplicates
and blurry images, upload them to a
printing service, and then arrange them
across dozens of pages. It can take hours of sitting
in front of a computer just to do one thing. Thankfully, our machine
learning and Google Photos already does most of
this work for you, and today, we're
bringing it all together with the launch of Photo Books. [APPLAUSE] They're beautiful, high quality
with a clean and modern design, but the best part
is that they're incredibly easy to make,
even on your phone. What used to take hours
now only takes minutes. I recently made a book
for Jess on Mother's Day. And let me show you just
how easy and fast that was. First, thanks to
unlimited storage, all my life's moments are
already here in Google Photos. No need to upload them to
another website or app.

So I'll select a
bunch of photos here. And the good news is I
don't have to figure out which are the right photos
and which are the good ones because this is where
Google Photos really shines. I'm just going to go
ahead and hit plus. Select Photo book. I'm going to pick
a hardcover book. We offer both a softcover
and a hardcover. And notice what happens. Google Photos is going to
pick the best photos for me automatically, automatically
suggesting photo– 40, in this case. [APPLAUSE] How awesome is that? And it's even going to go ahead
and lay them all out for me. All that's left for me to do
is make a couple of tweaks, check out, and in
a few days, I'll end up with one of these
beautiful printed photo books. [APPLAUSE] And soon, we'll make it
even easier to get started, applying machine learning
to create personalized photo books you'll love.

So when you go to Photo
Books from the menu, you'll see pre-made books
tailored just for you. Your trip to the
Grand Canyon, time with your family during
the holidays, or your pet, or even your kids artwork,
all easily customizable. We'll even notify you when
there are new Photo Books suggestions. AUDIENCE: [INAUDIBLE] ANIL SABHARWAL: Photo Books
are available today in the US on photos.google.com,
and they'll be rolling out on Android
and iOS next week, and will be expanding
to more countries soon.

[APPLAUSE] I am really excited about this
launch, and I want all of you to be the first to try it out. And that's why
everyone here at I/O will be receiving a free
hardcover photo book. [APPLAUSE] It's a great example of
machine learning at work. AUDIENCE: [? $10? ?] Take
that photo [INAUDIBLE] ANIL SABHARWAL: So those are
the three big updates related to sharing in Google Photos. Suggested Sharing, Shared
Libraries, and Photo Books. Three new features built
from the ground up with AI at their core. I can't wait for all of you
to try them out real soon. Now before I go, I want to
touch on what Sundar mentioned earlier, which is the way we're
taking photos is changing.

Instead of the occasional
photo with friends and family, we now take 30 identical
photos of a sunset. We're also taking different
types of photos, not just photos to capture
personal memory, but as a way to
get things done– whiteboards we want to remember,
receipts we need to file, books we'd like to read. And that's where Google Lens
and its vision-based computing capabilities comes in. It can understand
what's in an image and help you get things done. Scott showed how Google
Lens and the Assistant can identify what you're looking
at and help you on the fly. But what about after
you've taken the photo? There are lots of photos
you want to keep, and then look back on later to
learn more and take action.

And for that, we're
bringing Google Lens right into Google Photos. Let me show you. So let's say you took
a trip to Chicago. There's some beautiful
architecture there. And during your boat tour
down the Chicago River, you took lots of
photos, but it's hard to remember which
building is which later on. Now, by activating
Lens, you can identify some of the cool
buildings in your photos, like the second
tallest skyscraper in the US, Willis Tower. You can even pull up
directions and get the hours for the viewing deck. And later, while visiting
the Art Institute, you might take photos of a
few paintings you really love. In one tap, you can learn
more about the painting and the artist. And the screenshot that
your friend sent you of that bike rental place? Just activate Lens, and you
can tap the phone number and make the call
right from the photo.

[APPLAUSE] Lens will be rolling out in
Google Photos later this year, and we'll be continually
improving the experience so it recognizes
even more objects and lets you do
even more with them. And those are the updates
for Google Photos. [CHEERING AND APPLAUSE] Now, let's see what's
next from YouTube. [MUSIC PLAYING] SUSAN WOJCICKI: All right. Good morning, everyone. I am thrilled to be
here at my first ever I/O on behalf of YouTube. [APPLAUSE] Thank you. So that opening video
that we all just saw, that's a perfect glimpse into
what makes YouTube so special– the incredible
diversity of content. A billion people
around the globe come to YouTube every
month to watch videos from new and unique voices. And we're hard at
work to make sure that we can reach
the next billion viewers, which you'll hear about
in a later I/O session today.

We want to give
everyone the opportunity to watch the content on YouTube. So, YouTube is different
from traditional media in a number of ways. First of all, YouTube is open. Anyone in the world can upload
a video that everyone can watch. You can be a vlogger
broadcasting from your bedroom, a gamer live streaming
from your console, or a citizen
journalist documenting events live from your
phone on the front lines. And what we've seen
is that openness leads to important
conversations that help shape society,
from advancing LGBTQ rights to highlighting
the plight of refugees, to encouraging body positivity. And we've seen in our
numbers that users really want to engage with this
type of diverse content.

We are proud that last year we
passed a billion hours a day being watched on YouTube,
and our viewership is not slowing down. The second way that
YouTube is different from traditional media is that
it's not a one-way broadcast. It's a two way conversation. Viewers interact directly
with their favorite creators via comments, mobile live
streaming, fan polls, animated GIFs, and VR. And these features enable
viewers to come together, and to build communities
around their favorite content. So one of my favorite stories
of a YouTube community is the e-NABLE network. A few years ago, an
engineering professor named Jon Schull saw a YouTube
video about a carpenter who had lost two of his fingers. The carpenter worked
with a colleague for over a year to build
an affordable 3D-printed prosthesis that would enable
him to go back to work.

They then applied
this technology for a young boy who was
born without any fingers. So inspired by this
video, the professor posted a single
comment on the video asking for volunteers
with 3D printers to help print
affordable prostheses. The network has since grown
into a community of over 6,000 people who have
designed, printed, and distributed these
prosthetics to children in over 50 countries. [APPLAUSE] So today, thousands
of children have regained the ability
to walk, touch, play, and all because
of the one video– one comment– and that
incredible YouTube community that formed to help. And that's just one example of
the many passionate communities that are coming together
on YouTube around video. So, the third feature
of this new medium is that video works
on-demand on any screen. Over 60% of our watchtime now
comes from mobile devices. But actually our
fastest growing screen isn't the one in your pocket. It's the one in
your living room. Our watchtime in our living room
is growing at over 90% a year. So, let's now welcome Sarah Ali,
Head of Living Room Products, to the stage to talk about the
latest features in the living room.

[MUSIC PLAYING] [APPLAUSE] SARAH ALI: Thank you, Susan. So earlier today,
you heard from Rishi about how people
are watching YouTube on the TV via the Assistant. But another way
people are enjoying video is through the
YouTube app, which is available over half a billion
smart TVs, game consoles, and streaming devices. And that number continues
to grow around the world. So, when I think
about why YouTube is so compelling
in the living room, it isn't just about
the size of the screen. It's about giving
you an experience that TV just can't match. First, YouTube offers
you the largest library of on-demand content. Second, our recommendations
build channels and lineups based on your
personal interests, and what you enjoy watching. And third, it's a two-way
interactive experience with features like
voice control.

And today, I'm super
excited to announce that we're taking the
interactive experience a step further by introducing
360 video in the YouTube app on the big screen. And you know that
you can already watch 360 videos on your phone
or in your Daydream headset. But soon, you'll be
able to feel like you're in the middle of the action,
right from your couch, and on the biggest
screen you own. Now, one of my personal
interests outside of work is to travel. And one place I'd
love to visit is Alaska to check out
the Northern Lights. So, let's do a voice search. Aurora Borealis 360. Great. Let's choose that first video. And now, using my TV remote, I'm
able to pan around this video, checking out this awesome
view from every single angle. Traveling is great,
especially when I don't have to get on a flight,
but 360 is now a brand-new way to attend concerts. I didn't make it to Coachella,
but here I can experience it like I was on stage.

And to enhance the
experience even further, we are also introducing
live 360 in the living room. Soon, you'll be able to
witness moments and events as they unfold in a new,
truly immersive way. So whether you have a Sony
Android TV, or an Xbox One console, soon, you'll
be able to explore 360 videos right from
the comfort of your couch and along with your
friends and family. And now, to help
show you another way we're enabling
interactivity, please join me in welcoming Barbara McDonald,
who's the lead of something we call Super Chat. [MUSIC PLAYING] [APPLAUSE] BARBARA MACDONALD:
Good morning I/O, and to everybody
on the live stream.

As Susan mentioned, what
makes YouTube special is the relationships
that creators are able to foster with their fans. And one of the best ways to
connect with your fans is to bring them live, behind
the scenes of your videos, offering up can't-miss content. In the past year, the
number of creators live streaming on
YouTube has grown by 4x. This growth is
awesome, and we want to do even more to deepen the
connection between creators and their fans
during live streams. That's why earlier this year,
we rolled out a new feature called Super Chat. When a creator is
live streaming, fans can purchase Super
Chats which are highlighted, fun, chat messages. Not only do fans
love the recognition, but creators earn
extra money from it. In the past three
months since launch, we've been amazed by
the different ways creators are using Super Chat. Even April, our favorite
pregnant giraffe, who unfortunately could
not be here with us today, has raised tens of
thousands of dollars for her home, the
Animal Adventure Park.

But, OK. [CLAPPING] OK, we can clap for that. [APPLAUSE] [LAUGHS] But enough talking from me. We are going to do a live
stream right here, right now, to show all of you
how Super Chat works. And to help me, I am
very excited to introduce top YouTube creators with
9 million subscribers and over 1 billion
lifetime channel views. On the grass back
there, The Slow Mo Guys! [CHEERING AND APPLAUSE] GAVIN FREE: Hello, everyone. DANIEL GRUCHY: Wow, hey. Happy to be here. How's it going? BARBARA MACDONALD:
It's great to have you. So let's pull up
their live stream. And just look. Chat is flying. Now, I love The
Slow Mo Guys, and I want to make sure that
they see my message, so I'm going to Super Chat them. Pulled up the stream. And right from within live chat,
I am able to enter my message, select my amount, make
the purchase, and send.

Boom. See how much that
message stands out? And it gets to the top. It's cool, right? DANIEL GRUCHY: Yeah,
thanks, Barbara. It's actually lovely
at the minute. Although, I feel like there's
a high chance of showers. GAVIN FREE: Very local
showers, like, specifically to this stage. DANIEL GRUCHY: Very sudden. Yeah. BARBARA MACDONALD:
Ooh, I wonder. I wonder. Well, because we know developers
are incredibly creative, we wanted to see what you can
do to make Super Chat even more interactive. So we've launched an API for it.

And today, we're taking
it to the next level with a new developer
integration that triggers actions in the real world. This means that when a fan
sends a Super Chat to a creator, things can happen in real life,
such as turning the lights on or off in the creator's
studio, flying a drone around,
or pushing buttons on their toys and gadgets. The Slow Mo Guys are going to
create their next slow motion video using Super Chat's API. We have now rigged things up so
that when I send my next Super Chat, it will
automatically trigger the lights and a big horn
in this amphitheater, OK? And that is going to signal our
friends back there on the lawn to unleash a truckload of water
balloons at The Slow Mo Guys. GAVIN FREE: I'm scared. [CHEERING] DANIEL GRUCHY: Yeah. BARBARA MACDONALD: Yeah. [LAUGHS] DANIEL GRUCHY: That's right. For every dollar, we're going
to take another balloon. So, more money
means more balloons. Although, I did hear
a guy over here go, oh, we're going to
really nail these guys.

All right, that's going to
be at least $4 right there. So, yeah. Each dollar donated goes to
the causes Susan mentioned earlier, the e-NABLE network. BARBARA MACDONALD: OK, so, how
much do you think we can send? I can start at $1 and go
anywhere upwards from there. So, it's for charity. How do we think– $100. How's that sound? AUDIENCE: More. BARBARA MACDONALD: OK,
higher, higher. $200? $200? GAVIN FREE: How about
$500 for 500 balloons? BARBARA MACDONALD: $500? I can do that. I can do that. OK. So I'm going to send my
Super Chat and hit Send. $500. Boom. [HORN BLOWS] DANIEL GRUCHY: Oh! Balloons, oh [INAUDIBLE] god! Agh! BARBARA MACDONALD: [LAUGHS] DANIEL GRUCHY: Ugh. Yep. All right. All right. BARBARA MACDONALD: Keep going. Keep going. DANIEL GRUCHY: Oh! BARBARA MACDONALD: It's 500. DANIEL GRUCHY: It's finished. It's finished. GAVIN FREE: It never ends, ah! DANIEL GRUCHY: Ah! [INAUDIBLE] BARBARA MACDONALD:
That was amazing.

Thank you, everybody,
for your help. So this obviously just
scratches the surface of what is possible using
Super Chat's open APIs. And we are super excited
to see what all of you will do with it next. So Susan, how about
you come back out here, and let's check out the
video we've all made. [VIDEO PLAYBACK] [MUSIC PLAYING] [APPLAUSE] BARBARA MACDONALD: [LAUGHS] AUDIENCE: [? Yeah, guys! ?] BARBARA MACDONALD: Wow.

[APPLAUSE] Thank you, Slow Mo Guys. Thank you, Barbara. I'm really happy to
announce that YouTube is going to match The
Slow Mo Guys' Super Chat earnings from today
100x to make sure that we're supplying
prosthetics to children in need around the world. [APPLAUSE] So that 360 living room demo
and the Super Chat demo– those are just two
examples of how we are working to connect
people around the globe together with video.

Now, I hope that what
you've seen today is that the future of media
is a future of openness and diversity. A future filled with
conversations, and community. And a future that works
across all screens. Together with creators,
viewers, and partners, we are building the
platform of that future. Thank you, I/O, and please– [APPLAUSE] Please welcome
Dave Burke, joining us to talk about Android. [CHEERING AND APPLAUSE] [VIDEO PLAYBACK] [MUSIC – JACKIE WILSON, "HIGHER
AND HIGHER"] [BUZZING] [CHEERING] [SATELLITE BEEPS] – Yay! Woohoo! [FIREWORKS LAUNCHING] – Yay! Woohoo! [FIREWORKS LAUNCHING] [END PLAYBACK] [CHEERING AND APPLAUSE] DAVE BURKE: Hi, everybody. It's great to be here
at Google I/O 2017. As you can see, we
found some new ways to hardware accelerate Android.

This time, with jet packs. But seriously, 2 billion
active devices is incredible. And that's just
smartphones and tablets. We're also seeing new momentum
in areas such as TVs, and cars, and watches, and
laptops, and beyond. So let me take a
moment and give you a quick update on how Android
is doing in those areas. Android Wear 2.0 launched
earlier this year with a new update for
Android and iPhone users. And with you partners like
Emporio Armani, Movado, and New Balance, we now enable
24 of the world's top watch brands. Android Auto. We've seen a 10x user
growth since last year It's supported by more than 300 car
models and the Android Auto mobile app. And just this week,
Audi and Volvo announced that their
next generation nav systems will be powered by
Android for a more seamless, connected car experience. Android TV. We partnered with over 100
cable operators and hardware manufacturers around the world. And we're now seeing 1
million device activations every two months.

And there are more than
3,000 Android TV apps in the Play Store. This year, we're releasing a
brand-new launcher interface, and bringing the Google
Assistant to Android TV. Android Things previewed
late last year, and already there are thousands
of developers in over 60 countries using it to
build connected devices with easy access to the
Google Assistant, TensorFlow, and more. The full launch is
coming later this year. Chromebooks comprise almost 60%
of K-12 laptops sold in the US, and the momentum is
growing globally. And now, with the added
ability to run Android apps, you get to target laptops, too. Now, of course,
platforms are only as good as the apps they run. The Google Play ecosystem
is more vibrant than ever. Android users installed a
staggering 82 billion apps and games in the past year. That's 11 apps for every
person on the planet. All right, so let's come
back to smartphones. And the real reason I'm here
is to talk about Android O. Two months ago, we launched our
very first developer preview. So you could kick the tires
on some of the new APIs.

And of course, it's very
much a work in progress, but you can expect the
release later this summer. Today, we want to walk you
through two themes in O that we're excited about. The first is something
called Fluid Experiences. It's pretty incredible what you
can do on a mobile phone today, and how much we rely on them
as computers in our pockets. But there are still
certain things are tough to do
on a small screen, so we're doing a
couple of features in O that we think will
help with this, which I'll cover
in just a moment. The second theme is
something we call Vitals. And the concept here is to
keep vital system behavior in a healthy state so we can
maximize the user's battery, performance, and reliability. So let's jump
straight in and walk through four new
fluid experiences, with live demos,
done wirelessly.

What could possibly go wrong? [LAUGHTER] All right. These days, we do a lot of
[? wants ?] on our phones, whether it's paying
for groceries while reading a text
message you just received, or looking up guitar chords
while listening to a new song. But conventional
multi-window techniques don't translate well to mobile. They're just too fiddly to
set up when you're on the go. We think Picture-in-Picture
is the answer for many cases. So let's take a look.

My kids recently asked me
to build a lemonade stand. So I opened up YouTube, and I
started researching DIY videos. And I found this one. Now, at the same
time, I want to be able to jot down the
materials I need to build for this lemonade stand. So to multitask, all I do
is press the Home button, and boom, I get
Picture-in-Picture.

You can think of it as a kind
of automatic multi-window. I can move it out of the
way, I can launch Keep, I can add some more materials. So I know I need to get
some wood glue, like so. Then when I'm done, I just
simply swipe it away like that. It's brilliant. Picture-in-Picture lets you
do more with your phone. It works great when
video calling with Duo.

For example, maybe I
need to check my calendar while planning a
barbecue with friends. And there are lots of
other great use cases. For example,
Picture-in-Picture for Maps navigation, or watching
Netflix in the background, and a lot more. And we're also excited
to see what you come up with for this feature. We're also making
notification interactions more fluid for users. >From the beginning,
Android has really blazed a trail when it comes
to its advanced notification system. In O, we're extending the
reach of notifications with something we call
Notification Dots. It's a new way
for app developers to indicate that there's
activity in their app, and to drive engagement. So take a look. You'll notice that the Instagram
app icon has a dot in it. And this is it
indicating that there's a notification
associated with the app. So if I pull down the
shade, sure enough, you can see there's
a notification. In this case,
someone's commented on a photo I'm tagged in. What's really cool is I can
long press the app icon, and we now show the
notification in place.

One of the things I really
like about the Notification Dot mechanism is that it works
with zero effort from the app developer. We even extract the color
of the dot from your icon. Oh, and you get your erase
the icon by simply swiping the notification like that. So you're always in control. Another great feature in O that
helps make your experience more fluid is Autofill. Now, if you use
Chrome, you're probably already familiar with Autofill
for quickly filling out a username and
password, or credit card information with a single tap. With O, we've extended
Autofill to apps. Let's say I'm setting up a
new phone for the first time, and I open Twitter.

And I want to log in. Now, because I use twitter.com
all the time on Chrome, the system will automatically
suggest my username. I can simply tap it. I get my password. And then, boom. I'm logged in. It's pretty awesome. [APPLAUSE] Autofill takes the
pain out of setting up a new phone or tablet. Once the user opts
in, Autofill will work for most applications. We also provide
APIs for developers to customize Autofill
for their experience. I want to show you
one more demo of how we're making Android more fluid
by improving copy and paste. The feature is called
Smart Text selection.

So let's take a look. In Android, you typically
long press or double tap a word to select it. For example, I can open Gmail. I can start composing. If I double tap the word "bite,"
it gets selected like so. Now, we know from user
studies that phone numbers are the most copy-and-pasted items. The second most common are
named entities like businesses, and people, and places. In O, we're applying
on-device machine learning– in this case, a [? feed ?]
[? for a ?] neural network– to recognize these more
complicated entities. So watch this. I can double tap anywhere on
the phrase, "Old Coffee House," and all of it is
selected for me. No more fiddling around
with text selection handles. [APPLAUSE] It even works for addresses. So if I double tap on the
address, all of it is selected. And what's more– [APPLAUSE] There is more. What's more is the
machine learning model classifies
this as an address and automatically suggests Maps. So I can get directions
to it with a single click. And of course, it works as
you'd expect for phone numbers. You get the phone
dialer suggested.

And for email addresses,
you get Gmail suggested. All of this neural
networking processing happens on-device in real time,
and without any data leaving the device. It's pretty awesome. Now, on-device
machine learning helps to make your phone smarter. And we want to help
you build experiences like what you just saw. So we're doing two
things to help. First, I'm excited to
announce that we're creating a specialized version
of TensorFlow, Google's open source machine
learning library, which we call TensorFlow Lite. It's a library for apps
designed to be fast and small, yet still enabling
state-of-the-art techniques like [? compnets ?] and LSTMs. Second, we're introducing
a new framework at Android to hardware accelerate
neural computation. TensorFlow Lite will leverage
a new neural network API to tap into silicon-specific
accelerators. And over time, we expect to
see DSPs specifically designed for neural network
inference and training.

We think these new
capabilities will help power our next
generation of on-device speech processing, visual search,
augmented reality, and more. TensorFlow Lite will soon
be part of that open source TensorFlow project, and
the neural network API will be made available later
in an update to O this year. OK, so that's a
quick tour of some of the fluid experiences
in O.

Let's switch gears and talk about Vitals. So to tell you more,
I want to hand it over to Steph, who's been
instrumental in driving this project. Thank you. [MUSIC PLAYING] STEPHANIE SAAD
CUTHBERTSON: Hi, everyone. OK, so all the features
Dave talked about are cool. But we think your phones'
foundations are even more important–
battery life, security, startup time, and stability. After all, if your battery dies
at 4:00 PM, none of the other features that Dave talked
about really matter. So in O, we're investing
in what we call Vitals, keeping your phone secure
and in a healthy state to maximize power
and performance. We've invested in three
foundational building blocks– security enhancements,
OS optimizations, and tools to help
developers build great apps. First, security. Android was built with
security in mind from day one with application sandboxing. As Android has matured, we've
developed vast mobile security services. Now, we use machine learning to
continuously comb apps uploaded to Play, flagging
potentially harmful apps.

Then, we scan over 50
billion apps every day, scanning every installed app
on every connected device. And when we find a
potentially harmful app, we disable it or remove it. And we found most
Android users don't know these services
come built-in with Android devices with Play. So for greater
peace of mind, we're making them more
visible and accessible, and doubling down
on our commitment to security, with the
introduction of Google Play Protect. [APPLAUSE] So here, you can see
Play Protect has recently scanned all your apps. No problems found. That's Google Play Protect.

It's available out of the
box on every Android device with Google Play. Second, OS optimizations. The single biggest visible
change in O is boot time. On Pixel, for example,
you'll find, in most cases, your boot time is
now twice as fast. And we've made all
apps faster by default. We did this through extensive
changes to our runtime. Now, this is really cool stuff,
like concurrent compacting garbage collection
and code locality. But all you really need
to know is that your apps will run faster and smoother. Take Google Sheets–
aggregate performance over a bunch of common actions
is now over two times as fast. And that's all from the OS. There are no changes to the app. But we found apps
could still have a huge impact on performance. Some apps were running
in the background, and they were consuming tons
of system resources, especially draining battery.

So in O, we're
adding Wise Limits to background location
and background execution. These boundaries put
sensible limits on usage. They're protecting battery
life and freeing up memory. Now, our third theme is helping
developers build great apps. And here, I want
to speak directly to all the developers
in the audience. Wouldn't it be cool if Android's
engineering team could show you what causes performance issues? Today, we've launched
Play Console Dashboards that analyze every
app and pinpoint six top issues that
cause battery drain, crashes, and slow UI. For each issue the app
has, we show how many users are affected and provide
guidance on the best way to fix. Now, imagine if developers could
also have a powerful profiler to visualize what's
happening inside the app. In Android Studio, we've also
launched new unified profiling tools for network,
memory, and CPU. So, developers can
now see everything on a unified timeline, and
then dive into each profiler. For example, on CPU, you
can see every thread.

You can look at the call
stack, and the time every call is taking. You can visualize
where the CPU is going. And you can jump to
the exact line of code. OK, so that's Android Vitals. [APPLAUSE] How we're investing
in your phone's foundational security
and performance. Later today, you'll see
Android's developer story from end to end. Our hard work to
help developers build great apps at every stage– writing code, tuning,
launching, and growing.

But there is one more thing. One thing we think would
be an incredible complement to the story. And it is one thing our team
has never done for developers. We have never added a
new programming language to Android. And today, we're making
Kotlin an officially supported language in Android. [APPLAUSE] So, Kotlin– Kotlin is one our
developer community has already asked for. It makes developers so
much more productive. It is fully Android
runtime compatible. It is totally interoperable
with your existing code. It has fabulous IDE support. And it's mature and
production ready from day one. We are also announcing our
plans to partner with JetBrains, creating a foundation
for Kotlin. I am so happy JetBrains CEO,
Max Shafirov, is here today. [APPLAUSE] This new language is
wonderful, but we also thought we should increase
our investment in our existing languages. So we're doing that, too. Please join us at the
developer keynote later today to hear our story
from end to end.

OK, so let's wrap up. There are tons more features
in Android O, which we don't have time to go into today. Everything from
redesign settings, to Project Treble, which
is one of the biggest changes to the
foundations of Android to date, to downloadable fonts
with new emoji, and much more. If you want to try some of
these features for yourself– and you do– I'm happy to announce we're
making the first beta release of O available today. Head over to android.com/beta. [APPLAUSE] But there's more. [LAUGHS] You probably
thought we were done talking about
Android O, but, I'd like you to hear some
more about Android. And from that, please
welcome Sameer. Thank you. [MUSIC PLAYING] [APPLAUSE] SAMEER SAMAT: Thanks, Steph. Hi, everyone. >From the beginning,
Android's mission has been to bring the power
of computing to everyone. And we've seen tremendous
growth over the last few years, from the high end to
entry-level devices, in countries like
Indonesia, Brazil and India. In fact, there are now
more users of Android in India than there
are in the US.

And every minute,
seven Brazilians come online for the first time. Now, all this
progress is amazing. For those of us who
have a smartphone, we intuitively understand
the profound impact that computing is having
on our daily lives. And that's why our team
gets so excited about how we can help bring this
technology to everyone. So we took a step back
to think about what it would take to get
smartphones to more people. There are a few
things that are clear. Devices would need to
be more affordable, with entry-level prices
dropping significantly. This means hardware that uses
less power-packed processors and far less memory
than on premium devices. But the hardware is
only half the equation. The software also
has to be tuned for users' needs around
limited data connectivity and multilingual use. We learned a lot
from our past efforts here with Project
Svelte and KitKat, and the original
Android One program.

But we felt like the time was
right to take our investment to the next level. So today, I'm
excited to give you a sneak peek into a
new experience we're building for entry-level
Android devices. Internally, we
call it Android Go. Android Go focuses
on three things. First, optimizing the
latest release of Android to run smoothly on
entry-level devices, starting with Android
O. Second, a rebuilt set of Google Apps that
use less memory, storage space, and mobile data. And third, a version
of the Play Store that contains the
whole app catalog, but highlights the apps
designed by all of you for the next billion users. And all three of these
things will ship together as a single experience starting
on Android O devices with 1 gigabyte or less of memory.

Let's take a look at
some of the things we're working on for Android Go. First, let's talk about
the operating system. For manufacturers to make more
affordable entry-level devices, the prices of their
components have to come down. Let's take one example. Memory is an
expensive component. So we're making a
number of optimizations to the system UI and the
kernel to allow an Android O device built with
the Go configuration to run smoothly with as
little as 512 megabytes to 1 gigabyte of memory.

Now on-device
performance is critical, but data costs and
intermittent connectivity are also big
challenges for users. One person put it best
to me when she said, mobile data feels like currency. And she wanted more control
over the way she spent it. So on these devices, we're
putting data management front and center in Quick Settings. And we've created an API that
carriers can integrate with, so you can see exactly how much
prepaid data you have left, and even top up right
there on the device. But beyond the OS,
the Google Apps are also getting
smarter about data. For example, on these devices,
the Chrome Data Saver feature will be turned on by
default. Data Saver transcodes content on the
server and simplifies pages when you're on a
slow connection. And, well, now we're
making the savings more visible here in the UI.

In aggregate, this
feature is saving users over 750 terabytes
of data every day. I'm really excited that the
YouTube team has designed a new app called YouTube Go for
their users with limited data connectivity. Feedback on the new YouTube
app has been phenomenal, and we're taking many of the
lessons we've learned here and applying them to
several of our Google Apps. Let me show you some of the
things I love about YouTube Go. First, there's a new
preview experience, so you can get a sneak
peek inside a video before you decide to spend
your data to watch it. And when you're sure this
is the video for you, you can select the
streaming quality you want, and see exactly how much mobile
data that's going to cost you. But my favorite
feature of YouTube Go is the ability to save videos
while you're connected. So you can watch them
later when you might not have access to data. And if you want to share any
of those videos with a friend, you can use the built-in
peer-to-peer sharing feature to connect two of your
devices together directly, and share the files
across without using any of your mobile data at all.

[APPLAUSE] But beyond data
management, the Google Apps will also make it
easier to seamlessly go between multiple
languages, which is a really common use case
for people coming online today. For example, Gboard now
supports over 191 languages, including the recent addition
of 22 Indian languages. And there's even a
transliteration feature, which allows you to
spell words phonetically on a QWERTY keyboard to type
in your native language script. Now, Gboard is super cool,
so I want to show it to you. I grew up in the US, so for any
of my family that's watching, don't get too
excited by the demo. I haven't learned Hindi yet. And I'm sorry, mom, OK? [LAUGHTER] So let's say, I want to send a
quick note to my aunt in India. I can open up Allo,
and using Gboard, I can type how it
sounds phonetically. [HINDI], which means
"how are you" in Hindi. And transliteration
automatically gives me Hindi script. That's pretty cool. Now, let's say I want to ask
her how my I/O speech is going, but I don't know how to
say that in Hindi at all.

I can use the built-in
Google Translate feature to say, "how is this going?" And seamlessly, I
get Hindi script, all built right
into the keyboard. [APPLAUSE] My family is apparently
a tough audience. All right. Well, the Google Apps
are getting Go-ified, what had always
propelled Android forward is the apps from all of you. And no surprise, many of
our developer partners have optimized
their apps already.

So to better connect users
with these experiences, we'll be highlighting
them in the Play Store. One example is right
here on Play's home page. To be eligible for
these new sections, we published a set
of best practices called "Building for Billions,"
which includes recommendations we've seen make a big difference
in the consumer experience. Things such as designing
a useful offline state, reducing your APK size to
less than 10 megabytes, and using GCM or JobScheduler
for better battery and memory performance. And also in "Building
for Billions," you'll find best practices for
optimizing your web experience. We've seen developers
build amazing things with new technologies, such
as progressive web apps. And we hope you can come
to our developer keynote later today to learn
a whole lot more.

OK, that was a quick walkthrough
of some of the things coming in Android Go. Starting with Android
O, all devices with 1 gigabyte of RAM or less
will get the Go configuration. And going forward,
every Android release will have a Go configuration. We'll be unveiling much
more later this year, with the first devices
shipping in 2018. We look forward to
seeing what you'll build, and how we can bring computing
to the next several billion users. Next up– next up, you'll
be hearing from Clay on one of Google's newest platforms
that we're really excited about– VR and AR. Thank you. [APPLAUSE] [MUSIC PLAYING] CLAY BAVOR: Thank you, Sameer. So, Sundar talked about how
technologies like machine learning and
conversational interfaces make computing more intuitive by
enabling our computers to work more like we do. And we see VR and AR
in the same light. They enable us to
experience computing just as we experience
the real world. Virtual reality can
be transporting. And you can experience
not just what it's like to see
someplace, but what it's like to really be there.

And augmented reality uses
your surroundings as context, and puts computing
into the real world. A lot has happened since
Google I/O last year, and I'm excited to share a
bit of what we've been up to. So let's start with VR. Last year, we announced
Daydream, our platform for mobile virtual reality. And then in October, to
kick-start the Daydream ecosystem, we released
Daydream View, a VR headset made by Google. And it's super comfortable. It's really easy to use. And there's tons to do with it. You can play inside
alternate worlds, and games like "Virtual
Virtual Reality." And you can see any
part of our world with apps like Street View.

And you can visit other worlds
with apps like Hello Mars. There's already a great
selection of Daydream phones out there, and we're
working with partners to get Daydream on even more. First, I'm pleased
that LG's next flagship phone, which launches later this
year, will support Daydream. And there's another. I'm excited to announce that
the Samsung Galaxy S8 and S8 Plus will add Daydream support
this summer with a software update. [APPLAUSE] So, Samsung, of
course, they make many of the most popular
phones in the world.

And we're delighted to have
them supporting Daydream. So great momentum in
Daydream's first six months. Let's talk about what's next. So with Daydream,
we showed that you can create high
quality mobile VR experiences with
just a smartphone and a simple headset. And there are a lot of nice
things about smartphone VR. It's easy. There aren't a bunch of cables
and things to fuss with. You can choose from a bunch
of great compatible phones. And of course, it's portable. You can throw your
headset in a bag. We asked, how we take the
best parts of smartphone VR and create a kind of device
with an even better experience? Well, I'm excited to announce
that an entirely new kind of VR device is coming to Daydream–
what we call standalone VR headsets. And we're working with
partners to make them. So what's a standalone headset? Well, the idea is that
you have everything you need for VR built right
into the headset itself. There's no cables, no phone,
and certainly, no big PC. And the whole device is
designed just for VR. And that's cool for
a couple of reasons.

First, it's easy to use. Getting into VR is as easy
as picking the thing up. And it's one step
and two seconds. And second, presence. And by that, I mean really
feeling like you're there. By building every part of the
device specifically for VR, we've been able to optimize
everything– the displays, the optics, the sensors– all to deliver a stronger
sense of being transported. And nothing
heightens the feeling of presence like
precise tracking– how the headset
tracks your movement. And we've dramatically improved
tracking with the technology that we call WorldSense. So WorldSense enables what's
known as positional tracking. And with it, your view
in the virtual world exactly matches your
movement in the real world. And it works by using
a handful of sensors on the device that look
out into your surroundings. And that means it
works anywhere. There's no setup.

There's no cameras to install. And with it, you really
feel like you're there. Now, just as we did with
Daydream-ready smartphones, we're taking a platform approach
with standalone headsets, working with partners to
build some great devices. To start, we worked
with Qualcomm to create a Daydream
standalone headset reference design, a sort of
device blueprint that partners can build from. And we're working closely
with two amazing consumer electronics companies to
build the first headsets. First, HTC, the company
that created the VIVE. [APPLAUSE] We're excited about it, too. [CHEERING AND APPLAUSE] They're a leader
in VR, and we're delighted to be working
with them on a standalone VR headset for Daydream. And second, Lenovo. We've been partners for years,
working together on Tango. And now, we're excited
to work with them on VR.

These devices will start to
come to market later this year. So that's the update on VR. Great momentum with apps,
more Daydream-ready phones on the way, and a new category
of devices that we think people are going to love. So let's turn to
augmented reality. And a lot of us were
introduced to the idea of AR last year with Pokemon GO. And the app gave
us a glimpse of AR, and it showed us
just how cool it can be to have digital
objects show up in our world. Well, we've been working
in this space since 2013 with Tango, a sensing
technology that enables devices to understand
space more like we do. Two years ago in 2015, we
released a developer kit. And last year, we shipped the
first consumer-ready Tango phone. And I'm excited to announce
that the second generation Tango phone, the ASUS ZenFone AR
will go on sale this summer.

Now, looking at the slides,
you may notice a trend. The devices are getting smaller. And you can imagine
far more devices having this capability in the future. It's been awesome to
see what developers have done with the technology. And one thing we've
seen clearly is that AR is most
powerful when it's tightly coupled to the real
world, and the more precisely, the better. That's why we've been
working with the Google Maps team on a service that
can give devices access to very precise location
information indoors. It's kind of like
GPS, but instead of talking to satellites
to figure out where it is, your phone looks for
distinct visual features in the environment, and it
triangulates with those.

So you have GPS. We call this VPS, Google's
Visual Positioning Service. And we think it's going
to be incredibly useful in a whole bunch of places. For example, imagine you're at
Lowe's, the home improvement store that has
basically everything. And if you've been there,
you know it's really big. And we've all had
that moment when you're struggling to find that
one, weird, random screwdriver thing. Well, imagine in the
future, your phone could just take you to
that exact screwdriver and point it out to
you on the shelf. Turns out we can
do this with VPS.

And let me show you how. And this is working today. So here we are walking
down an aisle at Lowe's. And the phone will find
these key visual feature points, which you can
see there in yellow. By comparing the feature points
against previously observed ones, those colorful
dots in the back, the phone can figure out exactly
where it is in space down to within a few centimeters. So GPS can get you to
the door, and then VPS can get you to the exact
item that you're looking for.

Further out– [APPLAUSE] Further out, imagine
what this technology could mean to people with
impaired vision, for example. VPS and an audio-based
interface could transform how they make
their way through the world. And it combines so many things
that Google is good at– mapping, computer vision,
distributed computing. And we think precise
location will be critical for
camera-based interfaces. So VPS will be one of the core
capabilities of Google Lens. We're really excited about
the possibilities here. So the last thing
I wanted to share is something that
we've been working on that brings many
of these capabilities together in a really
important area. And that's education. Two years ago, we
launched Expeditions, which is a tool for teachers
to take their classes on virtual reality field trips.

And 2 million
students have used it. Today, we're excited
to announce that we're adding a new capability
to Expeditions– AR mode, which enables kind
of the ultimate show-and-tell right in the classroom. If we could roll
the video, please. [VIDEO PLAYBACK] – All right, who wants
to see a volcano? 3, 2, 1. – Whoa! – Look at that lava. Look at that smoke
coming out of that. Pretend you're an airplane
and fly over the tornado. – That's the top of it. – What do you see? – It's either a
asteroid, meteorite– – We're learning
about DNA and genes– things that we can't see. And so, the most exciting thing
for me with the AR technology was that I could see
kids get an "aha" moment that I couldn't get by
just telling them about it. – The minute I saw it
pop up on the screen, it made me want to
get up and walk to it. – You actually you get
to turn around and look at things from all angles, so
it gave us a nice perspective.

– See if you can
figure out what that might be based on what you know
about the respiratory system. – I got to see where the
alveoli branched off, and I could look inside them
and see how everything worked, which I never saw before. And it was really, really cool. [END PLAYBACK] CLAY BAVOR: We're just
delighted with the response we're seeing so far. And we'll be rolling this
out later in the year. So, VR and AR, two
different flavors of what you might call immersive
computing– computing that works more like we do. We think that's a big idea. And in time, we see VR
and AR changing how we work and play, live and learn.

And all that I
talked about here, these are just the first steps. But we can see where
all of this goes, and we're incredibly
excited about what's ahead. Thanks so much. Back to Sundar. [APPLAUSE] [VIDEO PLAYBACK] – We wanted to make machine
learning have an open source project so that everyone
outside of Google could use the same system
we're using inside Google. [MUSIC PLAYING] [END PLAYBACK] [APPLAUSE] SUNDAR PICHAI: It's incredible
[? with ?] any open source platform, when you see what
people can do on top of it. We're really excited about the
momentum behind TensorFlow. It's already the most popular
ML repository on GitHub. And we're going to
push it further. We are also announcing the
TensorFlow Research Cloud.

We are giving away
1,000 cloud TPUs, which is 180 petaflops
of computing to academics and researchers for free so that
they can do more stuff with it. I'm always amazed by the stories
I hear from developers when I meet them. I want to highlight
one young developer today, Abu Qader from Chicago. He has used TensorFlow to help
improve health for everyone. Let's take a look. [VIDEO PLAYBACK] [MUSIC PLAYING] [CHATTER] – My name is Abu. I am a high school student. 17 years old. My freshman year, I remember
Googling machine learning. I had no clue what it meant. That's a really cool
thing about the internet, is that someone's already doing
it, so you can just YouTube it, and [CLICK] it's right there. Within a minute, I really saw
what machine learning can do. It kind of like hit
something within me. This need to build
things to help people.

My parents are immigrants
from Afghanistan. It's not easy coming in. The only reason we made it
through some of the times that we did was because people
showed acts of kindness. Seeing that at an early
age was enough for me to understand that
helping people always comes back to you. [INAUDIBLE] – How are you? – And then it kind of hit me– a way where I could actually
generally help people. Mammograms are the cheapest
imaging format there is. It's the most accessible to
people all around the world. But one of the biggest problems
that we see in breast cancer is misdiagnosis. So I decided I
was going to build a system for early detection
of breast cancer tumors, that's successful to everyone,
and that's more accurate. How was I going to do it? Machine learning. The biggest, most extensive
resource that I've used, is this platform
called TensorFlow. And I've spent so
many hours going really deep into these
open source libraries and just figuring
out how it works. Eventually, I wrote
a whole system that can help radiologists
make their decisions. All right. – Ready? – Yeah. I'm by no means a wizard
at machine learning.

I'm completely self-taught. I'm in high school. I YouTubed and just
found my way through it. You don't know about
that kid in Brazil that might have a groundbreaking
idea, or that kid in Somalia. You don't know that
they have these ideas. But if you can open
source your tools, you can give them a
little bit of hope that they can actually conquer
what they're thinking of.

[END PLAYBACK] [CHEERING AND APPLAUSE] Abu started this as
a school project, and he's continued to
build it on his own. We are very, very fortunate
to have Abu and his family here with us today. [CHEERING AND APPLAUSE] Thank you for joining us. Enjoy I/O. We've been talking
about machine learning in terms of how it will power
new experiences and research. But it's also important we think
about how this technology can have an immediate
impact on people's lives by creating opportunities
for economic empowerment. 46% of US employers say
they faced talent shortages and have issues filling open job
positions while job seekers may be looking for openings
right next door.

There is a big disconnect here. Just like we focused
our contributions to teachers and students
through Google for Education, we want to better connect
employers and job seekers through a new initiative,
Google for Jobs. Google for Jobs
is our commitment to use our products to
help people find work. It's a complex,
multifaceted problem, but we've been investing
a lot over the past year, and we have made
significant progress. Last November, we announced
the Cloud Jobs API. Think of it as the first
fully end-to-end, pre-trained, vertical machine learning
model through Google Cloud, which we give to employers– FedEx, Johnson & Johnson,
HealthSouth, CareerBuilder, and we're expanding to
many more employers. So in Johnson &
Johnson's career site, they found that applicants
were 18% more likely to apply to a job suggesting the matching
is working more efficiently. And so far, over 4
and 1/2 million people have interacted with this API. But as we started
working on this, we realized the first
step for many people when they start looking for
a job is searching on Google.

So, it's like other
Search challenges we have worked in the past. So we built a new feature
in Search with a goal that no matter who you
are or what kind of job you are looking for, you can
find the job postings that are right for you. And as part of this
effort, we worked hard to include jobs across
experience and wage levels, including jobs that have
traditionally been much harder to search and classify– think retail jobs,
hospitality jobs, et cetera. To do this, well, we have
worked with many partners– LinkedIn, Monster, Facebook,
CareerBuilder, Glassdoor, and many more. So let's take a look
at how it works. Let's say you come to
Google and you start searching for retail jobs. And you're from Pittsburgh. We understand that. You can scroll down and click
into this immersive experience. And we immediately start showing
the most relevant jobs for you. And you can filter. You can choose Full-time. And as you can see, you
can drill down easily.

I want to look at jobs which are
posted in the past three days. So you can do that. Now, you're looking at retail
jobs in Pittsburgh, posted within the last three days. You can also filter
by job titles. It turns out employees
and employers use many different terminologies. For example, retail could
mean a store clerk, a sales representative, store manager. We use machine
learning to cluster automatically, and so that we
can bring all the relevant jobs for you. As you scroll through it,
you will notice that we even show commute times. It turns out to be an important
criteria for many people. And we'll soon add a
filter for that as well.

And if you find something
that's of interest to you– so maybe the retail
position [? in ?] Ross. And you can click on it, and you
end up going to it right away. And you're one click away. You can scroll to find more
information if you want. And you're one click away from
clicking and applying there. It's a powerful tool. We are addressing jobs of every
skill level and experience level. And we are committed to making
these tools work for everyone. As part of building
it, we literally talked to hundreds of people. So whether you are in a
community college looking for a barista job, a
teacher who is relocating across the country and you
want teaching jobs, or someone who is looking for
work in construction, the product should
do a great job of bringing that
information to you. We are rolling this out in
the US in the coming weeks, and then we are
going to expand it to more countries in the future.

I'm personally enthusiastic
for this initiative because it addresses
an important need and taps our core
capabilities as a company, from searching and
organizing information, to AI and machine learning. It's been a busy morning. We've talked about
this important shift from a mobile first
to a AI first world. And we're driving it forward
across all our products and platforms so that all of you
can build powerful experiences for new users everywhere.

It will take all of
us working together to bring the benefits of
technology to everyone. I believe we are on the verge
of solving some of the most important problems we face. That's our hope. Let's do it together. Thanks for your time today,
and enjoy Google I/O. [APPLAUSE] [MUSIC PLAYING].

As found on YouTube

Google I/O 2010 – Keynote Day 2 Android Demo – Full Length

ladies and gentlemen please welcome vice president of engineering for Google Vic Gundotra well good morning everybody you made it even at this early hour I hope you enjoyed that party last night how yeah how how about that spider was that cool or what after a few drinks I thought that thing was going to chase me a lot of fun let me also welcome the many thousands who are watching our live stream on YouTube yesterday just for your information we had over 24,000 people watching it concurrently live so in addition to the five thousand plus folks we had here we had almost 30 thousand watched us yesterday and so welcome to everyone watching on YouTube to begin today's keynote I'd like to start with a story it's a story of my very first day on the job at Google now I'm sure you've all been at a new job you understand the apprehensiveness you might feel with a new office new people it was on that very first day that I met a man named mr.

Andy Rubin now I suspect most of you know who Andy Rubin is at the time he was responsible for what was then a secret project codenamed Android and on that first day Andy enthusiastically described to me the team's mission and purpose and as he spoke I'll level with you I was skeptical in fact I interrupted Andy and I said Andy I don't get it does the world really need another mobile operating system Google is about advertising shouldn't we be on every phone to this day I remember Andy's response and he made two points the first point Andy made was that it was critically important to provide a free mobile operating system an open-source operating system that would enable innovation at every level the stack in other words Oh AMS should be free to build all kinds of devices devices with keyboards without keyboards with front-facing cameras two inches three inches four inches that operators should be able to compete on the strength and coverage of their network 2g 3G 4G LTE CDMA and that in the end with innovation coming at every layer it would be the consumer who would be able to benefit by getting the best device out the best network for them I remember Andy's second point he argued that if Google did not act we faced a draconian future a future where one man one company one device one carrier would be our only choice that's a future we don't want so if you believe in openness if you believe in choice if you believe in innovation from everyone then welcome to Android now let's get started let's talk a little bit about the momentum that we've achieved in 18 months even a year and a half since we've been started with Android how are we doing let's do a little bit of a report card first of all let's demonstrate some momentum it's hard to believe that in only 18 months we've achieved over 60 compatible devices where your software where your applications can run and these devices are not just from you know people you haven't heard of these are from the leading consumer electronic companies in the world Sony Ericsson HTC Motorola and many others who are producing devices that meet the needs of consumers we think this is pretty fantastic progress in just 18 months of course it's not just the creation of the devices it's the 2100 EMS in 48 countries and over 59 carriers who've joined the Android revolution of course producing devices making them available across a multitude of countries and carriers does it necessarily mean that we're going to see adoption have users found Android to be something that they desire well last year we reported late last year that we had reached a sales run rate a daily activation rate of over 30,000 units a day in February just a few months later we announced that our daily run rate had achieved sixty thousand units a day I'm very proud to announce today that our run rate daily activations has now passed a hundred thousand a day go Android of course that momentum has led to some pretty significant milestones one of the ones we're most proud of is that this quarter we are now second in the United States in smartphone sales second only to rim and that's pretty amazing progress in 18 months we're second in smartphone sales but according to AdMob data we are now first this quarter in total web and app usage that's fantastic you know we set a crazy internal milestone for ourselves you know the phones are being used by consumers but what are they being used for and we set a crazy internal goal when we shipped turn-by-turn navigation for Android six months ago we thought it might be possible to have a half a billion miles navigated in the first year in hardly six months we've now crossed a billion miles navigated with turn-by-turn navigation on Android so users just love that feature thank you there are some who say that users don't use Google search on smartphones well we're a company driven by data not by opinions and you know what the data shows the data shows that we've seen a 5x growth in the past two years that's not just on Android but that's across all smartphone categories people love Google search you give them a great browser and they do Google search tremendous awesome usage of the web on these devices of course what they love is applications and today I'm happy to announce that we've crossed 50,000 applications in the Android Marketplace and really the credit there goes to you thank you for the 180,000 developers thank you 180,000 developers who joined the Android revolution really it's your hard work that's paired with the innovation that's coming from OEMs and carriers that makes the mobile ecosystem work we certainly couldn't have done this without you thank you thank you thank you for supporting Android now let's talk about the platform we finished the section on momentum let's talk about what we're doing to make the Android platform continue to evolve and get better now in this section I have over 20 demos 22 demos I believe in order to help me with that I'd like to invite up onstage Matt Waddell my partner in crime here many of you may remember Matt from last year and we're going to go through a number of demos that's going to really showcase what we're doing with Android there are five major areas of investments that we're making in the platform now we've been quick to iterate with Android in fact there's been seven releases in those 18 months and today we're announcing the next release Android 2.2 codenamed Froyo what's in Froyo let's talk about five pillars number one let's begin with speed now as you developers know the Android architecture is one that's built upon a virtual machine the dalvik virtual machine and we think it's very important it's in a critical design decision we made that future proofs your application we have big dreams for Android and part of those dreams mean that Android will go to new places with new chip architectures but by having your applications write to the virtual machine why we believe we can go carry the entire ecosystem to exciting new areas of course that only works if the virtual machine is fast and the dalvik VM has done its job being fast efficient and automatic and easy for developers but we can do even better and we're very proud to announce that in Froyo we've added a a just-in-time compiler which gives up to 2 to 5 X speed-up of your apps on the exact same hardware now this is best demonstrated and let's do that let's go to our first demo we're going to show you a game the game has been modified for purposes of this demo you guys may know replica Island here's what we've done to modify the game the game now will show the frame rate that it's rendering in the bottom right hand corner if the framerate drops beneath 30 frames a second the screen will flash red exact same Hardware exact same game top is running Froyo with JIT compilation the bottom is running Eclair we're going to introduce a crazy number of monsters into this game so you see the monsters keep getting added increasing the complexity of the game you see the framerate at the top and as we add complexity it starts to slow down you'll note there at the bottom there are times and we're dropping beneath 30 frames a second and it's flashing red you'll note at the top with Froyo exactly the same game runs much better never dropping beneath 30 frames a second all because of the JIT compiler alright let's go back to slides now as Android adoption is skyrocketed people have been taking these devices to work and we've heard loud and clear from the enterprise that they need very specific features well in Froyo we introduced over 20 new features designed to meet the needs of enterprise let me touch on two of them number one we become Microsoft Exchange friendly that means yes thank you that means things like auto-discovery integration with the global address book the security policies that are available in exchange can be enforced upon the device number two we've added new api's for device management so you can build software that does critically important things like remote light of the device if necessary and there's many many other things that we've added that you'll see in the documentation let's talk about new services available for developers in the SDK now one of the first services I want to talk about is the application data backup API now if you use Android and you've gone to another device you know that Android will automatically backup your applications in other words you get a new device you log in and your applications come along with that however what Android has not done is backed up the data associated with those applications so for example I have a particularly favorite application that helps me monitor my exercise and my weight but when I moved to a new Android device the application moves but all my personal data in history does it starting with Froyo will provide an application data backup API that is BS can take advantage of and move the data along with the application we think that's a great feature in addition to that if we have a brand new API a cloud-to-device messaging API now let me be clear this is not a push notification API designed to compensate for the lack of basic functionality like multitasking in the operating system okay we've done something very clever I think you're going to love as a developer you can send a message to our servers which will do very smart things like collapse similar messages our servers which will optimize for the latency of mobile networks and make sure that that message gets down to the device but that's only the first step we've done deep integration with Android such that when you send a message that message can trigger an Android intent let me show you how powerful this is let's go to a demo now what Matt has on the screen here on his laptop is Google Maps he's in Google Maps and he's using the Chrome browser and he's added a Chrome extension the Chrome extension is in the upper right hand corner and that sends it to his phone so in the map he's got directions he's got directions from the Moscone Center to Google headquarters and he wants to send that to the phone what do you think happens do you think we send a text message that says this is the address do you think we send an email no we send an Android intent in other words when he says send a phone the message gets sent to our server gets pushed down to the device the device kicks into navigation mode automatically keep your eye when he clicks on send a phone keep your eye on the Android device so send it to phone and on the Android device boom right into navigation how hot is that that's how you do a device cloud-to-device API let me show you another example you're reading an article on the desktop you think it's awesome but you're out of time and you need to run you want to keep reading that on the Android phone why don't you send that to the device map to the device opens up the browser takes you right to that article not ever having to press any other Keys not great we can't wait to see what you're going to go do with this API alright let's go back to slides you're going to love this I mean come on if you're like me you have a plethora of devices you carry around with you and that all those devices shouldn't mean added complexity in yet another bill right you should be able to at the platform level enable tethering and so now your Android device can in fact become a portable hotspot and really serve the needs of these other devices that you might have with you in fact let's show you a demo of this working matt has a nexus one running Froyo he'll go right into tethering and portable hotspot he'll enable the hotspot you'll give it a name and in fact I think he's already named this as Android AP and that hotspot will turn on we'll give it a second there there we go tethering hotspot is now active now you'll go to another device that doesn't have connectivity how about the app that iPad and there you go one bill isn't that beautiful all right let's go back to slides now it turns out do you know what the most popular thing people do with these smartphones what do you think it is fairly obvious it's they actually use the phone the number two thing they do is they do text messaging and the third most used application is the browser and so it's critically important for us to make the Android browser Rock and we're going to constantly improve that browser in Froyo is a major step in that direction what have we done in Froyo well I think you're going to love this we have a two to 3x performance in the browser how well we take this we took the same javascript interpreter that we had in chrome that make chrome so fast v8 and we brought that into Froyo now the best way to show you how much of an advance we've made is to do a demo so let's go and show you this fantastic performance boost now we're going to do this demo with a number of devices we're going to do it we're going to do this demo three devices we're going to do this demo with Froyo Eclair and an iPad now here's the demo we're going to do there's an industry standard test in fact there's two is a suite of 26 tests the SunSpider tests that really exercise JavaScript performance of all kinds and when Matt presses start across these three devices we're going to exercise each one of these 26 tests as we complete one test that little Android robot will take one breast stroke forward and as all 26 tests are complete we're going to complete an entire lap okay so how are we going to go do here let's go ahead and start this test let's see what happens oh you started the iPad first trying to give it a little bit of a lead here huh you guys here it's great I really wonder if we'll be able to get that in the app store oh it's a web app how great is that all right let's go back to slides we think with the performance improvements that we just showed you that we can claim that Froyo has the world's fastest mobile browser and that's a pretty great accomplishment one we're extremely proud of okay we're not done I showed you what's in Froyo but make no mistake about our commitment to maintain leadership in the browser on on Android one of the ways that we're doing that is we're working with standards bodies to enable web developers to get capabilities that were formerly only limited to native access to the platform what kind of capabilities well think about the things that you can access on a device things like the magnetometer the accelerometer think about things like the camera or being able to access speech wouldn't it be great if you could access this from from the from the browser we're going to show you a sneak peek at something beyond Froyo an early development build that will give you a flavor of where we're going next let's go back to demos now remember we already got some of this work underway last year we worked with the standards committees to introduce geolocation into the browser and today almost an all major modern platforms why you can make an API call from the browser get location with the users permission and then do a feature like my location right within the browser but we want to do more how about the accelerometer what about the Magda the the the access to the tilt and the direction of the device watch this isn't that great right within the browser all right now another key feature that we've heard from developers that they want access to in the browser is the camera here for example is the buzz web application we'd love to not only post a buzz but we'd love to be able to include a picture well how do you access the camera from the web browser well in this development build we've built those api's right into the browser such that Matt now can simply select if he wants to it looks like in this case this demo is not going to work we'll give it one shot and we'll move on if this doesn't work but you can access that camera capability right from within the browser and we'll show you that in the sandbox later it's great let's talk about another capability that's available but before we talk about that capability let me remind you of what's possible in Android Google starting about several years ago really made a deep investment in voice recognition you know we recognized that the mobile device because of its limited input capability would be the platform in which people used voice input more than any other platform in fact we see a stunning number of queries being done on mobile devices where the input is a human voice now just as a reminder of how fantastic our capability has become over the past two years let me do a few demos or more accurately Matt why don't you highlight a few cases pictures of Barack Obama with the French president at the g8 summit that's a tough one boom look at that because this is so fun let's do a few more pictures of the Golden Gate Bridge at sunset these are queries you're likely not to type in let alone get a response back you know a few seconds is that great and one more del dotto vineyards Napa my favorite vineyard Napa boom now what that when not just showed you was phenomenal voice recognition that we're now shipping in Mandarin Japanese English a number of languages and that's constantly increasing but what's coming next is also the ability to understand human intentions we'll give you a very simple sneak peek at the kinds of things that we're going to enable call fifth floor restaurant so in this case because he said call it got it trigger and there's many many more intentions we're going to go build in and make it very very simple for you to use voice input as a first-class way to interact with your Android device hey Vic let's give this one a try oh you got okay let's let's go back to the camera let's see if this thing works eh this is a web app it's a buzz web app there's the camera from the browser boom right into the camera isn't that great and now you can post a picture and give web developers the same type of capabilities that you'd expect in a native application glad that worked no need for you guys to visit the sandbox alright now fetch getting back to the speech demonstration what we showed you was capability that we make available to developers but what about accessing that capability from the browser here's a web app it is a Translate app from Google with one new feature you'll note the microphone right there at the top Matt if you could just point to the microphone and watch what Matt does can you help me find the nearest hospital Judy you media Judah you people you people awesome awesome alright let's go back to slides now we're not only committed to making sure that Android you can clap I'll let you clap you know I love it when I don't even have to speak to a slide and we just get applause I'm so tensed oh go on but let me let me just make a point here you know we're not only committed to having the world's fastest browser we're committed to having the world's most comprehensive browser it turns out that on the internet people use flash and part of being open means you're inclusive rather than exclusive and you're open to innovation you know this was driven home to me very powerfully when my daughter picked up my iPad and went to her favorite website a website Nickelodeon and this is what she saw on the iPad can we switch to the iPad a sea of orange she said daddy can i play with your android device and this is what she saw the full Nickelodeon site isn't that great that's what openness means and by the way a special thanks to Adobe for their incredible willingness to work with us engage with us on Android and in chrome in many other areas it's really fun to work with other folks in the ecosystem to meet the needs of users much nicer than just saying no alright let's go back to slides now we've also made significant improvements in the Android Marketplace we listened carefully to the feedback that we've gotten from developers and from users and I think you're going to like the enhancements let's talk about what we've done on average our data shows that users are installing more than 40 apps on their Android device and so finding those apps is a challenge also they want to search within the data of those apps they want to move those apps to the SD card and they want to update all without having to update each individual app by themselves let me show you through a series of demos what we've done here let's go to demos the first demonstration we're going to show you is just searching the apps and now with the quick search box we've made it trivially easy to scope it to apps so as soon as you start typing a letter Y the app you're looking for just comes up it's a simple and easy way to go find apps not only can you find apps but we've improved the quick search box so that developers can plug into the search framework in this case that icon is for mint mint is a financial application and now you're searching within the data of that application bringing up your financial records we think developers are going to extend this in all sorts of exciting ways making not only Android more usable but their own applications be surfaced discoverable and more fun and engaging for the user another issue we've heard is people want to take advantage of the openness of Android the ability for you to plug in an SD card with arbitrary amounts of memory and they want to move apps not just in the internal memory but off to the SD card two points we've enabled that capability in a secure way with Froyo but we've also made it so that the user never has to worry about it when they install an application will intelligently look at the space and if appropriate move it to the SD card if the user wants to get involved and manually move things around that's what this demonstration shows in this case Matt is going to take a brand new game Need for Speed and he's going to shift that game from the internal memory onto the SD card and there you go there the whole app 50 megabytes is being moved over why don't we just start the game just so we can see you can see it's on the SD card once you launch that game isn't that awesome great new game need for speed that's coming and it can live on your SD card okay let's move on not Matt we got to go I know the games fun all right let's go into the marketplace and talk about all right let's let's talk about update all functionality Matt that's you he's easily distracted all right today you have to update each application individually in Froyo we've made it simple at the bottom of the Android Marketplace you see the update all button I'm kind of embarrassed you have to clap on that we're glad it's there we've gone one step further with Froyo why should the user have to take any action starting with Froyo with the users permission you can allow automatic updating and all your apps are updated all the time is that great and the user doesn't have to worry about this at all let's go back to slides another key feature we're adding in Froyo is designed to meet the needs of developers we want the best apps on Android the highest quality apps and that means we need to close the loop when there's a problem let's go back to a demonstration here and I'm going to show you an app called crash e it does exactly that really well it crashes and so Matt why don't you start the app and it crashed now note that there's a button there now called report and the user can press report can provide a description of what happened they can look at all the data that were that they're going to go send make sure they're comfortable with that data and then they can hit Send and that is sent back to the developer note on this screen the Android Marketplace note a brand new tab bugs and when you click on bugs we will show you those reports for your application and we will show you the entire stack trace of what happened we thought you might like that and closing the loop allowing apps to get better and better all right let's go back to slides I want to go a little bit beyond Froyo and show you a sneak peek at other exciting things that we're doing in the android marketplace I think you're going to like some of these things let's go back to demo mat is now showing you the Android Marketplace except this is unlike anything you've ever seen before it's the Android Marketplace accessible from a browser on a PC and that's great now mat can browse apps he can look at reviews but you'll note in the upper right hand corner Matt is signed in and because he signed in we know every device Android device that he has yeah he's a little weird in this case the device that he has on the stage he has selected as Nutter Butter nutty-buddy something whatever he like I said he's weird he's got his phone selected and so now he can go browse a particular app he's going to find an app that he likes like open table and a hold on Matt hold on hold on he's gonna hit free and he's going to go and download the app what do you think happens well let's see what happens today on on some other systems you find the app you download it to your PC or Mac you then have to tether your device you have to then once it's down on your PC you have to then convert it over that tethering to your device and then make that sync happen well guess what we discovered something really cool it's called the Internet and when Matt presses download keep your eye on the Android device over-the-air isn't that totally great over-the-air installed on your device now it turns out it's called the Android or the marketplace because it's more than apps fact what we're about to show you is a new category how about some music Matt why don't you find an album or song you like and why don't you just send it to the device using the internet over the air and boom keep your eye on the Android device there you go there's the music coming right on down now we're not done yet some of you may be thinking this is really cool this is so hot but I have all my music already on my device at home my windows media or iTunes library I've got thousands of songs there what what do I do with those well we recently acquired a company called simplified media and it is our intention to incorporate that technology into the marketplace as well and that allows you at home to run a simple piece of software that makes all of your non DRM music available to your Android device would you like to see this lets go to your Android device Android device has two songs on it that's nice but I want more than two songs I want all of my music I have so Matt will go down and select I guess he has to plug in the device you can hear it adoptions touch 9 ok thank you for calling the 5th floor restaurant book my god let me offer a public apology now what's really odd is that they didn't pick up all that time Wow okay getting back to music to songs I want all my music he's going to select that button that says all and magically all the music is now available and you might be wondering what just happened we are making your home music library available to your Android device as as a stream so for example if Matt was to select a particular song I guess Matt's an Earth Wind and Fire fan hit play this work yeah because usually we're going to love this feature alright let's go back to slides let's talk a little bit about advertising it turns out that we know a little bit about advertising today happens or this year happens to be Google's 10th anniversary of providing advertising solutions and we've learned a few things in that decade one of the foremost things we've learned is that if you want to have a healthy ecosystem of advertising you need advertisers and we have hundreds of thousands of them that means if you have an ad spot why we have a relevant ad that can fill it it's called inventory we're not new at this game we're not working with a handful of partners and charging them a million dollars each to be part of our program okay so we can be your advertising partner we've also learned something else in that decade that advertisers and publishers have different kinds of needs they're local advertisers there's direct response advertisers there are brand advertisers and if you want to play you need formats that meet the needs of those kinds of advertisers and we have those formats a third thing that we've learned advertising needs to be measured and that means it needs to have great tools why does it need to be measured well you need to know if the money you're spending is giving you a return on the investment you need to know which campaigns are working so you can invest more in them or tweak them as necessary and it turns out that we have some tools you might have heard of called double-click analytics Adsense AdWords the tools that in the industry uses knows and loves are being extended seamlessly to work in the mobile environment and finally we've learned that it's important to be open to innovation now I don't even know what the ponies have to do with this slide but but part of being open is that when you get a designer who gives you a slide you go with it let's show you each one of these four points in and go to demos now these are real advertisers and real applications let me make that point clear you're looking at backgrounds one of the most popular applications available in the android marketplace at the top you're looking at a Google Ad our ad program which has been available only to a limited number of a beta participants is called Adsense for mobile apps athma and you're looking at one of those atma ads you're looking at our large inventory of text ads being optimized and served on that device we're taking advantage of the small space that we have and you're seeing the nice rotation but that's just one ad format what happens if Matt selects something like beaches well then we can serve a contextually relevant ad in this case kayak is an a popular website and application for travel but in this case when you click on the ad we take you to deep within the Android Marketplace so that the user can download the app okay that's another important ad format let's show you another format this is a third format for brand advertising in other words a banner ad where an advertiser has displayed as decided to convey their message through an image we support that how about another category and this is something that we're announcing available today and that's an expandable ad format one of the things that we've seen through our research is that users if at all possible would like to stay within the context of the application they're using so in this case when Matt selects that Volkswagen ad it expands right out you can look at that at and when he's done with it he can simply make it slide right back in and we think that's great for advertisers and we think that's great for users that expandable format is available today let me show you another brand new one we're making available today this is an expandable ad format this is a Flixter application and why don't you expand that ad out in this case it's not just display it's rich media he's linking directly to a trailer he clicked on it and the trailer can be okay and the movie will go ahead and start Matt don't watch the movie let's come back okay let me show you another ad format this is also an expandable ad format but in this case this expandable ad format has I'm sorry in this case I jumped ahead of you let's go back to the other one in this case I'm going to show you a very popular ad format that has been testing incredibly well this is a click-to-call ad in this case we're leveraging the fact that the user has given us permission to use their location we know that they're in San Francisco and DirecTV would like to make a special offer to customers in San Francisco and they have a new click-to-call option so simply clicking on that button on that ad allows the customer to make that phone call to DIRECTV and that's incredibly valuable let me talk about another ad format this is an expandable format with a very clever twist Matt why don't you expand this one out it includes a map and directions and click-to-call isn't that great that's a new format that will become shortly available as well okay finally let me show you what I meant when I said open to innovation this is the weather comm application and they're using the industry standard double-click software to serve this ad from Google the ad is at the bottom now when Matt clicks on that ad you get a full-screen immersive ad and it's a pretty amazing ad it's got trailers and TV spots and galleries not only does it have a bunch of media that you can interact with look at the bottom there the bottom has the ability for you to tap directly into Fandango to purchase movie tickets but you know what the kicker is of what you're seeing it's not a Google ad you're seeing an immersive ad format delivered by a great advertising company called media let's but because the publish because the advertiser excuse me the ISB the developer use double-click to serve the ads double-click is open to have any ads show up and that's what it means to be open that Weser the most relevant ad to the user sometimes from Google sometimes not but you the developer and the publisher have the choice that's openness all right let's go back to slides if you'd like to learn more about our advertising solutions we encourage you to go to google.com mobile ads and to you know entice your interest a little bit we're doing two things today number one we're opening up Astra Adsense for mobile apps to everybody at the conference all 5,000 of you and we're going to sweeten the deal a little bit if you go sign up we will give you $100 free advertising credit to start learning the system and advertise your apps in the Android Marketplace so please go try that one of the most powerful ways to demonstrate what I talked about at the beginning that innovation comes from all levels of the stack is to highlight one of my favorite devices this is the device by HTC it's the evo device now there's several things I love about this device one thing I love about the device is it's absolutely gorgeous 4.3 inch screen I also love how fast it is I love the great work HTC has done to add value I love the Sprint network it's a 4G network you know what a 4G network can do it can give you a peak of up to 10 megabits per second imagine which you can watch on this device at 10 megabits per second I love the fact that this device has a great little stand so you can just set it on the desk or if you're on an airplane just set it right there and watch your content it's got a great battery it's got HDMI output it's got a camera that's 8 megapixels and that will do 720p HD video recording and you know what I love most about this device is that in partnership with Sprint and HTC Google is going to make this device available to every one of you today Oh but those of you who are watching on YouTube I'm sorry remember to register next year quickly I think you're going to love that device we ask you to do two things number one don't run out right now after the keynotes are over we're going to make it available all day until 6 p.m.

so you'll have plenty of time to get this device and really get to see the latest in Android innovation number 2 thing we're going to ask thank you for supporting Android thank you for voting on the side of openness and choice please keep building those great apps now we're at the halfway point this morning we're not done yet if you'd like to see the next step in the evolution of Android and where we're about to go next hang tight.

As found on YouTube

Wireless MobilePhone MonoPod Model:Z07-5!

cube here it's new product go today a wireless mobile phone monopod to take selfie Bluetooth of course and this is nice you can take you can be more creative and take better picture when your outing like this that's for sure so in this box this thing's come and here it's button for Bluetooth for taking picture you have this for upload battery because here it's a battery you can see itself ok I can we can turn it on like this there the blue light yes a blue light there is my iPhone there and we take bluetooth bluetooth on phone here it's find this set oh seven five I know it's rather good ok so now I can put this together on this selfie stone like this okay like yeah like this there we are something like that and first of all I'm not sternum their camera and we can like this looks there and this is long here so you can yeah you can take it much much longer than this see here here's the button so if we want to take this over there this box take a photo of this box near the six extreme in there so now I will touch this Bluetooth there and it was pictures we can look look at that do this okay take picture of this son over there very near so now we can look see how cool is that so in many situation this is fantastic this selfie stone can be so long like this and you can turn it around in every different angle and Bluetooth so common this box wireless mobile phone monopod photograph and we to yourself wonderful everywhere model set oh seven five cool

As found on YouTube

Samsung Galaxy S 2 Unboxing | Pocketnow

hey guys it's Brandon minimun from pocketnow.com 2010 was the year of the Samsung Galaxy S we found it on every major carrier it was a hot device had great battery life great performance really thin form factors and in 2011 samsung is back with their new flagship the galaxy s2 this phone ups the ante in every respect it's faster it's thinner it has a bigger screen and in this video we're going to unbox the galaxy s2 let's get to it ok so here we go so a lot of improvements to talk about between the galaxy s and the galaxy s2 we've got the new Exynos processor I'm not sure if I'm saying that right the 1.2 gigahertz dual-core Samsung proprietary process that goes right up against the snapdragon 1.2 gigahertz processor coming out on the HTC Sensation I've got some extra packaging here and thanks to our friends at clove Co UK for sending us the unit 2 review right now this phone is 440 British pounds which is about seven hundred and twenty-two dollars yes it's very expensive but hey it'll actually work on the on us ATT frequencies so that's an added little bonus there and plus this is a device that just came out so you got to expect the press to be very high very likely that this phone will see release on all major US and international carriers so you can keep an eye out for lower subsidized prices so here it is small little box galaxy s2 1.2 gigahertz this is incredible the previous Galaxy S had a single core 1 gigahertz processor and now we're going to 1.2 gigahertz and having two cores this is going to be awesome let's take a look what we have on the back here dual core 8 megapixel camera this shoots 1080p video and it can also play it back in 1080p if you connect it to a television we've got DLNA support a huge 1650 milliamp hour battery and the interesting thing about that is that this device is the thinnest smartphone you can buy right now it's a little bit over 8 millimeters thick it's got the Super AMOLED plus screen you can tell the time excited about this Samsung had the super-mo LED debut in the galaxy s devices now they're going with the plus they also made the screen bigger this screen is going to be all inspiring we're thinking the only problem with it is that the screen is WVGA resolution it's not qHD like all the higher-end devices are coming with but maybe the the Super AMOLED plus will be so nice that we won't even care comes with 16 gigs of internal memory we've got Wi-Fi a B G and n Android 2.3 you get the point let's let's break the seal here there's no turning back now and open up this little tiny box Wow social hub premium GPS ok let's just see what comes in the box this thing is remarkably thin and I wish you can actually be here holding it I just have to stop and just comment on this I cannot believe how thin this phone is it's so powerful it's got such a big screen and yet it is thinner than any phone I've ever held Wow this is unbelievable it's pencil thin ok let's see what else is in the box before you know going crazy about the thickness of the phone we've got headphones which is nice kind of nice looking headphones with little chrome pieces there let's see we've got a micro USB charger which is nice of course the European plug but your standard plugs will work with that if you plan to use this in the US extra earbuds and here's the quite large sixteen hundred and fifty million power battery this function a pretty impressive battery life and what we're going to do is put the battery in I'm going to get an AT&T SIM card and plug it in immediately so we can do the first time boot up on the beautifully thin Samsung Galaxy S 2 and let's just take a look at the front we're getting a lot of reflection from here so off tak turn off some of the lights but what's interesting is that you can't see any buttons when the phone's off kind of like the Nexus S we've got the back button here we've got the home button without a home logo on it so kind of a very sleek looking design oh it looks too much like an iPhone Apple might get mad by that but maybe not and we've got the menu button here on the left front-facing camera let's get a SIM card in here let's get the battery in we'll be right back all right SIM cards in battery is in this thing is very very light even with the SIM card and usually the problem with light is it feels cheap this phone does not feel cheap okay so let's try to pop on the back battery cover by the way back here we've got this really interesting textured plastic maybe you can see it in the light it's got these raised little bumps that give it a good in hand feel we've got sort of a reverse chin here usually the chin occurs on the front speaker here doesn't look like it as dual speakers we've got a you know something to connect a lanyard loop to we've got a 3.5 millimeter headphone jack secondary mic for noise cancellation it is just amazing how thin this thing is power standby button on the left that's a little bit different than Samsung had been doing usually they put the button on the top so let's press and hold that and while that's booting up let's compare that to some other devices so here we've got the htc inspire 4G and it's got a little bit of fingerprints on here but it did in a second ago kind of weird so the inspire 4G was one of the thinnest phones to come out it's basically the Desire HD which was HTC's flagship of last year and let's see how they compare in terms of thickness there's no contest here the Desire HD is about 12 millimeters thick which is quite thin but again we're going at under 9 millimeters on the galaxy s 2 so it's just it's amazing how much thinner it is so here we have a network does it provide dayton time ok great it's kind of up there we go just turn to 3g so we're getting 3g network connectivity here on AT&T we'll zoom in a little bit a lot of options here automatic will go through this later there seems to be no confirm button so if I hit the back button and I'll do change language the United States it's really cool to see the Super AMOLED plus screen on such a large display all of the Galaxy S phones came with a 4-inch display so it's very interesting to see such incredible contrast over a large display let's see if we can kind of get out here guess we're gonna have to go through this okay we'll do all of this in a little bit and we'll show you what it's like again let's let's bring out some other devices to compare with before we actually do that turn that off over here is the iPhone 4 which is starting to look kind of small in comparison to these devices that have these big bright 4.3 inch displays so we've got the front facing camera al that looks a lot bigger here on the galaxy s 2 in terms of thickness the galaxy s 2 is thinner than the iPhone 4 which is a tremendous feat considering the iPhone 4 like the Desire HD is one of the thinnest devices out there also next to another device that has a 4.3 inch screen the Big Daddy the HTC Thunderbolt of course the galaxy s2 does not have 4G although it's very likely in 2011 and 2012 we're going to see versions of the galaxy s2 come out with LTE and HSPA+ and WiMAX and all of those good radio technologies all right so I put in the Google account information and what we have here is the new version of TouchWiz TouchWiz 4 which is Samsung's proprietary interface now a lot of people after they got the galaxy s devices put on a third-party launcher like launcher pro or ADW launcher because the galaxy s devices had this weird TouchWiz interface it wasn't that fast it wasn't that usable but it looks like Samsung has really gone above and beyond with their new interface the program tray looks about the same but they've got these new widgets that kind of stick together we're obviously gonna have to spend some time with this and we're gonna have a full video that talks all about what it's like to use TouchWiz 4 and here's something cool move device left or right while holding a selected icon to reposition to another page that is pretty cool so if we tap and hold and we tilt apparently if there's a feature that makes it a lot easier well that looks awesome there's a feature that makes it a lot easier to move widgets from homescreen to homescreen right now we're getting HSDPA on AT&T we should be getting faster data speeds than you can get onto the Atrix 4G and the inspire 4G because those devices don't seem to even advantage of HSDPA speeds so this might be a killer device to get on AT&T if you don't want to wait it will work on t-mobile but you won't get the 3G connectivity we've got a ton more coming up on the Samsung Galaxy S 2 it's really exciting for us to test one of the flagship devices of 2011 HTC's coming out of the gate with a sensation in the near future and Samsung is coming out of the gate with the Galle galaxy s 2 Super AMOLED plus screen 1.2 gigahertz dual-core processor it's pretty awesome if you liked this video please give us a thumbs up and if you think I was too enthusiastic about this phone please give us a thumbs up and thanks for watching we'll be back soon with more that's it for now

As found on YouTube

Introducing the OnePlus 7 Pro

What does it take to be truly innovative? It all starts before you wake Introducing the OnePlus 7 Pro We set out to create a fully captivating experience With a powerful Dolby Atmos
sound from top to bottom And an immersive display that redefines
the term bezel-less on mobile devices Couple that with a 90 Hz refresh rate
and you can scroll smoothly through your favorite apps Or play the most intense mobile games available
with no compromises That's because we put more computing power
into this device than ever before Now you can move seamlessly between apps with software that anticipates your next move Our adaptive brightness feature gives your eyes the least strenuous experience possible No matter the shot or scene
the OnePlus 7 Pro is up to the task That's because we've added 3
rear camera lenses To make sure you're capable
of catching every moment And, of course,
we didn't forget the selfie camera Power through your day and night
worry free Because with Warp Charge
you're good to go in 20 minutes That means your phone is ready before you are And the camera capabilities don't
diminish when the sun goes down Thanks to our nightscape feature you'll be able to take gorgeous photographs no matter the time of day That's the OnePlus 7 Pro Our most advanced phone ever

As found on YouTube

Sony Xperia 1 review: a tall order

The thing that I like about Android is there can be so many
different kinds of phones, just a lot of weird choices. But lately, it seems like
there are really only two choices, at least in terms of screen size. There's regular and there's extra large, which is why I was so excited
to try this phone right here, the Sony Xperia 1. I mean, just look at this good tall boy. It's got a 21:9 aspect ratio, which makes it relatively
narrow and super, well, tall. I think it's a fascinating phone, and it's way nicer than
I expected it to be. But I don't think it can
really justify its $950 price. Let me tell you why. Now, a lot of people would
like to have a big-screened phone, but they're put off by
how big these phones feel.

And that's the reason I like the Xperia 1. It has a big screen, there
is no doubt about it. It's 6.5 inches. But it's quite a bit narrower
than this OnePlus 7 Pro here. So you get the benefits of
seeing more stuff on your screen, like on the web or on Twitter, without the drawback of
feeling like you have a big honking glass slab you can barely wrap your fingers around. This phone is also really good if you like to do split-screen apps, which… I don’t know, I guess
people still do that. I never do. Anyway, it's nicer to hold, but that doesn't make
it a one-handed phone by any stretch of the imagination.

You're still going to need
to use your second hand to reach the top of the screen. Sony has a couple of
software tricks that help with how tall this phone is, but neither of them are great. You can double tap the home button to make a smaller version of the screen. Or there's this other thing with… Er, wow. What are you
doing there, Chuckles? Huh. Sorry, let's back up. Or you're supposed to be able to tap either side of the screen or swipe on it to do other stuff. It's called Side Sense,
and it kind of sucks. I can never get it to
work when I want it to, and it pops up all of the time when I don't want it to.

Now the reason that Sony
says it made this phone at this weird, tall aspect ratio is for watching movies, and Sony says that it
has a 4K HDR OLED screen. It also has, quote, “professional
level color reproduction.” So it can be in the DCI-P3 color gamut. It can also be in the BT.2020 color space if you care about that. And it has the D65 white point. There's this whole “Creator Mode” thing. Basically, Sony is
trying to make this phone appeal to people who really
care about video quality, both watching it and recording it. But Sony, the thing is, if you're going to do that, this screen should get way brighter. It is way more dim compared
to other OLED screens. Anyway, yeah, I will say
watching a 21:9 movie on this phone with its Dolby
stuff, without letterboxing or weird camera cutouts, is great. But the truth is that most
of the video that I watch is not 21:9.

It’s stuff on YouTube, and so I still end up
having big black bars on the left or the right. Or, if I expand it full screen, I end up cutting off people's heads. Now, I do think this phone is pretty good from a build quality perspective. It's got Gorilla Glass and IP68. It's got some bezels,
but they're not too big, and it's just nice to hold. But, you know, of course
there's no headphone jack. But there's no getting around how it being this tall
makes it really awkward. It's so tall, it couldn't
fit in my pocket. I was sitting down, and it just slid right out of my pocket and clattered on the concrete, which is why there are dings on the edges of the
phone on our review unit, which is sad. The buttons are also awkward.

They're all on the right side of the phone and, I don’t know, the fingerprint sensor is separate from the power
button for some reason. And sometimes it gets a little dirty and you have to wipe it off
before it will actually work. I do like that there is
a dedicated camera button. But overall, when I'm
trying to use this phone, I just end up hitting the wrong button, like, all of the time.

On the back, there are
three 12-megapixel cameras. There's a regular, a 2X
telephoto, and a wide. Sony put some nice optical
image stabilization on the main lens, and you know what? Finally, Sony has made
a phone with a camera that's pretty good. It's not quite as good as a Pixel 3 or a OnePlus 7 Pro to my eyes, but it's finally respectable.

I do wish that the
telephoto was more than 2X, but the wide angle one, it's really fun. I kind of love it. But I don't love Sony's camera software. The wide angle thing makes you pick between prioritizing image
quality or distortion. The auto made doesn't do HDR by default, and there's just a bunch of other settings that just really look
and feel kind of silly. Anyway, let's get into the results of what I actually get
out of these lenses. I think that Sony
prefers leaving detail in, even though that also
leaves in a bunch of noise. It also doesn't do as
aggressive HDR as I would like unless you have to, you
know, manually turn it on.

But the thing that did surprise me is that even though there's
no dedicated night mode, sometimes it actually
really nails it anyway, even if it's incredibly dark. Now, you can shoot 4K, and
that's one of the reasons this phone exists. And so Sony also
included a Cinema Pro app that lets you really dial
in all these manual settings for shooting 4K video. Unfortunately, the 8-megapixel camera on the front is junk.

It's really not good. I don't know, man. If this phone didn't cost $950, I'd probably be a little
bit less nit-picky, but you know what? It does. So I am. In terms of software and performance, I actually don't have a ton of complaints. It's a fairly clean version of Android 9 with just a few bells and whistles. It has a Snapdragon 855 processor so it's fast, and there's 6GB of RAM, which is decent, but not stellar. I am a little bit grumpy that there's only one storage option:
128GB of storage. If you're going to want more, and especially if you're
going to want to shoot 4K, you're going to need to expand it. And you can because there's
a microSD card slot. Battery life is
average-ish for big phones. I'm getting over four hours of screen time, and it's lasting through a day, but there's only a 3,300mAh battery in here, and I kind of feel like that's not enough.

I would be happier with that if there was wireless
charging on this phone. But no, there's not. It does do fast charging, but one neat thing Sony does is it won't fast-charge when it knows that you’re charging overnight, which helps with the overall
life span of the battery, which means it should last longer, a year or two for now — at least in theory. Now, after all that, if you're still interested in this phone, you should also know
that Sony as a company has kind of been deemphasizing phones since it hasn't been
really successful with them in the past few years.

And that kind of makes sense, and I also think it makes sense for Sony to try something new and move into this niche
of making tall boys, like this guy right here. Now, of course, you can spend less money and get a better phone
like the OnePlus 7 Pro, but what you can't get is any other phone in this tall aspect ratio, so I like the idea of this form factor. I think that it should exist in
the world of Android phones. So I'm glad that Sony's
trying to make 21:9 happen. But I don't know that I'd
recommend this particular phone to anybody. If you really, really,
really love the tall screen or you really love what
Sony does with video, then maybe.

But there's no getting around the fact that this is an expensive phone. For $950, I expect more,
and you should, too. Hey, thank you for watching. Do you want a tall phone? Let me know in the comments below. Also, if you're wondering if
there are other tall phones, we did review the Xperia 10 last month. It's kind of the same idea but cheaper and also, it's really bad for a whole other set of reasons.

But if you want to see a
review of a good big phone, click here..

As found on YouTube