My goodness, it's terrible Super bad That’s right, it’s a disaster To be honest, the matter is much more serious than I thought A simple text symbol that crashed iMessage Just such a symbol makes iMessage completely unusable but I asked Will to look into it to see what the hell was going on It turns out that the actual situation is much more serious On his Macbook, this symbol crashed Havok directly The Note application on the notebook has crashed Just because there is a note in it that contains this symbol Just need a symbol That’s why I called Will over because he was studying this all morning Let’s test it in different ways to see how bad it is His Android machine is here, there is another one here, and then the iPhone X I will copy this symbol in the SMS chat and send it out, and send it to the Android device using the Android device, so that I can confirm that this is a problem only for iOS and OSX.
let's start My cell phone signal is a bit poor, so… Got it, look Nothing happens on the Android machine But what's going to happen next is a bit risky Just such a simple symbol may directly crash the messaging application on iPhone X Kind of crazy The ghost knows if we find this, will other apps have problems? Give it to Jack, show them Okay, that symbol is there, once the mouse touches that symbol Highlight it… collapsed! Show them what happens when you open the note app The note-taking app can’t even be opened IPhone X on the left, your phone on the right I put my phone card in, it might be a disaster I want to warn you, don't do this! I'm not kidding, it's really bad This symbol may crash your phone Yes, let them see what we are talking about Symbols are copied in The SIM card is read on the iPhone X send Will told me before that this is a foreign language Yes, I think it should be the Indian language Anyway, the language symbols of individual countries What if you are in this country? Do you have this symbol on your keyboard? Everyone is going to collapse The sending speed is a bit slow Why I feel that as long as I send a Test, he won’t crash It takes some time Did the iPhone receive it? Oh My God A little dangerous WOW! The phone restarted directly! If I open the text message, click on this text message Ready? But I don’t know why my iMessage app didn’t crash completely Because my next text message rejected the previous one, so it’s okay If someone is stupid and sends you this symbol, then he is willing to help you He can send you a rescue text message In this case, when you open the SMS, the phone does not need to display this symbol Before I send the rescue SMS, I first try to open the SMS app Now send a rescue text message, and then…
BOOM! I can enter the SMS application now If I take the initiative to open the chat window with Will SMS app will still crash collapsed! But if you copy this symbol in and post it, the iPhone will crash again. SMS arrived Open…crash It crashed immediately, and the text message was completely useless Completely collapsed The extent of this matter is more serious than I thought Apple, I hope you can find some engineers to see what's going on This is a big problem Remember, your Uncle Lewis told you not to do this They don’t realize how much we rely on technology now Just such a small bug immediately makes you panic It’s just a small symbol that makes so many things fall apart Willy Do, thank you for investigating this matter and sacrificing your own computer for presentation I'll go back and use my Pixel 2 XL hhhh, Android device Then I want to send this to Jack Do not! Don't do that! I was kidding, kidding
If the ostriches can't be here by 3:00, we can't use them. Ok bye. Dude, what happened to you? Motorcycle wreck? Oh, oh, cracked my phone over the weekend. Oh. Yeah, still got to use it though. [MUSIC PLAYING] It's really not as bad as it looks. You can still hold it. You can touch it. But definitely don't swipe it. Eight's the seven, the six is the five. Every other button's the same. Congratulations. You've just won a free cruise. Well, gee whiz. Let's board the boat. Hi, I'm Jennifer calling about your car's extended warranty.
Well, good golly. Let's extend it. No. I'm sorry. I'm so sorry, man. I'm so sorry. I'm just gonna do it, OK? Yeah! [CHEERING] [MUSIC PLAYING] Whoo. Whoa. No. Oh, no. Oh what are the chances? Oh, please tell me it didn't crack the front? No. I like funny things. Oh, sorry man. You had to be here. I am here. I already scrolled it. It's into the abyss now. You can't just go back. Gone forever, dude. I cannot believe that just– no way. What was it? Huh, oh dude. Sorry. I don't even know where I put it. Can you call it please? We're calling it. Don't all call it at once. Ty look underneath. Are we good? It's not under there, but it is remarkable what is under there. Oh, it's on silent. Guys, going to take your shoes off you're going to feel the vibration.
Oh, I found it. Yes. Bang. Ah, N64 Rumble Pack. It has the same vibration patterns though. I'll meet you in the car dude. We only gave it an hour. [TEXTING SOUNDS] Dude. What? Oh man, I cannot thank you enough for recommending that hemorrhoid cream. Has helped out tremendously. My sister has 15 warts on her hand. We're trying to figure out how to get rid of those. Floor two, please. Dude, another crazy thing I found out– are you listening to my phone call right now? You know that's illegal. Dang it. Hey, man. What's up? Hey. Wait, are you in the bathroom right now? What? Yeah, no. Definitely not. That'd be– Hey is that water running? Yeah. Left the office early to go fish. Can you send me a picture of the fish? No. No. No. FaceTime? Hey, big dog. What's good, good looking? What do you need? Oh, I just wanted to see your face. I'm organizing my undies. My phone is at 1%. If I lose you any time– oh. Hey, don't you do that brother. Don't you– hey, real quick. Got to let the dog out.
Don't move a muscle. Be right back. You called me by the way. Dude this is like his fifth time calling today. Are you underwater? That's actually kind of impressive. I know you can't hear me, but never call me again. Oh bag of diapers. 4:54 on a Saturday the 13th. I have some illegal dumping activity happening right now. Hey. These gentlemen are tresspassing! Whoa! [INTERPOSING VOICES] And we're live.
I am not dumping sir. I was filming them. This is a citizens arrest. This is a citizens arrest. [INTERPOSING VOICES] Looking for the best coffee shop, boom. Right there. Hashtag, sponsored. Not really. I'm not sponsored yet. But put on a show and don't even try to [INAUDIBLE].. They love me. They all love me. Yo, Gar. You good if I add you to that paintball group text? I appreciate the invite, man. I think I'll pass. It's the paintball boys. Seventh grade, your birthday party, we had so much fun. That was like 20 years ago. I got you added. You're in. See you Saturday. Sparky!! People on Facebook, check it out. This is amazing. Hey, hey, hey. Calvin. Calvin. Check out this concert I was at last year, dude. I paid top dollar for these seats. Look at this. That sounds like a broken washing machine, dude. Seven minutes left baby. Let's make something happen. No. No, market doesn't close for another two hours. East coast, got plenty of time. Stay patient. You bring food? Yeah, I brought dip.
I bought the dip too. I just bought the dip. Which one did you get? He brought chips and dip. Oh got it. Oh, it's going down still. It's still going down. Oh it goes up. It goes up. Yes. Lunch on me. Time is money right now. OK? If I got invited to an elementary school graduation, I would walk up on that stage and I would say, ETFs, stocks, crypto, all in one place. SoFi app. Mic drop. And I would walk off the stage. And that is the best piece of advice I could give them. Please, respect the game. Yeah. No. [MUSIC PLAYING] I'm number one for the team. I'm number one for the team. We're in a conference. Yes, that's your phone.
What song is that? One for the team. No, I don't have your phone. It's your phone. It's right there. It was in your front pocket. This little knob right here, silent mode. Let's live life that way, huh? Green bubbles? Who's got the Android? Jonah. I tried giving you an iPhone and you still refused. Another vote for Jonah. Green is my least favorite color. Jonah. One more vote and you will be removed from the group thread. Jonah. We're back to blue bubbles boys. Who wants to rename the group? Spider-man 3. 10 minutes. Let's go. I accidentally called my uncle. Just one second. One second. Hey, did you mean to call? What kind of question is that? Of course I meant to call. I was worried you might have accidentally called me or something. No, not you, Uncle Remus. You boys got to get back down here to Georgia. This old catfish came up and just scooped up the top water right off the top. You wouldn't believe out of tall the trees have gotten since you was here last time. I wouldn't? No.
But I don't know if you remember that girl Katie Funchess, that you used to date back in first grade? Tell you what– Ah. Hello. I don't have service unless I'm by the road. Hang on. Can you hear me? I don't know, I can't see it now. Well, I had to go back up there and look. They've got a candle burning at the firework stand. We're lighting these fireworks. We're going to test them. You've got to make sure they work. We don't sell no cheap ones around here, these are quality fireworks. Ow, ow. Dude! Can I talk to a manager? Hey, big dad. You want to talk to him? And I lost you. What seems to be the problem sir? You. Not only will you not buy a firework from my fireworks shop, you ain't buying one in this county or nor this state.
You have the audacity to show up on my property after what you've done to my family and my life? Get out of here right now. Nope. Ah. You better get out of here. [INTERPOSING VOICES] That's mine. Oh come on. That's my granddaddy's firework stand! He's had it since 1904. Oh no. No. Come on. Remember the koalas arrive at 2:00. So– Bean sales, through the roof. it's really. Can somebody please figure out whose phone that is. Whoever's phone this is, you're fired. Wait, guys. Sorry. It was actually me. The koalas are here. Oh. Oh. Oh. Oh that is the worst drive ever, so ugly and flat. Only thing worse is Amarillo to Albuquerque. Good shot, dude. I would highly recommend getting lessons. Honey, if you want to take piano seriously, you got to get lessons.
Dude, will you please upgrade your phone? Why? This one works just fine. What is that, the original iPhone? Look how small this is. Looks like a pager. Is that the iPod app? If I want to listen to the Goo Goo Dolls, I just click them. Hold on. It's loading. Sir, do you mind holding that for a second? I'm having trouble seeing it.
Could you move back a little bit? Right about there. That's great. Thank you. My son got a penguin. I just couldn't see the word. Oh wow. Congratulations. Just use my credit card. It's three– Whoa, whoa, whoa shh. Dude. They're listening. What are you talking about? Watch this. Yeah, thanks for coming over, man. I've been meaning to get into a new hobby these days. I've been thinking about getting into soap carving. They listen to everything. Anyways, what was that credit card number again? Oh, it's three six– What's new with you? Glad you asked. It's my granddaughters, grandson, family on the beach. This was a cool story. This guy bought this one at an auction. Brother-in-law who's Sheriff of Ellis County being interviewed after the tornado. You have 12 pictures of the ground? Yeah, that's kind of the way I roll. Three, six– Whoa, whoa, whoa, whoa, whoa. Yo Gar point ball– Dude, I can't do this. We had to call in a backup. Sometimes you need somebody to drop a phone. And the poor man just couldn't do it. Third person.
Could I get a little credit? Huge thanks to our friends at SoFi for sponsoring this video. Sign up for a SoFi Active Invest account where you can win up to $1,000 to buy stocks, ETFs, trade crypto, and more. And with SoFi social investing, you can even see how your portfolio stacks up against everyone, including us. Seriously. Just search our names in the app. So what are you waiting for? Click here to download the SoFi app right now. Click here if you want to see the last video. Click here if you're not a subscriber. Signing up now. Pound it. Noggin. See ya. What a talk! That's cheating, it's cheating!.
Oh my God I like this mysterious package best Unboxing is always exciting But i happen to know what's inside This is a product prototype that has attracted a lot of attention on the Kickstarter crowdfunding platform Stickers This is not, this is a business card It’s still their CEO’s business card, shocked Throw away Ah ♂~ The smallest mobile phone in the world This thing is too small to be unbelievable ZANCO tiny t1 The smallest mobile phone in the world As you all know, I opened the box once called the "smallest" cell phone This one is smaller than those Sync contacts and music on your phone Monochrome screen voice Changer 3 days standby time Built-in Bluetooth Insert a Nano size card into your mobile phone, which can store 500 SMS You heard that right, this phone has a voice changer You can become a man Female, child, elderly Cartoon character, teenager, optimus prime Duck, EVA robot WALL·E robot, Rap man and Rap sister Started to get fun Let's see how small this thing is Ho Ho Ho! WOW! It's unbelievably small Am i dreaming Great profit! It looks like there are speakers on the back Insert the SIM card from the side Compared with this, it feels that the Nano card has become bigger Smaller than my thumb About the same size as an SD card About the size of a coin Smaller than a small lighter Much smaller than the ketchup bag About the size of two pieces of gum This is too small! With a micro USB cable for charging There is also a card pin This one sells for $55 You may be thinking, who is the user group of this thing? I think this is fun to make For those who will try anything I took this out at the party, everyone was stunned Use this thing to call They will be stunned The buttons on the phone are much smaller than my fingers Jack, what do you think? hhhhhhh This is Pixel 2XL Of course the two phones have different goals But if you look at this contrast, it’s too small Ok, boot up Time has come, full power I want to insert the SIM card and try to make a call What do you think? Looks like I have a text message Let's try to make a call, Will what's your phone number Did you hear Not bad It's connected, it's weird Oh WOW! This speaker is quite loud Will: Yo~ Hey man, your voice sounds very clear eh Will: Uh, your voice is also very clear Really? I didn’t expect this thing to have good call quality You are too, very clear Feel like a secret agent You don’t need to bring your mobile phone, just bring this, and you can put it in your socks Suppose you want to go for a jog, but you are afraid of a call and don’t want to take such a big mobile phone out.
You need a smaller phone Should I try the voice changer? Okay, I'm going to change positions with Will, Will you come over Low voice Sounds a bit like a robot But it feels so strange hhhhh Okay, this thing is made for fun The body is incredibly small Not bad for fun But if it comes to work, such as texting I feel scary when I think about it It's unexpectedly good to call hhhh What What the hell This phone is not as big as the black part You can buy a normal phone for $50, but it will definitely not be the smallest in the world But for those who like to collect, buy this as a gift for him, he absolutely likes
Oh my God I like this mysterious package best Unpacking is always exciting But i happen to know what's inside This is a product prototype that has attracted a lot of attention on the Kickstarter crowdfunding platform Stickers This is not, this is a business card It’s still their CEO’s business card, shocked Throw away Ah ♂~ The smallest mobile phone in the world This thing is too small to be unbelievable ZANCO tiny t1 The smallest mobile phone in the world As you guys know, I opened the box once called the "smallest" cell phone This one is smaller than those Sync contacts and music on your phone Monochrome screen voice Changer 3 days standby time Built-in Bluetooth Insert a Nano size card into your mobile phone, which can store 500 SMS You heard that right, this phone has a voice changer You can become a man Female, child, elderly Cartoon character, teenager, optimus prime Duck, EVA robot WALL·E robot, Rap man and Rap girl Started to get fun Let's see how small this thing is Ho Ho Ho! WOW! It's unbelievably small Am i dreaming Great profit! It looks like there are speakers on the back Insert the SIM card from the side Compared with this, I feel that the Nano card has become bigger Smaller than my thumb About the same size as an SD card About the size of a coin Smaller than a small lighter Much smaller than the ketchup bag About the size of two pieces of chewing gum This is too small! With a micro USB cable for charging There is also a card pin This one sells for $55 You may be thinking, who is the user group of this thing? I think this is fun to make For those who will try anything I took this out at the party, everyone was stunned Use this thing to make a call They will be stunned The buttons on the phone are much smaller than my fingers Jack, what do you think? hhhhhhh This is Pixel 2XL Of course the two phones have different goals But if you look at this contrast, it’s too small Ok, boot up Time has come, full power I want to insert the SIM card and try to make a call What do you think? Looks like I have a text message Let's try to make a call, Will what's your phone number Did you hear Not bad It's connected, it's weird Oh WOW! This speaker is quite loud Will: Yo~ Hey man, your voice sounds very clear eh Will: Uh, your voice is also very clear Really? I didn’t expect this thing to have a good call quality You are too, very clear Feel like a secret agent You don’t need to bring your mobile phone, just bring this, and you can put it in your socks Suppose you want to go for a jog, but you are afraid of a call and don’t want to take such a big mobile You need a smaller phone Should I try the voice changer? OK, I want to change positions with Will, Will you come over Low voice Sounds a bit like a robot But it feels so strange hhhhh Okay, this thing is made for fun The body is incredibly small Not bad for fun But if it comes to work, such as texting I feel scary when I think about it It's unexpectedly good to call hhhh What What the hell This phone is not as big as the black part You can buy a normal phone for $50, but it will definitely not be the smallest in the world But for those who like to collect, buy this as a gift for him, he absolutely loves
good morning everyone it's great to be back at io today i'm excited to share our latest breakthrough in natural language understanding lambda it's a language model for dialogue applications and it's open domain which means it's designed to converse on any topic and while it's still in research and development we've been using it internally to explore novel interactions for example say you wanted to learn about one of my favorite planets pluto lambda already understands quite a lot about pluto and millions of other topics let's listen to a conversation the team had with pluto a few days ago i'm so curious about you i sense your excitement ask me anything tell me what i would see if i visited you would get to see a massive canyon some frozen icebergs geysers and some craters it sounds beautiful i assure you it is worth the trip however you need to bring your coat because it gets really cold i'll keep that in mind hey i was wondering have you ever had any visitors yes i have had some the most notable was new horizons the spacecraft that visited me let's break down what made it feel so natural first learn concepts as you saw the model talked about the new horizon spacecraft and the coldness of space lambda synthesized these concepts from its training data because none of the responses were predefined lambda answered with sensible responses keeping the dialogue open-ended natural conversations are generative and they never take the same path twice and lambda is able to carry a conversation no matter what we talk about yet it's still early research so it doesn't get everything right sometimes it can give nonsensical responses imagining pluto doing flips or playing fetch with its favorite ball the moon other times it just doesn't keep the conversation go going we believe lambda's natural conversation capabilities have the potential to make information and computing radically more accessible and easier to use we look forward to incorporating better conversational features into products like google assistant search and workspace lambda is a huge step forward in natural conversation but it is still trained only on text when people communicate with each other they do it across images text audio and video so we need to build models that allow people to naturally ask questions across different types of information these are called multimodal models for example when you say show me the part where the lion roars at sunset we will get you to that exact moment in a video advances in ai are helping us reimagine what a map can be but now you can also use it to explore the world around you you'll be able to access live view right from the map and instantly see details about the shops and the restaurants around you including how busy they are recent reviews and photos of those popular dishes in addition there are a host of new features coming to live view later this year we're adding prominent virtual street signs to help you navigate those complex intersections second we'll point you towards key alarm landmarks and places that are important for you like the direction of your hotel third we're bringing it indoors to help you get around some of the hardest to navigate buildings like airports transit stations and malls indoor live you will start rolling out in top train stations and airports in zurich this week and will come to tokyo next month we're bringing you the most detailed street maps we've ever made take this image of columbus circle one of the most complicated intersections in manhattan you can now see where the sidewalks the crosswalks the pedestrian islands are something that might be incredibly helpful if you're taking young children out on a walk or absolutely essential if you're using a wheelchair thanks to our application of advanced ai technology on robust street view and aerial imagery we're on track to launch detailed street maps in 50 new cities by the end of the year so we're making the map more dynamic and more tailored highlighting the most relevant information exactly when you need it if it's 8 a.m on a weekday we'll display the coffee shops and bakeries more prominently in the map while at 5 pm we'll highlight the dinner restaurants that match your tastes you'll start seeing this more tailored map in the coming weeks people have found it really useful especially during this pandemic to see how busy a place is before heading out now we're expanding this capability from specific places like restaurants and shops to neighborhoods with the feature called area business say you're in rome and want to head over to the spanish steps and its nearby shops with area business you'll be able to understand at a glance if it's the right time for you to go based on how busy that part of the city is in real time area busyness will roll out globally in the coming months let's talk about all the ways we're innovating in shopping soon on chrome when you open a new tab you'll be able to see your open carts from the past couple of weeks we'll also find you promotions and discounts for your open carts if you choose to opt in your personal information and what's in your carts are never shared with anyone externally without your permission we capture photos and videos so we can look back and remember there are more than four trillion photos and videos stored in google photos but having so many photos of loved ones screenshots selfies all stored together makes it hard to rediscover the important moments soon we're launching a new way to look back that we're calling little patterns little patterns show the magic in everyday moments by identifying not so obvious moments and resurfacing them to you this feature uses machine learning to translate photos into a series of numbers and then compares how visually or conceptually similar these images are when we find a set of three or more photos with similarities such as shape or color we'll surface them as a pattern when we started testing little patterns we saw some great stories come to life like how one of our engineers traveled the world with their favorite orange backpack or how our product manager christy had a habit of capturing objects of similar shape and color we also want to bring these moments to life with cutting edge effects last year we launched cinematic photos to help you relive your memories in a more vivid way cinematic moments will take these near duplicate images and use neural networks to synthesize the movement between image a and image b we interpolate the photos and fill in the gaps by creating new frames the end result is a vivid moving picture and the cool thing about this effect is it can work on any pair of images whether they were captured on android ios or scanned from a photo album in addition to providing personalized content to look back on we also want to give you more control we heard from you that controls can be helpful for anyone who has been through a tough life event breakup or loss these insights inspired us to give you the control to hide photos of certain people or time periods from our memories feature and soon you'll be able to remove a single photo from a memory rename the memory or remove it entirely instead of form following function what if form followed feeling instead of google blue we imagined material you a new design that includes you as a co-creator letting you transform the look and feel of all your apps by generating personal material palettes that mix color science with a designer's eye a new design that can flex to every screen and fit every device your apps adapt comfortably every place you go beyond light and dark a mode for every mood these selections can travel with your account across every app and every device material u comes first to google pixel this fall including all of your favorite google apps and over the following year we will continue our vision bringing it to the web chrome os wearables smart displays and all of google's products we've overhauled everything from the lock screen to system settings revamping the way we use color shapes light and motion watch what happens when the wallpaper changes like if i use this picture of my kids actually getting along for once i set it as my background and voila the system creates a custom palette based on the colors in my photo the result is a one of a kind design just for you and you'll see it first on google pixel in the fall starting from the lock screen the design is more playful with dynamic lighting pick up your phone and it lights up from the bottom of your screen press the power button to wake up the phone instead and the light ripples out from your touch even the clock is in tune with you when you don't have any notifications it appears larger on the lock screen so you know you're all caught up the notification shade is more intuitive with a crisp at a glance view of your app notifications whatever you're currently listening to or watching and quick settings that give you control over the os with just a swipe and a tap and now you can invoke the google assistant by long pressing the power button and the team also reduced the cpu time of android system server by a whopping 22 percent and with android 12 we're going even further to keep your information safe to give people more transparency and control we've created a new privacy dashboard that shows you what type of data was accessed and when this dashboard reports on all the apps on your phone including all of your google apps and we've made it really easy to revoke an app's permission directly from the dashboard we've also added an indicator to make it clear when an app is using your camera or microphone but let's take that a step further if you don't want any apps to access the microphone or camera even if you've granted them permission in the past we've added two new toggles in quick settings so you can completely disable those sensors for every app android's private compute core enables things like now playing which tells you what song is playing in the background and smart reply which suggests responses to your chats based on your personal reply patterns and there's more to come later this year all of the sensitive audio and language processing happens exclusively on your device and like the rest of android private compute core is open source it's fully inspectable and verifiable by the security community with a single tap you can unlock and sign into your chromebook when your phone is nearby incoming chat notifications from apps on your phone are right there in chrome os and soon if you want to share a picture one click and you can access your phone's most recent photos to keep movie night on track we're building tv remote features directly into your phone you can use voice search or even type with your phone's keyboard we're also really excited to introduce support for digital car key car key will allow you to lock unlock and start your car all from your phone it works with nfc and ultra wideband technology making it super secure and easy to use and if your friend needs to borrow your car you can remotely and securely share your digital key with them car key is launching this fall with select google pixel and samsung galaxy smartphones and we're working with bmw and others across the industry to bring it to their upcoming cars that was a quick look at android 12 which will launch this fall but you can check out many of these features in the android 12 beta today let's go beyond the phone to what we believe is the next evolution of mobile computing the smartwatch first building a unified platform jointly with samsung focused on battery life performance and making it easier for developers to build great apps for the watch second a whole new consumer experience including updates to your favorite google apps and third a world-class health and fitness service created by the newest addition to the google family fitbit as the world's largest os we have a responsibility to build for everyone but for people of color photography has not always seen us as we want to be seen even in some of our own google products to make smartphone photography truly for everyone we've been working with a group of industry experts to build a more accurate and inclusive camera so far we've partnered with a range of different expert image makers who've taken thousands of images to diversify our image data sets helped improve the accuracy of our auto white balance and auto exposure algorithms and given aesthetic feedback to make our images of people of color more beautiful and more accurate although there's still much to do we're working hard to bring all of what you've seen here and more to google pixel this fall we were all grateful to have video conferencing over the last year it helped us stay in touch with family and friends and kept businesses and schools going but there is no substitute for being together in the room with someone so several years ago we kicked off a project to use technology to explore what's possible we call it project star line first using high resolution cameras and custom built depth sensors we capture your shape and appearance from multiple perspectives and then fuse them together to create an extremely detailed real-time 3d model the resulting data is huge many gigabits per second to send this 3d imagery over existing networks we developed novel compression and streaming algorithms that reduce the data by a factor of more than 100 and we have developed a breakthrough light field display that shows you the realistic representation of someone sitting right in front of you in three dimensions as you move your head and body our system adjusts the images to match your perspective you can talk naturally gesture and make eye contact it's as close as we can get to the feeling of sitting across from someone we have spent thousands of hours testing it in our own offices and the results are promising there's also excitement from our lead enterprise partners we plan to expand access to partners in healthcare and media thank you for joining us today please enjoy the rest of google i o and stay tuned for the developer keynote coming up next i hope to see you in person next year until then stay safe and be well
– There's a separate RGB LED in the little fan, are you kidding me? Everyone and their dog is getting into the gaming phones space now, but not everyone designs their gaming phone for competitive eSports play, that's right, David. No, no, stop it, stop it. This is an eSports phone, and, there is legitimately some stuff about it that is straight up very, very cool. This is the Lenovo Legion Duel Phone 2, not dual phone, it's just one phone, duel like (imitating battle noises) I'm fighting with you, that kind of cool stuff. It's got a Snapdragon 888 processor, up to 18 gigabytes of LPDDR5 memory, up to 512 gigabytes of UFS 3.1 storage, and get this, it's got an active cooling system for its vapor chamber cooler, that's got a 12,500 RPM fan here and then a 15,000 RPM fan at the top for exhaust, so yes, my friends, you are looking at holes in the phone here and here for flipping cooling.
And apparently they have rethought the layout of the internals of this phone, so all the heat producing devices are kind of all aligned along this cooling path. They claim that the air heats up as much as 22 degrees going from the intake to the exhaust. This thing is crazy. Like, yeah, the ROG phone is for gaming or whatever, but this is clearly meant to be held like this, I mean, even if I'm taking a picture. So it's got 63 megapixel cameras, and 16 megapixel rear snappers on it, like if I'm taking a picture with this thing, I'm holding it like this, like a freaking camera. Okay, we're, you know what? We're going to straight up open the camera right away here. – Sup Jono? – What's up? (camera clicking) – Look at this guy waving in a picture and now I got a blurry hand to look at it. Anyway, the point is, the rear camera is not the main event here, the main event is the pop-out selfie camera.
That's right, you cannot be a legit pro gamer unless you have a face cam while you're gaming. Man, this thing is a chunky boy though, which makes sense, when you've got an active cooling system with not one, but two fans, a 5,500 milliamp power battery that is split into two, and charges at up to 90 Watts. And of course, like any self-respecting gaming phone, it's got dual forward facing speakers, so hopefully it's going to sound pretty decent. That's a bit of a slow fingerprint scanner, but let's see how it performs when, okay, that's fine when you're unlocking, just a little slow to register your fingerprints, so that's pretty good. All right, so let's just– (screaming and laughing) There it is, look at that.
Sup David? – [Jono] That's pretty good. – Wait, it's on the, hold on a second, are you kidding me? It is on the lock button, so it… (beeping) Like what? That is crazy. Why would you put the camera right on the lock button? Yeah, I don't know, if you want the lock button positioned for right-handed people, and then if you want sort of, the landscape mode, to be kind of a sensible, you want the volume rocker out of the way while you're gaming, so you got your shoulder buttons and stuff, I mean, I guess that's just where it has to go.
As a PC enthusiast, it drives me crazy how much better the front facing cameras are on phones. Like, are you telling me, PC manufacturers, that you couldn't find a way to put a camera that thick in your laptop? – Impossible? – [Jono] Yeah. – Completely impossible, can't be done? – [David] What about the processing, though? – Okay, Intel, are you telling me that processing cannot be fixed? Okay, it can't be better? At any rate, wow. (laughing in background) That is a lot of smoothing. I am beautiful. Holy crap, look at that, my pores are picture perfect. How much you want to bet that I straight up do not have wrinkles when I smile on this thing? This side is hilariously smoothed.
Wow. (laughing) Oh, beauty mode, you so silly. That is super cool, and I think they did a bang up job of getting the length of the lens just right for a face camera while you're gaming. But of course, there's more to this that makes it a gaming phone than just performance, cooling, and a front facing camera, starting of course, with all of the touch sensitive buttons that are located around the device.
So on each top you've got two trigger buttons or, well there's two shoulder buttons because neither of them is pressure sensitive, so you got one, two, three, four. You've got an additional touch sensitive button on the back of the device, which I believe is, you know what? I'm going to have to find that a little bit later. And then finally, this is super cool, they've gone and they've put a more advanced haptic motor system in this phone so that you've actually got two pressure sensitive buttons on either side of the screen. I am really excited to try that out. Naturally, there's RGB lighting. Hey. All right. So it's got a 144 Hertz display, but there's more to the responsiveness of a mobile phone display than just how quickly it refreshes. It actually pools for touches at 720 Hertz, so that means that when you touch the screen, by pooling more often, it's getting the most up-to-date possible information so that your inputs are going to have less latency.
720 Hertz. – [David] Yeah, that's pretty cool. – We're getting dangerously close to like PC peripherals. Now, touch screens are inherently not as responsive as something like a mouse or a keyboard input device, but we can improve that a lot by pooling more frequently. Oh my God, hold on a second, this is hilarious. I'm just going to enable trubo fan here. (laughing) This is cool, so you can change your lighting effects depending on what's going on, so it can play with their music app, which I don't know if you would use that much cause you probably have Spotify, or whatever.
(laughing in background) Color changes by device temperature when you get calls, cool. When did having lights on your phone become not cool anymore? Remember those cool apps where you could have like anyone that had an RGB LEB this is back in like the Nexus 5 days, right? – [David] Yeah, the little top LED. – So you could have like color coded and flashes and stuff. What was that app called? There was like a cool app everyone loved so much. – No way, that's a separate RGB LED in the little fan.
Are you kidding me? To Lenovo Legion's credit, the design is really well balanced. They said that, like we saw on the ROG Phone 5, they split the battery apart, so this is better for battery longevity, as well as for the balance of the device. And, talking about battery longevity, yeah, it's all fine and good to be able to charge your device to 50% in 13 minutes, but you also have to have protections in place for your battery. So, since this is a device that is very likely to be plugged in via USB Type-C. Of course, there are two Type-C ports, there's one that you can use horizontally, and one that you can use vertically.
You want to make sure that you're not just juicing it up to a hundred percent and leaving it there the whole time while it's running an intensive application, like a game and heating up, cause that's really bad for the battery. So, they've got phalanx battery protection system, not to be confused with phallus battery protection system which is more common in adult toys. Should we fire up some COD? This video is brought to you by Vincero.
Vincero makes time pieces that fit a variety of budgets and they're currently having their spring upgrade sale. You can get up to 30% off and free shipping on everything they sell. And if watches, aren't your thing, you can check out their new blue light filtering eyeglasses for some stylish eye protection, and no discount code or anything is required, it is automatically applied in the checkout. So don't wait, The sale ends April 12th, so go take advantage of the Vincero spring upgrade sale at the link below. Look at that, panoramic stereo sound, 144 Hertz, this displayed will do up to 1300 nits peak brightness, it's just shy of seven inches, 6.92 inches, and as a 1080P class display, 2460 by 1080. I'm expecting it to be very good, to compete with the ROG Phone 5, it's going to have to be excellent. (COD theme music playing on phone) – [David] Wow. – That's pretty rich, isn't it? – [David] That's really good. – [Jono] Yeah.
(gunshots on phone) – Oh, God. Oh, boy. Hey, I'm out of here. How do I just aim but not fire? Okay. Like that. Wow, these bots are really bad. These are clearly designed to make you feel good about your skills, man. (laughing in background) Cause I have barely been touched. (chuckling) Here we go. Oh, well, sorry, I was confused, I couldn't move.
(laughing in background) Too busy winning. All right, here we go. Here we go, here we go, here we go. Now that I know how to do it, this is fairly intuitive, it's not bad. So you go wide triggers. See, I just didn't realize you can move these things around. Let's see how it handles screen recording while playing. Okay, let's rampage, I think rampage, Oh wow, high-speed mode. I mean, this game's locked at 30 to 60 FPS anyway, but there are games that do run at a higher frame rate. Man, having all those extra functions bound to the touch buttons at the back and stuff, not bad. And you what? to it's credit, I've been sitting here playing for… I don't know, what 10, 15 minutes at this point? It's barely even warm to the touch, here is. So this is where they put all the heat generating components, and even with the fans running, like it's warm, but the sides where my hands are, not bad.
Hold on, Let me see how many of these bots I can line up and kill it at the same time. (laughing) – They just stand there. Oh man, I still can't kill them, it's shocking. Like are they pointing at the ground? Watch them. Watch them, watch them, let's see where he shoots. Where are you shooting, dog? Look at this guy, look, he's shooting at the ground. I mean, if the idea of gaming is to make you feel like an unstoppable bad-ass, this is going to do the trick for ya. It's like I might as well be Darth Vader fighting like Ewoks, like an army of Ewoks, like Jar Jar Binks sized Ewoks. (laughing in background) Or you Ewok-sized Jar Jar Binks', what would be worse? (laughing) I'm impressed, the display looks freaking great, they boast, what is that? 110% coverage of DCI-P3 or something like that, 111. Pretty darn impressive. The sound is great, HDR 10 plus. If you're into gaming, hey, this is looking pretty compelling. Man, that middle part really is toasty, like this thing is freaking, it's going, man. Oh, I just realized I never actually finished doing the unboxing.
It comes with two USB C-to-C cables. What else we got here? The thing is freaking heavy. Hold on a second, do you have to plug both freaking cables in to get the max output? Are you kidding me? Am I understanding this correctly? Okay, what else we got? Stylish outside. Oh God, yeah, that's not what I would describe as stylish outside. Everything else about this has been great so far, Look at that, Type-C to headphone jack, how hard was that, Apple? Really hard apparently, SIM. Oh, I didn't even mention it has dual SIM slots. So we've got a SIM removal tool and that's it for the accessory package. Man, this really puts the brick in power brick, this thing. This is a really cool device, now that gaming phones have been through their first couple of, sort of, yeah, does this really need to exist? If it does, not really in this form kind of iterations. (sighing) It's getting to the point where like, honestly, if most of what you did was game on, you know, a mobile device like an Android device.
There's a lot of gaming to be done on this, and the price doesn't even seem that outlandish compared to another mobile gaming console, like, I mean, well, really there's only the Switch, and Switch Lite, Switch Lite, Switch Lite. – [David] The Vita's still alive. – The Vita is not still alive, David. – [David] It's still alive – But this channel is still alive, so make sure you're subscribed so you don't miss more videos here on Short Circuit..
This video was sponsored by Skillshare. This is the Zero 18. It's the first phone from a Berlin based start-up called Blloc. It's a minimalist smartphone built to get rid of distractions, and the team behind it decided to completely reinvent the smartphone interface. I've spent a few days with two early prototypes, so let's take a look. This is not just yet another standard Android phone. The Blloc Zero 18 wants to cure your smartphone addiction, although you wouldn't know that it's unique from just looking at the device itself.
In fact the notch on the front and the iPhone-like back actually make this a fairly standard midrange Android phone for 2018. I like translucent back, but the hardware certainly feels fairly generic. Makes sense. Blloc is a small startup, so the hardware is probably off-the-shelf stuff from China. But as soon as you look at either the space-food inspired packaging, or the company's fancy website, or as soon as you unlock the screen, you notice that something is different. This phone and the company behind it is actually rather unique. There are two modes on the phone that you can switch between, and we will first start with the less extreme one that the company calls "MNML (minimal) mode". It's basically just standard Android with a monochrome launcher and a skin applied to it. If you open apps, they launch in monochrome mode by default, because our monkey brains are apparently really easily distracted by colors.
A cool trick here is that you can just tap the fingerprint reader to toggle color on or off anywhere in the system whenever you need it, like when you want to take a photo. It's pretty nifty, but that's pretty much all there is to MNML mode as it's really only meant to be used when the other mode gets a little too restrictive. Because Blloc mode is where the company wants you to spend your time. This mode has 3 main screens, all in black and white of course, and each one of them was designed with two goals in mind: to keep distractions in check as much as possible and to keep you from opening apps as much as possible. By default, your notifications are shown on the tiles as dots, and if an app annoys you, you can easily phase it out by hiding its notifications, muting it, or reducing its brightness because dark stuff apparently attracts your eyes a lot less.
Oh, and you can also lock individual apps from here. Of course this is a grid of icons so you can just use it to launch your apps if you need, but the company is slowly building out functionality so you won't have to. After all, opening apps is usually the first step to a user being sucked into an endless scrolling time waster. And uh, pro tip here: binge watching TechAltar videos or scrolling through TechAltar tweets is definitely not considered a time waster, so just keep watching. Anyway, as a start, some apps like Spotify let you do basic controls from the tile without opening the app and I was told that Blloc wants to add similar controls to most common apps in the future too.
The screen on the right is called the tree which wants to be a unified communication hub. So the system automatically pulls out SMS messages, phone calls, Whatsapp, Telegram, Messenger and other conversations from the apps and puts them all into this page. You can then switch between channels like Whatsapp or SMS per contact and have them all appear in one feed. All without opening an app and without having to care or remember which service a person uses. Similar solutions were available on Blackberry, with Blackberry Hub, as well as Windows Phone in the early days and as far as I know neither of them ever really took off, but you know Blloc's solution is somewhat different, so maybe this one will be a hit. And by the way, you might have noticed that even navigation buttons are hidden in the Blloc UI, as all you do here is swipe either to the right or to the left. Very minimalistic. To the left you have the last screen called the root. Here you are supposed to type in commands and have the phone serve you results.
So you can type "weather Berlin" to get the weather, you can type "news" and select your source to get the headlines, you can set up an alarm from here by typing in what you want and so on. Once again, the idea is to avoid opening apps as much as possible. Now, all of this software is definitely still a work in progress. Many of the integrations are pretty buggy, there just aren't enough integrations yet, and it's unclear if Blloc can keep up with all the changes in the future. After all, Blloc is hacking these integrations into Android and into Android apps, so if those change the way they work, these integrations might just break. I'm interested to see if the company manages to fix everything before they release the phone in November, but even until then, I like the ideas behind this phone and I just love how the software looks. Little details like the awesome boot animation give it a character, a kind of Berlin hipster design vibe that I really dig. And if nothing else comes out of it, this phone will at least be a fantastic UI or UX design case study.
Anyway, the rest of the device is pretty much a standard midrange Android experience. The Helio P23 processor is not too exciting but at least the phone comes with 4 gigs of RAM and 64 gigs of storage so it feels reasonably snappy. The screen is not an OLED which is definitely a missed opportunity given how dark the interface is, but at least the fullHD LCD screen is pretty good. The 3000 mAh battery is also pretty average in this category, and the camera, well, it's too early to say given that this is an early prototype, but so far it didn't seem too impressive. But of course real camera tests will have to wait until we get the final device. Either way, this phone wasn't built to be the best Android phone. It was built with a specific goal in mind. Blloc says your regular smartphones are functional, but too distracting and most specialized phones like the Light phone are distraction free, but not very functional. Blloc tries to integrate the best of both worlds by keeping all of the functionality, but still shielding you from distractions whenever possible.
And if that's what you are looking for, then there aren't many alternatives on the market right now. And at 359 Euros, also including a full year of insurance, I think it's pretty reasonably priced. I actually got to hang out with the Blloc team quite a lot for this video and I always find it inspiring when just a handful of people have the necessary skills to build something cool out of nothing. If you have your own big ideas but don't know how to turn them into a reality, go to Skillshare to learn the necessary skills. They have over 20,000 courses on entrepreneurship, design, animation, photography or whatever you are interested in and right now I'm in love with the design courses from Aaron Draplin. Such a great teacher and a huge character.
There are tons of really great courses like this one so if you want to get access to all of them for free for 2 months. There's a link in the description and using that link will really help my channel out as well..
The Galaxy S8, the newest infinity displayed flagship from Samsung. But what happens when that display breaks? This thing already passed my durability test with flying colors. Now it’s time to see what this thing is made of, and what it looks like on the inside. Let’s get started. [Intro] There are no visible screws along the outside, which is pretty normal for Samsung these days. It does make the phone slightly harder to repair, but not impossible. The process I am demonstrating is going to be pretty much the exact same for both the Galaxy S8 and the S8 Plus. But the replacement parts are going to be different.
I’ll have those linked separately in the description. I’ve warmed up the back of the phone with my heat gun or hair dryer until it’s just barely too hot to touch. Then I can stick a thin metal pry tool between the metal frame and the glass of the phone. Lift it up just high enough to slip a playing card or business card inside, and that will help you get around that curve without breaking anything.
If you need replacement backs, or if you break yours during your repair, they are pretty inexpensive so don’t stress out too much. After slicing through both sides, I’ll slip my green pry tool in to hold the glass up and keep it from resealing itself onto the phone body. The rest of the adhesive will be easy to cut away after that. Remember it’s important not to go too deep inside of the phone because there is important stuff under there that can be punctured, like the wireless charging or the battery. So stick around the edges. Once the back glass is free, we can see the fingerprint scanner up along the top. It’s still attached to the real panel. The first interesting thing that we find is all the warnings on the battery. You got normal stuff like don’t burn, don’t puncture, avoid extreme temperatures. And then you get this no dogs allowed sign. Like, I’m not a dog person myself, but I don’t advocate pet discrimination either.
Are cats and goldfish okay? I don’t really understand the rules anymore and Samsung just kind of made it weird. There are 14 screws holding down the first layer of guts. The circle-y thing is the wireless charging. We cut open one of these on the What’s Inside YouTube channel. The copper wires coil up to receive power through inductance, and then pass that power through the battery into these pins on the motherboard. Pretty sweet technology. Apple will probably invent this technology in the future for one of their next iPhones. So that’s something for iPhone users to look forward to. The battery disconnects from the motherboard easy enough, but there are no magical pull tabs underneath like we’ve seen on some other phones, so it’s time to use brute force.
I’ll use the rounded end of my metal pry tool, taking extreme care not to slice or puncture the battery. I also took special care not to use a dog at any point during this procedure since that’s one of Samsung’s battery requirements. The battery does look pretty cool. It’s got a 3000 milliamp capacity, and it even has a see-through area up at the top for the protection circuit that I talked about during my Note 7 video. The clear plastic on the battery makes me want a clear phone even more. It’s also cool that the inside of the phone is the same color as the outside – just like what we saw with the red iPhone that I took apart a few weeks ago. The loud speaker is the next piece to come out. It’s got a little water damage indicator down at the bottom. Remember, these phones are water resistant and not water proof.
It still has those golden contact points where it receives it’s power and signal from the phone. Before we can remove the charging port, we have to take out the main board. I’ll start disconnecting the wire cables at the bottom; there are three of those. Then the screen ribbon unsnaps like a little Lego from the side of the motherboard. After that I’ll move up to the front sensor array ribbon cable, and the front facing camera ribbon connector. And then, you know, there’s the SIM card tray that I should have removed before we started. At the base of the motherboard there’s a Lego connector for the charging port, but it’s on the underside of the board making things a little more complicated than it should be. I’ll give you a better view of that in just a second. Now that the motherboard is out, we have the plastic Samsung heat pipe.
This helps keep the processor cool since copper is a better conductor of heat than aluminum is. The thermal transfer away from the processor is more efficient than with copper. Now the rear 12 megapixel camera has it’s own Lego-like connection on the motherboard. I’ll snap that off and push the camera through the board. This is definitely replaceable. Just for kicks and giggles I’ll pull out the front facing camera as well. This little guy is attached to the iris scanner. If you look at the rear camera, you can see it move around inside of the frame. This is called the OIS, or optical image stabilization. I’ll show you more of how this works in just a second. On the front camera unit, the iris scanner is solid and normally the front facing camera is solid as well, but this one has movement. Samsung didn’t advertise having stabilization on this front camera, but it looks like they might have been playing around with the idea of adding it.
OIS takes image quality to the next level so it would be pretty awesome if they did. I’ll tuck that front facing camera back into the frame and clip the rear camera back into place as well. Let’s take a look at that earpiece speaker. Remember, during my durability test I complained that the grill size was way smaller on the new S8 than it was on the older S7.
It turns out that the internal speakers are pretty much the same size. If anything, the S8 might even be a little bit larger of a speaker, so no worries there. Since the speaker does sit a little lower than the actual earpiece slot, this channel directs the sound out of the hole in the front. This sensor array at the front is all connected with this ribbon cable. And the volume and Bixby buttons are all connected with these golden contact pads.
The round vibrator has it’s own two contact pads. And the power button is built the same way – two little contact pads resting up against the motherboard. Now for the bottom of the phone. The headphone jack is very easily replaceable, just one little screw to hold it in place. And it has the same little Lego style ribbon connector connecting it to the charging port board. You can see the little rubber seal around the headphone jack to help keep the water out. There are 5 more screws holding the charging port board to the frame. And here is the charging port itself. Incredible nice that we don’t have the front capacitive button reach around that we saw in the Galaxy S6- that was a nightmare. The charging port is pretty standard. It’s got the USBC port and the little microphone off to the side.
This phone is actually pretty easy to work on once you get inside that glued shut back glass. From the exterior you can see that there is metal all around the edge of the phone, but now that we have the guts taken out and the internals of the S8 exposed, we can see that it’s the same hunk of metal throughout the entire device which fully explains the rigidity of the phone. Metal is pretty solid. There’s a little slot in the frame for the screen ribbon to poke through. Speaking of the screen, replacing a cracked display is not cheap or easy with a Samsung. For one, it’s glued into place. And two, the curved AMOLED panels are pretty expensive. I’ll have the current pricing linked in the video description for you. Since the screen is glued in, the old display is essentially sacrificed in the removal process. Once it is heated up and removed, similar to how we did the back panel, just feed the new screen ribbon through the metal frame of the phone and plop it down into place.
I did this with the Galaxy S7 teardown if you’re interested in seeing the exact process. Since this screen is not broken though, I’ll leave it intact. And I’ll talk about a few ways to protect your phone towards the end of the video. Assembling the phone is a piece of cake. Charging port gets tucked back into place along with the headphone jack. This is a pretty great use of space, Samsung. There are 6 screws holding down all the components.
Then get those round wires tucked into the grooves along the metal frame. Now the charging port is connected at the base of the motherboard which is normal for Samsung, but strange to the rest of us. I’ll plug that in before setting the rest of the motherboard into place making sure there are no ribbons or connections stuck underneath the board as it goes down. I’ll clip in the front sensor array, and then the iris scanner and front facing camera. There are those 3 signal wires down at the bottom of the motherboard. The circular heads are pretty fragile so make sure you are gentle as you press them into place. And finally the screen ribbon snaps into place like a little Lego. The loud speaker is next. It’s easiest to snap the plastic into the metal frame from the bottom edge first. And the last thing we plug in is the battery. This is for the phone’s own protection. Normally you’ll want to put adhesive under the battery as well.
And you should definitely not turn your phone on at this point, but I kind of want to show you something cool, so I’m going to do it anyway. Remember the camera stabilization I talked about earlier? Here it is in action. The camera is turned on right now and the phone hardware is physically stabilizing the camera image to compensate for the shakiness or the movement of my hands.
Huge thumbs up for that. It’s seriously one of the best features you can have in a smart phone, and not every phone comes with this kind of hardware stabilizing. I think it’s pretty sweet. I check the front camera, but it doesn’t look like there is any kind of movement or stabilizing in the lens. So while Samsung might have considered adding OIS on that front camera, it’s definitely not enabled at the moment.
Now the phone is turned off again. I will set the wireless charging into place and get all 14 screws screwed in. And finally I can clip in the fingerprint scanner ribbon. This is a tedious process that reminds me a lot of the iPhone 5s. I maybe could have popped the fingerprint scanner out of the back glass and set it into place on the inside, but I wanted to keep that seal with the back glass as tight as possible, and my green tool worked just fine. The best kind of repair is the one you don’t have to do. The best way to keep your phone from breaking in the future is to protect it with a case or a skin.
A naked phone is just asking for trouble. A skin, like the one you see here from dbrand, goes a long way for adding grip, keeping that phone scratch free, and adding a raised surface around the camera lens for a little extra protection. I’ll toss a link in the description for you. And thanks to dbrand for supporting this video. Hopefully it will save people money when they break their phones in the future. If you want to check out a few other projects I’m working on, Instagram and Twitter have all my behind the scenes. And let me know if you were successful in repairing your own phone. Thanks a ton for watching! I’ll see you around..
[MUSIC PLAYING] [VIDEO PLAYBACK] [PAPER CRUMPLING] [MUSIC PLAYING] [SQUEAK] [SQUEAK] [SEAGULL CRYING] [CARS HONKING] [ZAP] [CHALK ON CHALKBOARD] [CAR HONKING] [SCRAPING] [TEARING] [CHEERING AND APPLAUSE] [CHIMES] [FOOTSTEPS] [BIRDS CHIRPING] [TAPPING] – Hm. [BIRDS CHIRPING] [POP] – [GASP] [CHUCKLES] [MUSIC PLAYING] – Hm? [BIRDS CHIRPING] [HEAVY FOOTSTEPS] – Mm. [TING] [THUNDERING IN DISTANCE] [RAINFALL] [THUNDER] [THUNDER] – [STRAINING] [THUNDER] – [GASP] – [MANY STRAINING] – [SIGH] – Hmm. [GLEAMING] – Huh? – Oh? – [GASP] Hmm. [CLACK] – Woohoo! Whoa. [ROCKETING] [TAP] [THUMP] [END PLAYBACK] [APPLAUSE] SUNDAR PICHAI: Good morning. Welcome to Google I/O. [CHEERING] AUDIENCE: I love you, Sundar! [LAUGHTER] SUNDAR PICHAI: I love you guys, too. [LAUGHTER] Can't believe it's one year already. It's a beautiful day.
We're being joined by over 7,000 people, and we are live streaming this, as always, to over 400 events in 85 countries. Last year was the 10th year since Google I/O started, and so we moved it closer to home at Shoreline, back where it all began. It seems to have gone well. I checked the Wikipedia entry from last year. There were some mentions of sunburn, so we have plenty of sunscreen all around. It's on us. Use it liberally. It's been a very busy year since last year, no different from my 13 years at Google. That's because we've been focused ever more on our core mission of organizing the world's information. And we're doing it for everyone. And we approach it by applying [? deep ?] computer science and technical insights to solve problems at scale.
That approach has served us very, very well. This is what allowed us to scale up seven of our most important products and platforms to over a billion monthly active users each. And it's not just the scale at which these products are working, users engage with them very heavily. YouTube, not just has over a billion users, but every single day, users watch over 1 billion hours of videos on YouTube. Google Maps. Every single day, users navigate over 1 billion kilometers with Google Maps. So the scale is inspiring to see, and there are other products approaching this scale. We launched Google Drive five years ago, and today, it is over 800 million monthly active users. And every single week, there are over 3 billion objects uploaded to Google Drive.
Two years ago at Google I/O, we launched Photos as a way to organize user's photos using machine learning. And today, we are over 500 million active users, and every single day, users upload 1.2 billion photos to Google. So the scale of these products are amazing, but they are all still working up their way to what's Android, which I'm excited as of this week, we crossed over 2 billion active devices of Android.
[APPLAUSE] As you can see, the robot is pretty happy, too, behind me, so it's a privilege to serve users of this scale. And this is all because of the growth of mobile and smartphones, but computing is evolving again. We spoke last year about this important shift in computing from a mobile first to a AI first approach. Mobile made us reimagine every product we were working on. We had to take into account that the user interaction model had fundamentally changed, with multi-touch, location, identity, payments, and so on. Similarly, in a AI first world, we are rethinking all our products and applying machine learning and AI to solve user problems. And we are doing this across every one of our products. So today, if you use Google Search, we rank differently using machine learning.
Or if you're using Google Maps, Street View automatically recognizes restaurant signs, street signs, using machine learning. Duo with video calling uses machine learning for low bandwidth situations. And Smart Reply and Allo last year had great reception. And so today, we are excited that we are rolling out Smart Reply to over 1 billion users of Gmail. It works really well. Here's a sample email. If you get an email like this, the machine learning systems learn to be conversational, and it can reply, I'm fine with Saturday, or whatever. So it's really nice to see.
Just like with every platform shift, how users interact with computing changes. Mobile brought multi-touch. We evolved beyond keyboard and mouse. Similarly, we now voice and vision as two new important modalities for computing. Humans are interacting with computing in more natural and immersive ways. Let's start with voice. We've been using voice as an input across many of our products. That's because computers are getting much better at understanding speech. We have had significant breakthroughs, but the pace, even since last year, has been pretty amazing to see. Our word error rate continues to improve, even in very noisy environments.
This is why if you speak to Google on your phone or Google Home, we can pick up your voice accurately, even in noisy environments. When we were shipping Google Home, we had originally planned to include eight microphones so that we could accurately locate the source of where the user was speaking from. But thanks to deep learning, we use a technique called neural beamforming. We were able to ship it with just two microphones and achieve the same quality. Deep learning is what allowed us about two weeks ago to announce support for multiple users in Google Home, so that we can recognize up to six people in your house and personalize the experience for each and every one. So voice is becoming an important modality in our products. The same thing is happening with vision. Similar to speech, we are seeing great improvements in computer vision. So when we look at a picture like this, we are able to understand the attributes behind the picture.
We realize it's your boy in a birthday party. There was cake and family involved, and your boy was happy. So we can understand all that better now. And our computer vision systems now, for the task of the image recognition, are even better than humans. So it's astounding progress and we're using it across our products. So if you used the Google Pixel, it has the best-in-class camera, and we do a lot of work with computer vision. You can take a low light picture like this, which is noisy, and we automatically make it much clearer for you.
Or coming very soon, if you take a picture of your daughter at a baseball game, and there is something obstructing it, we can do the hard work remove the obstruction– [APPLAUSE] –and– [APPLAUSE] –have the picture of what matters to you in front of you. We are clearly at an inflection point with vision, and so today, we are announcing a new initiative called Google Lens. [APPLAUSE] Google Lens is a set of vision-based computing capabilities that can understand what you're looking at and help you take action based on that information.
We'll ship it first in Google Assistant and Photos, and it'll come to other products. So how does it work? So for example, if you run into something and you want to know what it is, say, a flower, you can invoke Google Lens from your Assistant, point your phone at it, and we can tell you what flower it is. It's great for someone like me with allergies. [LAUGHTER] Or if you've ever been at a friend's place and you have crawled under a desk just to get the username and password from a Wi-Fi router, you can point your phone at it. [APPLAUSE] And we can automatically do the hard work for you. Or if you're walking in a street downtown and you see a set of restaurants across you, you can point your phone.
Because we know where you are and we have our Knowledge Graph and we know what you're looking at, we can give you the right information in a meaningful way. As you can see, we're beginning to understand images and videos. All of Google was built because we started understanding text and web pages. So the fact that computers can understand images and videos has profound implications for our core mission. When we started working on Search, we wanted to do it at scale. This is why we rethought our computational architecture. We designed our data centers from the ground up. And we put a lot of effort in them. Now that we are evolving for this machine learning and AI world, we are rethinking our computational architecture again. We are building what we think of as AI first data centers. This is why last year, we launched the tensor processing units. They are custom hardware for machine learning. They were about 15 to 30 times faster and 30 to 80 times more power efficient than CPUs and GPUs at that time. We use TPUs across all our products, every time you do a search, every time you speak to Google.
In fact, TPUs are what powered AlphaGo in its historic match against Lee Sedol. I see now machine learning as two components. Training, that is, how we build the neural net. Training is very computationally intensive, and inference is what we do at real time, so that when you show it a picture, we'd recognize whether it's a dog or a cat, and so on.
Last year's TPU were optimized for inference. Training is computationally very intensive. To give you a sense, each one of our machine translation models takes a training of over three billion words for a week on about 100 GPUs. So we've been working hard and I'm really excited to announce our next generation of TPUs, Cloud TPUs, which are optimized for both training and inference. What you see behind me is one Cloud TPU board. It has four chips in it, and each board is capable of 180 trillion floating point operations per second. [WHOOPING] And we've designed it for our data centers, so you can easily stack them. You can put 64 of these into one big supercomputer. We call these TPU pods, and each pod is capable of 11.5 petaflops. It is an important advance in technical infrastructure for the AI era. The reason we named it cloud TPU is because we're bringing it through the Google Cloud Platform. So cloud TPUs are coming to Google Compute Engine as of today. [APPLAUSE] We want Google Cloud to be the best cloud for machine learning, and so we want to provide our customers with a wide range of hardware, be it CPUs, GPUs, including the great GPUs Nvidia announced last week, and now Cloud TPUs.
So this lays the foundation for significant progress. So we are focused on driving the shift and applying AI to solving problems. At Google, we are bringing our AI efforts together under Google.ai. It's a collection of efforts and teams across the company focused on bringing the benefits of AI to everyone. Google.ai will focus on three areas, state-of-the-art research, tools, and infrastructure– like TensorFlow and Cloud TPUs– and applied AI.
So let me talk a little bit about these areas. Talking about research, we're excited about designing better machine learning models, but today it is really time consuming. It's a painstaking effort of a few engineers and scientists, mainly machine learning PhDs. We want it to be possible for hundreds of thousands of developers to use machine learning. So what better way to do this than getting neural nets to design better neural nets? We call this approach AutoML. It's learning to learn. So the way it works is we take a set of candidate neural nets. Think of these as little baby neural nets. And we actually use a neural net to iterate through them till we arrive at the best neural net.
We use a reinforcement learning approach. And it's– the results are promising. To do this is computationally hard, but Cloud TPUs put it in the realm of possibility. We are already approaching state of the art in standard tasks like, say, for our image recognition. So whenever I spend time with the team and think about neural nets building their own neural nets, it reminds me of one of my favorite movies, "Inception." And I tell them we must go deeper.
[LAUGHTER] So we are taking all these AI advances and applying them to newer, harder problems across a wide range of disciplines. One such area is health care. Last year, I spoke about our work on diabetic retinopathy. It's a preventable cause of blindness. This year, we published our paper in the "Journal of the American Medical Association," and [? verily ?] is working on bringing products to the medical community. Another such area is pathology. Pathology is a very complex area. If you take an area like breast cancer diagnosis, even amongst highly trained pathologists, agreement on some forms of breast cancer can be as low as 48%.
That's because each pathologist is reviewing the equivalent of 1,000 10-megapixel images for every case. This is a large data problem, but one which machine learning is uniquely equipped to solve. So we built neural nets to detect cancer spreading to adjacent lymph nodes. It's early days, but our neural nets show a much higher degree of accuracy, 89% compared to previous methods of 73%. There are important caveats we do have higher false positives, but already giving this in the hands of pathologists, they can improve diagnosis. In general, I think this is a great approach for machine learning, providing tools for people to do what they do better. And we're applying it across even basic sciences. Take biology. We are training neural nets to improve the accuracy of DNA sequencing.
[? Deep ?] [? Piriant ?] is a new tool from Google.ai that identifies genetic variants more accurately than state-of-the-art methods. Reducing errors is in important in applications. We can more accurately identify whether or not a patient has genetic disease and can help with better diagnosis and treatment. We're applying it to chemistry. We're using machine learning to predict the properties of molecules. Today, it takes an incredible amount of computing resources to hunt for new molecules, and we think we can [? accelerate ?] timelines by orders of magnitude.
This opens up possibilities in drug discovery or material sciences. I'm entirely confident one day, AI will invent new molecules that behave in predefined ways. Not everything we are doing is so profound. We are doing even simple and fun things, like a simple tool which can help people draw. We call this AutoDraw. Just like today when you type in Google, we give you suggestions, we can do the same when you're trying to draw, even I can draw with this thing.
So it may look like fun and games, but pushing computers to do things like this is what helps them be creative and actually gain knowledge. So we are very excited about progress even in these areas as well. So we are making impressive progress in applying machine learning, and we are applying it across all our products, but the most important product we are using this is for Google Search and Google Assistant. We are evolving Google Search to be more assistive for our users. This is why last year at Google I/O, we spoke about the Assistant, and since then, we've launched it on Google Pixel and Google Home. Scott and team are going to talk more about it, but before that, let's take a look at the many amazing ways people have been using the Google Assistant. [VIDEO PLAYBACK] – OK, Google. [MUSIC PLAYING] – Hey, Google? – Hey, Google. – OK, Google. – Hey, Google. [BLING] – Play some dance music. – Sure. [BLING] – This is "Fresh Air." My guest will be– – Kimmy Schmidt on Netflix. [BLING] – OK, Google. Count to 100.
– Sure. 1, 2, 3– – Play vacuum harmonica on my TV. [VACUUMING] [HARMONICA PLAYS] – –71, 72– – No! – –73– – Play the "Wonder Woman" trailer. – Hey, Google. Talk to Domino's. – Talk to Lonely Planet. – Talk to Quora. – Show me my photos from last weekend. [BLING] [SCREAMING] – Your car is parked at 22B. [BEEP BEEP] – Today in the news– [BLING] – Turn the living room lights on. – OK, turning on the lights. – I'm back, baby. – Hey, Google. Drop a beat. – Flip a coin. – Call Jill. – Set a timer. – Talk to Headspace. [TING] – And then just for a moment, I'd like you to let go of any focus at all. Just let your mind do whatever it wants to do. – Done. – Hey, Google. Good night. – Turning off all the things. See you tomorrow. [END PLAYBACK] [MUSIC PLAYING] [APPLAUSE] SCOTT HUFFMAN: Hey, everyone. Last year at I/O, we introduced the Google Assistant, a way for you to have a conversation with Google to get things done in your world.
Today, as Sundar mentioned, we're well on our way, with the Assistant available on over 100 million devices. But just as Google Search simplified the web and made it more useful for everyone, your Google Assistant simplifies all the technology in your life. You should be able to just express what you want throughout your day and the right things should happen. That's what the Google Assistant is all about. It's your own individual Google. So that video we saw really captures the momentum of this project. We've made such big strides and there's so much more to talk about today. The Assistant is becoming even more conversational, always available wherever you need it, and ready to help get even more things done.
First, we fundamentally believe that the Google Assistant should be, hands down, the easiest way to accomplish tasks, and that's through conversation. It comes so naturally to humans, and now Google is getting really good at conversation, too. Almost 70% of requests to the Assistant are expressed in natural language, not the typical keywords that people type in a search box. And many requests or follow-ups that continue the conversation. We're really starting to crack the hard computer science challenge of conversationality by combining our strengths in speech recognition, natural language understanding, and contextual meaning. Now recently, we made the Assistant even more conversational, so each member of the family gets relevant responses just for them by asking with their own voice. And we're continuing to make interacting with your Assistant more natural. For example, it doesn't always feel comfortable to speak out loud to your Assistant, so today, we're adding the ability to type to your Assistant on the phone.
Now, this is great when you're in a public place and you don't want to be overheard. The Assistant's also learning conversation beyond just words. With another person, it's really natural to talk about what you're looking at. Sundar spoke earlier about how AI and deep learning have led to tremendous strides in computer vision. Soon, with the smarts of Google Lens, your Assistant will be able to have a conversation about what you see. And this is really cool, and Ibrahim is here to help me show you a couple of examples of what we'll launch in the coming months. So, last time I traveled to Osaka, I came across a line of people waiting to try something that smelled amazing.
Now, I don't speak Japanese, so I couldn't read the sign out front, but Google Translate knows over 100 languages, and my Assistant will help with visual translation. I just tap the Google Lens icon, point the camera, and my Assistant can instantly translate the menu to English. And now, I can continue the conversation. IBRAHIM ULUKAYA: What does it look like? GOOGLE ASSISTANT: These pictures should match. SCOTT HUFFMAN: All right. It looks pretty yummy. Now notice, I never had to type the name of the dish.
My Assistant used visual context and answered my question conversationally. Let's look at another example. Some of the most tedious things I do on my phone stem from what I see– a business card I want to save, details from a receipt I need to track, and so on. With Google Lens, my Assistant will be able to help with those kinds of tasks, too. I love live music, and sometimes I see info for shows around town that look like fun.
Now, I can just tap the Google Lens icon and point the camera at the venue's marquee. My Assistant instantly recognizes what I'm looking at. Now, if I wanted to, I could tap to hear some of this band's songs, and my Assistant offers other helpful suggestions right in the viewfinder. There's one to buy tickets from Ticketmaster, and another to add the show to my calendar. With just a tap, my Assistant adds the concert details to my schedule. GOOGLE ASSISTANT: Saving event.
Saved Stone Foxes for May 17th at 9:00 PM. SCOTT HUFFMAN: Awesome. [APPLAUSE] My Assistant will help me keep track of the event, so I won't miss the show, and I didn't have to open a bunch of apps or type anything. Thanks Ibrahim. So that's how the Assistant is getting better at conversation– by understanding language and voices, with new input choices, and with the power of Google Lens. Second, the Assistant is becoming a more connected experience that's available everywhere you need help, from your living room to your morning jog, from your commute to errands around town, your Assistant should know how to use all of your connected devices for your benefit. Now, we're making good progress in bringing the Assistant to those 2 billion phones, and other devices powered by Android, like TVs, wearables, and car systems. And today, I'm excited to announce that the Google Assistant is now available on the iPhone. [APPLAUSE] Woo. So no matter what smartphone you use, you can now get help from the same smart assistant throughout the day at home, and on the go. The Assistant brings together all your favorite Google features on the iPhone.
Just ask to get package delivery details from Gmail, watch videos from your favorite YouTube creators, get answers from Google Search, and much more. You can even turn on the lights and heat up the house before you get home. Now, Android devices and iPhones are just part of the story. We think the Assistant should be available on all kinds of devices where people might want to ask for help. The new Google Assistant SDK allows any device manufacturer to easily build the Google Assistant into whatever they're building. Speakers, toys, drink-mixing robots, whatever crazy device all of you think up, now can incorporate the Google Assistant. And we're working with many of the world's best consumer brands and their suppliers, so keep an eye out for the badge that says, "Google Assistant built-in" when you do your holiday shopping this year. Now obviously, another aspect of being useful to people everywhere is support for many languages. I'm excited to announce that starting this summer, the Google Assistant will begin rolling out in French, German, Brazilian Portuguese, and Japanese on both Android phones and iPhones.
By the end of the year, we'll also support Italian, Spanish and Korean. So that's how the Assistant is becoming more conversational, and how it will be available in even more contexts. Finally, the Assistant needs to be able to get all kinds of useful things done for people. People sometimes ask if the Assistant is just a new way to search. Now of course, you can ask your Assistant to get all sorts of answers from Google Search, but beyond finding information, users are also asking the Assistant to do all sorts of things for them.
Now as you've already seen, the Assistant can tap into capabilities across many Google Apps and services, but Google's features are just part of the story. We also open the Assistant to third-party developers who are building some really useful integrations. I'll turn it over to Valerie to share more about how the developer platform is getting stronger. [MUSIC PLAYING] [APPLAUSE] VALERIE NYGAARD: Hi. OK, so with the actions on Google Platform, it's been awesome to see how developers like you have been engaging with the Google Assistant. Like honestly, you've built some really cool integrations. Like, I can ask Food Network about the recipe that's on TV right now. I can work out with Fitstar, ask CNBC about the news, or my husband and I can play name that tune with SongPop, which he is surprisingly good at.
Until now, these experiences have been available through the Assistant on Google Home. But today, we're also bringing them to Android phones and iPhones. It's over 100 million devices on Android alone. So now people can get to Google features and third-party services from anywhere, and they can even pick up where they left off across devices. So, not only are third-party integrations available in more places. They'll be able to do more. Starting today, actions on Google will be supporting transactions. It's a complete end-to-end solution for developers, including payments, identity, notifications, receipts, even account creation. The platform handles all the complexity. Let me show you how one will work. GOOGLE ASSISTANT: Hi, how can I help? VALERIE NYGAARD: I'd like delivery from Panera. PANERA: Hi, this is Panera. I'll need your delivery address.
Which one can I get from Google? GOOGLE ASSISTANT: We'll go with 1600 Amphitheater. PANERA: What can I get you started with? VALERIE NYGAARD: I'll have the strawberry poppy seed salad with steak instead of chicken. PANERA: Got it. How about one of these cool drinks? VALERIE NYGAARD: And here, I can just swipe through my options. See what looks good. Agave lemonade. PANERA: Great. Are you ready to check out? VALERIE NYGAARD: Yep. PANERA: OK, the total is $18.40. Are you ready to place the order? VALERIE NYGAARD: Yes. I'll just scan my fingerprint to pay with Google, and that's it. [APPLAUSE] PANERA: Thanks. You're all set. VALERIE NYGAARD: Yeah, super easy, like I was talking to someone at the store. So here I was a new Panera customer. I didn't have to install anything or create an account. You've also probably noticed I didn't have to enter my address or my credit card.
I just saved those earlier with Google, and Panera used built-in platform calls to request the information. Now, I was in control over what I shared every step of the way. So– AUDIENCE: Woo! VALERIE NYGAARD: [CHUCKLES] The developer platform's also getting much stronger for home automation integrations. Actions on Google can now support any smart home developer that wants to add conversational control. Today, over 70 smart home companies work with the Google Assistant, so now in my Google Home or from my phone, I can lock my front door with August locks, control a range of LG appliances, or check in on my son's room by putting the Nest cam on TV. All right, now that we're talking about making your home smarter, we also have a lot of news to share today about Google Home, our own smart speaker with the Google Assistant built in. Here to tell you more is Rishi Chandra.
[MUSIC PLAYING] [APPLAUSE] RISHI CHANDRA: Thanks, Valerie. You know, it's really hard to believe we launched Google Home a little over six months ago, and we've been really busy ever since. Since launch, we've added 50 new features, including some my favorites like support for Google Shopping, where I can use my voice to order items from Costco right to my front door. Or I can get step-by-step cooking instructions from over 5 million recipes. Or I can even play my favorite song just by using the lyrics. Now in April, we launched in the UK to some great reviews. And starting this summer, we're going to be launching in Canada, Australia, France, Germany, and Japan. [APPLAUSE] And with support for multiple users, we can unlock the full potential of Google Home to offer a truly personal experience. So now, you can schedule a meeting, set a reminder, or get your own daily briefing with My Day by using your own voice.
And get your commute, your calendar appointments, and your news sources. Now today, I'd like you share four new features we'll be rolling out over the coming months. So first, we're announcing support for proactive assistance coming to Google Home. Home is great at providing personally relevant information for you when you ask for it, but we think it'd be even more helpful if it can automatically notify you of those timely and important messages. And we do this by understanding the context of your daily life, and proactively looking for that really helpful information, and providing for you and a hands-free way. So for example, let's say I'm relaxing and [? playing game ?] with the kids. Well, I can see that the Google Home lights just turned on. Hey, Google, what's up? GOOGLE ASSISTANT: Hi, Rishi. Traffic's heavy right now, so you'll need to leave in 14 minutes to get to Shoreline Athletic Fields by 3:30 PM. RISHI CHANDRA: That's pretty nice. The Assistant saw the game coming up on my calendar, and got my attention because I had to leave earlier than normal.
So now, my daughter can make it to that soccer game right on time. Now, we're going to start simple, with really important messages like reminders, traffic delays, and flight status changes. And with multiple-user support, you have the ability to control the type of proactive notifications you want over time. All right, and second, another really common activity we do in the home today is communicate with others. And a phone call is still the easiest way to reach someone. So today, I'm excited to announce hands-free calling coming to Google Home. [CHEERING AND APPLAUSE] It's really simple to use. Just ask the Google Assistant to make a call, and we'll connect you. You can call any landline or mobile number in the US or Canada completely free. And it's all done in a hands-free way. For example, let's say I forgot to call my mom on Mother's Day. Well now, I can call her while I'm scrambling to get the kids ready for school in the morning. I just see and say, hey Google. Call mom. GOOGLE ASSISTANT: Sure, calling mom.
[RINGING] [RINGING] SPEAKER 1: So, you're finally calling. Mother's Day was three days ago. RISHI CHANDRA: Yeah, sorry about that. They made me rehearse for I/O on Mother's Day. Speaking of which, you're on stage right now. Say hi to everyone. SPEAKER 1: Oh, hi, everyone. AUDIENCE: Hi. RISHI CHANDRA: So, hopefully, this makes up for not calling, right? SPEAKER 1: No, it doesn't. You still need to visit and bring flowers.
RISHI CHANDRA: OK, I'm on it. Bye. SPEAKER 1: Bye. RISHI CHANDRA: It's that simple. We're just making a standard phone call through Google Home. So mom didn't need to learn anything new. She just needs to answer her phone. There's no additional setup, apps, or even phone required. And since the Assistant recognized my voice, we called my mom.
If my wife had asked, we would have called her mom. We can personalize calling just like everything else. And now, anyone home can call friends, family, even businesses. Maybe even a local florist to get some flowers for your mom. Now, by default, we're going to call out with a private number, but you also have the option to link your mobile number to the Google Assistant. And we'll use that number whenever we recognize your voice. So whoever you call [? must ?] know it's coming from you. Now, we're rolling out hands-free calling in the US to all existing Google Home devices over the next few months. It's the ultimate hands-free speakerphone. No setup required, call anyone, including personal contacts or businesses, and even dial out with your personal number when we detect your voice. We can't wait for you to try it out.
OK, third, let's talk a little about entertainment. We designed Google Home to be a great speaker, one that can put in any room in the house or wirelessly connect to other Chromecast built-in speaker systems. Well today, we're announcing that Spotify, in addition to their subscription service, will be adding their free music service to Google Home, so it's even easier to play your Spotify playlists. [APPLAUSE] We'll also be adding support for SoundCloud and Deezer to the largest global music services today. [APPLAUSE] And these music services will join many of the others already available through the Assistant. And finally, we'll be adding Bluetooth support to all existing Google Home devices. So you can play any audio from your iOS or Android device. AUDIENCE: Yes! [APPLAUSE] But Google Home can do much more than just audio. Last year, we launched the ability to use your voice to play YouTube, Netflix, and Google Photos right on your TV.
And today, we're announcing additional partners, including HBO NOW. [APPLAUSE] So just say you want to watch, and we'll play it for you all in a hands-free way. With Google Home, we want to make it really easy to play your favorite entertainment. OK, finally, I want to talk a little bit how we see the Assistant evolving to help you in a more visual way. Voice responses are great, but sometimes a picture is worth a thousand words. So today, we're announcing support for visual responses with Google Home. Now to do that, we need a screen. Well, fortunately, many of us already have a ton of screens in our home today, our phones, our tablets, even our TVs.
The Google Assistant should smartly take advantage of all these different devices to provide you the best response on the right device. For example, with Google Home, I can easily get location information. OK, Google. Where is my next event? GOOGLE ASSISTANT: Your Pokemon GO hike is at Rancho San Antonio Reserve. RISHI CHANDRA: It's for my kids. GOOGLE ASSISTANT: It's at 11:00 AM today. RISHI CHANDRA: It's for my kids. Relax. [LAUGHTER] But if I want to view the directions, the best place to do it is on my phone. Well soon, you could just say, OK, Google. Let's go. GOOGLE ASSISTANT: All right, I'm sending the best route to your phone. RISHI CHANDRA: And will automatically your phone– and notify your phone, whether it's Android or iOS, and take you straight to Google Maps.
So you can glance at directions, interact with the map, or just start navigation. It's really simple. Now TVs are another natural place to get help from the Google Assistant, and we've a great place to start with over 50 million Chromecast and Chromecast built-in devices. So today, we're announcing that we'll be updating Chromecast to show visual responses on your TV when you ask for help from Google Home. For example, I can now say, OK, Google. Show my calendar for Saturday. GOOGLE ASSISTANT: Showing it on your TV. RISHI CHANDRA: It'll show up right on TV screen. I'll immediately get results from the Assistant. [APPLAUSE] And since the Assistant detected my voice, we're showing my calendar. Others would see their calendar by using their voice. We can personalize the experience, even on the TV. They can continue to follow-up the conversation. Looks like I have a biking trip to Santa Cruz.
What's the weather in Santa Cruz this weekend? GOOGLE ASSISTANT: This weekend in Santa Cruz, it will be clear and sunny most of the time. RISHI CHANDRA: So it's really easy. It's all hands-free. Your Assistant can provide a visual response to a TV to a lot of different types of questions. We talked about how easy it is to play what you want to watch on the TV screen, but what about those times you don't know what to watch? Well, soon, you could just ask, hey, Google. What's on YouTube? GOOGLE ASSISTANT: Here you go.
RISHI CHANDRA: And it'll show me my personalized results right on the TV screen. If I don't like any of the options, I can continue the conversation with my voice. Show my Watch Later list. GOOGLE ASSISTANT: All right. RISHI CHANDRA: Play "Send My Love." GOOGLE ASSISTANT: Playing "Send My Love" from YouTube. [MUSIC – "SEND MY LOVE"] RISHI CHANDRA: It's really simple. Again, no remotes or phone required. In a short conversation, I found something really interesting to watch using Google Home. I can even do it with other things. OK, Google. What's on my DVR? GOOGLE ASSISTANT: Here you go. RISHI CHANDRA: Here we're showing how it works with YouTube TV, a new live TV streaming service that gives you live sports and shows from popular TV networks. And YouTube TV includes a cloud DVR, so I can easily play my saved episodes. Everything can be done in a hands-free way all from the comfort of my couch. And over time, we're going to bring all those developer actions that Valerie had already talked about right to the TV screen.
So we'll do even more over time with Google Home. And that's our update for Google Home. Proactive assistance will bring important information to you at the right time, simple and easy hands-free calling, more entertainment options, and evolving the Assistant to provide visual responses in the home. Next up is Anil, who's going to talk about Google Photos. [APPLAUSE] [MUSIC PLAYING] ANIL SABHARWAL: Two years ago, we launched Google Photos with an audacious goal– to be the home for all of your photos, automatically organized and brought to life so that you could easily share and save what matters.
In doing so, we took a fundamentally different approach. We built a product from the ground up with AI at its core. And that's enabled us to do things in ways that only Google can. Like when you're looking for that one photo you can't find, Google Photos organizes your library by people, places, and things. Simply type, "Anil pineapple Hawaii," and instantly find this gem. [LAUGHTER] Or when you come home from vacation, overwhelmed by the hundreds of photos you took, Google Photos will give you an album curated with only the best shots, removing duplicates and blurry images. This is the secret ingredient behind Google Photos, and the momentum we've seen in these two short years is remarkable. As Sundar mentioned, we now have more than half a billion monthly active users, uploading more than 1.2 billion photos and videos per day. And today, I'm excited to show you three new features we're launching to make it even easier to send and receive the meaningful moments in your life.
Now, at first glance, it might seem like photo sharing is a solved problem. After all, there's no shortage of apps out there that are great at keeping you and your friends and family connected, but we think there's still a big and different problem that needs to be addressed. Let me show you what I mean. [VIDEO PLAYBACK] – If there's one thing you know, it's that you're a great photographer. If there's a second thing you know, it's that you're kind of a terrible person. – What? – Yeah, you heard me. The only photo of the birthday girl in focus? Never sent it. The best picture of the entire wedding? Kept it to yourself. This masterpiece of your best friend? We were going to send it, but then you were like, oh, remember that sandwich? I love that sandwich.
If only something could say, hey, Eric looks great in these. You want to send them to him? And you can be like, great idea. Well, it can. Wait, it can? Yup. With Google Photos. [END PLAYBACK] [APPLAUSE] ANIL SABHARWAL: So today, to make us all a little less terrible people, we're announcing Suggested Sharing, because we've all been there, right? Like when you're taking that group photo and you insist that it be taken with your camera, because you know if it's not on your camera, you are never seeing that photo ever again. [LAUGHTER] Now thanks to the machine learning in Google Photos, we'll not only remind you so you don't forget to share, we'll even suggest the photos and people you should share with. In one tap, you're done. Let's have a look at Suggested Sharing in action. I'm once again joined onstage by my friend, and Google Photos product lead, David Leib. [APPLAUSE] All right, so here are a bunch of photos Dave took while bowling with the team last weekend.
He was too busy enjoying the moment, so he never got around to sharing them. But this time, Google Photos sent him a reminder via notification, and also by badging the new Sharing tab. The Sharing tab is where you're going be able to find all of your Google Photos sharing activity, and at the top, your personal suggestions based on your sharing habits and what's most important to you. Here is the Sharing Suggestion that Dave got from his day bowling. Google Photos recognized this was a meaningful moment, it selected the right shots, and it figured out who he should send it to based on who was in the photos. In this case, it's Janvi, Jason, and a few others who were also at the event. Dave can now review the photos selected, as well as update the recipients.
Or if he's happy with it, he can just tap Send. And that's it. Google Photos will even send an SMS or an email to anyone who doesn't have the app. And that way, everyone can view and save the full resolution photos, even if they don't have Google Photos accounts. And because Google photo sharing works on any device, including iOS, let's have a look at what Janvi sees on her iPhone. She receives a notification, and tapping on it lets her quickly jump right into the album. And look at all the photos that Dave has shared with her. But notice here at the bottom, she's asked to contribute the photos she took from the event, with Google Photos automatically identifying and suggesting the right ones.
Janvi can review the suggestions and then simply tap Add. Now all of these photos are finally pulled together in one place, and Dave gets some photos he's actually in. [LAUGHTER] Which is great, because a home for all your photos really should include photos of you. Now, though Suggested Sharing takes the work out of sharing, sometimes there's a special person in your life whom you share just about everything with. Your partner, your best friend, your sibling. Wouldn't it be great if Google Photos automatically shared photos with that person? For example, I would love it if every photo I ever took of my kids was automatically shared with my wife. And that's why today, we're also announcing Shared Libraries. [APPLAUSE] Let me show you how it works. So here, we're now looking at my Google Photos account. >From the menu, I now have the option to go ahead and share my library, which I'm going to go ahead and do with my wife, Jess.
Importantly, I have complete control over which photos I automatically share. I can share them all, or I can share a subset, like only photos of the kids, or only photos from a certain date forward, like when we first met. In this case, I'm going to go ahead and share all. [LAUGHTER] [LAUGHS] We did not meet today. [LAUGHTER] And that's all there is to it. I've now gone ahead and shared my library with my wife, Jess. So, let's switch to her phone to see what the experience looks like from her end. She receives a notification, and after accepting, she can now go to see all the photos that I've shared with her, which she can access really easily from the menu. If she see something she likes, she can go ahead and select those photos and simply save them to her library.
We'll even notify her periodically as I take new photos. Now, this is great, but what if Jess doesn't want to have to keep coming back to this view and checking if I shared new photos with her? She just wants every photo I take of her or the kids to automatically be saved to her library, just as if she took the photos herself. With Shared Libraries, she can do just that, choosing to autosave photos of specific people. Now, any time I take photos of her or the kids, without either of us having to do anything, they'll automatically appear in the main view of her app. Let me show you. Now, I couldn't justify pulling the kids out of school today just to have their photo taken, but I do have the next best thing.
[APPLAUSE] Let me introduce you to [? Eva ?] and [? Lilly. ?] All righty here. So I'm going to go ahead, take a photo with the girls. Smile, kids! [LAUGHTER] Wow, fantastic. And since this is too good of an opportunity, I'm going to have to take one with all of you here, too, all right? [CHEERING] Here we go. Woo! Brilliant. All right. OK, so thank you, girls. Much appreciated. Back to school we go. [LAUGHTER] All right. So, using nothing more than the standard camera app on my phone, I've gone ahead and taken one photo with my kids and one photo with all of you here in the audience. Google Photos is going to back these two photos up. It's going to share them with Jess, and then it's going to recognize the photo that has my kids in them and automatically save just that one to her library, like you can see right here. [APPLAUSE] Now finally, Jess and I can stop worrying about whose phone we're using to take the photos. All the photos of our family are in my Google Photos app, and they automatically appear in hers too.
And best of all, these family photos are part of both of our search results, and they're included in the great collages, movies, and other fun creations that Google Photos makes for us. But notice how only the photos with the kids showed up in Jess's main view. But because I shared my entire library with her, I can simply go to the menu, and Jess can now see all of the photos, including the one with all of you. [APPLAUSE] And that's how easy sharing can be in Google Photos. Spend less time worrying about sharing your memories, and more time actually enjoying them. Suggested Sharing and Shared Libraries will be rolling out on Android, iOS, and web in the coming weeks.
Finally, we know sharing doesn't always happen through apps and screens. There's still something pretty special about looking at and even gathering around an actual printed photo. But printing photos and albums today is hard. You have to hunt across devices and accounts to find the right photos, select the best among the duplicates and blurry images, upload them to a printing service, and then arrange them across dozens of pages. It can take hours of sitting in front of a computer just to do one thing. Thankfully, our machine learning and Google Photos already does most of this work for you, and today, we're bringing it all together with the launch of Photo Books. [APPLAUSE] They're beautiful, high quality with a clean and modern design, but the best part is that they're incredibly easy to make, even on your phone. What used to take hours now only takes minutes. I recently made a book for Jess on Mother's Day. And let me show you just how easy and fast that was. First, thanks to unlimited storage, all my life's moments are already here in Google Photos. No need to upload them to another website or app.
So I'll select a bunch of photos here. And the good news is I don't have to figure out which are the right photos and which are the good ones because this is where Google Photos really shines. I'm just going to go ahead and hit plus. Select Photo book. I'm going to pick a hardcover book. We offer both a softcover and a hardcover. And notice what happens. Google Photos is going to pick the best photos for me automatically, automatically suggesting photo– 40, in this case. [APPLAUSE] How awesome is that? And it's even going to go ahead and lay them all out for me. All that's left for me to do is make a couple of tweaks, check out, and in a few days, I'll end up with one of these beautiful printed photo books. [APPLAUSE] And soon, we'll make it even easier to get started, applying machine learning to create personalized photo books you'll love.
So when you go to Photo Books from the menu, you'll see pre-made books tailored just for you. Your trip to the Grand Canyon, time with your family during the holidays, or your pet, or even your kids artwork, all easily customizable. We'll even notify you when there are new Photo Books suggestions. AUDIENCE: [INAUDIBLE] ANIL SABHARWAL: Photo Books are available today in the US on photos.google.com, and they'll be rolling out on Android and iOS next week, and will be expanding to more countries soon.
[APPLAUSE] I am really excited about this launch, and I want all of you to be the first to try it out. And that's why everyone here at I/O will be receiving a free hardcover photo book. [APPLAUSE] It's a great example of machine learning at work. AUDIENCE: [? $10? ?] Take that photo [INAUDIBLE] ANIL SABHARWAL: So those are the three big updates related to sharing in Google Photos. Suggested Sharing, Shared Libraries, and Photo Books. Three new features built from the ground up with AI at their core. I can't wait for all of you to try them out real soon. Now before I go, I want to touch on what Sundar mentioned earlier, which is the way we're taking photos is changing.
Instead of the occasional photo with friends and family, we now take 30 identical photos of a sunset. We're also taking different types of photos, not just photos to capture personal memory, but as a way to get things done– whiteboards we want to remember, receipts we need to file, books we'd like to read. And that's where Google Lens and its vision-based computing capabilities comes in. It can understand what's in an image and help you get things done. Scott showed how Google Lens and the Assistant can identify what you're looking at and help you on the fly. But what about after you've taken the photo? There are lots of photos you want to keep, and then look back on later to learn more and take action.
And for that, we're bringing Google Lens right into Google Photos. Let me show you. So let's say you took a trip to Chicago. There's some beautiful architecture there. And during your boat tour down the Chicago River, you took lots of photos, but it's hard to remember which building is which later on. Now, by activating Lens, you can identify some of the cool buildings in your photos, like the second tallest skyscraper in the US, Willis Tower. You can even pull up directions and get the hours for the viewing deck. And later, while visiting the Art Institute, you might take photos of a few paintings you really love. In one tap, you can learn more about the painting and the artist. And the screenshot that your friend sent you of that bike rental place? Just activate Lens, and you can tap the phone number and make the call right from the photo.
[APPLAUSE] Lens will be rolling out in Google Photos later this year, and we'll be continually improving the experience so it recognizes even more objects and lets you do even more with them. And those are the updates for Google Photos. [CHEERING AND APPLAUSE] Now, let's see what's next from YouTube. [MUSIC PLAYING] SUSAN WOJCICKI: All right. Good morning, everyone. I am thrilled to be here at my first ever I/O on behalf of YouTube. [APPLAUSE] Thank you. So that opening video that we all just saw, that's a perfect glimpse into what makes YouTube so special– the incredible diversity of content. A billion people around the globe come to YouTube every month to watch videos from new and unique voices. And we're hard at work to make sure that we can reach the next billion viewers, which you'll hear about in a later I/O session today.
We want to give everyone the opportunity to watch the content on YouTube. So, YouTube is different from traditional media in a number of ways. First of all, YouTube is open. Anyone in the world can upload a video that everyone can watch. You can be a vlogger broadcasting from your bedroom, a gamer live streaming from your console, or a citizen journalist documenting events live from your phone on the front lines. And what we've seen is that openness leads to important conversations that help shape society, from advancing LGBTQ rights to highlighting the plight of refugees, to encouraging body positivity. And we've seen in our numbers that users really want to engage with this type of diverse content.
We are proud that last year we passed a billion hours a day being watched on YouTube, and our viewership is not slowing down. The second way that YouTube is different from traditional media is that it's not a one-way broadcast. It's a two way conversation. Viewers interact directly with their favorite creators via comments, mobile live streaming, fan polls, animated GIFs, and VR. And these features enable viewers to come together, and to build communities around their favorite content. So one of my favorite stories of a YouTube community is the e-NABLE network. A few years ago, an engineering professor named Jon Schull saw a YouTube video about a carpenter who had lost two of his fingers. The carpenter worked with a colleague for over a year to build an affordable 3D-printed prosthesis that would enable him to go back to work.
They then applied this technology for a young boy who was born without any fingers. So inspired by this video, the professor posted a single comment on the video asking for volunteers with 3D printers to help print affordable prostheses. The network has since grown into a community of over 6,000 people who have designed, printed, and distributed these prosthetics to children in over 50 countries. [APPLAUSE] So today, thousands of children have regained the ability to walk, touch, play, and all because of the one video– one comment– and that incredible YouTube community that formed to help. And that's just one example of the many passionate communities that are coming together on YouTube around video. So, the third feature of this new medium is that video works on-demand on any screen. Over 60% of our watchtime now comes from mobile devices. But actually our fastest growing screen isn't the one in your pocket. It's the one in your living room. Our watchtime in our living room is growing at over 90% a year. So, let's now welcome Sarah Ali, Head of Living Room Products, to the stage to talk about the latest features in the living room.
[MUSIC PLAYING] [APPLAUSE] SARAH ALI: Thank you, Susan. So earlier today, you heard from Rishi about how people are watching YouTube on the TV via the Assistant. But another way people are enjoying video is through the YouTube app, which is available over half a billion smart TVs, game consoles, and streaming devices. And that number continues to grow around the world. So, when I think about why YouTube is so compelling in the living room, it isn't just about the size of the screen. It's about giving you an experience that TV just can't match. First, YouTube offers you the largest library of on-demand content. Second, our recommendations build channels and lineups based on your personal interests, and what you enjoy watching. And third, it's a two-way interactive experience with features like voice control.
And today, I'm super excited to announce that we're taking the interactive experience a step further by introducing 360 video in the YouTube app on the big screen. And you know that you can already watch 360 videos on your phone or in your Daydream headset. But soon, you'll be able to feel like you're in the middle of the action, right from your couch, and on the biggest screen you own. Now, one of my personal interests outside of work is to travel. And one place I'd love to visit is Alaska to check out the Northern Lights. So, let's do a voice search. Aurora Borealis 360. Great. Let's choose that first video. And now, using my TV remote, I'm able to pan around this video, checking out this awesome view from every single angle. Traveling is great, especially when I don't have to get on a flight, but 360 is now a brand-new way to attend concerts. I didn't make it to Coachella, but here I can experience it like I was on stage.
And to enhance the experience even further, we are also introducing live 360 in the living room. Soon, you'll be able to witness moments and events as they unfold in a new, truly immersive way. So whether you have a Sony Android TV, or an Xbox One console, soon, you'll be able to explore 360 videos right from the comfort of your couch and along with your friends and family. And now, to help show you another way we're enabling interactivity, please join me in welcoming Barbara McDonald, who's the lead of something we call Super Chat. [MUSIC PLAYING] [APPLAUSE] BARBARA MACDONALD: Good morning I/O, and to everybody on the live stream.
As Susan mentioned, what makes YouTube special is the relationships that creators are able to foster with their fans. And one of the best ways to connect with your fans is to bring them live, behind the scenes of your videos, offering up can't-miss content. In the past year, the number of creators live streaming on YouTube has grown by 4x. This growth is awesome, and we want to do even more to deepen the connection between creators and their fans during live streams. That's why earlier this year, we rolled out a new feature called Super Chat. When a creator is live streaming, fans can purchase Super Chats which are highlighted, fun, chat messages. Not only do fans love the recognition, but creators earn extra money from it. In the past three months since launch, we've been amazed by the different ways creators are using Super Chat. Even April, our favorite pregnant giraffe, who unfortunately could not be here with us today, has raised tens of thousands of dollars for her home, the Animal Adventure Park.
But, OK. [CLAPPING] OK, we can clap for that. [APPLAUSE] [LAUGHS] But enough talking from me. We are going to do a live stream right here, right now, to show all of you how Super Chat works. And to help me, I am very excited to introduce top YouTube creators with 9 million subscribers and over 1 billion lifetime channel views. On the grass back there, The Slow Mo Guys! [CHEERING AND APPLAUSE] GAVIN FREE: Hello, everyone. DANIEL GRUCHY: Wow, hey. Happy to be here. How's it going? BARBARA MACDONALD: It's great to have you. So let's pull up their live stream. And just look. Chat is flying. Now, I love The Slow Mo Guys, and I want to make sure that they see my message, so I'm going to Super Chat them. Pulled up the stream. And right from within live chat, I am able to enter my message, select my amount, make the purchase, and send.
Boom. See how much that message stands out? And it gets to the top. It's cool, right? DANIEL GRUCHY: Yeah, thanks, Barbara. It's actually lovely at the minute. Although, I feel like there's a high chance of showers. GAVIN FREE: Very local showers, like, specifically to this stage. DANIEL GRUCHY: Very sudden. Yeah. BARBARA MACDONALD: Ooh, I wonder. I wonder. Well, because we know developers are incredibly creative, we wanted to see what you can do to make Super Chat even more interactive. So we've launched an API for it.
And today, we're taking it to the next level with a new developer integration that triggers actions in the real world. This means that when a fan sends a Super Chat to a creator, things can happen in real life, such as turning the lights on or off in the creator's studio, flying a drone around, or pushing buttons on their toys and gadgets. The Slow Mo Guys are going to create their next slow motion video using Super Chat's API. We have now rigged things up so that when I send my next Super Chat, it will automatically trigger the lights and a big horn in this amphitheater, OK? And that is going to signal our friends back there on the lawn to unleash a truckload of water balloons at The Slow Mo Guys. GAVIN FREE: I'm scared. [CHEERING] DANIEL GRUCHY: Yeah. BARBARA MACDONALD: Yeah. [LAUGHS] DANIEL GRUCHY: That's right. For every dollar, we're going to take another balloon. So, more money means more balloons. Although, I did hear a guy over here go, oh, we're going to really nail these guys.
All right, that's going to be at least $4 right there. So, yeah. Each dollar donated goes to the causes Susan mentioned earlier, the e-NABLE network. BARBARA MACDONALD: OK, so, how much do you think we can send? I can start at $1 and go anywhere upwards from there. So, it's for charity. How do we think– $100. How's that sound? AUDIENCE: More. BARBARA MACDONALD: OK, higher, higher. $200? $200? GAVIN FREE: How about $500 for 500 balloons? BARBARA MACDONALD: $500? I can do that. I can do that. OK. So I'm going to send my Super Chat and hit Send. $500. Boom. [HORN BLOWS] DANIEL GRUCHY: Oh! Balloons, oh [INAUDIBLE] god! Agh! BARBARA MACDONALD: [LAUGHS] DANIEL GRUCHY: Ugh. Yep. All right. All right. BARBARA MACDONALD: Keep going. Keep going. DANIEL GRUCHY: Oh! BARBARA MACDONALD: It's 500. DANIEL GRUCHY: It's finished. It's finished. GAVIN FREE: It never ends, ah! DANIEL GRUCHY: Ah! [INAUDIBLE] BARBARA MACDONALD: That was amazing.
Thank you, everybody, for your help. So this obviously just scratches the surface of what is possible using Super Chat's open APIs. And we are super excited to see what all of you will do with it next. So Susan, how about you come back out here, and let's check out the video we've all made. [VIDEO PLAYBACK] [MUSIC PLAYING] [APPLAUSE] BARBARA MACDONALD: [LAUGHS] AUDIENCE: [? Yeah, guys! ?] BARBARA MACDONALD: Wow.
[APPLAUSE] Thank you, Slow Mo Guys. Thank you, Barbara. I'm really happy to announce that YouTube is going to match The Slow Mo Guys' Super Chat earnings from today 100x to make sure that we're supplying prosthetics to children in need around the world. [APPLAUSE] So that 360 living room demo and the Super Chat demo– those are just two examples of how we are working to connect people around the globe together with video.
Now, I hope that what you've seen today is that the future of media is a future of openness and diversity. A future filled with conversations, and community. And a future that works across all screens. Together with creators, viewers, and partners, we are building the platform of that future. Thank you, I/O, and please– [APPLAUSE] Please welcome Dave Burke, joining us to talk about Android. [CHEERING AND APPLAUSE] [VIDEO PLAYBACK] [MUSIC – JACKIE WILSON, "HIGHER AND HIGHER"] [BUZZING] [CHEERING] [SATELLITE BEEPS] – Yay! Woohoo! [FIREWORKS LAUNCHING] – Yay! Woohoo! [FIREWORKS LAUNCHING] [END PLAYBACK] [CHEERING AND APPLAUSE] DAVE BURKE: Hi, everybody. It's great to be here at Google I/O 2017. As you can see, we found some new ways to hardware accelerate Android.
This time, with jet packs. But seriously, 2 billion active devices is incredible. And that's just smartphones and tablets. We're also seeing new momentum in areas such as TVs, and cars, and watches, and laptops, and beyond. So let me take a moment and give you a quick update on how Android is doing in those areas. Android Wear 2.0 launched earlier this year with a new update for Android and iPhone users. And with you partners like Emporio Armani, Movado, and New Balance, we now enable 24 of the world's top watch brands. Android Auto. We've seen a 10x user growth since last year It's supported by more than 300 car models and the Android Auto mobile app. And just this week, Audi and Volvo announced that their next generation nav systems will be powered by Android for a more seamless, connected car experience. Android TV. We partnered with over 100 cable operators and hardware manufacturers around the world. And we're now seeing 1 million device activations every two months.
And there are more than 3,000 Android TV apps in the Play Store. This year, we're releasing a brand-new launcher interface, and bringing the Google Assistant to Android TV. Android Things previewed late last year, and already there are thousands of developers in over 60 countries using it to build connected devices with easy access to the Google Assistant, TensorFlow, and more. The full launch is coming later this year. Chromebooks comprise almost 60% of K-12 laptops sold in the US, and the momentum is growing globally. And now, with the added ability to run Android apps, you get to target laptops, too. Now, of course, platforms are only as good as the apps they run. The Google Play ecosystem is more vibrant than ever. Android users installed a staggering 82 billion apps and games in the past year. That's 11 apps for every person on the planet. All right, so let's come back to smartphones. And the real reason I'm here is to talk about Android O. Two months ago, we launched our very first developer preview. So you could kick the tires on some of the new APIs.
And of course, it's very much a work in progress, but you can expect the release later this summer. Today, we want to walk you through two themes in O that we're excited about. The first is something called Fluid Experiences. It's pretty incredible what you can do on a mobile phone today, and how much we rely on them as computers in our pockets. But there are still certain things are tough to do on a small screen, so we're doing a couple of features in O that we think will help with this, which I'll cover in just a moment. The second theme is something we call Vitals. And the concept here is to keep vital system behavior in a healthy state so we can maximize the user's battery, performance, and reliability. So let's jump straight in and walk through four new fluid experiences, with live demos, done wirelessly.
What could possibly go wrong? [LAUGHTER] All right. These days, we do a lot of [? wants ?] on our phones, whether it's paying for groceries while reading a text message you just received, or looking up guitar chords while listening to a new song. But conventional multi-window techniques don't translate well to mobile. They're just too fiddly to set up when you're on the go. We think Picture-in-Picture is the answer for many cases. So let's take a look.
My kids recently asked me to build a lemonade stand. So I opened up YouTube, and I started researching DIY videos. And I found this one. Now, at the same time, I want to be able to jot down the materials I need to build for this lemonade stand. So to multitask, all I do is press the Home button, and boom, I get Picture-in-Picture.
You can think of it as a kind of automatic multi-window. I can move it out of the way, I can launch Keep, I can add some more materials. So I know I need to get some wood glue, like so. Then when I'm done, I just simply swipe it away like that. It's brilliant. Picture-in-Picture lets you do more with your phone. It works great when video calling with Duo.
For example, maybe I need to check my calendar while planning a barbecue with friends. And there are lots of other great use cases. For example, Picture-in-Picture for Maps navigation, or watching Netflix in the background, and a lot more. And we're also excited to see what you come up with for this feature. We're also making notification interactions more fluid for users. >From the beginning, Android has really blazed a trail when it comes to its advanced notification system. In O, we're extending the reach of notifications with something we call Notification Dots. It's a new way for app developers to indicate that there's activity in their app, and to drive engagement. So take a look. You'll notice that the Instagram app icon has a dot in it. And this is it indicating that there's a notification associated with the app. So if I pull down the shade, sure enough, you can see there's a notification. In this case, someone's commented on a photo I'm tagged in. What's really cool is I can long press the app icon, and we now show the notification in place.
One of the things I really like about the Notification Dot mechanism is that it works with zero effort from the app developer. We even extract the color of the dot from your icon. Oh, and you get your erase the icon by simply swiping the notification like that. So you're always in control. Another great feature in O that helps make your experience more fluid is Autofill. Now, if you use Chrome, you're probably already familiar with Autofill for quickly filling out a username and password, or credit card information with a single tap. With O, we've extended Autofill to apps. Let's say I'm setting up a new phone for the first time, and I open Twitter.
And I want to log in. Now, because I use twitter.com all the time on Chrome, the system will automatically suggest my username. I can simply tap it. I get my password. And then, boom. I'm logged in. It's pretty awesome. [APPLAUSE] Autofill takes the pain out of setting up a new phone or tablet. Once the user opts in, Autofill will work for most applications. We also provide APIs for developers to customize Autofill for their experience. I want to show you one more demo of how we're making Android more fluid by improving copy and paste. The feature is called Smart Text selection.
So let's take a look. In Android, you typically long press or double tap a word to select it. For example, I can open Gmail. I can start composing. If I double tap the word "bite," it gets selected like so. Now, we know from user studies that phone numbers are the most copy-and-pasted items. The second most common are named entities like businesses, and people, and places. In O, we're applying on-device machine learning– in this case, a [? feed ?] [? for a ?] neural network– to recognize these more complicated entities. So watch this. I can double tap anywhere on the phrase, "Old Coffee House," and all of it is selected for me. No more fiddling around with text selection handles. [APPLAUSE] It even works for addresses. So if I double tap on the address, all of it is selected. And what's more– [APPLAUSE] There is more. What's more is the machine learning model classifies this as an address and automatically suggests Maps. So I can get directions to it with a single click. And of course, it works as you'd expect for phone numbers. You get the phone dialer suggested.
And for email addresses, you get Gmail suggested. All of this neural networking processing happens on-device in real time, and without any data leaving the device. It's pretty awesome. Now, on-device machine learning helps to make your phone smarter. And we want to help you build experiences like what you just saw. So we're doing two things to help. First, I'm excited to announce that we're creating a specialized version of TensorFlow, Google's open source machine learning library, which we call TensorFlow Lite. It's a library for apps designed to be fast and small, yet still enabling state-of-the-art techniques like [? compnets ?] and LSTMs. Second, we're introducing a new framework at Android to hardware accelerate neural computation. TensorFlow Lite will leverage a new neural network API to tap into silicon-specific accelerators. And over time, we expect to see DSPs specifically designed for neural network inference and training.
We think these new capabilities will help power our next generation of on-device speech processing, visual search, augmented reality, and more. TensorFlow Lite will soon be part of that open source TensorFlow project, and the neural network API will be made available later in an update to O this year. OK, so that's a quick tour of some of the fluid experiences in O.
Let's switch gears and talk about Vitals. So to tell you more, I want to hand it over to Steph, who's been instrumental in driving this project. Thank you. [MUSIC PLAYING] STEPHANIE SAAD CUTHBERTSON: Hi, everyone. OK, so all the features Dave talked about are cool. But we think your phones' foundations are even more important– battery life, security, startup time, and stability. After all, if your battery dies at 4:00 PM, none of the other features that Dave talked about really matter. So in O, we're investing in what we call Vitals, keeping your phone secure and in a healthy state to maximize power and performance. We've invested in three foundational building blocks– security enhancements, OS optimizations, and tools to help developers build great apps. First, security. Android was built with security in mind from day one with application sandboxing. As Android has matured, we've developed vast mobile security services. Now, we use machine learning to continuously comb apps uploaded to Play, flagging potentially harmful apps.
Then, we scan over 50 billion apps every day, scanning every installed app on every connected device. And when we find a potentially harmful app, we disable it or remove it. And we found most Android users don't know these services come built-in with Android devices with Play. So for greater peace of mind, we're making them more visible and accessible, and doubling down on our commitment to security, with the introduction of Google Play Protect. [APPLAUSE] So here, you can see Play Protect has recently scanned all your apps. No problems found. That's Google Play Protect.
It's available out of the box on every Android device with Google Play. Second, OS optimizations. The single biggest visible change in O is boot time. On Pixel, for example, you'll find, in most cases, your boot time is now twice as fast. And we've made all apps faster by default. We did this through extensive changes to our runtime. Now, this is really cool stuff, like concurrent compacting garbage collection and code locality. But all you really need to know is that your apps will run faster and smoother. Take Google Sheets– aggregate performance over a bunch of common actions is now over two times as fast. And that's all from the OS. There are no changes to the app. But we found apps could still have a huge impact on performance. Some apps were running in the background, and they were consuming tons of system resources, especially draining battery.
So in O, we're adding Wise Limits to background location and background execution. These boundaries put sensible limits on usage. They're protecting battery life and freeing up memory. Now, our third theme is helping developers build great apps. And here, I want to speak directly to all the developers in the audience. Wouldn't it be cool if Android's engineering team could show you what causes performance issues? Today, we've launched Play Console Dashboards that analyze every app and pinpoint six top issues that cause battery drain, crashes, and slow UI. For each issue the app has, we show how many users are affected and provide guidance on the best way to fix. Now, imagine if developers could also have a powerful profiler to visualize what's happening inside the app. In Android Studio, we've also launched new unified profiling tools for network, memory, and CPU. So, developers can now see everything on a unified timeline, and then dive into each profiler. For example, on CPU, you can see every thread.
You can look at the call stack, and the time every call is taking. You can visualize where the CPU is going. And you can jump to the exact line of code. OK, so that's Android Vitals. [APPLAUSE] How we're investing in your phone's foundational security and performance. Later today, you'll see Android's developer story from end to end. Our hard work to help developers build great apps at every stage– writing code, tuning, launching, and growing.
But there is one more thing. One thing we think would be an incredible complement to the story. And it is one thing our team has never done for developers. We have never added a new programming language to Android. And today, we're making Kotlin an officially supported language in Android. [APPLAUSE] So, Kotlin– Kotlin is one our developer community has already asked for. It makes developers so much more productive. It is fully Android runtime compatible. It is totally interoperable with your existing code. It has fabulous IDE support. And it's mature and production ready from day one. We are also announcing our plans to partner with JetBrains, creating a foundation for Kotlin. I am so happy JetBrains CEO, Max Shafirov, is here today. [APPLAUSE] This new language is wonderful, but we also thought we should increase our investment in our existing languages. So we're doing that, too. Please join us at the developer keynote later today to hear our story from end to end.
OK, so let's wrap up. There are tons more features in Android O, which we don't have time to go into today. Everything from redesign settings, to Project Treble, which is one of the biggest changes to the foundations of Android to date, to downloadable fonts with new emoji, and much more. If you want to try some of these features for yourself– and you do– I'm happy to announce we're making the first beta release of O available today. Head over to android.com/beta. [APPLAUSE] But there's more. [LAUGHS] You probably thought we were done talking about Android O, but, I'd like you to hear some more about Android. And from that, please welcome Sameer. Thank you. [MUSIC PLAYING] [APPLAUSE] SAMEER SAMAT: Thanks, Steph. Hi, everyone. >From the beginning, Android's mission has been to bring the power of computing to everyone. And we've seen tremendous growth over the last few years, from the high end to entry-level devices, in countries like Indonesia, Brazil and India. In fact, there are now more users of Android in India than there are in the US.
And every minute, seven Brazilians come online for the first time. Now, all this progress is amazing. For those of us who have a smartphone, we intuitively understand the profound impact that computing is having on our daily lives. And that's why our team gets so excited about how we can help bring this technology to everyone. So we took a step back to think about what it would take to get smartphones to more people. There are a few things that are clear. Devices would need to be more affordable, with entry-level prices dropping significantly. This means hardware that uses less power-packed processors and far less memory than on premium devices. But the hardware is only half the equation. The software also has to be tuned for users' needs around limited data connectivity and multilingual use. We learned a lot from our past efforts here with Project Svelte and KitKat, and the original Android One program.
But we felt like the time was right to take our investment to the next level. So today, I'm excited to give you a sneak peek into a new experience we're building for entry-level Android devices. Internally, we call it Android Go. Android Go focuses on three things. First, optimizing the latest release of Android to run smoothly on entry-level devices, starting with Android O. Second, a rebuilt set of Google Apps that use less memory, storage space, and mobile data. And third, a version of the Play Store that contains the whole app catalog, but highlights the apps designed by all of you for the next billion users. And all three of these things will ship together as a single experience starting on Android O devices with 1 gigabyte or less of memory.
Let's take a look at some of the things we're working on for Android Go. First, let's talk about the operating system. For manufacturers to make more affordable entry-level devices, the prices of their components have to come down. Let's take one example. Memory is an expensive component. So we're making a number of optimizations to the system UI and the kernel to allow an Android O device built with the Go configuration to run smoothly with as little as 512 megabytes to 1 gigabyte of memory.
Now on-device performance is critical, but data costs and intermittent connectivity are also big challenges for users. One person put it best to me when she said, mobile data feels like currency. And she wanted more control over the way she spent it. So on these devices, we're putting data management front and center in Quick Settings. And we've created an API that carriers can integrate with, so you can see exactly how much prepaid data you have left, and even top up right there on the device. But beyond the OS, the Google Apps are also getting smarter about data. For example, on these devices, the Chrome Data Saver feature will be turned on by default. Data Saver transcodes content on the server and simplifies pages when you're on a slow connection. And, well, now we're making the savings more visible here in the UI.
In aggregate, this feature is saving users over 750 terabytes of data every day. I'm really excited that the YouTube team has designed a new app called YouTube Go for their users with limited data connectivity. Feedback on the new YouTube app has been phenomenal, and we're taking many of the lessons we've learned here and applying them to several of our Google Apps. Let me show you some of the things I love about YouTube Go. First, there's a new preview experience, so you can get a sneak peek inside a video before you decide to spend your data to watch it. And when you're sure this is the video for you, you can select the streaming quality you want, and see exactly how much mobile data that's going to cost you. But my favorite feature of YouTube Go is the ability to save videos while you're connected. So you can watch them later when you might not have access to data. And if you want to share any of those videos with a friend, you can use the built-in peer-to-peer sharing feature to connect two of your devices together directly, and share the files across without using any of your mobile data at all.
[APPLAUSE] But beyond data management, the Google Apps will also make it easier to seamlessly go between multiple languages, which is a really common use case for people coming online today. For example, Gboard now supports over 191 languages, including the recent addition of 22 Indian languages. And there's even a transliteration feature, which allows you to spell words phonetically on a QWERTY keyboard to type in your native language script. Now, Gboard is super cool, so I want to show it to you. I grew up in the US, so for any of my family that's watching, don't get too excited by the demo. I haven't learned Hindi yet. And I'm sorry, mom, OK? [LAUGHTER] So let's say, I want to send a quick note to my aunt in India. I can open up Allo, and using Gboard, I can type how it sounds phonetically. [HINDI], which means "how are you" in Hindi. And transliteration automatically gives me Hindi script. That's pretty cool. Now, let's say I want to ask her how my I/O speech is going, but I don't know how to say that in Hindi at all.
I can use the built-in Google Translate feature to say, "how is this going?" And seamlessly, I get Hindi script, all built right into the keyboard. [APPLAUSE] My family is apparently a tough audience. All right. Well, the Google Apps are getting Go-ified, what had always propelled Android forward is the apps from all of you. And no surprise, many of our developer partners have optimized their apps already.
So to better connect users with these experiences, we'll be highlighting them in the Play Store. One example is right here on Play's home page. To be eligible for these new sections, we published a set of best practices called "Building for Billions," which includes recommendations we've seen make a big difference in the consumer experience. Things such as designing a useful offline state, reducing your APK size to less than 10 megabytes, and using GCM or JobScheduler for better battery and memory performance. And also in "Building for Billions," you'll find best practices for optimizing your web experience. We've seen developers build amazing things with new technologies, such as progressive web apps. And we hope you can come to our developer keynote later today to learn a whole lot more.
OK, that was a quick walkthrough of some of the things coming in Android Go. Starting with Android O, all devices with 1 gigabyte of RAM or less will get the Go configuration. And going forward, every Android release will have a Go configuration. We'll be unveiling much more later this year, with the first devices shipping in 2018. We look forward to seeing what you'll build, and how we can bring computing to the next several billion users. Next up– next up, you'll be hearing from Clay on one of Google's newest platforms that we're really excited about– VR and AR. Thank you. [APPLAUSE] [MUSIC PLAYING] CLAY BAVOR: Thank you, Sameer. So, Sundar talked about how technologies like machine learning and conversational interfaces make computing more intuitive by enabling our computers to work more like we do. And we see VR and AR in the same light. They enable us to experience computing just as we experience the real world. Virtual reality can be transporting. And you can experience not just what it's like to see someplace, but what it's like to really be there.
And augmented reality uses your surroundings as context, and puts computing into the real world. A lot has happened since Google I/O last year, and I'm excited to share a bit of what we've been up to. So let's start with VR. Last year, we announced Daydream, our platform for mobile virtual reality. And then in October, to kick-start the Daydream ecosystem, we released Daydream View, a VR headset made by Google. And it's super comfortable. It's really easy to use. And there's tons to do with it. You can play inside alternate worlds, and games like "Virtual Virtual Reality." And you can see any part of our world with apps like Street View.
And you can visit other worlds with apps like Hello Mars. There's already a great selection of Daydream phones out there, and we're working with partners to get Daydream on even more. First, I'm pleased that LG's next flagship phone, which launches later this year, will support Daydream. And there's another. I'm excited to announce that the Samsung Galaxy S8 and S8 Plus will add Daydream support this summer with a software update. [APPLAUSE] So, Samsung, of course, they make many of the most popular phones in the world.
And we're delighted to have them supporting Daydream. So great momentum in Daydream's first six months. Let's talk about what's next. So with Daydream, we showed that you can create high quality mobile VR experiences with just a smartphone and a simple headset. And there are a lot of nice things about smartphone VR. It's easy. There aren't a bunch of cables and things to fuss with. You can choose from a bunch of great compatible phones. And of course, it's portable. You can throw your headset in a bag. We asked, how we take the best parts of smartphone VR and create a kind of device with an even better experience? Well, I'm excited to announce that an entirely new kind of VR device is coming to Daydream– what we call standalone VR headsets. And we're working with partners to make them. So what's a standalone headset? Well, the idea is that you have everything you need for VR built right into the headset itself. There's no cables, no phone, and certainly, no big PC. And the whole device is designed just for VR. And that's cool for a couple of reasons.
First, it's easy to use. Getting into VR is as easy as picking the thing up. And it's one step and two seconds. And second, presence. And by that, I mean really feeling like you're there. By building every part of the device specifically for VR, we've been able to optimize everything– the displays, the optics, the sensors– all to deliver a stronger sense of being transported. And nothing heightens the feeling of presence like precise tracking– how the headset tracks your movement. And we've dramatically improved tracking with the technology that we call WorldSense. So WorldSense enables what's known as positional tracking. And with it, your view in the virtual world exactly matches your movement in the real world. And it works by using a handful of sensors on the device that look out into your surroundings. And that means it works anywhere. There's no setup.
There's no cameras to install. And with it, you really feel like you're there. Now, just as we did with Daydream-ready smartphones, we're taking a platform approach with standalone headsets, working with partners to build some great devices. To start, we worked with Qualcomm to create a Daydream standalone headset reference design, a sort of device blueprint that partners can build from. And we're working closely with two amazing consumer electronics companies to build the first headsets. First, HTC, the company that created the VIVE. [APPLAUSE] We're excited about it, too. [CHEERING AND APPLAUSE] They're a leader in VR, and we're delighted to be working with them on a standalone VR headset for Daydream. And second, Lenovo. We've been partners for years, working together on Tango. And now, we're excited to work with them on VR.
These devices will start to come to market later this year. So that's the update on VR. Great momentum with apps, more Daydream-ready phones on the way, and a new category of devices that we think people are going to love. So let's turn to augmented reality. And a lot of us were introduced to the idea of AR last year with Pokemon GO. And the app gave us a glimpse of AR, and it showed us just how cool it can be to have digital objects show up in our world. Well, we've been working in this space since 2013 with Tango, a sensing technology that enables devices to understand space more like we do. Two years ago in 2015, we released a developer kit. And last year, we shipped the first consumer-ready Tango phone. And I'm excited to announce that the second generation Tango phone, the ASUS ZenFone AR will go on sale this summer.
Now, looking at the slides, you may notice a trend. The devices are getting smaller. And you can imagine far more devices having this capability in the future. It's been awesome to see what developers have done with the technology. And one thing we've seen clearly is that AR is most powerful when it's tightly coupled to the real world, and the more precisely, the better. That's why we've been working with the Google Maps team on a service that can give devices access to very precise location information indoors. It's kind of like GPS, but instead of talking to satellites to figure out where it is, your phone looks for distinct visual features in the environment, and it triangulates with those.
So you have GPS. We call this VPS, Google's Visual Positioning Service. And we think it's going to be incredibly useful in a whole bunch of places. For example, imagine you're at Lowe's, the home improvement store that has basically everything. And if you've been there, you know it's really big. And we've all had that moment when you're struggling to find that one, weird, random screwdriver thing. Well, imagine in the future, your phone could just take you to that exact screwdriver and point it out to you on the shelf. Turns out we can do this with VPS.
And let me show you how. And this is working today. So here we are walking down an aisle at Lowe's. And the phone will find these key visual feature points, which you can see there in yellow. By comparing the feature points against previously observed ones, those colorful dots in the back, the phone can figure out exactly where it is in space down to within a few centimeters. So GPS can get you to the door, and then VPS can get you to the exact item that you're looking for.
Further out– [APPLAUSE] Further out, imagine what this technology could mean to people with impaired vision, for example. VPS and an audio-based interface could transform how they make their way through the world. And it combines so many things that Google is good at– mapping, computer vision, distributed computing. And we think precise location will be critical for camera-based interfaces. So VPS will be one of the core capabilities of Google Lens. We're really excited about the possibilities here. So the last thing I wanted to share is something that we've been working on that brings many of these capabilities together in a really important area. And that's education. Two years ago, we launched Expeditions, which is a tool for teachers to take their classes on virtual reality field trips.
And 2 million students have used it. Today, we're excited to announce that we're adding a new capability to Expeditions– AR mode, which enables kind of the ultimate show-and-tell right in the classroom. If we could roll the video, please. [VIDEO PLAYBACK] – All right, who wants to see a volcano? 3, 2, 1. – Whoa! – Look at that lava. Look at that smoke coming out of that. Pretend you're an airplane and fly over the tornado. – That's the top of it. – What do you see? – It's either a asteroid, meteorite– – We're learning about DNA and genes– things that we can't see. And so, the most exciting thing for me with the AR technology was that I could see kids get an "aha" moment that I couldn't get by just telling them about it. – The minute I saw it pop up on the screen, it made me want to get up and walk to it. – You actually you get to turn around and look at things from all angles, so it gave us a nice perspective.
– See if you can figure out what that might be based on what you know about the respiratory system. – I got to see where the alveoli branched off, and I could look inside them and see how everything worked, which I never saw before. And it was really, really cool. [END PLAYBACK] CLAY BAVOR: We're just delighted with the response we're seeing so far. And we'll be rolling this out later in the year. So, VR and AR, two different flavors of what you might call immersive computing– computing that works more like we do. We think that's a big idea. And in time, we see VR and AR changing how we work and play, live and learn.
And all that I talked about here, these are just the first steps. But we can see where all of this goes, and we're incredibly excited about what's ahead. Thanks so much. Back to Sundar. [APPLAUSE] [VIDEO PLAYBACK] – We wanted to make machine learning have an open source project so that everyone outside of Google could use the same system we're using inside Google. [MUSIC PLAYING] [END PLAYBACK] [APPLAUSE] SUNDAR PICHAI: It's incredible [? with ?] any open source platform, when you see what people can do on top of it. We're really excited about the momentum behind TensorFlow. It's already the most popular ML repository on GitHub. And we're going to push it further. We are also announcing the TensorFlow Research Cloud.
We are giving away 1,000 cloud TPUs, which is 180 petaflops of computing to academics and researchers for free so that they can do more stuff with it. I'm always amazed by the stories I hear from developers when I meet them. I want to highlight one young developer today, Abu Qader from Chicago. He has used TensorFlow to help improve health for everyone. Let's take a look. [VIDEO PLAYBACK] [MUSIC PLAYING] [CHATTER] – My name is Abu. I am a high school student. 17 years old. My freshman year, I remember Googling machine learning. I had no clue what it meant. That's a really cool thing about the internet, is that someone's already doing it, so you can just YouTube it, and [CLICK] it's right there. Within a minute, I really saw what machine learning can do. It kind of like hit something within me. This need to build things to help people.
My parents are immigrants from Afghanistan. It's not easy coming in. The only reason we made it through some of the times that we did was because people showed acts of kindness. Seeing that at an early age was enough for me to understand that helping people always comes back to you. [INAUDIBLE] – How are you? – And then it kind of hit me– a way where I could actually generally help people. Mammograms are the cheapest imaging format there is. It's the most accessible to people all around the world. But one of the biggest problems that we see in breast cancer is misdiagnosis. So I decided I was going to build a system for early detection of breast cancer tumors, that's successful to everyone, and that's more accurate. How was I going to do it? Machine learning. The biggest, most extensive resource that I've used, is this platform called TensorFlow. And I've spent so many hours going really deep into these open source libraries and just figuring out how it works. Eventually, I wrote a whole system that can help radiologists make their decisions. All right. – Ready? – Yeah. I'm by no means a wizard at machine learning.
I'm completely self-taught. I'm in high school. I YouTubed and just found my way through it. You don't know about that kid in Brazil that might have a groundbreaking idea, or that kid in Somalia. You don't know that they have these ideas. But if you can open source your tools, you can give them a little bit of hope that they can actually conquer what they're thinking of.
[END PLAYBACK] [CHEERING AND APPLAUSE] Abu started this as a school project, and he's continued to build it on his own. We are very, very fortunate to have Abu and his family here with us today. [CHEERING AND APPLAUSE] Thank you for joining us. Enjoy I/O. We've been talking about machine learning in terms of how it will power new experiences and research. But it's also important we think about how this technology can have an immediate impact on people's lives by creating opportunities for economic empowerment. 46% of US employers say they faced talent shortages and have issues filling open job positions while job seekers may be looking for openings right next door.
There is a big disconnect here. Just like we focused our contributions to teachers and students through Google for Education, we want to better connect employers and job seekers through a new initiative, Google for Jobs. Google for Jobs is our commitment to use our products to help people find work. It's a complex, multifaceted problem, but we've been investing a lot over the past year, and we have made significant progress. Last November, we announced the Cloud Jobs API. Think of it as the first fully end-to-end, pre-trained, vertical machine learning model through Google Cloud, which we give to employers– FedEx, Johnson & Johnson, HealthSouth, CareerBuilder, and we're expanding to many more employers. So in Johnson & Johnson's career site, they found that applicants were 18% more likely to apply to a job suggesting the matching is working more efficiently. And so far, over 4 and 1/2 million people have interacted with this API. But as we started working on this, we realized the first step for many people when they start looking for a job is searching on Google.
So, it's like other Search challenges we have worked in the past. So we built a new feature in Search with a goal that no matter who you are or what kind of job you are looking for, you can find the job postings that are right for you. And as part of this effort, we worked hard to include jobs across experience and wage levels, including jobs that have traditionally been much harder to search and classify– think retail jobs, hospitality jobs, et cetera. To do this, well, we have worked with many partners– LinkedIn, Monster, Facebook, CareerBuilder, Glassdoor, and many more. So let's take a look at how it works. Let's say you come to Google and you start searching for retail jobs. And you're from Pittsburgh. We understand that. You can scroll down and click into this immersive experience. And we immediately start showing the most relevant jobs for you. And you can filter. You can choose Full-time. And as you can see, you can drill down easily.
I want to look at jobs which are posted in the past three days. So you can do that. Now, you're looking at retail jobs in Pittsburgh, posted within the last three days. You can also filter by job titles. It turns out employees and employers use many different terminologies. For example, retail could mean a store clerk, a sales representative, store manager. We use machine learning to cluster automatically, and so that we can bring all the relevant jobs for you. As you scroll through it, you will notice that we even show commute times. It turns out to be an important criteria for many people. And we'll soon add a filter for that as well.
And if you find something that's of interest to you– so maybe the retail position [? in ?] Ross. And you can click on it, and you end up going to it right away. And you're one click away. You can scroll to find more information if you want. And you're one click away from clicking and applying there. It's a powerful tool. We are addressing jobs of every skill level and experience level. And we are committed to making these tools work for everyone. As part of building it, we literally talked to hundreds of people. So whether you are in a community college looking for a barista job, a teacher who is relocating across the country and you want teaching jobs, or someone who is looking for work in construction, the product should do a great job of bringing that information to you. We are rolling this out in the US in the coming weeks, and then we are going to expand it to more countries in the future.
I'm personally enthusiastic for this initiative because it addresses an important need and taps our core capabilities as a company, from searching and organizing information, to AI and machine learning. It's been a busy morning. We've talked about this important shift from a mobile first to a AI first world. And we're driving it forward across all our products and platforms so that all of you can build powerful experiences for new users everywhere.
It will take all of us working together to bring the benefits of technology to everyone. I believe we are on the verge of solving some of the most important problems we face. That's our hope. Let's do it together. Thanks for your time today, and enjoy Google I/O. [APPLAUSE] [MUSIC PLAYING].
So here's a new one for you, a smartphone with two screens. This is the Nubia X – an Android smartphone that comes with two built in screens, one on the front, and one on the back. You know we couldn't let this one slip by without a full on durability test. Let's get started. [Intro] So supposedly if you want to work this contraption, there are two fingerprint readers on the sides that when gripped allow whatever's displayed on the front screen to slip over onto the back panel while retaining full functionality. You can run the whole phone from back here. When the rear screen is turned off, it looks just like a normal phone.
Let's see what happens when we scratch test both screens. This Nubia X comes with a builtin screen protector which is nice. I'll pull that off. The interesting thing with this phone is that there's no notch, there's no front facing cameras or sensors that need to be hidden, which allows the whole front of the phone to light up edge to edge, as a display with pretty much no bezels. As we see from the scratch test, the front is made with tempered glass, scratching at a level 6, with deeper grooves at a level 7. We've got to be fair to both screens though. I'll grip the sides and flip it around. The rear screen has a much lower refresh rate than the front screen, and has kind of a yellow tinge to it. It's also covered with a screen protector. I've found that the yellowness is just a blue light filter that can be removed. I'll show you that in a second. The blue backing on the phone is kind of like reflective film that acts as a 2-way mirror, that hides the second screen when it's turned off.
And once again we get scratches at a level 6, with deeper grooves at a level 7. The rear screen is covered with the same glass. Part of what makes the front screen so bezel-less is how small this earpiece is. I almost missed it. It's barely the thickness of my razor blade. The sides of the phone sound and look like anodized aluminum. You can see the silver color shining through under the blue coating, even here on the blue power button. One possible weak point in the frame is this flattened portion where the fingerprint scanner resides.
It looks suspiciously like the design flaw we saw on the iPad Pro, and we know how that one turned out. This phone has 2 of them, one on either side. I'm kind of nervous. The top of the phone has more metal and what looks like an IR blaster up top for changing the channels on your TV. On the other side we have the SIM card tray, volume rocker, and another flattened fingerprint reader…because why have just one fingerprint reader when you can have two? Two is kind of the theme of this phone.
There are 2 SIM card slots as well, which pair nicely with the two screens. Just one USB-C port down here at the bottom alongside the loudspeaker holes. You might be thinking to yourself, 'Jerry, why in the world would anyone want two screens on their phone?' Well let me tell you. There are no front facing cameras on the Nubia X, just these two dual rear cameras: a 16 megapixel and 24 megapixel. One regular and the other one for that portrait mode stuff. So when you want to take a selfie, you get to use the powerful rear facing cameras while seeing your face at the same time. It's an interesting solution to the notch problem, but also a solution that actually works. I'm impressed with the ingenuity – I genuinely did not see this one coming. Like dual everything else on the phone, there are two fingerprint scanners. You can use one or set up both for added security. And even with the damage inflicted on the right side, the phone can still sense and unlock my fingerprint. Not too shabby, but at least there's a backup if one ever fails. Now here's where things get interesting.
This front screen is a 6.2 inch 1080p IPS LCD, meaning that after about 10 seconds we see the pixels getting hot, turning off and going black, until the heat is removed, and then they slowly recover. They do recover completely though. Checking the back screen though, is where we see one of the brilliant parts of the Nubia X design. An LCD screen when turned on has light shining through every single pixel, even the black ones. So the whole display lights up. An LED screen or an AMOLED screen does not have light shining through the black pixels, so you really can't tell when the screen is on or off because the blacks are so black and emit no light. The Nubia X is using a 5.1 720p AMOLED screen on the back, an LCD on the front, and AMOLED for the rear. You can see the pixels going white and not recovering. Both technologies on one single device – burn test justification. Checkmate on the haters.
The rear screen can act like an always-on display without showing the actual edges of the screen. This provides the aesthetic illusion that there's no screen at all on the back panel. Whether the screen is lit up or not, you can't really see the rectangular edges due to the super black AMOLED pixels. It's interesting how all these innovative smartphones keep popping up out of nowhere. No complaints from me. Here's a quick look at the rear display with the blue light filter turned off. It looks completely normal and has as good of quality as the front screen. I'm not sure why the Nubia X had that setting turned on right out of the box. I think it looks pretty good. All of these awesome gimmicks though mean nothing if the phone can't survive in your pocket for a few years. Here on my channel we put phones through years of abuse in just a few minutes. It's time for the bend test. The dual fingerprint design on both side rails might just be fatal.
With a solid flex from the front, we get minimal bend, but no catastrophic damage, creeks or snaps. The rear screen looks fine. Even when bent from the front, the phone remains intact and fully functional. No complaints here. The Nubia X is a structurally solid device, even with all that fragile tech packed in, and while having screens on both sides, this phone survives. As a side note, if one screen does actually break, you can always just use the other as a back up since it does come with two. Do you think this phone is the future? Is a dual screen phone a better solution than a motorized front facing camera? Which one do you prefer? Back to back screens are definitely fascinating. Hit that subscribe button if you haven't already. And come hang out with me on Instagram and Twitter.