Google I/O 2021 keynote in 16 minutes

good morning everyone it's great to be back at io today i'm excited to share our latest breakthrough in natural language understanding lambda it's a language model for dialogue applications and it's open domain which means it's designed to converse on any topic and while it's still in research and development we've been using it internally to explore novel interactions for example say you wanted to learn about one of my favorite planets pluto lambda already understands quite a lot about pluto and millions of other topics let's listen to a conversation the team had with pluto a few days ago i'm so curious about you i sense your excitement ask me anything tell me what i would see if i visited you would get to see a massive canyon some frozen icebergs geysers and some craters it sounds beautiful i assure you it is worth the trip however you need to bring your coat because it gets really cold i'll keep that in mind hey i was wondering have you ever had any visitors yes i have had some the most notable was new horizons the spacecraft that visited me let's break down what made it feel so natural first learn concepts as you saw the model talked about the new horizon spacecraft and the coldness of space lambda synthesized these concepts from its training data because none of the responses were predefined lambda answered with sensible responses keeping the dialogue open-ended natural conversations are generative and they never take the same path twice and lambda is able to carry a conversation no matter what we talk about yet it's still early research so it doesn't get everything right sometimes it can give nonsensical responses imagining pluto doing flips or playing fetch with its favorite ball the moon other times it just doesn't keep the conversation go going we believe lambda's natural conversation capabilities have the potential to make information and computing radically more accessible and easier to use we look forward to incorporating better conversational features into products like google assistant search and workspace lambda is a huge step forward in natural conversation but it is still trained only on text when people communicate with each other they do it across images text audio and video so we need to build models that allow people to naturally ask questions across different types of information these are called multimodal models for example when you say show me the part where the lion roars at sunset we will get you to that exact moment in a video advances in ai are helping us reimagine what a map can be but now you can also use it to explore the world around you you'll be able to access live view right from the map and instantly see details about the shops and the restaurants around you including how busy they are recent reviews and photos of those popular dishes in addition there are a host of new features coming to live view later this year we're adding prominent virtual street signs to help you navigate those complex intersections second we'll point you towards key alarm landmarks and places that are important for you like the direction of your hotel third we're bringing it indoors to help you get around some of the hardest to navigate buildings like airports transit stations and malls indoor live you will start rolling out in top train stations and airports in zurich this week and will come to tokyo next month we're bringing you the most detailed street maps we've ever made take this image of columbus circle one of the most complicated intersections in manhattan you can now see where the sidewalks the crosswalks the pedestrian islands are something that might be incredibly helpful if you're taking young children out on a walk or absolutely essential if you're using a wheelchair thanks to our application of advanced ai technology on robust street view and aerial imagery we're on track to launch detailed street maps in 50 new cities by the end of the year so we're making the map more dynamic and more tailored highlighting the most relevant information exactly when you need it if it's 8 a.m on a weekday we'll display the coffee shops and bakeries more prominently in the map while at 5 pm we'll highlight the dinner restaurants that match your tastes you'll start seeing this more tailored map in the coming weeks people have found it really useful especially during this pandemic to see how busy a place is before heading out now we're expanding this capability from specific places like restaurants and shops to neighborhoods with the feature called area business say you're in rome and want to head over to the spanish steps and its nearby shops with area business you'll be able to understand at a glance if it's the right time for you to go based on how busy that part of the city is in real time area busyness will roll out globally in the coming months let's talk about all the ways we're innovating in shopping soon on chrome when you open a new tab you'll be able to see your open carts from the past couple of weeks we'll also find you promotions and discounts for your open carts if you choose to opt in your personal information and what's in your carts are never shared with anyone externally without your permission we capture photos and videos so we can look back and remember there are more than four trillion photos and videos stored in google photos but having so many photos of loved ones screenshots selfies all stored together makes it hard to rediscover the important moments soon we're launching a new way to look back that we're calling little patterns little patterns show the magic in everyday moments by identifying not so obvious moments and resurfacing them to you this feature uses machine learning to translate photos into a series of numbers and then compares how visually or conceptually similar these images are when we find a set of three or more photos with similarities such as shape or color we'll surface them as a pattern when we started testing little patterns we saw some great stories come to life like how one of our engineers traveled the world with their favorite orange backpack or how our product manager christy had a habit of capturing objects of similar shape and color we also want to bring these moments to life with cutting edge effects last year we launched cinematic photos to help you relive your memories in a more vivid way cinematic moments will take these near duplicate images and use neural networks to synthesize the movement between image a and image b we interpolate the photos and fill in the gaps by creating new frames the end result is a vivid moving picture and the cool thing about this effect is it can work on any pair of images whether they were captured on android ios or scanned from a photo album in addition to providing personalized content to look back on we also want to give you more control we heard from you that controls can be helpful for anyone who has been through a tough life event breakup or loss these insights inspired us to give you the control to hide photos of certain people or time periods from our memories feature and soon you'll be able to remove a single photo from a memory rename the memory or remove it entirely instead of form following function what if form followed feeling instead of google blue we imagined material you a new design that includes you as a co-creator letting you transform the look and feel of all your apps by generating personal material palettes that mix color science with a designer's eye a new design that can flex to every screen and fit every device your apps adapt comfortably every place you go beyond light and dark a mode for every mood these selections can travel with your account across every app and every device material u comes first to google pixel this fall including all of your favorite google apps and over the following year we will continue our vision bringing it to the web chrome os wearables smart displays and all of google's products we've overhauled everything from the lock screen to system settings revamping the way we use color shapes light and motion watch what happens when the wallpaper changes like if i use this picture of my kids actually getting along for once i set it as my background and voila the system creates a custom palette based on the colors in my photo the result is a one of a kind design just for you and you'll see it first on google pixel in the fall starting from the lock screen the design is more playful with dynamic lighting pick up your phone and it lights up from the bottom of your screen press the power button to wake up the phone instead and the light ripples out from your touch even the clock is in tune with you when you don't have any notifications it appears larger on the lock screen so you know you're all caught up the notification shade is more intuitive with a crisp at a glance view of your app notifications whatever you're currently listening to or watching and quick settings that give you control over the os with just a swipe and a tap and now you can invoke the google assistant by long pressing the power button and the team also reduced the cpu time of android system server by a whopping 22 percent and with android 12 we're going even further to keep your information safe to give people more transparency and control we've created a new privacy dashboard that shows you what type of data was accessed and when this dashboard reports on all the apps on your phone including all of your google apps and we've made it really easy to revoke an app's permission directly from the dashboard we've also added an indicator to make it clear when an app is using your camera or microphone but let's take that a step further if you don't want any apps to access the microphone or camera even if you've granted them permission in the past we've added two new toggles in quick settings so you can completely disable those sensors for every app android's private compute core enables things like now playing which tells you what song is playing in the background and smart reply which suggests responses to your chats based on your personal reply patterns and there's more to come later this year all of the sensitive audio and language processing happens exclusively on your device and like the rest of android private compute core is open source it's fully inspectable and verifiable by the security community with a single tap you can unlock and sign into your chromebook when your phone is nearby incoming chat notifications from apps on your phone are right there in chrome os and soon if you want to share a picture one click and you can access your phone's most recent photos to keep movie night on track we're building tv remote features directly into your phone you can use voice search or even type with your phone's keyboard we're also really excited to introduce support for digital car key car key will allow you to lock unlock and start your car all from your phone it works with nfc and ultra wideband technology making it super secure and easy to use and if your friend needs to borrow your car you can remotely and securely share your digital key with them car key is launching this fall with select google pixel and samsung galaxy smartphones and we're working with bmw and others across the industry to bring it to their upcoming cars that was a quick look at android 12 which will launch this fall but you can check out many of these features in the android 12 beta today let's go beyond the phone to what we believe is the next evolution of mobile computing the smartwatch first building a unified platform jointly with samsung focused on battery life performance and making it easier for developers to build great apps for the watch second a whole new consumer experience including updates to your favorite google apps and third a world-class health and fitness service created by the newest addition to the google family fitbit as the world's largest os we have a responsibility to build for everyone but for people of color photography has not always seen us as we want to be seen even in some of our own google products to make smartphone photography truly for everyone we've been working with a group of industry experts to build a more accurate and inclusive camera so far we've partnered with a range of different expert image makers who've taken thousands of images to diversify our image data sets helped improve the accuracy of our auto white balance and auto exposure algorithms and given aesthetic feedback to make our images of people of color more beautiful and more accurate although there's still much to do we're working hard to bring all of what you've seen here and more to google pixel this fall we were all grateful to have video conferencing over the last year it helped us stay in touch with family and friends and kept businesses and schools going but there is no substitute for being together in the room with someone so several years ago we kicked off a project to use technology to explore what's possible we call it project star line first using high resolution cameras and custom built depth sensors we capture your shape and appearance from multiple perspectives and then fuse them together to create an extremely detailed real-time 3d model the resulting data is huge many gigabits per second to send this 3d imagery over existing networks we developed novel compression and streaming algorithms that reduce the data by a factor of more than 100 and we have developed a breakthrough light field display that shows you the realistic representation of someone sitting right in front of you in three dimensions as you move your head and body our system adjusts the images to match your perspective you can talk naturally gesture and make eye contact it's as close as we can get to the feeling of sitting across from someone we have spent thousands of hours testing it in our own offices and the results are promising there's also excitement from our lead enterprise partners we plan to expand access to partners in healthcare and media thank you for joining us today please enjoy the rest of google i o and stay tuned for the developer keynote coming up next i hope to see you in person next year until then stay safe and be well

As found on YouTube