Android 12 preview: here’s Google’s radical new design

(rubs hands) – Android 12, it is here
or it's being announced. The new beta where Google
actually tells us what the biggest new user
facing features will be, has been announced. And, I have seen a demo
and I've played around with the beta here on my phone, and I have some thoughts. Do you wanna hear my thoughts or would you rather just see
what's new in Android 12? Oh, why not both… This is Android 12. (upbeat music) Android 12 looks different
from what you're used to on Android, actually very different. Google says that this is
the biggest visual overhaul since 2014, or maybe ever, depending
on who you're asking. And yeah, a lot of the pieces of this operating system here do look very different, but it all basically still works the same. You've got a home screen, you swipe up for apps, you swipe on for quick settings and for your notifications,
etcetera, etcetera.

What you're really looking at here with these big buttons and
the really big bubbly sliders and so on is how the Android team has decided to implement a new design system that Google is calling Material U. Now, Material UX or material UI just Material U like Y O U, whatever. Now, when you're looking at the B roll and the screen recordings and the screenshots on this phone, you should know that it is how Google is implementing Material U on the Pixel. Whether and how Samsung or Xiaomi or OnePlus decide to implement
it is going to be different. And also, you know, much later because their updates always
come later than the Pixel. Anyway, I don't have the full details on Material U and how it works and so on. But, I do know that it's
supposed to apply to everything from the web to Android,
to apps, to even hardware.

What that means is I'm just,
I'm not going to get any of the HETI UI versus UX versus you. You stuff here. I'm just going to talk about what I am seeing here on this phone. And what I am seeing is good. For the Android team the U part of material you hear is an
automatic theming system. So, when you set a new wallpaper you're gonna to be given
the option to have Android pull up some colors from your photo and then, apply that theme with
those colors to the system. So you can see here that the
buttons have turned green, and there's also an algorithm for pulling out complementary
colors from the photo. It's kind of neat, but I don't know that I would have picked
this particular green if I were beaming at myself. And the good news is is you can pick whatever
colors you really want to. So that's neat, but really I
can tell you the whole story of this visual redesign just by looking at a couple
of screen recording.

So, here's Android 11
and here is Andrew 12. So first there's a bunch of new like lighting effects when you unlock the phone,
you can kinda see colors and shadows and light kinda sweep around. And, in general there's just more animations all over the operating system. And we're gonna come
back to why that is, but look, they're even taking advantage of these animations on
the lock screen buttons, and you can see the little color from the material U theming as well. Now, when we pull down the quick settings
and notification shade you see that they are just
very big, easy to recognize easy to understand buttons. Google's just not afraid of taking up more space with all of their UI and they're not trying to cram everything into the most information
dense thing possible.

I actually think it's
like a nice direction. There is another subtle difference in the notification shade, you can see that it's just covering
the entire screen instead of sort of being a translucent layer over. It makes it into an entirely new space. And if you look at the
notifications themselves you'll see that they're
groups together and signified by a bunch of bubbles for
each individual group. So there's conversations and silent notifications and whatever. But if you slide an
individual notification away there's this really
subtle effect where the hard corner turns into a bubble for just that notification to indicate that is its own separate thing. Now on the home screen, let's just pause a moment
to look at these widgets. They are brand new and they're based on an entirely new
system for making widgets that is based on these principles from the material U design system.

So, Google is gonna
update a bunch of their own widgets, but they're also hoping that they can get a bunch of developers on board to update their old
widgets to the new system. And, I really hope it works
because the widget ecosystem on Android has gotten really
crufty and messy over time and it is due for a refresh. Now, next stop are quick settings and Google changes quick
settings every single year. And this year is no different. The new thing this year is that the buttons are huge! I mean, just look at
them, but I don't know. I kinda like it. Google also puts smart home controls and Google wallet into
quick settings finally, which means that now holding
down the power button brings up the assistant just like it does on the iPhone
and on Galaxy phones. And all of that means "adiós weird power button menu from Android 11!". You tried… Finally in quick settings there are toggles for
camera and mic access and we're going to get
to those in a minute.

Oh you know what, one more
thing I just have to talk about that's not in the screen
recordings, the new lock screen when you don't have any notifications you have this giant clock on it and it's dope and it
matches your color theme. We do have notifications. It's still pretty big. It just gets a little bit smaller. It's a good lock screen! Now the version of the Android beta that Google is releasing
this month, doesn't have all of the gewgaws and bells and
whistles that you just saw but, there's enough here that you can see where
Google is going with it.

Like, even if you just
look at the settings I have all of the icons
and the text is bigger and they've got this new
over scroll animation that kinda squeezes things together. It's a big redesign but it's not a complete overhaul
of how everything works. Every design gets crufty over time. And Android was definitely
starting to show a lot of inconsistencies as new features piled on and old ones were kind of half forgotten. I see this design as a general cleanup. All the buttons are big and bubbly and I see a sense that things are going to be a
little bit more coherent now, and, I dig that. So that is the new design
system, but I wanna come back to a thing I mentioned at the
top to the smoothness thing.

Android has a, a reputation that the
only way to make it smooth and good-looking is to
throw more powerful hardware at it with faster refresh
screens or more RAM or whatever. With Android 12, Google's promising that they're going to make
the animation smoother for everybody through
software improvements. So, we sat down with Sameer Samat, the VP of product management for
Android and Google play. And here's how he explains it. – So we've done a few
things to make things to make the system feel smooth. We've reduced lock
contention and key services, [Sameer] activity window
and package manager. What that really means is, there are multiple different parts of
the system trying to talk to the operating system at the same time. And that's when you see things jitter or genic, by smoothing a lot of that out and by reducing,
for example, the amount of time that Android system
server uses by 22%, actually. We've been able to make all the motion and animation feel super smooth. – All right, there are
a few other interesting features that are being announced today. So, there is a proper remote
control app for Android TV.

They're going to have car unlock
that works with NFC or UWB if your phone has it and that'll work with a
few different partners. And later this year, if
you have a Chromebook it's going to be able to
directly access the photo library on your Android phone. So next up is privacy updates. Google is putting privacy updates in every version of Android. That is great. And this year there really are a bunch.

The main thing that Google is trying to do this year is tamp
down on unfettered access to your location, your
camera and your microphone. So there are new indicators in the upper right-hand corner
when they're being accessed. And there are those new
buttons and quick settings that just fully turns off your
camera or your microphone. So, when you toggle them off, an app that looks your camera just gets a black, nothing. It thinks the camera's there, but really it's just getting nothing. There is also a new privacy dashboard that will show you how often those sensors have been accessed and by which apps. So you can view your data
from the past 24 hours in a pie chart or in a
timeline, and then turn off all the different
access stuff from there. Now for location, there is a
new kind of permission that you can grant to an app that's
approximate location instead of just precise location. So, say you've got
something like a weather app and you don't want it to
know your precise GPS pin but you want to know what
neighborhood you're in, you can give it an approximate location.

So let's all the privacy stuff
for sensors, but there's also this new part of the
operating system called the Android private compute core. Now you might think
it's a chip because core but it's not, instead it's,
it's like a sandbox part of the operating system for
machine learning things. It doesn't store data. It runs processes. – A good way to think
about it is, when you have these advanced technologies,
like for example speech recognition or
natural language processing, and they need access
to certain information. Another favorite example
of mine is smart reply. [Sameer] Awesome feature,
looks at your notifications your chat notifications,
and suggests replies based on a speech and language model.

All of that runs on device
in private compute core. – From my perspective, basically
what all that means is that if Google wants Android to be
able to do something with AI that you might think is creepy, now they can put all of
those processes in a box and limit all communication
into and out of that box and everything in the box
can't access the network and it's only accessible via limited API. So, that all seems great
but is it more secure? We'll see. So that's all the privacy
stuff that Google wants to talk about but, there is another
kind of privacy that Google really isn't keen
on discussing that much. And that is app tracking for ads. Now, there have been rumors
that Google would follow Apple and limit some kind of app
tracking for things like ads but, Google also makes
all of its money on ads.

So – Taking a step back on this one, there's obviously a lot changing
in the, in the ecosystem. One thing about Google is
that, it is a platform company. It's also a company that is
deep in the advertising space. So we're thinking, very deeply about how we should evolve
the advertising ecosystem. You've seen what we're doing on Chrome. [Sameer] From our standpoint on Android, we don't have anything
to announce at the moment but we are taking a position that privacy and advertising don't need
to be directly opposed with each other, [Sameer] that we don't believe is healthy for the overall ecosystem as a company.

So we're thinking about that working with our developer partners and we'll be sharing more later this year. – All right, well, stay tuned for news from Google on that later. And speaking of later, when are you gonna be
able to get Android 12 on your Android phone? Well, do you have a Pixel? Because then the answer is easy. You're going to get it this fall. Do you not have a Pixel?
Well, then the answer is later. Google says that the speed by which companies are
updating their phones to the latest version of
Android has improved by 30% but still, other manufacturers besides Google just take awhile to get the latest version of Android on their phones. That's just how Android works. Alright. That's Android 12, a huge redesign that adds some consistency and coherency with big buttons,
big sliders, big everything! There's more theming options. There's a bunch of privacy indicators.

There's a bunch of stuff that they put in the developer betas that
I haven't even covered here and a TV remote. This isn't the most massive release ever but you know what, it's enough. (transition sound) Hey everybody, thanks
so much for watching, right now it is the middle
of Google IO, which means that there is a lot going
on and we're going to have a lot more coverage of
everything Google has announced, and, you know, in general
it's just a big tech week. So I think there's gonna
be a couple more videos on the verge you're
gonna wanna check out….

As found on YouTube

Google I/O 2021 keynote in 16 minutes

good morning everyone it's great to be back at io today i'm excited to share our latest breakthrough in natural language understanding lambda it's a language model for dialogue applications and it's open domain which means it's designed to converse on any topic and while it's still in research and development we've been using it internally to explore novel interactions for example say you wanted to learn about one of my favorite planets pluto lambda already understands quite a lot about pluto and millions of other topics let's listen to a conversation the team had with pluto a few days ago i'm so curious about you i sense your excitement ask me anything tell me what i would see if i visited you would get to see a massive canyon some frozen icebergs geysers and some craters it sounds beautiful i assure you it is worth the trip however you need to bring your coat because it gets really cold i'll keep that in mind hey i was wondering have you ever had any visitors yes i have had some the most notable was new horizons the spacecraft that visited me let's break down what made it feel so natural first learn concepts as you saw the model talked about the new horizon spacecraft and the coldness of space lambda synthesized these concepts from its training data because none of the responses were predefined lambda answered with sensible responses keeping the dialogue open-ended natural conversations are generative and they never take the same path twice and lambda is able to carry a conversation no matter what we talk about yet it's still early research so it doesn't get everything right sometimes it can give nonsensical responses imagining pluto doing flips or playing fetch with its favorite ball the moon other times it just doesn't keep the conversation go going we believe lambda's natural conversation capabilities have the potential to make information and computing radically more accessible and easier to use we look forward to incorporating better conversational features into products like google assistant search and workspace lambda is a huge step forward in natural conversation but it is still trained only on text when people communicate with each other they do it across images text audio and video so we need to build models that allow people to naturally ask questions across different types of information these are called multimodal models for example when you say show me the part where the lion roars at sunset we will get you to that exact moment in a video advances in ai are helping us reimagine what a map can be but now you can also use it to explore the world around you you'll be able to access live view right from the map and instantly see details about the shops and the restaurants around you including how busy they are recent reviews and photos of those popular dishes in addition there are a host of new features coming to live view later this year we're adding prominent virtual street signs to help you navigate those complex intersections second we'll point you towards key alarm landmarks and places that are important for you like the direction of your hotel third we're bringing it indoors to help you get around some of the hardest to navigate buildings like airports transit stations and malls indoor live you will start rolling out in top train stations and airports in zurich this week and will come to tokyo next month we're bringing you the most detailed street maps we've ever made take this image of columbus circle one of the most complicated intersections in manhattan you can now see where the sidewalks the crosswalks the pedestrian islands are something that might be incredibly helpful if you're taking young children out on a walk or absolutely essential if you're using a wheelchair thanks to our application of advanced ai technology on robust street view and aerial imagery we're on track to launch detailed street maps in 50 new cities by the end of the year so we're making the map more dynamic and more tailored highlighting the most relevant information exactly when you need it if it's 8 a.m on a weekday we'll display the coffee shops and bakeries more prominently in the map while at 5 pm we'll highlight the dinner restaurants that match your tastes you'll start seeing this more tailored map in the coming weeks people have found it really useful especially during this pandemic to see how busy a place is before heading out now we're expanding this capability from specific places like restaurants and shops to neighborhoods with the feature called area business say you're in rome and want to head over to the spanish steps and its nearby shops with area business you'll be able to understand at a glance if it's the right time for you to go based on how busy that part of the city is in real time area busyness will roll out globally in the coming months let's talk about all the ways we're innovating in shopping soon on chrome when you open a new tab you'll be able to see your open carts from the past couple of weeks we'll also find you promotions and discounts for your open carts if you choose to opt in your personal information and what's in your carts are never shared with anyone externally without your permission we capture photos and videos so we can look back and remember there are more than four trillion photos and videos stored in google photos but having so many photos of loved ones screenshots selfies all stored together makes it hard to rediscover the important moments soon we're launching a new way to look back that we're calling little patterns little patterns show the magic in everyday moments by identifying not so obvious moments and resurfacing them to you this feature uses machine learning to translate photos into a series of numbers and then compares how visually or conceptually similar these images are when we find a set of three or more photos with similarities such as shape or color we'll surface them as a pattern when we started testing little patterns we saw some great stories come to life like how one of our engineers traveled the world with their favorite orange backpack or how our product manager christy had a habit of capturing objects of similar shape and color we also want to bring these moments to life with cutting edge effects last year we launched cinematic photos to help you relive your memories in a more vivid way cinematic moments will take these near duplicate images and use neural networks to synthesize the movement between image a and image b we interpolate the photos and fill in the gaps by creating new frames the end result is a vivid moving picture and the cool thing about this effect is it can work on any pair of images whether they were captured on android ios or scanned from a photo album in addition to providing personalized content to look back on we also want to give you more control we heard from you that controls can be helpful for anyone who has been through a tough life event breakup or loss these insights inspired us to give you the control to hide photos of certain people or time periods from our memories feature and soon you'll be able to remove a single photo from a memory rename the memory or remove it entirely instead of form following function what if form followed feeling instead of google blue we imagined material you a new design that includes you as a co-creator letting you transform the look and feel of all your apps by generating personal material palettes that mix color science with a designer's eye a new design that can flex to every screen and fit every device your apps adapt comfortably every place you go beyond light and dark a mode for every mood these selections can travel with your account across every app and every device material u comes first to google pixel this fall including all of your favorite google apps and over the following year we will continue our vision bringing it to the web chrome os wearables smart displays and all of google's products we've overhauled everything from the lock screen to system settings revamping the way we use color shapes light and motion watch what happens when the wallpaper changes like if i use this picture of my kids actually getting along for once i set it as my background and voila the system creates a custom palette based on the colors in my photo the result is a one of a kind design just for you and you'll see it first on google pixel in the fall starting from the lock screen the design is more playful with dynamic lighting pick up your phone and it lights up from the bottom of your screen press the power button to wake up the phone instead and the light ripples out from your touch even the clock is in tune with you when you don't have any notifications it appears larger on the lock screen so you know you're all caught up the notification shade is more intuitive with a crisp at a glance view of your app notifications whatever you're currently listening to or watching and quick settings that give you control over the os with just a swipe and a tap and now you can invoke the google assistant by long pressing the power button and the team also reduced the cpu time of android system server by a whopping 22 percent and with android 12 we're going even further to keep your information safe to give people more transparency and control we've created a new privacy dashboard that shows you what type of data was accessed and when this dashboard reports on all the apps on your phone including all of your google apps and we've made it really easy to revoke an app's permission directly from the dashboard we've also added an indicator to make it clear when an app is using your camera or microphone but let's take that a step further if you don't want any apps to access the microphone or camera even if you've granted them permission in the past we've added two new toggles in quick settings so you can completely disable those sensors for every app android's private compute core enables things like now playing which tells you what song is playing in the background and smart reply which suggests responses to your chats based on your personal reply patterns and there's more to come later this year all of the sensitive audio and language processing happens exclusively on your device and like the rest of android private compute core is open source it's fully inspectable and verifiable by the security community with a single tap you can unlock and sign into your chromebook when your phone is nearby incoming chat notifications from apps on your phone are right there in chrome os and soon if you want to share a picture one click and you can access your phone's most recent photos to keep movie night on track we're building tv remote features directly into your phone you can use voice search or even type with your phone's keyboard we're also really excited to introduce support for digital car key car key will allow you to lock unlock and start your car all from your phone it works with nfc and ultra wideband technology making it super secure and easy to use and if your friend needs to borrow your car you can remotely and securely share your digital key with them car key is launching this fall with select google pixel and samsung galaxy smartphones and we're working with bmw and others across the industry to bring it to their upcoming cars that was a quick look at android 12 which will launch this fall but you can check out many of these features in the android 12 beta today let's go beyond the phone to what we believe is the next evolution of mobile computing the smartwatch first building a unified platform jointly with samsung focused on battery life performance and making it easier for developers to build great apps for the watch second a whole new consumer experience including updates to your favorite google apps and third a world-class health and fitness service created by the newest addition to the google family fitbit as the world's largest os we have a responsibility to build for everyone but for people of color photography has not always seen us as we want to be seen even in some of our own google products to make smartphone photography truly for everyone we've been working with a group of industry experts to build a more accurate and inclusive camera so far we've partnered with a range of different expert image makers who've taken thousands of images to diversify our image data sets helped improve the accuracy of our auto white balance and auto exposure algorithms and given aesthetic feedback to make our images of people of color more beautiful and more accurate although there's still much to do we're working hard to bring all of what you've seen here and more to google pixel this fall we were all grateful to have video conferencing over the last year it helped us stay in touch with family and friends and kept businesses and schools going but there is no substitute for being together in the room with someone so several years ago we kicked off a project to use technology to explore what's possible we call it project star line first using high resolution cameras and custom built depth sensors we capture your shape and appearance from multiple perspectives and then fuse them together to create an extremely detailed real-time 3d model the resulting data is huge many gigabits per second to send this 3d imagery over existing networks we developed novel compression and streaming algorithms that reduce the data by a factor of more than 100 and we have developed a breakthrough light field display that shows you the realistic representation of someone sitting right in front of you in three dimensions as you move your head and body our system adjusts the images to match your perspective you can talk naturally gesture and make eye contact it's as close as we can get to the feeling of sitting across from someone we have spent thousands of hours testing it in our own offices and the results are promising there's also excitement from our lead enterprise partners we plan to expand access to partners in healthcare and media thank you for joining us today please enjoy the rest of google i o and stay tuned for the developer keynote coming up next i hope to see you in person next year until then stay safe and be well

As found on YouTube