Google I/O event is one of the most popular software conferences in the world. Hosted by the enormous software company Google, this event aims to introduce updated technology that is emerging in Google. It aims to enhance the already existing technology in Google.
The Google I/O event is usually held around May or June. It couldn’t be organised last year due to the coronavirus pandemic. The event may last for a maximum of 3 days.
The highlights from the Google I/O event held from 18 May to 20 May are described below:
Android was first released in 2007, and it now has over 3 billion active smartphones. As a result, it has become the world’s most common and widely used operating system. Around February of this year, Google released Android 12 which is the latest update of its smartphone operating system. Google revealed at the Google I/O event that Android 12 would have a fresh look.
Android 12 has introduced the integration of “Material You” as its new design. The Material You UI is a flexible user interface that emphasises the overall experience, from colours to forms, light, and motion. This UI grants us the ability to get a more intimate experience because of you.
We can customise our phones with custom colour palettes and wallpapers, and the OS will be able to choose colours from all of these. It will be able to decide which colour is more dominant and even select the one that matches the wallpapers and our custom colours and apply it in the OS. It would affect the notification UI.
The new Android 12 includes various features like privacy and a database. The privacy dashboard will give a transparent view of the applications that access the data on mobile are SMS, Calls, Locations, Camera, etc.
The two biggest tech companies – Google and Samsung have grown an alliance and announced that they will be creating a merged wearable platform. This platform will enable developers to build applications that can work on Samsung’s Tizen and Google’s Wear OS.
This platform will be based on Samsung’s Tizen in combination with Google’s Wear OS. The apps will get 30% faster than the older devices with long-lasting battery life.
In recent years, Google has been bridging the difference between distances; this time, they’re attempting to get things closer to home. We may feel comfortable calling or chatting with loved ones over Zoom, and Google Meet does a great job of connecting us, but nothing beats sitting down and talking with someone face to face.
This was the reason why Google announced Project Starline a few years ago. The project was demonstrated in the Google I/O event but it is still experimental. This technology includes 3D calls using depth sensors, high-resolution cameras, Lytro DNAs to show realistic 3D avatars of people we communicate with. It will use computer vision, real-time compression, spatial audio and machine learning.
Google has been deeply involved in Machine Learning and Artificial Intelligence and this time it is initiating the new AI language model – LaMDA. It was initially created as an improvement of other language models as the conversations will be in a more natural way.
LaMDA would be able to hold more intuitive discussions and run inquiries without prior experience, as well as answer more open-ended questions.
Google is overhauling its productivity suites to make them more integrated. It believes that its communication and teamwork platforms fail to bind people. Since users in a Docs team are separated from one another, Google wanted to develop a tool to get them together. As a result, “Smart Canvas” was revealed.
Smart Canvas tool is integrated into Google workspace to allow the users to plan and brainstorm their projects together. When working on a project, users will be able to share their views, leave notes, and have a Google Meet video chat. This applies to Google Docs, Sheets, and Slides as well.
Google is known for its heavy investments in AI and now they are introducing AI to Google Photos. It was announced in the Google I/O event that Google Photos will be creating a cinematic experience from the photos. From two images taken seconds apart, it will create an animation.
Wouldn’t you want to know, how do they do that? Google Photos, with the help of AI, guesses what will come in between and creates the video animation or GIF.
The new feature of a locked folder will be added to Google Photos. This will enable the users to create a privacy folder that can be unlocked using a touch or face sensor, password, PIN and pattern. These photos will be kept hidden and will not appear while scrolling.
Google Password Manager is a tool that allows one to recover and store passwords on the fly. Google has now revealed the addition of a new update to the Google Password Manager that will enable users to adjust a website’s saved password from inside the Google Password Manager.
It may have occurred that some routes displayed by Google Maps are not safe and fuel-efficient. Google announced to incorporate a better route identification method for Google Maps. It will employ machine learning to manage routes based on data from the route’s road network, weather patterns, and traffic to provide us with routes that are stable and less vulnerable to traffic jams, breaks, and weather disruptions.
The latest addition to Google Maps was the Live View AR tool which helped the users discover their route through indoor spaces as well. For example, this tool will help them find the train stations, airports, tunnels, etc.
This functionality will use AR to surface places in the real world using digital arrows and markers for easy navigation. Google announced that that new feature will be able to show hill slides, busy areas, landmarks, reviews, photos, etc.
Google’s Android TV has been a success as in the Google I/O event they announced that Google TV has reached 80 million active monthly users. They are expanding the capabilities of their Android TV. On Android smartphones, they revealed the addition of a TV remote version.
Users will be able to monitor the TV from their phones, including volume control, software switching, and control of other Android TVs that are attached to the phone.
Google revealed health care facilities for skin conditions. The app will allow you to take a close-up picture of the infected region of your face, after which you will be asked questions about your skin type, signs, any underlying diseases, and general health. The health tool will diagnose your skin condition based on your answers.
The diagnosis will include the name of the skin condition, signs, whether it is particularly infectious or not, the seriousness of the skin condition, medication, the length of time the skin condition will last if it does, how common the skin condition is, and much more information. It’s like getting a dermatologist on call at all times.
Google has released a new iteration of its TPU processors. TPUv4 is the fourth edition. Tensor Processing Units (TPUs) are Google’s custom-built application-specific integrated circuits for neural network machine learning. This latest TPUv4 is the fourth generation and is much faster than previous models.
These chips will be assembled into a cell, with each pod containing 4,096 v4 TPUs and a processing capacity of one exaflop (10^18 floating-point operations per second), according to Google CEO Sundar Pichai. Developers will be able to use these TPUs on the Google Cloud Platform.
Being the powerhouse of machine learning, Google announced that they will be launching a Private Compute Core feature to Android. This software will keep the user data in the operating system safe, local and private.
Wear OS, Google’s smartwatch operating system has confirmed that it will have more fitness applications. It will also host other healthcare applications that monitor things like heartbeats, mind stimulation, and so on. Following Google’s purchase of Fitbit, a wearables pioneer that led fitness bands, this announcement was made.
Cameras have a proclivity for filming light-coloured skin and struggle for dark skin. Google has been improving image recognition, and they recently revealed that their latest Android camera would be able to record and render dark skin tones and natural hair.
There will be some amendments to auto-white balance and related algorithms to provide accurate images of skin and hair.
Google is working with BMW to develop a wireless car key that allows owners to lock, unlock, and start their vehicles using their Android smartphones. This will be available in Android 12, Google’s most recent smartphone operating system.
When the user approaches the vehicle, the car can be opened and simple commands can be exchanged with it using the Android smartphone. UWB (Ultra Wideband) and NFC (Near-Field Communication) technologies are used in this feature.
The important highlights of the Google I/O event for a mobile application development company are as follows:
In the Google I/O event, it was revealed that Flutter will have a new upgrade in the UI toolkit called Flutter 2.2. The upgrades in the toolkit include the following:
Google has released an update to the Firebase. The majority of the changes were made to its tools, such as Remote Config and the Firebase Personalization feature. Google also enhanced Firebase’s analytics and monitoring services, especially Crashlytics, which assists gamers in determining why their apps have crashed.
Google has made several changes to its famous IDE, Android Studio. Google has announced that Android Studio will include Jetpack Compose, a new UI interface system. This new UI design environment will assist developers in designing and developing applications that are open to disabled people.
With technology advancing every day, it is hard for companies to keep up with the advancements. For many business owners, moving their services digitally is a huge step to take. Therefore, business owners need to reach out to the right mobile application development company to upgrade their business.
Get the weekly updates on the newest brand stories, business models and technology right in your inbox.