First Look: Android 16 Beta Released

First Look Android 16 Beta Released

The first Android 16 beta is here, welcoming both developers and early adopters to explore the latest features. Pixel users can enroll their supported devices to receive over-the-air updates for this and future betas. This build introduces support for enhanced app adaptivity, Live Updates, the Advanced Professional Video format, and more. We value your feedback as we continue to refine Android into a platform for everyone.

After two rounds of rigorous testing through the developer preview program, Google has officially launched the first public beta for Android 16, marking a significant step forward in the development cycle of the next major Android release.

This eagerly awaited public beta is now rolling out as an over-the-air (OTA) update to eligible Google Pixel devices, specifically those from the Pixel 6 series and newer ones enrolled in the Android Beta Program. This broader release allows a wider range of users to experience the latest features, design tweaks, and under-the-hood improvements that Android 16 offers, providing valuable real-world feedback to Google’s engineering teams.

The public beta represents a critical phase in refining the user experience, stability, and overall performance of the upcoming operating system, as Google gears up for its eventual full release. Enrolled Pixel users can anticipate a fresh look at the Android ecosystem and contribute directly to shaping the final version of Android 16 by participating and reporting any bugs or usability concerns they encounter.

enroll any supported Pixel device here 

Image Not Found

Developing Adaptable Android Apps

The landscape of Android app development is evolving, driven by the growing diversity of devices and user expectations for seamless experiences. No longer confined to the limitations of specific screen sizes and orientations, users demand that apps adapt gracefully across their entire ecosystem of devices, from compact smartphones to expansive tablets and foldable screens.

Recognizing this shift, Android 16 marks a significant step forward by phasing out the ability for applications to restrict screen orientation and resizability on larger displays. This change, mirroring features already implemented by OEMs on large-screen devices, empowers users to run apps in any window size and aspect ratio they prefer.

Consequently, targeting API level 36 will require developers to embrace adaptability, ensuring their application UIs scale smoothly on screens exceeding 600dp in width. This means that your application must now be ready to handle both portrait and landscape orientations effectively, maintaining usability and visual appeal regardless of the aspect ratio.

This transition doesn’t come without support; Google is providing comprehensive frameworks, tooling, and libraries specifically designed to aid developers in building robust, responsive applications that seamlessly adjust to different screen sizes and configurations. This push towards adaptive design underscores a critical shift in how developers must approach Android app development – building for fluidity and flexibility is no longer optional, but a core requirement for creating applications that deliver a modern and satisfying user experience.

Significant Updates:

The behavior of Android applications regarding screen orientation and resizing on large-screen devices like tablets and foldable phones is undergoing a significant shift, with a notable divergence in how apps and games are handled.

Going forward, the system will effectively disregard manifest attributes and APIs that attempt to restrict an app’s orientation (e.g., forcing portrait mode) or prevent resizing. This means that applications, unless they are specifically designated as games, will be expected to adapt fluidly to varying screen sizes and orientations. This change reflects a move towards embracing the diverse form factors and user experiences offered by large screens, ensuring that apps can take full advantage of the available real estate. Previously, many apps, particularly those designed primarily for phones, would struggle with improper layouts, stretching, or a generally poor user experience when displayed on a large screen.

By ignoring these restrictive settings, the Android system encourages developers to build more responsive and adaptable interfaces, thus creating a richer and more seamless experience for users on tablets, foldable devices, and other large-screen formats. While this change is designed to improve app usability across various screen sizes, it specifically excludes games. This allows them to retain the capability to manage orientation and resizing as designed by the developers, acknowledging the unique control requirements often needed for optimal game performance and presentation.

This distinction highlights the different needs of each category and aims to ensure that all types of experiences are optimized for their specific use case.

Timeline:

The landscape of Android app development is undergoing a significant shift, particularly for applications designed to run on larger screens. With the advent of Android 16 (scheduled for 2025), Google is introducing substantial changes aimed at ensuring a more consistent and optimized experience across various device sizes. Specifically, apps targeting API level 36 will be subject to enforced adaptations for large screens (defined as 600dp in width), encouraging developers to embrace layouts that can fluidly adjust to different form factors.

While this initial implementation provides developers with an opt-out mechanism, this reprieve is temporary. By the time of the Android release in 2026, the changes related to large-screen adaptation will become mandatory for apps targeting API level 37; the opt-out will no longer be available, signaling Google’s commitment to a consistent user experience across tablets, foldables, and other large-screen devices. This transition represents a crucial turning point for developers, urging them to proactively design apps with adaptive layouts in mind.

Thankfully, the shift doesn’t have to be a daunting leap into the unknown. Developers can utilize the app compatibility framework and enable the UNIVERSAL_RESIZABLE_BY_DEFAULT flag to test these overrides without having to change their application’s target API level, giving them valuable time to refine and optimize their layouts. The emphasis is clear: the future of Android app development is adaptive, and this development cycle is a call to action for developers to embrace the new standards and create more flexible, dynamic applications for all users.

This also includes exploring the changes to orientation and resizability APIs introduced in Android 16, ultimately resulting in more polished and user-friendly app experiences across the ever-evolving range of Android devices.

Real-Time Updates

Live Updates introduce a novel approach to notifications, designed to keep users informed and effortlessly connected to important, ongoing activities. This innovative class of notifications transcends simple alerts, providing a dynamic and real-time window into the progress of key events.

At the heart of this new feature lies the ProgressStyle notification template, meticulously crafted to ensure a consistent and intuitive user experience. This template empowers developers to build for progress-centric user journeys, particularly those common in ride-sharing, food delivery, and navigation apps. The ProgressStyle template goes beyond a basic progress bar, offering a rich canvas for user journey visualization.

It provides support for custom icons that delineate the start, end, and current progress points of a journey, while also incorporating segments and milestones to provide comprehensive tracking. Furthermore, it allows for the display of user journey states, keeping users abreast of the current status and any crucial changes.

Although incredibly versatile, ProgressStyle notifications are specifically suggested for ride-sharing, food delivery, and navigation use cases, ensuring their use remains focused on the user experiences where their rich feature set can offer the most value. This new capability allows for a much deeper and more engaging interaction with users, transforming notifications from simple interruptions into an active element of an ongoing activity.

@Override
protected Notification getNotification() {
return new Notification.Builder(mContext, CHANNEL_ID)
.setSmallIcon(R.drawable.ic_app_icon)
.setContentTitle(“Ride requested”)
.setContentText(“Looking for nearby drivers”)
.setStyle(
new Notification.ProgressStyle()
.addProgressSegment(
new Notification.ProgressStyle.Segment(100)
.setColor(COLOR_ORANGE)
).setProgressIndeterminate(true)
).build();
}

New Camera & Media Features

Android 16 is poised to significantly elevate the media capabilities of the platform, bringing substantial advancements to the playback, creation, and editing of high-quality content. This iteration focuses heavily on addressing a critical use case, particularly for social and productivity applications where rich media experiences are paramount.

Users can anticipate a smoother and more responsive playback experience for high-resolution videos and intricate audio files, leveraging improvements in codec support and hardware acceleration. Android 16 introduces enhanced tools and frameworks designed to empower developers to create more sophisticated media editing features directly within their apps.

This means that tasks like trimming videos, applying filters, adjusting audio levels, and adding overlays will be more seamless and accessible than ever before. The platform’s updated architecture will also facilitate more efficient processing of complex media tasks, minimizing lag and maximizing device performance.

The combined effect of these improvements is a significant step forward, enabling a new era of dynamic media interactions within Android applications, directly benefiting the end-user with greater creative flexibility and enjoyment.

Unlocking the Power of Professional Video

Android 16 marks a significant leap forward in mobile video capabilities with the introduction of native support for the Advanced Professional Video (APV) codec. This codec is meticulously engineered to cater to the demanding requirements of professional-level, high-quality video recording and post-production workflows.

Unlike commonly used codecs geared towards compression and efficiency for general consumption, APV prioritizes fidelity and flexibility, ensuring that filmmakers and content creators can capture and manipulate footage with minimal loss of information. The implementation of APV in Android 16 allows for the recording of video with enhanced dynamic range, richer color depth, and lower compression artifacts, ultimately resulting in a superior final product.

This advancement empowers mobile devices to become increasingly viable tools for professional video production, blurring the lines between dedicated camera equipment and the convenience of smartphones. By embracing the APV codec, Android 16 opens the door to a future where mobile video is not just convenient but also consistently delivers the quality and flexibility demanded by the most discerning professionals.

The ability to seamlessly integrate this high-end codec also has implications for post-production workflows, enabling smoother transitions into professional editing suites and providing filmmakers with the granular control necessary to craft compelling visual narratives. This move solidifies Android’s position as a platform that is not just for casual users but also a powerful tool for professional content creators.

Introduction to the APV Codec:

  • Perceptually Lossless Quality: Delivers video quality that is virtually indistinguishable from the original raw video.

  • Low Complexity, High Throughput Intra-Frame Coding: Utilizes intra-frame-only coding (no pixel domain prediction) for faster processing, specifically designed to enhance editing workflows.

  • High Bit-Rate Support: Accommodates high bit-rates up to several gigabits per second (Gbps) for 2K, 4K, and 8K resolution content, facilitated by a lightweight entropy coding scheme.

  • Frame Tiling: Implements frame tiling to support immersive content and enable parallel encoding and decoding processes for faster performance.

  • Flexible Chroma Sampling and Bit-Depths: Supports various chroma sampling formats and bit-depths, offering flexibility for different recording needs.

  • Multiple Decoding/Re-encoding Support: Allows multiple decoding and re-encoding cycles with minimal visual quality degradation.

  • Multi-View and Auxiliary Video Support: Handles multi-view video and additional video data such as depth maps, alpha channels, and previews.

  • HDR Support and Metadata: Supports HDR10/10+ and also allows for the inclusion of user-defined metadata.

The Advanced Picture and Video (APV) codec standard is gaining significant traction, particularly with the emergence of a reference implementation provided by the OpenAPV project. This open-source initiative offers developers and researchers a concrete foundation for exploring and utilizing the APV standard’s capabilities.

A key development that underscores the codec’s growing importance is its planned integration into the Android ecosystem. Specifically, Android 16 is slated to include native support for the APV 422-10 Profile. This profile is designed to deliver high-fidelity video by employing YUV 422 color sampling, which preserves more color detail compared to 420 sampling. Furthermore, the 10-bit encoding capability of the APV 422-10 Profile allows for a wider range of color gradations and finer detail in luminance, significantly reducing banding artifacts and enhancing overall image quality.

The target bitrate capacity of up to 2 Gigabits per second (Gbps) for this profile demonstrates its ability to accommodate demanding high-resolution, high-frame-rate video content. This combination of features positions the APV 422-10 Profile as a powerful tool for applications requiring premium video quality, from professional video production workflows to high-end mobile video consumption, ensuring a visually rich and immersive experience for end-users.

The upcoming support in Android 16 marks a major step in the broader adoption and impact of the APV codec.

Night Mode Scene Detection

Android 16 introduces a significant enhancement for developers seeking to seamlessly integrate night mode functionality within their camera applications: the EXTENSION_NIGHT_MODE_INDICATOR. This new API, accessible within the CaptureResult of the Camera2 framework, provides a crucial signal indicating when the camera system is actively engaged in a night mode capture session.

This seemingly small addition unlocks a powerful capability for apps to intelligently manage their user interfaces and camera processing pipelines. By monitoring this indicator, developers can ensure a consistent and optimal user experience, transitioning to appropriate display modes, or adjusting post-processing algorithms in real-time as the device dynamically switches between standard and night mode capture.

The EXTENSION_NIGHT_MODE_INDICATOR represents a crucial advancement that allows for fine-grained control of the camera’s night mode behavior. This feature directly addresses a key challenge highlighted in a recent blog post, “How Instagram enabled users to take stunning low light photos,” which explored practical implementations of night mode and demonstrated how enabling this feature resulted in more high-quality in-app photos, leading to an increase in users sharing these images.

The post teased the upcoming availability of this API, and now, with its release in Android 16, developers have a powerful tool to emulate, and potentially improve upon, the seamless night mode experience offered by apps like Instagram. By leveraging this indicator, developers can move past guesswork and consistently optimize their camera implementations, directly impacting user satisfaction and engagement, and showcasing the benefits of utilizing Android’s evolving camera capabilities.

Text Vertical

The Android 16 update introduces a significant enhancement for text rendering capabilities, specifically addressing the complexities of vertical writing systems. This advancement empowers developers with low-level tools to effectively support languages, like Japanese, that often employ vertical text layouts. The core of this enhancement lies in the introduction of the VERTICAL_TEXT_FLAG within the Paint class.

By setting this flag using Paint.setFlags, developers can instruct the Paint object to interpret and handle text metrics vertically rather than horizontally. This subtle but powerful change has far-reaching implications. Firstly, Paint’s text measurement APIs, which are crucial for calculating the dimensions and positioning of text, are now capable of reporting vertical advances—the distance the text moves vertically on a page or display—instead of their traditional horizontal counterparts. This allows for accurate layout calculations in vertical contexts.

Secondly, the Canvas object, which is the primary drawing surface in Android, is now equipped to draw text vertically when the VERTICAL_TEXT_FLAG is active. This means that developers can seamlessly render vertical text, preserving correct character orientation and line progression.

The impact of this feature is that it provides a crucial foundation for library developers to create sophisticated text-processing and rendering components that handle vertical text layouts efficiently and accurately, opening up new possibilities for user interfaces and applications that cater to diverse linguistic needs.

This fundamental shift towards native support for vertical text in Android eliminates the need for cumbersome workarounds and makes it easier to develop apps and experiences for users whose native languages use vertical writing systems. 

The current landscape of high-level text APIs in popular frameworks like Jetpack Compose and Android’s traditional TextView, along with their associated Layout classes and subclasses, exhibits a significant limitation: a lack of inherent support for vertical writing systems. This means that developers cannot easily render text in a vertical orientation using standard API calls and components. The crucial VERTICAL_TEXT_FLAG, intended to enable vertical text rendering, remains largely unimplemented or ineffective within these readily available UI tools.

val text = “「春は、曙。」” Box(Modifier .padding(innerPadding) .background(Color.White) .fillMaxSize() .drawWithContent { drawIntoCanvas { canvas -> val paint = Paint().apply { textSize = 64.sp.toPx()} // Draw text vertically paint.flags = paint.flags or VERTICAL_TEXT_FLAG val height = paint.measureText(text) canvas.nativeCanvas.drawText( text, 0, text.length, size.width / 2, (size.height – height) / 2, paint ) } }) {}

Accessibility: Design for Everyone

Android 16 introduces a suite of new accessibility APIs, empowering developers to create more inclusive and user-friendly applications for everyone. These enhancements aim to make apps more usable and navigable for individuals with diverse needs, ultimately ensuring a better experience for every user.

Supplemental Information

In the realm of Android accessibility, crafting a user experience that is both informative and navigable for individuals with disabilities requires careful consideration of how screen readers interpret UI elements. A common challenge arises when working with ViewGroup objects, containers that hold other views.

Accessibility services, such as TalkBack, often attempt to provide a comprehensive description of a ViewGroup by concatenating the content labels of its child views. While this behavior is generally helpful, it can lead to unintended consequences. If a developer sets a contentDescription for the ViewGroup itself, accessibility services interpret this as an instruction to override any content labels provided by non-focusable child views.

This can be particularly problematic when dealing with composite elements like dropdowns. Imagine a dropdown labeled “Font Family,” which currently has “Roboto” selected. If you set a contentDescription on the ViewGroup to simply “Font Family,” the screen reader will no longer announce “Roboto” as the current selection, creating a confusing and less informative experience. To address this, Android 16 introduced the setSupplementalDescription method, which allows developers to add descriptive text to a ViewGroup without disrupting the content of its child views.

Using setSupplementalDescription, you can label the dropdown as “Font Family” while still allowing accessibility services to announce the current selection, such as “Roboto”, thus creating a more complete and informative user experience. This provides a significant improvement to the ability of developers to provide clear and contextually appropriate accessibility information.

Required form fields

The introduction of setFieldRequired to AccessibilityNodeInfo in Android 16 marks a significant advancement in accessibility support for form fields. This seemingly small addition carries substantial implications for users navigating digital forms, offering a crucial mechanism for apps to communicate the required status of form elements to accessibility services.

Before this update, visually impaired users or those relying on assistive technologies often struggled to distinguish required form fields from optional ones. This ambiguity could lead to incomplete form submissions, frustration, and wasted time. With setFieldRequired, apps can now programmatically declare whether a field mandates user input, allowing accessibility services to relay this information to the user effectively.

This enhancement is particularly vital in scenarios where numerous form fields are present, even for something as fundamental as a required terms and conditions checkbox. By enabling users to identify required fields consistently and intuitively, setFieldRequired facilitates swift and efficient navigation between these critical elements, ensuring that users can complete forms accurately and without unnecessary hurdles. Ultimately, this contributes to a smoother, more inclusive, and more user-friendly digital experience for a wider range of users.

Exploring Gemini Extensions

The launch of new Gemini Extensions on Samsung’s Galaxy S25 series marks a significant leap forward in the integration of large language models with mobile ecosystems. This groundbreaking development showcases how Android applications can harness the power of Gemini, Google’s cutting-edge AI model, to deliver richer and more intuitive user experiences.

These extensions aren’t just about adding fancy features; they represent a fundamental shift in how apps interact with users, offering the potential for contextual understanding, intelligent task automation, and personalized content generation directly within the applications we use every day. Imagine an email app that intelligently summarizes long threads or a note-taking app that suggests relevant connections between your thoughts, all powered by the seamless integration of Gemini.

This is not just limited to smartphones; the ambition is to extend this innovative functionality to a wider array of devices and form factors, paving the way for a future where AI is deeply woven into the fabric of our digital lives, making technology more accessible, intelligent, and ultimately, more human-centric. This initial deployment on the Galaxy S25 series serves as an exciting preview of what’s to come as we work towards bringing this power to even more users and devices.

Android Power: Two APIs in 2025

In 2025, the Android platform will see a significant evolution through a dual-release strategy, designed to balance impactful updates with developer stability. The first major release, slated for Q2, will introduce not only new developer APIs but also critical behavior changes that could potentially affect existing applications.

This Q2 launch will be the sole release in 2025 to incorporate such changes, emphasizing the importance of developer preparedness and thorough testing. Following this, a second, minor release will arrive in Q4, bringing with it a suite of feature updates, performance optimizations, and bug fixes. Crucially, the Q4 release will not introduce any app-impacting behavior changes, focusing instead on refinement and platform stability.

Image Not Found

While these two key releases anchor the year, Android will maintain its commitment to continuous quality with quarterly updates in Q1 and Q3, providing incremental improvements and ensuring a smooth user experience. The team is also prioritizing collaboration with device partners to ensure the wide and rapid adoption of the Q2 major release across the ecosystem. Finally, the yearly requirement for app updates in Google Play, tied to the major API level, will remain unchanged with one annual update planned for each year, aligning with the major Q2 release and simplifying the planning for developers.

This structured approach aims to provide developers with a clear understanding of the year’s roadmap, allowing them to efficiently manage their update cycles while also providing users with the latest innovations and the highest level of quality.

App Compatibility Guide

The journey to Android 16 begins in November 2024 with the launch of the preview program, providing developers with a valuable head start to prepare their applications for the next iteration of the Android ecosystem. This program will run until the final public release in the second quarter of 2025, offering a series of iterative updates at key development milestones.

Each update will furnish developers with essential tools, including SDK tools, system images, emulators, comprehensive API references, and detailed API diffs, enabling them to thoroughly explore and integrate new features into their apps. Throughout the preview program, critical APIs will be highlighted in blogs and on the Android 16 developer website, ensuring developers remain informed and can proactively test the newest functionalities.

Image Not Found

A significant milestone within the preview program is the Platform Stability target, slated for March 2025. At this point, the final SDK/NDK APIs, as well as final internal APIs and app-facing system behaviors, will be delivered. This marks a pivotal moment in the development cycle, granting developers several months to finalize their testing efforts and address any potential compatibility issues before the official launch of Android 16.

Detailed information regarding the release timeline can be found, empowering developers to align their workflows with the Android 16 release schedule.

Android 16: First Steps

The Android 16 beta program is now officially open, offering users a chance to experience the latest features and improvements firsthand. If you own a compatible Pixel device, enrolling in the beta program is straightforward: you can now receive Android 16 beta updates, and subsequent updates, directly over-the-air, just like regular software updates. This eliminates the need for manual flashing and provides a seamless way to test the new Android version.

For those without a Pixel device, don’t fret – you can still participate by utilizing 64-bit system images in the Android Emulator within Android Studio, allowing developers and enthusiasts alike to explore the new operating system within a simulated environment. Notably, if you’re already part of the Android 16 development cycle, specifically on Developer Preview 2, or are currently enrolled in the existing Android Beta program, you will automatically be offered an over-the-air update to Beta 1 of Android 16.

This streamlined update process ensures that everyone actively involved in testing can effortlessly transition to the latest beta iteration. Furthermore, a critical note for users involved in the Android 25Q1 Beta: If your intention is to exit the Beta program and receive the final, stable release of 25Q1, it’s crucial to disregard the incoming over-the-air update that will bring you to 25Q2 Beta 1.

Instead, you must patiently wait for the official release of the 25Q1 final stable version. Proceeding with the 25Q2 Beta 1 update would place you on the next development cycle and potentially hinder your ability to easily return to the final stable release of the previous version. This important distinction highlights the necessity of understanding the release cycles within the Android ecosystem.

Source

Leave a Comment

Comments

No comments yet. Why don’t you start the discussion?

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    This site uses Akismet to reduce spam. Learn how your comment data is processed.