LuchaKliq.com
Home base for mutual aid, antifascism, counter culture, activism, news, truth, peace, love, gaming, and more!
Sunday, December 7, 2025
Android OS: FLAWS and Security threats
The Silent Erosion of Trust: An Exhaustive Analysis of Unmitigated Architectural Flaws and the Pixnapping Side-Channel in the Android Ecosystem
Executive Summary
The security posture of the Android operating system in late 2025 presents a complex paradox. On the surface, the ecosystem appears robust, fortified by a predictable cadence of monthly security bulletins and the mature "Project Mainline" initiative designed to modularize updates. However, beneath this veneer of rigorous patch management lies a systemic and widening fracture in the platform's architectural integrity. While high-profile, actively exploited zero-day vulnerabilities such as CVE-2025-48633 (Information Disclosure) and CVE-2025-48572 (Elevation of Privilege) capture the attention of the public and the cybersecurity industry
This report posits that the single "biggest" flaw in Android is not a transient code error, but the persistence of the Pixnapping side-channel vulnerability (CVE-2025-48561) combined with the enabling "Won't Fix" policies regarding application enumeration and the structural "Patch Gap" inherent to the fragmented hardware supply chain. Unlike traditional vulnerabilities that can be remediated with a binary patch, Pixnapping exploits the fundamental physics of modern mobile hardware—specifically, the data-dependent compression algorithms of Graphics Processing Units (GPUs)—to bypass the entirety of the Android permission model.
Furthermore, the analysis reveals that the most dangerous threat vectors are no longer the "Zero-Days" unknown to the vendor, but the "Forever Days"—vulnerabilities in drivers (Qualcomm Adreno, Arm Mali) and baseband firmware that remain unpatched on consumer devices for months or years despite the existence of upstream fixes.
1. The Architectural Crisis of Android Security
To understand the magnitude of the flaws discussed in this report, one must first dismantle the assumption that a "patched" Android device is a secure one. The traditional metric of security—the "Security Patch Level" date displayed in the system settings—has become an increasingly unreliable indicator of actual resilience against sophisticated threat actors.
1.1 The Illusion of the Monthly Patch
The monthly Android Security Bulletin serves as the primary mechanism for disclosing and remediating vulnerabilities. However, this bulletin represents a bifurcation of responsibility. It addresses vulnerabilities in the Android Open Source Project (AOSP) code, which Google controls, and lists vulnerabilities in vendor-specific components (Closed-Source Components), which Google does not control.
As of December 2025, the gap between the disclosure of a vendor vulnerability and its remediation on end-user devices has widened significantly. For example, critical vulnerabilities in Qualcomm's Adreno GPU drivers (CVE-2025-21479) were patched by the vendor in May 2025 but are only beginning to appear in OEM firmware updates in December 2025.
1.2 Defining the "Biggest" Flaw: Systemic vs. Specific
In the context of this report, the "biggest" flaw is defined not by the Common Vulnerability Scoring System (CVSS) score alone, but by a matrix of impact, exploitability, and persistence. A vulnerability that allows Remote Code Execution (RCE) but is patched within days is severe but transient. Conversely, a vulnerability like Pixnapping, which allows for the silent exfiltration of two-factor authentication (2FA) codes and encrypted messages without any permissions, and which remains unmitigated due to architectural constraints ("Won't Fix"), represents a systemic failure of the platform's security guarantees.
The Pixnapping vulnerability is particularly pernicious because it invalidates the user's trust in the visual output of their device. If an application can infer the contents of the screen by measuring the electromagnetic or timing side-effects of the GPU, then the visual isolation that underpins the security of banking apps, password managers, and secure messengers is broken.
1.3 The "Won't Fix" Culture and Its Consequences
A recurring theme in the analysis of Android's insecurity is the classification of dangerous behaviors as "Intended Behavior" or "Infeasible to Fix." The Google Issue Tracker and Project Zero archives are replete with reports of vulnerabilities—ranging from Intent enumeration to kernel address leakage—that are closed with a "Won't Fix" status.
This culture of acceptance stems from the tension between security and usability. Fixing the ability of apps to probe for the existence of other apps (a key enabler for targeted attacks) would break the rich inter-app integration that users expect (e.g., "Share to Instagram"). However, by prioritizing this interoperability, Android has codified a level of "permissive insecurity" that malware authors have learned to exploit as a reliable feature of the OS environment.
2. Pixnapping: The Unmitigated Visual Side-Channel
The emergence of the Pixnapping attack vector (CVE-2025-48561) in 2025 marks a watershed moment in mobile security research. It demonstrates that the software-defined permission model of Android (AndroidManifest.xml) is impotent against attacks that leverage hardware side-channels.
2.1 Theoretical Foundations: Side-Channels in Modern Computing
Side-channel attacks exploit the physical implementation of a computer system rather than weaknesses in the implemented algorithms. Classic examples include analyzing power consumption (Differential Power Analysis) or electromagnetic emissions to recover cryptographic keys. In the web context, researchers like Paul Stone demonstrated in 2013 that timing differences in browser rendering could leak information about visited links ("Pixel Perfect Timing Attacks").
Pixnapping brings this concept to the native Android application environment. It relies on the observation that modern computing is optimized for speed and efficiency, often at the cost of constant-time execution. When a system processes data A faster than data B, and the user (or attacker) can measure that time difference, the content of the data leaks.
2.2 The GPU.zip Precursor and Hardware Roots
The foundational research enabling Pixnapping is the "GPU.zip" vulnerability, disclosed in 2023.
Table 1: GPU Compression Characteristics
| GPU Architecture | Compression Strategy | Impact on Rendering Time |
| Arm Mali | Transaction Elimination / Smart Composition | High Variance: Skips rendering for static blocks; highly sensitive to color complexity. |
| Qualcomm Adreno | Universal Bandwidth Compression (UBWC) | Moderate Variance: Compresses blocks based on entropy; faster fetching for low-entropy (solid color) blocks. |
| Implication | Data Leakage: Rendering time $T$ is a function of Pixel Color $C$. $T(C_{white}) \neq T(C_{black})$. |
Because the GPU shares main memory (RAM) with the CPU in mobile System-on-Chip (SoC) architectures, the contention for memory bandwidth created by these compression algorithms is measurable by the CPU. Pixnapping weaponizes this correlation.
2.3 Mechanism of Action: CVE-2025-48561
The Pixnapping exploit operates by creating a precise feedback loop between the attacker's application and the target application's visual output. The mechanism requires no special permissions—not READ_FRAME_BUFFER, not PROJECT_MEDIA, and certainly not root access.
The attack leverages the Android WindowManager and the concept of Overlays. Android allows applications to spawn "Activities" (screens) that are partially transparent. When a transparent activity is placed on top of another application, the GPU must composite both layers.
The attacker forces the target application (e.g., Google Authenticator) to launch via an Intent. The malware then immediately overlays its own transparent activity. To the user, it might look like the phone momentarily lagged, or the malware might present a benign loading screen that hides the target app underneath. Crucially, the target app is rendering its content to the GPU, even if it is visually obscured by the malware's UI.
2.4 The Attack Chain: From Intent to Extraction
The extraction process is a sophisticated exercise in "blind" reading, akin to braille, but using timing signals instead of touch.
Isolation (The Keyhole): The malware creates a custom UI view that is opaque everywhere except for a single pixel or a small region. This forces the GPU to render the complex target app only through this tiny "keyhole." The malware moves this keyhole programmatically across the screen coordinates where sensitive data (like a 2FA code) is expected to be.
11 Amplification (The Blur Hammer): A single pixel's rendering time difference is too small to measure reliably against the noise of the OS. To amplify this signal, the malware utilizes the Android Blur API (or similar computationally expensive shader operations). It requests the system to apply a blur effect to the content behind the transparent keyhole.
If the pixel behind the keyhole is White (background), the GPU compresses it efficiently. The blur operation fetches compressed data, executes quickly, and writes back compressed data.
If the pixel is Black (text), the GPU achieves lower compression. The blur operation fetches more data, stalls on memory bandwidth, and takes longer to execute.
Measurement (VSync Monitoring): The malware registers a
FrameMetricslistener or monitors theVSyncsignal (the refresh rate of the screen). It measures the time elapsed between submitting the blur request and the completion of the frame.Reconstruction: By iterating this process thousands of times per second (a rate achievable on modern 120Hz displays), the malware builds a 2D map of rendering times. This map corresponds to the visual contrast of the underlying screen. Simple Optical Character Recognition (OCR) algorithms are then applied to this map to decode the text.
3
2.5 The Failed Mitigation and the Embargoed Bypass
Google's response to Pixnapping highlights the difficulty of patching architectural flaws. In the September 2025 Android Security Bulletin, Google issued a patch that attempted to mitigate the attack by rate-limiting the Blur API. The logic was that no legitimate application needs to request blur operations at the frequency required for the side-channel attack.
However, researchers immediately found a bypass. While the specific details remain under partial embargo pending a comprehensive fix in late 2025/early 2026, the bypass likely exploits:
Parallelism: Spawning multiple threads or distinct processes to distribute the blur requests, keeping each individual thread under the rate limit while maintaining the aggregate stress on the GPU.
Alternative Stressors: The Blur API is just one way to induce GPU load. Researchers have indicated that other graphical operations—such as complex alpha-blending, specific shader compilations, or manipulating
RenderNodeproperties—can generate similar timing signals.12
Because the vulnerability is rooted in the hardware's compression behavior, software patches that target specific APIs (like Blur) are essentially "Whac-A-Mole" solutions. A true fix would require disabling data-dependent compression (ruining battery life and performance) or strictly isolating GPU contexts in a way that prevents timing leaks (a massive kernel/driver re-architecture).
2.6 Implications for Secure Applications (2FA, Banking)
The existence of Pixnapping neutralizes the security model of high-value applications.
Google Authenticator / Authy: These apps rely on the premise that the TOTP code is safe because it is only displayed locally. Pixnapping allows that code to be read by a malicious game installed on the same device.
Signal / WhatsApp: End-to-end encryption protects data in transit. Pixnapping attacks the "Endpoint" of the encryption tunnel—the screen. It can read the decrypted messages as they are displayed to the user.
Banking Apps: Even with
FLAG_SECUREenabled (which prevents screenshots), Pixnapping can theoretically succeed because it does not capture the framebuffer; it measures the effort required to generate the framebuffer. WhileFLAG_SECUREprevents the GPU from writing the final image to a readable buffer, the rendering pipeline still executes, potentially leaving the timing side-channel open in certain driver implementations.12
3. The "Forever Day" Phenomenon: Anatomy of a Failure
While Pixnapping represents the cutting edge of research, the "Patch Gap" represents the blunt trauma of ecosystem fragmentation. A "Forever Day" is a vulnerability that is technically "known" and "patched" by the maintainer (e.g., the Linux Kernel team or Qualcomm), but remains unpatched on the user's device indefinitely.
3.1 The Supply Chain of Insecurity
The delivery of an Android security update is a relay race with too many participants, where dropping the baton is the norm.
Upstream (Kernel/Vendor): A vulnerability is found in a Qualcomm driver. Qualcomm creates a fix.
Google (AOSP): Google integrates the fix into the Generic Kernel Image (GKI) or publishes it in the Android Security Bulletin.
Silicon Vendor (SoC): The SoC vendor (if not Qualcomm) must update their Board Support Package (BSP).
OEM (Device Maker): Samsung, Motorola, or Xiaomi must merge this new kernel/driver into their device-specific source tree. They must test it against their proprietary modifications (skins like OneUI).
Carrier (ISP): In many markets (especially the US), carriers like Verizon or AT&T must "certify" the update, adding weeks of delay.
User: The user receives the OTA notification.
This pipeline introduces a latency of 3 to 9 months for non-Pixel devices. During this time, the vulnerability is public (N-day), documented in CVE databases, and often has Proof-of-Concept (PoC) exploit code available on GitHub.
3.2 Case Study: The Arm Mali Driver Saga
The handling of vulnerabilities in the Arm Mali GPU driver provides a quintessential example of this failure. The vulnerability CVE-2022-38181 (and subsequent related flaws in 2024/2025) involved a Use-After-Free (UAF) condition in the GPU's memory management.
Discovery: Reported to Google in July 2022.
Rejection: Google's Android Security Team initially marked it "Won't Fix" because the bug was in Arm's driver code, not AOSP code.
Vendor Fix: Arm released a patched driver version (r40p0) in October 2022.
Exploitation: By November 2022, Google's own Threat Analysis Group (TAG) detected spyware campaigns actively exploiting this exact bug.
Deployment: The fix did not reach widespread Android devices until April 2023—six months after the vendor patch and five months after active exploitation began.
This timeline is not an anomaly; it is the standard operating procedure. In late 2025, similar delays plague the rollout of fixes for CVE-2025-21479 (Qualcomm Adreno), ensuring that attackers have a reliable window of opportunity that lasts nearly half a year.
3.3 Case Study: Qualcomm Adreno Vulnerabilities
In December 2025, the Android Security Bulletin highlighted critical flaws in the Qualcomm Adreno GPU driver (CVE-2025-21479, CVE-2025-21480). These vulnerabilities allow for local privilege escalation due to "Incorrect Authorization" in the GPU microcode.
Table 2: Timeline of Vulnerability Exposure (Qualcomm Adreno Case Study)
| Event | Date | Status | Implications |
| Vendor Notification | Jan 2025 | Reported to Qualcomm | Vulnerability known to insiders. |
| Vendor Patch | May 2025 | Released to OEMs | OEMs begin testing; "Patch Gap" starts. |
| Wild Exploitation | June-Aug 2025 | Active Attacks | Spyware vendors deploy N-day exploits. |
| Public Disclosure | Dec 2025 | Android Bulletin | Public becomes aware; exploit becomes commodity. |
| Mass Remediation | Q1-Q2 2026 | OEM OTA Updates | Majority of fleet finally secured. |
The crucial takeaway is the "Wild Exploitation" phase. For three to five months, sophisticated actors used these bugs to compromise high-value targets who were ostensibly "up to date" with their monthly patches, because the monthly patch only covered AOSP bugs, not the delayed vendor blobs.
3.4 The Role of Vendor Binaries (Blobs)
The root cause of these delays is the use of closed-source binary blobs. Unlike the Linux kernel drivers for Intel or AMD graphics on desktop Linux (which are open source and updated with the kernel), mobile GPU drivers are proprietary.
Opacity: Security researchers cannot audit the code.
Dependency: Google cannot patch the driver itself; it must wait for Qualcomm or Arm.
Obsolescence: Once a vendor stops supporting a chipset (usually after 3-4 years), no further security updates are produced. This leaves perfectly functional hardware (e.g., a 4-year-old flagship phone) permanently vulnerable to any new driver flaw discovered. This creates the "Zombie Fleet" of millions of devices that will never be patched.
15
4. "Intended Behavior": The Vulnerabilities Google Refuses to Fix
Beyond the logistical failures of patching, there exists a category of vulnerabilities that are preserved by administrative decision. These are flaws that Google acknowledges but refuses to fix, citing backward compatibility or "intended design."
4.1 Application Enumeration and Reconnaissance
Information is power in an exploit chain. To launch a successful attack—whether it's Pixnapping or a phishing overlay—the malware must first know what to attack. Does the user have the "Bank of America" app installed? Do they use "Coinbase"?
Prior to Android 11, the getInstalledPackages() API allowed any app to request a full inventory of the device. Android 11 restricted this with the QUERY_ALL_PACKAGES permission. However, researchers quickly found a bypass: Intent Enumeration.
The Flaw: By attempting to send an Intent to a specific package name (e.g.,
com.coinbase.android), an app can infer the presence of that package based on the system's return code or the time it takes to process the request.9 Google's Stance: Google marked reports of this technique as "Won't Fix (Infeasible)".
3 The rationale is that the Intent system is designed to facilitate communication; if the system lied and said an app wasn't there when it was, it would break legitimate "Share" functionality.Consequence: This decision leaves the reconnaissance phase of the "Cyber Kill Chain" permanently open. Malware can silently profile a user's digital life, identifying high-value targets and tailoring their payload accordingly, all without requesting a single permission.
4.2 The Kernel ASLR (KASLR) Defeat on ARM64
Address Space Layout Randomization (ASLR) is a critical defense that randomizes the memory locations of key system components, making it harder for exploits to find the code they need to execute.
In July 2025, Google Project Zero identified a flaw in the implementation of KASLR on ARM64 kernels used in Android.
The Issue: On devices using 3-level paging (common in mobile configurations to save memory), the virtual address space available for the kernel is constrained. The calculation used to determine the randomization range (
linear_region_size - physical_memory_size) often results in a range of zero or near-zero.The Result: The kernel is mapped to the same virtual address on every boot.
The "Won't Fix": Google initially categorized this as "Intended Behavior" because it is a mathematical constraint of the hardware/paging choice. While they later agreed to "derestrict" the bug report, the fundamental hardware constraint means that KASLR is effectively non-functional on millions of devices, significantly lowering the bar for developing reliable kernel exploits.
4.3 OTA Signature Verification Flaws
The mechanism for updating Android—the Over-The-Air (OTA) update—relies on the Recovery partition to verify the cryptographic signature of the update package.
The Flaw: Researchers at Quarkslab found a bug in the AOSP recovery code where the verification logic checks that a certificate exists in the block but fails to rigorously enforce that it is the correct certificate in specific chaining scenarios.
16 Impact: An attacker with physical access (or malware with root access trying to gain persistence) could potentially flash a malicious update package that appears legitimate to the recovery system.
The Resolution: Google rated this as "Moderate" and declined to issue a comprehensive fix for older branches, leaving legacy devices permanently exposed to this supply-chain interdiction vector.
4.4 The Legacy of StrandHogg and Task Hijacking
"StrandHogg" is a class of vulnerabilities that exploit the Android multitasking system (taskAffinity). By manipulating how tasks are grouped, a malicious app can inject an activity into the task stack of a benign app. When the user taps the icon for "Gmail," they might actually be shown a phishing login screen injected by malware.
While StrandHogg 1.0 and 2.0 were patched in Android 10/11, the underlying feature—Task Reparenting—remains part of the OS. "Intended Behavior" dictates that apps should be able to group tasks. This insistence on flexibility means that variants of task hijacking continue to emerge (e.g., "Tapjacking" using overlays, or "Task Injection" using new Intent flags). The refusal to strictly enforce task isolation (e.g., "One App, One Task Stack") preserves this entire category of UI deception attacks.
5. The Hardware-Software Divide: Firmware Vulnerabilities
The most opaque and arguably the most dangerous "unpublicized" flaws reside in the firmware that runs on the peripheral processors of the smartphone: the Cellular Baseband and the Digital Signal Processor (DSP).
5.1 Baseband Remote Code Execution (The Silent Threat)
The Baseband Processor (BP) is a separate computer inside the phone that handles 4G/5G radio communication. It runs a proprietary Real-Time Operating System (RTOS).
CVE-2025-21483: In late 2025, a critical vulnerability was disclosed in the Qualcomm modem firmware handling RTP (Real-time Transport Protocol) packets.
6 Attack Vector: An attacker can exploit this over the air by sending malformed video packets (via VoLTE or Wi-Fi Calling) to the target. No user interaction is required.
Impact: A heap overflow in the modem allows the attacker to execute code on the baseband. From there, they can intercept all calls/SMS, or use the baseband's DMA (Direct Memory Access) privileges to attack the main Android OS.
Detection: Because the baseband runs separately from Android, this exploitation is invisible to the user and to security software running on the phone.
5.2 DSP Vulnerabilities and Audio/Video Processing
The Hexagon DSP in Qualcomm chips handles audio and video decoding to save battery. It also runs proprietary firmware.
The Risk: Vulnerabilities in the DSP (often related to parsing complex media formats) can be triggered by a malicious media file sent via MMS or WhatsApp.
The "Won't Fix" Connection: Similar to GPU drivers, DSP firmware updates are binary blobs that are slow to deploy. A DSP vulnerability discovered in January might not be patched on a user's device until the following year, creating a massive window for "Zero-Click" media exploits.
6
5.3 The Limitations of Antivirus and EDR
The existence of firmware and hardware side-channel attacks highlights the irrelevance of traditional Mobile Threat Defense (MTD) and Antivirus (AV) solutions on Android.
Visibility Gap: AV apps run as unprivileged user-space applications. They cannot scan the kernel memory for rootkits. They cannot scan the baseband firmware for implants. They cannot monitor the GPU for Pixnapping timing side-channels.
4 Signature Failure: Pixnapping apps do not have malicious signatures; they look like benign games. They do not request dangerous permissions. Therefore, AV engines—which rely on permission heuristics and signature matching—are blind to the most advanced threats facing the platform.
6. Vendor-Specific Fragmentation and Negligence
The fragmentation of Android is not just about version numbers; it is about the quality of the code added by OEMs. While AOSP code is heavily audited, OEM skins (OneUI, OxygenOS, HyperOS) often introduce elementary security flaws.
6.1 The OnePlus SMS Vulnerability (CVE-2025-10184)
A glaring example of OEM negligence occurred in late 2025 with OnePlus devices.
The Flaw: OnePlus added a custom "PushShopProvider" to their OxygenOS layer. This component exposed the SMS/MMS database to other apps without enforcing the standard Android
READ_SMSpermission check.The Negligence: Security researchers at Rapid7 reported this to OnePlus. The vendor ignored the report for five months.
The Fix: A patch was only released after public disclosure forced a PR crisis.
The Lesson: This incident proves that a device can run the latest AOSP security patch and still be critically vulnerable due to "value-add" bloatware introduced by the manufacturer.
6.2 Samsung's Proprietary Attack Surface (Landfall)
Samsung devices, which make up the largest share of the Android market, are frequent targets for spyware due to their custom components.
CVE-2025-21042: This vulnerability in Samsung's proprietary image processing library (
libimagecodec.quram.so) was exploited by the "Landfall" spyware campaign.20 Zero-Click: The exploit was triggered by processing a malformed DNG image file, likely delivered via a messaging app.
Exclusivity: This vulnerability did not exist in Stock Android or Pixel phones. It was a vulnerability introduced solely by Samsung's desire to have a custom image codec. This demonstrates how OEM customization expands the attack surface significantly.
6.3 The "Stock vs. Skin" Security Gap
The data suggests a clear hierarchy of security:
Google Pixel (Stock/AOSP): Smallest attack surface, fastest updates, fewer custom components.
Top-Tier OEM (Samsung): Large attack surface (many custom apps), but generally good update cadence for flagships.
Mid-Tier/Budget OEM: Large attack surface, poor update cadence, high risk of unpatched "Forever Days."
7. The Commercial Spyware Ecosystem
The "Biggest" flaw in Android is arguably the economic ecosystem that sustains these vulnerabilities. The "Patch Gap" and "Won't Fix" lists are the inventory catalogs for companies like NSO Group, Intellexa, and Candiru.
7.1 Weaponizing N-Days: The Economic Rationale
Developing a true Zero-day exploit (finding a bug nobody knows about) is expensive and difficult. It costs millions of dollars.
The N-Day Strategy: It is far cheaper to take a vulnerability that Qualcomm has just patched (but which hasn't reached users yet), reverse-engineer the patch to find the bug, and write an exploit.
Efficiency: This exploit will work on 90% of the target population for the next 6 months.
Attribution: If discovered, the exploit looks like a known bug, making it harder to attribute to a sophisticated actor compared to a custom Zero-day.
7.2 The Role of "Limited, Targeted Exploitation"
Google's security bulletins frequently use the phrase "indications of limited, targeted exploitation".
8. Strategic Recommendations and Future Outlook
The current approach to Android security—reactive patching and permission management—has reached its point of diminishing returns. The existence of Pixnapping proves that software permissions cannot contain hardware side-channels.
8.1 The Necessity of Hardware Isolation (MTE, CHERI)
The only robust defense against memory safety bugs (like the Adreno/Mali UAFs) is hardware enforcement.
MTE (Memory Tagging Extension): Arm's MTE, deployed in the Pixel 8/9, allows the hardware to detect when a pointer accesses memory it shouldn't. This effectively kills entire classes of memory corruption exploits.
13 CHERI (Capability Hardware Enhanced RISC Instructions): A more advanced future architecture that enforces fine-grained memory protection at the hardware level.
8.2 Architectural Reforms: Virtualization and Micro-Segmentation
To solve the driver problem, Android must move away from the monolithic kernel model.
pKVM (Protected KVM): Google is pushing for "Android Virtualization Framework," where the OS runs as a guest and sensitive computations (like DRM or biometric processing) run in isolated VMs.
Userspace Drivers: Moving GPU and Wi-Fi drivers out of the kernel and into userspace (Microkernel style) would ensure that a driver crash or exploit does not compromise the entire system.
8.3 Conclusion
The "biggest" bug in Android in 2025 is not a single CVE. It is the Pixnapping vulnerability, which stands as a monument to the failure of the permission model. It is the Patch Gap, which turns months-old bugs into active weapons. And it is the "Won't Fix" culture, which normalizes reconnaissance and surveillance as "intended behavior."
Until the ecosystem addresses the fundamental disconnect between hardware reality (side-channels, proprietary blobs) and software security (permissions, open source), the Android platform will remain structurally vulnerable to those with the patience to exploit the gap between the "Patch Released" date and the "Update Installed" date.
Table 3: Summary of Key Unmitigated Risks (Late 2025)
| Vulnerability Domain | Specific Threat | Status | Security Impact |
| Side-Channel | Pixnapping (CVE-2025-48561) | Unmitigated / Bypassable | Critical: Silent theft of 2FA/Secrets; bypasses permissions. |
| Kernel/Driver | Adreno/Mali GPU Exploits | Forever Day (Patch Gap) | Critical: Root access; active spyware utilization. |
| Firmware | Baseband RCE (Qualcomm) | Opaque / Unpatched | Critical: Remote compromise; invisible to user. |
| Privacy Design | Intent Enumeration | Won't Fix | High: Enables targeted attacks and profiling. |
| Supply Chain | OTA Verification Flaw | Won't Fix (Legacy) | Medium: Physical access compromise of older fleets. |
For the professional security community, the implication is stark: reliance on "Up to Date" status is insufficient. True security on Android currently requires specific hardware (MTE-enabled devices), minimizing the app footprint (to reduce Intent surface), and a recognition that the screen itself is a compromised medium.
.jpg)