Automotive & IVI

Car Service in AOSP

Car Service in AOSP Explained Simply: For Beginners in Android Automotive

If you’re getting started with Android Automotive OS (AAOS), you’ll quickly run into something called Car Service in AOSP. It’s one of those essential components that makes Android work inside a car — not on your phone, but actually on the car’s infotainment system. In this guide, we’ll break down Car Service in AOSP step-by-step, explain how...

Membership Required

You must be a member to access this content.

View Membership Levels

Already a member? Log in here
AOSP Architecture in Automotive

AOSP Architecture in Automotive: Building Smarter Infotainment and Connected Car Systems

The automotive industry is going through a digital revolution. Cars are no longer just mechanical marvels; they are becoming smart, connected, and software-driven. At the heart of many modern infotainment and connected car systems is AOSP Architecture in Automotive — the Android Open Source Project adapted for in-vehicle environments. In this blog, we’ll break down AOSP Architecture...

Membership Required

You must be a member to access this content.

View Membership Levels

Already a member? Log in here
aosp

AOSP Explained: How Google’s Android Without Google Actually Works

If you’ve ever wondered what powers Android at its core, you’ve probably stumbled across the term AOSP — short for Android Open Source Project.

It’s Android… but without Google.
 Sounds strange, right? Let’s unpack what that really means, why it exists, and how it works in practice.

What is AOSP?

At its simplest, AOSP is the open-source base of Android. It’s the version of Android that Google publishes for anyone to use, modify, and build on — all under the Apache 2.0 open-source license.

Think of it like a barebones Android:

  • It has the operating system code.
  • It has basic apps like a simple dialer, messaging app, and browser.
  • It has the kernel (based on Linux) and system frameworks.

What it doesn’t have: Google’s proprietary services and apps — like Gmail, Google Maps, YouTube, or the Google Play Store. Those are separate from AOSP and require Google licensing.

Why Does AOSP Exist?

When Google first created Android, the goal was to make it free and open so device makers could adapt it to different screen sizes, hardware types, and use cases.

AOSP is Google’s way of ensuring:

  1. Openness: Developers and manufacturers can use Android without asking for permission.
  2. Standardization: There’s a single, consistent base for all Android devices.
  3. Innovation: The community can modify and experiment with Android’s code.

AOSP vs. “Google Android”

Most Android phones you buy (Samsung, Pixel, OnePlus) run a Google-certified Android build, which is AOSP + Google Mobile Services (GMS).

Here’s the difference:

In short: AOSP is the foundation; GMS is the layer of Google extras.

Where is AOSP Used Without Google?

Not every Android device needs Google. Examples include:

  • Custom ROMs like LineageOS, /e/OS, and GrapheneOS.
  • Chinese smartphones (due to lack of Google licensing).
  • Embedded systems like car dashboards, TVs, and kiosks.
  • Android forks for specialized industries.

These systems use AOSP as a clean slate and replace Google services with their own or open-source alternatives.

How AOSP is Built and Used

The AOSP source code is hosted publicly on android.googlesource.com. Anyone can clone it and build it.

Here’s a simplified example of how a developer might build AOSP for a device:

Bash
# Install required packages
sudo apt-get update
sudo apt-get install git openjdk-11-jdk

# Download the repo tool
mkdir ~/bin
curl https://storage.googleapis.com/git-repo-downloads/repo > ~/bin/repo
chmod a+x ~/bin/repo

# Initialize the AOSP source for Android 14
repo init -u https://android.googlesource.com/platform/manifest -b android-14.0.0_r1

# Download the source code (this will take a while)
repo sync

# Build the system image
source build/envsetup.sh
lunch aosp_arm64-eng
make -j$(nproc)
  • repo init sets up which Android version you’re working with.
  • repo sync downloads all the AOSP code.
  • lunch selects the target device configuration.
  • make compiles the OS into a system image you can flash.

But Without Google, What’s Missing?

Running pure AOSP is like having a new phone without the “modern conveniences.”

  • No Play Store (you’ll need F-Droid or Aurora Store instead).
  • No Google account syncing.
  • Some apps won’t work if they depend on Google Play Services.

This is why most people using pure AOSP need replacement apps and services.

Why AOSP Matters

Even though most people never use plain AOSP, it’s crucial for:

  • Freedom: Developers can create custom systems without being locked into Google’s ecosystem.
  • Security & Privacy: Privacy-focused ROMs strip out tracking features.
  • Innovation: New Android features often start as AOSP experiments.

Without AOSP, Android wouldn’t be the flexible, global platform it is today.

Conclusion

AOSP is Android’s open heart — the part that anyone can see, modify, and improve. It’s the foundation that makes Android the most widely used mobile OS in the world, while still leaving room for choice between a Google-powered experience or something entirely different.

If you’ve ever thought about building your own OS, customizing an old device, or exploring privacy-first alternatives, AOSP is where that journey begins.

VHAL Interfaces (IVehicle)

Understanding VHAL Interfaces (IVehicle): The Backbone of Android Automotive Integration

Android Automotive OS is powering a growing number of infotainment systems, and at the heart of its vehicle interaction lies a critical component: VHAL Interfaces (IVehicle). These interfaces are what allow Android to talk to your car’s hardware. From reading the speedometer to turning on climate control, everything hinges on this system.

In this post, we’ll break down how VHAL Interfaces (IVehicle) work, why they’re essential, and how developers can work with them to build automotive apps that actually connect with vehicle systems.

What Is VHAL?

VHAL stands for Vehicle Hardware Abstraction Layer. Think of it as a translator between Android and the car’s underlying ECUs (Electronic Control Units). Cars have multiple ECUs controlling everything from brakes to lights, and Android Automotive needs a way to communicate with them.

That’s where VHAL Interfaces (IVehicle) come in. They define how Android gets and sets data to and from the vehicle hardware.

The Role of IVehicle Interface

In Android Automotive, IVehicle is the AIDL (Android Interface Definition Language) interface that enables communication between the Vehicle HAL and the framework.

You can think of IVehicle as a contract. It defines methods the Android system can call to:

  • Get vehicle property values (e.g., speed, fuel level)
  • Set values (e.g., adjust HVAC settings)
  • Subscribe to updates

This interface must be implemented by car manufacturers or Tier-1 suppliers so that Android can access real-time vehicle data.

Anatomy of IVehicle Interface

Here’s a simplified look at what an IVehicle interface might look like:

Java
interface IVehicle {
    VehiclePropValue get(in VehiclePropGetRequest request);
    StatusCode set(in VehiclePropValue value);
    void subscribe(in IVehicleCallback callback, in SubscribeOptions[] options);
    void unsubscribe(in IVehicleCallback callback, in int[] propIds);
}

Here,

  • get(): Used to read a vehicle property.
  • set(): Used to write or modify a property (like setting temperature).
  • subscribe(): Listen for changes (like speed updates).
  • unsubscribe(): Stop listening to property changes.

These methods form the foundation of vehicle interaction in Android Automotive.

What Is a Vehicle Property?

A vehicle property is any data point Android can interact with. Each has a unique ID, data type, and permission level. For example:

  • VehicleProperty::PERF_VEHICLE_SPEED: Car speed
  • VehicleProperty::HVAC_TEMPERATURE_SET: Climate control temperature
  • VehicleProperty::FUEL_LEVEL: Fuel level

Each property is defined in the VehicleProperty.aidl file.

Implementing a Custom VHAL

Let’s say you’re a car maker. You want Android to read your custom battery voltage data. You’d do something like this:

1. Define the Property

Java
#define VEHICLE_PROPERTY_CUSTOM_BATTERY_VOLTAGE (0x12345678)

2. Add It to Your Property List

Java
VehiclePropConfig config = {
    .prop = VEHICLE_PROPERTY_CUSTOM_BATTERY_VOLTAGE,
    .access = VehiclePropertyAccess::READ,
    .changeMode = VehiclePropertyChangeMode::ON_CHANGE,
    .configArray = {},
    .configString = "Custom Battery Voltage",
};

3. Implement Logic in get()

Java
VehiclePropValue get(const VehiclePropGetRequest& request) override {
    VehiclePropValue value = {};
    if (request.prop == VEHICLE_PROPERTY_CUSTOM_BATTERY_VOLTAGE) {
        value.prop = request.prop;
        value.value.floatValues = {12.6};
    }
    return value;
}

And that’s it. Now Android can read your custom battery voltage.

Why Are VHAL Interfaces (IVehicle) Important?

Without VHAL, Android is blind to the vehicle. These interfaces power key services like:

  • HVAC control UI
  • Instrument cluster apps
  • Battery status for EVs
  • Safety features

By standardizing communication, VHAL Interfaces (IVehicle) make it possible for third-party developers to build real, vehicle-aware apps. That’s a game-changer.

Example: Reading Vehicle Speed

Let’s look at a code snippet that reads the vehicle speed.

Requesting Speed in Framework (Java)

Java
VehiclePropertyValue speed = vehicle.get(VehicleProperty.PERF_VEHICLE_SPEED);
float currentSpeed = speed.getValue().getFloatValue();

Here,

  • The Java API calls the get() method on the IVehicle AIDL interface.
  • This request travels through the HAL to the car’s CAN bus or hardware.
  • The current speed is returned as a float.

Best Practices for Working with VHAL

  1. Don’t poll: Use subscribe() instead of calling get() in a loop.
  2. Permission-aware: Some properties require special permissions.
  3. Optimize data flow: Avoid flooding the system with updates.
  4. Test on real hardware: Simulators are helpful, but actual ECUs may behave differently.

Conclusion

If you want to build or integrate automotive systems with Android Automotive OS, you must understand how VHAL Interfaces (IVehicle) work. They’re the core pathway between Android and the car’s brain.

With the right implementation, you can create apps that do more than just run in the dashboard — they interact with the vehicle in real-time, improving safety, convenience, and experience.

VHAL Interfaces (IVehicle) are not just another Android abstraction. They’re what make Android truly automotive.

Vehicle HAL

What Is Vehicle HAL? How Vehicle HAL Is Changing the Way You Drive

In today’s rapidly evolving automotive world, technology increasingly powers every aspect of your driving experience. One such advancement making a significant impact behind the scenes is Vehicle HAL. You might be wondering, what exactly is Vehicle HAL, and how does it affect the way you drive? Let’s break it down clearly and simply.

What Is Vehicle HAL? 

Vehicle HAL stands for Vehicle Hardware Abstraction Layer. Think of it as the translator between the car’s hardware (like sensors, cameras, control units) and the software apps that make your driving experience smarter and safer. It sits in the middle, handling the nitty‑gritty so app developers can focus on features — not on hardware quirks.

With Vehicle HAL, your car’s systems talk in a standard language. Whether it’s braking, lane‑keeping, infotainment, or diagnostics, everything works through that common interface. That consistency simplifies development, improves safety, and speeds up innovation.

Why Vehicle HAL Matters

1. One Interface, Many Devices

Vehicle HAL gives developers a single, reliable interface to access diverse hardware. Instead of building custom code for each sensor or device, they write once and it works across models — much faster and safer.

2. Faster Updates, Smarter Features

Need to add voice commands, predictive cruise control, or advanced diagnostics? Vehicle HAL decouples hardware from apps. That means updates come quicker and you get new features without long delays.

3. Safety First

By enforcing consistent behavior across hardware components, Vehicle HAL helps reduce bugs and improves reliability. Consistency boosts safety — especially in critical systems like braking or collision avoidance.

4. Interoperability & Modularity

Automakers and suppliers can plug in different parts — cameras, sensors, processors — from various vendors. As long as they follow Vehicle HAL standards, everything integrates seamlessly. This encourages competition and innovation while keeping quality high.

How Vehicle HAL Works

Let’s look at a basic example in Android’s Vehicle HAL environment to understand how it controls a vehicle’s power state.

Java
// Example: Controlling Vehicle Power State with Vehicle HAL

public class VehiclePowerController {
    private VehicleHal vehicleHal;

    public VehiclePowerController(VehicleHal hal) {
        this.vehicleHal = hal;
    }

    // Method to turn vehicle power on or off
    public void setPowerState(boolean on) {
        try {
            int powerState = on ? VehiclePropertyIds.POWER_STATE_ON : VehiclePropertyIds.POWER_STATE_OFF;
            vehicleHal.setProperty(VehiclePropertyIds.POWER_STATE, powerState);
            System.out.println("Vehicle power turned " + (on ? "ON" : "OFF"));
        } catch (VehicleHalException e) {
            System.err.println("Failed to set power state: " + e.getMessage());
        }
    }
}

Here,

  • VehicleHal is an object representing the hardware abstraction layer interface.
  • The method setPowerState takes a boolean to turn the vehicle power on or off.
  • VehiclePropertyIds.POWER_STATE_ON and POWER_STATE_OFF are constants representing the hardware power states.
  • The setProperty method sends the command down to the hardware, abstracted away from the specific implementation.

This simple code showcases how Vehicle HAL hides the hardware complexities and presents a clean way to control vehicle functions programmatically.

Benefits of Vehicle HAL for Developers and Drivers

  • For developers: Simplifies app development and testing across multiple vehicle platforms.
  • For drivers: You get a smooth, consistent driving experience with new features delivered faster and more safely.
  • For manufacturers: Promotes modular design, reducing costs and accelerating innovation.

The Future of Driving with Vehicle HAL

As connected and autonomous vehicles advance, the role of Vehicle HAL will grow even more crucial. It will support complex sensor networks, cloud integration, AI-driven decisions, and real-time data sharing between vehicles to make driving smarter, safer, and more enjoyable.

Conclusion

In conclusion, Vehicle HAL is revolutionizing the automotive space by breaking down the barriers between hardware and software. It’s making cars more adaptable, feature-rich, and user-friendly, changing the way you interact with your vehicle every day. Whether it’s through better safety, easier updates, or improved performance, Vehicle HAL is quietly refashioning the future of driving, one line of code at a time.

Drive smarter, safer, and connected — thanks to Vehicle HAL.

Android Automotive OS Architecture

Android Automotive OS Architecture: A High‑Level Overview

Android Automotive OS is Google’s in‑car operating system that runs directly on a vehicle’s hardware. Not to be confused with Android Auto (a phone projection platform), Android Automotive OS Architecture is a complete software stack, ready for infotainment, driver assistance apps, and full vehicle integration.  Let’s dive into its main layers. Android Automotive Architecture A...

Membership Required

You must be a member to access this content.

View Membership Levels

Already a member? Log in here
GAS

Google Automotive Services (GAS) Compliance: A Developer’s Guide to Licensing, Integration, and Certification

If you’re an OEM or Tier 1 developer integrating Google Automotive Services (GAS) into your Android Automotive OS (AAOS) stack, compliance isn’t just a formality — it’s a binding agreement with Google. Their guidelines are intentionally strict to preserve platform security, ensure a consistent user experience, and maintain API reliability across the ecosystem.

This article takes a deep dive into what GAS compliance actually entails — offering actionable insights for engineers, system architects, and product owners navigating the AAOS landscape.

Quick Primer: What Is GAS?

Google Automotive Services (GAS) is a proprietary suite of applications running on Android Automotive OS (AAOS). It includes:

  • com.google.android.apps.maps (Google Maps)
  • com.google.android.googlequicksearchbox (Google Assistant)
  • com.android.vending (Play Store)
  • com.google.android.gms (Play Services)

Unlike Android Auto, which mirrors from a paired phone, GAS apps run natively on the IVI (In-Vehicle Infotainment) hardware. That requires full-stack integration — kernel to UI.

Licensing GAS (OEM Legal Requirement)

Before any technical work begins, your OEM must sign a GAS License Agreement with Google. This is model-specific, meaning:

  • Each vehicle/trim with a different infotainment configuration = separate GAS approval
  • Google reserves the right to audit or revoke if compliance slips

As a developer, you’ll typically get access to the GAS Partner Portal after your OEM is approved — where SDKs, sample projects, and certification tools are hosted.

Hardware & OS Prerequisites

To be GAS-compliant, your hardware must meet strict thresholds.

Minimum Hardware Spec

ComponentRequirement
RAM≥ 2GB (realistically 4GB+ recommended)
Storage≥ 32GB eMMC or UFS
ConnectivityWi-Fi, Bluetooth Classic + LE
GNSS / GPSRequired for Maps integration
MicrophonesHigh SNR, beamforming preferred
Audio DSPFor voice recognition preprocessing

Android Automotive OS

To integrate Google Automotive Services, your IVI system must use a Google-certified build of Android Automotive OS. This typically involves:

  • A certified AOSP base, often from a recent LTS (Long-Term Support) branch
  • HALs and BSPs tailored for IVI use cases, compliant with VHAL (Vehicle HAL) standards
  • A custom UI that respects Google Automotive Services guidelines for system behavior, Assistant integration, and safe navigation

Note: Google prohibits UI customizations that interfere with system-level navigation, Assistant triggers, or driving safety workflows. GAS will not support heavily skinned or fragmented UI shells that break these requirements.

The Test Suites — All Mandatory

Google requires your system to pass a set of test suites to ensure stability and UX consistency.

Compatibility Test Suite (CTS)

Tests Android APIs, permissions, and behavior.

Kotlin
$ run_cts --module CtsAppSecurityHostTestCases
$ run_cts --module CtsMediaTestCases

Failures often involve:

  • Custom permission models
  • Background activity restrictions
  • Missing system apps

Vendor Test Suite (VTS)

Validates hardware interface layers. You’ll need to flash your build and execute these over adb/fastboot.

Kotlin
$ run_vts --plan VtsKernelTest

Typical failures:

  • Bad binder transaction handling
  • Incomplete HIDL implementation

Automotive Test Suite (ATS)

Tests GAS apps in the context of AAOS.

Key checks include:

  • Intent resolution from Assistant (ACTION_NAVIGATE_TO)
  • Overlay permission use
  • Play Store update flow

Drivable Test Suite (DTS)

DTS evaluates runtime behavior during actual vehicle use. Google may perform this directly or via OEM-conducted telemetry logs.

Integration Tips for GAS Developers

1. Use CarApp API for Custom Apps

If you’re building companion apps, use the androidx.car.app APIs (Jetpack):

Kotlin
class MyCarScreen(carContext: CarContext) : Screen(carContext) {
    override fun onGetTemplate(): Template {
        return MessageTemplate.Builder("Welcome to MyCar App")
            .setTitle("MyCar")
            .setHeaderAction(Action.APP_ICON)
            .build()
    }
}

2. Use MediaBrowserServiceCompat for Media Apps

GAS expects media apps to use Android’s MediaBrowserServiceCompat so that Assistant can control them

Kotlin
class MyMediaService : MediaBrowserServiceCompat() {
    override fun onCreate() {
        super.onCreate()
        // Setup your media session and player
    }
    
    override fun onLoadChildren(parentId: String, result: Result<List<MediaItem>>) {
        // Populate UI content
    }
}

3. Assistant Support = Deep Linking Required

Make sure you support Google Assistant voice intents. This requires implementing App Actions schema or handling common Intents.

XML
<intent-filter>
    <action android:name="android.media.action.MEDIA_PLAY_FROM_SEARCH" />
</intent-filter>

Handle queries like “Play Arijit Singh songs on MyCar App”.

Privacy & Data Handling for GAS Compliance

As a developer, your GAS integration must comply with Google and regional privacy rules.

You must:

  • Avoid tracking without user consent
  • Route sensitive data via Android Keystore or SafetyNet
  • Support user-level account deletion (GDPR/CCPA)
  • Never misuse the Location or Microphone data exposed via GAS APIs

Pro Tips for Dev Teams

  • Use Emulator Images from AOSP: GAS builds aren’t public, but you can prototype using AAOS emulator images from Google’s android-automotive GitHub.
  • Leverage VHAL correctly: Don’t shortcut vehicle HAL integrations — Google’s certification expects clean VehicleProp handling.
  • Automate testing with TradeFed: You’ll be running these tests often. Use TradeFederation to orchestrate builds and reports.

Conclusion: Build for Compliance, Not Just Launch

GAS compliance is a high bar. But it’s not just bureaucracy — it’s about delivering a polished, secure, responsive infotainment system users can trust.

As a developer, your role is to make sure the AAOS stack:

  • Runs clean, certified builds
  • Passes all test suites
  • Delivers a user experience aligned with Google’s best practices
  • Handles data securely and transparently

Once certified, your GAS integration unlocks the full power of Google’s ecosystem — and keeps your vehicles competitive in a connected world.

Jetpack Glance Media Playback

A Deep Dive into Using Jetpack Glance for Media Playback in Android Automotive OS

As vehicles evolve into digital experiences, the need for glanceable, fast, and distraction-free interfaces becomes paramount. In Android Automotive OS (AAOS), this demand has led to the emergence of the Jetpack Glance framework — a powerful tool for creating UI surfaces that are lightweight, fast to load, and safe for drivers to interact with.

In this blog post, we’ll explore how Jetpack Glance can be used to build a media playback card for Android Automotive OS. From setting up dependencies to implementing a full-featured glanceable media widget with play/pause/skip functionality — we’ll walk through the full picture with code, context, and best practices.

What is Jetpack Glance?

Jetpack Glance is a declarative UI library designed for building remote user interfaces, including:

  • App widgets (for Android homescreens)
  • Glanceable UIs for wearables (e.g., Tiles)
  • Future-facing vehicle dashboards and clusters in Android Automotive

Think of Glance as the Compose-inspired sibling of RemoteViews, but tailored for rendering quickly, efficiently, and safely on surfaces with strict interaction rules — like a car’s infotainment screen.

Why Use Glance in Android Automotive?

Using Glance in AAOS allows developers to:

  • Create lightweight UIs for media, navigation, or vehicle info
  • Ensure low distraction by adhering to system-level constraints
  • Maintain fast rendering even on constrained hardware
  • Leverage Jetpack Compose-like syntax without full Compose overhead

Key Use Cases in AAOS

Use CaseDescription
Media CardsDisplay now-playing info and basic playback controls
Navigation PreviewsShow turn-by-turn summaries or route cards
Vehicle StatusFuel, tire pressure, battery charge level
Contextual AlertsDoor open, low fuel, safety notifications

Setting Up Jetpack Glance in Your Project

Add Required Dependencies

Update your build.gradle with the latest Glance libraries:

Kotlin
dependencies {
    implementation "androidx.glance:glance:1.0.0"
    implementation "androidx.glance:glance-appwidget:1.0.0"
    implementation "androidx.glance:glance-wear-tiles:1.0.0" // optional
    implementation "androidx.core:core-ktx:1.12.0"
}

Tip: Glance is backward-compatible with Android 12 and above, making it suitable for most AAOS setups.

Creating a Glanceable Media Widget for AAOS

Let’s walk through a full example where we build a media playback widget that can be shown in a center display or cluster (with OEM support).

Define the Glance Widget

Kotlin
class MediaGlanceWidget : GlanceAppWidget() {
    @Composable
    override fun Content() {
        val title = "Song Title"
        val artist = "Artist Name"

        Column(
            modifier = GlanceModifier
                .fillMaxSize()
                .padding(16.dp)
                .background(Color.DarkGray),
            verticalAlignment = Alignment.CenterVertically,
            horizontalAlignment = Alignment.CenterHorizontally
        ) {
            Text("Now Playing", style = TextStyle(fontWeight = FontWeight.Bold, color = Color.White))
            Spacer(Modifier.height(8.dp))
            Text(title, style = TextStyle(color = Color.White))
            Text(artist, style = TextStyle(color = Color.LightGray))

            Spacer(Modifier.height(16.dp))
            Row(horizontalAlignment = Alignment.CenterHorizontally) {
                Image(
                    provider = ImageProvider(R.drawable.ic_previous),
                    contentDescription = "Previous",
                    modifier = GlanceModifier.size(32.dp).clickable {
                        actionStartService<MediaControlService>("ACTION_PREVIOUS")
                    }
                )
                Spacer(Modifier.width(16.dp))
                Image(
                    provider = ImageProvider(R.drawable.ic_play),
                    contentDescription = "Play",
                    modifier = GlanceModifier.size(32.dp).clickable {
                        actionStartService<MediaControlService>("ACTION_PLAY_PAUSE")
                    }
                )
                Spacer(Modifier.width(16.dp))
                Image(
                    provider = ImageProvider(R.drawable.ic_next),
                    contentDescription = "Next",
                    modifier = GlanceModifier.size(32.dp).clickable {
                        actionStartService<MediaControlService>("ACTION_NEXT")
                    }
                )
            }
        }
    }
}

Handling Playback Actions: MediaControlService

Since Glance doesn’t support direct onClick behavior like Compose, we use a Service to act on UI interactions.

Kotlin
class MediaControlService : Service() {
    override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int {
        when (intent?.action) {
            "ACTION_PLAY_PAUSE" -> togglePlayPause()
            "ACTION_NEXT" -> skipToNext()
            "ACTION_PREVIOUS" -> skipToPrevious()
        }
        return START_NOT_STICKY
    }

    private fun togglePlayPause() {
        // Hook into MediaSession or ExoPlayer
    }

    private fun skipToNext() {
        // Forward playback command
    }

    private fun skipToPrevious() {
        // Rewind playback
    }

    override fun onBind(intent: Intent?): IBinder? = null
}

Integrating with AndroidManifest.xml

To register the widget and service:

Kotlin
<receiver
    android:name=".MediaGlanceWidgetReceiver"
    android:exported="true">
    <intent-filter>
        <action android:name="android.appwidget.action.APPWIDGET_UPDATE" />
    </intent-filter>
    <meta-data
        android:name="android.appwidget.provider"
        android:resource="@xml/media_widget_info" />
</receiver>

<service
    android:name=".MediaControlService"
    android:exported="false" />

Widget Configuration XML

In res/xml/media_widget_info.xml:

Kotlin
<appwidget-provider
    xmlns:android="http://schemas.android.com/apk/res/android"
    android:minWidth="180dp"
    android:minHeight="100dp"
    android:updatePeriodMillis="60000"
    android:widgetCategory="home_screen" />

Best Practices for Automotive Glance UI

  • Keep UI distraction-optimized
  • Use readable font sizes and sufficient contrast
  • Avoid overloading the interface — 2–3 actions max
  • Make controls large and touch-friendly
  • Always test on real AAOS hardware or emulator

Conclusion

Jetpack Glance is quickly becoming a go-to tool for developers looking to build safe, fast, and flexible UI surfaces across Android form factors. In the automotive space, it shines by helping deliver minimalist, glanceable media controls that respect both performance and safety constraints.

As AAOS continues to evolve, expect more OEM support for Glance in clusters, dashboards, and center displays — especially with the push toward custom car launchers and immersive media experiences

android in automotive industry

The Journey of Android in the Automotive Industry

Android has come a long way since powering our phones. Today, it’s in dashboards, infotainment systems, and even under the hood of cars. But how exactly did Android evolve from a smartphone OS to a critical player in the automotive world?

In this blog post, we’ll explore the fascinating journey of Android in the automotive industry, from its humble beginnings to the modern Android Automotive OS (AAOS). We’ll explain everything in a clear and easy-to-understand way, covering code examples, system architecture, and how this evolution affects developers, car manufacturers, and everyday drivers.

From Mobile OS to Infotainment: The Early Days

When Android was first introduced by Google in 2008, its open-source nature caught the attention of many industries — including automotive.

The Introduction of Android Auto

In 2015, Google officially launched Android Auto — a platform that allowed Android smartphones to project a simplified interface onto the car’s infotainment system. Drivers could use apps like Google Maps, Spotify, and WhatsApp with voice commands and touch input, enhancing safety and usability.

How It Works:
 Android Auto runs on the phone, not the car. The car merely acts as a display and controller.

Kotlin
// Example: Launching a voice command with Google Assistant
val intent = Intent(Intent.ACTION_VOICE_COMMAND)
startActivity(intent)

This architecture meant quick updates and a wide range of compatible vehicles. But it also had limitations — OEMs (Original Equipment Manufacturers) had little control over the UI or deep integration with car hardware.

The Rise of Android Automotive OS (AAOS)

Recognizing the limitations of projection-based systems, Google introduced Android Automotive OS — a full-fledged, car-ready version of Android that runs natively on the vehicle’s hardware.

What Makes Android Automotive OS Special?

  • Embedded OS: No need for a phone. The OS is pre-installed and controls the infotainment system.
  • Deeper Hardware Access: Unlike Android Auto, AAOS can integrate with HVAC, seat controls, vehicle telemetry, and more.
  • Customizable UI: OEMs can customize the look and feel while still leveraging the power of Android.

Architecture of Android Automotive OS

Let’s break down how Android Automotive OS works under the hood.

1. HAL (Hardware Abstraction Layer)

This layer interacts directly with the vehicle’s hardware. OEMs implement Vehicle HALs to expose data like speed, fuel level, and climate control to Android.

2. Vehicle HAL Interface (AIDL-based)

Android Automotive uses AIDL (Android Interface Definition Language) to define communication between system services and vehicle HALs.

Kotlin
// AIDL example to access vehicle property
interface IVehicle {
    int getProperty(int propertyId);
}

3. Car Services Layer

These are system services provided by AAOS (like CarSensorManager, CarInfoManager, etc.) that expose car-related data to apps.

Kotlin
val car = Car.createCar(context)
val sensorManager = car.getCarManager(Car.SENSOR_SERVICE) as CarSensorManager

Developer Experience: Building Apps for Android in Cars

With AAOS, developers now build apps that run directly on the car. These can be media, navigation, or communication apps.

App Categories Supported:

  • Media (e.g., Spotify)
  • Messaging (e.g., WhatsApp)
  • Navigation (e.g., Google Maps alternatives)

Sample Media App Setup

Android Automotive media apps are built on the MediaBrowserService framework:

Kotlin
class CarMediaService : MediaBrowserServiceCompat() {
    override fun onGetRoot(
        clientPackageName: String,
        clientUid: Int,
        rootHints: Bundle?
    ): BrowserRoot? {
        return BrowserRoot("root", null)
    }

    override fun onLoadChildren(
        parentId: String,
        result: Result<List<MediaItem>>
    ) {
        result.sendResult(emptyList()) // Placeholder
    }
}

This setup allows your media app to appear natively within the car’s infotainment system.

OEM Adoption and Industry Impact

More manufacturers are embracing Android in the automotive industry due to its flexibility and Google ecosystem support.

Popular Cars Running AAOS:

  • Volvo XC40 Recharge
  • Polestar 2
  • Renault Mégane E-Tech
  • GM, Honda, Ford, and Stellantis also announced future integration

OEMs can add their own app stores, integrate voice assistants like Alexa, and modify the interface, all while running on a solid Android foundation.

Privacy, Security & Updates

One of the major concerns with embedded software in cars is security. Google addresses this with:

  • Verified boot & partitioned OS layers
  • Google Play Protect (on supported systems)
  • Monthly security patches (when implemented by OEMs)
  • OTA (Over-the-Air) updates to push bug fixes and new features

What’s in It for You, the Driver?

Faster Access

No waiting for your phone to connect. No dropped Bluetooth. Everything just works.

Built-In Voice Control

“Hey Google, take me to the nearest gas station.” Simple, natural, and hands-free.

Fewer Distractions

Designed with safety in mind, the interface limits visual overload. You only see what you need, when you need it.

Better Personalization

Since AAOS runs directly in the car, it can save preferences across profiles and adapt to whoever’s behind the wheel.

The Future of Android in the Automotive Industry

We’re only scratching the surface of what Android can do for mobility.

Upcoming Trends:

  • Integration with EV battery data
  • Smart assistant for predictive driving
  • Multi-screen support (rear-seat entertainment)
  • Seamless phone-to-car sync with Android 15+

Google is also working on extending AI-powered user experiences using contextual data like location, calendar events, and habits to provide real-time driving recommendations and proactive assistance.

Conclusion

The journey of Android in the automotive industry showcases how adaptable and scalable the Android ecosystem truly is. From phone projection systems to embedded car platforms, it has revolutionized how drivers interact with their vehicles.

For developers, this is a golden era — you can now build apps not just for phones and tablets, but for the road itself. For OEMs, it’s an opportunity to build smarter, more connected vehicles. And for users, it means safer, more personalized, and enjoyable driving experiences.

As Android continues its journey on wheels, the road ahead looks smarter, safer, and more open than ever.

FAQ

Q: What is Android Automotive OS?
 A: Android Automotive OS (AAOS) is an operating system developed by Google that runs directly on a vehicle’s hardware, unlike Android Auto which runs on a smartphone.

Q: How is Android Auto different from Android Automotive OS?
 A: Android Auto is a projection system that mirrors apps from your phone. AAOS is a standalone OS installed in the car, offering deeper integration with vehicle functions.

Q: Can I build apps for Android Automotive?
 A: Yes! You can build navigation, media, and communication apps using standard Android tools and frameworks, with slight modifications for car compliance.

Q: Which cars use Android Automotive OS?
 A: Cars like the Polestar 2, Volvo XC40, and some models by GM, Honda, and Renault run Android Automotive OS.

Glanceable UI

Implementing Glanceable UI with Jetpack Glance in Android Automotive OS

Glanceable UI is a key pillar of in-car user experiences, especially in vehicles powered by Android Automotive OS (AAOS). With safety at the core, designing for “at-a-glance” interaction ensures drivers get the information they need with minimal distraction.

In this blog post, we’ll explore:

  • What Jetpack Glance is
  • How it helps build Glanceable UIs for AAOS
  • A practical example: Media playback glance widget
  • Best practices for in-car glance design

What Is Jetpack Glance?

Jetpack Glance is a lightweight UI toolkit by Google that allows developers to build glanceable app widgets using Jetpack Compose principles.

While it’s widely used for Android home screen widgets, it also plays a growing role in automotive contexts, where modular, safe, and context-aware UI components are essential.

Key Benefits:

  • Declarative UI (Compose-style)
  • Lightweight and fast
  • Surface-aware (homescreen, dashboard, etc.)
  • Seamlessly integrates with AAOS and Assistant

Why Glanceable UI Matters in Cars

An ideal in-car interface (for media, navigation, etc.) should be so intuitive, voice-driven, glanceable, and minimal that drivers can use it with little or no visual attention — keeping their eyes on the road and hands on the wheel.

Android Automotive is designed to operate under strict UX restrictions to reduce cognitive load and visual complexity while driving. Glanceable UI is about showing just enough information for a quick decision or action.

Example Use Cases:

  • Resume last media playback
  • Quick access to a recent contact
  • Show estimated time to destination
  • Weather or fuel level notifications

Setup: Adding Jetpack Glance to Your AAOS Project

First, make sure to add the required dependencies:

Kotlin
dependencies {
    implementation("androidx.glance:glance:1.1.1")
    implementation("androidx.glance:glance-appwidget:1.1.1")
}

Jetpack Glance is evolving, so make sure to check the latest versions here.

Example: Glanceable Media Playback Widget

Let’s say you want to display a media card with the currently playing song, a thumbnail, and basic controls like play/pause.

Step 1: Create Your Glance Widget

Kotlin
class MediaPlaybackWidget : GlanceAppWidget() {
    override suspend fun provideGlance(context: Context, id: GlanceId) {
        provideContent {
            MediaPlaybackContent()
        }
    }
}

Step 2: Build the UI Using Glance

Kotlin
@Composable
fun MediaPlaybackContent() {
    val mediaState = rememberMediaState()

    Column(
        modifier = GlanceModifier
            .fillMaxWidth()
            .padding(16.dp)
            .background(ImageProvider(R.drawable.widget_bg))
    ) {
        Text(
            text = mediaState.title,
            style = TextStyle(fontSize = 18.sp, fontWeight = FontWeight.Bold),
            maxLines = 1
        )
        Text(
            text = mediaState.artist,
            style = TextStyle(fontSize = 14.sp, color = Color.Gray),
            maxLines = 1
        )

        Row(
            horizontalAlignment = Alignment.CenterHorizontally,
            modifier = GlanceModifier.fillMaxWidth()
        ) {
            Image(
                provider = ImageProvider(R.drawable.ic_prev),
                contentDescription = "Previous",
                modifier = GlanceModifier.clickable { /* skipPrevious() */ }
            )
            Spacer(modifier = GlanceModifier.width(8.dp))
            Image(
                provider = if (mediaState.isPlaying)
                    ImageProvider(R.drawable.ic_pause)
                else
                    ImageProvider(R.drawable.ic_play),
                contentDescription = "Play/Pause",
                modifier = GlanceModifier.clickable { togglePlayback() }
            )
            Spacer(modifier = GlanceModifier.width(8.dp))
            Image(
                provider = ImageProvider(R.drawable.ic_next),
                contentDescription = "Next",
                modifier = GlanceModifier.clickable { /* skipNext() */ }
            )
        }
    }
}

Pro Tips for Designing Glanceable UI in AAOS

PrincipleGlanceable Design Tip
VisibilityLarge touch targets (min 48x48dp), no nested menus
Context AwarenessSurface-aware widgets (only show glance cards when relevant)
Minimized Screen TimeDisplay key actions only: Resume, Pause, Next
Distraction-Free UXAvoid animations, complex visuals, or swipe gestures
Test While Driving SimUse Android Emulator’s automotive mode or in-car dev hardware if possible

UX Restrictions You Must Follow

Google enforces strict glanceable design rules for AAOS:

  • No scrolling UI
  • No long lists
  • No non-essential buttons
  • Only relevant, context-sensitive info
  • Safety is non-negotiable: Design for drivers, not passengers

If your app violates these, it may be rejected or hidden when driving.

Conclusion

Jetpack Glance enables a new era of modular, composable, and glance-friendly UI for Android Automotive OS. By respecting the driving context and focusing on essential, actionable information, developers can create interfaces that enhance the in-car experience—without compromising safety.

Remember: In the car, less is more. Show less, do more.

error: Content is protected !!