# Speaker 1 : Mike Wolfson
Lead Android Engineer for Able AI LLC
Mike is a product-focused developer working out of Phoenix. He has been working in the software field for more than 20 years, and with Android since its introduction. He is a Google Developer Expert in Android and the author of the book "Android Developer Tools Essentials" published by O’Reilly. He currently works for Adobe as a Lead Android Engineer.
He has spoken about software development at a variety of conferences and user groups (including Google IO, Oredev, Oscon, GDGSiliconValley, Oredev, Droidcon NYC & Turin, AnDevCon, and others). When he is not geeking out about phones, he enjoys the outdoors (snowboarding, hiking, scuba diving), collecting PEZ dispensers, and chasing his young (but quick) daughter.
Talk Title: UI of AI
From punch cards to touchscreens, the evolution of UI is a story of constant adaptation to bridge the gap between our intentions and the machine's capabilities. This talk will draw parallels between the pivotal shift to mobile-first, touch-based interfaces that revolutionized Android development and the impending transformation driven by AI-powered UIs, aiming to prepare Android developers and designers for this crucial next step. I'll discuss conversational interfaces, explaining how natural language interactions, are becoming integral to the user experience. I will examine adaptive UIs, where AI enables interfaces to dynamically adjust, creating truly personalized and efficient interactions. These AI-driven paradigms are not just trends, they will fundamentally reshape how we design and interact with digital products. Attendees will leave this session understanding that just as the transition to mobile demanded new ways of thinking about layout, interaction, and user context, the integration of AI will require a similar adaptation. This presentation will help Android developers and designers prepare for the challenges and opportunities presented by AI so they can build the intuitive, intelligent, and effective user interfaces of tomorrow.
Speaker 2: Tunji Dahunsi
Software Engineer
Tj is a software engineer on the media foundations team at AirBnb. He's previously spent 3 years reasearching architecture and testing for Android apps in relation to jetpack compose on the Android Developer Relations team at Google.
## UI Layer Architecture for Adaptive Apps with Compose Multiplatform
Innovation, inspiration and economies of scale are increasingly pushing apps to become more adaptive; Jetpack Compose's declarative API makes it a perfect choice to meet the dynamic requirements of building adaptive apps for multiple form factors and platforms. How then, does one go architecting and building the UI layer for adaptive Jetpack Compose multiplatform apps without compromising the developer or user experience on any form factor, or platform? This talk will: Present a case study of a real world production scale social media Compose multiplatform app (an open source bluesky client) targeting, phones, tablets and desktops, and the architecture principles in the UI layer used to build it. Specifically, it covers: Business Logic in the UI Layer * State production for adaptive apps How is adaptive conditional logic handled? * Navigation for adaptive apps * What is an adaptive pane? * What can change within a pane? * How do panes change with navigation? UI Logic in the UI Layer * How are persistent UI elements like app bars and FABs managed between device configurations and platforms? What panes own the navigation UI elements like the nav rail? * How do shared element transitions work in adaptive apps? * How are intensive system resources like video players shared?