Boagworld: UX, Design Leadership, Marketing & Conversion Optimization cover art

Boagworld: UX, Design Leadership, Marketing & Conversion Optimization

Boagworld: UX, Design Leadership, Marketing & Conversion Optimization

By: Paul Boag Marcus Lillington
Listen for free

About this listen

Boagworld: The podcast where digital best practices meets a terrible sense of humor! Join us for a relaxed chat about all things digital design. We dish out practical advice and industry insights, all wrapped up in friendly conversation. Whether you're looking to improve your user experience, boost your conversion or be a better design lead, we've got something for you. With over 400 episodes, we're like the cool grandads of web design podcasts – experienced, slightly inappropriate, but always entertaining. So grab a drink, get comfy, and join us for an entertaining journey through the life of a digital professional.Boagworks Ltd Economics
Episodes
  • Why UX Teams Need a Maturity Audit Right Now
    Apr 23 2026
    UX is under pressure. A proactive maturity audit gives you a voice before leadership makes decisions about your team without you. Something uncomfortable is happening in organizations right now. UX teams are being quietly reassessed. AI has disrupted the field, leadership expectations have gone unmet, and there's a growing sense that UX hasn't delivered what it promised. The conversations are happening, but often not with the people who actually do UX work. If you're in a UX role, decisions about your team's future might be forming in rooms you're not in. That's the situation I've been thinking about lately, and it's why I want to talk about UX maturity audits. Not as a defensive measure or a tick-box exercise, but as a genuinely useful tool for getting ahead of a conversation that's already underway. The expectation gap is real A lot of the cynicism toward UX right now traces back to one thing: overselling. Leadership was told UX would deliver a hundredfold return on every dollar spent. That figure gets thrown around a lot, and someone took it seriously enough to hire one UX person and wait for the magic to happen. It didn't. That disappointment is partly our industry's fault, though it's not something we often admit openly. We've marketed UX with promises that assume a level of organizational change nobody warned leadership they'd have to make. Hiring one person doesn't transform an organization into a user-centric one. It never did. There's a certain naivety in the idea that a single hire will magically produce amazing experiences, without understanding the breadth of change required for an organization to truly become user-focused. But plenty of people implied it would. The result is a leadership team that feels, not unreasonably, like they were sold something that didn't arrive. Why waiting is a bad idea The natural response to this situation is to keep your head down and hope things settle. Understandable, but a mistake. If leadership is already souring on UX, the absence of any structured conversation about what UX is actually delivering gives that skepticism room to grow unchallenged. Decisions start getting made. Quietly, and without much input from the people who understand what's actually happening. A proactive UX maturity audit changes that dynamic. Instead of waiting to be judged, you're shaping the conversation. You're the one bringing evidence, framing the questions, and defining what success looks like. That's a considerably better position to be in. And it's not just damage control. Even mature, well-functioning UX teams benefit from this kind of review. There's always a next stage. Whether it's wider adoption, better integration with product teams, or moving toward something more democratized, an audit helps you see where you are and decide where to go. What a solid audit covers A UX maturity audit should cover five areas. Not exhaustively, but enough to give you a real picture. Strategy and leadership. Does UX have a seat at the table? Is there genuine sponsorship from someone with budget and influence, or is UX being practiced in a corner while real decisions happen elsewhere?Culture and capability. How widely does the organization understand what UX actually involves? Are there training pathways and career development? Or is it just a job title a few people happen to have?Research and design processes. Is UX practice consistent, or does it depend entirely on who's available? Are designers and researchers involved early, or called in after the big decisions are already made?Outcomes and measurement. Can the team point to specific improvements in user outcomes? Are there agreed definitions of what success looks like, and is anyone actually tracking it?Cross-functional integration. Is UX embedded across teams, or sitting in its own silo waiting for people to come to it? None of these are particularly complicated questions. The hard part is being honest about the answers. The difference between a real audit and a survey An audit that just collects opinions tells you what people think, which is interesting but not necessarily accurate. A good audit looks for evidence. That means checking whether research plans actually exist. Whether findings get used or disappear into a folder. Whether design systems are maintained or quietly falling apart. Whether the team can point to specific recent changes that improved user outcomes rather than just shipped features. But the more revealing question is often why these things aren't happening, because the answer usually points straight to the organizational problems that stop UX from gaining traction in the first place. A missing research plan isn't just an admin gap. It's often a signal that no one with authority has made space for it, or that the team has learned it wouldn't be taken seriously anyway. The questions worth asking aren't simply "how good is our UX?" They're "how well is UX supported here? How consistently is it practiced? What would ...
    Show More Show Less
    6 mins
  • AI Is Showing UI Designers the Door
    Apr 21 2026
    So this month Marcus and I get into a slightly uncomfortable question. If AI can knock out decent interfaces from a text prompt, where does that leave the people whose day job is opening Figma and making screens look nice? We start with Google Stitch, which has been getting a lot of attention lately. Then we zoom out into something I have become mildly obsessed with, which is building AI skills. Not prompt snippets, but reusable, documented processes that let you get consistent work out of AI without drowning it in context. App of the Month This month’s tool is Google Stitch (v2), Google’s AI UI generator. You describe what you want, it produces an interface, and you can do some light manual tweaking. It is not a full replacement for Figma. The editing controls are basic. The bigger story is what it represents. We are now at the point where a decent, usable UI can be generated fast enough that the real value shifts from "can you draw the screens" to "can you judge what good looks like." That is where experience, and yes, taste, starts to matter. If you want to compare approaches, I mentioned Figr again, which I still prefer for the quality of what it produces. Are UI Designers Becoming Vinyl? The question Stitch raises is not "can AI design interfaces". It clearly can. The question is what happens to the job market when "good enough" becomes cheap, fast, and widely available. I found myself telling 2 different clients recently that they could probably skip hiring a UI designer. They had tight budgets, tight timelines, and already had solid brand guidelines or a design system. In those situations, I could push the work through AI, iterate it a bit, and get something perfectly serviceable. That line of advice made me feel a bit grubby. Not because it was wrong for those clients, but because it hints at a bigger shift. My worry is that UI design becomes like vinyl records. Most people will not need it. A small number will care deeply and pay for it. The middle ground shrinks. Marcus made the important caveat here. Some designers will still be in demand because they bring something AI cannot easily fake. A distinctive visual style. Creative judgment. Brand thinking. The ability to make something feel like it came from a real point of view, not a model averaging the internet. We also talked about where UI designers can expand their value, because "I make pretty screens" is not a great long-term career plan. Broaden into UX and problem solving. Look past the interface and into the business problem, user needs, and research.Own the stuff between screens. AI still tends to think screen by screen. Humans are better at flows, journeys, and the messy reality of how people actually get from A to B.Lean into information architecture. For websites especially, the structure and content model matter as much as the visual design. We used a music analogy that will probably annoy some people, which makes it perfect. AI tools can generate "background" output that is fine for low-stakes use. They will not replace great musicians. But they will reduce the number of gigs available. AI Skills As a Career Asset After we finished terrifying UI designers, we moved on to something more useful. I think a lot of roles are going to need an AI toolkit. Not a handful of clever prompts, but a proper library of reusable skills. When I say "AI skills," I mean documented processes that an AI can follow reliably. Think SOPs you can run repeatedly, not prompt snippets you copy and paste. I now have around 60 skills in my library, and it is growing constantly. Outside of the Boagworld website, it might be the most valuable business asset I have. The reason is consistency and context management. AI can produce terrible output when you dump too much information on it at once. Skills let you break work into focused chunks and chain them. We talked about 3 levels of skills: Company-level skills Standard processes that keep things consistent. Proposals. Expense claims. Holiday booking. The sort of stuff that should not depend on one person remembering every step. Team or discipline skills For example, UX teams can create skills for personas, journey mapping, surveys, and top task analysis. That helps remove bottlenecks and lets colleagues do decent work without reinventing the wheel. Individual skills This is where it gets interesting for your career. These are the skills that capture how you do something, including all the weird little bits you have learned over the years. A key point here is that the value is not only in having the skill. It is in creating it. Writing down a process forces you to surface assumptions and explain what "good" looks like. We also got into AI agents. If you describe your skills well, an agent can chain them to complete bigger jobs. I gave a sales example where a meeting transcript can be turned into a CRM entry, follow-up tasks, company research, and a draft proposal with very little manual effort. That is ...
    Show More Show Less
    53 mins
  • Website Rebuilds, AI Tools, and UX in 2026
    Mar 17 2026
    This month, Paul and Marcus get into a tool that has made Paul cancel his Figma subscription, walk through how Paul has completely changed the way he approaches website rebuilds thanks to AI, and round things off with the latest thinking from Nielsen Norman Group on where UX is heading in 2026. App of the Week: figr.design Paul has been road-testing AI design tools as part of a workshop he ran on AI and UI, and after going through dozens of them, one stood out: figr.design. What makes it work where others fall short? A few things. It lets you feed in a significant amount of context upfront, things like style guides, design systems, and personas, which means the output is far more tailored than the generic average you often get from AI design tools. Iteration is also genuinely fast. You can queue up a whole list of changes and it processes them all in one go, rather than making you wait between each tweak. The prototypes it produces are more realistic than what you would typically get out of Figma. Text fields you can actually type in, accordion states that open and close, button states, fully responsive layouts. Not exactly revolutionary in theory, but refreshingly functional in practice. Export to Figma is available when you need it. The main limitation is that you cannot manually adjust elements yourself. Everything goes through the conversational interface. Paul has also been looking at a tool called Inspector, which runs locally and connects to the Claude API so you pay as you go rather than a flat monthly token allocation. It has been a bit fiddly to set up but worth keeping an eye on. For anyone regularly using Figma for wireframing and prototyping, it is worth giving figr.design a proper look. The shift Paul describes, from hunching over Figma to leaning back and having a conversation with the tool, is a fairly good summary of where this kind of work is heading. Rebuilding a Website in 2026 Paul has fundamentally changed how he approaches website rebuilds, and the shift is largely down to AI making a genuinely hard problem, getting good content onto a website, a lot easier. The old problem Website rebuilds have traditionally meant migrating existing content into a new design. Which sounds fine until you remember that most of that content was written by subject matter experts who know their field but have never thought about writing for the web. The result is pages that lecture rather than help, that bury the things users actually want to know, and that rarely arrive on time, because the content phase is almost always where projects stall. Why things are different now AI has changed three things meaningfully. First, generating content is no longer the enormous manual effort it used to be.Second, doing the research that informs good content, finding out what users actually ask, worry about, and need, is much simpler with tools like Perplexity.Third, AI-powered search engines are pushing toward a more question-oriented approach to content anyway, which makes getting this right more important than it used to be. How Paul works now Here is the process Paul walks through for a rebuild project. 1. Online research Using Perplexity, Paul researches the audience. For a well-known client, he'll ask specifically about them. For a smaller or niche client, he looks at the sector. He is looking for the questions people are asking, the tasks they are trying to complete, their objections, goals, and pain points. This takes about 10 minutes. 2. Personas The research output goes into AI, which identifies patterns and segments it into a set of personas. A couple of hours of back and forth to get these right. 3. Company overview Paul records his kickoff meeting with the client and points AI at the transcript. Out comes a clean summary of what the company does, its products and services, and how it talks about itself. An hour for the meeting, plus 10 minutes for the summary creation. 4. Top task analysis and information architecture If time and budget allow, Paul runs a formal top task analysis, collecting and prioritizing the questions users most want answered. For card sorting, he uses UX Metrics. If there is no time for that, AI brainstorms the top tasks from the personas and company overview. Either way, those tasks get fed into an AI-generated information architecture. 5. Building out the IA Paul builds the IA in the CMS or in Notion, assigning the relevant tasks and questions to each page. Stakeholders can see the structure and understand what each page is there to do before a word of copy is written. 6. Getting stakeholders to contribute Rather than asking stakeholders to write content (a recipe for delays), Paul asks them to do two simpler things for each page: bullet-point answers to the questions assigned to that page, and any other talking points they want included. Bullets only. No pressure to write. 7. Writing the content with AI This is where it all comes together. Paul sets up an AI project with four ...
    Show More Show Less
    1 hr
No reviews yet