From Forecasting to Foresight: Redesigning AI for Human Relevance
to
Key Learnings
- Diagnose the Forecasting Fallacy: Uncover why AI fails when driven only by past and present data. Learn to integrate backcasting with forecasting to design systems that align with preferable futures.
- Design for User Agency: Use Fogg’s Model within the BCD framework to design AI that empowers users—aligning motivation, ability, and timing so people act with control, not react to predictions.
- Personalise & Contextualise Interventions: Apply the Transtheoretical Model and Nudge Theory to tailor AI experiences to readiness for change, ensuring timely, ethical interventions that respect inte
Speakers
Speaker: Joana Cerejo
Profession: Experience Designer
Workplace: Hexagon
Description
Despite billions invested, 80% of AI solutions fail to deliver promised value or achieve long-term adoption, and 95% of generative AI pilot projects fail to generate measurable value. This catastrophe signals a fundamental design failure, not a technical one. We are creating systems that are "technologically brilliant... but, humanly irrelevant" This workshop moves beyond surface-level design and tackles the systemic flaw: the industry's reliance on Forecasting. Forecasting uses the past to predict the future, excelling at optimising what is probable but failing to perceive what is preferable. This results in systems that are "statistically correct, but humanly wrong", automating the user’s existing dysfunction. Participants will learn how to integrate foresight and the powerful methodology of backcasting. Backcasting turns possibilities into plans by starting from a desirable future state — habits, priorities, or skills — and working backward to define today’s design decisions. Building on my doctoral research, I’ll introduce behavior change frameworks built on Backcasting. These models show that designing AI isn’t just about creating functionality — it’s about projecting human change. Attendees will learn how to shift AI from being accurate to being relevant — systems that understand human change, not just predict behavior.