I’m Chris Deaton
a product manager, innovation professor, and conscientious builder
I am work at the intersection of AI education, accountability, ethics, and narrative design.
I design and build responsible, bias-declared AI assistants, operational innovation frameworks, and fictional story worlds that help people navigate complexity without losing their values.
Innovation Professor is where the tools and processes I am developing with the Responsible Innovation Lab (RIL) frameworks are tested in public on YouTube; with students, partners, and continual pilots — then turned into real tools, courses, and systems. Here, I model and develop the responsible AI chatbots in use by the lab and a number of others I am using for personal productive and creator ventures.
Currently I am helping develop the next phase of the RIL, learning how to be a content creator myself, and working on the next installments in both the Noir Nexus and Tales of Grump book series.
A complete – but highly inaccurate list – is here on my About Chris Deaton page. Innovation moves pretty fast around here.

Not sure where to start? Start with intent.
Innovation Professor is an operating system for responsible innovation — not a blog, not a pitch deck. Choose the path that matches why you’re here.
🧭 I want to partner or build something
You’re exploring collaboration, pilots, or applied work — & you care about impact, not hype.
🎓 I want to learn or teach
You’re here for AI literacy, systems thinking, or applied education — for yourself, students, or an institution.
📚 I want to explore your applied projects
You’re curious about how fiction, narrative, and simulation are used to pressure-test real systems & ethics. After much experimentation, I have found YouTube provides me with the best umbrella and platform to do the things I wanna do and test the things we need to test.
If someone sent you this site, they likely thought one of these paths fit you.
Trust that instinct.
Responsible Innovation Lab
Frameworks, pilots, partnerships, and real-world deployment. This is where ideas are tested, refined, and shipped with students and collaborators.
Give Me Back My Bias
My commercial startup: persona-driven, bias-declared AI assistants built for people who want honesty, boundaries, and usefulness over fake neutrality.
Fictional Story Worlds
Noir Nexus, Tales of Grump, SolHaven — and the narrative infrastructure I use to explore power, belief, ethics, and systems through story instead of abstraction.
Chris Deaton
The builder behind these ventures — product manager, innovation professor, and designer of responsible, bias-declared AI assistants.
Why I built Innovation Professor
Most people don’t need more tech hype. They need help seeing what’s real, what’s risky, and what’s worth building. This site is where I do that work — openly, in plain language, and with declared values instead of fake neutrality. Transparency is important. The future is for everyone, and everyone should be at the table.
Responsible AI Assistant Design
At the Responsible Innovation Lab and Give Me Back My Bias, Responsible AI assistant design begins with one principle: honesty beats neutrality. Every assistant declares its values, boundaries, and purpose up front — no hidden agendas, no over-friendly tone, no pretending to be “objective.” These assistants are built with clear roles, declared assumptions, and explicit limits, so they strengthen human judgment instead of replacing it.
Catch up on some of my work
- 🎅 Tales of Grump: Santa’s Shutdown: Grumpmas – Book 1 – start here where my AI Assistant, Grumpy Old Man, shines
- 🪖 Grumpendence Day: Introducing Law and Order – Book 2 – the sequel
- 🕵️♂️ NoirNexus: The Shard and the Shadow: A NeuNexus Files Mystery – meet the AI assistants Kade and Riley in their natural habitat
- ♾️ PolyINNOVATE – the INNOVATE Framework applied to relationships and personal growth
Frequently Asked Questions
What does “Innovation Professor” actually mean?
It means I sit at the intersection of product management, AI education, and narrative design. My work focuses on translating complex systems into practical frameworks, tools, and stories people can actually use.
What makes your responsible assistant design different?
Most assistants hide their assumptions. Mine declare them. Every assistant I build comes with stated values, boundaries, and learning goals, which makes them more trustworthy and more useful in real-world contexts.
How do students and graduates work with you?
Through the Responsible Innovation Lab, I help students and early-career innovators take ideas from concept to deployable products. We run pilots, build prototypes, test in public, and ship tools that create real impact.
Do you work with organizations outside the lab?
Yes. I partner with nonprofits, educators, founders, and public agencies on AI literacy, workforce development, ethical technology, and innovation strategy — without chasing the hype cycle.






