Skip to content
/Mike Aron
← All Posts
·8 min read·Builds

Launching StoryCraftr: What 1,000 Parents Taught Me About AI Products

I built a bedtime story app so I could put my kids in their own stories. Then 1,000+ other parents showed me what hyper-personalization actually means — and why generic AI products are going to feel more and more hollow.

I have a six-year-old, a four-year-old, and a ten-month-old.

Reading to them at night is one of the best parts of my day. But at some point during a stretch of bedtimes — kids piled on me, book in hand — I noticed something that wouldn't leave me alone.

I had almost no control over the content.

Whatever the book said, that's what I was reading. The words, the scenes, the values baked into the story — none of it was mine. Some of it was fine. Some of it was vaguely off. A lot of it was just… generic. And my kids weren't in any of it. They were listening to stories written for no one in particular, about no one in particular.

So I had the thought a lot of builders have at some point: I could just build this.

That's how StoryCraftr started.

The solo prototype

The first version was for me.

I wanted a story that was tuned for how I actually read to my kids. So I started with the constraints. A five-minute bedtime story means a specific word count. I worked out the word count per page. Then the number of pages. Then the image per page. Then the tone. Then the structure — a beginning that hooks, a middle that teaches, an end that lands soft so they can fall asleep.

I wired up the generation pipeline. Text engine writing the story. Image engine generating a visual for each page. My kids' names, my kids' traits, the places we actually go, and — importantly to me — the values I wanted to pass down. I was raised with Christian values and I wanted stories that reinforced that for my kids. Not in a preachy way. In the way a good bedtime story reinforces anything: through a character doing a thing, a small lesson learned, a kind moment at the right beat.

The prototype was ugly. It wasn't a product. It was a Python pipeline that spat out a personalized story I could read aloud. But it worked. And the first time I sat on the edge of my kid's bed and read him a story that had him in it — about a thing he cared about, with the small lesson I was trying to teach that week — I knew I was onto something.

Meet Ryan

I called my friend Ryan.

Ryan's a developer. A good one. And he has kids. I walked him through what I'd built and what I wanted to do with it next — turn my one-off pipeline into something other families could use. User accounts. A library of every story a kid had ever had read to them. Parent controls. A real product, not just my personal toy.

Ryan was in.

Our split made sense quickly. I drove product and the AI side — the story engine, the image pipeline, the personalization layer, the tone and structure decisions. Ryan drove production engineering — the infrastructure to let thousands of parents have their own libraries without the whole thing falling over. We built StoryCraftr together.

When we launched

My kids loved the stories. Ryan's kids loved them. Then we launched — and other parents' kids loved them too.

Over time it grew. 1,000+ parents. 2,000+ stories written. 10,000+ AI-generated illustrations. Every one of those is a real family, a real bedtime, a real kid hearing a story tuned for them.

Not everything worked out of the gate. The text was great — the stories were genuinely good, and parents told us they were. But the image models at the time weren't there yet. The thing that drove us crazy was character consistency. Same character on page one vs. page four would come out looking slightly different. Sometimes a lot different. Occasionally an image would land with weird artifacts and we'd have to regenerate.

The stories held up. The pictures held up about 80% of the time. That 20% was a constant fight.

What 1,000 parents actually taught me

Here's the real insight. The thing I didn't expect going in, and the thing I now think about on every AI product I touch.

People want hyper-personalization. A lot more than builders assume they do.

When I say hyper-personalization I don't mean "type your kid's name into a box and we paste it into a generic story." That's the baseline version. The baseline version is table stakes and, honestly, it's been around forever.

What parents wanted went way deeper. They wanted stories in the family's voice. They wanted the values they were trying to teach to show up organically in the plot. They wanted reading level matched to the kid, tone matched to the hour, length matched to their bedtime routine. Some parents wanted stories about specific places they'd been. Some wanted characters based on real people. Some wanted a story that would help their kid process a hard thing that had happened that week.

Every one of those is another dimension of what "personal" actually means. And once you've given a parent a story that hits all of those, a generic kids' book feels flat. It feels like somebody talking about kids in the abstract instead of talking to their kid.

That's the shift I think most AI products haven't caught up to yet. Generic AI is going to feel more and more hollow as hyper-personalized versions become the norm. This isn't specific to bedtime stories. It's going to happen in every category where the user is a real human with a real life, not a persona in a pitch deck.

The other thing I learned: parents pay for this. They come back for more. They tell other parents. The demand curve for hyper-personalization is steeper than most builders think, and I don't see it flattening any time soon.

The thing you can't move fast on

Here's the counterweight.

Building AI products for kids comes with a bar you don't fully understand until you're inside it. Every piece of content that gets generated is going to be read by a child. There is no "ship fast and iterate" version of that. A weird line in an adult app is an awkward UX bug. A weird line in a kid's bedtime story is a parent who never trusts you again, and probably shouldn't.

That's part of why I've deprioritized StoryCraftr a bit compared to some of my other projects. It's still live. Parents still use it. Families still come back. But it's not where my 2am nights go right now, and the honest reason is that the stakes when kids are your users are higher than the stakes when adults are — and the kind of aggressive velocity that works for the other stuff doesn't feel right here.

I'd rather let it grow carefully than try to force it.

The bigger lesson: composability

The thing StoryCraftr taught me that I now use everywhere is composability.

Modern AI products aren't one big model doing one big thing. They're compositions. A story engine. An image engine. A safety layer. A personalization layer. A state system that remembers what story a kid heard last week so this week's isn't repetitive. Each of those is its own component, with its own failure modes, and the actual craft of building a good AI product is in how you wire those components together.

I didn't fully internalize that until I'd spent a year making a text model, an image model, and a user-preference layer cooperate on a bedtime story.

That same pattern shows up everywhere in what I've built since. Alfred is 15 specialized agents working together — each one good at one thing, orchestrated by a layer that knows when to call which. AutoRev uses specialized models for different parts of the build journey — research, fitment, compatibility, pricing — because no single model is best at all of them. SirHENRY follows the same shape.

None of those architectures would be in my head if I hadn't spent those first months wiring a bedtime story.

What's next for it

StoryCraftr is still live at storycraftr.ai. My kids still read them. Other families still read them. The image quality is dramatically better now than it was at launch — what we couldn't crack then, the current generation of models handles without flinching.

Someday I'll probably invest another stretch of 2am nights into making it what it could be with today's tools. The foundation is there. The audience is there. The only thing missing is the time I'd have to give it.

But for now, it already taught me the lesson it was supposed to teach.

Hyper-personalized AI is real. It works. Once you've built it for someone, they can't go back to anything less.

That's the bet for every product I've shipped since.