Table of contents
- Rosebud: Your AI-powered journal that listens, understands, and helps you become more self-aware.
- Personalized AI experiences are revolutionizing software by making it feel like it truly knows and cares about you.
- AI can provide the emotional support and thoughtful interaction people crave, but it's crucial to design it with boundaries to avoid unhealthy dependencies.
- Rosebud is revolutionizing mental health by blending AI with therapy, making support accessible and enhancing sessions for deeper, more productive conversations.
- Rose Bud is revolutionizing emotional literacy and goal-setting by becoming a lead gen source for therapists and coaches, while enhancing AI's context and memory capabilities for a more personalized user experience.
- Startups must innovate to keep costs down and deliver value in the AI space.
Rosebud: Your AI-powered journal that listens, understands, and helps you become more self-aware.
Chris, thanks for joining us on the channel. You're one of the best designers and engineers I've ever had the chance to work with, so it's really fun to welcome you to our YouTube community.
Chris: Thanks, Gary. It's great to be here, and it's been awesome watching you develop this from afar. It's exciting to finally be in.
Gary: To start off, why don't we cut to the chase? What is Rosebud?
Chris: Absolutely. Rosebud is an interactive journal designed with therapists to create positive change in people's lives. The way it works is by taking a daily journaling practice and making it more rewarding through follow-up questions that help you dig deeper into your experience. As you know, journaling has been scientifically proven to be good for our mental health, but it's often hard to approach because we don't always know what to write about. We thought, rather than worrying about AI becoming self-aware, what if AI could help us become self-aware? We believe we've accomplished that, and it's very exciting so far.
Gary: One of the coolest things happening right now is that LLMs are turning out to be really smart, infinitely patient, and infinitely empathic. That's one of the impressive aspects of what you've built with Rosebud. It's able to keep you going with journaling, and some people even mention it as a form of therapy.
Chris: Yes, we do have people saying this is better than their therapist, having deeply therapeutic experiences with the product. The interesting thing about AI is that while it might not be the best at inventing things yet or being the most creative, it's really good at being an active listener and helping you piece through your thoughts. A good therapist will tell you that you already have all the answers when you walk in the door. Essentially, AI on Rosebud helps pull out what already exists inside of you, making it more clear and actionable.
One of the most surprising things is how many people are actually afraid to talk to their friends, loved ones, or even professionals. People don't want to create a burden on others, so there's a sensitivity not about what you're going to share but about how it might impact the person you're sharing with. AI removes that entirely, allowing you to express what's truly going on for you—things you might have never shared before—and then you get empathic feedback from the AI.
The way we've designed it is crucial. If you go to ChatGPT or Claude and say, "Hey, I'm feeling sad today," they'll often respond with, "I'm sorry you're feeling that way," and provide a blog post with solutions. But people don't always want solutions; they need someone to listen. We've trained the AI to be very concise, precise, and complete in its communication. If you share something sensitive, it will say, "Thank you for sharing that. That must be hard. Why don't you walk me through what's going on for you?" Like a wise sounding board, it doesn't jump to solutions. Instead, it helps you work through what's going on with you.
Designing these delightful AI experiences is an art in itself. The default of ChatGPT and Claude can be verbose and over the top. We have users who tell us they've tried different tools but prefer Rosebud because of the way it responds. This tells me there's a competitive advantage and intellectual property in prompt engineering and developing these experiences. Once you pull in all the data and memory to make the experience even more personalized, you start to get hyper-personalized and interesting interactions that you won't get from the bigger players right now.
Personalized AI experiences are revolutionizing software by making it feel like it truly knows and cares about you.
Experiences is actually an art in itself. The default of ChatGPT and Claude can be verbose and kind of over the top. We actually have users that tell us they've tried all these different tools and they prefer Rose Bud because of the way it responds. This tells me that there's actually a competitive advantage and there's IP in the prompt engineering and developing these different experiences. Once you pull in all the data and memory to make the experience even more personalized, that's where you really start to get things to be hyper-personalized and interesting in a way that you're not going to get from the bigger players right now.
There are lots of options for journaling, and increasingly, there are other options here as well. How do you think about the role of both top-notch software design and being really thoughtful about user experience? Hackers and founders across all of tech are approaching this because everything's wide open again. Consumer software has been closed for maybe 10 years, but AI has suddenly opened up all these brand new markets. Markets that we might have previously considered tarpits are not tarpits anymore, and this is almost certainly one of those areas.
We are just at the very beginning of what AI is going to mean for software consumer software development. The way we've approached things is a focus on personalization. AI can tailor live context and generate something based off of that. Typically with software, you build the best and kind of build one path. You might be able to do some differentiation through getting input from a user where the experience starts to shift a little bit, but never have you been able to do any sort of generative UI before. That's the difference.
You mentioned journaling, but I think this applies to all products. With journaling, the way we're doing it now is the differentiator. It's able to take form and shape itself to what your needs are, not only in communication but also in the entire app. The AI can have an awareness of the primitives of your system. In the case of a journal entry, you might have an entry, a goal, a checklist item, and all of these different possibilities. You can present this information to the AI with the context of what the user needs right now or what their requests might be, and generate a dynamic experience for them.
We've been working towards generating UI elements from the AI, and I think that's only the very beginning. I'm curious if you've seen any companies doing anything around this, but I think that's the future for software development.
I think the thing that we've been seeing that you have really nailed is this idea of memory. Even just using Rose Bud myself, I'm pretty impressed. It's not just a good listener and asks good questions afterwards; it also asks good questions days or even weeks later. For example, it might ask, "Hey, you were saying you were having a problem with such and such person in your life. How's that going?" This exemplifies one of the main things that's really cool about software today, which is that it can actually have a memory.
There's something very satisfying about being asked these things. Even your best friend or spouse might not always remember those things because they're living their lives. The very best people in your life might remember those things, but sometimes people just don't have others who think about them in that way. It's pretty wild. I think that's somewhat of a human need—people want to be cared about and asked questions. This kind of reminds me of the trope in the movie "Her," where you can have a piece of software that can give off the characteristics of being thoughtful.
AI can provide the emotional support and thoughtful interaction people crave, but it's crucial to design it with boundaries to avoid unhealthy dependencies.
There is something very satisfying about being asked questions that even your best friend or spouse might not remember. The very best people in your life might remember those things, but sometimes people just don't have others who think about them in that way. It's pretty wild, and I think that this is somewhat of a human need—people want to be cared about and asked questions. This reminds me of the trope in the movie "Her," where a piece of software can give the characteristics of being thoughtful and even loving.
This goes into a different territory, exploring how people develop relationships with AI. "Her" is an extreme example of someone developing an emotional, intimate relationship and dependency similar to that with another human. I'm not sure that's the direction we want to go. Ultimately, we see it as providing people with the basic level of support required to function, especially if they can't access or speak to a loved one who can support them in the way they need. The end goal is to give people the confidence to show them what healthy relational communication looks like so they can bring that to their relationships and feel more confident engaging with the real world.
It might be a bit dystopian to think, "I don't need anybody else; I just need this AI because it understands me." The problem with these AI agents is that they don't have boundaries. They are always there, and you can talk to them whenever you want, unlike a human who might not be in a good mood or available. This lack of boundaries makes the relationship immediately unnatural. You could try to design boundaries into it, but then you enter an uncanny valley where the AI has to fictionalize being busy or having other plans. It's important to consider this in the design and establish the extent of the relationship expected between the AI and each individual.
We've seen users on Rosebud for four or five hours a day, which raises the question of whether this is healthy. We can't see the nature of these conversations, but it's certainly alarming. It brings up the need to design the AI in a way that serves its purpose without creating a codependent relationship with users.
Our homepage emphasizes that the best thing users have done for their mental health is using Rosebud. We've incorporated various psychoanalytic traditions and therapies into our approach. There's been a lot of great work in psychology, psychotherapy, coaching, and other frameworks over the past several decades. We've taken notes from these to combine them into the core experience, including Internal Family Systems (IFS), Acceptance and Commitment Therapy (ACT), and Cognitive Behavioral Therapy (CBT). We worked with therapists like David Coats to develop an IFS journal and have collaborated with other therapists and coaches to fine-tune our responses. This has been a year-long process of tuning and iterating to align with what users might expect.
Rosebud is revolutionizing mental health by blending AI with therapy, making support accessible and enhancing sessions for deeper, more productive conversations.
Combining different therapeutic modalities into the core experience of Rosebud has been a significant focus. Internal Family Systems (IFS), a powerful therapy framework, has been integrated into the platform with the help of therapist David Coats, resulting in the development of an IFS Journal. This allows users to experience specific therapeutic modalities within Rosebud. Additionally, Acceptance and Commitment Therapy (ACT) and Cognitive Behavioral Therapy (CBT), one of the most popular methods for effecting positive change, are also considered. We have collaborated with therapists and coaches to gather feedback and iterate on responses, a process that has taken about a year of fine-tuning to align with user expectations.
User integration with therapists has emerged as an interesting behavior. We received feedback requesting a share button on the app, as users often share their journals with their therapists. Therapists have also been recommending Rosebud to their clients, indicating a positive reception. This emergent behavior was initially uncertain, but feedback from coaches and therapists has been overwhelmingly positive. We are exploring ways to facilitate this integration more easily, such as providing special access to journals for therapists or allowing therapists to drop journal prompts and homework into Rosebud. This would enable therapists to see the results and ensure clients are engaged in their personal growth between sessions, making therapy sessions more productive.
Looking ahead, the potential advancements in AI, such as GPT-5, could enhance the reasoning ability and recall of past interactions, potentially exceeding that of a real-life therapist. However, Rosebud is not intended to replace therapy but to serve as foundational support, especially given the current shortage of available mental health support. Rosebud aims to be accessible to everyone, providing a base layer of support. For those deeply invested in their mental health and who can afford it, working with human therapists will remain essential. Rosebud could serve as an on-ramp, preparing users for more intensive work with therapists and coaches.
In terms of product development, there is much more to explore. Currently, the conversational aspect of journaling is one of the most powerful features. We aim to expand into goal setting, allowing users to build their happiness recipe and further enhance their personal growth journey.
Rose Bud is revolutionizing emotional literacy and goal-setting by becoming a lead gen source for therapists and coaches, while enhancing AI's context and memory capabilities for a more personalized user experience.
This will support that and maybe even be the on-ramp to that. Imagine you have millions of people using Rose Bud who are growing, learning about themselves, and becoming emotionally literate. Then, they're wanting to take the next step. We could become a lead gen source for all of these different organizations, therapists, or what have you, to take these folks that are ready and primed to do the work and make them more successful in actually working with therapists and coaches.
In terms of the product, there's a lot more that we can do. Right now, the conversational aspect of journaling is one of the most powerful, but we want to step more into goal setting. Currently, you're able to build your happiness recipe based on what you write. We can make suggestions on certain actions or habits you can adopt. We also help with broad goal setting, so you're able to come and say, "I have a goal, and maybe I want to win a Pulitzer Prize." You can track that in the app, and Rose Bud will help set milestones, check in on the goal, and coach you through it. Right now, that's very basic and rudimentary, but we want to invest more into that. So, that's another big opportunity for the product.
I'm not sure how much more intelligence is going to do. I think it's already pretty good. I imagine the improvements would be incremental. You can only be so smart—if you're too smart, no one's going to understand you. It has to be smart enough to communicate effectively with most of us. I think the next frontier is around context and memory. As you mentioned, it's really important. The way that like Chat GPT does memory now, I think, is just wrong. It generates facts on the fly and then saves those facts. It's kind of random—how does it choose those facts? So, it's doing context at the right time, whereas context at read, which is what we do, is a lot more effective.
There's a lot more to be done there because not only do you have conversational history, you have all these other aspects and pieces of data that you might want to pull in. You're going to be syndicating things from all these different data sources and then filtering it down to somehow provide the right context to the AI at all times. That, to me, is a really interesting area. I've seen some startups working in memory, but I haven't seen one that has yet done it such that I can not only query my vector database to get the conversational history, but I can also query the primitives in our database. There are multiple models that could potentially filter things down at low latency to assemble the most relevant context. Once it's able to do that, that's where it gets really interesting.
People have an expectation that the AI is aware of everything that they see. They assume that it sees everything that's in Rose Bud, is aware of what's on the settings page, and is aware of the goals that you have. The reality is, it's not yet. The day that it can be, that's to me the step function. It's less about the reasoning ability but more about bringing in the right context to create the right experience.
The context windows are pretty big. Do we need even larger context windows? It wouldn't hurt, but I think right now at 128k context, you can do a lot with that. Do you find that you're filling the context typically? That also increases the cost per query by quite a bit. We don't use the full context now. We limit it, one, just because the attention model, the way that it works, kind of loses track of things in the middle. As we talked about in that thread, there's an incentive to drive down costs and to drive focus for LLM. Just having a massive context window, maybe one day in the far future, it'll work and be aware of everything, but cost is always going to be a factor. You can't just send up a million tokens every chat message because that's going to add up. But if you can get it down to focus on what are the most important facts out of the set of data we know about this user, reduce the context down, then you're using less context, so it's more likely to give a good answer, and you're saving money. Those are the things that are going to drive context design in the future.
One of the interesting things, if you talk to Brockman or a lot of AI researchers, they sort of believe that...
Startups must innovate to keep costs down and deliver value in the AI space.
To drive down costs and focus on LLM, it is essential to consider the context. While having a massive continuous window might work in the far future, cost will always be a factor. You can't send up a million tokens every chat message because that will add up. Instead, focus on the most important facts about the user, reduce the context, and use fewer tokens. This approach is more likely to provide good answers and save money. These factors will drive context design in the future.
One interesting observation is that AI researchers like Brockman believe larger models will become smarter and subsume everything else. However, product builders on the ground are using today's models with techniques like Chain of Thought, workflow, smart context, pre-processing, and RAG to pull forward the future. Researchers are often surprised by this approach. The ongoing debate is whether larger models will be more usable, but we are currently at a price-performance curve where today's frontier models cost $10 to $50 a month. As a consumer product, it is challenging to pay more than that. Larger models might be ten times smarter, but cost-performance remains a critical issue.
This raises a good point about the unit economics of consumer AI. For example, ChatGPT costs $20 a month. Competing with such giants is difficult because they have deep pockets and control the software's sophistication level. Startups are incentivized to keep costs down and deliver the best results. They can't afford to wait for big, expensive models to become affordable. Usage-based pricing was considered but not fully tested because it adds complexity and changes the developer-client relationship. Instead, a fixed price model was chosen, ensuring a profit margin on every user.
One power user spending four to five hours a day on the product cost $500 to $600 a month, teaching a quick lesson that this approach is unsustainable. Startups will continue wrestling with cost management and sensible pricing models. Chris, having been at the forefront of various consumer waves, reflects on his journey. From Treehouse to Secret, and now with AI journaling, his focus has always been on helping people connect meaningfully. The current AI wave feels long and rolling, with real value in applications.
When asked what message he would send to his younger self, Chris advises: "You're exactly where you need to be. Keep doing what you're doing. People might question your choices, but you're doing the most right thing." For those interested in Rosebud, it can be found at Rosebud.doapp and on the App Store and Play Store by searching for Rosebud.