[CakeMix: AI Companions] - Make your own characters and live out your fantasies

Would you consider making a Mac version?

2 Likes

I had a Mac a few years back and I loved it. But… I sold it when I moved. Without one, I can’t make a Mac version. I don’t know if this game would even work on a Mac, to be honest. If I ever get another Mac, though, I’ll take a look.

CakeMix Version 1.2.1

The latest full update is out and available on Patreon. I’ll be updating the demo to this newer version soon and I’ll post here when that’s ready.

What’s new in this update?

Additional Weight-Gaming features: Aside from bug fixes, I’ve added the ability for characters to gain/lose weight over “chat time” so you have more control over it all. Once in chat/story mode, click the little Action Modifiers button above your text input box to bring up the new options. You can also add them directly into a story prompt using [ ] commands (see below and ask for help if you need it)…

  • Added ability to use [ ] in scenarios for instant weight gain on specific chat turns: [3]GainWeight 5 (will gain 5 pounds on turn 3)

  • Added ability to use [ ] in scenarios for instant weight loss on specific chat turns: [12]LoseWeight 23 (will lose 23 pounds on turn 12)

  • Added ability to use [ ] in scenarios for gradual weight changes over the course of the chat: [0]AutoWeight (1–5) (1 gains lowest amount, 4 gains highest, 5 gains and loses randomly)

  • Added gradual weight change over session to weight-based scenario starters.

  • Improved weight gain/loss visuals.

  • Added scale icon on Character Chooser card when a character gains weight over time.

Chat Text Accessibility:

  • Added Color Scheme option in General Settings for using alternate text/background color schemes. You can choose from the default dynamic colors, dark/light, or light/dark.

What else is new in this update?

  • Added Maren, a new alternative/goth character, to Downloaded characters.

  • Moved Steampunk Seductress personality from Mature to Alternative.

  • Changed “Chooser” button in Stories Panel to “Load Prompt.”

  • Changed “Save” button in Stories Panel to “Save Prompt.”

  • Updated Story-related wording to “Prompts” for clarity.

  • Added ability to include the current music track in a saved prompt so it loads with it.

  • Added music playlists loading on custom/downloaded characters.

  • Added controls help button and panel to Scenario and Story screens.

  • Fixed controls help button not being clickable in Character Creator.

  • Fixed various resolution issues, including camera movement and misplaced panels.

  • Made fading message stand out more.

  • Added Action Modifiers panel for gradual weight gain/loss during chat and instant weight changes.

  • Removed > chat commands and replaced them with Action Modifiers panel controls.

  • Improved AI back-end in several areas.

  • Reduced AI tendency to generate player responses at the end of story parts.

  • Removed AI tendency to use too many plot twists too often (rival lovers, etc.)

  • Added ability to change plot twist frequency and character quirk frequency in cakemix.ini.

  • Removed numGPULayers being automatically set in cakmix.ini; it’s now manually set only, and changed to num_GPU_Layers. See * here for more info on that.

  • Added warning if installation path may cause AI loading issues; warning can be disabled in cakemix.ini.

  • Added note on LLM download panel that VPN use may prevent downloads.

  • Fixed crash when clicking Stories button while not using an official character; now loads a random scenario.

  • Fixed various typos and grammatical errors in prompts and text.

  • More internal bug fixes, improvements, and additions.

-----------------------------------------------------

Thanks so much for the response, and as always I’d love to hear your thoughts and suggestions. If you need any help, just ask.

~ piper

9 Likes

I’m in need of some help. I downloaded the demo and selected Lyric and her basic scenario but I’m stuck forever on loading into it. Anything else seems fine a little slow but I think my computer will live, does anyone know what seems to be the problem??

Sorry to hear you’re having troubles. Please see this page, and follow the second set of steps.

Can you add some commands similar to the weight gain in the chat for breast and ass expansion. So more specific body parts grow instead of everything at once.

2 Likes

I had the same problem. My game folder was hidden on my desktop in a folder with an invisible name. Later, I moved the game folder directly to the C:/ drive, and the problem was solved.

1 Like

Sooooo, uhh. Thank you for replying so fast but I’m not really good at that stuff, coud explain how do I fix it like step by step?

If your path to the game folder looks something like this: ”C:\Users\Admin\Desktop\New Folder\CakeMix_1.2.1”, then shorten it by moving the folder to the “C:\” drive so that it looks like this: “C:\CakeMix_1.2.1”.

I kind of done it? I dragge the whole file from downloads inro c drive but it stilll loads forever

Yup I started doing that already and decided to work on it some more before putting it in a build. It’s coming. :slight_smile:

2 Likes

Which part of the steps did you not understand? The only way I can really help figure out what the problem might be is if I have some indication of your system specs, etc. which is all in that Player.log file.

New CakeMix Free Demo is out, btw. I forgot to post it here.

More info and download here.

There are several bug fixes and improvements over the previous demo, so be sure to get the latest if you’re having trouble.

I extracted it using win rar to c drive

Edit: I sent it to you

1 Like

I just took a look and wrote back. Your video card is not powerful enough to use AI, sorry. Thank you for trying, though, and for sending that over.

Hi there. I really like what you’ve put together with this project. Combining a local LLM with a hand-crafted character creator is not something I expected to see, but it works surprisingly well and you seem to have made great improvements and incorporated a lot of community feedback in a short time.

I wanted to offer some suggestions that I hope are constructive and may be at least interesting to consider for a project like this. Obviously I don’t know anything about how it all works under the hood, but I hope these are at least interesting ideas even if actually implementing any of them might not be on the cards.

Ability to edit AI responses

It’s an LLM, it’s going to do weird things sometimes and rolling with that can be a lot of fun. However, sometimes the LLM totally deviates from what you’ve asked it to do or doesn’t properly consider the context. You can hit the Interrupt button but by then it’s usually too late, and trying to issue subsequent prompts to the LLM to revert and change things is clunky and doesn’t always achieve what you want.

I’m assuming that the LLM is considering its own recent output when deciding what to do next, so with that in mind it seems like being able to edit the text the LLM has just produced before inputting your own prompt would allow you to erase something you don’t like that the model just did and steer it down a different path. I’ve seen the model do things as basic as getting names wrong, or introducing story elements that don’t make any sense (e.g. having a character pull out a phone and start texting when I told it we’re in a medieval fantasy setting). It’s an LLM, so you obviously can’t stop it from doing things like this, but maybe we could have a way to edit the most egregious issues ourselves.

If you were able to implement this I think it would fix the majority of issues I’ve experienced personally, and would be far and away the most impactful change you could make.

More Personalities

I’m not even close to having tried out all of the personalities that sound interesting, but working under an assumption (again) that the selected personality is being fed to the LLM and should influence its behaviour, I feel like there are a few character archetypes that would be of interest to users on here that didn’t immediately seem to be represented in the options I looked through.

Assuming new personalities aren’t too difficult to add (as there seem to be a lot of them already), this could be an interesting area to open up to polls/suggestions. For example, you could look to add some options that are less positive about weight gain (especially now there’s an option to have a character start thin and gain weight during a scenario), e.g:

  • A character who prefers being thin and isn’t happy with their weight increasing/wants to lose weight (this would tie in well with the program directly tracking weight and knowing if/how much it has changed, which you’ve alluded to in other posts)
  • A character who is in denial about their weight and insists they are skinny or tries to make excuses and convince themselves (and others) that nothing has changed
  • A character who is sensitive about their weight and prefers to avoid the subject/gets upset if it is brought up

Maybe some of the above could be solved by a general option to set if a character should be positive/negative/neutral towards weight gain or being above a certain size, which would then let us combine the interesting array of existing personalities with this kind of story element.

Storing Persistent Information Mid-Session

I’m not sure to what extent the LLM remembers the original scenario prompt even as you add more and more responses. I’ve had some luck getting the LLM to accept different genres and story elements by including them in the character biography, preferences, scenario, etc. However, one weakness right now is that the LLM can dynamically introduce an interesting detail that it then quickly forgets about, or else you can’t get any changes from the starting point to stick (e.g. the character announces spontaneously they are pregnant and then keeps forgetting about it until you prompt it back in again).

I don’t know what the technical or performance impact would be, but this could maybe be solved by including a small additional text field that the user can edit during a scenario and that is kept in the LLM’s “memory” of what is happening. Even one sentence would be enough to make a difference.

Something else that might just be a weird quirk of my sessions is that the LLM seems to like describing the character’s underwear even in situations where that doesn’t make sense (since they have clothes on top). I’m assuming this is because the LLM is aware of the options you selected for clothing in the character creator. Being able to either edit the LLM’s responses or insert a short description of the character’s outfit that remains in “memory” would help with this.

The User Persona

This is where I think I might be straying too far from what you had in mind with this project for this to be constructive, but I found it a bit weird that the LLM always assumes that the user and the character are interested in one another even if you try to prompt around it or use the sexuality settings for the user. I get that this was more or less the point of what you originally built here, but it closes out a lot of scenario options people might be interested in. For example, I tried a scenario where I was playing the character’s roommate and the scenario was meant to focus on them being bad influences on each other, but then the character starts kissing me out of nowhere a few messages in.

The current approach also rules out something like a “slice of life” scenario where the character moves between different situations and characters, as the user is always seemingly forced to have their own personal present in every scene (or else the LLM keeps inserting you back in).

I don’t know if this is something you’d even want to “solve”, but personally I’d be interested to see what would happen if you could start a scenario without a single fixed user persona, or else edit it on the fly. You can sort of do this right now by ending and starting new scenarios with new personas, but then you lose temporary edits you made to the character and the LLM won’t remember what was happening previously. It feels like what you’ve built is very close to being able to do this already, but I absolutely understand that I’m probably in the minority for even wanting to do this.

2 Likes

Thanks so much for your write-up. I’ve been down with a cold the past few days so I apologize for the delay in getting back to you.

The system is currently set up for modern style stories, which would explain why the AI wants to take a call in a medieval scenario, lol. There are a bunch of messages I send to the AI in the background that have to do with modern romance/sexual type things, so that’s why it keeps wanting to go there.

I started working on some options to solve that a few months back and stopped to work on some other things, but I will definitely revisit. I basically set up some various story styles to choose from like sci-fi, fantasy, thriller, historical, etc. which help guide the AI in those directions. So yeah I’ll try to prioritize getting some of that in for the next update or the one after it.

As far as editing the LLM responses mid-stream, I’m not sure if that’s even possible, but I will give it a look. Like I say, right now I send background messages to it for all kinds of things so that’s likely a possible way I can keep it better on track. Because of the small size of the LLMs, they are always going to do some weird things, but I like the idea of “nudging” it back where you want it. That’s been on my to-do list for a while now and I just keep working with it on each update. So, little improvements all the time, but I doubt it will ever be as solid on track as I’d like, again given the small size of the LLMs.

I like the idea of those additional personalities, so I will likely get some of those in for the upcoming update. I also realize that more can be done for the player persona, which Ive been working on, so I’ll consider your ideas along those lines as well.

Thanks again for your thoughts! Much appreciated. :slight_smile:

2 Likes

I don’t know about the technical side of things, but every LLM based text roleplay site/program I’ve ever seen has had this as a feature, going back to AI Dungeon which predates the modern LLM craze by a good few years and was using a much more basic model than what we have now. To be fair, I don’t know how they’re doing it or if there’s anything special about the LLM running locally that makes this harder to achieve. If this isn’t something that can be added then it’s not like the current setup is unusable without it, but I do think it would go a long way towards working around the inherent limitations of anything LLM based that may be very hard to solve with just your system prompts (e.g. excessive repetition, where the LLM gets into a “feedback loop” by repeating itself, and then those repetitions it’s already made seem to influence it to repeat the same thing in the future).

That’s interesting to know, I assumed it was just the training data being skewed towards elements you see a lot in the real world. In that case it’s actually pretty surprising how well this work just by prompting a genre or setting and occasionally mentioning it in instructions. I would be genuinely interested in story styles as an official feature, but this already seems to work quite well at the moment.

One final piece of feedback I wanted to share: I saw you mention in a previous post that the 100 character response “limit” was included as a suggestion to encourage us to keep inputs short. At first when I tried the demo I had assumed this was a hard limit of 100 characters (due to the local LLM) and that seemed like pretty much a deal breaker as it’s way too short to do anything. I spent a while trimming everything to be as concise as possible and save tiny amounts of input space until I read more into how it actually works and realised it’s not a hard limit. I imagine others might be doing the same and getting a bad experience as a result.

I’ve never had issues with putting in much longer inputs than 100 characters (even going above 800 seemed fine). I’d suggest either making it clear that this isn’t a hard limit, or else rethinking how this is tracked/conveyed to the user as it might be giving people the wrong idea and putting them off like it almost did with me.

1 Like

I’m really curious about this but I’m afraid I can’t quite picture in my mind what it really is or how it would work. Could you explain in detail, perhaps, how that might work in CakeMix? Like what the interface might include, etc? I think I’m just not fully understanding the concept. I’m assuming that you’ve used the [OOC:] commands, which can help put things back on track, so if there’s a way you’re thinking would work, please let me know.

I’ll take a look at that. It is indeed a soft limit as you can type however much you’d like, but past a certain point the AI gets confused. If you’re regularly typing ~800 chars into responses and the LLM is going off track, that’s why the soft limit is there.

I’m super curious as to how people use this thing, and I’m mostly just guessing as I don’t collect any data. If you are so inclined, and want to do some chats then send me the logs, I would love to take a look. It would help me to hone the system towards real world use rather than just my own biases. If you don’t mind doing that, you can send some chat logs to ripenedpeach[@]gmail.com – that goes for anyone else reading this.

Thanks again!