Entwined Verdancy v0.1.2 - A Tentacly Text Adventure Game

cover

I made a text adventure game with weight gain, tentacles bondage, cum inflation, pregnancy and egg laying. Play as a lone woman who finds herself trapped in a treacherous forest full of ravenous tentacle monsters and strange glowing exotic fruits that fatten you with just one bite!

Important: The game is narrated by a large language model similar to chatgpt, and images made by stable diffusion, both are generative AIs.

This game is a purely experimental project. I wanted to construct an entire fictional world and let players fully explore it with as much freedom as they wanted, To to keep consistency, I tried to let the AI handle the player stats updates, but its a hit or miss for now. I’ll improve it when I have time.

For the User Interface, there’s a 3D view that shows the character model so you can kinda gauge what her current condition is, I added morphtargets for her body (weight gain), chest (breast expansion) and belly (inflation). Also the narration comes with some 2D ai-generated images that try to match the player’s current state (walking, sleeping, being ravaged, …) with the belly size and weight level.

Note that you’ll need to enable weight gain in the settings, the AI love to rapidly increase player weight no matter what (ie. brush past a fruit? gain weight, dip feet in water? get super fat, …) so I toggled it off for now.

Play the game here:

Screenshots



11 Likes

Oops, fixed a bug where fatness stuck at 0

Was a fun time. Thanks for sharing.

1 Like

The game seems to have… “crashed”?
To elaborate slightly, there is no notification for this. It occurs mid-generation.
Is it possible to run the game locally?

Sorry but you should save often.
I can’t make the game work for a locally hosted model because I don’t have a PC capable of running LLMs to test with.

interesting idea, i know some will call it lazy and stuff but i like the idea of using ai for other niche things like this, i myself used ai to make a little adventure game but nothing like this.

I am curious though, what did you use for the 3d model?

It’s a free Vroid model

where can I access the saves that I download from this game?

you should have seen the file being downloaded like a normal file. If you dont see anything then try to enable popups, disable adblocks and so on

Also unfortunately save files don’t last very long because I’m constantly revamping the game and can’t promise to support old save versions

Great game with a lot of potential! :slight_smile:

1 Like

I see, very good game, I am certainly looking forward to what you have in store next =D

1 Like

played quite a bit of it already and can say it’s quite fun with what content is their atm so i do hope their will be new events and other characters introduced as the game gets updates

This looks interesting. I took a glance at it the other day and then I clicked off after looking at it. But I assume the API keys are for a specific AI.

It would be great if I could also some how use my novelai API key.

Edit: seems I have to save it within the browser, THEN I can successfully download the save, now I get the prompt to save it to my desktop

You have to use “Download Save”. Then when you come back, “Upload Save” and the “Load Save” to load up the save you just uploaded.

I would really like this to be generalized in some sort of gaming frontend for LLMs.

Something like much more powerfull successor of AI dungeon.

Downloadable, supporting other (local (oobabooga)) LLM api, custom world scenarios and maybe even optional image genetation api (local as well (SD.next)). All pieces are already there and and the execution is awesome, it would be shame to lock this kind of framework to just one scenario (which is little too much tentacular for me).

1 Like

Previous twine games, never had an issue directly saving to my computer without saving within the browser, but this games the first to prompt me to save inside the browser first, then I can download it to my computer. Least now I know what to do for future builds

That’s gonna take another couple hardware generations to be plausible. I’m running a 5800X and 7900XT. Neither of those are optimized for AI work (getting image generation to run on AMD right now is like pulling impacted teeth).

Yeah, I know, same. Dunno if it’s the way this one’s written, or an artifact of the AI processing bits. Just thought I’d note it since I figured it out with a little experimentation.

Nvidia is way to go. I have 2x RTX3060 (12GB) and it can run quite clever LLM with 20k token context (split across both cards) at speed of reading or some smaller model on one card and stable diffusion on another in parallel. Also my CPU is prehistoric and does not have AVX2 instructions, but current CPUs can run LLM too (not sure about the speed though).

Running anything on AMD cards is pain (or was some time ago, maybe things are chaning).

This particular game would be perfectly runnable locally on single 12GB Nvidia card.