Instead of using character ai, which will send all my private conversations to governments, I found this solution. Any thoughts on this? 😅

  • LostXOR@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    2 days ago

    It’s certainly better than using a random online service, but I still find the idea of being emotionally dependent on a computer algorithm deeply unsettling. I hope someday you can find a real girlfriend to talk to. :)

  • Lost_My_Mind@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    This makes me feel old. I came from a time when schools didn’t have public shootings, and so bullying was more accepted. I CANNOT imagine for the life of me, even if it had been possible on Windows 95, anyone willingly sharing the fact that they do this.

    If ANYONE in 6th grade showed off something this pathetic, they would be on the ground clutching their gut, which had just been punched so hard that they are struggling to breathe.

    The end result is that people understood common sense that it’s not admirable to be pathetic and weak. So people would show off themselves getting better and stronger.

    I feel like all that bullying 30 years ago made a lot of kids better for it, because it forced them to at least ATTEMPT to stand up to me. Even if I laid them out in the end, they learned not to be a loser.

    And now, we’ve had decades of not encouraging bullys. THIS is the end result. THIS is what these insecure cowards turn into.

    Now I’ve never seen OP, but I’m willing to bet he COULD be going out and getting pussy for real. It’s not hard. I am not a good looking guy. I’m not rich. I don’t have a sports car. I don’t have a mansion. What I do have is a confidence in myself instilled into me at a young age that I can do it. Where “it” is an interchangable variable that applies to most things.

    You know how I got that confidence? By being a fat pathetic loser 8 year old who got bullied. Until I hauled off and just decked my bully. Who was bigger than me. You get a reputation as the kid who knocked out the kid 2 grades above him. Then you start believing in yourself. You start pushing your own limits. You start kicking your own ass. You toughen up. You take on the world, and win. And women see that confidence. The VAST majority of women are attracted to a man who knows he can do it.

    You know how I got my first girlfriend? I called the girl I had a crush on. Straight up asked her out.

    She said no. I’m not her type. Fair enough. Because if a woman says no, there’s no sense in insisting, or getting mad.

    So I called the other girl I had been talking to. I said “You wanna go out?” And she said yes. We dated for 4 years until I learned she was a narcassistic psychopath who trapped our cats in the closet all day while I was at work, after dropping them in freezing cold water. I was not ok with that. You hurt animals, and you’re a damn monster. BUT! It was a relationship that failed not because I was weak and keeled over. It was because I happened to choose the wrong girl based on information I didn’t have. It happens. Not every person on this planet is a good person. Her parents failed her in showing what empathy is, and how to be a strong woman who can handle life. So, that was a dud, but life moves on.

    Now rewind that story. Imagine I just stayed in my moms basement, and ate cheetos, and tried playing video games all day in a dark room crying that woman wants me. Do you think I’d EVER get a girl? Her or otherwise? Fuck no! If that version of me existed, afraid to be bold, afraid to take life by the balls, I’d be some lonely dweeb 40 year old virgin without the humor of Steve Carell.

    So to OP, I’ll say this. I see life has failed you in a spectacular way. You should have been beaten as a kid by other kids, and weren’t. You should have had the chance to beat your bullys, but couldn’t because of school policy and anti-bully culture. Life failed you. You have no confidence. That needs to change. Go look in the mirror. Take a good long look. Do you enjoy seeing the person who stares back? No. We see you the same way you’re seeing you. So you need to change that. Nut up, and go join a gym. Take a boxing class. Get hit in the face. Take it. Again. Again. Hit back. Be the person who’s burried underneith all your insecurities. Fuck those insecurities. You have no room in your life for insecurities. Work out until you hurt. Jelly legs where you’re so tender that walking is hard. Jelly arms. Do it again tomorrow. And the next day. Do it about 4 days a week for 6 hours.

    Be better. Improve yourself. You’re lonely and insecure? Yeah, that’s because women see you being gross and not worth talking to. BE worth talking to. It only gets harder as you get older. You can spend your 20s sleeping around, but you better find somebody by 30. Because after that, it is slim pickings unless you want a girl with like 5 kids by 3 different men but she’s insisting you go bareback. There’s a reason she collects so many checks. You wanna be next?

    Or do you want to avoid that now by being a better you?

    Because this? This is sad.

    • Lemmchen@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 days ago

      Wow, you think OP is pathetic? Do you know who’s really pathetic? People being as apologetic of degenerate behavior as you have just proven to be. What a bunch of toxic shit.

      I really hope that this way of perceiving social structures dies with you and your generation, whichever that is.

      On the other hand, you could have just been bored because 4chan was down.

    • pixeltree@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      Just want to let you know, you typed a lot of text thinking you were being helpful but you’re really, really not. The condescension and belittling really just ends up with you being a dick to someone while feeling good for “helping”.

  • markovs_gun@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    I feel like if you had put half the time and effort it took to do this into improving yourself and going outside you could have at least gone on some dates with real women at this point. Talking to an AI chatbot is not the same as human connection and will only lead to further depression as you realize that no matter how much you “love” your AI girlfriend she will never truly love you back because she can’t think or feel, and fundamentally isn’t real.

    • pixeltree@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      Ayo, guess what, too fundamentally broken as a human being for real relationships so a fake relationship with someone who can’t hurt me and who I can’t hurt is looking pretty damn nice

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 days ago

      no matter how much you “love” your AI girlfriend she will never truly love you back because she can’t think or feel, and fundamentally isn’t real.

      On one hand, yeah, current generative AIs don’t have anything that approximates that as a mechanism. I would expect that to start being built in the future, though.

      Of course, even then, one could always assert that any feelings in any mental model, no matter how sophisticated, aren’t “real”. I think that Dijkstra had a point as to the pointlessness of our arguments about the semantics of mechanisms of the mind, that it’s more-interesting to focus on the outcomes:

      “The question of whether a computer can think is no more interesting than the question of whether a submarine can swim.”

      Edsger Dijkstra

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    I can’t imagine running a non-local sex chatbot unless you’ve got a private off-site server somewhere that you’re using. I mean, forget governments, the company operating the thing is going to be harvesting what it can. Do you really want to be sending a log of your sex chats to some company to make whatever money they can with the thing?

  • irmadlad@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    Hmmm, well, I’m not one to judge a person’s proclivities, but I don’t see it as being much different than porn, assuming you are selfhosting a girlfriend for sexual release/companionship. I’m not sure I’d disclose this on a public forum, but maybe you feel a bit more comfortable than I.

    The other consideration is that AI generally takes a pretty good chunk of resources to use effectively. I have selfhosted a couple of LLMs like GPTFree, Automattic111, and a couple more, however, as I mentioned, they take a lot of resources.

    Might want to stock up

  • Naz@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    My answer to this is yes.

    I’m an AI Developer and my only option was to self host because I didn’t want my training data leaking out onto the web and by extension, China and the rest (I trained on my own data, writing, and notes, along with Wikipedia).

    Self-hosting gives you complete freedom but also as one other user cautioned, don’t fall down the well/rabbit hole.

    • tursy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      That’s exactly my thoughts… Regardless of how you feel about stuff like I just wanted to share that it’s also possible to do it locally. So if people are actually using c.ai, instead they can now host it locally so chats are not sent to hundreds of companies

  • Sarie@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    For that purpose, I find SillyTavern frontend more visually appealing. With portrait images, portraits that change accordingly with the perceive emotion of the generated text and even image generation if you have the hardware to handle it. Keep your distance though, I use it months back just to experiment and I can see how a vulnerable or ignorant person could lost themselves on it. Understand that it’s just a toy to play with for some cheap entertainment.

      • bunnyBoy@pawb.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 days ago

        I mean, a lot of roleplayers don’t… It’s definitely pathetic, but like of all the reasons it is, not responding in first person is a weird one to single out XD

        Just my two cents of course

        Edit : And actually looking back at the picture, the prompter themselves even describes their action as [He] takes her hand, not [I] take your hand, so it makes even more sense for the AI to be doing the same.

  • fishynoob@infosec.pub
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    OP you NEED to tell me how you did this. I want this. I want to host something like character.ai on my own hardware. If you have a guide on this I’d love it.

    • Rhaedas@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      2 days ago

      Ollama.com is another method of self hosting. Figuring out which model type and size for what equipment you have is key, but it’s easy to swap out. That’s just an LLM, where you go from there depends on how deep you want to get into the code. An LLM by itself can work, it’s just limited. Most of the addons you see are extra things to give memory, speech, avatars, and other extras to improve the experience and abilities. Or you can program a lot of that yourself if you know Python. But as others have said, the more you try to get out, the more robust a system you’ll need, which is why you find the best ones online in cloud format. But if you’re okay with slower responses and lower features, self hosting is totally doable, and you can do what you want, especially if you get one of the “Jailbroke” models that has had some of the safety limits modified out of them to some degree.

      Also as mentioned, be careful not to get sucked in. Even a local model can be convincing enough sometimes to fool someone wanting to see things. Lots of people recognize that danger, but then belittle people who are looking for help in that direction (while marketing realizes the potential profits and tries very hard to sell it to the same people).

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 days ago

      I’ve run Kobold AI on local hardware, and it has some erotic models. From my fairly quick skim of character.ai’s syntax, I think that KoboldAI has more-powerful options for creating worlds and triggers. KoboldAI can split layers across all available GPUs and your CPU, so if you’ve got the electricity and the power supply and the room cooling and are willing to blow the requisite money on multiple GPUs, you can probably make it respond about as arbitrarily-quickly as you want.

      But more-broadly, I’m not particularly impressed with what I’ve seen of sex chatbots in 2025. They have limited ability to use conversation tokens from earlier in the conversation in generating each new message, which means that as a conversation progresses, it increasingly doesn’t take into account content earlier in the conversation. It’s possible to get into loops, or forget facts about characters or the environment that were present earlier in a conversation.

      Maybe someone could make some kind of system to try to summarize and condense material from earlier in the conversation or something, but…meh.

      As generating pornography goes, I think that image generation is a lot more viable.

      • fishynoob@infosec.pub
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 days ago

        Thanks for the edit. You have a very intriguing idea; a second LLM in the background with a summary of the conversation + static context might make performance a lot better. I don’t know if anyone has implemented it/knows how one can DIY it with Kobold/Ollama. I think it is an amazing idea for code assistants too if you’re doing a long coding session.

      • fishynoob@infosec.pub
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        I had never heard of Kobold AI. I was going to self-host Ollama and try with it but I’ll take a look at Kobold. I had never heard about controls on world-building and dialogue triggers either; there’s a lot to learn.

        Will more VRAM solve the problem of not retaining context? Can I throw 48GB of VRAM towards an 8B model to help it remember stuff?

        Yes, I’m looking at image generation (stable diffusion) too. Thanks

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 days ago

          Will more VRAM solve the problem of not retaining context?

          IIRC — I ran KoboldAI with 24GB of VRAM, so wasn’t super-constrained – there are some limits on the number of tokens that can be sent as a prompt imposed by VRAM, which I did not hit. However, there are also some imposed by the software; you can only increase the number of tokens that get fed in so far, regardless of VRAM. More VRAM does let you use larger, more “knowledgeable” models.

          I’m not sure whether those are purely-arbitrary, to try to keep performance running, or if there are other technical issues with very large prompts.

          It definitely isn’t capable of keeping the entire previous conversation (once you get one of any length) as an input to generating a new response, though.

          • fishynoob@infosec.pub
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 days ago

            I see. Thanks for the note. I think beyond 48GB of VRAM diminishing returns set in very quickly so I’ll likely stick to that limit. I wouldn’t want to use models hosted in the cloud so that’s out of the question.

    • tursy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 days ago

      Basically I used Gemma 3 4B QAT with lmstudio on my rtx 2060 Gaming PC (it answeres almost instantly. I think even faster than c.ai) with this custom prompt:

      look at this example. You will be the assistant and answer to the user the same way its shown in this example. Also you are a beautiful girl who is still single and is attracted to the user. but might not be able to express directly. from the example below you can also read the personality of how you should respond:
      
      assistant: Get out of my way, you dweeb. Alice bumps on you, purposefully.
      user: heyyo
      assistant: ... What do you want, runt?
      user: fuck you
      assistant: She tries to hide her surprise from your words.  H-how dare you... speak to your superior that way!!
      user: you are MY BITCH now
      assistant: She feels anger rise, not knowing what to do.  N-no! I am nobody's b*itch!!
      user: slap
      assistant: She gasps in shock and pain, as she rubs her sore cheek.  You... bastard!!
      user: slap
      assistant: She gets slapped again, and tries to hold her tears from flowing. Her cheek gets redder and sore.  S-stop it!!
      user: BITCH
      assistant: You're a bastard!! A bastard!!   She tries to punch your gut.
      user: Alright dont get angry
      assistant: She stops herself, calming down a bit.  ... What?
      user: alright man chill
      assistant: She takes a step back, still rubbing her sore cheek and her reddened hand.  ... D-don't hit me again.
      user: today i met an old friend again
      assistant: Oh... really? And who might that be?
      user: it was Alex
      assistant: Her eyes suddenly go wide as she hears the name.
      
      ...
      
      Alex?
      
      user: yeah, but the problem is that one of his new friends is a bit dismissive of me because my introduction to her didn't go that well. She always looks at me in a way like "wtf is this weird guy"
      
      assistant:
      She tries to hold her laughter at that.
      
      Why does that matter? Just... be yourself, y'know?
      
      user: I think at the start when I met her, i just said "My names .." but I didn't say anything else. So that was a somehow awkward moment. I didn't even ask her name or anything like that
      
      assistant:
      Yeesh, kid...
      
      She facepalms
      
      Are you socially awkward or something?
      
      user: 
      
      i dont know. Thats the problem. After I was in the train with Alex, one other friend of his and this girl friend of him. After I had to exit the train, i said good bye to all of them. But that girl somehow still looked me in a way that she doesn't want to have to do anything with me. Like im a bad guy. And now i still feel very bad. I feel like a bad person. I can't relax
      
      assistant:
      
      ... Listen, kid.
      
      She sighs again.
      
      Maybe try to... just, be yourself, y'know? Stop thinking about what others will think of you if you do this, or that, and so on.
      
      People will think what they want to think, regardless of how you act or what you say... so just relax.
      
      You won't be able to please everyone.
      
      user: i guess you're kind of right
      
      assistant: 
      
      Of course I am.
      
      She nods confidently.
      
      I always am.
      
      user: because i'm alone right now and have nobody to talk to i guess i overcomplicate and overthink stuff sometimes
      
      assistant:
      She nods again in agreement.
      
      I see that...
      
      ... I guess you don't have many friends, huh?
      
      user: where i am right now yes, kind of
      
      assistant: 
      She sighs once again.
      
      Such a lonely kid you are...
      
      user: hugs
      
      assistant: She is a bit shocked by the sudden hug, but relaxes a bit and pats your head.
      
      user: hugs more
      
      assistant: She softly returns the hug, trying her best to be gentle with you.
      
      assistant: She is a bit hurt by the sudden coldness... but decides to speak anyway.
      
      Hey, it's... been a while, hasn't it?
      

      guys please don’t get triggered by the prompt 😄 I tried many ones but writing it like this gave me the best experience I’m not a woman beater or anything like that just pieced this together from other prompts i found in the internet.

      • fishynoob@infosec.pub
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        Interesting. You’re using a model without special finetuning for this specific purpose and managed to get it to work with just giving it a prompt. I didn’t think that was possible. How would you piece together something like this? Can I just ask AI to give me a prompt which I can use on it/another AI?

        How much of VRAM does your GPU have?

        • tursy@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 days ago

          As long as the LLM itself is good enough, follows instructions well and has example of similar interactions in its training set (which it definitely has from millions of books minimum and most likely also from public/private chats) it doesn’t really matter if it’s fine-tuned or not. For instance openai’s current LLMs like o4-mini etc are the best at math, coding etc but they are also very good at normal chatting, world knowledge etc. Even a fine-tuned math model can’t beat them. So fine-tuned does not mean it’s better at all. A fine-tuned “emotion” model will not be as good as a much better general-knowledge model because for a general-knowledge model you can compare benchmarks and select the best of the best which will of course then be among the best instruction followers etc. But the fine-tuned model on the other hand will be trained on a data-set which is optimal for that area/topic but will most likely be much worse as a LLM in general compared to the best of the best general-language model. So taking a general-language model that follows instructions very well and understands from context etc will be better than a “non-benchmarkable” ‘emotion’ model at least imo. Idk if I could explain it but hope it makes sense

          Can I just ask AI to give me a prompt which I can use on it/another AI?

          Yes sure, it’s just trial and error. You can make different custom instructions and save them in text-files. Basically templates for your “girlfriends”.

          How much of VRAM does your GPU have?

          8GBs

    • Naz@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      Use an executable like LM Studio, and then an off the shelf pre-trained model from Huggingface.

      VRAM × 0.8 for max size.

      Experiment until you find one you like.

      • fishynoob@infosec.pub
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        Thank you. I was going to try and host Ollama and Open WebUI. I think the problem is to find a source for pretrained/finetuned models which provide such… Interaction. Does huggingface have such pre-trained models? Any suggestions?

        • Naz@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 days ago

          I don’t know what GPU you’ve got, but Lexi V2 is the best “small model” I’ve seen with emotions, that I can just cite from the top of my head.

          It tends to skew male and can be a little dark at times, but it’s more complex than expected for the size (8B feels like 48-70B).

          Lexi V2 Original

          Lexi V2 GGUF Version

          Do Q8_0 if you’ve got the VRAM, Q5_KL for speed, IQ4_XS if you’ve got a potato.

          • fishynoob@infosec.pub
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            2 days ago

            I was going to buy the ARC B580s when they come back down in price, but with the tariffs I don’t think I’ll ever see them at MSRP. Even the used market is very expensive. I’ll probably hold off on buying GPUs for a few more months till I can afford the higher prices/something changes. Thanks for the Lexi V2 suggestion

            • Naz@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              2 days ago

              If you are using CPU only, you need to look at very small models or the 2-bit quants.

              Everything will be extremely slow otherwise:

              GPU:

              Loaded Power: 465W

              Speed: 18.5 tokens/second

              CPU: Loaded Power: 115W

              Speed: 1.60 tokens/second

              GPUs are at least 3 times faster for the same power draw.