• El Barto@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          May I ask what hardware do you use to run this application? The community’s about section mentioned that the pinned post had some info, but I couldn’t find it…

            • El Barto@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              1 year ago

              Get the hell out, for reals?!

              I always thought that this kind of application would require a beefy desktop computer with powerful GPUs.

              Are these images generated offline?

              • planish382@aiparadise.moe
                link
                fedilink
                English
                arrow-up
                5
                ·
                1 year ago

                I think it’s that you need to be able to throw parallel processing at a lot of RAM. If you want to do that on a PC you need a GPU that has a lot of RAM, which you can only really buy as part of a beefy GPU. You can’t buy a midrange GPU and duct tape DIMMs onto it.

                The Apple Silicon architecture has an OK GPU in it, but because of how it’s integrated with the CPU, all the RAM in the system is GPU RAM. So Apple Silicon Macs can really punch above their weight for AI applications, because they can use a lot more RAM.

              • ChipthensfwMonk@lemmynsfw.comOP
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 year ago

                It’s a ridiculously powerful machine. Running AI stuff caused the fan to spin on for the first time. It destroys everything else.

                • El Barto@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  May I ask what modela you use to generate these? I got DiffusionBee, and the two models it downloaded by default, while impressive on their own (I mean… text to images? Magic!), the results are nowhere near as good as your images.

                  • ChipthensfwMonk@lemmynsfw.comOP
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    1 year ago

                    It’s a merge of Realistic Vision 4, LazyMix+, and URPM with weighting towards realistic vision 4. There are probably better merges and maybe even better models to use, but these have helped me generate some realistic stuff. I also use a fair amount of experimenting with LoRas and Controlnets along with dynamic prompts to test variables (prompt components or parts).

              • HoldingMyDick@lemmynsfw.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                M2 is a damn good chip and this is coming from someone who has not and probably never will buy an Apple product.

                Yes, they’re genrated offline.

    • imPastaSyndrome@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      What did you run to generate these? They turned out pretty coherent if not fantastic.

      Model, sampler and postprocessing?

      • ChipthensfwMonk@lemmynsfw.comOP
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        1 year ago

        These took awhile to get right. The model is a blend of Real Vision 4, LazyMix+, and URPM. The top three were done with a ControlNet (I can’t remember if it was Canny or Depth) to aid the pose and the face was improved with InPaint. The bottom one uses the “openblouse” lora, I think. I am finding that DPM++ SDE Karras is producing the best results, but I haven’t tried them all systematically to prove that.

        • imPastaSyndrome@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Wow! Cool! Seems like quite a bit more work than I’ve put in to this type of thing so far, excited to learn more