Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse

NodeBB

  1. Home
  2. Gaming
  3. Epic’s AI Darth Vader tech is about to be all over Fortnite

Epic’s AI Darth Vader tech is about to be all over Fortnite

Scheduled Pinned Locked Moved Gaming
29 Posts 11 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • R This user is from outside of this forum
    R This user is from outside of this forum
    [email protected]
    wrote last edited by
    #1
    This post did not contain any content.
    theangriestbird@beehaw.orgT 1 Reply Last reply
    28
    • R [email protected]
      This post did not contain any content.
      theangriestbird@beehaw.orgT This user is from outside of this forum
      theangriestbird@beehaw.orgT This user is from outside of this forum
      [email protected]
      wrote last edited by
      #2

      Like honestly, on paper this sounds so cool. If you told me in 2015 that in 10 years time i would be able to play online games with NPC allies that chat with me kinda like real people, I would be super excited about that (and maybe just a little unsettled).

      Of course, in practice, I hate this. I don't care how cool the tech is, the energy cost of running this tech for even half of Fortnite's active daily users on a daily basis must be eyewatering. No LLM tech in videogames is worth cooking the planet over. And we all know that the tech companies want utility customers to help foot the bill for these moronic uses of energy, whether we like it or not.

      So ultimately: fuck Epic and anyone else trying to use this LLM tech for anything other than life-or-death situations.

      A D M owlboi@lemm.eeO 4 Replies Last reply
      34
      • theangriestbird@beehaw.orgT [email protected]

        Like honestly, on paper this sounds so cool. If you told me in 2015 that in 10 years time i would be able to play online games with NPC allies that chat with me kinda like real people, I would be super excited about that (and maybe just a little unsettled).

        Of course, in practice, I hate this. I don't care how cool the tech is, the energy cost of running this tech for even half of Fortnite's active daily users on a daily basis must be eyewatering. No LLM tech in videogames is worth cooking the planet over. And we all know that the tech companies want utility customers to help foot the bill for these moronic uses of energy, whether we like it or not.

        So ultimately: fuck Epic and anyone else trying to use this LLM tech for anything other than life-or-death situations.

        A This user is from outside of this forum
        A This user is from outside of this forum
        [email protected]
        wrote last edited by
        #3

        I with you on the whole let's not cook the planet for this.

        It'd be nice if instead we had more focus on integrating lightweight and focused models that can run on local hardware. I think that will happen eventually.

        theangriestbird@beehaw.orgT B 2 Replies Last reply
        15
        • theangriestbird@beehaw.orgT [email protected]

          Like honestly, on paper this sounds so cool. If you told me in 2015 that in 10 years time i would be able to play online games with NPC allies that chat with me kinda like real people, I would be super excited about that (and maybe just a little unsettled).

          Of course, in practice, I hate this. I don't care how cool the tech is, the energy cost of running this tech for even half of Fortnite's active daily users on a daily basis must be eyewatering. No LLM tech in videogames is worth cooking the planet over. And we all know that the tech companies want utility customers to help foot the bill for these moronic uses of energy, whether we like it or not.

          So ultimately: fuck Epic and anyone else trying to use this LLM tech for anything other than life-or-death situations.

          D This user is from outside of this forum
          D This user is from outside of this forum
          [email protected]
          wrote last edited by
          #4

          I disagree the shareholders have never been happier. Who matters more to Epic Games? People or shareholders?

          1 Reply Last reply
          1
          • theangriestbird@beehaw.orgT [email protected]

            Like honestly, on paper this sounds so cool. If you told me in 2015 that in 10 years time i would be able to play online games with NPC allies that chat with me kinda like real people, I would be super excited about that (and maybe just a little unsettled).

            Of course, in practice, I hate this. I don't care how cool the tech is, the energy cost of running this tech for even half of Fortnite's active daily users on a daily basis must be eyewatering. No LLM tech in videogames is worth cooking the planet over. And we all know that the tech companies want utility customers to help foot the bill for these moronic uses of energy, whether we like it or not.

            So ultimately: fuck Epic and anyone else trying to use this LLM tech for anything other than life-or-death situations.

            M This user is from outside of this forum
            M This user is from outside of this forum
            [email protected]
            wrote last edited by
            #5

            In going to bet your video card uses more energy than the AI while you play the game.

            theangriestbird@beehaw.orgT 1 Reply Last reply
            6
            • M [email protected]

              In going to bet your video card uses more energy than the AI while you play the game.

              theangriestbird@beehaw.orgT This user is from outside of this forum
              theangriestbird@beehaw.orgT This user is from outside of this forum
              [email protected]
              wrote last edited by [email protected]
              #6

              that would be a safe bet given that none of these AI companies disclose their actual energy usage, so you would never have to pay out that bet because we would never find out if you were right.

              What we do know is that generating a single text response on the largest open source AI models takes about 6,500 joules, if you don't include the exorbitant energy cost of training the model. We know that most of the closed source models are way more complicated, so let's say they take 3 times the cost to generate a response. That's 19,500 joules. Generating an AI voice to speak the lines increases that energy cost exponentially. MIT found that generating a grainy, five-second video at 8 frames per second on an open source model took about 109,000 joules.

              My 3080ti is 350W - if I played a single half-hour match of Fortnite, my GPU would use about 630,000 joules (and that's assuming my GPU is running at max capacity the entire time, which never happens). Epic's AI voice model is pretty high quality, so let's estimate that the cost of a single AI voice response is about 100,000 joules, similar to the low quality video generation mentioned above. If these estimates are close, this would mean that if I ask Fortnite Darth Vader just 7 questions, the AI has cost more energy than my GPU does while playing the game on max settings.

              M E 2 Replies Last reply
              10
              • A [email protected]

                I with you on the whole let's not cook the planet for this.

                It'd be nice if instead we had more focus on integrating lightweight and focused models that can run on local hardware. I think that will happen eventually.

                theangriestbird@beehaw.orgT This user is from outside of this forum
                theangriestbird@beehaw.orgT This user is from outside of this forum
                [email protected]
                wrote last edited by
                #7

                yeah, if OpenAI and the rest were talking today about making the models more efficient, rather than focusing on making them more accurate, I would be way less of a luddite about this. If our energy grids were mostly made up of clean sources of energy, I would be way less of a luddite about this. But neither of these things are true, so I remain a luddite about AI.

                blisterexe@lemmy.zipB 1 Reply Last reply
                12
                • theangriestbird@beehaw.orgT [email protected]

                  that would be a safe bet given that none of these AI companies disclose their actual energy usage, so you would never have to pay out that bet because we would never find out if you were right.

                  What we do know is that generating a single text response on the largest open source AI models takes about 6,500 joules, if you don't include the exorbitant energy cost of training the model. We know that most of the closed source models are way more complicated, so let's say they take 3 times the cost to generate a response. That's 19,500 joules. Generating an AI voice to speak the lines increases that energy cost exponentially. MIT found that generating a grainy, five-second video at 8 frames per second on an open source model took about 109,000 joules.

                  My 3080ti is 350W - if I played a single half-hour match of Fortnite, my GPU would use about 630,000 joules (and that's assuming my GPU is running at max capacity the entire time, which never happens). Epic's AI voice model is pretty high quality, so let's estimate that the cost of a single AI voice response is about 100,000 joules, similar to the low quality video generation mentioned above. If these estimates are close, this would mean that if I ask Fortnite Darth Vader just 7 questions, the AI has cost more energy than my GPU does while playing the game on max settings.

                  M This user is from outside of this forum
                  M This user is from outside of this forum
                  [email protected]
                  wrote last edited by [email protected]
                  #8

                  We know that most of the closed source models are way more complicated, so let's say they take 3 times the cost to generate a response.

                  This is completely arbitrary and supposition. Is it 3x "regular" response? I have no idea. How do you even arrive at that guess? Is a more complex prompt exponential more expensive? Linearly? Logarithmically? And how complex are we talking when system prompts themselves can be 10k tokens?

                  Generating an AI voice to speak the lines increases that energy cost exponentially. MIT found that generating a grainy, five-second video at 8 frames per second on an open source model took about 109,000 joules

                  Why did you go from voice gen to video gen? I mean I don't know whether video gen takes more joules or not but there's no actual connection here. You just decided that a line of audio gen is equivalent to 40 frames of video. What if they generate the text and then use conventional voice synthesizers? And what does that have to do with video gen?

                  If these estimates are close

                  Who even knows, mate? You've been completely fucking arbitrary and, shocker, your analysis supports your supposition, kinda. How many Vader lines are you going to get in 30 minutes? When it's brand new probably a lot, but after the luster wears off?

                  I'm not even telling you you're wrong, just that your methodology here is complete fucking bullshit.

                  It could be as low as 6500 joules (based on your link) which changes the calculus to 60 lines per half hour. Is it that low? Probably not, but that is every bit as valid as your math and I'm even using your numbers without double checking.

                  At the end of the day maybe I lose the bet. Fair. I've been wondering for a bit how they actually stack up, and I'm willing to be shown. But I suspect using it for piddly shit day to day is a drop in the bucket compared to all the mass corporate spam. But I'm aware it's nothing but a hypothesis and I'm willing to be proven wrong. But not based on this.

                  theangriestbird@beehaw.orgT 1 Reply Last reply
                  7
                  • A [email protected]

                    I with you on the whole let's not cook the planet for this.

                    It'd be nice if instead we had more focus on integrating lightweight and focused models that can run on local hardware. I think that will happen eventually.

                    B This user is from outside of this forum
                    B This user is from outside of this forum
                    [email protected]
                    wrote last edited by
                    #9

                    I’ve been relatively impressed that the image/emoji generation on my iPhone has all been done on my device. Every time I’ve used it, I’ve checked the “Apple Intelligence” server requests log and it’s always empty (until the one time I asked it to generate a photo memory).

                    My phone gets pretty hot instantly, then churns through 3% of the battery in a minute but it’s still running on a local system with the local battery.

                    1 Reply Last reply
                    4
                    • M [email protected]

                      We know that most of the closed source models are way more complicated, so let's say they take 3 times the cost to generate a response.

                      This is completely arbitrary and supposition. Is it 3x "regular" response? I have no idea. How do you even arrive at that guess? Is a more complex prompt exponential more expensive? Linearly? Logarithmically? And how complex are we talking when system prompts themselves can be 10k tokens?

                      Generating an AI voice to speak the lines increases that energy cost exponentially. MIT found that generating a grainy, five-second video at 8 frames per second on an open source model took about 109,000 joules

                      Why did you go from voice gen to video gen? I mean I don't know whether video gen takes more joules or not but there's no actual connection here. You just decided that a line of audio gen is equivalent to 40 frames of video. What if they generate the text and then use conventional voice synthesizers? And what does that have to do with video gen?

                      If these estimates are close

                      Who even knows, mate? You've been completely fucking arbitrary and, shocker, your analysis supports your supposition, kinda. How many Vader lines are you going to get in 30 minutes? When it's brand new probably a lot, but after the luster wears off?

                      I'm not even telling you you're wrong, just that your methodology here is complete fucking bullshit.

                      It could be as low as 6500 joules (based on your link) which changes the calculus to 60 lines per half hour. Is it that low? Probably not, but that is every bit as valid as your math and I'm even using your numbers without double checking.

                      At the end of the day maybe I lose the bet. Fair. I've been wondering for a bit how they actually stack up, and I'm willing to be shown. But I suspect using it for piddly shit day to day is a drop in the bucket compared to all the mass corporate spam. But I'm aware it's nothing but a hypothesis and I'm willing to be proven wrong. But not based on this.

                      theangriestbird@beehaw.orgT This user is from outside of this forum
                      theangriestbird@beehaw.orgT This user is from outside of this forum
                      [email protected]
                      wrote last edited by [email protected]
                      #10

                      This is completely arbitrary and supposition

                      It is, that's the point. We don't know because the AI companies are intentionally hiding that detail. My estimates are based on the real numbers we do have, and all we know about the closed source models is that they contain more parameters than the open source models, and more parameters = more energy use.

                      When I started adding multipliers to take a stab at the numbers, I was being conservative. A single AI voice response definitely takes more than 6500 joules, we just don't know how much more. It's not that much of a stretch to assume that the energy cost of a voice generation is somewhere between the energy cost of a text generation and the energy cost of a video generation.

                      If my numbers were accurate, that would actually be great news for the AI companies. They would be shouting these numbers from the fucking rooftops to get the stink of this energy usage story off their backs. Corporations never disclose anything unless it is good news. Their silence says everything - if we were actually betting, I would gladly bet that my single video card uses way less energy than their data centers packed to the brim with higher-end GPUs. It's just a no-brainer.

                      M 1 Reply Last reply
                      2
                      • theangriestbird@beehaw.orgT [email protected]

                        This is completely arbitrary and supposition

                        It is, that's the point. We don't know because the AI companies are intentionally hiding that detail. My estimates are based on the real numbers we do have, and all we know about the closed source models is that they contain more parameters than the open source models, and more parameters = more energy use.

                        When I started adding multipliers to take a stab at the numbers, I was being conservative. A single AI voice response definitely takes more than 6500 joules, we just don't know how much more. It's not that much of a stretch to assume that the energy cost of a voice generation is somewhere between the energy cost of a text generation and the energy cost of a video generation.

                        If my numbers were accurate, that would actually be great news for the AI companies. They would be shouting these numbers from the fucking rooftops to get the stink of this energy usage story off their backs. Corporations never disclose anything unless it is good news. Their silence says everything - if we were actually betting, I would gladly bet that my single video card uses way less energy than their data centers packed to the brim with higher-end GPUs. It's just a no-brainer.

                        M This user is from outside of this forum
                        M This user is from outside of this forum
                        [email protected]
                        wrote last edited by
                        #11

                        What I said was I'll bet one person uses more power running the game than the AI uses to respond to them. Just that.

                        Then you started inventing scenarios and moving goalposts to comparing one single video card to an entire data center. I guess because you didn't want to let my statement go unchallenged, but you had nothing solid to back you up. You're the one that posted 6500 joules, which you supported, and I appreciate that, but after that it's all just supposition and guesses.

                        You're right that it's almost certainly higher than that. But I can generate text and images on my home PC. Not at the quality and speed of OpenAI or whatever they have on the back-end, but it can be done on my 1660. So my suggestion that running a 3D game consumes more power than generating a few lines seems pretty reasonable.

                        But I know someone who works for a company that has an A100 used for serving AI. I'll ask and see if he has more information or even a better-educated guess than I do, and if I find out I'm wrong, I won't suggest otherwise in the future.

                        theangriestbird@beehaw.orgT V 2 Replies Last reply
                        1
                        • M [email protected]

                          What I said was I'll bet one person uses more power running the game than the AI uses to respond to them. Just that.

                          Then you started inventing scenarios and moving goalposts to comparing one single video card to an entire data center. I guess because you didn't want to let my statement go unchallenged, but you had nothing solid to back you up. You're the one that posted 6500 joules, which you supported, and I appreciate that, but after that it's all just supposition and guesses.

                          You're right that it's almost certainly higher than that. But I can generate text and images on my home PC. Not at the quality and speed of OpenAI or whatever they have on the back-end, but it can be done on my 1660. So my suggestion that running a 3D game consumes more power than generating a few lines seems pretty reasonable.

                          But I know someone who works for a company that has an A100 used for serving AI. I'll ask and see if he has more information or even a better-educated guess than I do, and if I find out I'm wrong, I won't suggest otherwise in the future.

                          theangriestbird@beehaw.orgT This user is from outside of this forum
                          theangriestbird@beehaw.orgT This user is from outside of this forum
                          [email protected]
                          wrote last edited by
                          #12

                          i appreciate that you are engaging deeply. I don't really have anything else to say that i haven't said already, but just wanted to show my respect there.

                          M 1 Reply Last reply
                          1
                          • theangriestbird@beehaw.orgT [email protected]

                            that would be a safe bet given that none of these AI companies disclose their actual energy usage, so you would never have to pay out that bet because we would never find out if you were right.

                            What we do know is that generating a single text response on the largest open source AI models takes about 6,500 joules, if you don't include the exorbitant energy cost of training the model. We know that most of the closed source models are way more complicated, so let's say they take 3 times the cost to generate a response. That's 19,500 joules. Generating an AI voice to speak the lines increases that energy cost exponentially. MIT found that generating a grainy, five-second video at 8 frames per second on an open source model took about 109,000 joules.

                            My 3080ti is 350W - if I played a single half-hour match of Fortnite, my GPU would use about 630,000 joules (and that's assuming my GPU is running at max capacity the entire time, which never happens). Epic's AI voice model is pretty high quality, so let's estimate that the cost of a single AI voice response is about 100,000 joules, similar to the low quality video generation mentioned above. If these estimates are close, this would mean that if I ask Fortnite Darth Vader just 7 questions, the AI has cost more energy than my GPU does while playing the game on max settings.

                            E This user is from outside of this forum
                            E This user is from outside of this forum
                            [email protected]
                            wrote last edited by [email protected]
                            #13

                            Generating an AI voice to speak the lines increases that energy cost exponentially.

                            TTS models are tiny in comparison to LLMs. How does this track? The biggest I could find was Orpheus-TTS that comes in 3B/1B/400M/150M parameter sizes. And they are not using a 600 billion parameter LLM to generate the text for Vader's responses, that is likely way too big. After generating the text, speech isn't even a drop in the bucket.

                            You need to include parameter counts in your calculations. A lot of these assumptions are so wrong it borders on misinformation.

                            theangriestbird@beehaw.orgT 1 Reply Last reply
                            4
                            • E [email protected]

                              Generating an AI voice to speak the lines increases that energy cost exponentially.

                              TTS models are tiny in comparison to LLMs. How does this track? The biggest I could find was Orpheus-TTS that comes in 3B/1B/400M/150M parameter sizes. And they are not using a 600 billion parameter LLM to generate the text for Vader's responses, that is likely way too big. After generating the text, speech isn't even a drop in the bucket.

                              You need to include parameter counts in your calculations. A lot of these assumptions are so wrong it borders on misinformation.

                              theangriestbird@beehaw.orgT This user is from outside of this forum
                              theangriestbird@beehaw.orgT This user is from outside of this forum
                              [email protected]
                              wrote last edited by
                              #14

                              I will repeat what I said in another reply below: if the cost of running these closed source AI models was as negligible as you are suggesting, then these companies would be screaming it from the rooftops to get the stink of this energy usage story off their backs. AI is all investors and hype right now, which means the industry is extra vulnerable to negative stories. By staying silent, the AI companies are allowing people like me to make wild guesses at the numbers and possibly fear-monger with misinformation. They could shut up all the naysayers by simply releasing their numbers. The fact that they are still staying silent despite all the negative press suggests that the energy usage numbers are far worse than anyone is estimating.

                              E 1 Reply Last reply
                              2
                              • theangriestbird@beehaw.orgT [email protected]

                                I will repeat what I said in another reply below: if the cost of running these closed source AI models was as negligible as you are suggesting, then these companies would be screaming it from the rooftops to get the stink of this energy usage story off their backs. AI is all investors and hype right now, which means the industry is extra vulnerable to negative stories. By staying silent, the AI companies are allowing people like me to make wild guesses at the numbers and possibly fear-monger with misinformation. They could shut up all the naysayers by simply releasing their numbers. The fact that they are still staying silent despite all the negative press suggests that the energy usage numbers are far worse than anyone is estimating.

                                E This user is from outside of this forum
                                E This user is from outside of this forum
                                [email protected]
                                wrote last edited by [email protected]
                                #15

                                This doesn't mean you can misrepresent facts like this though. The line I quoted is misinformation, and you don't know what you're talking about. I'm not trying to sound so aggressive, but it's the only way I can phrase it.

                                1 Reply Last reply
                                1
                                • M [email protected]

                                  What I said was I'll bet one person uses more power running the game than the AI uses to respond to them. Just that.

                                  Then you started inventing scenarios and moving goalposts to comparing one single video card to an entire data center. I guess because you didn't want to let my statement go unchallenged, but you had nothing solid to back you up. You're the one that posted 6500 joules, which you supported, and I appreciate that, but after that it's all just supposition and guesses.

                                  You're right that it's almost certainly higher than that. But I can generate text and images on my home PC. Not at the quality and speed of OpenAI or whatever they have on the back-end, but it can be done on my 1660. So my suggestion that running a 3D game consumes more power than generating a few lines seems pretty reasonable.

                                  But I know someone who works for a company that has an A100 used for serving AI. I'll ask and see if he has more information or even a better-educated guess than I do, and if I find out I'm wrong, I won't suggest otherwise in the future.

                                  V This user is from outside of this forum
                                  V This user is from outside of this forum
                                  [email protected]
                                  wrote last edited by
                                  #16

                                  Other than the obvious missing numbers, this is also missing scale. Sure, one response likely takes less energy than playing the game, but Fortnite averaged 1.4 million daily players the last year. Granted not all of them are going to interact with the bot, but a whole hell of a lot are going to, and do it multiple times in a row.

                                  T M 2 Replies Last reply
                                  1
                                  • theangriestbird@beehaw.orgT [email protected]

                                    yeah, if OpenAI and the rest were talking today about making the models more efficient, rather than focusing on making them more accurate, I would be way less of a luddite about this. If our energy grids were mostly made up of clean sources of energy, I would be way less of a luddite about this. But neither of these things are true, so I remain a luddite about AI.

                                    blisterexe@lemmy.zipB This user is from outside of this forum
                                    blisterexe@lemmy.zipB This user is from outside of this forum
                                    [email protected]
                                    wrote last edited by [email protected]
                                    #17

                                    Actually, you might be suprised about the clean energy thing! Places like east coast canada and france are already ~99% clean energy, and even texas is like 50% there.

                                    1 Reply Last reply
                                    1
                                    • V [email protected]

                                      Other than the obvious missing numbers, this is also missing scale. Sure, one response likely takes less energy than playing the game, but Fortnite averaged 1.4 million daily players the last year. Granted not all of them are going to interact with the bot, but a whole hell of a lot are going to, and do it multiple times in a row.

                                      T This user is from outside of this forum
                                      T This user is from outside of this forum
                                      [email protected]
                                      wrote last edited by [email protected]
                                      #18

                                      Yes, but by definition all of them are also playing the game, and given that this is mostly a novelty feature (and also based on how shockingly little use the user-facing chatbots I've seen in professional settings are utilized), I personally doubt that the chatbot energy usage will top the game's.

                                      My guess is there will be 90% of people who use the feature once or twice before ignoring it forever, 9% who will use it occasionally for e.g. video creation purposes, and 1% or less who will actually sit there and use it a bunch just to talk to. That would about match up with ChatGPT's general usage trends.

                                      V 1 Reply Last reply
                                      0
                                      • V [email protected]

                                        Other than the obvious missing numbers, this is also missing scale. Sure, one response likely takes less energy than playing the game, but Fortnite averaged 1.4 million daily players the last year. Granted not all of them are going to interact with the bot, but a whole hell of a lot are going to, and do it multiple times in a row.

                                        M This user is from outside of this forum
                                        M This user is from outside of this forum
                                        [email protected]
                                        wrote last edited by
                                        #19

                                        And they used 1.4 million video cards. The scale is a wash. And yes, when it's brand new folks are going to sit there for a bit appreciating how cool it is to talk to Darth Vader. And then he's going to say some stupid out-of-character stuff, and the novelty is going to wear off, and the AI usage is going to go down, but the video card usage will stay the same.

                                        V 1 Reply Last reply
                                        1
                                        • M [email protected]

                                          And they used 1.4 million video cards. The scale is a wash. And yes, when it's brand new folks are going to sit there for a bit appreciating how cool it is to talk to Darth Vader. And then he's going to say some stupid out-of-character stuff, and the novelty is going to wear off, and the AI usage is going to go down, but the video card usage will stay the same.

                                          V This user is from outside of this forum
                                          V This user is from outside of this forum
                                          [email protected]
                                          wrote last edited by
                                          #20

                                          Sure, that is true specifically for Vader, but the whole point of the article is that they are rolling out several new bots to keep the novelty up.

                                          M 1 Reply Last reply
                                          1
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups