Epic’s AI Darth Vader tech is about to be all over Fortnite
-
What I said was I'll bet one person uses more power running the game than the AI uses to respond to them. Just that.
Then you started inventing scenarios and moving goalposts to comparing one single video card to an entire data center. I guess because you didn't want to let my statement go unchallenged, but you had nothing solid to back you up. You're the one that posted 6500 joules, which you supported, and I appreciate that, but after that it's all just supposition and guesses.
You're right that it's almost certainly higher than that. But I can generate text and images on my home PC. Not at the quality and speed of OpenAI or whatever they have on the back-end, but it can be done on my 1660. So my suggestion that running a 3D game consumes more power than generating a few lines seems pretty reasonable.
But I know someone who works for a company that has an A100 used for serving AI. I'll ask and see if he has more information or even a better-educated guess than I do, and if I find out I'm wrong, I won't suggest otherwise in the future.
i appreciate that you are engaging deeply. I don't really have anything else to say that i haven't said already, but just wanted to show my respect there.
-
that would be a safe bet given that none of these AI companies disclose their actual energy usage, so you would never have to pay out that bet because we would never find out if you were right.
What we do know is that generating a single text response on the largest open source AI models takes about 6,500 joules, if you don't include the exorbitant energy cost of training the model. We know that most of the closed source models are way more complicated, so let's say they take 3 times the cost to generate a response. That's 19,500 joules. Generating an AI voice to speak the lines increases that energy cost exponentially. MIT found that generating a grainy, five-second video at 8 frames per second on an open source model took about 109,000 joules.
My 3080ti is 350W - if I played a single half-hour match of Fortnite, my GPU would use about 630,000 joules (and that's assuming my GPU is running at max capacity the entire time, which never happens). Epic's AI voice model is pretty high quality, so let's estimate that the cost of a single AI voice response is about 100,000 joules, similar to the low quality video generation mentioned above. If these estimates are close, this would mean that if I ask Fortnite Darth Vader just 7 questions, the AI has cost more energy than my GPU does while playing the game on max settings.
wrote last edited by [email protected]Generating an AI voice to speak the lines increases that energy cost exponentially.
TTS models are tiny in comparison to LLMs. How does this track? The biggest I could find was Orpheus-TTS that comes in 3B/1B/400M/150M parameter sizes. And they are not using a 600 billion parameter LLM to generate the text for Vader's responses, that is likely way too big. After generating the text, speech isn't even a drop in the bucket.
You need to include parameter counts in your calculations. A lot of these assumptions are so wrong it borders on misinformation.
-
Generating an AI voice to speak the lines increases that energy cost exponentially.
TTS models are tiny in comparison to LLMs. How does this track? The biggest I could find was Orpheus-TTS that comes in 3B/1B/400M/150M parameter sizes. And they are not using a 600 billion parameter LLM to generate the text for Vader's responses, that is likely way too big. After generating the text, speech isn't even a drop in the bucket.
You need to include parameter counts in your calculations. A lot of these assumptions are so wrong it borders on misinformation.
I will repeat what I said in another reply below: if the cost of running these closed source AI models was as negligible as you are suggesting, then these companies would be screaming it from the rooftops to get the stink of this energy usage story off their backs. AI is all investors and hype right now, which means the industry is extra vulnerable to negative stories. By staying silent, the AI companies are allowing people like me to make wild guesses at the numbers and possibly fear-monger with misinformation. They could shut up all the naysayers by simply releasing their numbers. The fact that they are still staying silent despite all the negative press suggests that the energy usage numbers are far worse than anyone is estimating.
-
I will repeat what I said in another reply below: if the cost of running these closed source AI models was as negligible as you are suggesting, then these companies would be screaming it from the rooftops to get the stink of this energy usage story off their backs. AI is all investors and hype right now, which means the industry is extra vulnerable to negative stories. By staying silent, the AI companies are allowing people like me to make wild guesses at the numbers and possibly fear-monger with misinformation. They could shut up all the naysayers by simply releasing their numbers. The fact that they are still staying silent despite all the negative press suggests that the energy usage numbers are far worse than anyone is estimating.
wrote last edited by [email protected]This doesn't mean you can misrepresent facts like this though. The line I quoted is misinformation, and you don't know what you're talking about. I'm not trying to sound so aggressive, but it's the only way I can phrase it.
-
What I said was I'll bet one person uses more power running the game than the AI uses to respond to them. Just that.
Then you started inventing scenarios and moving goalposts to comparing one single video card to an entire data center. I guess because you didn't want to let my statement go unchallenged, but you had nothing solid to back you up. You're the one that posted 6500 joules, which you supported, and I appreciate that, but after that it's all just supposition and guesses.
You're right that it's almost certainly higher than that. But I can generate text and images on my home PC. Not at the quality and speed of OpenAI or whatever they have on the back-end, but it can be done on my 1660. So my suggestion that running a 3D game consumes more power than generating a few lines seems pretty reasonable.
But I know someone who works for a company that has an A100 used for serving AI. I'll ask and see if he has more information or even a better-educated guess than I do, and if I find out I'm wrong, I won't suggest otherwise in the future.
Other than the obvious missing numbers, this is also missing scale. Sure, one response likely takes less energy than playing the game, but Fortnite averaged 1.4 million daily players the last year. Granted not all of them are going to interact with the bot, but a whole hell of a lot are going to, and do it multiple times in a row.
-
yeah, if OpenAI and the rest were talking today about making the models more efficient, rather than focusing on making them more accurate, I would be way less of a luddite about this. If our energy grids were mostly made up of clean sources of energy, I would be way less of a luddite about this. But neither of these things are true, so I remain a luddite about AI.
wrote last edited by [email protected]Actually, you might be suprised about the clean energy thing! Places like east coast canada and france are already ~99% clean energy, and even texas is like 50% there.
-
Other than the obvious missing numbers, this is also missing scale. Sure, one response likely takes less energy than playing the game, but Fortnite averaged 1.4 million daily players the last year. Granted not all of them are going to interact with the bot, but a whole hell of a lot are going to, and do it multiple times in a row.
wrote last edited by [email protected]Yes, but by definition all of them are also playing the game, and given that this is mostly a novelty feature (and also based on how shockingly little use the user-facing chatbots I've seen in professional settings are utilized), I personally doubt that the chatbot energy usage will top the game's.
My guess is there will be 90% of people who use the feature once or twice before ignoring it forever, 9% who will use it occasionally for e.g. video creation purposes, and 1% or less who will actually sit there and use it a bunch just to talk to. That would about match up with ChatGPT's general usage trends.
-
Other than the obvious missing numbers, this is also missing scale. Sure, one response likely takes less energy than playing the game, but Fortnite averaged 1.4 million daily players the last year. Granted not all of them are going to interact with the bot, but a whole hell of a lot are going to, and do it multiple times in a row.
And they used 1.4 million video cards. The scale is a wash. And yes, when it's brand new folks are going to sit there for a bit appreciating how cool it is to talk to Darth Vader. And then he's going to say some stupid out-of-character stuff, and the novelty is going to wear off, and the AI usage is going to go down, but the video card usage will stay the same.
-
And they used 1.4 million video cards. The scale is a wash. And yes, when it's brand new folks are going to sit there for a bit appreciating how cool it is to talk to Darth Vader. And then he's going to say some stupid out-of-character stuff, and the novelty is going to wear off, and the AI usage is going to go down, but the video card usage will stay the same.
Sure, that is true specifically for Vader, but the whole point of the article is that they are rolling out several new bots to keep the novelty up.
-
Yes, but by definition all of them are also playing the game, and given that this is mostly a novelty feature (and also based on how shockingly little use the user-facing chatbots I've seen in professional settings are utilized), I personally doubt that the chatbot energy usage will top the game's.
My guess is there will be 90% of people who use the feature once or twice before ignoring it forever, 9% who will use it occasionally for e.g. video creation purposes, and 1% or less who will actually sit there and use it a bunch just to talk to. That would about match up with ChatGPT's general usage trends.
For sure that is true for Vader, bit the point of the article is to roll out several new bots to keep the novelty up
-
Sure, that is true specifically for Vader, but the whole point of the article is that they are rolling out several new bots to keep the novelty up.
Okay. So, your position is that 6 year olds are going to join Fortnite to spam the funny-man-speak button and because of that AI energy usage will be higher? Okay. Maybe. I'd argue the novelty of AI wears thin really quickly once you interact with it a lot, but I'll grant you some folks might remain excited by AI beyond reason.
So now they are logging into Fortnite and rather than playing the actual game they are just going to talk to characters? It doesn't make a lot of sense to me. But once we throw out the other commenter's numbers and suppose it's not 7 generations to equal 30 minutes of play, maybe it's 20. Maybe it's 40. Maybe it's 100. I honestly don't know. But we're definitely in the realm where I think betting the video card uses more energy than the AI for a given player (and all video cards use more energy than AI for all given players) is a perfectly reasonable position to take.
I bet that is the case. I don't know it. I can't prove it right or wrong without actual numbers. But based on my ability to generate images and text locally on a shit video card, I am sticking with my bet.
-
Okay. So, your position is that 6 year olds are going to join Fortnite to spam the funny-man-speak button and because of that AI energy usage will be higher? Okay. Maybe. I'd argue the novelty of AI wears thin really quickly once you interact with it a lot, but I'll grant you some folks might remain excited by AI beyond reason.
So now they are logging into Fortnite and rather than playing the actual game they are just going to talk to characters? It doesn't make a lot of sense to me. But once we throw out the other commenter's numbers and suppose it's not 7 generations to equal 30 minutes of play, maybe it's 20. Maybe it's 40. Maybe it's 100. I honestly don't know. But we're definitely in the realm where I think betting the video card uses more energy than the AI for a given player (and all video cards use more energy than AI for all given players) is a perfectly reasonable position to take.
I bet that is the case. I don't know it. I can't prove it right or wrong without actual numbers. But based on my ability to generate images and text locally on a shit video card, I am sticking with my bet.
No, my point is one player on a single video card is going to interact with the bots multiple times to try and get it to say something funny (or racist as real life played out). That takes more than a single interaction
-
No, my point is one player on a single video card is going to interact with the bots multiple times to try and get it to say something funny (or racist as real life played out). That takes more than a single interaction
You start by saying no, but your elaboration says yes.
Maybe I'm wrong. I don't think so. shrug
Not sure what else there is to say at this point. AI uses energy. So do lots of things—video cards in this example. My point is really to put things into perspective here. If the number of video cards running Fortnite weren't cause for worry 3 years ago, why would this use of AI be concerning today?
-
You start by saying no, but your elaboration says yes.
Maybe I'm wrong. I don't think so. shrug
Not sure what else there is to say at this point. AI uses energy. So do lots of things—video cards in this example. My point is really to put things into perspective here. If the number of video cards running Fortnite weren't cause for worry 3 years ago, why would this use of AI be concerning today?
Sorry, I said "no" because it is clearly not just 6 year olds playing fortnite and interacting with the bots, but did not elaborate at all. Brain fart on my part
If the number of video cards running Fortnite weren’t cause for worry 3 years ago, why would this use of AI be concerning today?
Because it has the potential to be way worse, but it depends heavily on actual energy usage, which we don't have in this case. What we do have is estimates on ChatGPT, and the estimates are pretty bad
We can only speculate, but if 100 words of text takes .14kWh we can assume 100 words of voice production is worse.
And maybe if the novelty does not stick it is not something to be concerned about, but that doesn't mean we shouldn't be asking about the energy cost
-
Like honestly, on paper this sounds so cool. If you told me in 2015 that in 10 years time i would be able to play online games with NPC allies that chat with me kinda like real people, I would be super excited about that (and maybe just a little unsettled).
Of course, in practice, I hate this. I don't care how cool the tech is, the energy cost of running this tech for even half of Fortnite's active daily users on a daily basis must be eyewatering. No LLM tech in videogames is worth cooking the planet over. And we all know that the tech companies want utility customers to help foot the bill for these moronic uses of energy, whether we like it or not.
So ultimately: fuck Epic and anyone else trying to use this LLM tech for anything other than life-or-death situations.
the energy usage of AI is grossly overestimated by people. cooking the planet really should be your last concern. id worry more about a possible skynet if they ever achieve sentient agi, which at the rate its going is gonna be within the next 5 years.
-
Sorry, I said "no" because it is clearly not just 6 year olds playing fortnite and interacting with the bots, but did not elaborate at all. Brain fart on my part
If the number of video cards running Fortnite weren’t cause for worry 3 years ago, why would this use of AI be concerning today?
Because it has the potential to be way worse, but it depends heavily on actual energy usage, which we don't have in this case. What we do have is estimates on ChatGPT, and the estimates are pretty bad
We can only speculate, but if 100 words of text takes .14kWh we can assume 100 words of voice production is worse.
And maybe if the novelty does not stick it is not something to be concerned about, but that doesn't mean we shouldn't be asking about the energy cost
wrote last edited by [email protected]I think it's fair to discuss the energy. I'm not sure where the math comes from that 100 words takes .14kWh. My video card uses 120W pegged and can generate 100 words in let's say a nice round 2 minutes. So that works out to 4W or .004kWh. But of course they are running much more advanced and hungry models, and this is probably generating the text and then generating the voice, and I don't know what that adds. I do know that an AI tool I use added a voice tool and it added nothing to cost, so it was small enough for them to eat, but also the voices are eh and there are much better voice models out there.
So that's fine, I can pretty well define the lower bounds of what a line of text could cost, energy-wise. But this strategy doesn't get us closer to an actual number. What might be helpful.... is understanding it from EA's perspective. They are doing this to increase their bottom line through driving customer engagement and excitement, because I haven't heard anything about this costing the customer anything.
So whatever the cost is of all the AI they are using, has to be small enough for them to simply absorb in the name of increased player engagement leading to more purchases. The number I just found is $1.2 billion in profit annually. Fuck, that's a lot of money. What do you think they might spend on this? Do you think it would be as high as 2%? I'll be honest, I really don't know. So lets say they are going to spend $24million on generative AI and let's just assume for a second that all goes to power.
I just checked and the average for 1KWh nationally is $0.1644 but let's cut that in half assuming they cut some good deals? (I'm trying to be completely fair in these numbers so disagree if you like. I'm writing this before doing all the math so I don't even know where this is going.) That looks like about 291 million KWh (or... that's just 291 GWh, right?)
I read global energy usage is estimated at 25,500 TWh, and check my math that works out to about 1/87,000th of the world’s annual electricity consumption. Kinda a lot for a single game, but it's pretty popular.
But the ask is how that compares to video cards and.... let's be honest this is going to be a very slippery, fudge-y number. I was quoted 1.5 million daily players (and I see other sources report up to 30 million which is really wide, but lets go with the lower number). So the question is, how long do they play on average, and how much power do their video cards use? I see estimates of 6-10 hours per week and 8-10 hours per week. Let's make it really easy and assume 7 hours per week or 1 hour per day.
I have a pretty low end video card, but it's probably still comparable to or better than some of the devices connecting to fortnight. I don't have a better number to use, so I'm going to use 120W. There should be a lot of players higher than that, but also probably a lot of switches and whatnot that are probably lower power. Feel free to disagree.
So 1.5m players x 1 hour per day = 120MWh x 365 = 43.8GWh.
By these numbers the AI uses about 6x the power of the GPUs. So there is that. But also I think I have been extremely generous with these numbers everywhere except maybe the video card wattage which I really don't have any idea how to estimate. Would EA spend 2% expecting to recoup that in revenue? What if it's 1%? What if it's .5%? At .5% they are getting pretty close.
Or if the number of daily players is 15 million instead of 1.5, that alone is enough to tip the scale the other way.
And device power is honestly a wild-ass guess. You could tell me the average is 40W or 250W and I'd have no real basis to argue.
If you have any numbers or suggestions to make any of this more accurate, I'm all ears. The current range of numbers would lean toward me being wrong, but my confidence in any of this is low enough that I consider the matter unresolved. I also didn't dive into how much of AI cost is power vs. infrastructure. If only half the cost of AI is power (and it's probably lower than that) it changes things significantly.
I'm going to stick with my assertion, but my confidence is lower than it was.
-
I think it's fair to discuss the energy. I'm not sure where the math comes from that 100 words takes .14kWh. My video card uses 120W pegged and can generate 100 words in let's say a nice round 2 minutes. So that works out to 4W or .004kWh. But of course they are running much more advanced and hungry models, and this is probably generating the text and then generating the voice, and I don't know what that adds. I do know that an AI tool I use added a voice tool and it added nothing to cost, so it was small enough for them to eat, but also the voices are eh and there are much better voice models out there.
So that's fine, I can pretty well define the lower bounds of what a line of text could cost, energy-wise. But this strategy doesn't get us closer to an actual number. What might be helpful.... is understanding it from EA's perspective. They are doing this to increase their bottom line through driving customer engagement and excitement, because I haven't heard anything about this costing the customer anything.
So whatever the cost is of all the AI they are using, has to be small enough for them to simply absorb in the name of increased player engagement leading to more purchases. The number I just found is $1.2 billion in profit annually. Fuck, that's a lot of money. What do you think they might spend on this? Do you think it would be as high as 2%? I'll be honest, I really don't know. So lets say they are going to spend $24million on generative AI and let's just assume for a second that all goes to power.
I just checked and the average for 1KWh nationally is $0.1644 but let's cut that in half assuming they cut some good deals? (I'm trying to be completely fair in these numbers so disagree if you like. I'm writing this before doing all the math so I don't even know where this is going.) That looks like about 291 million KWh (or... that's just 291 GWh, right?)
I read global energy usage is estimated at 25,500 TWh, and check my math that works out to about 1/87,000th of the world’s annual electricity consumption. Kinda a lot for a single game, but it's pretty popular.
But the ask is how that compares to video cards and.... let's be honest this is going to be a very slippery, fudge-y number. I was quoted 1.5 million daily players (and I see other sources report up to 30 million which is really wide, but lets go with the lower number). So the question is, how long do they play on average, and how much power do their video cards use? I see estimates of 6-10 hours per week and 8-10 hours per week. Let's make it really easy and assume 7 hours per week or 1 hour per day.
I have a pretty low end video card, but it's probably still comparable to or better than some of the devices connecting to fortnight. I don't have a better number to use, so I'm going to use 120W. There should be a lot of players higher than that, but also probably a lot of switches and whatnot that are probably lower power. Feel free to disagree.
So 1.5m players x 1 hour per day = 120MWh x 365 = 43.8GWh.
By these numbers the AI uses about 6x the power of the GPUs. So there is that. But also I think I have been extremely generous with these numbers everywhere except maybe the video card wattage which I really don't have any idea how to estimate. Would EA spend 2% expecting to recoup that in revenue? What if it's 1%? What if it's .5%? At .5% they are getting pretty close.
Or if the number of daily players is 15 million instead of 1.5, that alone is enough to tip the scale the other way.
And device power is honestly a wild-ass guess. You could tell me the average is 40W or 250W and I'd have no real basis to argue.
If you have any numbers or suggestions to make any of this more accurate, I'm all ears. The current range of numbers would lean toward me being wrong, but my confidence in any of this is low enough that I consider the matter unresolved. I also didn't dive into how much of AI cost is power vs. infrastructure. If only half the cost of AI is power (and it's probably lower than that) it changes things significantly.
I'm going to stick with my assertion, but my confidence is lower than it was.
It is super tough to get more than a ballpark for sure. The 1.4 million players from average daily players over the past year. 30 million is the all time peak, I believe.
As far as how much Epic/Disney is willing to spend, that is another grey area. Do they consider this marketing? Do they consider this a test bed for AI generated actors for shows and movies? The motivation and how much they are willing to spend is even more opaque than the energy use numbers
-
i appreciate that you are engaging deeply. I don't really have anything else to say that i haven't said already, but just wanted to show my respect there.
I appreciate this in return. Online I tend to throw around colorful epithets and I know that can come across as aggressive, and a couple of time I might've phrased things more enthusiastically than I aspire to. I appreciate that you were able to look past that and stay engaged on the topic.