My 980TI is like that Japanese soldier that stayed fighting on a remote island for decades not knowing the war is lost. One day my sweet prince will finally rest.
Even today, there are still very few games my GTX 1080 cannot run on ultra and mantain a pretty steady 60fps, as long as I stay at 1080p. God of War and AC Valhalla come to mind.
It's only in the last year or so where my GTX 1080 has started having issues keeping up with AAA games. I mainly play JRPGs so I'm usually way over powered anyways.
I'm an AMD fanboy, but the 1080TI is an example of what nVidea can do if they want to, they just aren't being pressured enough and are always holding back to make sure they can counter whatever AMD launches.
I've got my gtx 1080 powering a 3840x1600 ultrawide, and it's honestly doing just fine. Most modern games at medium settings native resolution, looks pretty good. If the 30 series drops in price enough after the next gen, I might upgrade then, but we'll see.
Went from a 1650 to a 1660ti a couple years ago and I feel like the king of the world in most games I play, even at 4K. Plays doom eternal 60fps, fine for me.
I upgraded from a 1080 to a 3080 a month ago. I wanted to wait, but the errors were starting to multiply, and I'm not die-hard enough to go a few months without a functioning PC just to save the money. Life's too short!
My 1070 chugs along no matter what I put it through, for better or for worse or forza horizon 5 at max settings heating up the house better than our hvac system
My gaming laptop from 2020 has an rtx2060. Sounds fancy, but the gtx1080 in my desktop out specs it. That gpu was a beast when it came out, and there's definitely a few years left in it.
I had a spare 1070ti laying around. I threw it in a refurb I got off eBay and the thing runs every game I've thrown at it like a dream...given that my discount monitor can only do 1080p 75hz.
Bruh my acer predator 17x with that sweet 1080ti prince is pulling it's weight in gold. I'm gonna b sad when it finally goes Kapoot. It's def. Showing it's age :/
I'm all for more graphics power, but it's a bit outlandish that just a graphics card is using 700W if the the 4000 series rumours are to be believed. We're approaching the point where a 1000W PSU us going to be the min requirement for a system running xx80 or higher.
Yeah, it's insane. If you get a good chip you can at least undervolt it for it to run with 25% less power draw, but that's still a lot. If the 4090 would draw 700, even with a good undervolt it would still require 525 watts...
My thoughts exactly. Plus none of the components are actually getting any cooler either. There's so much focus on efficiency that we forget to look at the gross output.
With energy costs soaring it'll simply be too expensive for some to upgrade to that level of sustained power usage. If you use a window ac unit it's worse as your ac and PC perform a boss battle for climate control of the room everytime you game and that's with the 30 series now imagine the 40 series if the rumors are true.
This is a little overblown. A 3080 draws 50w watching youtube, internet browsing, running desktop apps, running many older games, etc... It can only hit it's 400w when in the middle of a game with high resource demand... and it drops down a fair amount as soon as you hit pause.
This will get buried, but a 1000W PSU isn't sustainable, at least here in North America with our weak asthmatic (though marginally safer) residential power.
Absolutely loving my M1 Max MacBook Pro. I’m all mac for work then obviously pc for gaming. If Apple ever got serious about gaming it would be fun to see what a dedicated gpu could do when paired with the new M chips.
We should all be thankful that the M1 and M2 are as powerful as they are while being so insanely power efficient. Even if we never switch to a Mac, the efficiency of those chips proves what can be done with such little wattage and puts pressure on Intel and AMD to produce more efficient chips. More competition the better.
Really hyped for the M2 chips, really hope that windows starts catching up and fully supporting ARM chips. The power efficiency is insane and battery life is great. My M1 work laptop survives a full day of work on a single charge (vscode, Firefox, iterm2, teams, vpn and docker unit testing running consistently).
What really needs to happen is breaking x86 backcompat on calls. So much space is used so a x86 CPU so programs that worked on the Intel 8086 and forward still work.
I'm torn. I don't want an SOC because then it's going to be almost impossible to upgrade or repair my machine, but I want more efficient hardware. Let's see where this goes.
I’m not convinced ARM based chips like the M1/M2 will ever hold a candle to discrete GPUs for gaming, though. The embedded GPUs aren’t really designed for it, they’re designed to transcode video and crank out blender renders.
That's not really the point of the post. The Mac would be a very capable of gaming with better support from game companies and Vulkan support (instead of Apple insisting on Metal). ARM chip designs like M1/M2 are likely the (not near) future of desktop PCs as well.
After a lifelong hatred of Apple, I got a used MacBook pro recently. Mostly to better understand the OS when trying to help our Mac users. Gotta be honest, I fucking love the thing. It's not going to replace my gaming rig, but it's been a great daily driver for work.
the Steam Deck is such a cool device. it would be really cool if someone made a portable dock with a laptop-esque form factor, so you could use it for both work and gaming anywhere
Apple's way of comparison was holding the 3090 to the amount of power an M2 used. The M2 is a lot more power efficient, so you're effectively throttling the 3090 because it can use a lot more power to outperform the M2. It invites a lot of car comparisons, for some obvious reasons.
I think they were comparing m1 ultra to the 3090. It does beat it in some productivity stuff, not all of course. And gaming is not even close, but that’s not the point of a mac anyway
Its been widely considered in the computer engineering field that RISC cpus (like ARM's) are much better than CISC (like the x86 family) in regards to both power and speed. But we've been stuck with intels x86 just for the sake of backwards compatibility and industry standard. RISC is the future and i hope our PCs will also follow that and we will have wide support for it without the need for emulation
The RISC v. CISC debate isn't really that significant and really ended up not mattering at all. Which is really why a RISC revolution never really happened, transistor density just made it irrelevant. ARM at this point isn't very RISC-y either, having picked up its own decade of legacy (thumb-2 says hello, for example).
i386/amd64 is definitely not CISC for the same reason you mentioned. They're essentially RISC processors with some special microcode that helps some very specific processes run much faster.
I do have an m1 sitting next to a rig with a 3080ti and I do adore both machines. Why is there even a debate about this? It’s like comparing a car to a bike, different things for different purpose…
I think the direction this is going is great. I fear a bit for customization like, 'you wanna game, well, got to pay 1200 for this chip that locks you on spec or you can't run games well'. I love having the option to go the upgrade route on a budget, used or new. But if done well for gaming, a box like this with proper AAA gaming power, maybe a down shining rgb light... That has it's own kind of cool!
I almost want to get a new M1 or M2 mac, I just don't know if I'd be able to use it much since I can't natively install Windows on it (as far as I know) , and ever since MacOS Catalina, half of my Mac library was cut in half.
Everything has its place. M2 does well in certain workloads while pulling low levels of power. Custom PCs give you more flexibility in exchange for more power draw and footprint.
This meme is of their M1 Ultra, their top end desktop chip. It’s absolutely wicked fast, but everyone hates on it because they compared it to a 3090 at 300W, which is misleading af
I'm far from a Macintosh fan but I'm very impressed by the capabilities of apples M chips. They are pulling very little power compared to an X86 system with similar horsepower. It's certainly the future of pcs which is better for general use but will make building pcs either obsolete or as a hobby.
So I’m a PCMR guy through and through. I have a PC I’ve invested a fair bit into, a gaming laptop so I can take my VR to friends places and which I tried for school. The battery ran out after like 45 minutes using MS word.
People are saying they don’t care about a slightly higher electricity bill, but I think the heating effect is the real issue. Even with my 160W 2060, my room is noticeably warmer after an hour of gaming than the rest of the house.
Have been Win Desktop + Ubuntu/Linux work laptop for the last 10 years or so. My phones were always androids. Last year got myself an iphone and watch7 and started to appreciate what apple has done in how those things work together, anyways so i decided to go Mac instead my Ubuntu for work, and OH boy what i have been missing, this is "MacBook pro M1" hands down the best laptop or system experience i ever had, and no i don't work with photo/vid editing, i work in cloud engineering projects.
I am in similar situation. Had PC for gaming/work with Windows and sometimes Linux all my life. But with new company I chose MacBook Pro as my laptop of choice and booy it is great. But getting used to it while using typical Windows keyboard will take me some time getting used to.
Sorry but when you start getting gaming PCs sucking down near a kilowatt of power, this is a problem. With average electricity prices you’re paying a price in the dollars per day to run that thing, rather than cents. That really adds up over a month or year.
Genuinely excited about the M2’s, but I need Apple to get their shit together with eGPU’s. I’m still running a 2019 MacBook Pro that heats up like the sun when rendering video in Premiere/After Effects. Have had the option to go M-series at work a few times, but can’t do it (I require mobility as well)
Or you could join team red and not break 200 watts until you need to game. My AMD APU has a TDP of less than 75 watts so I can cruise along at less than 75 watts until I load up a game and then the GPU starts working. Heck, on CPU bound games the graphics card doesn't even need to work that hard.
I refuse to go with the trend and buy a card with more power draw... not because I'm elitist, but only because my cooling system can't possibly handle any more heat. Smol-PC yaaay.
M series Macs are amazing if you want a powerful multimedia box or a video editing box. I’m a PC user, but new Macs are pretty sweet for certain things.
The only thing I dislike about the M1/M2 is that I fear other companies will want to follow apple’s model and PC building building as we know it could look bleak. Companies may shift only to prebuilts that you can’t take apart or repair, forcing consumers to buy new computers when a certain component like RAM dies. It’d also greatly reduce the amount of customization that current PC building allows, so I don’t want M1s to become the future of computing. Let’s hope right to repair becomes more of a thing so that this doesn’t happen.
I bought the 13” M2 MacBook Pro and for video editing (which is what I do) it keeps up with my PC, which has a 3900x and a 2070 Super. I’m not disappointed.
I’m still perfectly fine with my Mac Pro 5,1 with two 6 core Xeons, a Radeon RX 590, 96GB of RAM, with Mac OS and Windows 11. Literally does everything I want including gaming🤷🏼♂️
Welcome everyone from
looks at my gtx1080 "You're doing fine, lieutenant. Steady the guns"
My rx470 longs for the sweet embrace of death but he always gets back up again. Proud of ye lad
My 980TI is like that Japanese soldier that stayed fighting on a remote island for decades not knowing the war is lost. One day my sweet prince will finally rest.
Even today, there are still very few games my GTX 1080 cannot run on ultra and mantain a pretty steady 60fps, as long as I stay at 1080p. God of War and AC Valhalla come to mind.
It's only in the last year or so where my GTX 1080 has started having issues keeping up with AAA games. I mainly play JRPGs so I'm usually way over powered anyways.
I'm an AMD fanboy, but the 1080TI is an example of what nVidea can do if they want to, they just aren't being pressured enough and are always holding back to make sure they can counter whatever AMD launches.
1070 on a 1080p 60fps monitor. Steady on lads
laughs in intel hd 2000 running 640x360 gaming
1080 is more than fine
But the CPU and RAM might be getting an upgrade soon because holy crap does my rig just not run like it used to.
My faithful 1080 saw its replacement in a box next to it yesterday and said “thank you, I can rest now”
GTX 1060 my dude.
My DREAM is still a 1080, 1080ti if lucky, and play minecraft with shittons of shaders.
I fear my 1080 doesn’t have a lot of time left. Every thing runs slower these days but he is holding on.
1080 ti till I die
gt 1030
My GTX 770 4GB still performs better than it has any right to.
Brother
My GTX970 is chugging along nicely. Solid little ironclad, and about as old as one.
I've got my gtx 1080 powering a 3840x1600 ultrawide, and it's honestly doing just fine. Most modern games at medium settings native resolution, looks pretty good. If the 30 series drops in price enough after the next gen, I might upgrade then, but we'll see.
Hell yeah, my 1080ti is doing just fine with gaming in 2022.
Imagine crying because you have a 1080. That's still a beast dude, you can game comfortably.
1060 reporting in
Went from a 1650 to a 1660ti a couple years ago and I feel like the king of the world in most games I play, even at 4K. Plays doom eternal 60fps, fine for me.
My EVGA 980 is still going strong after getting it like 8 years ago or whenever they were newest
RISE MY INTEL HD GRAPHICS
GTX 660Ti master race
I upgraded from a 1080 to a 3080 a month ago. I wanted to wait, but the errors were starting to multiply, and I'm not die-hard enough to go a few months without a functioning PC just to save the money. Life's too short!
My 970 is still doing very well, too.
Exactly
I said this to my 1060. The rest of my pc is 10 years old 😂
I've got a 1080ti on a 1440p ultra wide doing just fine. Thing is a power house and don't need any of that garbage Nvidia keeps dangling
Looking at my 1080ti I paid 900 for in 2018... hang in there buddy..
My 1070 chugs along no matter what I put it through, for better or for worse or forza horizon 5 at max settings heating up the house better than our hvac system
I got a 3080 during the insanity and I’m gonna ride that bitch for years
My gaming laptop from 2020 has an rtx2060. Sounds fancy, but the gtx1080 in my desktop out specs it. That gpu was a beast when it came out, and there's definitely a few years left in it.
cries in 970
Meanwhile my 1050ti in the corner
I had a spare 1070ti laying around. I threw it in a refurb I got off eBay and the thing runs every game I've thrown at it like a dream...given that my discount monitor can only do 1080p 75hz.
Bruh my acer predator 17x with that sweet 1080ti prince is pulling it's weight in gold. I'm gonna b sad when it finally goes Kapoot. It's def. Showing it's age :/
As someone who finally upgraded their 970 to a 3060 Ti:
Zotac Amp extrene 1080ti 11GB, hapiness in 1.5KG
Fyi, the modern stuff is sitting at or slightly below msrp
My 1060 been holding steady for a few years lmao. I’m hoping OW2 doesn’t kill it
Laughs in Gt-610 superiority
My gtx1070 really deserves retirement. He's put in the work. Need to fix this damn budget to find his successor.
GTX1050ti here, “right behind ya chief”
This shit just moves too fast, I don’t even know if my RTX 2080 super is even middle of the pack anymore or not…
Still a good card. Especially the ti versions.
Looks at my ATI Radeon 7000
I literally was playing call of duty 2 right now. That's perfect 😂
My two 980tis keeps my basement warm in the winter
1060ti reporting for duty, commander!
I'm all for more graphics power, but it's a bit outlandish that just a graphics card is using 700W if the the 4000 series rumours are to be believed. We're approaching the point where a 1000W PSU us going to be the min requirement for a system running xx80 or higher.
Yeah, it's insane. If you get a good chip you can at least undervolt it for it to run with 25% less power draw, but that's still a lot. If the 4090 would draw 700, even with a good undervolt it would still require 525 watts...
My thoughts exactly. Plus none of the components are actually getting any cooler either. There's so much focus on efficiency that we forget to look at the gross output.
They won't hit near that level of power. They said something outlandish also before the 3000 series released in leaks.
With energy costs soaring it'll simply be too expensive for some to upgrade to that level of sustained power usage. If you use a window ac unit it's worse as your ac and PC perform a boss battle for climate control of the room everytime you game and that's with the 30 series now imagine the 40 series if the rumors are true.
Those 700w+ numbers weren't for TDP. I'll just quote Guru3d...
Or gpus will just go external and require their own power brick
This is a little overblown. A 3080 draws 50w watching youtube, internet browsing, running desktop apps, running many older games, etc... It can only hit it's 400w when in the middle of a game with high resource demand... and it drops down a fair amount as soon as you hit pause.
This will get buried, but a 1000W PSU isn't sustainable, at least here in North America with our weak asthmatic (though marginally safer) residential power.
Not with transient spikes, we are looking at 1200w supplies.
2 stroke power supply when
[удалено]
Doomguy got upgrades.
Eleven called, wants to use your rig to close the portal.
I started looking for the chad who undervolts
Mine pulls a little over 180 watts. However thats mostly because it cooks fucking eggs at 95C, is a laptop, and AMD fucking witchcraft.
Directly connected to the top secret nuclear fusion plant
But it costs an ARM and a leg…
Haha I get it! Super funny! But my friend doesn't get it, can somebody explain it? To my friend?
The door is that way - - - - - - - >
Ha!
2023 will be the year of RISC-V and Linux Desktop :-(
Genuinely excited to see RISC CPUs rise. Would be interesting to see how far we can go with it.
The RISC Strikes back, RISC was sort of big in the 90's and up to 2001 I think.
Wait A and M series chips are RISC? That's a name I haven't seen in quite some time.
And Plesse let it be risc-v, Qualcomm sucks
M2 is already on the market in the new MacBook Pro though.
Absolutely loving my M1 Max MacBook Pro. I’m all mac for work then obviously pc for gaming. If Apple ever got serious about gaming it would be fun to see what a dedicated gpu could do when paired with the new M chips.
Honestly, nobody should buy it though.
windows 98 machine still chugging doom
Imagine playing doom on windows 98 mate what year is this, everyone is playing on pregnancy tests it's the new cool
Imagine not hitting 640x480 in doom in 1998. Time for an upgrade homie.
ackschually, back then you'd be running 320x200 at 70fps if at all possible, because the refresh rate of the VGA 13h mode was 70hz
We should all be thankful that the M1 and M2 are as powerful as they are while being so insanely power efficient. Even if we never switch to a Mac, the efficiency of those chips proves what can be done with such little wattage and puts pressure on Intel and AMD to produce more efficient chips. More competition the better.
See Apple has a fantastically power efficient gpu, but then they go and crap on their magnificent creation by saying it's a 3090
Really hyped for the M2 chips, really hope that windows starts catching up and fully supporting ARM chips. The power efficiency is insane and battery life is great. My M1 work laptop survives a full day of work on a single charge (vscode, Firefox, iterm2, teams, vpn and docker unit testing running consistently).
efficiency on new Macbooks is just crazy and they barely get hot
it's a shame RISC-V came so much later than ARM, it's having a much harder time to solidify itself in the consumer computer market.
What really needs to happen is breaking x86 backcompat on calls. So much space is used so a x86 CPU so programs that worked on the Intel 8086 and forward still work.
I'm torn. I don't want an SOC because then it's going to be almost impossible to upgrade or repair my machine, but I want more efficient hardware. Let's see where this goes.
Quadcome is working on some laptop mobile processors
A new laptop chip from ARM was just announced this week (Cortex-X3). Benchmarks are good, I'd really like to buy a powerful Windows ARM machine soon!
Since when is 8 hours considered insane? My XPS 15 has been doing more than that since I purchased it a couple years ago.
This is why I skip unit testing, to save power.
I've heard rumors that Apple is working on a standalone VR headset based on the M2, which sounds amazing.
I’m not convinced ARM based chips like the M1/M2 will ever hold a candle to discrete GPUs for gaming, though. The embedded GPUs aren’t really designed for it, they’re designed to transcode video and crank out blender renders.
My problem with Apple isn't with their ARM chips.
Most games I've tried on mac (of the few that are available) just don't work right. Wish this wasn't the case, but that's my experience.
200w is low af, wdym?
Wish people would realise different devices do different things. Personally I have a MacBook and a PC for different purposes.
Yep. Same here, work with my macbook, game on my PC.
ok mr money bags
That's not really the point of the post. The Mac would be a very capable of gaming with better support from game companies and Vulkan support (instead of Apple insisting on Metal). ARM chip designs like M1/M2 are likely the (not near) future of desktop PCs as well.
Yes indeed. My Mac is my photo editing machine and preferred device for internet use and productivity. The screen is beautiful for movies and tv.
After a lifelong hatred of Apple, I got a used MacBook pro recently. Mostly to better understand the OS when trying to help our Mac users. Gotta be honest, I fucking love the thing. It's not going to replace my gaming rig, but it's been a great daily driver for work.
Bingo, nailed it!
I have both and I use neither.
im free 24/7
same sometimes i say fuck it and watch yt on my phone
That m1 ultra is seriously no joke. Fastest computer I’ve ever seen.
When Asahi Linux gets more-developed GPU Drivers and customised versions of Box64, we shall see a resurgance of Gaming on Macs.
One can actually like and also own both systems at the same time
Yes, if one is rich. For the most of us, we have to pick.
And then there's me with a Steam Deck that games well and pulls 40W
the Steam Deck is such a cool device. it would be really cool if someone made a portable dock with a laptop-esque form factor, so you could use it for both work and gaming anywhere
To each their own
The M2 beats a 3090? In productivity I assume?
Apple's way of comparison was holding the 3090 to the amount of power an M2 used. The M2 is a lot more power efficient, so you're effectively throttling the 3090 because it can use a lot more power to outperform the M2. It invites a lot of car comparisons, for some obvious reasons.
Pretty much only in Final Cut. For gaming, not even close.
In 9/10 things, absolutely not and it isn't even close.
I think they were comparing m1 ultra to the 3090. It does beat it in some productivity stuff, not all of course. And gaming is not even close, but that’s not the point of a mac anyway
Its been widely considered in the computer engineering field that RISC cpus (like ARM's) are much better than CISC (like the x86 family) in regards to both power and speed. But we've been stuck with intels x86 just for the sake of backwards compatibility and industry standard. RISC is the future and i hope our PCs will also follow that and we will have wide support for it without the need for emulation
The RISC v. CISC debate isn't really that significant and really ended up not mattering at all. Which is really why a RISC revolution never really happened, transistor density just made it irrelevant. ARM at this point isn't very RISC-y either, having picked up its own decade of legacy (thumb-2 says hello, for example).
i386/amd64 is definitely not CISC for the same reason you mentioned. They're essentially RISC processors with some special microcode that helps some very specific processes run much faster.
The lines of RISC and CISC have been blurred for decades.
I do have an m1 sitting next to a rig with a 3080ti and I do adore both machines. Why is there even a debate about this? It’s like comparing a car to a bike, different things for different purpose…
it baffled me when the m1 came out and people started comparing to high end desktops. like sir, this is a 40 watt chip in an ultrabook
I'm with you bro (3080 not ti tho), they're both for different purposes and both awesome at what they do.
I just like seeing people enjoy things.
I think the direction this is going is great. I fear a bit for customization like, 'you wanna game, well, got to pay 1200 for this chip that locks you on spec or you can't run games well'. I love having the option to go the upgrade route on a budget, used or new. But if done well for gaming, a box like this with proper AAA gaming power, maybe a down shining rgb light... That has it's own kind of cool!
I almost want to get a new M1 or M2 mac, I just don't know if I'd be able to use it much since I can't natively install Windows on it (as far as I know) , and ever since MacOS Catalina, half of my Mac library was cut in half.
It can still run windows faster than native/x86
Yeah I have no need for a PC that uses 1000+ watts
I’d prefer the new MacBook Air for my day to day tasks. They can do it all day…
I just bought a MacBook Air for the portability and efficiency. I won’t game on it but it’s great for traveling and web browsing
gAming pC better hAte aPPle - most people on this sub
I read that as femboy at first and got super confused
be a femboy 😎
Everything has its place. M2 does well in certain workloads while pulling low levels of power. Custom PCs give you more flexibility in exchange for more power draw and footprint.
Why don’t we all start using renewable power sources?
Here are some reasons. There could be more but these are the few that came to my mind.
Your comment, I can't quite follow it
I'm out of the loop here, what is Apple releasing?
their second in-house processor
This meme is of their M1 Ultra, their top end desktop chip. It’s absolutely wicked fast, but everyone hates on it because they compared it to a 3090 at 300W, which is misleading af
I'm far from a Macintosh fan but I'm very impressed by the capabilities of apples M chips. They are pulling very little power compared to an X86 system with similar horsepower. It's certainly the future of pcs which is better for general use but will make building pcs either obsolete or as a hobby.
So I’m a PCMR guy through and through. I have a PC I’ve invested a fair bit into, a gaming laptop so I can take my VR to friends places and which I tried for school. The battery ran out after like 45 minutes using MS word.
People are saying they don’t care about a slightly higher electricity bill, but I think the heating effect is the real issue. Even with my 160W 2060, my room is noticeably warmer after an hour of gaming than the rest of the house.
Undervolt people. It works and you basically still have the same performance or sometimes better.
Have been Win Desktop + Ubuntu/Linux work laptop for the last 10 years or so. My phones were always androids. Last year got myself an iphone and watch7 and started to appreciate what apple has done in how those things work together, anyways so i decided to go Mac instead my Ubuntu for work, and OH boy what i have been missing, this is "MacBook pro M1" hands down the best laptop or system experience i ever had, and no i don't work with photo/vid editing, i work in cloud engineering projects.
I am in similar situation. Had PC for gaming/work with Windows and sometimes Linux all my life. But with new company I chose MacBook Pro as my laptop of choice and booy it is great. But getting used to it while using typical Windows keyboard will take me some time getting used to.
Get yourself an iPad Pro too. If you move around a lot at work and work in different areas, it’s a super compact two monitor system.
At the end of the day we don't build gaming rigs to conserve power xD to get better performance sacrifices need to be made lol
There is a threshold where power consumption becomes important. Miners already hit the wall, gamers will probably reach it sometime in the future.
Sorry but when you start getting gaming PCs sucking down near a kilowatt of power, this is a problem. With average electricity prices you’re paying a price in the dollars per day to run that thing, rather than cents. That really adds up over a month or year.
Hum yummy anti right for repair machine.
could we focus on power efficiency please?
Apple Silicon could game real well if Apple wasn’t being a little bitch about it.
I agree with the sentiment but how is preferring pc being a fanboy? PC isn't a brand.
Love this !
Genuinely excited about the M2’s, but I need Apple to get their shit together with eGPU’s. I’m still running a 2019 MacBook Pro that heats up like the sun when rendering video in Premiere/After Effects. Have had the option to go M-series at work a few times, but can’t do it (I require mobility as well)
It isn't about what it can do, but what you do with it ✌️
Dear Apple,
You know those 2-in-1 builds where they out an itx streaming PC into the gaming pc? I want to see that with the studio or mini
So what exactly is holding Nvidia and AMD back on progressing past the x86 architecture?
Two things:
Can’t wait for the m2 air to drop
This is a thread I can get behind
Be the chad guys.
Mac for work, PC for gaming. It’s been like that for ages now.
Muh beepy box is better than your beepy box.
Or you could join team red and not break 200 watts until you need to game. My AMD APU has a TDP of less than 75 watts so I can cruise along at less than 75 watts until I load up a game and then the GPU starts working. Heck, on CPU bound games the graphics card doesn't even need to work that hard.
the world needed this post. take my free award
I refuse to go with the trend and buy a card with more power draw... not because I'm elitist, but only because my cooling system can't possibly handle any more heat. Smol-PC yaaay.
M series Macs are amazing if you want a powerful multimedia box or a video editing box. I’m a PC user, but new Macs are pretty sweet for certain things.
Oh c'mon don't be so rude, the Mac mini is a very good Facebook machine
I game on my pc and I work on my M1 MBP. Why not enjoy the best of both worlds?
The only thing I dislike about the M1/M2 is that I fear other companies will want to follow apple’s model and PC building building as we know it could look bleak. Companies may shift only to prebuilts that you can’t take apart or repair, forcing consumers to buy new computers when a certain component like RAM dies. It’d also greatly reduce the amount of customization that current PC building allows, so I don’t want M1s to become the future of computing. Let’s hope right to repair becomes more of a thing so that this doesn’t happen.
I bought the 13” M2 MacBook Pro and for video editing (which is what I do) it keeps up with my PC, which has a 3900x and a 2070 Super. I’m not disappointed.
I’m still perfectly fine with my Mac Pro 5,1 with two 6 core Xeons, a Radeon RX 590, 96GB of RAM, with Mac OS and Windows 11. Literally does everything I want including gaming🤷🏼♂️
1 kilowatt smokestack lmao my house is probably gonna burn down
Currently have a MacBook Pro, my old windows pc is slow af. Definitely want something that games. Lol
At this point ARM life is the only way I can stay sane with how hot summers have gotten.
Apple made so much money in free to play games they have no reason to want to, but a valve-backed proton for Mac M processors would be *guitar solo*
This is wholesome, I dig!
My Mac Studio arrived yesterday and this makes me happy.
Looks @ my
I have a rx5700xt and it is holding strong, does 1080p 60fps fine for most games so Im happy for awhile.
Hard to beat Apple silicon when it comes to video editing. Sucks you can't game with it.
I don't care what people use, I just think apple is more suited to business use
I love how half of these comments are fanboys fighting with each other
Don’t forget, in a decade SOCs like the Mac Studio will likely replace gaming PCs as you know them
This is the first post I’ve ever seen on this sub that has people actually enjoying an apple product and I feel like I’m stepping into a trap