Computing Newslinks - Page: 1
| PC World - 12 minutes ago (PC World)Getting 4TB of fast storage space for $210 seems like a cheat code, but here we are. Right now, you can snatch the Crucial X9 Pro portable SSD for that much, the best price we’ve seen in the past year. That’s a sizable 28 percent discount off its normal price.
This portable SSD doesn’t just offer a ton of storage space, but it’s also super fast. The Crucial X9 Pro hits up to 1,050MB/s read and write speeds, which means you’ll be able to transfer large files in a snap. If you hook it up to your iPhone, you can even capture 4K and 8K videos directly to the Crucial X9 Pro. It’s that fast.
This Crucial SSD doesn’t just work with iPhones but also Android devices, laptops, gaming consoles, and more. It connects via USB-C, which essentially makes it universally compatible.
The drive is also tiny and lightweight, weighing just 2.5 ounces for maximum portability. Plus, your data will be safe on this drive no matter what happens to it, thanks to its IP55 water and dust resistance and it being rated against drops from up to 7.5 feet.
We reviewed the Crucial X9 Pro and gave it a 4.5-star rating and our Editors’ Choice award, happy with its super small form factor and great performance. Don’t miss out on this chance to get the Crucial X9 Pro for $210 on Amazon with a whopping 4TB of space!
This 4TB portable SSD that`ll fit everything you need is 28% offBuy now at Amazon Read...Newslink ©2025 to PC World | |
| | | PC World - 32 minutes ago (PC World)Nvidia’s GeForce Now streaming service is a great way to play PC games on platforms that can’t support them. But you can’t get all the bells and whistles of the PC gaming experience — mods are pretty hard to implement on those remote streaming machines.
Nvidia and developer Larian Studios have been working on it, though, and at least some Baldur’s Gate III mods now work on the service. That’s according to a post on Nvidia’s official blog, which announced that “a range of curated mods” are available for the smash-hit role playing game, implemented through the official in-game system. I would expect “curated” to mean “heavily edited” in this context. Some popular D&D subclass and species additions might make the list, but your Daisy Duke cutoff jeans for Shadowheart probably won’t.
Sadly, the upgraded BGIII experience is reserved for those subscribed to GeForce Now’s Performance and Ultimate tiers — and those are getting harder to find at the moment.
As usual, GeForce Now continues to add new games to its ever-expanding lineup, including the popular new Jotunnslayer: Hordes of Hel and the Xbox Game Pass version of Among Us. Read...Newslink ©2025 to PC World | |
| | | PC World - 32 minutes ago (PC World)Heads up! While gaming laptops with RTX 4060 GPUs typically go for at least $1,000 — usually more — right now you can grab an Acer Nitro V 15 for just $750 at Best Buy. That’s an amazing discount on a solid machine, a hefty $350 off its original $1,100 price tag.
This configuration comes with a 10-core Intel Core i7-13620H CPU and 16GB of DDR5 RAM, a solid combination that’ll get you through pretty much any app or game you want to run. The laptop also makes it easy to upgrade the RAM, up to 32GB if you swap the 8GB sticks for 16GB each.
But the highlight here is the Nvidia GeForce RTX 4060 GPU. It may not be the newest or greatest of the RTX 40-series, but it’s an incredible pickup at this price. Seriously, you’d have to spend more than $1k for something like this otherwise, and it’s powerful enough to raise those graphics settings (although probably not to the very max).
The Acer Nitro V 15 features a 15.6-inch display with 1080p resolution and 144Hz refresh rate, so your games will look great and your frame rates will be super smooth. It’s all rounded off with a 512GB SSD that’s fast and offers enough space for your apps, games, and files.
Don’t miss out on this amazing deal. It’s not every day you can grab an Acer Nitro V for $750 at Best Buy!
Save $350 on this powerful RTX 4070 gaming laptopBuy now at Best Buy Read...Newslink ©2025 to PC World | |
| | | PC World - 52 minutes ago (PC World)I’m old enough to remember when consoles were getting seriously powerful and people were starting to wonder if PC gaming’s days were numbered. Turns out, no. Far from it.
According to a recent professional survey, a shocking 80 percent of game developers are actively working on games for PC, more than double the percentage for the next platform. That’s according to the 2025 State of the Game Industry report at the Game Developers Conference (GDC).
When asked, “Which platforms are being used for your current project?” About 80 percent responded with PC, followed by 38 percent for PlayStation 5, 34 percent for Xbox Series X/S, 29 percent for Android, and 28 percent for iOS. Just 23 percent were working on Mac (though it’s worth pointing out that Mac computers have access to the iPhone/iPad App Store) and 20 percent were working on the Switch.
But there are a few qualifiers to put this data in perspective. Due to the Game Developers Conference being located in San Francisco, it skews attendance towards American developers and those interested in the latest PC hardware tech. (Nvidia and AMD are both just down the road in Santa Clara.) A small indie developer based in Japan and focusing on Switch games is unlikely to spend all that money to attend, especially when most of the information is made available online later.
But even within that context, a definite trend is emerging. The percentage of developers answering “PC” to this same survey question has jumped up rapidly — from 56 percent in 2020 to 58, 63, 65, and 66 percent in the following years, as PC Gamer reports. Even so, the jump this year is notable, likely driven by the spread of the Steam Deck and its competitors. “When asked to name other platforms that interest them, almost half (44%) wrote in Steam Deck,” says the GDC report.
The report also said that an alarming 11 percent of game developers were laid off last year. It’s not entirely unexpected since basically every major developer and publisher announced some kind of large-scale layoffs in 2024 despite the gaming industry continuing to grow at a rapid pace. On top of that, some gamers will probably be dismayed to hear that one-third of developers working for a AAA company said they’re actively developing a live service game. Read...Newslink ©2025 to PC World | |
| | | PC World - 1 hour ago (PC World)At a glanceExpert`s Rating
Pros
Fast PCIe 5.0 performance (12GBps)
700TBW endurance rating
5-year warranty
Cons
Slower than much of the PCIe 5.0 competition
Our Verdict
If you find the Teamgroup GE Pro priced below the competition, you’ll like the performance. That said, speed was below average for a PCIe 5.0 NVMe SSD, and slower than Its Z540 cousin.
Price When Reviewed
This value will show the geolocated pricing text for product undefined
Best Pricing Today
Best Prices Today: Teamgroup GE Pro PCIe 5 SSD
Retailer
Price
$259.99
View Deal
Price comparison from over 24,000 stores worldwide
Product
Price
Price comparison from Backmarket
With the Pro in its name, I was expecting the GE Pro to eclipse the performance of the Teamgroup’s own Z540. Alas, it lagged behind its cousin and others in most tests. It’s a good PCIe 5.0 drive, it’s just not among the elite.
What are the Teamgroup GE Pro’s features?
The GE Pro is a PCIe 5.0 x4 (four lane), NVMe SSD in the standard 2280 (22mm wide, 80mm long) form factor. It features 512MB of DRAM per 1TB of capacity, and uses an InnoGrit IG5666 controller and 232-layer TLC (Triple-Level Cell/3-bit) NAND. Up to 33 percent of the available NAND may serve as secondary cache.
As you can see in the photo at the top of this article, the GE Pro ships with an attractive graphene heat spreader that you can stick to the chips shown above. If you’re using the SSD heavily, the heat spreader will help dissipate heat and stave off any thermal throttling that might occur.
Teamgroup rates the GE Pro for 700TBW (terabytes that may be written), which is about 100TBW more than average. That’s a nice little bonus, though end users would be hard-pressed to write anywhere near that much within the five-year warranty period.
How much is the Teamgroup GE Pro?
Currently available in 2TB and 4TB flavors, the GE Pro will set you back $260 and $450 if you pay full price. A 1TB model will be sold, but wasn’t available or priced at the time of this writing. Figure around $150 retail when it shows up.
The GE Pro ships with a graphene heat spreader, which can help prevent throttling during heavy use.
How fast is the Teamgroup GE Pro?
Being PCIe 5.0, the GE Pro turned in some very good numbers. But they weren’t best-in-class by any means — ranking 13th place overall, behind even some PCIe 4.0 SSDs that offer better real-world performance. Among PCIe 5.0 drives, it was 9th in all tests, and 7th in CrystalDiskMark 8.
Note the very low single-queue read performance. Single-queue is the way most I/O is handled under Windows.
The GE Pro’s single-queue numbers are a might concerning as that’s the way most I/O is done in Windows. Longer bars are better.
Unlike the sequential tests, the GE Pro was fine in CrystalDiskMark 8’s single-queue random tests, but fell off a bit in the multi-queue 4K tests.
The GE Pro was better with single-queued random performance but was off the pass with multiple queues. Longer bars are better.
Although not a crushing loss, the GE Pro still placed behind the competition in our 48GB transfers. Note that Windows uses only a single queue for file transfers.
This is a good 48GB transfer performance from the GE Pro, but not great, and we’ve seen better from some PCIe 4.0 and HMB designs. Shorter bars are better.
Again, while not too far off the pace, the GE Pro lagged behind its competitors (Crucial T705, Corsair MP700 Pro SE, Teamgroup Z540) in the 450GB write.
Again, it was close but no cigar for the GE Pro. Beaten even by its Z540 cousin. Shorter bars are better.
Overall, the GE Pro is a fast drive — it’s just not as fast as its competitors. Again, that includes Teamgroup’s own Z540, a like-priced SSD, and which should probably be the model with “Pro” in its moniker.
Also, with PCIe 4.0 and HMB (Host Memory Buffer — using system memory for primary cache) designs available for so much less, there’s the question of do you really need any PCIe 5.0 SSD at this point in time. You’d be hard-pressed to tell the difference during even stressful use.
Should you buy the Teamgroup GE Pro?
The GE Pro is a very good SSD, but given its premium price and the fact that it placed in the lower echelon of the PCIe 5.0 charts, I can’t wholeheartedly recommend it. If the price is right, you won’t regret the purchase, but there are faster PCIe 5.0 SSDs available.
How we test
Drive tests currently utilize Windows 11, 64-bit running on an X790 (PCIe 4.0/5.0) motherboard/i5-12400 CPU combo with two Kingston Fury 32GB DDR5 4800MHz modules (64GB of memory total). Both 20Gbps USB and Thunderbolt 4 are integrated to the back panel and Intel CPU/GPU graphics are used. The 48GB transfer tests utilize an ImDisk RAM disk taking up 58GB of the 64GB of total memory. The 450GB file is transferred from a 2TB Samsung 990 Pro which also runs the OS.
Each test is performed on a newly NTFS-formatted and TRIM’d drive so the results are optimal. Note that in normal use, as a drive fills up, performance may decrease due to less NAND for secondary caching, as well as other factors. This can be less of a factor with the current crop of SSDs with far faster late-generation NAND.
Caveat: The performance numbers shown apply only to the drive we were shipped and to the capacity tested. SSD performance can and will vary by capacity due to more or fewer chips to shotgun reads/writes across and the amount of NAND available for secondary caching. Vendors also occasionally swap components. If you ever notice a large discrepancy between the performance you experience and that which we report, by all means, let us know. Read...Newslink ©2025 to PC World | |
| | | PC World - 1 hour ago (PC World)Earlier this week, security firm Trend Micro posted a security advisory (spotted by BleepingComputer) about a big vulnerability in 7-Zip, a popular file archiving utility app that’s used by millions around the world.
According to the advisory, the vulnerability — identified as CVE-2025-0411 — makes it possible for hackers to bypass the Mark of the Web (MotW) security feature in Windows and remotely execute code on your PC when extracting from a malware-loaded archive file.
Igor Pavlov, developer of 7-Zip, actually patched the flaw back in November 2024 with version 24.09. However, 7-Zip doesn’t have an automatic update feature, so many users are still using outdated versions of the app that are still vulnerable to this MotW exploit.
If you haven’t updated 7-Zip in a long time, do it now. Head over to the 7-Zip download page and get the latest version, which is 24.09 as of this writing. As long as you’re on that version or later, you’ll be okay. Read...Newslink ©2025 to PC World | |
| | | PC World - 1 hour ago (PC World)On January 21, a photo of an Nvidia GPU with code name GB202-200-A1 (see header image above) surfaced on the infamous Chip Hell forum. The user who posted it claims that it’s a prototype of the most powerful GPU in the upcoming RTX 50 series.
The fully configured GPU allegedly utilizes the maximum possible number of computing units for which the chip was designed. We’re talking about an RTX 50-series chip that supports 192 streaming processors and an incredible 24,576 shader units. The current top model RTX 5090 has “only” 21,760 shader units. Mathematically, the GB202-200-A1 has almost 13 percent more raw computing power.
The second striking feature are the two 12V 2×6 power connections. These also appear to be necessary, as the Thermal Design Power (TDP) is said to be an extremely high 800 watts. By comparison, the RTX 5090 has a TDP of 575 watts. The power consumption of the GB202-200-A1 is therefore greater by 39 percent.
GB202-200-A1GeForce RTX 5090GPUGB202-200-A1GB202-300-A1CodenameDavid BlackwellDavid BlackwellComputing power (FP32)123.6 TFLOPS104.9 TFLOPSShader units24,57621,760Streaming processors192170GPU clock base / boost2,100MHz / 2,514MHz2,010MHz / 2,410MHzMemory bus512 bit512 bitMemory typeGDDR7GDDR7Graphics memory32GB32GBTGP800 watts575 wattsRRP?$1,999
Nvidia appears to be investing that additional power requirement in higher clock frequencies. Here, too, the Chip Hell forum thread provides very specific figures: the base and boost clocks of the GB202-200-A1 are said to be 2,100MHz and 2,514MHz, respectively. For comparison, the RTX 5090 runs at 2,010MHz and 2,410MHz.
As with the RTX 5090, the RAM configuration should be 32GB, which are connected via a 512-bit memory interface. The picture clearly shows the 16 memory chips surrounding the graphics processor. But here, too, Nvidia could go one better — by only installing chips with a capacity of 3GB, as is already the case with the mobile version of the RTX 5090.
Now the only question that remains is what it should be called. It would be conceivable to bring the GB202-200-A1 to market as the RTX 5090 Ti. But a new entry in the Titan series would also be possible, perhaps as the Nvidia Titan RTX Blackwell. Since the Titan is traditionally aimed more at consumers in the creative sector (e.g., content creators), I’d be disappointed if this card didn’t have 48GB of GDDR7 memory. Read...Newslink ©2025 to PC World | |
| | | PC World - 2 hours ago (PC World)At a glanceExpert`s Rating
Pros
By far the fastest gaming performance ever, even ray traced
DLSS 4 Multi Frame Generation makes games so snappy and smooth, they feel like new — it’s truly game-changing
Huge 32GB of GDDR7 with tons of bandwidth
Tightly engineered Founders Edition model somehow squeezes into a fairly quiet two-slot design
Cons
Bro, it’s $2,000
$500 premium over 4090
Significant power increase requires 1,000W power supply
GPU temp hits 84 degrees Celsius under load
Our Verdict
Nvidia’s GeForce RTX 5090 is the most brutally fast graphics card ever introduced, augmented by new DLSS 4 technology that feels like magic. But you pay dearly for it, and it feels like this GPU was designed more for AI researchers than PC gamers.
Price When Reviewed
This value will show the geolocated pricing text for product undefined
Best Pricing Today
The wait is finally over. The long-awaited GeForce RTX 5090 lands on store shelves in January — and friends, the flagship graphics card for Nvidia’s new “Blackwell” architecture is an absolute monster.
It should be for $2,000, of course. While the RTX 5090’s leap in raw gaming performance isn’t anywhere as massive as the 4090’s was over its predecessor, it blows the pants off any GPU we’ve ever seen before, with no notable flaws in its technical configuration.
But while raw gaming performance is welcome, I suspect that the GeForce RTX 50-series will live or die on the back of DLSS 4, a new generation of Nvidia’s vaunted AI-powered performance-boosting technologies. A lot of Blackwell’s improvements were focused on Nvidia’s AI tensor cores, and Blackwell was designed hand-in-hand to optimize DLSS 4. And hot damn, friends. Based on our early playtime, DLSS 4’s new Multi Frame Generation AI technology feels like black magic, utterly changing the way games feel and respond to your inputs. It’s amazing, full stop.
Check out our embedded video review below for an in-depth analysis of every benchmark we ran and plenty of additional experiential information. This written review will focus on key things would-be RTX 5090 buyers need to know before slapping down $2,000 for the most badass graphics card ever built.
The Nvidia GeForce RTX 5090 is badass at everything
When we covered (and then analyzed!) the specifications for Nvidia’s initial GeForce RTX 50-series lineup, one thing jumped out: The GeForce RTX 5090 was the clear crown jewel, designed with virtually no technical flaws. That bears out in our testing.
With 33 percent more CUDA cores than the RTX 4090, new RT and tensor AI cores, and more raw power being pumped through its digital veins, there was never any doubt the 5090 would whup on its predecessor in gaming. (Much more on that in the next section.) Its ginormous 32GB of bleeding-edge GDDR7 memory, built with a wide 512-bit bus, will be able to tackle any gaming task you throw at it, regardless of resolution.
But the RTX 5090 is more than just a gaming behemoth.
Adam Patrick Murray / Foundry
We live in an era where GPUs do serious work on AI tasks now, not just gaming. Nvidia optimized its Blackwell architecture to excel at AI workloads, while the RTX 5090’s unrivaled memory configuration can hold much larger AI models than any prior GPU. The results in Procyon’s AI Text Generation benchmark are nothing short of sparkling.
In the “worst case” scenario, with the Phi 3.5 large language model, the RTX 5090 is about 19 percent faster than the 4090; in the best case scenario, Meta’s Llama 3.1, performance jumped 32 percent. AI researchers and engineers will be scrambling to pick this up.
The GeForce RTX 5090 is also a content creation powerhouse. It houses an additional media encoding engine, bringing its total up to three, while the massive memory pool helps with the creation and editing of complex projects.
We were only able to run a couple of Adobe benchmarks from Puget Systems’ excellent PugetBench for this initial review. All modern high-end GPUs offer roughly the same performance in Photoshop (no surprise there), but the RTX 5090 is 8 percent faster than the 4090 in Premiere Pro. Expect that margin to jump with more GPU-centric creation software, like DaVinci Resolve, or if your workloads utilize ray tracing or can tap into Nvidia’s excellent DLSS technologies.
There’s a reason we’re talking about this first: The GeForce RTX 5090 is much more than just a gaming card, like a more amplified version of the 4090 before it. People who use their PCs for real work will be clamoring for this monster GeForce GPU to make real money using it. The RTX 4090 has sold for closer to $2,500 than its $1,600 suggested price for years now, and I expect the demand will be even stronger for the titanic 5090 with its fast, massive memory pool and AI optimizations. This will sell lot hotcakes — yes, even at $1,999, which I suspect will look like an absolute steal a few months from now.
But if you are able to snag one, you’ll have your hands on by far the fastest gaming GPU of all time. Onto the fun stuff.
Nvidia GeForce RTX 5090 gaming benchmarks: Brutally fast
The GeForce RTX 4090 stood unopposed as the ultimate gaming GPU since the moment it launched. No longer. The new Blackwell generation uses the same underlying TSMC 4N process technology as the RTX 40-series, so Nvidia couldn’t squeeze easy improvements there. Instead, the company overhauled the RTX 5090’s instruction pipeline, endowed it with 33 percent more CUDA cores, and pushed it to a staggering 575W TGP, up from the 4090’s 450W. Blackwell also introduced a new generation of RT and AI cores.
Add it all up and the RTX 5090 is an unparalleled gaming beast — though the effects hit different depending on whether or not you’re using RTX features like ray tracing and DLSS.
Our gaming benchmark suite tests titles utilizing a variety of different game engines, to try to get a well-rounded view of performance. We’ve decided to focus on 4K gaming performance given this $2,000 graphics card’s might.
In games that don’t use ray tracing or DLSS, simply brute force graphics rendering, the RTX 5090 isn’t much more than a mild generational performance upgrade. It runs an average of 27 percent faster in those games — but the splits swing wildly depending on the game: Cyberpunk 2077 is 50 percent faster, Shadow of the Tomb Raider is 32 percent faster, and Rainbox Six Siege is 28 percent faster, but Assassin’s Creed Valhalla and Call of Duty: Black Ops 6 only pick up 15 and 12 percent more performance, respectively.
Performance results are less of a yo-yo once you flip on ray tracing, DLSS upscaling (not Frame Generation), or some mix of the two. Black Myth Wukong, Cyberpunk 2077, and Returnal all run ~30 percent faster on the RTX 5090 versus the 4090, while F1 24 leapt up 40 percent.
Nvidia invested heavily engineering work into its ray tracing and tensor cores this generation, and it shows. You should not pick up the RTX 5090 if you plan on ignoring ray tracing and DLSS in games (unless you plan on using it for work, of course).
But sweet holy moly, you shouldn’t ignore DLSS 4 anyway.
DLSS 4 is a literal game-changer
Much like DLSS, DLSS 2, and DLSS 3 before it, the new DLSS 4 generation is an absolute game-changer. Nvidia’s boundary-pushing AI tech continues to look better, run faster, and now feel smoother. It’s insane.
Nvidia made two monumental changes to DLSS to coincide with the RTX 50-series release. First, all DLSS games will be switching to a new “Transformer” model from the older “Convolutional Neural Network” behind the scenes, on all RTX GPUs going back to the 20-series.
With over 75 games and apps scheduled to support DLSS 4 on the day of the RTX 5090’s launch, you’ll have plenty of opportunities to put your fancy new hardware to work.
More crucially for the RTX 5090 (and future 50-series offerings), DLSS 4 adds a new Multi Frame Generation technology, building upon the success of DLSS 3 Frame Gen. While DLSS 3 uses tensor cores to insert a single AI-generated frame between GPU-rendered frames, supercharging performance, MFG inserts three AI frames between each GPU-rendered frame (which itself may only be rendering an image at quarter resolution, then using DLSS Super Resolution to upscale that to fit your screen).
It’s AI all the way down, just like we predicted. As someone who is sensitive to latency, I was skeptical going in. But friends, DLSS 4’s Multi Frame Generation feels fantastic in our limited playtesting.
Adam Patrick Murray / Foundry
We only pulled benchmark numbers for Cyberpunk 2077 using the RT Overdrive preset and 1.7x DLSS scaling. Flipping on MFG increases the frame rate over the 5090 with DLSS 3 by a whopping 91 percent, hitting a blistering 249 frames per second. Without any sort of Frame Generation on, the game runs at 71fps; enabling DLSS 4 MFG lets it run an absolutely staggering 251 percent faster.
Insanity. And it feels smoother than silk with Nvidia Reflex turned on.
Online forums have been broiled in debate since DLSS 4’s announcement: Do the AI frames really count as a frame rate increase, since you’re only affecting the input on 1/4 of frames rendered by the GPU, or is it more like an advanced motion-smoothing technology? I lean towards the latter, but either way: DLSS 4 MFG makes games look and feel so much better, and that’s the important part.
PCWorld video guru Adam Patrick Murray, who ran our benchmarks, normally sits on a couch during gaming sessions, using a small form-factor PC connected to a television. He never bothered turning on DLSS 3 Frame Gen; he felt the occasional visual glitches hurt more than the extra performance helped in that scenario. But he’s a big DLSS 4 MFG believer. “You can absolutely feel the difference now,” he told me, and he turned it on wherever possible during playtesting.
PCWorld contributor Will Smith, who is working on a deeper dive into DLSS 4, delivers even stronger praise: He reports that turning on DLSS 4 makes Star Wars Outlaws, a fun game prone to performance concerns, feel just as good as the legendary Doom 2016, which many gamers consider the paragon of fast-action shooters. “It’s like a whole new game,” he said. Cyberpunk 2077 also looked and felt smoother than ever, though to be fair, that game already handled really well.
Regardless of whether you consider AI frames to boost frame rates or smooth out motion, the end result is a masterpiece in action. Odd visual glitches seem to happen much less often now — though pumping out ray-traced Cyberpunk scenes at 249 frames per second gives your eyes much less time to try to notice them, as well.
I was concerned about added latency but Will and Adam report that everything feels snappy and even better than usual, though they counsel that you’ll want native in-game frame rates to hit close to 50- to 60fps before turning on MFG, or the latency can start to feel a little weird. That shouldn’t be a problem with the RTX 5090.
And that’s with using the standard version of Nvidia’s latency-reducing Reflex technology, which is needed to counter the latency introduced by adding AI frames that don’t respond to user input. The newer, more complex and performant Reflex 2 isn’t required for DLSS 4, just the original version.
That let Nvidia roll out a truly killer capability for DLSS 4: The ability to force DLSS 3 games to run DLSS 4 MFG instead using an override in the Nvidia app, rather than requiring developers to go back and update older titles. Because of this, you’ll be able to use 75 games and apps with DLSS 4 the day your new RTX 5090 shows up. That’s a huge improvement over the usual long, multi-month (or multi-year) rollout for new AI rendering technologies.
Bottom line: DLSS 4 is a stunning upgrade you must play around with to fully appreciate its benefits. It’s literally a game-changer, once again — though we’ll have to see if it feels this sublime on lower-end Nvidia cards like the more affordable RTX 5070.
We’re gonna need a bigger power supply
Okay, so you’re sold on the RTX 5090! Not so fast. You may need to adjust the rest of your PC build to accommodate it.
Fortunately, the meticulously designed Nvidia RTX 5090 Founders Edition has been engineered down to a slim, svelte two-slot solution that can easily slip into any PC (including Adam’s SFF rig). But adding all the extra hardware and then cranking up the juice means you may need a new power supply.
The GeForce RTX 5090 pushes the power supply requirement goalposts to 1,000W, up from the 4090’s 850W mandate. If you don’t want to use the included 12VHPWR adapter cables, consider picking up a PSU that ships with those included — though Nvidia has listened to past feedback and overhauled its fugly adapter to include a woven-braided design with longer cords. Hallelujah.
Despite its small stature, the RTX 5090 Founders Edition ran quietly enough in our testing, and didn’t heat up our rooms nearly as much as feared. Nvidia nailed the design on this one. The GPU hit 84 degrees Celsius in our testing, which is a bit on the warm side, but well within spec tolerances. Bigger 3- and 4-slot custom 5090s will no doubt tame temperatures in exchange for their larger size and higher sticker prices.
…and a bigger budget
If the RTX 4090 was over your budget, this will be too. The GeForce RTX 5090 Founders Edition starts at $1,999, but I expect it and the MSRP-priced custom cards we see at launch to disappear quickly, driving prices up. Go stand in line at midnight at Best Buy or prepare to battle masses of scalpers and AI researchers online if you want the 5090 day one.
Again, the 4090 is still going for $2,800 to $3,200 online right now, today because of its immense use to AI researchers and developers — and the RTX 5090 obliterates it in that field.
Bottom line: Bleeding-edge performance at a bleeding-edge price
This is a weird review to have to wrap into a neat conclusion.
Adam Patrick Murray / Foundry
In a vacuum, the RTX 5090 delivers around a 30 percent average boost in gaming performance over the RTX 4090. That’s a solid generational improvement, but one we’ve seen throughout history delivered at the same price point as the older, slower outgoing hardware. Nvidia asking for an extra $500 on top seems garish and overblown from that perspective.
But the RTX 5090 isn’t like past generations. We’re in the AI era now, and AI professionals will no doubt sell their firstborn to get ahold of the awesomely powerful GPU, massive memory pool, and ferociously fast memory bandwidth — three crucial hardware considerations for the field. Given that, I suspect prices for this monstrous graphics card to rise as rapidly as Godzilla does from Japan’s seas.
If you demand the very best of bleeding-edge gaming hardware, price be damned, you’ll drool over the GeForce RTX 5090. It’s built to handle anything and everything with aplomb, churning out ray-traced frames at a frenetic pace and then cranking smoothness way past 11 with DLSS 4’s magic. With over 75 games and apps scheduled to support DLSS 4 on the day of the RTX 5090’s launch, you’ll have plenty of opportunities to put your fancy new hardware to work.
Adam Patrick Murray / Foundry
While I wouldn’t recommend upgrading to this over the RTX 4090 for gaming (unless you’re giddy to try DLSS 4), it’s a definite upgrade option for the RTX 3090 and anything older. The 4090 was 55 to 83 percent faster than the 3090 in games, and the 5090 is about 30 percent faster than that, with gobs more memory.
At the end of the day, nobody needs a $2,000 graphics card to play games. But if you want one and don’t mind the sticker price, this is easily the most powerful, capable graphics card ever released. The GeForce RTX 5090 is a performance monster supercharged by DLSS 4’s see-it-to-believe it magic. Read...Newslink ©2025 to PC World | |
| | | PC World - 2 hours ago (PC World)This year at CES, Nvidia presented the next generation of its DLSS upscaling technology, which is trained with the help of artificial intelligence, alongside the new GeForce RTX 5090, 5080, and 5070 (Ti) graphics cards. The company touted its major advantages — and now that RTX 5090 reviews are live, we can confirm that DLSS 4 indeed feels like black magic, supercharging frame rates and making games feel just as snappy as the beloved Doom 2016.
That’s because DLSS 4 now supports Multi Frame Generation (MFG), an AI-based multiple intermediate frame calculation that can artificially generate up to three images and insert them between two “real” frames, thus quadrupling the frame rate. Of course, this feature only works on new Blackwell-based RTX 50-series GPUs.
But are the AI frames generated in this way a step forward or is it all hogwash? Let’s take a close look at DLSS 4 and its multi-frame generation works along with some early impressions.
Nvidia DLSS 4 in detail
The 2-slot RTX 5090 Founders Edition (right) next to the 3-slot RTX 4090. Adam Patrick Murray / Foundry
Nvidia DLSS 4 builds atop the existing DLSS 3 and DLSS 3.5 feature set. It’s made up of the following five functions:
DLSS Deep Learning Anti-Aliasing (“DLAA”)
DLSS Multi Frame Generation (“MFG”)
DLSS Ray Reconstruction (“RR”)
DLSS Frame Generation (“FG”)
DLSS Super Resolution (“SR”)
The classic upscaling technology DLSS Super Resolution renders games a lower internal resolution, then upscales to a higher output resolution with the help of AI. Super Resolution also works on older graphics cards from the GeForce RTX 20-, 30-, and 40-series, but it gets more complicated with the other DLSS features.
While the simple DLSS Frame Generation (FG) is reserved for the GeForce RTX 40- and 50-series graphics cards, the latest DLSS Multi Frame Generation feature is only supported by the latest generation. DLSS Ray Reconstruction, which improves visual fidelity in ray traced games, is mastered by all RTX graphics processors.Nvidia itself graphically summarized the entire feature set of DLSS 4 by generation.
Nvidia
Together with DLSS 4, Nvidia is also introducing a new Transformer AI model, which replaces the previously used Convolutional Neural Network (“CNN”) as the neural network behind DLSS. Nvidia promises gamers even better upscaling, more precise ray reconstruction, refined AI edge smoothing (DLAA), and more performance with the switch. We can confirm it looks great in action. The advanced architecture of the new neural network uses the principle of “Deep Attention” as well as the global context to generate significantly sharper details and significantly reduce artifacts such as ghosting.
Nvidia
What is Multi Frame Generation?
Nvidia DLSS 4 now supports Multi Frame Generation, an intermediate image calculation that can artificially generate up to three AI-calculated images and insert them between two classically rendered images. Standard DLSS 3 Frame Generation only inserts a single AI frame between rendered images. Nvidia shows the difference in two easy-to-understand diagrams below:
The new technology uses several neural networks, as well as the Nvidia Optical Flow Accelerator and the optimized tensor computing units inside the new Blackwell GPU architecture powering the RTX 50-series. Note that Multi Frame Generation will not be backwards compatible with prior RTX generations.Nvidia promises huge leaps in performance through the use of DLSS 4 with Multi Frame Generation and makes this clear using the example of the particularly performance-hungry action role-playing game Cyberpunk 2077.
Adam Patrick Murray / Foundry
In PCWorld’s testing, both visual smoothness and responsiveness skyrocketed after flipping on DLSS 4’s Multi Frame Generation in Cyberpunk 2077 (using the RT Overdrive preset and 1.7x DLSS scaling). Without any sort of Frame Generation on, the game runs at 71fps; enabling DLSS 4 MFG lets it run an absolutely staggering 251 percent faster, and a just-as-insane 91 percent faster than the 5090 with DLSS 3’s single Frame Gen active. Hitting such high speeds looks and feels so good.
But what effect do these so-called “fake frames” have on the gaming experience and image quality?
DLSS 4: Is it true progress or “fake frames”?
Gamers who use Multi Frame Generation must be aware that when it’s on, “artificial” frames are added to each “real” rendered frame. But is this ultimately real progress, as Nvidia CEO Jensen Huang described the feature at his contentious CES 2025 keynote, or is it just “fake frames” as some enthusiasts have been shouting from the rooftops in online forums?Of course, the whole thing needs to be considered in a differentiated way and in independent tests. It’s understandable that buyers of a $1,999 graphics card want significantly better performance without “tricks” compared to the previous generation. (Fortunately, the RTX 5090 delivers solid traditional performance gains too.) But as the calendar flipped over to 2025, PCWorld declared that AI upscaling killed native graphics gaming. We’re better off for it.
After using it, DLSS 4 feels like it drives that theory home. Whether DLSS 4 adds true extra “frames” in the traditional sense or acts more like motion smoothing on steroids is an important technical distinction that demands further discussion and testing. But there’s no doubt that games look and feel so much smoother and snappier with DLSS 4 MFG and Nvidia Reflex active. It provides a big boost to the quality of your gaming experiences, period (though I’d still hesitate to turn it on in competitive multiplayer games).
We have more DLSS 4 coverage coming, but here’s a tease from our RTX 5090 review: Turning on DLSS 4 makes Star Wars Outlaws, a fun game prone to performance concerns, feel just as good as the legendary Doom 2016, which many gamers consider the paragon of fast-action shooters. “It’s like a whole new game,” PCWorld’s Will Smith says. It’s high praise coming from a guy who has been reviewing graphics cards and making games for decades.
An initial conclusion
The use of artificial intelligence and features based on it (like DLSS 4) should be welcomed if, for example, intelligent AI upscaling improves both the picture quality and the frame rate — like DLSS Super Resolution does. The intelligent and extremely powerful AI edge smoothing using DLAA is also a gain, especially for enthusiasts. The fact that Nvidia no longer compares its new generation of graphics cards with native rendering performance from the previous generation, instead marketing somewhat “embellished” frame rate values with long bar charts in benchmarks, isn’t very helpful. Comparisons of a GeForce RTX 4090 with DLSS 3 and Frame Generation against a GeForce RTX 5090 with DLSS 4 and Multi Frame Generation say absolutely nothing about the actual increase in rendering performance and are a source of uncertainty for buyers. The poor initial reception by gamers is understandable and justified.
That being said, now that we’ve tested it, DLSS 4 truly is a revolutionary, game-changing technology. Trying to show such a gigantic experiential leap forward is hard to do with simple frame rate graphs. I don’t like that Nvidia failed to show how the RTX 5090 compares to the 4090 in traditional performance during its blockbuster CES keynote, but it’s somewhat understandable now that we’ve experienced the difference with our own eyes and hands. DLSS 4 is that good.
Perhaps the bigger takeaway here? Hardware and software-side AI features such as upscaling, frame generation, latency improvers, and dedicated AI hardware are here to stay and will play an even greater role in the future. The days of native rendering are coming to an end. You certainly don’t have to like that, but I think enthusiasts need to start accepting it. The game has changed. Read...Newslink ©2025 to PC World | |
| | | BBCWorld - 3 hours ago (BBCWorld)It comes a day after owner OpenAI announced a massive investment in artificial intelligence (AI). Read...Newslink ©2025 to BBCWorld | |
| | |
|
|
| Top Stories |
RUGBY
Crusaders CEO Colin Mansbridge feels changes to New Zealand Rugby's eligibility rules could have dire consequences for the game More...
|
BUSINESS
Belief the Government's set itself a lofty task, with one of the biggest reforms to the science sector in decades More...
|
|
| Today's News |
| News Search |
|
|