r/nvidia 7d ago

Discussion Game Ready & Studio Driver 537.42 FAQ/Discussion

164 Upvotes

Game Ready & Studio Driver 537.42 has been released.

Article Here: https://www.nvidia.com/en-us/geforce/news/cyberpunk-2077-phantom-liberty-dlss-3-5-ray-reconstruction-game-ready-driver/

Game Ready Driver Download Link: Link Here

Studio Driver Download Link: Link Here

New feature and fixes in driver 537.42:

Game Ready - This new Game Ready Driver provides the best gaming experience for the latest new games supporting DLSS 3.5 technology and DLSS Ray Reconstruction including Cyberpunk 2077: Phantom Liberty. Further support for new titles leveraging NVIDIA DLSS technology includes the launch of Warhaven and Witchfire which support DLSS 3, as well as the arrival of Party Animals which supports DLSS Super Resolution and NVIDIA Reflex technology.

Applications - The September NVIDIA Studio Driver provides optimal support for the latest new creative applications and updates. This includes DaVinci Resolve version 18.6 which features NVIDIA TensorRT acceleration as well as the latest Chaos Vantage update which introduces support for DLSS Ray Reconstruction.

Fixed Gaming Bugs

  • N/A

Fixed General Bugs

  • [Octane Render]: intersection shaders cause slowdown in performance [4164876]
  • [Octane Render]: inconsistent behavior and broken motion keys using TLAS with numKeys=2 [4088077]

Open Issues

  • [Halo Infinite] Significant performance drop is observed on Maxwell-based GPUs. [4052711]
  • [DaVinci Resolve] This driver implements a fix for creative application stability issues seen during heavy memory usage. We’ve observed some situations where this fix has resulted in performance degradation when running DaVinci Resolve. This will be addressed in an upcoming driver release. [4172676]

Additional Open Issues from GeForce Forums

Notes: This is not new. Manuel from Nvidia has been tracking any additional driver issues in their forum post separate from release notes. Last post was the first time I am including them here and will do so moving forward.

  • Small checkerboard like pattern may randomly appear in Chromium based applications [3992875]
  • Some monitors may display random black screen flicker when in Display Stream Compression mode when using R530 drivers [4034096]
  • [GeForce RTX 4060] GPU monitoring utilities reporting incorrect idle power usage [4186490]
  • Event Viewer logs nvlddmkm error at the end of the OCCT video ram test when memory is full [4049182]
  • [GeForce GTX 10/RTX 20 series] PC may randomly freeze/bugcheck when Windows Hardware-Accelerated GPU Scheduling and NVIDIA SLI are both enabled [4009884]
  • Horizontal band may appear when cloning a G-SYNC display to HDMI monitor [4103923]
  • A new NVIDIA icon is created in the system tray each time a a user switch takes place in Windows [4251314]
  • [Alienware X17 R2 w/ GeForce RTX 3080 Ti] Display goes blank when DirectX game is launched while notebook is in dedicated GPU mode [4146369]
  • [RTX 4060 Ti] Display may randomly flicker with a black bar on the top of the screen when using desktop apps [4239893]

Driver Downloads and Tools

Driver Download Page: Nvidia Download Page

Latest Game Ready Driver: 537.42 WHQL

Latest Studio Driver: 537.42 WHQL

DDU Download: Source 1 or Source 2

DDU Guide: Guide Here

DDU/WagnardSoft Patreon: Link Here

Documentation: Game Ready Driver 537.42 Release Notes | Studio Driver 537.42 Release Notes

NVIDIA Driver Forum for Feedback: Link Here

Submit driver feedback directly to NVIDIA: Link Here

RodroG's Driver Benchmark: TBD

r/NVIDIA Discord Driver Feedback: Invite Link Here

Having Issues with your driver? Read here!

Before you start - Make sure you Submit Feedback for your Nvidia Driver Issue

There is only one real way for any of these problems to get solved, and that’s if the Driver Team at Nvidia knows what those problems are. So in order for them to know what’s going on it would be good for any users who are having problems with the drivers to Submit Feedback to Nvidia. A guide to the information that is needed to submit feedback can be found here.

Additionally, if you see someone having the same issue you are having in this thread, reply and mention you are having the same issue. The more people that are affected by a particular bug, the higher the priority that bug will receive from NVIDIA!!

Common Troubleshooting Steps

  • Be sure you are on the latest build of Windows 10 or 11
  • Please visit the following link for DDU guide which contains full detailed information on how to do Fresh Driver Install.
  • If your driver still crashes after DDU reinstall, try going to Go to Nvidia Control Panel -> Managed 3D Settings -> Power Management Mode: Prefer Maximum Performance

If it still crashes, we have a few other troubleshooting steps but this is fairly involved and you should not do it if you do not feel comfortable. Proceed below at your own risk:

  • A lot of driver crashing is caused by Windows TDR issue. There is a huge post on GeForce forum about this here. This post dated back to 2009 (Thanks Microsoft) and it can affect both Nvidia and AMD cards.
  • Unfortunately this issue can be caused by many different things so it’s difficult to pin down. However, editing the windows registry might solve the problem.
  • Additionally, there is also a tool made by Wagnard (maker of DDU) that can be used to change this TDR value. Download here. Note that I have not personally tested this tool.

If you are still having issue at this point, visit GeForce Forum for support or contact your manufacturer for RMA.

Common Questions

  • Is it safe to upgrade to <insert driver version here>? Fact of the matter is that the result will differ person by person due to different configurations. The only way to know is to try it yourself. My rule of thumb is to wait a few days. If there’s no confirmed widespread issue, I would try the new driver.

Bear in mind that people who have no issues tend to not post on Reddit or forums. Unless there is significant coverage about specific driver issue, chances are they are fine. Try it yourself and you can always DDU and reinstall old driver if needed.

  • My color is washed out after upgrading/installing driver. Help! Try going to the Nvidia Control Panel -> Change Resolution -> Scroll all the way down -> Output Dynamic Range = FULL.
  • My game is stuttering when processing physics calculation Try going to the Nvidia Control Panel and to the Surround and PhysX settings and ensure the PhysX processor is set to your GPU
  • What does the new Power Management option “Optimal Power” means? How does this differ from Adaptive? The new power management mode is related to what was said in the Geforce GTX 1080 keynote video. To further reduce power consumption while the computer is idle and nothing is changing on the screen, the driver will not make the GPU render a new frame; the driver will get the one (already rendered) frame from the framebuffer and output directly to monitor.

Remember, driver codes are extremely complex and there are billions of different possible configurations. The software will not be perfect and there will be issues for some people. For a more comprehensive list of open issues, please take a look at the Release Notes. Again, I encourage folks who installed the driver to post their experience here... good or bad.

Did you know NVIDIA has a Developer Program with 150+ free SDKs, state-of-the-art Deep Learning courses, certification, and access to expert help. Sound interesting? Learn more here.


r/nvidia 22d ago

Discussion September RTX Updates & Games Discussion - What have you been playing?

17 Upvotes

September RTX Updates

This portion will continuously be updated throughout the month as we get more updates

September 19 Update - https://www.nvidia.com/en-us/geforce/news/dlss-3-5-available-september-21/

  • NVIDIA DLSS 3.5
    • Launches September 21st
  • Cyberpunk 2077
    • Upgrades To DLSS 3.5 On September 21st
  • Chaos Vantage
    • Upgrades To DLSS 3.5 On September 21st
  • NVIDIA Omniverse
    • DLSS 3.5 This October
  • The First Descendant
    • Open Beta Begins Today, Featuring DLSS 3
  • Warhaven
    • Launches September 20th With DLSS 3
  • Witchfire
    • Enters Early Access September 20th With Support For DLSS 3 & Reflex
  • Lies of P
    • Launches Today With DLSS 2
  • Party Animals
    • Launches September 20th With DLSS 2 & Reflex

----------

September 12 Update - https://www.nvidia.com/en-us/geforce/news/dlss-3-icarus-dlss-2-ad-infinitum/

  • ICARUS
    • DLSS 3 Update Available Now
  • Ad Infinitum
    • Launches September 14th With DLSS 2 
  • Mortal Kombat 1
    • Launches With DLSS 2 On September 19th
  • Warstride Challenges
    • Exits Early Access With Support For DLSS 2 & Reflex
  • Arcadegeddon
    • Available Now With DLSS 2
  • Starsiege: Deadzone
    • Available Now With DLSS 2

----------

September 5 Update - https://www.nvidia.com/en-us/geforce/news/dlss-anticipated-games-launching-september-2023/

  • SYNCED
    • Launches September 8th With DLSS 3 
  • Witchfire
    • Enters Early Access On September 20th With Support For DLSS 3 & Reflex
  • Lies of P
    • Launches September 19th With DLSS 2
  • Party Animals
    • Launching September 20th With DLSS 2 & Reflex

Games Discussion - What Have You Been Playing?

Feel free to comment on the games you've been playing and how you're enjoying it. If you have games you're looking forward to, feel free to post them too!! We're always curious what everyone is playing. To start, here are what the mods have been playing:

/u/RenegadeAI

  • "Still no time for video games"

/u/Nestledrink

  • Baldur's Gate 3
  • Tales of Berseria

/u/itbefoxy

  • Starship Troopers Extermination

/u/LooniLuna

  • "Lol. No Games"

/u/Nekrosmas

  • Wargame: Red Dragon
  • F1 Manager 2023
  • World of Tanks

r/nvidia 16h ago

Discussion The Gigabyte OC low profile 4060 is way smaller than I expected. It's tiny. (3 photos)

Thumbnail
gallery
466 Upvotes

The MSI Ventus 3X is a 3080 To


r/nvidia 15h ago

News Nvidia Offices in France Raided as Part of Inquiry, WSJ Says

Thumbnail
finance.yahoo.com
250 Upvotes

r/nvidia 17h ago

Rumor NVIDIA GeForce RTX 50 "GB202" GPU rumored to feature 192 SMs and 512-bit memory bus, according to "kopite7kimi" - VideoCardz.com

Thumbnail
videocardz.com
338 Upvotes

r/nvidia 11h ago

Discussion Ray Reconstruction ON/OFF comparison in Cyberpunk regular RT

Thumbnail
gallery
53 Upvotes

r/nvidia 12h ago

Benchmarks In Cyberpunk/Phantom Liberty you can gain double digit framerates by simply using DLSS Balanced vs Quality with no obvious loss in image quality with RT Overdrive Ultra + Ray Reconstruction at 1440p

Thumbnail imgsli.com
71 Upvotes

r/nvidia 1d ago

Rumor Batman Arkham Knight Steam Update Hints At RTX Remaster

Thumbnail
tech4gamers.com
542 Upvotes

r/nvidia 16h ago

Question Is this connection fine or are the cables bent too much?

Post image
57 Upvotes

Is this fine or am I putting too much strain on the connector?

It's the adapter that came with the MSI Ventus X3 4090.


r/nvidia 17h ago

Question Is 2080ti worth it in 2023? Found one on FB for $270.

46 Upvotes

Initially I was looking at a 3070, but the extra VRAM of the 2080ti is more appealing to me since I run at 4k. Planning to meet with the guy today, gonna put it on a test bench and benchmark before handing over my money.


r/nvidia 4h ago

Discussion Further observations of DLSS Ray Reconstruction - Increased Ray Counts

3 Upvotes

Some modding tools for Cyberpunk 2077 have been updated for the version 2.0 patch. And this allows for easy testing of certain things that were difficult before. So I will be looking at one of these today. The impacts of increased ray count, and decreased noise, on DLSS Ray Reconstruction.

Cyberpunk 2077 has a "Path Tracing" mode that traces rays through the scene to calculate lighting data and how it intersects with materials. Typically the number of rays cast is quite low to keep performance high (because tracing more rays, and subsequently doing more shading, requires more processing time). This leads to an issue called "noise" where the information collected from the low number of rays is highly inconsistent from pixel to pixel. To work around this noise issue, we use denoisers. Tools that blur the lighting data, combine lighting data over time, and/or predicts lighting data, based on the noisy inputs. And DLSS Ray Reconstruction is a denoiser (plus upscaler?).

Anyway, different denoisers behave differently depending on how noisy the input data is (E.G. A denoiser designed for low ray counts might over blur details in high ray count renders. While a denoiser designed for high ray counts handles this better, but does a much worse job at producing a "convincing result" when working with low ray counts). So I wanted to see how DLSS Ray Reconstruction handles this.

From my testing I noticed that increasing the ray count:

  1. Improved the quality of some effects while in motion. (E.G. Some reflections show more detail as you move). This is expected, there is a higher "signal to noise" ratio meaning that the denoiser doesn't have to blur, combine, or predict as much information, meaning it can show more "correct" information.
  2. Improved the convergence time while stationary (E.G. When you stop moving, some lighting effects will converge to a "high quality denoised" result. Increasing the ray count decreases the time required to converge to this image). This is expected. There is a higher "signal to noise" ratio meaning the denoiser can use fewer frames to get the information required to create that "converged" result.
  3. Improves the responsiveness of surfaces to changes in lighting (E.G. When a light changes colour, an increased ray count will result in surfaces reacting faster to this change in lighting). This is expected. Higher "signal to noise" ratio leads to reduced denoising, and reduced reliance on temporal information.
  4. Reduces some denoising based ghosting on some objects (E.G. Ghosting on a bottle rolling across the floor is slightly reduced). This is kind of expected. Higher "signal to noise" ratio leads to reduced denoising and reliance on temporal information, reducing ghosting. Note: This is still quite a bit of ghosting, it's just slightly reduced on some objects.
  5. Reduces disocclusion artifacts (E.G. The artifacts/noise is reduced on surfaces that were exposed in the current frame). This is expected. Higher "signal to noise" ratio leads to a higher quality denoised result when there is no temporal information to use to improve the quality.
  6. Reduces "swimming" or "boiling" artifacts on some surfaces, and subtly changes the brightness or colour of the lighting on these surfaces. This is expected. Higher "signal to noise" ratio leads to more accurate denoising, reducing artifacts.
  7. Increases "boiling" and "swimming" artifacts on some surfaces in certain situations. My guess is that DLSS Ray Reconstruction is reducing the denoising it's doing in response to a higher quality input, but it's been reduced too far resulting in the noise negatively impacting the final result.
  8. Sometimes increasing the ray count doesn't increase the quality of the denoised result while in motion on some surfaces. My guess for why this happens is similar to the previous point. DLSS Ray Reconstruction is probably reducing the amount of denoising it's doing, probably reducing the temporal information it's using, in response to a higher quality input, which results in an image that's more responsive to changes in lighting, but still the same sort of quality as a lower sample count. Since the changes in lighting aren't drastic, you don't notice that advantage, and the denoised result looks similar to before.
  9. DOES NOT remove the "oil painting" look of DLSS Ray Reconstruction.
  10. Reduces performance, meaning some things that rely on temporal information can see a reduction in quality (E.G. Some ghosting is actually worse with a higher ray count. Probably due to the reduced frame rate reducing the quality of the temporal upscaling part of DLSS Ray Reconstruction).

Note: Most of my tests were done at 2560x1440 with DLSS Super Resolution Qaulity Mode with DLSS Ray Reconstruction turned on with a RTX 4090. I primarily tested 2 rays per pixel (seems to be the default) vs 8 rays per pixel. I chose to compare against 8 rays per pixel because performance is still high enough that the motion between frames is still relatively low, allowing DLSS Ray Reconstruction to easily reuse some temporal information while in motion.

Now, there is an interesting point I wanted to bring up from the previous paragraph. Increasing the ray count reduces frame rates. DLSS Ray Reconstruction has a temporal component to it, and reducing the frame rate can result in a decreased ability for DLSS Ray Reconstruction to use temporal information due to the increased motion between frames. So is this trade off between performance and ability to use temporal information worth it? The answer to that question changes from person to person based on their performance and preferences. But I believe it is worth it to some extent. If you have the performance headroom to do so, then increasing the ray count from 2 to 4 or even 8 could be worth it. However there is diminishing returns as your ray count increases. So going above values like 8 may not be worth it while using a denoiser.

For anyone interested, here's a video showing a few examples comparing the different ray counts. Showing scenes that have a benefit, a scene that has almost no noticeable benefit, and scenes where increasing the ray count is partially detrimental. https://youtu.be/L4frHe_j8t8


r/nvidia 1h ago

Question RTX 4080 (12 GB) VS RTX 3080 Ti (16 GB)

Upvotes

Hi everyone,

I'm planning to get a gaming laptop (my current laptop has an intel HD 4000 graphics card... yeah. so any gaming laptop will be a significant upgrade).

I'm hesitating between these two laptops :

GIGABYTE AORUS 17H BXF ( NVIDIA GeForce RTX 4080 12Go GDDR6 - Intel Core I7-13700H )

and this one :

MSI Stealth GS66 ( RTX 3080 Ti Max-Q GDDR6 16Go - intel Core i7-12700H )

I am a newbie to this and I don't which option is better. Any input will help, thank you all.


r/nvidia 1h ago

Question Squeaky whisper "mouse like" sound from my GPU fan - 3080FE (Not coil whine)

Upvotes

This is driving me crazy.

I have this random squeaky "mouse like" sound from my GPU fan. (Not coil whine which I'm familiar with)

I've tried cleaning the GPU using a compressed air can but the sound is still there. I'm 100% sure it's from the GPU fan because zero rpm mode eliminates the sound completely. Any thing else I can try beside replacing the fans? (I'm pretty sure I'll screw something up if I'll try to disassemble it)


r/nvidia 14h ago

Question Will GPUs like 4090/80/70 have black friday deals?

11 Upvotes

This might be a dumb question but I’ve never paid attention to new gen GPUs around black friday. I was wondering if it’s a norm to see bundles or deals on high end gpus around that time of year.


r/nvidia 11h ago

Question Will this 600w power supply run this 3080?

7 Upvotes

Will this prebuilt work with its GPU and power supply?

Hi folks. Im looking to buy this PC because its on sale for a thousand off and specs look great for 1700 USD. However, I noticed the power supply is only 600 and it has a 3080. Can this power this kind of a rig? Should I cancel this order, or do the specs make sense?

I cant imagine they’d sell a prebuilt that couldn’t run.

https://www.bestbuy.ca/en-ca/product/hp-omen-25l-gaming-pc-intel-core-i7-12700f-1tb-ssd-32gb-ram-geforce-rtx3080-only-at-best-buy/16691486?source=category&adSlot=3

P.S. I know its more efficient in the long run to build my own. But my life is hectic and my old pc broke so I just want to get back to gaming.


r/nvidia 4h ago

Question Non popular brands of 4090 and which one to buy

0 Upvotes

Hey there guys, it is finally time to change my 960 into something much better after god know how many years.

In my country there is quite a big difference in price for some models of 4090 and since I am just around the msrp budget I have the choice of:

Inno3d X3 OC and Zotac trinity

Rest of brands like MSI / Asus / Gigabyte are obscurely priced and definitely out or my price range. Zotac and Inno are considered as low quality non popular brands here , hence the price drop which makes it borderline msrp.

Even though this is the most frequent question, I still wanna check out if there is some update about the quality and build of above mentioned cards that I have missed and mostly which one would be better choice?

Tnx a lot for your help in advance 😁


r/nvidia 19h ago

Rumor GB202 could have a H100 like structure with a GPC:TPC:SM structure of 12:8:2 giving 192SMs

Thumbnail
x.com
15 Upvotes

My Thoughts: H100 Hopper datacenter and AD102 consumer Ada Lovelace were drastically different approaches to making a 144SM GPU and the different scaling as well as different internal architecture deserved different names.

Nvidia GPUs minimum unit is the Streaming Multiprocessor the SM. This is where the math happens for both Integer and Float.

2 SM together are paired into a structure called TPC, housing specialized units like texture units

The highest level and what companies like Apple and Intel call the GPU core are for Nvidia the GPC. They distribute work on the GPU and house units like ROPs

Both H100 and AD102 have 144SMs. H100 has a 9GPC:8TPC:2SM AD102 has 12GPC:6TPC:2SM GB202 might have 12GPC:8TPC:2SM


r/nvidia 9h ago

Question Upgrade from 1080 to rtx 3070?

2 Upvotes

So I've been running a 1080 for a good while now and it's been suprisingly good at keeping up with modern titles but I think it's about time for an upgrade.

For reference I use a 1440p monitor and have a ryzen 5 3600x. I know the absolute basics about builds but am otherwise pretty tech naive. My question is would it be worth while to upgrade to a 3070 with my build or is there a better option for me? Price wise my budged is around 500 (maybe 600 max) and the 3070 seems to tick all the boxes but I haven't heard too much about this card. What's everyone's experience with it?


r/nvidia 5h ago

Discussion RTX 6000 Ada cards for MSRP

1 Upvotes

Anyone know where I can buy an RTX 6000 ADA for MSRP pricing? Most places online that sell it to the average consumer (non-corporate entity) seems to sell it at a severely upmarked price of $10,000+.

I do super heavy 3d work and need that 48GB of vram for 3D rendering, the 4090 is too weak for my purposes.


r/nvidia 11h ago

PSA Quick reminder to 10xx and 20xx series card owners: CLEAN YOUR CARDS! Also, quick question...

3 Upvotes

Some days ago I decided to pull the trigger and remove my good old reliable 1080Ti for a couple of days to have it cleaned up/repasted/repadded completely. It had been a couple of years since I last cleaned it up and I had never repadded it since its launch in 2017 (2016? Can't recall).

Safe to say it sounded like a jet engine and it was definitely thermal throttling because it would hit 93C while under load at 80-100% fan speed. The sound was almost unbearable (especially considering the 10xx FE series has a blower cooler) but at that point but I've grown accostumed to it, since it's been running like that for SO LONG. Only now that it is soooooo quiet that I notice how much of a turbojet engine it sounded like before.

I took some before and after pictures and I also have a quick question to make. First the befores:

Crunchyyyy!

SO. MUCH. DUST.

Now it's all ready to go:

Yes I know the blue pads are kinda meh but it's what we have here...

Now to a question (or a bunch of them): I noticed the PCB coating near the GPU core itself has a lot of discoloration, and also the board itself is really discolored on its side (see pics below). Is this a product of the card running so hot for such a long time? Is it expected? Is it just due to the card's age? Should I be expecting it to die soon because of it? It hasn't shown ANY signs of defects or strange behaviors whatsoever, but these made me afraid it is counting seconds until it craps itself. Here's what I'm talking about:

Discoloration on PCB near GPU core

Compared to a random GPU core we had lying around

This one scares me as well. The PCB profile around the area of the core is much more orange than the parts away from it, which are yellow

Please take care of your cards guys. Make them last. I'm itching so much to upgrade but I'll get every last drop of performance from this beast of a 1080TI until it dies on me. This card has done so much over the years: it mined, gamed, smashed VR on my Quest2 and rendered workloads gracefully and it's honestly the GOAT!


r/nvidia 1d ago

News Best Buy has 4090 FE in stock

Post image
179 Upvotes

FYI, just grabbed mine


r/nvidia 9h ago

Discussion Ray reconstruction on vs off comparison in Cyberpunk 2077

Thumbnail
youtu.be
0 Upvotes

r/nvidia 10h ago

Question 4070 Ti: Asus Strix (non OC) vs Gigabyte Eagle OC vs Gigabyte Aero OC vs MSI Gaming X trio

0 Upvotes

Hi! I've been using the same GPU for 6 years so it's time for an upgrade, which option from the 4 above is the best version for me to choose? Disregarding prices.

Take into consideration: General Performance, Coil Whine and Temps

Please tell me about your personal experience in case you own one of the models! Thanks in advance!

Notes: I'll be gaming in 1440p and I'm using a Core i7 12700KF


r/nvidia 1d ago

News Counter-Strike 2 with NVIDIA Reflex and GeForce RTX 40 Series GPUs - The Lowest System Latency & Highest FPS

Thumbnail
nvidia.com
114 Upvotes

r/nvidia 2d ago

Discussion Cyberpunk 2077 Has The Best Graphics I've Ever Seen in Any Game. Playing in 4K ULTRA + Max Ray Tracing - RTX 4070Ti - 60FPS

Thumbnail
gallery
1.2k Upvotes

r/nvidia 20h ago

Question Nvidia Jetson Xavier with a Tesla P40 strapped to it

4 Upvotes

The Setup:This is an unusual idea but I have an Xavier AGX and a Tesla P40 PCIe Card. I was thinking about building an "AI" server if you will that would run image and language models that I would interact with either by webui or API (for home automation stuff). The Xavier would be a nice small power friendly machine to run this on given I would need to use external power for the GPU tumor on the side but I would probably 3dprint a nice small case (and fan) for the GPU etc....

The Question:I've ran the P40 on my desktop (ThreadRipper) and it worked just fine, however I've tried to place it in other desktops and linux/windows complain about the device not have enough resources available to start. I am assuming that they are meaning PCIe lanes as the P40 takes advantage of the whole x16 pcie slot.

Does the Jetson Xavier AGX pcie slot allow full 16x access? or would I be wasting my time trying to run these two devices together?

*Side note* I've ran stable diffusion etc on the xavier natively but it was WAY slow. Language models also need a lot of /fast/ Vram and run nicely on the P40. I know it is an unusual question but I'm sure there are some lurkers here with the knowledge that can save me a lot of time.

https://preview.redd.it/aeidupg6k0rb1.jpg?width=2727&format=pjpg&auto=webp&s=63b51398f5be864e876de5ce64453bf7fdd7ecfa

https://preview.redd.it/q5bqnwsyh0rb1.jpg?width=1918&format=pjpg&auto=webp&s=9fa891bce061bd2b9013c5daea5b7e4387f80ba2


r/nvidia 1d ago

News Guerrilla Games and Nixxes are bringing Horizon Forbidden West: Complete Edition to PC in early 2024

Thumbnail
blog.playstation.com
404 Upvotes