Chat with RTX vs ChatGPT – how are they different?

What's the difference?

NVIDIA Chat with RTX compared to OpenAI ChatGPT.

You can trust PC GuideOur team of experts use a combination of independent consumer research, in-depth testing where appropriate – which will be flagged as such, and market analysis when recommending products, software and services. Find out how we test here.

Last Updated on

Chat with RTX is the new AI chatbot from NVIDIA. Able to run locally on your own computer, this unique selling point differentiates it from any other chatbot in big tech. Is that where the differences end? Both generative AI tools can answer questions about your own data, and both come from well-funded recognizable names in big tech, so the list of differences and similarities is likely to expand as time goes on. Let’s compare Chat with RTX vs ChatGPT.

Chat with RTX and ChatGPT overview

Chat with RTX is an AI chatbot released by NVIDIA on February 13th, 2024. In a general sense, it’s the same kind of tool as OpenAI’s ChatGPT, Microsoft Copilot, or Google Gemini (formerly Bard). However, you won’t find all of the same features between them. Each of these alternatives has pros and cons, but in terms of security and personalization Chat with RTX is looking like a strong contender off the bat.

The new demo app allows you to personalize a GPT large language model (LLM) with proprietary data from your computer. Upload files, such as documents, and the software will be able to answer questions about the content of those documents. It uses retrieval-augmented generation (RAG), TensorRT-LLM, and RTX acceleration technologies, giving NVIDIA ‘full-stack’ visibility because the same firm also produces the hardware that this software runs on. The potential security benefits of AI on a closed system, unconnected to the internet, plus the efficiency of hardware and software that are deeply optimized for one another will no doubt be an attractive USP for many users.

By contrast, ChatGPT will work on your desktop PC regardless of which graphics card brand you have installed. You can run ChatGPT in any of the well-known browsers (Google Chrome, Mozilla Firefox, Microsoft Edge, Opera, or Safari) as long as you have an internet connection. This is where ChatGPT itself is limited because, without an internet connection, you won’t be able to use the service at all.

Key similarities

Both of these AI tools feature NLP (natural language processing) technology. As the most fundamental aspect of an AI chatbot, this allows the two services to understand your commands (known as text prompts). You can write in natural human language, as you might when texting a friend, and the AI system will ‘understand’ you. It can also write human-like text responses as a result.

In addition, ChatGPT and Chat with RTX can process files you choose to upload from your computer. These file types include .txt, .pdf, .doc, .docx, and .XML, although ChatGPT Plugins can augment the capabilities of ChatGPT in this area.

Notable differences

Due to the fact that Chat with RTX will run locally on your PC, response times can be made faster than a server-based LLM. This is because client-side operations use the hardware sitting in front of you, whereas server-side operations send a request over your internet connection to be processed elsewhere, with the result then sent back. Your internet connection becomes a factor in your response speed as a result.

NVIDIA’s Chat with RTX is now free to download, giving anyone with an RTX GPU the power of on-device artificial intelligence.

Chat with RTXChatGPT
Created by NVIDIACreated by OpenAI
Runs locally on your computerRuns on a server
Requires an NVIDIA GPU (RTX 30 or 40 Series GPU or RTX Ampere or Ada Generation GPU with at least 8GB of VRAM)No GPU requirements
Requires Windows 11No OS requirements
Requires 16BG of system memory (RAM)No RAM requirements
No third-party pluginsThird-party plugins

Pricing

The Chat with RTX demo is free to download, and it might take a while at just over 35GB. ChatGPT also has a free-forever plan, in addition to three premium subscription tiers. As a result, you can use both generative AI tools for free.

However, access to the best OpenAI models comes with ChatGPT Plus ($20/month), Teams ($25/month x 2 users), Enterprise (Scalable pricing), or via Microsoft Copilot. NVIDIA’s AI chatbot allows access to the Mistral model for free, and OpenAI’s alternative doesn’t. There are multiple models based on the 7B Mistral model which rank in the top 20 of the LYMSys Chatbot Arena Leaderboard, but OpenAI’s GPT-4 Turbo still holds the number one spot. As a result, ChatGPT’s pricing is still worth it for many users.

Conclusion

So if you were wondering how Nvidia’s Chat with RTX differs from ChatGPT, there you have it. As mentioned, it’s still very early days for Nvidia’s new AI, so we’re expecting it to be put through its paces very soon.

If you’re looking to explore some new AIs, check out our top picks for the best AI writers of the year, or our latest reviews – including our updated ChatGPT review.

Steve is the AI Content Writer for PC Guide, writing about all things artificial intelligence. He currently leads the AI reviews on the website.