An nameless reader writes: Final week, we realized that Microsoft spent tons of of thousands and thousands of {dollars} to purchase tens of hundreds of Nvidia A100 graphics chips in order that accomplice OpenAI may prepare the big language fashions (LLMs) behind Bing’s AI chatbot and ChatGPT.
Do not have entry to all that capital or house for all that {hardware} on your personal LLM challenge? Nvidia’s DGX Cloud is an try and promote distant internet entry to the exact same factor. Introduced at present on the firm’s 2023 GPU Know-how Convention, the service rents digital variations of its DGX Server packing containers, every containing eight Nvidia H100 or A100 GPUs and 640GB of reminiscence. The service consists of interconnects that scale as much as the neighborhood of 32,000 GPUs, storage, software program, and “direct entry to Nvidia AI specialists who optimize your code,” beginning at $36,999 a month for the A100 tier.
In the meantime, a bodily DGX Server field can value upwards of $200,000 for a similar {hardware} if you happen to’re shopping for it outright, and that does not depend the efforts firms like Microsoft say they made to construct working knowledge facilities across the expertise.