OpenAI has begun exploring alternatives to Nvidia chips for artificial intelligence tasks. According to Reuters, citing no fewer than eight sources, the developer of ChatGPT is dissatisfied with some of Nvidia’s recent decisions and is reevaluating its hardware strategy. The reason is the growing role of specialized chips responsible for AI inference, the stage where the model generates responses to user requests.
Sources indicate that AI inference is becoming a new field of competition in the chip market. Although Nvidia continues to dominate the segment of training large AI models, OpenAI and other companies’ search for alternatives is seen as a serious challenge to its positions. This situation evolves amid ongoing negotiations over investments and redistribution of interests within the AI ecosystem.
_large.jpg)
Nvidia’s CEO Jensen Huang previously stated that the company intends to participate in OpenAI’s financing, calling such investments beneficial, although specific amounts were not mentioned. A possibility of investments up to $100 billion was previously discussed; however, according to The Wall Street Journal, these plans were suspended at the end of January due to Nvidia’s internal doubts over deal terms.
In recent developments, OpenAI’s exploration into alternative suppliers might include partnerships with companies like AMD or Intel, which have been making strides in AI chip technology. Experts believe this shift could redefine competitive dynamics, pushing Nvidia to innovate even further to retain its market dominance. Some commentators note that OpenAI’s potential diversification may lead to more competitive pricing and innovation within the AI chip sector, ultimately benefiting a broad range of tech businesses leveraging AI.