
Photonics proves to be a hard nut to crack

growing computing The energy required to train complex AI models like OpenAI’s ChatGPT could eventually collide with mainstream chip technology.
In the 2019 analysis, OpenAI set up From 1959 to 2012, the amount of electricity used to train AI models doubled every two years, and after 2012 electricity usage began to grow sevenfold.
It has created stress.Microsoft is It is said Facing an internal shortage of the server hardware it needs to run its AI, that scarcity is driving up prices. CNBC estimated the current cost in conversations with analysts and technologists train Developing a ChatGPT-like model from scratch would cost over $4 million.
One solution to the AI training dilemma that has been proposed is photonic chips, which use light to send signals instead of the electricity used by traditional processors. Photonic chips could theoretically lead to higher training performance because light generates less heat than electricity, travels faster and is less susceptible to changes in temperature and electromagnetic fields.
light matter, light up, luminous computing, Intel and NTT are among the companies developing photonic technology. But while the technology created a lot of buzz—and attracted a lot of investment—a few years ago, the industry has cooled significantly since then.
There are many reasons for this, but the general message among investors and analysts studying photonics is that photonic chips for AI, while promising, are not the panacea they were once believed to be.