China's New Taichi Photonic Chip Could Help Closing the Gap to Nvidia's AI Accelerators
#AI #Photonics #OpticalComputing #Semiconductors #AIchips #Tsinghua #EnergyEfficiency #DeepLearning #AIGC #TaichiChip #AIAccelerators
China's New Taichi Photonic Chip Could Help Closing the Gap to Nvidia's AI Accelerators
#AI #Photonics #OpticalComputing #Semiconductors #AIchips #Tsinghua #EnergyEfficiency #DeepLearning #AIGC #TaichiChip #AIAccelerators
💡 AI trained with lasers? Yes, and it cuts energy use by 95%.
Researchers just trained a 1B-parameter Transformer using optical chips—no backprop, no burnout. Their approach, called ODFA, uses scattered light for fast, parallel training.
It works across language, vision, and even diffusion models.
Would you use optics to scale your next AI model?
🔗 Read the full breakdown:
https://blueheadline.com/tech-breakthroughs/ai-optical-chips/
#AI #MachineLearning #OpticalComputing #EnergyEfficiency #DeepLearning
Optical computing for advanced data crunching:
#nphardproblems #mathematics #photonics #opticalcomputing #research
🤓
https://techxplore.com/news/2024-10-hard-problems-3d-photonics.html
🕝 In a twist of irony, U.S. sanctions have fueled the rise of light-based AI! Discover how this innovation in optical computing is poised to revolutionize the future of AI. Read more on "Into the Mind of AI;)" #AI #OpticalComputing #Innovation #Technolog
http://intothemindofai.blog/2024/08/13/sanctions-fuel-innovation-rise-light-based-ai/
[PhD position, short deadline]
If you are interested in doing a PhD on #OpticalComputing and #MachineLearning, me and Mickael Mounaix have now a open position!
Apply or contact us for more info.
Boost appreciated for reach.
https://scholarships.uq.edu.au/scholarship/quex-phd-scholarship
If you are looking for a #PhD and are interested in working on #OpticalComputing for #MachineLearning (and to spend some time in the UK and some time in Australia), contact me!
Got the funding but the official advert is not out yet. Will update when it is (but the deadlines are going to be short).
FatNet marks a significant step forward in the realm of AI and computing https://innovationtoronto.com/2023/05/fatnet-marks-a-significant-step-forward-in-the-realm-of-ai-and-computing/
"Optical transformers". Anderson et al. 2023 https://arxiv.org/abs/2302.10360
"we performed small-scale optical experiments with a prototype accelerator to demonstrate that Transformer operations can run on optical hardware despite noise and errors."
Claims possible energy efficiency advantage of 8,000x over conventional GPUs, with room for more.
“The speed record for data transmission using a single light source and optical chip has been shattered once again. Engineers have transmitted data at a blistering rate of 1.84 petabits per second (Pbit/s), almost twice the global internet traffic per second.” https://newatlas.com/telecommunications/optical-chip-fastest-data-transmission-record-entire-internet-traffic/ #tech #technology #internet #broadband #data #bandwidth #opticalcomputing #datatransmission #opticalchip
@virginiaheffernan The trend of exponential development in info production and organization has been consistent thru five computing paradigms, and is evident in multiple digital tech streams. The log-log trend extends from the Big Bang https://commons.m.wikimedia.org/wiki/File:ParadigmShiftsFrr15Events.svg #quantumcomputing #3Dtransistors #dnacomputing #memristors #EvolutionaryComputing #opticalcomputing #graphene
Fourier Transforms (and More) Using Light
Linear transforms -- like a Fourier transform -- are a key math tool in engineering and science. A team from UCLA recently published a paper describing how they used deep learning techniques to design an all-optical solution for arbitrary linear transforms. The technique doesn't use any conventional processing elements and, instead, relies on diffractive surfaces. They also describe a "data free" design approach that does not rely on deep learning.
There is obvious appeal to using light to compute transforms. The computation occurs at the speed of light and in a highly parallel fashion. The final system will have multiple diffractive surfaces to compute the final result.
The deep learning the paper's authors refer to was all set up with TensorFlow using the Adam optimizer. It appears that the paper relies on simulations of the diffraction surfaces, not an actual implementation. We aren't sure how hard it is to realize high-resolution diffraction surfaces with the very specific patterns called for by the designs.
If you are looking to get started with TensorFlow yourself, we've covered quite a few tutorials. On the other hand, we talk quite a bit about Fourier transforms, too.