#opticalcomputing

Blue Headline - Tech NewsBlueHeadline
2025-04-03

💡 AI trained with lasers? Yes, and it cuts energy use by 95%.

Researchers just trained a 1B-parameter Transformer using optical chips—no backprop, no burnout. Their approach, called ODFA, uses scattered light for fast, parallel training.

It works across language, vision, and even diffusion models.

Would you use optics to scale your next AI model?

🔗 Read the full breakdown:
blueheadline.com/tech-breakthr

2025-02-07

🥰#call4reading

✍️Synthesis of #Binary-Input Multi-Valued Output #Optical Cascades for Reversible and #Quantum Technologies #by Ishani Agarwal, Miroslav Saraivanov, and Marek Perkowski

🔗10.26421/QIC24.15-16-2 (#arXiv:2410.18367)

#opticalcomputing #quantumlayout

Odin NashOdinNaah
2024-08-13

🕝 In a twist of irony, U.S. sanctions have fueled the rise of light-based AI! Discover how this innovation in optical computing is poised to revolutionize the future of AI. Read more on "Into the Mind of AI;)"

intothemindofai.blog/2024/08/1

2024-06-14

[PhD position, short deadline]
If you are interested in doing a PhD on #OpticalComputing and #MachineLearning, me and Mickael Mounaix have now a open position!
Apply or contact us for more info.
Boost appreciated for reach.
scholarships.uq.edu.au/scholar

2024-06-04

If you are looking for a #PhD and are interested in working on #OpticalComputing for #MachineLearning (and to spend some time in the UK and some time in Australia), contact me!
Got the funding but the official advert is not out yet. Will update when it is (but the deadlines are going to be short).

2023-02-22

"Optical transformers". Anderson et al. 2023 arxiv.org/abs/2302.10360

"we performed small-scale optical experiments with a prototype accelerator to demonstrate that Transformer operations can run on optical hardware despite noise and errors."

Claims possible energy efficiency advantage of 8,000x over conventional GPUs, with room for more.

#transformers #OpticalComputing #MachineLearning

2022-12-14

“The speed record for data transmission using a single light source and optical chip has been shattered once again. Engineers have transmitted data at a blistering rate of 1.84 petabits per second (Pbit/s), almost twice the global internet traffic per second.” newatlas.com/telecommunication #tech #technology #internet #broadband #data #bandwidth #opticalcomputing #datatransmission #opticalchip

2022-12-01

@virginiaheffernan The trend of exponential development in info production and organization has been consistent thru five computing paradigms, and is evident in multiple digital tech streams. The log-log trend extends from the Big Bang commons.m.wikimedia.org/wiki/F #quantumcomputing #3Dtransistors #dnacomputing #memristors #EvolutionaryComputing #opticalcomputing #graphene

heise online (inoffiziell)heiseonline@squeet.me
2022-11-25
heise+ | Rechnen mit Licht: In Spezialbereichen tausendfach schneller als Elektronen

Lichtstrahlen zum Rechnen zu nutzen, ist nicht neu. Erstmals gibt es reelle Chancen, die altgediente Siliziumelektronik in gewissen Bereichen zu überflügeln.
Rechnen mit Licht: In Spezialbereichen tausendfach schneller als Elektronen
2021-10-01

Fourier Transforms (and More) Using Light

Linear transforms -- like a Fourier transform -- are a key math tool in engineering and science. A team from UCLA recently published a paper describing how they used deep learning techniques to design an all-optical solution for arbitrary linear transforms. The technique doesn't use any conventional processing elements and, instead, relies on diffractive surfaces. They also describe a "data free" design approach that does not rely on deep learning.

There is obvious appeal to using light to compute transforms. The computation occurs at the speed of light and in a highly parallel fashion. The final system will have multiple diffractive surfaces to compute the final result.

The deep learning the paper's authors refer to was all set up with TensorFlow using the Adam optimizer. It appears that the paper relies on simulations of the diffraction surfaces, not an actual implementation. We aren't sure how hard it is to realize high-resolution diffraction surfaces with the very specific patterns called for by the designs.

If you are looking to get started with TensorFlow yourself, we've covered quite a few tutorials. On the other hand, we talk quite a bit about Fourier transforms, too.

#news #science #fouriertransform #light #opticalcomputing

image

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst