#neuralEmpty

2023-12-06

Findings from #WMT23
Our Chat4 friend is in the winning group across tasks
Most submissions still use from scratch training
Less constrained (low resource) submissions than before
More test suit submissions!
Low resource results TBD (tech issue)
#EMNLP2023 #WMT #neuralEmpty #LLMs

The Data Therapistdatatherapist
2023-09-01

שלום כיתה ‫‬ א1
כמה גורמים אפשריים לדעתכם לכישלון המפואר הזה?

‬ ‫
‏‪‬ ‪‬ ‪

2023-02-06

Few-shot learning almost reaches traditional machine translation

arxiv.org/abs/2302.01398
#enough2skim #NLProc #neuralEmpty

2023-01-23

3 reasons for hallucinations started
only 2 prevailed

Finding how networks behave while hallucinating, they
filter hallucinations (with great success)

arxiv.org/abs/2301.07779
#NLProc #neuralEmpty #NLP #deepRead

The Data Therapistdatatherapist
2022-12-23

Bold statement (need to think about it more), especially when coming from a machine translation person.

I’d claim MT was no less revolutionary once it became pervasive in industry. But @marian_nmt seems to dismiss it now given ChatGPT

twitter.com/marian_nmt/status/

2022-12-06

@ #conll #EMNLP talk to me about
ColD Fusion & ibm.github.io/model-recycling/
BabyLM shared task
label-sleuth.org/
Enhancing decoders with syntax

And guided work (talk to them too)
Estimating #neuralEmpty quality with source only
Controlling structure in - neuron level
Details:

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst