I like the insights in this paper review: https://lnkd.in/ggCejTFy The conclusion applies to more than linguistics. In an age where most people think that modeling is purely a data+GPU game, having linguistic knowledge can help a lot in understanding the problem at hand, explaining results, and testing properly. Especially in the language technology space. Of course data+GPU is most likely just as important (as we are seeing with these massive LLMs, MT, and ASR models), but having both will still give you an edge. #ai #datahungry #gpuhungry #linguistics #asr #mt #llm #machinelearning
As cognition is something missing in LLMs, I bet that cognitive sciences (considering language a cognitive issue, not only a set of rules) are going to be relevant in the development of AI systems.
Here comes the novice… The question is no longer about the syntax of semantics or the semantics of syntax, in an age of MMM’s linguistics is one part of a modal symphony where meaning that emanates from beyond the textual. (MMM=Multi-Modal Models).
Good, I'm glad that my PhD in linguistics still looks directly relevant in at least some corners of the tech world.
Yes! Syntax can widely vary in ways that the LLMs have yet to account for.
CTO @ Nym
1moInsightful! At Nym I believe we are one of the few places that have linguists and ML researchers (and also ML researchers with linguistics background) and I'm always amazed with the creative solutions we create around it and how much it impacts the way we see data. A lot of the points at the paper really resonate with me.