Can Global Semantic Context Improve Neural Language Models? I don't know, but that's the question asked and answered in the latest entry on Apple's Machine Learning Journal.
Today, most techniques for training word embeddings capture the local context of a given word in a sentence as a window containing a relatively small number of words (say, 5) before and after the word in question—"the company it keeps" nearby. For example, the word "self-evident" in the U.S. Declaration of Independence has a local context given by "hold these truths to be" on the left and "that all men are created" on the right.
In this article, we describe an extension of this approach to one that captures instead the entire semantic fabric of the document—for example, the entire Declaration of Independence. Can this global semantic context result in better language models? Let's first take a look at the current use of word embeddings.
It's heady stuff but a good read for anyone interested in how Apple is working to make Siri and systems like QuickType better.
We may earn a commission for purchases using our links. Learn more.
Display analyst pours cold water on iPhone 12 Pro 120Hz display claims
Will iPhone 12 Pro have a 120Hz display? We thought so, but display analyst Ross Young says not.
Enhance Recording & more coming to Voice Memos in iOS 14 and macOS Big Sur
Apple announced plenty of new iOS 14 and macOS Big Sur features during WWDC, but one that didn't get any screen time was a new Enhance Recording feature in the Voice Memos app.
iPhone 12 to come in 'exquisite' thinner box without charger, says leaker
L0vetodream says the iPhone SE's charger is also getting the chop.
Get these Star Wars games on sale today for Star Wars Day!
Get great deals on Star Wars games past and present for this Star Wars Day.