Can Global Semantic Context Improve Neural Language Models? I don't know, but that's the question asked and answered in the latest entry on Apple's Machine Learning Journal.
Today, most techniques for training word embeddings capture the local context of a given word in a sentence as a window containing a relatively small number of words (say, 5) before and after the word in question—"the company it keeps" nearby. For example, the word "self-evident" in the U.S. Declaration of Independence has a local context given by "hold these truths to be" on the left and "that all men are created" on the right.
In this article, we describe an extension of this approach to one that captures instead the entire semantic fabric of the document—for example, the entire Declaration of Independence. Can this global semantic context result in better language models? Let's first take a look at the current use of word embeddings.
It's heady stuff but a good read for anyone interested in how Apple is working to make Siri and systems like QuickType better.
We may earn a commission for purchases using our links. Learn more.
On Apple and the FBI regarding privacy, from San Bernardino to Pensacola
We compare two of the most high-profile Apple news stories in recent memory.
Apple and Google accused of using market dominance to cripple competition
Sonos, Tile, Basecamp and PopSockets have all testified to a House antitrust committee, stating that big tech firms like Amazon, Apple and Google used their market dominance and bullying business tactics to crush competition.
Apple signs multi-year Apple TV+ deal with Seinfeld's Julia Louis-Dreyfous
Apple has signed a multi-year deal with Julia Louise-Dreyfus, formerly of Saturday Night Live and Seinfeld.
The perfect fit for smaller spaces, the best 32-inch TVs
Big-screen TVs are the greatest, but a 55-inch probably doesn't make much sense in your bedroom, kitchen, or smaller living room. 32 inches is often the perfect size for these spaces, and these are the best 32-inch TVs you can buy.