Microsoft will make one of the most advanced AI tools available to developers as part of GitHub open-source project. As a part of this project, Microsoft released examples of AI techniques they are using, along with a video about the company’s artificial intelligence lab. Many of Microsoft’s advancements in artificial intelligence, have been pioneered by Bing’s search team and researchers at Microsoft’s Asia research lab.
The algorithm, Space Partition Tree And Graph (SPTAG), allows consumers to take benefit of the intelligence from learning models to search through billions of pieces of info, known as Vectors, in milliseconds, according to a Microsoft blog post by Charlie Waldburger.
Using Vectors for Much Better Search
Microsoft began utilizing vectors for much better search, which searches from the concept as opposed to the key words, as a way to fix a lot of challenges emerging from the evolution of search engines. Vectors rely on a profound learning algorithm which Microsoft calls an approximate nearest neighbor.
Especially as voice search grows more popular, using vectors for better search becomes more efficient. For example, let’s talk movie trivia. Keyword matching to query words wouldn’t discover the film name Rocky IV, by way of example, because most people will search verbally using the number 4, stating “Rocky 4” as opposed to searching on the Roman numeral. These little nuances will have a big impact on voice search.
Basically a representation of a word, picture pixel, or other information point, a vector helps to capture what a bit of information means. Due to advances in AI, Microsoft claims it can comprehend and represent search goal working with these vectors. After the numerical purpose has been assigned to some bit of information, vectors may be organized, or mapped, with close numbers positioned in proximity to one another to represent similarity. These proximal results get displayed to users, improving search outcomes.
How The Technology Behind Vector Search Started
The technology behind the vector search Bing uses got its start when business engineers started noticing unusual trends in consumers’ search patterns. In analyzing their logs, the Bing group discovered that search queries are getting longer and longer. Think about your own behavior when it comes to search queries — are your queries short or long? Mine are almost always long.
This suggests that consumers are asking more questions, describing in greater detail due to bad experiences with key word look, or were trying to act as computers, when describing things, all unnatural and uncomfortable for consumers. With Bing’s search, vectorizing campaign has extended to more than 150 billion pieces of information indexed by the internet hunt engine to bring improvement over the traditional keyword fitting.These include words, characters, internet page snippets, full inquiries along with other media. When a user searches, Bing may scan the indexed vectors and provide the best match.
But as with any language and/or search, there are other challenges with words which have one or more meaning. By way of example, the word “bank” may refer to a riverbank or to a fiscal bank. The question posed by a searcher must have a context for the search to achieve success.
Context is King
In Waldburger’s blog post announcing the AI search code news, there is a video of Rangan Majumder, Group Program Manager for Microsoft’s Bing search and AI team, that helps put the Bing search algorithm into perspective. He writes that a pile of 150 billion business cards may stretch from Earth to the Moon. Immediately, Bing’s search using SPTAG can find 10 different business cards one after another within that pile of cards. And that’s why Microsoft is making this code available to developers.
According to Majumder:
“If we give it to the community, other people can use it and build their own search around vectors search. If we give it to the community, maybe they will contribute back.”
Microsoft executives hope that researchers and academics may use the code to explore others regions of search.
Why Microsoft Making AI Search Code Available is Important
In making its algorithm accessible to the public, Microsoft carries on its wider passage from being a closed ecosystem to one that is more accessible and more inviting, which might factor to the brand’s resurgence over the last couple of decades. The simple fact that it was uploaded to Github is emblematic of its own attempts to tribute the developer community, which is rare on Microsoft’s part. Developers will be capable use Microsoft vector search technologies to construct their very own search engines or help enhance it by submitting upgrades.
Exterior conventional search, the Bing team predicts that it will be used for business or consumer-facing software, like identifying a spoken language through an audio snippet or determining an image’s content more quickly.
Microsoft going open source with SPTAG is a real purpose means we may find out more quickly what customers and prospects are looking for as a result of their search queries and supply it more expeditiously. And in the cases where that’s not possible, we will waste fewer resources pursuing uninvested users. Microsoft going open source with SPTAG is a fantastic faith gesture, but in addition, it also potentially extends traditional search in ways we can’t yet predict.
Uses for Visual and Audio Search
According to Microsoft’s blog post, the Bing team expects the open source offering could be used for enterprise or consumer-facing applications to identify a language being spoken based on an audio snippet, or for image-heavy services such as an app that lets people take pictures of flowers and identify what type of flower it is. For those types of applications, a slow or irrelevant search experience is frustrating. The team also is hoping that researchers and academics will use it to explore other areas of search breakthroughs.
Video of Rangan Majumder, Group Program Manager
Microsoft’s Bing Search and AI Team
Futurum Research provides industry research and analysis. These columns are for educational purposes only and should not be considered in any way investment advice.
Shelly Kramer is a Principal Analyst and Founding Partner at Futurum Research. A serial entrepreneur with a technology centric focus, she has worked alongside some of the world’s largest brands to embrace disruption and spur innovation, understand and address the realities of the connected customer, and help navigate the process of digital transformation. She brings 20 years' experience as a brand strategist to her work at Futurum, and has deep experience helping global companies with marketing challenges, GTM strategies, messaging development, and driving strategy and digital transformation for B2B brands across multiple verticals. Shelly's coverage areas include Collaboration/CX/SaaS, platforms, ESG, and Cybersecurity, as well as topics and trends related to the Future of Work, the transformation of the workplace and how people and technology are driving that transformation. A transplanted New Yorker, she has learned to love life in the Midwest, and has firsthand experience that some of the most innovative minds and most successful companies in the world also happen to live in “flyover country.”