We’ve been travelling quite a bit the beginning of this year. One of the best events thus-far has been the Re-Work Deep Learning and AI Assistant Summit that took place in San Francisco in January 2018. It hosted a variety of exhibitors (including AlphaBlues 👍) and close to 1,000 participants. The talks were of high quality and there were several interesting exhibiting companies. I put down some thoughts that we observed throughout the whole trip in Silicon Valley as they reflect the state of the AI in the NLP & intent understanding space that we are working in.
- There were plenty of tech talks from highly technical people but the underlying impression was that majority of the tech is still in research phase when it comes to understanding conversational dialogue. It is a tough problem mainly because of the limitation that we currently have in context understanding. There are approaches focusing on previous steps in the dialogues but a common theme from the talks was that the lack of labelled training data is a major bottleneck.
- Several large tech companies are now in the voice assistant game with their home control devices. Voice is an upcoming trend and as Google, Amazon and Apple fight for the prime spot in the living room, understanding meaning from voice becomes important. Large companies get billions of queries each month (that is some sizeable training data) and it can be understood how with more and more use, they come in possession of the largest training sets. And as such can improve their systems the best.
- NLU is not only limited to understanding real-time human-machine communication but also used in searching through logs of messaging content in team collaboration tools. This is an interesting angle as there is ton of message based content left behind in various workplace tools (i.e. Slack). Make those queries intelligently requires the understanding of meaning in messages and can help in information retrieval.