Voice-driven Knowledge Graph Journey with Neo4j and Amazon Alexa
  • Kyle McNamara

Voice-driven Knowledge Graph Journey with Neo4j and Amazon Alexa

Updated: Apr 18

by GraphAware. Content reproduced with the permission of the copyright holder, Graph Aware Ltd, UK. source: (Original Content)


Hashtags: #neo4j #knowledgegraph


In 2016, 25% of web searches on Android were made by voice and this percentage is predicted to double by 2018. From Amazon Alexa to Google Home, smartwatches and in-car systems, touch is no longer the primary user interface. In this talk, Alessandro and Christophe will demonstrate how graphs and machine learning are used to create an extracted and enriched graph representation of knowledge from text corpus and other data sources. This representation will then be used to map user intents made by voice to an entry point in this Neo4j backed knowledge graph. Every user interaction will then have to be taken into account at any further steps and we will highlight why graphs are an ideal data structure for keeping an accurate representation of a user context in order to avoid what is called machine or bot amnesia. The speakers will then conclude the session by explaining about how recommendations algorithms are used to predict next steps of the user’s journey.


Video and Slides


See the video and slides here on SlidesLive.


Slides






0 views
LINKS
ABOUT

info@graphable.ai

Tel: +1 844-344-7274

53 State St., Suite 500

Boston, MA 02109

SOCIAL
  • Twitter

© 2020 by Graphable®.