Problem being addressed
Hierarchical and power law structure data arises quite often in network learning, translation and question and answer tasks. What's the most natural geometry for modelling these problems with a neural network?
This research combines ideas from hyperbolic geometry (as opposed to cartesian geometry) with neural networks. The basic idea is that by embedding a neural network into a hyperbolic space the network can be used to model data structures that are a better natural fit for the space. It is argued that many applications including those from biology, physics and social networks are modeled better by neural networks that have been first transformed into an alternate geometric setting. Further, it is shown that these transformed neural networks, called Hyperbolic Attention Networks, perform well on tasks like path length prediction in scale free networks, and question and answer tasks.
Advantages of this solution
This is an interesting take on something fundamental, the question of what underlying space a neural network is embedded into. Where the ambient geometry plays a role in the neural network functioning, then this could improve performance of the network.
Solution originally applied in these industries
Possible New Application of the Work
Network and hierarchical data sets play a role in the airline industry when it comes to route and fleet management. Using hyperbolic algorithms could perhaps improve performance and results for these problems.
Social networks play an important role in recommender systems for the hospitality industry. In this research, it was shown that results on social networks improve when using the hyperbolic approach. Perhaps the approach could be used to improve the performance of restaurant and recommendation engines like Yelp!
Telecommunications routing systems often take the form of scale free or hierarchical networks. Algorithms for call routing and optimization could benefit from altering the geometry as it is done in the paper.
Source URL: #############