Four short links: 7 February 2019

VR, Learning Robot, Bubble Sort, and Graph Neural Networks

By Nat Torkington
February 7, 2019
  1. Hamlet in Virtual Reality — context for WGBH’s Hamlet 360. It’s 360º video, so you can pick what you look at but not where you look at it from. Interesting work, and a reminder that we’re still trying to figure out what kinds of stories these media lend themselves to, and how best to tell stories with them.
  2. Self-Taught Robot Figures Out What It Looks Like and What It Can DoTo begin with, the robot had no idea what shape it was and behaved like an infant, moving randomly while attempting various tasks. Within about a day of intensive learning, the robot built up an internal picture of its structure and abilities. After 35 hours, the robot could grasp objects from specific locations and drop them in a receptacle with 100% accuracy. Paper is behind a paywall, though Sci-Hub has it.
  3. Learn faster. Dig deeper. See farther.

    Join the O'Reilly online learning platform. Get a free trial today and find answers on the fly, or master something new and useful.

    Learn more
  4. Bubble Sort: An Archaeological Algorithmic AnalysisText books, including books for general audiences, invariably mention bubble sort in discussions of elementary sorting algorithms. We trace the history of bubble sort, its popularity, and its endurance in the face of pedagogical assertions that code and algorithmic examples used in early courses should be of high quality and adhere to established best practices. This paper is more an historical analysis than a philosophical treatise for the exclusion of bubble sort from books and courses. However, sentiments for exclusion are supported by Knuth: “In short, the bubble sort seems to have nothing to recommend it, except a catchy name and the fact that it leads to some interesting theoretical problems.” Although bubble sort may not be a best practice sort, perhaps the weight of history is more than enough to compensate and provide for its longevity.
  5. Comprehensive Survey on Graph Neural NetworksWe propose a new taxonomy to divide the state-of-the-art graph neural networks into different categories. With a focus on graph convolutional networks, we review alternative architectures that have recently been developed; these learning paradigms include graph attention networks, graph autoencoders, graph generative networks, and graph spatial-temporal networks. We further discuss the applications of graph neural networks across various domains and summarize the open source codes and benchmarks of the existing algorithms on different learning tasks. Finally, we propose potential research directions in this fast-growing field.
Post topics: Four Short Links
Share: