Neurosymbolic AI at ACS 2020

10 August 2020

Today I presented a long paper, “Neurosymbolic AI for Situated Language Understanding” at the Advances in Cognitive Systems conference, held virtually and hosted by the Palo Alto Research Center.

This was third of three papers submitted as a postdoc and presented as a professor, and I’m really very proud of this one, as it’s a detailed yet concise summary of effectively the last five years of work at Brandeis in developing situated grounding and embodied AI under the Communicating With Computers program, and nicely lays out most of my graduate and postdoctoral career.

The work discussed in this paper forms the foundations of the work in the SIGNAL Lab, and I hope that neurosymbolic AI is just getting started, and provides a number of opportunities for groundbreaking research that are barely on the horizon for the AI and cognitive systems communities. You can find the paper here, the slides here, and a video of the talk here.

(X-posted on signallab.ai)