Prof. Dr. Udo Kruschwitz has held the Chair of Information Science at the University of Regensburg since July 2019. Before that he was professor at the School of Computer Science and Electronic Engineering at the University of Essex.
His research deals with the connection of Information Retrieval (IR) and Natural Language Processing (NLP). He has led research projects developing algorithms to transform unstructured and partially structured textual data into structured knowledge and user/cohort models that have been applied in a variety of fields including search, navigation and summary.
There is another reason though to take AI seriously and that is the paradigm shift that we have witnessed in the last few years. Neural approaches have largely replaced traditional statistical as well as rule-based ideas when you look at state-of-the-art performance of machine learning algorithms in a broad range of applications. There are three key reasons that have made this development possible, namely massively increased computing power (e.g. via GPUs), the availability of training data magnitudes larger in size than before and finally the emergence of scalable software tools accessible to anyone interested in implementing modern machine learning algorithms. This development can nicely be illustrated by looking at recent advances in natural language processing (NLP). The paradigm of choice for a multitude of NLP problems is an architecture called BERT. Note that the paper that this architecture is based on was only published in 2019 at one of the top NLP conferences. As of today it has already been cited 17,623 times according to Google Scholar. In other words, the state of the art in NLP has been shifted dramatically within less than three years, and there is no end in sight to this trend.
Mr. Kruschwitz, thank you very much for taking the time to answer these questions. We wish you a nice day!