Research

We believe that better understanding of the brain plays a vital role in developing intelligent systems. Since rapid progress has been made in the artificial intelligence (AI) field, it is crucial to investigate the importance of the brain to come up with ideas that will accelerate and guide AI research. Thus, we seek to develop novel deep learning algorithms inspired by the brain functions. Our lab is interested in the following research areas but not limited.

  • Brain-Inspired Artificial Intelligence

    The brain provides a rich source of inspiration for new types of algorithms and architectures of artificial intelligence (AI) and can provide validation of AI techniques that already exist. If a known algorithm is subsequently found to be implemented in the brain, then that is strong support for its plausibility as an integral component of an overall general intelligence system. Our view is that leveraging in-sights gained from neuroscience research will expedite progress in AI research. As convolutional neural networks are inspired by the vision system in the brain, understanding brain functions is a key for developing novel deep learning models. We study neuroscience to understand the brain and develop brain-like neural networks based on how the brain works.

  • Generative AI

    Although deep learning has revolutionized computer vision, current approaches have several major problems: typical vision datasets are labor intensive and costly to create while teaching only a narrow set of visual concepts; standard vision models are good at one task and one task only, and require significant effort to adapt to a new task; and models that perform well on benchmarks have disappointingly poor performance on stress tests, casting doubt on the entire deep learning approach to computer vision. To address these problems, we study a neural network that it is trained on a wide variety of images with a wide variety of natural language supervision that’s abundantly available on the internet to generate images of high visual qualityfrom text prompts.


  • Neural Text Generation

    Neural Text Generation is a type of Language Modelling problem. Language Modelling is the core problem for a number of of natural language processing tasks such as speech to text, conversational system, and text summarization. A trained language model learns the likelihood of occurrence of a word based on the previous sequence of words used in the text. Language models can be operated at character level, n-gram level, sentence level or even paragraph level. Text generation can be addressed with Markov processes or deep generative models like LSTMs. Recently, some of the most advanced methods for text generation include BART, GPT and other GAN-based approaches. Text generation systems are evaluated either through human ratings or automatic evaluation metrics like METEOR, ROUGE, and BLEU. We develop a neural network that generates text from images or sentences.