News & Events

Events

 

RSS

June 22, 2017, 11:00 AM
Omid Bakhshandeh Babarsad: Language Learning Through Comparison

[Thursday, June 22, 2017 at 11:00 AM in Wegmans Hall 2506] Natural Language Understanding (NLU) has been one of the longest-running and the most challenging areas in artificial intelligence. For any natural language comprehension system having a basic understanding of entities and concepts is a primary requirement. Comparison, where we name the similarities and differences between entities and concepts, is a unique cognitive ability in humans which requires memorizing facts, experiencing things and integration of concepts of the world. Clearly, developing NLU systems that are capable of comprehending comparison is a crucial step forward in AI. In this thesis, I will present my research on developing systems that are capable of comprehending comparison, through which, systems can learn world knowledge and perform basic commonsense reasoning. The main building block for understanding and modeling comparison is to develop a comprehensive semantic meaning representation. I study the semantics of comparison at three major linguistic levels: lexical (word) level, sentence level, and document (paragraph) level. I will discuss the challenges facing language comprehension at each of these three levels and present a reading comprehension test for evaluating a system’s thorough comparison understanding and basic reasoning capabilities.


June 23, 2017, 03:00 PM
Shibo Wang: Content-Aware Memory Systems for High-Performance, Energy-Efficient Data Movement

[Friday, June 23, 2017 at 3:00 PM in Wegmans Hall 2506] Power dissipation and limited memory bandwidth are significant bottlenecks in virtually all computer systems, from datacenters to mobile devices. The memory subsystem is responsible for a significant and growing fraction of the total system energy due to data movement throughout the memory hierarchy. These energy and performance problems become more severe as emerging data-intensive applications place a larger fraction of the data in memory, and require substantial data processing and transmission capabilities. As a result, it is critical to architect novel, energy- and bandwidth-efficient memory systems and data access mechanisms for future computer systems. Existing memory systems are largely oblivious to the contents of the transferred or stored data. However, the transmission and storage costs of data with different contents often differ, which creates new possibilities to reduce the attendant data movement overheads. This dissertation investigates both content aware transmission and storage mechanisms in conventional DRAM systems, such as DDRx, and emerging memory architectures, such as Hybrid Memory Cube (HMC). Content aware architectural techniques are developed to improve the performance and energy efficiency of the memory hierarchy.  The dissertation first presents a new energy-efficient data encoding mechanism based on online data clustering that exploits asymmetric data movement costs. Given an interconnect with asymmetric cost of transferring a 1 vs. a 0, data movement energy can be reduced by minimizing the number of 1s in each transmitted data block. In the proposed coding scheme, the transmitted data blocks are dynamically grouped into clusters based on the similarities between their binary representations. Each transmitted data block is expressed as the bitwise XOR between the nearest cluster center and a sparse residual with a small number of 1s. By dynamically learning and adjusting the cluster centers, the total number of 1s in the transmitted residual is lowered, leading to substantial savings in data movement energy.  The dissertation then introduces content aware refresh—a novel DRAM refresh method that reduces the refresh rate by exploiting the unidirectional nature of DRAM retention errors: assuming that a logical 1 and 0 respectively are represented by the presence and absence of charge, 1-to-0 failures dominate the retention errors. As a result, in a DRAM system that uses a block error correcting code (ECC) to protect memory from errors, blocks with fewer 1s exhibit a lower probability of encountering an uncorrectable error. Leveraging this key insight, and without compromising memory reliability, the proposed content aware refresh mechanism refreshes memory blocks with fewer 1s less frequently.  Finally, the dissertation examines a novel HMC power management solution that enables energy-efficient HMC systems with erasure codes. The key idea is to encode multiple blocks of data in a single coding block that is distributed among all of the HMC modules in the system, and to store the resulting check bits in a dedicated, always-on HMC. The inaccessible data that are stored in a sleeping HMC module can be reconstructed by decoding a subset of the remaining memory blocks retrieved from other active HMCs, rather than waiting for the sleeping HMC module to become active."


June 26, 2017, 11:00 AM
Nasrin Mostafazadeh: From Event to Story Understanding

[Monday, June 26, 2017 at 11:00 AM in Wegmans Hall 2506] Building systems that have natural language understanding capabilities has been one of the oldest and the most challenging pursuits in AI. In this thesis we present our research on modeling language in terms of ‘events’ and how they interact with each other in time, mainly in the domain of stories. Deep language understanding, which enables inference and commonsense reasoning, requires systems that have large amounts of knowledge which would enable them to connect surface language to the concepts of the world. A part of our work concerns developing approaches for learning semantically rich knowledge bases on events. First, we present an approach to automatically acquire conceptual knowledge about events in the form of inference rules, which can enable commonsense reasoning. We show that the acquired knowledge is precise and informative which can be employed in different NLP tasks. Learning stereotypical structure of related events, in the form of narrative structures or scripts, has been one of the major goals in AI. The research on narrative understanding has been hindered by the lack of a proper evaluation framework. We address this problem by introducing a new framework for evaluating story understanding and script learning: the ‘Story Cloze Test (SCT). In this test, the system is posed with a short foursentence narrative context along with two alternative endings to the story, and is tasked with choosing the right ending. Along with the SCT, We have worked on developing ix the ROCStories corpus of about 100K commonsense short stories, which enables building models for story understanding and story generation. We present various models and baselines for tackling the task of SCT and show that human can perform with an accuracy of 100%. One prerequisite for understanding and proper modeling of events and their interactions is to develop a comprehensive semantic framework for representing their variety of relations. We introduce ‘Causal and Temporal Relation Scheme (CaTeRS)’ which is a rich semantic representation for event structures, with an emphasis on the domain of stories. The impact of the SCT and the ROCStories project goes beyond this thesis, where numerous teams and individuals across academia and industry have been using the evaluation framework and the dataset for a variety of purposes. We hope that the methods and the resources presented in this thesis will spur further research on building systems that can effectively model eventful context, understand, and generate logically-sound stories.