The explosive growth of data and information has motivated technological developments in computing systems that utilize them for efficiently discovering patterns and gaining relevant insights. Inspired by the structure and functions of biological synapses and neurons in the brain, neural network algorithms that can realize highly parallel computations have been implemented on conventional silicon transistor-based hardware. However, synapses composed of multiple transistors allow only binary information to be stored, and processing such digital states through complicated silicon neuron circuits makes low-power and low-latency computing difficult. Therefore, the attractiveness of the emerging memories and switches for synaptic and neuronal elements, respectively, in implementing neuromorphic systems, which are suitable for performing energy-efficient cognitive functions and recognition, is discussed herein. Based on a literature survey, recent progress concerning memories shows that novel strategies related to materials and device engineering to mitigate challenges are presented to primarily achieve nonvolatile analog synaptic characteristics. Attempts to emulate the role of the neuron in various ways using compact switches and volatile memories are also discussed. It is hoped that this review will help direct future interdisciplinary research on device, circuit, and architecture levels of neuromorphic systems.