In this demo, we introduce an open-source library called Transformers4Rec that is also presented at the ACM RecSys’21 for Sequential-based and Session-based recommendation. The library works as a bridge between NLP and recommender systems by integrating with one of the most popular NLP frameworks HuggingFace Transformers, making state-of-the-art Transformer architectures available for RecSys researchers and industry practitioners. The Transformers4Rec library is flexible, customizable, supports multiple input features and provides APIs for PyTorch and Tensorflow. Transformers4Rec together with other NVIDIA libraries, NVTabular and Triton Inference server, enables an end-to-end pipeline on GPU, from data preprocessing to model training to deployment and inference. This demo is particularly prepared based on our accepted paper at ACM RecSys’21 conference.