So, Sentence-BERT is modification of the BERT model which uses siamese and triplet network structures and adds a pooling operation to the output of BERT to obtain fix-sized semantically meaningful sentence embeddings. These generated sentence embedding can further be used for sentence similarity comparison (using cosine similarity), clustering and semantic search.
Sentence-BERT는 BERT에 조정을 가한 모델로서 문장 마다의 고정 길이 벡터를 출력하며 각 벡터는 semantic information을 반영한다.