* Cosine similarity measures the cosine of the angle between two vectors, effectively capturing their orientation similarity while ignoring their magnitude. * Euclidean Distance calculates the straight-line distance between the endpoints of two vectors in the vector space. Highly sensitive to vector magnitude. * Manhattan Distance calculates the sum of the absolute differences between the components of two vectors. Can be more robust to outliers than Euclidean distance. * Dot product calculates the sum of the product of corresponding vector components. It's essentially unnormalized (influenced by vector magnitudes) cosine similarity. * Jaccard Index (Jaccard Similarity) calculates the size of the intersection divided by the size of the union (originally designed for sets). Good for sparse binary data. * Hamming Distance counts the number of positions at which two vectors differ.