@@ -413,13 +413,13 @@ A ranking system usually deals with a set of $M$ users
413413
414414$$ U = \left\{u_0, u_1, ..., u_{M-1}\right\} $$
415415
416- Each user ($u_i$) having a set of $N $ ground truth relevant documents
416+ Each user ($u_i$) having a set of $N_i $ ground truth relevant documents
417417
418- $$ D_i = \left\{d_0, d_1, ..., d_{N -1}\right\} $$
418+ $$ D_i = \left\{d_0, d_1, ..., d_{N_i -1}\right\} $$
419419
420- And a list of $Q $ recommended documents, in order of decreasing relevance
420+ And a list of $Q_i $ recommended documents, in order of decreasing relevance
421421
422- $$ R_i = \left[r_0, r_1, ..., r_{Q -1}\right] $$
422+ $$ R_i = \left[r_0, r_1, ..., r_{Q_i -1}\right] $$
423423
424424The goal of the ranking system is to produce the most relevant set of documents for each user. The relevance of the
425425sets and the effectiveness of the algorithms can be measured using the metrics listed below.
@@ -439,21 +439,21 @@ $$rel_D(r) = \begin{cases}1 & \text{if $r \in D$}, \\ 0 & \text{otherwise}.\end{
439439 Precision at k
440440 </td>
441441 <td>
442- $p(k)=\frac{1}{M} \sum_{i=0}^{M-1} {\frac{1}{k} \sum_{j=0}^{\text{min}(\left|D\right| , k) - 1} rel_{D_i}(R_i(j))}$
442+ $p(k)=\frac{1}{M} \sum_{i=0}^{M-1} {\frac{1}{k} \sum_{j=0}^{\text{min}(Q_i , k) - 1} rel_{D_i}(R_i(j))}$
443443 </td>
444444 <td>
445- <a href="https://en.wikipedia.org/wiki/Information_retrieval #Precision_at_K">Precision at k</a> is a measure of
445+ <a href="https://en.wikipedia.org/wiki/Evaluation_measures_(information_retrieval) #Precision_at_K">Precision at k</a> is a measure of
446446 how many of the first k recommended documents are in the set of true relevant documents averaged across all
447447 users. In this metric, the order of the recommendations is not taken into account.
448448 </td>
449449 </tr>
450450 <tr>
451451 <td>Mean Average Precision</td>
452452 <td>
453- $MAP=\frac{1}{M} \sum_{i=0}^{M-1} {\frac{1}{\left|D_i\right| } \sum_{j=0}^{Q -1} \frac{rel_{D_i}(R_i(j))}{j + 1}}$
453+ $MAP=\frac{1}{M} \sum_{i=0}^{M-1} {\frac{1}{N_i } \sum_{j=0}^{Q_i -1} \frac{rel_{D_i}(R_i(j))}{j + 1}}$
454454 </td>
455455 <td>
456- <a href="https://en.wikipedia.org/wiki/Information_retrieval #Mean_average_precision">MAP</a> is a measure of how
456+ <a href="https://en.wikipedia.org/wiki/Evaluation_measures_(information_retrieval) #Mean_average_precision">MAP</a> is a measure of how
457457 many of the recommended documents are in the set of true relevant documents, where the
458458 order of the recommendations is taken into account (i.e. penalty for highly relevant documents is higher).
459459 </td>
@@ -462,10 +462,10 @@ $$rel_D(r) = \begin{cases}1 & \text{if $r \in D$}, \\ 0 & \text{otherwise}.\end{
462462 <td>Normalized Discounted Cumulative Gain</td>
463463 <td>
464464 $NDCG(k)=\frac{1}{M} \sum_{i=0}^{M-1} {\frac{1}{IDCG(D_i, k)}\sum_{j=0}^{n-1}
465- \frac{rel_{D_i}(R_i(j))}{\text{ln }(j+2)}} \\
465+ \frac{rel_{D_i}(R_i(j))}{\text{log }(j+2)}} \\
466466 \text{Where} \\
467- \hspace{5 mm} n = \text{min}\left(\text{max}\left(|R_i|,|D_i| \right),k\right) \\
468- \hspace{5 mm} IDCG(D, k) = \sum_{j=0}^{\text{min}(\left|D\right|, k) - 1} \frac{1}{\text{ln }(j+2)}$
467+ \hspace{5 mm} n = \text{min}\left(\text{max}\left(Q_i, N_i \right),k\right) \\
468+ \hspace{5 mm} IDCG(D, k) = \sum_{j=0}^{\text{min}(\left|D\right|, k) - 1} \frac{1}{\text{log }(j+2)}$
469469 </td>
470470 <td>
471471 <a href="https://en.wikipedia.org/wiki/Discounted_cumulative_gain#Normalized_DCG">NDCG at k</a> is a
0 commit comments