Thursday, April 23, 2009

Dominance approach vs. ideal-point approach in item selection

Dominance approach (Coombs, 1964; Likert, 1932)
  • It is about measuring people's ability
  • It uses items of high internal consistency.
  • Therefore, if a person scores low on one item, he/she should be score low on the total scores as well. Likewise, if I score higher on the item than you do, my ability would be dominant over your ability.
  • In IRT terminology, DIF (Differential Item Functioning) refers to "a difference in the probability of endorsing an item for members of a reference group (e.g., US workers) and a focal group (e.g., Chinese workers) having the same standing on the latent attribute measured by a test." It is related to dominance approach.
Ideal-point approach (Thurstone, 1928)
  • It is about measuring people's attitude
  • Individuals will endorse an item to the degree that it reflects
  • More neutral items should be included

Tuesday, April 21, 2009

Metric MDS and software

Metric MDS include the followings (Borg & Groenen, 2005, p. 203):
  • ratio MDS:
    • (disparities) = b * (proximities in terms of dissimilarities; short for 'prox' below)

  • interval MDS:
    • (disparities) = a + b * (prox)

  • logarithmic MDS:
    • (disparities) = log(prox)
    • (disparities) = b * log(prox)
    • (disparities) = a + b * log(prox)

  • exponential MDS
    • (disparities) = exp(prox)
    • (disparities) = b * exp(prox)
    • (disparities) = a + b * exp(prox)

  • power MDS (which includes square root with q = 0.5):
    • (disparities) = (prox)^q
    • (disparities) = b * (prox)^q
    • (disparities) = a + b * (prox)^q

  • polynomial MDS (i.e., spline MDS without interior knots)
    • (disparities) = a + b * (prox) + c * (prox)^2
    • (disparities) = a + b * (prox) + c * (prox)^2 + d * (prox)^3
However, softwares are not always clear about the kinds of metric MDS they are performing. Based on my own testing as of 04/21/09, here is a table of comparison:

Software PackageProgram, version, dateMetric MDS supported
MATLAB 7.8.0.347 (R2009a)mdscale() 1.1.6.9, 12/01/08
Criterion = 'metricstress'
Ratio only
smacof in R 0.9-0 (05/24/08) smacofSym(), metric = TRUE Ratio only
SPSS 17.0.0 (08/23/08)Proxscal version 1.0Ratio, Interval, Spline
SYSTAT 12.02.00Multidimensional Scaling
Shape = Square (similarities model)
Interval (Linear), Log, Power

To date, no program in any of these software packages provide combinations of two or more than two transformations, but these could be very helpful. For example, log + polynomial may be of interest, because log may be used to normalize residuals, while polynomial may be able to pick up the trend of the data. That is,
  • (disparities) = a + b * log(prox) + c * log(prox)^2
  • (disparities) = a + b * log(prox) + c * log(prox)^2 + d * log(prox)^3

Saturday, April 18, 2009

Eigendecomposition and Singular Value Decomposition

Eigenvalue and eigenvector are those satisfying the following eigenequation:

matrix(transformation) * eigenvector = eigenvalue * eigenvector

Thus, if we can find such a eigenvector and therefore a eigenvalue, their interpretations are: after being linearly transformed by the matrix, eigenvector still has the same direction. Eigenvalue can thus be considered some essential part of the matrix, or the characteristic value of the matrix. Eigenvector can be considered a tool to extract such essential part of the matrix.

A nice explanation can be founded here; see also Borg and Groenen (2005) Chapter 7.

Eigendecomposition: matrix A = QΛQ'
Thus, AQ = QΛQ'Q = QΛ, where Λ is a diagonal matrix of eigenvalues
Singular Value Decomposition: matrix A = PΦQ'

P is a matrix of left singular vectors, Φ is a diagonal matrix with singular values, Q is a matrix of right singular vectors. The naming choice of "singular" probably is similar to that of "eigen", because the expressions of the two decompositions are very similar and probably referring to the essential and unique quality of the matrix.