Feminism and HCI: New Perspectives

Interacting with Computers, 10/2011, Volume 23, Issue 5, p.iii-xi, 2011
Feminism and HCI: New Perspectives
Shaowen Bardzell, Elizabeth Churchill
Abstract

As a word and as a set of theories and practices, feminism is a poorly understood concept. However, feminist perspectives have a lot in common with user- and value-centered design processes such as those espoused within the field of Human Computer Interaction.

Examples include consideration of alternative viewpoints, considerations of agency (who get to say/do what and under what circumstances) and the development of reflective and reflexive methods for understanding how, when, where and why people do what they do.

In the ''Feminism and HCI: New Perspectives'' special issue, we have invited researchers and practitioners to reflect on the ways in which feminist thinking, theory, and practice can and does have an impact on the field of Human Computer Interaction.

This introductory editorial offers more background to our view that there is great value to understanding the actual and potential impact of feminist thinking on HCI, followed by a precis of each paper. We close with some observations regarding common themes, points of contention and possibilities for future work.

Another publication from the same category: Machine Learning and Data Science

IEEE Computing Conference 2018, London, UK

Regularization of the Kernel Matrix via Covariance Matrix Shrinkage Estimation

The kernel trick concept, formulated as an inner product in a feature space, facilitates powerful extensions to many well-known algorithms. While the kernel matrix involves inner products in the feature space, the sample covariance matrix of the data requires outer products. Therefore, their spectral properties are tightly connected. This allows us to examine the kernel matrix through the sample covariance matrix in the feature space and vice versa. The use of kernels often involves a large number of features, compared to the number of observations. In this scenario, the sample covariance matrix is not well-conditioned nor is it necessarily invertible, mandating a solution to the problem of estimating high-dimensional covariance matrices under small sample size conditions. We tackle this problem through the use of a shrinkage estimator that offers a compromise between the sample covariance matrix and a well-conditioned matrix (also known as the "target") with the aim of minimizing the mean-squared error (MSE). We propose a distribution-free kernel matrix regularization approach that is tuned directly from the kernel matrix, avoiding the need to address the feature space explicitly. Numerical simulations demonstrate that the proposed regularization is effective in classification tasks.

Keywords