Epistemic vigilance against the new empiricism of Big Data
Apr. 22, 2019
The Big Data vogue represents the emergence of a new empiricism. It is advocated as the computational mean to know everything of anything, free of bias, and with a high degree of certainty. It is believed that having enough data makes unneeded interpretations or prior theories. Big Data is thought as a direct emergence of knowledge, free of uncertainty, that makes traditional forms of scientific inquiry obsolete. It shifts focus from causal explanations to correlational analysis, and diminishes context relevance. Furthermore, non-mediated knowledge is expected to be objective and fair.
Rieder & Simon (2017) state this new empiricism is creating an increasingly opaque algorithmic environment, which they name a black box society. Big data collections are always small or partial, algorithms may perpetuate prejudices of their creators, and forecasts are never certain (except for narrow and controlled environments), They write that this aura of truth objectivity contributes to a algorithmic culture that renders as the normal and the socially acceptable. "Government and corporate secrecy paired with technical inscrutability; obsolescent legal safeguards that are no match for new forms of digital feudalism; algorithmic scapegoating to avoid responsibility and curtail agency – these are the main ingredients of a thoroughly black-boxed data economy, in which opaque technologies are spreading, unmonitored and unregulated" (p. 95). In a black box society, systems work in a mysterious way, distinction between state and marked fades, and people submit to the rule of measurable data, these authors say.
To deal with negative effects of the Big Data transformation, Rieder & Simon make a call for epistemic vigilance. It is needed to be aware of the potential dangers and pitfalls of an ever more data-driven society. They suggest that people have access to both the data used and the algorithms applied in Big Data systems, and develop competencies to understand Big Data analytical processes as a specific way of producing knowledge that is "neither inherently objective nor unquestionably fair" (p. 96). All this implies legal reforms, educational measures, and technological interventions. About the last one, technology should make visible that is black boxed and implement ethically solutions as form of governance by design.
Reference:
Rieder, G. & Simon, J. (2017). Big Data: A New Empiricism and its Epistemic and Socio-Political Consequences. In W. Pietsch, J. Wernecke, & M. Ott (Eds.), Berechenbarkeit der Welt? Philosophie und Wissenschaft im Zeitalter von Big Data (pp. 85–105). Springer VS. https://doi.org/10.1007/978-3-658-12153-2_4