Everyone with an interest in postsecondary enrollment or the health of the industry seems to be talking about research methodology these days. A 5.5% increase in first-year student enrollment was misreported as a 5% decline due to a data coding error announced by the National Student Clearinghouse Research Center. Professionals are frustrated that “a thermometer for a feverish sector” could have such an inaccurate reading; some maintain that a preliminary report created in response to a worldwide pandemic and possibly carried forward due to its headline-grabbing nature should never have been considered a reliable industry indicator.
It’s got me thinking about the bias many people have towards quantitative over qualitative data. The belief that numbers don’t lie, but feelings or experiences are too subjective or less trustworthy. Whether you're engaged in survey research or social intelligence, it’s the adherence to method that matters. As Scott Alberts, Chair of Computer & Data Sciences at Truman State University said on Bluesky, “Data Cleaning > Data Analysis.”
If you only knew how much time we spend on data cleaning (which we prefer to call data validation) at Campus Sonar… it’s core to our measures of success.
In a fast-moving world, social intelligence enables you to uncover your stakeholders’ voices and use their sentiments and expectations to inform your work and align with your audience. It’s faster, more fluid, and more authentic than traditional research methods. We wouldn’t say this without keen attention to three measures of success.
Errors are possible in any research project, regardless of methodology. They’re not more or less likely because of the nature of the data. Rather, errors are related to the rigor of protocols, checks, and balances. You don’t need to employ traditional research methods to meet that standard. Sonarians will gladly validate data to support your innovative approach to decision-making.