With so many ways to collect, manage, and analyse data in the internet of things (IoT) and technological advances, data quality control may be sacrificed for the sake of quantity over quality as companies endeavour to make fast and radical decisions in the dynamic and ever-changing business world.

Good business decisions are made when there is solid, rich data to base decisions on, and when every option is weighed up. One can only do that if research is conducted in such a way that it brings meaningful insights to a company which they can use. Data is used to gather various different types of information. For example, data that is collected can be used for statistical analyses. Depending on the data which is collected and the information you are looking for, this may offer insights to things such as the number of consumers responding to a survey, consumer behaviour, traffic on your website, or even employee churn within the organisation. These statistics offer insights which lead to better decision making, which ideally will positively impact your bottom line.

The hardest part of any research experiment is taking all the acquired data and translating it into meaningful insights. Good data and the measurement thereof influence decision making and is vital to any business strategy and depends on excellent research structure.

Firstly, what is data?

Information that is collected, observed, generated, or created is considered to be data and can take on a plethora of forms with many different methods of being gathered. At Genex, for example, our data-gathering processes include surveys, field research, quota sampling, and focus groups to name a few.

There are two different types of data. Quantitative research is specifically interested in data that can be counted and measured and looks to explain, predict and generalise findings to real world situations, while Qualitative research explores and describes phenomena without measurements- it examines rather people’s feelings or emotions that tend to be expressed in words and language.

No matter how this data is collected and which data you choose to collect, it needs to be ‘good enough’ to make sense of it all.

Exploring what ‘good data’ actually means

The correct collection and preparation of data is essential to ensuring accurate analysis and measurement later on in the research process. The only way to truly benefit from data is to use it as a learning tool, and a means to change circumstances- the last thing you want to do is change circumstances or make big decisions based on poor quality data.

Characteristics that define ‘good’ data:

Good data means quality data and there are certain characteristics which define good data. These include:

  1. Accuracy and precision: Data must not be misleading, should be exact and convey the right message.
  2. Completeness and comprehensiveness: Complete data is just as important as accurate data. If there are any gaps or information missing, the data does not convey a comprehensive depiction of the current circumstances
  3. Reliability and consistency: Data that contradicts other sources is not reliable data. Oftentimes there may be slight variations in data – particularly in the case for qualitative data but the sentiment will always turn out to be the same if the data is reliable and credible without contradicting other reputable sources.
  4. Relevance and timeliness: The data that is collected for the study should be at the right moment in time to represent the situation correctly
  5. Validity: Most surveys or data-collection methods have criterion which makes data relevant to the particular study and required data. For example, there may be a case where collecting data on the gender or ethnicity of participants is not ‘valid’ to the study at all.

Evaluating data quality

To get a good idea of whether the data you are working with is ‘good’/quality data, the characteristics of good data should be examined against the data set you’re working with.

According to Deloitte, these are the questions you should be asking to evaluate data quality:

  1. How comprehensive is the data set?
  2. Is data available when it is needed?
  3. Are attributes conforming across various datasets?
  4. Does the data conform to its expected definition?
  5. Does the data accurately represent an object or event?
  6. Is there unwanted duplication in the data?

Data collection and interpretation is a long process that takes greater research understanding. Therefore many organisations choose to outsource information to external service providers. While this is beneficial for conserving the time of the business, it is also important that this information be understood by the business. Understanding the basics of research assists businesses in evaluating the expertise of their vendors and the usefulness of their projects.

If you require more information about the data collection and evaluation process, contact us at Genex Insights.