How to write a successful critical analysis

Kozzi-pros-cons-128x128Click on the arrows beside the topics below to learn more about the content and definition of a critical analysis, and the ways to evaluate a theory, concept, argument, methodology, and research results and conclusions.

For further queries or assistance in writing a critical analysis email Bill Wrigley.

What do you critically analyse?

In a critical analysis you do not express your own opinion or views on the topic. You need to develop your thesis, position or stance on the topic from the views and research of others. In academic writing you critically analyse other researchers’:

  • theories
  • concepts, terms
  • viewpoints, arguments, positions
  • methodologies, approaches
  • research results and conclusions

This means weighing up the strength of the arguments or research support on the topic, and deciding who or what has the more or stronger weight of evidence or support.

Therefore, your thesis argues, with evidence, why a particular theory, concept, viewpoint, methodology, or research result(s) is/are stronger, more sound, or more advantageous than others.

What does ‘analysis’ mean?

A critical analysis means analysing or breaking down the parts of the literature and grouping these into themes, patterns or trends.

In an analysis you need to:

1. Identify and separate out the parts of the topic by grouping the various key theories, main concepts, the main arguments or ideas, and the key research results and conclusions on the topic into themes, patterns or trends of agreement, dispute and omission.

2. Discuss each of these parts by explaining:

i. the areas of agreement/consensus, or similarity

ii. the issues or controversies: in dispute or debate, areas of difference

ii. the omissions, gaps, or areas that are under-researched

3. Discuss the relationship between these parts

4. Examine how each contributes to the whole topic

5. Make conclusions about their significance or importance in the topic

What does ‘critical’ mean?

A critical analysis does not mean writing angry, rude or disrespectful comments, or  expressing your views in judgmental terms of black and white, good and bad, or right and wrong.

To be critical, or to critique, means to evaluate. Therefore, to write critically in an academic analysis means to:

  • judge the quality, significance or worth of the theories, concepts, viewpoints, methodologies, and research results
  • evaluate in a fair and balanced manner
  • avoid extreme or emotional language

You evaluate or judge the quality, significance or worth by examining the:strengths and weaknesses computer keys showing performance or an

  • strengths, advantages, benefits, gains, or improvements
  • disadvantages, weaknesses, shortcomings, limitations, or drawbacks

How to critically analyse a theory, model or framework

The evaluative words used most often to refer to theory, model or framework are a sound theory or a strong theory.

The table below summarizes the criteria for judging the strengths and weaknesses of a theory:

  • comprehensive
  • clear
  • logical
  • practical
  • applicable
  • empirically supported
  • up-to-date
  • parsimonious

Evaluating a Theory, Model or Framework

The table below lists the criteria for the strengths and their corresponding weaknesses that are usually considered in a theory.

Comprehensively accounts for main phenomenaoverlooks or omits important features or concepts
Clear, detailedvague, unexplained, ill-defined, misconceived
Main tenets or concepts are logical and consistentconcepts or tenets are inconsistent or contradictory
Practical, usefulimpractical, unuseful
Applicable across a range of settings, contexts, groups and conditionslimited or narrow applicability
Empirically supported by a large body of evidence

propositions and predictions are supported by evidence
supported by small or no body of evidence

insufficient empirical support for the propositions and predictions
Up-to-date, accounts for new developmentsoutdated
Parsimonius (not excessive): simple, clear, with few variablesexcessive, overly complex or complicated

Critical analysis examples of theories

The following sentences are examples of the phrases used to explain strengths and weaknesses.

Smith’s (2005) theory appears up to date, practical and applicable across many divergent settings.

Brown’s (2010) theory, although parsimonious and logical, lacks a sufficient body of evidence to support its propositions and predictions

Little scientific evidence has been presented to support the premises of this theory.

One of the limitations with this theory is that it does not explain why…

A significant strength of this model is that it takes into account …

The propositions of this model appear unambiguous and logical.

A key problem with this framework is the conceptual inconsistency between ….

How to critically analyse a concept

The table below summarizes the criteria for judging the strengths and weaknesses of a concept:

  • key variables identified
  • clear and well-defined
  • meaningful
  • logical
  • relevant
  • up-to-date

Evaluating Concepts

Key variables or constructs identifiedkey variables or constructs omitted or missed
Clear, well-defined, specific, preciseambiguous, vague, ill-defined, overly general, imprecise, not sufficiently distinctive

overinclusive, too broad, or narrowly defined
Meaningful, usefulconceptually flawed
Relevantquestionable relevance
Up-to-dateout of date

Critical analysis examples of concepts

Many researchers have used the concept of control in different ways.

There is little consensus about what constitutes automaticity.

Putting forth a very general definition of motivation means that it is possible that any behaviour could be included.

The concept of global education lacks clarity, is imprecisely defined and is overly complex.

Some have questioned the usefulness of resilience as a concept because it has been used so often and in so many contexts.

Research suggests that the concept of preoperative fasting is an outdated clinical approach.

How to critically analyse arguments, viewpoints or ideas

The table below summarizes the criteria for judging the strengths and weaknesses of an argument, viewpoint or idea:

  • reasons support the argument
  • argument is substantiated by evidence
  • evidence for the argument is relevant
  • evidence for the argument is unbiased, sufficient and important
  • evidence is reputable
  • balanced
  • clear
  • logical
  • convincing

Evaluating Arguments, Views or Ideas

Reasons and evidence provided support the argumentthe reasons or evidence do not support the argument - overgeneralization
Substantiated (supported) by factual evidenceinsufficient substantiation (support)
Evidence is relevant and believableBased on peripheral or irrelevant evidence
Unbiased: sufficient or important evidence or ideas included and considered.biased: overlooks, omits, disregards, or is selective with important or relevant evidence or ideas.
Evidence from reputable or authoritative sourcesevidence relies on non reputable or unrecognized sources
Balanced: considers opposing viewsunbalanced: does not consider opposing views
Clear, not confused, unambiguousconfused, ambiguous
Logical, consistentthe reasons do not follow logically from and support the arguments; arguments or ideas are inconsistent

Critical analysis examples of arguments, viewpoints or ideas

The validity of this argument is questionable as there is insufficient evidence to support it.

Many writers have challenged Jones’ claim on the grounds that …….

This argument fails to draw on the evidence of others in the field.

This explanation is incomplete because it does not explain why…

The key problem with this explanation is that ……

The existing accounts fail to resolve the contradiction between …

However, there is an inconsistency with this argument. The inconsistency lies in…

Although this argument has been proposed by some, it lacks justification.

However, the body of evidence showing that… contradicts this argument.

How to critically analyse a methodology

The table below provides the criteria for judging the strengths and weaknesses of methodology.

An evaluation of a methodology usually involves a critical analysis of its main sections:

design; sampling (participants); measurement tools and materials; procedure

  • design tests the hypotheses or research questions
  • method valid and reliable
  • potential bias or measurement error, and confounding variables addressed
  • method allows results to be generalized
  • representative sampling of cohort and phenomena; sufficient response rate
  • valid and reliable measurement tools
  • valid and reliable procedure
  • method clear and detailed to allow replication

Evaluating a Methodology

Research design tests the hypotheses or research questions research design is inappropriate for the hypotheses or research questions
Valid and reliable method dubious, questionable validity
The method addresses potential sources of bias or measurement error.
confounding variables were identified
insufficiently rigorous
measurement error produces questionable or unreliable

confounding variables not identified or addressed
The method (sample, measurement tools, procedure) allows results to be generalized or transferred.

Sampling was representative to enable generalization
generalizability of the results is limited due to an unrepresentative sample:

small sample size or limited sample range
Sampling of cohort was representative to enable generalization

sampling of phenomena under investigation sufficiently wide and representative

sampling response rate was sufficiently high
limited generalizability of results due to unrepresentative sample:

small sample size or limited sample range of cohort or phenomena under investigation

sampling response rate was too low
Measurement tool(s) / instrument(s), appropriate, reliable and valid

measurements were accurate
inappropriate measurement tools; incomplete or ambiguous scale items

inaccurate measurement

reliability statistics from previous research for measurement tool not reported

measurement instrument items are ambiguous, unclear, contradictory
Procedure reliable and validMeasurement error from administration of the measurement tool(s)
Method was clearly explained and sufficiently detailed to allow replicationExplanation of the methodology (or parts of it, for example the Procedure) is unclear, confused, imprecise, ambiguous, inconsistent or contradictory

Critical analysis examples of a methodology

The unrepresentativeness of the sample makes these results misleading.

The presence of unmeasured variables in this study limits the interpretation of the results.

Other, unmeasured confounding variables may be influencing this association.

The interpretation of the data requires caution because the effect of confounding variables was not taken into account.

The insufficient control of several response biases in this study means the results are likely to be unreliable.

Although this correlational study shows association between the variables, it does not establish a causal relationship.

Taken together, the methodological shortcomings of this study suggest the need for serious caution in the meaningful interpretation of the study’s results.

How to critically analyse research results and conclusions

The table below provides the criteria for judging the strengths and weaknesses of research results and conclusions:

  • appropriate choice and use of statistics
  • correct interpretation of results
  • all results explained
  • alternative explanations considered
  • significance of all results discussed
  • consistency of results with previous research discussed
  • results add to existing understanding or knowledge
  • limitations discussed
  • results clearly explained
  • conclusions consistent with results

Evaluating the Results and Conclusions

Chose and used appropriate statisticsinappropriate choice or use of statistics
Results interpreted correctly or accuratelyincorrect interpretation of results
the results have been over-interpreted
For example: correlation measures have been incorrectly interpreted to suggest causation rather than association
All results were explained, including inconsistent or misleading resultsinconsistent or misleading results not explained
Alternative explanations for results were consideredunbalanced explanations: alternative explanations for results not explored
Significance of all results were consideredincomplete consideration of results
Results considered according to consistency with other research or viewpoints

Results are conclusive because they have been replicated by other studies
consistency of results with other research not considered
results are suggestive rather than conclusive because they have not been replicated by other studies
Results add significantly to existing understanding or knowledgeresults do not significantly add to existing understanding knowledge
Limitations of the research design or method are acknowledgedlimitations of the research design or method not considered
Results were clearly explained, sufficiently detailed, consistent results were unclear, insufficiently detailed, inconsistent, confusing, ambiguous, contradictory
Conclusions were consistent with and supported by the resultsconclusions were not consistent with or not supported by the results

Leave a Reply

You must be logged in to post a comment.