Research Methodology
Guides the design and documentation of research methodologies for academic studies.
Research methodology is the bridge between a question and an answer. A well-designed methodology produces valid, reliable results; a poorly designed one wastes resources and generates findings that cannot be trusted or replicated. ## Key Points 1. State the research question precisely; the methodology must serve the question 2. Select the paradigm and approach that best fits the question type 3. Define the population and justify your sampling strategy with target sample size 4. Design or select data collection instruments and establish their validity 5. Plan the data collection procedure step by step, including ethical safeguards 6. Pre-register the study or analysis plan if applicable to your field 7. Specify the analysis techniques before collecting data to prevent fishing 8. Conduct a pilot study or dry run to identify procedural problems 9. Document every methodological decision with its rationale 10. Write the methods section with enough detail that another researcher could replicate your study - The research question dictates the method, never the reverse - Justify every design choice; reviewers will question unjustified decisions
skilldb get academic-writing-skills/Research MethodologyFull skill: 87 linesResearch Methodology Design
Overview
Research methodology is the bridge between a question and an answer. A well-designed methodology produces valid, reliable results; a poorly designed one wastes resources and generates findings that cannot be trusted or replicated.
This skill covers selecting research approaches, designing studies, planning data collection and analysis, and documenting methodology clearly enough for replication. It applies across quantitative, qualitative, and mixed-methods research paradigms.
Core Philosophy
The research question dictates the methodology, never the reverse. Choosing a method because it is familiar or fashionable rather than because it is the best fit for the question produces studies that are technically competent but intellectually hollow. Every methodological decision -- paradigm, approach, sampling strategy, instrument, analysis technique -- must be justified by its alignment with what you are trying to learn.
Transparency is the foundation of methodological credibility. A methods section that describes what was done without explaining why invites skepticism from reviewers and prevents replication by future researchers. Every design choice should be accompanied by its rationale, including the alternatives that were considered and rejected. This level of documentation transforms methodology from a recipe into an intellectual argument.
Rigor is not the exclusive property of any single paradigm. Quantitative studies must address validity and reliability through experimental controls and statistical power. Qualitative studies must address credibility and dependability through triangulation, member checking, and thick description. Mixed-methods studies must justify how the components integrate. Dismissing any paradigm as inherently less rigorous reveals methodological bias, not methodological understanding.
Anti-Patterns
-
Choosing a method because it is familiar rather than because it fits the question. A researcher trained in surveys who forces every question into a survey design, or a qualitative researcher who avoids quantification even when the question demands it, is letting comfort drive scholarship. The question comes first; the method follows.
-
Under-powering quantitative studies so that results are inconclusive. A study with too few participants to detect a meaningful effect is a waste of resources -- it cannot confirm or deny the hypothesis. Conduct a power analysis before data collection and recruit the sample size the analysis requires, not the sample that is convenient.
-
Writing methods so vaguely that replication is impossible. "Interviews were conducted" tells the reader nothing actionable. How many interviews? How long? What questions were asked? How were participants selected? How was data analyzed? The methods section should contain enough procedural detail that another researcher could reproduce the study without contacting the author.
-
Conflating correlation with causation in non-experimental designs. Observational and survey studies can identify associations but cannot establish causal relationships. Claiming that X causes Y based on correlational data is a fundamental methodological error that undermines the study's conclusions and the author's credibility.
-
Skipping the pilot test and discovering procedural problems mid-study. A pilot study is not optional -- it is insurance against wasting your main data collection on a flawed instrument or procedure. Even a small-scale dry run with five participants can reveal ambiguous survey questions, technical failures, or timing problems that are invisible on paper.
Core Framework
The Methodology Decision Tree
Level 1 - Paradigm: Choose your philosophical orientation. Positivist approaches seek generalizable, measurable truths. Interpretivist approaches seek deep understanding of meaning. Pragmatist approaches select methods based on what best answers the question.
Level 2 - Approach: Quantitative (experimental, survey, correlational), qualitative (case study, ethnography, phenomenology, grounded theory), or mixed methods (sequential, concurrent, transformative).
Level 3 - Design: Define variables or constructs, sampling strategy, data collection instruments, and analysis plan. Every design choice must be justified by the research question.
Validity and Reliability
Quantitative work must address internal validity (causal claims), external validity (generalizability), and reliability (consistency). Qualitative work must address credibility, transferability, dependability, and confirmability through triangulation, member checking, and thick description.
Process
- State the research question precisely; the methodology must serve the question
- Select the paradigm and approach that best fits the question type
- Define the population and justify your sampling strategy with target sample size
- Design or select data collection instruments and establish their validity
- Plan the data collection procedure step by step, including ethical safeguards
- Pre-register the study or analysis plan if applicable to your field
- Specify the analysis techniques before collecting data to prevent fishing
- Conduct a pilot study or dry run to identify procedural problems
- Document every methodological decision with its rationale
- Write the methods section with enough detail that another researcher could replicate your study
Key Principles
- The research question dictates the method, never the reverse
- Justify every design choice; reviewers will question unjustified decisions
- Transparency about limitations strengthens credibility rather than weakening it
- Pilot test instruments and procedures before the main study
- Pre-registration reduces researcher degrees of freedom and increases trust in findings
- Ethical approval must be obtained before any data collection involving human subjects
- Document deviations from the original plan and explain why they occurred
- Power analysis or saturation criteria should determine sample size, not convenience
Common Pitfalls
- Choosing a method because it is familiar rather than because it fits the question
- Under-powering quantitative studies, making results inconclusive
- Treating qualitative research as less rigorous than quantitative, skipping validity checks
- Failing to pilot test instruments, discovering problems mid-study
- Conflating correlation with causation in non-experimental designs
- Writing methods so vaguely that replication is impossible
Output Format
Deliver methodology documentation as:
- Research design summary: one-page overview of paradigm, approach, and design with justification
- Sampling plan: population definition, sampling method, target size, and recruitment strategy
- Instruments: questionnaires, interview guides, observation protocols, or measurement tools
- Procedure: step-by-step data collection protocol with timeline
- Analysis plan: specified techniques, software, and decision rules for interpretation
- Ethics documentation: consent forms, IRB application, and data management plan
Install this skill directly: skilldb add academic-writing-skills
Related Skills
Academic Abstract
Guides the writing of effective academic abstracts for papers, conferences, and proposals.
Academic Poster
Design and create effective academic research posters for conferences and
Citation Management
Guides effective citation practices, reference management, and bibliography formatting.
Conference Presentation
Prepare and deliver effective academic conference presentations that communicate
Grant Proposal
Guides the writing of competitive grant proposals for research funding.
Literature Review
Provides a systematic methodology for conducting and writing literature reviews.