Discussion: View Thread

JBP Free Access Methods Edition

  • 1.  JBP Free Access Methods Edition

    Posted 09-07-2024 07:26

    Dear Colleagues

    Our 2024 Methods Feature is fully available for free for 2 months.  We hope you find this useful to your work.

     

    Data Aggregation in Multilevel Research: Best Practice Recommendations and Tools for Moving Forward

    James M. LeBreton, Amanda N. Moeller & Jenell L. S. Wittmer

    The multilevel paradigm is omnipresent in the organizational sciences, with scholars recognizing data are almost always nested – either hierarchically (e.g., individuals within teams) or temporally (e.g., repeated observations within individuals). The multilevel paradigm is moored in the assumption that relationships between constructs often reside across different levels, often requiring data from a lower-level (e.g., employee-level justice perceptions) to be aggregated to a higher-level (e.g., team-level justice climate). Given the increased scrutiny in the social sciences around issues of clarity, transparency, and reproducibility, this paper first introduces a set of data aggregation principles that are then used to guide a brief literature review. We found that reporting practices related to data aggregation are quite variable with little standardization as to what information and statistics are included by authors. We conclude our paper with a Data Aggregation Checklist and a new R package, WGA (Within-Group Agreement & Aggregation), intended to improve the clarity and transparency of future multilevel studies.

    Gone Fishin': Addressing Completeness, Accuracy, and Representativeness in the Search and Coding Processes of Meta-Analyses in the Organizational Sciences

    Ernest H. O'Boyle, Martin Götz & Damian C. Zivic

    No research question is compelling enough nor a meta-analytic procedure advanced enough to overcome an ineffectual search or inaccurate coding process. The bulk of attention towards meta-analyses conducted within the organizational sciences has been directed at establishing the types of research questions meta-analyses are best equipped to address and how best to go about analyzing secondary data. However, the meta-analytic process requires rigor and transparency at every step. It is too often the case that the search and coding are non-systematic, resulting in a deficient and/or contaminated dataset and, ultimately, not an accurate reflection of the extant literature. Using the analogy of a fishing trip where fish are available studies and the oceans, lakes, and rivers are the sources of data, we highlight best practices and offer actionable takeaways in conducting and reporting a thorough and representative search and accurate and inclusive coding process for meta-analyses in the organizational sciences.

    Assessing Publication Bias: a 7-Step User's Guide with Best-Practice Recommendations

    Sven Kepes, Wenhao Wang & Jose M. Cortina

    Meta-analytic reviews are a primary avenue for the generation of cumulative knowledge in the organizational and psychological sciences. Over the past decade or two, concern has been raised about the possibility of publication bias influencing meta-analytic results, which can distort our cumulative knowledge and lead to erroneous practical recommendations. Unfortunately, no clear guidelines exist for how meta-analysts ought to assess this bias. To address this issue, this paper develops a user's guide with best-practice recommendations for the assessment of publication bias in meta-analytic reviews. To do this, we review the literature on publication bias and develop a step-by-step process to assess the presence of publication bias and gage its effects on meta-analytic results. Examples of tools and best practices are provided to aid meta-analysts when implementing the process in their own research. Although the paper is written primarily for organizational and psychological scientists, the guide and recommendations are not limited to any particular scientific domain.

    What's holding you back? Development of the Multi-Facet Organizational Constraints Scale (MOCS)

    Nathan A. Bowling, Jesse S. Michel, Md Rashedul Islam, Michael A. Rotch, Stephen H. Wagner & Lucian Zelazny

    Organizational constraints-which include any workplace condition that undermines a worker's ability to perform his or her job tasks-are an important type of work stressor. Previous research has typically assessed organizational constraints as a global (i.e., unidimensional) construct. In the current paper, we argue that a facet (i.e., multidimensional) approach to assessing organizational constraints would complement the global approach in important ways. A facet approach, for instance, would provide researchers with new insights into the fundamental nature of the organizational constraints construct, and it would provide practitioners with specific, actionable information that they could use to inform organizational policies and interventions. With these potential benefits of the facet approach in mind, we developed the Multi-Facet Organizational Constraints Scale (MOCS)-a self-report measure that yields 16 separate facet-level scores. Across seven samples (total N = 1,600), we found that the MOCS had desirable psychometric properties: It yielded high levels of internal-consistency and test–retest reliability, it produced an interpretable factor structure, and we observed evidence supporting the MOCS's construct validity. By providing a means of assessing organizational constraints facets, the current research has both theoretical and practical implications for various research areas within applied psychology, including occupational stress, organizational culture, employee training, and leadership.

    The Development and Validation of an Interpersonal Distrust Scale

    Hanyi Min & Michael J. Zickar

    Though many researchers have studied interpersonal trust, its counterpart, distrust, has been largely ignored. The relative dearth of distrust research may be a result of an early assumption that distrust represents an absence of trust. Nevertheless, recent reviews have pointed out that distrust is not the opposite of trust, but rather a distinct construct (e.g., Lewicki, Tomlinson, & Gillespie, Academy of Management Review, 23(3), 438–458, 2006; Lumineau, Journal of Management, 0149206314556656, 2015). We use three studies to empirically demonstrate that distrust and trust are descriptively bipolar but functionally distinct constructs. In Study 1, we generate a distrust scale with methodological rigor, which shows good psychometric properties. In Study 2, we crossvalidate the distrust scale. Discriminant validity of the new scale also demonstrates that the distrust scale is distinct from subscales of trust and another theoretically relevant construct (i.e., distrust propensity), which provides the first empirical evidence that distrust is not redundant with trust. Moreover, we develop a theoretical model of distrust antecedents and outcomes based on social exchange theory and empirically investigate the nomological network of interpersonal distrust in Study 3. Consistent with the hypotheses, interpersonal distrust significantly correlates with the theoretical antecedents and consequences across two samples. Additionally, our findings in Study 3 demonstrate that distrust has significantly different relation strength with other constructs compared to trust, which further supports that distrust and trust are descriptive bipolar but functionally independent constructs.

     


    --


    ---------------------------------------------------------------------------
    Steven G. Rogelberg, PhD 
    Chancellor's Professor
    Past-President, Society for Industrial and Organizational Psychology  
    Co-Editor, Journal of Business and Psychology

    Professor, Organizational Science, Psychology, and Management
    University of North Carolina, Charlotte | Colvard 4025 
    9201 University City Blvd. | Charlotte, NC 28223