In lieu of an abstract, here is a brief excerpt of the content:

  • Looking Forward
  • Robert D. Brown, 1983–1988

Rather than wax nostalgic about my editorial tenure 20 years ago, I will focus on the future by sharing my thoughts about the Journal of College Student Development and the student affairs profession. First, I pose three questions that a graduate student might consider for a thesis topic and then I discuss several challenges directed primarily at researchers and graduate program faculty.

Research Questions for a Thesis

Who reads the Journal? What are the demographics of the readers? Are they mostly graduate students? Are they reading for a class assignment or a literature review for a research project? How many readers are faculty members, practitioners, or from other disciplines or work settings? How do they choose which articles to read and do they read the entire article or focus primarily on the recommendations and conclusions?

Do readers understand the articles? Many quantitative articles employ sophisticated statistical analyses taught in advanced statistics courses (e.g., multiple regression, various forms of factor analysis, etc.). The same can be said for qualitative methodologies. Yet, data suggest that new professionals feel least prepared and their supervisors see them as least competent in program assessment and in quantitative and qualitative research methodologies (Cuyjet, Longwell-Grice, & Molina, 2009). If Journal readers do not have a full grasp of the quantitative or the qualitative methodologies, how do they form a judgment as to whether the analyses were appropriate or appropriately interpreted?

Do readers understand the theories discussed in the thought pieces? Do they rethink their own views of student learning and development because of what they read?

Do readers apply what they have learned from the Journal articles? Do they discuss what they have read with colleagues? Do they process what they have read in a reflective manner? Do they reformulate their conceptions of student learning and development? Do they search out related articles? Do they make an effort to apply the authors' recommendations to their professional practice or to their respective student affairs unit?

Answers to these questions could be informative to graduate training faculty and student affairs administrators.

Thoughts on Improving the Journal's Impact

Here are a few thoughts on how researchers, graduate training programs, and student affairs units can enhance the impact of the Journal on professional practice.

Survey research needs to be followed up with interventions, programs, and treatments. Manuscripts describing studies of student alcohol abuse and student retention are [End Page 707] examples of what were pet peeves during my editorial tenure and remain so. One researcher annually sent me a manuscript documenting student drinking during spring break in Florida. Baseline data can be useful, but I suggested to the author that it would more fruitful if an equal amount of time was spent designing and implementing an intervention program for students with alcohol issues.

The same can be said about retention studies that entail numerous correlations and prediction equations. The literature is replete with studies that find that this or that student variable is related to retention, but studies that follow-up on these findings with evaluation of programmatic efforts to reduce drop-out rates are too rare.

Journal editors request, if not require, authors to discuss the implications for practice, as well as implications for future research in their manuscripts. But I wonder whether readers apply these suggestions and do researchers themselves design and evaluate the programs, interventions, or treatments they propose?

Researchers need to move through all phases of the problem topic and not just stop at the survey or multiple regression stage. That is, the researcher first should pose and address the problem. Then, if practical implications follow from survey or correlation findings, the researcher needs to play a role in implementing and evaluating whatever application (program or intervention) is recommended. Survey and correlation studies are easier to conduct than experimental or program evaluation studies and they have their place, as does basic research, but I believe researchers have a moral obligation to see that their recommendations are implemented and the results evaluated.

Journal editors are often viewed as gatekeepers of the profession, but they can only publish the manuscripts they have in hand. They can encourage and cajole, but...


Additional Information

Print ISSN
pp. 707-709
Launched on MUSE
Open Access
Back To Top

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.