The Connecting Force of Gratitude

So, whatcha getting your granddaughter this year for Christmas?

Wait, you’re not a grandma yet?

Your kids are only 5 and 3?

Me too.

But get this: the latest research coming out of my fave (am I allowed to say that?) research journal, Environmental Psychology, is saying that if we’re 


We’re more likely to care about future generations.

And if we’re aware of a responsibility toward future generations, we’re more likely to 


I love research.

So that means that if you make an effort to get your grateful on, you’re basically making the planet a better place for future generations. (This, my friends, is an oversimplification, but a reasonable conclusion. Read on to decide for yourself!)

While I love the simplified outcomes of research, I also love digging in to research to appreciate how the authors came to their conclusions. I’ll let you in on some of my secrets to determining if a research paper is up to snuff, and how to tease apart meaning. These days, when you can get numbers to say anything, how can you know if you’re reading truth? I’ll tell you.

  1. Head to the Original Article: 

Yep. If you’re reading a news piece, a summary article (like this one), a montage, or a quote, never take it at face value. If you’re particularly struck by a piece of “research”, make sure that the author cites the piece being referenced. People have the tendency to over-simplify, misunderstand, and poorly regurgitate research. The red flags you can look for with social science research (anything about the family) and with any research really, that should make you think twice are:

  • Arguments and quotes that seem really conclusive. Most research comes with pretty significant caveats. Because of the nature of statistical programming and clinical trials, it’s impossible to test every facet of every situation and make umbrella statements that are true. Instead of words like “caused” “entirely responsible” and other black-and-white, all-or-nothing inferences, original research will say “correlated”, “contributes to”, and “influenced.” Often that’s the best we can do. Always think of research as a really really really good idea of the process being questioned with helpful suggestions for implementation.
  • No citation of research or claims.
  • Really strong wording like “proves” or “causes”.
  1. Journal Assessment: 

Whatever research you’re reading, make sure it’s published through a peer review process. The peer review process takes ten hundred years and is really grueling. Okay it takes about 9 months, but I’ve had papers in the cogs of journal review for over a year and a half. This is awesome because when research is actually published this way, it’s been validated and fine-tooth-combed over. Here’s how the sequence of publishing in a peer reviewed journal works. 

  • Step 1: Conduct a preliminary research review to determine if your hypothesis (Does gratitude affect environmental concern?) has the potential to be valid. AKA, look up all previous research that’s ever been done remotely pertaining to your area of interest. Often this step is refined later, especially when your data surprises you with unforeseen connections.
  • Step 2: You conduct a research study, entailing getting permission from the powers-that-be to work with human subjects, designing materials and scales and questionnaires, putting together your testing process, recruiting participants, enacting the data collection. 
  • Step 3: You analyze the data. Run numbers through statistical programs. Make your grad students organize a billion rows of questionnaire input. Determine statistical significance of relationships (Does A impact B enough to really count?). Make charts. Assess the outcomes. 
  • Step 4: Write your paper. Usually about 30 pages, including a thorough literature review, a description of your measures and items, methods of analysis, results, and conclusions. Try to format a table in landscape orientation when the rest of the paper is in portrait. Cry seventeen tears. Put all sources in correct formatting. Have it edited 10* times. * = rough estimate
  • Step 5: Select journal of choice and reformat according to specifications, then submit paper to journal.
  • Step 6: Wait for 1-6 months while your paper is considered and reviewed by 3-5 academic and professional experts to determine validity.
  • Step 7: Hear back from the journal with either a “Rejected” or “Accepted with Reviews”. They say there’s a possibility of pure acceptance without revision, but I’ve never seen it personally. Receive feedback from each reviewer with a list of changes to make and hard questions to address.
  • Step 8: Redo analyses, lit review, and all other parts of your paper to be better, based on the input from reviewers and editor. Write up another paper about the changes you made and why or why not. Coordinate changes with all authors and contributors to the paper
  • Step 9: Resubmit
  • Step 10: Wait 1-6 months.
  • Step 11: Receive exhilarating news that your paper has been accepted and will be published in 1-6 more months.

Total time: 2-5 years. For real. Unless it’s a super hot topic and your research 

process is slick. Then 9 months at best.

  1. Look for Meta-Analyses:

Meta-analyses are the research gold. These are studies that take many many other publications and analyze those studies together in a statistical program. It’s essentially like taking the best info from a bunch of authors and putting them all together to look for conclusions. These conclusions are particularly powerful because they show trends, averages, and can suggest pretty powerful overall guidelines.

  1. Check out the Authors and Identify Conflicts of Interest: 

Look into the authors! Who are these people? Usually it’s the editors’ job to make sure there aren’t any conflicts of interest between a topic and a researcher, and if there are, to disclose them. Often though, companies will have their own research entities that have very explicit interests, and their “third party” analyses are very biased, often even using their own publisher, etc. You’ll want to see who sponsored the research (who paid for the 2 years of work these people did?) and the reasons they might have for wanting “proof.” If they stand to gain a lot of moolah, I consider that a red flag. Not a deal breaker, but a red flag.

  1. Note the Restrictions and Limitations:

Each paper is required to list its weaknesses. Inherently, all research involving relationships of any kind has the weakness of “correlational, not causational.” Any representations of this research should ideally show this. It’s way more fun to say on TV though, that “X caused Y” and therefore we should “Z”. And a lot of times the media and others ASSUME that we all know this. Be the smart consumer, and do. Assume that all data presented as fact is more like a really good guess, the goodness of the guess depending on how well they executed the previous 4 elements of this list. 

  1. Apply science knowing that it’s helpful for the group, but inapplicable for the individual:

Here’s what I mean. If you want to know how 2293 Canadian mothers did to help their kids teething pain, great. But if you want to know what to do with your kid? You’re on your own. We can always try the suggestions that come out of research, but know that any one person could fit anywhere on the bell curve of researched attitudes, behaviors, and outcomes. Isn’t that nice. After all that.

If you can go through all these steps, you’re going to be able to discern the quality of the research being presented. 

Do most reporters or bloggers do this? 

Not a chance. 

They’ll often take a line of text and use it as conclusive evidence (out of context) as to why their point is pure truth. However, if you ask the author of the research about their interpreting their findings, they’ll almost always say “It depends!” 

Beware of overly simplified conclusions.

In the case of this study, let’s run it through the list.

  1.  The original article is blocked to me because I’m no longer able to access these databases for free. I was totally spoiled before through my former hookups as a university adjunct professor and from my days as a student. I’m taking my info from the original abstract, or summary, that the article authors prepared for publication. You can find the article HERE.
  2. Journal assessment: I’ve published with Environmental Psychology, and it’s a great journal. You can also judge a journal by its impact factor. I found THIS article to be a helpful summary of impact factors.
  3. Meta-analysis? Nope. But kind of. This study is actually an accumulation of seven different studies performed by the same authors. 
  4. This group of authors spans North America. This is particularly fun to me. Oregon, Canada, Massachusetts. It’s a hard thing to coordinate efforts and funding across institutions and geography like that, especially in a pandemic. I didn’t see any conflicts of interest, but I could dig deeper if this was a hot topic or something highly debatable.
  5. I didn’t see any notes about limitations because I didn’t read the full article-- interesting right? I assume that their hands are tied in ways that most research studies are: correlation, not causation, and not longitudinal (they didn’t measure data over time, just at one point).
  6. I recognize that while most grateful people are environmentally conscious, I could be the exception.

I hope this little guide helps you break down the research that is misinterpreted, simplified, and thrown around social media like parade candy these days. And I hope that you feel great about giving your future granddaughter a better world because you’re choosing gratitude in the midst of the chaos of 2020!


Leave a comment

Please note, comments must be approved before they are published