Which is a 2020 self reported online survey. As you have pointed out, there may be issues with that way of getting data. However, the DOD SHARP data comes from our online, anonymous Command Climate Survey's and SHARP surveys, which are the same style of data collection as the study RAINN cited. So Again, the numbers I posted are Apples to Apples numbers.
Sorry, but just because two studies have similar structural defects in data collection does not mean they are "apples to apples" comparable. Two self-selecting studies can both be biased, but they are not necessarily going to result in either the
same biases, or the same magnitude of biases which means that comparisons between the two can not be expected to be meaningful.
Put simply, a self-selected survey relating to sexual assault will typically attract a disproportionate amount of responses from people who have suffered from sexual assault, which will significantly increase the apparent incidence of sexual assault in the results. Thus, any well-constructed survey that intends to determine the actual rate of sexual assault within a given population needs to randomly sample that population. This way you can be more confident in the relationship between the numerator and denominator when you calculate your percentages.
On the other hand, an official DoD survey - even one that pinky swears that it is anonymous - might well serve to suppress responses (at least compared to a third-party survey over the same subject conducted on a college campus) given the historically high incidence of retribution for officially reporting sexual assaults - see the RAND surveys relating to sexual assault retaliation in the military.
You are free to dismiss all of them as junk if you like, I'm not going to argue veracity of data I did not collect.
Thank you, I will, but you are the one who selected the sources for the percentages you posted and compared. Personally, I suspect it might have just been the first search results that agreed with your position so you didn't bother to vet them any more than that.
"Look, X proves Y!""X doesn't prove Y.""How can you expect me to defend X? But Y has been proven."I will insist on pointing out that the DOJ study you posted gets it's numbers (at least in the chart on page 1) from assaults reported to police, and it's widely recognized that many sexual assaults, especially those falling short of rape, go un-reported.
Please insist all you like, but that is not correct. If you read the caption of the chart on page 1, or skim the very first paragraph of the study, you might notice that it was not in fact claiming to get its numbers from assaults reported to police, but was showing the relative percentage of rape or sexual assault victimizations that were reported to police.
If you were to go on to read the highlights (also on page 1) you would see that:
This report uses the National Crime Victimization Survey (NCVS) to compare the rape and sexual assault victimization of female college students and nonstudents.
The NCVS has its limitations, but compared to your surveys it is unquestionably the better resource for determining rate of victimization within a population.
Page 2 of the DOJ study you cited goes into a little detail on why it's not an Apples-to-Apples comparison with other surveys on the subject, and that it has widely varying outcomes from other methods of data collection.
Come back here with those goalposts, dogmush!
You kept saying "apples-to-apples" specifically because the definition of included offenses were sufficiently similar between your sources.
That said, you make a valid point. I should not have compared the DOD and DOJ studies. My only excuse is that when I looked I could not readily find the methodology of your DOD source and gave it more credence than I should have. Unlike the DOJ survey, the surveys you cited are inherently (maybe by design?) not capable of determining a realistic rate of incidence of rape and sexual assault within their respective populations, and are thus not suitable for comparison to other studies, including ones that are more likely to provide relatively accurate victimization rates.
The comparisons on page 2 and 3 of the study do show the relative strength of using the NCVS vs some other surveys out there. For example, the NVCS is conducted on a nationally representative sample of 240,000 individuals with an overall response rate of 74%. (Those are some pretty good numbers, FYI.)
Once again, I don't have any data that shows you are wrong in your assertion that the military is relatively safe for women to be in. Nor do I have evidence showing that there is a "rape culture" either in the military or in the general population. Hawkmoon and you are the ones who made those assertions, so I'll leave you guys to sort out the validity of your respective claims.
I mostly took issue with the way you tried to gussy up your biases and prior assumptions in pretentious scientitiousness and invalid, alarmist data.