0
   

Rape Culture in Dog Parks; Feminist scholarship fooled by clever hoax.

 
 
Reply Sun 14 Oct, 2018 06:36 pm
A trio of researchers have been publishing fake studies in prestigious feminist and gender studies journals. They submitted 20 papers which were reviewed, and 7 of them were published including...

- A study of rape culture among dogs in a dog park.
- A analysis of the benefits of chaining male students to the floor as a classroom exercise.
- A section lifted from Mein Kampf with phrases like "Jewish conspiracy" changed to to phrases like "oppressive patriarchy".
- A study suggesting that advocating male "penetrative" use of anal sex toys as part of masturbation might decrease oppressive behavior.

If you don't see how this is funny, you don't have a sense of humor. I would have loved to help write these articles.

This hoax was uncovered by an astute Wall Street Journal reporter who thought the dog park study published in the academic journal called "Gender, Place & Culture" was a little suspicious.

This also shows the power of a narrative. If people have an unquestioning belief in rape culture and an opposition to what they see as the "patriarchy" they are likely to accept any study that fits this narrative without even questioning whether it makes sense or not.

You can read the submitted papers, and the reviewers comments... here

https://drive.google.com/drive/folders/19tBy_fVlYIHTxxjuVMFxh4pqLHM_en18

My personal favorite is:

Quote:
Specifcally, this study seeks to explore, “Do men who report greater comfort with receptive penetrative anal eroticism also report less transphobia, less obedience to masculine gender norms, greater partner sensitivity, and greater awareness about rape?” This study uses semi- structured interviews with thirteen men to explore this question, analyzed with a naturalist and constructivist grounded theory approach in the context of sexualities research and introduces transhysteria as a parallel concept to Anderson’s homohysteria. This analysis recognizes potential socially remedial value for encouraging male anal eroticism with sex toys.






  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Question • Score: 0 • Views: 1,099 • Replies: 17
No top replies

 
maxdancona
 
  1  
Reply Sun 14 Oct, 2018 06:37 pm
@maxdancona,
This is apparently causing a stir in the academic world

https://www.chronicle.com/article/What-the-Grievance/244753
0 Replies
 
Thomas
 
  2  
Reply Mon 15 Oct, 2018 03:33 am
I wonder if there has brrn any effort to try this with one of the hard-science or engineering magazines. I suspect you might get away with a nonsense paper full of object-oriented, Agile, next-generation, enterprise-class buzzwords, but I don't know of anyone having made the attempt. Do you?
engineer
 
  2  
Reply Mon 15 Oct, 2018 05:24 am
@Thomas,
An analysis of the twenty papers: https://slate.com/technology/2018/10/grievance-studies-hoax-not-academic-scandal.html

Quote:
What about the seven papers that were accepted for publication? One was a collection of poetry for a journal called Poetry Therapy. Let’s be clear: This was bad poetry. (“Love is my name/ And yours a sweet death.”) But I’m not sure its acceptance sustains the claim that entire fields of academic inquiry have been infiltrated by social constructivism and a lack of scientific rigor.

Another three plants were scholarly essays. Two were boring and confusing; I think it’s fair to call them dreck. That dreck got published in academic journals, a fact worth noting to be sure. The third, a self-referential piece on the ethics of academic hoaxes, makes what strikes me as a somewhat plausible argument about the nature of satire. The fact that its authors secretly disagreed with the paper’s central claim—that they were parroting the sorts of arguments that had been made against them in the past, and with which they’ve strongly disagreed—doesn’t make those arguments a priori ridiculous.

That leaves us with three more examples of the hoax. These were touted as the most revealing ones—the headline grabbers, the real slam dunks: the dog-rape paper, the dildo paper, the breastaurant research. They also share a common trait: Each was presented as a product of empirical research, based on original data. The dog-rape study is supposed to have resulted from nearly 1,000 hours of observation at three dog parks in southeast Portland. The dildo paper pretends to draw from multihour interviews with 13 men—eight straight, two bisexual, three gay—about their sexual behaviors. And the breastaurant research claims to have its basis in a two-year-long project carried out in northern Florida, involving men whose educational backgrounds, ages, and marital statuses were duly recorded and reported.

How absurd was it for such work to get an airing? It may sound silly to investigate the rates at which dog owners intervene in public humping incidents, but that doesn’t mean it’s a total waste of time (as psychologist Daniel Lakens pointed out on Twitter). If the findings had been real, they would have some value irrespective of the pablum that surrounds them in the paper’s introduction and discussion sections.


I don't think is it hard to completely make up data and hoax scientific magazines, even the hard science ones. It happens periodically and it's not hard to find examples where scientists falsified data. There is a real debate going on around what the proper confidence interval should be for studies since the p=.05 standard is easily abused. The entire journal system assumes a level of integrity that can be easily abused. Those studies would likely fail peer review, but it's not like the journals replicate the experiments from each article they publish.
engineer
 
  1  
Reply Mon 15 Oct, 2018 05:37 pm
@engineer,
This topic got me looking around and there are some real doozies out there. This guy could have killed people.
Quote:
One Duke University surgeon called it a “new frontier” in cancer treatment. Another said it could save “10,000 lives a year” or more. A researcher at Mass General Hospital called it “a very, very exciting tool” in the fight against lung cancer. As news spread in 2006 and 2007 of the work of Anil Potti, a star cancer researcher at Duke, the excitement grew.

What he had claimed to achieve, in leading medical journals, was a genomic technology that could predict with up to 90 percent accuracy which early stage lung cancer patients were likely to have a recurrence and therefore benefit from chemotherapy.

He had developed, Potti said in interviews at the time, a genomic “fingerprint unique to the individual patient” that would predict the chances of survival of early stage lung cancer patients.

It was considered a breakthrough because, as the Economist explained at the time, chemotherapy is “a blunt instrument … In most cases a patient’s survival depends on whether he dies from the side effects of chemotherapy before the chemotherapy kills the cancer, or vice versa. A way to pick the right type of chemotherapy would make a big difference. Anil Potti and colleagues, of Duke University in North Carolina, have proven — in principle, at least — that they can do exactly that. Instead of prescribing chemotherapies according to a doctor’s best guess, they propose a genetic analysis to predict which type of chemotherapy would stand the greatest chance of zapping cancerous cells.”

And they had ample reason for their praise. After all, the revolutionary findings by Anil Potti and his team were first published in Nature Medicine, one of the most prestigious peer-reviewed journals in the field, and later in a host of other prestigious journals.

Now, the Office of Research Integrity (ORI), the agency that investigates fraud in federally-funded medical research, has officially declared that the data generated by Potti was not only flawed, but “false.”

The data was “altered,” it said in a report published Monday in the Federal Register, to produce the results desired by the researchers. False data were also submitted to obtain further grants for research, it concluded, citing a claim by Potti that 6 of 33 patients responded favorably to a test when only 4 patients were enrolled in the trial, none of them responding positively.

Or how about this guy who was just doing it for the money. I love that he sued his employer for pointing out problems in his work.

Quote:
Harvard Medical School and Brigham and Women’s Hospital have recommended that 31 papers from a former lab director be retracted from medical journals.

The papers from the lab of Dr. Piero Anversa, who studied cardiac stem cells, “included falsified and/or fabricated data,” according to a statement to Retraction Watch and STAT from the two institutions.

Last year, the hospital agreed to a $10 million settlement with the U.S. government over allegations Anversa and two colleagues’ work had been used to fraudulently obtain federal funding. Anversa and Dr. Annarosa Leri — who have had at least one paper already retracted, and one subject to an expression of concern — had at one point sued Harvard and the Brigham unsuccessfully for alerting journals to problems in their work back in 2014. Anversa’s lab closed in 2015; Anversa, Leri, and their colleague Dr. Jan Kajstura no longer work at the hospital.

Here's a website specializing in retractions from journals who would rather keep it on the down low: http://retractionwatch.com/
engineer
 
  1  
Reply Mon 15 Oct, 2018 05:42 pm
@engineer,
Here's an article from the Guardian

Quote:
Dozens of recent clinical trials contain suspicious statistical patterns that could indicate incorrect or falsified data, according to a review of thousands of papers published in leading medical journals.

The study, which used statistical tools to identify anomalies hidden in the data, has prompted investigations into some of the trials identified as suspect and raises new concerns about the reliability of some papers published in medical journals.

The analysis was carried out by John Carlisle, a consultant anaesthetist at Torbay Hospital, who previously used similar statistical tools to expose one of the most egregious cases of scientific fraud on record, involving a Japanese anaesthesiologist who was found to have fabricated data in many of his 183 retracted scientific papers.

In the latest study, Carlisle reviewed data from 5,087 clinical trials published during the past 15 years in two prestigious medical journals, Jama and the New England Journal of Medicine, and six anaesthesia journals. In total, 90 published trials had underlying statistical patterns that were unlikely to appear by chance in a credible dataset, the review concluded.
0 Replies
 
maxdancona
 
  1  
Reply Mon 15 Oct, 2018 06:05 pm
Quote:
Consequently, I examine the following questions, which are underdeveloped
within intersectional animal/feminist literature: (1) How do human discourses of
rape culture get mapped onto dogs’ sexual encounters at dog parks; particularly,
how do companions manage, contribute, and respond to ‘dog rape culture’? (2)

GENDER, PLACE & CULTURE 5
What issues surround queer performativity and human reaction to homosexual
sex between and among dogs? and (3) Do dogs suffer oppression based upon
(perceived) gender.


Come on you guys! There are no Physics journals considering whether quantum entanglement causes problems for customers at hair salons. There is a big difference between presenting fraudulent data and getting feminists journals to consider if dogs "suffer oppression based upon (perceived) gender.


These feminist journals were willing to accept studies that were patently absurd, without question, as long as they followed a basic ideological template. You don't see that happen in hard sciences.

That is the scandal. This is a continuation of something called the Sokol affair. this refers to a test to see if social science journals would "publish an article liberally salted with nonsense if (a) it sounded good and (b) it flattered the editors' ideological preconceptions".



maxdancona
 
  1  
Reply Mon 15 Oct, 2018 06:18 pm
@maxdancona,
The hard sciences are different in significant ways.

1. Hard science aren't integrally entwined with a political ideology.

2. When people challenge the dominant paradigm, and have data to back it up, people are excited. They don't hide the data. I dated a woman who was studying the atmosphere of Mars, they discovered that the stratification completely defied the current models. They didn't hide this data, they celebrated it (and it turned into a large amount of grant money).

3. Data rules. We take it as a given that faster than light information transmission is impossible. If someone was able to break this limit... we wouldn't be threatened by it. It would upset much of what we understand of Physics. People would be very skeptical at first, but they would rush to test it... and if it held and was reproducible it would be celebrated.

I was disappointed when Pons and Fleischman didn't pass muster. I wasn't surprised, but it would have been much cooler to see the paradigms of Physics change. If cold fusion could have been reproduced reliably it would have been accepted in spite of the fact it breaks the current paradigm.

Data is what matters in hard science, not ideology. (Of course science is not perfect in terms of funding and prestige etc., but if someone has the data to back up her finding and it can be reproduced, their idea will prevail.)

This hoax is built on the idea that if you get ideology right, academic feminism will accept it no matter how ill-supported it is.
0 Replies
 
engineer
 
  1  
Reply Tue 16 Oct, 2018 05:49 am
@maxdancona,
maxdancona wrote:

These feminist journals were willing to accept studies that were patently absurd, without question, as long as they followed a basic ideological template.

I think you have it backwards. While the absurd articles were rejected, the ones that presented hard data and claimed to use an appropriate scientific approach were accepted. Your position appears to be that if it doesn't meet your viewpoint it should be censored a priori even though it appears to be rigorous. The purpose of journals are to publish work that seems to be legitimate so that it can be subject to peer review. My guess is that editors don't sit around making judgement calls on whether something is worthy of study.
maxdancona
 
  1  
Reply Tue 16 Oct, 2018 06:19 am
@engineer,
The rape culture in dog parks article suggesting that dogs "humping" in the park are oppressed by heteronormativity was published. You can read the article,

The research involved sitting in the park watching dogs humping and judging if the dogs enjoyed it as a measure of consent. It is patently ridiculous. It was accepted because it fit the ideological template.

So was the article suggesting that convincing heterosexual men to pit dildos in their anus would have a positive effect in reducing sexual violence. The feminist Mein Kampf article was accepted, but not published before the story broke.

Do you really not see how absurd these articles are ?

I think they made their point.
engineer
 
  1  
Reply Tue 16 Oct, 2018 07:11 am
@maxdancona,
maxdancona wrote:

The research involved sitting in the park watching dogs humping and judging if the dogs enjoyed it as a measure of consent. It is patently ridiculous. It was accepted because it fit the ideological template.

So your position is that the editors are to censor articles that you find ridiculous even if the research is solid? Are you willing to consider that what you find ridiculous, others might find mildly interesting or even tangentially related to research they are doing? People in Congress might find someone studying the dynamics of frisbee flight as a ridiculous waste of effort, does that mean no one should publish the results?

I think they made the point that someone who is willing to put in a good enough fake paper can get it published, but I think a lot of people have already made that point.
tsarstepan
 
  1  
Reply Tue 16 Oct, 2018 07:40 am
@maxdancona,
A handful of feminist idiots were triggered by an obvious hoax and this proves your whole worldview ... how?

Hint? It doesn't. There are dangerous idiots on every side of every political, social, and cultural equation. Fruit rotting on the ground (and that's what these gullible idiots who fell foot in mouth screaming against this Onionesque trap) isn't even close to low hanging fruit.

If you don't see how this is funny, you don't have a sense of humor.
It's not offensive nor its funny. It's just kind of lame and mean-spirited.
maxdancona
 
  1  
Reply Tue 16 Oct, 2018 07:54 am
@engineer,
The point of this hoax was to show that you can get ridiculous work accepted as long as it flatters the ideological beliefs of editors and reviewers. Did you read the papers? It isn't just that the conclusions are ridiculous, the entire process is ridiculous. There were leaps in logic, ideological conclusion unsupported by the data given. There were clearly unethical practices described.

You are circling the wagons. But there is no question that the Dog Park article was absurd on every level. The conclusion was absurd, the research was absurd, and the results didn't even match the purported data.

The only reason that these articles were accepted was they used the correct ideological templates and reached the proper ideological conclusions. Read some of the articles (and they also show the reviewers notes which are also funny).

The original articles, and reviewer responses (rather humorous), are all archived here... https://drive.google.com/drive/folders/19tBy_fVlYIHTxxjuVMFxh4pqLHM_en18

The peer review in these cases cared mainly about ideological results. It failed to screen out clear failures in basic logic and even accepted quite unethical processes (including chaining students as part of a classroom exercise).

Look at the articles for yourself, and then tell me that this wasn't a huge failure in the peer review process.

If peer review can't stop breaches in logic, unsupported conclusions and unethical process, then what is it good for?
0 Replies
 
maxdancona
 
  1  
Reply Tue 16 Oct, 2018 07:57 am
@tsarstepan,
I don't think you took the time to understand the story before you responded, Tsar. I will give you some time to read what actually happened, and then we can discuss it if you want.
tsarstepan
 
  1  
Reply Tue 16 Oct, 2018 08:10 am
@maxdancona,
I'm not going to click on a strange Google drive link. I don't trust you enough or I definitely don't trust whomever runs this goofy project to step into some kind of malware trap or give away my Gmail info when clicking through this dull nonsense (you'll need a Google account to access).

What exactly am I getting wrong in the first place? The whole project was formed as a gotcha trap against feminists and antirape culture movement. Conceded. It caught the stupidest and ironically unenlightened of that movement. They're not worthy of being considered low hanging fruit. They're the bane of actual feminists and progressives as their as harmful as lung cancer in a one lunged sprinter.

Next time, drop a legitimate website link. Oh... wait! No legitimate publication or academic site would publish this reductive drivel.

You just couldn't drop the Wall Street article? Or does it actually not coincide with the points your trying to make?
maxdancona
 
  1  
Reply Tue 16 Oct, 2018 08:27 am
@tsarstepan,
If someone successfully smuggles weapons onto an airplane as a way to show that TSA is failing to ensure security... you might be angry at them for breaking the rules, and they certainly are playing "gotcha" with the TSA. You might feel it was justified if they got arrested. But, that doesn't change the fact that TSA failed in an important way. If someone does this, the TSA should think long and hard about how the failed and what they can do to address it.

The peer review process in these prominent journals failed to detect the humorously shoddy work, ridiculous conclusions, unethical practices and failures in basic logic that were in these papers. They set out to prove that all that mattered was flattering the ideological biases of this academic community... and they succeeded.

Circling the wagons to ignore these problems in what is supposed to be an intellectual community is counterproductive.

An academic community that attacks critics rather than responding intelligently and changing when appropriate has a problem.
tsarstepan
 
  1  
Reply Tue 16 Oct, 2018 08:41 am
@maxdancona,
The TSA is a governmental organization. If and when they fail? People literally could die and die very horrible deaths.

Randomly cherry picking the stupidest and easily triggered few from one group? Yeah they failed the easiest of tests. Not using their collective brain cells. So? How many people died (or possibly die) from this endeavor? Oh wait!

These is a very idiotic comparison as they literally don't match up. Not even by a nanometer.

Quote:
The peer review process in these prominent journals failed to detect the humorously shoddy work, ridiculous conclusions, unethical practices and failures in basic logic that were in these papers. They set out to prove that all that mattered was flattering the ideological biases of this academic community... and they succeeded.

Obviously conceded by almost everyone involved. They failed and failed miserably. And I do get the dark humor in dwelling on that failure. Yet, that proves none of your really dangerous overreaching conclusions. I'm pretty sure that even you realize this. Talk about stubborn and immovable.
maxdancona
 
  1  
Reply Tue 16 Oct, 2018 10:34 am
@tsarstepan,
Quote:
Obviously conceded by almost everyone involved. They failed and failed miserably. And I do get the dark humor in dwelling on that failure.


Thank you Tsar. I don't know what conclusions you think I am reaching. But, I will accept what agreement I can get.

Unrelated... but amusing to me... I recently heard a gender studies professor on NPR attempt to explain why there aren't more women in STEM fields (it made me chuckle).
0 Replies
 
 

Related Topics

 
  1. Forums
  2. » Rape Culture in Dog Parks; Feminist scholarship fooled by clever hoax.
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.03 seconds on 04/19/2024 at 05:27:00