Thanks very much Fred. Kind introduction and I also want to just thank Dennis have someone who couldn't be here today but who really he extended the invitation and I work with me over the last several months to just tell me a little more about your program and to work with me on you know what would be hopefully an interesting and important subject for you to think about I'm going to give it you know it's a real pleasure to be here and I'm going to give today. What I think of as sort of a torrent of scientific publishing. Now many of you I'm sure travel to new cities maybe when you moved here as students or faculty you sort of took a tour of the city to try and find out a bit more about it. I know that my wife and I we go to places you know like London or something and see these double decker buses in our hometown in New York we see a lot of double decker buses with all due respect to those we could make be on one day we try to avoid them. But you know we're sort of the cynical New Yorkers. This isn't that sort of high roads tour. This is a bit more like. If you showed up to London and you went on a Jack the Ripper tour and it started in midnight and you weren't quite sure where to go and when you got there. You weren't quite sure you want to be there anymore but hopefully you will at least pick up some some things here. So that's that's really sort of my my M.O. here today. So let me start by you know asking a question about this paper that was published which some of you. Maybe even seen some this paper. And I'm asking. So the question is this publishing today. This is a paper that came out in a journal called The Journal of computer science and information technology that sounds like a sort of reasonable JOURNALIST two things that go together. It has all of the trappings of a journal. Right. It's got a nice I said. Number it means the Library of Congress is at least aware of it. Deal why the logic identify all of that and. It looks like a journal. Now you know with all due respect to computer scientists. It's not a pretty looking journal it doesn't it's not a very attractive. You know home page but that's OK. It presents of information but so far so good but then I want to draw your attention and some of you. I can see some smiles so I think so. If you've already seen this but I want to draw your attention to the author list. This is a paper robots no longer considered harmful net are those of us who don't study robots or aren't you know sort of Engineers quite frankly you know that say OK well maybe that's maybe that's a pressing concern in this in this study of robots but then you get to the author list and I've only met a few of you. So I don't presume that I you know should be sort of have a comfort level with you to speak these but I will with polities they're better if you read them aloud. Professor IP Freely is a very well known. It's actually Rolla just it turns out I think I ran into him when I was in the men's room a moment ago but of course he often works with all of her clothes off and their French colleague jockstrap who's very well known these some of you. You obviously can tell that they are fake names are actually all from the Simpsons. So how did a bunch of Simpsons characters and the publishing a paper in the august journal of computer science and information technology. Well the answer is and there are some lessons in this answer the answer is a program that some of you may have heard of if you can raise your hands if you have or if you don't know it. Want to admit it. It's called side and be heard of side A C. I.G.N. so you can Google this I never remember exactly what the your L. is but you sort of Google sideband an MIT was a couple of grad students at MIT about ten years ago who came up with this great program what they were trying to do and you can go and do it yourselves a at a Marcus who co-founded fraction watch and I you know I've done this just to have a little you know a little giggle periodically when things. Slow. You put in any authors you want and up comes a paper. Now most people just sort of take that and then it creates a P.D.F. and it has it has graphics and figures in it that if you don't look too hard look like real figures but then you realize that there's a point to the back it's like a snake eating its own tail you know sort of arrows that point to nowhere or to the end of the other end of the line. It doesn't like being in a sense. And this was an attempt to mechanize if you will industrialize something called the socal hoax which he was not a colleague of mine then because I joined the faculty long after but someone named Alan Sokol a mathematician that. At N.Y.U.. Who sort of got very tired of reading all of what he thought was nonsense sort of fashionable nonsense in the literature and so he wrote a paper in the spirit of that was the humanities literature and the world of postmodern actually post modernist literature and he wrote a paper in that style which was completely made up complete jibberish just sort of lots of words together to form sentences but didn't really make any sense and he got accepted and he in a very august journal and he actually published a book based on this whole experience it's worth the read but this is a sort of way to mechanize it right. So we had the industrial revolution you didn't have to you know so buttons anyway just had a sort of machine do all that that's that's what this does but it also speaks to something else because this is a journal. You know obviously any period Well one would hope that if this paper actually had gone through peer review. Somebody might have noticed you know even if they blocked out the names of the authors that the rest of the paper was complete gibberish Well nobody did. And there's probably a reason for that which is that this particular journal doesn't do peer review it claims to do peer review what it really does is take your check for whatever the fee is and run it through something they call peer review and then just publish it to make it make a profit and those are known as predatory publishers and the reason why predatory publishers which could be a whole other lecture but the reason why predatory publishers have a market and act. You see studies like this again. Studies like this publish is that people have this urge this need to publish. Otherwise they perish. That's the publish or perish problem. And so there are lots of ways that the systems are to take advantage of that. Most of them are bad but this is one example of predatory publishers and if you're interested in this subject. It turns out that the F.T.C. the U.S. Federal Trade Commission. Has decided that one particular predatory publisher one of the largest ones is actually they've sued them they sued them last Friday for you know for basically making false claims about and false advertising which is a big deal actually regardless of where the suit goes it's sort of a shot across the bow of these publishers So hopefully maybe mildly fun story. Not fun for everyone involved but that's one to start with but let me take get us a little more serious. If I if I can about this whole publish or perish problem and talk about the story of that this is about this is a piece that kept Ferguson who was our first F. writer traction watch and Adam and I wrote this was published not quite two years ago now in nature as a feature piece not it not appear of your vendor script but a feature piece and you want to tell you the sort of intro the beginning of the story. The anecdote that it leads with and that sort of feeds through. It's about someone in healing in Moon and healing and moon was a actually still is a researcher in South Korea and he was studying medicinal plants or potential medicinal uses of plants and he needed he knew to get papers published otherwise he would not get grants he would not get promoted to not get all those things. So he wanted to make sure to get these papers published. He had a total it turned out about twenty actually not about but he had twenty eight papers that he wanted to make sure it got published. So what did he do we chose a publisher called informant healthcare. Not a huge publisher. But. Novel sized publisher they have a bunch of journals. He chose three different journals of that. Publisher and the reason he chose them was that he knew they would ask him when he submitted the paper for potential reviewers that's going to be important. So remember that and then when they asked from Perth potential reviewers he would say well. You know Dennis has would be a great reviewer for this paper and he would on the phone on the entry form. You know on the site you would enter Dennis Heston Ga Tech in Georgia Tech and all that and the editor of The Journal would look at and say I remember yes dentist gave a great talk at that last plant medicines meeting wonderful job. Clearly he should review that's a great choice of reviewer very smart guy. Let's let's have him review it. So he clicked on invite and you know out went a peer review request an invitation to Dennis Hess But here's the here's why you were never listening to me tell you about Moon instead of work. Reading his wonderful papers and citing them. The person getting those invitations was not then his house because young and moon clever person that he is he. Instead of writing you know giving dentist has a GA Tech that he to you whatever doctor has his e-mail is he would put you know Dennis has one two three a G. Mail dot com And who do you think actually had that e-mail address it was young in moon so young in the moon we get an invitation to review peer review this wonderful paper by him. He managed to do this. Twenty eight different times. All of the papers you will not be surprised to learn or accepted. They were good peer reviews and by that not just positive peer views they were very thoughtful because who knows the flaws in hearing the moon's work better than hearing in movies right now. Again why you have. About it now. They were all retracted twenty eight of them at once were tracked it now. Why are you hearing about him now though. And why was he the lead of our story. The reason he got caught because you're all thinking I'm I think you know I trained a little bit in psychiatry so I can read minds. Clearly you're all thinking I wish I had thought of that Were you one of you were more of use actually creating a fake e-mail address right now. Go ahead. But it won't work because. Number one because you're hearing about it and we've reported on it and journals are doing something about it that you know to fix that loophole that sort of vulnerability. Number two the reason you got caught was because all of the reviews came back from invitation to full peer review and within twenty four hours. Now those of you you know. Peer reviewed or been editors are laughing because you know it's impossible to get people to even respond to a peer of your request in twenty four hours. Let alone do the whole review in twenty four hours and a good one. I mean you know it's one thing to respond like yeah sure except great paper. You know that's a lousy period. So that's how you can move so little pop quiz now for you sort of you know when almost many of your students have any so I told you that there were twenty eight papers retracted by the young in Moon perfect peer review or. On his behalf I should say he didn't do it voluntarily. How many papers you think by today's That was two thousand and twelve August two thousand and twelve almost exactly four years ago. How many just show of hands I want to see how cynical this crowd is you know how many have been retracted since then for that reason it's going to be at least twenty eight. Right. So if you do under twenty eight and know you were listening and he guesses. One thousand one hundred that will hundred thousand I you know try. Just very busy but even that is way beyond anything we could accomplish one hundred thousand that's a lot. I heard of four hundred I think somewhere. And I heard a thousand and one. Two thousand two thousand five hundred. This is a more precise number the actual number is but I think that the current figure we keep is three hundred nineteen. But it's more than three hundred. So three hundred papers been retracted. And these are the ones that got caught. It's probably most of them at this point but three hundred nine hundred papers got retracted over the past four years because of vulnerabilities in the peer review system and because somebody thought he was being very clever but of course eventually got caught so that is a little bit of a fun introduction to a little bit of statistics here that will give you some put some context around those numbers so again three hundred tractions for that particular reason. Now. Let's see if I can actually. You can sort of these are somewhat difficult to read. I apologize but the trend line is very clear this is by year and actually on the over here is probably the most important. Actually there are the most important label which is the retraction notice as part of one hundred thousand papers. This is just a number of publications over that time and I just I mean this is a pretty clear trend line right so it's going up. Apologies that it seems to sort of depart suddenly in two thousand and fifteen as if there were no more retractions happening. It's just that the person who did this which is a wonderful thing he did it this year I'll sort of he just moved on to I guess more interesting things sort of they were more interesting to him. And so he doesn't refresh any more but I can assure you that it went up here but just to give you a couple of data points to give you a sense of what's happening. So in the year two thousand there were about forty retractions OK in the year two thousand and ten there are about four hundred fractions. So in that ten year span refractions went up ten fold. Now you say well. Well number of papers went up to right there were more journals there's more people publishing that's. True but it turns out that her head. Thanks very much Fred. Kind introduction and I also want to just thank them as have someone who couldn't be here today but who really he extended the invitation and the one that shows that show in the right here. And again it went up here again. Fiscal Year two thousand and fifteen which is have a problem at the National Library medicine those things. There were six hundred eighty four. So just shy of seven hundred K. so the numbers continue to increase now. People often say well why is that happening. It's clearly an increase. Why is it happening and I'm going to get into sort of longer more descriptive answer but at the end of the day this is a screening issue. There are simply more people looking for problematic images. Plagiarism fake data all sorts of things. It's certainly possible that there's actually more fraud happening in two thirds of attractions are due to fraud no for misconduct I'll get to that the moment. But we don't have any evidence that there's more fraud happening we just have evidence we're better at catching it just the same way you know autism rates have gone up. Of course but they actually haven't probably gone up. It's just that we're catching more of it. We're looking for it. We're screening for it. Certain cancer rates again have gone up again because we're actually looking for the actual rate of cancer may not have increased at all. That's probably what we're seeing. But let me talk about some of the sort of more detailed reasons for fractions. So duplications now you can't actually plagiarize yourself like by definition plagiarism mean stealing somebody else's work but it is often inelegantly referred to as plagiarism and what that means is that you publish something somewhere and you try to publish it somewhere else. Sometimes is a legitimate reason for that and that's OK but if it's getting retracted there was a legitimate reason for it. You know for example used to be that a lot of authors in Germany would publish in German and then one. To publish their work again in an English language journal to get a larger audience and as long as everybody knew that. So this is previously published in whatever that's fine. The problem of course is when you don't do that and that's considered a problem. And so that's actually responsible for about fifteen percent for corrections. Plagiarism. Obviously not a good thing to do to put it mildly. That's responsible for about ten percent of attractions give or take. So again these two and those two I put together. Because it's important member that the same way. There are more people looking for fake data and all sorts of problems in images and they could do that because you know most papers if not all papers are online. Well here you have sort of if you will robots looking for problems you have plagiarism detection software. Right. So I think the cave is over the most commonly use so together these two problems which are the ones that pleasures and think your software would find are responsible for about a quarter of a passion. OK. People are looking and it's worth remembering that an image manipulation now you know in the fields that you study and work in this certainly would would be an issue a lot of what we see is in the basic life sciences. So western blots of you may have run Western bots in the past. You know these are you trying to look at a protein and sort of see have bigoted as you run some electric current through it. That's a oversimplification but basically it's you know it's sort of a mostly black. Figure with some white bands in it and people cut and paste those bands to make them look like they're in different places or they didn't really want to control and and so they just take a control band from the last experiment because they think it's probably the same and they cut and paste it all this stuff is fine to pull it well it's it's you can do that all the MID if you lation using Photoshop. You also end up finding it using Photoshop. Then there's big data. You know one of the things that sort of struck us about the killing in Moon case was that when we sent these studies to people in the field who actually knew something about it. We don't you know we're not experts in this field not. It's and basically anything they all said you know these are pretty solid papers. I mean you know they're not nature level papers are not going to sort of no one's going to you know patent anything in this paper but you know they're solid and they could be published and so they didn't seem like they were fake. And the other hand I'll get to some specific stories about fake data people just make things completely they just make it all up. And if you're hoping that you're thinking about doing that which I've really discouraged just in case you are thinking about it I'll show you some ways that you will probably get caught in a minute. So hopefully that'll discourage you more than my just saying hey don't do that you know. But people are faking data we have and I'll show you a list of people who've done that we've talked about fake peer reviews publisher ever now and a very interesting conversation over lunch about a whole number bunch of publishers of the bunch of editors here or in faculty and publisher areas not a particularly important reason. Our head. Thanks very much for that kind introduction and I also want to just thank them as have so much seem to have a way when they publish a paper twice by accident. Sometimes in the same issue which boggles our mind a little bit. They don't have a way to sort of say to the world. Hey by the way we made a mistake and just ignore that other than retracting it's kind of like somehow the only mechanism. So think about that for a second. You probably you know you submit a paper you publish a paper those are period publish it. It will be good paper. And then because the publisher has to retract it. One of one version of it when you when someone does sort of one of those automated C.V.S. you get people from one of the indices or you just sort of Google Scholar or whatever it is growing automatically you have a crack on your record. It's not and has nothing to do with you. That's not your fault. And yet that's the only way you can deal with it anyway. Not a huge issue but it's annoying. Authorship issues these are these are a big headache. We have. Seen all kinds of authorship problems and then roughly speaking they fall into sort of a couple categories there's they're authors who shouldn't be on papers sort of honorary guest authors. They're authors who aren't on papers who should be on papers. Why did you leave me off to be frank. We hear that a lot from people who've left you know graduate students who've left labs. I should have been on that paper of of what have I did all the work and what happened. And then we see some sort of fun stories right. There's one particular case where and we still are not quite sure why this happened or why he did this but someone made up an author and just added him to me more than a dozen papers. This this person's name was move but the author of the fake author made up was named Javier grounded. So he may speak Spanish so that in one way to sort of say the name is Mr Big. Which seemed to be the reason why he added his name because it seemed impressive he had a nice affiliation also. And he got all these papers published and then it turned out he was also thinking data so that was a problem but then there are some more fun ones and anyone speak Italian that the no one speak of. Well actually. And then I have to get you know yell that but there was one there's a physicist. So you may know the story is a physicist. At think it was at Livermore Lawrence Livermore labs and he was very frustrated because you know he does good he did good work at least he thought so and but the editors of these journals and the peer reviews these couple journals didn't think so. And so they were not they were ejected his papers. So he said you know I'm convinced that there. This is a legitimate finding and that it should be published. So he submitted the same exact paper. You know. Updated a little bit because it was of like a year later. You know same author list but he added one author and he gave this author a very prestigious you know from a very prestigious department in Italy and you know then he figured hey somebody is going to read it and say So it's from this department must be good. But he used sort of what a poker player might call a tell right. He wanted people to see this. So the name of the author that he was strong show biz jelly. I can tell. None of you speak Italian because that would either horrify you or make you laugh. Depending on your mood that basically means and with apologies for no virgin ears and basically means giant ****. And so he had a paper he had three different papers published in very prestigious physics journals with the co-author giant ****. Right. So when I said this and then also in a fun way. Polly Matzinger was a geneticist a very prominent geneticist she couldn't get she she was actually not so much upset you can do a capers publish she was very upset about the use of passive voice in scientific papers which is a whole other subject very interesting that it was more dear to my heart because I'm a woman editor. So what she did and I don't really know the connection but this is how she protested she added her dog's name to papers. And was glad real important would which sounds like something out of knights of the Roundtable or something but that's her dogs they've apparently he's no longer with us. But the policy is legal reasons you know this is something that you know you may come across. If you're working with particular. Agents or particular you know materials that you might work with the industry partner on they they may have given you very specific rights to publish or not to publish and if you violate those could happen but there's also because people threaten to sue you. I don't like your finding and then all of that does make me an Adam laugh though because when when a publisher says this was retracted due to legal reasons that sort of suggests that all the other attractions for illegal reasons but that's what it means and then not reproducible so this is a actually a bit of a headache for a lie. Lot of people. I mean it's a headache and in a good way people are starting to think about the reproducibility problem and how in many fields. Not every field and not every field really looked at this but certainly you know there's a big paper in basic cancer biology was another one if you can nominate psychology a number of fields of look at that. How many findings in sort of key areas are reproducible. It turns out that roughly speaking in many of those fields it's only about fifty percent. And that's unfortunately consistent. And that isn't this really reason for track papers there are guidelines for this which I'll get to in a moment that you know the the retraction should be reserved for serious error that in serious honester but also serious fraud of course that invalidates the conclusions. Now something not being reproducible where there wasn't an ever just it's just that when you did it this time it didn't work. Or it did work and then the next time it didn't work. That isn't necessarily a reason for traction. We have seen some cases that too but that's again a it's sort of it's an ongoing discussion in science about whether that should happen. So I mentioned that two thirds of attractions are due to misconduct. And I just wanted to give you a reference for that. This is a paper that came out four years ago now and P N A S I mentioned it for a couple reasons one is that if you're interested in these issues. I would urge you to follow. Maybe set up an alert or whatever you want. In particular first and last authors of this paper thing Arturo how sort of all ferric they're both microbiologists. For external editor and actually shows are corrupt. And ferrets on our board of directors of the Center for Scientific Integrity or parent organization. What they did was they actually looked at you know reason for traction obviously that's that's what the headline says. And they found that two thirds were due to misconduct as I mentioned but that was actually very different from what granted found about two years earlier when he had looked at reasons for a crack in C. found fewer than half were due to misconduct and the reason why that was different was not because anything could changed but big. Because he they went beyond just the retraction notices their fraction notices are often unclear or opaque sometimes they're misleading. Some of them say absolutely nothing at all. I'll get to that in a moment to actually look at look at Retraction Watch we've been around for a few years I look at other media reports and again this really change their understanding of this is something things have changed but this honestly was the I mean this paper came out after the launch of course but the sort of understanding that we've learned here was a main reason why we want to attract what two thousand and ten because retraction notices and I'll get to some hopefully fun examples of this in a moment. Are often opaque and don't really tell the whole story or even most of the story. So we were journalists at Retraction Watch. We like our leader board so we sort of this is a little bit styled after like yes P.N. or something like that or any kind of you know score standings. This is a list of the of the first ten on our leaderboard we actually get down to thirty mentioned more in a second. Moment and some other information about them. Second. So this is the leaderboard Now how many of you. I know most of you know many of you looked like your grad students or postdocs So you certainly wouldn't be you know in this category Ed and you can you can hope you can you can look for that but among the faculty and I think a lot of them are sitting over here and if you have published one hundred eighty three papers or more they went OK so there's hope for everyone who just raises his her hand that you could one day top the track and. We can talk later I except bribes. You know look or whatever whatever you like. But one of the serious questions well get that question second. So there's a couple things to notice about this list. One is that that's an impressive number one hundred eighty three attractions I'm going to tell that you briefly the story in a in a few minutes of how you got caught and why that happened. Number one number one and number two on the list. You're stuck if would you and walking both was. It was number one until you came along and we've been trying to we've been trying to contact him to get the certificate you know that very nice a word that we gave him back but he hasn't seemed to want to give it to us. He's very upset about the whole thing. They're both any C.Z. ology researchers are or were I should say so you know you might say well I hope you might think I would say well I hope none of you are having surgery and a time soon. That requires anesthesia I actually kind of say the opposite. Not that I hope any of you need surgery for anything really. But I say anesthesiology that means I trust it more than a lot of other fields because it's the field that I don't see any retractions and that I wonder if anyone is actually paying any attention. OK. And I'll tell you the story of the second. There's a couple of things about this list. It's hard to tell because you know obviously different names and of different cultures different languages have different naming conventions. But you may notice that all of the people in the top ten are men. OK And if you go all the way to the to the number thirty. OK Would you just kind of arbitrarily where we stopped. There are only two women on this list. OK. And they're both kind of at the bottom. So I'm you know I'm really very into you know equality and equal. You know just equal pay equal. So I want equal refractions and I just think it's it's really time for men to stop dominating this field so much. I mean you know leave some space for other people. Let's promote some women here. Now you know course there are some reasons for that but even then ferritin Archer did a study where they I know they actually took into account. You know the things you would want to take into account the fact that you know men still are overrepresented in many scientific fields certainly in senior positions but even if you count for that you know in terms of authorship men are nine times more likely to retract papers for fraud. So congratulations to those of us with a Y. chromosome overboard that's. As men. I'm not sure if we're better at doing you know committing fraud or getting caught. I'm not sure which one I prefer but there you go. But you know again there's some other things on this list. You know different fields. I think probably many of you are familiar with the Schoen story he's in the he's down at number nine. Now you know you'd think he'd be quite impressive but that's that's he's barely in that these holding onto the pup tent. You know barely So what's interesting about this though is that if you just took these ten. This is a significant percentage of the number of refractions overall which is you know in the number it's probably somewhere around seven thousand. OK And that suggests that you know these are outliers. Most people never attract a paper. And among those who do you know one is usually the sort of the probably the is probably the median number but anyway for the most I should say. The other thing that is sort of interesting to people is which journals are tracked so we have who are tracks we have which journals are tracked in. This I tried to tease that luncheon today but I think I gave away probably most of the answers. Again another paper by Franken Arturo which journals are tracked now what this is is a plot of impact factor and we're all familiar with impact federal courts against traction in Decorah traction index being just rate of refractions a number of attractions for thousand papers published. OK so it's just a rate of a correction. It's a pretty you know it's not obviously perfectly linear but you know the way in terms of these data and you know you can look at the paper and see what the statistics are like but knowing a little medicine has the highest impact factor of the world. It also has the highest rate of refractions in the world nature sound science which you're probably at least two of those are probably closer to journals that you might publish and at some point in terms of your own fields. They're pretty high up the Lancet sort of hang out over here. We're not sure exactly why but they've certainly published their share of clunkers over the years you know. Then people say and then there are people who you know really they hate the impact factor they hate and I can understand why. And they hate sort of the Gemini that a lot of these journals have and they say you know we want to disrupt that and I understand that. And they say. See that's proof that these journals are publishing more fraudulent science. You know more practical science. I have to say to me it's still a screening effect. There are more eyeballs on these papers. OK they're more people citing them which you would hope means that they're reading them but I'll get to why that may not be the case of the second but point being again more eyeballs. They're finding more things. They're going through medicine doesn't exactly look at this with pride. Some of the other journals are a little bit more willing to do that but I would argue the should be proud of having more crafters of of course I would say that because I wonder traction watch. This that was a little bit more concerning so. And this was a paper this was a study that was actually replicate this is a replication of an earlier study showing the same thing and I'll just show you this is by a number of librarians who looked at what actually went paper under fact the papers are cited or they cited in support of ideas. Another would support of papers or they cited saying by the way this is retracted so I actually don't pay attention anymore. So what would happen if you and some of you may know lawyers so you may be lawyers in your previous careers. If you showed up to a court room and tried to argue a case based on a precedent. They'd been overturned by a higher court like nothing good would happen to you and you lose the case that might be the the easiest thing you. If the judge you realize that you knew that the that that had been overturned. You actually would probably face sanctions from the local you know Bar Association. Not so in science. So what actually happens in science is most of the time when these papers are cited more than ninety percent of the time. In fact they're cited as if the paper never been retracted the original paper that's kind of a problem right now. To be a little bit light about it. It's a big problem actually And so it's actually one of the problems that Adam and I and our staff are trying to solve. We're creating a database that's what a lot of our grant funding is for traction so that anytime papers are corrected even if you've cited in the the in the distant past. You'll get a little ping in your ear the sort of integrate into mental layer. You know whatever you used to in terms of your own personal library database of papers. And so that you know you may still choose decided that may be a reasonable thing to do but keep in mind that it's been cited there's been a crack that. And again this is probably different little difficult to read but. This is another one of our leader boards and what we what we look at here is a number of times the most widely cited retracted papers. OK. This is just number. We have ten of them on the list and we updated regularly relatively often if you look at some of these a number one is a paper in Science from eleven years ago. It's been cited a thousand times overall what's interesting is that three quarters of them came after the attraction and it turns out a lot of those are actually saying this paper was retracted but there's actually some other there's some valid stuff was a complicated paper there's a lot of a lot of different findings in it. So that's actually OK. Number two this I think in Zurich to the paper is not so OK this. I think a lot of you recognize as. Despite all the sort of jargon and if this was the paper from the Lancet in one thousand nine hundred eight that claimed the link between autism and vaccines. Now it took the Lancet twelve years to retract that paper. OK In that time it was cited six hundred seventy five times since then it's been cited three hundred times again mostly saying this paper is going to crack to be fair but somebody keep in mind. OK this sort of continues to be a problem. And part of the reason that it's a problem is that journals are not very good at getting the word out and so I think this speaks to ethics because if you make a mistake the ethical thing to do is let people know somebody in your pages made a mistake the ethical thing is to look people know. And of course they actually don't. Me About a third of the time when you did everything you could to try and you know Grant knew a paper of interactive you went and searched the the journal site and one searched Pub Med search other indices a third of the time there was no mention of their attraction. So you kind of almost can't blame somebody who came across the paper and said this is interesting. Let me cite it and they cited it and well that's actually been the track that that's a problem. So I'm going to switch gears a little bit and talk about the notices themselves so this may get a little bit nitty gritty inside baseball I apologize for that but hopefully you'll find it useful if I want to start in a sort of like way. And that is to say as I'm going to sort of make the case that retraction notices are not as clear as they could be they don't give you as much information as they could be they hide information so I'm going to start going to run through these euphemisms. Now this is for a particular behavior that leads to attraction a fair number of you know for really often. But instead of using that word which describes the behavior in one word. The editors chose other words. So this this one of these are all taken from real retraction notices. This one. This was described as an approach to writing and Adam wrote the post about this couple years ago and he said this is an approach to writing. The way showing up to a bank with a gun is an approach to banking. OK so you probably have some sense of what this actually was this paper had a significant originality issue that's why or tracked it. OK it's that's not a Freudian sort of psychoanalysis thing that's actually what they wrote there. This is this paper. We're back to because that inadvertently copied text. Which may be true but I will note that the English language does not have a word advertently. And then there's this one which you know does tell you what happened but I don't know when I was learning. You know how to write and how to edit I was always taught use you know no reason to use fifteen words when one. Words will do it tends to obfuscate things and of just gets a fun word but not when you should use that often because in writing because of the members what it means. Sort of defines itself but that you know. OK fifteen different words and then perhaps my favorite one. But the one that gives away the answer of course is that some sentences are directly taken from other papers which could be viewed as a form of pleasure. What's left unsaid here of course is what else it might be viewed as. In the U.K. So those are some light examples but on a more serious note this is the kind of notice that too generous in particular but actually a number of others as well. Routinely and one of them. I'll show you in a second actually stop doing this which we thought was a great thing. This is the standard notice from the Journal of Neurosurgery it's OK. How many of you again I WARN I told you earlier that I can read minds. So I know exactly what they meant here of course but how many of you can tell what you know why those papers are corrected. Obviously no one and I can't either because there's no information in it and we don't think that's particularly useful useful to the anyone reading this useful the scientific community not useful to anyone who might want to build on that work because there are lots of reasons for papers if you're tracked and honestly if a paper was plagiarized the core findings I'd want to go to the original one. I cited but the core findings were OK but now they're sort of this weird you know almost conspiracy theory in my mind I don't know why this rich are attracted or why does that happen. My nature which is had a little spate of attractions a couple years ago actually some fairly serious ones. Or some back the lawyers. I tend to beat up on them a lot. I will say though that half of my family's lawyers and my family. You went to med school unless you didn't like the sight of blood in which case you went to law school and then defended the people who know that's you know but the point is I actually like lawyers a lot of the very helpful. Those are traction watch people threaten us with things all the time because they don't like with the right. Because we're telling the truth about them. But you know lawyers are entering. This process and so if you want to use the misconduct rather than sort of answer it for in a forthright way and sort of deal with that you hire a lawyer and you know everyone's entitled to a this is not a criminal proceeding so not really entitle but I do think everyone is entitled to a vigorous defense. The problem is that that means that the scientific community really does get short changed because you don't get the kind of information you want I will note though I mean I don't want to be all doom and gloom. You know about the actual chemistry which traditionally had done really poor job in terms of transparency and publish those one line notices like the Journal of Science. They've said you know. Everyone's right. We publish lousier tractor notices. I've bowled the that's my building at the bottom there but that is you know we're going to actually explain what's happening and they've been true to their word which we think is a good development. There are guidelines for these things if you're curious and interested in interested in if you are journal editor and don't know about cope which of I would doubt that if you're interested in becoming a journalist or getting involved in scientific publishing company and publication ethics has some great at lots and lots of useful information and one particular guidelines about attraction notices I would urge you to take a look at that. But a lot of the reason that I mention that there's sort of a screening effect that's happening where people are finding more things is because of the ability for people to actually compare papers on their laptops or blow them up and you know project them on walls. Journal clubs are no longer just sitting you know sort of OK when Journal clubs over it's kind of like first rule of journal club is nobody talks about journal club. You know when it's over. And that's that that movie is so old now that you know it just only works for certain populations who's fight club by the way in case anybody is wondering. So there's a site called Pub peer peer dot com And so you may be familiar with but your dot com It's much more active in the life sciences but you can leave a comment on any paper this been published as long as it has a. Digital object identifier D A Y and a lot of papers have been you know several dozen papers have been retracted or corrected because of comments left there and the authors of the originally original authors get a note and e-mail whenever somebody leaves a comment on their paper. It's pretty powerful and again this is like an online journal club that just sort of in public isn't just you know sort of you're done. And you have your sandwich and then you leave. So and then there are individual people doing this kind of work. And Young who wrote this piece by the way just you may know that he just came out with a fantastic book. It's called I contain multitudes and as as terrific writer he's got a Ph D. in cancer biology and now as a full time he's at the Atlantic actually. So you know he Simonson who's pictured here. He actually did this in psychology. OK And so he has a method to figure out how likely particular results are to be sort of real and by real and short handing by saying is the variability in those results. The same as you would expect to see if they were sort of if they existed in nature and even that is your shorthand but that's what I'm getting at. And he has uncovered all sorts of problems in psychology research and these are all and I think there were six different people who actually had to track papers for fake data. So that's happening in psychology. It's also happening though you'll remember you should talk of food your friend with one hundred eighty three or tractions and one of the pig one of the things people often say to me is how do you get to one hundred eighty three attractions. You know you can see. OK make a mistake or there's actual misconduct. You have to attract one paper OK Keep an eye on it. That's not so good. Two papers three papers maybe even eight or nine papers. You know if you publish one hundred eighty three papers that's not so you know it's a decent percentage but OK you know people can continue. We should get second chances third chances what have you. But what was it was it like sort of listen you should talk a little concerned we're trying to give you you know enough space or through the stuff out one hundred eighty two attractions that's fine but it's on the hundred eighty three. We're going to have. To ask you to leave. That's not actually how it happened that would have been even more absurd than the real story. What actually happened was in two thousand. Someone wrote a letter to the editor about one of his papers and they said. These these results are just too beautiful. Now all of you spent enough time in academia to know that when someone says to beautiful about your data they mean the opposite right that it's probably you know crap. And that's what the person meant but the journal editor let Fuji write a letter back and say no thank you for your concern but actually we can explain it but of law but food you knew that the so that in order to practice and this is sixteen years ago. Could you knew that the anesthesiologists were on to him so because he's not it's easy. Ology he can publish in like any field he wants right so you're always doing any seizure on a particular patient population. So you're you know doing it to see John. Children that would be Pediatrics Journal. If you're doing it on you know women during pregnancy or you know right at the livery that would be an O.B.-G.Y.N. Journal. So we started publishing there because the reviewers wouldn't have been wise to the fact that people in that it's the Asia had been finding problems with this research. So once he started doing that. And once other people sort of started coming to the fore Scott and John Carlisle did basically what your recent sin had done in a slightly different way but statistically looked at what the likelihood was of who'd use results being you know sort of having natural variation and what he found was that the likelihood was one times ten to the minus thirty third. So that's basically zero. I mean I you know I'm not a mathematician but I'm pretty sure it's actually closer to zero than any of us will probably ever get the cure zero. If you will. So obviously he made up a lot of data and got caught and had to retract it. I will say that. Just to sort of wrap up here in terms of consequences. Again in case you've sort of. You know listen to everything I've said and taken into account like well OK what are the actual chances of getting caught here. They're still relatively small to be perfectly honest with you but they're not zero. They're far greater than food use data are likely you know likely to be real. This was a case of someone who actually faked he spiked rabbit blood samples he was studying each I.V. vaccine. Potentially HIV vaccine and using rabbit blood red rabbit as a model usually and he spiked the samples because he wasn't getting results the one with human blood that was it was actually controlled that had been exposed to HIV so it had the antibodies you would expect if you if the vaccine was working so he went to the freezer and spiked all of these results and how he got caught was actually very interesting and I want to take time. Now I'm happy to talk to anybody is interested in that. What happened was so you know he got caught actually confessed and then because he's in Iowa. Chuck Grassley says Senator and Chuck Grassley went on a bit of a rampage as Chuck. If you've ever followed him on Twitter you will know often does. Wrote letters did all sorts of things in the U.S. attorney ended up prosecuting prosecuting young people on who is now spending a lot of appeal so he's spending close to five years in prison. That's unusual very unusual actually but it does happen but I will leave you with this which is much happier news. And hopefully this will encourage ethics and thinking about you know good behavior and all of that. And that is that while crime doesn't pay quite as much as it used to although it's still probably pays pretty well I wouldn't know. Doing the right thing. It turns out actually probably does. OK And here's what I mean by that this is a study and I'm happy to say that it's actually been replicated at least some of the main results would replicated. It took a look at what happened after a retraction happened. And in specifically what happened in terms of the citations to that person's work. Because you know citations are pretty important they're how you get impact there. How you know people are reading your papers. Ten. Committees look at these sorts of things in terms of impact factor and all that so they took a look at what happened then. Basically when the papers were corrected for fraud misconduct. What happened is what you'd expect to happen. So if you hear through the grapevine or you just see that someone in your fields are attracted to paper you are probably less likely to cite them. I mean even if some people still continue to cite the paper in the wrong way you're less likely to cite them in other studies or back that up so you see a ten to fifteen percent did in citations to your own work and actually to your whole subfield So you're kind of doing something terrible in the pool. You know in addition to just messing with your own work. So that's what happens when you get fraud when you've tracked for fraud. But it turns out that when you're tracked for honest error in other words you come forward. You say you know what we made a mistake that we didn't calibrate the instrument right or we were just using the wrong religion that happens people of you know picked the wrong sort of compound off the shelf I mean literally dirt track you know this is that say that a couple in physics of the scene over the years when you come forward in the notice is very clear about why you are attracted and you sort of seem to be very regretful of what happened. You actually don't see a dip in your citations. There's some evidence although it's all squishy here that you see a bump in your citations but the important thing here is you don't see a dip. So it will not actually hurt you. It shouldn't actually hurt you. According to these these studies. If you are cracked or on a SARAH So if there's anything to encourage ethical behavior and to to fess up about making a mistake. I think that's a pretty good thing to think about. But that actually does it for me hopefully a positive note. That's just my. Info and. Acknowledgement So thank you very much and I look forward to questions. For the one. And the average time. It's the question is sort of what's the average time between publication attractants an excellent question. People have studied it looks like it's about three years and thirty five thirty six months. And that's a that has it's now a little bit more of my model distribution. So this are two spikes because we see a lot that actually are happening very quickly in others the current record for attraction. Is actually eighty years. It's a fun story so I'll take one take thirty seconds. A group of medical student German medical students in the one nine hundred twenty S.. They wanted to sort of just have some fun and so they made up a case report. They claimed to have seen a woman in their clinic who was coughing up a very strange substance which they claim to have tested and found his urine and they said we had this woman coughing of urine which is unusual. We actually somehow tested and see that she has a kidney growing in her long. Why why this past year of you. I'll never know but so it gets published in one thousand nine hundred three and eighty years later. One of the authors. I know whose deathbed exactly or just kind of crisis of conscience late in life. Said You know actually not he said in German. So I don't know it sounds but you know maybe not. And so they were attracted. But good three three years about the is the average That being said the important figure which we don't have a good handle on is how much time does it take from allegation to attraction right to choose me and since we don't usually know when the allegation happened. We kind of have to guess and we don't have a date on that. I mean. You. I mean I will say just as a general principle any trends you would find statistically it's difficult because when you're looking at these leaders so they speak you know they're outliers number one and and even so the whole world of our track is pretty small. You know to be honest we haven't really seen you know any one study carefully country of origin we've what we have seen is not so much the individuals but they used to be and this is changing that the reason for traction differed in different countries. So although the overall grade which again is pretty small and you're talking about seven hundred tractors out of two million papers is a very very small percentage of the rate overall rate is about the same in every country but certain parts the world. I mean the developing world had far more plagiarism and duplications whereas you know the developed world is much better at just out now fakery or sort of bizarre sort of approaches to this that's structurally changing in a lot of just quite frankly a lot of countries are Westernizing in adopting a lot of our bad practices. So. When we first we have that. Yeah there are a lot and you're like OK you know I do want to say that just on behalf of the scientific community thanks for the work doing something that really is changing the way people think about this issue one simply going to the.