Fantasy rejects itself

I came across this review when linking to Undead Science on Amazon. It’s old, but there is no other review. I did buy that book, in 2009, from Amazon, used, but never reviewed it and now Amazon wants me to spend at least $50 in the last year to be able to review books….

But I can comment on the review, and I will. I first comment here.


JohnVidale

August 7, 2011

Format: Hardcover|Verified Purchase

I picked up this book on the recommendation of a fellow scientist with good taste in work on the history of science. I’ll update this, should I get further through the book, but halfway through this book is greatly irritating.

The book is a pretty straightforward story by a sociologist of science, something Dr. Vidale is not (he is a professor of seismology). There are many myths, common tropes, about cold fusion, and, since Dr. Vidale likes Gary Taubes (as do I, by the way), perhaps he should learn about information cascades; Taubes has written much about them. He can google “Gary Taubes information cascade.”

An information cascade is a social phenomenon where something comes to be commonly believed without ever having been clearly proven. It happens with scientists as well as with anyone.

The beginning is largely an explanation of how science works theoretically.

It is not. Sociologists of science study how science actually works, not the theory.

The thesis seems to be that science traditionally is thought of as either alive or dead, depending on whether the issues investigated are uncertain or already decided.

Is that a “thesis” or an observation? It becomes very clear in this review that the author thinks “cold fusion” is dead. As with many such opinions, it’s quote likely he has no idea what he is talking about. What is “cold fusion”?

It was a popular name given to an anomalous heat effect, based on ideas of the source, but the scientists who discovered the effect, because they could not explain the heat with chemistry — and they were experts chemists, leaders in their field — called it an “unknown nuclear reaction.” They had not been looking for a source of energy. They were actually testing the Born-Oppenheimer approximation, and though that the approximation was probably good enough that they would find nothing. And then their experiment melted down.

A third category of “undead” is proposed, in which some scientists think the topic is alive and others think it is dead, and this category has a life of its own. Later, this theme evolves to argue the undead topic of cold fusion still alive, or was long after declared dead.

That is, more or less the basis for the book. The field is now known by the more neutral term of “Condensed Matter Nuclear Science,” sometimes “Low Energy Nuclear Reactions,” the heat effect is simply called the Anomalous Heat Effect by some. I still use “cold fusion” because the evidence has become overwhelming that the nuclear reaction, whatever it is, is producing helium from deuterium, which is fusion in effect if not in mechanism. The mechanism is still unknown. It is obviously not what was thought of as “fusion” when the AHE was discovered.

The beginning and the last chapter may be of interest to those who seek to categorize varieties in the study of the history of science, but such pigeonholing is of much less value to me than revealing case studies of work well done and poorly done.

That’s Gary Taubes’ professional theme. However, it also can be superficial. There is a fine study by Henry H. Bauer (2002). ‘Pathological Science’ is not Scientific Misconduct (nor is it pathological).

One argument I’m not buying so far is the claim that what killed cold fusion is the consensus among most scientists that it was nonsense, rather than the fact that cold fusion is nonsense.

If not “consensus among most scientists,” how then would it be determined that a field is outside the maintream? And is “nonsense” a “fact”? Can you weigh it?

There is a large body of experimental evidence, and then there are conclusions drawn from the evidence, and ideas about the evidence and the conclusions. Where does observed fact become “nonsense.”

“Nonsense” is something we say when what is being stated makes no sense to us. It’s highly subjective.

Notice that the author appears to believe that “cold fusion” is “nonsense,” but shows no sign of knowing what this thing it is, what exactly is reported and claimed.

No, the author seems to be believe “cold fusion is nonsense,” as a fact of nature, as a reality, not merely a personal reaction. 

More to the point, where and when was the decision made that “cold fusion is dead”? The U.S. Department of Energy held two reviews of the field. The first was in 1989, rushed, and concluded before replications began appearing. Another review was held in 2004. Did these reviews declare that cold fusion was dead?

No. In fact, both recommended further research. One does not recommend further research for a dead field. In 2004, that recommendation was unanimous for an 18-member panel of experts.

This is to me a case study in which many open-minded people looked at a claim and shredded it.

According to Dr. Vidale. Yes, there was very strong criticism, even “vituperation,” in the words of one skeptic. However, the field is very much alive, and publication in mainstream journals has continued (increasing after a nadir in about 2005). Research is being funded. Governmental interest never disappeared, but it is a very difficult field.

There is little difference here between the truth and the scientists consensus about the truth.

What consensus, I must ask? The closest we have to a formal consensus would be the 2004 review, and what it concluded is far from the position Mr. Vidale is asserting. He imagines his view is “mainstream,” but that is simply the effect of an information cascade. Yes, many scientists think as he thinks, still. In other words, scientists can be ignorant of what is happening outside their own fields. But it is not a “consensus,” and never was. It was merely a widespread and very strong opinion, but that opinion was rejecting an idea about the Heat Effect, not the effect itself.

To the extent, though, that they were rejecting experimental evidence, they were engaged in cargo cult science, or scientism, a belief system. Not the scientific method.

The sociological understructure in the book seems to impede rather than aid understanding.

It seems that way to Dr. Vidale because he’s clueless about the reality of cold fusion research.

Specifically, there seems an underlying assumption that claims of excess heat without by-products of fusion reactions are a plausible interpretation, whose investigations deserved funding, but were denied by the closed club of established scientists.

There was a claim of anomalous heat, yes. It was an error for Pons and Fleischmann to claim that it was a nuclear reaction, and to mention “fusion,” based on the evidence they had, which was only circumstantial.

The reaction is definitely not what comes to mind when that word is used.

But . . . a fusion product, helium, was eventually identified (Miles, 1991), correlated with heat, and that has been confirmed by over a dozen research groups, and confirmation and measurement of the ratio with increased precision is under way at Texas Tech, very well funded, as that deserves. Extant measurements of the heat/helium ratio are within experimental error of the deuterium fusion to helium theoretical value.

(That does not show that the reaction is “d-d fusion,” because any reaction that starts with deuterium and ends with helium, no matter how this is catalyzed, must show that ratio.)

That Dr. Vidale believes that no nuclear product was identified simply shows that he’s reacting to what amounts to gossip or rumor or information cascade. (Other products have been found, there is strong evidence for tritium, but the levels are very low and it is the helium that accounts for the heat).

The author repeatedly cites international experts calling such scenarios impossible or highly implausible to suggest that the experts are libeling cold fusion claims with the label pathological science. I side with the experts rather than the author.

It is obvious that there were experts who did that; this is undeniable. Simon does not suggest “libel.” And Vidale merely joins in the labelling, without being specific such that one could test his claims. He’s outside of science. He’s taking sides, which sociologists generally don’t do, nor, in fact, do careful scientists do it within their field. To claim that a scientist is practicing “pathological science” is a deep insult. That is not a scientific category. Langmuir coined the term, and gave characteristics, which only superficially match cold fusion, which long ago moved outside of that box.

Also, the claim is made that this case demonstrates that sociologists are better equipped to mediate disputes involving claims of pathological science than scientists, which is ludicrous.

It would be, if the book claimed that, but it doesn’t. More to the point, who mediates such disputes? What happens in the real world?

Clearly, in the cold fusion case, another decade after the publication of this book has not contradicted any of the condemnations from scientists of cold fusion.

The 2004 U.S. DoE review was after the publication of the book, and it contradicts the position Dr. Vidale is taking, very clearly. While that review erred in many ways (the review was far too superficial, hurried, and the process allowed misunderstandings to arise, some reviewers clearly misread the presented documents), they did not call cold fusion “nonsense.” Several reviewers probably thought that, but they all agreed with “more research.”

Essentially, if one wishes to critically assess the stages through which cold fusion ideas were discarded, it is helpful to understand the nuclear processes involved.

Actually, no. “Cold fusion” appears to be a nuclear physics topic, because of “fusion.” However, it is actually a set of results in chemistry. What an expert in normal nuclear processes knows will not help with cold fusion. It is, at this point, an “unknown nuclear reaction” (which was claimed in the original paper). (Or it is a set of such reactions.) Yes, if someone wants to propose a theory of mechanism, a knowledge of nuclear physics is necessary, and there are physicists, with such understanding, experts, doing just that. So far, no theory has been successful to the point of being widely accepted.

One should not argue, as the author indirectly does, for large federal investments in blue sky reinvention of physics unless one has an imposing reputation of knowing the limitations of existing physics.

Simon does not argue for that. I don’t argue for that. I suggest exactly what both U.S. DoE reviews suggested: modest funding for basic research under existing programs. That is a genuine scientific consensus! However, it is not necessary a “consensus of scientists,” that is, some majority showing in a poll, as distinct from genuine scientific process as functions with peer review and the like.

It appears that Dr. Vidale has an active imagination, and thinks that Simon is a “believer” and thinks that “believers” want massive federal funding, so he reads that into the book. No, the book is about a sociological phenomenon, it was Simon’s doctoral thesis originally, and sociologists of science will continue to study the cold fusion affair, for a very long time. Huizenga called it the “scientific fiasco of the twentieth century.” He was right. It was a perfect storm, in many ways, and there is much that can be learned from it.

Cold fusion is not a “reinvention of physics.” It tells us very little about nuclear physics. “Cold fusion,” as a name for an anomalous heat effect, does not contradict existing physics. It is possible that when the mechanism is elucidated, it will show some contradiction, but what is most likely is that all that has been contradicted was assumption about what’s possible in condensed matter, not actual physics.

There are theories being worked on that use standard quantum field theory, merely in certain unanticipated circumstances. Quick example: what will happen if two deuterium molecules are trapped in relationship at low relative momentum, such that the nuclei form the vertices of a tetrahedron? The analysis has been done by Akito Takahashi: they will collapse into a Bose -Einstein condensate within a femtosecond or so, and that will fuse by tunneling within another femotosecond or so, creating 8Be, which can fission into two 4He nuclei, without gamma radiation (as would be expected if two deuterons could somehow fuse to helium without immediately fissioning into the normal d-d fusion products).

That theory is incomplete, I won’t go into details, but it merely shows how there may be surprises lurking in places we never looked before.

I will amend my review if my attention span is long enough, but the collection of objectionable claims has risen too high to warrant spending another few hours finishing this book. Gary Taubes’ book on the same subject, Bad Science, was much more factual and enlightening.

Taubes’ Bad Science is an excellent book on the history of cold fusion, the very early days only. The story of the book is well known, he was in a hurry to finish it so he could be paid. As is common with his work, he spent far more time than made sense economically for him. He believed he understood the physics, and sometimes wrote from that perspective, but, in fact, nobody understands what Pons and Fleischmann found. They certainly didn’t.

Gradually, fact is being established, and how to create reliable experiments is being developed. It’s still difficult, but measuring the heat/helium ratio is a reliable and replicable experiment. It’s still not easy, but what is cool about it is that, per existing results, if one doesn’t see heat, one doesn’t see helium, period, and if one does see heat (which with a good protocol might be half the time), one sees proportionate helium.

So Dr. Vidale gave the book a poor review, two stars out of five, based on his rejection of what he imagined the book was saying.


There were some comments, that can be seen by following the Unreal arguments link.

postoak6 years ago
“Clearly, in the cold fusion case, another decade after the publication of this book has not contradicted any of the condemnations from scientists of cold fusion.” I think this statement is false. Although fusion may not be occurring, there is much, much evidence that some sort of nuclear event is taking place in these experiments. See http://www.youtube.com/watch?v=VymhJCcNBBc
The video was presented by Frank Gordon, of SPAWAR. It is about nuclear effects, including heat.
JohnVidale  6 years ago In reply to an earlier post
More telling than the personal opinion of either of us is the fact that 3 MORE years have passed since the video you linked, and no public demonstration of energy from cold fusion has YET been presented.
How does Dr. Vidale know that? The video covers many demonstrations of LENR. What Dr. Vidale may be talking about is practical levels of energy, and he assumes that if such a demonstration existed, he’d have heard about it. There have been many demonstrations. Dr.  Vidale’s comments were from August 2011. Earlier that year, there was a major claim of commercial levels of power, kilowatts, with public “demonstrations.” Unfortunately, it was fraud, but my point here is that this was widely known, widely considered, and Dr. Vidale doesn’t seem to know about it at all.
(The state of the art is quite low-power, but visible levels of power have been demonstrated and confirmed.)
Dr. Vidale is all personal opinion and no facts. He simply ignored the video, which is quite good, being a presentation by the SPAWAR group (U.S. Navy Research Laboratory, San Diego) to a conference organized by Dr. Robert Duncan, who was Vice Chancellor for Research at the University of Missouri, and then countered the comment with simple ignorance (that there has been no public demonstration). 
Taser_This 2 years ago (Edited)
The commenters note is an excellent example of the sociological phenomenon related to the field of Cold Fusion, that shall be studied along with the physical phenomenon, once a change of perception of the field occurs. We shall eventually, and possibly soon, see a resolution of the clash of claims of pathological science vs. pathological disbelief. If history is any indicator related to denial in the face of incontrovertible evidence (in this case the observation of excess heat, regardless of the process of origin since we know it is beyond chemical energies) we shall be hearing a lot more about this topic.

Agreed, Dr. Vidale has demonstrated what an information cascade looks like. He’s totally confident that he is standing for the mainstream opinion. Yet “mainstream opinion” is not a judgment of experts, except, of course, in part.

Dr. Vidale is not an expert in this field, and he is not actually aware of expert reviews of “cold fusion.” Perhaps he might consider reading this peer-reviewed review of the field, published the year before he wrote, in Naturwissenschaften, which was, at the time, a venerable multidiscoplinary journal,  and it had tough peer review. Edmund Storms, Status of cold fusion (2010). (preprint).

There are many, many reviews of cold fusion in mainstream journals, published in the last  15 years. The extreme skepticism, which Vidale thinks is mainstream, has disappeared in the journals. What is undead here is extreme skepticism on this topic, which hasn’t noticed it died.

So, is cold fusion Undead, or is it simply Alive and never died?


After writing this, I found that Dr. John Vidale was a double major as an undergraduate, in physics and geology, has a PhD from Cal Tech (1986), and his major focus appears to be seismology.

He might be amused by this story from the late Nate Hoffman, who wrote a book for the American Nuclear Society, supported by the Electric Power Research Institute, A Dialogue on Chemically Induced Nuclear Effects: A Guide for the Perplexed About Cold Fusion (1995). Among other things, it accurately reviews Taubes and Huizenga. The book is written as a dialogue between a Young Scientist (YS), who represents common thinking, particularly among physicists, and Old Metallurgist (OM), which would be Hoffman himself, who is commonly considered a skeptic by promoters of cold fusion. Actually, to me, he looks normally skeptical, skepticism being essential to science.

YS: I guess the real question has to be this: Is the heat real?

OM: The simple facts are as follows. Scientists experienced in the area of calorimetric measurements are performing these experiments. Long periods occur with no heat production, then, occasionally, periods suddenly occur with apparent heat production. These scientists become irate when so-called experts call them charlatans. The occasions when apparent heat occurs seem to be highly sensitive to the surface conditions of the palladium and are not reproducible at will.

YS: Any phenomenon that is not reproducible at will is most likely not real.

OM: People in the San Fernando Valley, Japanese, Columbians, et al, will be glad to hear that earthquakes are not real.

YS: Ouch. I deserved that. My comment was stupid.

OM: A large number of of people who should know better have parroted that inane statement. There are, however, many artifacts that can indicate a false period of heat production. The question of whether heat is being produced is still open, though any such heat is not from deuterium atoms fusing with deuterium atoms to produce equal amounts of 3He + neutron and triton + proton. If the heat is real, it must be from a different nuclear reaction or some toally unknown non-nuclear source of reactions with energies far above the electron-volt levels of chemical reactions.

As with Taubes, Hoffman may have been under some pressure to complete the book. Miles, in 1991, was the first to report, in a conference paper, that helium was being produced, correlated with helium, and this was noticed by Huizenga in the second edition of his book (1993). Hoffman covers some of Miles’ work, and some helium measurements, but does not report the crucial correlation, though this was published in Journal of Electroanalytical Chemistry in 1993.

I cover heat/helium, as a quantitatively reproducible and widely-confirmed experiment, in my 2015 paper, published in a special section on Low Energy Nuclear Reactions in Current Science..

Of special note in that section would be McKubre, Cold fusion: comments on the state of scientific proof.

McKubre is an electrochemist who, when he saw the Pons and Fleischmann announcement, already was familiar with the palladium-deuterium system, working at SRI International, and immediately recognized that the effect reported must be in relatively unexplored territory, with very high loading ratio. This was not widely understood, and replication efforts that failed to reach a loading threshold, somewhere around 90% atom (D/Pd), reported no results (neither anomalous heat, nor any other nuclear effects). At that time, it was commonly considered that 70% loading was a maximum.

SRI and McKubre were retained by the Electric Power Research Institute, for obvious reasons, to investigate cold fusion, and until retiring recently, he spent his entire career after that, mostly on LENR research.

One of the characteristics of the rejection cascade was cross-disciplinary disrespect. In his review, Dr. Vidale shows no respect or understanding of sociology and “science studies,” and mistakes  his own opinions and those of his friends as “scientific consensus.”

What is scientific consensus? This is a question that sociologists and philosphers of science study. As well, most physicists knew little to nothing about electrochemistry, and there are many stories of Stupid Mistakes, such as reversing the cathode and anode (because of a differing convention) and failing to maintain very high cleanliness of experiments. One electrochemist, visiting such a lab, asked, “And then did you pee in the cell?” The most basic mistake was failing to run the experiment long enough to develop the conditions that create the effect. McKubre covers that in the paper cited.

(An electrolytic cathode will collect cations from the electrolyte, and cathodes may become loaded with fuzzy junk. I fully sympathize with physicists with a distaste for the horrible mess of an electrolytic cathode. For very good reasons, they prefer the simple environment of a plasma, which they can analyze using two-body quantum mechanics.

I sat in Feynman’s lectures at Cal Tech, 1961-63, and, besides his anecdotes that I heard directly from him when he visited Page House, I remember one statement about physics: “We don’t have the math to calculate the solid state, it is far too complex.” Yet too many physicists believed that the approximations they used were reality. No, they were useful approximations, that usually worked. So did Ptolemaic astronomy.)

Dr. Vidale is welcome to comment here and to correct errors, as may anyone.

NASA

This is a subpage of Widom-Larsen theory/Reactions

On New Energy Times, “Third Party References” to W-L theory include two connected with NASA, by Dennis Bushnell (2008) [slide 37] and J. M. Zawodny (2009) (slide 12, date is October 19, 2010, not 2009 as shown by Krivit).

What can be seen in the Zawodny presentation is a researcher who is not familiar with LENR evidence, overall, nor with the broad scope of existing LENR theory, but who has accepted the straw man arguments of WL theorists and Krivit, about other theories, and who treats WL theory as truth without clear verification. NASA proceeded to put about $1 million into LENR research, with no publications coming out of it, at least not associated with WL theory. They did file a patent, and that will be another story.

By 2013, all was not well in the relationship between NASA and Larsen.

To summarize, NASA appears to have spent about a million dollars looking into Widom-Larsen theory, and did not find it adequate for their purposes, nor did they develop, it seems, publishable data in support (or in disconfirmation) of the theory. In 2012, they were still bullish on the idea, but apparently out of steam. Krivit turns this into a conspiracy to deprive Lattice Energy of profit from their “proprietary technology,” which Lattice had not disclosed to NASA. I doubt there is any such technology of any significant value.

NASA’s LENR Article “Nuclear Reactor in Your Basement”

[NET linked to that article, and also to another copy. They are dead links, like many old NET links; NET has moved or removed many pages it cites, and the search function does not find them. But this page, I found with Google on phys.org. 

Now, in the Feb. 12, 2013, article, NASA suggests that it does not understand the Widom-Larsen theory well. However, Larsen spent significant time training Zawodny on it. Zawodny also understood the theory well enough to be a co-author on a chapter about the Widom-Larsen theory in the 2011 Wiley Nuclear Energy Encyclopedia. He understood it well enough to give a detailed, technical presentation on it at NASA’s Glenn Research Center on Sept. 22, 2011.

It simply does not occur to Krivit that perhaps NASA found the theory useless. Zawodny was a newcomer to LENR, it’s obvious. Krivit was managing that Wiley encyclopedia. The “technical presentation” linked contains numerous errors that someone familiar with the field would be unlikely to make — unless they were careless. For example, Pons and Fleischmann did not claim “2H + 2H -> 4He.” Zawodny notes that high electric fields will be required for electrons “heavy” enough to form neutrons, but misses that these must operate over unphysical distances, for an unphysical accumulation of energy, and misses all the observable consequences.

In general, as we can see from early reactions to WL Theory, simply to review and understand a paper like those of Widom and Larsen requires study and time, in addition to the followup work to confirm a new theory. WL theory was designed by a physicist (Widom, Larsen is not a physicist but an entrepreneur) to seem plausible on casual review.

To actually understand the theory and its viability, one needs expertise in two fields: physics and the experimental findings in Condensed Matter Nuclear Science (mostly chemistry). That combination is not common. So a physicist can look at the theory papers and think, “plausible,” but not see the discrepancies, which are massive, with the experimental evidence. They will only see the “hits,” i.e., as a great example, the plot showing correspondence between WL prediction and Miley data. They will not know that (1) Miley’s results are unconfirmed (2) they will not realize that other theories might make similar predictions. Physicists may be thrilled to have a LENR theory that is “not fusion,” not noticing that WL theory actually requires higher energies than are needed for ordinary hot fusion.

Also from the page cited:

New Energy Times spoke with Larsen on Feb. 21, 2013, to learn more about what happened with NASA.

“Zawodny contacted me in mid-2008 and said he wanted to learn about the theory,” Larsen said. “He also dangled a carrot in front of me and said that NASA might be able to offer funding as well as give us their Good Housekeeping seal of approval.

Larsen has, for years, been attempting to position himself as a consultant on all things LENR. It wouldn’t take much to attract Larsen.

“So I tutored Zawodny for about half a year and taught him the basics. I did not teach him how to implement the theory to create heat, but I offered to teach them how to use it to make transmutations because technical information about reliable heat production is part of our proprietary know-how.

Others have claimed that Larsen is not hiding stuff. That is obviously false. What is effectively admitted here is that WL theory does not provide enough guidance to create heat, which is the main known effect in LENR, the most widely confirmed. Larsen was oh-so-quick to identify fraud with Rossi, but not fast enough — or too greedy — to consider it possible with Larsen. Larsen was claiming Lattice Energy was ready to produce practical devices for heat in 2003. He mentioned “patent pending, high-temperature electrode designs,” and “proprietary heat sources.” Here is the patent, perhaps. It does not mention heat nor any nuclear effect. Notice that if a patent does not provide adequate information to allow constructing a working device, it’s invalid. The patent referred to a prior Miley patent. first filed in 1997, which does mention transmutation. Both patents reference Patterson patents from as far back as 1990. There is another Miley patent filed in 2001 that has been assigned to Lattice.

“But then, on Jan. 22, 2009, Zawodny called me up. He said, ‘Sorry, bad news, we’re not going to be able to offer you any funding, but you’re welcome to advise us for free. We’re planning to conduct some experiments in-house in the next three to six months and publish them.’

“I asked Zawodny, ‘What are the objectives of the experiments?’ He answered, ‘We want to demonstrate excess heat.’

I remember that this is hearsay. However, it’s plausible. NASA would not be interested in transmutations, but rather has a declared interest in LENR for heat production for space missions. WL Theory made for decent cover (though it didn’t work, NASA still took flak for supporting Bad Science), but it provides no guidance — at all — for creating reliable effects. It simply attempts to “explain” known effects, in ways that create even more mysteries.

“I told Zawodny, ‘At this point, we’re not doing anything for free. I told you in the beginning that all I was going to do was teach you the basic physics and, if you wish, teach you how to make transmutations every time, but not how to design and fabricate LENR devices that would reliably make excess heat.’

And if Larsen knew how to do that, and could demonstrate it, there are investors lined up with easily a hundred million dollars to throw at it. What I’m reasonably sure of is that those investors have already looked at Lattice and concluded that there is no there there. Can Larsen show how to make transmutations every time? Maybe. That is not so difficult, though still not a slam-dunk.

“About six to nine months later, in mid-2009, Zawodny called me up and said, ‘Lew, you didn’t teach us how to implement this.’ To my amazement, he was still trying to get me to tell him how to reliably make excess heat.

See, Zawodny was interested in heat from the beginning, and the transmutation aspect of WL Theory was a side-issue. Krivit has presented WL Theory as a “non-fusion” explanation for LENR, and the interest in LENR, including Krivit’s interest, was about heat, consider the name of his blog (“New Energy”). But the WL papers hardly mention heat. Transmutations are generally a detail in LENR, the main reaction clearly makes heat and helium and very few transmuted elements by comparison. In the fourth WL paper, there is mention of heat, and in the conclusion, there is mention of “energy-producing devices.”

From a technological perspective, we note that energy must first be put into a given metallic hydride system in order to renormalize electron masses and reach the critical threshold values at which neutron production can occur.

This rules out gas-loading, where there is no input energy. This is entirely aside from the problem that neutron production requires very high energies, higher than hot fusion initiation energies.

Net excess energy, actually released and observed at the physical device level, is the result of a complex interplay between the percentage of total surface area having micron-scale E and B field strengths high enough to create neutrons and elemental isotopic composition of near-surface target nuclei exposed to local fluxes of readily captured ultra low momentum neutrons. In many respects, low temperature and pressure low energy nuclear reactions in condensed matter systems resemble r- and
s-process nucleosynthetic reactions in stars. Lastly, successful fabrication and operation of long lasting energy producing devices with high percentages of nuclear active surface areas will require nanoscale control over surface composition, geometry and local field strengths.

The situation is even worse with deuterium. This piece of the original W-L paper should have been seen as a red flag:

Since each deuterium electron capture yields two ultra low momentum neutrons, the nuclear catalytic reactions are somewhat more efficient for the case of deuterium.

The basic physics here is simple and easy to understand. Reactions can, in theory, run in reverse, and the energy that is released from fusion or fission is the same as the energy required to create the opposite effect, that’s a basic law of thermodynamics, I term “path independence.” So the energy that must be input to create a neutron from a proton and an electron is the same energy as is released from ordinary neutron decay (neutrons being unstable with a 15 minute half-life, decaying to a proton, electron, and a neutrino. Forget about the neutrino unless you want the real nitty gritty. The neutrino is not needed for the reverse reaction, apparently). 781 KeV.

Likewise, the fusion of a proton and a neutron to make a deuteron releases a prompt gamma ray at 2.22 MeV. So to fission the deuteron back to a proton and a neutron requires energy input of 2.22 MeV, and then to convert the proton to another neutron requires another 0.78 MeV, so the total energy required is 3.00 MeV. What Widom and Larsen did was neglect the binding energy of the deuteron, a basic error in basic physics, and I haven’t seen that this has been caught by anyone else. But it’s so obvious, once seen, that I’m surprised and I will be looking for it.

Bottom line, then, WL theory fails badly with pure deuterium fuel and thus is not an explanation for the FP Heat Effect, the most common and most widely confirmed LENR. Again, the word “hoax” comes to mind. Larsen went on:

I said, ‘Joe, I’m not that stupid. I told you before, I’m only going to teach you the basics, and I’m not going to teach you how to make heat. Nothing’s changed. What did you expect?’”

Maybe he expected not to be treated like a mushroom.

Larsen told New Energy Times that NASA’s stated intent to prove his theory is not consistent with its behavior since then.

Many government scientists were excited by WL Theory. As a supposed “not fusion” theory, it appeared to sidestep the mainstream objection to “cold fusion.” So, yes, NASA wanted to test the theory (“prove” is not a word used commonly by scientists), because if it could be validated, funding floodgates might open. That did not happen. NASA spent about a million dollars and came up with, apparently, practically nothing.

“Not only is there published experimental data that spans one hundred years which supports our theory,” Larsen said, “but if NASA does experiments that produce excess heat, that data will tell them nothing about our theory, but a transmutation experiment, on the other hand, will.

Ah, I will use that image from NET again:

Transmutations have been reported since very early after the FP announcement, and they reported, in fact, tritum and helium, though not convincingly. With one possible exception I will be looking at later, transmutation has never been correlated with heat. (nor has tritium, only helium has been found and confirmed to be correlated). Finding low levels of transmuted products has often gotten LENR researchers excited, but this has never been able to overcome common skepticism. Only helium, through correlation with heat, has been able to do that (when skeptics took the time to study the evidence, and most won’t.)

Finding some transmutations would not prove WL theory. First of all, it is possible that there is more than one LENR effect (and, as “effect” might be described, it is clear there is more than one). Secondly, other theories also provide transmutation pathways.

“The theory says that ultra-low-momentum neutrons are produced and captured and you make transmutation products. Although heat can be a product of transmutations, by itself it’s not a direct confirmation of our theory. But, in fact, they weren’t interested in doing transmutations; they were only interested in commercially relevant information related to heat production.

Heat is palpable, transmutations are not necessarily so. As well, the analytical work to study transmutations is expensive. Why would NASA invest money in verifying transmutation products, if not in association with heat? From the levels of transmutations found and the likely precursors, heat should be predictable. No, Larsen was looking out for his own business interests, and he can “sell” transmutation with little risk. Selling heat could be much riskier, if he doesn’t actually have a technology. Correlations would be a direct confirmation, far more powerful than the anecdotal evidence alleged. At this point, there is no experimental confirmation of WL theory, in spite of it having been published in 2005. The neutron report cited by Widom in one of his “refutations” — and he was a co-author of that report — actually contradicts WL Theory.

Of course, that report could be showing that some of the neutrons are not ultra-low momentum, and some could then escape the heavy electron patch, but the same, then, would cause prompt gammas to be detected, in addition to the other problem that is solved-by-ignoring-it: delayed gammas from radioactive transmuted isotopes. WL Theory is a house of cards that actually never stood, but it seemed like a good idea at the time! Larsen continued:

“What proves that is that NASA filed a competing patent on top of ours in March 2010, with Zawodny as the inventor.

The NASA initial patent application is clear about the underlying concept (Larsen’s) and the intentions of NASA. Line [25] from NASA’s patent application says, “Once established, SPP [surface plasmon polariton] resonance will be self-sustaining so that large power output-to-input ratios will be possible from [the] device.” This shows that the art embodied in this patent application is aimed toward securing intellectual property rights on LENR heat production.

The Zawodny patent actually is classified as a “fusion reactor.” It cites the Larsen patent described below.

See A. Windom [sic] et al. “Ultra Low Momentum Neutron Catalyzed Nuclear Reactions on Metallic Hydride Surface,” European Physical Journal C-Particles and Fields, 46, pp. 107-112, 2006, and U.S. Pat. No. 7,893,414 issued to Larsen et al. Unfortunately, such heavy electron production has only occurred in small random regions or patches of sample materials/devices. In terms of energy generation or gamma ray shielding, this limits the predictability and effectiveness of the device. Further, random-patch heavy electron production limits the amount of positive net energy that is produced to limit the efficiency of the device in an energy generation application.

They noticed. This patent is not the same as the Larsen patent. It looks like Zawodny may have invented a tweak, possibly necesssary for commercial power production.

The Larsen patent was granted in 2011, but was filed in 2006, and is for a gamma shield, which is apparently vaporware, as Larsen later admitted it couldn’t be tested.

I don’t see that Larsen has patented a heat-producing device.

“NASA is not behaving like a government agency that is trying to pursue basic science research for the public good. They’re acting like a commercial competitor,” Larsen said. “This becomes even more obvious when you consider that, in August 2012, a report surfaced revealing that NASA and Boeing were jointly looking at LENRs for space propulsion.” [See New Energy Times article “Boeing and NASA Look at LENRs for Green-Powered Aircraft.”]

I’m so reminded of Rossi’s reaction to the investment of Industrial Heat in standard LENR research in 2015. It was intolerable, allegedly supporting his “competitors.” In fact, in spite of efforts, Rossi was unable to find evidence that IH had shared Rossi secrets, and in hindsight, if Rossi actually had valuable secrets, he withheld them, violating the Agreement.

From NET coverage of the Boeing/NASA cooperation:

[Krivit had moved the page to make it accessible to subscribers only, to avoid “excessive” traffic, but the page was still available with a different URL. I archived it so that the link above won’t increase his traffic. It is a long document. If I find time, I will extract the pages of interest, PDF pages 38-40, 96-97]

The only questionable matter in the report is its mention of Leonardo Corp. and Defkalion as offering commercial LENR systems. In fact, the two companies have delivered no LENR technology. They have failed to provide any convincing scientific evidence and failed to show unambiguous demonstrations of their extraordinary claims. Click here to read New Energy Times’extensive original research and reporting on Andrea Rossi’s Leonardo Corp.

Defkalion is a Greek company that based its technology on Rossi’s claimed Energy Catalyzer (E-Cat) technology . . . Because Rossi apparently has no real technology, Defkalion is unlikely to have any technology, either.

What is actually in the report:

Technology Status:
Multiple coherent theories that explain LENR exist which use the standard Quantum Electrodynamics & Quantum Chromodynamics model. The Widom-Larson(10) theory appears to have the best current understanding, but it is far from being fully validated and applied to current prototype testing. Limited testing is ongoing by NASA and private contractors of nickel-hydrogen LENR systems. Two commercial companies (Leonardo Corp. & Defkalion) are reported to be offering commercial LENR systems. Those systems are advertised to run for 6 months with a single fueling cycle. Although data exists on all of these systems, the current data in each case is lacking in either definition or 3rd party verification. Thus, the current TRL assessment is low.
In this study the SUGAR Team has assumed, for the purposes of technology planning and establishing system requirements that the LENR technology will work. We have not conducted an independent technology feasibility assessment. The technology plan contained in this section merely identifies the steps that would need to take place to develop a propulsion system for aviation that utilizes LENR technology.

This report was issued in May 2012. The description of Leonardo, Defkalion, and WL theory were appropriate for that time. At that point, there was substantial more evidence supporting heat from Leonardo and Defkalion, but no true independent verification. Defkalion vanished in a cloud of bad smell, Leonardo was found to be highly deceptive at best. And WL theory also has, as they point out, no “definition” — as to energy applications — n nor 3rd party verification.

Krivit’s articles on Rossi and Leonardo were partly based on innuendo and inference; they had little effect on investment in the Rossi technology, because of the obvious yellow-journalist slant. Industrial Heat decided that they needed to know for sure, and did what it took to become certain, investing about $20 million in the effort. They knew, full well, it was very high-risk, and considered the possibly payoff so high, and the benefits to the environment so large, as to be worth that cost, even if it turned out that Rossi was a fraud. The claims were depressing LENR investment. Because they took that risk, Woodford Fund then gave them an additional $50 million for LENR research, and much of current research has been supported by Industrial Heat. Krivit has almost entirely missed this story. As to clear evidence on Rossi, it became public with the lawsuit, Rossi v. Darden and we have extensive coverage on that here. Krivit was right that Rossi was a fraud . . . but it is very different to claim that from appearances and to actually show it with evidence.

In the Feb. 12, 2013, NASA article, the author, Silberg, said, “But solving that problem can wait until the theory is better understood.”

He quoted Zawodny, who said, “’From my perspective, this is still a physics experiment. I’m interested in understanding whether the phenomenon is real, what it’s all about. Then the next step is to develop the rules for engineering. Once you have that, I’m going to let the engineers have all the fun.’”

In the article, Silberg said that, if the Widom-Larsen theory is shown to be correct, resources to support the necessary technological breakthroughs will come flooding in.

“’All we really need is that one bit of irrefutable, reproducible proof that we have a system that works,’ Zawodny said. ‘As soon as you have that, everybody is going to throw their assets at it. And then I want to buy one of these things and put it in my house.’”

Actually, what everyone says is that if anyone can show a reliable heat-producing device, that is independently confirmed, investment will pour in, and that’s obvious. With or without a “correct theory.” A plausible theory was simply nice cover to support some level of preliminary research. NASA was in no way prepared to do what it would take to create those conditions. It might take a billion dollars, unless money is spent with high efficiency, and pursuing a theory that falls apart when examined in detail was not efficient, at all.  NASA was led down the rosy path by Widom and Larsen and the pretense of “standard physics.” In fact, the NASA/Boeing report was far more sophisticated, pointing out other theories:

Multiple coherent theories that explain LENR exist which use the standard Quantum Electrodynamics & Quantum Chromodynamics model

As an example, Takahashi’s TSC theory. This is actually standard physics, as well, more so than WL theory, but is incomplete. No LENR theory is complete at this time.

There is one theory, I call it a Conjecture, that in the FP Heat Effect, deuterium is being converted to helium, mechanism unknown. This has extensive confirmed experimental evidence behind it, and is being supported by further research to improve precision,. It’s well enough funded, it appears.

Back on Jan. 12, 2012, NASA published a short promotional video in which it tried to tell the public that it thought of the idea behind Larsen and Widom’s theory, but it did not mention Widom and Larsen or their theory. At the time, New Energy Times sent an e-mail to Zawodny and asked him why he did not attribute the idea to Widom and Larsen.

“The intended audience is not interested in that level of detail,” Zawodny wrote.

The video was far outside the capacity of present technology, but treats LENR as a done deal, proven to produce clean energy. That’s hype, but Krivit’s only complaint is that they did not credit Widom and Larsen for the theory used. As if they own physics. After all, if that’s standard physics . . . .

(See our articles “LENR Gold Rush Begins — at NASA” and “NASA and Widom-Larsen Theory: Inside Story” for more details.)

The Gold Rush story tells the same tale of woe, implying that NASA scientists are motivated by the pursuit of wealth, whereas, in fact, the Zawodny patent simply protects the U.S. government.

The only thing that is clear is that NASA tries to attract funding to develop LENR. So does Larsen. It has massive physical and human resources. He is a small businessman and has the trade secret. Interesting times lie ahead.

I see no sign that they are continuing to seek funding. They were funded to do limited research. They found nothing worth publishing, apparently. Now, Krivit claims that Larsen has a “trade secret.” Remember, this is about heat, not transmutations. By the standards Krivit followed with Rossi, Larsen’s technology is bullshit. Krivit became a more embarrassing flack for Larsen than Mats Lewan became for Rossi. Why did he ask Zawodny why he didn’t credit Widom and Larsen for the physics in that video? It’s obvious. He’s serving as a public relations officer for Lattice Energy. Widom is the physics front. Krivit talks about a gold rush at NASA. How about at New Energy Times and with Widom, a “member” of Lattice Energy, and a named inventor in the useless gamma shield patent.

NASA started telling the truth about the theory, that it’s not developed and unproven. Quoted on the Gold Rush page:

“Theories to explain the phenomenon have emerged,” Zawodny wrote, “but the majority have relied on flawed or new physics.

Not only did he fail to mention the Widom-Larsen theory, but he wrote that “a proven theory for the physics of LENR is required before the engineering of power systems can continue.”

Shocking. How dare they imply there is no proven theory? The other page, “Inside Story,” is highly repetitive. Given that Zadodny refused an interview, the “inside story” is told by Larsen.

In the May 23, 2012, video from NASA, Zawodny states that he and NASA are trying to perform a physics experiment to confirm the Widom-Larsen theory. He mentions nothing about the laboratory work that NASA may have performed in August 2011. Larsen told New Energy Times his opinion about this new video.

“NASA’s implication that their claimed experimental work or plans for such work might be in any way a definitive test of the Widom-Larsen theory is nonsense,” Larsen said.

It would be the first independent confirmation, if the test succeeded. Would it be “definitive”? Unlikely. That’s really difficult. Widom-Larsen theory is actually quite vague. It posits reactions that are hidden, gamma rays that are totally absorbed by transient heavy electron patches, which, by the way, would need to handle 2.2 MeV photons from the fusion of a neutron with a proton to form deuterium. But these patches are fleeting, so they can’t be tested. I have not seen specific proposed tests in WL papers. Larsen wanted them to test for transmutations, but transmutations at low levels are not definitive without much more work.  What NASA wanted to see was heat, and presumably heat correlated with nuclear products.

“The moment NASA filed a competing patent, it disqualified itself as a credible independent evaluator of the Widom-Larsen theory,” he said. “Lattice Energy is a small, privately held company in Chicago funded by insiders and two angel investors, and we have proprietary knowledge.

Not exactly. Sure, that would be a concern, except that this was a governmental patent, and was for a modification to the Larsen patent intended to create more reliable heat. Consider this: Larsen and Widom both have a financial interest in Lattice Energy, and so are not neutral parties in explaining the physics. If NASA found confirmation of LENR using a Widom-Larsen approach (I’m not sure what that would mean), it would definitely be credible! If they did not confirm, this would be quite like hundreds of negative studies in LENR. Nothing particularly new. Such never prove that an original report was wrong.

Cirillo, with Widom as co-author, claimed the detection of neutrons. Does Widom as a co-author discredit that report? To a degree, yes. (But the report did not mention Widom-Larsen theory.) Was that work supported by Lattice Energy?

“NASA offered us nothing, and now, backed by the nearly unlimited resources of the federal government, NASA is clearly eager to get into the LENR business any way it can.”

Nope. They spent about a million dollars, it appears, and filed a patent to protect that investment. There are no signs that they intend to spend more at this point.

New Energy Times asked Larsen for his thoughts about the potential outcome of any NASA experiment to test the theory, assuming details are ever released.

“NASA is behaving no differently than a private-sector commercial competitor,” Larsen said. “If NASA were a private-sector company, why would anyone believe anything that it says about a competitor?”

NASA’s behavior here does not remotely resemble a commercial actor. Notice that when NASA personnel said nice things about W-L theory, Krivit was eager to hype it. And when they merely hinted that the theory was just that, a theory, and unproven, suddenly their credibility is called into question.

Krivit is transparent.

Does he really think that if NASA found a working technology, ready to develop for their space flight applications, they would hide it because of “commercial” concerns. Ironically, the one who is openly concealing technology, if he isn’t simply lying, is Larsen. He has the right to do that, as Rossi had the right. Either one or both were lying, though. There is no gamma shield technology, but Larsen used the “proprietary” excuse to avoid disclosing evidence to Richard Garwin. And Krivit reframed that to make it appear that Garwin approved of WL Theory.

 

Reactions

This is a subpage of Widom-Larsen theory

New Energy Times has pages covering reactions to Widom-Larsen theory. As listings in his “In the News Media” section of the WLtheory master page:

November 10, 2005, Krivit introduced W-L theory. Larsen is described in this as “mysterious.”

March 10, 2006, Krivit published Widom-Larsen Low Energy Nuclear Reaction Theory, Part 3 (The 2005 story was about “Newcomers,” and had a Part 1 and Part 2, and only Part 2 was about W-L theory)

March 16, 2007 “Widom Larsen Theory Debate” mentions critical comments by Peter Hagelstein, “choice words” from Scott Chubb, covers the correspondence between a reported prediction by Widom and Larsen re data from George Miley (which is the most striking evidence for the theory I have seen, but I really want to look at how that prediction was made, since this is post-h0c, apparently), presents a critique by Akito Takahashi with little comment, the comment from Scott Chubb mentioned above, an Anonymous reaction from a Navy particle physicist, and a commentary from Robert Deck.

January 11, 2008 The Widom-Larsen Not-Fusion Theory has a detailed history of Krivit’s inquiry into W-L theory, with extensive discussions with critics. Krivit didn’t understand or recognize some of what was written to him. However, he was clearly trying to organize some coherent coverage.

Non-reviewed peer responses” has three commentaries

September 11, 2006 from Dave Rees, “particle physicist” with SPAWAR.

March 14, 2007, by Robert Deck of Toledo University.

February 23, 2007 by Hideo Kizima (source of initial Kozima quote is unclear)

Also cited:

May 27, 2005 Lino Daddi conference paper on Hydrogen Miniatoms. Daddi’s mention of W-L theory is of unclear relationship to the topic of the paper.

(Following up on a dead link on the W-l theory page, I found this article from the Chicago Tribune from April 16, 2007, showing how Lattice Energy was representing itself then. Larsen “predicts that within five years there will be power sources based on LENR technology.”) That page was taken down, but I found it on the internet archive.

Third-Party References:

David Nagel, email to Krivit, May 4, 2005, saying that he’s sending it to “some theoretical physicists for a scrub,” and Nagel slides  May 11, 2005 and Sept. 16, 2005. The first asks “challenges”  about W-L theory (some of the same questions I have raised). The second asks the same questions. Nagel is treating the theory as possibly having some promise, in spite of still having questions about it. This was the same year as original publication.

Lino Daddi is quoted, with no context (the link is to Krivit, NET)

Brian Josephson, the same.

George Miley is also quoted, more extensively, from Krivit.

David Rees (cited above also)

SPAWAR LENR Research Group – 2007: “We find that Widom and Larsen have done a thorough mathematical treatment that describes one mechanism to create…low-energy neutrons.”

erratum that credits Widom and Larsen for the generation of “low energy neutrons.”

Szpak et al (2007) were looking at the reverse of neutron decay and, given their context, “Further evidence of nuclear reactions in the Pd/D lattice: emission of charged particles, and after pointing to the 0.8 MeV required for this with a proton and “25 times” more with a deuteron, inexplicably proposed this:

The reaction e + D+ -> 2n is the source of low energy neutrons (Szpak, unpublished data), which are the product of the energetically weak reaction (with the heat of reaction on the electron volt level) and reactants for the highly energetic nuclear reaction n+ X -> Y.
At that point SPAWAR had evidence they were about to publish for fast neutrons. I’m not aware of any of their work that supports slow neutrons, but maybe Szpak had them in mind for transmutations.

Defense Threat Reduction Agency, 2007 . – 2007: “New theory by Widom[-Larsen] shows promise; collective surface effects, not fusion.”.

NET report is linked. The actual report. The comment was an impression from 2007, common then.

Richard Garwin (Physicist, designer of the first hydrogen bomb) – 2007: “…I didn’t say it was wrong

Comment presented out-of-context to mislead.

Dennis M. Bushnell,  (Chief Scientist, NASA Langley Research Center) – 2008: “Now, a Viable Theory” (page 37

see NASA subpage. All is not well between NASA and Larsen.

Johns Hopkins University – 2008, (pages 25 and 37) [page 25, pdf page 26, has this:]

[About the Fleischmann-Pons affair] . . . Whatever else, this history may stand as one of the more acute examples of the toxic effect of hype on potential technology development. [. . . ]

and they then proceed to repeat some hype:

According to the Larsen-Widom analysis, the tabletop, LENR reactions involve what’s called the “weak nuclear force,” and require no new physics.22 Larsen anticipates that advances in nanotechnology will eventually permit the development of compact, battery-like LENR devices that could, for example, power a cell phone for five hundred hours.

Note 22 is the only relevant information on page 37, and it is only a citation of Krivit’s Widom-Larsen theory portal (but it was broken, it was to “.htm” which fails, it must now be “.shtml”. And this may explain many of the broken links on NET.)

This citation is simply an echo of Krivit’s hype.

Pat McDaniel (retired from Sandia N.L.): “Widom Larsen theory is considered by many [people] in the government bureaucracy to explain LENR.

J. M. Zawodny (Senior Scientist, NASA Langley Research Center) – 2009: “All theories are based on the Strong Nuclear force and are variants of Cold Fusion except for one new theory. Widom-Larsen Theory is the first theory to not require ‘new physics’.

DTRA-Sponsored Report – 2010, “Applications of Quantum Mechanics: Black Light Power and the Widom-Larsen Theory of LENR,” Toton, Edward and Ullrich, George

Randy Hekman (2012 Senatorial Candidate) – 2011: “This theory explains the data in ways that are totally consistent with accepted concepts of science.

CERN March 22, 2012 Colloquium

The link is to an NET page.

Marty K. Bradley and Christopher K. Droney – Boeing (May 2012) “The Widom-Larson theory appears to have the best current understanding.

In 2007, Krivit solicited comments from LENR researchers on a mailing list.

Explanation

This is a subpage of Widom-Larsen theory

Steve Krivit’s summary:

1. Creation of Heavy Electrons   
Electromagnetic radiation in LENR cells, along with collective effects, creates a heavy surface plasmon polariton (SPP) electron from a sea of SPP electrons.

Part of the hoax involves confusion over “heavy electrons.” The term refers to renormalization of mass, based on the behavior of electrons user some conditions which can be conceived “as if” they are heavier. There is no gain in rest mass, apparently. That “heavy electrons” can exist, in some sense or other, is not controversial. The question is “how heavy”? We will look at that. In explanations of this, proponents of W-L theory point to evidence of intense electric fields under some conditions, one figure given was 1011 volts per meter. That certainly sounds like a lot, but … that field strength exists over what distance? To transfer the energy to an electron, it would be accelerated by the field over a distance, and that would give it a “mass” of 1011 electron volts per meter, but the fields described exist only for very short distances. The lattice constant with palladium is under 4 Angstroms or 4 x 10-10 meter.  So a field of 1011 volts/meter  would give mass (energy) of under 40 electron volts per lattice constant.

Generally , this problem is denied by claiming that there is some collective effect where many electrons give up some of their energy to a single electron. This kind of energy collection is a violation of the Second Law of Thermodynamics, applying to large systems. The reverse, large energy carried by one electron being distributed to many electrons, is normal.

The energy needed to create a neutron is the same as the energy released in neutron decay, i.e., 781 Kev, which is far more than the energy needed to “overcome the Coulomb barrier.” If that energy could be collected in a single particle, then ordinary fusion would be easy to come by. However, this is not happening.

2. Creation of ULM Neutrons  
An electron and a proton combine, through inverse beta decay, into an ultra-low-momentum (ULM) neutron and a neutrino.

Neutrons have a short half-life, and undergo beta decay, as mentioned below, so they are calling this “inverse beta decay,” though the more common term is “electron capture.” What is described is a form of electron capture, of the electron by a proton. By terming the electron “heavy,” they perhaps imagine it could have an orbit closer to the nucleus, I think, and thus more susceptible to capture. But the heavy electrons are “heavy” because of their momentum, which will cause many other effects that are not observed. They are not “heavy” as muons are heavy, i.e., higher rest mass. High mass will be associated with high momentum, hence high velocity, not at all allowing electron capture.

The energy released from neutron decay is 781 KeV. So the “heavy electron” would need to collect energy across a field that large, i.e., over about 20,000 lattice constants, roughly 8 microns. Now, if you have any experience with high voltage: what would you expect would happen long before that total field would be reached? Yes. ZAAP!

Remember, these are surface phenomena being described, on the surface of a good conductor, and possibly immersed in an electrolyte, also a decent conductor. High field strength can exist, perhaps, very locally. In studies cited by Larsen, he refers to biological catalysis, which is a very, very local phenomenon where high field strength can exist for a very short distance, on the molecular scale, somewhat similar to the lattice constant for Pd, but a bit larger.

Why and how “ultra low momentum”? Because he says so? Momentum must be conserved, so what happens to the momentum of that “heavy electron?” These are questions I have that I will keep in mind as I look at explanations. In most of the explanations, such as those on New Energy Times, statements are made that avoid giving quantities, they are statements that can seem plausible, if we neglect the problems of magnitude or rate. It is with magnitude and rate that conflicts arise with “standard physics” and cold fusion. After all, even d-d fusion is not “impossible,” but is rate-limited. That is, there is an ordinary fusion rate at room temperature, but it’s very, very . . . very low — unless there are collective effects and it was the aim of Pons and Fleischmann, beginning their research, to see the effect of the condensed matter state on the Born–Oppenheimer approximation. (There are possible collective effects that do not violate the laws of thermodynamics.)

3. Capture of ULM Neutrons  
That ULM neutron is captured by a nearby nucleus, producing, through a chain of nuclear reactions, either a new, stable isotope or an isotope unstable to beta decay.

A free neutron outside of an atomic nucleus is unstable to beta decay; it has a half-life of approximately 13 minutes and decays into a proton, an electron and a neutrino.

If slow neutrons are created, expecially “ultra-slow,” they will be indeed captured, neutrons are absorbed freely by nuclei, some more easily than others. If the momentum is too high, they bounce. With very slow neutrons (“ultra low momentum”) the capture cross-section becomes very high for many elements, and many such reactions will occur (essentially, in a condensed matter environment, all the neutrons generated will be absorbed. The general result is an isotope with the same atomic number as the target (same number of protons, thus the same positive  charge on the nucleus), but one atomic mass unit heavier, because of the neutron. While some of these will be stable, many will not, and they would be expected to decay, with a characteristic half-lives.

Neutron capture on protons would be expected to generate a characteristic prompt gamma photon at 2.223 MeV. Otherwise the deuterium formed is stable. That such photons are not detected is explained by an ad hoc side-theory, that the heavy electron patches are highly absorbent of the photons. Other elements may produce delayed radiation, in particular gammas and electrons.

How these delayed emissions are absorbed, I have never seen W-L theorists explain.

From the Wikipedia article on Neutron activation analysis:

[An excited state is generated by the absorption of a neutron.] This excited state is unfavourable and the compound nucleus will almost instantaneously de-excite (transmutate) into a more stable configuration through the emission of a prompt particle and one or more characteristic prompt gamma photons. In most cases, this more stable configuration yields a radioactive nucleus. The newly formed radioactive nucleus now decays by the emission of both particles and one or more characteristic delayed gamma photons. This decay process is at a much slower rate than the initial de-excitation and is dependent on the unique half-life of the radioactive nucleus. These unique half-lives are dependent upon the particular radioactive species and can range from fractions of a second to several years. Once irradiated, the sample is left for a specific decay period, then placed into a detector, which will measure the nuclear decay according to either the emitted particles, or more commonly, the emitted gamma rays.

So, there will be a characteristic prompt gamma, and then delayed gammas and other particles, such as the electrons (beta particles) mentioned. Notice that if a proton is converted to a neutron by an electron, and then the neutron is absorbed by an element with atomic number of X, and mass M, the result is an increase M of one, and it stays at this mass (approximately) with the emission of the prompt gamma. Then if it beta-decays, the mass stays the same, but the neutron becomes a proton and so the atomic number becomes X + 1. The effect is fusion, as if the reaction were the fusion of X with a proton. So making neutrons is one way to cause elements to fuse, this could be called “electron catalysis.”

Yet it’s very important to Krivit to claim that this is not “fusion.” After all, isn’t fusion impossible at low temperatures? Not with an appropriate catalyst! (Muons are the best known and accepted possibility.)

4. Beta Decay Creation of New Elements and Isotopes  
When an unstable nucleus beta-decays, a neutron inside the nucleus decays into a proton, an energetic electron and a neutrino. The energetic electron released in a beta decay exits the nucleus and is detected as a beta particle. Because the number of protons in that nucleus has gone up by one, the atomic number has increased, creating a different element and transmutation product.

That’s correct as to the effect of neutron activation. Sometimes neutrons are considered to be element zero, mass one. So neutron activation is fusion with the element of mass zero. If there is electron capture with deuterium, this would form a di-neutron, which, if ultracold, might survive long enough for direct capture. If the capture is followed by a beta decay, then the result has been deuterium fusion.

In the graphic above, step 2 is listed twice: 2a depicts a normal hydrogen reaction, 2b depicts the same reaction with heavy hydrogen. All steps except the third are weak-interaction processes. Step 3, neutron capture, is a strong interaction but not a nuclear fusion process. (See “Neutron Capture Is Not the New Cold Fusion” in this special report.)

Very important to him, since, with the appearance of W-L theory, Krivit more or less made it his career, trashing all the other theorists and many of the researchers in the field, because of their “fusion theory,” often making “fusion” equivalent to “d-d fusion,” which is probably impossible. But fusion is a much more general term. It basically means the formation of heavier elements from lighter ones, and any process which does this is legitimately a “fusion process,” even if it may also have other names.

Given that the fundamental basis for the Widom-Larsen theory is weak-interaction neutron creation and subsequent neutron-catalyzed nuclear reactions, rather than the fusing of deuterons, the Coulomb barrier problem that exists with fusion is irrelevant in this four-step process.

Now, what is the evidence for weak-interaction neutron creation? What reactions would be predicted and what evidence would be seen, quantitatively? Yes, electron catalysis, which is what this amounts to, is one of a number of ways around the Coulomb barrier. This one involves the electron being captured into an intermediate product. Most electron capture theories have a quite different problem, than the Coulomb barrier problem, that other products would be expected that are not observed, and W-L theory is not an exception.

The most unusual and by far the most significant part of the Widom-Larsen process is step 1, the creation of the heavy electrons. Whereas many researchers in the past two decades have speculated on a generalized concept of an inverse beta decay that would produce either a real or virtual neutron, Widom and Larsen propose a specific mechanism that leads to the production of real ultra-low-momentum neutrons.

It is not the creation of heavy electrons, per se, that is “unusual,” it is that they must have an energy of 781 KeV. Notice that 100 KeV is quite enough to overcome the Coulomb barrier. (I forget the actual height of the barrier, but fusion occurs by tunnelling at much lower approach velocities). This avoidance of mentioning the quantity is typical for explanations of W-L theory.

ULM neutrons would produce very observable effects, and that’s hand-waved away.

The theory also proposes that lethal photon radiation (gamma radiation), normally associated with strong interactions, is internally converted into more-benign infrared (heat) radiation by electromagnetic interactions with heavy electrons. Again, for two decades, researchers have seen little or no gamma emissions from LENR experiments.

As critique of the theory mounted, as people started noticing the obvious, the explanation got even more devious. The claim is that the “heavy electron patches” absorb the gammas, and Lattice Energy (Larsen’s company) has patented this as a “gamma shield,” but then when the easy testability of such a shield, if it could really absorb all those gammas, was mentioned (originally by Richard Garwin), Larsen first claimed that experimental evidence was “proprietary,” and then, later pointed out that they could not be detected because the  patches were transient, pointing to the flashing spots in a SPAWAR IR video, which was totally bogus. (Consider imaging gammas, which was the proposal, moving parallel to the surface, close to it. Unless the patches are in wells, below the surface, they would be captured by a patch anywhere along the surface. No, more likely: Larsen was blowing smoke, avoiding a difficult question asked by Garwin. That’s certainly what Garwin thought. Once upon a time, Krivit reported that incident straight (because he was involved in the conversation. Later he reframed it, extracting a comment from Garwin, out of context, to make it look like Garwin approved of W-L theory.

 Richard Garwin (Physicist, designer of the first hydrogen bomb) – 2007: “…I didn’t say it was wrong

The linked page shows the actual conversation. This was far, far from an approval. The “I didn’t say” was literal, and Garwin points out that reading complex papers with understanding is difficult. In the collection of comments, there are many that are based on a quick review, not a detailed critique.

Perhaps the prompt gammas would be absorbed, though I find the idea of a 2 MeV photon being absorbed by a piddly patch, like a truck being stopped by running into a motorcycle, rather weird, and I’d think some would escape around the edges or down into and through the material. But what about the delayed gammas? The patches would be gone if they flash in and out of existence.

However, IANAP. I Am Not A Physicist. I just know a few. When physics gets deep, I am more or less in “If You Say So” territory. What do physicists say? That’s a lot more to the point here than what I say or what Steve Krivit says, or, for that matter, what Lewis Larson says. Widom is the physicist, Larson is the entrepreneur and money guy, if I’m correct. His all-but-degree was in biophysics.

Toton-Ullrich DARPA report

This is a subpage of Widom-Larsen theory

From Krivit:

The report was produced in March 2010, when two physicists, Edward Toton and George Ullrich, under contract with the Advanced Systems and Concepts Office, a think tank that is part of the U.S. Defense Threat Reduction Agency, favorably analyzed Larsen and Widom’s theory.

Toton is a consultant with a long history in defense-related research, and Ullrich was, at the time, a senior vice president for Advanced Technology and Programs with Science Applications International Corp.

Toton and Ullrich summarized their evaluation with a question: “Could the Widom-Larsen theory be the breakthrough needed to position LENR as a major source of carbon-free, environmentally clean source of source of low-cost nuclear energy??”

Larsen spoke with the two physicists from 2007 to 2010 to help them understand key details of his and Widom’s theory of LENRs.

The authors summarized their evaluation in a slide presentation on March 31, 2010, in Fort Belvoir, Virginia. Their slides were geared toward a technical audience and included, with acknowledgments, some information and graphics taken directly from Larsen’s slides, originally published on SlideShare.

Larsen tends to publish on SlideShare, which makes it more difficult to criticize. The Toton-Ullrich summary is not independent, it’s heavily taken from Larsen.

The Toton-Ullrich summary does an excellent job of distilling Larsen’s explanation of why LENR experiments produce few long-lived radioactive isotopes:

This is the problem: W-L theory appears to explain certain results, but not the full body of results, only selected phenomena. As well, the theory is often accepted based on superficial explanations that are not detailed and not backed by specific evidence.  Before I move on, to a detailed examination of W-L theory from 2013 (not some rehashed and uncooked evidence from 2010, as the Krivit report was), I do want to look at more of what Toton and Ullrich wrote, it was remarkable in several ways.

Krivit has this report here, but the originals are here: Abstract, Report.

As well, I’ve also copied the report: Applications of Quantum Mechanics: Black Light Power and the Widom-Larsen Theory of LENR 

Tasking

• Determine the state of understanding of LENR theoretical modeling, experimental
observations
 Confer with selected Low Energy Nuclear Reactions (LENR) proponents
 Survey and evaluate competing theories for the observed LENR results
• Catalogue opponent/proponent views on LENR theories and experiments
 Conduct literature search
 Seek consultations
• Review data on element transmutation
 Present alternative explanations
• Prepare assessment and recommendations
 Include pros & cons for potential DTRA support of LENR research
• Critically examine past and new claims by Black Light Power Inc: power generation using
a newly discovered field of hydrogen-based chemistry
 Investigate the theoretical basis for these claims
 Assess compatibility with mainstream theories and other observed phenomena

Did they do this, and how well did they do it? Who designed the task? First of all, mixing Black Light Power with LENR is combining radically different ideas and sets of proponents, as if BLP were claiming “LENR.” which they weren’t.

My emphasis:

Recommendations

• DTRA should be cautious in considering contractual relationships with
BlackLight Power
 Reviews & assessments performed throughout the BlackLight Power history
have generally revealed serious deficiencies in the CP theory
Experimental claims have not enjoyed the benefit of doubt of even those in the LENR field
 No substantive independent validations (BlackLight Power exercises
proprietary constraints)
• DTRA should continue to be receptive to and an advocate for
independent laboratory validation
 Contractual support for participation in independent laboratory validation
should be avoided – a full, “honest broker” stance is necessary should
promising results emerge in a highly controversial field

Yes. Obviously. Who made the suggestion that BLP has anything to do with LENR?

Then they move on to LENR. They start with a quotation of the 2004 U.S. DoE report:

The lack of testable theories for (LENRs) is a major impediment to acceptance of
experimental claims … What is required for the evidence (presented) is either a
testable theoretical model or an engineering demonstration of a self-powered
system …
2004 DOE LENR Review Panel

Basically, warmed over bullshit. “Testable theoretical model” is looking for a testable theory of “mechanism,” whereas what is actually testable is a theory of “effect.” Obviously both of these requirements could suffice, but the first one was satisfied (as to “effect”) by 1991, though it wasn’t understood that way, because it wasn’t a “theory of mechanism.” Rather it was what I have called a Conjecture: that the Fleischmann-Pons Heat Effect with palladium deuteride is the result of the conversion of deuterium to helium. That is (1) testable — and it’s been widely confirmed, with quantitative results — and (2) it’s nuclear, because of the nuclear product. The other alternative is well beyond the state of the art. Such requires a reliable reaction, and with present technology, that’s elusive. The preponderance of the evidence is clear, in fact, already, that the effect is real, and the 2004 review almost got there, the process was a mess; but a clear majority of those who were present for the presentation considered the effect real and probably nuclear in nature. Then there were those who just reacted, remotely, without literally giving the presenters the time of day. That took it to a divided result.

W-L theory here will be considered a “testable theory,” perhaps, but it was proposed in 2005 or so. Where are the test results? Sure, you can cob together various ad hoc assumptions and thus “explain” some results (mostly notably work by George Miley on transmutations — which is unconfirmed) but there are other results that it seems the theory predicts that are simply ignored, as if those aren’t “tests” of the theory.

Much of the information in this briefing has been drawn from various papers and briefings posted on the Internet and copyrighted by Lattice Energy, LLA. The information is being used with the expressed permission of Dr. Lewis Larsen, President and CEO of Lattice Energy LLC.

They took the easy way and we can see the influence.

On 23 March 1989 Pons and Fleischman [sic] revealed in a news conference that they had achieved thermonuclear fusion (D – D) in an electrochemical cell at standard pressure and temperature

I’m not completely clear what they claimed in the news conference. In their first-published paper, they actually claimed that they had found an “unknown nuclear reaction,” but the idea that if the FP Heat Effect was nuclear, it must be “d-d fusion” was very common, and we can see here how that is proposed as the Big Idea that W-L has corrected. Those who criticize W-L theory are considered in this report as “proponents of d-d fusion.” This was a totally naive acceptance of the Larsen story, as promoted by Krivit.

The Theoretical Dilemma posed by Cold Fusion

• D – D reactions and their branching ratios
 D + D -> 3He (0.82 MeV) + n0 (2.45 MeV) (slightly less than 50% of the time)
 D + D -> T (1.01 MeV) + n0 [sic] (3.02 MeV) (slightly less than 50% of the time)
 D + D -> 4He (0.08 MeV) + γ (23.77 MeV) (less than 1% of the time)

It is actually far less than 1%. It’s hard to find that branching ratio, but 10-7 comes to mind. The helium branch is very rare, and so the other two branches really are 50%. And then to make things even more obvious that this is not your grandfather’s d-d fusion, tritium shows up a million times more than fast neutrons (which are very rare from LENR). The second branch is also incorrect, it produces tritium (T) plus a proton (P), not a neutron. It’s hard to find good help.

• But the Pons & Fleischman [sic]* results did not indicate neutron emissions at
expected rates, nor show any evidence of γ emissions
• Subsequent experiments, while continuing to show convincing evidence for
nuclear reactions, have largely dispelled thermonuclear fusion as the
underlying responsible physical mechanism
• Some other Low Energy Nuclear Reaction (LENR) was likely in play

Which, in fact, Pons and Fleischmann pointed out. (“Unknown nuclear reaction.”)

A new theory was needed to explain “LENR”

Needed by whom and for what? Apparently, some people need a theory, and probably a deep one, to accept experimental evidence, but experimental evidence is just that: evidence, and simple theories can be developed, and have been developed, that don’t explain everything.  We will see:

* Pons and Fleischman [sic] reported detecting He4 but subsequently retracted this claim as a flawed measurement.

The reality is that they stopped talking about helium, and why they did this is not clear. By 1991, however, Miles had reported helium correlated with anomalous heat. Pons and Fleishmann had seen helium in a single measurement, and it is entirely possible that this was leakage. (Details are scarce.) That was not the case with later measurements and the many confirmations.

Did these researchers read Storms (2007). That was a definitive monograph on the field. They don’t seem to be aware of the actual state of the field, but followed Larsen’s explanations.

Observations from LENR Experiments

• Macroscopic “excess heat” measured calorimetrically
 Weakly repeatable and extremely contentious
 Richard Garwin says, “Call me when you can boil a cup of tea*”
* Largest amount and duration of excess heat measured in an LENR experiment was 44 W for 24 days (90 MJ) in nickel-light hydrogen gas phase system.

Who is supplying them with these sound bites? Because of the unreliability of the effect (sometimes it’s a lot of heat), experiments were scaled down (since before the 1989 announcement). It’s awkward if an experiment melts down, as the FP one did, apparently, in 1985. The scientific issue would properly be if measurements were adequate for correlation with nuclear products, and they have been, for one product: helium. They also correlate with conditions and with material. I.e., some material simply doesn’t work, others work far more reliably, with material from a single batch. And then a new batch, don’t work. But that can all be addressed scientifically with controlled experiments and correlations.

The “cup of tea” remark was from Douglas Morrison, the CERN physicist, and has been repeated by Robert Park, author of Voodoo Science. I don’t think Garwin said this, but maybe. These scientists are repeating rumors, from . . . it’s pretty obvious! That or shallow reading. They still end up with something sensible, just . . . off.

• Production of gaseous helium isotopes
 Difficult to detect reliably and possibility of contamination
 Observed by only a few researchers but most do not go to the
expense of looking for helium

Yes, helium at the levels involved with modest anomalous heat is difficult to measure, but it has long been possible, and has been done, with blind testing by reputable labs. The correlation, across many measurements, given the experimental procedures, rules out “contamination” and, in fact, validates the heat measurements as well. In experimental series, large numbers of cells had no significant heat and also no helium above background. Given that the difference between a heat-active cell and one with no significant excess heat may only be a couple of degrees C., if leakage were the cause, we would not see these correlations. The suggestion of “leakage” was made in the final report of the U.S. DoE panel in 2004, and it was preposterous there . . . but the presentation had been misunderstood, that’s obvious on review. Then, “leakage” gets repeated over and over. The field is full of ideas that came up at one time, thought plausible then, which have been shown to be way crazy . . . but that still get repeated as if fact.

This might as well have been designed as a trap to finger sloppy researchers and reporters, who repeat stuff merely because it’s been repeated in the past.

• Modest production of MeV alpha particles and protons
 Reproducible and reported by a number of researchers

Sloppy as well. “MeV alpha particles”? No, not many, if any. And there have been no correlations. The tracks reported by SPAWAR, were almost certainly not alphas (except for the triple-tracks, which are alphas, from neutron-induced fission of carbon into three alpha particles, and which are found only at very low levels.) Again, there is little attention paid to quantity, which feeds into accepting W-L theory.

• Production of a broad spectrum of transmuted elements
 More repeatable than excess heat but still arguments over possible
contamination

This is not more repeatable than excess heat. Don’t mistake “many reports” for “replications,” but they do just that. Contamination is not the only problem.

If, say, deuterium is being converted to helium (which is clear, in fact, it is the mechanism and full pathways that are not clear), then there is 24 MeV per helium, energy released in some way. Because almost all this energy apparently shows up as heat, there would not be large quantities of “other reactions,” but such a reaction would very possibly and occasionally create some rare branches, or secondary reactions with some other element involved, thus low levels of other transmutations may appear, even though the only transmutation that occurs at high levels is from deuterium to helium. Larsen is not going to point this out! He does produce a speculated reaction pathway to create helium, but that then raises other problems. Why this pathway and not others? What happens to intermediate products?

 Difficult to argue against competent mass spectoscopy [sic]

Right. However, what it means that an element shows up at low levels can be unclear. In a paper presented a month ago at ICCF-21 in Colorado, a researcher showed how samarium appeared on the surface of his cathode. I think this was gas discharge work. The cathode is etched away, and he concluded that this process concentrated samarium on the surface, as it was not ablated. If it is not correlated with heat, it may be some different effect, and there can be fractionation, where something very rare is concentrated in the sample. That is quite distinct from the competence of the mass spectrometry.

There is a whole class of reports that show “some nuclear effect.” That, then, creates some big hoopla, because, we think, there shouldn’t be such effects at low temperatures. But “nuclear effects” are all around us, if we look for them. This is very weak evidence, unless there are correlations showing common causation. Large effects, that’s another story, but the transformation results are generally not so.

The Widom-Larsen (W-L) theory provides a self-consistent framework for addressing many long-standing issues about LENR

Some and not others.

 Overcoming the Coulomb barrier – the most significant stumbling block for thermonuclear “Cold Fusion” advocates

Who is that? “Cold fusion,” by definition, is not “thermonuclear.” It is looking like the considering of opposing views, part of the charge, was only as reported through Larsen.

 Absence of significant emissions of high-energy neutrons

This only requires the helium branch, and as pointed out, pathways through 8Be fission to helium with no neutrons. Yes, W-L theory avoids the “missing neutrons” problem. But so does the “gremlin” theory. Basically, we have known since 1990 that “cold fusion” wasn’t ordinary d-d fusion, period. That is where the “neutron problem” comes from. The missing neutrons are a problem for any straight “d-d fusion” theory, because muon-catalyzed fusion, even though it occurs at extremely low temperatures, still generates the same branching ratio. So something else is happening, that’s completely obvious.

 Absence of large emissions of gamma rays

W-L theory predicts substantial gammas, easily detectable. Just not that monster 24 MeV gamma from d + d -> 4He.

• The W-L theory does not postulate any new physics or invoke any ad hoc mechanisms to describe a wide body of LENR observations, including
 Source of excess heat in light and heavy water electrochemical cells
 Transmutation products typically seen in H and D LENR experimental setups
 Variable fluxes of soft x-rays seen in some experiments
 Small fluxes of high-energy alpha particles in certain LENR systems

The “gamma shield” proposed to explain the lack of neutron activation gammas is “new physics,” and so is the idea of “heavy electrons” with increased mass adequate to manage creating electron capture by protons or deuterons. W-L theory provides no guide to predicting the amount of excess heat, nor the variability and unreliability of the heat effect. (Other theories do, and I have never seen Larsen address that problem. Nor has he shown any experimental results coming out of the theory, nor has, in fact, anyone, in well over a decade since it was first proposed.)

The nature of W-L theory allows making up reactions to take place in series, with multiple neutron captures. That makes no sense once we look at reaction rates. That is, if a neutron is made, there will be a capture, which will create an effect. Because the effects in LENR are taking place at low levels compared to the number of atoms in the sample, the rate at which atoms are activated by neutrons must be low, so the chance of an additional capture on the same atom will be low. There is a way around this, but the point is that rate must be considered, something Larsen never does. Transmutations results are not consistent, as implied.

There may be soft X-rays, several theories predict them. No comparison is made in this report with other LENR theories, not that any of them are particularly good. Some, however, are more compatible with experimental observations, a crucial issue that the authors totally neglect. They are only looking at the “good points,” and not critically, as they certainly were with BLP ideas.

W-L Theory – The Basics

• Electromagnetic radiation on a metallic hydride surface increases mass of surface plasmon electrons (e-)
• Heavy-mass surface plasmon polariton (SPP) electrons react with surface protons (p+) or deuterons (d+)  to produce ultra low momentum (ULM) neutrons and an electron neutrino (ν)

What is completely missing here is how much mass must be added to the electrons. Peter Hagelstein took a careful look at this in 2013. It’s enormous (781 KeV), and the conditions required are far from what is possible on the surface of a Fleischmann-Pons cathode. There is no evidence for such reactions taking place, other than this ad hoc theory.

• ULM neutrons are readily captured by nearby atomic nuclei (Z,A), resulting in an increase in the atomic mass (A) by 1 thereby creating a heavier mass isotope (Z,A+1) .
• If the new isotope is unstable it may undergo beta decay*, thereby increasing the atomic number by 1 and producing a new transmuted element (Z+1, A+1) along with a beta particle (e-) and an anti-neutrino (νe )

Yes, that’s what cold neutrons would do. Too much, they would do this. Many results can be predicted that are not seen. Gammas, both prompt and delayed, as well as delayed high-energy electrons (beta radiation) would be generated. Radioactive nuclei (delayed beta emitters) would be generated, and be detectable with mass spectrometry. There is no coherent evidence for this. There are only scattered and incoherent transmutation reports at low levels, very very little that is consistent with the theory. If that’s not correct, where is the paper describing it, clearly?

• The energy released during the beta decay is manifest as “excess heat”

There would also be the absorbed gammas from the prompt radiation. Why don’t they mention that? Are they aware of those prompt gammas? Yes, at least somewhat, there was a note added to the above:

*It could also undergo alpha decay or simply release a gamma ray, which in turn is converted to infrared energy

However, the conversion of gammas to heat is glossed over here. Most gammas would escape the cell, unless something else happens.

W-L Theory Invokes Many Body Effects

This is quite a mess.

• Certain hydride forming elements, e.g., Pd, Ni, Ti, W, can be loaded with H, D, or T, which will ionize, donating their electrons to the sea of free electrons in the metal
• Once formed, ions of hydrogen isotopes migrate to specific interstitial structural sites in the bulk metallic lattice, assemble in many-body patches, and oscillate collectively and coherently (their QM wave functions are effectively entangled) setting the stage for a local breakdown in the Born-Oppenheimer approximation[1]

Embarrassing. These physicists are not familiar with LENR experimental evidence and what is known about PdD LENR, or they would not make the “interstitial structural sites” mistake.  The helium evidence shows clearly that the reaction producing helium is at or very near the surface, not anywhere deep in the lattice. The isotopes will not preferentially collect in “interstitial structural sites” (i.e., voids). There will be a vapor pressure equilibrium in such sites. W-L theory does not address the issue of the loading ratio of palladium, known to be correlated with excess heat (at least with initiation). (i.e., below a loading of about 90 atom percent, excess heat is not seen.)

W-L theory generally assumes the patches are at the surface, but is unclear on the exact location and local conditions, which would be an essential part of a theory if it is to be of practical utility.

• This, in turn, enables the patches of hydrogenous ions to couple electromagnetically to the nearby sea of collectively oscillating SSP electrons
• The coupling creates strong local electric fields (>1011 V/m) that can renormalize the mass of the SSPs above the threshold for ULM neutron production

Again, no mention of the magnitude of the renormalization, which must add on the order of 781 KeV to the mass-energy of the electron.

• ULM neutrons have huge DeBroglie wavelengths[2] and extremely large capture cross sections with atomic nuclei compared even to thermal neutrons
 Lattice Energy LLC has estimated the ULM neutron fission capture cross section on U235 to be ~ 1 million barns vs. ~586 barns for thermal neutrons

What is not said is why ULM neutrons are formed. They need ULM neutrons so that the neutrons don’t escape the “patch.” This, by the way, requires that the neutrons be generated in the middle of the patch, not near an edge.

It’s not just a two-body collision
[useless image]

[1]The Born-Oppenheimer approximation allows the wavefunction of molecule to be broken down into its electronic and nuclear (vibrational and rotational) components. In this case, the wavefunction must be constructed for the many body patch.

This is getting closer to many-body theory, such as Takahashi or Kim. “Must be constructed.” Must be in order to what? Basically, constructing the wavefunction for an arbitrary and undefined patch is not possible. This is hand-waving. It is on the order of “we can’t calculate this, so it might be possible.”

[2]The DeBroglie wavelength of ULM neutrons produced by a condensed matter collective system must be comparable to the spatial dimension of the many-proton surface patches in which they were produced.

They noticed. “Must be” is in order to avoid the escape of the neutrons from the patch. The “useless image” showed a gaggle of protons huddling together, with electrons dancing apart from them. That is not what would exist. Where did they get that image?

W-L Theory Insights

Insight 1: Overcoming Coulomb energy barrier
 The primary LENR process is driven by nuclei absorbing ULM
neutrons for which there is no Coulomb barrier

No, the primary process proposed is the formation of neutrons from a proton and electron, which has a 781 KeV barrier, which is larger than the ordinary Coulomb barrier. There is no Coulomb barrier for any neutral particle, which would include what are called femto-atoms, any nucleus with electrons collapsed into a much smaller structure. The formation of the neutrons is what is unexpected. Once they are formed, absorption is normal. But then there is a second miracle:

Insight 2: Suppression of gamma ray emmisions [sic]
 Compton scattering from heavy SSP electrons creates soft photons
 Creation of heavy SSP electron-hole pairs in LENR systems have
energy spreads in the MeV range, compared to nominal spreads in
the eV range for normal conditions in metals, thus enabling gamma
ray absorption and conversion to heat

Garwin was quite skeptical and so am I. There is no evidence for this other than what Krivit points out: that gammas aren’t observed. That’s backwards. This “gamma shield” must be about perfect, no leakage. The delayed gammas are ignored. What it means to have many heavy electrons in a patch is ignored. Where does all this mass/energy come from?

Insight 3: Origins of excess heat
 ULM neutron capture process and subsequent nuclei relaxation through radioactive decay or gamma emission generates excess heat

If we know where it is coming from, it is no longer “excess heat,” but that’s a mere semantic point. There is no doubt that neutrons, if formed, would generate reactions that would create fusion heat, that is, the heat released as elements are walked up the number of protons and neutrons (up to the maximum packing efficiency at iron). That’s fusion energy, folks. They are simply doing it with protons and electrons first forming neutrons, and then electrons are emitted, often. The gammas will also generate heat, if they are absorbed as claimed. A number of theories postulate low-energy gammas. (If it comes from a nucleus, it’s called a “gamma,” otherwise these are called “X-rays.”) If the gammas are low-enough energy, they will be absorbed.

Widom-Larsen theory, however, by postulating neutron absorption, predicts necessary high-energy gammas, which is why it needs the special absorption process. The delayed gammas are ignored.

– Alpha and beta particles transfer kinetic energy to surrounding medium through scattering process

High-energy alphas (above 10 – 20 KeV) would generate secondary radiation that is not observed. This could not be captured by the patches because those alphas are delayed.

– Gamma rays are converted to infrared photons which are absorbed by nearby matter

So that’s the second miracle.

Insight 4: Elemental transmutation  Five-peak transmutation product mass
spectra reported by several researchers
– One researcher (Miley) hypothesized that these peaks were fission products of
very neutron-rich compound nuclei with atomic masses of 40, 76, 194, and 310
(a conjectured superheavy element)
 According to W-L theory, successive rounds of ULM neutron production and
capture will create higher atomic mass elements consistent with observations
– The W-L neutron optical potential model of ULM neutron absorption by nuclei
predicts abundance peaks very close to the observed data

First of all, Miley has not been confirmed. Secondly, the transmutation levels observed in most reports are quite low. So successive transmutations must be far lower. By ignoring rate issues, W-L theory can imagine countless possible reactions and then fit them to this or that observation. I’m not sure what the “optical potential model” means. In fact, I have no idea at all. Did they?

W-L Theory Transmutation Pathways for Iwamura Experiments

Transmutation data from Iwamura, Mitsubishi Heavy
Industries
– Experiments involved permeation of a D2 gas through a
Pd:Pd/CaO thin-film with Cs and Sr seed elements placed on
the outermost surface
– 55Cs133 target transmuted to 59Pr141; 38Sr88 transmuted to
42Mo96
– In both cases* the nuclei grew by 8 nucleons

Others would notice that this is as if there were fusion with a 4D condensate, with the electrons scattering. That those transmutation are only +4D — four protons and four neutrons — is an argument against the complicated W-L process.

 W-L theory postulates the following plausible nucleosynthesis pathway

(see the document for the list of reactions.) I don’t find this plausible at all. 8 successive neutron captures are required for each single result. The four beta decays, clearly delayed, will also involve radiation, the material would be quite radioactive until the process is complete. Why only 8? Why not 1, 2, 3, 4, 5, 6, 7, 9, 10, etc?

* Iwamura noted that it took longer to convert Sr into Mo than Cs into Pr. W-L argue that this is because the neutron cross section for Cs is vastly higher than for Sr

This is what Larsen does: he collects facts that can be stuffed into his evidence bag. Instead of making a set of coherent and clear predictions that can be verified, he works ad-hoc and post-hoc. Widom-Larsen theory is not experimentally verified by any published experiments designed to test it. Of course, this is me looking back, after another eight years. To these physicists, before 2010, it looked better than anything they had seen. As long as they didn’t look too closely.

Neutron-rich isotopes build up via neutron captures interspersed with β-decay
− Neutron capture on stable or unstable isotopes releases substantial nuclear binding
energy, mostly in gamma emissions, which convert to IR

So there are twelve reactions that must happen to complete the observed transmutation. In one case, it’s eight neutron captures, then four beta decays. In the other, there are neutron captures mixed with beta decays. Why this particular sequence? As I mention above, why exactly that number of captures? And what about all the intermediate products? They all must disappear. Compare that complicated mess to one reaction with 4D.

4D fusion, to a plasma physicist, seems impossible, but … it is, in fact, simply two deuterium molecules that, Takahashi predicts, may collapse to a Bose-Einstein condensate and fuse (and then fission to form helium, no neutrons), but it seems possible in the Iwamura experiment that the condensate may directly fuse with target elements on the surface. It has the electrons with it, so it is a “neutral particle.” There would be no Coulomb barrier. The new physics is only an understanding of how a BEC might behave under these conditions, but that is a “we don’t know yet,” not “impossible.”

The Widom-Larsen Theory Summary

The Widom-Larsen (W-L) theory of LENR differs from the mainstream understanding in that the governing mechanism for LENR is presumed to be dominated by the weak force of the standard theory, instead of the strong force that governs nuclear fission and fusion

What is the “mainstream understanding of LENR”? W-L theory incorporates strong force mechanisms in the neutron absorptions. It is only the creation of neutrons that is weak force dominated.

 Assumption of weak interactions leads to a theoretical framework for the LENR
energy release mechanism consistent with the observed production of large amounts
of energy, over a long time, at moderate conditions of temperature and pressure,
without the release of energetic neutrons or gamma radiation

The analysis that leads to no gamma radiation being detected is one that makes unwarranted ad hoc assumptions about the absorption of gamma rays that, even if they made sense with regard to the prompt gammas expected — which they don’t, this is new physics –, would not cover delayed gammas that would clearly be expected.

• W-L theory is built upon the well-established theory of electro-weak interactions and many-body collective effects

The behavior assumed by W-L theory is far from “well-established.”

W-L theory explains the observations from a large body of LENR experiments
without invoking new physics or ad-hoc mechanisms

It is not established that W-L theory predicts detailed observations, quantitatively. The reactions proposed are ad-hoc, chosen to match experimental results, not predicted from basic principles. W-L theory is clearly an “ad-hoc” theory of mechanism, cobbed together to create an appearance of plausibility, if one doesn’t look too closely.

 So far, no experimental result fatally conflicts with the basic tenets of the W-L
theory

Lack of activation gammas, and especially delayed gammas, is fatal to the theory.

 In fact, an increasing number of LENR anomalies have been explained by W-L

The theory is plastic, amenable to cherry-picking of “plausible reactions” to explain many results. What is missing is clear, testable prediction of phenomena not previously observed, and, in particular, quantitative prediction.

 In one case, W-L theory provided a plausible explanation for an anomalous
observation of transmutation in an exploding wire experiment conducted back in
1922

I have not looked at this.

• Could the W-L theory be the breakthrough needed to position LENR as a major
source of carbon-free, environmentally clean source of source of low-cost
nuclear energy??

No. W-L theory has not provided guidance for dealing with the major obstacle to LENR progress, the design and demonstration of a “lab rat,” a reliable experiment. There is no sign that any experimental group has benefited from applying W-L theory, which seems to be successful only in that, as allegedly a “non-fusion theory,” it seems to be more readily accepted by those who don’t actually study it in detail and with a knowledge of physics and a knowledge of the full body of LENR evidence.

LENR State of Play

The Widom-Larsen theory has done little to unify or focus the LENR research community
• If anything, it appears to have increased the resolve of the strongforce D-D fusion advocates to circle the wagons

Again, who are these “strongforce D-D fusion advocates”? That’s a Steve Krivit idea, that researchers are biased toward “D-D fusion,” whereas the field is not at all united on any theory, but . . . the experimental evidence is strong for deuterium conversion to helium in the FP Heat Effect with PdD. Deuterium conversion to helium is possible by other pathways than “D-D fusion.” Key, though, is that the energy per helium would be the same. If there is no radiation leakage or other products, a neutron pathway could also produce helium, in theory, with the same energy/helium. That is, if the neutrons are produced from deuterium and the electrons are recovered. As I have explained, the electron becomes, as it were, a catalyst. The problem with this picture, though, is that neutrons generate very visible effects, which W-L theory waves away. There would be leakages (i.e., radiation or other products).

• LENR is an area of research at the TRL-1 level but the community is already jockeying for position to achieve a competitive TRL-8 position, which further impedes the normal scientific process

Technology Readiness Level. 

The TRL system does not easily apply to LENR. It is not designed to deal with a field that doesn’t have confirmed reliable methods. However, it could be considered to be spread across TRL-1 to TRL-3. W-L theory has not contributed to progress in this. TRL-4

• Without a theory to guide the research, LENR will remain in a perpetual cook-and-look mode, which produces some tantalizing results to spur venture capital investments but does little to advance the science

That’s a common idea, but there are “basic theories” that are established, and what is actually needed is more basic research to generate more data for theory formation. There are “tantalizing results,” that are never reduced to extensive controlled studies to explore the parameter space.

A “basic theory” is one like what I call the Conjecture, that the FP Heat Effect is the result of the conversion of deuterium to helium, mechanism unknown, with no major leakages (i.e., no major radiation not being converted to heat, and no other major nuclear products). That’s testable, and has been tested and widely confirmed. Another would refer to the generation of anomalous heat under some conditions by metal hydrides, and would look at the involved correlations. These are not theories of mechanism, but of effect.

• DTRA needs to be careful not to get embroiled in the politics of LENR and serve as an honest broker

This report is being used in the “politics of LENR.” It was inadequately critical, it did not point to critiques of W-L theory, but appeared to accept the proponent’s version of the situation.

 Exploit some common ground, e.g., materials and diagnostics
 Force a show-down between Widom-Larsen and Cold Fusion advocates
 Form an expert review panel to guide DTRA-funded LENR research

And here is where, in spite of the shortcomings, they settle on common sense. The failure of the DoE reviews was that they recommended research “under existing programs” but did nothing to facilitate that. And the cold fusion community, on its side, did not apparently request was would have been needed, something like what is suggested here. I called it a “LENR desk,” but it would maintain expert review resources. Was this done? We do know that DTRA has continued to be involved.

As to the “show-down,” what would that involve? The idea is presented as if there are two groups, “W-L” and “Cold Fusion.” In fact, the field is called CMNS and LENR. I use “Cold Fusion,” to be sure, because it is a popular name for the FP Heat Effect, and the main product of that effect is helium, a fusion product if the fuel is deuterium, even if you wave some “heavy electrons” at it.

There are some in the field stuck on “D-D fusion,” but it’s actually few.

Widom-Larsen

DRAFT undergoing revision.

first revision 7/12/2018: corrected comment about Widom activity, moved DARPA report to its own subpage, and added responses, including a reported replication failure, to the Cirillo et al paper.

A discussion on a private mailing list led me to take a new look at Widom-Larsen theory.

This is long. I intend to refactor it and boil it down. There is a lot of material available. This also examines the role of Steve Krivit in promoting W-L theory and generally attacking the cold fusion community (and “cold fusion” only means the heat effect popularly called that, and does not indicate any specific reaction.) What I call the “cold fusion community” is the LENR or CMNS community, which, setting aside a few fanatics, is not divided into factions as Krivit promotes.

I have, in the past, called W-L theory a “hoax.” That has sometimes been misinterpreted. The theory itself is not a hoax, it appears to have been a serious attempt to “explain” LENR phenomena. However, there is a common idea about it, that it does not contradict existing physics, often combined with an idea that “cold fusion” is in such contradiction, which is true only for some interpretations of “cold fusion.” The simplest, that it is a popular name for a set of experimental results displaying a heat anomaly, doesn’t present any actual contradiction. That the heat is from “d-d fusion,” a common idea again (especially among skeptics!), does present some serious issues. But there are many possible paths and understandings of “fusion.”

No, the hoax is that W-L theory only involves accepted physics.

Explanation of Widom-Larsen theory

The subpage covers the explanation on New Energy Times, and my commentary on it.

Reactions of physicists

So Krivit has many pages on the reactions of physicists and others, covered on Reactions.

The most recent one I see is this:

Larsen Uncovers Favorable Defense Department Evaluation of Widom-Larsen LENR Theory

So this,  June 6, 2017, was from Larsen, framed by Larsen. As we will be seeing, that W-L theory has been “successful” in terms of being accepted as possible, in many circles, is reasonably true, or at least was true, but has a problem. Who are these people, and what do they know about the specific physics, and most to the point, what do they know about the very large body of evidence for LENR? One may easily imagine that LENR evidence is a certain way, if one is not familiar with it.

This “favorable report” was actually old, from 2010. I cover this report on a subpage: Toton-Ullrich DARPA report. While the report presents W-L theory as it was apparently explained to them by Widom and/or Larsen, including comments that reflect their political point of view, the report ends with this:

The Widom-Larsen theory has done little to unify or focus the LENR research community
• If anything, it appears to have increased the resolve of the strongforce D-D fusion advocates to circle the wagons

(No specific references are made to a “strongforce D-D fusion” theory. Ordinary D-D fusion has long been understood as Not Happening in LENR. Most theories (like W-L theory) now focus on collective effects. This concept of an ideological battle has been promoted by Krivit and, I think, Larsen.)

• LENR is an area of research at the TRL-1 level but the community is already jockeying for position to achieve a competitive TRL-8 position, which further impedes the normal scientific process

Depending on definitions, the research is largely at TRL-1, yes, but in some areas perhaps up to TRL-3. Nobody is close to TRL-8. This report was in 2010, and Rossi was privately demonstrating his devices to government officials. Then, Rossi wasn’t claiming TRL-8, though possibly close, and later he clearly claimed to have market-ready products. He was lying. Yes, there is secrecy and there are non-disclosure agreements, McKubre has been pointing out for the last couple of years how this impedes the normal scientific process. Notice that in the history of Lattice Energy, Larsen invoked “proprietary” to avoid disclosing information about the state of verification of their alleged technology, which was, we can now be reasonably confident, vaporware.

• Without a theory to guide the research, LENR will remain in a perpetual cook-and-look mode, which produces some tantalizing results to spur venture capital investments but does little to advance the science

While a functional theory would certainly be useful, W-L theory does not qualify. A premature theory, largely ad-hoc, as W-L theory is, could mislead research. Such theories can best be used to brainstorm new effects to measure, but at this point the most urgent research need is to verify what has already been found, with increased precision and demonstrated reliability (i.e., real error bars, from real data, from extensive series of tests.)

• DTRA needs to be careful not to get embroiled in the politics of LENR and serve as an honest broker
 Exploit some common ground, e.g., materials and diagnostics
 Force a show-down between Widom-Larsen and Cold Fusion advocates
 Form an expert review panel to guide DTRA-funded LENR research

Great idea. They did not take advantage of the opportunity to do just that, as far as we know. If they did, good for them! The story that there is a battle between W-L theory and “cold fusion advocates” is purely a W-L advocacy story, as is the claim that W-L theory does not conflict with known physics, which the report authors did not critically examine. it is not clear that they read any of the critical literature.

Critiques of W-L theory

Steve Krivit mentions some of the critiques on his blog, but suppresses their visibility. Some, in spite of being published under peer review, he completely ignores.

The subpage, Critiques,  covers

Hagelstein and Chaudhary (2008)

Hagelstein (2013)

Ciuci et al (2012)

Cirillo et al (2012) (experimental neutron finding cited as support of W-L theory)

Faccini et al (2013), critique of Cirillo and replication failure and further response to Widom

Tennefors (2013)

Email critiques from 2007, including two written with explicit “off the record” requests, which Krivit published anyway, claiming that they had not obtained permission first for an off-the-record comment, and that he had explicitly warned them, which he had not. Krivit interprets language however it suits him, and his action might as well have been designed to discourage scientists in the field from talking frankly with him . . . which is the result he obtained.

Vysotskii (2012 and 2014)

Storms (2007 and 2010) and the Krivit comment published by Naturwissenschaften, Storms’ reply, and Krivit’s continued reply on his blog.

Maniani et al (2014)

Takahashi and New Hydrogen Energy

Today I began and completed a review of Akito Takahashi’s presentation on behalf of a collaboration of groups, using the 55 slides made available. Eventually, I hope to see a full paper, which may resolve some ambiguities. Meanwhile, this work shows substantial promise.

This is the first substantial review of mine coming out of ICCF-21, which, I declared, the first day, would be a breakthrough conference.

I was half-way out-of-it for much of the conference, struggling with some health issues, exacerbated by the altitude. I survived. I’m stronger. Yay!

Comments and corrections are invited on the reviews, or on what will become a series of brief summaries.

The title of the presentation: Research Status of Nano-Metal Hydrogen Energy. There are 17 co-authors, affiliated with four universities (Kyushu, Tohoku, Kobe, and Nagoya), and two organizations (Technova and Nissan Motors). Funding was reportedly $1 million US, for October 2015 to October 2017.

This was a major investigation, finding substantial apparent anomalous heat in many experiments, but this work was, in my estimation, exploratory, not designed for clear confirmation of a “lab rat” protocol, which is needed. They came close, however, and, to accomplish that goal, they need do little more than what they have already done, with tighter focus. I don’t like presenting “best results,” from an extensive experimental series, it can create misleading impressions.

The best results were from experiments at elevated temperatures, which requires heating the reactor, which, with the design they used, requires substantial heating power. That is not actually a power input to the reactor, however, and if they can optimize these experiments, as seems quite possible, they appear to be generating sufficient heat to be able to maintain elevated temperature for a reactor designed to do that. (Basically, insulate the reactor and provide heating and cooling as needed, heating for startup and cooling once the reactor reaches break-even — i.e., generating enough heat to compensate for heat losses). The best result was about 25 watts, and they did not complete what I see as possible optimization.

They used differential scanning calorimetry to identify the performance of sample fuel mixtures. I’d been hoping to see this kind of study for quite some time. This work was the clearest and most interesting of the pages in the presentation; what I hope is that they will do much more of that, with many more samples. Then, I hope that they will identify a lab rat (material and protocol) and follow it identically with many trials (or sometimes with a single variation, but there should be many iterations with a single protocol.

They are looking forward to optimization for commercial usage, which I think is just slightly premature. But they are close, assuming that followup can confirm their findings and demonstrate adequate reliability.

It is not necessary that this work be fully reliable, as long as results become statistically predictable, as shown by actual variation in results with careful control of conditions.

Much of the presentation was devoted to Takahashi’s TSC theory, which is interesting in itself, but distracting, in my opinion, from what was most important about this report. The experimental work is consistent with Takahashi theory, but does not require it, and the work was not designed to deeply vet TSC predictions.

Time was wasted in letting us know that if cold fusion can be made practical, it will have a huge impact on society. As if we need to hear that for the n thousandth time. I’ve said that if I see another Rankin diagram, I’d get sick. Well, I didn’t, but be warned. I think there are two of them.

Nevertheless, this is better hot-hydrogen LENR work than I’ve seen anywhere before. I’m hoping they have helium results (I think they might,) which could validate the excess heat measures for deuterium devices.

I’m recommending against trying to scale up to higher power until reliability is nailed.

Update, July 1, 2018

There was reference to my Takahashi review on LENR Forum, placed there by Alain Coetmeur, which is appreciated. He misspelled my name. Ah, well!

Some comments from there:

Alan Smith wrote:

Abd wrote to Akito Takahashi elsewhere.

“I am especially encouraged by the appearance of a systematic approach, and want to encourage that.”

A presumptuous comment for for somebody who is not an experimenter to make to a distinguished scientist running a major project don’t you think? I think saying ‘the appearance’ really nails it. He could do so much better.

That comment was on a private mailing list, and Smith violated confidentiality by publishing it. However, no harm done — other than by his showing no respect for list rules.

I’ll point out that I was apparently banned on LENR Forum, in early December, 2016, by Alan Smith. The occasion was shown by my last post. For cause explained there, and pending resolution of the problem (massive and arbitrary deletions of posts — by Alan Smith — without notice or opportunity for recovery of content), I declared a boycott. I was immediately perma-banned, without notice to me or the readership.

There was also an attempt to reject all “referrals” to LENR Forum from this blog, which was easily defeated and was then abandoned. But it showed that the problem on LF was deeper than Alan Smith, since that took server access. Alan Coetmeur (an administrator there) expressed helplessness, which probably implicated the owner, and this may have all been wrapped in support for Andrea Rossi.

Be that as it may, I have excellent long-term communication with Dr. Takahashi. I was surprised to see, recently, that he credited me in a 2013 paper for “critical comments,” mistakenly as “Dr. Lomax”, which is a fairly common error (I notified him I have no degree at all, much less a PhD.) In that comment quoted by Smith, “appearance” was used to mean “an act of becoming visible or noticeable; an arrival,” not as Smith interpreted it. Honi soit qui mal y pense.

I did, in the review, criticize aspects of the report, but that’s my role in the community, one that I was encouraged to assume, not by myself alone, but by major researchers who realize that the field needs vigorous internal criticism and who have specifically and generously supported me to that end.

Shane D. wrote:

Abd does not have much good to say about the report, or the presentation delivery.

For those new to the discussion, this report…the result of a collaboration between Japanese universities, and business, has been discussed here under various threads since it went public. Here is a good summation: January 2018 Nikkei article about cold fusion

Overall, my fuller reaction was expressed here, on this blog post. I see that the format (blog post here, detailed review as the page linked from LF) made that less visible, so I’ll fix that. The Nikkei article is interesting, and for those interested in Wikipedia process, that would be Reliable Source for Wikipedia. Not that it matters much!

Update July 3, 2018

I did complain to a moderator of that private list, and Alan edited his comment, removing the quotation. However, what he replaced it with is worse.

I really like Akito. Wonderful man. And a great shame Abd treats his work with such disdain.

I have long promoted the work of Akito Takahashi, probably the strongest theoretician working on the physics of LENR. His experimental work has been of high importance, going back decades. It is precisely because of his position in the field that I was careful to critique his report. The overall evaluation was quite positive, so Smith’s comment is highly misleading.

Not that I’m surprised to see this from him. Smith has his own agenda, and has been a disaster as a LENR Forum moderator. While he may have stopped the arbitrary deletions, he still, obviously, edits posts without showing any notice.

This was my full comment on that private list (I can certainly quote myself!)

Thanks, Dr. Takahashi. Your report to ICCF-21 was of high interest, I have reviewed it here:

http://coldfusioncommunity.net/iccf-21/abstracts/review/takahashi/

I am especially encouraged by the appearance of a systematic approach, and want to encourage that.

When the full report appears, I hope to write a summary to help promote awareness of this work.

I would be honored by any corrections or comments.

Disdain? Is Smith daft?

Takahashi

Overall reaction to this presentation is in a blog post. This review goes over each slide with comments, and may seem overly critical. However, from the post:

. . . this is better hot-hydrogen LENR work than I’ve seen anywhere before. 

Abstract

Research Status of Nano-Metal Hydrogen Energy

Akito Takahashi1, Akira Kitamura16, Koh Takahashi1, Reiko Seto1, Yuki Matsuda1, Yasuhiro Iwamura4, Takehiko Itoh4, Jirohta Kasagi4, Masanori Nakamura2, Masanobu Uchimura2, Hidekazu Takahashi2,
Shunsuke Sumitomo2, Tatsumi Hioki5, Tomoyoshi Motohiro5, Yuichi Furuyama6, Masahiro Kishida3,
Hideki Matsune3
1Technova Inc., 2Nissan Motors Co., 3Kyushu University, 4Tohoku University, 5Nagoya University and
6Kobe University

Two MHE facilities at Kobe University and Tohoku University and a DSC (differential
scanning calorimetry) apparatus at Kyushu University have been used for excess-heat
generation tests with various multi-metal nano-composite samples under H(or D)-gas
charging. Members from 6 participating institutions have joined in planned 16 times
test experiments in two years (2016-2017). We have accumulated data for heat generation
and related physical quantities at room-temperature and elevated- temperature conditions,
in collaboration. Cross-checking-style data analyses were made in each party and
compared results for consistency. Used nano-metal composite samples were PS(Pd-SiO2)
-type ones and CNS(Cu-Ni-SiO2)-type ones, fabricated by wet-methods, as well as PNZ
(Pd-Ni-Zr)-type ones and CNZ(Cu-Ni-Zr)-type ones, fabricated by melt-spinning and
oxidation method. Observed heat data for room temperature were of chemical level.

Results for elevated-temperature condition: Significant level excess-heat evolution data
were obtained for PNZ-type, CNZ-type CNS-type samples at 200-400℃ of RC (reaction
chamber) temperature, while no excess heat power data were obtained for single nanometal
samples as PS-type and NZ-type. By using binary-nano-metal/ceramics-supported
samples as melt-span PNZ-type and CNZ-type and wet-fabricated CNS-type, we
observed excess heat data of maximum 26,000MJ per mol-H(D)-transferred or 85 MJ
per mol-D of total absorption in sample, which cleared much over the aimed target value
of 2MJ per mol-H(D) required by NEDO. Excess heat generation with various Pd/Ni
ratio PNZ-type samples has been also confirmed by DSC (differential scanning
calorimetry) experiments, at Kyushu University, using very small 0.04-0.1g samples at
200 to 500℃ condition to find optimum conditions for Pd/Ni ratio and temperature. We
also observed that the excess power generation was sustainable with power level of 10-
24 W for more than one month period, using PNZ6 (Pd1Ni10/ZrO2) sample of 120g at
around 300℃. Detail of DSC results will be reported separately. Summary results of
material analyses by XRD, TEM, STEM/EDS, ERDA, etc. are to be reported elsewhere.


Slides

ICCF21AkitoTakahashippt

REVIEW

  • Page 1: ResearchGate cover page
  • Page 2: Title
  • Page 3: MHE Aspect: Anomalously large heat can be generated by the
    interaction of nano-composite metals and H(D)-gas.
  • Page 4Candidate Reaction Mechanism: CCF/TSC-theory by Akito Takahashi


This is a summary of Takahashi TSC theory. Takahashi found that the rate of 3D fusion in experiments where PdD was bombarded by energetic deuterons was enhanced 10^26, as I recall, over naive plasma expectation. This led him to investigate multibody fusion. 4D, to someone accustomed to thinking of plasma fusion, may seem ridiculously unlikely; however, this is actually only two deuterium molecules. We may image two deuterium molecules approaching each other in a plasma and coming to rest at the symmetric position as they are slowed by repulsion of the electron clouds. However, this cannot result in fusion in free space, because the forces would dissociate the molecules, they would slice each other in two. However, in confinement, where the dissociating force may be balanced by surrounding electron density, it may be possible. Notable features: the Condensate that Takahashi predicts includes the electrons. Fusion then occurs by tunneling to 100% within about a femtosecond; Takahashi uses Quantum Field Theory to predict the behavior. To my knowledge, it is standard QFT, but I have never seen a detailed review by someone with adequate knowledge of the relevant physics. Notice that Takahashi does not detail how the TSC arises. We don’t know enough about the energy distribution of deuterium in PdD to do the math. Because the TSC and resulting 8Be are so transient, verifying this theory could be difficult.

Takahashi posits a halo state resulting from this fusion that allows the 8Be nucleus, with a normal half-life of around a femtosecond, to survive long enough to radiate most of the energy as a Burst of Low-Energy Photons (BOLEP), and suggests a residual energy per resulting helium nucleus of 40 – 50 KeV, which is above the Hagelstein limit, but close enough that some possibility remains. (This energy left is the mass difference of the ground state for 8Be over two 4He nuclei.)

Notice that Takahashi does not specify the nature of the confining trap that allows the TSC to arise. From experimental results, particularly where helium is found, the reaction takes place on the surface, not in the bulk, so the trap must only be found on (or very near) the surface. Unless a clear connection is shown, this theory is dicta, not really related to the meat of the presentation, experimental results.

  • Page 5: Comparison of Energy-Density for Various Sources.  We don’t need this fluff. (The energy density, if “cold fusion” is as we have found, is actually much higher, because it is a surface reaction, but density is figured for the bulk. Bulk of what? Not shown. Some LENR papers present a Rankin diagram, which is basically the same. It’s preaching to the choir; it was established long ago and is not actually controversial: if “cold fusion” is real, it could have major implications, providing practical applications can be developed, which remains unclear. What interests us (i.e., the vast majority of those at an ICCF conference) is two-fold: experimental results, rather than complex interpretations, and progress toward control and reliability.
  • Page 6: Comparison of Various Energy Resources. Please, folks, don’t afflict this on us in what is, on the face, an experimental report. What is given in this chart is to some extent obvious, to some extent speculative. We do not know the economics of practical cold fusion, because it doesn’t exist yet. When we present it, and if this is seen by a skeptic, it confirms the view that we are blinded by dreams. We aren’t. There is real science in LENR, but the more speculation we present, the more resistance we create. Facts, please!!!
  • Page 7. Applications to Society. More speculative fluff. Where’s the beef? (I don’t recall if I was present for this talk. There was at least one where I found myself in an intense struggle to stay awake, which was not helped by the habit of some speakers to speak in a monotone, with no visual or auditory cues as to what is important, and, as untrained speakers (most in the Conference, actually), no understanding of how to engage and inspire an audience. Public speaking is not part of the training of scientists, in general. Some are good at it and become famous. . . . ) (I do have a suggested solution, but will present it elsewhere.)
  • Page 8. Required Conditions to Application: COP, E-density, System-cost. More of the same. Remarkable, though: The minimum power level for a practical application shown is 1 KW. The reported present level is 5 to 20 W. Scientifically, that’s a high level, of high interest, and we are all eager to hear what they have done and found. However, practically, this is far, far from the goal. Note that low power, if reliable, can be increased simply by scaling up (either making larger reactors or making many of them; then cost may become an issue. This is all way premature, still.) By this time, if I was still in the room, I’m about to leave, afraid that I’ll actually fall asleep and start snoring. That’s a bit more frank and honest with our Japanese guest than I’d want to be. (And remember, my sense is that Takahashi theory is the strongest in the field, even if quite incomplete. Storms has the context end more or less nailed, but is weak on theory of mechanism. Hagelstein is working on many details, various trees of possible relevance, but still no forest.)

Page 9. NEDO-MHE Project, by6Parties.
Project Name: Phenomenology and Controllability of New
Exothermic Reaction between Metal and Hydrogen
Parties:Technova Inc., Nissan Motors Co., Kyushu U., Tohoku U., Nagoya U., Kobe U.
Period: October 2015 to October 2017 R. Fund:ca. 1.0 M USD
Aim :To verify existence of anomalous heat effect (AHE) in nano-metal and hydrogen-gas interaction and to seek controllability of effect
Done:New MHE-calorimetry system at Tohoku U. Collaboration experiments to verify AHE. Sample material analyses before and after runs. Study for industrial application

Yay! I’ll keep my peace for now on the “study for industrial application.” Was that part of the charge? It wasn’t mentioned.

Page 10. Major Results Obtained. 
1. Installation of new MHE calorimetry facility and collaborative tests
2. 16 collaborative test experiments to have verified the existence of AHE (Pd-Ni/ZrO2, CuNi/ZrO2)
3. generation of 10,000 times more heat than bulk-Pd H-absorption heat, AHE by Hydrogen, ca. 200 MJ/mol-D is typical case
4. Confirmation of AHE by DSC-apparatus with small samples

“Typical case” hides the variability. The expression of results in heat/moles of deuterium is meaningless without more detail. Not good. The use of differential scanning calorimetry  is of high interest.

  • Page 11. New MHE Facility at ELPH Tohoku U. (schematic) (photo)
  • Page 12. MHE Calorimetry Test System at Kobe University, since 2012 (photo)
  • Page 13. Schematics of MHE Calorimetry Test System at Kobe University, since 2012

System has 5 or 6 thermocouples (TC3 is not shown).

  • Page 14. Reaction Chamber (500 cc) and filler + sample; common for Tohoku and Kobe

Reaction chamber is the same for both test systems. It contains 4 RTDs.

  • Page 15. Melt-Spinning/Oxidation Process for Making Sample
  • Page 16Atomic composition for Pd1Ni10/ZrO2 (PNZ6, PNZ6r) and Pd1Ni7/ZrO2 (PNZ7k)
  • Page 17. 6 [sic, 16?] Collaborative Experiments. Chart showing results from 14 listed tests, 8 from Kobe, 5 from Tohoku, and listing one DSC study from Kyushu.

These were difficult to decode. Some tests were actually two tests, one at RT (Room Temperature) and another at ET (Elevated Temperature). Other than the DSC test, the samples tested were all different in some way, or were they?

  • Page 18. Typical hydrogen evolution of LM and power in PNZ6#1-1 phase at Room Temp. I have a host of questions. “LM” is loading (D/Pd*Ni), and is taken up to 3.5. Pressure?

“20% difference between the integrated values evaluated from TC2 and those
from RTDav : due to inhomogeneity of the 124.2-g sample distributed in the
ZrO2 [filler].” How do we know that? What calibrations were done? Is this test 14 from Page 17? If so, the more optimistic result was included in the table summary. The behavior is unclear.

Page 19. Using Same Samples divided(CNZ5=Cu1Ni7/ZrO2)100g, parallel tests. This would be test 4 (Kobe, CNZ5), test 6 (Tohoku, CNZ5s)

The labs are not presenting data in the same format. It is unclear what is common and what might be different. The behaviors are not the same, regardless, which is suspicious if the samples are the same and they are treated the same. The difference, then, could be in the calorimetry or other aspects of the protocol not controlled well. The input power is not given in the Kobe plot. (This is the power used to maintain elevated temperature). It is in the Tohoku plot, it is 80 W, initially, then is increased to 134 W.

“2~8W of AHE lasted for a week at Elevated Temp. (H-gas)” is technically sort-of correct for the Kobe test (i.e., between 2 and 8 watts of AHP (this is power, not energy)  started out at 8 W average and declined steadily until it reached 2 W after 3.5 days. Then it held at roughly this level for three days, then there is an unexplained additional brief period at about 4 W. The Tohoku test showed higher power, but quite erratically. After almost rising to 5 W, for almost a day, it collapsed to zero, then rose to 2 W. Then, if this is plotted correctly, the input power was increased to raise the temperature. (for an environmental temperature, which  this was intended to be, the maintenance power is actually irrelevant, it should be thermostatically controlled — and recorded, of course. Significant XP would cause a reduction in maintenance power, as a check. But if they used constant maintenance power, then we would want to know the environment temperature, which should rise with XP. But only a little in this experiment, XP being roughly 2% of heating power. At about 240 hours, the XP jumped to about 3.5 W. I have little confidence in the reliability of this data, without knowing much more than is presented.

Page 20. 14-th Coll. Test(PNZ6): Largest AHE Data 

“Wex: 20W to 10W level excess-power lasted for a month.” This is puffery, cherry-picking data from a large set to create an impressive result. Yes, we would want to know the extremes, but both extremes, and we would even more want to know what is reliable and reproducible. This work is still “exploratory,” it is not designed, so far, to develop reliability and confidence data. The results so far are erratic, indicating poor control. Instead of using one material — it would not need to be the “best” — they have run a modest number of tests with different materials. Because of unclear nomenclature, it’s hard to say how many were different. One test is singled out as being the same material in two batches. I’d be far more interested in the same material in sixteen batches, all with an effort that they be thoroughly mixed, as uniform as possible, before dividing them. Then I’d want to see the exact same protocol run, as far as possible, in the sixteen experiments. Perhaps the only difference would be the exact calorimetric setup, and I’d want to see dummy runs in both setups with “fuel” not expected to be nuclear-active.

One of the major requirements for calorimetric work, too often neglected, is to understand the behavior of the calorimeter thoroughly, across the full range of experimental conditions. This is plodding work, boring. But necessary.

  • Page 21. Excess power, Wex, integrated excess heat per metal atom, Ea (keV/a-M), and
    excess energy per hydrogen isotope atom absorbed/desorbed, ηav,j (keV/aD(H)),
    in RT and ET phases evaluated by TC2 temp. Re-calcined PNZ6.
  • Page 22. Peculiar evolution of temperature in D-PNZ6r#1-2 phase: Re-calcined PNZ6
  • Page 23. PNZ5r sample: baking (#0) followed by #1 – #3 run (Rf = 20 ccm mostly)
  • Page 24Local large heat:Pd/Ni=1/7, after re-calcination of PNZ5. Uses average of RTDs rather than flow thermocouple.
  • Page 25. Excess heat-power evolution for D and H gas: Re-calcined PNZ5.
  • Page 26. About 15 cc 100g PNZ5r powder + D2 gas generated over 100 MJ/mol-D anomalous excess heat:
    Which is 5,000 times of 0.02 MJ/mol-D by PdD formation! More fluff, that assumes there is no systematic error, distracting from the lack of a consistent experiment repeated many times, and that this is not close to commercial practicality. I was really hoping that they had moved into reliability study.
  • Page 27. Radiations and flow rate of coolant BT400; n and gamma levels are natural BG. No radiation above background.
  • Page 28. Excess Power Evolution by CNS2(Cu1Ni7/meso-silica). Appears to show four trials with that sample, from 2014, i.e., before the project period. Erratic results.
  • Page 29. Sample Holder/Temperature-Detection of DSC Apparatus Kyushu University; M. Kishida, et al. photo)
  • Page 30. DSC Measuring Conditions: Kyushu University.
    Sample Amount: 40~100 mg
    Temperature : 25 ~ 550 ℃
    Temp. Rise Rate: 5 ℃/min
    Hydrogen Flow: 70 ml/min
    Keeping Temp.: 200~550 ℃,mainly 450℃
    Keeping Period: 2 hr ~ 24 hr,mostly 2hr
    Blank Runs : He gas flow
    Foreground Runs: H2 gas flow

See Wikipedia, Differential Scanning Calorimetry. I don’t like the vague variations: “mainly,” “mostly.” But we’ll see.

  • Page 31. DSC Experiments at Kyushu University. No Anomalous Heat was observed for Ni and ZrO2 samples.
  • Page 32. DSC Experiments at Kyushu University. Anomalous Heat was observed for PNZ(Pd1Ni7/ZrO2 samples. Very nice, clear. 43 mW/gram. Consistency across different sample sizes?
  • Page 33. Results by DSC experiments: Optimum running temperature For Pd1Ni7/zirconia sample.
  • Page 34. Results by DSC experiments; Optimum Pd/Ni Ratio. If anyone doesn’t want more data before concluding that 1:7 is optimal, raise your hand. Don’t be shy! We learn fastest when we are wrong. They have a decent number of samples at low ratio, with the heat increasing with the Ni, but then only one data point above the ratio of 7. That region is of maximum interest if we want to maximize heat. One point can be off for many reasons, and, besides, where is the actual maximum? As well, the data for 7 could be the bad point. It actually looks like the outlier. Correlation! Don’t leave home without it. Gather lots of data with exact replication or a single variable . Science! Later, on P. 44, Takahashi provides a possible explanation for an optimal value somewhere around 1:7., but the existence of an “explanation” does not prove the matter.
  • Page 35. Summary Table of Integrated Data for Observed Heat at RT and ET. 15 samples. The extra one is PNZt, the first listed.
  • Page 36. Largest excess power was observed by PNZ6 (Pd1Ni10/ZrO2) 120g.  That was 25 W. This contradicts the idea that the optimal Pd/Ni ratio is 1:7, pointing to a possible flyer in the DSC data at Pd/Ni 1:7, which was used for many experiments. It is possible from the DSC data, then, that 100% Ni would have even higher power results (or 80 or 90%). Except for that single data point, power was increasing with Ni ratio, consistently and clearly. (I’d want to see a lot more data points, but that’s what appears from what was done.) This result (largest) was consistent between #1 and #2. I’m assuming that (“#”) means two identical subsamples.
  • Page 37. Largest heat per transferred-D, 270 keV/D was observed by PNZ6r (re-oxidized). This result was not consistent between #1 and #2.
  • Page 38. STEM/EDS mapping for CNS2 sample, showing that Ni and Cu atoms are included in the same pores of the mp-silica with a density ratio approximately equal to the mixing ratio.
  • Page 39. Pd-Ni nano-structure components are only partial [partial what?] (images)
  • Page 40. Obtained Knowledge. I want to review again before commenting much on this. Optimal Pd/Ni was not determined. The claim is no XE for pure Pd. I don’t see that pure Ni was tested. (I.e., PZ) Given that the highest power was seen at the highest Ni:Pd (10), that’s a major lacuna.
  • Page 41. 3. Towards Application(next-R&D).
    Issue / Subjective [Objective?] / Method
    Increase Power / Present ca. 10W to 500-1000W or more / Increase reaction rate
    ・temperature, pressure
    ・increase sample nano
    ・high density react. site
    Enhance COP / Now 1.2; to 3.0~5.0
    Control / Find factors, theory / Speculation by experiments, construct theory
    Lower cost / Low cost nanocomposites / Optimum binary, lower cost fabrication

I disagree that those are the next phase. The first phase would ideally identify and confirm a reasonably optimal experiment. That is not actually complete, so completing it would be the next phase. This completion would use DSC to more clearly and precisely identify an optimal mixture (with many trials). A single analytical protocol would be chosen and many experiments run with that single mixture and protocol. Combining this with exploration, in attempt to “improve,” except in a very limited and disciplined way, will increase confusion. The results reported already show very substantial promise. 10-25 watts, if that can be shown to be reasonably reliable and predictable, is quite enough. Higher power at this point could make the work much more complex, so keep it simple.

Higher power then, could be easy, by scaling up, and then, as well, increasing COP could be easy by insulating the reactor to reduce heat loss rate. With sufficient scale and insulation, the reaction should be able to become self-sustaining, i.e., maintaining the necessary elevated environmental temperature with its own power.

Theory of mechanism is almost completely irrelevant at this point. Once there is an identified lab rat, then there is a test bed for attempting to verify — or rule out — theories. Without that lab rat, it could take centuries. At this point, as well, low cost (i.e., cost of materials and processing) is not of high significance. It is far more important at this time to create and measure reliability. Once there is a reliable experiment, as shown by exact and single-variable replications, then there is a standard to apply in comparing variables and exploring variations, and cost trade-0ffs can be made. But with no reliable reactor, improving cost is meaningless.

This work was almost there, could have been there, if planned to complete and validate a lab rat. DSC, done just a little more thoroughly, could have strongly verified an optimal material. It is a mystery to me why the researchers settled on Pd/Ni of 7. (I’m not saying that’s wrong, but it was not adequately verified, as far as what is reported in the presentation.

Within a design that was still exploratory, it makes sense, but moving from exploration to confirmation and measuring reliability is a step that should not be skipped, or the probability is high that millions of dollars in funding could be wasted, or at least not optimally used. One step at a time wins, in the long run.

APPENDIX ON THEORETICAL MODELS

  • Page 42. Brief View of Theoretical Models, Akito Takahashi, Professor Emeritus Osaka U. For appendix of 2016-9-8 NEDO hearing. (title page)
  • Page 43. The Making of Mesoscopic Catalyst To Scope CMNR AHE on/in Nano-Composite particles.
  • Page 44. Binary-Element Metal Nano-Particle Catalyst. This shows the difference between Ni/Pd 3 and Ni/Pd 7, at the size of particle being used. An optimal ratio might vary with particle size, following this thinking. Studying this would be a job for DSC.
  • Page 45SNH will be sites for TSC-formation. To say that more generically, these would be possible Nuclear Active environment (NAE). I don’t see that “SNH” is defined, but it would seem to refer to pores in a palladium coating on a nickel nanoparticle, creating possible traps.
  • Page 46. Freedom of rotation is lost for the first trapped D2, and orthogonal coupling
    with the second trapped D2 happens because of high plus charge density localization
    of d-d pair and very dilute minus density spreading of electrons. Plausible.
  • Page 47. TSC Langevin Equation. This equation is from “Study on 4E/Tetrahedral Symmetric Condensate Condensation Motion by Non-Linear Lengevin Equation,” Akito Takahashi and Norio Yabuuchi, in Low Energy Nuclear Reactions Sourcebook, American Chemical Society and Oxford University Press, ed. Marwan and Krivit (2008) — not 2007 as shown. See also “Development status of condensed cluster
    fusion theory” Akito Takahashi, Current Science, 25 February, 2015, and Takahashi, A.. “Dynamic Mechanism of TSC Condensation Motion,” in ICCF-14, 2008.
  • Page 48. (plots showing simulations, first, oscillation of Rdd (d-d separation in pm) and Edd  (in ev), with a period of roughly 10 fs, and, second, “4D/TSC Collapse”, which takes about a femtosecond from a separation of about 50 pm to full collapse, Rdd shown as 20 fm.)
  • Page 49. Summary of Simulation Results. for various multibody configurations. (Includes muon-catalyzed fusion.)
  • Page 50.  Trapped D(H)s state in condensed cluster makes very enhanced fusion rate. “Collision Rate Formula UNDERESTIMATES fusion rate of steady molecule/cluster/” Yes, it would, i.e., using plasma collision rates.
  • Page 51. This image is a duplicate of Page 4, reproduced above.
  • Page 52. TSC Condensation Motion; by the Langevin Eq.: Condensation Time = 1.4 fs for 4D and 1.0 fs for 4H Proton Kinetic Energy INCREASES as Rpp decreases.
  • Page 53. 4H/TSC will condense and collapse under rather long time chaotic oscilation Near weak nuclear force enhanced p-e distance.
  • Page 544H/TSC Condensation Reactions. collapse to 4H, emission of electron and neutrino (?) to form 4Li*, prompt decay to 3He + p. Color me skeptical, but maybe. Radiation? 3He (easily detectable)?
  • Page 55. Principle is Radiation-Less Condensed Cluster Fusion. Predictions, see “Nuclear Products of Cold Fusion by TSC Theory,” Akito Takahashi, J. Condensed Matter Nucl. Sci. 15 (2015, pp 11-22).

Fake facts and true lies

This a little “relax after getting home” exploration of a corner of Planet Rossi, involving Mats Lewan — but, it turns out, only very peripherally –, Frank Acland’s interview of Andrea Rossi just the other day (June 11), and some random comments on E-Cat World, easily categorized under the time-wasting “Someone is wrong on the internet.” Continue reading “Fake facts and true lies”

Review

UNDER CONSTRUCTION

This page will collect reviews of the ICCF-21 abstracts. The intention is to support study and preparation for talks and poster sessions, as well as to prioritize attendee time.

The abstracts display a wide range of quality and usefulness. Those two characteristics are personal assessments, not fact. Comments are welcome.

My intention is to update reviews during the conference, and to later integrate what is studied here with actual published papers, where those appear.

My intention is to create a page for each title. The title here will link to the review page for the author (if a link is shown)

(If a reader wants to comment on a presentation that is not linked, request a page be created with a comment below. These requests will be deleted when actioned.)

Afanasyev-Sergei-1 POSTER Cold fusion: superfluidity of deuterons
Alexandrov-Dimiter-1 Experiment and Theory Th 1:52 Nuclear fusion in solids – experiments and theory
Amini-Farzan-1 POSTER Warp Drive Hydro Model For Interactions Between Hydrogen and Nickel
Anderson-Paul-1 POSTER The SAFIRE Project – An overview
Barot-Shriji-1 POSTER Flow Calorimetry Design for Elevated Temperature Experiments with Deuterium
Beiting-Edward-1 Diverse Experiments Th 8:00 Investigation of the Nickel-Hydrogen Anomalous Heat Effect
Beiting-Edward-2 POSTER Generation of High-Temperature Samples and Calorimetric Measurement of Therma
Bergschneider-Matthew-1 POSTER Study of a Calorimety Apparatus utilizing Radiation based Heat Transfer
Biberian-Jean-Paul-1 Transmutations Tu 10:30 Anomalous Isotopic Composition of Silver in a Palladium Electrode
Blake-Russ-1 Theory Th 3:30 Understanding LENR Using QST
Blake-Russ-2 POSTER Further Foundations of Fusion
Bowen-NL-1 POSTER A Simple Calculation of the Inter-Nucleon Up-to-Down Quark Bond and its Implications for Nuclear Binding
Brink-Simon-1 Experiment and Theory Th 2:37 LENR Catalyst Identification Model
Celani-Francesco-1 Heat Measurements Tu 8:48 Steps to identification of main parameters for AHE generation in submicrometric materials: measurements by isoperibolic and air-flow calorimetry
Cook-Norman-1 Theory M 4:30 The “Renaissance” in Nuclear Physics: Low-energy nuclear reactions and transmutations
Czerski-Konrad-1 Ion Beams – Rydberg Matter Tu 1:30 Influence of Crystal Lattice Defects and the Threshiold Resonance on the Deuteron-Deuteron Reaction Rates at Room Temperature
Daggett-David_1 Heat Measurements Tu: 9:36 Positive Result of a Laser-Induced LENR Experiment
David-Fabrice-1 Instrumentation Th 11:37 Alternatives to Calorimetry
Dong-ZM-1 Diverse Experiments Th 8:48 Temperature Dependence of Excess Heat in Gas-Loading Experiments
Dubinko-Vladimir-1 Theory Tu 4:37 Chemical and Nuclear Catalysis Mediated by the Energy Localization in Hydrogenated Crytals and Quasicrystals
Egely-George-1 POSTER Electric Energy Generation by LENR
Egely-George-2 Old and New Experiments W 10:30 Changes of Isotope Ratios in Transmutations
Fomitchev-Zamilov-Max-1 Transmutations Tu 10:52 Synthesis of Lanthanides on Nickel Anode
Fomitchev-Zamilov-Max-2 POSTER Reliable Neutron and Gamma Radiation Detection
Forsley-Lawrence-1 Applications and Close F 10:52 Space Applications of a Hybrid Fusion-Fission Reactor
Fowler-Malcolm-1 Instrumentation Th 10:30 Development of a Sensitive Detection system for the Measurement of Trace Amounts of He4 in Deuterium or Hydrogen
fredericks-keith-1 POSTER Elliptical tracks and magnetic monopoles
Gibson-Martin-1 POSTER A Geometric Understanding of Low Energy Nuclear Reactions in the Palladium-Deuterium Lattice
Gordon-Frank-1 POSTER Real-time Instrumentation and Digital Processing for LENR Characterization
Grimshaw-Thomas-1 POSTER Documentation and Archives of 29 Years of LENR Research by Dr. Edmund Storms
Gutzmann-Emma-GWU-1 POSTER Parametric experimental studies of Ni-H electrochemical cells
Hagelstein-Peter-1 Theory M 3:30 Statistical mechanics models for the PdH, and PdD, phase diagram with both O-site and T-site occupation
Hagelstein-Peter-2 Materials W 8:48 Phonon-mediated excitation transfer involving nuclear excitation
Hagelstein-Peter-3 POSTER Phonon-nuclear coupling matrix element for the low energy E1 transition in Ta-181 and applications
Hatt-Philippe-1 Theory Th 3:52 Cold Nuclear Transmutations Light Atomic Nuclei Binding Energy
Higgins-Bob-1 Instrumentation Th 10:52 Modeling and Simulation of a Gas Discharge LENR Prototype
Hioki_Tatsumi-1 Heat from NanoMaterials M 2:30 XRD and XAFS Analyses for Metal Nanocomposites Used in Anomalous Heat Effect Experiments
Imam-Ashraf-1 Materials W 9:12 Fabrication, Characterization, and Evaluation of Palladium-Born Alloys Use in LENR Experiments
Iwamura-Yasuhiro-1 Heat from NanoMaterials M 2:00 Anomalous Heat Effects Induced by Metal Nanocomposites and Hydrogen Gas
Kaal-Edo-1 POSTER The Structured Atom Model – SAM
Kasagi-Jiro-1 Instrumentation Th 11:15 Search for γ-ray radiation in NiCuZr nano-metals and H2 gas system generating large excess heat.
Katinsky-Steven-1 Introduction M 8:30 LEAP: The LENRIA Experiment and Analysis Program
Kitagawa-Yuta-1 Diverse Experiments Th 9:12 Direct Joule Heating of D-Loaded Bulk Pd Plates in Vaccum
Kornilova-Alla-1 POSTER Stimulation of LENR in Hydroborate Minerals Under the Action of Distant High-Frequency Thermal Waves
Kovacs-Andras-1 Experiment and Theory Th 2:15 Electron mediated nuclear chain reactions
Letts-Dennis-1 Heat Measurements M 10:30 Building and Testing a High Temperature Seebeck Calorimeter
Li-XZ-1 Theory Tu 3:30 Resonant Surface Capture Model
Lomax-Abd-ulRahman-1 POSTER Correlation and cold fusion
Lu-Gongxuan-1 Transmutations Tu 11:15 Photocatalytic hydrogen evolution and induced transmutation of potassium to calcium via low-energy nuclear reaction (LENR) driven by visible light.
McCarthy-William-1 Old and New Experiments W 11:15 Light Hydrogen LENR in Copper Alloys
McKubre-Michael-1 Technical Perspective M 9:30 The Fleischmann-Pons heat and ancillary effects. What do we know, and why? How might we proceed?
Metzler-Florian-1 Old and New Experiments W 10:52 Observation of non-exponential decay of x-ray and γ lines from Co-57 on steel plates
Meulenberg-Andrew-1 Applications and Close F 11:15 Nuclear-waste remediation with femto-atoms and femto-molecules
Meyer-Jacob-1 POSTER On the Oxidation of Palladium
Miles-Melvin-1 Materials W 9:36 Excess Power Measurements For Palladium-Boron Cathodes
Miles-Melvin-2 POSTER Calorimetric Insights From Fleschmann Letters
Miles-Melvin-3 POSTER No Steady State For Open Isoperibolic Calorimetry
Miley-George-1 Heat Measurements M 11:30 Progress in Cluster Enabled LENR
Mizuno-Tadahiko-1 Heat Measurements M 11:00 Excess heat generation by simple treatment of reaction metal in hydrogen gas
Mosier-Boss-Pamela-1 Applications and Close F 10:30 Hybrid Fusion-Fission Reactor Using Pd/D Codeposition
Mosier-Boss-Pamela-2 POSTER Overview of Pd/D Co-deposition
Nee-Han-1 Materials W 8:24 Lattice Confinement of Hydrogen in FCC Metals for Fusion Reaction
Nikitin-Aleksander-1 Transmutations Tu 11:37 Impact of Effective Microorganisms on the Activity of 137Cs in Soil from the Exclusion Zone of Chernobyl NPP
Olafsson-Sveinn-1 Ion Beams – Rydberg Matter Tu 1:52 What is Rydberg Matter and Ultra-Dense Hydrogen?
Olafsson-Sveinn-2 POSTER Adler-Bill-Jakiw anomaly in electroweak interactions, the 3p+  3L* process and links to spontaneous UHD decay and transmutation process
Olafsson-Sveinn-3 POSTER Volcanism in Iceland, Cold fusion and Rydberg matter
Olafsson-Sveinn-4 POSTER Conductivity of Rydberg matter
Olafsson-Sveinn-5 POSTER Rydberg matter experimental setup in Iceland
Paillet-Jean Luc-1 Theory Tu 3:52 On highly relativistic deep electrons
Papadatos-Gabriel-GWU-1 POSTER Electrical, thermal and chemical simulations of Ni-H electrochemical cells
Plekhanov-VG-1 POSTER A possible signature of neutron quarks – lepton interaction in solids
Prevenslik-Thomas-2 POSTER X-ray emission in LENR by Zero Point Energy or simple QED?
Ramarao-Prahlada-1 Diverse Experiments Th 8:24 Observation of Excess Heat in Nickel – LAH System
Roarty-Brian-1 Old and New Experiments W 11:37 A Method to Initiate an LENR Reaction in an Aqueous Solution
Ruer-Jacques-1 POSTER Chemical Heat Generation in LENR
Scholkmann-Felix-GWU-1 POSTER Complex current fluctuations in Ni-H electrochemical experiments: Characterization using non-liner signal analysis
Seccombe-Dana-1 Experiences F 9:30 Experience with Semiconductor Technology Development Potentially Relevant to LENR
Staker-Michael-1 Heat Measurements Tu 9:12 Coupled Calorimetry and Resistivity Measurements, in Conjunction with an Emended and More Complete Phase Diagram of the Palladium – Isotopic Hydrogen System
Stevenson-Cheryl-1 Theory Tu 4:15 Isotope Effects beyond the Electromagnetic Force: 1H and 2H in Palladium Exhibiting LENR
Storms-Edmund-1 Experiences F 8:00 The enthalpy of formation of PdH as a function of H/Pd atom ratio and treatment
Storms-Edmund-2 Materials W 8:00 The Loading and Deloading Behavior of Palladium Hydride
Storms-Edmund-3 POSTER The strange behavior of catalysts made from Pd or Pt applied to Al2O3
Stringham-Roger-1 Diverse Experiments Th 9:36 Investigation of Cavitation Effects Related to LENR
Stringham-Roger-2 POSTER A Deuteron Plasma Driven to Neutrality and 4He
Swartz-Mitchell-1 Heat Measurements Tu 8:24 Aqueous and Nanostructured CF/LANR Systems Each Have Two Electrically Driven Modes
Swartz-Mitchell-2 Experiences F 9:00 Excess Heat is Linked to Deuterium Loss in an Aqueous Nickel CF/LANR System
Takahashi-Akito-1 Heat from NanoMaterials M 1:30 Research Status of Nano-Metal Hydrogen Energy
Tanabe-Katsuaki-1 Theory Th 4:15 Plasmonic Field Enhancement on Planar Metal Surfaces
Tanzella-Fran-1 Heat Measurements Tu 8:00 Nanosecond Pulse Stimulation in the Ni-H2 System.
Tarassenko-Gennadiy-1 POSTER The Mechanism of Formation of LENR in Earth’s Crust
Vysotskii-Vladimir-1 Experiment and Theory Th 1:30 Using the Method of Coherent Correlated States for Realization of Nuclear Interaction of Slow Particles with Crystals and Molecules
Vysotskii-Vladimir-2 Theory M 3:50 Effective LENR in Weakly Ionized Gas Under the Action of Optimal Pulsed Magnetic Fields and Lightning (Theory and Experiments)
Vysotskii-Vladimir-3 POSTER Generation and Registration of Undamped Temperature Waves at Large Distance in LENR Related Experiments
Vysotskii-Vladimir-4 POSTER Controlled transmutation of Na, P and Mn to Fe isotopes in D2O and H2O during growth of yeast Saccharomyces cerevesiae
Whitehouse-Harper-1 POSTER Electrochemical Immittance and Transfer-function Spectroscopy applied to LENR
Wood-Ryan Ion Beams – Rydberg Matter Tu 12:37 Joseph Papp Nobel Gas Engine Shows Early LENR?
Yoshimura-Toshihiko-1 Theory Th 4:37 Estimation of bubble fusion requirements during high-pressure, high-temperature cavitation
Zeiner-Gundersen-Sindre-1 Ion Beams – Rydberg Matter Tu 12:15 Hydrogen reactor for Rydberg Matter and Ultra Dense Hydrogen, a replication of Leif Holmid
Zeiner-Gundersen-Sindre-2 POSTER Distance dependency of spontaneous decay signal from ultra dense hydrogen source
Zeiner-Gundersen-Sindre-3 POSTER Pulse shape and PMT stabilization period from spontaneous signal from a ultra dense hydrogen source
Zhang-Hang-1 POSTER Experimental on hydrogen carrying metal glow discharge
Ziehm-Erik-1 POSTER Detecting Charged Particles in LENR Applications using CR-39
Zuppero-Anthony-1 POSTER Electron Quasiparticle Catalysis of Nuclear Reactions
Zuppero-Anthony-2 Theory M 4:10 Transmutations by Heavy Electron Catalysis

 

Analysis re Kirkegaard

The sources cited by Oliver Smith on his Kirkegaard attack blog.

In the introduction, https://www.theguardian.com/commentisfree/2018/jan/14/how-toby-young-got-where-he-isnt-today-universities-regulator-resignation (archive) Sun 14 Jan 2018 05.00 EST attack vituperation (see below) and then:

(the number at the beginning of each link is a link to an anchor for details below. Archive copies are also linked in the details.)

Opinion piece in Guardian

https://www.theguardian.com/commentisfree/2018/jan/14/how-toby-young-got-where-he-isnt-today-universities-regulator-resignation (archive)

(I was astonished that the Guardian would put up a piece like this, even as “Opinion.” This is not listed as an official Guardian editorial. The author, Stewart Lee, is a “standup comedian, writer and director.His latest “opinion piece” (really sarcastic satire like this), and, my opinion, funny … as long as one does not take it literally. There are truths behind sarcasm, but it is not “the truth.” These pieces are selected by the Guardian from submissions, see “pitch guidelines.” I have found no disclaimer, however, to warn the reader that pieces are the opinion of the author, not the Guardian. Nor that a piece by a “standup comedian” might be a satirical roast.)

The following is only what relates to Kirkegaard, but the general theme regarding the London Conference on Intelligence is that there would be “nazism … being handed round .”

secret conference on “intelligence”, featuring notorious speakers including in previous years white supremacists and a weird far-right paedophilia apologist called Emil.  

Private Eye

Toby Young Breeds Contempt. Private Eye, 10 Jan. 2018. The link was incorrect, returning an internal link to the deleted blog. With some effort, I found a screenshot of the paper’s article. (archive) Quoting from it, re Kirkegaard, and about the London Conference on Intelligence:

The conference serves as a rendezvous for academic racists and their sympathizers. One speaker, Emil Kirkegaard, whom Young follows on Twitter, presented papers at the 2015, 2016 and 2017 conferences. Writing about pedophilia on his website, he argued in 2012 that a “compromise is having sex with a sleeping child without them knowing it (so, using sleeping medicine). If they dont (sic) notice, it is difficult to see how they cud [sic] be harmed. He added in April 2017 that he advocated a “frank discussion of pedophilia-related issues”.

The author generally is searching for what is wrong, Private Eye is a scandal sheet. Kirkegaard, in his 2017 comment, did not realize the problem: public reaction to mention of pedophilia can be highly reactive, and high reactivity does not allow clear insight, and “frank discussion” is unsafe, outside of a safe context. Even then it can be dangerous! It is easy to say things that are clearly true, and be attacked for “advocating pedophilia.” — and saying that is not advocating pedophilia and does not imply that pedophilic activity — that’s the legal issue, not pedophilia per se, is or should be acceptable, it is quite the contrary. However, to actually protect children, we need to step back and generate deeper understanding than the simple horror that is common — and understandable.

Kirkegaard is a hereditarian, i.e., tends to look for genetic causes of behavior. So he was thinking about pedophilia as genetic. If it is genetic, a conclusion can come to mind that pedophiles “can’t help themselves.” So he was thinking about what one could do if one was unfortunate enough to be a pedophile. He is quite correct that if we want to control damage, we must consider this. However, where one goes from there is not obvious. Someone with an innate disposition to uncontrollable violence — which may have genetic roots — is not therefore to be excused and tolerated!

Kirkegaard’s language “it is hard to see,” would easily be taken as a claim that there is no harm. And so his statement, quoted out of context, seems to be “approving” of child rape. However, he did not claim that, and his post actually goes on to discuss the actual and potential harm. That part was not quoted, and the selective quotation implies that it was missing.

He meant by this that it might be hard for the pedophile. It obviously was not hard for him, as he complete his thinking about the problem. He was not, and probably is not, aware of what real pedophiles, or people who seem (to themselves!) like pedophiles, can actually do to avoid harm to themselves and others.

The author clearly did read the original post, not merely the RationalWiki quotation, but searching for confirmation of the shocking revelation she had seen there, serving the purpose of her article (to smear Toby Young by association), reading the post in the context of a claim that Kirkegaard is a pedophile or pedophile apologist, she missed the contradiction.

As well, it is a common public reaction to believe that pedophiles and pedophile apologists want public discussion of pedophilia. No, they don’t. That is the opposite of reality, in my experience. Genuine pedophiles don’t want the topic brought up at all! (I ended up counselling pedophiles in several instances, there are three examples that I came across. Only one of them would want a “frank conversation.” He had been convicted and had thoroughly accepted the serious wrongness of what he had done. (He was also involved in counselling pedophiles. And he had actually managed to get the sex offender registry lifted, having demonstrated to the satisfaction of the courts that re-offense was very unlikely, and largely because everyone in his life — his family, etc., — knew what he had done. Pedophiles, far more commonly, want it never mentioned. Pedophilic activity thrives in secrecy.

There are organizations that advocate for “boy love” or the like. They create walled gardens for their members, not public conversation. The “frank public conversation” is actually from academics, professionals and the like, actually studying the issue with scientific research and academic publication. Kirkegaard has not apparently been part of the academic conversation.

Hence the author here cites the “frank conversation” comment as if it is advocacy, and that attitude (that conversation must be advocacy) does little or nothing to actually protect children, and may actually cause harm. The archive image on imgur was archived from a photo of the page, for a post on reddit. The title given to the scan, the next day, was “Private Eye: Toby Young attended secret eugenics conference with white supremacists, paedophiles”

The Evolve Politics page (see coverage here) has a complete scan of the Private Eye article 

The Telegraph

UCL launches ‘eugenics’ probe after it emerges academic held controversial conferenceThe Telegraph. 10 Jan. 2018. 10 JANUARY 2018 • 6:49PM  Still up, original link.

Since 2015, Dr James Thompson has overseen the London Conference on Intelligence, which has seen a researcher who has previously advocated child rape online speak on campus on three occasions.

. . .

Speakers who have attended include researcher Emil Kirkegaard, who previously wrote that a “compromise” for child pornography would be for paedophiles to have “sex with a sleeping child without them knowing”.
In a blog published on his personal website in 2012, Mr Kirkegaard wrote: “One can have sex with some rather young ones (say, any consenting child in puberty) without any moral problems, especially when one is young oneself.”
“If they don’t notice it is difficult to see how they cud [sic] be harmed, even if it is rape.”

The second quotation is simple truth, actually, and describes behavior that is often legal. What was confusing here was the use of “child” to refer to someone sexually mature. Whether this is legal or not depends on unstated details. “Rather young” doesn’t mean “children” in this context. The juxtaposition of this sentence with the one about rape is confusing as well. The second sentence actually came first, and there was an intermediate sentence in the original post that made it clear that these were about two different situations.

London Student

London’s eugenics conference and its neo-Nazi links. London Student. 10 Jan. 2018.

Among the speakers and attendees over the last four years are a self-taught geneticist who argues in favour of child rape, multiple white supremacists, and ex-board member of the Office for Students Toby Young.

. . .

Another major organiser of the LCI is Emil Kirkegaard, who has attended all four conferences and even designed the website. Although he refers to himself as a “polymath” and Thompson describes him as a “very bright young guy”, Kirkegaard is not an academic. His highest qualification is a Bachelor’s in linguistics.
Having dropped out of his Masters degree, instead preferring to be “self-taught in various subjects”, Kirkegaard now runs OpenPsych, a platform for non-peer reviewed psychology papers, along with Davide Piffer of Mankind Quarterly. Piffer is a fellow LCI-speaker, and was praised by Richard Lynn as having done “brilliant work identifying the genes responsible for race differences in intelligence.”
. . .
Kirkegaard’s own personal blog is home to topics such as ‘Is miscegenation bad for your kids?’ and how one could empirically verify a Jewish conspiracy. His Facebook features alt-right ‘promotional videos’and once featured a friend’s Nazi salute with the caption ‘There will be an heir to the Führer.’
A post on Facebook featuring a Nazi salute behind Kirkegaard alongside his ‘Führer’ comment
By far the most disturbing of part of Kirkegaard’s internet presence, however, is a blog-post in which he justifies child rape. He states that a ‘compromise’ with paedophiles could be:

“having sex with a sleeping child without them knowing it (so, using sleeping medicine. If they dont notice it is difficult to see how they cud be harmed, even if it is rape. One must distinguish between rape becus the other was disconsenting (wanting to not have sex), and rape becus the other is not consenting, but not disconsenting either.”
He qualifies this with a note that “bodily harm” would undermine this justification, and especially “with small children since their bodily openings are not large enuf [sic] for a regular sized male penis. To avoid this one shud [sic] not penetrate.”
Kirkegaard’s reputation as a scientific advocate for neo-Nazism was increased last April when he appeared on Tara McCarthy’s ‘Reality Calls’ to discuss “the future of eugenics.” McCarthy was banned from YouTube for alleging a Jewish conspiracy to commit “white genocide”, supports deporting naturalized citizens and “killing them if they resist”, and said that she hopes “zero” migrants crossing the Mediterranean “make it alive”.

Unfortunately, the author did not complete the quotation, for Kirkegaard ultimately rejects the whole idea.  The author is again using guilt by association, a common theme. 

University probes eugenics conference with links to white supremacistsThe Jewish Chronicle. 10. Jan. 2018.

The article does not mention Kirkegaard except very indirectly.

University probes eugenics conference with links to white supremacists
The London Conference on Intelligence has connections to a number of controversial racial theorists

There are connections to “racial theorists,” and they are “controversial. This article is not a problem at all. The only quibble I’d have is that the Conference is not actually a “eugenics conference,” but eugenics is a topic that comes up. That is, the practical usage of information about genetics. Eugenics has a bad name because of Nazi eugenics, which was racist, but eugenics, in itself is not racist, and genetic testing is commonly used for selective abortion, as one example. This is about genetics, not race.

Evolve Politics

Toby Young spoke alongside Nazi who argues raping unconscious children is fine. Evolve Politics. 11 Jan. 2018. (original post includes scan of Private Eye article. I have saved a copy if needed.)

This is straightforward defamation.

It is fast becoming clear that Toby Young’s controversialist career is far darker than first appeared, with Private Eye drawing attention to his attendance of a secret eugenics conference alongside white supremacists and advocates of paedophilia.

. . .

Several of the speakers were controversial beyond their presentations. One speaker, Emil Kirkegaard, who presented papers three times at the conference, defended paedophilia suggesting that drugging the child before sex might be a “compromise” as they were unlikely to know it had happened and therefore wouldn’t be harmed. He has also advocated a “frank discussion of paedophilia related issues.”

. . .

It goes without saying that Young follows the openly fascist Kirkegaard on Twitter.

. . .

The obvious question that emerges from this is whether Young is fit for his role as director of the state funded New Schools Network who advise on the education of our children. It is the firm opinion of Evolve Politics that anyone palling around with literal paedophile advocates and white supremacists should have absolutely no place whatsoever in education.

I notice the “quotation creep,” going further than cherry-picking, where what was actually written is twisted to exaggerate impressions. In that context, then, “frank discussion” can imply advocacy of the alleged “no harm” position, which was not Kirkegaards expressed view.

Top London university launches probe into conference that included speakers with controversial views on race and genderDaily Mail. 11 Jan 2018. (original URL still up).

Presents controversy, seems to be responsible journalism, even if incomplete. (i.e., what is the other side of the “controversy”?  

Metro

University investigates ‘racist’ eugenics conferenceMetro. 11 Jan. 2018. (Original URL is still up.)

Defamation.

The London Conference on Intelligence included talks by controversial speakers including white supremacists, child rape advocates, and those with extreme views on race and gender.

. . .

Another speaker was blogger Emil Kirkegaard, who has argued that men should be allowed to have ‘sex with a sleeping child’.

I find it fascinating how collective interpretation can invent what was not in the blog post. There was no “should be allowed” in that post, none at all. The fact of active pedophiles is that one might make an argument like that to him or herself. Kirkegaard actually rejects the argument. Fed an interpretation before reading it, and not reading carefully, though, can create an impression that one has verified it. In this case, it is not necessarily likely that the author actually read the blog post itself.

The Independent

University College London launches ‘eugenics’ probe after controversial conference secretly held on campusThe Independent. 11 Jan. 2018. (original URL is up.)

Other speakers at the conference have included Emil Kirkegaard, a researcher who justified child sexual abuse on his personal blog. In a 2012 post, he wrote: “Perhaps a compromise is having sex with a sleeping child without them knowing it (so, using sleeping medicine). If they dont [sic] notice it is difficult to see how they cud [sic] be harmed, even if it is rape.”

He later added a note to the post in which he said he did not support the legalisation of paedophilia but advocated “frank discussion of paedophilia-related issues”.
Mr Kirkegaard presented research – which was not published by peer-reviewed journals – on differences in cognitive ability between ethnic groups at the International Society for Intelligence Research (ISID) conference in Montreal last July, at which Mr Young also lectured.
Kirkegaard has published research in peer-reviewed journals, and conference papers are commonly current work, presented before publication. Technically, the research is not on “differences in cognitive ability” but on “differences in performance on tests,” and the whole field is a difficult one. The theme being followed is like the RationalWiki article, and as well the extreme assessment of Richard Lynn from the SPLC, which is hardly a neutral organization. (But a good one, by the way, my view, simply very political, and the topic is one which is highly politicized.)
This author did present the later clarification (the only one so far to note it), but the original post did not “support legalization of pedophilia” at all. Again, the “frank discussion” comment can appear to be supporting pedophilia, or at least a conversation over legalization.
The text simply ignores the “not support” comment added and reports that Kirkegaard “justified” child abuse. That reading only appears when one ignores the context and conclusions of the original post.

The Guardian

UCL to investigate eugenics conference secretly held on campusThe Guardian. 11 Jan. 2018. (Original URL is still up.)

Defamation.

Previous attendees included Richard Lynn, whom the US-based research group Southern Poverty Law Center labelled an “unapologetic eugenicist”, and the blogger Emil Kirkegaard, who has written supportively about pedophiles being allowed to have “sex with a sleeping child”.

. . .

Kirkegaard did not respond to requests for comment. But Thompson told the Daily Telegraph that the conference’s main subject was how IQ was inherited between different groups and races. “Eugenics is one topic, but many topics are discussed,” he said.

At least they tried, but it looks like Kirkegaard had very little time to respond. To make the statement they made, and as professional journalists, they would be presumed to have carefully checked that the statement made was true from the evidence they had, and the blog was available, particularly before going ahead with publication. Did it support “allowing pedophiles to have sex with a sleeping child”?

No, it did not. It did not talk about legality at all. Indeed, in the thought-experiment, which is what it was, imagining the thinking of a pedophile, it would be essential to the idea of no-harm that nobody found out about it (as well as other problems, such as physical injury or … pregnancy can happen with a child who has not yet menstruated, it’s rare, but it is known to have happened, and precocious puberty is a thing, meaning first menses before 8. The earliest known pregnancy was a famous case at something like five.) And, again, with many illegal actions, a potential offender may think “if nobody finds out, what’s the harm?” Kirkegaard, however, came up with the conclusion that potential harm cannot be avoided, that the whole line of thinking was ultimately defective, and that perhaps castration was the solution.

It is unfortunate that Kirkegaard did not respond. When major media contact one for comment, they are wanting to publish ASAP, it’s their business. Kirkegaard is young and wanted to consult an attorney, I’ve been told. My suspicion: he was afraid of being misinterpreted, and fear leads us to make poor decisions. Nevertheless, Kirkegaard is not hiding, he is facing this situation, head-on.

The real story here was how a long-term troll managed to manipulate major media, and many others, on RationalWiki (and in the full story, Wikipedia and the WMF wikis) with a story that he largely invented out of thin and misleading evidence.

As well, racism and racialism and hereditarianism vs. environmentalism re intelligence are complex issues that deserve “frank conversations,” because the polarization of hate groups does not allow society to heal and move on, it does not allow us to make sane decisions based on evidence — and based on what we freely choose for society.  

RT

Shamed Toby Young ‘attended secret eugenics conference with neo-nazis and pedophiles’. RT. 11. Jan. 2018. (Original URL is still up)

Defamation.

Shamed Toby Young ‘attended secret eugenics conference with neo-nazis and pedophiles’

Also at the lecture series was Emil Kirkegaard, who was invited despite previously claiming a“compromise” for child pornography would be for pedophiles to have “sex with a sleeping child without them knowing.”
Mr Kirkegaard wrote: “One can have sex with some rather young ones (say, any consenting child in puberty) without any moral problems, especially when one is young oneself. If they don’t notice it is difficult to see how they cud [sic] be harmed, even if it is rape.”
The quote is inaccurate, mashing together separate statements.  (The original, again.) The headline follows classic trolling practice: first, allegations are stated as truth without qualification: (“Shamed, secret, eugenics, neo-nazis, and pedophiles) and then what may be a single fact — if true — becomes many. How many “pedophiles”? RationalWiki is full of that. This is the stuff of “genuine fake news.”

The Times

University College London under fire over its conferences on ‘eugenics’The Times. 11 Jan. 2018. (Original URL is up — but requires registration for full text (free for two articles a week). I have my own copy password protected.)

Not defamation. Some of it was conclusory: that the conference was secret, that rules had been violated, and the conference was characterized by speakers allegedly making certain claims that — I suspect and have some basis — are not presented fairly. Commentors on the article noticed! However, this was, by far, the most balanced coverage. There are 9 comments,quite good, considering. Deeper coverage would explore the real controversy, not the social-media yelling at each other version.

Protecting the fringe allows the mainstream to breathe

Wikipedia is famously biased against fringe points of view or fringe science (and actually the bias can appear with any position considered “truth” by a majority or plurality faction). The pseudoskeptical faction there claims that there is no bias, but it’s quite clear that reliable sources exist, per Wikipedia definitions, that are excluded, and weaker sources “debunking” the fringe are allowed, plus if editors appears to be “fringe,” they are readily harassed and blocked or banned, whereas more egregious behavior, violating Wikipedia policies, is overlooked, if an editor is allied with the “skeptical” faction. Over time, the original Wikipedians, who actually supported Neutral Point of View policy, have substantially been marginalized and ignored, and the faction has become increasingly bold.

When I first confronted factional editing, before the Arbitration Committee in 2009, the faction was relatively weak. However, over the ensuing years, the debunkers organized, Guerrilla Skeptics on Wikipedia (GSoW) came into existence, and operates openly. People who come to Wikipedia to attempt to push toward neutrality (or toward “believer” positions) are sanctioned for treating Wikipedia as a battleground, but that is exactly what the skeptics have done, and the Guerrilla Skeptics (consider the name!) create a consistent push with a factional position.

There is increasing evidence of additional off-wiki coordination. It would actually be surprising if it did not exist, it can be difficult to detect. But we have an incident, now.

February 24, 2018 I was banned by the WikiMediaFoundation. There was no warning, and no explanation, and there is no appeal from a global ban. Why? To my knowledge, I did not violate the Terms of Service in any way. There was, however, at least one claim that I did, an allegation by a user that I had “harassed” him by email, the first of our emails was sent through the WMF servers, so if, in fact, that email was harassment, it would be a TOS violation, though a single violation, unless truly egregious, has never been known to result in a ban. I have published all the emails with that user here.

This much is known, however. One of those who claimed to have complained about me to the WMF posted a list of those complaining on the forum, Wikipedia Sucks. It is practically identical to the list I had inferred; it is, then, a convenient list of those who likely libelled me. However, I will be, ah, requesting the information from the WikiMedia Foundation.

Meanwhile, the purpose of this post is to consider the situation with fringe science and an encyclopedia project. First of all, what is fringe science?

The Wikipedia article, no surprise, is massively confused on this.

Description

The term “fringe science” denotes unorthodox scientific theories and models. Persons who create fringe science may have employed the scientific method in their work, but their results are not accepted by the mainstream scientific community. Fringe science may be advocated by a scientist who has some recognition within the larger scientific community, but this is not always the case. Usually the evidence provided by fringe science is accepted only by a minority and is rejected by most experts.[citation needed]

Indeed, citation needed! Evidence is evidence, and is often confused with conclusions. Rejection of evidence is essentially a claim of fraud or reporting error, which is rare for professional scientists, because it can be career suicide. Rather, a scientist may discover an anomaly, au unexplained phenomenon, more precisely, unexplained results. Then a cause may be hypothesized. If this hypothesis is unexpected within existing scientific knowledge, yet the hypothesis is not yet confirmed independently, it may be “rejected” as premature or even wrong. If there are experts in the relevant field who accept it as possible and worthy of investigation, this then is “possible new science.” There may be experts who reject the new analysis, for various reasons, and we will look at a well-known example, “continental drift.”

There is no “journal of mainstream opinion,” but there are journals considered “mainstream.” The term “mainstream” is casually used by many authors without any clear definition. In my own work, I defined “mainstream journals” as journals acceptable as such by Dieter Britz, a skeptical electrochemist. As well, the issue of speciality arises. If there is an electrochemical anomaly discovered, heat the expert chemists cannot explain through chemistry, what is the relevant field of expertise. Often those who claim a field is “fringe” are referring to the opinions of those who are not expert in the directly relevant field, but whose expertise, perhaps, leads to conclusions that are, on the face, contradicted by evidence gathered with expertise other than in their field.

With “cold fusion,” named after a hypothesized source for anomalous heat,  in the Fleischmann-Pons Heat Effect,  (also found by many others), it was immediately assumed that the relevant field would be nuclear physics. It was also assumed that if “cold fusion” were real, it would overturn established physical theory. That was a blatant analytical error, because it assumed a specific model of the heat source, a specific mechanism, which was actually contradicted by the experimental evidence, most notably by the “dead graduate student effect.” If the FPHE were caused by the direct fusion of two deuterons to form helium, the third of Huizenga’s three “miracles,” if absent, would have generated fatal levels of gamma radiation. The second miracle was the reaction being guided in to the very rare helium branch, instead of there being fatal levels of neutron radiation, and the first would be the fusion itself. However, that first miracle would not contradict existing physics, because an unknown form of catalysis may exist, and one is already known, muon-catalyzed fusion.

Evidence is not provided by “fringe science.” It is provided by ordinary scientific study. In cargo cult science, ordinary thinking is worshipped as if conclusive, without the rigorous application of the scientific method. Real science is always open, no matter how well-established a theory. The existing theory may be incomplete. Ptolemaic astronomy provided a modal that was quite good at explaining the motions of planets. Ptolemaic astronomy passed into history when a simpler model was found.

Galileo’s observations were rejected because they contradicted certain beliefs.  The observations were evidence, and “contradiction” is an interpretation, not evidence in itself. (It is not uncommon for  apparently contradictory evidence to be later understood as indicating an underlying reality. But with Galileo, his very observations were rejected — I think, it would be interesting to study this in detail — and if he were lying, it would be a serious moral offense, actually heresy.

The boundary between fringe science and pseudoscience is disputed. The connotation of “fringe science” is that the enterprise is rational but is unlikely to produce good results for a variety of reasons, including incomplete or contradictory evidence.[7]

The “boundary question” is an aspect of the sociology of science. “Unlikely to produce good results,” first of all, creates a bias, where results are classified as “good” or “poor” or “wrong,” all of which moves away from evidence to opinion and interpretation. “Contradictory evidence,” then, suggests anomalies. “Contradiction” does not exist in nature. With cold fusion, an example is the neutron radiation issue. Theory would predict, for two-deuteron fusion, massive neutron radiation. So that Pons and Fleischmann reported neutron radiation, but at levels far, far below what would be expected for d-d fusion generating the reported heat, first of all, contradicted the d-d fusion theory, on theoretical grounds. They were quite aware of this, hence what they actually proposed in their first paper was not “d-d fusion” but an “unknown nuclear reaction.” That was largely ignored, so much noise was being made about “fusion,” it was practically a Perfect Storm.

Further, any substantial neutron radiation would be remarkable as a result from an electrochemical experiment. As came out rather rapidly, Pons and Fleischmann had erred. Later work that established an upper limit for neutron radiation was itself defective (the FP heat effect was very difficult to set up, and it was not enough to create an alleged “FP cell” and look for neutrons, because many such cells produce no measurable heat), but it is clear from later work that neutron generation, if it exists at all, is at extremely low levels, basically irrelevant to the main effect.

Such neutron findings were considered “negative” by Britz. In fact, all experimental findings contribute to knowledge; it became a well-established characteristic of the FP Heat Effect that it does not generate significant high-energy radiation, nor has the heat ever been correlated (across multiple experiments and by multiple independent groups) with any other nuclear product except helium. 

The term may be considered pejorative. For example, Lyell D. Henry Jr. wrote that, “fringe science [is] a term also suggesting kookiness.”[8] This characterization is perhaps inspired by the eccentric behavior of many researchers of the kind known colloquially (and with considerable historical precedent) as mad scientists.[9]

The term does suggest that. The looseness of the definition allows inclusion of many different findings and claims, which do include isolated and idiosyncratic ideas of so-called “mad scientists.” This is all pop science, complicated by the fact that some scientists age and suffer from forms of dementia. However, some highly successful scientists also move into a disregard of popular opinion, which can create an impression of “kookiness,” which is, after all, popular judgment and not objective. They may be willing to consider ideas rejected for social reasons by others.

Although most fringe science is rejected, the scientific community has come to accept some portions of it.[10] One example of such is plate tectonics, an idea which had its origin in the fringe science of continental drift and was rejected for decades.[11]

There are lost and crucial details. Rejected by whom, and when? The present tense is used, and this is common with the anti-fringe faction on Wikipedia. If something was rejected by some or by many, that condition is assumed to continee and is reported in the present tense, as as it were a continuing fact, when an author cannot do more than express an opinion about the future.  Now, plate tectonics is mentioned. “Continental drift” is called “fringe science,” even after it became widely accepted.

Wegener’s proposal of continental drift is a fascinating example. The Wikipedia article does not mention “fringe science.” The Wikipedia article is quite good, it seems to me. One particular snippet is of high interest:

David Attenborough, who attended university in the second half of the 1940s, recounted an incident illustrating its lack of acceptance then: “I once asked one of my lecturers why he was not talking to us about continental drift and I was told, sneeringly, that if I could prove there was a force that could move continents, then he might think about it. The idea was moonshine, I was informed.”[47]

As late as 1953 – just five years before Carey[48] introduced the theory of plate tectonics – the theory of continental drift was rejected by the physicist Scheidegger on the following grounds.[49]

That rejection was essentially pseudoskepticism and pseudoscientific. There was observation (experimental evidence) suggesting drift. The lack of explanatory theory is not evidence of anything other than possible ignorance. “Absence of evidence is not evidence of absence.”

The fact is that the continental drift hypothesis, as an explanation for the map appearance and fossil record, was not generally accepted. What shifted opinion was the appearance of a plausible theory. Worthy of note was how strongly the opinion of “impossible” was, such that “proof” was demanded. This is a sign of a fixed mind, not open to new ideas. The history of science is a long story of developing methods to overcome prejudice like that. This is a struggle between established belief and actual fact. Experimental evidence is fact. Such and such was observed, such and such was measured. These are truth, the best we have. It can turn out that recorded data was a result of artifact, and some records are incorrect, but that is relatively rare. Scientists are trained to record data accurately and to report it neutrally. Sometimes they fail, they are human. But science has the potential to grow beyond present limitations because of this habit.

Anomalies, observations that are not understood within existing scientific models, are indications that existing models are incomplete. Rejecting new data or analyses because they don’t fit existing models is circular. Rather, a far better understanding of this is that the evidence for a new idea has not risen to a level of detail, including controlled tests, to overcome standing ideas. Science, as a whole, properly remains agnostic. Proof is for math, not the rest of science. This does not require acceptance of new ideas until one is convinced by the preponderance of evidence. Pseudoskeptics often demand “proof.” “Extraordinary claims” require extraordinary evidence.” Yes, but what does that actually mean? What if there is “ordinary evidence?” What is the definition of an “extraordinary claim,” such that ordinary evidence is to be disregarded?

It’s subjective. It means nothing other than “surprising to me” — or to “us,” often defined to exclude anyone with a contrary opinion. For Wikipedia, peer-reviewed secondary source in a clearly mainstream journal is rejected because the author is allegedly a “believer.” That is editorial opinion, clearly not neutral. Back to the fringe science article:

The confusion between science and pseudoscience, between honest scientific error and genuine scientific discovery, is not new, and it is a permanent feature of the scientific landscape …. Acceptance of new science can come slowly.[12]

This was presented by formatting as a quotation, but was not attributed in the text. This should be “According to Michael W. Friedlander.” in his book on the topic, At the Fringes of Science (1005). He is very clear: there is no clear demarcation between “science” and “fringe science.”

Friedlander does cover cold fusion, to some degree. He hedges his comments. On page 1, “… after months of independent, costly, and exhaustive checks by hundreds of scientist around the world, the excitement over cold fusion cooled off, and the claim is probably destined to take its place alongside monopoles, N-rays, polywater, and other fly-by-night “discoveries” that flash across our scientific skies to end up as part of our folklore.”

He hedged with “probably.” On what evidence was he basing that assessment?  Cold fusion was not actually his primary investigation. On pp. 27-34, he reports the early days of the cold fusion fiasco, (with some errors), and doesn’t report on what came later. He doesn’t mention the later confirmations of the heat effect, nor the discovery of a nuclear product, published in 1993 in a mainstream journal (though announced in 1991, Huizenga covered it in 1993). He does not distinguish between the”fusion theory” and the actual report of anomalous heat by experts in heat measurement, not to mention the later discovery of a correlated nuclear product. He closes that section with:

To summarize briefly, the cold fusion “discovery” will surely be remembered as a striking example of how science should not be done. Taubes has compared “many of the proponents of cold fusion” to Blaise Pascal, the seventeenth century scientist who “renounced a life of science for one of faith>” [Bad Science (1993), 92] The whole episode certainly illustrates the practical difficulty in implementing an innocuous-sounding “replication” and points to the need for full and open disclosure if there are to be meaningful tests and checks. It has also exposed some unfortunate professional sensitivities, jealousies, and resentments. At least to date, the exercise appears to be devoid of redeeming scientific value — but perhaps something may yet turn up as the few holdouts tenaciously pursue a theory as evasive as the Cheshire cat.

I agree with much of this, excepting his ignorance of results in the field, and his idea that what was to be pursued was a “theory.” No, what was needed was clear confirmation of the heat anomaly, then confirmation of the direct evidence that it was nuclear in nature (correlated helium!), and then far more intensive study of the effect itself, its conditions and other correlates and only then would a viable theory become likely.

Cold fusion was the “Scientific Fiasco of the Century” (Huizenga, 1992) It looks like Friendlander did not look at the second edition of Huizenga’s book, where he pointed to the amazing discovery of correlated helium. There was a problem in cold fusion research, that there were many “confirmations” of the heat effect, but they were not exact replications, mostly. Much of the rush to confirm — or disconfirm — was premature and focused on what was not present: “expected” nuclear products, i.e., neutrons. Tritium was confirmed but at very low levels and not correlated with heat (often the tritium studies were of cells where heat was not measured).

Nobody sane would argue that fringe claims should be “believed” without evidence, and where each individual draws the line on what level of evidence is necessary is a personal choice. It is offensive, however, when those who support a fringe claim are attacked and belittled and sometimes hounded. If fringe claims are to be rejected ipso facto, i.e., because they are considered fringe, the possibility of growth in scientific understanding is suppressed. This will be true even if most fringe claims ultimately disappear. Ordinary evidence showing some anomaly is just that, showing an anomaly. By definition, an anomaly indicates something is not understood.

With cold fusion, evidence for a heat anomaly accumulated, and because the conditions required to create the anomaly were very poorly understood, a “negative confirmation” was largely meaningless, indicating only that whatever approach was used did not generate the claimed effect, and it could have been understood that the claimed effect was not “fusion,” but anomalous heat. If the millions of dollars per month that the U.S. DoE was spending frantically in 1989 to test the claim had been understood that way, and if time had been allowed for confirmation to appear, it might not have been wasted.

As it is, Bayesian analysis of the major “negative confirmations” shows that with what became known later, those experiments could be strongly predicted to fail, they simply did not set up the conditions that became known as necessary. This was the result of a rush to judgment, pressure was put on the DoE to come up with quick answers, perhaps because the billion-dollar-per-year hot fusion effort was being, it was thought, threatened, with heavy political implications. Think of a billion dollars per year no longer being available for salaries for, say, plasma physicists.

However, though they were widely thought to have “rejected” cold fusion, the reality is that both U.S. DoE reviews were aware of the existence of evidence supporting the heat effect and its nuclear nature, and recommended further research to resolve open questions; in 2004, the 18-member panel was evenly divided on the heat question, with half considering the evidence to be conclusive and half not. Then on the issue of a nuclear origin, a third considered the evidence for a nuclear effect to be “conclusive or somewhat conclusive.”

The heat question has nothing to do with nuclear theory, but it is clear that some panel members rejected the heat evidence because of theory. The most recent major scientific work on cold fusion terms itself as a study of the Anomalous Heat Effect, and they are working on improving precision of heat and helium measurements.

If one does not accept the heat results, there would be no reason to accept nuclear evidence! So it is clear from the 2004 DoE review that cold fusion was, by then, moving into the mainstream, even though there was still rampant skepticism.

The rejection of cold fusion became an entrenched idea, an information cascade that, as is normal for such cascades, perpetuates itself, as scientists and others assume that was “everyone thinks” must be true.

In mainstream journals, publication of papers, and more significantly, reviews that accept the reality of the effect began increasing around 2005. There are no negative reviews that were more than a passing mention. What is missing is reviews in certain major journals that essentially promised to not publish on the topic, over a quarter-century ago.

One of the difficulties is that the basic research that shows, by a preponderance of the evidence, that the effect is real and nuclear in nature was all done more than a decade ago. It is old news, even though it was not widely reported. Hence my proposal, beginning quite a few years ago, was for replication of that work with increased precision, which is a classic measure of “pathological science.” Will the correlation decline or disappear with increased precision?

This is exactly the work that a genuine skeptic would want to see.

I have often written that genuine skepticism is essential to science. As well, those who will give new ideas or reported anomalies enough credence to support testing are also essential. Some of them will be accused of being “believers” or “proponents,” or even “diehards.”

The mainstream needs the fringes to be alive, in order to breathe and grow.

Diehard believers have hope, especially if they also trust reality. Diehard skeptics are simply dying.

(More accurately, “diehard skeptic” is an oxymoron. Such a person is a pseudoskeptic, a negative believer.)

Racialism and racism

Huffington Post, 2015: Racism and Racialism Are Different

Wikipedia, the lede for Racialism:

Racialism is the belief that the human species is naturally divided into races, that are ostensibly distinct biological categories. Most dictionaries define the term racialism as synonymous with racism.[1]

This is fascinating. The meaning of a term, and its synonyms, depend on context. This inspired me to write this page, because I find the statement about “most dictionaries” to be suspect. The reference is to a book by Chester L. Quarles (2004), Christian Identity: The Aryan American Bloodline Religion. McFarland. I have often pointed out that references in the lede is a sign of POV conflict on an article. Whatever is in the lede should clearly be established in the article itself, and no references are needed. But an editor pushing to include something in the lede will cite a reference, and claim that it’s sourced and should not be removed.

What is in the source? The reference does not give a page number. Naughty, naughty! However, it is a search for “racialist synonymous with racist”. The book is mostly talking about “identity groups,” which may be race-based, or based on other factors. One p. 67, after discussing distinctions between racism and racialism, has:

In most dictionaries, the terms “racist” and “racialist” are pretty much synonymous. While racism is described as a discriminatory practice, racialism is described as “a doctrine or teaching without scientific support that claims to find racial differences in character, intelligence, etc., that asserts the superiority of one race over another, or others, and that seeks to maintain the purity of the race, or races.” It is clear, however, that right-wing purists separate these terms in their publications.

To purists, racialism is simply a racial preference and a racialist is merely one who wishes to associate primarily with members of his own race. The racist, on the other hand, is often a hater, with severe prejudices and extreme bias against other racial groups.

Notice how the lede, which should be rigorously neutral, has collapsed “prety much synonymous” with synonymous. Further, the purpose of the author here is not to establish these as synonyms, and he is actually distinguishing the words, but … what he ascribes to “right-wing purists” is arguably racist, and not intrinsic to racialism as it was long understood. Racialism does not necessarily involve a racial preference, but those with a racial preference may certainly be racialist as distinct from racist, as they define the term. It is a term of high opprobrium, and those who might think that race is a biological reality — which is the origin of racialism, not the claims about “racial differences” in themselves — are understandably irate when called “racist,” without the hatred that they associate with the term.

I have an African daughter, and encountered racism in a surprising place, but, in fact, it was a racialist expectation, subtle, and those afflicted with it would have been horrified to be called racist. But they were racist, in that they had different expectations of my daughter than they’d have had if she had been “white,” and the result was discrimination against her.

The collapse of racialism with racism makes it much more difficult to address. Racism is to be disempowered, but the transformative response to racialism is education and exposure to diversity. If people wish to associate with what those believe are “their own kind,” that is properly within their freedom of choice, with limitations as required for fairness. I.e., if one’s job is public service, that trumps the freedom of association rights.

Coercive education will be resisted, prolonging conflict and maintaining polarization. The human spirit insists on freedom.

So what about dictionaries? That was an off-hand comment, not a survey. What is actually in dictionaries?

Merriam-Webster online:

Definition of racialism
a theory that race determines human traits and capacities; also racism

That is, in some contexts, “racialism” might be a synonym for “racism.” But the words are defined differently.

Definition of racism
1 : a belief that race is the primary determinant of human traits and capacities and that racial differences produce an inherent superiority of a particular race
2 a : a doctrine or political program based on the assumption of racism and designed to execute its principles
b : a political or social system founded on racism
3 : racial prejudice or discrimination

There is a fascinating note on the history of the word, which, like racialism, apparently was not used before 1900. The note goes on to point out the foolishness of a certain usage of dictionaries, quite like this was used in Wikipedia:

Dictionaries are often treated as the final arbiter in arguments over a word’s meaning, but they are not always well suited for settling disputes. The lexicographer’s role is to explain how words are (or have been) actually used, not how some may feel that they should be used, and they say nothing about the intrinsic nature of the thing named by a word, much less the significance it may have for individuals. When discussing concepts like racism, therefore, it is prudent to recognize that quoting from a dictionary is unlikely to either mollify or persuade the person with whom one is arguing.

The Wikipedia lede, then, focuses on the word, not the thing or concept the word points to. And what that is varies with the speaker or writer.

We need to know what “race” is, as well:

Definition of race
1 : a breeding stock of animals
2 a : a family, tribe, people, or nation belonging to the same stock
b : a class or kind of people unified by shared interests, habits, or characteristics
3 a : an actually or potentially interbreeding group within a species; also : a taxonomic category (such as a subspecies) representing such a group
b : breed
c : a category of humankind that shares certain distinctive physical traits
4 obsolete : inherited temperament or disposition

“Stock” refers to genetic similarity.

Something is obvious to me here. Back to the definition of racialism: “a theory that race determines human traits and capacities.” “Race” refers to genetics, then, “stock.” It is obvious that structures that support, say, intelligence, require genetic provision of the mechanisms. The idea that behavior is primarily determined by genetics is called “hereditarian,” and I suspect that racialists are generically hereditarian. But as an extreme position, it’s preposterous, i.e., the opposite position is “environmentalism,” which, with intelligence, is presented as the idea that environmental conditions (which would include education) . It is obvious that environment also has a strong effect, and so the scientific question would not be A or B, but how much A and how much B.

Back to the Wikipedia racialism article, they quote W.E. DeBois:

In 1903, W. E. B. Du Bois said that racialism is the philosophical position that races existed, and that collective differences existed among such categories, the races.[citation needed] He further stated that racism required advancing the argument that one race is superior to other races of human beings. In In My Father’s House (1992), Kwame Anthony Appiah summarized Du Bois’s philosophical stance that racialism is value-neutral term and that racism is a value-charged term.

Since the word racialism only is found in 1902, it’s clear that W.E.B Du Bois was making a clear distinction. The idea that race is a biological reality is not intrinsically racist. I know of no other word for the relatively value-free meaning, so what I’m seeing may be an example of linguistic facism, the systematic deprivation of some class of people of language that could be used to describe what they think. Instead, it is all collapsed to racism.

Today, some anthropologists and geneticists point to studies that suggest racialist beliefs are both compatible and incompatible with modern population genetics.[clarification needed][2][3][4][5][6][7][8]

First of all, “skeptics” will focus on “beliefs.” Racialism is an idea, a concept, and there can then be racialist beliefs, but it is not, in itself, a belief. The Wikipedia sentence is amazing for an encyclopedia, totally confusing and unclear. What do the sources say? And what is the source for “both compatible and incompatible” or was that synthesis from someone too lazy to actually cover what is in the sources.

As usual, Wikipedia ontology can be astonishingly primitive. In any case, this edit created the unclear language. The prior language was:

Today, some anthropologists and geneticists point to studies that suggest racialist beliefs are incompatible with modern population genetics.[clarification needed][2][3][4][5][6][7][8]

The editor was, in fact, pointing to an alleged case of sources not supporting the article claim. Without researching those sources, this is what happens when editors have a belief. They see it in the sources! The edit summary was:

(Definitions and differences: none of the links say they are incompatible, nor that racial categories or labels for populations are unsupported)

True? Who cares? This is Wikipedia, where reliability is an unfunded mandate, nobody is responsible.

Without putting too fine a point on it, (1) race as a biological determinant of behavior is pretty much dead in the water, scientifically, it has become fringe; “race” is not well-defined except as a social reality, i.e., what people think and how that affects many environmental conditions. It is likely that human genetics are sufficiently similar, overall and on average, that environmental differences loom far larger. (2) there remains debate on the issue, and there is research being done. The sources cited generally supported the original text, but this had been handled very sloppily, it was weasel text, not specific, using a pile of sources to make a point about “beliefs,” which isn’t what most of those sources were about. (3) because identification of “race” depends on visible markers, it seems to have a reality, but studying how much of that is real and how much is mere appearance is difficult.

Judgement on this is, at present, highly political, which is damaging to science. Calling a hereditarian researcher “racist” does not promote the development of scientific knowledge and consensus, it hinders it.

Hatewatch

If you are reading this on an archive site, be sure to check the original URL for completion, updates, comments and corrections.

A post appeared on Hatewatch, a discussion forum for the Southern Poverty Law Center, –an organization which I generally support, — Wikipedia wars: inside the fight against far-right editors, vandals and sock puppets.

The post was decent, but seemed to assume that far-right sock puppets were a serious problem on Wikipedia, and it cited RationalWiki, which was unfortunate, since RationalWiki routinely defames with misrepresentations and sometimes downright lies, and has no reliable content review process (nor does Wikipedia, but Wikipedia hews far closer to neutrality than RW, which is explicitly and openly biased, site policy is SPOV, Snarky Point of View).

A “Bill Connors” appeared promoting RationalWiki. No surprise, because this person is very likely the creator of articles being cited. (There is an alternate possibility of his twin brother, both defacto banned since 2011 on Wikipedia, the particular topic here is of Oliver D. Smith interest, but there can be some crossover.)

Characteristic of “Anglo Pyramidologist socks” (either one of the brothers) is an account that appears out of nowhere, with very high interest, highly partisan and inflammatory, and recognizable arguments (that have been repeated in many places, often irrelevant to topic, but purely or largely ad-hominem. This commentor showed all these traits. His Discus account.

For comparison, my Discus account.

I responded to a number of the Bill Connors comments on the blog. However, this is distracting from the purpose of the page, so I am creating this as a followup, to avoid adding more noise to that blog. These are comments from Connors (or possibly others) which I am choosing to answer here instead of there. And then I comment in indented italics.

Note Abd Lomax is an internet harasser who has defended Rightpedia and alt-right individuals https://rationalwiki.org/wi…

This is classic AP. I have not “defended Rightpedia,” which is a disgusting pile of far-right trash — as far as I’ve seen. The RationalWiki article on me was written as a retaliation because of my exposure of AP impersonation socking on Wikipedia, and contains many libels based on taking fact (I mentioned Rightpedia) and distorting it (I stated that a particular article on Rightpedia — on one of the AP brothers, Oliver D. Smith, very likely “Bill Connors” — appeared to be correct in its factual claims. This is hardly “defending Rightpedia.”

Some — by no means all — of the individuals attacked by AP socks are or are claimed to be “alt-right.” If I expose the deceptions and impersonations and misrepresentations of sources presented by AP socks (often on RationalWiki), this is not “defending” those individuals, it is defending the truth and the common welfare.

Abd Ul-Rahman Lomax has also defended neo-Nazi Emil O. W. Kirkegaard. https://rationalwiki.org/wi…

“Lomax claims to be a free speech advocate and protector of civil liberties. He has written a series of blog posts defending the far-right child-rape apologist Emil O. W. Kirkegaard and other alt-right activists”

Yes. I was, as an undergraduate at Cal Tech, well over fifty years ago, an officer in the campus chapter of the American Civil Liberties Union. I am a strong supporter of free speech, but not libel under the mask of anonymity. People may make anonymous statements, but it is essential that these not be given the same weight as statements made by people willing to stand up and be counted, with real reputations at stake.

What we see in the “Bill Connors” comments is an anonymous hater attacking real people with real names and real reputations. Recently, Oliver D. Smith came out of the closet and acknowledged he had created RationalWiki articles (including those on Emil Kirkegaard and John Fuerst.) I congratulated him on that, on being open, but then he more or less disappeared and has been active with new names.

“Child-rape apologist” is a favorite claim of ODS about Kirkegaard. It is a drastic misrepresentation of something Kirkegaard wrote many years ago. This is a great example of how to abuse sources, picking a phrase out of a paragraph that is actually contradicted by the rest of the source. Sloppy news media, in January, quoted this conclusion as if fact. Smith knows that the claim will elicit immediate reactions of horror, and those reactions can disable the critical thinking that RationalWiki purports to promote.

Kirkegaard denies being “far-right,” and I have seen nothing that indicates he is. However, he is a researcher with a hereditarian focus on “intelligence” in an environment where the popular and more mainstream view is more environmentalist. Both positions, taken as extremes, are unrealistic. Racists and the alt-right might like some of Kirkegaard’s research because it can seem to confirm “racialism,” which then shades into “racist” when emotional content is added, but that does not make him a racist. All of this actually confuses the scientific issues. My own view of Kirkegaard’s research is that it is not well-founded, but such research is properly assessed through normal academic process, not through political polemic.

https://rationalwiki.org/wi… for information about the banned wikipedia troll Abd Ul-Rahman Lomax.

This was irrelevant in context, simply spamming the RW article on me.

Lomax you are a bad liar. Mikemikev confirmed the socks belong to him on his racist Gab account. Mikemikev has written offensive hit-piece articles about anti-fascists on Rightpedia and you have defended the neo-nazi Emil Kirkegaard.

I love stuff where we can actually look at evidence, and that’s most of what I do with my time, compile evidence. What lie?

Here, Bill Connors had written:

Mikemikev the Rightpedia cofounder has confirmed the socks talked about in this article belong to him https://gab.ai/Mikemikev/po… and he seems to find the whole situation funny.

I responded:

 

[…] Mikemikev did not “confirm the socks belong to him.”

So what does the source show?

Michael Coombs @Mikemikev

a class=”inner-post-url” title=”http://www.splcenter.org/hatewatch/2018/03/12/wikipedia-wars-inside-fight-against-far-right-editors-vandals-and-sock-puppets” href=”http://www.splcenter.org/hatewatch/2018/03/12/wikipedia-wars-inside-fight-against-far-right-editors-vandals-and-sock-puppets” target=”_blank” rel=”noopener”>http://www.splcenter.org/hatewatch/2018/03/12/wiki…
http://en.wikipedia.org/wiki/Wikipedia:Sockpuppet_…
Their source
http://en.rightpedia.info/w/Magnus_Pharao_Hansen

The Wikipedia link is to the SPI casepage on Mikemikev, so this would be, for him, “my socks.” Were he more careful, he might have inserted “alleged.” But he DGAF (that part was reasonable, he likely does find this “funny”). What I notice was that many of those socks were not mikemikev, that is obvious from the individual reports, so he is definitely not “confirming the socks belong to him.” As well, Mikemikev, like many other AP targets, has been impersonated.  Mikemikev gives the “source” as a person he names in the Rightpedia article as Wikipedia user Maunus. Notice the first report, by Maunus. The finding: “Unrelated.” (But Mikemikev’s comment ascribing all this to Maunus was unfair.)

The article had this text on mikemikev socks:

One of the white nationalists who co-founded Rightpedia, a far-right free encyclopedia that split from Metapedia, created more than 140 accounts in the past 10 years.

That sounds like a lot, though it would only be a little more than one sock per month, and these accounts tend to make few edits before being discovered, so this does not actually bear on the problem of factional bias. However, this was an obvious reference to mikemikev. It was not sourced. However, Bill Connors supplied this:

The cofounder of Rightpedia is neo-Nazi Michael Coombs who users the name Mikemikev, he writes hit-piece articles about anti-fascists on Rightpedia. On Wikipedia he has 143 suspected socks https://en.wikipedia.org/wi…

I reviewed the issue of Mikemikev socks on Wikipedia, found on this subpage. My conclusions: Mikemikev has nowhere near that number of actual socks on Wikipedia. The actual number is unclear, because Wikipedia sock puppet investigations are erratic and a systematic error can be created by impersonations — or other misidentification, and there have been impersonations. It seems nothing is recent that is reasonably clear as Mikemikev is recent. At least two tagged accounts were far more likely to be Oliver D. Smith (Anglo Pyramidologist on Wikipedia) socks. That is very likely Bill Connors. Haters hate.

Overall conclusion: Bill Connors lied about the alleged “confirmation of socks in the article as belonging to him.” He certainly did not confirm this. Oliver Smith commonly uses accidents of language, the multiple interpretations possible for words, to make definitive claims when deeper examination shows this was not the intended meaning. And then he claimed I was lying, on top of that.

There is one person with an obsession about mikemikev and me and others, who behaves this way: Oliver D. Smith, with some minor possibility this is his brother Darryl L. Smith. He will claim that all this is a “defense of Michael Coombs,” but it is not. It is a defense of truthfulness and accuracy and rationality.

 

Note Abd Lomax is an internet harasser who has defended Rightpedia and alt-right individuals https://rationalwiki.org/wi..

Pure ad hominem argument, citing the article he or his brother wrote on me on RationalWiki, in a highly misleading way. This was in response to a comment about the alleged Wikipedia socking of Eleonora Dubcziki. That was not her, it was, high probability, Oliver D. Smith. I did not “defend Rightpedia,” and merely examined claims about certain alleged alt-right individuals. My position is that attacking allegedly reprehensible persons with deception and misinformation is itself reprehensible and sometimes even more so that the original “offenses.”

  

Abd Lomax https://meta.wikimedia.org/… has been globally banned from the Wikimedia Foundation for harassment

 

Abd Ul-Rahman Lomax is on the global ban list https://meta.wikimedia.org/… he was globally banned for doxing and harassing users. He is the 27th person to be globally banned by the WMF.

 Again, pure ad hominem argument, and highly misleading. Yes, I was “office banned” in February of this year. The reason, however, is obscure. There was no warning, no request to cease and desist anything, and no explanation. Star chamber process, and with unclear purpose. What did it prevent? I had filed a series of successful requests for checkuser on the meta wiki. And then something strangely shifted. Something was going on behind the scenes.

However, what is very likely from context is that there were a number of private complaints, claiming harassment. There was no harassment — unless documenting sock puppetry is harassment, which it is not, when the reports are reasonable, even if wrong, and, most importantly, no use of WMF facilities to harass. Because this is being used to defame me — as here –, I am working on a legal appeal process, but it doesn’t matter otherwise, since I wasn’t using the WMF for anything other than to protect users and resources from disruption by … “Anglo Pyramidologist,” who has never been formally banned, but many socks have been globally locked for Long Term Abuse, and I was documenting that and was attacked for it. Anglo Pyramidologist, the evidence is overwhelming, is Oliver D. and Darryl L. Smith. That’s “doxxing” to them and to RationalWiki, though they do it routinely on RationalWiki and elsewhere.

Notice that abundant claims about me, a person who uses a real name, with a real reputation, are being made by “Bill Connors” here. So if doxxing is bad, he’s been doing it for years with many people. (I only stumbled across some impersonation socks a few months ago. And when I simply began collecting the evidence, I was attacked. Which gave me a clue that I was onto something. )

 

https://rationalwiki.org/wi… “Lomax claims to be a free speech advocate and protector of civil liberties. He has written a series of blog posts defending the far-right child-rape apologist Emil O. W. Kirkegaard and other alt-right activists”.

Once again, Connors is citing himself. The article is heavily misleading. Let’s look at one claim here: “the far-right child rape apologist Emil O. W. Kirkegaard.

Kirkegaard is an academic, or at least his approach is academic. I think he would agree that he is a racialist and, on the matter of intelligence, a hereditarian. That does not make him “far-right,” though it might endear him to some who are far right. “Far” would, here, be hyperbole. You want to see something that looks far-right, look at Rightpedia, which Connors claims that I defend. Absolutely, I don’t, but I’m not going over every detail. A liar can toss mud with a few words that then take many words to answer, unless one just wants to say: “Liar!” And what do we think of people who cry “Liar” without providing evidence? Or, say, “fake news!” And who says that phrase over and over?

Is there evidence that Kirkegaard is a “child rape apologist”? That’s highly defamatory, designed that way, in fact.

Yes. There is evidence for the claim, if you take a comment Kirkegaard made in 2011, out of context, ignoring the rest of the page. I like verifiable evidence, but the problem is that some people — and apparently some newspaper reporters — don’t read carefully. They see a claim on RationalWiki that Kirkegaard is a “child rape apologist,” and there is a link. They follow the link, and, yes he wrote what was quoted. They don’t bother to read the rest of it! And so Oliver Smith got his defamation repeated in mainstream media and then he cited the media as proof it was true. This is the Kirkegaard comment, from 2012.

If anyone can read the whole comment and still think that Kirkegaard is a “child-rape apologist,” well, congratulations on the ability to maintain an opinion in the face of clear contrary evidence.

There is a little more. That was in 2012. Here is what Oliver D. Smith was writing in 2012, as Atlantid

He has acknowledged being Atlantid. Blatantly racist. Is this unfair? It could be. He changed his mind, see this farewell to Metapedia (which was definitely right-wing). (He did apparently lie, later, just below, about his connection to RationalWiki, but I haven’t checked the exact dates, and there is some disruption that has been attributed to Oliver Smith that was actually — more likely — Darryl.

The point is that to characterize someone prominently based on an obscure comment they made many years ago is … not fair! In 2012, Darryl was 22, and he has acknowledged a diagnosis of schizophrenia. From my direct email contact with him, he is still unstable, his sanity seems to come and go, fairly rapidly. There is correspondence with him reproduced here, and what starts out as reasonably calm and sane, as in the first part of that farewell, becomes highly reactive and blaming. “Lies!!!” Notice the mention of mikemikev, a long-term obsession of his.

 

https://rationalwiki.org/wi… Liar. You are not a sysop on Rationalwiki you have been trying to vandalize Rationalwiki for years because they exposed your internet harassment and racism.

This is easy. To convince communities and administrators, the Smiths rely on most people not checking sources or investigating independently. RationalWik allows any sysop to remove the sysop privilege, and it gives the privilege relatively easily. My RationalWiki user rights log

I did not claim to be a sysop. I claimed that I was a sysop. The log shows that the right was first given in February 2012. That went back and forth, as these things do when someone is not a hard-core RationalWikian, but a neutral skeptic — with some opinions or wanting to present unpopular evidence. David Gerard removed the right in August, 2012. Gerard was still only a tech, not yet a moderator, he was elected in 2013. There is a story there…. Two weeks later, the right was restored by Blue, who was a moderator. (Moderators have the power to prevent a user from being “demoted,” they call it, to sysop. The removal reason shows how RW used to operate. No “cooping,” i.e., discussion of the removal of rights. Later, Gerard had enough power to ignore that, and he removed the right, after a complaint by Anglo Pyramidologist socks, 9 October, 2017. So I had the rights for over five years.

However, my contributions and the history of my RW user page shows the situation. The page was vandalized in 2012 by Occasional use. I indicated an intention to avoid RW, This edit shows the summary.

The harassment that appeared in October was preceded by my first edit in years. That edit told what had preceded the creation of the article on me, which was created 5 October 2017 by an obvious AP sock, almost certainly Darryl L. Smith, with this, and other actions, carrying out threats to retaliate for documenting Wikipedia socking. This pattern of threats and libel and impersonation — which appeared in strength after I was blocked on RW — is a widely-known behavior of AP socks. Oliver D. Smith, who was, from interests, probably Oliver D. Smith, and the RW sysop who later blocked me (Skeptical), claimed in an email to me that most of the socks were his twin brother, Darryl. Sometime around this point, I suspect that Oliver became involved in the attack on me on Wikipedia (where I had not edited since 2011) and Wikiversity (where I had been mostly inactive for two years, but returned to deal with impersonation socks and attacks on Wikiversity and a Wikiversity user).

Those RW attack socks have names characteristic of socks created by AP, probably Darryl, but there is crossover with the brothers. Oliver admitted creating two articles that were started with an impersonation name, and there is circumstantial evidence that Oliver created the lulu book impersonations.

(It is possible it was his brother. There are no other reasonable suspects. Bill Connors appears there, claiming to be an RW sysop. Yes. He is, through several accounts. The photoshopped image of with my face on an obese almost-naked man was definitely first published by Oliver D. Smith, that evidence is very strong. There is a worse image (the man  is completely naked) on a book pretending to be by Rome Viharo, but like all the impersonation books so far, the content is from RationalWiki. The image is of Tim Farlyy. Why him? Well, because Viharo suspects him of behind-the-scenes involvement with the AP socking. I am unconvinced.)

This is a common AP tactic: using impersonations, present the imagined arguments of his targets as parodies, extreme, and do this especially with something that might have some truth to it. It creates an apriori “paranoid conspiracy theory” impression. I’ve seen it work. On RationalWiki all the material relating to the AP socking has been deleted, some of it revision-deleted so that even sysops cannot see it. Yet anyone else socking in other places is documented in articles. It’s a fascinating public relations technique, that can work for a time.

 

Abd Ul-Rahman Lomax you have been globally banned by the Wikimedia Foundation. Its funny when you call other people socks. You have also defended the neo-Nazi Emil Kirkegaard. You have 0 respectability.

Again, repetitive and irrelevant. It would imply that I am massively socking, which I have not done. This is coming from a person who has about 200 tagged socks on Wikipedia, and who has claimed to have more active accounts. He definitely has active accounts on RationalWiki, see rationalwiki/anglo-pyramidologist/list-of-articles/

(With the retirement of ODS, the current Oliver account appears to be MrsBlintz, and Debunking spiritualism has returned to strenuous activity.)

As to Kirkegaard, I have defended the truth. A judgment of Kirkegaard as “neo-nazi” is not apparent from what I’ve seen. However, he’s been accused of being a “child rape apologist,” which claim is radically unsupported by the evidence, which has been cherry-picked and presented out of context, and the actual context would lead to very different conclusions. Technically, yes, this is defending him from false claims. Is that reprehensible? Apparently, Bill Connors would have it be so. Who cares about the truth? Lynch the lying rapists!!!

 

https://rationalwiki.org/wi… “Lomax takes issue with the Southern Poverty Law Center which he claims is “highly political”. He criticizes their report on Richard Lynn, claiming “this hit piece is simply hitting on stereotypes about racism and sexism, knee-jerk expectations”. On his blog, Lomax links to the neo-Nazi encyclopedia Rightpedia as a “valid” source of information”.

This is a clear example of the divisive AP tactics. The RatWiki article sources the claim about the SPLC to a page where I studied the claim of Oliver D. Smith in a short-lived blog about Kirkegaard that information was “well-sourced.” In fact, the media cited got their leads from RationalWiki or Oliver Smith himself, and apparently did not carefully check. So I looked at the sources claimed by Oliver, hence that page.

These pages are generally study notes, finding sources and summarizing or analyzing them, and this writing is not generally polemic, i.e., succinct, sound-bite, and is not designed to rabble-rouse. The issue I was looking at was attendees at the London Conference on Intelligence.. Richard Lynn was listed, and I wrote about him (with comments now inserted)

Richard Lynn (Wikipedia article). A link is given to a web site about Lynn: https://www.splcenter.org/fighting-hate/extremist-files/individual/richard-lynn That is the Southern Poverty Law Center, which is highly political.

This was a comment about Wikipedia reliable source. “Highly political” is not an attack on the SPLC, which is particularly valuable for its political work! SPLC would not be a reliable source for fact on Lynn, by Wikipedia policy, if policy were being followed. That is generally true for political sources, it is like blogs. I could look at how the source was used, but this was not “taking issue with the Southern Policy Law Center.” I might agree with their conclusions, in fact. I was, instead, examining the solidity of sources, faced with a claim that they were “well-founded.”

In 2016, Lynn spoke on “Sex differences in intelligence.” If Lynn is smart, he would be talking about how much smarter than men, women are. Seriously, I have two immediate reactions: comparing intelligence between woman and men is extremely difficult, what one can do is to compare measures only. and there are hosts of stereotypes to deal with. Men have trouble understanding cooking and taking care of babies, right? And especially men have trouble understanding women, famously.

Does Lynn give a decent speech, raising questions worth considering, or was this uninterrupted racist or sexist propaganda? To know, one would probably have to be there! This hit piece is simply hitting on stereotypes about racism and sexism, knee-jerk expectations. The Wikipedia article provides much more balance. I’d be amazed at a Conference on Intelligence that did not include Lynn. Yes, his views might be highly controversial, and he might take positions on social issues that I might find offensive, but the man does have academic qualifications.

This does not defend views. It is about objectivity, neutrality, and the SPLC article isn’t neutral, it leads with conclusions. I read what Lynn allegedly wrote there, and I don’t have time to research it in detail. The SPLC page has an apparent political purpose: to discredit Lynn, it is not simply to inform about his views.

I’m starting to smell academic censorship, rejection of research because it offends political correctness (which is more or less what Kirkegaard has been claiming). The existence of that kind of bias does not mean that the research is sound, but a free academy will not be reasoning from consequences. Data is data. Interpretation of data is distinct from that, and interpretation is often quite biased. According to the Wikipedia article on Lynn, he sits on the board of the journal Intelligence, published by Elsevier. He is also 87 years old. Someone is surprised that he attends and speaks at a conference on intelligence?

This was much more abou the general academic milieu than about the SPLC article, which would simply be reflecting it, from their own political perspective.

The purpose here was to examine how a collection of selected facts were used to create an impression in media of a blatantly and offensively racist conference. The problem, from my point of view, is that racialism and racism are not the same, and, if we are careful about language, statements that might appear racist, knee-jerk, may simply be factual. These kinds of semantic difficulties foment and foster division among people, and if the goal is to “hate nothing but hatred,” if hatred is the true enemy, polemic that fails to allow diversity of thought fosters and creates hatred.

On RationalWiki they claim I am vague about racism. I’ve been explicit: racism is, in my view, an extension of ancient and probably instinctive survival reaction. We evolved under tribal conditions, small groups, family groups, not large populations with high genetic diversity. The visible markers that people use to quickly discriminate “race” are signs that a person is probably different, from some different tribe, and under some conditions, that would be a danger signal, requiring rapid response for survival.

There are many examples where our instinctive survival responses (fight or flight, generally) have become dangerous in themselves. It is difficult to define “intelligence.” It is obvious, though, that there is a genetic component to intelligence, but core to intelligence is experience. Intelligence doesn’t exist in the human brain when it has no training or experience. It is obvious that genetic variations might lead to differences in brain performance under differing conditions, but I see no sign that these differences are major; rather, all the evidence I have seen is that it is the programming of the brain, the accumulated experience, which includes the learning of technique and analysis, is environmentally based. It is that programming that is crucial, the hardware, so to speak, is less fundamental. However, no hardware, no intelligence.

“IQ” is not the same thing as “intelligence.” IQ is a test result, on a standardized test. I have no doubt that the tests will include cultural bias, or, more subtly, may include “triggers” that can statistically alter group performance.

Further, even if there were substantial genetic differences affecting intelligence, that does not automatically lead to what amount to racist political decisions. In fact, some people have disorders or diseases which impair performance. “Eugenics” has a very bad name because of mixing up the possibly laudable goals of eugenics with racist and inhumane policies. Remarkable, here, is what I found about the RW user behind the hatred of the racialists and hereditarians: blatant racism, expressed on Metapedia,  that would indicate what kinds of traits he would consider “good,” i.e., “improvements.” Definitely not kinky hair! He later renounced the racism, though I’m not sure he really understood the issues, but kept the high intolerance, the attempt to attack any views different from his, the intellectual fascism.

Racism is not going to be long-term addressed through hating racists. Absolutely, violent racists should be disempowered, but racialism (which I’ll define as the belief in race as a biological reality, more than simple population genetics) is very common. When we label people with these views as “deplorables,” we lose elections. To combat racism, our goal must be to educate (Not indoctrinate, i.e., forced education) and when we do that, people will transform themselves. At least most will.

I should mention: Black Lives Matter.

Of course, all lives matter! But that slogan arose in a context where black lives, too often, did not matter. It’s a medicine and certainly not a claim that “white lives” don’t matter, or that police lives don’t matter. An antidote to racism is caring about people. All people.

Continued comment

“Bill Connors,” an obvious fake account, continued to attack. Responding to my single comment, place in lieu of individual responses, since this mess is not likely the interest of the SPLC blog, he wrote two comments. My responses are in italics.

 

Abd-Ul-Rahman Lomax says in the above link his friend Emil Kirkegaard is not racist or far-right, that is laughable https://rationalwiki.org/wi… for the real facts.

For starters, I did’t say that. I wrote that I had seen no evidence that Kirkegaard is these things, but that he is a racialist, and might have some views that are right-wing. “Far-right” is not generally a fact, it’s a complex judgment relative to an overall view. The RationalWiki article includes much which is not “real fact,” such as the claim that Kirkegaard is a “child rape apologist.” From the evidence, that’s preposterous. And this is irrelevant to the issues raised in the SPLC article.

Racialism is not racist, though some racialists might be racist. See also this HuffPost article, explicitly on this point. The confusion causes political damage to the cause of disempowering racism, a cause which I firmly support. The Wikipedia article on this is … weird, and shows evidence of POV pushing. Instead of a focus on racialism and the history of it, it is mostly an argument that racialism is wrong. I happen to agree with that, but … on a separate page, I explore the distinction between racialism and racism, and what is said in that Wikipedia article.

However, the postings of Bill Connors are an example of how hatred is neither right-wing nor left-wing, and my view is that the real enemy of humanity is hatred, which sometimes masks itself as anti-hate. Falling into hatred is an obvious hazard for anyone who is confronting hatred.

The end of racism will not come from hatred of racists, which perpetuates conflict. Even less will it be furthered by hating racialists, the fundamental concepts of which are very common. Yes, they are in error, but the error is not corrected by hating people with those beliefs, at all. The opposite, in fact.

RationalWiki is a joke wiki, started by mostly adolescent refugees from Conservapedia. It is founded in snark, and snark — dismissive and contemptuous comment — is policy there. (I was a sysop there until the AP affair, but I had given up on using it for anything serious.)  Citing it as if reliable source is proof of ignorance or extreme bias, full of hate. In the case of the author of the article, it would be ignorance, in the case of “Bill Connors,” who, from evidence, is the primary author of the articles cited, it’s effectively the lies of a hater.

 

“Inflammatory claims”? No all factual. You are globally banned by the Wikimedia Foundation for harassing users https://meta.wikimedia.org/… and you spend your time defending racists like Emil Kirkegaard and Mikemikev.

Bill Connors is not a known person. On another site, he claimed to be a RationalWiki sysop, almost all of which are anonymous, but there is no sysop by that name. However, behaviorally, he may well be one.

Yes, I am globally banned, but Connors has invented the reason for it. Others think I was banned because I was a critic. Global bans never stated a reason, it is private (even if a user waives privacy, they don’t explain.) I did not harass users, but it is entirely possible that I was accused of this, and, if so, the WMF believed lies. That happens on occasion.

This claim of defending racists is just another deception from one who has been attacking Kirkegaard and Mikemikev — and many others not arguably racist — for many years. I defended a target of this person from the consequences of his impersonation on Wikipedia, and there is clear and abundant evidence of that, which will be cited on request. I exposed the abusive socking and was attacked, and threatened with deletion of all my work. And, remarkably, that happened! Extensive work that had stood without disruption for a decade was deleted. (Though I was able to recover all of it.) How did that happen? I became even more interested! How did an obvious long-term abuser, globally banned as such by stewards, manage to accomplish this?

The answer is that there is a much deeper problem than a pair of crazy brothers. The problem cuts to the core defects in Wikipedia and its vulnerability to manipulation by factions. These factions need not have majority support, and years ago, I concluded that a handful of users could strongly manipulate Wikipedia through off-wiki coordination to promote their point of view and attack what offends it.

I previously had seen evidence of such coordination, but … I had dismissed it. “Never ascribe to a conspiracy what could merely be ignorance.” I no longer dismiss it, because, particularly with the ban, and the evidence I found researching this matter, the evidence becomes overwhelming that collusion exists. To discover the full extent of it may take legal action, and the ban gives me a wedge into discovery.

In Memoriam: John Perry Barlow

A page popped up in my Firefox feed: John Perry Barlow’s Tips for Being a Grown Up

The author adds this:

Barlow was determined to adhere to his list of self-imposed virtues, and stated in his original post about the principles in 1977: “Should any of my friends or colleagues catch me violating any one of them, bust me.”

This was written in 1977 when Barlow was 30. It’s a guide to live by, and living by it can be predicted to create a life well worth living. I would nudge a few of his tips, based on more than forty additional years of experience and intense training, but it is astonishing that someone only 30 would be so clear. Whatever he needed beyond that, he would find.

Barlow’s Wikipedia page.

His obituary on the Electronic Frontiers Foundation.

I never met Barlow, but I was a moderator on the W.E.L.L. when he was on the board, and I’d followed EFF in general. This man accomplished much, but there is much left to do. Those who take responsibility are doing that work, and will continue.

While his body passed away, as all bodies do, his spirit is immortal, at least as long as there are people to stand for what he stood for.

We will overcome.

And, yes, “should anyone (friend or otherwise) catch me violating the principles of a powerful life, bust me.” I promise to, at least, consider the objection, and to look at what I can rectify without compromising other basic principles. There is often a way. Enemies may tell me what friends will not, and I learned years ago to listen carefully, and especially to “enemies.”

Farewell, John Barlow. Joy was your birthright and your legacy.

Well-sourced

If you are reading this on an archive site, be sure to check the original URL for updates, corrections, or retractions.

In his attack blog (covered in the page supra), Oliver D. Smith wrote:

Emil O. W. Kirkegaard is a far-right/neo-Nazi child rape apologist who made news headlines in January 2018 about his paedophilia apologism and links to white supremacists and eugenicists:

And then he listed ten sources. What I notice is that none of the headlines mention Kirkegaard by name. They are all about someone or something else, and only two of the headlines mention him. These stories all appeared within two days. They obviously copy from each other. And where did the information come from about what an alleged Nazi allegedly argues? It came from this RationalWiki article written by … Oliver D. Smith. Smith has claimed that I have abused Google to attack critics. He is a hypocrite, accusing me (and others) of doing what he has done for years.

I wrote the above and the rest of this study before I noticed that Smith actually bragged about creating the media flap:

The person who wrote those RationalWiki articles sent a tip-off to some newspapers. The story now has national coverage.[[User:SkepticDave|SkepticDave]] ([[User talk:SkepticDave|talk]]) 23:07, 11 January 2018 (UTC)

(SkepticDave is an obvious AngloPyramidologist sock, i.e., Oliver Smith — or possibly his brother Darryl.)

Smith just demonstrated how lies on a site that appears to be encyclopedic can create, then, news stories in sloppy media, that then are used to strengthen the original claims (as all those stories then were cited on RationalWiki). I will here look at each story on the claim Smith makes, but first some background:

Hereditarianism

Kirkegaard would be, perhaps, a speaker on hereditarian views on intelligence or related research, the Wikipedia article has this:

Hereditarianism is the doctrine or school of thought that heredity plays a significant role in determining human nature and character traits, such as intelligence and personality. Hereditarians believe in the power of genetics to explain human character traits and solve human social and political problems. Hereditarians adopt the view that an understanding of human evolution can extend the understanding of human nature.

The statement is unsourced, however, I’m going to assume that a hereditarian would agree with the definition. The article goes on:

Theories opposed to hereditarianism include behaviorismsocial determinism and environmental determinism.[citation needed] This disagreement and controversy is part of the nature versus nurture debate. But both are based on the assumption that genes and environment have large independent effects. The dominant view outside psychology among biologists and geneticists is that both of these are gross oversimplifications and that the behavioral/psychological phenotype for human beings is determined by a function of genes and environment which cannot be decomposed into a sum of functions of the two independently. And this especially because human behavior is uniquely plastic compared to that of other animals.

Hereditarianism has major political implications.

Pastore [1949] has claimed that hereditarians were more likely to be conservative,[4] that they view social and economic inequality as a natural result of variation in talent and character. Consequently, they explain class and race differences as the result of partly genetic group differences. Pastore contrasted this with the claim that behaviorists were more likely to be liberals or leftists, that they believe economic disadvantage and structural problems in the social order were to blame for group differences.[4]

The political implications become incendiary when the claim is made of a correlation between race and intelligence. The problem is amplified if “race” is assumed to be a biological reality, which might be one definition of “racialism,” which should be distinguished from “racism,” though obviously racism is racialist.

All this becomes a chaotic mess when implications which may be taken from scientific findings are judged based on the imagined — or real — political consequences. If some fact is shown by scientific research that would lead to a “wrong” policy decision, then the research must be wrong and is to be attacked. That is reasoning from consequences, a major logical error. As well, if research is supported by or funded by or liked by Bad People, with the Wrong Political Views, the research and the researcher are Bad. Guilt by association.

Kirkegaard on pedophiles

I cover this in a comment on an email from Oliver Smith, here. The short of it. Kirkegaard made some socially clumsy statements, but did not intent to legitimate child rape or child sexual abuse. Rather he “thought out loud” about how a moral pedophile might deal with the “problem” of being a pedophile, writing things that were just plain silly and useless. Many have done that, but it usually isn’t picked up and broadcast six years later, in what is a totally irrelevant context, like the UCL conference. A speaker at a conference, many years ago, said something dumb? This is relevant news? Only in the world of “fake news” (and counter-fake news, which is really the same) which seeks for the sensational and salacious, regardless of relevance. The UCL Conference organizers would not be responsible for knowing what Kirkegaard wrote many years before, only his recent activity. The tragedy of this is that “mainstream media” repeated accusations from RationalWiki, which then cites those repetitions and highly biased analysis — not mentioning where the newspapers got the information, which is obvious. RationalWiki. So Oliver Smith created a media nightmare and then cites it as proof that the nightmare is true. Nice trick. Not.

Exposed: London’s eugenics conference and its neo-Nazi links

A eugenics conference held annually at University College London by an honorary professor, the London Conference on Intelligence, is dominated by a secretive group of white supremacists with neo-Nazi links, London Student can exclusively reveal.
First of all, was it a “eugenics conference,” and what is “eugenics?” Wikipedia: Eugenics. The concept has come to refer to attempts or study of techniques for “improving” human genetics, which could range from what was done in the past (such as selective sterilization of people deemed to be carrying “defective heritable characteristics”) to genetic engineering, including selective abortion. I.e., aborting a child because it is shown to be carrying some gene for a genetic disorder, would be a form of eugenics. Eugenics, as a field, has a bad name particularly because of concepts and applications in Nazi Germany, where the idea was heavily mixed with concepts of “racial purity.”
Racialism is hereditarian, with a concept of the reality of races as genetic in nature. It’s rather obvious that the characteristics used to identify people racially can have a genetic component.  Is this story factual? The official name of the Conference has been the London Conference on Intelligence. Not “Eugenics.”
The article states that the Conference has its own Youtube channel, but that is gone. No details were given. It is unclear what was the importance of mentioning this.
The co-op article is a massive exercise in guilt by association. If a “link” can be found, that shows “domination.” What we have is a list of persons who have participated in the Conference. I would expect, by the way, that racists would be attracted to hereditarianism, but that does not make hereditarianism racist. There is obviously a genetic component to intelligence; what is the difference, otherwise, between a mouse and a human in intelligence? The issue as to racialism would be the extent of genetic differences between the populations called “races” — which can be very poorly defined — and, for the Conference, how they relate to measures of intelligence, intelligence itself being, often, poorly defined. That mouse is pretty smart, when it comes to being a mouse!
So the “speakers and attendees” named, and then the “links” to “neo-Nazis”:
  • Professor James Thompson, who allegedly “doesn’t understand genetics.” Evidence. Another professor said so. Maybe it’s true and maybe it isn’t!
  • “a self-taught geneticist who argues in favour of child rape,” Which would be Emil Kirkegaard, and what he wrote six years ago and did not promote or repeat, even if he did do what was stated, and … this has zero to do with “neo-Nazi” or hereditarianism, it’s simply mud to toss.
  • multiple white supremacists, not named. Out of how many? and a conference and its organizers is to be judge by those who are interested and attend? Invited speakers, yes, but sometimes anyone is allowed to present a paper, generally based on an abstract presented. A conference will not do deep research to rule out some “neo-Nazi link.” They may not look at presenter qualifications at all, it depends.
  • ex-board member of the Office for Students Toby Young.
  • Richard Lynn (Wikipedia article). A link is given to a web site about Lynn: https://www.splcenter.org/fighting-hate/extremist-files/individual/richard-lynn That is the Southern Poverty Law Center, which is highly political. In 2016, Lynn spoke on “Sex differences in intelligence.” If Lynn is smart, he would be talking about how much smarter than men, women are. Seriously, I have two immediate reactions: comparing intelligence between woman and men is extremely difficult, what one can do is to compare measures only. and there are hosts of stereotypes to deal with. Men have trouble understanding cooking and taking care of babies, right? And especially men have trouble understanding women, famously. Does Lynn give a decent speech, raising questions worth considering, or was this uninterrupted racist or sexist propaganda? To know, one would probably have to be there! This hit piece is simply hitting on stereotypes about racism and sexism, knee-jerk expectations. The Wikipedia article provides much more balance. I’d be amazed at a Conference on Intelligence that did not include Lynn. Yes, his views might be highly controversial, and he might take positions on social issues that I might find offensive, but the man does have academic qualifications. I’m starting to smell academic censorship, rejection of research because it offends political correctness (which is more or less what Kirkegaard has been claiming). The existence of that kind of bias does not mean that the research is sound, but a free academy will not be reasoning from consequences. Data is data. Intepretation of data is distinct from that, and interpretation is often quite biased. According to the Wikipedia article on Lynn, he sits on the board of the journal Intelligence, published by Elsevier. He is also 87 years old. Someone is surprised that he attends and speaks at a conference on intelligence?
  • four of the six members of the UISR’s Academic Advisory Council. The Ulster Institute for Social Research, on the face, is an academic institution. The members are titled “professors,” Edward Miller, Helmuth Nyborg, Donald Templer, Andrei Grigoriev, James Thompson, Gerhard Meisenberg. James Thompson, of course, was the Conference sponsor at UCL, so he’s been mentioned twice. I will list these separately:
  • Edward M. Miller, “is an American economist. He is a professor whose writings on race and intelligence have sparked debates on academic freedom. Indeed, and it is still happening. Academic freedom must include the right to be wrong. When ideas must be “correct” in order to be considered and discussed, we have a new orthodoxy that can and will crush real progress. Miller is my age, a born about four months after me.
  • Helmuth Nyborg is a Danish psychologist and author. He is former professor of developmental psychology at Aarhus UniversityDenmark and Olympic canoeist. His main research topic is the connection between hormones and intelligence. Among other things, he has worked on increasing the intelligence of girls with Turner’s syndrome by giving them estrogen. His research was censured for political reasons[1] by the administration of Aarhus University in 2007, forcing his retirement. He was later cleared by the governmental Danish Committees on Scientific Dishonesty (DCSD).[1] From my point of view, with any particular measure  of intelligence, there may be differences between  populations (i.e., “races” or ethnic groups) and between genders. The implications for policies are quite unclear, because there are other issues, and individual difference may be (I’ll simply say are) far larger than the population differences. Nyborg is 81.
  • Donald Templer died in 2016. He was 68 years old and was also quite controversial. Racists will believe in racialism and hereditarianism, and will show interest in these topics, but that does not make them “racist.” As well, political views can become highly biased, but if a researcher does good science, the bias can be separated from it; it will show up in how data is interpreted. If a researcher actually falsifies data, they would be rejected by all scientists. It’s rare.
  • Andrei Gigoriev: co-authored with Richard Lynn A_study_of_the_intelligence_of_Kazakhs_Russians_and_Uzbeks_in_Kazakhstan, published in Intelligence (2014)

Reading the paper, the immediate question I looked to find is how “intelligence” was measured. I find the research itself interesting, but quite inadequately explored. The paper talks about “intelligence” and does actually consider measures of intelligence, and … this is a general problem with “intelligence”: The test was a test designed in Great Britain and was administered in Russian, so higher performance for Russians could simply be related to familiarity with that language. Could there be a racialist or cultural bias here? Yes, my opinion. However, part of a solution would be to repeat the study with a similar test in Uzbek. The paper also suggests another problem: cultural emphasis on certain kinds of thinking and de-emphasis on other kinds. That is, the definition of “intelligence” may incorporate cultural bias. And the paper then goes into what I would call “racist or racialist speculation.” I would fault the reviewers at Intelligence for not insisting on skeptical analysis (i.e., the authors could have suggested further research to clear up ambiguities, but they did not.) (I could rip this paper to shreds, my opinion, but … academic freedom can handle this, and should.

  • James Thompson was the organizer of the UCL conference. He appears to be a recognized academic, see this paper, published by Oxford University Press in the Journal of Biosocial Science. From his Twitter feed, some quotes (All from January 15):

 If you want to combat racism and sexism you need the benchmark of open discussion of racial and sexual differences.

An unpopular idea may be traduced, misrepresented and suppressed and yet be wrong.

We should examine the ideas we cherish with as much ferocity as those we find repellent.

(See Spearman’s hypothesis, which is highly relevant, and G-loading. To my mind, the difference between hereditarian positions and those which consider other factors more important (such as environment, including cultural environment, social expectations, etc.) is one of degree, not absolute. I have been diagnosed with ADHD, and it was not marginal. There are theories that ADHD (which does run in families) is a genetic variant that favors hunter-gatherer survival, whereas “normies” are adapted for settled communities, originally agricultural. Nomadic peoples (like the Kazakhs) would probably fit more on the hunter-gatherer side. All human cultures need intelligence, but the form of intelligence would vary. But how large is the genetic component? The racist aspect of this research shows up in assumptions about what is “better.” It is generally assumed that “higher intelligence” is good, but what is the definition of “good”? It can be highly biased, culturally. The Gigoriev study cites some  specific question differences. It seems the authors cannot see the trees for the forest. A question designed to test the operation of logic included words that would be cultural triggers for Kazakhs, causing them to respond from a cultural position rather than from pure logic. These kinds of differences were then interpreted as an “inability to reason logically.” But all people when triggered into well-established patterns of thought do not apply abstract logic. The test, as designed, apparently, would create a process bias. It is not that a Kazakh could not understand “All A is not B. Given an example of A, is it B?” Different people, on average, and culturally, might have strong reactions to A or B, thus shifting answers. It would only need to shift the answers from some to warp the results. The paper is confused about racialism vs. culturalism.)

The Wikipedia article on Mankind Quarterly covers critique (quite prominently, by the way, a sign of biased editing there.) Generally, controversy should be kept out of the lede, and placing four links to what is obviously political criticism in the lede is not balanced. The Journal itself is clearly a scientific journal, publishing articles in a field with high controversy. For the latest issue, I picked a paper to look at.

“The Relationship between the “Smart Fraction”, SES, and Education: The Sudan Case.” From the abstract, this is neither hereditarian nor racialist. There is a paper by Emil Kirkegaard, ‘Employment Rates for 11 Country of Origin Groups in Scandinavia.” This was the only paper that I noticed as possibly being “politically edgy.” However, such data is needed for public policy review. Without reading the paper itself, I could expect that Kirkegaard might have expressed an anti-immigration position. Whether or not this would discredit the actual research is another issue.

For Kirkegaard: the article has “Kirkegaard’s reputation as a scientific advocate for neo-Nazism was increased last April when he appeared on Tara McCarthy’s ‘Reality Calls’ to discuss “the future of eugenics.” … and then evidence is shown that Tara McCarthy is Very Bad. This is guilt by association. Kirkegaard’s actual views were not described (and Kirkegaard has denied being a “neo-Nazi.” HIs general views on hereditarianism and intelligence — and eugenics — would make him a person of interest to certain racists and white supremacists, but that does not make him one of them. Further, even if he has politically offensive views, that does not discredit his scientific work. The London Student article is attacking an entire field, the study of intelligence and in particular, the origin of differences in measures of intelligence. Hereditarians consider genetics important, but the more mainstream view (and my view) is now that, among human beings (with very similar genetic coding, and aside from specific genetic disorders), other factors are far more important, and that survival pressure optimized for “general intelligence” in all major populations. However, I will argue strongly for the right of hereditatians and racialists to perform and present research, academically. If offensive racist (not merely racialist) views are presented, or, related, pernicious sexist views, not merely a study of sexual differences), then an academic institution may decide to exclude such work. The hysterical London Student article does not consider the real issues, but has merely fleshed out — a little — what came from the RationalWiki article by Oliver D. Smith, who has acknowledged, through his sock SkepticDave, that he fomented the whole flap by email.

Then the article mentions another Conference speaker as having been interviewed by McCarthy: Adam Perkins. Here is a cogent critique of Perkin’s work. “Cogent” means thoughtful and, to some extent, balanced, not knee-jerk, not that I necessarily agree. (But I probably would if I studied the book, which I’m not doing). The political significance is considered, and it is politics that dominate here. Not science. The book title is sensationalist: The Welfare Trait: How State Benefits Affect Personality. I have no doubt that this would appeal to conservatives and, as well, to certain neo-Nazis.

I conclude that the London Student article was sensationalist, focused on easy allegations, not distinguishing between the academic study of intelligence and heredity (by no means a resolved scientific controversy) and “neo-Nazi.”

this was a straightforward news report, reporting an investigation, not conclusions.

This is on the face repeating Oliver D. Smith’s attacks and arguments. It’s pure guilt by association.

Kirkegaard is not a “Nazi.” The article is conclusory, making exaggerated claims, such as “The London Conference on Intelligence (LCI) is a secretive, invitation-only event where they appear to discuss only the most bigoted of topics.”

Topics are not bigoted, unless they are, by their nature, conclusory on bigotry. I.e., “Why are Blacks of such Low Intelligence” would be racist and conclusory (i.e., incorporated racist assumption).” So far, I haven’t seen topics like that. So this is a hit piece, and we know that Oliver D. Smith contacted media to promote these ideas. They fell for it. They quote Kirkegaard (not mentioning that it was years ago):

He has also advocated a “frank discussion of paedophilia related issues.”

Obviously a pedophile because any non-pedophile would not want any discussion of such issues, they are unthinkable to any normal person. See Harris Mirkin. (Seriously, I’m a parent and pedophile hysteria does not protect children, it probably has the opposite effect.)

Top London university launches probe into conference that included speakers with controversial views on race and gender

Yes, they did that. There is some level of incorporated conclusion in the headline. Certainly there are allegations of “controversial views.” But these topics are not generally well-understood. So, looking at details, first in the subheads:

University College London said it was probing a potential breach of policy

Yes, that’s clear. The exact nature of the breach is not clear, not to me, yet.

One professor said the London Conference on Intelligence was ‘pseudoscience’

What did this mean? The source was the London Student. This was a media feeding frenzy. I will later look for follow-up. “Intelligence” is a very hot topic, with strong views being common, and politically fraught.

Some speakers claim certain countries have higher IQs than others, it is alleged

This is shocking, perhaps, until one knows what “IQ” actually is. Intelligence Quotient is measured by performance on standardized tests. Given a set of tests, I don’t think it is controversial: differing populations may have differing average scores on such tests. As a silly point: Countries do not have IQ, people do. Or perhaps robots. Siri is pretty smart! Good, perhaps, the word “alleged” is put in there. But the claim is not controversial! Except that some people will come unglued if one says it.

Public policy formation should not be knee-jerk from shallow interpretations of data. The public policy implications of the measured differences in IQ are a quite different topic than the raw data. There are many issues to be example, which will not be examined while there is shouting about “Racism!” Though with countries, it would be Nationalism, right? Which may or may not be racist.

By Eleanor Harding Education Correspondent For The Daily Mail

PUBLISHED: 20:41 EST, 10 January 2018 | UPDATED: 21:01 EST, 10 January 2018

The annual conference, which was first held in 2014, is alleged to have included speakers who have written about people in some countries having on average a higher IQ than those in others.

Again, that is not controversial, once we know what IQ is. It is performance on a standardized test. So, then, it becomes a matter of interest, scientifically (and with public policy implications). Why? Answers to that are not necessarily simple, and would, scientifically, require testing. Or it would be pseudoscience.

It was hosted by an honorary UCL senior lecturer, Professor James Thompson, who taught psychology for 32 years and for the last decade has worked as a consultant psychologist.

Yes. So, surprising that a conference on intelligence is hosted by a psychologist?

Another speaker at the LCI has been Emil Kirkegaard, who gave a talk in 2015 about how far ‘genomic race’ is associated with cognitive ability.

Well, “how far” is actually an open scientific question. It’s difficult to study. Kirkegaard used the term “genomic race.” What is that. Is it different from “race”? How? Here is a blog post by Kirkegaard.

The post shows some problems. First of all, genomic race is race measured by genetic markers, rather than “SIRE” or “self reported race/ethnicity.” Kirkegaard emphasizes the need for strong evidence because “environmentalists are very stubborn.” He is betraying a strong bias, his research is attempting to prove something, which classically leads to poor research. However, that does not make his results wrong, only that the results must be interpreted with caution, because he may then select data to publish that has been selected for value in creating desired conclusions, He is clearly a hereditarian (opposed to “environmentalism”) and a racialist. That is, he believes that the genetic influence on intelligence is strong — which is not a mainstream position now — and that “race” is a biological reality — also a widely rejected view. In my opinion, the statement “race is a biological reality” is neither true nor false, it is confuses interpretation with fact. But the interpretation that race is not a reality (other than as a social construct) is now dominant (and I have expressed that view many times.) We will see a comment on this:

Writer and geneticist Adam Rutherford told the London Student that, based on the titles and abstracts at the LCI, some of the views presented by speakers were a ‘pseudoscientific front for bog-standard, old-school racism’.

“Bog-standard” appears to be British for “ordinary.” I don’t agree. Racism, to confront the core, is a manifestation of what may be a human instinct, to mistrust strangers, people who are different. That made some level of sense for first reactions under “tribal conditions.” It becomes dangerous and pernicious under more modern conditions. But that kind of reaction will occur, it’s mediated by the amygdala, my opinion. So most “normal people” will be racist. Under modern conditions, such people are likely to deny it, since racism is Bad. (This has changed radically in my lifetime. When I was growing up in a white community, Manhattan Beach, California, racism was completely normal. That shifted, to the point that racism is suppressed. But people will still have those reactions, and to move beyond this, declaring the reactions Bad and Wrong will not shift this. Rather, racism is disappearing mostly because of increased exposure and familiarity, such that “black people” are now part of “our tribe.” The first step in defeating that “inner racism” is to acknowledge it, and the atmosphere of strong rejection makes that more difficult, not less difficult. Basically, blaming people is not a part of any skilled pedagogy or social transformation. 

This was not the comment of some careful academic. Basic on the Wikipedia article, Adam Rutherford, I’d expect a certain kind of bias, which is amply displayed here.

“Some views expressed” could refer to one or two speakers.

He added: ‘As soon as you begin to speak about black people and IQ you have a problem, because genetically-speaking “black people” aren’t one homogenous group.

Okay, who spoke about “black people”? Remember, he did not attend the conference and did not read the papers. Here is the list of speakers for the 2016 Conference: It includes a paper that I’d expect might have something like that, by Kirkegaard and Fuerst. http://www.dcscience.net/London-conference-of-Intelligence-2016.pdf#14

‘Any two people of recent African descent are likely to be more genetically distinct from each other than either of them is to anyone else in the world.’

Yes, I understand that is correct. But there is an obvious error here. That fact (i.e., genetic diversity, which can be measured) does not negate the possibility of a genetic influence on intelligence, and the variations in intelligence studied by researchers in the field are not confined to genetic differences. To determine these effects, as well as their causes, research is needed, and especially careful research. But if the field is rejected as intrinsically racist, which is the appearance here, that research will not be done, or if done, may not be reported and criticized.

The London Conference on Intelligence included talks by controversial speakers including white supremacists, child rape advocates, and those with extreme views on race and gender.

This article depends heavily on the London Student article. With “child rape advocates” (how many?) it shows its the origin with, directly or indirectly. Oliver D. Smith. It is full of the same non sequiturs. 

The use of the hyperbolic plural is a tipoff to the yellow journalistic agenda.

Was it a “eugenics” conference? Notice that this is an incorporated assumption in the headline. The 2016 conference document cited above is headed with a photograph of Edward Thorndike, and a saying from him:

Selective breeding can alter man’s capacity to learn, to keep sane, to cherish justice or to be happy. There is no more certain and economical a way to improve man’s environment as to improve his nature.

“Selective breeding” is actually natural and normal. (But Thorndike may have had something more “scientific” in mind.). There is nothing offensive about the statement, though I might disagree with the weight that he put on it. There is nothing ‘racist’ about this comment. If he was a racist — I don’t know, but many were in his day — the comment appears independent of that, he was not talking about “race purity,” which is, actually, genetically dangerous. Diversity is important for the maintenance of healthy populations.

Richard Lynn has an obvious interest in eugenics. He wrote Eugenics: A Reassessment. However, I see no indication that the Conference is fairly called a “eugenics conference.” It was about intelligence and population studies of measures of intelligence. It was accurately named. I saw not one paper in the list that was about eugenics (which in modern times would refer most strongly to genetic engineering. The study of intelligence could have an impact on that. Can genes for “intelligence” be found? Again, how would we know? Genetic engineering will bring many ethical issues — and it’s already happening. It is common to do fetal genetic testing to detect Down syndrome, and to then selectively abort. However, eugenics would probably be focused on increasing desirable characteristics.

 

Is it a “eugenics probe”? Or is it a reaction to a massive flap about alleged racism? Is a topic to be banned because someone interested in the topic, and who writes academic papers on it, has expressed, at some time or other, allegedly abhorrent views?  From a comment by a  UCL spokesperson:

“Our records indicate the university was not informed in advance about the speakers and content of the conference series, as it should have been for the event to be allowed to go ahead. The conferences were booked and paid for as an external event and without our officials being told of the details. They were therefore not approved or endorsed by UCL.

It would be radically contrary to academic freedom for the university to assert control over speakers and content. From the topics of the 2016 conference, I would expect a normal university response to allow a next conference, if they even took that much interest. The conference organizer was apparently a trusted faculty member, and that would be the extent of it.

I would not expect specific conference speakers and content to be approved in advance by the university. That is quite contrary to actual practice, which is that a conference is planned, often very long in advance, the venue secured for the general topic, and then, once a location is secure, the speakers and papers to be delivered are chosen. 

“We have suspended approval for any further conferences of this nature by the honorary lecturer and speakers pending our investigation into the case. As part of that investigation, we will be speaking to the honorary lecturer and seeking an explanation.”

As a temporary measure pending investigation, this makes sense. Oliver D. Smith, who triggered this flap by private email to the media, probably linking to the RationalWiki article that he wrote, crowed on RationalWiki that he got the conference “shut down.” That has not happened yet. There is a temporary suspension pending investigation and whether or not it affects this year’s conference is unclear. If it stays up in the air, unresolved, Conference organizers may simply move the Conference elsewhere. This was a small conference and does not need to be held at a University. I’d suggest a hotel in Hawaii. Cheaper in China, I’m sure.

The university stressed it was “committed to free speech but also to combatting racism and sexism in all forms”.

We will see how committed they are to free speech.

University College London has launched an urgent investigation into how a senior academic was able to secretly host conferences on eugenics and intelligence with notorious speakers including white supremacists.

The London Conference on Intelligence was said to have been run secretly for at least three years by James Thompson, an honorary senior lecturer at the university, including contributions from a researcher who has previously advocated child rape.

Oliver Smith successfully framed the conversation. The conference was on intelligence, yes. Were any speakers “white supremacists?” That’s quite unclear. Oliver Smith has made this claim about some. The speakers were well-known academics in the field. “Notorious”? Who? This was an appalling piece by the Guardian, polemic, not sober reporting. The “child rape” accusation was false, and the comments he made — which were not advocacy, clearly — were many years before, as a young blogger.

Any actual journalism here? Okay:

UCL said it had no knowledge of the conference, an invitation-only circle of 24 attendees, which could have led to a breach of the government’s Prevent regulations on campus extremism.

This conference was not “extremist.” It was, in some respects, fringe or controversial research.  The actual Prevent document is about terrorism.

Russia Television. Shabby yellow journalism, repeating the Smith claims. Much commentary was about Toby Young, for having “attended” the conference. Young is a highly opininated journalist and has made comments relating to eugenics. The Wikipedia article is, by the way, afflicted with Oliver Smith fake news, my sense is that it violates biography policy, with recentism and focus on a splash of claims in media. (The claims actually contradict sources, but … newspapers like the Guardian are “reliable source.” Nevertheless, it’s up to editors to consider balance. It’s obvious that a series of media sources copied each other having copied RationalWiki. And there was an Oliver Smith sock (tagged as Anglo Pyramidologist) who edited that. (“Rebecca Bird.”)

The quality is a little higher, in a dismal field:

One of Britain’s most liberal universities has learnt that it has played host to a conference for controversial academics and experts for three years without knowing it.

More accurately, the University spokesperson has claimed, to repeat:

Our records indicate the university was not informed in advance about the speakers and content of the conference series, as it should have been for the event to be allowed to go ahead. The conferences were booked and paid for as an external event and without our officials being told of the details. They were therefore not approved or endorsed by UCL.

This kind of statement can be quite misleading. “Records indicate” shows that someone didn’t find something in the records, but information may have been provided that was not recorded. “Booked and paid for as an external event” is possible. Who can do that and under what rules? What information, if any, was actually provided? This was, however, arguably “secretive” — from what Toby Young has written, there was a realization that the content could be controversial — but not “secret.” There was ample information about the conference, in public view. I would not expect the University to be informed of conference details, particularly speakers. Rather, what would seem more likely would be that the general conference subject would be revealed. Speakers would not necessarily be known until not long before the conference, and it would not be the job of the University to vet speakers. The Time more accurately describes the topic of the conference than any of the other sources:

University College London has been the venue for the London Conference on Intelligence, a secretive, invitation-only event on “empirical studies of intelligence, personality and behaviour”.

Given the apparent function of the conference, I would not be surprised for it to be “invitation-only.” That does not, in itself, make it “secretive” or “secret.” Just in the last few days, there was a conference for cold fusion researchers at MIT that was “invitation-only.” This is done where the desire is to create a collaborative working environment, among people already familiar with the research.

It has been held at the university every year since 2015 without the authorities being notified, in a breach of its own rules. This year’s conference, scheduled for May, has been suspended while UCL investigates.

The Times is stating that the rules have been breached, but has not provided evidence or a source for that, other than the vague comments of the University spokesperson. The inquiry is into whether or not rules were breached. Who, exactly was responsible for notifying exactly whom? Is there a form for booking a conference. Did it contain the required information. My guess would be, it did, and that the idea of rules violation is CYA from some University officials. But I certainly don’t know.

The conferences have hosted speakers presenting work that claims racial mixing has a negative effect on population “quality”, and that “skin brightness” is a factor in global development.

So, with a rather diverse group of speakers, and many papers over the years, one finds a few studies that sound weird. I could go over all the lists of papers, but I’m not doing that now.

I have seen “skin brightness” used as a measure of “color.” It is a crude marker for certain populations. (Skin brightness can be objectively measured. Skin brightness might be a factor in global development because of endemic racism. How would one know? It’s obvious that there is an attitude of certain topics being forbidden, to be condemned, which is more or less what Kirkegaard has claimed. “Population quality” is vague, but in the few papers I have read, these terms are defined and may not be at all what a reader of a newspaper would assume.  

I find this fascinating: as media picked up the stories, each new report tended to focus on the facts or claims of the prior reports. There is little sign of investigation de novo. So facts or claims that would be, in an unbiased report, considered marginal or irrelevant, not to be covered, are covered, and there is a bias in this toward what is sensational or scandalous.

Standard, ancient problem of media bias, not necessarily a bias toward a political position, but toward scandal and the like. The most obvious example here is the often mentioned alleged advocacy of child rape, that wasn’t. This had nothing to do with the conference (the ostensible topic of the stories) and was simple ad hominem attack and claim of guilt by association.

For a very different (and still very political) view, http://www.vdare.com/articles/then-they-came-for-the-london-conference-on-intelligence

A modest proposal: perhaps there is a gene for racism. (From what I’ve described above, this is not absolutely preposterous. Fear of the “other” may be instinctive and not simply conditioned, it probably has some genetic basis. So, how about the possibility of a genetic test for racism, and there could be fetal tests for it, and then selective abortion to diminish the obviously damaging propensity for racism in the population. Readers should be aware of the history of “a modest proposal.”

My hope here is that UCL makes a sane decision that does protect academic  freedom. If there are aspects of the Conference that are gratuitously offensive — I have not seen that yet — then they may sanely place restrictions. In this field, some of the researchers will hold unconventional views. That’s critical for the scientific process. What would truly concern me would be data falsification, and nothing like that has been alleged.

SOS Wikipedia

Original post

I’ve been working on some studies that involve a lot of looking at Wikipedia, and I come across the Same Old S … ah, Stuff! Yeah! Stuff!

Wikipedia has absolutely wonderful policies that are not worth the paper they are not written on, because what actually matters is enforcement. If you push a point of view considered fringe by the administrative cabal (Jimbo’s word for what he created … but shhhh! Don’t write the word on Wikipedia, the sky will fall!) you are in for some, ah, enforcement. But if you have and push a clear anti-fringe point of view — which is quite distinct from neutrally insisting on policy — nothing will happen, unless you go beyond limits, in which case you might even get blocked until your friends bail you out, as happened with jps, mentioned below. Way beyond limits.

So an example pushed against my eyeballs today. It’s not about cold fusion, but it shows the thinking of an administrator (JzG is the account but he signs “Guy”) and a user (the former Science Apologist, who has a deliberately unpronounceable username but who signs jps (those were his real-life initials), who were prominent in establishing the very iffy state of Cold fusion.

Wikipedia:Fringe_theories/Noticeboard


Aron K. Barbey ‎[edit]

Before looking at what JzG (Guy) and UnpronounceableUsername (jps) wrote, what happened here? What is the state of the article and the user?

First thing I find is that Aron barbey wrote the article and has almost no other edits. However, he wrote the article on Articles for creation. Looking at his user talk page, I find

16 July 2012, Barbey was warned about writing an article about himself, by a user declining a first article creation submission.

9 July 2014, it appears that Aron barbey created a version of the article at Articles for Creation. That day, he was politely and properly warned about conflict of interest.

The article was declined, see 00:43:46, 9 July 2014 review of submission by Aron barbey

from the log found there:

It appears that the article was actually originally written by Barbey in 2012. See this early copy, and logs for that page.

Barbey continued to work on his article in the new location, and resubmitted it August 2, 2014

It was accepted August 14, 2014.  and moved to mainspace.

Now, the article itself. It has not been written or improved by someone with a clue as to what Wikipedia articles need. As it stands, it will not withstand a Articles for deletion request. The problem is that there are few, if any, reliable secondary sources. Over three years after the article was accepted, JzG multiply issue-tagged it. Those tags are correct. There are those problems, some minor, some major. However, this edit was appalling, and the problem shows up in the FTN filing.

The problems with the article would properly suggest AfD if they cannot be resolved. So why did JzG go to FTN? What is the “Fringe Theory” involved? He would go there for  one reason: on that page the problems with this article can be seen by anti-fringe users, who may then either sit on the article to support what JzG is doing, or vote for deletion with opinions warped by claims of “fringe,” which actually should be irrelevant. The issue, by policy would be the existence of reliable secondary sources. If there are not enough, then deletion is appropriate, fringe or not fringe.

So his filing:


The article on Aron Barbey is an obvious autobiography, edited by himself and IP addresses from his university. The only other edits have been removing obvious puffery – and even then, there’s precious little else in the article. What caught my eye is the fact that he’s associated with a Frontiers journal, and promulgates a field called “Nutritional Cognitive Neuroscience”, which was linked in his autobiography not to a Wikipedia article but to a journal article in Frontiers. Virtually all the cites in the article are primary references to his won work, and most of those are in the Frontiers journal he edits. Which is a massive red flag.

Who edited the article is a problem, but the identity of editors is not actually relevant to Keep/Delete and content. Or it shouldn’t be. In reality, those arguments often prevail. If an edit is made in conflict of interest, it can be reverted. But … what is the problem with that journal? JzG removed the link and explanation. For Wikipedia Reliable Source, the relevant fact is the publisher. But I have seen JzG and jps arguing that something is not reliable source because the author had fringe opinions — in their opinion!

What JzG removed:

15:48, 15 December 2017‎ JzG (talk | contribs)‎ . . (27,241 bytes) (-901)‎  . (remove links to crank journal) (undo)

This took out this link:

Nutritional Cognitive Neuroscience

and removed what could show that the journal is not “crank.” There is a better source (showing that the editors of the article didn’t know what they were doing). Nature Publishing Group press release. This “crank journal” is Reliable Source for Wikipedia, and that is quite clear. (However, there are some problems with all this, complexities. POV-pushing confuses the issues, it doesn’t resolve them.

Aron Barbey is Associate Editor of Frontiers in Human Neuroscience, Nature Publishing Group journal.[14] Barbey is also on the Editorial Board of NeuroImage,[15] Intelligence,[16] and Thinking & Reasoning,.[17]

Is Barbey an “Associate Editor”? This is the journal home page.

Yes, Barbie is an Associate Editor. There are two Chief Editors. A journal will choose a specialist in the field, to participate in the selection and review of articles, so this indicates some notability, but is a primary source.

And JzG mangled:

Barbey is known for helping to establish the field of Nutritional Cognitive Neuroscience.[36]

was changed to this:

Barbey is known for helping to establish the field of Cognitive Neuroscience.[35]

JzG continues on FTN:

So, I suspect we have a woo-monger here, but I don’t know whether the article needs to be nuked, or expanded to cover reality-based critique, if any exists. Guy (Help!) 16:03, 15 December 2017 (UTC)

“Woo” is a term used by “skeptic” organizations. “Woo-monger” is uncivil, for sure. As well, the standard for inclusion in Wikipedia is not “reality-based” but “verifiable in reliable source.” “Critique” assumes that what Barbey is doing is controversial, and Guy has found no evidence for that other than his own knee-jerk responses to the names of things.

It may be that the article needs to be deleted. It certainly needs to be improved. However, what is obvious is that JzG is not at all shy about displaying blatant bias, and insulting an academic and an academic journal.

And jps does quite the same:

This is borderline Men who stare at goats sort of research (not quite as bad as that, but following the tradition) that the US government pushes around. Nutriceuticals? That’s very dodgy. Still, the guy’s won millions of dollars to study this stuff. Makes me think a bit less of IARPA. jps (talk) 20:41, 15 December 2017 (UTC)

This does not even remotely resemble that Army paranormal research, but referring to that project is routine for pseudosceptics whenever there is government support of anything they consider fringe. Does nutrition have any effect on intelligence? Is the effect of nutrition on intelligence of any interest? Apparently, not for these guys. No wonder they are as they are. Not enough kale (or, more accurately, not enough nutritional research, which is what this fellow is doing.)

This is all about warping Wikipedia toward an extreme Skeptical Point of View. This is not about improving the article, or deleting it for lack of reliable secondary sources. It’s about fighting woo and other evils.

In editing the article, JzG used these edit summaries:

  • (remove links to crank journal)
  • (rm. vanispamcruft)
  • (Selected publications: Selected by Barbey, usually published by his own journal. Let’s see if anyone else selects them)
  • (Cognitive Neuroscience Methods to Enhance Human Intelligence: Oh good, they are going to be fad diet sellers too)

This are all uncivil (the least uncivil would be the removal of publications, but it has no basis. JzG has no idea of what would be notable and what not.

The journal is not “his own journal.” He is merely an Associate Editor, selected for expertise. He would not be involved in selecting his own article to publish. I’ve been through this with jps, actually, where Ed Storms was a consulting editor for Naturwissenschaften and the claim was made that he had approved his own article, a major peer-reviewed review of cold fusion, still not used in the article. Yet I helped with the writing of that article and Storms had to go through ordinary peer review. The faction makes up arguments like this all the time.

I saw this happen again and again: an academic edits Wikipedia, in his field. He is not welcomed and guided to support Wikipedia editorial policy. He is, instead, attacked and insulted. Ultimately, if he is not blocked, he goes away and the opinion grows in academia that Wikipedia is hopeless. I have no idea, so far, if this neuroscientist is notable by Wikipedia standards, but he is definitely a real neuroscientist, and being treated as he is being treated is utterly unnecessary. But JzG has done this for years.

Once upon a time, when I saw an article like this up for Deletion, I might stub it, reducing the article to just what is in the strongest sources, which a new editor without experience may not recognize. Later, if the article survives the AfD discussion, more can be added from weaker sources, including some primary sources, if it’s not controversial. If the article isn’t going to survive AfD, I’d move it to user space, pending finding better sources. (I moved a fair number of articles to my own user space so they could be worked on. Those were deleted at the motion of …. JzG.)

(One of the problems with AfD is that if an article is facing deletion, it can be a lot of work to find proper sources. I did the work on some occasions, and the article was deleted anyway, because there had been so many delete !votes (Wikipedia pretends it doesn’t vote, one of the ways the community lies to itself.  before the article was improved, and people don’t come back and reconsider, usually. That’s all part of Wikipedia structural dysfunction. Wasted work. Hardly anyone cares.)

Sources on Barbey

Barbey and friends may be aware of sources not easily found on the internet. Any newspaper will generally be a reliable source. If Barbey’s work is covered in a book that is not internet-searchable, it may be reliable source. Sourcing for the biography should be coverage of Barbey and/or Barbey’s work, attributed to him, and not merely passing mention. Primary sources (such as his university web site) are inadequate. If there were an article on him in the journal where he is Associate Editor, it would probably qualify (because he would not be making the editorial decision on that). If he is the publisher, or he controls the publisher, it would not qualify.

Reliable independent sources
  • WAMC.org BRADLEY CORNELIUS “Dr. Aron Barbey, University of Illinois at Urbana-Champaign – Emotional Intelligence  APR 27, 2013
  • 2013 Carle Research Institute Awards October 2013, Research Newsletter. Singles out a paper for recognition, “Nutrient Biomarker Patterns, Cognitive Function, and MRI Measures of Brain Aging,” however, I found a paper by that title and Barbey is not listed as an author, nor could I find a connection with Barbey.
  • SMITHSONIAN MAGAZINE David Noonan, “How to Plug In Your Brain” MAY 2016
  • The New Yorker.  Emily Anthes  “Vietnam’s Neuroscientific Legacy” October 2, 2014 PASSING MENTION
  • MedicalXpress.com Liz Ahlberg Touchstone “Cognitive cross-training enhances learning, study finds” July 25, 2017

“Aron Barbey, a professor of psychology” (reliable sources make mistakes) Cites a study, the largest and most comprehensive to date, … published in the journal Scientific Reports. N. Ward et al, Enhanced Learning through Multimodal Training: Evidence from a Comprehensive Cognitive, Physical Fitness, and Neuroscience Intervention, Scientific Reports (2017).
The error indicates to me that this was actually written by Touchstone, based on information provided by the University of Illinois, not merely copied from that.

Iffy but maybe

My sense is that continued search could find much more. Barbey is apparently a mainstream neuroscientist, with some level of recognition. His article needs work by an experienced Wikipedian.

Notes for Wikipedians

An IP editor appeared in the Fringe Theories Noticeboard discussion pointing to this CFC post:

Abd is stalking and attacking you both on his blog [25] in regard to Aron Barbey. He has done the same on about 5 other articles of his. [26]. He was banned on Wikipedia yet he is still active on Wiki-media projects. Can this guy get banned for this? The Wikimedia foundation should be informed about his harassment. 82.132.217.30 (talk) 13:30, 16 December 2017 (UTC)

This behavior is clearly of the sock family, called Anglo Pyramidologist on Wikipedia, and when I discovered the massive damage that this family had done, I verified the most recent activity with stewards (many accounts were locked and IPs blocked) and I have continued documentation, which Wikipedia may use or not, as it chooses. It is all verifiable. This IP comment was completely irrelevant to the FTN discussion, but attempting to turn every conversation into an attack on favorite targets is common AP sock behavior. For prior edits in this sequence, see (from the meta documentation):

This new account is not an open proxy. However, I will file a request anyway, because the behavior is so clear, following up on the 193.70.12.231 activity.

I have private technical evidence that this is indeed the same account or strongly related to Anglo Pyramidologist, see the Wikipedia SPI.

(I have found other socks, some blocked, not included in that archive.)

I have also been compiling obvious socks and reasonable suspicions from RationalWiki, for this same user or set of users, after he created a revenge article there on me (as he had previously done with many others).  It’s funny that he is claiming stalking. He has obviously been stalking, finding quite obscure pages and now giving them much more publicity.

And I see that there is now more sock editing on RationalWiki, new accounts with nothing better to do than document that famous troll or pseudoscientist or anti-skeptic (none of which I am but this is precisely what they claim.) Thanks for the incoming links. Every little bit helps.

If anyone thinks that there is private information in posts that should not ethically be revealed, please contact me through my WMF email, it works. Comments are also open on this blog, and corrections are welcome.

On the actual topic of that FTN discussion, the Aron Barbey article (with whom I have absolutely no connection), I have found better sources and my guess is that there are even better ones available.

JzG weighs in

Nobody is surprised. Abd is obsessive. He even got banned from RationalWiki because they got bored with him. Not seeing any evidence of meatpuppetry or sockpuppetry here though. Guy (Help!) 20:16, 16 December 2017 (UTC)

This is a blog I started and run, I have control. Guy behaves as if the Fringe Theories Noticeboard is his personal blog, where he can insult others without any necessity, including scientists like Barbey and a writer like me. And he lies. I cannot correct JzG’s lies on Wikipedia, but I can do it here.

I am not “banned” from RationalWiki. I was blocked by a sock of the massively disruptive user who I had been documenting, on meta for the WMF, on RationalWiki and on my blog when that was deleted by the same sock. The stated cause of the block was not “boring,” though they do that on RW. It was “doxxing.” As JzG should know, connecting accounts is not “doxxing.” It is revelation of real names for accounts that have not freely revealed that, or personal identification, like place of employment.

“Not seeing any evidence of meatpuppetry or sockpuppetry here.” Really? That IP is obviously the same user as behind the globally blocked Anglo Pyramidologist pushing the same agenda, this time with, likely, a local cell phone provide (because the geolocation matches know AP location), whereas with the other socking, documented above, was with open proxies.)

Properly, that IP should have been blocked and the edits reverted as vandalism. But JzG likes attack dogs. They are useful for his purposes.

Paranoia strikes deep

Evil Big Physics is out to fool and deceive us! They don’t explain everything in ordinary language! If Steve Krivit was Fooled, how about Joe Six-Pack?

Krivit continues to rail at alleged deception.

Nov. 7, 2017 EUROfusion’s Role in the ITER Power Deception 

All his fuss about language ignores the really big problem with this kind of hot fusion research: it is extremely expensive, it is not clear that it will ever truly be practical, the claims of being environmentally benign are not actually proven, because there are problems with the generation of radioactive waste from reactor materials exposed to high neutron flux; it is simply not clear that this is the best use of research resources.

That is, in fact, a complex problem, not made easier by Krivit’s raucous noises about fraud. Nevertheless, I want to complete this small study of how he approaches the writing of others, in this case, mostly, public relations people working for ITER or related projects. Continue reading “Paranoia strikes deep”

ITERitation

Krivit continues his crusade against DECEPTION!

Nov. 7, 2017 List of Corrected Fusion Power Statements on the ITER Web Site

What has been done is to replace “input power” with “input heating power.” Krivit says this is to “differentiate between reactor input power and plasma heating input power.” He’s not wrong, but … “Input heating power” could still be misunderstood. In fact, all along what was meant by “input power” was plasma heating power, and it never meant total power consumption, not even total power consumption by the heating system, since there are inefficiencies in converting electrical power to plasma heating.

Krivit calls all this “false and misleading statements about the promised performance of the ITER fusion reactor” and claims “This misrepresentation was a key factor in the ITER organization’s efforts to secure $22 billion of public funding.”

If anyone was misled about ITER operation, they were not paying attention. Continue reading “ITERitation”

Krivit’s ITERation – Deja vu all over again

Krivit must be lonely, there is no news confirming Widom-Larsen theory, which has now been out for a dozen years with zero confirmation, only more post-hoc “explanations” that use or abuse it, for no demonstrated value, so far.

But, hey, he can always bash ITER, and he has done it again. Continue reading “Krivit’s ITERation – Deja vu all over again”