Menu

Blog

Archive for the ‘LHC’ tag

Oct 6, 2017

Fundamental Particles & Forces: What do we know?

Posted by in categories: chemistry, general relativity, particle physics, physics, quantum physics, science

Do you remember all the hoopla last year when the Higgs Boson was confirmed by physicists at the Large Hadron Collider? That’s the one called the ‘God particle’, because it was touted as helping to resolve the forces of nature into one elegant theory. Well—Not so fast, bucko!…

First, some credit where credit is due: The LHC is a 27-kilometer ring of superconducting magnets interspersed by accelerators that boost the energy of the particles as they whip around and smash into each other. For physicists—and anyone who seeks a deeper understanding of what goes into everything—it certainly inspires awe.

Existence of the Higgs Boson (aka, The God Particle) was predicted. Physicists were fairly certain that it would be observed. But its discovery is a ‘worst case’ scenario for the Standard Model of particle physics. It points to shortcomings in our ability to model and predict things. Chemists have long had a master blueprint of atoms in the Periodic Table. It charts all the elements in their basic states. But, physicists are a long way from building something analogous. That’s because we know a lot more about atomic elements than the fundamental building blocks of matter and energy. [continue below image]

Continue reading “Fundamental Particles & Forces: What do we know?” »

Aug 21, 2015

Exotic Pentaquark Particle Discovery & CERN’s Massive Data Center

Posted by in categories: big data, engineering, particle physics, physics, science


July, 2015; as you know.. was the all systems go for the CERNs Large Hadron Collider (LHC). On a Saturday evening, proton collisions resumed at the LHC and the experiments began collecting data once again. With the observation of the Higgs already in our back pocket — It was time to turn up the dial and push the LHC into double digit (TeV) energy levels. From a personal standpoint, I didn’t blink an eye hearing that large amounts of Data was being collected at every turn. BUT, I was quite surprised to learn at the ‘Amount’ being collected and processed each day — About One Petabyte.

Approximately 600 million times per second, particles collide within the (LHC). The digitized summary is recorded as a “collision event”. Physicists must then sift through the 30 petabytes or so of data produced annually to determine if the collisions have thrown up any interesting physics. Needless to say — The Hunt is On!

The Data Center processes about one Petabyte of data every day — the equivalent of around 210,000 DVDs. The center hosts 11,000 servers with 100,000 processor cores. Some 6000 changes in the database are performed every second.

Continue reading “Exotic Pentaquark Particle Discovery & CERN's Massive Data Center” »

Apr 24, 2015

Article: Harnessing “Black Holes”: The Large Hadron Collider – Ultimate Weapon of Mass Destruction

Posted by in categories: astronomy, big data, computing, cosmology, energy, engineering, environmental, ethics, existential risks, futurism, general relativity, governance, government, gravity, information science, innovation, internet, journalism, law, life extension, media & arts, military, nuclear energy, nuclear weapons, open source, particle physics, philosophy, physics, policy, posthumanism, quantum physics, science, security, singularity, space, space travel, supercomputing, sustainability, time travel, transhumanism, transparency, treaties

Harnessing “Black Holes”: The Large Hadron Collider – Ultimate Weapon of Mass Destruction

Why the LHC must be shut down

Apr 24, 2015

CERN-Critics: LHC restart is a sad day for science and humanity!

Posted by in categories: astronomy, big data, complex systems, computing, cosmology, energy, engineering, ethics, existential risks, futurism, general relativity, governance, government, gravity, hardware, information science, innovation, internet, journalism, law, life extension, media & arts, military, nuclear energy, nuclear weapons, particle physics, philosophy, physics, policy, quantum physics, science, security, singularity, space, space travel, supercomputing, sustainability, time travel, transhumanism, transparency, treaties
PRESS RELEASE “LHC-KRITIK”/”LHC-CRITIQUE” www.lhc-concern.info
CERN-Critics: LHC restart is a sad day for science and humanity!

Continue reading “CERN-Critics: LHC restart is a sad day for science and humanity!” »

May 14, 2012

Consideration for Sub-Millisecond Pulsars (or the Lack Thereof)

Posted by in categories: existential risks, particle physics, physics, space

On a casual read of the appraised work of Duncan R. Lorimer on Binary and Millisecond Pulsars (2005) last week, I noted the reference to the lack of pulsars with P < 1.5 ms. It cites a mere suggestion that this is due to gravitational wave emission from R-mode instabilities, but one has not offered a solid reason for such absence from our Universe. As the surface magnetic field strength of such would be lower (B ∝ (P ˙P )^(1÷2)) than other pulsars, one could equally suggest that the lack of sub millisecond pulsars is due to their weaker magnetic fields allowing CR impacts resulting in stable MBH capture… Therefore if one could interpret that the 108 G field strength adopted by G&M is an approximate cut-off point where MBH are likely to be captured by neutron stars, then one would perhaps have some phenomenological evidence that MBH capture results in the destruction of neutron stars into black holes. One should note that more typical values of observed neutron stars calculate a 1012 G field, so that is a 104 difference from the borderline-existence cases used in the G&M analysis (and so much less likely to capture). That is not to say that MBH would equate to a certain danger for capture in a planet such as Earth where the density of matter is much lower — and accretion rates much more likely to be lower than radiation rates — an understanding that is backed up by the ‘safety assurance’ in observational evidence of white dwarf longevity. However, it does take us back to question — regardless of the frequently mentioned theorem here on Lifeboat that states Hawking Radiation should be impossible — Hawking Radiation as an unobserved theoretical phenomenon may not be anywhere near as effective as derived in theoretical analysis regardless of this. This oft mentioned concern of ‘what if Hawking is wrong’ of course is endorsed by a detailed G&M analysis which set about proving safety in the scenario that Hawking Radiation was ineffective at evaporating such phenomenon. Though doubts about the neutron star safety assurance immediately makes one question how reliable are the safety assurances of white dwarf longevity – and my belief has been that the white dwarf safety assurance seems highly rational (as derived in a few short pages in the G&M paper and not particularly challenged except for the hypothesis that they may have over-estimated TeV-scale MBH size which could reduce their likelihood of capture). It is quite difficult to imagine a body as dense as a white dwarf not capturing any such hypothetical stable MBH over their lifetime from CR exposure – which validates the G&M position that accretion rates therein must be vastly outweighed by radiation rates, so the even lower accretion rates on a planet such as Earth would be even less of a concern. However, given the gravity of the analysis, those various assumptions on which it is based perhaps deserves greater scrutiny, underscored by a concern made recently that 20% of the mass/energy in current LHC collisions are unaccounted for. Pulsars are often considered one of the most accurate references in the Universe due to their regularity and predictability. How ironic if those pulsars which are absent from the Universe also provided a significant measurement. Binary and Millisecond Pulsars, D.R. Lorimer: http://arxiv.org/pdf/astro-ph/0511258v1.pdf

Apr 9, 2012

LHC-Critique Press Info: Instead of a neutral risk assessment of the LHC: New records and plans for costly upgrades at CERN

Posted by in categories: complex systems, cosmology, engineering, ethics, existential risks, futurism, media & arts, nuclear energy, particle physics, philosophy, physics, policy, scientific freedom, space, sustainability

High energy experiments like the LHC at the nuclear research centre CERN are extreme energy consumers (needing the power of a nuclear plant). Their construction is extremely costly (presently 7 Billion Euros) and practical benefits are not in sight. The experiments eventually pose existential risks and these risks have not been properly investigated.

It is not the first time that CERN announces record energies and news around April 1 – apparently hoping that some critique and concerns about the risks could be misinterpreted as an April joke. Additionally CERN regularly starts up the LHC at Easter celebrations and just before week ends, when news offices are empty and people prefer to have peaceful days with their friends and families.

CERN has just announced new records in collision energies at the LHC. And instead of conducting a neutral risk assessment, the nuclear research centre plans costly upgrades of its Big Bang machine. Facing an LHC upgrade in 2013 for up to CHF 1 Billion and the perspective of a Mega-LHC in 2022: How long will it take until risk researchers are finally integrated in a neutral safety assessment?

There are countless evidences for the necessity of an external and multidisciplinary safety assessment of the LHC. According to a pre-study in risk research, CERN fits less than a fifth of the criteria for a modern risk assessment (see the press release below). It is not acceptable that the clueless member states point at the operator CERN itself, while this regards its self-set security measures as sufficient, in spite of critique from risk researchers, continuous debates and the publication of further papers pointing at concrete dangers and even existential risks (black holes, strangelets) eventually arising from the experiments sooner or later. Presently science has to admit that the risk is disputed and basically unknown.

Continue reading “LHC-Critique Press Info: Instead of a neutral risk assessment of the LHC: New records and plans for costly upgrades at CERN” »

Feb 12, 2012

CERN’s annual Chamonix-meeting to fix LHC schedules (Feb. 6–10 2012): Increasing energies. No external and multi-disciplinary risk assessment so far. Future plans targeting at Mega-LHC.

Posted by in categories: cosmology, engineering, ethics, events, existential risks, particle physics, physics, scientific freedom, sustainability, transparency

Info on the outcomes of CERN’s annual meeting in Chamonix this week (Feb. 6–10 2012):

In 2012 LHC collision energies should be increased from 3.5 to 4 TeV per beam and the luminosity is planned to be highly increased. This means much more particle collisions at higher energies.

CERN plans to shut down the LHC in 2013 for about 20 months to do a very costly upgrade (CHF 1 Billion?) to run the LHC at 7 TeV per beam afterwards.

Future plans: A High-Luminosity LHC (HL-LHC) is planned, “tentatively scheduled to start operating around 2022” — with a beam energy increased from 7 to 16.5 TeV(!).

Continue reading “CERN’s annual Chamonix-meeting to fix LHC schedules (Feb. 6-10 2012): Increasing energies. No external and multi-disciplinary risk assessment so far. Future plans targeting at Mega-LHC.” »

Feb 12, 2012

Badly designed to understand the Universe — CERN’s LHC in critical Reflection by great Philosopher H. Maturana and Astrophysicist R. Malina

Posted by in categories: complex systems, cosmology, education, engineering, ethics, existential risks, futurism, media & arts, particle physics, philosophy, physics, scientific freedom, sustainability

Famous Chilean philosopher Humberto Maturana describes “certainty” in science as subjective emotional opinion and astonishes the physicists’ prominence. French astronomer and “Leonardo” publisher Roger Malina hopes that the LHC safety issue would be discussed in a broader social context and not only in the closer scientific framework of CERN.

(Article published in “oekonews”: http://oekonews.at/index.php?mdoc_id=1067777 )

The latest renowned “Ars Electronica Festival” in Linz (Austria) was dedicated in part to an uncritical worship of the gigantic particle accelerator LHC (Large Hadron Collider) at the European Nuclear Research Center CERN located at the Franco-Swiss border. CERN in turn promoted an art prize with the idea to “cooperate closely” with the arts. This time the objections were of a philosophical nature – and they had what it takes.

In a thought provoking presentation Maturana addressed the limits of our knowledge and the intersubjective foundations of what we call “objective” and “reality.” His talk was spiked with excellent remarks and witty asides that contributed much to the accessibility of these fundamental philosophical problems: “Be realistic, be objective!” Maturana pointed out, simply means that we want others to adopt our point of view. The great constructivist and founder of the concept of autopoiesis clearly distinguished his approach from a solipsistic position.

Continue reading “Badly designed to understand the Universe — CERN's LHC in critical Reflection by great Philosopher H. Maturana and Astrophysicist R. Malina” »

Apr 3, 2010

Natural selection of universes and risks for the parent civilization

Posted by in category: existential risks

Lee Smolin is said to believe (according to personal communication from Danila Medvedev who was told about it by John Smart. I tried to reach Smolin for comments, but failed) that global catastrophe is impossible, based on the following reasoning: the multiverse is dominated by those universes that are able to replicate. This Self-replication occurs in black holes, and in especially in those black holes, which are created civilizations. Thus, the parameters of the universe are selected so that civilization cannot self-destruct before they create black holes. As a result, all physical processes, in which civilization may self-destruct, are closed or highly unlikely. Early version of Smolin’s argument is here: http://en.wikipedia.org/wiki/Lee_Smolin but this early version was refuted in 2004, and so he (probably) added existence of civilization as another condition for cosmic natural selection. Anyway, even if it is not Smolin’s real line of thoughts, it is quite possible line of thoughts.

I think this argument is not persuasive, since the selection can operate both in the direction of universes with more viable civilizations, and in the direction of universes with a larger number of civilizations, just as biological evolution works to more robust offspring in some species (mammals) and in the larger number of offspring with lower viability (plants, for example, dandelion). Since some parameters for the development of civilizations is extremely difficult to adjust by the basic laws of nature (for example, the chances of nuclear war or a hostile AI), but it is easy to adjust the number of emerging civilizations, it seems to me that the universes, if they replicated with the help of civilizations, will use the strategy of dandelions, but not the strategy of mammals. So it will create many unstable civilization and we are most likely one of them (self indication assumption also help us to think so – see recent post of Katja Grace http://meteuphoric.wordpress.com/2010/03/23/sia-doomsday-the-filter-is-ahead/)

But still some pressure can exist for the preservation of civilization. Namely, if an atomic bomb would be as easy to create as a dynamite – much easier then on Earth (which depends on the quantity of uranium and its chemical and nuclear properties, ie, is determined by the original basic laws of the universe), then the chances of the average survival of civilization would be lower. If Smolin’s hypothesis is correct, then we should encounter insurmountable difficulties in creating nano-robots, microelectronics, needed for strong AI, harmful experiments on accelerators with strangelet (except those that lead to the creation of black holes and new universes), and in several other potentially dangerous technology trends that depend on their success from the basic properties of the universe, which may manifest itself in the peculiarities of its chemistry.

In addition, the evolution of universes by Smolin leads to the fact that civilization should create a black hole as early as possible in the course of its history, leading to replication of universes, because the later it happens, the greater the chances that the civilization will self-destruct before it can create black holes. In addition, the civilization is not required to survive after the moment of “replication” (though survival may be useful for the replication, if civilization creates a lot of black holes during its long existence.) From these two points, it follows that we may underestimate the risks from Hadron Collider in the creation of black holes.

Continue reading “Natural selection of universes and risks for the parent civilization” »

Mar 27, 2010

Critical Request to CERN Council and Member States on LHC Risks

Posted by in categories: complex systems, cosmology, engineering, ethics, existential risks, particle physics, policy

Experts regard safety report on Big Bang Machine as insufficient and one-dimensional

International critics of the high energy experiments planned to start soon at the particle accelerator LHC at CERN in Geneva have submitted a request to the Ministers of Science of the CERN member states and to the delegates to the CERN Council, the supreme controlling body of CERN.

The paper states that several risk scenarios (that have to be described as global or existential risks) cannot currently be excluded. Under present conditions, the critics have to speak out against an operation of the LHC.

The submission includes assessments from expertises in the fields markedly missing from the physicist-only LSAG safety report — those of risk assessment, law, ethics and statistics. Further weight is added because these experts are all university-level experts – from Griffith University, the University of North Dakota and Oxford University respectively. In particular, it is criticised that CERN’s official safety report lacks independence – all its authors have a prior interest in the LHC running and that the report uses physicist-only authors, when modern risk-assessment guidelines recommend risk experts and ethicists as well.

Continue reading “Critical Request to CERN Council and Member States on LHC Risks” »