Monday, 26 December 2016

New Scientific and Technological Discoveries which will shape Our Future Part: One

 Future-Science GIF

At the end of 20th century and the advent of 21st century biological discoveries came into full swing by the success of  "Human Genome Projects" the first draft of human gene map which  unfold the Book of Bible of Life.

After this success we enter into the age of Biology which disclose the different aspects of biology specially in the field of Microbiology,Cellular Biology, Neuroscience and Medicinal biology, from the knowledge gathered from these interesting modified subjects Stem cell biology  and Gene therapy came into existence which  can in near future regenerate our faulty cells.

During this age of Biology there were significant development arises in the field of Quantum Physics, beautiful divine formula of string theory came into existence which define the multiple forms of matters existed due to the different frequency of vibrating strings ,which is the ultimate structure of the matters.

 Higgs Boson in an artist's conception a Higgs boson erupts from a collision of protons

 higgs-boson-announcement

The discovery of Higgs Boson (God Particle) on 4th July 2012 at CERNs and The first observation of gravitational waves was made on 14 September 2015 and was announced by the LIGO and Virgo collaborations on 11 February 2016 (Previously, gravitational waves had been inferred only indirectly, via their effect on the timing of pulsars in binary star systems) shift  theoretical Physics into experimental lab.

 The LIGO facility in Livingston, Louisiana, has a twin in Hanford, Washington.

 A schematic showing aLIGO's interferometer

After these major scientific revelation some new development arises which will revolutionize and shape our scientific future within some years ahead. So we are in the transitional phase of science which will take quantum leap  with the help of the knowledge gathered from these new scientific study and technology within  a few years.

 crystal_to_string

These new  scientific findings are discussed below:-

1) The 5 years long ENCODE ( Encyclopedia of DNA Elements) Projects:

2) Development  in the field of Neuroscience:

A ) Direct Brain to brain interface through Brain Computer interface(BCI) headgear:-

B) Huge Human Brain Mapping Project:

3) Latest Development on gene therapy will made human nearly immortal in the near future:

4) Endosymbiosis ,and gradual evolution  of ATP  fromation or energy production at cellular level causes microevolution which in large scale affects species evolution:

5) NA64 hunts the mysterious dark photon:

6) Quantum Computers:

7) Nanomachines: The Nobel Prize in Chemistry 2016 was awarded jointly to Jean-Pierre Sauvage, Sir J. Fraser Stoddart and Bernard L. Feringa "for the design and synthesis of molecular machines".

8) Making Of Artifitial Leaf to produce efficient and pollutionless energy from CO2, Sunlight and water:

9) 3D printing technology will revilutionize Manufacturing and Medical Industry:-

1) The 5 years long ENCODE ( Encyclopedia of DNA Elements) Projects:

After the success of Human Genome Projects, geneticists still cannot define DNA or genetic material at its full extent, they realised that there is a large fill in the gap in our DNA knowledge which cant properly express the complexity of all biochemical reactions within our cells as well as never fully cure the abnormality of cells.

Then they realize that vast sections of the human genome that were previously thought to have no useful function and were dismissed as " JUNK DNA" are infact involved in key biochemical process.

 Encode Nature Graphic

 Overview of the Encyclopedia of DNA Elements (ENCODE) project. (A) Genomic elements that are the targets of the ENCODE project (dashed arrows) and some of the methods (gray boxes) used to quantify them in more than 150 human cell lines. (B) Summary of ENCODE’s workflow. To gain access to human ENCODE data, navigate to the UCSC Genome Browser, select the February 2009 assembly, and jump to your genomic region of interest. ENCODE data can be found in the 'Expression and Regulation' and the 'Mapping, Genes, and Variation' track groups.

The Five year long ENCODE ( Encyclopedia  of DNA elements) project has attempted to catalouge the bulk genetic material that does not fall under the category of Protein-Coding Gene. The building blocks neccessory for life that comprise only 2% of human genome. ENCODE has revealed that most of the human genome is involved in the complex molecular choreography required for converting genetic information into living cells and organisms.

2) Development  in the field of Neuroscience:

A ) Direct Brain to brain interface through Brain Computer interface(BCI) headgear:-

Scientists at the University of Washington have successfully completed what is believed to be the most complex human brain-to-brain communication experiment ever. It allowed two people located a mile apart to play a game of "20 Questions" using only their brainwaves, a nearly imperceptible flash of light, and an internet connection to communicate.

 The cycle of the experiment. Brain signals from the “Sender” are recorded. When the computer detects imagined hand movements, a “fire” command is transmitted over the Internet to the TMS machine, which causes an upward movement of the right hand of the “Receiver.” This usually results in the “fire” key being hit

Brain-to-brain interfaces have gotten much more complex over the last several years. Miguel Nicolelis, a researcher at Duke University, has even created "organic computers" by connecting the brains of several rats and chimps together.

 University of Washington researcher Rajesh Rao, left, plays a computer game with his mind. Across campus, researcher Andrea Stocco, right, wears a magnetic stimulation coil over the left motor cortex region of his brain. Stocco’s right index finger moved involuntarily to hit the “fire” button as part of the first human brain-to-brain interface demonstration.University of Washington

But in humans, the technology remains pretty basic, primarily because the most advanced brain-to-brain interfaces require direct access to the brain. We're not exactly willing to saw open a person's skull in the name of performing some rudimentary tasks for science.

Using two well-known technologies, electroencephalography EEG and transcranial magnetic stimulation (TMS), Andrea Stocco and Chantel Prat were able to increase the complexity of a human brain to brain interface.

 brain-computer-interface

The EEG was used to read one person's brain waves, while the TMS machine was used to create a "phosphene"—a ring of light perceptible only to the wearer—on the other.

In the experiment, the EEG wearer was shown an object—a shark, for instance. The TMS wearer then used a computer mouse to ask a question of the EEG wearer—maybe "can it fly?" The EEG wearer then focused on a screen flashing either "yes" or "no." The brain waves were then read and transferred via the internet. If the answer was "yes," the TMS wearer would see a phosphene, suggesting he or she was on the right track to guessing the object.In 72 percent of games, the guesser was able to eventually get to the correct object. In a control group, just 18 percent of guessers were.

As I mentioned, both of these technology are well-known and are used in medical settings regularly. There is perhaps no totally new technological breakthrough here, but it's a clever way of hooking neurological devices to each other to complete a task. In a paper published in PLOS One, Stocco and Prat write that the task is "collaborative, potentially open-ended, operates in real-time, and requires conscious processing of incoming information."

"Because phosphenes are private to the receiver and can be perceived under a variety of conditions, or even while performing other actions, they represent a more versatile and interesting means of transferring information than [previous brain-to-brain interfaces]," they wrote.

Neither Stocco nor Pratt were able to talk to me today because they were in the process of moving offices, but in a press release published by the university, they suggest that future experiments could be less gimmicky and more therapeutic. Future experiments will connect the brains of someone who suffers from ADHD and someone who doesn't to see if it's possible to induce the ADHD student to a higher state of attention.

B) Huge Human Brain Mapping Project:

President Barack Obama announced a new research initiative this morning (April 2) to map the human brain, a project that will launch with \$100 million in funding in 2014.

The Brain Activity Map (BAM) project, as it is called, has been in the planning stages for some time. In the June 2012 issue of the journal Neuron, six scientists outlined broad proposals for developing non-invasive sensors and methods to experiment on single cells in neural networks. This February, President Obama made a vague reference to the project in his State of the Union address, mentioning that it could "unlock the answers to Alzheimer's."

 Obama-has-announced-a-100-million-dollar-brain-mapping-project.

In March, the project's visionaries outlined their final goals in the journal Science. They call for an extended effort, lasting several years, to develop tools for monitoring up to a million neurons at a time. The end goal is to understand how brain networks function.

"It could enable neuroscience to really get to the nitty-gritty of brain circuits, which is the piece that's been missing from the puzzle," Rafael Yuste, the co-director of the Kavli Institute for Brain Circuits at Columbia University, who is part of the group spearheading the project, told LiveScience in March. "The reason it's been missing is because we haven't had the techniques, the tools." [Inside the Brain: A Journey Through Time]

Not all neuro scientists support the project, however, with some arguing that it lacks clear goals and may cannibalize funds for other brain research.

Missing puzzle piece

Currently, scientists can monitor the activity of a single neuron using electrodes. They can watch the whole brain in action using functional magnetic resonance imaging (fMRI) and other techniques. But the middle ground eludes them. How do neurons work together in networks? What happens when the brain's circuitry breaks down?

To find out, Yuste and his colleagues say, researchers must be able to monitor whole, interacting networks of neurons at once. Scientists also need tools to alter the action of individual neurons in a circuit in order to test the effects of a single cell on the whole system.

 A high definition fiber-tracking (HDFT) map of a million brain fibers

The plan, as laid out in the journal Science, is to begin with small-brained invertebrates and move up in brain complexity. Within five years, the researchers write, scientists should be able to monitor tens of thousands of neurons at once — within 10 years, hundreds of thousands.By year 15 of the project, the researchers plan to be able to monitor million-neuron networks, the size of an entire zebrafish's brain. This would also allow scientists to study significant chunks of the mouse cortex in one fell swoop.

The scientists argue that the project would help develop technology such as nanoscale neural probes that could be used in the clinical treatment of brain problems. If successful, the project could also help explain the origins of autism, schizophrenia, dementia and depression. Additionally, it could lead to new treatments for stroke, spinal cord injury and other neurological diseases, they wrote. "All these brain diseases are also missing that piece," Yuste told LiveScience, referring to an understanding of neurocircuitry. "It's very likely that there are both mental diseases and also neurological diseases that will be greatly advanced by these technologies."

The Power of Light

Several promising light-based technologies have emerged in recent years for mapping brain activity. One such approach is multiphoton microscopy, where two or more photons are used to excite chromophores in vivo or in brain slices from mice. Using light at the infrared end of the spectrum and moving toward three photons instead of two, which has been the standard since the technique was developed in the 1990s, allows researchers to penetrate deeper into the brain.

Yuste said his lab, in collaboration with Karl Deisseroth at Stanford University and others, is working to combine two-photon microscopy with optogenetics in order to selectively excite neurons at different depths within living tissues. In a paper published in Nature Methods, they used this two-photon optogenetic approach, along with a newly developed light-activated cation channel from Deisseroth’s lab called C1V1, to generate action potentials in neurons with single-cell precision and to map neural circuits in mouse brain slices. In addition, Yuste’s group was also able to split the laser beam, which allowed simultaneous activation of neurons in three dimensions.

"With optogenetics you can really determine the functional connectivity of cells in a neural network," said Ed Boyden, a neuroscientist at MIT. "You can aim light at the exact set of cells you want and thereby activate those cells and not the other ones."

 Neural progenitor cells cultured from the developing brain

Similar to optogenetics, optochemical genetics is another light-based approach that holds promise for mapping brain activity. Optochemical genetics takes advantage of caged ligands—compounds that are synthesized in the lab by taking any neurotransmitters containing nitrogen (glutamate, GABA, serotonin, nicotine, among others) and attaching an inactivating group that is photolabile. The caged ligand remains inactive until exposed to light, at which time the inactivating group is photocleaved and releases the active neurotransmitter. In this way, researchers are able to tease apart the functions of individual neurotransmitters in specific circuits within the brain.

3) Latest Development on gene therapy will made human nearly immortal in the near future:

 Ex vivo and in vivo strategies for therapeutic genome editing

 Diversity of targets for therapeutic genome editing

Gene therapy has historically been defined as the addition of new genes to human cells. However, the recent advent of genome-editing technologies has enabled a new paradigm in which the sequence of the human genome can be precisely manipulated to achieve a therapeutic effect. This includes the correction of mutations that cause disease, the addition of therapeutic genes to specific sites in the genome, and the removal of deleterious genes or genome sequences.

 Common DNA targeting platforms for genome editing

This review presents the mechanisms of different genome-editing strategies and describes each of the common nuclease-based platforms, including zinc finger nucleases, transcription activator-like effector nucleases (TALENs), meganucleases, and the CRISPR/Cas9 system. We then summarize the progress made in applying genome editing to various areas of gene and cell therapy, including antiviral strategies, immunotherapies, and the treatment of monogenic hereditary disorders. The current challenges and future prospects for genome editing as a transformative technology for gene and cell therapy are also discussed.

 ExVivoGeneTherapy

 how it works-gene therapy

The realization of the genetic basis of hereditary disease led to the early concept of gene therapy in which “exogenous ‘good’ DNA be used to replace the defective DNA in those who suffer from genetic defects”.1 More than 40 years of research since this proposal of gene therapy has shown the simple idea of gene replacement to be much more challenging and technically complex to implement both safely and effectively than originally appreciated. Many of these challenges centered on fundamental limitations in the ability to precisely control how genetic material was introduced to cells. Nevertheless, the technologies for addition of exogenous genes have made remarkable progress during this time and are now showing promising clinical results across a range of strategies and medical indications.2 However, several challenges still remain. Integrating therapeutic genes into the genome for stable maintenance in replicating cells can have unpredictable effects on gene expression and unintended effects on neighboring genes.3 Moreover, some therapeutic genes are too large to be readily transferred by available delivery vectors. Finally, the addition of exogenous genes cannot always directly address dominant mutations or remove unwanted genetic material such as viral genomes or receptors. To address these fundamental limitations of conventional methods for gene addition, the field of gene editing has emerged to make precise, targeted modifications to genome sequences. Here we review the recent exciting developments in the ease of use, specificity, and delivery of gene-editing technologies and their application to treating a wide variety of diseases and disorders.

To insert the corrected gene into the patient’s targeted cell, a carrier molecule, called a vector must be used. The most common form of vector is a virus which has been genetically modified to contain human DNA within it. The viruses are modified by replacing the deformed gene with the genes encoding for the desired effect. Thus the virus can be used as a 'vehicle' to carry the good genes into the targeted human cell in a pathogenic manner (see figure 2 below). The target cells are usually a patient’s liver or lung cells, where the viral vector transfers the therapeutic gene into the target cell. The therapeutic gene generates the production of functional proteins and restores the cell to its normal state

 Mechanisms of double-strand break repair

There are six main types of viruses used as vectors in gene therapy (shown in table below):

1. Retroviruses - A class of viruses that can create double-stranded DNA copies of their RNA genomes. These copies of its genome can be integrated into the chromosomes of host cells. Human immunodeficiency virus (HIV) is a retrovirus.

2. Adenoviruses - A class of viruses with double-stranded DNA genomes that cause respiratory, intestinal, and eye infections in humans. The virus that causes the common cold is an adenovirus.

3. Adeno-associated viruses - A class of small, single-stranded DNA viruses that can insert their genetic material at a specific site on chromosome 19.

4.Herpes simplex viruses - A class of double-stranded DNA viruses that infect a particular cell type, neurons. Herpes simplex virus type 1 is a common human pathogen that causes cold sores.13

5. Alphaviruses- a single stranded positive sense RNA, particularly used to develop viral vectors for  the Ross-River virus, Sindbis virus, Semliki Forest virus and Venezuelan Equine Encephalitis virus.

6. Vaccina or pox viruses- a large, complex, enveloped virus belonging to the poxvirus family. It has a linear, double-stranded DNA genome of approximately 190 kb in length, which encodes for around 250 genes. Can accept as much as 25kb of foreign DNA making it especially useful in expressing a large gene in gene therapy.

 Showing the viruses used as vectors

Non-Viral Options

The simplest option of non-viral insertion of DNA into the target cells involves direct insertion of the therapeutic cells into the target. This method poses some problems however, due to the large amount of DNA needed and can only be used with specific tissue types.

Another option is of a non-viral approach is the use of therapeutic DNA that chemically links the DNA to a special receptors on the target cells’ molecules. The DNA binds to the receptors, and the cell membrane engulfs the cell DNA and transfers it to the interior of the targeted cell. However the delivery system of this option is not very successful.

A liposome (an artificial lipid sphere with an aqueous core) can also be used to transfer the therapeutic DNA to the target cell through the cell’s membrane.

Challenges in Gene Therapy

Issues with viral vectors- viruses, while the most common choice for a vector in gene therapy, poses several problems. Viruses can trigger immune and inflammatory responses and induce toxicity in the patient. There are also numerous concerns about the viruses ability regain the capability of causing diseases.

The nature of gene therapy is short lived- there are problems with integrating the therapeutic DNA into the patients genome, thereby limiting the long term benefits of gene therapy. Patients therefore have to undergo series of treatment. Before gene therapy can become a permanent cure for a disease the DNA needs to be integrated into genome and remain functional and stable within the target cell.

Immune response-There is always a risk of initiating an immune attack when inserting the therapeutic gene into the patient, since the immune system recognizes the vector as a foreign particle. Additionally, the immune systems enhanced response to invaders it has seen before makes gene therapy difficult to be repeated in the patient.

Mulitigene disorders- Some common diseases, such as Alzheimer’s disease, arthritis and diabetes,  arise from the combined effect of variations and mutations in many genes. Multifactorial disorders are hard to treat with gene therapy since conditions that arise from single mutations are the best contenders for gene therapy.

4) Endosymbiosis ,and gradual evolution  of ATP  fromation or energy production at cellular level causes microevolution which in large scale affects species evolution:

These open the whole new spectrum of evolutinery Biology.

Endosymbiosis: Lynn Margulis

 Endosymbiosis: Lynn Margulis

Symbiotic microbes = eukaryote cells?
In the late 1960s Margulis  studied the structure of cells. Mitochondria, for example, are wriggly bodies that generate the energy required for metabolism. To Margulis, they looked remarkably like bacteria. She knew that scientists had been struck by the

 Mitochondria are thought to have descended from close relatives of typhus-causing bacteria

similarity ever since the discovery of mitochondria at the end of the 1800s. Some even suggested that mitochondria began from bacteria that lived in a permanent symbiosis within the cells of animals and plants. There were parallel examples in all plant cells. Algae and plant cells have a second set of bodies that they use to carry out photosynthesis. Known as chloroplasts, they capture incoming sunlight energy. The energy drives biochemical reactions including the combination of water and carbon dioxide to make organic matter. Chloroplasts, like mitochondria, bear a striking resemblance to bacteria. Scientists became convinced that chloroplasts (below right), like mitochondria, evolved from symbiotic bacteria — specifically, that they descended from cyanobacteria (above right), the light-harnessing small organisms that abound in oceans and fresh water.

 Margulis and others hypothesized that chloroplasts (bottom) evolved from cyanobacteria (top)

When one of her professors saw DNA inside chloroplasts, Margulis was not surprised. After all, that's just what you'd expect from a symbiotic partner. Margulis spent much of the rest of the 1960s honing her argument that symbiosis (see figure, below) was an unrecognized but major force in the evolution of cells. In 1970 she published her argument in The Origin of Eukaryotic Cells.

The genetic evidence
In the 1970s scientists developed new tools and methods for comparing genes from different species. Two teams of microbiologists — one headed by Carl Woese, and the other by W. Ford Doolittle at Dalhousie University in Nova Scotia — studied the genes inside chloroplasts of some species of algae. They found that the chloroplast genes bore little resemblance to the genes in the algae's nuclei. Chloroplast DNA, it turns out, was cyanobacterial DNA. The DNA in mitochondria, meanwhile, resembles that within a group of bacteria that includes the type of bacteria that causes typhus (see photos, right). Margulis has maintained that earlier symbioses helped to build nucleated cells. For example, spiral-shaped bacteria called spirochetes were incorporated into all organisms that divide by mitosis. Tails on cells such as sperm eventually resulted. Most researchers remain skeptical about this claim.

It has become clear that symbiotic events have had a profound impact on the organization and complexity of many forms of life. Algae have swallowed up bacterial partners, and have themselves been included within other single cells. Nucleated cells are more like tightly knit communities than single individuals. Evolution is more flexible than was once believed.

Endosymbiotic Theory Introduction:

The hypothesized process by which prokaryotes gave rise to the first eukaryotic cells is known as endosymbiosis, and certainly ranks among the most important evolutionary events. Endosymbiotic theory, that attempts to explain the origins of eukaryotic cell organelles such as mitochondria in animals and fungi and chloroplasts in plants was greatly advanced by the seminal work of biologist Lynn Margulis in the 1960s. Mitochondria are one of the many different types of organelles in the cells of all eukaryotes. In general, they are considered to have originated from proteobacteria (likely Rickettsiales) through endosymbiosis. Chloroplasts are one of the many different types of organelles in the plant cell. In general, they are considered to have originated from cyanobacteria through endosymbiosis. Endosymbiosis has gained ever more acceptance in the last half century, especially with the relatively recent advent of sequencing technologies. There are many variants to the theory, regarding what organism(s) engulfed what other organism(s), as well as how many times and when it occurred across geological time

Endosymbiotic Theory and Eukaryotic Origins

Such symbiotic relationships in which two species are dependent upon one another to varying extents also served as crucial elements of the evolution of eukaryotic cells. The theory holds that the eukaryote mitochodrion evolved from a small, autotrophic bacterium that was engulfed by a larger primitive, heterotrophic, eukaryotic cell. This eukaryotic cell arose when an
anaerobic prokaryote (unable to use oxygen for energy) lost its cell wall. The more flexible membrane underneath then began to grow and fold in on itself which, in turn, led to

formation of a nucleus and other internal membranes. Endosymbiosis occurred according to the figure to the right: a) The primitive eukaryotic cell was also eventually able to eat prokaryotes, a marked improvement to absorbing small
molecules from its environment; b) The process of endosymbiosis commenced when the eukaryote engulphed but did not digest a autotrophic bacterium. Evidence suggests this engulfed bacterium was an alphaproteobacteria, an autotroph that uses photosynthesis to acquire energy. c) The eukaryote then began a mutually beneficial (symbiotic) relationship with it whereby the eukaryote provided protection and nutrients to the prokaryote, and in return, the prokaryotic endosymbiont provided additional energy to its eukaryotic host through its respiratory cellular machinary. d) The relationship became permanent over time completing primary endosymbiosis as the endosymbiont lost some genes it used for independent life and transferred others to the eukaryote's nucleus. The symbiont thus became dependent on the host cell for organic molecules and inorganic compounds. The genes of the repiratory machinary became a mitochondrion. Endosymbiotic theory hypothesizes the origin of chloroplasts similarly, where chloroplasts a eukaryote with mitochondria engulfs a photosynthetic cyanobacteruim in a symbiotic relationship ending in the chloroplast organelle.

Digging deeper, the symbiosis is analogous to that between plants and their "birds and bees" symbionts. The aerobic bacterium thrived within the cell cytoplasm that provided abundant molecular food for its heterotrophic existence. The bacterium digested these molecules that manufactured enormous energy in the form of adenosine triphosphate (ATP), and so much so that extra ATP was available to the host cell's cytoplasm. This enormously benefited the anaerobic cell that then gained the ability to aerobically digest food. Eventually, the aerobic bacterium could no longer live independently from the cell, evolving into the mitochondrion organelle. Such aerobically obtained energy vastly exceeded that of anaerobic respiration, setting the stage for vastly accelerated evolution of eukaryotes.

Endosymbiotic theory posits a later parallel origin of the chloroplasts; a cell ate a photosynthetic cyanobacterium and failed to digest it. The cyanobacterium thrived in the cell and eventually evolved into the first chloroplast. Other eukaryotic organelles may have also evolved through endosymbiosis; it has been proposed that cilia, flagella, centrioles, and microtubules may have originated from a symbiosis between a Spirochaete bacterium and an early eukaryotic cell, but this is not yet broadly accepted among biologists.

 Amimated GIF of ATP synthese

In the race of any kind of Evolution or modification main thing that matter is Energy Production, a particular organism that produce and use energy more efficiantly will be tha winner in the evolutionery race. Therefore at cellular level main source of energy is ATP (Adinosine Tri-Phosphate), The main bio-molecular nanomachine in our cell which produce ATP is ATP-Synthetase.

Everything you do needs energy to drive it, whether it is running up a flight of stairs, eating breakfast, fighting off an infection, or even just producing new proteins for hair growth.  The energy currency used by all cells from bacteria to man is adenosine triphosphate (ATP).  Every process within an organism – DNA and protein synthesis, muscle contraction, active transport of nutrients and ions, neural activity, maintenance of osmosis, carbon fixation – requires a source of ATP.

Why is ATP such a good source of energy?

ATP is a nucleoside triphosphate (ribose sugar, adenine base and three phosphate groups), where a high-energy bond attaches the third phosphate group to the molecule.  This bond is highly unstable, and when it is hydrolysed it releases a substantial amount of free energy (approximately 7 kcal/mole).  In addition to providing energy, ATP has other essential roles within cells:  it is one of the four nucleotides required for the synthesis of DNA (replication) and RNA (protein synthesis); it regulates certain biochemical pathways; in mammals it is released from damaged cells to elicit a pain response; and in photosynthetic organisms it drives carbon fixation.  However, as it is unstable (cannot be stored for long) and is used for almost every conceivable process, each cell in the body must constantly produce ATP to supply its needs.  In total, an organism’s requirement for ATP is substantial: the average human body generates over 100 kg of ATP per day.  ATP synthase is the prime producer of ATP in cells, catalysing the combination of ADP (adenosine diphosphate) with inorganic phosphate to make ATP:

ADP + Pi à ATP + H2O

ATP Synthase, an Early Enzyme of Life

Because of its fundamental importance in sustaining life, organisms evolved ATP synthase (ATPase) early during evolution, making it one of the oldest of enzymes - even predating photosynthetic and respiratory enzyme machinery.  As a result, ATPase has remained a highly conserved enzyme throughout all kingdoms of life:  the ATPases found in the thylakoid membranes of chloroplasts and in the inner membranes of mitochondria in eukaryotes retain essentially the same structure and function as their enzymatic counterparts in the plasma membranes of bacteria.  In particular, the subunits that are essential for catalysis show striking homology between species.

The ATPase Family
ATPases are membrane-bound ion channels (actually transporters, as they are not true ion channels) that couple ion movement through a membrane with the synthesis or hydrolysis of a nucleotide, usually ATP.  Different forms of membrane-associated ATPases have evolved over time to meet specific demands of cells.  These ATPases have been classified as F-, V-, A-, P- and E-ATPases based on functional differences.  They all catalyse the reaction of ATP synthesis and/or hydrolysis.  The driving force for the synthesis of ATP is the H+ gradient, while during ATP hydrolysis the energy from breaking the ATP phosphodiester bond is the driving force for creating an ion gradient.  Structurally these ATPases can differ: F-, V- and A-ATPases are multi-subunit complexes with a similar architecture and possibly catalytic mechanism, transporting ions using rotary motors.  The P-ATPases are quite distinct in their subunit composition and in the ions they transport, and do not appear to use a rotary motor.  The different types of ATPases are discussed below:

F-ATPases
The F-ATPases (for ‘phosphorylation Factor’, and also known as H+-transporting ATPases or F(0)F(1)-ATPases) are the prime enzymes used for ATP synthesis, and are remarkably conserved throughout evolution.  They are found in the plasma membranes of bacteria, in the thylakoid membranes of chloroplasts, and in the inner membranes of mitochondria.  These membrane proteins can synthesize ATP using a H+ gradient, and work in the reverse to create a H+ gradient using the energy gained from the hydrolysis of ATP.  In certain bacteria, Na+-transporting F-ATPases have also been found.

V-ATPases
V-ATPases (for ‘Vacuole’) are found in the eukaryotic endomembrane system (vacuoles, Golgi apparatus, endosomes, lysosomes, clathrin-coated vesicles {transport external substances inside the cell}, and plant tonolplasts), and in the plasma membrane of prokaryotes and certain specialised eukaryotic cells. V-ATPases hydrolyse ATP to drive a proton pump, but cannot work in reverse to synthesize ATP.  V-ATPases are involved in a variety of vital intra- and inter-cellular processes such as receptor mediated endocytosis, protein trafficking, active transport of metabolites, homeostasis and neurotransmitter release

A-ATPases
A-ATPases (for ‘Archaea’) are found exclusively in Archaea and have a similar function to F-ATPases (reversible ATPases), even though structurally they are closer to V-ATPases.  A-ATPases may have arisen as an adaptation to the different cellular needs and the more extreme environmental conditions faced by Archaeal species.

P-ATPases
P-ATPases (also known as E1-E2 ATPases) are found in bacteria and in a number of eukaryotic plasma membranes and organelles.  P-ATPases function to transport a variety of different compounds, including ions and phospholipids, across a membrane using ATP hydrolysis for energy.  There are many different classes of P-ATPases, each of which transports a specific type of ion: H+, Na+, K+, Mg2+, Ca2+, Ag+ and Ag2+, Zn2+, Co2+, Pb2+, Ni2+, Cd2+, Cu+ and Cu2+.  For example, gastric P-ATPase is a H+/K+ pump responsible for acid secretion in the stomach, transporting H+ from the cytoplasm of stomach parietal cells to create a large pH gradient in exchange for getting K+ ions inside the cell, using ATP hydrolysis as the energy source.  P-ATPases can be composed of one or two polypeptides (fewer than the other ATPases), and can assume two conformations called E1 and E2.

E-ATPases
E-ATPases (for ‘Extracellular’) are membrane-bound cell surface enzymes that have broad substrate specificity, hydrolysing other NTPs besides ATP, as well as NDPs – although their most likely substrates are ATP, ADP and UTP, as well as extracellular ATP.  There are at least three classes of E-ATPases: ecto-ATPases, CD39s, and ecto-ATP/Dases.  An example is ecto-ATPase from the smooth muscle membranes of chickens, which is thought to exhibit a range of activities determined by the oligomerisation of the enzyme, which in turn is affected by different membrane events.

5) NA64 hunts the mysterious dark photon:

 Researchers have studied electron-positron (e+e−${e}^{+}\phantom{\rule{0}{0ex}}{e}^{-}$) collisions for interactions that produce a normal photon γ$\gamma$ and a dark photon A′${A}^{\prime }$ that interacts with ordinary matter particles. The dark photon can potentially decay into an e+e−${e}^{+}\phantom{\rule{0}{0ex}}{e}^{-}$ pair (shown here) or a μ+μ−${\mu }^{+}\phantom{\rule{0}{0ex}}{\mu }^{-}$ pair (not shown). However, the latest results from the BaBar collaboration offer no sign of dark photons, thus placing new limits on these types of models.

One of the biggest puzzles in physics is that eighty-five percent of the matter in our universe is “dark”: it does not interact with the photons of the conventional electromagnetic force and is therefore invisible to our eyes and telescopes. Although the composition and origin of dark matter are a mystery, we know it exists because astronomers observe its gravitational pull on ordinary visible matter such as stars and galaxies.

 An overview of the NA64 experimental set-up at CERN. NA64 hunts down dark photons, hypothetic dark matter particles

Some theories suggest that, in addition to gravity, dark matter particles could interact with visible matter through a new force, which has so far escaped detection. Just as the electromagnetic force is carried by the photon, this dark force is thought to be transmitted by a particle called “dark” photon which is predicted to act as a mediator between visible and dark matter.

“To use a metaphor, an otherwise impossible dialogue between two people not speaking the same language (visible and dark matter) can be enabled by a mediator (the dark photon), who understands one language and speaks the other one,” explains Sergei Gninenko, spokesperson for the NA64 collaboration.

CERN’s NA64 experiment looks for signatures of this visible-dark interaction using a simple but powerful physics concept: the conservation of energy. A beam of electrons, whose initial energy is known very precisely, is aimed at a detector. Interactions between incoming electrons and atomic nuclei in the detector produce visible photons. The energy of these photons is measured and it should be equivalent to that of the electrons.  However, if the dark photons exist, they will escape the detector and carry away a large fraction of the initial electron energy.

Therefore, the signature of the dark photon is an event registered in the detector with a large amount of “missing energy” that cannot be attributed to a process involving only ordinary particles, thus providing a strong hint of the dark photon’s existence.
If confirmed, the existence of the dark photon would represent a breakthrough in our understanding the longstanding dark matter mystery.

Dark matter interacts with each other by "dark photon"

Now we know that word " Dark" does not always related to the Evils but the Unknown entity which largely govern our Universe.