Representative Image

A scientific theory is an explanation of an aspect of the natural world that can be repeatedly tested and verified in accordance with the scientific method, using accepted protocols of observation, measurement, and evaluation of results. Where possible, theories are tested under controlled conditions in an experiment. 

Here is a list of a few Scientific Theories that revolutionized Scientific progress and research:

The Grand Unified Theory



A Grand Unified Theory (GUT) is a model in particle physics in which, at high energies, the three gauge interactions of the Standard Model that define the electromagnetic, weak, and strong interactions, or forces, are merged into a single force. Although this unified force has not been directly observed, the many GUT models theorize its existence. If unification of these three interactions is possible, it raises the possibility that there was a grand unification epoch in the very early universe in which these three fundamental interactions were not yet distinct.

Experiments have confirmed that at high energy, the electromagnetic interaction and weak interaction unify into a single electroweak interaction. GUT models predict that at even higher energy, the strong interaction and the electroweak interaction will unify into a single electronuclear interaction. This interaction is characterized by one larger gauge symmetry and thus several force carriers, but one unified coupling constant. Unifying gravity with the electronuclear interaction would provide a theory of everything (TOE) rather than a GUT. GUTs are often seen as an intermediate step towards a TOE.

The novel particles predicted by GUT models are expected to have extremely high masses of around the GUT scale of 1016 GeV —just a few orders of magnitude below the Planck scale of 
1019 GeV—and so are well beyond the reach of any foreseen particle collider experiments. Therefore, the particles predicted by GUT models will be unable to be observed directly and instead the effects of grand unification might be detected through indirect observations such as proton decay, electric dipole moments of elementary particles, or the properties of neutrinos. Some GUTs, such as the Pati-Salam model, predict the existence of magnetic monopoles.

GUT models which aim to be completely realistic are quite complicated, even compared to the Standard Model, because they need to introduce additional fields and interactions, or even additional dimensions of space. The main reason for this complexity lies in the difficulty of reproducing the observed fermion masses and mixing angles which may be related to an existence of some additional family symmetries beyond the conventional GUT models. Due to this difficulty, and due to the lack of any observed effect of grand unification so far, there is no generally accepted GUT model.

Models that do not unify the three interactions using one simple group as the gauge symmetry, but do so using semisimple groups, can exhibit similar properties and are sometimes referred to as Grand Unified Theories as well.

Historically, the first true GUT which was based on the simple Lie group SU(5), was proposed by Howard Georgi and Sheldon Glashow in 1974. The Georgi–Glashow model was preceded by the semisimple Lie algebra Pati–Salam model by Abdus Salam and Jogesh Pati, who pioneered the idea to unify gauge interactions.

The acronym GUT was first coined in 1978 by CERN researchers John Ellis, Andrzej Buras, Mary K. Gaillard, and Dimitri Nanopoulos, however in the final version of their paper they opted for the less anatomical GUM (Grand Unification Mass). Nanopoulos later that year was the first to use the acronym in a paper.


String theory



In physics, string theory is a theoretical framework in which the point-like particles of particle physics are replaced by one-dimensional objects called strings. It describes how these strings propagate through space and interact with each other. On distance scales larger than the string scale, a string looks just like an ordinary particle, with its mass, charge, and other properties determined by the vibrational state of the string. In string theory, one of the many vibrational states of the string corresponds to the graviton, a quantum mechanical particle that carries gravitational force. Thus string theory is a theory of quantum gravity.

String theory is a broad and varied subject that attempts to address a number of deep questions of fundamental physics. String theory has been applied to a variety of problems in black hole physics, early universe cosmology, nuclear physics, and condensed matter physics, and it has stimulated a number of major developments in pure mathematics. Because string theory potentially provides a unified description of gravity and particle physics, it is a candidate for a theory of everything, a self-contained mathematical model that describes all fundamental forces and forms of matter. Despite much work on these problems, it is not known to what extent string theory describes the real world or how much freedom the theory allows in the choice of its details.

String theory was first studied in the late 1960s as a theory of the strong nuclear force, before being abandoned in favor of quantum chromodynamics. Subsequently, it was realized that the very properties that made string theory unsuitable as a theory of nuclear physics made it a promising candidate for a quantum theory of gravity. The earliest version of string theory, bosonic string theory, incorporated only the class of particles known as bosons. It later developed into superstring theory, which posits a connection called supersymmetry between bosons and the class of particles called fermions. Five consistent versions of superstring theory were developed before it was conjectured in the mid-1990s that they were all different limiting cases of a single theory in 11 dimensions known as M-theory. In late 1997, theorists discovered an important relationship called the AdS/CFT correspondence, which relates string theory to another type of physical theory called a quantum field theory.

One of the challenges of string theory is that the full theory does not have a satisfactory definition in all circumstances. Another issue is that the theory is thought to describe an enormous landscape of possible universes, which has complicated efforts to develop theories of particle physics based on string theory. These issues have led some in the community to criticize these approaches to physics, and to question the value of continued research on string theory unification.

In the 20th century, two theoretical frameworks emerged for formulating the laws of physics. The first is Albert Einstein's general theory of relativity, a theory that explains the force of gravity and the structure of spacetime at the macro-level. The other is quantum mechanics which is a completely different formulation to describe physical phenomena using the known probability principles at the micro-level. By the late 1970s, these two frameworks had proven to be sufficient to explain most of the observed features of the universe, from elementary particles to atoms to the evolution of stars and the universe as a whole.

In spite of these successes, there are still many problems that remain to be solved. One of the deepest problems in modern physics is the problem of quantum gravity. The general theory of relativity is formulated within the framework of classical physics, whereas the other fundamental forces are described within the framework of quantum mechanics. A quantum theory of gravity is needed in order to reconcile general relativity with the principles of quantum mechanics, but difficulties arise when one attempts to apply the usual prescriptions of quantum theory to the force of gravity. In addition to the problem of developing a consistent theory of quantum gravity, there are many other fundamental problems in the physics of atomic nuclei, black holes, and the early universe.

String theory is a theoretical framework that attempts to address these questions and many others. The starting point for string theory is the idea that the point-like particles of particle physics can also be modeled as one-dimensional objects called strings. String theory describes how strings propagate through space and interact with each other. In a given version of string theory, there is only one kind of string, which may look like a small loop or segment of ordinary string, and it can vibrate in different ways. On distance scales larger than the string scale, a string will look just like an ordinary particle, with its mass, charge, and other properties determined by the vibrational state of the string. In this way, all of the different elementary particles may be viewed as vibrating strings. In string theory, one of the vibrational states of the string gives rise to the graviton, a quantum mechanical particle that carries gravitational force. Thus string theory is a theory of quantum gravity.

One of the main developments of the past several decades in string theory was the discovery of certain 'dualities', mathematical transformations that identify one physical theory with another. Physicists studying string theory have discovered a number of these dualities between different versions of string theory, and this has led to the conjecture that all consistent versions of string theory are subsumed in a single framework known as M-theory.

Studies of string theory have also yielded a number of results on the nature of black holes and the gravitational interaction. There are certain paradoxes that arise when one attempts to understand the quantum aspects of black holes, and work on string theory has attempted to clarify these issues. In late 1997 this line of work culminated in the discovery of the anti-de Sitter/conformal field theory correspondence or AdS/CFT. This is a theoretical result which relates string theory to other physical theories which are better understood theoretically. The AdS/CFT correspondence has implications for the study of black holes and quantum gravity, and it has been applied to other subjects, including nuclear and condensed matter physics.

Since string theory incorporates all of the fundamental interactions, including gravity, many physicists hope that it will eventually be developed to the point where it fully describes our universe, making it a theory of everything. One of the goals of current research in string theory is to find a solution of the theory that reproduces the observed spectrum of elementary particles, with a small cosmological constant, containing dark matter and a plausible mechanism for cosmic inflation. While there has been progress toward these goals, it is not known to what extent string theory describes the real world or how much freedom the theory allows in the choice of details.

One of the challenges of string theory is that the full theory does not have a satisfactory definition in all circumstances. The scattering of strings is most straightforwardly defined using the techniques of perturbation theory, but it is not known in general how to define string theory nonperturbatively. It is also not clear whether there is any principle by which string theory selects its vacuum state, the physical state that determines the properties of our universe. These problems have led some in the community to criticize these approaches to the unification of physics and question the value of continued research on these problems.


Quantum Theory



Quantum mechanics, including quantum field theory, is a fundamental theory in physics describing the properties of nature on an atomic scale.

Classical physics, the description of physics that existed before the formulation of the theory of relativity and of quantum mechanics, describes many aspects of nature at ordinary (macroscopic) scale. Quantum mechanics explains the aspects of nature at ordinary (macroscopic) scales but extends this description to the small (atomic and subatomic) scales. Most theories in classical physics can be derived from quantum mechanics as an approximation valid at large (macroscopic) scale. Quantum mechanics differs from classical physics in that energy, momentum, angular momentum, and other quantities of a bound system are restricted to discrete values (quantization), objects have characteristics of both particles and waves (wave-particle duality), and there are limits to how accurately the value of a physical quantity can be predicted prior to its measurement, given a complete set of initial conditions (the uncertainty principle).

Quantum mechanics arose gradually, from theories to explain observations which could not be reconciled with classical physics, such as Max Planck's solution in 1900 to the black-body radiation problem, and the correspondence between energy and frequency in Albert Einstein's 1905 paper which explained the photoelectric effect. Early quantum theory was profoundly re-conceived in the mid-1920s by Erwin Schrödinger, Werner Heisenberg, Max Born and others. The modern theory is formulated in various specially developed mathematical formalisms. In one of them, a mathematical function, the wave function, provides information about the probability amplitude of energy, momentum, and other physical properties of a particle.



Scientific inquiry into the wave nature of light began in the 17th and 18th centuries, when scientists such as Robert Hooke, Christiaan Huygens and Leonhard Euler proposed a wave theory of light based on experimental observations. In 1803, English polymath Thomas Young described the famous double-slit experiment. This experiment played a major role in the general acceptance of the wave theory of light.

In 1838, Michael Faraday discovered cathode rays. These studies were followed by the 1859 statement of the black-body radiation problem by Gustav Kirchhoff, the 1877 suggestion by Ludwig Boltzmann that the energy states of a physical system can be discrete, and the 1900 quantum hypothesis of Max Planck. Planck's hypothesis that energy is radiated and absorbed in discrete "quanta" (or energy packets) precisely matched the observed patterns of black-body radiation.

In 1896, Wilhelm Wien empirically determined a distribution law of black-body radiation, called Wien's law. Ludwig Boltzmann independently arrived at this result by considerations of Maxwell's equations. However, it was valid only at high frequencies and underestimated the radiance at low frequencies.

The foundations of quantum mechanics were established during the first half of the 20th century by Max Planck, Niels Bohr, Werner Heisenberg, Louis de Broglie, Arthur Compton, Albert Einstein, Erwin Schrödinger, Max Born, John von Neumann, Paul Dirac, Enrico Fermi, Wolfgang Pauli, Max von Laue, Freeman Dyson, David Hilbert, Wilhelm Wien, Satyendra Nath Bose, Arnold Sommerfeld, and others. The Copenhagen interpretation of Niels Bohr became widely accepted.

Max Planck corrected this model using Boltzmann's statistical interpretation of thermodynamics and proposed what is now called Planck's law, which led to the development of quantum mechanics. After Planck's solution in 1900 to the black-body radiation problem (reported 1859), Albert Einstein offered a quantum-based explanation of the photoelectric effect (1905, reported 1887). Around 1900–1910, the atomic theory but not the corpuscular theory of light first came to be widely accepted as scientific fact; these latter theories can be considered quantum theories of matter and electromagnetic radiation, respectively. However, the photon theory was not widely accepted until about 1915. Even until Einstein's Nobel Prize, Niels Bohr did not believe in the photon.

Among the first to study quantum phenomena were Arthur Compton, C. V. Raman, and Pieter Zeeman, each of whom has a quantum effect named after him. Robert Andrews Millikan studied the photoelectric effect experimentally, and Albert Einstein developed a theory for it. At the same time, Ernest Rutherford experimentally discovered the nuclear model of the atom, and Niels Bohr developed a theory of atomic structure, confirmed by the experiments of Henry Moseley. In 1913, Peter Debye extended Bohr's theory by introducing elliptical orbits, a concept also introduced by Arnold Sommerfeld. This phase is known as old quantum theory.

Theory of Relativity

Einstein's Mass Energy Equation


The theory of relativity usually encompasses two interrelated theories by Albert Einstein: special relativity and general relativity. Special relativity applies to all physical phenomena in the absence of gravity. General relativity explains the law of gravitation and its relation to other forces of nature. It applies to the cosmological and astrophysical realm, including astronomy.

Albert Einstein

The theory transformed theoretical physics and astronomy during the 20th century, superseding a 200-year-old theory of mechanics created primarily by Isaac Newton. It introduced concepts including spacetime as a unified entity of space and time, relativity of simultaneity, kinematic and gravitational time dilation, and length contraction. In the field of physics, relativity improved the science of elementary particles and their fundamental interactions, along with ushering in the nuclear age. With relativity, cosmology and astrophysics predicted extraordinary astronomical phenomena such as neutron stars, black holes, and gravitational waves.


Albert Einstein published the theory of special relativity in 1905, building on many theoretical results and empirical findings obtained by Albert A. Michelson, Hendrik Lorentz, Henri Poincaré and others. Max Planck, Hermann Minkowski and others did subsequent work.

Einstein developed general relativity between 1907 and 1915, with contributions by many others after 1915. The final form of general relativity was published in 1916.

The term "theory of relativity" was based on the expression "relative theory" (German: Relativtheorie) used in 1906 by Planck, who emphasized how the theory uses the principle of relativity. In the discussion section of the same paper, Alfred Bucherer used for the first time the expression "theory of relativity" (German: Relativitätstheorie).

By the 1920s, the physics community understood and accepted special relativity. It rapidly became a significant and necessary tool for theorists and experimentalists in the new fields of atomic physics, nuclear physics, and quantum mechanics.

By comparison, general relativity did not appear to be as useful, beyond making minor corrections to predictions of Newtonian gravitation theory. It seemed to offer little potential for experimental test, as most of its assertions were on an astronomical scale. Its mathematics seemed difficult and fully understandable only by a small number of people. Around 1960, general relativity became central to physics and astronomy. New mathematical techniques to apply to general relativity streamlined calculations and made its concepts more easily visualized. As astronomical phenomena were discovered, such as quasars (1963), the 3-kelvin microwave background radiation (1965), pulsars (1967), and the first black hole candidates (1981), the theory explained their attributes, and measurement of them further confirmed the theory.


Evolution by Natural selection

Charles Darwin


Natural selection is the differential survival and reproduction of individuals due to differences in phenotype. It is a key mechanism of evolution, the change in the heritable traits characteristic of a population over generations. Charles Darwin popularised the term "natural selection", contrasting it with artificial selection, which in his view is intentional, whereas natural selection is not.

Variation exists within all populations of organisms. This occurs partly because random mutations arise in the genome of an individual organism, and their offspring can inherit such mutations. Throughout the lives of the individuals, their genomes interact with their environments to cause variations in traits. The environment of a genome includes the molecular biology in the cell, other cells, other individuals, populations, species, as well as the abiotic environment. Because individuals with certain variants of the trait tend to survive and reproduce more than individuals with other less successful variants, the population evolves. Other factors affecting reproductive success include sexual selection (now often included in natural selection) and fecundity selection.

Natural selection acts on the phenotype, the characteristics of the organism which actually interact with the environment, but the genetic (heritable) basis of any phenotype that gives that phenotype a reproductive advantage may become more common in a population. Over time, this process can result in populations that specialise for particular ecological niches (microevolution) and may eventually result in speciation (the emergence of new species, macroevolution). In other words, natural selection is a key process in the evolution of a population.

Natural selection is a cornerstone of modern biology. The concept, published by Darwin and Alfred Russel Wallace in a joint presentation of papers in 1858, was elaborated in Darwin's influential 1859 book On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life. He described natural selection as analogous to artificial selection, a process by which animals and plants with traits considered desirable by human breeders are systematically favoured for reproduction. The concept of natural selection originally developed in the absence of a valid theory of heredity; at the time of Darwin's writing, science had yet to develop modern theories of genetics. The union of traditional Darwinian evolution with subsequent discoveries in classical genetics formed the modern synthesis of the mid-20th century. The addition of molecular genetics has led to evolutionary developmental biology, which explains evolution at the molecular level. While genotypes can slowly change by random genetic drift, natural selection remains the primary explanation for adaptive evolution.

The Big Bang Theory

The Big Bang theory is a cosmological model of the observable universe from the earliest known periods through its subsequent large-scale evolution. The model describes how the universe expanded from an initial state of very high density and high temperature, and offers a comprehensive explanation for a broad range of observed phenomena, including the abundance of light elements, the cosmic microwave background (CMB) radiation, large-scale structure, and Hubble's law – the farther away galaxies are, the faster they are moving away from Earth. If the observed conditions are extrapolated backwards in time using the known laws of physics, the prediction is that just before a period of very high density there was a singularity. Current knowledge is insufficient to determine if anything existed prior to the singularity.

Georges Lemaître first noted in 1927 that an expanding universe could be traced back in time to an originating single point, calling his theory that of the "primeval atom". For much of the rest of the 20th century scientific community was divided between supporters of the Big Bang and the rival steady-state model, but a wide range of empirical evidence has strongly favored the Big Bang, which is now universally accepted. Edwin Hubble concluded from analysis of galactic redshifts in 1929 that galaxies are drifting apart; this is important observational evidence for an expanding universe. In 1964, the CMB was discovered, which was crucial evidence in favor of the hot Big Bang model, since that theory predicted the existence of a background radiation throughout the universe.

The known laws of physics can be used to calculate the characteristics of the universe in detail back in time to an initial state of extreme density and temperature. Detailed measurements of the expansion rate of the universe place the Big Bang at around 13.8 billion years ago, which is thus considered the age of the universe. After its initial expansion, the universe cooled sufficiently to allow the formation of subatomic particles, and later atoms. Giant clouds of these primordial elements – mostly hydrogen, with some helium and lithium – later coalesced through gravity, forming early stars and galaxies, the descendants of which are visible today. Besides these primordial building materials, astronomers observe the gravitational effects of an unknown dark matter surrounding galaxies. Most of the gravitational potential in the universe seems to be in this form, and the Big Bang theory and various observations indicate that it is not conventional baryonic matter that forms atoms. Measurements of the redshifts of supernovae indicate that the expansion of the universe is accelerating, an observation attributed to dark energy's existence.


Chaos Theory



Chaos theory is a branch of mathematics focusing on the study of chaos—states of dynamical systems whose apparently-random states of disorder and irregularities are often governed by deterministic laws that are highly sensitive to initial conditions. Chaos theory is an interdisciplinary theory stating that, within the apparent randomness of chaotic complex systems, there are underlying patterns, interconnectedness, constant feedback loops, repetition, self-similarity, fractals, and self-organization. The butterfly effect, an underlying principle of chaos, describes how a small change in one state of a deterministic nonlinear system can result in large differences in a later state (meaning that there is sensitive dependence on initial conditions). A metaphor for this behavior is that a butterfly flapping its wings in China can cause a hurricane in Texas.

Small differences in initial conditions, such as those due to rounding errors in numerical computation, can yield widely diverging outcomes for such dynamical systems, rendering long-term prediction of their behavior impossible in general. This can happen even though these systems are deterministic, meaning that their future behavior follows a unique evolution and is fully determined by their initial conditions, with no random elements involved. In other words, the deterministic nature of these systems does not make them predictable. This behavior is known as deterministic chaos, or simply chaos. The theory was summarized by Edward Lorenz as:

Chaos: When the present determines the future, but the approximate present does not approximately determine the future.

Chaotic behavior exists in many natural systems, including fluid flow, heartbeat irregularities, weather and climate. It also occurs spontaneously in some systems with artificial components, such as the stock market and road traffic. This behavior can be studied through the analysis of a chaotic mathematical model, or through analytical techniques such as recurrence plots and Poincaré maps. Chaos theory has applications in a variety of disciplines, including meteorology, anthropology, sociology, physics, environmental science, computer science, engineering, economics, biology, ecology, pandemic crisis management, and philosophy. The theory formed the basis for such fields of study as complex dynamical systems, edge of chaos theory, and self-assembly processes.


Atomic Theory

Atom


In chemistry and physics, atomic theory is a scientific theory of the nature of matter, which states that matter is composed of discrete units called atoms. It began as a philosophical concept in ancient Greece and entered the scientific mainstream in the early 19th century when discoveries in the field of chemistry showed that matter did indeed behave as if it were made up of atoms.

The word atom comes from the Ancient Greek adjective atomos, meaning "indivisible". 19th century chemists began using the term in connection with the growing number of irreducible chemical elements. Around the turn of the 20th century, through various experiments with electromagnetism and radioactivity, physicists discovered that the so-called "uncuttable atom" was actually a conglomerate of various subatomic particles (chiefly, electrons, protons and neutrons) which can exist separately from each other. In fact, in certain extreme environments, such as neutron stars, extreme temperature and pressure prevents atoms from existing at all.

Since atoms were found to be divisible, physicists later invented the term "elementary particles" to describe the "uncuttable", though not indestructible, parts of an atom. The field of science which studies subatomic particles is particle physics, and it is in this field that physicists hope to discover the true fundamental nature of matter.

Plate Techtonics Theory

Map of Major Techtonic Plates in the World


Plate tectonics is a scientific theory describing the large-scale motion of seven large plates and the movements of a larger number of smaller plates of the Earth's lithosphere, since tectonic processes began on Earth between 3.3 and 3.5 billion years ago. The model builds on the concept of continental drift, an idea developed during the first decades of the 20th century. The geoscientific community accepted plate-tectonic theory after seafloor spreading was validated in the late 1950s and early 1960s.

The lithosphere, which is the rigid outermost shell of a planet (the crust and upper mantle), is broken into tectonic plates. The Earth's lithosphere is composed of seven or eight major plates (depending on how they are defined) and many minor plates. Where the plates meet, their relative motion determines the type of boundary: convergent, divergent, or transform. Earthquakes, volcanic activity, mountain-building, and oceanic trench formation occur along these plate boundaries (or faults). The relative movement of the plates typically ranges from zero to 100 mm annually.

Tectonic plates are composed of oceanic lithosphere and thicker continental lithosphere, each topped by its own kind of crust. Along convergent boundaries, subduction, or one plate moving under another, carries the lower one down into the mantle; the material lost is roughly balanced by the formation of new (oceanic) crust along divergent margins by seafloor spreading. In this way, the total surface of the lithosphere remains the same. This prediction of plate tectonics is also referred to as the conveyor belt principle. Earlier theories, since disproven, proposed gradual shrinking (contraction) or gradual expansion of the globe.

Tectonic plates are able to move because the Earth's lithosphere has greater mechanical strength than the underlying asthenosphere. Lateral density variations in the mantle result in convection; that is, the slow creeping motion of Earth's solid mantle. Plate movement is thought to be driven by a combination of the motion of the seafloor away from spreading ridges due to variations in topography (the ridge is a topographic high) and density changes in the crust (density increases as newly formed crust cools and moves away from the ridge). At subduction zones the relatively cold, dense oceanic crust is "pulled" or sinks down into the mantle over the downward convecting limb of a mantle cell. Another explanation lies in the different forces generated by tidal forces of the Sun and Moon. The relative importance of each of these factors and their relationship to each other is unclear, and still the subject of much debate.

Oxygen Theory Of Combustion by Antoine Lavoisier

Oxygen Supports Combustion


The oxygen theory of combustion resulted from a demanding and sustained campaign to construct an experimentally grounded chemical theory of combustion, respiration, and calcination. The theory that emerged was in many respects a mirror image of the phlogiston theory, but gaining evidence to support the new theory involved more than merely demonstrating the errors and inadequacies of the previous theory. From the early 1770s until 1785, when the last important pieces of the theory fell into place, Lavoisier and his collaborators performed a wide range of experiments designed to advance many points on their research frontier.

Cell Theory

Cell

In biology, cell theory is the historic scientific theory, now universally accepted, that living organisms are made up of cells, that they are the basic structural/organizational unit of all organisms, and that all cells come from pre-existing cells. Cells are the basic unit of structure in all organisms and also the basic unit of reproduction.

The three tenets to the cell theory are as described below:

All living organisms are composed of one or more cells.
The cell is the basic unit of structure and organization in organisms.
Cells arise from pre-existing cells.
There is no universally accepted definition of life. Some biologists consider non-cellular entities such as viruses living organisms, and thus reasonably disagree with the first tenet.

With continual improvements made to microscopes over time, magnification technology advanced enough to discover cells in the 17th century. This discovery is largely attributed to Robert Hooke, and began the scientific study of cells, known as cell biology. Over a century later, many debates about cells began amongst scientists. Most of these debates involved the nature of cellular regeneration, and the idea of cells as a fundamental unit of life. Cell theory was eventually formulated in 1839. This is usually credited to Matthias Schleiden and Theodor Schwann. However, many other scientists like Rudolf Virchow contributed to the theory. It was an important step in the movement away from spontaneous generation.
The discovery of the cell was made possible through the invention of the microscope. In the first century BC, Romans were able to make glass, discovering that objects appeared to be larger under the glass. In Italy during the 12th century, Salvino D’Armate made a piece of glass fit over one eye, allowing for a magnification effect to that eye. The expanded use of lenses in eyeglasses in the 13th century probably led to wider spread use of simple microscopes (magnifying glasses) with limited magnification. Compound microscopes, which combine an objective lens with an eyepiece to view a real image achieving much higher magnification, first appeared in Europe around 1620. In 1665, Robert Hooke used a microscope about six inches long with two convex lenses inside and examined specimens under reflected light for the observations in his book Micrographia. Hooke also used a simpler microscope with a single lens for examining specimens with directly transmitted light, because this allowed for a clearer image.

Extensive microscopic study was done by Anton van Leeuwenhoek, a draper who took the interest in microscopes after seeing one while on an apprenticeship in Amsterdam in 1648. At some point in his life before 1668, he was able to learn how to grind lenses. This eventually led to Leeuwenhoek making his own unique microscope. His was a single lens simple microscope, rather than a compound microscope. This was because he was able to use a single lens that was a small glass sphere but allowed for a magnification of 270x. This was a large progression since the magnification before was only a maximum of 50x. After Leeuwenhoek, there was not much progress in microscope technology until the 1850s, two hundred years later. Carl Zeiss, a German engineer who manufactured microscopes, began to make changes to the lenses used. But the optical quality did not improve until the 1880s when he hired Otto Schott and eventually Ernst Abbe.

Optical microscopes can focus on objects the size of a wavelength or larger, giving restrictions still to advancement in discoveries with objects smaller than the wavelengths of visible light. The development of the electron microscope in the 1920s made it possible to view objects that are smaller than optical wavelengths, once again opening up new possibilities in science.



Post a Comment

Previous Post Next Post