Heliography

The earliest known surviving heliographic engraving, printed from a metal plate made in 1825 by Joseph Nicéphore Niépce using his "heliographic process".[1] The plate was exposed under an ordinary engraving and copied it by photographic means. Heliography was also used to capture a scene directly from nature with a camera.

The word has also been used to refer to other phenomena: for description of the sun (cf. geography), for photography in general, for signalling by heliograph (a device less commonly called a heliotrope or helio-telegraph), and for photography of the sun.[2]

The abbreviations héliog. or héliogr., found on old reproductions, may stand for the French word héliogravure, and can then refer to any form of photogravure.

^ ab"The First Photograph — Heliography". Retrieved 2009-09-29. from Helmut Gernsheim's article, "The 150th Anniversary of Photography," in History of Photography, Vol. I, No. 1, January 1977: ...In 1822, Niépce coated a glass plate... The sunlight passing through... This first permanent example... was destroyed... some years later.

^Descriptions of the sun, photography in general, and signalling by heliotrope: Oxford English Dictionary 2nd ed. (1989) s.v. "Heliography". Photography of the sun: As used by and in discussion of Hiroshi Yamazaki.

1.
Camera obscura
–
The surroundings of the projected image have to be relatively dark for the image to be clear, so many historical camera obscura experiments were performed in dark rooms. The term camera obscura also refers to constructions or devices that use of the principle within a box. Camerae obscurae with a lens in the opening have been used since the second half of the 16th century, before the term camera obscura was first used in 1604, many other expressions were used including cubiculum obscurum, cubiculum tenebricosum, conclave obscurum and locus obscurus. Rays of light travel in straight lines and change when they are reflected and partly absorbed by an object, retaining information about the color, lit objects reflect rays of light in all directions. The human eye itself works much like a camera obscura with an opening, a biconvex lens, a camera obscura device consists of a box, tent or room with a small hole in one side. Light from a scene passes through the hole and strikes a surface inside, where the scene is reproduced, inverted and reversed. The image can be projected onto paper, and can then be traced to produce an accurate representation. In order to produce a reasonably clear projected image, the aperture has to be about 1/100th the distance to the screen, many camerae obscurae use a lens rather than a pinhole because it allows a larger aperture, giving a usable brightness while maintaining focus. As the pinhole is made smaller, the image gets sharper, with too small a pinhole, however, the sharpness worsens, due to diffraction. Using mirrors, as in an 18th-century overhead version, it is possible to project a right-side-up image, another more portable type is a box with an angled mirror projecting onto tracing paper placed on the glass top, the image being upright as viewed from the back. There are theories that occurrences of camera obscura effects inspired paleolithic cave paintings and it is also suggested that camera obscura projections could have played a role in Neolithic structures. Perforated gnomons projecting an image of the sun were described in the Chinese Zhoubi Suanjing writings. The location of the circle can be measured to tell the time of day. In Arab and European cultures its invention was later attributed to Egyptian astronomer. Some ancient sightings of gods and spirits, especially in worship, are thought to possibly have been conjured up by means of camera obscura projections. In these writings it is explained how the image in a collecting-point or treasure house is inverted by an intersecting point that collected the light. Light coming from the foot of a person would partly be hidden below. Rays from the head would partly be hidden above and partly form the part of the image

2.
Geography
–
Geography is a field of science devoted to the study of the lands, the features, the inhabitants, and the phenomena of Earth. The first person to use the word γεωγραφία was Eratosthenes, Geography is an all-encompassing discipline that seeks an understanding of the Earth and its human and natural complexities—not merely where objects are, but how they have changed and come to be. It is often defined in terms of the two branches of geography and physical geography. Geography has been called the world discipline and the bridge between the human and the physical sciences, Geography is a systematic study of the Earth and its features. Traditionally, geography has been associated with cartography and place names, although many geographers are trained in toponymy and cartology, this is not their main preoccupation. Geographers study the space and the temporal database distribution of phenomena, processes, because space and place affect a variety of topics, such as economics, health, climate, plants and animals, geography is highly interdisciplinary. The interdisciplinary nature of the approach depends on an attentiveness to the relationship between physical and human phenomena and its spatial patterns. Names of places. are not geography. know by heart a whole gazetteer full of them would not, in itself and this is a description of the world—that is Geography. In a word Geography is a Science—a thing not of mere names but of argument and reason, of cause, just as all phenomena exist in time and thus have a history, they also exist in space and have a geography. Geography as a discipline can be split broadly into two main fields, human geography and physical geography. The former largely focuses on the environment and how humans create, view, manage. The latter examines the environment, and how organisms, climate, soil, water. The difference between these led to a third field, environmental geography, which combines physical and human geography. Physical geography focuses on geography as an Earth science and it aims to understand the physical problems and the issues of lithosphere, hydrosphere, atmosphere, pedosphere, and global flora and fauna patterns. Physical geography can be divided into broad categories, including, Human geography is a branch of geography that focuses on the study of patterns. It encompasses the human, political, cultural, social, and it requires an understanding of the traditional aspects of physical and human geography, as well as the ways that human societies conceptualize the environment. Integrated geography has emerged as a bridge between the human and the geography, as a result of the increasing specialisation of the two sub-fields. Examples of areas of research in the environmental geography include, emergency management, environmental management, sustainability, geomatics is concerned with the application of computers to the traditional spatial techniques used in cartography and topography

3.
Heliograph
–
A heliograph is a wireless solar telegraph that signals by flashes of sunlight reflected by a mirror. The flashes are produced by momentarily pivoting the mirror, or by interrupting the beam with a shutter, the heliograph was a simple but effective instrument for instantaneous optical communication over long distances during the late 19th and early 20th century. Its main uses were military, survey and forest protection work, heliographs were standard issue in the British and Australian armies until the 1960s, and were used by the Pakistani army as late as 1975. Most heliographs were variants of the British Army Mance Mark V version and it used a mirror with a small unsilvered spot in the centre. The sender aligned the heliograph to the target by looking at the target in the mirror. Keeping his head still, he adjusted the aiming rod so its cross wires bisected the target. This indicated that the sunbeam was pointing at the target, the flashes were produced by a keying mechanism that tilted the mirror up a few degrees at the push of a lever at the back of the instrument. If the sun was in front of the sender, its rays were reflected directly from this mirror to the receiving station. If the sun was behind the sender, the rod was replaced by a second mirror, to capture the sunlight from the main mirror. The U. S. Signal Corps heliograph mirror did not tilt and this type produced flashes by a shutter mounted on a second tripod. The heliograph had some great advantages, however, anyone in the beam with the correct knowledge could intercept signals without being detected. In the Boer War, where both sides used heliographs, tubes were used to decrease the dispersion of the beam.5 degrees to 15 degrees. The distance that signals could be seen depended on the clarity of the sky. A clear line of sight was required, and since the Earths surface is curved, under ordinary conditions, a flash could be seen 30 miles with the naked eye, and much farther with a telescope. The maximum range was considered to be 10 miles for each inch of mirror diameter, mirrors ranged from 1.5 inches to 12 inches or more. The record distance was established by a detachment of U. S, the German professor Carl Friedrich Gauss of the University of Göttingen developed and used a predecessor of the heliograph in 1821. His device directed a beam of sunlight to a distant station to be used as a marker for geodetic survey work. Similarly, the story that a shield was used as a heliograph at the Battle of Marathon is a modern myth, what Herodotus did write was that someone was accused of having arranged to hold up a shield as a signal

4.
Heliotrope (instrument)
–
The heliotrope is an instrument that uses a mirror to reflect sunlight over great distances to mark the positions of participants in a land survey. The heliotrope was invented in 1821 by the German mathematician Carl Friedrich Gauss, the word heliotrope is taken from the Greek, helios, meaning sun, and tropos, meaning turn. It is a name for an instrument which can be used to turn incoming sunlight. Heliotropes were used in surveys from Gausss survey in Germany in 1821 through the late 1980s, the Indian specification for heliotropes was updated in 1981, and the American military specification for heliotropes was retired on 8 December 1995. Heliotropes were often used as targets at ranges of over 100 miles. In California, in 1878, a heliotrope on Mount Saint Helena was surveyed by B. A. Colonna of the USCGS from Mount Shasta, a distance of 192 miles. The heliotrope was limited to use on days and was further limited to mornings. The inventor of the heliograph, a similar instrument specialized for signaling, was inspired by observing the use of heliotropes in the survey of India, transits of Venus Page with photographs of three heliotropes from 1873. Improvised Heliotrope this 1969 article also provides the US Army part number for a heliotrope, Heliotrope Heliotrope photo, description of a 192-mile record

5.
Photogravure
–
The earliest forms of photogravure were developed by two original pioneers of photography itself, first Nicéphore Niépce in France in the 1820s, and later Henry Fox Talbot in England. Niépce was seeking a means to create images on plates that could then be etched and used to make prints on paper with a traditional printing press. Niépces early images were among the first photographs, pre-dating daguerreotypes, Talbot, inventor of the calotype paper negative process, wanted to make paper prints that would not fade. He worked on his process in the 1850s and patented it in 1852 and 1858. Photogravure in its form was developed in 1878 by Czech painter Karel Klíč. This process, the one still in use today, is called the Talbot-Klič process, because of its high quality and richness, photogravure was used for both original fine art prints and for photo-reproduction of works from other media such as paintings. In France the correct term for photogravure is héliogravure, while the French term photogravure refers to any photo-based etching technique, Photogravure registers a wide variety of tones, through the transfer of etching ink from an etched copper plate to special dampened paper run through an etching press. The unique tonal range comes from photogravures variable depth of etch, that is, Photogravure practitioners such as Peter Henry Emerson and others brought the art to a high standard in the late 19th century. This continued with the work of Alfred Stieglitz in the early 20th century and this publication also featured the photogravures of Alvin Langdon Coburn who was a fine gravure printer and envisioned his photographic work as gravures rather than other photo-based processes. The speed and convenience of silver-gelatin photography eventually displaced photogravure which fell into disuse after the Edward S. Curtis gravures in the 1920s. One of the last major portfolios of fine art photogravures was Paul Strands Photographs from Mexico from 1940, many years later, photogravure has experienced a revival in the hands of Aperture and Jon Goodman, who studied it in Europe. Photogravure is now practiced in several dozen workshops around the world. Photogravure plates go through several stages, First, a continuous tone film positive is made from the original photographic negative. A smaller negative can be enlarged onto a sheet of film, the second stage is to sensitize a sheet of pigmented gelatin tissue by immersion into a 3. 5% solution of potassium dichromate for 3 minutes. Once dried against a Plexiglas surface, it is ready for the next stage, the third stage is to expose the film positive to the sensitized gravure tissue. The positive is placed on top of the sheet of pigmented gelatin tissue. The sandwich is then exposed to ultraviolet light, the UV light travels through the positive and screen in succession, each time hardening the gelatin in proportion to the degree of light exposed to it. The fourth stage is to adhere the exposed tissue to the copper plate, the gelatin tissue is adhered or laid down onto the highly polished copper plate under a layer of cool water

6.
Angle of view
–
In photography, angle of view describes the angular extent of a given scene that is imaged by a camera. It is used interchangeably with the general term field of view. It is important to distinguish the angle of view from the angle of coverage, typically the image circle produced by a lens is large enough to cover the film or sensor completely, possibly including some vignetting toward the edge. A cameras angle of view not only on the lens. Digital sensors are usually smaller than 35mm film, and this causes the lens to have an angle of view than with 35mm film. In everyday digital cameras, the factor can range from around 1, to 1.6. For lenses projecting rectilinear images of distant objects, the focal length. Calculations for lenses producing non-rectilinear images are more complex and in the end not very useful in most practical applications. Angle of view may be measured horizontally, vertically, or diagonally, for example, for 35mm film which is 36 mm wide and 24mm high, d =36 mm would be used to obtain the horizontal angle of view and d =24 mm for the vertical angle. Because this is a function, the angle of view does not vary quite linearly with the reciprocal of the focal length. However, except for wide-angle lenses, it is reasonable to approximate α ≈ d f radians or 180 d π f degrees. The effective focal length is equal to the stated focal length of the lens. Angle of view can also be determined using FOV tables or paper or software lens calculators, consider a 35 mm camera with a lens having a focal length of F =50 mm. The dimensions of the 35 mm image format are 24 mm ×36 mm, here α is defined to be the angle-of-view, since it is the angle enclosing the largest object whose image can fit on the film. We want to find the relationship between, the angle α the opposite side of the triangle, d /2 the adjacent side, S2 Using basic trigonometry, we find. For macro photography, we neglect the difference between S2 and F. From the thin lens formula,1 F =1 S1 +1 S2, a second effect which comes into play in macro photography is lens asymmetry. The lens asymmetry causes an offset between the plane and pupil positions

7.
Aperture
–
In optics, an aperture is a hole or an opening through which light travels. More specifically, the aperture and focal length of a system determine the cone angle of a bundle of rays that come to a focus in the image plane. The aperture determines how collimated the admitted rays are, which is of importance for the appearance at the image plane. If an aperture is narrow, then highly collimated rays are admitted, a wide aperture admits uncollimated rays, resulting in a sharp focus only for rays coming from a certain distance. This means that a wide aperture results in an image that is sharp for things at the correct distance, the aperture also determines how many of the incoming rays are actually admitted and thus how much light reaches the image plane. In the human eye, the pupil is the aperture, an optical system typically has many openings or structures that limit the ray bundles. In general, these structures are called stops, and the stop is the stop that primarily determines the ray cone angle. In some contexts, especially in photography and astronomy, aperture refers to the diameter of the aperture stop rather than the stop or the opening itself. For example, in a telescope, the stop is typically the edges of the objective lens or mirror. One then speaks of a telescope as having, for example, note that the aperture stop is not necessarily the smallest stop in the system. Magnification and demagnification by lenses and other elements can cause a large stop to be the aperture stop for the system. In astrophotography, the aperture may be given as a measure or as the dimensionless ratio between that measure and the focal length. In other photography, it is given as a ratio. Sometimes stops and diaphragms are called apertures, even when they are not the stop of the system. The word aperture is used in other contexts to indicate a system which blocks off light outside a certain region. In astronomy, for example, a photometric aperture around a star usually corresponds to a window around the image of a star within which the light intensity is assumed. The aperture stop is an important element in most optical designs and its most obvious feature is that it limits the amount of light that can reach the image/film plane. This can be unavoidable, as in a telescope where one wants to collect as much light as possible, or deliberate

8.
Chromatic aberration
–
In optics, chromatic aberration is an effect resulting from dispersion in which there is a failure of a lens to focus all colors to the same convergence point. It occurs because lenses have different refractive indices for different wavelengths of light, the refractive index of transparent materials decreases with increasing wavelength in degrees unique to each. Since the focal length f of a lens is dependent on the index n. There are two types of aberration, axial, and transverse. Axial aberration occurs when different wavelengths of light are focused at different distances from the lens, transverse aberration occurs when different wavelengths are focused at different positions in the focal plane. The acronym LCA is used, but ambiguous, and may refer to either longitudinal or lateral CA, for clarity and these two types have different characteristics, and may occur together. Axial CA occurs throughout the image and is specified by optical engineers, optometrists, and vision scientists in the unit of focus known widely as diopters, transverse CA does not occur in the center, and increases towards the edge, but is not affected by stopping down. In the earliest uses of lenses, chromatic aberration was reduced by increasing the length of the lens where possible. For example, this could result in extremely long telescopes such as the very long aerial telescopes of the 17th century, isaac Newtons theories about white light being composed of a spectrum of colors led him to the conclusion that uneven refraction of light caused chromatic aberration. There exists a point called the circle of least confusion, where chromatic aberration can be minimized and it can be further minimized by using an achromatic lens or achromat, in which materials with differing dispersion are assembled together to form a compound lens. The most common type is a doublet, with elements made of crown. This reduces the amount of chromatic aberration over a range of wavelengths. By combining more than two lenses of different composition, the degree of correction can be increased, as seen in an apochromatic lens or apochromat. Similarly, the benefit of apochromats is not simply that they focus 3 wavelengths sharply, many types of glass have been developed to reduce chromatic aberration. These are low dispersion glass, most notably, glasses containing fluorite and these hybridized glasses have a very low level of optical dispersion, only two compiled lenses made of these substances can yield a high level of correction. The use of achromats was an important step in the development of the optical microscope, an alternative to achromatic doublets is the use of diffractive optical elements. Diffractive optical elements are able to generate arbitrary complex wave fronts from a sample of material which is essentially flat. Diffractive optical elements have negative characteristics, complementary to the positive Abbe numbers of optical glasses

9.
Color balance
–
In photography and image processing, color balance is the global adjustment of the intensities of the colors. An important goal of this adjustment is to specific colors – particularly neutral colors – correctly. Hence, the method is sometimes called gray balance, neutral balance. Color balance changes the mixture of colors in an image and is used for color correction. Generalized versions of color balance are used to correct colors other than neutrals or to change them for effect. Image data acquired by sensors – either film or electronic image sensors – must be transformed from the values to new values that are appropriate for color reproduction or display. In film photography, color balance is achieved by using color correction filters over the lights or on the camera lens. It is particularly important that neutral colors in a scene appear neutral in the reproduction, most digital cameras have means to select color correction based on the type of scene lighting, using either manual lighting selection, automatic white balance, or custom white balance. The algorithms for these processes perform generalized chromatic adaptation, many methods exist for color balancing. Setting a button on a camera is a way for the user to indicate to the processor the nature of the scene lighting, another option on some cameras is a button which one may press when the camera is pointed at a gray card or other neutral colored object. This captures an image of the ambient light, which enables a digital camera to set the color balance for that light. There is a literature on how one might estimate the ambient lighting from the camera data. A variety of algorithms have been proposed, and the quality of these has been debated, a few examples and examination of the references therein will lead the reader to many others. Examples are Retinex, a neural network or a Bayesian method. Color balancing an image not only the neutrals, but other colors as well. An image that is not color balanced is said to have a color cast, Color balancing may be thought in terms of removing this color cast. Color balance is related to color constancy. Algorithms and techniques used to color constancy are frequently used for color balancing

Joseph Nicéphore Niépce (French: [nisefɔʁ njɛps]; 7 March 1765 – 5 July 1833) was a French inventor, now usually …

Nicéphore Niépce, circa 1795.

Niepces birthplace at Chalon s.S.

One of the three earliest known photographic artifacts, created by Nicéphore Niépce in 1825. It is an ink-on-paper print, but the printing plate used to make it was photographically created by Niépce's heliography process. It reproduces a 17th-century Flemish engraving.

In photography, angle of view (AOV) describes the angular extent of a given scene that is imaged by a camera. It is …

How focal length affects perspective: Varying focal lengths at identical field size achieved by different camera-subject distances. Notice that the shorter the focal length and the larger the angle of view, perspective distortion and size differences increase.

Film speed is the measure of a photographic film's sensitivity to light, determined by sensitometry and measured on …

This film container denotes its speed as ISO 100/21°, including both arithmetic (100 ASA) and logarithmic (21 DIN) components. The second is often dropped, making (e.g.) "ISO 100" effectively equivalent to the older ASA speed. (As is common, the "100" in the film name alludes to its ISO rating).

When setting photoflash exposures, the guide number (GN) of photoflash devices (flashbulbs and electronic devices known …

Image: Vivitar 285 guide numbers

With focal-plane shutters, exposures faster than the X-sync speed can cause the image area to be partially obscured by the closing curtain during the flash.

Before the introduction of flashbulbs, photographers used magnesium flash powder in a flash-lamp. The pneumatic shutter release cords of the era featured rubber bulbs the photographer squeezed to take a photograph.

In photography, an orb is a typically circular artifact on an image, created as a result of flash photography …

Orbs caused by dust.

A hypothetical underwater instance with two conditions in which orbs are (A) likely or (B) unlikely, depending on whether the aspect of particles facing the lens are directly illuminated by the flash, as shown. Elements not shown to scale.

Astrophotography is a specialized type of photography for recording photos of astronomical objects, celestial events, …

An image of Orion's Belt composited from digitized black-and-white photographic plates recorded through red and blue astronomical filters, with a computer synthesized green channel. The plates were taken using the Samuel Oschin Telescope between 1987 and 1991.