Sample records for uinta piceance paradox

Current United States geological tight sand designations in the Piceance and Uinta Basins' Western Gas Sands Project include the Mesaverde Group, Fort Union and Wasatch Formations. Others, such as the Dakota, Cedar Mountain, Morrison and Mancos may eventually be included. Future production from these formations will probably be closely associated with existing trends. Cumulative gas production through December 1979, of the Mesaverde Group, Fort Union and Wasatch Formations in the Piceance and Uinta Basins is less than 275 billion cubic feet. This contrasts dramatically with potential gas in place estimates of 360 trillion cubic feet. If the geology can be fully understood and engineering problems surmounted, significant potential reserves can be exploited.

......................................................................................................................... 20 Attachment 1. Additional key species and plant communities in the Piceance area. .......... 22. This report serves as a comprehensive update to the 2008 plan. The primary audience is intended

Resources of potential oil in place in the Green River Formation are measured and estimated for the primary oil-shale resource area east of the Green River in Utah's Uinta Basin. The area evaluated (Ts 7-14 S, Rs 19-25 E) includes most of, and certainly the best of Utah's oil-shale resource. For resource evaluation the principal oil-shale section is divided into ten stratigraphic units which are equivalent to units previously evaluated in the Piceance Creek Basin of Colorado. Detailed evaluation of individual oil-shale units sampled by cores, plus estimates by extrapolation into uncored areas indicate a total resource of 214 billion barrels of shale oil in place in the eastern Uinta Basin.

The Western Gas Sands Project Core Program was initiated by US DOE to investigate various low permeability, gas bearing sandstones. Research to gain a better geological understanding of these sandstones and improve evaluation and stimulation techniques is being conducted. Tight gas sands are located in several mid-continent and western basins. This report deals with the Piceance Basin in northwestern Colorado. This discussion is an attempt to provide a general overview of the Piceance Basin stratigraphy and to be a useful reference of stratigraphic units and accompanying descriptions.

Heavy oils derived from Uinta Basin bitumens have been hydrotreated under varying conditions. The process variables investigated included total reactor pressure (11.0-16.9 MPa), reactor temperature (616-711 K), feed rate (0.29-1.38 WHSV), and catalyst composition. The extent of heteroatom removal and residuum conversion were determined by the feed molecular weight and catalyst selection. Catalytic activity for heteroatom conversion removal was primarily influenced by metal loading. The heteroatom removal activity of the catalysts studied were ranked HDN catalysts > HDM catalysts > HDN-support. Catalytic activity for residuum conversion was influenced by both metal loading and catalyst surface area. The residuum conversion activity of HDN catalysts were always higher than the activity of HDM catalysts and HDN supports. The residuum conversion activity of HDN-supports surpassed the activity of HDM catalyst at higher temperatures. The conversions achieved with HDN catalysts relative to the HDM catalysts indicated that the low metals contents of the Uinta Basin bitumens obviate the need for hydrodemetallation as an initial upgrading step with these bitumens. The upgrading of Uinta Basin bitumens for integration into refinery feed slates should emphasize molecular weight and boiling range reduction first, followed by hydrotreating of the total liquid product produced in the pyrolysis process. Kinetics of residuum conversion can be modeled by invoking a consecutive-parallel mechanism in which native residuum in the feed is rapidly converted to volatile products and to product residuum. Deep conversion of residuum is only achieved when the more refractory product residuum is converted to volatile products.

The Mesaverde Group of the Piceance Basin in western Colorado has been a pilot study area for government-sponsored tight gas sand research for over 20 years. This study provides a critical comparison of the geologic, production and reservoir characteristics of existing Mesaverde gas producing areas within the basin to those same characteristics at the MWX site near Rifle, Colorado. As will be discussed, the basin has been partitioned into three areas having similar geologic and production characteristics. Stimulation techniques have been reviewed for each partitioned area to determine the most effective stimulation technique currently used in the Mesaverde. This study emphasizes predominantly the southern Piceance Basin because of the much greater production and geologic data there. There may be Mesaverde gas production in northern areas but because of the lack of production and relatively few penetrations, the northern Piceance Basin was not included in the detailed parts of this study. 54 refs., 31 figs., 7 tabs.

The purpose of this study was to contribute to a framework for establishing policies to promote efficient use of the nation's oil shale resources. A methodology was developed to explain the effects of federal leasing policies on resource recovery, extraction costs, and development times associated with oil shale surface mines. This report investigates the effects of lease size, industrial development patterns, waste disposal policies, and lease boundaries on the potential of Piceance Basin oil shale resource. This approach should aid in understanding the relationship between federal leasing policies and requirements for developing Piceance Basin oil shale. 16 refs., 46 figs. (DMC)

P-WAVE TIME-LAPSE SEISMIC DATA INTERPRETATION AT RULISON FIELD, PICEANCE BASIN, COLORADO by Donald-lapse seismic surveys, shot by the Reservoir Characterization Project in the fall of 2003 and 2004, at Rulison seismic can monitor tight gas reservoirs, to a limited extent, over a short period of time. Repeat surveys

The objectives of the study were to increase both primary and secondary hydrocarbon recovery through improved characterization (at the regional, unit, interwell, well, and microscopic scale) of fluvial-deltaic lacustrine reservoirs, thereby preventing premature abandonment of producing wells. The study will encourage exploration and establishment of additional water-flood units throughout the southwest region of the Uinta Basin, and other areas with production from fluvial-deltaic reservoirs.

and seal rocks of the Green River petroleum system. Datum is Mahoganey oil shale bed (1). 49 27 Fig. 11?Cross-section of thermal maturity of oil accumulations in the Green River petroleum system. 49 28 Fig. 12? Lake Uinta depositional... This petroleum system has produced more than 450 MMBO mainly from two formations, the Green River and Colton Formations. 7 The Green River Formation contains the source rock and most of the reservoir and seal rocks (Fig. 10). 49 Most of the kerogen-rich oil...

The preeminent view that evaporating black holes should simply be smaller black holes has been challenged by the firewall paradox. In particular, this paradox suggests that something different occurs once a black hole has evaporated to one-half its original surface area. Here we derive variations of the firewall paradox by tracking the thermodynamic entropy within a black hole across its entire lifetime. Our approach sweeps away many unnecessary assumptions, allowing us to demonstrate a paradox exists even after its initial onset (when conventional assumptions render earlier analyses invalid). Our results suggest that not only is the formation of a firewall the most natural resolution, but provides a mechanism for it. Finally, although firewalls cannot have evolved for modest-sized black holes, within the age of the universe, we speculate on the implications if they were ever unambiguously observed.

Anastamosing, low gradient distributary channels produce {approx}30 gravity, paraffinic oils from the Middle Member of the lacustrine Eocene Green River Formation in the south-central portion of the Uinta Basin. This localized depocenter was situated along the fluctuating southern shoreline of Lake Uinta, where complex deposits of marginal-lacustrine to lower delta plain accumulations are especially characteristic. The Middle Member contains several fining-upward parasequences that can be recognized in outcrop, core, and downhole logs. Each parasequence is about 60 to 120 feet thick and consists of strata deposited during multiple lake level fluctuations that approach 30 to 35 feet in individual thickness. Such parasequences represent 300,000-year cycles based on limited absolute age dating. The subaerial to subaqueous channels commonly possess an erosional base and exhibit a fining upward character. Accordingly, bedding features commonly range from large-scale trough and planar cross bedding or lamination at the base, to a nonreservoir, climbing ripple assemblage near the uppermost reservoir boundary. The best reservoir quality occurs within the laminated to cross-stratified portions, and the climbing ripple phase usually possesses more deleterious micas and/or detrital clays. Diagenesis also exerts a major control on reservoir quality. Certain sandstones were cemented by an early, iron-poor calcite cement, which can be subsequently leached. Secondary intergranular porosity (up to 20%) is largely responsible for the 10 -100 millidarcy rock, which represents petrophysical objectives for both primary and secondary production. Otherwise, intense compaction, silicic and iron-rich carbonate cements, and authigenic clays serve to reduce reservoir quality to marginal economic levels.

) occurred. The resulting increase in water supply to Lake Uinta while sediment remained trapped in the northern basin caused a period of exceptionally high biologic activity. This allowed the deposition of the rich oil shales for which the Green River... brown, brittle shales make up the majority of this unit. Minor amounts of limestone, dolomite, and siltstone are also present. Some of the shales are "oil shales". This 440 ft ( 134 m) thick Member is responsible for most of the production from...

Klyachko and coworkers consider an orthogonality graph in the form of a pentagram, and in this way derive a Kochen-Specker inequality for spin 1 systems. In some low-dimensional situations Hilbert spaces are naturally organised, by a magical choice of basis, into SO(N) orbits. Combining these ideas some very elegant results emerge. We give a careful discussion of the pentagram operator, and then show how the pentagram underlies a number of other quantum "paradoxes", such as that of Hardy.

Geraghty & Miller, Inc. of Midland, Texas conducted geologic and hydrologic feasibility studies of the potential applicability of Jack McIntyre`s patented process for the recovery of natural gas from coalbed/sand formations in the Piceance Basin through literature surveys. Jack McIntyre`s tool separates produced water from gas and disposes of the water downhole into aquifers unused because of poor water quality, uneconomic lifting costs or poor aquifer deliverability. The beneficial aspects of this technology are two fold. The process increases the potential for recovering previously uneconomic gas resources by reducing produced water lifting, treatment and disposal costs. Of greater importance is the advantage of lessening the environmental impact of produced water by downhole disposal. Results from the survey indicate that research in the Piceance Basin includes studies of the geologic, hydrogeologic, conventional and unconventional recovery oil and gas technologies. Available information is mostly found centered upon the geology and hydrology for the Paleozoic and Mesozoic sediments. Lesser information is available on production technology because of the limited number of wells currently producing in the basin. Limited information is available on the baseline geochemistry of the coal/sand formation waters and that of the potential disposal zones. No determination was made of the compatibility of these waters. The study also indicates that water is often produced in variable quantities with gas from several gas productive formations which would indicate that there are potential applications for Jack McIntyre`s patented tool in the Piceance Basin.

Saline water disposal is one of the most pressing issues with regard to increasing petroleum and natural gas production in the Uinta Basin of northeastern Utah. Conventional oil fields in the basin provide 69 percent of Utah?s total crude oil production and 71 percent of Utah?s total natural gas, the latter of which has increased 208% in the past 10 years. Along with hydrocarbons, wells in the Uinta Basin produce significant quantities of saline water ? nearly 4 million barrels of saline water per month in Uintah County and nearly 2 million barrels per month in Duchesne County. As hydrocarbon production increases, so does saline water production, creating an increased need for economic and environmentally responsible disposal plans. Current water disposal wells are near capacity, and permitting for new wells is being delayed because of a lack of technical data regarding potential disposal aquifers and questions concerning contamination of freshwater sources. Many companies are reluctantly resorting to evaporation ponds as a short-term solution, but these ponds have limited capacity, are prone to leakage, and pose potential risks to birds and other wildlife. Many Uinta Basin operators claim that oil and natural gas production cannot reach its full potential until a suitable, long-term saline water disposal solution is determined. The enclosed project was divided into three parts: 1) re-mapping the base of the moderately saline aquifer in the Uinta Basin, 2) creating a detailed geologic characterization of the Birds Nest aquifer, a potential reservoir for large-scale saline water disposal, and 3) collecting and analyzing water samples from the eastern Uinta Basin to establish baseline water quality. Part 1: Regulators currently stipulate that produced saline water must be disposed of into aquifers that already contain moderately saline water (water that averages at least 10,000 mg/L total dissolved solids). The UGS has re-mapped the moderately saline water boundary in the subsurface of the Uinta Basin using a combination of water chemistry data collected from various sources and by analyzing geophysical well logs. By re-mapping the base of the moderately saline aquifer using more robust data and more sophisticated computer-based mapping techniques, regulators now have the information needed to more expeditiously grant water disposal permits while still protecting freshwater resources. Part 2: Eastern Uinta Basin gas producers have identified the Birds Nest aquifer, located in the Parachute Creek Member of the Green River Formation, as the most promising reservoir suitable for large-volume saline water disposal. This aquifer formed from the dissolution of saline minerals that left behind large open cavities and fractured rock. This new and complete understanding the aquifer?s areal extent, thickness, water chemistry, and relationship to Utah?s vast oil shale resource will help operators and regulators determine safe saline water disposal practices, directly impacting the success of increased hydrocarbon production in the region, while protecting potential future oil shale production. Part 3: In order to establish a baseline of water quality on lands identified by the U.S. Bureau of Land Management as having oil shale development potential in the southeastern Uinta Basin, the UGS collected biannual water samples over a three-year period from near-surface aquifers and surface sites. The near-surface and relatively shallow groundwater quality information will help in the development of environmentally sound water-management solutions for a possible future oil shale and oil sands industry and help assess the sensitivity of the alluvial and near-surface bedrock aquifers. This multifaceted study will provide a better understanding of the aquifers in Utah?s Uinta Basin, giving regulators the tools needed to protect precious freshwater resources while still allowing for increased hydrocarbon production.

An integrated detailed sedimentologic, stratigraphic, and geochemical study of Utah's Green River Formation has found that Lake Uinta evolved in three phases (1) a freshwater rising lake phase below the Mahogany zone, (2) an anoxic deep lake phase above the base of the Mahogany zone and (3) a hypersaline lake phase within the middle and upper R-8. This long term lake evolution was driven by tectonic basin development and the balance of sediment and water fill with the neighboring basins, as postulated by models developed from the Greater Green River Basin by Carroll and Bohacs (1999). Early Eocene abrupt global-warming events may have had significant control on deposition through the amount of sediment production and deposition rates, such that lean zones below the Mahogany zone record hyperthermal events and rich zones record periods between hyperthermals. This type of climatic control on short-term and long-term lake evolution and deposition has been previously overlooked. This geologic history contains key points relevant to oil shale development and engineering design including: (1) Stratigraphic changes in oil shale quality and composition are systematic and can be related to spatial and temporal changes in the depositional environment and basin dynamics. (2) The inorganic mineral matrix of oil shale units changes significantly from clay mineral/dolomite dominated to calcite above the base of the Mahogany zone. This variation may result in significant differences in pyrolysis products and geomechanical properties relevant to development and should be incorporated into engineering experiments. (3) This study includes a region in the Uinta Basin that would be highly prospective for application of in-situ production techniques. Stratigraphic targets for in-situ recovery techniques should extend above and below the Mahogany zone and include the upper R-6 and lower R-8.

This report is a summary of drilling and testing operations in the four primary study areas of the WESP for this period. Greater Green River Basin, Northern Great Plains Province, Piceance Basin, and Uinta Basin. (DLC)

This report is a summary of drilling and testing activities in the four primary study areas of the WGSP: Greater Green River Basin, Northern Great Plains Province, Uinta Basin, and Piceance Basin. (DLC)

The US Geological Survey recognizes six major plays for nonassociated gas in Tertiary and Upper Cretaceous low-permeability strata of the Uinta Basin, Utah. For purposes of this study, plays without gas/water contacts are separated from those with such contacts. Continuous-saturation accumulations are essentially single fields, so large in areal extent and so heterogeneous that their development cannot be properly modeled as field growth. Fields developed in gas-saturated plays are not restricted to structural or stratigraphic traps and they are developed in any structural position where permeability conduits occur such as that provided by natural open fractures. Other fields in the basin have gas/water contacts and the rocks are water-bearing away from structural culmination`s. The plays can be assigned to two groups. Group 1 plays are those in which gas/water contacts are rare to absent and the strata are gas saturated. Group 2 plays contain reservoirs in which both gas-saturated strata and rocks with gas/water contacts seem to coexist. Most units in the basin that have received a Federal Energy Regulatory Commission (FERC) designation as tight are in the main producing areas and are within Group 1 plays. Some rocks in Group 2 plays may not meet FERC requirements as tight reservoirs. However, we suggest that in the Uinta Basin that the extent of low-permeability rocks, and therefore resources, extends well beyond the limits of current FERC designated boundaries for tight reservoirs. Potential additions to gas reserves from gas-saturated tight reservoirs in the Tertiary Wasatch Formation and Cretaceous Mesaverde Group in the Uinta Basin, Utah is 10 TCF. If the potential additions to reserves in strata in which both gas-saturated and free water-bearing rocks exist are added to those of Group 1 plays, the volume is 13 TCF.

The C-a and the C-b tracts in the Piceance Creek Basin are potential sites for the development of oil shale by the modified in-situ retorting (MIS) process. Proposed development plans for these tracts require the disturbance of over three billion m/sup 3/ of oil shale to a depth of about 400 m (1312 ft) or more below ground level. The study investigates the nature and impacts of dewatering and reinvasion that are likely to accompany the MIS process. The purpose is to extend earlier investigations through more refined mathematical analysis. Physical phenomena not adequately covered in previous studies, particularly the desaturation process, are investigated. The present study also seeks to identify, through a parametric approach, the key variables that are required to characterize systems such as those at the C-a and C-b tracts.

It is well known that, Klein paradox is one of the most exotic and counterintuitive consequences of quantum theory. Nevertheless, many discussions about the Klein paradox are based upon single-particle Dirac equation in quantum mechanics rather than quantum field method. By using the path integral formalism, we evaluate the reflection and transmission coefficients up to the lowest order for the electron scattering by the finite square barrier potential. Within the context of assuming the step potential is the limiting case of the finite square barrier potential, we explain the Klein paradox that is caused by the ill-definition of the step potential.

As no heat effect and mechanical work are observed, we have a simple experimental resolution of the Gibbs paradox: both the thermodynamic entropy of mixing and the Gibbs free energy change are zero during the formation of any ideal mixtures. Information loss is the driving force of these spontaneous processes. Information is defined as the amount of the compressed data. Information losses due to dynamic motion and static symmetric structure formation are defined as two kinds of entropies - dynamic entropy and static entropy, respectively. There are three laws of information theory, where the first and the second laws are analogs of the two thermodynamic laws. However, the third law of information theory is different: for a solid structure of perfect symmetry (e.g., a perfect crystal), the entropy (static entropy for solid state) S is the maximum. More generally, a similarity principle is set up: if all the other conditions remain constant, the higher the similarity among the components is, the higher the value of entropy of the mixture (for fluid phases) or the assemblage (for a static structure or a system of condensed phases) or any other structure (such as quantum states in quantum mechanics) will be, the more stable the mixture or the assemblage will be, and the more spontaneous the process leading to such a mixture or an assemblage or a chemical bond will be.

The paradox of enrichment was observed by M. Rosenzweig in a class of predator-prey models. Two of the parameters in the models are crucial for the paradox. These two parameters are the prey's carrying capacity and prey's half-saturation for predation. Intuitively, increasing the carrying capacity due to enrichment of the prey's environment should lead to a more stable predator-prey system. Analytically, it turns out that increasing the carrying capacity always leads to an unstable predator-prey system that is susceptible to extinction from environmental random perturbations. This is the so-called paradox of enrichment. Our resolution here rests upon a closer investigation on a dimensionless number $H$ formed from the carrying capacity and the prey's half-saturation. By recasting the models into dimensionless forms, the models are in fact governed by a few dimensionless numbers including $H$. The effects of the two parameters: carrying capacity and half-saturation are incorporated into the number $H$. In fact, increasing the carrying capacity is equivalent (i.e. has the same effect on $H$) to decreasing the half-saturation which implies more aggressive predation. Since there is no paradox between more aggressive predation and instability of the predator-prey system, the paradox of enrichment is resolved. We also further explore spatially dependent models for which the phase space is infinite dimensional. The spatially independent limit cycle which is generated by a Hopf bifurcation from an unstable steady state, is linearly stable in the infinite dimensional phase space. Numerical simulations indicate that the basin of attraction of the limit cycle is riddled. This shows that spatial perturbations can sometimes (neither always nor never) remove the paradox of enrichment near the limit cycle!

It is demonstrated that both transmission and reflection coefficients associated to the Klein paradox at a step barrier are positive and less than unity, so that the particle-antiparticle pair creation mechanism commonly linked to this phenomenon is not necessary. Because graphene is a solid-state testing ground for quantum electrodynamics phenomena involving massless Dirac fermions we suggest that the transport characteristic through a p-n graphene junction can decide between the results obtained in this paper and the common Klein paradox theory, which imply negative transmission and higher-than-unity reflection coefficients. Recent experimental evidence supports our findings.

*Correspondence: nir.friedman@weizmann.ac.il http://dx.doi.org/10.1016/j.cell.2014.07.033 SUMMARY A widespread molecule can induce opposite effects in the responding cells. For example, the cytokine IL-2 can promote to the paradoxical effect of IL-2, which increases the proliferation rate cooperatively and the death rate linearly

Strategic Environmental Assessment (SEA) is a tool that can facilitate sustainable development and improve decision-making by introducing environmental concern early in planning processes. However, various international studies conclude that current planning practice is not taking full advantage of the tool, and we therefore define the paradox of SEA as the methodological ambiguity of non-strategic SEA. This article explores causality through at three-step case study on aggregates extraction planning in Denmark, which consists of a document analysis; a questionnaire survey and follow-up communication with key planners. Though the environmental reports on one hand largely lack strategic considerations, practitioners express an inherent will for strategy and reveal that their SEAs in fact have been an integrated part of the planning process. Institutional context is found to be the most significant barrier for a strategy and this suggests that non-strategic planning setups can prove more important than non-strategic planning in SEA practice. Planners may try to execute strategy within the confinements of SEA-restricted planning contexts; however, such efforts can be overlooked if evaluated by a narrow criterion for strategy formation. Consequently, the paradox may also spark from challenged documentation. These findings contribute to the common understanding of SEA quality; however, further research is needed on how to communicate and influence the strategic options which arguably remain inside non-strategic planning realities. - Highlights: • International studies conclude that SEAs are not strategic. = The paradox of SEA. • Even on the highest managerial level, some contexts do not leave room for strategy. • Non-strategic SEA can derive from challenged documentation. • Descriptive and emergent strategy formation can, in practice, be deemed non-strategic.

COMPUTER IMPLICATION AND CURRY'S PARADOX WAYNE AITKEN, JEFFREY A. BARRETT Abstract. There are theoretical limitations to what can be implemented by a computer program. In this paper we are concerned with a limitation on the strength of computer implemented deduction. We use a version of Curry's paradox to arrive

We use an argument by Page to exhibit a paradox in the global description of the multiverse: the overwhelming majority of observers arise from quantum fluctuations and not by conventional evolution. Unless we are extremely atypical, this contradicts observation. The paradox does not arise in the local description of the multiverse, but similar arguments yield interesting constraints on the maximum lifetime of metastable vacua.

This paper explores the consequences of denying the "emptiness of paths not taken," EPNT, premise of Bernstein, Greenberger, Horne, and Zeilinger (BGHZ) in their paper titled, Bell theorem without inequalities.[ ] Carrying out the negation of EPNT leads to the concept of a "shadow stream." Streams are essentially particle implementations of the paths in Feynman path-integrals, resulting in a simple and consistent extension of the standard axioms for quantum mechanics. The construct provides elegant resolutions of single- and multi-particle interference paradoxes. Moreover, combining the argument of this paper with that of BGHZ shows that there are just two choices for quantum foundations: interpretations closely similar to the present one or those that harbor instantaneous action at a distance.

We demonstrate the appearance of Einstein-Podolsky-Rosen (EPR) paradox when a radiation field impinges on a movable mirror. The, the possibility of a local realism test within a pendular Fabry-Perot cavity is shown to be feasible.

We investigate what it means to apply the solution, proposed to the firewall paradox by Harlow and Hayden, to the famous quantum paradoxes of Sch\\"odinger's Cat and Wigner's Friend if ones views these as posing a thermodynamic decoding problem (as does Hawking radiation in the firewall paradox). The implications might point to a relevance of the firewall paradox for the axiomatic and set theoretic foundations underlying mathematics. We reconsider in this context the results of Benioff on the foundational challenges posed by the randomness postulate of quantum theory. A central point in our discussion is that one can mathematically not naturally distinguish between computational complexity (as central to the approach of Harlow and Hayden and further developed by Susskind) and proof theoretic complexity (since they represent the same concept on a Turing machine), with the latter being related to a finite bound on Kolmogorov entropy (due to Chaitin incompleteness).

Some known relativistic paradoxes are reconsidered for closed spaces, using a simple geometric model. For two twins in a closed space, a real paradox seems to emerge when the traveling twin is moving uniformly along a geodesic and returns to the starting point without turning back. Accordingly, the reference frames (RF) of both twins seem to be equivalent, which makes the twin paradox irresolvable: each twin can claim to be at rest and therefore to have aged more than the partner upon their reunion. In reality, the paradox has the resolution in this case as well. Apart from distinction between the two RF with respect to actual forces in play, they can be distinguished by clock synchronization. A closed space singles out a truly stationary RF with single-valued global time; in all other frames, time is not a single-valued parameter. This implies that even uniform motion along a spatial geodesic in a compact space is not truly inertial, and there is an effective force on an object in such motion. Therefore, the traveling twin will age less upon circumnavigation than the stationary one, just as in flat space-time. Ironically, Relativity in this case emerges free of paradoxes at the price of bringing back the pre-Galilean concept of absolute rest. An example showing the absence of paradoxes is also considered for a more realistic case of a time-evolving closed space.

This conceptual design report summarizes the conceptualized design for an exploratory shaft facility at a representative site in the Paradox Basin located in the southeastern part of Utah. Conceptualized designs for other possible locations (Permian Basin in Texas and Gulf Interior Region salt domes in Louisiana and Mississippi) are summarized in separate reports. The purpose of the exploratory shaft facility is to provide access to the reference repository horizon to permit in situ testing of the salt. The in-situ testing is necessary to verify repository salt design parameters, evaluate isotropy and homogeneity of the salt, and provide a demonstration of the constructability and confirmation of the design to gain access to the repository. The fundamental purpose of this conceptual design report is to assure the feasibility of the exploratory shaft project and to develop a reliable cost estimate and realistic schedule. Because a site has not been selected and site-specific subsurface data are not available, it has been necessary to make certain assumptions in order to develop a conceptual design for an exploratory shaft facility in salt. As more definitive information becomes available to support the design process, adjustments in the projected schedule and estimated costs will be required.

We investigate the role which clouds could play in resolving the Faint Young Sun Paradox (FYSP). Lower solar luminosity in the past means that less energy was absorbed on Earth (a forcing of -50 Wm-2 during the late Archean), but geological evidence points to the Earth being at least as warm as it is today, with only very occasional glaciations. We perform radiative calculations on a single global mean atmospheric column. We select a nominal set of three layered, randomly overlapping clouds, which are both consistent with observed cloud climatologies and reproduce the observed global mean energy budget of Earth. By varying the fraction, thickness, height and particle size of these clouds we conduct a wide exploration of how changed clouds could affect climate, thus constraining how clouds could contribute to resolving the FYSP. Low clouds reflect sunlight but have little greenhouse effect. Removing them entirely gives a~forcing of +25 Wm-2 whilst more modest reduction in their efficacy gives a forcing of +10 ...

Utah is rich in oil shale and oil sands resources. Chief among the challenges facing prospective unconventional fuel developers is the ability to access these resources. Access is heavily dependent upon land ownership and applicable management requirements. Understanding constraints on resource access and the prospect of consolidating resource holdings across a fragmented management landscape is critical to understanding the role Utah’s unconventional fuel resources may play in our nation’s energy policy. This Topical Report explains the historic roots of the “crazy quilt” of western land ownership, how current controversies over management of federal public land with wilderness character could impact access to unconventional fuels resources, and how land exchanges could improve management efficiency. Upon admission to the Union, the State of Utah received the right to title to more than one-ninth of all land within the newly formed state. This land is held in trust to support public schools and institutions, and is managed to generate revenue for trust beneficiaries. State trust lands are scattered across the state in mostly discontinuous 640-acre parcels, many of which are surrounded by federal land and too small to develop on their own. Where state trust lands are developable but surrounded by federal land, federal land management objectives can complicate state trust land development. The difficulty generating revenue from state trust lands can frustrate state and local government officials as well as citizens advocating for economic development. Likewise, the prospect of industrial development of inholdings within prized conservation landscapes creates management challenges for federal agencies. One major tension involves whether certain federal public lands possess wilderness character, and if so, whether management of those lands should emphasize wilderness values over other uses. On December 22, 2010, Secretary of the Interior Ken Salazar issued Secretarial Order 3310, Protecting Wilderness Characteristics on Lands Managed by the Bureau of Land Management. Supporters argue that the Order merely provides guidance regarding implementation of existing legal obligations without creating new rights or duties. Opponents describe Order 3310 as subverting congressional authority to designate Wilderness Areas and as closing millions of acres of public lands to energy development and commodity production. While opponents succeeded in temporarily defunding the Order’s implementation and forcing the Bureau of Land Management (BLM) to adopt a more collaborative approach, the fundamental questions remain: Which federal public lands possess wilderness characteristics and how should those lands be managed? The closely related question is: How might management of such resources impact unconventional fuel development within Utah? These questions remain pressing independent of the Order because the BLM, which manages the majority of federal land in Utah, is statutorily obligated to maintain an up-to-date inventory of federal public lands and the resources they contain, including lands with wilderness characteristics. The BLM is also legally obligated to develop and periodically update land use plans, relying on information obtained in its public lands inventory. The BLM cannot sidestep these hard choices, and failure to consider wilderness characteristics during the planning process will derail the planning effort. Based on an analysis of the most recent inventory data, lands with wilderness characteristics — whether already subject to mandatory protection under the Wilderness Act, subject to discretionary protections as part of BLM Resource Management Plan revisions, or potentially subject to new protections under Order 3310 — are unlikely to profoundly impact oil shale development within Utah’s Uinta Basin. Lands with wilderness characteristics are likely to v have a greater impact on oil sands resources, particularly those resources found in the southern part of the state. Management requirements independent of l

Some known relativistic paradoxes are reconsidered for closed spaces, using a simple geometric model. For two twins in a closed space, a real paradox seems to emerge when the traveling twin is moving uniformly along a geodesic and returns to the starting point without turning back. Accordingly, the reference frames (RF) of both twins seem to be equivalent, which makes the twin paradox irresolvable: each twin can claim to be at rest and therefore to have aged more than the partner upon their reunion. In reality, the paradox has the resolution in this case as well. Apart from distinction between the two RF with respect to actual forces in play, they can be distinguished by clock synchronization. A closed space singles out a truly stationary RF with single-valued global time; in all other frames, time is not a single-valued parameter. This implies that even uniform motion along a spatial geodesic in a compact space is not truly inertial, and there is an effective force on an object in such motion. Therefore, the...

A normally ordered characteristic function (NOCF) of Bose operators is calculated for a number of discrete-variable entangled states (Greenberger-Horne-Zeilinger (GHZ) and Werner (W) qubit states and a cluster state). It is shown that such NOCFs contain visual information on two types of correlations: pseudoclassical and quantum correlations. The latter manifest themselves in the interference terms of the NOCFs and lead to quantum paradoxes, whereas the pseudoclassical correlations of photons and their cumulants satisfy the relations for classical random variables. Three- and four-qubit states are analyzed in detail. An implementation of an analog of Bernstein's paradox on discrete quantum variables is discussed. A measure of quantumness of an entangled state is introduced that is not related to the entropy approach. It is established that the maximum of the degree of quantumness substantiates the numerical values of the coefficients in multiqubit vector states derived from intuitive considerations.

Solutions of the one dimensional Dirac equation with piece-wise constant potentials are presented using standard methods. These solutions show that the Klein Paradox is non-existent and represents a failure to correctly match solutions across a step potential. Consequences of this exact solution are studied for the step potential and a square barrier. Characteristics of massless Dirac states and the momentum linear band energies for Graphene are shown to have quite different current and momentum properties.

Paradoxical games, ratchets, and related phenomena Juan M.R. Parrondo Luis Din´is Javier Buceta states in spatially extended systems [2, 3, 4]. Brownian ratchets show that noise can be rectified a Brownian ratchet. In fact, the paradox came up as a translation to gambling games of the flashing ratchet

Beyond the Productivity Paradox: Computers are the Catalyst for Bigger Changes Forthcoming, The Wharton School lhitt@wharton.upenn.edu http://grace.wharton.upenn.edu/~lhitt/ #12;Beyond the Productivity Paradox Page 1 I. Why Should We Care About Productivity? An important question that has been debated

Forty years after the discovery of Hawking radiation, its exact nature remains elusive. If Hawking radiation does not carry any information out from the ever shrinking black hole, it seems that unitarity is violated once the black hole completely evaporates. On the other hand, attempts to recover information via quantum entanglement lead to the firewall controversy. Amid the confusions, the possibility that black hole evaporation stops with a "remnant" has remained unpopular and is often dismissed due to some "undesired properties" of such an object. Nevertheless, as in any scientific debate, the pros and cons of any proposal must be carefully scrutinized. We fill in the void of the literature by providing a timely review of various types of black hole remnants, and provide some new thoughts regarding the challenges that black hole remnants face in the context of information loss paradox and its latest incarnation, namely the firewall controversy. The importance of understanding the role of curvature singularity is also emphasized, after all there remains a possibility that singularity cannot be cured even by quantum gravity. In this context a black hole remnant conveniently serves as a cosmic censor. We conclude that a remnant remains a possible end state of Hawking evaporation, and if it contains large interior geometry, may help to ameliorate information loss and the firewall paradox. We hope that this will raise some interests in the community to investigate remnants more critically but also more thoroughly.

A data base was assembled in 1978 consisting of well records from more than 900 hydraulically fractured wells in the Piceance, Uinta, Washakie, Sand Wash, and Denver Basins. The purpose of the present study is to develop a western gas sand computerized data base for hydraulically stimulated gas wells by adapting and expanding the above-mentioned data file. This report describes the data file, tasks accomplished to date, and a sample well record. (DMC)

of the corresponding social choice rule. Most significantly, we illustrate several multiple-election para- doxes such para- doxes. We also study the possibility of avoiding the paradoxes for strategic sequential voting

Lucien Hardy's Paradox as Presented by N. David Mermin Frank Rioux A source emits two photons, the result H'H' is never observed. Sources: Lucien Hardy, "Spooky Action at a Distance in Quantum Mechanics

In the history of cosmology physical paradoxes played important role for development of contemporary world models. Within the modern standard cosmological model there are both observational and conceptual cosmological paradoxes which stimulate to search their solution. Confrontation of theoretical predictions of the standard cosmological model with the latest astrophysical observational data is considered. A review of conceptual problems of the Friedmann space expending models, which are in the bases of modern cosmological model, is discussed. The main paradoxes, which are discussed in modern literature, are the Newtonian character of the exact Friedmann equation, the violation of the energy conservation within any comoving local volume, violation of the limiting recession velocity of galaxies for the observed high redshift objects. Possible observational tests of the nature of the cosmological redshift are discussed

In the history of cosmology physical paradoxes played important role for development of contemporary world models. Within the modern standard cosmological model there are both observational and conceptual cosmological paradoxes which stimulate to search their solution. Confrontation of theoretical predictions of the standard cosmological model with the latest astrophysical observational data is considered. A review of conceptual problems of the Friedmann space expending models, which are in the bases of modern cosmological model, is discussed. The main paradoxes, which are discussed in modern literature, are the Newtonian character of the exact Friedmann equation, the violation of the energy conservation within any comoving local volume, violation of the limiting recession velocity of galaxies for the observed high redshift objects. Possible observational tests of the nature of the cosmological redshift are discussed

The primary objective of this project is to enhance domestic petroleum production by demonstration and technology transfer of an advanced oil recovery technology in the Paradox basin, southeastern Utah. If this project can demonstrate technical and economic feasibility, the technique can be applied to about 100 additional small fields in the Paradox basin alone, and result in increased recovery of 150 to 200 million barrels of oil. This project is designed to characterize five shallow-shelf carbonate reservoirs in the Pennsylvanian (Desmoinesian) Paradox Formation and choose the best candidate for a pilot demonstration project for either a waterflood or carbon dioxide-(CO -) 2 flood project. The field demonstration, monitoring of field performance, and associated validation activities will take place in the Paradox basin within the Navajo Nation. Two activities continued this quarter as part of the geological and reservoir characterization of productive carbonate buildups in the Paradox basin: (1) diagenetic characterization of project field reservoirs, and (2) technology transfer.

This paper identifies specific angles of emission and reception of light for which there exists a mass-energy counterpart to the well known transverse Doppler shift. At these angles, the relationship of proper and relative frequency is the same as that for proper and relative mass-energy of a source. Paradoxically, the transverse Doppler shift is often used to demonstrate that for specific angles of emission and reception the relationship of proper and relative frequency of light is the same as that for proper and relative time. But by carefully defining angles, this apparent paradox is resolved.

Constant Proportion Debt Obligations, Zeno's Paradox, and the Spectacular Financial Crisis of 2008-going worldwide financial crisis are heightened by the existence of other financial derivatives more arcane than, to the beat of his dying heart, The Devil drum on the darkened pane: "You did it, but was it Art?" ­ Rudyard

1 Bosnia-Herzegovina: Trying to Build a Federal State on Paradoxes Jens Woelk Introduction The basis for federalism in Bosnia-Herzegovina1 is rather peculiar due to the unique complexity. This was to be accomplished by physical reconstruction as well as by preserving Bosnia and Herzegovina as one country

Chronic Kidney Disease and Hypertension: The Paradox of Treating Patients with Spironolactone Amber the correlation between CKD and hypertension 2. Describe the role of aldosterone in the renin-angiotensin-aldosterone system (RAAS) and in resistant hypertension 3. Explain the mechanism of aldosterone antagonists 4

The paradoxical aspect of the Himalayan granites Jean-Louis VIGNERESSE1 and Jean-Pierre BURG2 1 as reference examples of collision-related granites. However, they are much smaller than the Hercynian collision-related granites. Additional comparison with magmatic arcs and cordilleran-type batholiths

The Renewable Electric Plant Information System (REPiS) is a comprehensive database with detailed information on grid-connected renewable electric plants in the US. The current version, REPiS3 beta, was developed in Paradox for Windows. The user interface (UI) was developed to facilitate easy access to information in the database, without the need to have, or know how to use, Paradox for Windows. The UI is designed to provide quick responses to commonly requested sorts of the database. A quick perusal of this manual will familiarize one with the functions of the UI and will make use of the system easier. There are six parts to this manual: (1) Quick Start: Instructions for Users Familiar with Database Applications; (2) Getting Started: The Installation Process; (3) Choosing the Appropriate Report; (4) Using the User Interface; (5) Troubleshooting; (6) Appendices A and B.

It is pointed out that Coleman and Van Vleck make a major blunder in their discussion of the Shockly-James paradox by designating relativistic hidden mechanical momentum as the basis for resolution of the paradox. This blunder has had a wide influence in the current physics literature, including erroneous work on the Shockley-James paradox, on Mansuripur's paradox, on the motion of a magnetic moment, on the Aharonov-Bohm phase shift, and on the Aharonov-Casher phase shift. Although hidden mechanical momentum is indeed dominant for non-interacting particles moving in a closed orbit under the influence of an external electric field, the attention directed toward hidden mechanical momentum represents a fundamental misunderstanding of the classical electromagnetic interaction between a multiparticle magnet and an external point charge. In the interacting multiparticle situation, the external charge induces an electrostatic polarization of the magnet which leads to an internal electromagnetic momentum in the magnet where both the electric and magnetic fields for the momentum are contributed by the magnet particles. This internal electromagnetic momentum for the interacting multiparticle situation is equal in magnitude and opposite in direction compared to the familiar external electromagnetic momentum where the electric field is contributed by the external charged particle and the magnetic field is that due to the magnet. In the present article, the momentum balance of the Shockley-James situation for a system of a magnet and a point charge is calculated in detail for a magnet model consisting of two interacting point charges which are constrained to move in a circular orbit on a frictionless ring with a compensating negative charge at the center.

Citation: David Cateforis, “Calligraphy, Poetry, and Paradoxical Power in Wenda Gu’s Neon Calligraphy Series,” Word & Image 26, no. 1 (January-March 2010): 1-20. Available at http://hdl.handle.net/1808/5633. Abstract: Many contemporary... Chinese artists demonstrate a strong interest in calligraphy, which they creatively reinterpret in the context of China’s swift economic development, physical modernization, and social and cultural transformation. A leader among them is Wenda Gu (also...

Bohmian mechanics solves the wave-particle duality paradox by introducing the concept of a physical particle that is always point-like and a separate wavefunction with some sort of physical reality. However, this model has not been satisfactorily extended to relativistic levels. Here we introduce a model of permanent point-like particles that works at any energy level. Our model seems to have the benefits of Bohmian mechanics without its shortcomings. We propose an experiment for which the standard interpretation of quantum mechanics and our model make different predictions.

Based on a reconsideration of the Gibbs paradox, we show that a residual, non-extensive term in entropy turns up upon mixing identical particles, whether they are indistinguishable or not. The positive contribution from this residual entropy leads to a decrease in free energy, and we suggest that this entropic mechanism may serve as a source of like-charge attractions between a pair of colloidal particles or other macroions. For a system of two colloidal particles along with their neutralizing counterions, such decrease in free energy is of a few thermal energies and therefore crucial to the effective interaction between the particles.

The primary objective of this project is to enhance domestic petroleum production by field demonstration and technology transfer of an advanced- oil-recovery technology in the Paradox basin, southeastern Utah. If this project can demonstrate technical and economic feasibility, the technique can be applied to approximately 100 additional small fields in the Paradox basin alone, and result in increased recovery of 150 to 200 million barrels (23,850,000-31,800,000 m3) of oil. This project is designed to characterize five shallow-shelf carbonate reservoirs in the Pennsylvanian (Desmoinesian) Paradox Formation and choose the best candidate for a pilot demonstration project for either a waterflood or carbon-dioxide-(CO2-) miscible flood project. The field demonstration, monitoring of field performance, and associated validation activities will take place within the Navajo Nation, San Juan County, Utah.

The primary objective of this project was to enhance domestic petroleum production by field demonstration and technology transfer of an advanced-oil-recovery technology in the Paradox Basin, southeastern Utah. If this project can demonstrate technical and economic feasibility, the technique can be applied to approximately 100 additional small fields in the Paradox Basin alone, and result in increased recovery of 150 to 200 million barrels (23,850,000-31,800,000 m3) of oil. This project was designed to characterize five shallow-shelf carbonate reservoirs in the Pennsylvanian (Desmoinesian) Paradox Formation and choose the best candidate for a pilot demonstration project for either a waterflood or carbon-dioxide-(CO2-) miscible flood project. The field demonstration, monitoring of field performance, and associated validation activities will take place within the Navajo Nation, San Juan County, Utah.

As the velocity of a rocket in a circular orbit near a black hole increases, the outwardly directed rocket thrust must increase to keep the rocket in its orbit. This feature might appear paradoxical from a Newtonian viewpoint, but we show that it follows naturally from the equivalence principle together with special relativity and a few general features of black holes. We also derive a general relativistic formalism of inertial forces for reference frames with acceleration and rotation. The resulting equation relates the real experienced forces to the time derivative of the speed and the spatial curvature of the particle trajectory relative to the reference frame. We show that an observer who follows the path taken by a free (geodesic) photon will experience a force perpendicular to the direction of motion that is independent of the observers velocity. We apply our approach to resolve the submarine paradox, which regards whether a submerged submarine in a balanced state of rest will sink or float when given a horizontal velocity if we take relativistic effects into account. We extend earlier treatments of this topic to include spherical oceans and show that for the case of the Earth the submarine floats upward if we take the curvature of the ocean into account.

In this work we suggest very simple solution of the two capacitors paradox in the completely ideal (without any electrical resistance or inductivity) electrical circuit. Without any diminishing of the generality basic conclusions we consider explicitly technically much simpler case of the single capacitor paradox. Firstly (macroscopically), it is shown that electrical field energy loss corresponds to work done by electrical field by movement of the electrons from the first, initially negatively charged, at the second, initially positively charged, capacitor plate. Secondly (microscopically), according to well-known Sommerfeld free electron model of metals, it is supposed that electrons on both capacitor plates are quantum statistically distributed according to corresponding Fermi-Dirac quantum statistics. For this reason some dissipative processes (e.g. Joule heating or electromagnetic wave emission effects) for dissipation of the energy of individual electron by its arrival at second capacitor plate are not necessary at all. Practically, here we have only statistical redistribution of the electrons energies over redefined quantum statistical ensemble.

The primary objective of this project is to enhance domestic petroleum production by demonstration and technology transfer of an advanced oil recovery technology in the Paradox basin, southeastern Utah. If this project can demonstrate technical and economic feasibility, the technique can be applied to approximately 100 additional small fields in the Paradox basin alone, and result in increased recovery of 150 to 200 million barrels of oil. This project is designed to characterize five shallow-shelf carbonate reservoirs in the Pennsylvanian (Desmoinesian) Paradox Formation and choose the best candidate for a pilot demonstration project for either a waterflood or carbon dioxide- (CO{sub 2}-) flood project. The field demonstration, monitoring of field performance, and associated validation activities will take place in the Paradox basin within the Navajo Nation. The results of this project will be transferred to industry and other researchers through a petroleum extension service, creation of digital databases for distribution, technical workshops and seminars, field trips, technical presentations at national and regional professional meetings, and publication in newsletters and various technical or trade journals.

The primary objective of this project is to enhance domestic petroleum production by demonstration and technology transfer of an advanced oil recovery technology in the Paradox basin, southeastern Utah. If this project can demonstrate technical and economic feasibility, the technique can be applied to about 100 additional small fields in the Paradox basin alone, and result in increased recovery of 150 to 200 million bbl of oil. This project is designed to characterize five shallow-shelf carbonate reservoirs in the Pennsylvanian (Desmoinesian) Paradox Formation and choose the best candidate for a pilot demonstration project for either a waterflood or carbon dioxide-(CO-) flood 2 project. The field demonstration, monitoring of field performance, and associated validation activities will take place in the Paradox basin within the Navajo Nation. The results of this project will be transferred to industry and other researchers through a petroleum extension service, creation of digital databases for distribution, technical workshops and seminars, field trips, technical presentations at national and regional professional meetings, and publication in newsletters and various technical or trade journals.

to assess the nation's military preparedness. However, risk management is not a panacea for the problemsThe Paradoxes of Military Risk Assessment: Will the Enterprise Risk Assessment Model, Composite Risk Management and Associated Techniques Provide the Predicted Benefits? Chris. W. Johnson, Glasgow

The Faint Young Sun Paradox comes from the fact that solar luminosity (2-4)x10^9 years ago was insufficient to support the Earth's temperature necessary for the efficient development of geological and biological evolution (particularly, for the existence of considerable volumes of liquid water). It remains unclear by now if the so-called greenhouse effect on the Earth can resolve this problem. An interesting alternative explanation was put forward recently by M.Krizek (New Ast. 2012, 17, 1), who suggested that planetary orbits expand with time due to the local Hubble effect, caused by the uniformly-distributed Dark Energy. Then, under a reasonable value of the local Hubble constant, it is easy to explain why the Earth was receiving an approximately constant amount of solar irradiation for a long period in the past and will continue to do so for a quite long time in future.

Solar models suggest that four billion years ago the young Sun was about 25% fainter than it is today, rendering Earth's oceans frozen and lifeless. However, there is ample geophysical evidence that Earth had a liquid ocean teeming with life 4 Gyr ago. Since ${\\cal L_\\odot} \\propto G^7M_\\odot^5$, the Sun's luminosity ${\\cal L_\\odot}$ is exceedingly sensitive to small changes in the gravitational constant $G$. We show that a percent-level increase in $G$ in the past would have prevented Earth's oceans from freezing, resolving the faint young Sun paradox. Such small changes in $G$ are consistent with observational bounds on ${\\Delta G}/G$. Since ${\\cal L}_{\\rm SNIa} \\propto G^{-3/2}$, an increase in $G$ leads to fainter supernovae, creating tension between standard candle and standard ruler probes of dark energy. Precisely such a tension has recently been reported by the Planck team.

The standard theory of ideal gases ignores the interaction of the gas particles with the thermal radiation (photon gas) that fills the otherwise vacuum space between them. This is an unphysical feature of the theory since every material in this universe, and hence also the particles of a gas, absorbs and radiates thermal energy. The interaction with the thermal radiation that is contained within the volume of the body may be important in gases since the latter, unlike solids and liquids, are capable of undergoing conspicuous volume changes. Taking this interaction into account makes the behaviour of the ideal gases more realistic and removes Gibbs' paradox.

The primary objective of this project is to enhance domestic petroleum production by field demonstration and technology transfer of an advanced-oil-recovery technology in the Paradox basin, southeastern Utah. If this project can demonstrate technical and economic feasibility, the technique can be applied to approximately 100 additional small fields in the Paradox basin alone, and result in increased recovery of 150 to 200 million barrels (23,850,000-31,800,000 m{sup 3}) of oil. This project is designed to characterize five shallow-shelf carbonate reservoirs in the Pennsylvanian (Desmoinesian) Paradox Formation and choose the best candidate for a pilot demonstration project for either a waterflood or carbon-dioxide-miscible flood project. The field demonstration, monitoring of field performance, and associated validation activities will take place within the Navajo Nation, San Juan County, Utah.

The primary objective of this project is to enhance domestic petroleum production by demonstration and technology transfer of an advanced oil recovery technology in the Paradox basin, southeastern Utah. If this project can demonstrate technical and economic feasibility, the technique can be applied to approximately 100 additional small fields in the Paradox basin alone, and result in increased recovery of 150 to 200 million barrels of oil. This project is designed to characterize five shallow-shelf carbonate reservoirs in the Pennsylvanian Paradox Formation and choose the best candidate for a pilot demonstration project for either a waterflood or carbon dioxide-flood project. The field demonstration, monitoring of field performance, and associated validation activities will take place in the Paradox basin within the Navajo Nation. The results of this project will be transferred to industry and other researchers through a petroleum extension service, creation of digital databases for distribution, technical workshops and seminars, field trips, technical presentations at national and regional professional meetings, and publication in newsletters and various technical or trade journals.

The Paradox Basin of Utah, Colorado, and Arizona contains nearly 100 small oil fields producing from shallow-shelf carbonate buildups or mounds within the Desert Creek zone of the Pennsylvanian (Desmoinesian) Paradox Formation. These fields typically have one to four wells with primary production ranging from 700,000 to 2,000,000 barrels (111,300-318,000 m{sup 3}) of oil per field at a 15 to 20 percent recovery rate. Five fields in southeastern Utah were evaluated for waterflood or carbon-dioxide (CO{sub 2})-miscible flood projects based upon geological characterization and reservoir modeling. Geological characterization on a local scale focused on reservoir heterogeneity, quality, and lateral continuity as well as possible compartmentalization within each of the five project fields. The Desert Creek zone includes three generalized facies belts: (1) open-marine, (2) shallow-shelf and shelf-margin, and (3) intra-shelf, salinity-restricted facies. These deposits have modern analogs near the coasts of the Bahamas, Florida, and Australia, respectively, and outcrop analogs along the San Juan River of southeastern Utah. The analogs display reservoir heterogeneity, flow barriers and baffles, and lithofacies geometry observed in the fields; thus, these properties were incorporated in the reservoir simulation models. Productive carbonate buildups consist of three types: (1) phylloid algal, (2) coralline algal, and (3) bryozoan. Phylloid-algal buildups have a mound-core interval and a supra-mound interval. Hydrocarbons are stratigraphically trapped in porous and permeable lithotypes within the mound-core intervals of the lower part of the buildups and the more heterogeneous supramound intervals. To adequately represent the observed spatial heterogeneities in reservoir properties, the phylloid-algal bafflestones of the mound-core interval and the dolomites of the overlying supra-mound interval were subdivided into ten architecturally distinct lithotypes, each of which exhibits a characteristic set of reservoir properties obtained from outcrop analogs, cores, and geophysical logs. The Anasazi and Runway fields were selected for geostatistical modeling and reservoir compositional simulations. Models and simulations incorporated variations in carbonate lithotypes, porosity, and permeability to accurately predict reservoir responses. History matches tied previous production and reservoir pressure histories so that future reservoir performances could be confidently predicted. The simulation studies showed that despite most of the production being from the mound-core intervals, there were no corresponding decreases in the oil in place in these intervals. This behavior indicates gravity drainage of oil from the supra-mound intervals into the lower mound-core intervals from which the producing wells' major share of production arises. The key to increasing ultimate recovery from these fields (and similar fields in the basin) is to design either waterflood or CO{sub 2}-miscible flood projects capable of forcing oil from high-storage-capacity but low-recovery supra-mound units into the high-recovery mound-core units. Simulation of Anasazi field shows that a CO{sub 2} flood is technically superior to a waterflood and economically feasible. For Anasazi field, an optimized CO{sub 2} flood is predicted to recover a total 4.21 million barrels (0.67 million m3) of oil representing in excess of 89 percent of the original oil in place. For Runway field, the best CO{sub 2} flood is predicted to recover a total of 2.4 million barrels (0.38 million m3) of oil representing 71 percent of the original oil in place. If the CO{sub 2} flood performed as predicted, it is a financially robust process for increasing the reserves in the many small fields in the Paradox Basin. The results can be applied to other fields in the Rocky Mountain region, the Michigan and Illinois Basins, and the Midcontinent.

The Paradox basin of Utah, Colorado, and Arizona contains nearly 100 small oil fields producing from carbonate buildups or mounds within the Pennsylvanian (Desmoinesian) Paradox Formation. These fields typically have one to four wells with primary production ranging from 700,000 to 2,000,000 barrels of oil per field at a 15 to 20% recovery rate. At least 200 million barrels of oil is at risk of being unrecovered in these small fields because of inefficient recovery practices and undrained heterogeneous reservoirs. Five fields (Anasazi, Mule, Blue Hogan, Heron North, and Runway) within the Navajo Nation of southeastern Utah are being evaluated for waterflood or carbon-dioxide-miscible flood projects based upon geological characterization and reservoir modeling. The results can be applied to other fields in the Paradox basin and the Rocky Mountain region, the Michigan and Illinois basins, and the Midcontinent. The Anasazi field was selected for the initial geostatistical modeling and reservoir simulation. A compositional simulation approach is being used to model primary depletion, waterflood, and CO{sub 2}-flood processes. During this second year of the project, team members performed the following reservoir-engineering analysis of Anasazi field: (1) relative permeability measurements of the supra-mound and mound-core intervals, (2) completion of geologic model development of the Anasazi reservoir units for use in reservoir simulation studies including completion of a series of one-dimensional, carbon dioxide-displacement simulations to analyze the carbon dioxide-displacement mechanism that could operate in the Paradox basin system of reservoirs, and (3) completion of the first phase of the full-field, three-dimensional Anasazi reservoir simulation model, and the start of the history matching and reservoir performance prediction phase of the simulation study.

The Paradox basin of Utah, Colorado, and Arizona contains nearly 100 small oil fields producing from carbonate buildups or mounds within the Pennsylvanian (Desmoinesian) Paradox Formation. These fields typically have one to four wells with primary production ranging from 700,000 to 2,000,000 barrels of oil per field at a 15 to 20% recovery rate. At least 200 million barrels of oil is at risk of being unrecovered in these small fields because of inefficient recovery practices and undrained heterogeneous reservoirs. Five fields (Anasazi, mule, Blue Hogan, heron North, and Runway) within the Navajo Nation of southeastern utah are being evaluated for waterflood or carbon-dioxide-miscible flood projects based upon geological characterization and reservoir modeling. The results can be applied to other fields in the Paradox basin and the Rocky Mountain region, the Michigan and Illinois basins, and the Midcontinent. The reservoir engineering component of the work completed to date included analysis of production data and well tests, comprehensive laboratory programs, and preliminary mechanistic reservoir simulation studies. A comprehensive fluid property characterization program was completed. Mechanistic reservoir production performance simulation studies were also completed.

As an extension of its efforts in the development of the geopressured resources of the Gulf Coast, the Division of Geothermal Energy of the US Department of Energy is interested in determining the extent and characteristics of geopressured occurrences in areas outside the Gulf Coast. The work undertaken involved a literature search of available information documenting such occurrences. Geopressured reservoirs have been reported from various types of sedimentary lithologies representing virtually all geologic ages and in a host of geologic environments, many of which are unlike those of the Gulf Coast. These include many Rocky Mountain basins (Green River, Big Horn, Powder River, Wind River, Uinta, Piceance, Denver, San Juan), Mid-Continent basins (Delaware, Anadorko, Interior Salt, Williston, Appalachian), California basins (Sacramento, San Joaquin, Los Angeles, Ventura, Coast Ranges), Alaskan onshore and offshore basins, Pacific Coast offshore basins, and other isolated occurrences, both onshore and offshore.

Research Highlights: > Information is found to be encoded and carried away by Hawking radiations. > Entropy is conserved in Hawking radiation. > We thus conclude no information is lost. > The dynamics of black hole may be unitary. - Abstract: We revisit in detail the paradox of black hole information loss due to Hawking radiation as tunneling. We compute the amount of information encoded in correlations among Hawking radiations for a variety of black holes, including the Schwarzchild black hole, the Reissner-Nordstroem black hole, the Kerr black hole, and the Kerr-Newman black hole. The special case of tunneling through a quantum horizon is also considered. Within a phenomenological treatment based on the accepted emission probability spectrum from a black hole, we find that information is leaked out hidden in the correlations of Hawking radiation. The recovery of this previously unaccounted for information helps to conserve the total entropy of a system composed of a black hole plus its radiations. We thus conclude, irrespective of the microscopic picture for black hole collapsing, the associated radiation process: Hawking radiation as tunneling, is consistent with unitarity as required by quantum mechanics.

If a quantum property is measured, the corresponding state of an entangled partner is also determined. This means that quanta have real features, even in the absence of direct observation. The problem is to establish is these objective qualities are always well-defined, or exhibit complexity (with many simple states in superposition) as predicted by quantum mechanics. The EPR reality criterion demands the existence of simple properties, even for non-commuting variables. The apparent implication is that Heisenberg's principle can only govern the parameters of measurement outcomes. Yet, the latest results in quantum theory support a different picture: the principle must hold for undetected qualities as well. Here I show a solution to this problem. Well-defined quantum properties can be interpreted as transient net states of superposition between multiple spectral components. Different variables become well-defined at different locations, as predicted by the wave-function, whenever component vectors add up (and/or cancel out) to simple linear states. This approach enables the conclusion that quantum mechanics is complete, without contradicting the tenets of Einstein realism.

The Paradox Basin of Utah, Colorado, Arizona, and New Mexico contains nearly 100 small oil fields producing from carbonate buildups within the Pennsylvanian (Desmoinesian) Paradox Formation. These fields typically have one to 10 wells with primary production ranging from 700,000 to 2,000,000 barrels (111,300-318,000 m{sup 3}) of oil per field and a 15 to 20 percent recovery rate. At least 200 million barrels (31.8 million m{sup 3}) of oil will not be recovered from these small fields because of inefficient recovery practices and undrained heterogeneous reservoirs. Several fields in southeastern Utah and southwestern Colorado are being evaluated as candidates for horizontal drilling and enhanced oil recovery from existing vertical wells based upon geological characterization and reservoir modeling case studies. Geological characterization on a local scale is focused on reservoir heterogeneity, quality, and lateral continuity, as well as possible reservoir compartmentalization, within these fields. This study utilizes representative cores, geophysical logs, and thin sections to characterize and grade each field's potential for drilling horizontal laterals from existing development wells. The results of these studies can be applied to similar fields elsewhere in the Paradox Basin and the Rocky Mountain region, the Michigan and Illinois Basins, and the Midcontinent region. This report covers research activities for the first half of the fourth project year (April 6 through October 5, 2003). The work included (1) analysis of well-test data and oil production from Cherokee and Bug fields, San Juan County, Utah, and (2) diagenetic evaluation of stable isotopes from the upper Ismay and lower Desert Creek zones of the Paradox Formation in the Blanding sub-basin, Utah. Production ''sweet spots'' and potential horizontal drilling candidates were identified for Cherokee and Bug fields. In Cherokee field, the most productive wells are located in the thickest part of the mound facies of the upper Ismay zone, where microporosity is well developed. In Bug field, the most productive wells are located structurally downdip from the updip porosity pinch out in the dolomitized lower Desert Creek zone, where micro-box-work porosity is well developed. Microporosity and micro-box-work porosity have the greatest hydrocarbon storage and flow capacity, and potential horizontal drilling target in these fields. Diagenesis is the main control on the quality of Ismay and Desert Creek reservoirs. Most of the carbonates present within the lower Desert Creek and Ismay have retained a marine-influenced carbon isotope geochemistry throughout marine cementation as well as through post-burial recycling of marine carbonate components during dolomitization, stylolitization, dissolution, and late cementation. Meteoric waters do not appear to have had any effect on the composition of the dolomites in these zones. Light oxygen values obtained from reservoir samples for wells located along the margins or flanks of Bug field may be indicative of exposure to higher temperatures, to fluids depleted in {sup 18}O relative to sea water, or to hypersaline waters during burial diagenesis. The samples from Bug field with the lightest oxygen isotope compositions are from wells that have produced significantly greater amounts of hydrocarbons. There is no significant difference between the oxygen isotope compositions from lower Desert Creek dolomite samples in Bug field and the upper Ismay limestones and dolomites from Cherokee field. Carbon isotopic compositions for samples from Patterson Canyon field can be divided into two populations: isotopically heavier mound cement and isotopically lighter oolite and banded cement. Technology transfer activities consisted of exhibiting a booth display of project materials at the annual national convention of the American Association of Petroleum Geologists, a technical presentation, a core workshop, and publications. The project home page was updated on the Utah Geological Survey Internet web site.

The Paradox Basin of Utah, Colorado, Arizona, and New Mexico contains nearly 100 small oil fields producing from carbonate buildups within the Pennsylvanian (Desmoinesian) Paradox Formation. These fields typically have one to 10 wells with primary production ranging from 700,000 to 2,000,000 barrels (111,300-318,000 m{sup 3}) of oil per field and a 15 to 20 percent recovery rate. At least 200 million barrels (31.8 million m{sup 3}) of oil will not be recovered from these small fields because of inefficient recovery practices and undrained heterogeneous reservoirs. Several fields in southeastern Utah and southwestern Colorado are being evaluated as candidates for horizontal drilling and enhanced oil recovery from existing vertical wells based upon geological characterization and reservoir modeling case studies. Geological characterization on a local scale is focused on reservoir heterogeneity, quality, and lateral continuity, as well as possible reservoir compartmentalization, within these fields. This study utilizes representative cores, geophysical logs, and thin sections to characterize and grade each field's potential for drilling horizontal laterals from existing development wells. The results of these studies can be applied to similar fields elsewhere in the Paradox Basin and the Rocky Mountain region, the Michigan and Illinois Basins, and the Midcontinent region. This report covers research activities for the second half of the third project year (October 6, 2002, through April 5, 2003). The primary work included describing and mapping regional facies of the upper Ismay and lower Desert Creek zones of the Paradox Formation in the Blanding sub-basin, Utah. Regional cross sections show the development of ''clean carbonate'' packages that contain all of the productive reservoir facies. These clean carbonates abruptly change laterally into thick anhydrite packages that filled several small intra-shelf basins in the upper Ismay zone. Examination of upper Ismay cores identified seven depositional facies: open marine, middle shelf, inner shelf/tidal flat, bryozoan mounds, phylloid-algal mounds, quartz sand dunes, and anhydritic salinas. Lower Desert Creek facies include open marine, middle shelf, protomounds/collapse breccia, and phylloid-algal mounds. Mapping the upper Ismay zone facies delineates very prospective reservoir trends that contain porous, productive buildups around the anhydrite-filled intra-shelf basins. Facies and reservoir controls imposed by the anhydritic intra-shelf basins should be considered when selecting the optimal location and orientation of any horizontal drilling from known phylloidalgal reservoirs to undrained reserves, as well as identifying new exploration trends. Although intra-shelf basins are not present in the lower Desert Creek zone of the Blanding sub-basin, drilling horizontally along linear shoreline trends could also encounter previously undrilled, porous intervals and buildups. Technology transfer activities consisted of a technical presentation at a Class II Review conference sponsored by the National Energy Technology Laboratory at the Center for Energy and Economic Diversification in Odessa, Texas. The project home page was updated on the Utah Geological Survey Internet web site.

, as abundant coals are found between 2450 and 2630 m. Only three thin coalbeds occur within the Coal Ridge Group between 1950 and 2450 m, so gases from this interval were probably derived from interbedded shales. Core and cuttings samples were also collected...

Federal and State agencies and private campgrounds in geographical areas of concern to determine fee, such as water, sewer, electricity and recreational equipment/infrastructure. 1 An abbreviated version

This report describes the results made in fulfillment of contract DE-FG26-02NT15451, ''Multicomponent Seismic Analysis and Calibration to Improve Recovery from Algal Mounds: Application to the Roadrunner/Towaoc Area of the Paradox Basin, Ute Mountain Ute Reservation, Colorado''. Optimizing development of highly heterogeneous reservoirs where porosity and permeability vary in unpredictable ways due to facies variations can be challenging. An important example of this is in the algal mounds of the Lower and Upper Ismay reservoirs of the Paradox Basin in Utah and Colorado. It is nearly impossible to develop a forward predictive model to delineate regions of better reservoir development, and so enhanced recovery processes must be selected and designed based upon data that can quantitatively or qualitatively distinguish regions of good or bad reservoir permeability and porosity between existing well control. Recent advances in seismic acquisition and processing offer new ways to see smaller features with more confidence, and to characterize the internal structure of reservoirs such as algal mounds. However, these methods have not been tested. This project will acquire cutting edge, three-dimensional, nine-component (3D9C) seismic data and utilize recently-developed processing algorithms, including the mapping of azimuthal velocity changes in amplitude variation with offset, to extract attributes that relate to variations in reservoir permeability and porosity. In order to apply advanced seismic methods a detailed reservoir study is needed to calibrate the seismic data to reservoir permeability, porosity and lithofacies. This will be done by developing a petrological and geological characterization of the mounds from well data; acquiring and processing the 3D9C data; and comparing the two using advanced pattern recognition tools such as neural nets. In addition, should the correlation prove successful, the resulting data will be evaluated from the perspective of selecting alternative enhanced recovery processes, and their possible implementation. The work is being carried out on the Roadrunner/Towaoc Fields of the Ute Mountain Ute Tribe, located in the southwestern corner of Colorado. Although this project is focused on development of existing resources, the calibration established between the reservoir properties and the 3D9C seismic data can also enhance exploration success. During the time period covered by this report, the majority of the project effort has gone into the permitting, planning and design of the 3D seismic survey, and to select a well for the VSP acquisition. The business decision in October, 2002 by WesternGeco, the projects' seismic acquisition contractor, to leave North America, has delayed the acquisition until late summer, 2003. The project has contracted Solid State, a division of Grant Geophysical, to carry out the acquisition. Moreover, the survey has been upgraded to a 3D9C from the originally planned 3D3C survey, which should provide even greater resolution of mounds and internal mound structure.

Progress made on the Methane Recovery from Coalbeds Project (MRCP) is reported in the Raton Mesa Coal Region. The Uinta and Warrior basin reports have been reviewed and will be published and delivered in early December. A cooperative core test with R and P Coal Company on a well in Indiana County, Pennsylvania, was negotiated. In a cooperative effort with the USGS Coal Branch on three wells in the Wind River Basin, desorption of coal samples showed little or no gas. Completed field testing at the Dugan Petroleum well in the San Juan Basin. Coal samples showed minimal gas. Initial desorption of coal samples suggests that at least a moderate amount of gas was obtained from the Coors well test in the Piceance Basin. Field work for the Piceance Basin Detailed Site Investigation was completed. In the Occidental Research Corporation (ORC) project, a higher capacity vacuum pump to increase CH/sub 4/ venting operations has been installed. Drilling of Oxy No. 12 experienced delays caused by mine gas-offs and was eventually terminated at 460 ft after an attempt to drill through a roll which produced a severe dog leg and severely damaged the drill pipe. ORC moved the second drill rig and equipment to a new location in the same panel as Oxy No. 12 and set the stand pipe for Oxy No. 13. Drill rig No. 1 has been moved east of the longwall mining area in anticipation of drilling cross-panel on 500 foot intervals. Waynesburg College project, Equitable Gas Company has received the contract from Waynesburg College and has applied to the Pennsylvania Public Utilities Commission for a new tariff rate. Waynesburg College has identified a contractor to make the piping connections to the gas line after Equitable establishes their meter and valve requirements.

Inspired by responses to the work (arXiv:1310.6514), we solved the one-dimensional, nonlinear Fourier initial and boundary condition problem using the finite element method. Examination of all possible parameters reveals the following: 1. Hydrogen bond has memory effect to emit energy at a rate, or with a relaxation time, depending on initial energy storage. 2. Skin super-solidity creates gradients in thermal diffusion coefficient for heat conduction in liquid with the optimal skin-bulk ratio of 1.48. 3. Convection alone produces no such effect. 4. Mpemba effect happens only in the highly non-diabetic source-path-drain cycling system.

Sedimentological, mineralogical and geochemical studies of two drill cores penetrating the lower Saline zone of the Parachute Creek Member (middle L-4 oil-shale zone through upper R-2 zone) of the Green River Formation in north-central Piceance Creek basin, Colorado, indicate the presence of two distinct oil-shale facies. The most abundant facies has laminated stratification and frequently occurs in the L-4, L-3 and L-2 oil-shale zones. The second, and subordinate facies, has ''streaked and blebby'' stratification and is most abundant in the R-4, R-3 and R-2 zones. Laminated oil shale originated by slow, regular sedimentation during meromictic phases of ancient Lake Uinta, whereas streaked and blebby oil shale was deposited by episodic, non-channelized turbidity currents. Laminated oil shale has higher contents of nahcolite, dawsonite, quartz, K-feldspar and calcite, but less dolomite/ankerite and albite than streaked and blebby oil shale. Ca-Mg-Fe carbonate minerals in laminated oil shale have more variable compositions than those in streaked and blebby shales. Streaked and blebby oil shale has more kerogen and a greater diversity of kerogen particles than laminated oil shale. Such variations may produce different pyrolysis reactions when each shale type is retorted.

Utah oil fields have produced over 1.2 billion barrels (191 million m{sup 3}). However, the 13.7 million barrels (2.2 million m{sup 3}) of production in 2002 was the lowest level in over 40 years and continued the steady decline that began in the mid-1980s. The Utah Geological Survey believes this trend can be reversed by providing play portfolios for the major oil producing provinces (Paradox Basin, Uinta Basin, and thrust belt) in Utah and adjacent areas in Colorado and Wyoming. Oil plays are geographic areas with petroleum potential caused by favorable combinations of source rock, migration paths, reservoir rock characteristics, and other factors. The play portfolios will include: descriptions and maps of the major oil plays by reservoir; production and reservoir data; case-study field evaluations; summaries of the state-of-the-art drilling, completion, and secondary/tertiary techniques for each play; locations of major oil pipelines; descriptions of reservoir outcrop analogs; and identification and discussion of land use constraints. All play maps, reports, databases, and so forth, produced for the project will be published in interactive, menu-driven digital (web-based and compact disc) and hard-copy formats. This report covers research activities for the third quarter of the first project year (January 1 through March 31, 2003). This work included gathering field data and analyzing best practices in the eastern Uinta Basin, Utah, and the Colorado portion of the Paradox Basin. Best practices used in oil fields of the eastern Uinta Basin consist of conversion of all geophysical well logs into digital form, running small fracture treatments, fingerprinting oil samples from each producing zone, running spinner surveys biannually, mapping each producing zone, and drilling on 80-acre (32 ha) spacing. These practices ensure that induced fractures do not extend vertically out of the intended zone, determine the percentage each zone contributes to the overall production of the well, identify areas that may be by-passed by a waterflood, and prevent rapid water breakthrough. In the eastern Paradox Basin, Colorado, optimal drilling, development, and production practices consist of increasing the mud weight during drilling operations before penetrating the overpressured Desert Creek zone; centralizing treatment facilities; and mixing produced water from pumping oil wells with non-reservoir water and injecting the mixture into the reservoir downdip to reduce salt precipitation, dispose of produced water, and maintain reservoir pressure to create a low-cost waterflood. During this quarter, technology transfer activities consisted of technical presentations to members of the Technical Advisory Board in Colorado and the Colorado Geological Survey. The project home page was updated on the Utah Geological Survey Internet web site.

Utah oil fields have produced over 1.2 billion barrels (191 million m{sup 3}). However, the 13.7 million barrels (2.2 million m{sup 3}) of production in 2002 was the lowest level in over 40 years and continued the steady decline that began in the mid-1980s. The Utah Geological Survey believes this trend can be reversed by providing play portfolios for the major oil-producing provinces (Paradox Basin, Uinta Basin, and thrust belt) in Utah and adjacent areas in Colorado and Wyoming. Oil plays are geographic areas with petroleum potential caused by favorable combinations of source rock, migration paths, reservoir rock characteristics, and other factors. The play portfolios will include: descriptions and maps of the major oil plays by reservoir; production and reservoir data; case-study field evaluations; summaries of the state-of-the-art drilling, completion, and secondary/tertiary techniques for each play; locations of major oil pipelines; descriptions of reservoir outcrop analogs; and identification and discussion of land-use constraints. All play maps, reports, databases, and so forth, produced for the project will be published in interactive, menu-driven digital (web-based and compact disc) and hard-copy formats. This report covers research activities for the first quarter of the second project year (July 1 through September 30, 2003). This work included (1) describing the Conventional Southern Uinta Basin Play, subplays, and outcrop reservoir analogs of the Uinta Green River Conventional Oil and Gas Assessment Unit (Eocene Green River Formation), and (2) technology transfer activities. The Conventional Oil and Gas Assessment Unit can be divided into plays having a dominantly southern sediment source (Conventional Southern Uinta Basin Play) and plays having a dominantly northern sediment source (Conventional Northern Uinta Basin Play). The Conventional Southern Uinta Basin Play is divided into six subplays: (1) conventional Uteland Butte interval, (2) conventional Castle Peak interval, (3) conventional Travis interval, (4) conventional Monument Butte interval, (5) conventional Beluga interval, and (6) conventional Duchesne interval fractured shale/marlstone. We are currently conducting basin-wide correlations to define the limits of the six subplays. Production-scale outcrop analogs provide an excellent view, often in three dimensions, of reservoir-facies characteristics and boundaries contributing to the overall heterogeneity of reservoir rocks. They can be used as a ''template'' for evaluation of data from conventional core, geophysical and petrophysical logs, and seismic surveys. Outcrop analogs for each subplay except the Travis interval are found in Indian and Nine Mile Canyons. During this quarter, the project team members submitted an abstract to the American Association of Petroleum Geologists for presentation at the 2004 annual national convention in Dallas, Texas. The project home page was updated on the Utah Geological Survey Internet web site.

for accelerating the expansion of our Universe in recent cosmic times, and Cold Dark Matter (CDM) responsible is increasingly focused on information. If ETCs shared a similar growth pattern, and if advanced physics allows an amazing variety of observations. However, there are cracks in its foundation: why does CDM require

, blood quantum as a determinant of citizenship was not a great leap. Blood quantum as a determinant of citizenship might have been new to most Indians, but exogamy was not. As the Cherokee demographer Russell Thornton has pointed out, during early... been common outside of Indian tribes and is now unfortunately also becoming common within them. Citizenship by Blood Quantum Citizenship by blood quantum alone is a guarantee of physical extinction. Know the tribal population, the required blood...

In this article, we develop and empirically test the theoretical argument that when an organizational culture promotes meritocracy (compared with when it does not), managers in that organization may ironically show greater ...

Shown is that contrary to common intuition, even an arbitrarily weak attenuating mechanism is sufficient to make the background sky quite dark independently of the size of the universe and the Hubble expansion. Further shown is that such an attenuation already exists in the wave nature of light due to entrapment and diffusion from successive diffractions. This is a fundamentally new mechanism to physics, as illustrated by application to the solar neutrino attenuation, galactic dark matter and gamma ray bursts problems. It not only provides a big bang-like cutoff, but also appears to explain the appearance of primeval, metal-deficient galaxies at high redshifts, without deviating from the Olbers' premise of an infinite universe.

True open access to scientific publications not only gives readers the possibility to read articles without paying subscription, but also makes the material available for automated ingestion and harvesting by 3rd parties. Once articles and associated data become universally treatable as computable objects, openly available to 3rd party aggregators and value-added services, what new services can we expect, and how will they change the way that researchers interact with their scholarly communications infrastructure? I will discuss straightforward applications of existing ideas and services, including citation analysis, collaborative filtering, external database linkages, interoperability, and other forms of automated markup, and speculate on the sociology of the next generation of users.

Liquefied natural gas (LNG) has been demonstrating its viability as a clean-burning alternative fuel for buses and medium- and heavy-duty trucks for the past 30 years. The first known LNG vehicle project began in San Diego in 1965, When San Diego Gas and Electric converted 22 utility trucks and three passenger vehicles to dedicated LNG. A surge in LNG vehicle project activity over the past five years has led to a fairly robust variety of vehicles testing the fuel, from Class 8 tractors, refuse haulers and transit buses to railroad locomotives and ferry boats. Recent technology improvements in engine design, cryogenic tanks, fuel nozzles and other related equipment have made LNG more practical to use than in the 1960s. LNG delivers more than twice the driving range from the same-sized fuel tank as a vehicle powered by compressed natural gas (CNG). Although technical and economic hurdles must be overcome before this fuel can achieve widespread use, various ongoing demonstration projects are showing LNG`s practicality, while serving the vital role of pinpointing those areas of performance that are the prime candidates for improvement.

, and Firearms tests wine, gin, whisky, and vodka for radioactivity. If the product does not have sufficient. The key feature of radioactivity that makes it so fascinating is that the energy released is enormous-- at least when compared to typical chemical energies. The typical energy release in the explosion of one

Although prediction of future natural gas supply is complicated by uncertainty in such variables as demand, liquefied natural gas supply price and availability, coalbed methane and gas shale development rate, and pipeline availability, all U.S. Energy Information Administration gas supply estimates to date have predicted that Unconventional gas sources will be the dominant source of U.S. natural gas supply for at least the next two decades (Fig. 1.1; the period of estimation). Among the Unconventional gas supply sources, Tight Gas Sandstones (TGS) will represent 50-70% of the Unconventional gas supply in this time period (Fig. 1.2). Rocky Mountain TGS are estimated to be approximately 70% of the total TGS resource base (USEIA, 2005) and the Mesaverde Group (Mesaverde) sandstones represent the principal gas productive sandstone unit in the largest Western U.S. TGS basins including the basins that are the focus of this study (Washakie, Uinta, Piceance, northern Greater Green River, Wind River, Powder River). Industry assessment of the regional gas resource, projection of future gas supply, and exploration programs require an understanding of reservoir properties and accurate tools for formation evaluation. The goal of this study is to provide petrophysical formation evaluation tools related to relative permeability, capillary pressure, electrical properties and algorithms for wireline log analysis. Detailed and accurate moveable gas-in-place resource assessment is most critical in marginal gas plays and there is need for quantitative tools for definition of limits on gas producibility due to technology and rock physics and for defining water saturation. The results of this study address fundamental questions concerning: (1) gas storage; (2) gas flow; (3) capillary pressure; (4) electrical properties; (5) facies and upscaling issues; (6) wireline log interpretation algorithms; and (7) providing a web-accessible database of advanced rock properties. The following text briefly discusses the nature of these questions. Section I.2 briefly discusses the objective of the study with respect to the problems reviewed.

Strain caused by the adsorption of gases was measured in samples of subbituminous coal from the Powder River basin of Wyoming, U.S.A., and high-volatile bituminous coal from the Uinta-Piceance basin of Utah, U.S.A. using a newly developed strain measurement apparatus. The apparatus can be used to measure strain on multiple small coal samples based on the optical detection of the longitudinal strain. The swelling and shrinkage (strain) in the coal samples resulting from the adsorption of carbon dioxide, nitrogen, methane, helium, and a mixture of gases was measured. Sorption-induced strain processes were shown to be reversible and easily modeled with a Langmuir-type equation. Extended Langmuir theory was applied to satisfactorily model strain caused by the adsorption of gas mixtures using the pure gas Langmuir strain constants. The amount of time required to obtain accurate strain data was greatly reduced compared to other strain measurement methods. Sorption-induced changes in permeability were also measured as a function of pres-sure. Cleat compressibility was found to be variable, not constant. Calculated variable cleat-compressibility constants were found to correlate well with previously published data for other coals. During permeability tests, sorption-induced matrix shrinkage was clearly demonstrated by higher permeability values at lower pore pressures while holding overburden pressure constant. Measured permeability data were modeled using three dif-ferent permeability models from the open literature that take into account sorption-induced matrix strain. All three models poorly matched the measured permeability data because they overestimated the impact of measured sorption-induced strain on permeabil-ity. However, by applying an experimentally derived expression to the measured strain data that accounts for the confining overburden pressure, pore pressure, coal type, and gas type, the permeability models were significantly improved.

In August 2005, the U.S. Congress enacted the Energy Policy Act of 2005, Public Law 109-58. In Section 369 of this Act, also known as the ''Oil Shale, Tar Sands, and Other Strategic Unconventional Fuels Act of 2005,'' Congress declared that oil shale and tar sands (and other unconventional fuels) are strategically important domestic energy resources that should be developed to reduce the nation's growing dependence on oil from politically and economically unstable foreign sources. In addition, Congress declared that both research- and commercial-scale development of oil shale and tar sands should (1) be conducted in an environmentally sound manner using management practices that will minimize potential impacts, (2) occur with an emphasis on sustainability, and (3) benefit the United States while taking into account concerns of the affected states and communities. To support this declaration of policy, Congress directed the Secretary of the Interior to undertake a series of steps, several of which are directly related to the development of a commercial leasing program for oil shale and tar sands. One of these steps was the completion of a programmatic environmental impact statement (PEIS) to analyze the impacts of a commercial leasing program for oil shale and tar sands resources on public lands, with an emphasis on the most geologically prospective lands in Colorado, Utah, and Wyoming. For oil shale, the scope of the PEIS analysis includes public lands within the Green River, Washakie, Uinta, and Piceance Creek Basins. For tar sands, the scope includes Special Tar Sand Areas (STSAs) located in Utah. This paleontological resources overview report was prepared in support of the Oil Shale and Tar Sands Resource Management Plan Amendments to Address Land Use Allocations in Colorado, Utah, and Wyoming and PEIS, and it is intended to be used by Bureau of Land Management (BLM) regional paleontologists and field office staff to support future projectspecific analyses. Additional information about the PEIS can be found at http://ostseis.anl.gov.

Vast quantities of natural gas are entrapped within various tight formations in the Rocky Mountain area. This report seeks to quantify what proportion of that resource can be considered recoverable under today's technological and economic conditions and discusses factors controlling recovery. The ultimate goal of this project is to encourage development of tight gas reserves by industry through reducing the technical and economic risks of locating, drilling and completing commercial tight gas wells. This report is the fourth in a series and focuses on the Wind River Basin located in west central Wyoming. The first three reports presented analyses of the tight gas reserves and resources in the Greater Green River Basin (Scotia, 1993), Piceance Basin (Scotia, 1995) and the Uinta Basin (Scotia, 1995). Since each report is a stand-alone document, duplication of language will exist where common aspects are discussed. This study, and the previous three, describe basin-centered gas deposits (Masters, 1979) which contain vast quantities of natural gas entrapped in low permeability (tight), overpressured sandstones occupying a central basin location. Such deposits are generally continuous and are not conventionally trapped by a structural or stratigraphic seal. Rather, the tight character of the reservoirs prevents rapid migration of the gas, and where rates of gas generation exceed rates of escape, an overpressured basin-centered gas deposit results (Spencer, 1987). Since the temperature is a primary controlling factor for the onset and rate of gas generation, these deposits exist in the deeper, central parts of a basin where temperatures generally exceed 200 F and drill depths exceed 8,000 feet. The abbreviation OPT (overpressured tight) is used when referring to sandstone reservoirs that comprise the basin-centered gas deposit. Because the gas resources trapped in this setting are so large, they represent an important source of future gas supply, prompting studies to understand and quantify the resource itself and to develop technologies that will permit commercial exploitation. This study is a contribution to that process.

Utah oil fields have produced over 1.2 billion barrels (191 million m{sup 3}). However, the 13.7 million barrels (2.2 million m{sup 3}) of production in 2002 was the lowest level in over 40 years and continued the steady decline that began in the mid-1980s. The Utah Geological Survey believes this trend can be reversed by providing play portfolios for the major oil-producing provinces (Paradox Basin, Uinta Basin, and thrust belt) in Utah and adjacent areas in Colorado and Wyoming. Oil plays are geographic areas with petroleum potential caused by favorable combinations of source rock, migration paths, reservoir rock characteristics, and other factors. The play portfolios will include: descriptions and maps of the major oil plays by reservoir; production and reservoir data; case-study field evaluations; locations of major oil pipelines; identification and discussion of land-use constraints; descriptions of reservoir outcrop analogs; and summaries of the state-of-the-art drilling, completion, and secondary/tertiary techniques for each play. This report covers research activities for the sixth quarter of the project (October 1 through December 31, 2003). This work included describing outcrop analogs for the Jurassic Twin Creek Limestone and Mississippian Leadville Limestone, major oil producers in the thrust belt and Paradox Basin, respectively, and analyzing best practices used in the southern Green River Formation play of the Uinta Basin. Production-scale outcrop analogs provide an excellent view of reservoir petrophysics, facies characteristics, and boundaries contributing to the overall heterogeneity of reservoir rocks. They can be used as a ''template'' for evaluation of data from conventional core, geophysical and petrophysical logs, and seismic surveys. In the Utah/Wyoming thrust belt province, the Jurassic Twin Creek Limestone produces from subsidiary closures along major ramp anticlines where the low-porosity limestone beds are extensively fractured and sealed by overlying argillaceous and non-fractured units. The best outcrop analogs for Twin Creek reservoirs are found at Devils Slide and near the town of Peoa, Utah, where fractures in dense, homogeneous non-porous limestone beds are in contact with the basal siltstone units (containing sealed fractures) of the overlying units. The shallow marine, Mississippian Leadville Limestone is a major oil and gas reservoir in the Paradox Basin of Utah and Colorado. Hydrocarbons are produced from basement-involved, northwest-trending structural traps with closure on both anticlines and faults. Excellent outcrops of Leadville-equivalent rocks are found along the south flank of the Uinta Mountains, Utah. For example, like the Leadville, the Mississippian Madison Limestone contains zones of solution breccia, fractures, and facies variations. When combined with subsurface geological and production data, these outcrop analogs can improve (1) development drilling and production strategies such as horizontal drilling, (2) reservoir-simulation models, (3) reserve calculations, and (4) design and implementation of secondary/tertiary oil recovery programs and other best practices used in the oil fields of Utah and vicinity. In the southern Green River Formation play of the Uinta Basin, optimal drilling, development, and production practices consist of: (1) owning drilling rigs and frac holding tanks; (2) perforating sandstone beds with more than 8 percent neutron porosity and stimulate with separate fracture treatments; (3) placing completed wells on primary production using artificial lift; (4) converting wells relatively soon to secondary waterflooding maintaining reservoir pressure above the bubble point to maximize oil recovery; (5) developing waterflood units using an alternating injector--producer pattern on 40-acre (16-ha) spacing; and (6) recompleting producing wells by perforating all beds that are productive in the waterflood unit. As part of technology transfer activities during this quarter, an abstract describing outcrop reservoir analogs was accepted by the American Assoc

at the top. Both siltstone and mudstone are locally bioturbated. Siltstone beds (~20 cm) are sharp-based, light brown to gray in color, and are interbeded with mudstone beds. Coal with siltstone Trough-cross bedding Laminae/flaser beddin g/burrows 10 m10... it occurs at the top of the Rollins Sandstone Member. It is laterally continuous, distinctively white colored, and 22 capped with a coal bed (Figure 9). This unit is composed of six subunits arranged in an upward-coarsening trend and is described...

It is often asserted that consumers purchasing automobiles or other goods and services underweight the costs of gasoline or other "add-ons." We test this hypothesis in the US automobile market by examining the effects of ...

as viable and autonomous political actors in Tunisia given their historically high status in comparison to the rest of the region. Tunisia, existing in relative global obscurity before 2011, was known essentially for its commitment to a secular political... Code of Personal Status, (known by its French acronym, CSP), that the Arab world had ever seen, helped solidify the perception that Tunisia was a bastion of secularism and the standard bearer for women’s rights in the region. The reality, however...

the expansion of our Universe in recent cosmic times, and Cold Dark Matter (CDM) responsible for the rapid on information. If ETCs shared a similar growth pattern, and if advanced physics allows easier exploration variety of observations. However there are cracks in its foundation: why does CDM require at least six

networks such as: transportation and logistical networks, communication networks, energy and power networks that corporate buyers alone spent $517.6 billion on telecommunications goods and services in 1999. Energy for our very survival. In 1995, 2 #12;according to the U. S. Department of Commerce, the energy

and Valenciano, 2003a, and Saari and Sieberg, 2001), measure power in broader classes of cooperative games, often agree on the measure of power for a voter, let alone on the ranking of the power of voters (cf. Saari

results in 4° C global warming, sea level rise, specieslevels of greenhouse gases are responsible for incremental global warming.levels as that of the developing countries, the world would not today have faced the threat of global warming.

at bus i; Fij = Real power flow between buses i and j; i = Locational marginal price at bus i in $/MW lines in the system are upgraded. In these systems, locational prices (as currently used electric power network. The load at bus 4 is assumed to have a totally price- inelastic demand of 100 MW

—especially among Buddhists, Christians, and Muslims—have also been integral sources of globalization distinct from economic practices like trade. The following section will present the “homogenization thesis,” and introduce the subject of the globalization... of completing the journey travel to Mecca, the Muslim holy city, annually after Ramadan. Of course, this is not always feasible, but Muslims are expected to undertake the hajj at least once in their life. The hajj gives Muslims the opportunity to experience...

Reprogramming of fibroblasts to induced pluripotent stem cells (iPSCs) entails a mesenchymal to epithelial transition (MET). While attempting to dissect the mechanism of MET during reprogramming, we observed that knockdown ...

to unknown intermediates. Moreover, our in vitro system utilizes proteins from a pathway in mammalian (and), are extensively employed to understand physicochemical pro- cesses in biological systems (1­4). The lack system involving the proteins from the canonical mitogen-activated protein kinase/extracellular signal

It could be argued that the term 'digital' as a prefix to architecture is evidence that contemporary design practice is lost in time. Modernity's predilection of spatial constructs over temporal ones continues to cast a ...

the potential supply of renewable energy is abundant,potential for drawing on these renew- able resources is now beginning to be realized; illustrative is the national Renewable Energythe potential of renewable technologies. Indeed, the energy

s Fourth National Communication on Climate Change. A Reportclimate change can be assessed using data from national communicationscommunication were also reflected in the working drafts of the Spanish Climate Change

, and she does not consider any evidence in these works which is favorable to the small town. Russell Hlankenship devotes a few pages to "Willa Cather and the Village" in a chapter entitled "Beyond the Village" in American Lite ature as an ~sh . '. od... brought the town to life. By 1884 there were eight 10 pe. ssenger trains a day passing through Red Cloud, going and coming between Kansas City and denver. The dining car had not yet been invented, so many trai~s paused there long enough f...

Lessons From 35 Years of Research on Oil Shale Lands in the Piceance Basin Fort Collins Fort Collins with oil shale extraction. The project involved approximately ten independent field studies, which were established on a 20-ha site located near what was then the focal point of oil shale activity in the Piceance

.................................................................................................................................... 10 Attachment 1. Additional key species and plant communities in the Piceance area........... 12 and threats by participants of a July 2008 workshop. The primary audience is intended to be the workshop

Intensive pre-project feasibility and engineering studies begun in 1979 have produced an outline plan for development of a major project for production of shale oil from private lands in the Piceance Basin in western Colorado. This outline plan...

Utah oil fields have produced over 1.2 billion barrels (191 million m{sup 3}). However, the 13.7 million barrels (2.2 million m{sup 3}) of production in 2002 was the lowest level in over 40 years and continued the steady decline that began in the mid-1980s. The Utah Geological Survey believes this trend can be reversed by providing play portfolios for the major oil-producing provinces (Paradox Basin, Uinta Basin, and thrust belt) in Utah and adjacent areas in Colorado and Wyoming. Oil plays are geographic areas with petroleum potential caused by favorable combinations of source rock, migration paths, reservoir rock characteristics, and other factors. The play portfolios will include: descriptions and maps of the major oil plays by reservoir; production and reservoir data; case-study field evaluations; summaries of the state-of-the-art drilling, completion, and secondary/tertiary techniques for each play; locations of major oil pipelines; descriptions of reservoir outcrop analogs; and identification and discussion of land use constraints. All play maps, reports, databases, and so forth, produced for the project will be published in interactive, menu-driven digital (web-based and compact disc) and hard-copy formats. This report covers research activities for the fourth quarter of the first project year (April 1 through June 30, 2003). This work included describing outcrop analogs to the Jurassic Nugget Sandstone and Pennsylvanian Paradox Formation, the major oil producers in the thrust belt and Paradox Basin, respectively. Production-scale outcrop analogs provide an excellent view, often in three dimensions, of reservoir-facies characteristics and boundaries contributing to the overall heterogeneity of reservoir rocks. They can be used as a ''template'' for evaluation of data from conventional core, geophysical and petrophysical logs, and seismic surveys. The Nugget Sandstone was deposited in an extensive dune field that extended from Wyoming to Arizona. Outcrop analogs are found in the stratigraphically equivalent Navajo Sandstone of southern Utah which displays large-scale dunal cross-strata with excellent reservoir properties and interdunal features such as oases, wadi, and playa lithofacies with poor reservoir properties. Hydrocarbons in the Paradox Formation are stratigraphically trapped in carbonate buildups (or phylloid-algal mounds). Similar carbonate buildups are exposed in the Paradox along the San Juan River of southeastern Utah. Reservoir-quality porosity may develop in the types of facies associated with buildups such as troughs, detrital wedges, and fans, identified from these outcrops. When combined with subsurface geological and production data, these outcrop analogs can improve (1) development drilling and production strategies such as horizontal drilling, (2) reservoir-simulation models, (3) reserve calculations, and (4) design and implementation of secondary/tertiary oil recovery programs and other best practices used in the oil fields of Utah and vicinity. During this quarter, technology transfer activities consisted of exhibiting the project plans, objectives, and products at a booth at the 2003 annual convention of the American Association of Petroleum Geologists. The project home page was updated on the Utah Geological Survey Internet web site.

For a real massless scalar field in general relativity with a negative cosmological constant, we uncover a large class of spherically symmetric initial conditions that are close to AdS, but whose numerical evolution does not result in black hole formation. According to the AdS/CFT dictionary, these bulk solutions are dual to states of a strongly interacting boundary CFT that fail to thermalize at late times. Furthermore, as these states are not stationary, they define dynamical CFT configurations that do not equilibrate. We develop a two-timescale perturbative formalism that captures both direct and inverse cascades of energy and agrees with our fully nonlinear evolutions in the appropriate regime. We also show that this formalism admits a large class of quasi-periodic solutions. Finally, we demonstrate a striking parallel between the dynamics of AdS and the classic Fermi-Pasta-Ulam-Tsingou problem.

This project examines how Presidents Lyndon Johnson, Richard Nixon, Jimmy Carter, Ronald Reagan, and Bill Clinton discussed issues of poverty and welfare from Johnson?s declaration of War on Poverty in 1964 to Clinton?s signing of the Personal...

Through exploration of William Faulkner's, James Weldon Johnson's and Nella Larsen's "passing novels," this dissertation points out that narrative representation of racial passing facilitates and compromises the authors' challenge to the white...

, this means acting upon teenage fantasies of independence and freedom, for others it means rejecting the mandate to maintain rigid order and tidiness within their houses, and for still others it means choosing madness or even death as a reasonable...

We examine GRBs with both Fermi-LAT and X-ray afterglow data. Assuming that the 100MeV (LAT) emission is radiation from cooled electrons accelerated by external shocks, we show that the kinetic energy of the blast wave estimated from the 100MeV flux is 50 times larger than the one estimated from the X-ray flux. This can be explained if either: i) electrons radiating at X-rays are significantly cooled by SSC (suppressing the synchrotron flux above the cooling frequency) or ii) if the X-ray emitting electrons, unlike those emitting at 100MeV energies, are in the slow cooling regime. In both cases the X-ray flux is no longer an immediate proxy of the blast wave kinetic energy. We model the LAT, X-ray and optical data and show that in general these possibilities are consistent with the data, and explain the apparent disagreement between X-ray and LAT observations. All possible solutions require weak magnetic fields: $10^{-6}energy...

Properly interpreted data from nearby galaxies $(z\\simeq 0.01)$ lead to $\\Omega \\simeq 0.082$. Data from farther away galaxies $(z\\simeq 1)$ with type Ia supernovae to $\\Omega =0.153$. Data to be expected from very high redshifted galaxies $(z\\simeq 10.1)$ to $\\Omega =0.500$. And actual data from the CBR, emitted at the time at which the universe became transparent $(z\\simeq 1422)$ to $\\Omega \\simeq 0.992$. All these data are simultaneously consistent with the standard big-bang picture (no inflation), in which $ \\Omega $ is time dependent and it is given by $\\Omega (y)=1/\\cosh ^{2}(y)$, being $y\\equiv \\sinh ^{-1}(T_{+}/T)^{1/2}$

about Future Generations Can Result in Poverty for Everyone (Game-Theoretic Analysis) Tanja Magoc to disastrous "solutions" such as universal poverty. In other words, seemingly reasonable altruism can lead spend too much of our resources (if we spend too much of the financial reserves, if we deplete our fuel

about Future Generations Can Result in Poverty for Everyone (Game­Theoretic Analysis) Tanja Mago to disastrous ``solutions'' such as universal poverty. In other words, seemingly reasonable altruism can lead spend too much of our resources (if we spend too much of the financial reserves, if we deplete our fuel

about Future Generations Can Result in Poverty for Everyone (Game-Theoretic Analysis) Tanja Magoc to disastrous "solutions" such as universal poverty. In other words, seemingly reasonable altruism can lead spend too much of our resources (if we spend too much of the financial reserves, if we deplete our fuel

Hybrid organizations combine institutional logics, often in a search for novel solutions to complex problems such as climate change. This dissertation explores the conditions under which hybrid organizations are effective ...

, Locational Marginal Price I. NOMENCLATURE NL = Number of lines in the network NB = Number of buses reliability benefits to the system. The price paid for this reliability benefit is increased congestion flow between buses i and j i = Nodal price at bus i Âµij = Shadow price of transmission between buses i

The present paper scrutinizes the principle of quantum determinism, which maintains that the complete information about the initial quantum state of a physical system should determine the system's quantum state at any other time. As it shown in the paper, assuming the strong exponential time hypothesis, SETH, which conjectures that known algorithms for solving computational NP-complete problems (often brute-force algorithms) are optimal, the quantum deterministic principle cannot be used generally, i.e., for randomly selected physical systems, particularly macroscopic systems. In other words, even if the initial quantum state of an arbitrary system were precisely known, as long as SETH is true it might be impossible in the real world to predict the system's exact final quantum state. The paper suggests that the breakdown of quantum determinism in a process, in which a black hole forms and then completely evaporates, might actually be physical evidence supporting SETH.

Energy dissipation and decoherence are at first glance harmful to acquiring long exciton lifetime desired for efficient photovoltaics. In the presence of both optically forbidden (namely, dark) and allowed (bright) excitons, however, they can be instrumental as suggested in photosynthesis. By simulating quantum dynamics of exciton relaxations, we show that the optimized decoherence that imposes a quantum-to-classical crossover with the dissipation realizes a dramatically longer lifetime. In an example of carbon nanotube, the exciton lifetime increases by nearly two orders of magnitude when the crossover triggers stable high population in the dark exciton.

Energy dissipation and decoherence are at first glance harmful to acquiring long exciton lifetime desired for efficient photovoltaics. In the presence of both optically forbidden (namely, dark) and allowed (bright) excitons, however, they can be instrumental as suggested in photosynthesis. By simulating quantum dynamics of exciton relaxations, we show that the optimized decoherence that imposes a quantum-to-classical crossover with the dissipation realizes a dramatically longer lifetime. In an example of carbon nanotube, the exciton lifetime increases by nearly two orders of magnitude when the crossover triggers stable high population in the dark exciton.

Networks: Own Brand and Chocolate in UK Supermarkets. In Fair Trade: The Challenges Transforming: Power, Production, and History in the Americas. Steve Striffler and Mark Moberg, eds. Pp. 103.D. dissertation, Department of Anthropology, New School for Social Research. Erem, Suzan 2001 Labor Pains: Inside

Observation of the decay of muons produced in the Earth's atmosphere by cosmic ray interactions provides a graphic illustration of the counter-intuitive space-time predictions of special relativity theory. Muons at rest in the atmosphere, decaying simultaneously, are subject to a universal time-dilatation effect when viewed from a moving frame and so are also observed to decay simultaneously in all such frames. The analysis of this example reveals the underlying physics of the differential aging effect in Langevin's travelling-twin thought experiment.

and immediate benefit of reducing energy consumption, even accounting for the latent energy content of the food sedentary individuals increases their longevity, and therefore their overall energy consumption. #12 Hall Philadelphia, PA 19104 USA ulrich@wharton.upenn.edu First Version: May 2006 This Version: July

Among the initial results from Kepler are two striking lightcurves, for KOI-74 and KOI-81, in which the relative depths of the primary and secondary eclipses show that the more compact, less luminous object is hotter than its stellar host. That result becomes particularly intriguing because a substellar mass is derived for the secondary in KOI-74, which would make the high temperature challenging to explain; in KOI-81, the mass range for the companion is also consistent with a substellar object. We re-analyze the Kepler data and demonstrate that both companions are likely to be white dwarfs. We also find that the photometric data for KOI-74 show a modulation in brightness as the more luminous star orbits, due to Doppler boosting. The magnitude of the effect is sufficiently large that we can use it to infer a radial velocity amplitude accurate to 1 km/s. As far as we are aware, this is the first time a radial-velocity curve has been measured photometrically. Combining our velocity amplitude with the inclinatio...

to improved nutrition,1 water purification,4 and reduced oppor- tunity for transmission. That reducing individuals.5,6 A classic example of such perversity is the increase in the incidence of congenital rubella

When is non-state violence politically effective? Existing scholarship suggests that insurgency and terrorism are generally effective or ineffective based on the analysis of unitary non-state coercers operating solely at ...

Utah oil fields have produced over 1.33 billion barrels (211 million m{sup 3}) of oil and hold 256 million barrels (40.7 million m{sup 3}) of proved reserves. The 13.7 million barrels (2.2 million m3) of production in 2002 was the lowest level in over 40 years and continued the steady decline that began in the mid-1980s. However, in late 2005 oil production increased, due, in part, to the discovery of Covenant field in the central Utah Navajo Sandstone thrust belt ('Hingeline') play, and to increased development drilling in the central Uinta Basin, reversing the decline that began in the mid-1980s. The Utah Geological Survey believes providing play portfolios for the major oil-producing provinces (Paradox Basin, Uinta Basin, and thrust belt) in Utah and adjacent areas in Colorado and Wyoming can continue this new upward production trend. Oil plays are geographic areas with petroleum potential caused by favorable combinations of source rock, migration paths, reservoir rock characteristics, and other factors. The play portfolios include descriptions and maps of the major oil plays by reservoir; production and reservoir data; case-study field evaluations; locations of major oil pipelines; identification and discussion of land-use constraints; descriptions of reservoir outcrop analogs; and summaries of the state-of-the-art drilling, completion, and secondary/tertiary recovery techniques for each play. The most prolific oil reservoir in the Utah/Wyoming thrust belt province is the eolian, Jurassic Nugget Sandstone, having produced over 288 million barrels (46 million m{sup 3}) of oil and 5.1 trillion cubic feet (145 billion m{sup 3}) of gas. Traps form on discrete subsidiary closures along major ramp anticlines where the depositionally heterogeneous Nugget is also extensively fractured. Hydrocarbons in Nugget reservoirs were generated from subthrust Cretaceous source rocks. The seals for the producing horizons are overlying argillaceous and gypsiferous beds in the Jurassic Twin Creek Limestone, or a low-permeability zone at the top of the Nugget. The Nugget Sandstone thrust belt play is divided into three subplays: (1) Absaroka thrust - Mesozoic-cored shallow structures, (2) Absaroka thrust - Mesozoic-cored deep structures, and (3) Absaroka thrust - Paleozoic-cored shallow structures. Both of the Mesozoic-cored structures subplays represent a linear, hanging wall, ramp anticline parallel to the leading edge of the Absaroka thrust. Fields in the shallow Mesozoic subplay produce crude oil and associated gas; fields in the deep subplay produce retrograde condensate. The Paleozoic-cored structures subplay is located immediately west of the Mesozoic-cored structures subplays. It represents a very continuous and linear, hanging wall, ramp anticline where the Nugget is truncated against a thrust splay. Fields in this subplay produce nonassociated gas and condensate. Traps in these subplays consist of long, narrow, doubly plunging anticlines. Prospective drilling targets are delineated using high-quality, two-dimensional and three-dimensional seismic data, forward modeling/visualization tools, and other state-of-the-art techniques. Future Nugget Sandstone exploration could focus on more structurally complex and subtle, thrust-related traps. Nugget structures may be present beneath the leading edge of the Hogsback thrust and North Flank fault of the Uinta uplift. The Jurassic Twin Creek Limestone play in the Utah/Wyoming thrust belt province has produced over 15 million barrels (2.4 million m{sup 3}) of oil and 93 billion cubic feet (2.6 billion m{sup 3}) of gas. Traps form on discrete subsidiary closures along major ramp anticlines where the low-porosity Twin Creek is extensively fractured. Hydrocarbons in Twin Creek reservoirs were generated from subthrust Cretaceous source rocks. The seals for the producing horizons are overlying argillaceous and clastic beds, and non-fractured units within the Twin Creek. The Twin Creek Limestone thrust belt play is divided into two subplays: (1) Absaroka thrust-Mesozoic-cored structures and (2) A

, Architec- tural and Environmental Engineering, said, ?Lack of stream#18;ow or declining lake or reservoir levels can mean there is not enough water physically available for power plant cooling. #30;e high temperatures have also increased water...20 tx H2O Fall 2011 Story by Danielle Kalisek and Leslie Lee 1980 1981 1982 1983 1984 The Sabine River Authority and the city of Dallas sign a contract to move water to the Dallas Water Utilities Eastside Water...

first worldwide until 1983, in per capita yearly alcohol intake, and stayed in the top 6 countries since alcohol consumption: results from two population- based surveys in Île-de-France, 1991 and 2005. Short questionnaire during a period of decreasing alcohol consumption: results from two population- based surveys

The roles of oxytocin (OT) and vasopressin (AVP) on both basal and estrogen-induced prolactin (PRL) secretion were examined. Adult female Sprague-Dawley rats that were ovariectomized for 3 weeks and received estrogen treatment for 1 week were used. Intravenous administration of hormones and serial blood sampling were accomplished through indwelling intraatrial catheters which were implanted two days before. Plasma PRL levels were measured by radioimmunoassay. Oxytocin at a dose of 20 {mu}g/rat stimulated a moderate PRL release in the morning and lower doses were without effect. Vasopressin was most effective at a dose of 5 {mu}g/rat in stimulating PRL release, while consecutive injections of higher doses were less effective. In contrast, TRH, ranging from 1 to 8 {mu}g/rat, induced a dose-dependent increases in PRL secretion. Using the effective dosages determined from the morning studies, repeated injections of either OT, AVP or their specific antagonists MPOMeOVT were given hourly between 1300 to 1800h and blood samples were obtained hourly from 1100 to 1900h. It was found that either OT or AVP significantly reduced the afternoon PRL surge, while their antagonists were not as effective.

Understanding the electrical properties of rocks is of fundamental interest. We report on currents generated when stresses are applied. Loading the center of gabbro tiles, 30x30x0.9 cm$^3$, across a 5 cm diameter piston, leads to positive currents flowing from the center to the unstressed edges. Changing the constant rate of loading over 5 orders of magnitude from 0.2 kPa/s to 20 MPa/s produces positive currents, which start to flow already at low stress levels, rock-forming minerals. The peroxy break-up leads to positive holes h$^{\\bullet}$, i.e. electronic states associated with O$^-$ in a matrix of O$^{2-}$, plus electrons, e'. Propagating...

GEOMECHANICAL MODELING AS A RESERVOIR CHARACTERIZATION TOOL AT RULISON FIELD, PICEANCE BASIN _______________ ____________________ Dr. Terence K. Young Department Head Department of Geophysics ii #12;ABSTRACT Geomechanics is a powerful reservoir characterization tool. Geomechanical modeling is used here to understand how the in

U sing a geology-based assessment methodology, the U.S. Geological Survey estimated a total of 1.525 trillion barrels of oil in place in seventeen oil shale zones in the Eocene Green River Formation in the Piceance Basin, western Colorado.

This progress report discusses in details the geologic assessment of the Piceance Creek Basin. Analysis of the high resolution aeromagnetic survey concentrated on the high-resolution aeromagnetic data acquired by World Geoscience, but the interpretation was supplemented by examination of regional published gravity and magnetic data, as well as surface geology and subsurface geology.

Utah oil fields have produced a total of 1.2 billion barrels (191 million m{sup 3}). However, the 15 million barrels (2.4 million m{sup 3}) of production in 2000 was the lowest level in over 40 years and continued the steady decline that began in the mid-1980s. The Utah Geological Survey believes this trend can be reversed by providing play portfolios for the major oil producing provinces (Paradox Basin, Uinta Basin, and thrust belt) in Utah and adjacent areas in Colorado and Wyoming. Oil plays are geographic areas with petroleum potential caused by favorable combinations of source rock, migration paths, reservoir rock characteristics, and other factors. The play portfolios will include: descriptions and maps of the major oil plays by reservoir; production and reservoir data; case-study field evaluations; summaries of the state-of-the-art drilling, completion, and secondary/tertiary techniques for each play; locations of major oil pipelines; descriptions of reservoir outcrop analogs; and identification and discussion of land use constraints. All play maps, reports, databases, and so forth, produced for the project will be published in interactive, menu-driven digital (web-based and compact disc) and hard-copy formats. This report covers research activities for the first quarter of the first project year (July 1 through September 30, 2002). This work included producing general descriptions of Utah's major petroleum provinces, gathering field data, and analyzing best practices in the Utah Wyoming thrust belt. Major Utah oil reservoirs and/or source rocks are found in Devonian through Permian, Jurassic, Cretaceous, and Tertiary rocks. Stratigraphic traps include carbonate buildups and fluvial-deltaic pinchouts, and structural traps include basement-involved and detached faulted anticlines. Best practices used in Utah's oil fields consist of waterflood, carbon-dioxide flood, gas-injection, and horizontal drilling programs. Nitrogen injection and horizontal drilling programs have been successfully employed to enhance oil production from the Jurassic Nugget Sandstone (the major thrust belt oil-producing reservoir) in Wyoming's Painter Reservoir and Ryckman Creek fields. At Painter Reservoir field a tertiary, miscible nitrogen-injection program is being conducted to raise the reservoir pressure to miscible conditions. Supplemented with water injection, the ultimate recovery will be 113 million bbls (18 million m{sup 3}) of oil (a 68 percent recovery factor over a 60-year period). The Nugget reservoir has significant heterogeneity due to both depositional facies and structural effects. These characteristics create ideal targets for horizontal wells and horizontal laterals drilled from existing vertical wells. Horizontal drilling programs were conducted in both Painter Reservoir and Ryckman Creek fields to encounter potential undrained compartments and increase the overall field recovery by 0.5 to 1.5 percent per horizontal wellbore. Technology transfer activities consisted of exhibiting a booth display of project materials at the Rocky Mountain Section meeting of the American Association of Petroleum Geologists, a technical presentation to the Wyoming State Geological Survey, and two publications. A project home page was set up on the Utah Geological Survey Internet web site.

One of the Department of Energy`s (DOE) Strategies and Objectives in the Natural Gas Program is to conduct activities to transfer technology from R&D programs to potential users. The Slant Hole Completion Test has achieved exactly this objective. The Slant Hole site is essentially the same as the Multiwell site and is located in the southeastern portion of the Piceance Basin near Rifle, Colorado. The Piceance Basin is typical of the Western low permeability basins that contain thick sequences of sands, silts and coals deposited during the Cretaceous period. These sequences contain vast amounts of natural gas but have proven to be resistant to commercial production because of the low permeability of the host rocks. Using the knowledge gained from the DOE`s earlier Multiwell experiment, the SHCT-1 was drilled to demonstrate that by intersecting the natural fractures found in these ``tight rocks,`` commercial gas production can be obtained.

This report describes progress in the following tasks: high-resolution aeromagnetic survey of the southern Piceance Basin of western Colorado; field performance site selection of Rulison Field for seismic acquisition which covers technical work to be performed; seismic acquisition processing and associated costs;theoretical background concerning P-wave multi-azimuth 3D seismic; field data examples of P-wave multi-azimuth data; and 3D basin modeling.

-seventies, the tax rate in the United States was lowered for every category of income. However, people's income of the death penalty is about the same rate for white and black defendants, in fact, slightly higher for white applied to; it is hidden in a comparison of the overall admission rates. Here is another example, from

For several decades, the ascendancy of the Pharma & Biotech sector was largely driven by favorable macro-economic conditions combined with an astonishing level of innovation and a clear focus on addressing unmet medical ...

of nanostructures in data storage technology, nanotechnology, chemistry, biophysics and bioengineering S Kivshar2 and Alexei R Khokhlov3,5 1 Data Storage Institute, Agency for Science, Technology and Research are useful for a wide range of commercial devices. In recent years, the problem of laser heating of plasmonic

A modified vacuum energy density of the radiation field is evaluated, which leads to accepted prediction for the radius of the universe. The modification takes into account the existence of a new gauge boson which also can be used in order to determine the mass of the boson responsible for the weak decay of the muon.

The goals of this project were: (1) To enhance recovery of oil contained within algal mounds on the Ute Mountain Ute tribal lands. (2) To promote the use of advanced technology and expand the technical capability of the Native American Oil production corporations by direct assistance in the current project and dissemination of technology to other Tribes. (3) To develop an understanding of multicomponent seismic data as it relates to the variations in permeability and porosity of algal mounds, as well as lateral facies variations, for use in both reservoir development and exploration. (4) To identify any undiscovered algal mounds for field-extension within the area of seismic coverage. (5) To evaluate the potential for applying CO{sub 2} floods, steam floods, water floods or other secondary or tertiary recovery processes to increase production. The technical work scope was carried out by: (1) Acquiring multicomponent seismic data over the project area; (2) Processing and reprocessing the multicomponent data to extract as much geological and engineering data as possible within the budget and time-frame of the project; (3) Preparing maps and data volumes of geological and engineering data based on the multicomponent seismic and well data; (4) Selecting drilling targets if warranted by the seismic interpretation; (5) Constructing a static reservoir model of the project area; and (6) Constructing a dynamic history-matched simulation model from the static model. The original project scope covered a 6 mi{sup 2} (15.6 km{sup 2}) area encompassing two algal mound fields (Towaoc and Roadrunner). 3D3C seismic data was to acquired over this area to delineate mound complexes and image internal reservoir properties such as porosity and fluid saturations. After the project began, the Red Willow Production Company, a project partner and fully-owned company of the Southern Ute Tribe, contributed additional money to upgrade the survey to a nine-component (3D9C) survey. The purpose of this upgrade to nine components was to provide additional shear wave component data that might prove useful in delineating internal mound reservoir attributes. Also, Red Willow extended the P-wave portion of the survey to the northwest of the original 6 mi{sup 2} (15.6 km{sup 2}) 3D9C area in order to extend coverage further to the northwest to the Marble Wash area. In order to accomplish this scope of work, 3D9C seismic data set covering two known reservoirs was acquired and processed. Three-dimensional, zero-offset vertical seismic profile (VSP) data was acquired to determine the shear wave velocities for processing the sh3Dseismic data. Anisotropic velocity, and azimuthal AVO processing was carried out in addition to the conventional 3D P-wave data processing. All P-, PS- and S-wave volumes of the seismic data were interpreted to map the seismic response. The interpretation consisted of conventional cross-plots of seismic attributes vs. geological and reservoir engineering data, as well as multivariate and neural net analyses to assess whether additional resolution on exploration and engineering parameters could be achieved through the combined use of several seismic variables. Engineering data in the two reservoirs was used to develop a combined lithology, structure and permeability map. On the basis of the seismic data, a well was drilled into the northern mound trend in the project area. This well, Roadrunner No.9-2, was brought into production in late April 2006 and continues to produce modest amounts of oil and gas. As of the end of August 2007, the well has produced approximately 12,000 barrels of oil and 32,000 mcf of gas. A static reservoir model was created from the seismic data interpretations and well data. The seismic data was tied to various markers identified in the well logs, which in turn were related to lithostratigraphy. The tops and thicknesses of the various units were extrapolated from well control based upon the seismic data that was calibrated to the well picks. The reservoir engineering properties were available from a number of wel

Economically viable natural gas production from the low permeability Mesaverde Formation in the Piceance Basin, Colorado requires the presence of an intense set of open natural fractures. Establishing the regional presence and specific location of such natural fractures is the highest priority exploration goal in the Piceance and other western US tight, gas-centered basins. Recently, Advanced Resources International, Inc. (ARI) completed a field program at Rulison Field, Piceance Basin, to test and demonstrate the use of advanced seismic methods to locate and characterize natural fractures. This project began with a comprehensive review of the tectonic history, state of stress and fracture genesis of the basin. A high resolution aeromagnetic survey, interpreted satellite and SLAR imagery, and 400 line miles of 2-D seismic provided the foundation for the structural interpretation. The central feature of the program was the 4.5 square mile multi-azimuth 3-D seismic P-wave survey to locate natural fracture anomalies. The interpreted seismic attributes are being tested against a control data set of 27 wells. Additional wells are currently being drilled at Rulison, on close 40 acre spacings, to establish the productivity from the seismically observed fracture anomalies. A similar regional prospecting and seismic program is being considered for another part of the basin. The preliminary results indicate that detailed mapping of fault geometries and use of azimuthally defined seismic attributes exhibit close correlation with high productivity gas wells. The performance of the ten new wells, being drilled in the seismic grid in late 1996 and early 1997, will help demonstrate the reliability of this natural fracture detection and mapping technology.

The overall objectives of this study were to: review and evaluate published information on the disposal, composition, and leachability of solid wastes produced by aboveground shale oil extraction processes; examine the relationship of development to surface and groundwater quality in the Piceance Creek basin of northwestern Colorado; and identify key areas of research necessary to quantitative assessment of impact. Information is presented under the following section headings: proposed surface retorting developments; surface retorting processes; environmental concerns; chemical/mineralogical composition of raw and retorted oil shale; disposal procedures; water quality; and research needs.

The following reports summarize the results of recent exploration, testing, and production in the Wind River Basin, Wyoming; Powder River Basin, Wyoming and Montana; Greater Green River Coal Region, Wyoming and Colorado; Piceance Basin, Colorado; San Juan Basin, Colorado and New Mexico; Raton Basin, Colorado and New Mexico; Black Warrior Basin, Alabama and the Northern and Central Appalachian Basins. Contents also include: Advances in Laboratory Measurement Techniques of Relative Permeability and Capillary Pressure for Coal Seams; Methane from Coal Seams Research; and Technical Events.

Games Based on Brownian Ratchets Juan M. R. Parrondo,1 Gregory P. Harmer,2 and Derek Abbott2 1) Based on Brownian ratchets, a counterintuitive phenomenon has recently emerged--namely, that two losing or randomly [1,2]. This so-called flashing ratchet is in the class of phenomena known as Brownian ratchets [3

Although current oil shale developments in the Piceance Basin appear to have had little impact on ecosystems, it is important to recognize that planned expansion of the industry in the Basin will greatly magnify the potential for serious perturbations of the Piceance environs. The relatively small scale of the present oil shale activities in the Basin provides the biologist with a unique opportunity to establish and conduct quantitative studies designed to measure impacts as they occur. This paper is intended to focus attention on some of the problems, perspectives and recommended approaches to conducting ecosystem effects studies that will provide criteria for evaluation and mitigation of impacts should they occur. The purpose of this paper is not to criticize past and current environmental studies on oil shale, but in light of anticipated growth of the industry, to focus attention on the need to carefully define, design and execute ecological effects studies to quantify and provide mitigation criteria for impacts that will undoubtedly result from accelerated industry activities.

This report is an annual summarization of an ongoing research in the field of modeling and detecting naturally fractured gas reservoirs. The current research is in the Piceance basin of Western Colorado. The aim is to use existing information to determine the most optimal zone or area of fracturing using a unique reaction-transport-mechanical (RTM) numerical basin model. The RTM model will then subsequently help map subsurface lateral and vertical fracture geometries. The base collection techniques include in-situ fracture data, remote sensing, aeromagnetics, 2-D seismic, and regional geologic interpretations. Once identified, high resolution airborne and spaceborne imagery will be used to verify the RTM model by comparing surficial fractures. If this imagery agrees with the model data, then a further investigation using a three-dimensional seismic survey component will be added. This report presents an overview of the Piceance Creek basin and then reviews work in the Parachute and Rulison fields and the results of the RTM models in these fields.

The report names and describes the Godiva Rim Member of the Green River Formation in the eastern part of the Washakie basin in southwest Wyoming and the central part of the Sand Wash basin in northwest Colorado. The Godiva Rim Member comprises lithofacies of mixed mudflat and lacustrine origin situated between the overlying lacustrine Laney Member of the Green River Formation and the underlying fluvial Cathedral Bluffs Tongue of the Wasatch Formation. The Godiva Rim Member is laterally equivalent to and grades westward into the LaClede Bed of the Laney Member. The Godiva Rim Member of the Green River Formation was deposited along the southeast margins of Lake Gosiute and is correlated to similar lithologic units that were deposited along the northeast margins of Lake Uinta in the Parachute Creek Member of the Green River Formation. The stratigraphic data presented provide significant evidence that the two lakes were periodically connected around the east end of the Uinta Mountains during the middle Eocene.

The Project Rulison underground nuclear test was conducted in 1969 at a depth of 8,400 ft in the Williams Fork Formation of the Piceance Basin, west-central Colorado (Figure 1). The U.S. Department of Energy Office of Legacy Management (LM) is the steward of the site. Their management is guided by data collected from past site investigations and current monitoring, and by the results of calculations of expected behavior of contaminants remaining in the deep subsurface. The purpose of this screening risk assessment is to evaluate possible health risks from current and future exposure to Rulison contaminants so the information can be factored into LM's stewardship decisions. For example, these risk assessment results can inform decisions regarding institutional controls at the site and appropriate monitoring of nearby natural-gas extraction activities. Specifically, the screening risk analysis can provide guidance for setting appropriate action levels for contaminant monitoring to ensure protection of human health.

An Oil Shale Mining Economic Model (OSMEM) was developed and executed for mining scenarios representative of commercially feasible mining operations. Mining systems were evaluated for candidate sites in the Piceance Creek Basin. Mining methods selected included: (1) room-and-pillar; (2) chamber-and-pillar, with spent shale backfilling; (3) sublevel stopping; and (4) sublevel stopping, with spent shale backfilling. Mines were designed to extract oil shale resources to support a 50,000 barrels-per-day surface processing facility. Costs developed for each mining scenario included all capital and operating expenses associated with the underground mining methods. Parametric and sensitivity analyses were performed to determine the sensitivity of mining cost to changes in capital cost, operating cost, return on investment, and cost escalation.

Geraghty & Miller, Inc. of Midland, Texas conducted geological and hydrological feasibility studies of the potential applicability of Jack W. McIntyre`s patented tool for the recovery of natural gas from coalbed formations in the San Juan, Powder River, Greater Green River, Piceance, Black Warrior, Appalachian and Michigan basins. Results from the surveys indicated that geology dominated research efforts for many of the basins. Limited information exists on the hydrology and water quality of the basins. All of the basins contain some potential for the use of Jack McIntyre`s patented production process. This process is designed specifically to separate produced water and produced gas in a downhole environment and may allow for more efficient and economical development of coalbed methane resources in this area.

The focus of this report was on preparing data and modules for Piceance Basin-wide fracture prediction. A review of the geological data input and automated history reconstruction approach was made. Fluid pressure data analysis and preliminary basin simulations were carried out. These activities are summarized briefly below and reviewed in more detail in Appendices A-E. Appendix D is a review of the fluid pressure data and its implications for compartmentation. Preliminary fracture prediction computations on generic basins are presented in Appendix E; these were carried out as part of our code testing activities. The results of these two Appendices are the beginning of what will be the basis of the model testing; fluid pressures are directly comparable with the model predictions and are a key element of fracture nucleation and presentation. We summarize the tectonic and sedimentary history of the Piceance Basin based on our automated history reconstruction and published interpretations. The narrative and figures provide the basic material we have quantified for our CIRF.B basin simulator input. This data supplements our existing well data interpretation approach. It provides an independent check of the automated sedimentary/subsidence history reconstruction module. Fluid pressure data was gathered and analyzed. This data serves two functions. Fluid pressure distribution across the basin provides a quantitative test as it is a direct prediction of CIRF.B. Furthermore, fluid pressure modifies effective stress. It thereby enters fracture nucleation criteria and fracture extension rate and aperture laws. The pressure data is presented in Appendix Din terms of overpressure maps and isosurfaces.

The Wasatch Formation of the Uinta Basin in eastern Utah is typical of many formations in the Rocky Mountains, having low permeability and high sensitivity to water. Stimulation treatments with several types of fracturing fluids, including oilwater emulsion fluids, complex gel fluids and foam fluids, have been generally successful. Production decline curves from twenty four wells in the field were used for comparison of the different stimulation methods. Although foam fracturing has been used for the shortest period of time, comparison of the production histories show the relatively higher efficiency of the foam fracturing treatments compared to other stimulation methods in the Wasatch formation. Foam fluids gave higher production rates and higher flowing pressures than offset wells fractured with complex gel fluids. A stimulation model for oil and gas production was used to match the production history from this reservoir. The model allowed a projection of gas production based on early production from the wells and knowledge of the reservoir.

The effect of confining pressure on the pore volume of some tight sandstones from the Uinta Basin, Utah, was investigated. A new method based on the pressure-volume relationships of a gas was developed and used to measure pore volume reduction. The results were compared with the results obtained using the more common method that involves the measurement of liquid expelled from a saturated core and were found to be in good agreement. Pore volume compressibility of the samples studies is in the range of values reported by other investigators and ranges from 2.0 x 10/sup -6/ to 1.3 x 10/sup -5/ pv/pv/psi at a confining pressure of 5,000 psi.

Cylinder Stokes-type force Hydrodynamic interactions Non-Newtonian fluids Stokes' paradox a b s t r a c t The present work deals with the numerical calculation of the Stokes-type drag undergone by a cylindrical, where the Stokes' paradox takes place. For unbounded medium, avoiding these traps, we show that the drag

It is pointed out that Schr\\"odinger's celebrated "cat" paradox contains a simple error in reasoning regarding the definition of life. It is then shown that there is no paradox in the context of life as we currently understand it.

, kinematic waves, transmissivity feedback, exchange between matrix and macropores, and so forth (Beven, 1989 these observations in the following way may prove useful. Paradox 1: Rapid Mobilization of Old Water The hydrology of old water' paradox, exemplified by Figure 1. In many small catchments, streamflow responds promptly

Waterflooding is by far the most widely used method in the world to increase oil recovery. Historically, little consideration has been given in reservoir engineering practice to the effect of injection brine composition on waterflood displacement efficiency or to the possibility of increased oil recovery through manipulation of the composition of the injected water. However, recent work has shown that oil recovery can be significantly increased by modifying the injection brine chemistry or by injecting diluted or low salinity brine. This paper reports on laboratory work done to increase the understanding of improved oil recovery by waterflooding with low salinity injection water. Porous media used in the studies included outcrop Berea sandstone (Ohio, U.S.A.) and reservoir cores from the Green River formation of the Uinta basin (Utah, U.S.A.). Crude oils used in the experimental protocols were taken from the Minnelusa formation of the Powder River basin (Wyoming, U.S.A.) and from the Green River formation, Monument Butte field in the Uinta basin. Laboratory corefloods using Berea sandstone, Minnelusa crude oil, and simulated Minnelusa formation water found a significant relationship between the temperature at which the oil- and water-saturated cores were aged and the oil recovery resulting from low salinity waterflooding. Lower aging temperatures resulted in very little to no additional oil recovery, while cores aged at higher temperatures resulted in significantly higher recoveries from dilute-water floods. Waterflood studies using reservoir cores and fluids from the Green River formation of the Monument Butte field also showed significantly higher oil recoveries from low salinity waterfloods with cores flooded with fresher water recovering 12.4% more oil on average than those flooded with undiluted formation brine.

twentieth century mathematicians used the expression ``The Crisis in Foundations''. This crisis had many in the form of Russell's paradox, appropriately in the heart of set theory. At first blush one might think

Marine energy has a significant role to play in lowering carbon emissions within the energy sector. Paradoxically, it may be susceptible to changes in climate that will result from rising carbon emissions. Wind patterns are expected to change...

Liberalism faces an apparent paradox. Its commitments to values such as neutrality and tolerance seem to recommend a hands-off attitude toward a society's ethical life. It seems the state should not regulate the value ...

the Unexpected. by Donald G. Saari. Review by David Pritchard. Voting gives rise to many paradoxes. There are times where you should not vote for who you really want. Can you explain this? Donald Saari can! 5

This project explores the legal, economic, and social aspects of household and estate management in eighteenth-century France. It investigates two paradoxes surrounding noblewomen and household management. The first involves ...

The use of wood is fraught with paradox. Wood as a building material is embraced for its naturalness, while the cutting of trees is indicted as a destruction of nature. Wood is lauded for its structural properties and ...

(cont.) We apply the framework to get new results, creating (a) encryption schemes with very short keys, and (b) hash functions that leak no information about their input, yet-paradoxically-allow testing if a candidate ...

of the Middle Ages The hermeneutics of the Christian Middle Ages assign the religious image and its aesthetic. This Christian hermeneutics or theory of symbolism endow the corporeal, painted image with a paradox inherent

The complexity of the school, society and policy, and dominant cultural beliefs about teaching, learning, and knowledge constrain people's mindsets, paradoxically preventing the fundamental changes that can take advantage ...

:COURTESYQUAKERPEACEANDSOCIALWITNESS,THERELIGIOUSSOCIETYOFFRIENDSINBRITAIN T he paradoxical seriousness of games was reviewed synoptically by the Dutch historian Johan Huizinga for the humanistic insights of Huizinga. Although the essential ideas for game theory grew out of the analysis

Proliferating tumor cells use aerobic glycolysis to support their high metabolic demands. Paradoxically, increased glycolysis is often accompanied by expression of the lower activity PKM2 isoform, effectively constraining ...

This thesis studies the concept of sic et non (the paradox of systematically acting in a way which contradicts one's beliefs or of simultaneously holding two contradictory beliefs) in the LBA and its cultural milieu, ...

This paper explores the seeming paradox between the predominant choice of natural gas for capacity additions to generate electricity in the United States and the continuing large share of coal in meeting incremental ...

This thesis explores the paradox faced by 25-34 year-old, White, well-educated persons who choose to live in predominantly low-income neighborhoods. In particular, this thesis asks if gentrifiers are aware of gentrification ...

paradox' has led many investigators to models in which cooling of the hotter Earth took place through' are strongly depen- dent on lithospheric age and hence on lithospheric recycling rate and mantle tem- perature

or it is of no consequence whatsoever. The reality is that Earth's atmosphere, land surface, and oceans are not passive by observing the response to increased green- house gases. Eons and the Faint Sun Paradox The concept is well

The anticancer drug cisplatin is in widespread use but its mechanism of action is only poorly understood. Moreover, human cancers acquire resistance to the drug, which limits its clinical utility. A paradox in the field ...

The Colorado School of Mines (CSM) was awarded a grant by the National Energy Technology Laboratory (NETL), Department of Energy (DOE) to conduct a research project en- titled GIS- and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development in October of 2008. The ultimate goal of this research project is to develop a water resource geo-spatial infrastructure that serves as “baseline data” for creating solutions on water resource management and for supporting decisions making on oil shale resource development. The project came to the end on September 30, 2012. This final project report will report the key findings from the project activity, major accomplishments, and expected impacts of the research. At meantime, the gamma version (also known as Version 4.0) of the geodatabase as well as other various deliverables stored on digital storage media will be send to the program manager at NETL, DOE via express mail. The key findings from the project activity include the quantitative spatial and temporal distribution of the water resource throughout the Piceance Basin, water consumption with respect to oil shale production, and data gaps identified. Major accomplishments of this project include the creation of a relational geodatabase, automated data processing scripts (Matlab) for database link with surface water and geological model, ArcGIS Model for hydrogeologic data processing for groundwater model input, a 3D geological model, surface water/groundwater models, energy resource development systems model, as well as a web-based geo-spatial infrastructure for data exploration, visualization and dissemination. This research will have broad impacts of the devel- opment of the oil shale resources in the US. The geodatabase provides a “baseline” data for fur- ther study of the oil shale development and identification of further data collection needs. The 3D geological model provides better understanding through data interpolation and visualization techniques of the Piceance Basin structure spatial distribution of the oil shale resources. The sur- face water/groundwater models quantify the water shortage and better understanding the spatial distribution of the available water resources. The energy resource development systems model reveals the phase shift of water usage and the oil shale production, which will facilitate better planning for oil shale development. Detailed descriptions about the key findings from the project activity, major accomplishments, and expected impacts of the research will be given in the sec- tion of “ACCOMPLISHMENTS, RESULTS, AND DISCUSSION” of this report.

Wind River Resources Corporation (WRRC) received a DOE grant in support of its proposal to acquire, process and interpret fifteen square miles of high-quality 3-D seismic data on non-allotted trust lands of the Uintah and Ouray (Ute) Indian Reservation, northeastern Utah, in 2000. Subsequent to receiving notice that its proposal would be funded, WRRC was able to add ten square miles of adjacent state and federal mineral acreage underlying tribal surface lands by arrangement with the operator of the Flat Rock Field. The twenty-five square mile 3-D seismic survey was conducted during the fall of 2000. The data were processed through the winter of 2000-2001, and initial interpretation took place during the spring of 2001. The initial interpretation identified multiple attractive drilling prospects, two of which were staked and permitted during the summer of 2001. The two initial wells were drilled in September and October of 2001. A deeper test was drilled in June of 2002. Subsequently a ten-well deep drilling evaluation program was conducted from October of 2002 through March 2004. The present report discusses the background of the project; design and execution of the 3-D seismic survey; processing and interpretation of the data; and drilling, completion and production results of a sample of the wells drilled on the basis of the interpreted survey. Fifteen wells have been drilled to test targets identified on the North Hill Creek 3-D Seismic Survey. None of these wildcat exploratory wells has been a dry hole, and several are among the best gas producers in Utah. The quality of the data produced by this first significant exploratory 3-D survey in the Uinta Basin has encouraged other operators to employ this technology. At least two additional 3-D seismic surveys have been completed in the vicinity of the North Hill Creek Survey, and five additional surveys are being planned for the 2004 field season. This project was successful in finding commercial oil, natural gas and natural gas liquids production on a remote part of the Uintah & Ouray Reservation. Much of the natural gas and natural gas liquids are being produced from the Wingate Formation, which to our knowledge has never produced commercially anywhere. Another large percentage of the natural gas is being produced from the Entrada Formation which has not previously produced in this part of the Uinta Basin. In all, at least nine geologic formations are contributing hydrocarbons to these wells. This survey has clearly established the fact that high-quality data can be obtained in this area, despite the known obstacles.

The Mesaverde Group of the Piceance Basin in western Colorado has been a pilot study area for government-sponsored tight gas sand research for over twenty years. Early production experiments included nuclear stimulations and massive hydraulic fracture treatments. This work culminated in the US Department of Energy (DOE)`s Multiwell Experiment (MWX), a field laboratory designed to study the reservoir and production characteristics of low permeability sands. A key feature of MWX was an infrastructure which included several closely spaced wells that allowed detailed characterization of the reservoir through log and core analysis, and well testing. Interference and tracer tests, as well as the use of fracture diagnostics gave further information on stimulation and production characteristics. Thus, the Multiwell Experiment provided a unique opportunity for identifying the factors affecting production from tight gas sand reservoirs. The purpose of this operation was to support the gathering of field data that may be used to resolve the number of unknowns associated with measuring and modeling the dimensions of hydraulic fractures. Using the close-well infrastructure at the Multiwell Site near Rifle, Colorado, this operation focused primarily on the field design and execution of experiments. The data derived from the experiments were gathered and analyzed by DOE team contractors.

Commercial processing of oil shale is currently being carried out in two countries, these being Manchuria and Estonia. Germany, Israel, Australia, Brazil and the United States are planning commercial development of oil shale during the 1980's. In the United States, developers currently pursuing production facilities in the Piceance Basin in Colorado are the Union Oil Company; Colony Development Company, now owned by Tosco and Exxon; Occidental Oil Shale Inc.; The Rio Blanco Shale Company (Amoco and Gulf) CA Tract; The Cathedral Bluff's Oil Shale Company (Oxy and Tenneco) at CB tract; The Anvil Points Bureau of Mines Site under the direction of DOE which has been leased to the Paraho Development Company to optimize their process; and Superior Oil. Superior Oil plans to recover Negcolite and Dowsonite that are associated with their oil shale. The processes used by these companies are described briefly. These are the Union B process, Tosco II process, Paraho process, and Occidental process. It is estimated that between 400,000 to 500,000 barrels per day (63,600 to 79,500 m/sup 3//day) production would be achieved by 1990 if all of the effects on the infrastructure are planned for and constructed in an orderly manner.

Progress is outlined on activities leading toward evaluation of ecological and agricultural impacts of shale oil development in the Piceance Creek Basin region of northwestern Colorado. After preliminary review of the problem, it was decided to use a model-based calculation approach in the evaluation. The general rationale and objectives of this approach are discussed. Previous studies were examined to characterize climate, soils, vegetation, animals, and ecosystem response units. System function was methodically defined by developing a master list of variables and flows, structuring a generalized system flow diagram, constructing a flow-effects matrix, and conceptualizing interactive spatial units through spatial matrices. The process of developing individual mathematical functions representing the flow of matter and energy through the various system variables in different submodels is discussed. The system model diagram identified 10 subsystems which separately account for flow of soil temperatures, soil water, herbaceous plant biomass, shrubby plant biomass, tree cover, litter biomass, shrub numbers, animal biomass, animal numbers, and land area. Among these coupled subsystems there are 45 unique kinds of state variables and 150 intra-subsystem flows. The model is generalizeable and canonical so that it can be expanded, if required, by disaggregating some of the system state variables and allowing for multiple ecological response units. It integrates information on climate, surface water, ecology, land reclamation, air quality, and solid waste as it is being developed by several other task groups.

When a hydrocarbon reservoir is subjected to a hydraulic fracture treatment, the cracking and slipping of the formation results in the emission of seismic energy. The objective of this study was to determine the advantages of using broadband (100 Hz to 1500 M) microseismic emissions to map a hydraulic fracture treatment. A hydraulic fracture experiment was performed in the Piceance Basin of Western Colorado to induce and record broadband microseismic events. The formation was subjected to four processes; break-down/ballout, step-rate test, KCL mini-fracture, and linear-gel mini-fracture. Broadband microseisms were successfully recorded by a novel three-component wall-locked seismic accelerometer package, placed in an observation well 211 ft (64 m) offset from the treatment well. During the two hours of formation treatment, more than 1200 significant microseismic events were observed. The occurrences of the events strongly correlated with the injection bore-bole pressures during the treatments. Using both hodogram analysis and time of arrival information, estimates of the origination point of the seismic events were computed. A map of the event locations yielded a fracture orientation estimate consistent with the known orientation of the field in the formation. This paper describes the technique for acquiring and analyzing broadband microseismic events and illustrate how the new broadband approach can enhance signal detectability and event location resolution.

This report presents research objectives, discusses activities, and presents technical progress for the period April 1, 1993 through June 31, 1993 on Contract No. DE-FC21-86LC11084 with the Department of Energy, Laramie Project Office. The scope of the research program and the continuation is to study interacting hydrologic, geotechnical, and chemical factors affecting the behavior and disposal of combusted processed oil shale. The research combines bench-scale testing with large scale research sufficient to describe commercial scale embankment behavior. The large scale approach was accomplished by establishing five lysimeters, each 7.3 {times} 3.0 {times} 3.0 m deep, filled with processed oil shale that has been retorted and combusted by the Lurgi-Ruhrgas (Lurgi) process. Approximately 400 tons of Lurgi processed oil shale waste was provided by Rio Blanco Oil Shale Co., Inc. (RBOSC) through a separate cooperative agreement with the University of Wyoming (UW) to carry out this study. Three of the lysimeters were established at the RBOSC Tract C-a in the Piceance Basin of Colorado. Two lysimeters were established in the Environmental Simulation Laboratory (ESL) at UW. The ESL was specifically designed and constructed so that a large range of climatic conditions could be physically applied to the processed oil shale which was filled in the lysimeter cells.

The scope of the research program and the continuation is to study interacting hydrologic, geotechnical, and chemical factors affecting the behavior and disposal of combusted processed oil shale. The research combines bench-scale testing with large scale research sufficient to describe commercial scale embankment behavior. The large scale approach was accomplished by establishing five lysimeters, each 7.3 {times} 3.0 {times} 3.0 m deep, filled with processed oil shale that has been retorted and combusted by the Lurgi-Ruhrgas (Lurgi) process. Approximately 400 tons of Lurgi processed oil shale waste was provided by Rio Blanco Oil Shale Co., Inc. (RBOSC) through a separate cooperative agreement with the University of Wyoming (UW) to carry out this study. Three of the lysimeters were established at the RBOSC Tract C-a in the Piceance Basin of Colorado. Two lysimeters were established in the Environmental Simulation Laboratory (ESL) at UW. The ESL was specifically designed and constructed so that a large range of climatic conditions could be physically applied to the processed oil shale which was filled in the lysimeter cells.

The scope of the research program and the continuation is to study interacting hydrologic, geotechnical, and chemical factors affecting the behavior and disposal of combusted processed oil shale. The research combines bench-scale testing with large scale research sufficient to describe commercial scale embankment behavior. The large scale approach was accomplished by establishing five lysimeters, each 7.3 [times] 3.0 [times] 3.0 m deep, filled with processed oil shale that has been retorted and combusted by the Lurgi-Ruhrgas (Lurgi) process. Approximately 400 tons of Lurgi processed oil shale waste was provided by Rio Blanco Oil Shale Co., Inc. (RBOSC) through a separate cooperative agreement with the University of Wyoming (UW) to carry out this study. Three of the lysimeters were established at the RBOSC Tract C-a in the Piceance Basin of Colorado. Two lysimeters were established in the Environmental Simulation Laboratory (ESL) at UW. The ESL was specifically designed and constructed so that a large range of climatic conditions could be physically applied to the processed oil shale which was filled in the lysimeter cells.

The scope of the research program and the continuation is to study interacting hydrologic, geotechnical, and chemical factors affecting the behavior and disposal of combusted processed oil shale. The research combines bench-scale testing with large scale research sufficient to describe commercial scale embankment behavior. The large scale approach was accomplished by establishing five lysimeters, each 7.3 {times} 3.0 {times} 3.0 m deep, filled with processed oil shale that has been retorted and combusted by the Lurgi-Ruhrgas (Lurgi) process. Approximately 400 tons of Lurgi processed oil shale waste was provided by Rio Blanco Oil Shale Co., Inc. (RBOSC) through a separate cooperative agreement with the University of Wyoming (UW) to carry out this study. Three of the lysimeters were established at the RBOSC Tract C-a in the Piceance Basin of Colorado. Two lysimeters were established in the Environmental Simulation Laboratory (ESL) at UW. The ESL was specifically designed and constructed so that a large range of climatic conditions could be physically applied to the processed oil shale which was filled in the lysimeter cells.

The recent surge of activity involving self-sourcing reservoirs and horizontal drilling recognizes a little tapped niche in the domestic energy mix. Such prolific pays as the Cretaceous Bakken and Austin Chalk have drawn research interest and large amounts of investment capital. Fluorescence analysis can discern movable oil--as opposed to exhausted source rock--in such reservoirs with an inexpensive test. Other potential targets are the Cretaceous Mesaverde in the Piceance basin, Devonian New Albany shale in Kentucky, Devonian Antrim shale in the Michigan basin, and the Cretaceous Niobrara, Mancos, and Pierre formations in Colorado and New Mexico. To insure success in this niche this key question must be answered positively: Is movable oil present in the reservoir? Even if tectonic studies verify a system of open fractures, sonic logs confirm overpressuring in the zone, and resistivity logs document the maturity of the source, the ultimate question remains: Is movable oil in the fractures available to flow to the borehole? The paper explains a technique that will answer these questions.

The bitumen from the Whiterocks oil sand deposit in the Uinta Basin of eastern Utah was hydrotreated in a fixed-bed reactor to determine the extent of upgrading as a function of process operating variables. The process variables investigated included reactor pressure (11.2--16.7 MPa); reactor temperature (641--712 K) and liquid hourly space velocity (0.19--0.77 h{sup {minus}1}). The hydrogen/oil ratio, 890 m{sup 3} m{sup {minus}3} was fixed in all experiments. A sulphided Ni-Mo on alumina hydrodenitrogenation catalyst was used in these studies. The deactivation of the catalyst, 0.2 {degree}C/day, was monitored by thedecline in the API gravity of the total liquid product with time on-stream at a standard set of conditions. The effect of temperature, WHSV, and pressure on denitrogenation, desulphurization, and metals removalwere studied and apparent kinetic parameters determined. The effect of process variables on residue conversion and Conradson carbon residue reduction were also investigated.

The bitumen from the Whiterocks oil sand deposit in the Uinta Basin of eastern Utah was hydrotreated in a fixed-bed reactor to determine the extent of upgrading as a function of process operating variables. The process variables investigated included reactor pressure (11.2--16.7 MPa); reactor temperature (641--712 K) and liquid hourly space velocity (0.19--0.77 h[sup [minus]1]). The hydrogen/oil ratio, 890 m[sup 3] m[sup [minus]3] was fixed in all experiments. A sulphided Ni-Mo on alumina hydrodenitrogenation catalyst was used in these studies. The deactivation of the catalyst, 0.2 [degree]C/day, was monitored by thedecline in the API gravity of the total liquid product with time on-stream at a standard set of conditions. The effect of temperature, WHSV, and pressure on denitrogenation, desulphurization, and metals removalwere studied and apparent kinetic parameters determined. The effect of process variables on residue conversion and Conradson carbon residue reduction were also investigated.

This study characterizes an extremely large gas resource located in low permeability, sandstone reservoirs of the Mesaverde group and Wasatch formation in the Uinta Basin, Utah. Total in-place resource is estimated at 395.5 Tcf. Via application of geologic, engineering and economic criteria, the portion of this resource potentially recoverable as reserves is estimated. Those volumes estimated include probable, possible and potential categories and total 3.8 Tcf as a mean estimate of recoverable gas for all plays considered in the basin. Two plays were included in this study and each was separately analyzed in terms of its tight gas resource, established productive characteristics and future reserves potential based on a constant $2/Mcf wellhead gas price scenario. A scheme has been developed to break the overall resource estimate down into components that can be considered as differing technical and economic challenges that must be overcome in order to exploit such resources; in other words, to convert those resources to economically recoverable reserves. About 82.1% of the total evaluated resource is contained within sandstones that have extremely poor reservoir properties with permeabilities considered too low for commerciality using current frac technology.

Two thermodynamic "paradoxes" of black hole physics are re-examined. The first is that there is a thermal instability involving two coupled blackbody cavities containing two black holes, and second is that a classical black hole can swallow up entropy in the form of ambient blackbody photons without increasing its mass. The resolution of the second paradox by Bekenstein and by Hawking is re-visited. The link between Hawking radiation and Wigner's superluminal tunneling time is discussed using two equivalent Feynman diagrams, and Feynman's re-interpretation principle.

There exists a paradox in quantum field theory: substituting a field configuration which solves a subset of the field equations into the action and varying it is not necessarily equivalent to substituting that configuration into the remaining field equations. We take the $S^4$ and Freund-Rubin-like instantons as two examples to clarify the paradox. One must match the specialized configuration field variables with the corresponding boundary conditions by adding appropriate Legendre terms to the action. Some comments are made regarding exceptional degenerate cases.

Key natural gas reserves in Rocky Mountain and other U.S. basins are in reservoirs with economic producibility due to natural fractures. In this project, we evaluate a unique technology for predicting fractured reservoir location and characteristics ahead of drilling based on a 3-D basin/field simulator, Basin RTM. Recommendations are made for making Basin RTM a key element of a practical E&P strategy. A myriad of reaction, transport, and mechanical (RTM) processes underlie the creation, cementation and preservation of fractured reservoirs. These processes are often so strongly coupled that they cannot be understood individually. Furthermore, sedimentary nonuniformity, overall tectonics and basement heat flux histories make a basin a fundamentally 3-D object. Basin RTM is the only 3-D, comprehensive, fully coupled RTM basin simulator available for the exploration of fractured reservoirs. Results of Basin RTM simulations are presented, that demonstrate its capabilities and limitations. Furthermore, it is shown how Basin RTM is a basis for a revolutionary automated methodology for simultaneously using a range of remote and other basin datasets to locate reservoirs and to assess risk. Characteristics predicted by our model include reserves and composition, matrix and fracture permeability, reservoir rock strength, porosity, in situ stress and the statistics of fracture aperture, length and orientation. Our model integrates its input data (overall sedimentation, tectonic and basement heat flux histories) via the laws of physics and chemistry that describe the RTM processes to predict reservoir location and characteristics. Basin RTM uses 3-D, finite element solutions of the equations of rock mechanics, organic and inorganic diagenesis and multi-phase hydrology to make its predictions. As our model predicts reservoir characteristics, it can be used to optimize production approaches (e.g., assess the stability of horizontal wells or vulnerability of fractures to production-induced formation pressure drawdown). The Piceance Basin (Colorado) was chosen for this study because of the extensive set of data provided to us by federal agencies and industry partners, its remaining reserves, and its similarities with other Rocky Mountain basins. We focused on the Rulison Field to test our ability to capture details in a well-characterized area. In this study, we developed a number of general principles including (1) the importance of even subtle flexure in creating fractures; (2) the tendency to preserve fractures due to the compressibility of gases; (3) the importance of oscillatory fracture/flow cycles in the expulsion of natural gas from source rock; and (4) that predicting fractures requires a basin model that is comprehensive, all processes are coupled, and is fully 3-D. A major difficulty in using Basin RTM or other basin simulator has been overcome in this project; we have set forth an information theory technology for automatically integrating basin modeling with classical database analysis; this technology also provides an assessment of risk. We have created a relational database for the Piceance Basin. We have developed a formulation of devolatilization shrinkage that integrates organic geochemical kinetics into incremental stress theory, allowing for the prediction of coal cleating and associated enhancement of natural gas expulsion from coal. An estimation of the potential economic benefits of the technologies developed or recommended here is set forth. All of the above findings are documented in this report.

Coal is one of the richest known sources of hydrocarbons. This heterogeneous material has the unique characteristic of being both a source and a reservoir of natural gas. By virtue of their maturation to high rank some coals have the capacity to generate more than 8,000 ft{sup 3} of methane per ton of coal. Although most of this gas eventually has been lost over 400 trillion ft{sup 3} remains in place in US coal basins. The Potential Gas Committee has estimated that at least 90 trillion ft{sup 3} likely are recoverable. Coal-bed methane exploration requires application of both coal geology and petroleum geology as well as nonconventional approaches to reservoir engineering. With advanced technologies developed largely through cooperative efforts of the Gas Research Institute and industry, researchers and explorationists are better understanding the geological and engineering peculiarities of coal reservoirs. Commercial coal-bed methane development occurs basically in two diverse geologic settings: (1) thin, shallow coals of Pennsylvanian age in the Black Warrior and Appalachian basins and (2) thicker, deeper coals of Cretaceous age in the Rocky Mountains, principally the San Juan, Piceance, Raton, and Green River basins. Recent exploration has targeted shallow, anomalously thick but lower-rank, low-gas-content Tertiary coals in Wyoming. Coal basins in Washington, British Columbia, and Alberta also show potential. Methane in coal beds is an immense, virtually untapped source of environmentally acceptable, pipeline-quality energy. In light of increasing demand for natural gas, coal-bed methane is becoming an economically viable, low-risk exploratory and development objective.

-dynamics models of higher complexity necessarily lead to better predictions? Christophe Ancey École Polytechnique worldwide. Paradoxically, the substantial increase in model complexity can lead us to lose sight of the empirical nature of the assumptions used to build the models. Human expertise should still be of paramount

at Protein Folding Jçrg Enderlein*[a] Protein folding, that is, the organization of proteins into a highly folded structure. To solve Levinthal's paradox it was postulated that proteins fold along specific path- ways (older view), or follow one of many parallel paths down the now famous protein folding funnel

Regulating reproduction in India's population Efforts, Results and RecommenÂ­ dations. By K, St. Louis, MO 63130 India's population growth is a paradox. In 1952, India became the first country in the world to institute a national policy to limit population, and the central government has pursued

Regulating reproduction in India's population Efforts, Results and Recommen- dations., Washington University, St. Louis, MO 63130 India's population growth is a paradox. In 1952, India became the first countr* *y in the world to institute a national policy to limit population

, and alleviating it reach out to issues such as wealth distribution, social cohesion patterns, and power structure economic growth by promoting a people­centered approach to development whereby people are both the agents attempting at making up people (Hacking, 2000), paradoxically engineering the molding of a new society

On the Security of the CCM Encryption Mode and of a Slight Variant Pierre-Alain Fouque1 and Gwena.Valette@dga.defense.gouv.fr Abstract. In this paper, we present an analysis of the CCM mode of operations and of a slight variant. CCM important fact is that, while the privacy of CCM is provably garan- teed up to the birthday paradox

, Australia) Title The Paradox of Measurable Counterfactuals: Performing the efficiency of emissions trading Abstract This paper places the more or less uncontested history of emissions trading, beginning with Ronald a century earlier. Of particular relevance is the claim that emissions trading schemes are not only more

I discuss various thoughts, old and new, about the cosmological constant (or dark energy) paradox. In particular, I suggest the possibility that the cosmological ``constant'' may decay as $\\Lambda \\sim \\alpha^2 m_N^3 / \\tau$, where $\\tau$ is the age of the universe.

"To sense" or "not to sense" in energy-efficient power control games Maël Le Treust Laboratoire des corresponds to a compact power control game). The sensing game is shown to be a weighted potential game, and then playing the power control game. This is an interesting Braess-type paradox to be aware of for energy

Page 1 ... 1993, Leibniz referred to what we call the in?nitesimal calculus as. “the calculus of ... ?nite mind to apprehend God is inherently paradoxical, has now been ..... but I don't believe it!,“ and which happens to have a pleasant melody.

agricultural and forestry practices. By drawing down the CO2 amount we can not only avert catastrophic few years, paradoxically, carry both bad news and good news. The enclosed paper, "Target Atmospheric. The good news is that it is still feasible to solve the problem, to reduce CO2 emissions over coming

This document is a guide for use by the Tank Waste Remediation System (TWRS) Information Locator Database (ILD) System Administrator. The TWRS ILD System is an inventory of information used in the TWRS Systems Engineering process to represent the TWRS Technical Baseline. The inventory is maintained in the form of a relational database developed in Paradox 4.5.

of what to remember and how to do so was the subject of vehement debate. This paradox, of an attempt to create and fix a version of the past by a regime that simultaneously wanted to deny large swathes of history, has long been recognised. From Edgar...

The destruction of quantum coherence can pump energy into a system. For our examples this is paradoxical since the destroyed correlations are ordinarily considered negligible. Mathematically the explanation is straightforward and physically one can identify the degrees of freedom supplying this energy. Nevertheless, the energy input can be calculated without specific reference to those degrees of freedom.

of compressibility, gravity, and some mass transfer (using at least a black-oil system). Further, modern simulators problems. Paradoxically, the equations that describe incompressible flow problems are simpler than those must be based on methods which work for the in- compressible case. DIFFERENTIAL EQUATIONS Basic

and also for the rare earth elements Gd, Dy and Tb at various temperatures. In addition, the magnetic, such as that of Brown's paradox [5]. Micromechanical cantilevers used in atomic force microscopy [6] (AFM) are highly sensitive force and torque sensors and therefore ideal tools for detecting magnetic properties of small

We study the recently proposed "stationary measure" in the context of the string landscape scenario. We show that it suffers neither from the "Boltzmann brain" problem nor from the "youngness" paradox that makes some other measures predict a high CMB temperature at present. We also demonstrate a satisfactory performance of this measure in predicting the results of local experiments, such as proton decay.

We study the recently proposed ''stationary measure'' in the context of the string landscape scenario. We show that it suffers neither from the ''Boltzmann brain'' problem nor from the ''youngness'' paradox that makes some other measures predict a high CMB temperature at present. We also demonstrate a good performance of this measure in predicting the results of local experiments, such as proton decay.

, and the gap between safety­ critical and best­e#ort engineering practices. We call for a coherent sci­ entific to the electrical engineers, because computation and software are integral parts of embedded systems. Indeed, the shortcomings of current design, validation, and maintenance processes make software, paradoxically, the most

reasoning, Valuation of information tech- nology. c 1994 Mark E. Nissen #12;1 Productivity Paradox of IT through the business environment, this need for useful IT-valuation measures will certainly grow through time; moreover, to the extent that measures for IT valuation fail to capture the "true" economic bene

to Napoleon" Thursday 13 November-Sunday 16 November 2014 At the Karl Anatol Center, California State University, Long Beach. Napoleon Bonaparte (1769-1821) remains a fascinating, ambivalent, and polarizing, Napoleon is one of the most paradoxical inspirations ever to ride the horse of (literary and art) history

form is overcame in their theory, but we can still observe some sign of the survival of the paradox. WeON THE THREE AXIOMS OF GENERAL DESIGN THEORY Makoto Kikuchi Department of Computer and Systems Yoshikawa's General Design Theory is an axiomatic theory of design in which design is formulated and dis

. 2 Reframing the Statistical Problem Define a reference set, R, as a set of documents, each of which keywords in ad hoc ways, given the lack of formal statistical methods to help. Paradoxically, this often language to evade authorities, seek political advantage, or express creativity; generic web searching; e

in a kitchen chopping up suet" What is the benefit of reading history alongside literature and vice versa intense in the 16th and 17th centuries. Reading history alongside literature and vice versa is not merely? Virginia Woolf presented us with the perplexing paradox of women in history in `A room of one's own'; `She

, the heat capacity of the undercooled liquid is higher than that of the crystalline phase (or phases). Aa entropy. Below this temperature, in order to avoid the Kauzmann paradox 131, a drop in heat capacity hae to occut, the amorphous alloy is formed and d b i t e a heat capacity comparablewith that of the stable

typically exhibit little difference between liquid and crystal heat capacities Cp , and tend to have of the latter have liquid heat capacities that are significantly larger than the corresponding crystal values of the so-called "Kauzmann paradox". 5 The supercooled-liquid versus crystal heat capacity discrepancy, when

Warming 4. Jet Plane Traffic Enhancing Global Warming 5. Disappearing Trees and Bushes 6. Peace for Humans.1. Everyone Is A Specialist 2.2. The Paradox of This New Century 2.3. Changes Wrought by Technology 3. Global mysterious at that time and our ancestors living in India tried to understand the causes for these mysteries

In general relativity, closed timelike curves can break causality with remarkable and unsettling consequences. At the classical level, they induce causal paradoxes disturbing enough to motivate conjectures that explicitly prevent their existence. At the quantum level, resolving such paradoxes induce radical benefits - from cloning unknown quantum states to solving problems intractable to quantum computers. Instinctively, one expects these benefits to vanish if causality is respected. Here we show that in harnessing entanglement, we can efficiently solve NP-complete problems and clone arbitrary quantum states - even when all time-travelling systems are completely isolated from the past. Thus, the many defining benefits of closed timelike curves can still be harnessed, even when causality is preserved. Our results unveil the subtle interplay between entanglement and general relativity, and significantly improve the potential of probing the radical effects that may exist at the interface between relativity and quantum theory.

The superposition principle is at the heart of quantum mechanics and at the root of many paradoxes arising when trying to extend its predictions to our everyday world. Schroedinger's cat is the prototype of such paradoxes and here, in contrast to many others, we choose to investigate it from the operational point of view. We experimentally demonstrate a universal strategy for producing an unambiguously distinguishable type of superposition, that of an arbitrary pure state and its orthogonal. It relies on only a limited amount of information about the input state to first generate its orthogonal one. Then, a simple change in the experimental parameters is used to produce arbitrary superpositions of the mutually orthogonal states. Constituting a sort of Schroedinger's black box, able to turn a whole zoo of input states into coherent superpositions, our scheme can produce arbitrary continuous-variable optical qubits, which may prove practical for implementing quantum technologies and measurement tasks.

The Einstein-Podolsky-Rosen (EPR) paradox established a link between entanglement and nonlocality in quantum mechanics. EPR steering is the nonlocality associated with the EPR paradox and has traditionally only been investigated between two parties. Here, we present the first experimental observations of multipartite EPR steering, and of the genuine tripartite continuous variable entanglement of three mesoscopic optical systems. We explore different linear optics networks - each one with optimised asymmetries - that create multipartite steerable states containing different numbers of quantised optical modes (qumodes). By introducing asymmetric loss on a 7-qumode state, we characterize 8 regimes of directional steering, showing that N + 1 regimes exist for an N-qumode state. Further, we reveal the directional monogamy of steering, and experimentally demonstrate continuous variable one-sided semi device-independent quantum secret sharing. Our methods establish principles for the development of multiparty quantum communication protocols with asymmetric observers, and can be extended to qubits, whether photonic, atomic, superconducting, or otherwise.

Social behaviors are often contagious, spreading through a population as individuals imitate the decisions and choices of others. A variety of global phenomena, from innovation adoption to the emergence of social norms and political movements, arise as a result of people following a simple local rule, such as copy what others are doing. However, individuals often lack global knowledge of the behaviors of others and must estimate them from the observations of their friends' behaviors. In some cases, the structure of the underlying social network can dramatically skew an individual's local observations, making a behavior appear far more common locally than it is globally. We trace the origins of this phenomenon, which we call "the majority illusion," to the friendship paradox in social networks. As a result of this paradox, a behavior that is globally rare may be systematically overrepresented in the local neighborhoods of many people, i.e., among their friends. Thus, the "majority illusion" may facilitate the ...

We want to understand whether and to which extent the maximal (Carnot) efficiency for heat engines can be reached at a finite power. To this end we generalize the Carnot cycle so that it is not restricted to slow processes. We show that for realistic (i.e. not purposefully-designed) engine-bath interactions, the work-optimal engine performing the generalized cycle close to the maximal efficiency has a long cycle time and hence vanishing power. This aspect is shown to relate to the theory of computational complexity. A physical manifestation of the same effect is the Levinthal's paradox in the protein folding problem. The resolution of this paradox for realistic proteins allows to construct engines that can extract at a finite power 40% of the maximally possible work reaching 90% of the maximal efficiency. For purposefully designed engine-bath interactions, the Carnot efficiency is achievable at a large power.

In an apparently unexplored region of relativistic spacetime, a simple thought experiment demonstrates that conjoined Lorentz transformations predict a proper clock at rest will run backwards and that prediction violates the logical principle of causality. Shown first in a modification of the standard clock paradox thought experiment, this fault carries over to finite accelerations of the moving observer. After re-examination of the standard clock paradox, a logical fault was also found in the concept of spacetime. A two-dimensional treatment of the Earth orbit predicts that our astronomers should measure proper time on distant variable objects in our own Galaxy as impossibly running backwards on approach-then-recede trajectories. The excellent record of relativity aside, we still have much new physics to learn about our spatially three-dimensional universe. It is suggested that space is not a freely stretching medium but is something that is substantive and is being produced.

is an economic paradox. The traditional view of economic development suggests that Namibia has a comparative advantage in the production of beef and, yet, the country’s production and exportation of beef has enjoyed only modest growth since 2000. This is best... illustrated by the underutilization of national export abattoirs and the failure to meet EU export quotas. Existing literature has only begun to study the reasons for this phenomenon. Some explanations suggest low producer prices are the source...

This article revisits the historiography of the problem of inertial frames. Specifically, the case of the twins in the clock paradox is considered to see that some resolutions implicitly assume inertiality for the non-accelerating twin. If inertial frames are explicitly identified by motion with respect to the large scale structure of the universe, it makes it possible to consider the relative inertiality of different frames.

" does not include ' Bertrand Russell makes use of these interpretive distinctions and anticipates many of their consequences in his seminal paper on vagueness in 1923. According to Keefe and Smith (1996, 1) the theory of supervalustions was first...-paradoxical knowledge must be precisely formulated. Bertrand Russell, for example, writes that most of the problems of formulating knowledge could be most adequately addressed from within the physical sciences, and that "science is perpetually trying to substitute...

called EH). Received: 8 August 1996. Ernst & Young Professor Director, Ernst & Young Center for Auditing Research and Advanced Tech-nology Division of Accounting and Information Systems School of Business, The University of Kansas Lawrence, KS 66045, USA.... This rule is equivalent to "minimax" rule in a two-person zero-sum game. It is interesting to note that the proposition developed in the article not only explains Ellsberg's paradox but also models correctly all the behaviors observed by EH. The remaining...

Four might be it. Chapter Five examines A Modest Confutation (1642) and Milton?s answer to it, An Apology Against a Pamphlet (1642). The argument in this debate moves away from the particulars of church discipline toward the personal agendas... plans. The paradox of how Milton proposed to connect with an often conservative populist audience, especially in light of the radicalism of much of his intellectual agenda, his just begun to receive its fair share of scholarly attention. Milton...

for a higher transconductance and cutoff frequency which is a paradox. Lower channel resistance results in higher depletion region capacitance and vice versa. Dacey and Ross kept the channel resistance high in order to keep the pinch-off potential... Effect Transistor, JFET, has seen many improvements. Several have been the result of silicon technology which has taken noisy discrete JFETs and turned them into high yield, integrated devices. Finer line widths and proper optimization have further...

liberal organizations provide more education and health services, accounting perhaps for the paradoxical co-existence of a more prosperous economy and more service provision by religious organizations. We emphasize also that counter to some historical... can think of this say as education, health, employment, or other services that arise from membership of the organization). Again we think the poor will value these services more than the rich. In classic economic terms, this creates a game...

open meetings at MLA 2009 (see item 5 above). 3. The James Holly Hanford Award for a distinguished book recognized the excellence of Gordon Campbell, Thomas N. Corns, John K. Hale, and Fiona J. Tweedie, Milton and the Manuscript of De Doctrina... the excellence of Milton and Toleration, ed. Sharon Achinstein and Elizabeth Sauer Oxford: Oxford University Press, 2007). 5. The James Holly Hanford Essay Awards recognize the excel- lence of John Creaser, ??Service is Perfect Freedom?: Paradox and Prosodic...

It is considered the study of determinism in the theories of physics. Based on fundamental postulates of physics, it is proved that the evolution of the universe is univocally determined, proving ultimately that free will does not exist. In addition, it is presented some contradictions and weaknesses of quantum mechanics, suggesting paradoxes in the theory. It is also analyzed some consequences of the postulates in justice and ethics.

the interrelated symbolic evidence for these processes within a brainwave, biofeedback company called the MindCenter. Three main perspectives are uncovered within the MindCenter's promotional literature which hinge upon the concept of agent and control.... Paradoxes abound when the reasoning within each perspective is examined. This thesis provides a glimpse into the complexity between the concepts: technology, organizations, and individuals, and may provide heuristic ground work for future studies...

incentive to deviate, will also find a lower incentive to implement a punishment for a possible deviant from the co-operative agreement. A possible escape from such a paradox may be found in the construction of more articulate punishment strategies, able... of the community 32 of internet users, calculated by taking into account the quality of interconnection between them. Formally, ?? ??? ? = +? iii npp ,? . The costs borne by a provider, instead, are assumed to increase with the number of both its users...

The theme of the conference was 'The paradox: today's coal technologies versus tomorrow's promise'. The sessions covered: today's technologies, tomorrow's potential; economic stability; energy security; transition to sustainable energy future; new coal power technologies leading to zero emission coal; existing power plants - improved performance through use of new technology; and carbon capture and storage R & D - challenges and opportunities. Some of the papers only consist of the viewgraphs/overheads.

A strategy for playing the game of roulette is presented in this paper. The strategy is based on the same probabilistic argument that leads to the well-known Birthday Paradox in Probability theory. Following the strategy, a player will have a positive expected gain per spin as well as in the long run despite the fact that the pay-off ratios in roulette favor the House.

The gravitational effect of vacuum polarization in space exterior to a particle in (2+1)-dimensional Einstein theory is investigated. In the weak field limit this gravitational field corresponds to an inverse square law of gravitational attraction, even though the gravitational mass of the quantum vacuum is negative. The paradox is resolved by considering a particle of finite extension and taking into account the vacuum polarization in its interior.

The quantization of the gravitational Chern-Simons coefficient is investigated in the framework of $ISO(2,1)$ gauge gravity. Some paradoxes involved are cured. The resolution is largely based on the inequivalence of $ISO(2,1)$ gauge gravity and the metric formulation. Both the Lorentzian scheme and the Euclidean scheme lead to the coefficient quantization, which means that the induced spin is not quite exotic in this context.

We extend the worldline measure for pocket formation in eternal inflation to allow for time-ordered bubble formation. Such a time-ordering is equivalent to imposing a preferred time-slicing on the "parent" de Sitter space. Using this measure, we describe a covariant version of the youngness paradox and show that the youngness paradox is a gauge artifact if the parent spacetime is an unbroken de Sitter space, due to the lack of an explicit time-ordering for the bubble nucleation events. We then show that one can add a "clock" to the de Sitter space, in the form of a vector field with a spontaneously broken symmetry that defines a unique timelike direction accessible to all observers. Once this is done, the existence of a preferred slicing means that the youngness paradox cannot be easily resolved. We use this to elucidate the apparent "persistence of memory" discussed recently by Garriga, Guth and Vilenkin, for inflationary universes produced by bubble nucleation.

The Sapient Paradox is the apparently unexplainable time delay of several ten thousand years following the arrival of Homo sapiens in Asia and Europe and before the introduction of impressive innovations with the agricultural revolution. Renfrew (2007) has suggested that the solution of the paradox has to do with changes in modes of thought that occurred with sedentism. According to Renfrew, this is a subject of study for cognitive archaeology where the final goal would be to understand the formation of the human mind. Several scholars, however, affirm that climatic change was crucial to such a revolution as it would have been very difficult to develop agriculture during the Palaeolithic. In other words, sedentism was not justified during the ice age, and that may be the solution to the paradox. It is widely accepted that climate variations were due to so-called orbital forcing, the slow periodic changes of orbital parameters of the Earth (known also as the Milankovitch theory). These and other astronomical e...

The thought experiment (called the clock paradox or the twin paradox)proposed by Langevin in 1911 of two observers, one staying on Earth and the other making a trip toward a star with a velocity near the light velocity is very well known for its surprising result. When the traveler comes back, he is younger that the stay on Earth. This astonishing situation deduced from the theory of Special relativity sparked a huge amount of articles, discussions and controversies such it remains a particular phenomenon probably unique in Physics. We propose to study it. First we looked for the simplest solutions when one can observe that the published solutions correspond in fact to two different versions of the experiment. It appears that the complete and simple solution of Moller is neglected for complicated methods with dubious validity. We propose to interpret this avalanche of works by the difficulty to accept the conclusions of the Special Relativity, in particular the difference in times indicated by two clocks, one immobile and the second moving and finally stopping. We also suggest that the name "twin paradox" is maybe related to some subconscious idea concerning conflict between twins as it can be found in the Bible and in several mythologies

The Hamiltonian of classical anti-de Sitter gravity is a pure boundary term on-shell. If this remains true in nonperturbative quantum gravity then (i) boundary observables will evolve unitarily in time and (ii) the algebra of boundary observables is the same at all times. In particular, information available at the boundary at any one time t{sub 1} remains available at any other time t{sub 2}. Since there is also a sense in which the equations of motion propagate information into the bulk, these observations raise what may appear to be potential paradoxes concerning simultaneous (or spacelike separated) measurements of noncommuting observables, one at the asymptotic boundary and one in the interior. We argue that such potentially paradoxical settings always involve a breakdown of semiclassical gravity. In particular, we present evidence that making accurate holographic measurements over short time scales radically alters the familiar notion of causality. We also describe certain less intrinsically paradoxical settings which illustrate the above boundary unitarity and render the notion more concrete.

Since the discovery of Hawking radiation, its consistency with quantum theory has been widely questioned. In the widely described picture, irrespective of what initial state a black hole starts with before collapsing, it eventually evolves into a thermal state of Hawking radiations after the black hole is exhausted. This scenario violates the principle of unitarity as required for quantum mechanics and leads to the acclaimed "information loss paradox". This paradox has become an obstacle or a reversed touchstone for any possible theory to unify the gravity and quantum mechanics. Based on the results from Hawking radiation as tunneling, we recently show that Hawking radiations can carry off all information about the collapsed matter in a black hole. After discovering the existence of information-carrying correlation, we show in great detail that entropy is conserved for Hawking radiation based on standard probability theory and statistics. We claim that information previously considered lost remains hidden inside Hawking radiation. More specifically, it is encoded into correlations between Hawking radiations. Our study thus establishes harmony between Harking radiation and the unitarity of quantum mechanics, which establishes the basis for a significant milestone towards resolving the long-standing information loss paradox. The paper provides a brief review of the exciting development on Hawking raidation. In addition to summarize our own work on this subject, we compare and address other related studies.

In 1996, Advanced Resources International (ARI) began performing R&D targeted at enhancing production and reserves from natural gas fields. The impetus for the effort was a series of field R&D projects in the early-to-mid 1990's, in eastern coalbed methane and gas shales plays, where well remediation and production enhancement had been successfully demonstrated. As a first step in the R&D effort, an assessment was made of the potential for restimulation to provide meaningful reserve additions to the U.S. gas resource base, and what technologies were needed to do so. That work concluded that: (1) A significant resource base did exist via restimulation (multiples of Tcf). (2) The greatest opportunities existed in non-conventional plays where completion practices were (relatively) complex and technology advancement was rapid. (3) Accurate candidate selection is the greatest single factor that contributes to a successful restimulation program. With these findings, a field-oriented program targeted at tight sand formations was initiated to develop and demonstrate successful candidate recognition technology. In that program, which concluded in 2001, nine wells were restimulated in the Green River, Piceance and East Texas basins, which in total added 2.9 Bcf of reserves at an average cost of $0.26/Mcf. In addition, it was found that in complex and heterogeneous reservoirs (such as tight sand formations), candidate selection procedures should involve a combination of fundamental engineering and advanced pattern recognition approaches, and that simple statistical methods for identifying candidate wells are not effective. In mid-2000, the U.S. Department of Energy (DOE) awarded ARI an R&D contract to determine if the methods employed in that project could also be applied to stripper gas wells. In addition, the ability of those approaches to identify more general production enhancement opportunities (beyond only restimulation), such as via artificial lift and compression, was also sought. A key challenge in this effort was that, whereas the earlier work suggested that better (producing) wells tended to make better restimulation candidates, stripper wells are by definition low-volume producers (either due to low pressure, low permeability, or both). Nevertheless, the potential application of this technology was believed to hold promise for enhancing production for the thousands of stripper gas wells that exist in the U.S. today. The overall procedure for the project was to select a field test site, apply the candidate recognition methodology to select wells for remediation, remediate them, and gauge project success based on the field results. This report summarizes the activities and results of that project.

The U.S. Department of Energy (DOE) Office of Legacy Management developed this report as a guide for discussions with the Colorado State regulators and other interested stakeholders in response to increased drilling for natural gas reserves near the underground nuclear explosion site at Rulison, Colorado. The Rulison site is located in the Piceance Basin of western Colorado, 40 miles northeast of Grand Junction. The Rulison test was the second natural gas reservoir stimulation experiment in the Plowshare Program, which was designed to develop peaceful uses for nuclear energy. On September 10, 1969, the U.S. Atomic Energy Commission, a predecessor agency of DOE, detonated a 40-kiloton nuclear device 8426 feet below the ground surface in an attempt to release commercially marketable quantities of natural gas. The blast vaporized surrounding rock and formed a cavity about 150 feet in diameter. Although the contaminated materials from drilling operations were subsequently removed from the surface of the blast site, no feasible technology exists to remove subsurface radioactive contamination in or around the test cavity. An increase in drilling for natural gas near the site has raised concern about the possibility of encountering residual radioactivity from the area of the detonation. DOE prohibits drilling in the 40-acre lot surrounding the blast site at a depth below 6000 feet. DOE has no evidence that indicates contamination from the Rulison site detonation has migrated or will ever migrate beyond the 40-acre institutional control boundary. The Colorado Oil and Gas Conservation Commission (COGCC) established two wider boundaries around the site. When a company applies for a permit to drill within a 3-mile radius of surface ground zero, COGCC notifies DOE and provides an opportunity to comment on the application. COGCC also established a half-mile radius around surface ground zero. An application to drill within one-half mile requires a full hearing before the commission. This report outlines DOE's recommendation that gas developers adopt a conservative, staged drilling approach allowing gas reserves near the Rulison site to be recovered in a manner that minimizes the likelihood of encountering contamination. This staged approach calls for collecting data from wells outside the half-mile zone before drilling closer, and then drilling within the half-mile zone in a sequential manner, first at low contamination probability locations and then moving inward. DOE's recommended approach for drilling in this area will protect public safety while allowing collection of additional data to confirm that contamination is contained within the 40-acre institutional control boundary.

In order to predict the nature and distribution of natural fracturing, Advanced Resources Inc. (ARI) incorporated concepts of rock mechanics, geologic history, and local geology into a geomechanical approach for natural fracture prediction within mildly deformed, tight (low-permeability) gas reservoirs. Under the auspices of this project, ARI utilized and refined this approach in tight gas reservoir characterization and exploratory activities in three basins: the Piceance, Wind River and the Anadarko. The primary focus of this report is the knowledge gained on natural fractural prediction along with practical applications for enhancing gas recovery and commerciality. Of importance to tight formation gas production are two broad categories of natural fractures: (1) shear related natural fractures and (2) extensional (opening mode) natural fractures. While arising from different origins this natural fracture type differentiation based on morphology is sometimes inter related. Predicting fracture distribution successfully is largely a function of collecting and understanding the available relevant data in conjunction with a methodology appropriate to the fracture origin. Initially ARI envisioned the geomechanical approach to natural fracture prediction as the use of elastic rock mechanics methods to project the nature and distribution of natural fracturing within mildly deformed, tight (low permeability) gas reservoirs. Technical issues and inconsistencies during the project prompted re-evaluation of these initial assumptions. ARI's philosophy for the geomechanical tools was one of heuristic development through field site testing and iterative enhancements to make it a better tool. The technology and underlying concepts were refined considerably during the course of the project. As with any new tool, there was a substantial learning curve. Through a heuristic approach, addressing these discoveries with additional software and concepts resulted in a stronger set of geomechanical tools. Thus, the outcome of this project is a set of predictive tools with broad applicability across low permeability gas basins where natural fractures play an important role in reservoir permeability. Potential uses for these learnings and tools range from rank exploration to field-development portfolio management. Early incorporation of the permeability development concepts presented here can improve basin assessment and direct focus to the high potential areas within basins. Insight into production variability inherent in tight naturally fractured reservoirs leads to improved wellbore evaluation and reduces the incidence of premature exits from high potential plays. A significant conclusion of this project is that natural fractures, while often an important, overlooked aspect of reservoir geology, represent only one aspect of the overall reservoir fabric. A balanced perspective encompassing all aspects of reservoir geology will have the greatest impact on exploration and development in the low permeability gas setting.

Research highlights: {yields} The formation of unique side-to-side RAF dimers is required for full kinase activity. {yields} RAF kinase inhibitors block MEK activation in cells containing oncogenic B-RAF. {yields} RAF kinase inhibitors can lead to the paradoxical increase in RAF kinase activity. -- Abstract: A-RAF, B-RAF, and C-RAF are a family of three protein-serine/threonine kinases that participate in the RAS-RAF-MEK-ERK signal transduction cascade. This cascade participates in the regulation of a large variety of processes including apoptosis, cell cycle progression, differentiation, proliferation, and transformation to the cancerous state. RAS mutations occur in 15-30% of all human cancers, and B-RAF mutations occur in 30-60% of melanomas, 30-50% of thyroid cancers, and 5-20% of colorectal cancers. Activation of the RAF kinases requires their interaction with RAS-GTP along with dephosphorylation and also phosphorylation by SRC family protein-tyrosine kinases and other protein-serine/threonine kinases. The formation of unique side-to-side RAF dimers is required for full kinase activity. RAF kinase inhibitors are effective in blocking MEK1/2 and ERK1/2 activation in cells containing the oncogenic B-RAF Val600Glu activating mutation. RAF kinase inhibitors lead to the paradoxical increase in RAF kinase activity in cells containing wild-type B-RAF and wild-type or activated mutant RAS. C-RAF plays a key role in this paradoxical increase in downstream MEK-ERK activation.

We apply the Helmholtz program of basic measurements to relativistic motion. We define a spatiotemporal order by practical comparison: "longer than" if one object or process covers the other. To express its value also numerically (how many times more) we cover them by a locally regular grid of light clocks. We define basic measures from physical operations. Interrelation of measurement operations by different observers reveals a genetic derivation of formal Lorentz transformation. Operationally impracticable configurations for accelerating observers clarify the way out of apparent Twin paradox. From simple measurement-methodical principles - without mathematical presuppositions - we derive all equations of relativistic Kinematics (and next same for classical and relativistic Dynamics).

about the complexity of ecclesiastical polity. Those who continue to urge a portrait of the period as the seedbed of secular liberty will, like his contemporaries, find Goodwin a formidable obstacle and paradox. D. F. McKenzie and Maureen Bell, eds....00. Review by RANDY ROBERTSON, SUSQUEHANNA UNIVERSITY. The Chronology and Calendar is a staggering achievement. Some years ago, D. F. McKenzie began to collect references to the book trade that he discov- ered in the Calendar of State Papers, Domestic...

The concept of classical indistinguishability is analyzed and defended against a number of well-known criticisms, with particular attention to the Gibbs' paradox. Granted that it is as much at home in classical as in quantum statistical mechanics, the question arises as to why indistinguishability, in quantum mechanics but not in classical mechanics, forces a change in statistics. The answer, illustrated with simple examples, is that the equilibrium measure on classical phase space is continuous, whilst on Hilbert space it is discrete. The relevance of names, or equivalently, properties stable in time that can be used as names, is also discussed.

understand that the functions tti and ttr completely defines the field as well as the implications hydrodynamic theory has on aerodynamics. Fig. 2. 4 A cylinder in an ideal perfect fluid flow. Consider a cylinder in an ideal perfect incompressible fluid... with circulation will be in a direction normal to the linear flow. Hydrodynamic theory was left somewhat incomplete with D'Alembert's paradox and the mathematical model of circulation to develop lift. The problem of the drag can be addressed in part by the lack...

The possibility of planetary mass black hole production by crossing entropy limits is addressed. Such a possibility is given by pointing out that two geophysical quantities have comparable values: first, Earth's total negative entropy flux integrated over geological time and, second, its extensive entropy bound, which follows as a tighter bound to the Bekenstein limit when entropy is an extensive function. The similarity between both numbers suggests that the formation of black holes from planets may be possible through a strong fluctuation toward thermodynamic equilibrium which results in gravothermal instability and final collapse. Briefly discussed are implications for the astronomical observation of low mass black holes and for Fermi's paradox.

Good quantum codes, such as quantum MDS codes, are typically nondegenerate, meaning that errors of small weight require active error-correction, which is--paradoxically--itself prone to errors. Decoherence free subspaces, on the other hand, do not require active error correction, but perform poorly in terms of minimum distance. In this paper, examples of degenerate quantum codes are constructed that have better minimum distance than decoherence free subspaces and allow some errors of small weight that do not require active error correction. In particular, two new families of [[n,1,>= sqrt(n)

Non-locality of the type first elucidated by Bell in 1964 is a difficult concept to explain to non-specialists and undergraduates. Here we attempt this by showing how such non-locality can be used to solve a problem in which someone might find themselves as the result of a collection of normal, even if somewhat unlikely, events. Our story is told in the style of a Sherlock Holmes mystery, and is based on Mermin's formulation of the "paradoxical" illustration of quantum non-locality discovered by Greenberger, Horne and Zeilinger.

, claiming that his behavior was directly opposed to everything America stood for. At the heart of so much of the rhetorical bullying was this idea of ?American.? Nobody could decide what it meant or who had the most of it. Was it more American... Rhetoric Reconsidered: Constitutive Paradoxes in G.W. Bush?s Iraq War Speeches.? Zagacki focuses on the idea of ?prophetic dualism? that he claims guided Bush?s rhetoric to the American people in an attempt to create identification between Americans...

, Moors and Christians: Festivals of Reconquest in Mexico and Spain, which appeared just as Fuch?s book was going to press, Max Harris examines the cultural legacy of such interplay and shows its survival into modern times. Fuchs goes further to show how... of America as a continuation of the reconquista, and approached the Amerindians as they did the Muslims and Jews of Spain?as enemies to be either converted or expelled. But as Fuchs carefully shows, the paradox of Spanish mimesis was never fully resolved...

. If the narrative in My Son was his own life story, God was, to use one of his own phrases, “an ornery bastard.” God exacts a price for living a life of leisure and lust; he takes away precious children with the stroke of a razor blade, and he visits gloom upon... tested and punished while on Earth. It is the paradox of faith that you must be both afraid of the wrath of God yet also forgiven for all your sins in the name of his son. This contrast of hope and resurrection with fear and anguish...

Quantum simulations of Bell inequality violations are numerically obtained using probabilistic phase space methods, namely the positive P-representation. In this approach the moments of quantum observables are evaluated as moments of variables that have values outside the normal eigenvalue range. There is thus a parallel with quantum weak measurements and weak values. Nevertheless, the representation is exactly equivalent to quantum mechanics. A number of states violating Bell inequalities are sampled, demonstrating that these quantum paradoxes can be treated with probabilistic methods. We treat quantum dynamics by simulating the time evolution of the Bell state formed via parametric down-conversion, and discuss multi-mode generalizations.

be defined by its actions. As Fortune pointed out, $4,000 marked the point above which, instead of just subsisting, families began having choices in what they bought, and they did indeed exercise their options. 2 They bought procesed foods, outdoor grils... of paradoxes that revolve around the fact that although Americans have aces to an abundance of food, their relationship to food is one marked by anxiety instead of gratitude or relief. In tracing this idea, the book functions as an overview of the major...

These lecture notes are an elementary and pedagogical introduction to the black hole evaporation, based on a lecture given by the author at the Ninth Modave Summer School in Mathematical Physics and are intended for PhD students. First, quantum field theory in curved spacetime is studied and tools needed for the remaining of the course are introduced. Then, quantum field theory in Rindler spacetime in 1+1 dimensions and in the spacetime of a spherically collapsing star are considered, leading to Unruh and Hawking effects, respectively. Finally, some consequences such as thermodynamics of black holes and information loss paradox are discussed.

DNA looping has been observed to enhance and suppress transcriptional noise but it is uncertain which of these two opposite effects is to be expected for given conditions. Here, we derive analytical expressions for the main quantifiers of transcriptional noise in terms of the molecular parameters and elucidate the role of DNA looping. Our results rationalize paradoxical experimental observations and provide the first quantitative explanation of landmark individual-cell measurements at the single molecule level on the classical lac operon genetic system [Choi et al., Science 322, 442-446 (2008)].

The work is an attempt to model a scenario of inflation in the framework of anti-de Sitter/conformal field theory duality, a potentially complete nonperturbative description of quantum gravity. We study bubble geometries with de Sitter interiors within an ambient Schwarzschild anti-de Sitter black hole spacetime and the properties of the corresponding states in the dual conformal field theory. It is argued the viable bubble states can be identified with a subset of the black hole microstates. Consistency checks are performed and a number of implications regarding cosmology are discussed including how the key problems or paradoxes of conventional eternal inflation are overcome in this scenario.

In the classical (non-quantum) relativity theory the course of the moving clock is dilated as compared to the course of the clock at rest (the Einstein dilation). Any unstable system may be regarded as a clock. The time evolution (e.g., the decay) of a uniformly moving physical system is considered using the relativistic quantum theory. The example of a moving system is given whose evolution turns out to be speeded-up instead of being dilated. A discussion of this paradoxical result is presented.

We propose a model for protein folding in vivo based on a Brownian-ratchet mechanism in the multidimensional energy landscape space. The device is able to produce directed transport taking advantage of the assumed intrinsic asymmetric properties of the proteins and employing the consumption of energy provided by an external source. Through such a directed transport phenomenon, the polypeptide finds the native state starting from any initial state in the energy landscape with great efficacy and robustness, even in the presence of different type of obstacles. This model solves Levinthal's paradox without requiring biased transition probabilities but at the expense of opening the system to an external field.

We propose a model for protein folding in vivo based on a Brownian-ratchet mechanism in the multidimensional energy landscape space. The device is able to produce directed transport taking advantage of the assumed intrinsic asymmetric properties of the proteins and employing the consumption of energy provided by an external source. Through such a directed transport phenomenon, the polypeptide finds the native state starting from any initial state in the energy landscape with great efficacy and robustness, even in the presence of different type of obstacles. This model solves Levinthal's paradox without requiring biased transition probabilities but at the expense of opening the system to an external field.

A conventional space-time diagram is $r-ct$ one, which satisfies the Minkowski geometry. This geometry conflict the intuition from the Euclid geometry. In this work an Euclid space-time diagram is proposed to describe relativistic world lines with an exact Euclid geometry. The relativistic effects such as the dilation of moving clocks, the contraction of moving length, and the twin paradox can be geometrically expressed in the Euclid space-time diagram. It is applied to the case of a satellite clock to correct the gravitational effect. It is found that this Euclid space-time diagram is much more intuitive than the conventional space-time diagram.

The process of cognition is analysed to adjust the set theory to physical description. Postulates and basic definitions are revised. The specific sets of predicates, called presets, corresponding to the physical objects identified by an observer during cognition are introduced. Unlike sets, the presets are free of logical or set-theoretical paradoxes and may be consistently used in physical description. Schemes of cognition based on presets are considered. Being different logical systems, the relativistic and quantum theories, observations in modern cosmology cannot be consistently considered in one `unified physical theory', but they are in frames of introduced schemes of cognition.

In this addendum to [arXiv:1402.5674] two points are discussed. In the first additional evidence is provided for a dual connection between the geometric length of an Einstein-Rosen bridge and the computational complexity of the quantum state of the dual CFT's. The relation between growth of complexity and Page's ``Extreme Cosmic Censorship" principle is also remarked on. The second point involves a gedanken experiment in which Alice measures a complete set of commuting observables at her end of an Einstein-Rosen bridge is discussed. An apparent paradox is resolved by appealing to the properties of GHZ tripartite entanglement.

A process model of quantum mechanics utilizes a combinatorial game to generate a discrete and finite causal space upon which can be defined a self-consistent quantum mechanics. An emergent space-time M and continuous wave function arise through a non-uniform interpolation process. Standard non-relativistic quantum mechanics emerges under the limit of infinite information (the causal space grows to infinity) and infinitesimal scale (the separation between points goes to zero). The model has the potential to address several paradoxes in quantum mechanics while remaining computationally powerful.

The development of an oil shale industry in northwestern Colorado and northeastern Utah has been forecast at various times since early this century, but the comparatively easy accessibility of other oil sources has forestalled development. Decreasing fuel supplies, increasing energy costs, and the threat of a crippling oil embargo finally may launch a commercial oil shale industry in this region. Concern for the possible impacts on the human environment has been fostered by experiences of rapid population growth in other western towns that have hosted energy resource development. A large number of studies have attempted to evaluate social and economic impacts of energy development and to determine important factors that affect the severity of these impacts. These studies have suggested that successful management of rapid population growth depends on adequate front-end capital for public facilities, availability of housing, attention to human service needs, long-range land use and fiscal planning. This study examines variables that affect the socioeconomic impacts of oil shale development. The study region is composed of four Colorado counties: Mesa, Moffat, Garfield and Rio Blanco. Most of the estimated population of 111 000 resides in a handful of urban areas that are separated by large distances and rugged terrain. We have projected the six largest cities and towns and one planned company town (Battlement Mesa) to be the probable centers for potential population impacts caused by development of an oil shale industry. Local planners expect Battlement Mesa to lessen impacts on small existing communities and indeed may be necessary to prevent severe regional socioeconomic impacts. Section II describes the study region and focuses on the economic trends and present conditions in the area. The population impacts analyzed in this study are contingent on a scenario of oil shale development from 1980-90 provided by the Department of Energy and discussed in Sec. III. We recognize that the rate of development, the magnitude of development, and the technology mix that will actually take place remain uncertain. Although we emphasize that other energy and mineral resources besides oil shale may be developed, the conclusions reached in this study reflect only those impacts that would be felt from the oil shale scenario. Socioeconomic impacts in the region reflect the uneven growth rate implied by the scenario and will be affected by the timing of industry developments, the length and magnitude of the construction phase of development, and the shift in employment profiles predicted in the scenario. The facilities in the southern portion of the oil shale region, those along the Colorado River and Parachute Creek, show a peak in the construction work force in the mid-1980s, whereas those f acil it i es in the Piceance Creek Bas into the north show a construction peak in the late 1980s. Together, the facilities will require a large construction work force throughout the decade, with a total of 4800 construction workers required in 1985. Construction at the northern sites and second phase construction in the south will require 6000 workers in 1988. By 1990, the operation work force will increase to 7950. Two important characteristics of oil shale development emerge from the work force estimates: (1) peak-year construction work forces will be 90-120% the size of the permanent operating work force; and (2) the yearly changes in total work force requirements will be large, as much as 900 in one year at one facility. To estimate population impacts on individual communities, we devised a population distribution method that is described in Sec. IV. Variables associated with the projection of population impacts are discussed and methodologies of previous assessments are compared. Scenario-induced population impacts estimated by the Los Alamos method are compared to projections of a model employed by the Colorado West Area Council of Governments. Oil shale development in the early decade, as defined by the scenario, will produce growth primarily

Recovery of oil from oil shales and the natural primary migration of hydrocarbons are closely related processes that have received renewed interest in recent years because of the ever tightening supply of conventional hydrocarbons and the growing production of hydrocarbons from low-permeability tight rocks. Quantitative models for conversion of kerogen into oil and gas and the timing of hydrocarbon generation have been well documented. However, lack of consensus about the kinetics of hydrocarbon formation in source rocks, expulsion timing, and how the resulting hydrocarbons escape from or are retained in the source rocks motivates further investigation. In particular, many mechanisms have been proposed for the transport of hydrocarbons from the rocks in which they are generated into adjacent rocks with higher permeabilities and smaller capillary entry pressures, and a better understanding of this complex process (primary migration) is needed. To characterize these processes, it is imperative to use the latest technological advances. In this study, it is shown how insights into hydrocarbon migration in source rocks can be obtained by using sequential high-resolution synchrotron X-ray tomography. Three-dimensional images of several immature "shale" samples were constructed at resolutions close to 5 um. This is sufficient to resolve the source-rock structure down to the grain level, but very-fine-grained silt particles, clay particles, and colloids cannot be resolved. Samples used in this investigation came from the R-8 unit in the upper part of the Green River shale, which is organic rich, varved, lacustrine marl formed in Eocene Lake Uinta, USA. One Green River shale sample was heated in situ up to 400 degrees C as X-ray-tomography images were recorded. The other samples were scanned before and after heating at 400 degrees C. During the heating phase, the organic matter was decomposed, and gas was released. Gas expulsion from the low-permeability shales was coupled with formation of microcracks. The main technical difficulty was numerical extraction of microcracks that have apertures in the 5- to 30-um range (with 5 um being the resolution limit) from a large 3D volume of X-ray attenuation data. The main goal of the work presented here is to develop a methodology to process these 3D data and image the cracks. This methodology is based on several levels of spatial filtering and automatic recognition of connected domains. Supportive petrographic and thermogravimetric data were an important complement to this study. An investigation of the strain field using 2D image correlation analyses was also performed. As one application of the 4D (space + time) microtomography and the developed workflow, we show that fluid generation was accompanied by crack formation. Under different conditions, in the subsurface, this might provide paths for primary migration.

In October, 2000, the U.S. Department of Energy, through contractor Advanced Resources International, launched a multi-year government-industry R&D collaboration called the Coal-Seq project. The Coal-Seq project is investigating the feasibility of CO{sub 2} sequestration in deep, unmineable coalseams, by performing detailed reservoir studies of two enhanced coalbed methane recovery (ECBM) field projects in the San Juan basin. The two sites are the Allison Unit, operated by Burlington Resources, and into which CO{sub 2} is being injected, and the Tiffany Unit, operating by BP America, into which N{sub 2} is being injected (the interest in understanding the N{sub 2}-ECBM process has important implications for CO{sub 2} sequestration via flue-gas injection). The purposes of the field studies are to understand the reservoir mechanisms of CO{sub 2} and N{sub 2} injection into coalseams, demonstrate the practical effectiveness of the ECBM and sequestration processes, an engineering capability to simulate them, and to evaluate sequestration economics. In support of these efforts, laboratory and theoretical studies are also being performed to understand and model multi-component isotherm behavior, and coal permeability changes due to swelling with CO{sub 2} injection. This report describes the results of an important component of the overall project, applying the findings from the San Juan Basin to a national scale to develop a preliminary assessment of the CO{sub 2} sequestration and ECBM recovery potential of U.S. coalbeds. Importantly, this assessment improves upon previous investigations by (1) including a more comprehensive list of U.S. coal basins, (2) adopting technical rationale for setting upper-bound limits on the results, and (3) incorporating new information on CO{sub 2}/CH{sub 4} replacement ratios as a function of coal rank. Based on the results of the assessment, the following conclusions have been drawn: (1) The CO{sub 2} sequestration capacity of U.S. coalbeds is estimated to be about 90 Gt. Of this, about 38 Gt is in Alaska (even after accounting for high costs associated with this province), 14 Gt is in the Powder River basin, 10 Gt is in the San Juan basin, and 8 Gt is in the Greater Green River basin. By comparison, total CO{sub 2} emissions from power generation plants is currently about 2.2 Gt/year. (2) The ECBM recovery potential associated with this sequestration is estimated to be over 150 Tcf. Of this, 47 Tcf is in Alaska (even after accounting for high costs associated with this province), 20 Tcf is in the Powder River basin, 19 Tcf is in the Greater Green River basin, and 16 Tcf is in the San Juan basin. By comparison, total CBM recoverable resources are currently estimated to be about 170 Tcf. (3) Between 25 and 30 Gt of CO{sub 2} can be sequestered at a profit, and 80-85 Gt can be sequestered at costs of less than $5/ton. These estimates do not include any costs associated with CO{sub 2} capture and transportation, and only represent geologic sequestration. (4) Several Rocky Mountain basins, including the San Juan, Raton, Powder River and Uinta appear to hold the most favorable conditions for sequestration economics. The Gulf Coast and the Central Appalachian basin also appear to hold promise as economic sequestration targets, depending upon gas prices. (5) In general, the 'non-commercial' areas (those areas outside the main play area that are not expected to produce primary CBM commercially) appear more favorable for sequestration economics than the 'commercial' areas. This is because there is more in-place methane to recover in these settings (the 'commercial' areas having already been largely depleted of methane).

The postulate that coordinate and momentum representations are related to each other by the Fourier transform has been accepted from the beginning of quantum theory by analogy with classical electrodynamics. As a consequence, an inevitable effect in standard theory is the wave packet spreading (WPS) of the photon coordinate wave function in directions perpendicular to the photon momentum. This leads to the following paradoxes: if the major part of photons emitted by stars are in wave packet states (what is the most probable scenario) then we should see not separate stars but only an almost continuous background from all stars; no anisotropy of the CMB radiation should be observable; data on gamma-ray bursts, signals from directional radio antennas (in particular, in experiments on Shapiro delay) and signals from pulsars show no signs of WPS. In addition, a problem arises why there are no signs of WPS for protons in the LHC ring. We argue that the above postulate is based neither on strong theoretical arguments nor on experimental data and propose a new consistent definition of the position operator. Then WPS in directions perpendicular to the particle momentum is absent and the paradoxes are resolved. Different components of the new position operator do not commute with each other and, as a consequence, there is no wave function in coordinate representation. Implications of the results for entanglement, quantum locality and the problem of time in quantum theory are discussed.

The process approach to NRQM offers a fourth framework for the quantization of physical systems. Unlike the standard approaches (Schrodinger-Heisenberg, Feynman, Wigner-Gronewald-Moyal), the process approach is not merely equivalent to NRQM and is not merely a re-interpretation. The process approach provides a dynamical completion of NRQM. Standard NRQM arises as a asymptotic quotient by means of a set-valued process covering map, which links the process algebra to the usual space of wave functions and operators on Hilbert space. The process approach offers an emergentist, discrete, finite, quasi-non-local and quasi-non-contextual realist interpretation which appears to resolve many of the paradoxes and is free of divergences. Nevertheless, it retains the computational power of NRQM and possesses an emergent probability structure which agrees with NRQM in the asymptotic quotient. The paper describes the process algebra, the process covering map for single systems and the configuration process covering map for multiple systems. It demonstrates the link to NRQM through a toy model. Applications of the process algebra to various quantum mechanical situations - superpositions, two-slit experiments, entanglement, Schrodinger's cat - are presented along with an approach to the paradoxes and the issue of classicality.

Radio waves propagating from distant pulsars in the interstellar medium (ISM), are refracted by electron density inhomogeneities, so that the intensity of observed pulses fluctuates with time. The theory relating the observed pulse time-shapes to the electron-density correlation function has developed for 30 years, however, two puzzles have remained. First, observational scaling of pulse broadening with the pulsar distance is anomalously strong; it is consistent with the standard model only when non-uniform statistics of electron fluctuations along the line of sight are assumed. Second, the observed pulse shapes are consistent with the standard model only when the scattering material is concentrated in a narrow slab between the pulsar and the Earth. We propose that both paradoxes are resolved at once if one assumes stationary and uniform, but non-Gaussian statistics of the electron-density distribution. Such statistics must be of Levy type, and the propagating ray should exhibit a Levy flight. We propose that a natural realization of such statistics may be provided by the interstellar medium with random electron-density discontinuities. We develop a theory of wave propagation in such a non-Gaussian random medium, and demonstrate its good agreement with observations. The qualitative introduction of the approach and the resolution of the anomalous-scaling paradox was presented earlier in [PRL 91, 131101 (2003); ApJ 584, 791 (2003)].

Uncertainty in economics still poses some fundamental problems illustrated, e.g., by the Allais and Ellsberg paradoxes. To overcome these difficulties, economists have introduced an interesting distinction between 'risk' and 'ambiguity' depending on the existence of a (classical Kolmogorovian) probabilistic structure modeling these uncertainty situations. On the other hand, evidence of everyday life suggests that 'context' plays a fundamental role in human decisions under uncertainty. Moreover, it is well known from physics that any probabilistic structure modeling contextual interactions between entities structurally needs a non-Kolmogorovian quantum-like framework. In this paper we introduce the notion of 'contextual risk' with the aim of modeling a substantial part of the situations in which usually only 'ambiguity' is present. More precisely, we firstly introduce the essentials of an operational formalism called 'the hidden measurement approach' in which probability is introduced as a consequence of fluctuations in the interaction between entities and contexts. Within the hidden measurement approach we propose a 'sphere model' as a mathematical tool for situations in which contextual risk occurs. We show that a probabilistic model of this kind is necessarily non-Kolmogorovian, hence it requires either the formalism of quantum mechanics or a generalization of it. This insight is relevant, for it explains the presence of quantum or, better, quantum-like, structures in economics, as suggested by some authors, and can serve to solve the aforementioned paradoxes.

One of the most fundamental features of a black hole in general relativity is its event horizon: a boundary from which nothing can escape. There has been a recent surge of interest in the nature of these event horizons and their local neighbourhoods. In an attempt to resolve black hole information paradox(es), and more generally, to better understand the path towards quantum gravity, firewalls have been proposed as an alternative to black hole event horizons. In this letter, we explore the phenomenological implications of black holes possessing a surface or firewall. We predict a potentially detectable signature of these firewalls in the form of a high energy astrophysical neutrino flux. We compute the spectrum of this neutrino flux in different models and show that it is a possible candidate for the source of the PeV neutrinos recently detected by IceCube. We further show that, independent of the generation mechanism, IceCube data can be explained (at $1\\sigma$ confidence level) by conversion of accretion on...

We discuss the behavior of massive modes near a horizon based on a study of the dispersion relation and wave packet simulations of the Klein-Gordon equation. We point out an apparent paradox between two (in principle equivalent) pictures of black-hole evaporation through Hawking radiation. In the picture in which the evaporation is due to the emission of positive-energy modes, one immediately obtains a threshold for the emission of massive particles. In the picture in which the evaporation is due to the absorption of negative-energy modes, such a threshold apparently does not exist. We resolve this paradox by tracing the evolution of the positive-energy massive modes with an energy below the threshold. These are seen to be emitted and move away from the black-hole horizon, but they bounce back at a 'red horizon' and are reabsorbed by the black hole, thus compensating exactly for the difference between the two pictures. For astrophysical black holes, the consequences are curious but do not affect the terrestrial constraints on observing Hawking radiation. For analogue-gravity systems with massive modes, however, the consequences are crucial and rather surprising.

The purpose of this paper is to investigate the radiation process of the charged particle passing through an external periodic field in a dispersive medium. In the optical range of spectrum we will consider two cases: first, the source has not eigenfrequency, and second, the source has eigenfrequency. In the first case, when the Cherenkov radiation occurs, the non-zero eigenfrequency produces a paradox for Doppler effect. It is shown that the absence of the eigenfrequency solves the paradox known in the literature. The question whether the process is normal (i.e. hard photons are being radiated under the small angles) or anomalous depends on the law of the medium dispersion. When the source has an eigenfrequency the Doppler effects can be either normal or anomalous. In the X-ray range of the oscillator radiation spectrum we have two photons radiated under the same angle- soft and hard. In this case the radiation obeys to so-called complicated Doppler effect, i.e. in the soft photon region we have anomalous Doppler effect and in the hard photon region we have normal Doppler effect.

Chirality occupies a central role in fields ranging from biological self-assembly to the design of optical metamaterials. The definition of chirality, as given by Lord Kelvin, associates chirality with the lack of mirror symmetry: the inability to superpose an object on its mirror image. While this definition has guided the classification of chiral objects for over a century, the quantification of handed phenomena based on this definition has proven elusive, if not impossible, as manifest in the paradox of chiral connectedness. In this work, we put forward a quantification scheme in which the handedness of an object depends on the direction in which it is viewed. While consistent with familiar chiral notions, such as the right-hand rule, this framework allows objects to be simultaneously right and left handed. We demonstrate this orientation dependence in three different systems - a biomimetic elastic bilayer, a chiral propeller, and optical metamaterial - and find quantitative agreement with chirality pseudotensors whose form we explicitly compute. The use of this approach resolves the existing paradoxes and naturally enables the design of handed metamaterials from symmetry principles.

This paper illustrates various aspects of the ER=EPR conjecture.It begins with a brief heuristic argument, using the Ryu-Takayanagi correspondence, for why entanglement between black holes implies the existence of Einstein-Rosen bridges. The main part of the paper addresses a fundamental question: Is ER=EPR consistent with the standard postulates of quantum mechanics? Naively it seems to lead to an inconsistency between observations made on entangled systems by different observers. The resolution of the paradox lies in the properties of multiple black holes, entangled in the Greenberger-Horne-Zeilinger pattern. The last part of the paper is about entanglement as a resource for quantum communication. ER=EPR provides a way to visualize protocols like quantum teleportation. In some sense teleportation takes place through the wormhole, but as usual, classical communication is necessary to complete the protocol.

We show that the equation of motion for a rigid one-dimensional elastic body (i.e. a rod or string whose speed of sound is equal to the speed of light) in a two-dimensional spacetime is simply the wave equation. We then solve this equation in a few simple examples: a rigid rod colliding with an unmovable wall, a rigid rod being pushed by a constant force, a rigid string whose endpoints are simultaneously set in motion (seen as a special case of Bell's spaceships paradox), and a radial rigid string that has partially crossed the event horizon of a Schwarzschild black hole while still being held from the outside.

Cascading failures are one of the main reasons for blackouts in electrical power grids. Stable power supply requires a robust design of the power grid topology. Currently, the impact of the grid structure on the grid robustness is mainly assessed by purely topological metrics, that fail to capture the fundamental properties of the electrical power grids such as power flow allocation according to Kirchhoff's laws. This paper deploys the effective graph resistance as a metric to relate the topology of a grid to its robustness against cascading failures. Specifically, the effective graph resistance is deployed as a metric for network expansions (by means of transmission line additions) of an existing power grid. Four strategies based on network properties are investigated to optimize the effective graph resistance, accordingly to improve the robustness, of a given power grid at a low computational complexity. Experimental results suggest the existence of Braess's paradox in power grids: bringing an additional li...

We consider the motion of a system of free particles moving in a plane with hard scatterers of regular polygonal shape arranged in a random manner. Calling this the Ehrenfest gas which is known to be pseudo-integrable, we propose a finite-time Lyapunov exponent characterizing the dynamics. In the limit of large number of vertices, where polygon tends to a circle, we recover the Lyapunov exponent for the Lorentz gas. To obtain this result, we generalized the reflection law of a pencil of rays incident on a polygonal scatterer in a way that the formula for the circular scatterer is recovered in the limit of infinite vertices. Thus, seemingly paradoxically, chaos seems to emerge from pseudo-chaos.

In the present paper it will be argued that transport in a 2D electron gas can be implemented as 'local hidden instrument based' variables. With this concept of instrumentalism it is possible to explain the quantum correlation, the particle-wave duality and Wheeler's 'backward causation of a particle'. In the case of quantum correlation the spin measuring variant of the Einstein Podolsky and Rosen paradox is studied. In the case of particle-wave duality the system studied is single photon Mach-Zehnder (MZ) interferometry with a phase shift size $\\delta$. The idea that the instruments more or less neutrally may show us the way to the particle will be replaced by the concept of laboratory equipment contributing in an unexpected way to the measurement.

We apply the methods of transformation optics to theoretical descriptions of spacetimes that support closed null geodesic curves. The metric used is based on frame dragging spacetimes, such as the van Stockum dust or the Kerr black hole. Through transformation optics, this metric is analogous to a material that in theory should allow for communication between past and future. Presented herein is a derivation and description of the spacetime and the resulting permeability, permittivity, and magneto-electric couplings that a material would need in order for light in the material to follow closed null geodesics. We also address the paradoxical implications of such a material, and demonstrate why such a material would not actually result in a violation of causality. A full derivation of the Plebanski equations is also included.

The verification of nuclear warheads for arms control faces a paradox: International inspectors must gain high confidence in the authenticity of submitted items while learning nothing about them. Conventional inspection systems featuring ''information barriers'', designed to hide measurments stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, designed such that sensitive information is never measured so does not need to be hidden. We interrogate submitted items with energetic neutrons, making in effect, differential measurements of neutron transmission and emission. Calculations of diversion scenarios show that a high degree of discrimination can be achieved while revealing zero information. Timely demonstration of the viability of such an approach could be critical for the nexxt round of arms-control negotiations, which will likely require verification of individual warheads, rather than whole delivery systems.

If black holes were able to clone quantum states, a number of paradoxes in black hole physics would disappear. However, the linearity of quantum mechanics forbids exact cloning of quantum states. Here we show that black holes indeed clone incoming quantum states with a fidelity that depends on the black hole's absorption coefficient, without violating the no-cloning theorem because the clones are only approximate. Perfectly reflecting black holes are optimal universal "quantum cloning machines" and operate on the principle of stimulated emission, exactly as their quantum optical counterparts. In the limit of perfect absorption, the fidelity of clones is equal to what can be obtained via quantum state estimation methods. But for any absorption probability less than one, the cloning fidelity is nearly optimal as long as $\\omega/T\\geq10$, a common parameter for modest-sized black holes.

We investigate the effects of a squeezed pump on the quantum properties and conversion efficiency of the light produced in single-pass second harmonic generation. Using stochastic integration of the two-mode equations of motion in the positive-P representation, we find that larger violations of continuous-variable harmonic entanglement criteria are available for lesser effective interaction strengths than with a coherent pump. This enhancement of the quantum properties also applies to violations of the Reid-Drummond inequalities used to demonstrate a harmonic version of the Einstein-Podolsky-Rosen paradox. We find that the conversion efficiency is largely unchanged except for very low pump intensities and high levels of squeezing.

Black holes present the extreme limits of physics. They are ubiquitous in the cosmos, and in some extra-dimensional scenarios they could be produced at colliders. They have also yielded a puzzle that challenges the foundations of physics. These talks will begin with an overview of the basics of black hole physics, and then briefly summarize some of the exciting developments with cosmic black holes. They will then turn to properties of quantum black holes, and the question of black hole production in high energy collisions, perhaps beginning with the LHC. I will then overview the apparent paradox emerging from Hawking's discovery of black hole evaporation, and what it could be teaching us about the foundations of quantum mechanics and gravity.

Black holes present the extreme limits of physics. They are ubiquitous in the cosmos, and in some extra-dimensional scenarios they could be produced at colliders. They have also yielded a puzzle that challenges the foundations of physics. These talks will begin with an overview of the basics of black hole physics, and then briefly summarize some of the exciting developments with cosmic black holes. They will then turn to properties of quantum black holes, and the question of black hole production in high energy collisions, perhaps beginning with the LHC. I will then overview the apparent paradox emerging from Hawking's discovery of black hole evaporation, and what it could be teaching us about the foundations of quantum mechanics and gravity.

Black holes present the extreme limits of physics. They are ubiquitous in the cosmos, and in some extra-dimensional scenarios they could be produced at colliders. They have also yielded a puzzle that challenges the foundations of physics. These talks will begin with an overview of the basics of black hole physics, and then briefly summarize some of the exciting developments with cosmic black holes. They will then turn to properties of quantum black holes, and the question of black hole production in high energy collisions, perhaps beginning with the LHC. I will then overview the apparent paradox emerging from Hawking's discovery of black hole evaporation, and what it could be teaching us about the foundations of quantum mechanics and gravity.

The black hole model with a self-gravitating charged spherical symmetric dust thin shell as a source is considered. The Schroedinger-type equation for such a model is derived. This equation appeared to be a finite differences equation. A theory of such an equation is developed and general solution is found and investigated in details. The discrete spectrum of the bound state energy levels is obtained. All the eigenvalues appeared to be infinitely degenerate. The ground state wave functions are evaluated explicitly. The quantum black hole states are selected and investigated. It is shown that the obtained black hole mass spectrum is compatible with the existence of Hawking's radiation in the limit of low temperatures both for large and nearly extreme Reissner-Nordstrom black holes. The above mentioned infinite degeneracy of the mass (energy) eigenvalues may appeared helpful in resolving the well known information paradox in the black hole physics.

A possible resolution of the information loss paradox for black holes is proposed in which a phase transition occurs when the temperature of an evaporating black hole equals a critical value, $T_c$, and Lorentz invariance and diffeomorphism invariance are spontaneously broken. This allows a generalization of Schr\\"odinger's equation for the quantum mechanical density matrix, such that a pure state can evolve into a mixed state, because in the symmetry broken phase the conservation of energy-momentum is spontaneously violated. TCP invariance is also spontaneously broken together with time reversal invariance, allowing the existence of white holes, which are black holes moving backwards in time. Domain walls would form which separate the black holes and white holes (anti-black holes) in the broken symmetry regime, and the system could evolve into equilibrium producing a balance of information loss and gain.

Bohmian mechanics, a hydrodynamic formulation of the quantum theory, constitutes a useful resource to analyze the role of the phase as the mechanism responsible for the dynamical evolution of quantum systems. Here this role is discussed in the context of quantum interference. Specifically, it is shown that when dealing with two wave-packet coherent superpositions this phenomenon is analogous to an effective collision of a single wave packet with a barrier. This effect is illustrated by means of a numerical simulation of Young's two-slit experiment. Furthermore, outcomes from this analysis are also applied to a realistic simulation of Wheeler's delayed choice experiment. As it is shown, in both cases the Bohmian formulation helps to understand in a natural way (and, therefore, to demystify) what are typically regarded as paradoxical aspects of the quantum theory, simply stressing the important dynamical role played by the quantum phase. Accordingly, our conception of quantum systems should not rely on artifici...

The dynamical behavior of a weakly diluted fully inhibitory network of pulse-coupled spiking neurons is investigated. Upon increasing the coupling strength, a transition from regular to stochasticlike regime is observed. In the weak-coupling phase, a periodic dynamics is rapidly approached, with all neurons firing with the same rate and mutually phase locked. The strong-coupling phase is characterized by an irregular pattern, even though the maximum Lyapunov exponent is negative. The paradox is solved by drawing an analogy with the phenomenon of 'stable chaos', i.e., by observing that the stochasticlike behavior is 'limited' to an exponentially long (with the system size) transient. Remarkably, the transient dynamics turns out to be stationary.

In honor of Alan Turing's hundredth birthday, I unwisely set out some thoughts about one of Turing's obsessions throughout his life, the question of physics and free will. I focus relatively narrowly on a notion that I call "Knightian freedom": a certain kind of in-principle physical unpredictability that goes beyond probabilistic unpredictability. Other, more metaphysical aspects of free will I regard as possibly outside the scope of science. I examine a viewpoint, suggested independently by Carl Hoefer, Cristi Stoica, and even Turing himself, that tries to find scope for "freedom" in the universe's boundary conditions rather than in the dynamical laws. Taking this viewpoint seriously leads to many interesting conceptual problems. I investigate how far one can go toward solving those problems, and along the way, encounter (among other things) the No-Cloning Theorem, the measurement problem, decoherence, chaos, the arrow of time, the holographic principle, Newcomb's paradox, Boltzmann brains, algorithmic info...

Preliminary analyses of scenarios for human interference with the performance of a radioactive waste repository in a deep salt formation are presented. The following scenarios are analyzed: (1) the U-Tube Connection Scenario involving multiple connections between the repository and the overlying aquifer system; (2) the Single Borehole Intrusion Scenario involving penetration of the repository by an exploratory borehole that simultaneously connects the repository with overlying and underlying aquifers; and (3) the Pressure Release Scenario involving inflow of water to saturate any void space in the repository prior to creep closure with subsequent release under near lithostatic pressures following creep closure. The methodology to evaluate repository performance in these scenarios is described and this methodology is applied to reference systems in three candidate formations: bedded salt in the Palo Duro Basin, Texas; bedded salt in the Paradox Basin, Utah; and the Richton Salt Dome, Mississippi, of the Gulf Coast Salt Dome Basin.

The scattering of a fermion in the background of a sign potential is considered with a general mixing of vector and scalar Lorentz structures with the scalar coupling stronger than or equal to the vector coupling under the Sturm–Liouville perspective. When the vector coupling and the scalar coupling have different magnitudes, an isolated solution shows that the fermion under a strong potential can be trapped in a highly localized region without manifestation of Klein’s paradox. It is also shown that the lonely bound-state solution disappears asymptotically as one approaches the conditions for the realization of spin and pseudospin symmetries. -- Highlights: •Scattering of fermions in a sign potential assessed under a Sturm–Liouville perspective. •An isolated bounded solution. •No pair production despite the high localization. •No bounded solution under exact spin and pseudospin symmetries.

In this paper, we try to construct black hole thermodynamics based on the fact that, the formation and evaporation of a black hole can be described by quantum unitary evolutions. First, we show that the Bekenstein-Hawking entropy $S_{BH}$ cannot be a Boltzmann or thermal entropy. To confirm this statement, we show that the original black hole's "first law" cannot be treated as the first law of thermodynamics formally, due to some missing metric perturbations caused by matter. Then, by including those (quantum) metric perturbations, we show that the black hole formation and evaporation can be described in a unitary manner effectively, through a quantum channel between the exterior and interior of the event horizon. In this way, the paradoxes of information loss and firewall can be resolved effectively. Finally, we show that black hole thermodynamics can be constructed in an ordinary way, by constructing statistical mechanics.

The recently proposed 'reheating-volume' (RV) measure promises to solve the long-standing problem of extracting probabilistic predictions from cosmological multiverse scenarios involving eternal inflation. I give a detailed description of the new measure and its applications to generic models of eternal inflation of random-walk type. For those models I derive a general formula for RV-regulated probability distributions that is suitable for numerical computations. I show that the results of the RV cutoff in random-walk type models are always gauge invariant and independent of the initial conditions at the beginning of inflation. In a toy model where equal-time cutoffs lead to the 'youngness paradox', the RV cutoff yields unbiased results that are distinct from previously proposed measures.

I propose a new volume-weighted probability measure for cosmological 'multiverse' scenarios involving eternal inflation. The 'reheating-volume (RV) cutoff' calculates the distribution of observable quantities on a portion of the reheating hypersurface that is conditioned to be finite. The RV measure is gauge-invariant, does not suffer from the 'youngness paradox', and is independent of initial conditions at the beginning of inflation. In slow-roll inflationary models with a scalar inflaton, the RV-regulated probability distributions can be obtained by solving nonlinear diffusion equations. I discuss possible applications of the new measure to 'landscape' scenarios with bubble nucleation. As an illustration, I compute the predictions of the RV measure in a simple toy landscape.

We present a general approach to solve the (1+1) and (2+1)-dimensional Dirac equation in the presence of static scalar, pseudoscalar and gauge potentials, for the case in which the potentials have the same functional form and thus the factorization method can be applied. We show that the presence of electric potentials in the Dirac equation leads to a two Klein-Gordon equations including an energy-dependent potential. We then generalize the factorization method for the case of energy-dependent Hamiltonians. Additionally, the shape invariance is generalized for a specific class of energy-dependent Hamiltonians. We also present a condition for the absence of the Klein's paradox (stability of the Dirac sea), showing how Dirac particles in low dimensions can be confined for a wide family of potentials.

Luka Popov has attempted to advance Machian physics by maintaining that the heliocentric system must be replaced by Tycho Brahe's geocentric system. We show that while geocentrism relies on Mach's contention that accelerations are relative, this contention is untenable because, inter alia, the consequences of an acceleration of an object with respect to the fixed stars cannot be duplicated by acceleration of the stars with respect to this object and, if the universe and a co-rotating observer have the same angular velocity, this motion is detectable because they have different linear velocities. Also, geocentrism precludes the relativity of accelerations and leads to an absolute space while Mach argued against absolute space, Popov's result that the force exerted by the Earth on the Sun depends on the square of the Sun's mass but is independent of the Earth's mass is paradoxical, and the annual asymmetry of the Cosmic Microwave Background falsifies all geocentric or 'Tychonic/Brahean) systems.

We present an algorithm to estimate the configurational entropy $S$ of a polymer. The algorithm uses the statistics of coincidences among random samples of configurations and is related to the catch-tag-release method for estimation of population sizes, and to the classic "birthday paradox". Bias in the entropy estimation is decreased by grouping configurations in nearly equiprobable partitions based on their energies, and estimating entropies separately within each partition. Whereas most entropy estimation algorithms require $N\\sim 2^{S}$ samples to achieve small bias, our approach typically needs only $N\\sim \\sqrt{2^{S}}$. Thus the algorithm can be applied to estimate protein free energies with increased accuracy and decreased computational cost.

The behaviors of various confidence/credible interval constructions are explored, particularly in the region of low statistics where methods diverge most. We highlight a number of challenges, such as the treatment of nuisance parameters, and common misconceptions associated with such constructions. An informal survey of the literature suggests that confidence intervals are not always defined in relevant ways and are too often misinterpreted and/or misapplied. This can lead to seemingly paradoxical behaviours and flawed comparisons regarding the relevance of experimental results. We therefore conclude that there is a need for a more pragmatic strategy which recognizes that, while it is critical to objectively convey the information content of the data, there is also a strong desire to derive bounds on models and a natural instinct to interpret things this way. Accordingly, we attempt to put aside philosophical biases in favor of a practical view to propose a more transparent and self-consistent approach that better addresses these issues.

Following the Membrane Paradigm, we show that the stretched horizon of a black hole retains information about particles thrown into the hole for a time of order the scrambling time m ln(m/M_P), after the particles cross the horizon. One can, for example, read off the proper time at which a particle anti-particle pair thrown into the hole, annihilates behind the horizon, if this time is less than the scrambling time. If we believe that the Schwarzschild geometry exterior to the horizon is a robust thermodynamic feature of the quantum black hole, independent of whether it is newly formed, or has undergone a long period of Hawking decay, then this classical computation shows that the "firewall" resolution of the AMPS paradox is not valid.

The cross product frequently occurs in Physics and Engineering, since it has large applications in many contexts, e.g. for calculating angular momenta, torques, rotations, volumes etc. Though this mathematical operator is widely used, it is commonly expressed in a 3-D notation which gives rise to many paradoxes and difficulties. In fact, instead of other vector operators like scalar product, the cross product is defined just in 3-D space, it does not respect reflection rules and invokes the concept of "handedness". In this paper we are going to present an extension of cross product in an arbitrary number N of spatial Dimensions, different from the one adopted in the Exterior Algebra and explicitly designed for an easy calculus of moments.

In recent years, social networking platforms have developed into extraordinary channels for spreading and consuming information. Along with the rise of such infrastructure, there is continuous progress on techniques for spreading information effectively through influential users. In many applications, one is restricted to select influencers from a set of users who engaged with the topic being promoted, and due to the structure of social networks, these users often rank low in terms of their influence potential. An alternative approach one can consider is an adaptive method which selects users in a manner which targets their influential neighbors. The advantage of such an approach is that it leverages the friendship paradox in social networks: while users are often not influential, they often know someone who is. Despite the various complexities in such optimization problems, we show that scalable adaptive seeding is achievable. In particular, we develop algorithms for linear influence models with provable app...

The idealization of monochromatic plane waves leads to considerable simplifications in the analysis of electromagnetic systems. However, for active systems this idealization may be dangerous due to the presence of growing waves. Here we consider a gainy slab, and use a realistic incident beam, which is both causal and has finite width. This clarifies some apparent paradoxes arising from earlier analyses of this setup. In general it turns out to be necessary to involve complex frequencies $\\omega$ and/or complex transversal wavenumbers $k_x$. Simultaneously real $\\omega$ and $k_x$ cannot describe amplified waves in a slab which is infinite in the transversal direction. We also show that the only possibility to have an absolute instability for a finite width beam, is if a normally incident plane wave would experience an instability.

Many protein systems fold in a two-state manner. Random models, however, rarely display two-state kinetics and thus such behavior should not be accepted as a default. To date, many theories for the prevalence of two-state kinetics have been presented, but none sufficiently explain the breadth of experimental observations. A model, making a minimum of assumptions, is introduced that suggests two-state behavior is likely for any system with an overwhelmingly populated native state. We show two-state folding is emergent and strengthened by increasing the occupancy population of the native state. Further, the model exhibits a hub-like behavior, with slow interconversions between unfolded states. Despite this, the unfolded state equilibrates quickly relative to the folding time. This apparent paradox is readily understood through this model. Finally, our results compare favorable with experimental measurements of protein folding rates as a function of chain length and Keq, and provide new insight into these result...

Einstein-Podolsky-Rosen's paper in 1935 is discussed in parallel with an EPR experiment on $K^0\\bar{K}^0$ system in 1998, yielding a strong hint of distinction in both wave-function and operators between particle and antiparticle at the level of quantum mechanics (QM). Then it is proposed that the CPT invariance in particle physics leads naturally to a basic postulate that the (newly defined) space-time inversion (${\\bf x}\\to -{\\bf x},t\\to -t$) is equivalent to the transformation between particle and its antiparticle. The evolution of this postulate from nonrelativistic QM via relativistic QM till the quantum field theory is discussed in some detail. The Klein paradox for both Klein-Gordon equation and Dirac equation is also discussed. Keywords: CPT invariance, Antiparticle, Quantum mechanics, Quantum field theory

In the Axelrod's model of cultural dissemination, we consider mobility of cultural agents through the introduction of a density of empty sites and the possibility that agents in a dissimilar neighborhood can move to them if their mean cultural similarity with the neighborhood is below some threshold. While for low values of the density of empty sites the mobility enhances the convergence to a global culture, for high enough values of it the dynamics can lead to the coexistence of disconnected domains of different cultures. In this regime, the increase of initial cultural diversity paradoxically increases the convergence to a dominant culture. Further increase of diversity leads to fragmentation of the dominant culture into domains, forever changing in shape and number, as an effect of the never ending eroding activity of cultural minorities.

One of the beneficial outcomes of searching for life in the Universe is that it grants greater awareness of our own problems here on Earth. Lack of contact with alien beings to date might actually comprise a null "signal" pointing humankind toward a viable future. Astrobiology has surprising practical applications to human society; within the larger cosmological context of cosmic evolution, astrobiology clarifies the energetic essence of complex systems throughout the Universe, including technological intelligence that is intimately dependent on energy and likely will be for as long as it endures. The "message" contained within the "signal" with which today's society needs to cope is reasonably this: Only solar energy can power our civilization going forward without soiling the environment with increased heat yet robustly driving the economy with increased per capita energy usage. The null "signals" from extraterrestrials also offer a rational solution to the Fermi paradox as a principle of cosmic selection l...

From the principle of equivalence, Einstein predicted that clocks slow down in a gravitational field. Since the general theory of relativity is based on the principle of equivalence, it is essential to test this prediction accurately. Muller, Peters and Chu claim that a reinterpretation of decade old experiments with atom interferometers leads to a sensitive test of this gravitational redshift effect at the Compton frequency. Wolf et al dispute this claim and adduce arguments against it. In this article, we distill these arguments to a single fundamental objection: an atom is NOT a clock ticking at the Compton frequency. We conclude that atom interferometry experiments conducted to date do not yield such sensitive tests of the gravitational redshift. Finally, we suggest a new interferometric experiment to measure the gravitational redshift, which realises a quantum version of the classical clock "paradox".

We give an account of the matter-gravity entanglement hypothesis which, unlike the standard approach to entropy based on coarse-graining, offers a definition for the entropy of a closed system as a real and objective quantity. We explain how this new approach offers an explanation for the Second Law of Thermodynamics in general and a non-paradoxical understanding of information loss during black hole formation and evaporation in particular. We also very briefly review some recent related work on the nature of equilibrium states involving quantum black holes and point out how it promises to resolve some puzzling issues in the current version of the string theory approach to black hole entropy.

We reexamine the bending of light issue associated with the metric of the static, spherically symmetric solution of Weyl gravity discovered by Mannheim and Kazanas (1989). To this end we employ the procedure used recently by Rindler and Ishak to obtain the bending angle of light by a centrally concentrated spherically symmetric matter distribution in a Schwarzschild-de Sitter background. In earlier studies the term {gamma}r in the metric led to the paradoxical result of a bending angle proportional to the photon impact parameter, when using the usual formalism appropriate to asymptotically flat space-times. However, employing the approach of light bending of Rindler and Ishak we show that the effects of this term are in fact insignificant, with the discrepancy between the two procedures attributed to the definition of the bending angle between the asymptotically flat and nonflat spaces.

We reexamine the deflection of light in conformal Weyl gravity obtained in Sultana and Kazanas (2010), by extending the calculation based on the procedure by Rindler and Ishak, for the bending angle by a centrally concentrated spherically symmetric matter distribution, to second order in M/R, where M is the mass of the source and R is the impact parameter. It has recently been reported in Bhattacharya et al. (JCAP 09 (2010) 004; JCAP 02 (2011) 028), that when this calculation is done to second order, the term ?r in the Mannheim-Kazanas metric, yields again the paradoxical contribution ?R (where the bending angle is proportional to the impact parameter) obtained by standard formalisms appropriate to asymptotically flat spacetimes. We show that no such contribution is obtained for a second order calculation and the effects of the term ?r in the metric are again insignificant as reported in our earlier work.

We consider a thought experiment in which an energetic massless string probes a "stringhole" (a heavy string lying on the correspondence curve between strings and black holes) at large enough impact parameter for the regime to be under theoretical control. The corresponding, explicitly unitary, $S$-matrix turns out to be perturbatively sensitive to the microstate of the stringhole: in particular, at leading order in $l_s/b$, it depends on a projection of the stringhole's Lorentz-contracted quadrupole moment. The string-black hole correspondence is therefore violated if one assumes quantum hair to be exponentially suppressed as a function of black-hole entropy. Implications for the information paradox are briefly discussed.

Between 1913 and 1924, several Denver area facilities extracted radium from carnotite ore mined from the Paradox basin region of Colorado. Tailings or abandoned ores from these facilities were apparently incorporated into asphalt used to pave approximately 7.2 kilometers (4.5 miles) of streets in Denver. A majority of the streets are located in residential areas. The radionuclides are bound within the asphalt matrix and pose minimal risk unless they are disturbed. The City and County of Denver (CCoD) is responsible for controlling repairs and maintenance on these impacted streets. Since 2002, the CCoD has embarked on a significant capital improvement project to remove the impacted asphalt for secure disposal followed by street reconstruction. To date, Parsons has removed approximately 55 percent of the impacted asphalt. This paper discusses the history of the Denver Radium Streets and summarizes on-going project efforts. (authors)

The oceans play a critical role in regulating the global carbon cycle. Deep-ocean waters are roughly 200% supersaturated with CO{sub 2} compared to surface waters, which are in contact with the atmosphere. This difference is due to the flux of photosynthetically derived organic material from surface to deep waters and its subsequent remineralization, i.e. the biological pump''. The pump is a complex phytoplankton-based ecosystem. the paradoxical nature of ocean regions containing high nutrients and low phytoplankton populations has intrigued biological oceanographers for many years. Hypotheses to explain the paradox include the regulation of productivity by light, temperature, zooplankton grazing, and trace metal limitation and/or toxicity. To date, none of the hypotheses, or combinations thereof, has emerged as a widely accepted explanation for why the nitrogen and phosphorus are not depleted in these regions of the oceans. Recently, new evidence has emerged which supports the hypothesis that iron limitation regulates primary production in these areas. This has stimulated discussions of the feasibility of fertilizing parts the Southern Ocean with iron, and thus sequestering additional atmospheric CO{sub 2} in the deep oceans, where it would remain over the next few centuries. The economic, social, and ethical concerns surrounding such a proposition, along with the outstanding scientific issues, call for rigorous discussion and debate on the regulation of productivity in these regions. To this end, The American Society of Limnology and Oceanography (ASLO) held a Special Symposium on the topic Feb. 22--24th, 1991. Participants included leading authorities, from the US and abroad, on physical, chemical, and biological oceanography, plant physiology, microbiology, and trace metal chemistry. Representatives from government agencies and industry were also present.

The oceans play a critical role in regulating the global carbon cycle. Deep-ocean waters are roughly 200% supersaturated with CO{sub 2} compared to surface waters, which are in contact with the atmosphere. This difference is due to the flux of photosynthetically derived organic material from surface to deep waters and its subsequent remineralization, i.e. the ``biological pump``. The pump is a complex phytoplankton-based ecosystem. the paradoxical nature of ocean regions containing high nutrients and low phytoplankton populations has intrigued biological oceanographers for many years. Hypotheses to explain the paradox include the regulation of productivity by light, temperature, zooplankton grazing, and trace metal limitation and/or toxicity. To date, none of the hypotheses, or combinations thereof, has emerged as a widely accepted explanation for why the nitrogen and phosphorus are not depleted in these regions of the oceans. Recently, new evidence has emerged which supports the hypothesis that iron limitation regulates primary production in these areas. This has stimulated discussions of the feasibility of fertilizing parts the Southern Ocean with iron, and thus sequestering additional atmospheric CO{sub 2} in the deep oceans, where it would remain over the next few centuries. The economic, social, and ethical concerns surrounding such a proposition, along with the outstanding scientific issues, call for rigorous discussion and debate on the regulation of productivity in these regions. To this end, The American Society of Limnology and Oceanography (ASLO) held a Special Symposium on the topic Feb. 22--24th, 1991. Participants included leading authorities, from the US and abroad, on physical, chemical, and biological oceanography, plant physiology, microbiology, and trace metal chemistry. Representatives from government agencies and industry were also present.

Over 400 million barrels (64 million m{sup 3}) of oil have been produced from the shallow-shelf carbonate reservoirs in the Pennsylvanian (Desmoinesian) Paradox Formation in the Paradox Basin, Utah and Colorado. With the exception of the giant Greater Aneth field, the other 100 plus oil fields in the basin typically contain 2 to 10 million barrels (0.3-1.6 million m{sup 3}) of original oil in place. Most of these fields are characterized by high initial production rates followed by a very short productive life (primary), and hence premature abandonment. Only 15 to 25 percent of the original oil in place is recoverable during primary production from conventional vertical wells. An extensive and successful horizontal drilling program has been conducted in the giant Greater Aneth field. However, to date, only two horizontal wells have been drilled in small Ismay and Desert Creek fields. The results from these wells were disappointing due to poor understanding of the carbonate facies and diagenetic fabrics that create reservoir heterogeneity. These small fields, and similar fields in the basin, are at high risk of premature abandonment. At least 200 million barrels (31.8 million m{sup 3}) of oil will be left behind in these small fields because current development practices leave compartments of the heterogeneous reservoirs undrained. Through proper geological evaluation of the reservoirs, production may be increased by 20 to 50 percent through the drilling of low-cost single or multilateral horizontal legs from existing vertical development wells. In addition, horizontal drilling from existing wells minimizes surface disturbances and costs for field development, particularly in the environmentally sensitive areas of southeastern Utah and southwestern Colorado.

In this work the non-additive entropy is examined. It appears in isolated particle systems composed of few components. Therefore, the mixing of isolated particle systems S=S1+S2 has been studied. Two cases are considered T1=T2 and T1\\leqT2, where T1,T2 are the initial temperatures of the system S1 and S2 respectively. The concept of similar systems containing interacting particles is introduced. These systems are defined by a common temperature and an identical time evolution process, i.e. the approach to the same thermodynamic equilibrium. The main results are: 1) The properties of the similar particle systems yield the non-additive entropy and free energy. The Gibbs Paradox is not a paradox. 2) The relation between the initial temperatures T1 and T2 governs the mixing process. 3) In the two cases T1=T2, T1\\leqT2 mixing of the systems S1, S2 results in a uniform union system S=S1+S2. The systems S, S1, S2 are similar one to the other. 4) The mixing process is independent of the extensive quantities (volume, particle number, energy) and of the particle type. Only the mean energy plays an important role in the mixing of the systems S1, S2. 5) Mixing in the case T1\\leqT2 is in essence a thermalization process, but mixing in the case T1=T2 is not a thermodynamic process. 6)Mixing is an irreversible process. Keywords: Entropy; Similar systems of interacting particles; Mixing of systems; Thermal equilibrium

Stars formed in galaxy cluster potential wells must be responsible for the high level of enrichment measured in the intracluster medium (ICM); however, there is increasing tension between this truism and the parsimonious assumption that the stars in the generally old population studied optically in cluster galaxies emerged from the same formation sites at the same epochs. We construct a phenomenological cluster enrichment model to demonstrate that ICM elemental abundances are underestimated by a factor >2 for standard assumptions about the stellar population-a discrepancy we call the ''cluster elemental abundance paradox''. Recent evidence of an elliptical galaxy initial mass function (IMF) skewed to low masses deepens the paradox. We quantify the adjustments to the star formation efficiency and IMF, and Type Ia supernovae (SNIa) production efficiency, required to resolve this while being consistent with the observed ICM abundance pattern. The necessary enhancement in metal enrichment may, in principle, originate in the observed stellar population if a larger fraction of stars in the supernova-progenitor mass range form from an IMF that is either bottom-light or top-heavy, with the latter in some conflict with observed ICM abundance ratios. Other alternatives that imply more modest revisions to the IMF, mass return and remnant fractions, and primordial fraction, posit an increase in the fraction of 3-8 M{sub Sun} stars that explode as SNIa or assume that there are more stars than conventionally thought-although the latter implies a high star formation efficiency. We discuss the feasibility of these various solutions and the implications for the diversity of star formation in the universe, the process of elliptical galaxy formation, and the origin of this ''hidden'' source of ICM metal enrichment.

The fact that closed timelike curves (CTCs) are permitted by general relativity raises the question as to how quantum systems behave when time travel to the past occurs. Research into answering this question by utilising the quantum circuit formalism has given rise to two theories: Deutschian-CTCs (D-CTCs) and "postselected" CTCs (P-CTCs). In this paper the quantum circuit approach is thoroughly reviewed, and the strengths and shortcomings of D-CTCs and P-CTCs are presented in view of their non-linearity and time travel paradoxes. In particular, the "equivalent circuit model"---which aims to make equivalent predictions to D-CTCs, while avoiding some of the difficulties of the original theory---is shown to contain errors. The discussion of D-CTCs and P-CTCs is used to motivate an analysis of the features one might require of a theory of quantum time travel, following which two overlapping classes of new theories are identified. One such theory, the theory of "transition probability" CTCs (T-CTCs), is fully developed. The theory of T-CTCs is shown not to have certain undesirable features---such as time travel paradoxes, the ability to distinguish non-orthogonal states with certainty, and the ability to clone or delete arbitrary pure states---that are present with D-CTCs and P-CTCs. The problems with non-linear extensions to quantum mechanics are discussed in relation to the interpretation of these theories, and the physical motivations of all three theories are discussed and compared.

Firewalls are controversial principally because they seem to imply departures from general relativistic expectations in regions of spacetime where the curvature need not be particularly large. One of the virtues of the Harlow-Hayden approach to the firewall paradox, concerning the time available for decoding of Hawking radiation emanating from charged AdS black holes, is precisely that it operates in the context of cold black holes, which are not strongly curved outside the event horizon. Here we clarify this point. The approach is based on ideas borrowed from applications of the AdS/CFT correspondence to the quark-gluon plasma. Firewalls aside, our work presents a detailed analysis of the thermodynamics and evolution of evaporating charged AdS black holes with flat event horizons. We show that, in one way or another, these black holes are always eventually destroyed in a time which, while long by normal standards, is short relative to the decoding time of Hawking radiation.

The characterization of global energy storage and release in the coupled solar wind-magnetosphere system remains one of the fundamental problems of space physics.Recently, it has been realised that a new paradigm in physics, that of Self Organised Criticality (SOC) may encapsulate the mixing and merging of flux on many scales in the magnetotail prompting bursty energy release and reconfiguration. SOC is consistent with qualitative measures such as power law power spectra and bursty bulk flows and with more quantitative tests such as power law burst distributions in auroral indices and auroral optical activity. Here, we present a careful classification of the broad range of systems that fall under the general description of "SOC". We argue that some, but not all, of these are consistent with our current understanding of the magnetosphere. We discuss the observed low dimensionality of the dynamic magnetosphere in terms of both SOC model properties, and observables. Observations of burst statistics are highlighted; we show that these are currently suggestive but not sufficient to confirm SOC and in particular we find that auroral indices are not effective at distinguishing the internal dynamics of the magnetosphere from that of the intermittent solar wind driver. This may also elucidate the paradox of predictability and complexity of the coupled solar wind-magnetosphere system.

We review of the interface between (theoretical) physics and information for non-experts. The origin of information as related to the notion of entropy is described, first in the context of thermodynamics then in the context of statistical mechanics. A close examination of the foundations of statistical mechanics and the need to reconcile the probabilistic and deterministic views of the world leads us to a discussion of chaotic dynamics, where information plays a crucial role in quantifying predictability. We then discuss a variety of fundamental issues that emerge in defining information and how one must exercise care in discussing concepts such as order, disorder, and incomplete knowledge. We also discuss an alternative form of entropy and its possible relevance for nonequilibrium thermodynamics. In the final part of the paper we discuss how quantum mechanics gives rise to the very different concept of quantum information. Entirely new possibilities for information storage and computation are possible due to the massive parallel processing inherent in quantum mechanics. We also point out how entropy can be extended to apply to quantum mechanics to provide a useful measurement for quantum entanglement. Finally we make a small excursion to the interface betweeen quantum theory and general relativity, where one is confronted with an "ultimate information paradox" posed by the physics of Black Holes. In this review we have limited ourselves; not all relevant topics that touch on physics and information could be covered.

A century after the advent of Quantum Mechanics and General Relativity, both theories enjoy incredible empirical success, constituting the cornerstones of modern physics. Yet, paradoxically, they suffer from deep-rooted, so-far intractable, conflicts. Motivations for violations of the notion of relativistic locality include the Bell's inequalities for hidden variable theories, the cosmological horizon problem, and Lorentz-violating approaches to quantum geometrodynamics, such as Horava-Lifshitz gravity. Here, we explore a recent proposal for a "real ensemble" non-local description of quantum mechanics, in which "particles" can copy each others' observable values AND phases, independent of their spatial separation. We first specify the exact theory, ensuring that it is consistent and has (ordinary) quantum mechanics as a fixed point, where all particles with the same values for a given observable have the same phases. We then study the stability of this fixed point numerically, and analytically, for simple models. We provide evidence that most systems (in our study) are locally stable to small deviations from quantum mechanics, and furthermore, the phase variance per value of the observable, as well as systematic deviations from quantum mechanics, decay as $\\sim$ (Energy$\\times$Time)$^{-2n}$, where $n \\geq 1$. Interestingly, this convergence is controlled by the absolute value of energy (and not energy difference), suggesting a possible connection to gravitational physics. Finally, we discuss different issues related to this theory, as well as potential novel applications for the spectrum of primordial cosmological perturbations and the cosmological constant problem.

Graphene is a monoatomic layer of graphite with Carbon atoms arranged in a two dimensional honeycomb lattice configuration. It has been known for more than sixty years that the electronic structure of graphene can be modelled by two-dimensional massless relativistic fermions. This property gives rise to numerous applications, both in applied sciences and in theoretical physics. Electronic circuits made out of graphene could take advantage of its high electron mobility that is witnessed even at room temperature. In the theoretical domain the Dirac-like behavior of graphene can simulate high energy effects, such as the relativistic Klein paradox. Even more surprisingly, topological effects can be encoded in graphene such as the generation of vortices, charge fractionalization and the emergence of anyons. The impact of the topological effects on graphene's electronic properties can be elegantly described by the Atiyah-Singer index theorem. Here we present a pedagogical encounter of this theorem and review its various applications to graphene. A direct consequence of the index theorem is charge fractionalization that is usually known from the fractional quantum Hall effect. The charge fractionalization gives rise to the exciting possibility of realizing graphene based anyons that unlike bosons or fermions exhibit fractional statistics. Besides being of theoretical interest, anyons are a strong candidate for performing error free quantum information processing.

In a spatially infinite and eternal universe approaching ultimately a de Sitter (or quasi-de Sitter) regime, structure can form by thermal fluctuations as such a space is thermal. The models of Dark Energy invoking holographic principle fit naturally into such a category, and spontaneous formation of isolated brains in otherwise empty space seems the most perplexing, creating the paradox of Boltzmann Brains (BB). It is thus appropriate to ask if such models can be made free from domination by Boltzmann Brains. Here we consider only the simplest model, but adopt both the local and the global viewpoint in the description of the Universe. In the former case, we find that if a parameter $c$, which modulates the Dark Energy density, lies outside the exponentially narrow strip around the most natural $c = 1$ line, the theory is rendered BB-safe. In the later case, the bound on $c$ is exponentially stronger, and seemingly at odds with those bounds on $c$ obtained from various observational tests.

Assuming a spherical symmetry, the extreme UV emitted by a very hot source ionizes low pressure molecular hydrogen making a transparent bubble of H II (Protons and electrons). For an increase of radius, intensity of extreme UV and temperature decrease, so that the plasma contains more and more atoms. A spherical shell, mainly of neutral atoms (H I) appears. If this shell is optically thick at Lyman frequencies of H I, it is superradiant and a competition of modes selects modes tangent to a sphere for which many atoms are excited. Thus, a shell of plasma emits, into a given direction, tangential rays showing a ring in which selected modes are brighter. While at Lyman frequencies, absorption of rays emitted by the source excites the atoms able to amplify the superradiance, a more powerful amplification of superradiance results from an induced scattering of the radial beams, which extends to feet of lines and progressively to the whole spectrum. Thermodynamics says that the brightness of radial and tangential beams tends to be equal; if the solid angle of observation is much larger for the ring than for the source, almost the whole light emitted by the source is transferred to the rings, and the source becomes invisible. Paradoxically, a glow due to incoherent scattering and impurities around the source remains visible. As the scattering decreases with the decrease of the radial intensity, the brightness of the ring decreases with radius. These characteristics are found in supernova remnant 1987A.

This paper develops a deterministic model of quantum mechanics as an accumulation-and-threshold process. The model arises from an analogy with signal processing in wireless communications. Complex wavefunctions are interpreted as expressing the amplitude and phase information of a modulated carrier wave. Particle transmission events are modeled as the outcome of a process of signal accumulation that occurs in an extra (non-spacetime) dimension. Besides giving a natural interpretation of the wavefunction and the Born rule, the model accommodates the collapse of the wave packet and other quantum paradoxes such as EPR and the Ahanorov-Bohm effect. The model also gives a new perspective on the 'relational' nature of quantum mechanics: that is, whether the wave function of a physical system is "real" or simply reflects the observer's partial knowledge of the system. We simulate the model for a 2-slit experiment, and indicate possible deviations of the model's predictions from conventional quantum mechanics. We also indicate how the theory may be extended to a field theory.

Performance and scalability are critical quality attributes for server applications in Internet-facing business systems. These applications operate in dynamic environments with rapidly fluctuating user loads and resource levels, and unpredictable system faults. Adaptive (autonomic) systems research aims to augment such server applications with intelligent control logic that can detect and react to sudden environmental changes. However, developing this adaptive logic is complex in itself. In addition, executing the adaptive logic consumes processing resources, and hence may (paradoxically) adversely affect application performance. In this paper we describe an approach for developing high-performance adaptive server applications and the supporting technology. The Adaptive Server Framework (ASF) is built on standard middleware services, and can be used to augment legacy systems with adaptive behavior without needing to change the application business logic. Crucially, ASF provides built-in control loop components to optimize the overall application performance, which comprises both the business and adaptive logic. The control loop is based on performance models and allows systems designers to tune the performance levels simply by modifying high level declarative policies. We demonstrate the use of ASF in a case study.

We investigate the validity of the equivalence principle near horizons in string theory, analyzing the breakdown of effective field theory caused by longitudinal string spreading effects. An experiment is set up where a detector is thrown into a black hole a long time after an early infalling string. Light cone gauge calculations, taken at face value, indicate a detectable level of root-mean-square longitudinal spreading of the initial string as measured by the late infaller. This results from the large relative boost between the string and detector in the near horizon region, which develops automatically despite their modest initial energies outside the black hole and the weak curvature in the geometry. We subject this scenario to basic consistency checks, using these to obtain a relatively conservative criterion for its detectability. In a companion paper, we exhibit longitudinal nonlocality in well-defined gauge-invariant S-matrix calculations, obtaining results consistent with the predicted spreading albeit not in a direct analogue of the black hole process. We discuss applications of this effect to the firewall paradox, and estimate the time and distance scales it predicts for new physics near black hole and cosmological horizons.

The production of natural gas from coal typically requires stimulation in the form of hydraulic fracturing and, more recently, cavity completions. The results of hydraulic fracturing treatments have ranged from extremely successful to less than satisfactory. The purpose of this work is to characterize common and potential fracturing fluids in terms of coal-fluid interactions to identify reasons for less than satisfactory performance and to ultimately devise alternative fluids and treatment procedures to optimize production following hydraulic fracturing. The laboratory data reported herein has proven helpful in designing improved hydraulic fracturing treatments and remedial treatments in the Black Warrior Basin. Acid inhibitors, scale inhibitors, additives to improve coal relative permeability to gas, and non-damaging polymer systems for hydraulic fracturing have been screened in coal damage tests. The optimum conditions for creating field-like foams in the laboratory have been explored. Tests have been run to identify minimum polymer and surfactant concentrations for applications of foam in coal. The roll of 100 mesh sand in controlling leakoff and impairing conductivity in coal has been investigated. The leakoff and proppant transport of fluids with breaker has been investigated and recommendations have been made for breaker application to minimize damage potential in coal. A data base called COAL`S has been created in Paradox (trademark) for Windows to catalogue coalbed methane activities in the Black Warrior and San Juan Basins.

It is shown that thermal energy from a heat source can be converted to useful work in the form of maser-laser light by using a combination of a Stern-Gerlach device and stimulated emissions of excited particles in a maser-laser cavity. We analyze the populations of atoms or quantum dots exiting the cavity, the photon statistics, and the internal entropy as a function of atomic transit time, using the quantum theory of masers and lasers. The power of the laser light is estimated to be sufficiently high for device applications. The thermodynamics of the heat converter is analyzed as a heat engine operating between two reservoirs of different temperature but is generalized to include the change of internal quantum states. The von Neumann entropies for the internal degree are obtained. The sum of the internal and external entropies increases after each cycle and the second law is not violated, even if the photon entropy due to finite photon number distribution is not included. An expression for efficiency relating to the Carnot efficiency is obtained. We resolve the subtle paradox on the reduction of the internal entropy with regards to the path separation after the Stern-Gerlach device.

Recent efforts in cosmic ray (CR) confinement and transport theory are discussed. Three problems are addressed as being crucial for understanding the present day observations and their possible telltale signs of the CR origin. The first problem concerns CR behavior right after their release from a source, such as a supernova remnant (SNR). At this phase the CRs are confined near the source by self-emitted Alfven waves. The second is the problem of diffusive propagation of CRs through the turbulent ISM. This is a seemingly straightforward and long-resolved problem, but it remains controversial and reveals paradoxes. A resolution based on the Chapman-Enskog asymptotic CR transport analysis, that also includes magnetic focusing, is suggested. The third problem is about a puzzling sharp ($\\sim10^{\\circ}$) anisotropies in the CR arrival directions that might bear on important clues of their transport between the source and observer. The overarching goal is to improve our understanding of all aspects of the CR's so...

We re-examine the "Regge-Tolman paradox" with reference to some recent experimental results. It is straightforward to find a formula for the velocity v of the moving system required to produce causality violation. This formula typically yields a velocity very close to the speed of light (for instance, v/c > 0.97 for X-shaped microwaves), which raises some doubts about the real physical observability of the violations. We then compute the velocity requirement introducing a delay between the reception of the primary signal and the emission of the secondary. It turns out that in principle for any delay it is possible to find moving observers able to produce active causal violation. This is mathematically due to the singularity of the Lorentz transformations for beta to 1. For a realistic delay due to the propagation of a luminal precursor, we find that causality violations in the reported experiments are still more unlikely (v/c > 0.989), and even in the hypothesis that the superluminal propagation velocity goes to infinity, the velocity requirement is bounded by v/c > 0.62. We also prove that if two macroscopic bodies exchange energy and momentum through superluminal signals, then the swap of signal source and target is incompatible with the Lorentz transformations; therefore it is not possible to distinguish between source and target, even with reference to a definite reference frame.

This is a study of a possible alternative procedure for adding a potential energy to the free electron Dirac equation. When Dirac added potentials to his free electron equation, there were two alternatives (here called D1 and D2). He chose D1 and lost charge conjugation symmetry, found Ehrenfest equations that depended on the sign of the energy of the state determining the expectation value, encountered Klein tunneling, zitterbewegung and the Klein paradox. The D1 alternative also predicted that deep potentials should pull positive energy states down into the negative energy continuum, possibly creating an unstable vacuum. Extensive experiments (1975-1997) found no evidence for this instability, but did find low energy electron-positron pairs with sharply defined energies and unusually low counting statistics. These pairs tended to disappear with higher beam currents. This paper explores the other alternative, here called D2 and finds charge conjugation symmetry preserved, Ehrenfest equations are classical, Klein tunneling is not present, unstable vacuua are forbidden, zitterbewegung is absent in the charge current density, new excitations of bound electron-positron pairs are possible in atoms, and the energies at which low energy electron-positron pair production in heavy ion scattering occurs is well described. Also all of the positive energy calculations, including those with the Coulomb potential, the hydrogen-like atom, are retained exactly the same as found in alternative D1. It might have been better if Dirac had chosen alternative D2.

We investigate the effect of the intrinsic spin of a fundamental spinor field on the surrounding spacetime geometry. We show that despite the lack of a rotating stress-energy source (and despite claims to the contrary) the intrinsic spin of a spin-half fermion gives rise to a frame-dragging effect analogous to that of orbital angular momentum, even in Einstein-Hilbert gravity where torsion is constrained to be zero. This resolves a paradox regarding the counter-force needed to restore Newton's third law in the well-known spin-orbit interaction. In addition, the frame-dragging effect gives rise to a long-range gravitationally mediated spin-spin dipole interaction coupling the internal spins of two sources. We argue that despite the weakness of the interaction, the spin-spin interaction will dominate over the ordinary inverse square Newtonian interaction in any process of sufficiently high energy for quantum field theoretical effects to be non-negligible.

The cosmological Robertson-Walker metric of general relativity is often said to have the consequences that (1) the recessional velocity $v$ of a galaxy at proper distance $\\ell$ obeys the Hubble law $v=H\\ell$, and therefore galaxies at sufficiently great distance $\\ell$ are receding faster than the speed of light $c$; (2) faster than light recession does not violate special relativity theory because the latter is not applicable to the cosmological problem, and because ``space itself is receding'' faster than $c$ at great distance, and it is velocity relative to local space that is limited by $c$, not the velocity of distant objects relative to nearby ones; (3) we can see galaxies receding faster than the speed of light; and (4) the cosmological redshift is not a Doppler shift, but is due to a stretching of photon wavelength during propagation in an expanding universe. We present a particular Robertson-Walker metric (an empty universe metric) for which a coordinate transformation shows that none of these interpretation necessarily holds. The resulting paradoxes of interpretation lead to a deeper understanding of the meaning of the cosmological metric.

The predictive value of radionuclide ventriculography was studied in 34 patients with depressed left ventricular ejection fraction (less than 40%) and clinically evident congestive heart failure secondary to atherosclerotic coronary artery disease. In addition to left ventricular ejection fraction, right ventricular ejection fraction and extent of left ventricular paradox were obtained in an attempt to identify a subgroup at increased risk of mortality during the ensuing months. The 16 patients who were alive after a 2 year follow-up period had a higher right ventricular ejection fraction and less extensive left ventricular dyskinesia. When a right ventricular ejection fraction of less than 35% was used as a discriminant, mortality was significantly greater among the 21 patients with a depressed right ventricular ejection fraction (71 versus 23%), a finding confirmed by a life table analysis. It appears that the multiple factors contributing to the reduction in right ventricular ejection fraction make it a useful index not only for assessing biventricular function, but also for predicting patient outcome.