Sample records for increasing congestion levels

A motivation exists to formulate and implement new tools and methodologies to address the problem of congestion in the National Airspace System (NAS). This thesis presents a novel methodology for allocating aircraft among ...

The predictive value of radionuclide ventriculography was studied in 34 patients with depressed left ventricular ejection fraction (less than 40%) and clinically evident congestive heart failure secondary to atherosclerotic coronary artery disease. In addition to left ventricular ejection fraction, right ventricular ejection fraction and extent of left ventricular paradox were obtained in an attempt to identify a subgroup at increased risk of mortality during the ensuing months. The 16 patients who were alive after a 2 year follow-up period had a higher right ventricular ejection fraction and less extensive left ventricular dyskinesia. When a right ventricular ejection fraction of less than 35% was used as a discriminant, mortality was significantly greater among the 21 patients with a depressed right ventricular ejection fraction (71 versus 23%), a finding confirmed by a life table analysis. It appears that the multiple factors contributing to the reduction in right ventricular ejection fraction make it a useful index not only for assessing biventricular function, but also for predicting patient outcome.

Congestion begins when an excess of vehicles on a segment of roadway at a given time, resulting in speeds that are significantly slower than normal or 'free flow' speeds. Congestion often means stop-and-go traffic. The transition occurs when vehicle density (the number of vehicles per mile in a lane) exceeds a critical level. Once traffic enters a state of congestion, recovery or time to return to a free-flow state is lengthy; and during the recovery process, delay continues to accumulate. The breakdown in speed and flow greatly impedes the efficient operation of the freeway system, resulting in economic, mobility, environmental and safety problems. Freeways are designed to function as access-controlled highways characterized by uninterrupted traffic flow so references to freeway performance relate primarily to the quality of traffic flow or traffic conditions as experienced by users of the freeway. The maximum flow or capacity of a freeway segment is reached while traffic is moving freely. As a result, freeways are most productive when they carry capacity flows at 60 mph, whereas lower speeds impose freeway delay, resulting in bottlenecks. Bottlenecks may be caused by physical disruptions, such as a reduced number of lanes, a change in grade, or an on-ramp with a short merge lane. This type of bottleneck occurs on a predictable or 'recurrent' basis at the same time of day and same day of week. Recurrent congestion totals 45% of congestion and is primarily from bottlenecks (40%) as well as inadequate signal timing (5%). Nonrecurring bottlenecks result from crashes, work zone disruptions, adverse weather conditions, and special events that create surges in demand and that account for over 55% of experienced congestion. Figure 1.1 shows that nonrecurring congestion is composed of traffic incidents (25%), severe weather (15%), work zones, (10%), and special events (5%). Between 1995 and 2005, the average percentage change in increased peak traveler delay, based on hours spent in traffic in a year, grew by 22% as the national average of hours spent in delay grew from 36 hours to 44 hours. Peak delay per traveler grew one-third in medium-size urban areas over the 10 year period. The traffic engineering community has developed an arsenal of integrated tools to mitigate the impacts of congestion on freeway throughput and performance, including pricing of capacity to manage demand for travel. Congestion pricing is a strategy which dynamically matches demand with available capacity. A congestion price is a user fee equal to the added cost imposed on other travelers as a result of the last traveler's entry into the highway network. The concept is based on the idea that motorists should pay for the additional congestion they create when entering a congested road. The concept calls for fees to vary according to the level of congestion with the price mechanism applied to make travelers more fully aware of the congestion externality they impose on other travelers and the system itself. The operational rationales for the institution of pricing strategies are to improve the efficiency of operations in a corridor and/or to better manage congestion. To this end, the objectives of this project were to: (1) Better understand and quantify the impacts of congestion pricing strategies on traffic operations through the study of actual projects, and (2) Better understand and quantify the impacts of congestion pricing strategies on traffic operations through the use of modeling and other analytical methods. Specifically, the project was to identify credible analytical procedures that FHWA can use to quantify the impacts of various congestion pricing strategies on traffic flow (throughput) and congestion.

of automation than previous NASA vehicles, due to program requirements for automation, including Automated Ren into a human space flight vehicle, NASA has created the Function-specific Level of Autonomy and Automation Tool levels of automation than previous NASA vehicles. A key technology to the success of the CEV

Airport surface congestion results in significant increases in taxi times, fuel burn and emissions at major airports. This paper presents the field tests of a control strategy to airport congestion control at Boston Logan ...

1 The Impacts of Congestion on Time-definitive Urban Freight Distribution1 Networks CO2 Emission pressures to limit the impacts13 associated with CO2 emissions are mounting rapidly. A key challenge on CO2 emissions are hindered by the complexities of vehicle routing18 problems with time

. Figure 1 depicts a conceptual "throughput versus load" curve for a network. In the presence of congestion to zero as the load increases, i.e. it follows the "Congested" curve in Figure 1 instead of following a certain point, the network throughput decreases with increasing load instead of maintaining its peak value

In a previous study of plateout and resuspension effects for {sup 222}Rn progeny, an unexpected suppression of the airborne {sup 218}Po and {sup 214}Po levels, which is total unrelated and not predicted by theory or other works, was observed when high {sup 222}Rn concentrations were utilized in a 0.283-m{sup 3} test chamber. Two separate time-dependent methods were used and are reported here to measure this airborne progeny suppression effect to attempt to possibly determine the magnitude and cause of the effect and possible consequences on prior and current ongoing radon research by others. The earlier buildup method was used to observe the buildup phase of {sup 222} Rn and its daughters from a constant emanation source, a constant air change rate (ACH), and initially zero concentrations Rn and progeny. The data were compared with theory using Leonard`s solutions to the Bateman equations to determine the magnitude of the suppression. The second method, referred to as the {open_quotes}down{close_quotes} method, was to measure the decrease in {sup 222}Rn and progeny concentrations from an initially injected high {sup 222}Rn activity concentration in the test chamber, the decrease resulting from a constant ACH of {approximately}0.1 h{sup -1} imposed by the gradual removal of air from the chamber at a constant rate of {approximately}0.5 l/min. No {sup 222}Rn emanation source was present during the second method after the initial injection so that the level of the {sup 222}Rn underwent a monotonic decrease in concentration.

Transmission congestion occurs when there is insufficient transmission capacity to simultaneously accommodate all requests for transmission service within a region. Historically, vertically integrated utilities managed this condition by constraining the economic dispatch of generators with the objective of ensuring security and reliability of their own and/or neighboring systems. Electric power industry restructuring has moved generation investment and operations decisions into the competitive market but has left transmission as a communal resource in the regulated environment. This mixing of competitive generation and regulated transmission makes congestion management difficult. The difficulty is compounded by increases in the amount of congestion resulting from increased commercial transactions and the relative decline in the amount of transmission. Transmission capacity, relative to peak load, has been declining in all regions of the U.S. for over a decade. This decline is expected to continue. Congestion management schemes used today have negative impacts on energy markets, such as disruptions and monetary penalties, under some conditions. To mitigate these concerns various congestion management methods have been proposed, including redispatch and curtailment of scheduled energy transmission. In the restructured electric energy industry environment, new congestion management approaches are being developed that strive to achieve the desired degree of reliability while supporting competition in the bulk power market. This report first presents an overview and background on key issues and emerging approaches to congestion management. It goes on to identify and describe policies affecting congestion management that are favored and/or are now being considered by FERC, NERC, and one of the regional reliability councils (WSCC). It reviews the operational procedures in use or proposed by three of the leading independent system operators (ISOs) including ERCOT, California ISO, and PJM. Finally, it presents recommendations for evaluating the competing alternative approaches and developing metrics to use in such evaluations. As with any report concerning electricity restructuring, specific details quickly become dated. Individual utilities, states and regions will inevitably change rules and procedures even during the time it takes to publish a report. Hopefully, the general conclusions are more robust and this report will continue to have value even after some of the specific details have changed.

The increasing penetration of electric vehicles over the coming decades, taken together with the high cost to upgrade local distribution networks, and consumer demand for home charging, suggest that managing congestion on low voltage networks will be a crucial component of the electric vehicle revolution and the move away from fossil fuels in transportation. Here, we model the max-flow and proportional fairness protocols for the control of congestion caused by a fleet of vehicles charging on distribution networks. We analyse the inequality in the charging times as the vehicle arrival rate increases, and show that charging times are considerably more uneven in max-flow than in proportional fairness. We also analyse the onset of instability, and find that the critical arrival rate is indistinguishable between the two protocols.

We address here the issue of congestion in the modeling of crowd motion, in the non-smooth framework: contacts between people are not anticipated and avoided, they actually occur, and they are explicitly taken into account in the model. We limit our approach to very basic principles in terms of behavior, to focus on the particular problems raised by the non-smooth character of the models. We consider that individuals tend to move according to a desired, or spontanous, velocity. We account for congestion by assuming that the evolution realizes at each time an instantaneous balance between individual tendencies and global constraints (overlapping is forbidden): the actual velocity is defined as the closest to the desired velocity among all admissible ones, in a least square sense. We develop those principles in the microscopic and macroscopic settings, and we present how the framework of Wasserstein distance between measures allows to recover the sweeping process nature of the problem on the macroscopic level, ...

In this report, we evaluate individual options that have the potential to stem the decline in the marginal value of variable generation (VG) with increasing penetration levels. We focus only on the effectiveness of mitigation measures for wind and PV.

of the congestion management allocation scheme. The time frame used here for the consideration of congestion attributable to each transaction. The congestion allocation results are explicitly represented in the IGO devote Section 5.4 to the presentation of numerical results and some discussion of the implementation

Environmental temperature and Turkey performance: the use of diets containing increasedlevels 581 King 5turkeys kept under an 18 or 25 Â°C-choice with whole wheat. Turkeys which received the conventional diets (178 or 209 g protein/kg) grew significantly

Operating the Irish Power System with IncreasedLevels of Wind Power Aidan Tuohy, Student Member of Ireland. Using results from various studies performed on this system, it is shown that wind power of installed wind power will have implications for the operation of power systems. These will be seen

Carbon Dioxide-Induced Anesthesia Results in a Rapid Increase in Plasma Levels of Vasopressin Brian of carbon dioxide, prior to decapitation is considered a more humane alternative for the euthanasia with carbon dioxide until recumbent (20­25 sec), immediately killed via decapitation, and trunk blood

Channel and High Performance Computing (HPC) networks over InfiniBand, not only increases the cost as the unified switch fabric for all of the TCP/IP traffic, the storage traffic and the high performance computing traffic in data centers. Backward Congestion Notification (BCN) is the basic mechanism for the end

Air traffic is expected to continue to grow in the future and improved methods for dealing with the increased demand on the system need to be designed and implemented. One method for reducing surface congestion at airports ...

Over the past two decades, the City of Chicago, as many of its counterparts in the U.S., has experienced a great increase in traffic congestion, which limits regional mobility, induces a huge amount of energy waste and ...

A number of waste components in US defense high level radioactive wastes (HLW) have proven challenging for current Joule heated ceramic melter (JCHM) operations and have limited the ability to increase waste loadings beyond already realized levels. Many of these ''troublesome'' waste species cause crystallization in the glass melt that can negatively impact product quality or have a deleterious effect on melter processing. Recent efforts at US Department of Energy laboratories have focused on understanding crystallization behavior within HLW glass melts and investigating approaches to mitigate the impacts of crystallization so that increases in waste loading can be realized. Advanced glass formulations have been developed to highlight the unique benefits of next-generation melter technologies such as the Cold Crucible Induction Melter (CCIM). Crystal-tolerant HLW glasses have been investigated to allow sparingly soluble components such as chromium to crystallize in the melter but pass out of the melter before accumulating. The Hanford site AZ-101 tank waste composition represents a waste group that is waste loading limited primarily due to high concentrations of Fe2O3 (with higher Al2O3). Systematic glass formulation development utilizing slightly higher process temperatures and higher tolerance to spinel crystals demonstrated that an increase in waste loading of more than 20% could be achieved for this waste composition, and by extension higher loadings for wastes in the same group.

A number of waste components in US defense high level radioactive wastes (HLW) have proven challenging for current Joule heated ceramic melter (JCHM) operations and have limited the ability to increase waste loadings beyond already realized levels. Many of these ''troublesome'' waste species cause crystallization in the glass melt that can negatively impact product quality or have a deleterious effect on melter processing. Recent efforts at US Department of Energy laboratories have focused on understanding crystallization behavior within HLW glass melts and investigating approaches to mitigate the impacts of crystallization so that increases in waste loading can be realized. Advancedmore »glass formulations have been developed to highlight the unique benefits of next-generation melter technologies such as the Cold Crucible Induction Melter (CCIM). Crystal-tolerant HLW glasses have been investigated to allow sparingly soluble components such as chromium to crystallize in the melter but pass out of the melter before accumulating. The Hanford site AZ-101 tank waste composition represents a waste group that is waste loading limited primarily due to high concentrations of Fe2O3 (with higher Al2O3). Systematic glass formulation development utilizing slightly higher process temperatures and higher tolerance to spinel crystals demonstrated that an increase in waste loading of more than 20% could be achieved for this waste composition, and by extension higher loadings for wastes in the same group.« less

congestion in rural areas due the important increase of wind generation [1]. In the literature, many methods generation. Therefore, most European TSO's have chosen to manage separately, congestions related to wind are affected by errors in load and generation prediction due to element outage or random production as is wind

Runway and airspace congestion are the primary causes of flight delays in the US. These delays cost airlines and airline customers billions of dollars per year. This thesis consists of two essays. The first essay focuses ...

. Our results indicate a high level of compliance with economic incentives and disincentives. Detailed/or punishments to help reduce network congestion. We found a high level of compliance with economic incentives several stud- ies on using economic incentives to help shape consumer behaviors towards more efficient use

The main factors that determine the cost of high-level waste (HLW) vitrification are the waste loading (which determines the volume of glass) and the melting rate. Product quality should be the only factor determining the waste loading while melter design should provide a rapid melting technology. In reality, the current HLW melters are slow in glass-production rate and are subjected to operational risks that require waste loading to be kept far below its intrinsic level. One of the constraints that decrease waste loading is the liquidus-temperature limit. close inspection reveals that this constraint is probably too severe, even for the current technology. The purpose of the liquidus-temperature constraint is to prevent solids from settling on the melter bottom. It appears that some limited settling would niether interfere with melter operation nor shorten its lifetime and that the rate of settling can be greatly reduced if only small crystals are allowed to form.

Historically, NASA has operated at low levels of automation and relied heavily on manual control and ground based planning. In early spacecraft such as Mercury, Gemini, and Apollo, computer technology limited the amount of automation. How- ever, some routine... program and was the primary technical objective of the Gemini missions[5]. In the Apollo program, the Lunar Module (LM) had to successfully rendezvous with the Command and Service Module (CSM) on its return from the lunar surface. Rendezvous also allows...

Benzene is a known human leukemogen, but its role as an in utero leukemogen remains controversial. Epidemiological studies have correlated parental exposure to benzene with an increased incidence of childhood leukemias. We hypothesize that in utero exposure to benzene may cause leukemogenesis by affecting the embryonic c-Myb/Pim-1 signaling pathway and that this is mediated by oxidative stress. To investigate this hypothesis, pregnant CD-1 mice were treated with either 800 mg/kg of benzene or corn oil (i.p.) on days 10 and 11 of gestation and in some cases pretreated with 25 kU/kg of PEG-catalase. Phosphorylated and total embryonic c-Myb and Pim-1 protein levels were assessed using Western blotting and maternal and embryonic oxidative stress were assessed by measuring reduced to oxidized glutathione ratios. Our results show increased oxidative stress at 4 and 24 h after exposure, increased phosphorylated Pim-1 protein levels 4 h after benzene exposure, and increased Pim-1 levels at 24 and 48 h after benzene exposure. Embryonic c-Myb levels were elevated at 24 h after exposure. PEG-catalase pretreatment prevented benzene-mediated increases in embryonic c-Myb and Pim-1 protein levels, and benzene-induced oxidative stress. These results support a role for ROS in c-Myb and Pim-1 alterations after in utero benzene exposure.

Network congestion occurs when offered traffic load exceeds available capacity at any point in a network. In wireless sensor networks, congestion causes overall channel quality to degrade and loss rates to rise, leads to ...

faster than TCP in utilizing high bandwidth links while maintaining promising convergence properties. Third, we study the feasibility of employing congestion avoidance algorithms in TCP. We show that end-host based congestion prediction is more accurate...

Variable generation such as wind and photovoltaic solar power has increased substantially in recent years. Variable generation has unique characteristics compared to the traditional technologies that supply energy in the wholesale electricity markets. These characteristics create unique challenges in planning and operating the power system, and they can also influence the performance and outcomes from electricity markets. This report focuses on two particular issues related to market design: revenue sufficiency for long-term reliability and incentivizing flexibility in short-term operations. The report provides an overview of current design and some designs that have been proposed by industry or researchers.

be combined with demand side (load) management to solve the congestion. The congestion management scheme generator re-dispatching alone or combined with demand side management due to the availability of load1 A Comprehensive Contribution Factor Method for Congestion Management H. Song, Student Member

The Savannah River Site's (SRS) Defense Waste Processing Facility (DWPF) began stabilizing high level waste (HLW) in a glass matrix in 1996. Over the past few years, there have been several process and equipment improvements at the DWPF to increase the rate at which the high level waste can be stabilized. These improvements have either directly increased waste processing rates or have desensitized the process to upsets, thereby minimizing downtime and increasing production. Improvements due to optimization of waste throughput with increased HLW loading of the glass resulted in a 6% waste throughput increase based upon operational efficiencies. Improvements in canister production include the pour spout heated bellows liner (5%), glass surge (siphon) protection software (2%), melter feed pump software logic change to prevent spurious interlocks of the feed pump with subsequent dilution of feed stock (2%) and optimization of the steam atomized scrubber (SAS) operation to minimize downtime (3%) for a total increase in canister production of 12%. A number of process recovery efforts have allowed continued operation. These include the off gas system pluggage and restoration, slurry mix evaporator (SME) tank repair and replacement, remote cleaning of melter top head center nozzle, remote melter internal inspection, SAS pump J-Tube recovery, inadvertent pour scenario resolutions, dome heater transformer bus bar cooling water leak repair and new Infra-red camera for determination of glass height in the canister are discussed.

Transmission lines deliver electricity that is generated at power plants to loads. When there is not sufficient transmission capacity to schedule or transport all desired electricity transfers, the transmission system is constrained, and the particular line, flowgate or interface is congested. While it is useful to measure congestion for several reasons—to identify where and how much congestion exists and how this changes over time, to determine whether or what to do about it, and to assess the effectiveness of actions taken—it is challenging to measure congestion in a meaningful and consistent way across markets or over time in the same market. This paper examines current public reporting of congestion measures for organized markets in the U.S., and what these measures can and cannot tell us about congestion across regions or over time in the same region.

Recently, independent system operators (ISOs) and others have published reports on the costs of transmission congestion. The magnitude of congestion costs cited in these reports has contributed to the national discussion on the current state of U.S. electricity transmission system and whether it provides an adequate platform for competition in wholesale electricity markets. This report reviews reports of congestion costs and begins to assess their implications for the current national discussion on the importance of the U.S. electricity transmission system for enabling competitive wholesale electricity markets. As a guiding principle, we posit that a more robust electricity system could reduce congestion costs; and thereby, (1) facilitate more vibrant and fair competition in wholesale electricity markets, and (2) enable consumers to seek out the lowest prices for electricity. Yet, examining the details suggests that, sometimes, there will be trade-offs between these goals. Therefore, it is essential to understand who pays, how much, and how do they benefit in evaluating options (both transmission and non-transmission alternatives) to address transmission congestion. To describe the differences among published estimates of congestion costs, we develop and motivate three ways by which transmission congestion costs are calculated in restructured markets. The assessment demonstrates that published transmission congestion costs are not directly comparable because they have been developed to serve different purposes. More importantly, critical information needed to make them more comparable, for example in order to evaluate the impacts of options to relieve congestion, is sometimes not available.

straightforward. Assuming complete knowledge of the power system inputs, such as the loads at all system buses congestion on a large-scale power system, such as the North American Eastern Interconnect, the simplicityEstimating the Actual Cost of Transmission System Congestion Thomas J. Overbye Department

This thesis makes three related contributions to the broad literature on congestion pricing. First, it examines three policy dimensions that underlie pricing: the economic arguments that motivate it, the technological ...

Understanding policy impacts on freight is essential for planners who have overlooked this transport group in the past and must evaluate new congestion alleviation policies with respect to regional economic and social ...

In wireless sensor networks (WSN), nodes have very limited power due to hardware constraints. Packet losses and retransmissions resulting from congestion cost precious energy and shorten the lifetime of sensor nodes. This problem motivates the need...

Congestion is a major problem for the major cities of today. It reduces mobility, slows economic growth, and is a major cause of emissions. Vehicles traveling at slow speeds emit significantly more pollutants than vehicles ...

in networking technology, recent pursuit to do better flow control of network traffic has led to the emergence of several explicit-feedback congestion control methods. As a first step towards understanding these methods, we analyze the stability and transient...

In April 2007, New York City's Mayor Bloomberg released PlaNYC, a broad ranging set of planning initiatives for the city. A centerpiece of the plan was a congestion-pricing proposal for the downtown core in Manhattan. The ...

Optimising the flow of traffic on road networks can be achieved through sharing information about congestion events to road users. Using GPS sensors to track vehicle movements can provide information to monitor speeds ...

3D Module Placement for Congestion and Power Noise Reduction Jacob R. Minz School of ECE Georgia that copies are not made or distributed for profit or commercial advantage and that copies bear this notice

-- In electronic systems the ever-increasinglevel of integration is paced by component scaling. Consequently portable as components dimensions scale down. Hence, for the robustness of electronic systems, system level protection improvements in electrostatic discharge (ESD) reliability during a device

In mammals and birds, low oxygen levels in the lungs cause a constriction of the pulmonary be demonstrated in denervated lungs that are devoid of external neurohumoral influences (e.g. Fishman, 1976 and ventilation by diverting pulmonary blood flow from poorly ventilated to well- ventilated regions of the lung

, fluorescent tube lighting, pc's and laptops have become more prominent in the residential household. With an expectation of higher penetrations of electric vehicle chargers and renewable energy devices the increased usage of nonlinear devices by type. The disadvantage to this approach is that it relies upon

If the ethnic makeup of the astronomy profession is to achieve parity with the general population within one generation (~30 years), the number of underrepresented minorities earning graduate degrees in astronomy and astrophysics must increase in the coming decade by a factor of 5 to 10. To accomplish this, the profession must develop and invest in mechanisms to more effectively move individuals across critical educational junctures to the PhD and beyond. Early and continuous research engagement starting in the undergraduate years is critical to this vision, in which the federally funded research internship programs (e.g. NSF REU, NASA GSRP) and national centers/observatories play a vital role. Regionally based partnerships with minority-serving institutions (MSIs) are crucial for tapping extant pools of minority talent, as are post-baccalaurate and/or masters degree "bridging" programs that provide critical stepping stones to the PhD. Because of the strong undergraduate physics, engineering, and computer sci...

The essential difference between Revision 1 and the original issue of this report is the analysis of the anchor bolts that tie the steel dome of the primary tank to the concrete tank dome. The reevaluation of the AP anchor bolts showed that (for a given temperature increase) the anchor shear load distribution did not change significantly from the initially higher stiffness to the new secant shear stiffness. Therefore, the forces and displacements of the other tank components such as the primary tanks stresses, secondary liner strains, and concrete tank forces and moments also did not change significantly. Consequently, the revised work in Revision 1 focused on the changes in the anchor bolt responses and a full reevaluation of all tank components was judged to be unnecessary.

Congestion on power grids seems a physical reality, a "hard" fact easy to check. Our paper models a different idea: congestion signal may be distorted by transmission system operators (TSOs), which puts the European ...

of Business University of British Columbia Thurs. Feb. 28, 2013 4:00 Â­ 5:00 pm Location: Transportation Center discounted investment costs over a facility's lifetime. If the marginal cost of investment is constant of Business at UBC. His research interests include road pricing, traffic congestion models, financing roads

congestion, the market-based mechanisms using the locational marginal prices (LMPs) have become the most transmission services and compute the pricing for those services. The inherent volatility of electricity markets introduces uncertainty in the LMPs and consequently, in transmission pricing. In order to protect

Mitigating traffic congestion on urban roads, with paramount importance in urban development and reduction of energy consumption and air pollution, depends on our ability to foresee road usage and traffic conditions pertaining to the collective behavior of drivers, raising a significant question: to what degree is road traffic predictable in urban areas? Here we rely on the precise records of daily vehicle mobility based on GPS positioning device installed in taxis to uncover the potential daily predictability of urban traffic patterns. Using the mapping from the degree of congestion on roads into a time series of symbols and measuring its entropy, we find a relatively high daily predictability of traffic conditions despite the absence of any a priori knowledge of drivers' origins and destinations and quite different travel patterns between weekdays and weekends. Moreover, we find a counterintuitive dependence of the predictability on travel speed: the road segment associated with intermediate average travel ...

Purpose: c-Met is overexpressed in some non-small cell lung cancer (NSCLC) cell lines and tissues. Cell lines with higher levels of c-Met expression and phosphorylation depend on this receptor for survival. We studied the effects of AMG-458 on 2 NSCLC cell lines. Methods and Materials: 3-(4,5-Dimethylthiazol-2-yl)-5-(3-carboxymethoxyphenyl)-2-(4-sulfophenyl) -2H-tetrazolium assays assessed the sensitivities of the cells to AMG-458. Clonogenic survival assays illustrated the radiosensitizing effects of AMG-458. Western blot for cleaved caspase 3 measured apoptosis. Immunoblotting for c-Met, phospho-Met (p-Met), Akt/p-Akt, and Erk/p-Erk was performed to observe downstream signaling. Results: AMG-458 enhanced radiosensitivity in H441 but not in A549. H441 showed constitutive phosphorylation of c-Met. A549 expressed low levels of c-Met, which were phosphorylated only in the presence of exogenous hepatocyte growth factor. The combination of radiation therapy and AMG-458 treatment was found to synergistically increase apoptosis in the H441 cell line but not in A549. Radiation therapy, AMG-458, and combination treatment were found to reduce p-Akt and p-Erk levels in H441 but not in A549. H441 became less sensitive to AMG-458 after small interfering RNA knockdown of c-Met; there was no change in A549. After overexpression of c-Met, A549 became more sensitive, while H441 became less sensitive to AMG-458. Conclusions: AMG-458 was more effective in cells that expressed higher levels of c-Met/p-Met, suggesting that higher levels of c-Met and p-Met in NSCLC tissue may classify a subset of tumors that are more sensitive to molecular therapies against this receptor.

The choroid plexus, a barrier between the blood and cerebrospinal fluid (CSF), is known to accumulate lead (Pb) and also possibly function to maintain brain's homeostasis of A{beta}, an important peptide in the etiology of Alzheimer's disease. This study was designed to investigate if Pb exposure altered A{beta} levels at the blood-CSF barrier in the choroid plexus. Rats received ip injection of 27 mg Pb/kg. Twenty-four hours later, a FAM-labeled A{beta} (200 pmol) was infused into the lateral ventricle and the plexus tissues were removed to quantify A{beta} accumulation. Results revealed a significant increase in intracellular A{beta} accumulation in the Pb-exposed animals compared to controls (p < 0.001). When choroidal epithelial Z310 cells were treated with 10 {mu}M Pb for 24 h and 48 h, A{beta} (2 {mu}M in culture medium) accumulation was significantly increased by 1.5 fold (p < 0.05) and 1.8 fold (p < 0.05), respectively. To explore the mechanism, we examined the effect of Pb on low-density lipoprotein receptor protein-1 (LRP1), an intracellular A{beta} transport protein. Following acute Pb exposure with the aforementioned dose regimen, levels of LRP1 mRNA and proteins in the choroid plexus were decreased by 35% (p < 0.05) and 31.8% (p < 0.05), respectively, in comparison to those of controls. In Z310 cells exposed to 10 {mu}M Pb for 24 h and 48 h, a 33.1% and 33.4% decrease in the protein expression of LRP1 was observed (p < 0.05), respectively. Knocking down LRP1 resulted in even more substantial increases of cellular accumulation of A{beta}, from 31% in cells without knockdown to 72% in cells with LRP1 knockdown (p < 0.05). Taken together, these results suggest that the acute exposure to Pb results in an increased accumulation of intracellular A{beta} in the choroid plexus; the effect appears to be mediated, at least in part, via suppression of LRP1 production following Pb exposure.

to pollution in the air. In a traditional pollution monitoring system, a handful of sites across a region of pollution with error of maximum 0.5 ppm, a negligible amount for the application of interest. Index Terms of interest are carefully selected for the deployment of air quality monitoring equipment [3]. However

Bangkok is widely known for its severe traffic congestion. The Thai government advocates the concept of jobs and housing balance (JHB) as a strategy for reducing traffic congestion in Metropolitan Bangkok. The basic idea ...

Bangkok is widely known for its severe traffic congestion. The Thai government advocates the concept of jobs and housing balance (JHB) as a strategy for reducing traffic congestion in Metropolitan Bangkok. The basic idea is to decentralize the jobs...

of wind generation [2]. In the literature, many methods have been reported for congestion management to element outage or random production as is wind generation. hal-00422160,version1-6Oct2009 Author related to wind generation [8]. This is due to the difficulties to predict exact congestion magnitude

error rates, like wireless links, packets are lost more due to error than due to congestion. But TCP does not differentiate between error and congestion losses and hence reduces the sending rate for losses due to error also, which unnecessarily reduces...

CONGESTION IN THE ISO-NE ELECTRICITY MARKETS BY ANNA BARBARA IHRIG THESIS Advisor: Prof. George in charge of operation and control, the ISO-NE. We describe how the ISO-NE administers the energy market in causing congestion is analyzed; no significant correlation was found. In addition, the impacts of the ISO

Can ECN Be Used to Differentiate Congestion Losses from Wireless Losses? Sa^ad Biaz Xia Wang.auburn.edu Technical Report CSSE04-04 May 13, 2004 Abstract TCP was designed and tuned to work well on networks where losses are mainly congestion losses. The performance of TCP decreases dramatically when a TCP connection

An Accumulation-based Congestion Control Model Yong Xia, David Harrison, Shivkumar Kalyanaraman and proposes a model to use accumulation, buffered packets of a flow inside network routers, as a congestion measure on which a family of congestion control schemes can be derived. We call this model accumulation

; Orthopnea; Paroxysmal Nocturnal Dyspnea; Weight gain; Edema; Fatigue; Malaise; Cough or wheeze; Palpitations enough air in or out of lungs); Faster breathing; Noisy breathing; Cough which often worsens at night mucus production, increases the severity of asthma attacks, and lessens the effectiveness of a cough

The Internet is plagued with congestion problems of growing severity which are worst at peak periods. In this paper, we compare two schemes that incentivize users to shift part of their usage from the peak-time to the ...

This thesis analyzes the performance of Quantized Congestion Notification (QCN) during data access from clustered servers in data centers. The reasons why QCN does not perform adequately in these situations are examined and several modifications...

@control.csl.uiuc.edu 2 Coordinated Science Laboratory and Department of Electrical and Computer Engineering, University versions of TCP as an indication of congestion. In this paper, however, we propose and analyze a pric- ing

We propose a stylized model of a problem-solving organization whose internal communication structure is given by a fixed network. Problems arrive randomly anywhere in this network and must find their way to their respective specialized solvers by relying on local information alone. The organization handles multiple problems simultaneously. For this reason, the process may be subject to congestion. We provide a characterization of the threshold of collapse of the network and of the stock of floating problems (or average delay) that prevails below that threshold. We build upon this characterization to address a design problem: the determination of what kind of network architecture optimizes performance for any given problem arrival rate. We conclude that, for low arrival rates, the optimal network is very polarized (i.e. star-like or centralized), whereas it is largely homogeneous (or decentralized) for high arrival rates. These observations are in line with a common transformation experienced by information-intensive organizations as their work flow has risen in recent years.

In 2004, Australia, through the Australian Nuclear Science and Technology Organisation (ANSTO), created the Regional Security of Radioactive Sources (RSRS) project and partnered with the U.S. Department of Energy’s Global Threat Reduction Initiative (GTRI) and the International Atomic Energy Agency (IAEA) to form the Southeast Asian Regional Radiological Security Partnership (RRSP). The intent of the RRSP is to cooperate with countries in Southeast Asia to improve the security of their radioactive sources. This Southeast Asian Partnership supports objectives to improve the security of high risk radioactive sources by raising awareness of the need and developing national programs to protect and control such materials, improve the security of such materials, and recover and condition the materials no longer in use. The RRSP has utilized many tools to meet those objectives including: provision of physical protection upgrades, awareness training, physical protection training, regulatory development, locating and recovering orphan sources, and most recently - development of model security procedures at a model facility. This paper discusses the benefits of establishing a model facility, the methods employed by the RRSP, and three of the expected outcomes of the Model Facility approach. The first expected outcome is to increase compliance with source security guidance materials and national regulations by adding context to those materials, and illustrating their impact on a facility. Second, the effectiveness of each of the tools above is increased by making them part of an integrated system. Third, the methods used to develop the model procedures establishes a sustainable process that can ultimately be transferred to all facilities beyond the model. Overall, the RRSP has utilized the Model Facility approach as an important tool to increase the security of radioactive sources, and to position facilities and countries for the long term secure management of those sources.

Elevated CO2 increases intrinsic water use efficiency (WUEi) of forests, but the magnitude of this effect and its interaction with climate is still poorly understood. We combined tree ring analysis with isotope measurements at three Free Air CO2 Enrichment (FACE, POP-EUROFACE, in Italy; Duke FACE in North Carolina and ORNL in Tennessee, USA) sites, to cover the entire life of the trees. We used 13C to assess carbon isotope discrimination ( 13C ci/ca) and changes in WUEi, while direct CO2 effects on stomatal conductance were explored using 18O as a proxy. Across all the sites, elevated CO2 increased 13C-derived WUEi on average by 73% for Liquidambar styraciflua, 77% for Pinus taeda and 75% for Populus sp., but through different ecophysiological mechanisms. Our findings provide a robust means of predicting WUEi responses from a variety of tree species exposed to variable environmental conditions over time, and species-specific relationships that can help modeling elevated CO2 and climate impacts on forest productivity, carbon and water balances.

1/13 Three-year increase of Gamma-Glutamyl Transferase level and development of type 2 diabetes and development of Type 2 Diabetes P. André, B. Balkau, C. Born, M. A. Charles, E. Eschwčge and the D) is the main predictor for the development of type 2 diabetes, but there is no data on GGT change and type 2

The Impact of Imperfect Permit Market on Congested Electricity Market Equilibrium Tanachai market in conjunction with a permit market to study such interactions. The concept of conjectural variations is proposed to account for imperfect competition in permit market. The model is then applied

their strategies so as to improve all of their costs. The strong price of anarchy (see, e.g., [3]) is the ratioStrong and Pareto Price of Anarchy in Congestion Games Steve Chien Alistair SinclairÂ§ September 21 games, focusing on the relationships between the price of anarchy for these equilibria

). The controller regulates QoS by manipulating the flow of controllable traffic into the network. Controllability. In this paper we use an adaptive feedback and feedforward control system to maximise throughput such that the QoAdaptive Congestion Control in Broadband-ISDN: High Throughput with Sustained Quality of Service

benchmark system is used to illustrate and compare the effect on locational marginal prices and transmission marginal prices obtained from stability-constrained auction models when dynamic and steady state FACTS discusses the effect on transmission congestion management and pricing of dynamic and steady- state models

network operation? iii) What are the sustainable values of link utilization without the risk of network control features) over Internet congested links (the bottlenecks). Within this framework the analysis in a general framework, where only some very elementary properties of the protocols and the network come

be incorporated in RED, an active queue management scheme. The proposed scheme possesses all the advantages of RED. In addition, it lowers the drop rates of short-lived flows and also those high bandwidth flows that reduce their sending rate when congestion...

in the electric power industry in the US and around the world and the rapid growth of interregional trading of the Program on Workable Energy Regulation (POWER). POWER is a program of the University of California Energy of electric power require the development of procedures for coordinating congestion management across multiple

1 Optimum and equilibrium in assignment problems with congestion: mobile terminals association setting, this problem corresponds to the determination of the locations at which mobile terminals prefer power needed by the mobile terminals over the whole network (global optimum), and a user optimization

A Bayesian Approach for TCP to Distinguish Congestion from Wireless Losses # Dhiman Barman,matta}@cs.bu.edu Technical Report BUCSÂ­2003Â­030 Abstract. The Transmission Control Protocol (TCP) has been the protocol approach to infer at the source host the reason of a packet loss, whether congesÂ­ tion or wireless

://www.cs.tamu.edu/faculty/vaidya/mobile.html Technical Report 97-009 August 18, 1997 Abstract On wireless links, the rate of corruption losses canUsing End-to-End Statistics to Distinguish Congestion and Corruption Losses: A Negative Result Saad be signi cant, leading to poor TCP performance. The performance gets worse when these losses are mistaken

2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD) administered in vivo causes drastic reduction in the weight of the mouse thymus at low doses the reduction becoming statistically significant after 2 days. To understand the cause for such thymic involution TCDD-evoked changes in various biochemical parameters in this tissue were examined. The most noticeable change was observed in the increased activity of specific protein-tyrosine kinases and protein kinase C and an increasedlevel of p21{sup ras}-associated binding of ({sup 3}H)GTP. The above changes appear to be a selective effect on these special classes of proteins. It has become apparent that the rise in protein-tyrosine kinase activities becomes significant within 24 hr, whereas the rise in protein kinase C does not become significant until 48 hr. In view of similarities between TCDD and thyroid hormones in causing thymic involution, the levels of c-erb-A expression were assessed in the liver by using avian {sup 32}P-labeled v-erb-A probe and RNA transfer blot hybridization technique. The results clearly indicate that TCDD has the property to elevate levels of mRNA bearing homology to v-erb-A. Based on such observations a hypothesis has been proposed that TCDD owes its potency to its ability to stimulate the expression of one of a family of DNAs bearing homology to v-erb-A and that one of the major consequences of such an action is stimulation of various tyrosine kinases.

promotions (i.e., seasonal) and regular demand. In response to this, at the strategic or aggregate level demand. Several multinational companies use distribution center networks in this manner. For example. For most fast moving consumer goods, demand patterns are quite erratic, being a function of both sales

signal on a variable, complex and non transparent constraint that they may manipulate to benefit from the congestion management’s revenues or for any other personal agenda. The “Light Handed Regulation” in Nordic countries makes this strategy even more... . In the optimisation programs, these laws appear as constraints to take in account the way electric power is transmitted in the grid. References Barzel Y. (1990). Economic Analysis of Property Rights, Cambridge University Press. Baldick R. (2002). “Variation...

and weight, and increase safety and reliability of next-generation hybrid, battery-powered, and fuel cell. Through materials characterization, processing, and systems simulations, lab experts are developing next-generation batteries and manufacturing processes. ORNL is also addressing the bioenergy supply chain to enable large

p N=2c If p N is odd, the lower bound is b p N=2c 1 Whenever the node updates one of the links in the map, our route computation module only needs to recompute a subgraph of the network. This is because the tra c ow from the source node... GCA:Global Congestion Awareness for Load Balance in Networks-on-Chip. (August 2012) Mukund Ramakrishna, B.Tech., International Institute of Information Technology, Hyderabad Co{Chairs of Advisory Committee: Dr. Paul V. Gratz Dr. Alexander...

#12;As the speed of the compressor was increased per test, any excess charge was held in COP despite saturated conditions at the inlet to the compressor. Therefore, optimal airflow and refrigerant subcooling at the condenser exit were experimentally determined at discrete compressor speeds

We propose a framework for constructing microscopic traffic models from microscopic acceleration patterns that can in principle be experimental measured and proper averaged. The exact model thus obtained can be used to justify the consistency of various popular models in the literature. Assuming analyticity of the exact model, we suggest that a controlled expansion around the constant velocity, uniform headway "ground state" is the proper way of constructing various different effective models. Assuming a unique ground state for any fixed average density, we discuss the universal properties of the resulting effective model, focusing on the emergent quantities of the coupled non-linear ODEs. These include the maximum and minimum headway that give the coexistence curve in the phase diagram, as well as an emergent intrinsic scale that characterizes the strength of interaction between clusters, leading to non-trivial cluster statistics when the unstable ground state is randomly perturbed. Utilizing the universal properties of the emergent quantities, a simple algorithm for constructing an effective traffic model is also presented. The algorithm tunes the model with statistically well-defined quantities extracted from the flow-density plot, and the resulting effective model naturally captures and predicts many quantitative and qualitative empirical features of the highway traffic, especially in the presence of an on-ramp bottleneck. The simplicity of the effective model provides strong evidence that stochasticity, diversity of vehicle types and modeling of complicated individual driving behaviors are \\emph{not} fundamental to many observations of the complex spatiotemporal patterns in the real traffic dynamics. We also propose the nature of the congested phase can be well characterized by the long lasting transient states of the effective model, from which the wide moving jams evolve.

An engineering information (EI) and information technology (IT) organization that must improve its productivity should work to further its business goals. This paper explores a comprehensive model for increasing EI/IT productivity by supporting organizational objectives.

, Sweden, Sep. 9-11, 2002 Congestion Sensitive Downlink Power Control for Wideband CDMA Systems Vasilios A-- We present a model for efficient and robust power control in the downlink of Wideband CDMA wireless. INTRODUCTION Current power control algorithms for wireless systems in- crease the power when the interference

This paper is concerned with the issue of side payments between content providers (CPs) and Internet service (access bandwidth) providers (ISPs) in an Internet that is potentially not neutral. We herein generalize past results modeling the ISP and CP interaction as a noncooperative game in two directions. We consider different demand response models (price sensitivities) for different provider types in order to explore when side payments are profitable to the ISP. Also, we consider convex (non-linear) demand response to model demand triggered by traffic which is sensitive to access bandwidth congestion, particularly delay-sensitive interactive real-time applications. Finally, we consider a model with two competing "eyeball" ISPs with transit pricing of net traffic at their peering point to study the effects of caching remote content.

The study quantifies the impact on the cost of experimentation of synergistic advancements in instrumentation, theory, and computation over the last two decades. The study finds that the productivity of experimental investigation (experimental results/$) is increasing as science is transformed from a linear, isolated approach to a hierarchical, multidisciplinary approach. Developments such as massively parallel processors coupled with instrumental systems with multiple probes and diverse data analysis capabilities will further this transformation and increase the productivity of scientific studies. The complexities and scale of today`s scientific challenges are much greater than in the past, however, so that the costs of research are increasing. Even though science is much more productive in terms of the experimental results, the challenges facing scientific investigators are increasing at an even faster pace. New approaches to infrastructure investments must capitalize on the changing dynamics of research and allow the scientific community to maximize gains in productivity so that complex problems can be attacked cost-effectively. Research strategies that include user facilities and coordinated experimental, computational, and theoretical research are needed.

The study quantifies the impact on the cost of experimentation of synergistic advancements in instrumentation, theory, and computation over the last two decades. The study finds that the productivity of experimental investigation (experimental results/$) is increasing as science is transformed from a linear, isolated approach to a hierarchical, multidisciplinary approach. Developments such as massively parallel processors coupled with instrumental systems with multiple probes and diverse data analysis capabilities will further this transformation and increase the productivity of scientific studies. The complexities and scale of today's scientific challenges are much greater than in the past, however, so that the costs of research are increasing. Even though science is much more productive in terms of the experimental results, the challenges facing scientific investigators are increasing at an even faster pace. New approaches to infrastructure investments must capitalize on the changing dynamics of research and allow the scientific community to maximize gains in productivity so that complex problems can be attacked cost-effectively. Research strategies that include user facilities and coordinated experimental, computational, and theoretical research are needed.

In this dissertation I defend some controversial "level-bridging" principles in epistemology. In the first chapter, I defend the KK principle-the principle that if one knows that P, then one knows that one knows that P. I ...

. These include air, ground, and river pollution, an increase in accidents, the depletion of global oil reserves in many countries. Incentivized to drive as much as possible, their users contribute to pollution

impacts: first of all, the airlines' operational cost increase is estimated by JEC to $19 billion is a fast developing area. Airlines compete for a limited resource, namely airport capacity. The consequence as well as for many other. The airline business has, as the energy supply or the health care, a capital

An agency may provide a pay increase to allow a senior executive to advance his or her relative position with the SES rate range only upon a determination by the authorized agency official that the executive’s individual performance and/or contribution to agency performance so warrant. A senior executive who receives an annual summary rating of Outstanding or equivalent may be considered for an annual pay increase. A senior executive who receives an annual summary rating of less than Fully Successful or equivalent may not receive an increase in pay for the current appraisal period. The expectation is that executives who are paid consistent with their current level of responsibilities and who receive an acceptable (“Fully Successful” or better) annual summary rating will be eligible to receive a discretionary performance-based pay increase. Pay increases that advance an executive's position in the SES rate range restart the clock under the 12-month rule.

WHAT DO THREAT LEVELS AND RESPONSE LEVELS MEAN? THREAT LEVELS: The UK Threat Level is decided by the Government's Joint Terrorism Analysis Centre (JTAC). It is the system to assess the threat to the UK from Threat Levels: Low - an attack is unlikely Moderate - an attack is possible, but not likely Substantial

An increase in central blood volume in microgravity may result in increased plasma levels of atrial natriuretic factor (ANF). Since elevations in plasma ANF are found in clinical syndromes associated with edema, and since space motion sickness induced by microgravity is associated with an increase in central blood volume and facial edema, we determined whether ANF increases capillary permeability to plasma protein. Conscious, bilaterally nephrectomized male rats were infused with either saline, ANF + saline, or hexamethonium + saline over 2 h following bolus injections of 125I-albumin and 14C-dextran of similar molecular size. Blood pressure was monitored and serial determinations of hematocrits were made. Animals infused with 1.0 micrograms.kg-1.min-1 ANF had significantly higher hematocrits than animals infused with saline vehicle. Infusion of ANF increased the extravasation of 125I-albumin, but not 14C-dextran from the intravascular compartment. ANF also induced a depressor response in rats, but the change in blood pressure did not account for changes in capillary permeability to albumin; similar depressor responses induced by hexamethonium were not accompanied by increased extravasation of albumin from the intravascular compartment. ANF may decrease plasma volume by increasing permeability to albumin, and this effect of ANF may account for some of the signs and symptoms of space motion sickness.

Changes in physical and chemical characteristics of aquatic habitats made to reduce or eliminate ecological risks can sometimes have unforeseen consequences. Environmental management activities on the U.S. Dept. of Energy reservation in Oak Ridge, Tennessee,have succeeded in improving water quality in streams impacted by discharges fi-om industrial facilities and waste disposal sites. The diversity and abundance of pollution-sensitive components of the benthic macroinvertebrate communities of three streams improved after new waste treatment systems or remedial actions reduced inputs of various toxic chemicals. Two of the streams were known to be mercury-contaminated from historical spills and waste disposal practices. Waterborne mercury concentrations in the third were typical of uncontaminated systems. In each case, concentrations of mercury in fish, or the apparent biological availability of mercury increased over the period during which ecological metrics indicated improved water quality. In the system where waterborne mercury concentrations were at background levels, increased mercury bioaccumulation was probably a result of reduced aqueous selenium concentrations; however, the mechanisms for increased mercury accumulation in the other two streams remain under investigation. In each of the three systems, reduced inputs of metals and inorganic anions was followed by improvements in the health of aquatic invertebrate communities. However, this reduction in risk to aquatic invertebrates was accompanied by increased risk to humans and piscivorous wildlife related to increased mercury concentrations in fish.

This thesis describes the role of short sea shipping within the transportation network in the European Union. It examines the existence of externalities relating to congestion, infrastructure, air pollution, noise, and ...

Coastal regions have a high social, economical and environmental importance. Due to this importance the sea level fluctuations can have many bad consequences. In this research the correlation between the increasing trend of temperature in coastal stations due to Global Warming and the Caspian Sea level has been established. The Caspian Sea level data has been received from the Jason-1 satellite. It was resulted that the monthly correlation between the temperature and sea level is high and also positive and almost the same for all the stations. But the yearly correlation was negative. It means that the sea level has decreased by the increase in temperature.

Virginia Wetlands Report Sea Level Rise & Other Coastal Hazards: The Risks of Coastal Living See. Climate change is bringing increased temperatures, rising sea level, more frequent storms and increased in tide levels. From these records it is not only clear that water levels are rising, they appear

This dissertation addresses current distribution phenomena in the smoothing of advancing and receding microprofiles during electrodeposition in the following areas: levelling in the presence of inhibitors, levelling in the presence of corrosive agents, and levelling caused by periodic current reversal. These phenomena are relevant to many commercial electrodeposition processes. Theoretical analysis of moving boundaries in electrodeposition is addressed, focusing on the levelling of microscopic surface contours. The literature relevant to the solution of current distribution problems is reviewed. Convection of inhibitors to the depth of trenches is evaluated using the finite element method, and characterized as a function of Reynolds number, notch angle, and depth. Secondary flows are shown to noticeably enhance transport into microscopic trenches only at high Peclet numbers, i.e. at very high flow velocities. The boundary element method (BEM) is used to analyze levelling caused by inhibitors consumed at the transport limiting rate during electrodeposition. It is predicted that (1) better levelling performance can be obtained if the microscopic surface waviness is oriented perpendicular to the convective flow, and (2) for surface roughness oriented parallel to the flow, there is an optimum boundary layer thickness, or flux of additive, which results in superior levelling performance. Profilometry and photomicrography is applied to obtain the current distribution, current efficiency and levelling performance on novel microprofiled electrodes for two orientations with respect to the fluid flow during nickel electrodeposition in the presence of coumarin. Slightly better levelling occurs in flows transverse to grooves, and the deposit thickness increases in the flow direction. It is concluded that coumarin acts by simultaneously lowering the current efficiency, and blocking metal deposition. 331 refs., 86 figs., 8 tabs.

A liquid level monitor for tracking the level of a coal slurry in a high-pressure vessel including a toroidal-shaped float with magnetically permeable bands thereon disposed within the vessel, two pairs of magnetic field generators and detectors disposed outside the vessel adjacent the top and bottom thereof and magnetically coupled to the magnetically permeable bands on the float, and signal processing circuitry for combining signals from the top and bottom detectors for generating a monotonically increasing analog control signal which is a function of liquid level. The control signal may be utilized to operate high-pressure control valves associated with processes in which the high-pressure vessel is used.

A liquid-level monitor for tracking the level of a coal slurry in a high-pressure vessel including a toroidal-shaped float with magnetically permeable bands thereon disposed within the vessel, two pairs of magnetic-field generators and detectors disposed outside the vessel adjacent the top and bottom thereof and magnetically coupled to the magnetically permeable bands on the float, and signal-processing circuitry for combining signals from the top and bottom detectors for generating a monotonically increasing analog control signal which is a function of liquid level. The control signal may be utilized to operate high-pressure control valves associated with processes in which the high-pressure vessel is used.

of a sedimentary brine source. These groundwater fluxes, while very small (. Furthermore, 36 Cl/Cl ratios and 234 U values for these brines are close to secular equilibrium, indicating brine ages on the order of millions of years. The recognition of a substantial geologic salinity source

Continual increase in transport demand and uneven road capacity results in chaotic traffic congestion, brings with it high levels of air pollution, an elevated number of accidents, and an insatiable demand for oil to satisfy the motorized vehicles...

Continual increase in transport demand and uneven road capacity results in chaotic traffic congestion, brings with it high levels of air pollution, an elevated number of accidents, and an insatiable demand for oil to satisfy the motorized vehicles...

Continual increase in transport demand and uneven road capacity results in chaotic traffic congestion, brings with it high levels of air pollution, an elevated number of accidents, and an insatiable demand for oil to satisfy the motorized vehicles...

Continual increase in transport demand and uneven road capacity results in chaotic traffic congestion, brings with it high levels of air pollution, an elevated number of accidents, and an insatiable demand for oil to satisfy the motorized vehicles...

Capacity increased by more than 4.6% when one dynamic matrix multivariable controller began operating in Valero Refining Company`s MTBE production complex in Corpus Christi, Texas. This was on a plant that was already running well above design capacity due to previously made process changes. A single controller was developed to cover an isobutane dehydrogenation (ID) unit and an MTBE reaction and fractionation plant with the intermediate isobutylene surge drum. The overall benefit is realized by a comprehensive constrained multivariable predictive controller that properly handles all sets of limits experienced by the complex, whether limited by the front-end ID or back-end MTBE units. The controller has 20 manipulated, 6 disturbance and 44 controlled variables, and covers widely varying dynamics with settling times ranging from twenty minutes to six hours. The controller executes each minute with a six hour time horizon. A unique achievement is intelligent surge drum level handling by the controller for higher average daily complex capacity as a whole. The ID unit often operates at simultaneous limits on reactor effluent compressor capacity, cold box temperature and hydrogen/hydrocarbon ratio, and the MTBE unit at impurity in butene column overhead as well as impurity in MTBE product. The paper discusses ether production, isobutane dehydrogenation, maximizing production, controller design, and controller performance.

The Maui Smart Grid Project (MSGP) is under the leadership of the Hawaii Natural Energy Institute (HNEI) of the University of Hawaii at Manoa. The project team includes Maui Electric Company, Ltd. (MECO), Hawaiian Electric Company, Inc. (HECO), Sentech (a division of SRA International, Inc.), Silver Spring Networks (SSN), Alstom Grid, Maui Economic Development Board (MEDB), University of Hawaii-Maui College (UHMC), and the County of Maui. MSGP was supported by the U.S. Department of Energy (DOE) under Cooperative Agreement Number DE-FC26-08NT02871, with approximately 50% co-funding supplied by MECO. The project was designed to develop and demonstrate an integrated monitoring, communications, database, applications, and decision support solution that aggregates renewable energy (RE), other distributed generation (DG), energy storage, and demand response technologies in a distribution system to achieve both distribution and transmission-level benefits. The application of these new technologies and procedures will increase MECO’s visibility into system conditions, with the expected benefits of enabling more renewable energy resources to be integrated into the grid, improving service quality, increasing overall reliability of the power system, and ultimately reducing costs to both MECO and its customers.

A tiltmeter device having a pair of orthogonally disposed tilt sensors that are levelable within an inner housing containing the sensors. An outer housing can be rotated to level at least one of the sensor pair while the inner housing can be rotated to level the other sensor of the pair. The sensors are typically rotated up to about plus or minus 100 degrees. The device is effective for measuring tilts in a wide range of angles of inclination of wells and can be employed to level a platform containing a third sensor.

design alternatives provides reduction of CO2 emission levels such that the CO2 emissions for 2050 meet Abstract-- Historic data shows an increase in carbon dioxide (CO2) emissions at airports caused regulations at airports through reduction of CO2 for all components of flight operations. The purpose

The impact of absorbing aerosols on global climate are not completely understood. Here, we present results of idealized experiments conducted with the Community Atmosphere Model (CAM4) coupled to a slab ocean model (CAM4-SOM) to simulate the climate response to increases in tropospheric black carbon aerosols (BC) by direct and semi-direct effects. CAM4-SOM was forced with 0, 1x, 2x, 5x and 10x an estimate of the present day concentration of BC while maintaining their estimated present day global spatial and vertical distribution. The top of the atmosphere (TOA) radiative forcing of BC in these experiments is positive (warming) and increases linearly as the BC burden increases. The total semi-direct effect for the 1x experiment is positive but becomes increasingly negative for higher BC concentrations. The global average surface temperature response is found to be a linear function of the TOA radiative forcing. The climate sensitivity to BC from these experiments is estimated to be 0.42 K $\\textnormal W^{-1} m^{2}$ when the semi-direct effects are accounted for and 0.22 K $\\textnormal W^{-1} m^{2}$ with only the direct effects considered. Global average precipitation decreases linearly as BC increases, with a precipitation sensitivity to atmospheric absorption of 0.4 $\\%$ $\\textnormal W^{-1} \\textnormal m^{2}$ . The hemispheric asymmetry of BC also causes an increase in southward cross-equatorial heat transport and a resulting northward shift of the inter-tropical convergence zone in the simulations at a rate of 4$^{\\circ}$N $\\textnormal PW^{-1}$. Global average mid- and high-level clouds decrease, whereas the low-level clouds increase linearly with BC. The increase in marine stratocumulus cloud fraction over the south tropical Atlantic is caused by increased BC-induced diabatic heating of the free troposphere.

. This paper provides reduced form estimates of changes in electricity consumption due to increased use to higher projections of electricity consumption. These increases in projected electricity consumption were: climate change, vulnerability, electricity consumption, heating, cooling Please use the following citation

The IPCC (Intergovernmental Panel on Climate Change) 1995 Scientific Assessment, Chapter 7. Sea Level Change, presents a modest revision of the similar chapter in the 1990 Assessment. Principal conclusions on observed sea-level change and the principal terms in the sea-level equation (ocean thermal expansion, glaciers, ice sheets, and land hydrology), including our knowledge of the present-day (defined as the 20th Century) components of sea-level rise, and projections of these for the future, are presented here. Some of the interesting glaciological problems which are involved in these studies are discussed in more detail. The emphasis here is on trends over decades to a century, not on shorter variations nor on those of the geologic past. Unfortunately, some of the IPCC projections had not been agreed at the time of writing of this paper, and these projections will not be given here. 15 refs., 2 figs.

with the ubiquitous standard TCP-Reno, or in some cases, even among two connections running over the same path. We requires an unreasonable amount of time for this desired window to be regained after a loss. Indeed to recover from a single congestion event. Under the random packet loss model, TCP-Reno can require

A deterministic model was developed to study the effects of inefficient scheduling on flight delays at hub airports. The model bases the delay calculation on published schedule data and on user-defined airport capacities. ...

comparison of market designs Another strand of literature compares different pricing strategies for real markets; often the system in place with an optimal electricity dispatch based on LMP. Bernard and Guertin (2002) simulate a three?node model of Hydro... possibilities are negligible in most electric power networks, so demand and supply must be instantly balanced. The consequence is that transmission constraints and how they are managed often have a large influence on market prices. The European Union’s regulation 1228...

An ultrasonic liquid level detector for use within a shielded container, the detector being tubular in shape with a chamber at its lower end into which liquid from in the container may enter and exit, the chamber having an ultrasonic transmitter and receiver in its top wall and a reflector plate or target as its bottom wall whereby when liquid fills the chamber a complete medium is then present through which an ultrasonic wave may be transmitted and reflected from the target thus signaling that the liquid is at chamber level.

This paper discusses the environmental effects of incorporating wind energy into the electric power system. We present a detailed emissions analysis based on comprehensive modeling of power system operations with unit commitment and economic dispatch for different wind penetration levels. First, by minimizing cost, the unit commitment model decides which thermal power plants will be utilized based on a wind power forecast, and then, the economic dispatch model dictates the level of production for each unit as a function of the realized wind power generation. Finally, knowing the power production from each power plant, the emissions are calculated. The emissions model incorporates the effects of both cycling and start-ups of thermal power plants in analyzing emissions from an electric power system with increasinglevels of wind power. Our results for the power system in the state of Illinois show significant emissions effects from increased cycling and particularly start-ups of thermal power plants. However, we conclude that as the wind power penetration increases, pollutant emissions decrease overall due to the replacement of fossil fuels.

A liquid level sensor comprising a transparent waveguide containing fluorescent material that is excited by light of a first wavelength and emits at a second, longer wavelength. The upper end of the waveguide is connected to a light source at the first wavelength through a beveled portion of the waveguide such that the input light is totally internally reflected within the waveguide above an air/liquid interface in a tank but is transmitted into the liquid below this interface. Light is emitted from the fluorescent material only in those portions of the waveguide that are above the air/liquid interface, to be collected at the upper end of the waveguide by a detector that is sensitive only to the second wavelength. As the interface moves down in the tank, the signal strength from the detector will increase.

affected by human interferences, e.g. in-channel dredging, sand mining and the construction of levees-intensive dredging. The stations along the coastal regions show significant increasing extreme high/low water level. The coastal regions are not influenced by in-channel dredging, and furthermore,20 sediment loads from upper

Climate Change AND Sea-Level Rise IN Florida AN Update OF THE EFFECTS OF Climate Change ON FLORIDA to them. Florida Oceans and Coastal Council. 2010. Climate Change and Sea-Level Rise in Florida: An Update, sea-level rise, with the expectation that updates for increasing greenhouse gases, air temperature

We propose dynamic aggregation of virtual tags in TLB to increase its coverage and improve the overall miss ratio during address translation. Dynamic aggregation exploits both the spatial and temporal locality inherent in ...

Introduction Wetlands are increasingly used for wastewater treatment Plant community changes and related nutrient retention within an aridland constructed wastewater treatment wetland How does plant community composition change in an aridland constructed wastewater treatment wetland and how do those

A liquid level detector for conductive liquids for vertical installation in a tank, the detector having a probe positioned within a sheath and insulated therefrom by a seal so that the tip of the probe extends proximate to but not below the lower end of the sheath, the lower end terminating in a rim that is provided with notches, said lower end being tapered, the taper and notches preventing debris collection and bubble formation, said lower end when contacting liquid as it rises will form an airtight cavity defined by the liquid, the interior sheath wall, and the seal, the compression of air in the cavity preventing liquid from further entry into the sheath and contact with the seal. As a result, the liquid cannot deposit a film to form an electrical bridge across the seal.

with levels in the 1970s and 1980s. For example, the accumulated cyclone energy (ACE) index in the Atlantic of disturbances. Bottom: annual number (Aug­Oct) of North Atlantic basin hurricanes (1980­2005). See figures 2, is a crucial question for the future outlook of hurricane activity in the basin. It is difficult to distinguish

A RIDGE (Region of IncreaseD Gene Expression), as defined by previous studies, is a consecutive set of active genes on a chromosome that span a region around 110 kbp long. This study investigated RIDGE formation by ...

An apparatus for increasing security in inter-chip communication includes a sending control module, a communication bus, and a receiving control module. The communication bus is coupled between the sending control module and the receiving control module. The sending control module operates to send data on the communication bus, disable the communication bus when threats are detected, or both.

INCREASED LEAD ABSORPTION AND LEAD POISONING IN YOUNG CHILDREN A STATDIE:"IT BY THE CEJde slnt:t! the Surgeon GenerJi's Statement. "~edicJI Aspects of Childhood Lead POisoning," was issued:ld absorption Jnd lead poisoning. Such JCtivities for children will continue to be necessary until sources

Means and methods for enhancing the output of radiant energy from a porous radiant burner by minimizing the scattering and increasing the adsorption, and thus emission of such energy by the use of randomly dispersed ceramic fibers of sub-micron diameter in the fabrication of ceramic fiber matrix burners and for use therein.

Benzene Increases Aneuploidy in the Lymphocytes of Exposed Workers: A Comparison of Data Obtained Benzene is an established human leukemogen that increases the level of chromosome aberrations in lym and 8 in healthy benzene-exposed human subjects. Metaphase and interphase cells from the peripheral

In this paper, a methodology for designing efficient energy scavengers is proposed. The scavenger consists of a cantilever beam on which piezoelectric films and a mass are mounted. The mass at the tip of the beam is known as the proof mass and the device is called either an energy scavenger or a beam-mass system. The proof mass is a permanent magnet, where in its vicinity attracting permanent magnets are placed. It is shown that when the magnets have appropriate strengths and are placed appropriately, the vibration of the beam-mass system can be amplified, thereby the scavenged energy is increased. Examples are given throughout the paper.

We study the problem of thermoelectricity and propose a simple microscopic mechanism for the increase of thermoelectric efficiency. We consider the cross transport of particles and energy in open classical ergodic billiards. We show that, in the linear response regime, where we find exact expressions for all transport coefficients, the thermoelectric efficiency of ideal ergodic gases can approach Carnot efficiency for sufficiently complex charge carrier molecules. Our results are clearly demonstrated with a simple numerical simulation of a Lorentz gas of particles with internal rotational degrees of freedom.

A free jet of air is disturbed at a frequency that substantially matches natural turbulences in the free jet to increase the entrainment, mixing, and spreading of air by the free jet, for example in a room or other enclosure. The disturbances are created by pulsing the flow of air that creates the free jet at the desired frequency. Such pulsing of the flow of air can be accomplished by sequentially occluding and opening a duct that confines and directs the flow of air, such as by rotating a disk on an axis transverse to the flow of air in the duct.

A free jet of air is disturbed at a frequency that substantially matches natural turbulences in the free jet to increase the entrainment, mixing, and spreading of air by the free jet, for example in a room or other enclosure. The disturbances are created by pulsing the flow of air that creates the free jet at the desired frequency. Such pulsing of the flow of air can be accomplished by sequentially occluding and opening a duct that confines and directs the flow of air, such as by rotating a disk on an axis transverse to the flow of air in the duct. 11 figs.

Cisco's current process for developing and maintaining product-level bills of materials (BOMs) has resulted in inconsistencies in BOM structure leading to product launch delays, increased product support costs, and lower ...

humidity. The net effect is more low cloud cover with increasing aerosol absorption. The higher specific by dust radiative heating. Although in some areas our model exhibits a reduction of low cloud cover due are expected to have a similar effect. Citation: Perlwitz, J., and R. L. Miller (2010), Cloud cover increase

For more than 50 years, compression equipment along the 2,000-mile Tennessee Gas Pipeline has been helping to supply natural gas needs for the Northeast. But increasing demand and a need for more environmentally safe equipment mean a major replacement program for the compressor stations that make the natural gas transmission possible. Today it is one of the longest gas pipelines in the world, carrying more than 1 Bcf/d of natural gas. New compression equipment is being installed to boost efficiency and meet more stringent environmental standards. In 1993, Tenneco Energy, purchased by El Paso Energy in December 1996, initiated a Horsepower Replacement Program intended to replace older, inefficient technology with more advanced equipment. A major objective was to improve operational effectiveness and to reduce harmful nitrogen oxide and carbon monoxide emissions by converting much of the machinery to electric-driven compression equipment.

experiments, (ii) observational studies on humans, and (iii) human experiments. Animal experiments are beyond Pressure There have been many intervention studies on humans, and several meta-analyses. Although publication bias is a concern, the experiments do suggest some reduction in blood pressure for hypertensive

In the nuclear industry design specifications for certain quality characteristics require that the final product be inspected by a sampling plan which can demonstrate product conformance to stated assurance levels. The Specified Assurance Level (SAL) Sampling Procedure has been developed to permit the direct selection of attribute sampling plans which can meet commonly used assurance levels. The SAL procedure contains sampling plans which yield the minimum sample size at stated assurance levels. The SAL procedure also provides sampling plans with acceptance numbers ranging from 0 to 10, thus, making available to the user a wide choice of plans all designed to comply with a stated assurance level.

environmental effects: the reduced speeds in urban areas increase energy consumption, greenhouse gas emissions, 1969 ; Boiteux et al, 1994). The French Department of Transport commissioned us to update sector. We provide a method for aggregating local marginal costs and demonstrate how to use

LNG imports to the US jumped in 1996 as Algerian base-load plants resumed operations following major revamps. Exports from Alaska to Japan grew by nearly 4% over 1995. Total LNG imports to the US in 1996 were 40.27 bcf compared to 17.92 bcf in 1995, an increase of 124.8%. Algeria supplied 35.32 bcf; Abu Dhabi, 4.95 bcf. About 82.3% of the imported LNG was received at Distrigas Corp.`s terminal north of Boston. The remaining LNG was received at the Pan National terminal in Lake Charles, LA. LNG imports during 1995 fell to such a low level not because of depressed US demand but because of limited supply. The paper discusses LNG-receiving terminals, base-load producers, LNG pricing, and exports.

An improved process for liquefying solid carbonaceous materials wherein the solid carbonaceous material is slurried with a suitable solvent and then subjected to liquefaction at elevated temperature and pressure to produce a normally gaseous product, a normally liquid product and a normally solid product. The normally liquid product is further separated into a naphtha boiling range product, a solvent boiling range product and a vacuum gas-oil boiling range product. At least a portion of the solvent boiling-range product and the vacuum gas-oil boiling range product are then combined and passed to a hydrotreater where the mixture is hydrotreated at relatively severe hydrotreating conditions and the liquid product from the hydrotreater then passed to a catalytic cracker. In the catalytic cracker, the hydrotreater effluent is converted partially to a naphtha boiling range product and to a solvent boiling range product. The naphtha boiling range product is added to the naphtha boiling range product from coal liquefaction to thereby significantly increase the production of naphtha boiling range materials. At least a portion of the solvent boiling range product, on the other hand, is separately hydrogenated and used as solvent for the liquefaction. Use of this material as at least a portion of the solvent significantly reduces the amount of saturated materials in said solvent.

The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the SDT System at Hanford. The "Double-Shell Tank (DST) Integrity Project - DST Thermal and Seismic Project" is in support of Tri-Party Agreement Milestone M-48-14.

Wind power development in the United States is outpacing previous estimates for many regions, particularly those with good wind resources. The pace of wind power deployment may soon outstrip regional capabilities to provide transmission and integration services to achieve the most economic power system operation. Conversely, regions such as the Southeastern United States do not have good wind resources and will have difficulty meeting proposed federal Renewable Portfolio Standards with local supply. There is a growing need to explore innovative solutions for collaborating between regions to achieve the least cost solution for meeting such a renewable energy mandate. The Department of Energy funded the project 'Integrating Midwest Wind Energy into Southeast Electricity Markets' to be led by EPRI in coordination with the main authorities for the regions: SPP, Entergy, TVA, Southern Company and OPC. EPRI utilized several subcontractors for the project including LCG, the developers of the model UPLAN. The study aims to evaluate the operating cost benefits of coordination of scheduling and balancing for Southwest Power Pool (SPP) wind transfers to Southeastern Electric Reliability Council (SERC) Balancing Authorities (BAs). The primary objective of this project is to analyze the benefits of regional cooperation for integrating mid-western wind energy into southeast electricity markets. Scenarios were defined, modeled and investigated to address production variability and uncertainty and the associated balancing of large quantities of wind power in SPP and delivery to energy markets in the southern regions of the SERC. DOE funded Oak Ridge National Laboratory to provide additional support to the project, including a review of results and any side analysis that may provide additional insight. This report is a unit-by-unit analysis of changes in operations due to the different scenarios used in the overall study. It focuses on the change in capacity factors and the number of start-ups required for each unit since those criteria summarize key aspects of plant operations, how often are they called upon and how much do they operate. The primary analysis of the overall project is based on security-constrained unit commitment (SCUC) and economic dispatch (SCED) simulations of the SPP-SERC regions as modeled for the year 2022. The SCUC/SCED models utilized for the project were developed through extensive consultation with the project utility partners, to ensure the various regions and operational practices are represented as best as possible in the model. SPP, Entergy, Oglethorpe Power Company (OPC), Southern Company, and the Tennessee Valley Authority (TVA) actively participated in the project providing input data for the models and review of simulation results and conclusions. While other SERC utility systems are modeled, the listed SERC utilities were explicitly included as active participants in the project due to the size of their load and relative proximity to SPP for importing wind energy.

European polypropylene (PP) producers are gearing up for yet another attempt to raise prices and stem their losses. Despite a string of pricing initiatives throughout 1992, the oversupplied PP market continued to sink. It slipped again in January, with many producers accusing their competitors of price cutting to raise sales volumes. The difference this time is that all the major players have stated their readiness to hike prices, while output has been cut back considerably to reduce stocks. Sentiment in the market is that prices simply cannot be allowed to go any lower. Neste Chemicals (Helsinki) has led the way by announcing a 40-pfennig/kg increase, effective March 1. Sven Svensson, Neste's v.p./PP, says the increase was announced early to allow converters to adjust the prices of their products. Huels (Marl, Germany) has since announced a 30 pfennig-40 pfennig/kg hike for February or March, Hoechst (Frankfurt) says it will go for a similar increase March 1, Amoco Chemical Europe (Geneva) has promised a hike effective February 1, while Himont (Milan) and Brussels-based Petrofina and Solvay confirm they will also be raising prices. There could be a greater sense of urgency now that propylene contracts have been raised for February. The lowest PP price so far reported in Europe has been BF12.50/kg (DM0.61/kg) for raffia-grade material in Belgium. The French market is about F2.20/kg; the UK at [Brit pounds]290/m.t.; German prices are slightly firmer at DM0.70/kg, with injection molding at about DM0.75/kg. PP copolymer prices have fallen precipitously since early December, with German levels dropping by 20 pfennig/kg, to about DM0.90/kg.

The average level spacing for highly excited nuclei is a key parameter in cross section formulas based on statistical nuclear models, and also plays an important role in determining many physics quantities. Various methods to evaluate average level spacings are reviewed. Because of the finite experimental resolution, to detect a complete sequence of levels without mixing other parities is extremely difficult, if not totally impossible. Most methods derive the average level spacings by applying a fit, with different degrees of generality, to the truncated Porter-Thomas distribution for reduced neutron widths. A method that tests both distributions of level widths and positions is discussed extensivey with an example of /sup 168/Er data. 19 figures, 2 tables.

In an effort to enable supply chain visibility for Intel products, the Customer Unit Level Traceability (ULT) Program was formed to help extend Intel's ULT capability to the customer level. Increased traceability of Intel ...

A growing number of low and middle income nations (LMCs) have adopted some sort of system for environmental impact assessment (EIA). However, generally many of these EIA systems are characterised by a low performance in terms of timely information dissemination, monitoring and enforcement after licencing. Donor actors (such as the World Bank) have attempted to contribute to a higher performance of EIA systems in LMCs by intervening at two levels: the project level (e.g. by providing scoping advice or EIS quality review) and the system level (e.g. by advising on EIA legislation or by capacity building). The aims of these interventions are environmental protection in concrete cases and enforcing the institutionalisation of environmental protection, respectively. Learning by actors involved is an important condition for realising these aims. A relatively underexplored form of learning concerns learning at EIA system-level via project level donor interventions. This 'indirect' learning potentially results in system changes that better fit the specific context(s) and hence contribute to higher performances. Our exploratory research in Ghana and the Maldives shows that thus far, 'indirect' learning only occurs incidentally and that donors play a modest role in promoting it. Barriers to indirect learning are related to the institutional context rather than to individual characteristics. Moreover, 'indirect' learning seems to flourish best in large projects where donors achieved a position of influence that they can use to evoke reflection upon system malfunctions. In order to enhance learning at all levels donors should thereby present the outcomes of the intervention elaborately (i.e. discuss the outcomes with a large audience), include practical suggestions about post-EIS activities such as monitoring procedures and enforcement options and stimulate the use of their advisory reports to generate organisational memory and ensure a better information dissemination.

In the past two decades, China’s manufacturing exports have grown spectacularly, U.S. imports from China have surged, but U.S. exports to China have increased only modestly. Using representative, longitudinal data on ...

The Space Elevator (SE) represents a major paradigm shift in space access. It involves new, untried technologies in most of its subsystems. Thus the successful construction of the SE requires a significant amount of development, This in turn implies a high level of risk for the SE. This paper will present a systems level analysis of the SE by subdividing its components into their subsystems to determine their level of technological maturity. such a high-risk endeavor is to follow a disciplined approach to the challenges. A systems level analysis informs this process and is the guide to where resources should be applied in the development processes. It is an efficient path that, if followed, minimizes the overall risk of the system's development. systems level analysis is that the overall system is divided naturally into its subsystems, and those subsystems are further subdivided as appropriate for the analysis. By dealing with the complex system in layers, the parameter space of decisions is kept manageable. Moreover, A rational way to manage One key aspect of a resources are not expended capriciously; rather, resources are put toward the biggest challenges and most promising solutions. This overall graded approach is a proven road to success. The analysis includes topics such as nanotube technology, deployment scenario, power beaming technology, ground-based hardware and operations, ribbon maintenance and repair and climber technology.

Petaflops systems will have tens to hundreds of thousands of compute nodes which increases the likelihood of faults. Applications use checkpoint/restart to recover from these faults, but even under ideal conditions, applications running on more than 30,000 nodes will likely spend more than half of their total run time saving checkpoints, restarting, and redoing work that was lost. We created a library that performs redundant computations on additional nodes allocated to the application. An active node and its redundant partner form a node bundle which will only fail, and cause an application restart, when both nodes in the bundle fail. The goal of this library is to learn whether this can be done entirely at the user level, what requirements this library places on a Reliability, Availability, and Serviceability (RAS) system, and what its impact on performance and run time is. We find that our redundant MPI layer library imposes a relatively modest performance penalty for applications, but that it greatly reduces the number of applications interrupts. This reduction in interrupts leads to huge savings in restart and rework time. For large-scale applications the savings compensate for the performance loss and the additional nodes required for redundant computations.

the wall cavities were not insulated during construction or where the insulating material has settled properties of building materials, insulation levels, and the temperature dependence of conduction throughNREL researchers discover ways to increase accuracy in building energy simulations tools to improve

at which inorganic CaCO3 precipitation occurs, because such precipitation would reverse the initial effect- . The safe level of saturation to avoid CaCO3 precipitation is not well known. If Ca(OH)2 is spread uniformly increases in saturation. The rate at which Ca(OH)2 can be added at such locations without CaCO3

Dim Light at Night Increases Immune Function in Nile Grass Rats, a Diurnal Rodent Laura K. Fonken lighting during the 20th century, human and nonhuman animals became exposed to high levels of light significant implications for certain ecological niches because of the important influence light exerts

This paper reports on a model of current distribution and electrode shape change for electrodeposition in the presence of diffusion-controlled leveling agents that have been developed. The system is treated as a special case of secondary current distribution, with the surface overpotential taken to depend on both the current density and the transport-limited flux of the leveling agent, according to an empirical relation adapted from polarization data measured at different conditions of agitation. The spatial variation of the leveling-agent flux is determined from a concentration field problem based on the assumption of a stagnant diffusion layer. The solution is obtained by the boundary element method, with a flexible moving-boundary algorithm for simulating the advancement of the electrode profile. To illustrate the model's performance, the evolution of a groove profile during deposition of nickel from a Watts-type bath containing coumarin is predicted and compared with measurements reported in the literature.

At the Large Hadron Collider at CERN the proton bunches cross at a rate of 40MHz. At the Compact Muon Solenoid experiment the original collision rate is reduced by a factor of O (1000) using a Level-1 hardware trigger. A subsequent factor of O(1000) data reduction is obtained by a software-implemented High Level Trigger (HLT) selection that is executed on a multi-processor farm. In this review we present in detail prototype CMS HLT physics selection algorithms, expected trigger rates and trigger performance in terms of both physics efficiency and timing.

Determining the electronic and dielectric properties of water at high pressure and temperature is an essential prerequisite to understand the physical and chemical properties of aqueous environments under supercritical conditions, e.g. in the Earth interior. However optical measurements of compressed ice and water remain challenging and it has been common practice to assume that their band gap is inversely correlated to the measured refractive index, consistent with observations reported for hundreds of materials. Here we report ab initio molecular dynamics and electronic structure calculations showing that both the refractive index and the electronic gap of water and ice increase with pressure, at least up to 30 GPa. Subtle electronic effects, related to the nature of interband transitions and band edge localization under pressure, are responsible for this apparently anomalous behavior.

A length of metal sheathed metal oxide cable is perforated to permit liquid access to the insulation about a pair of conductors spaced close to one another. Changes in resistance across the conductors will be a function of liquid level, since the wetted insulation will have greater electrical conductivity than that of the dry insulation above the liquid elevation.

L-5455 10/06 Texas Rangeland Monitoring: Level Three C. Wayne Hanselka, Charles R. Hart and Allan McGinty* Monitoring is an essential tool in rangeland management. Monitoring is the way to determine whether goals are being achieved with current...

Fluor Daniel Northwest (FDNW) was authorized to address flammable gas issues by reconciling the unexplained surface levelincreases in Tank 241-S-111. The trapped gas evaluation document states that Tank S-111 exceeds the 25% of the lower flammable-limit criterion, based on a surface level rise evaluation. The Waste Storage Tank Status and Leak Detection Criteria document, commonly referred to as the Welty Report is the basis for this letter report. The unexplained waste level rises were attributed to the production and retention of gas in the column of waste corresponding to the unaccounted for surface level rise. From 1973 through 1980, the Welty Report tracked Tank S-111 transfers. This surface levelincrease is from an unknown source or is unaccounted for. Duke Engineering and Services Hanford and Lockheed Martin Hanford Corporation are interested in determining the validity of the unexplained surface level changes reported in the Welty Report based upon other corroborative sources of data. The purpose of this letter report is to assemble detailed surface level and waste addition data from daily tank records, logbooks, and other corroborative data that indicate surface levels, and to reconcile the cumulative unaccounted for surface level changes as shown in the Welty Report from 1973 through 1980. Tank S-111 initially received waste from REDOX in 1952, and after April 1974, primarily received processed waste slurry from the 242-S Evaporator/Crystallizer and transferred supernatant waste to Tank S-102. From the FDNW review and comparisons of the Welty Report versus other daily records for Tank S-111, FDNW determined that the majority of the time, the Welty Report is consistent with daily records. Surface level decreases that occurred following saltwell pumping were identified as unaccounted for decreases in the Welty Report, however they were probably a continued settlement caused by saltwell pumping of the interstitial liquids. Because the flammable/trapped gas issue is linked to the unexplained increase in the surface level, FDNW recommends that all occurrence reports, concerning tank waste levelincreases or decreases from 1970 through 1980, be reevaluated for acceptability of the evaluation as to the root cause of the occurrence.

Suggested for Track 7: Advances in Reactor Core Design and In-Core Management _____________________________________________________________________________________ Fast Reactor Subassembly Design Modifications for Increasing Electricity Generation Efficiency R. Wigeland and K. Hamman Idaho National Laboratory Given the ability of fast reactors to effectively transmute the transuranic elements as are present in spent nuclear fuel, fast reactors are being considered as one element of future nuclear power systems to enable continued use and growth of nuclear power by limiting high-level waste generation. However, a key issue for fast reactors is higher electricity cost relative to other forms of nuclear energy generation. The economics of the fast reactor are affected by the amount of electric power that can be produced from a reactor, i.e., the thermal efficiency for electricity generation. The present study is examining the potential for fast reactor subassembly design changes to improve the thermal efficiency by increasing the average coolant outlet temperature without increasing peak temperatures within the subassembly, i.e., to make better use of current technology. Sodium-cooled fast reactors operate at temperatures far below the coolant boiling point, so that the maximum coolant outlet temperature is limited by the acceptable peak temperatures for the reactor fuel and cladding. Fast reactor fuel subassemblies have historically been constructed using a large number of small diameter fuel pins contained within a tube of hexagonal cross-section, or hexcan. Due to this design, there is a larger coolant flow area next to the hexcan wall as compared to flow area in the interior of the subassembly. This results in a higher flow rate near the hexcan wall, overcooling the fuel pins next to the wall, and a non-uniform coolant temperature distribution. It has been recognized for many years that this difference in sodium coolant temperature was detrimental to achieving greater thermal efficiency, since it causes the fuel pins in the center of the subassembly to operate at higher temperatures than those near the hexcan walls, and it is the temperature limit(s) for those fuel pins that limits the average coolant outlet temperature. Fuel subassembly design changes are being investigated using computational fluid dynamics (CFD) to quantify the effect that the design changes have on reducing the intra-subassembly coolant flow and temperature distribution. Simulations have been performed for a 19-pin test subassembly geometry using typical fuel pin diameters and wire wrap spacers. The results have shown that it may be possible to increase the average coolant outlet temperature by 20 C or more without changing the peak temperatures within the subassembly. These design changes should also be effective for reactor designs using subassemblies with larger numbers of fuel pins. R. Wigeland, Idaho National Laboratory, P.O. Box 1625, Mail Stop 3860, Idaho Falls, ID, U.S.A., 83415-3860 email – roald.wigeland@inl.gov fax (U.S.) – 208-526-2930

This paper reports on the electrodeposition of nickel into an angular trench in the presence of coumarin, a widely used inhibitor, simulated using boundary layer approximations representative of flow parallel and transverse to the groove. Based on the diffusion-adsorption mechanism of leveling action, the dependence of the developing contours on variations in the Langmuir coefficient and inhibitor/metal-ion flux ratio are investigated. Leveling efficiency is shown to be highest for thin, planar boundary layers, and lowest for contour following boundary layers. The model successfully predicts the leveling-off of the inhibitor effect with increasing inhibitor vs. metal-ion flux, and that there is an optimal mass transfer boundary layer thickness, or flux of additive which results in superior leveling performance. Satisfactory agreement is found between the predicted contours, obtained by solving the model equations using the boundary element method, and the experimental leveling efficiencies determined by previous investigators.

Fluor Daniel Northwest was authorized to address flammable gas issues by reconciling the unexplained surface levelincreases in Tank 241-SX-105 (SX-105, typical). The trapped gas evaluation document states that Tank SX-105 exceeds the 25% of the lower flammable limit criterion, based on a surface level rise evaluation. The Waste Storage Tank Status and Leak Detection Criteria document, commonly referred to as the Welty Report is the basis for this letter report. The Welty Report is also a part of the trapped gas evaluation document criteria. The Welty Report contains various tank information, including: physical information, status, levels, and dry wells. The unexplained waste level rises were attributed to the production and retention of gas in the column of waste corresponding to the unaccounted for surface level rise. From 1973 through 1980, the Welty Report tracked Tank SX-105 transfers and reported a net cumulative change of 20.75 in. This surface levelincrease is from an unknown source or is unaccounted for. Duke Engineering and Services Hanford and Lockheed Martin Hanford Corporation are interested in determining the validity of unexplained surface level changes reported in the Welty Report based upon other corroborative sources of data. The purpose of this letter report is to assemble detailed surface level and waste addition data from daily tank records, logbooks, and other corroborative data that indicate surface levels, and to reconcile the cumulative unaccounted for surface level changes as shown in the Welty Report from 1973 through 1980. Tank SX-105 initially received waste from REDOX starting the second quarter of 1955. After June 1975, the tank primarily received processed waste (slurry) from the 242-S Evaporator/Crystallizer and transferred supernate waste to Tanks S-102 and SX-102. The Welty Report shows a cumulative change of 20.75 in. from June 1973 through December 1980.

Fluoro Dynel Northwest (FDNW) was authorized to address flammable gas issues by reconciling the unexplained surface levelincreases in Tank 24 1-S-1 1 1 (S-I 1 1, typical). The trapped gas evaluation document (ref 1) states that Tank SX-102 exceeds the 25% of the lower flammable limit (FL) criterion (ref 2), based on a surface level rise evaluation. The Waste Storage Tank Status and Leak Detection Criteria document, commonly referred to as the ``Wallet Report`` is the basis for this letter report (ref 3). The Wallet Report is also a part of the trapped gas evaluation document criteria. The Wallet Report contains various tank information, including: physical information, status, levels, and dry wells, see Appendix A. The unexplained waste level rises were attributed to the production and retention of gas in the column of waste corresponding to the unacquainted for surface level rise. From 1973 through 1980, the Wallet Report tracked Tank S- 102 transfers and reported a net cumulative change of 19.95 in. This surface levelincrease is from an unknown source or is unacquainted for. Duke Engineering and Services Hanford (DASH) and Leached Martin Hanford Corporation (LMHC) are interested in determining the validity of the unexplained surface level changes reported in the 0611e Wallet Report based upon other corroborative sources of data. The purpose of this letter report is to assemble detailed surface level and waste addition data from daily tank records, logbooks, and other corroborative data that indicate surface levels, and to reconcile the cumulative unacquainted for surface level changes as shown in the Wallet Report from 1973 through 1980.

A new 5-level polysilicon surface micromachine process has been developed that offers significantly increased system complexity, while further promoting the manufacturability and reliability of microscopic mechanical systems. In general, as complexity increases, reliability suffers. This is not necessarily the case, however, with MicroElectroMechanical Systems (MEMS). In fact, utilizing additional levels of polysilicon in structures can greatly increase yield, reliability, and robustness. Surface micromachine devices are built thousands at a time using the infrastructure developed to support the incredibly reliable microelectronics industry, and the batch fabrication process utilized in the 5-level technology further increases reliability and reduces cost by totally eliminating post assembly.

is also the Berkeley site director of the Power Systems Engineering Research Center (PSerc). Dr. Oren has of deregulated electricity markets. She received a B.S. in Electrical Engineering from Brown University in 1999 engineering and operations research (IEOR) from Columbia University, and an M.S. and Ph.D. in IEOR from

With increasing delays and airport congestion that disturb airline operations, the development of robust schedules is becoming crucial. Increased traffic and poor weather are a few of the causes of airport congestion, ...