Type

Database

Creator

Date

Thumbnail

Search results

97 records were found.

This paper presents the results obtained in a continuing investigation of fault-tolerant computing which is being conducted at the Jet Propulsion Laboratory. Initial studies led to the decision to design and construct an experimental computer with dynamic (standby) redundancy, including replaceable subsystems and a program rollback provision to eliminate transient errors. This system, called the STAR computer, began operation in 1969. The following aspects of the STAR system are described: architecture, reliability analysis, software, automatic maintenance of peripheral systems, and adaptation to serve as the central computer of an outer-planet exploration spacecraft.
Comment: To be published in the Proceedings of the 23rd Winter Workshop on Nuclear Dynamics, Big Sky, Montana, February 11-18, 2007. This work is supported by the Yale-Weizmann Collaboration Program of the American Committee on Weizmann Institute of Science (ACWIS), New York
A search was made for e+e--->X1X2 where X1 consists of one or more light unobservable particles and X2 decays promptly to a visible jet of particles. One event was found for an integrated luminosity of 176 pb-1, a rate consistent with known backgrounds. This result places a significant constraint on a number of theoretical models.
OBJECTIVES: This study was conducted to evaluate follow-up results in patients with hypertrophic obstructive cardiomyopathy (HOCM) who underwent either percutaneous transluminal septal myocardial ablation (PTSMA) or septal myectomy. BACKGROUND: Controversy exists with regard to these two forms of treatment for patients with HOCM. METHODS: Of 51 patients with HOCM treated, 25 were treated by PTSMA and 26 patients via myectomy. Two-dimensional echocardiograms were performed before both procedures, immediately afterwards and at a three-month follow-up. The New York Heart Association (NYHA) functional class was obtained before the procedures and at follow-up. RESULTS: Interventricular septal thickness was significantly reduced at follow-up in both groups (2.3 +/- 0.4 cm vs. 1.9 +/- 0.4 cm for septal ablation and 2.4 +/- 0.6 cm vs. 1.7 +/- ...
Context. Photoelectric heating is a dominant heating mechanism for many phases of the interstellar medium. We study this mechanism throughout the Large Magellanic Cloud (LMC). Aims. We aim to quantify the importance of the [C II] cooling line and the photoelectric heating process of various environments in the LMC and to investigate which parameters control the extent of photoelectric heating. Methods. We use the BICE [C II] map and the Spitzer/SAGE infrared maps. We examine the spatial variations in the efficiency of photoelectric heating: photoelectric heating rate over power absorbed by grains, i.e. the observed [C II] line strength over the integrated infrared emission. We correlate the photoelectric heating efficiency and the emission from various dust constituents and study the variations as a function of H emission, dust tempe...
The recent robust and homogeneous analysis of the world's supernova distance-redshift data, together with cosmic microwave background and baryon acoustic oscillation data—provides a powerful tool for constraining cosmological models. Here we examine particular classes of scalar field, modified gravity, and phenomenological models to assess whether they are consistent with observations even when their behavior deviates from the cosmological constant Λ. Some models have tension with the data, while others survive only by approaching the cosmological constant, and a couple are statistically favored over Λ cold dark matter. Dark energy described by two equation-of-state parameters has considerable phase space to avoid Λ and next-generation data will be required to constrain such physics, with the level of complementarity between probes var...
The recent robust and homogeneous analysis of the world's supernova distance-redshift data, together with cosmic microwave background and baryon acoustic oscillation data, provides a powerful tool for constraining cosmological models. Here we examine particular classes of scalar field, modified gravity, and phenomenological models to assess whether they are consistent with observations even when their behavior deviates from the cosmological constant Lambda. Some models have tension with the data, while others survive only by approaching the cosmological constant, and a couple are statistically favored over LCDM. Dark energy described by two equation of state parameters has considerable phase space to avoid Lambda and next generation data will be required to constrain such physics.
Want to know more?If you want to know more about this cutting edge product, or schedule a demonstration on your own organisation, please feel free to contact us or read the available documentation at http://www.keep.pt/produtos/retrievo/?lang=en