If NASA faked the moon landings, does the agency have any credibility at all? Was the Space Shuttle program also a hoax? Is the International Space Station another one? Do not dismiss these hypotheses offhand. Check out our wider NASA research and make up your own mind about it all.

Vladislaw » 02 Jun 2017, 07:50 wrote:Simon, Hoi this bottom picture contains something strange. I have allocated this strange artefact in yellow box. Pay attention to the lens. What or who is reflected there??

We are told temperatures around the ISS in the sun (as this seem to be the case) are around 250 F.

Before you ask, the gopro must be located outside the suit, since not only we see shots of the actornaut himself, but we are also told that

The camera used during the space-walk works much like the kind of GoPro you can buy here on Earth, only with a one-touch power up and record function. "This makes it much easier to execute while wearing large gloves,” http://time.com/3819293/gopro-spacewalk-astronauts/

If it's operated with the gloves, it means it is staying independently outside of the suit... So how can it resist such temperatures? And how can the footage even have sound??

This composite image, made from seven frames, shows the International Space Station, with a crew of six onboard, as it transits the Sun at roughly five miles per second during a partial solar eclipse, Monday, Aug. 21, 2017 near Banner, Wyoming. Onboard as part of Expedition 52 are: NASA astronauts Peggy Whitson, Jack Fischer, and Randy Bresnik; Russian cosmonauts Fyodor Yurchikhin and Sergey Ryazanskiy; and ESA (European Space Agency) astronaut Paolo Nespoli. A total solar eclipse swept across a narrow portion of the contiguous United States from Lincoln Beach, Oregon to Charleston, South Carolina. A partial solar eclipse was visible across the entire North American continent along with parts of South America, Africa, and Europe.

As millions of people across the United States experienced a total eclipse as the umbra, or moon’s shadow passed over them, only six people witnessed the umbra from space. Viewing the eclipse from orbit were NASA’s Randy Bresnik, Jack Fischer and Peggy Whitson, ESA (European Space Agency’s) Paolo Nespoli, and Roscosmos’ Commander Fyodor Yurchikhin and Sergey Ryazanskiy. The space station crossed the path of the eclipse three times as it orbited above the continental United States at an altitude of 250 miles.

From a million miles out in space, NASA’s Earth Polychromatic Imaging Camera (EPIC) captured 12 natural color images of the moon’s shadow crossing over North America on Aug. 21, 2017. EPIC is aboard NOAA’s Deep Space Climate Observatory (DSCOVR), where it photographs the full sunlit side of Earth every day, giving it a unique view of total solar eclipses. EPIC normally takes about 20 to 22 images of Earth per day, so this animation appears to speed up the progression of the eclipse.

It maybe meaningless but here's how they created the 3rd photograph from the above post, the information was obtained from FotoForensics meta data analysis of which the below is a small selection , usually most of this information gets wiped out but for some reason it was left on the uploaded NASA image.

Here's the answer to your question. The atmosphere between us and the moon is much brighter than the moon, so we see that instead, especially since no part of the moon is being reflected towards us.

Here's another question. Since the moon is supposedly 2,000 miles in diameter, shouldn't its shadow be at least that long? I live near Portland Oregon and watched the 95% eclipse with my kids. Pretty amazing!

The following cute illustration by NASA is very misleading. Although they admit it's not to scale, they neglect to show the true situation. At the supposed distance of the sun, its rays are pretty much parallel and the moon shadow would be cast nearly in a straight line from its edges.

It is interesting that the rays would almost be parallel given official numbers. Here is a very simple diagram I made about the distances a year or so back.

Moon distance from Earth

The Moon and Earth are the top shape. The Sun is the bottom shape.

However, given the strength of the unimpeded light and the fact that the Sun is much wider than the Moon, the shadow should be slightly smaller than the Moon's diameter. I'm guessing NASA's image is supposed to be a 3D computer generated calculation based on official dimensions, but I agree it looks odd and unexpected.

But if we say the maximum angle at which the Sun's light gets around the Moon is 0.15°, then by the time it touches Earth 380,000 km away, that still creeps almost 1000 km into the circle shadow.

By these numbers, NASA's image plays by its own rule book well enough. The fact that these things add up so well during the eclipse does not yet (to me) help explain the amazing angle discrepancies of the Moon's lighting in broad daylight.

Man, we've been answering each other's questions for a while and everyone has a different one, but mine is my favorite since it's part of a "closed discussion" on CluesForum, and perhaps that will have to be the end of this chain letter for now.

Before posting, I just did a quick (hasty) calculation on the angle of the light rays coming from the edge of the sun versus the center (at our distance), but didn't realize the angle was in radians so of course, it didn't reduce the umbra much. Thanks for reposting your calculations.

I, too, expected to see a much sharper shadow edge just based on the proximity of the Sun and Earth, assuming that at that distance the sun would act like a point source.Here's a pseudo scientific rendering of a solar eclipse. It's pseudo because all the geometry is on the same plane with no consideration for angularity of orbit. Everything is in line and proportional. I had to create a scale model because my earth-based software couldn't handle the astronomical numbers. Created in 3ds Max 2016 using Mental Ray. The camera has a 40 mm lens @ 62,400 km to Earth center. I realized after that I should have set up the camera at the stated distance of "a million miles out in space" so I created another camera 1.6M km from Earth and had to use a 10,000 mm lens to achieve similar framing.

Solar System Dimensions [source: Wikipedia]:

My guess is they had to tweak the levels of the shadow to make it more dramatic/noticeable.

Quote from this video [http://www.youtube.com/watch?v=NQkYVr_Wr6k] at 1:55 showing an eclipse from March 2016: "although the view from EPIC is once every two hours"; yet we see a composite of 13 frames to track the moon traversing the planet while Australia moves approximately 1/6th rotation, or a 4 hour window (not 1.5-2.65 hours). So the EPIC must have a "burst mode" for special circumstances because 2 x 13 = 26.

And, according to Wikipedia [https://en.wikipedia.org/wiki/Solar_eclipse_of_August_21,_2017], "The event's shadow began to cover land on the Oregon coast as a partial eclipse at 4:05 p.m. UTC (9:05 a.m. PDT), with the total eclipse beginning there at 5:16 p.m. UTC (10:16 a.m. PDT); the total eclipse's land coverage ended along the South Carolina coast at about 6:44 p.m. UTC"4:05 p.m. - 6:44 p.m. = 2 hours 39 minutes5:16 p.m. - 6:44 p.m. = 1 hours 28 minutesHorizontal Width: 2,511 miles from Perth directly east to Sydney.Horizontal Width: 2,680 miles - North America

Nick Java » 31 Aug 2017, 07:20 wrote:Quote from this video [http://www.youtube.com/watch?v=NQkYVr_Wr6k] at 1:55 showing an eclipse from March 2016: "although the view from EPIC is once every two hours"; yet we see a composite of 13 frames to track the moon traversing the planet while Australia moves approximately 1/6th rotation, or a 4 hour window (not 1.5-2.65 hours). So the EPIC must have a "burst mode" for special circumstances because 2 x 13 = 26.

Do you think that NASA would be so stupid?

According to NOAA,

"The time cadence will be no faster than 10 spectral band images every hour."

My apologies (because that's what Canadian's do; apologize) to all for the length of this post. If you have a bandwidth budget you may want to turn off images. Some large images further down.

Originally I was planning on posting a brief addenda showing the "moon's shadow" as the sun diminishes in size (scroll down) but thanks to the hack job by agraposo I have a lot of flack to address.

That move, by the way, intentional or not, is straight out of the Divide and Conquer Strategies Handbook. Full-on bull-dog instead of a couple of corrective comments.A casual dismissal and a passive-aggressive smiley.Negative instead of positive input.What up? Did I strike a nerve?

Demean a member, demean the team. Let's keep our eyes on the common goal.

Or, we can play a game, I like games, how about "Mole or Troll?"? Haven't heard of it? That's because I just made it up. There are no rules...

(I can hear the moderators bitchin' already, "no rough-housin' in the the forum".) ...

"Why, in the first place, do you try to debunk a video taken from a camera 1,000,000 miles away "

Wasn't trying to debunk the video. I was curious as to what shadow the 3d software would produce based on a recent comment by Hoi, but when I plugged in the provided numbers, I was surprised to see it was even less pronounced than NASA's rendering. And, in the process, I ended up counting the frames in the eclipse animation and noticed a discrepancy.Btw, do you think it's harder to debunk because the camera is 1,000,000 miles away?

While we're on the topic of debunking, I thought I'd throw in another observation. It appears that the Earth in the video has a nice specularity and also no atmosphere. Clouds, yes, atmosphere, no. And not a solitary twinkling star. I guess there are no stars in that direction or they just, believably?, aren't bright enough for EPIC's 30.5 cm dia. f9.38 lens at 40 ms exposure on its 2048x2048 CCD downsampled to 1024x1024. How big is a resampled star vs how big is a pixel? The website says "the short exposure times will render points of starlight invisible." And yes, it doesn't prove anything. But the specularity is a nice touch and really adds to the realism/plausibility of the image.

"You should first ask yourself these questions:"

Yeah sure, OK, but before I do I have a few questions of my own. Can I just ask them or do I have to answer them too? Who made you a deputy of the Research Police? What are you trying to accomplish? What's your agenda? Let me see your badge. Is there a book or website that lists all the questions I should ask first? Are these the only questions I should ask first?

"- how did the spacecraft arrive there? (L1 Lagrangian point)"

I didn't put the camera at the Lagrangian point, someone else did. Why do I have to answer that? Maybe Elon Musk would be a better target, he's the one with the paycheck. Btw, looks like NASA hired a new animator.

But I digress, back to: "how did the spacecraft arrive there?" How about a rocket? To be precise, a Falcon 9 v1.1 rocket.

It just flew there and stopped at "the neutral gravity point between Earth and the sun". Then I guess it had to start orbiting the sun at the same relative speed as Earth so it could stay focused on the target. Maybe a better question is: How does it locate and stay focused on Earth with it's 10,000 mm (equivalent) lens in all that solar wind? Or, does it have to turn around to scan the sun? And does the solar data slow down the earth data, can it stream simultaneously?

Oh, and what happened to the Falcon 9? Did it drop off the EPIC and return the reusable parts, (the word "reusable" is reused/referenced 31 times on the Falcon 9 wikipedia page; seaching its derivative "reus" reveals 50 hits) or, did the EPIC, like a butterfly, shed it's Falcon 9 cocoon, and emerge as a glistening geo-stationary telescope?

"- how does the spacecraft circuitry is shielded from space radiation?"

is magic space-age materials?

"- how does the signal from the spacecraft reach the Earth?"

maybe UPS has a secret telegraphic Stellar Delivery Service?

"all your simple trigonometric/euclidean calculations are wrong."

and you know this because? ... No, let me guess, you know what it looks like because you've seen it yourself. You know, or maybe you don't, sometimes simplicity adds clarity. Ever heard of the KISS method?Did you even take the time to wrap your head around the point I was trying to make?

"As everybody knows, in outer space the space-time is curved, the light bends and the time dilates, according to relativity."

Personally I haven't been there so I didn't know that. So, I guess I'm not everybody. Just out of curiousity, how much does the light bend in the relatively short distance between the sun and the earth? What's bending it? Is it a uniform bend? Or do some rays bend more than others? I'll try to factor that into my next flawed observation.

I also didn't factor in the diffractive properties of the thin film of air encompassing the planet. Here I created a 200 km optically dense object representing the atmosphere, and all it did was slightly darken the earth and diffuse the shadow even more. IOR values, Vacuum: 1.0, Air: 1.000029, Water: 1.33, Glass: 1.517. [https://docs.chaosgroup.com/display/VR2R/Refractive+Materials.]

"Do you think that NASA would be so stupid? [...] The time cadence will be no faster than 10 spectral band images every hour."

Thanks for pointing that out. That's quite the way you introduce semi-valid information. Sorry I didn't research the document you linked to. I didn't get that deep into EPIC.

Obviously something is hinky. 10 spectral bands per hour should result in 40 images over a four hour period, not 13, which is 3.25 per hour.

But, how many spectral bands make an "image"? According to the last line (actually, the only line on page 2) in the pdf you quoted "To increase the downlink cadence of retrieved 10-channel image sets, the resolution will be averaged on board to 1024x1024 pixels resulting in spatial sampling at 17 km from pixel to pixel [...] The time cadence will be no faster than 10 spectral band images every hour." Or, one spectral band image set per hour. So, EPIC functions no faster than 10 spectral band images every hour.

"That same error ("once every two hours") is repeated in the official NASA Goddard's YouTube video"

And, once every two hours? "Error"?That number is quoted at least twice in the video I watched. At 1:18 it's a qualified "EPIC takes at least 1 set of images about every two hours" and at 1:55 "although the view from EPIC is once every two hours". Then twice again on the Deep Space Climate Observatory page, first in the 2nd paragraph "It takes full-Earth pictures about every two hours" and again at the article's end "At least twelve images are released each day at regular intervals". [https://en.wikipedia.org/wiki/Deep_Space_Climate_Observatory] While we're at it, let's not ignore this authoritative site: [https://www.extremetech.com/extreme/199754-new-epic-camera-to-photograph-earth-in-a-way-it-has-never-been] "it will send back images of the sunlit face of the Earth every 110 minutes, in each of the 10 wavelengths it can see."

This is a classic example of obfuscation. [see DCSH]

So which is it? Let's go with the tech sheet and dismiss the narator, who claims to be Jay Herman, the EPIC lead scientist for the DSCOVR mission. But, don't you think the lead scientist on a project would know his specs better? Kind of like the lead engineer of a Formula 1 race engine manufacturer claiming that their new engine (capable of 20,000 rpms) will achieve at least 2,000 rpms, failing to mention that if you step on the gas it will go a lot faster. Where's the pride?

Now all we have to do is figure out how they obtained 3.25 images per hour from a 1-image per hour camera. My guess is, like I said earlier, "burst-mode". Just press turbo but don't hold too long, the CPU might overheat.

Or, here's the real turbo 'out', and it's dirty. So roll up your sleeves and put your gloves on. Here's a bone to chew on. Only scan the rgb bands successively, and skip the others and there you go, 3.33 images per hour. Boom! I saw mention of RGB filters somewhere but they also mentioned a time-shift problem due to capture rates of ~5 minutes per band, and I see a specularity, but I don't see a time shift, and for the life of me, I can't find the link. Sorry.

If someone sees it let me know, I'll add it here. It's got a picture with cascading RGB channels on the page.

In full-disclosure, I just discovered the wikipedia Deep Space Climate Observatory page says "The animation was composed of monochrome images taken in different color filters at 30 second intervals for each frame, resulting in a slight color fringing for the Moon in each finished frame. "

It looks like someone's already done the math but with a slight embellishment. So now we're back to burst-mode. How does a 10 image per hour camera, that's 6 minutes per image, suddenly work at 30 second intervals? Why don't they do it all the time?

And what about those other 7 channels? When do they get their turn again? How long can they stay off-line? There must be a record of when they switch to 3 channel mode. You'd think they would talk about it like it was some special feature but I haven't run across a single reference to a 3 spectral band image set. Maybe you can find one. Also it would be nice to see a data stream log for the event. I bet Elon's working on it right now.

Distance the Earth travels at the equator over 12 minutes:Circumference of the Earth: 40,075 km times (12 minutes/60/24=0.00833333333333333) or 333.958333333333 km. (12 minutes is 2-6 minute cycles to capture 3 images)

Quote from https://en.wikipedia.org/wiki/Deep_Spac ... bservatory "resulting in spatial sampling at 17 km/pixel". That would be an equatorial 19.64 pixel overall rotational shift which should be noticeable especially in "original" offerings. But of course, they would have time-shift correction software that no one talks about.

While perusing the VIIRS wikipedia page I ran across a simple elementary school level geometry problem. We can leave Euclid out of this one. 2 circles, 2 sight lines, not an angle to calculate. I'm hoping that a light bending expert can help me understand how VIIRS can scan pole to pole.Warning: This is a trick question.

Here's a riddle for all the no-satellite folks. I've been working on it for a a few days and I have a few theories but I haven't come up with a plausible sollution. If there are no satellites where does the undeniable satellite data come from?

What is Visible Infrared Imaging Radiometer Suite (VIIRS)?

Wikipedia tells us that:[it is] a sensor designed and manufactured by the Raytheon Company[it was] launched on October 28, 2011VIIRS is a whiskbroom scanning radiometerVIIRS has a swath width of 3060km at the satellite's average altitude of 829kmThe VIIRS instrument can collect data in 22 different spectral bands of the electromagnetic spectrum.The average orbit power for the instrument is 200 watts. In total the instrument weighs 275kg.VIIRS data sets will allow for the assessment of how climate change has effected the earth's surface over the past ~20 years.

But it doesn't tell us how fast it's going. We're going to have to go somewhere else for that. So, here's two sources that kind of corroborate each other:

According to wikipedia Escape Velocity is: 40,270 km/h or 25,020 mph at the surface.

Just a quick aside, Airspeed record: Lockheed SR-71 Blackbird: 3,529.6 km/hr, 2,193.2 mph28 July 1976, Capt. Eldon W. Joersz and Maj. George T. MorganSR-71 pilot Brian Shul reported in The Untouchables that he flew in excess of Mach 3.5 on April 15, 1986, over Libya, in order to avoid a missile.[https://en.wikipedia.org/wiki/Flight_airspeed_record]

I did some more "bean counting" and everything seems to check out with the numbers above. Did I find any magic beans while I was counting?In a seven day window we get about 40 images so that works out to just under 6 images per day.

Slightly under the expected average of 13/2= 6.5 allowing for day/night (assuming the region is in every scan swath).

Now that could be because, as we learn in Beginner’s Guide to VIIRS Imagery Data:"However, the satellite orbit is not a circle, the Earth is not a sphere and the satellite does not travel at constant speed."http://rammb.cira.colostate.edu/projects/npp/Beginner_Guide_to_VIIRS_Imagery_Data.pdfWow, that raises a lot more questions. Here's one: "Is it because it has to slow down for corners that the satellite does not travel at constant speed?"

Geostationary Operational Environmental Satellite"The GOES system uses geosynchronous satellites that, since the launch of SMS-1 in 1974, have been a basic element of U.S. weather monitoring and forecasting." The procurement, design, and manufacture of GOES satellites is overseen by the National Aeronautics and Space Administration. [https://en.wikipedia.org/wiki/Geostationary_Operational_Environmental_Satellite]

This one is from GOES West 0.62 um Visible on September the 9th.

This particular data set is updated every half hour and looks like it has been wrapped onto a distorted sphere for visualization purposes.

This is also captured from McIDAS-VGOES East/West VIS Composite(It also looks like the data has been wrapped onto a distorted sphere.)

What's with all the data gaps near the pole?

Below are captures of Polar Data viewed in McIDAS-V:

Polar Operational Environmental SatellitesThe POES program will be superseded by the Joint Polar Satellite System (JPSS).

Antarctic 2017-08-10

What I'm noticing is the shape of the scans in the composites. There really doesn't seem to be any consistency as far as regularity goes considering the data is coming from an orbital scanner.

It becomes even more apparent when viewed as a 24 hour animated snapshot.

Antarctic 17-08-10

Here's a couple more from a month later. Things don't seem to improve.

Antarctic 17-09-07

I added a red highlight to one frame to draw attention to a large composite layer under the sporadic swaths. And the first image has a red line to indicate shape of a scan. For a polar orbiting scanner there seems a dearth of over-the-pole scans. Also, I'm noticing that the snapshots are updated hourly, sometimes every half hour. How do they do that? Oh right, I remember reading geo-syncronous somewhere. Geo-syncronous or Orbital? It's so hard to keep track. [ed. it's not geo-syncronous.]

Let's take a look at the North pole. Maybe it's different.

This is a compilation of 2 days worth of scans at 1 hour intervals taken on September 6-7, 2017.

Arctic 2017-09-06-7

I'm sure there's a logical explanation for what appears to be a hodge podge of random scan swaths and why it's so difficult to make a composite of either pole with readable information. It's more like a pole-dancer skirting center-stage than a polar-orbitter . You'd think a program with a name like 'Polar Operational Environmental Satellites' would be more interested in detail regarding the poles of the planet.

Some of the sweeps look more like flight paths on a parallel as opposed to meridian.

So let's take a look at how the 3d program would wrap a rectangular swath produced on a "parallel" path near the pole.

Enough looking at satellite images, time to get back to matters at hand. I'll leave it to some one else to pick up the mess.

For the time being, I stand by my

Now to the point I was planning making more than a week ago.

I did come up with another solution to the "believeable shadow" conundrum.

If the results don't match expectations and we really don't care about "geometric integrity" how about we just change the size of the sun in the model just before we hit the render button?

How would YOU like the shadows to appear based on your superior knowledge?This is what the ray-tracer in the 3d program thinks the shadow looks like using figures gleaned from wikipedia:

It's not wrong. It may not be 100% accurate, but it's not "wrong". It is what it is, a sun sized light shining at an earth sized sphere with a moon sized shadow-casting sphere directly between at proportional distances using a ray-tracer to calculate the shadow. If you don't like the results, I challenge you to find or build a more accurate renderer, then we can compare results. Who knows, maybe the renderer just can't deal with the astronomical numbers or I keyed in dyslexic numbers. If I did, let me know, and I'll re-assess the situation.

And this is what decreasing the size of the sun produces:

More useless bits of information:The heaviest non-Catadioptric telephoto lens for civilian use was made by Carl Zeiss and has a focal length of 1700 mm with a maximum aperture of f/4, implying a 425 mm (16.7 in) entrance pupil. It is designed for use with a medium format Hasselblad 203 FE camera and weighs 256 kg (564 lb) https://en.wikipedia.org/wiki/Telephoto_lens

Anyways, I've spent way more time than I ever planned looking at satellite imagery and satellite tech data, so let me remind you of the riddle that arose: If there are no satellites where does the undeniable satellite data come from?

Also, I do have one final task for you, intrepid research critic. I'd like to see 2 successive "3 image sets" or 1 "10 image set" containing 3 1/3 successive RGB scans all properly time-stamped at 30 second intervals and another random standard "10 image set" for comparison. Good luck on that.

A gauntlet has not been dropped. And I'm sheathing my pen/sword for now, except for this excerpt from the DCSH: "Diversionary Tactics: A diversionary activity is one intended to attract people's attention away from something which you do not want them to think about, know about."

Was the diversion successful? I'll let you, the reader determine that. I took 10 personal chair days out of my ever-shortening life to compose this reply, when I could have been working on something else. I hope it was worth the effort.

Nick Java's Conspiracy Theory of the Day

Oh, one more thing before I go. There's an old saying, 'curiosity killed the cat'. I suspect that cat's and their mind control* decided to rid the planet of the demon "Curiosity", by building an efigy and relegating it to the moon. "No, not far enough, a far, far away planet, Mars. No mice there." "Curiosity to Mars?""Yeah, and we'll call it Rover.""Like the dog?""Yeah, like the dog." "Heow, heow, heow.""Get the humans to build a rocket and we will rid ourselves of the blight. No more curiosity, no more death." And that's what happened to curiosity in America. Hey, and what a coincidence, Elon Musk has a new cat.**... and a new space suit.*http://www.bbc.com/future/story/20141212-how-cats-can-control-our-minds**https://twitter.com/elonmusk/status/192701084932907009?lang=en

As we celebrate, maybe a turkey dinner and a glass of wine, the 16th anniversary of 9-11, let's not forget all the people who died:

And let's remember, not a single warrant, not a single arrest, no serious investigation into the theory of collapse physics. Either our "custodians" are asleep at the wheel or they're too busy eating hand-pealed grapes. The time for answers is: Now! "They" are retiring and dying, and as almost "everyone knows" dead men tell no tales, true or false.

Is there such a thing, in the pool of 323.1 million (2016) American citizens, as one lawyer with the balls, or lips, to challenge NIST?Maybe an insurance company? Like the one financing the rebuild. Where's Ban Ki-moon when we need him?

My apologies, one last time, I didn't mean to turn this into a platform.

I''m going to close with the last hymn of the day sung by a group of 20th century space pilots.Let's work together.

Canned Heat - Future Blues Cover 1970

Canned Heat - Let's work together

full link: http://www.youtube.com/watch?v=Hom0fYd5uX4

or, if the dancing girls are too much for you, here's the same song but it only has lip-synced guys in it.